hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fc7b98903c66b23ec526b11b5a94c210bef7305f | 2,568 | py | Python | crunch-scrunch/src/codegen/make_tuple_ptypefamily.py | noslowerdna/crunch | 571b90c03e3010e7bb9badf4e6e441ab2164be56 | [
"Apache-2.0"
] | 84 | 2015-01-06T07:39:29.000Z | 2022-01-21T07:14:25.000Z | crunch-scrunch/src/codegen/make_tuple_ptypefamily.py | noslowerdna/crunch | 571b90c03e3010e7bb9badf4e6e441ab2164be56 | [
"Apache-2.0"
] | 12 | 2016-03-21T21:26:42.000Z | 2021-02-02T17:20:05.000Z | crunch-scrunch/src/codegen/make_tuple_ptypefamily.py | noslowerdna/crunch | 571b90c03e3010e7bb9badf4e6e441ab2164be56 | [
"Apache-2.0"
] | 86 | 2015-02-16T17:13:00.000Z | 2021-11-10T23:11:24.000Z | #!/usr/bin/env python
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
letters = "ABCDEFGHIJKLMNOPQRSTUVWXYZ"
print """/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/"""
print "package org.apache.crunch.scrunch\n"
print "import org.apache.crunch.TupleN"
print "import org.apache.crunch.types.PType\n"
print "trait GeneratedTuplePTypeFamily extends BasePTypeFamily {"
print " import GeneratedTupleHelper._\n"
for j in range(5, 23):
lets = letters[0:j]
types = ", ".join(lets)
args = ", ".join(["p%d: PType[%s]" % (x, l) for (x, l) in enumerate(lets)])
print " def tuple%d[%s](%s) = {" % (j, types, args)
inout = ",".join(["t.get(%d).asInstanceOf[%s]" % (x, l) for (x, l) in enumerate(lets)])
print " val in = (t: TupleN) => (%s)" % inout
outin = ", ".join(lets)
outout = ", ".join(["t._%d" % (1 + x) for x in range(j)])
print " val out = (t: (%s)) => tupleN(%s)" % (outin, outout)
derout = ", ".join(["p%d" % x for x in range(j)])
print " derived(classOf[(%s)], in, out, ptf.tuples(%s))" % (outin, derout)
print " }\n"
print "}\n"
| 42.098361 | 89 | 0.69743 | 383 | 2,568 | 4.671018 | 0.315927 | 0.067077 | 0.029067 | 0.035774 | 0.759083 | 0.730017 | 0.730017 | 0.709894 | 0.709894 | 0.709894 | 0 | 0.006173 | 0.179907 | 2,568 | 60 | 90 | 42.8 | 0.843305 | 0.301012 | 0 | 0.054054 | 0 | 0 | 0.69668 | 0.115363 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.081081 | null | null | 0.324324 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5d7b1b2c2c946a875d85aac4aec97e90f213d33f | 117,680 | py | Python | pygsti/modelpacks/smq2Q_XXYYII.py | drewrisinger/pyGSTi | dd4ad669931c7f75e026456470cf33ac5b682d0d | [
"Apache-2.0"
] | 1 | 2021-12-19T15:11:09.000Z | 2021-12-19T15:11:09.000Z | pygsti/modelpacks/smq2Q_XXYYII.py | drewrisinger/pyGSTi | dd4ad669931c7f75e026456470cf33ac5b682d0d | [
"Apache-2.0"
] | null | null | null | pygsti/modelpacks/smq2Q_XXYYII.py | drewrisinger/pyGSTi | dd4ad669931c7f75e026456470cf33ac5b682d0d | [
"Apache-2.0"
] | null | null | null | """
Variables for working with the 2-qubit model containing the gates
I*X(pi/2), I*Y(pi/2), X(pi/2)*I, Y(pi/2)*I, and CPHASE.
"""
#***************************************************************************************************
# Copyright 2015, 2019 National Technology & Engineering Solutions of Sandia, LLC (NTESS).
# Under the terms of Contract DE-NA0003525 with NTESS, the U.S. Government retains certain rights
# in this software.
# Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except
# in compliance with the License. You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0 or in the LICENSE file in the root pyGSTi directory.
#***************************************************************************************************
from collections import OrderedDict
from pygsti.construction import circuitconstruction as _strc
from pygsti.construction import modelconstruction as _setc
from pygsti.modelpacks._modelpack import GSTModelPack, RBModelPack
class _Module(GSTModelPack, RBModelPack):
description = ("I*I, I*X(pi/2), I*Y(pi/2), X(pi/2)*I, Y(pi/2)*I, X(pi/2)*X(pi/2), "
"Y(pi/2)*Y(pi/2), X(pi/2)*Y(pi/2), and Y(pi/2)*X(pi/2) gates")
gates = [(), ('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)]
_sslbls = (0, 1)
_germs = [((), ), (('Gxpi2', 0), ), (('Gypi2', 0), ), (('Gxpi2', 1), ), (('Gypi2', 1), ), (('Gxxpi2', 0, 1), ), (('Gxypi2', 0, 1), ),
(('Gxypi2', 0, 1), ), (('Gyypi2', 0, 1), ), (('Gxpi2', 0), ('Gypi2', 0)), (('Gxpi2', 1), ('Gypi2', 1)),
(('Gypi2', 1), ('Gypi2', 0)), (('Gxpi2', 1), ('Gxpi2', 0)), (('Gxpi2', 1), ('Gypi2', 0)), (('Gypi2', 1), ('Gxpi2', 0)),
((), ('Gxpi2', 1)), ((), ('Gypi2', 1)), ((), ('Gypi2', 0)), (('Gypi2', 1), ('Gxxpi2', 0, 1)), (('Gypi2', 0), ('Gxxpi2', 0, 1)),
(('Gxypi2', 0, 1), ('Gxypi2', 0, 1)), (('Gxxpi2', 0, 1), ('Gxypi2', 0, 1)), (('Gxxpi2', 0, 1), ('Gxypi2', 0, 1)),
(('Gyypi2', 0, 1), ('Gxypi2', 0, 1)), (('Gxxpi2', 0, 1), ('Gyypi2', 0, 1)), (('Gxpi2', 1), ('Gyypi2', 0, 1)),
(('Gypi2', 0), ('Gxypi2', 0, 1)), (('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0)), (('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1)),
(('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)), (('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1)),
(('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)), (('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0)),
(('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1)), (('Gypi2', 1), ('Gxpi2', 0), ('Gxpi2', 0)), (('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 0)),
(('Gxpi2', 1), ('Gxpi2', 0), ('Gypi2', 1)), (('Gxpi2', 1), ('Gypi2', 0), ('Gxpi2', 0)), (('Gxpi2', 1), ('Gypi2', 0), ('Gypi2', 1)),
(('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 0)), (('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 0)), (('Gypi2', 1), ('Gypi2', 0), ('Gxpi2', 0)),
(('Gxpi2', 0), ('Gypi2', 0), ()), (('Gxpi2', 0), (), ('Gypi2', 0)), (('Gxpi2', 0), (), ()), (('Gypi2', 0), (), ()),
(('Gxpi2', 1), ('Gypi2', 1), ()), (('Gxpi2', 1), (), ('Gypi2', 1)), (('Gxpi2', 1), (), ()), (('Gypi2', 1), (), ()),
((), ('Gxpi2', 1), ('Gypi2', 0)), ((), ('Gypi2', 1), ('Gypi2', 0)), ((), ('Gypi2', 0), ('Gxpi2', 1)),
(('Gxpi2', 0), ('Gypi2', 0), ('Gxxpi2', 0, 1)), (('Gypi2', 0), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1)),
(('Gypi2', 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1)), (('Gypi2', 1), ('Gypi2', 0), ('Gxxpi2', 0, 1)),
(('Gxpi2', 1), ('Gxpi2', 1), ('Gxxpi2', 0, 1)), (('Gxpi2', 1), ('Gypi2', 1), ('Gxxpi2', 0, 1)),
(('Gxpi2', 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0)), ((), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1)),
(('Gxpi2', 1), ('Gxxpi2', 0, 1), ('Gypi2', 1)), (('Gypi2', 1), ('Gxxpi2', 0, 1), ('Gypi2', 0)),
(('Gxpi2', 0), ('Gxpi2', 0), ('Gxxpi2', 0, 1)), (('Gxpi2', 0), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1)),
(('Gypi2', 0), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1)), (('Gxpi2', 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)),
(('Gypi2', 0), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1)), (('Gypi2', 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1)),
(('Gxpi2', 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1)), (('Gypi2', 0), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)),
(('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)), (('Gypi2', 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1)),
(('Gypi2', 0), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1)), (('Gypi2', 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)),
(('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)), (('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1)),
(('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1)), (('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)),
(('Gxpi2', 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1)), (('Gxpi2', 0), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1)),
(('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1)), (('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)),
(('Gypi2', 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1)), (('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1)),
(('Gxpi2', 1), ('Gxypi2', 0, 1), ('Gypi2', 0)), (('Gxpi2', 0), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1)),
(('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)), (('Gypi2', 0), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1)),
(('Gypi2', 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1)), (('Gxpi2', 0), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1)),
(('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gyypi2', 0, 1)), (('Gxpi2', 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)),
(('Gypi2', 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)), (('Gxpi2', 0), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1)),
(('Gypi2', 0), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)), (('Gypi2', 0), ('Gypi2', 0), ('Gyypi2', 0, 1)),
(('Gxpi2', 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)), (('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxypi2', 0, 1), ('Gxpi2', 0)),
(('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 0), ('Gypi2', 1)), (('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 0)),
(('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1)), (('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0)),
(('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 1), ('Gypi2', 0)), (('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)),
(('Gxpi2', 0), ('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 1)), (('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), ()),
(('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ()), (('Gypi2', 1), ('Gxpi2', 0), (), ()), (('Gxpi2', 1), (), (), ('Gxpi2', 0)),
(('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0)), (('Gypi2', 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 1)),
(('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1)),
(('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)),
(('Gypi2', 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)),
(('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 0)),
(('Gxpi2', 0), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1)),
(('Gypi2', 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 1)),
(('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 1), ('Gypi2', 0), ('Gypi2', 1)),
(('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 0), ('Gxpi2', 1), ('Gxpi2', 0)),
(('Gypi2', 0), ('Gypi2', 1), ('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 1)),
(('Gypi2', 1), ('Gxpi2', 0), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 0)),
(('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gxpi2', 0)),
(('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 1), ('Gxxpi2', 0, 1), ('Gypi2', 1)),
((), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1)),
(('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)),
(('Gxpi2', 0), ('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 1), ('Gypi2', 0)),
(('Gxpi2', 0), ('Gypi2', 1), ('Gxpi2', 1), ('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 1)),
(('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1)),
(('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0)),
(('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1)),
(('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gxpi2', 1)),
(('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 1), ('Gxpi2', 0), ('Gxpi2', 1), ('Gypi2', 1)),
(('Gxpi2', 0), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 0)),
(('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 0), ('Gxpi2', 0)),
(('Gypi2', 0), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1)),
(('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 1), ('Gypi2', 0), ('Gxpi2', 1)),
(('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 1), ('Gxpi2', 1), ('Gypi2', 1)),
(('Gypi2', 0), ('Gxpi2', 0), ('Gypi2', 1), ('Gypi2', 0), ('Gxxpi2', 0, 1), ('Gypi2', 0)),
(('Gypi2', 1), ('Gxpi2', 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 0), ('Gypi2', 1)),
((), ('Gypi2', 0), ('Gxpi2', 0), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxxpi2', 0, 1)),
(('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 1), (), ('Gypi2', 1)),
(('Gypi2', 0), ('Gxpi2', 0), ('Gypi2', 1), ('Gxpi2', 0), ('Gxpi2', 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 1)),
(('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 1), ('Gypi2', 0))]
_germs_lite = [((), ), (('Gxpi2', 0), ), (('Gypi2', 0), ), (('Gxpi2', 1), ), (('Gypi2', 1), ), (('Gxxpi2', 0, 1), ),
(('Gxypi2', 0, 1), ), (('Gxypi2', 0, 1), ), (('Gyypi2', 0, 1), ), (('Gxpi2', 0), ('Gypi2', 0)),
(('Gxpi2', 1), ('Gypi2', 1)), (('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0)), (('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1)),
(('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)), (('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1)),
(('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)), (('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxypi2', 0, 1), ('Gxpi2', 0)),
(('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 0)),
(('Gxpi2', 0), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1)),
(('Gxpi2', 0), ('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 1), ('Gypi2', 0)),
(('Gxpi2', 0), ('Gypi2', 1), ('Gxpi2', 1), ('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 1)),
(('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1)),
(('Gypi2', 0), ('Gxpi2', 0), ('Gypi2', 1), ('Gxpi2', 0), ('Gxpi2', 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 1))]
_fiducials = [(), (('Gxpi2', 1), ), (('Gypi2', 1), ), (('Gxpi2', 1), ('Gxpi2', 1)), (('Gxpi2', 0), ), (('Gxpi2', 0), ('Gxpi2', 1)),
(('Gxpi2', 0), ('Gypi2', 1)), (('Gxpi2', 0), ('Gxpi2', 1), ('Gxpi2', 1)), (('Gypi2', 0), ), (('Gypi2', 0), ('Gxpi2', 1)),
(('Gypi2', 0), ('Gypi2', 1)), (('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 1)), (('Gxpi2', 0), ('Gxpi2', 0)),
(('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 1)), (('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 1)),
(('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 1), ('Gxpi2', 1))]
_prepfiducials = [(), (('Gxpi2', 1), ), (('Gypi2', 1), ), (('Gxpi2', 1), ('Gxpi2', 1)), (('Gxpi2', 0), ), (('Gxpi2', 0), ('Gxpi2', 1)),
(('Gxpi2', 0), ('Gypi2', 1)), (('Gxpi2', 0), ('Gxpi2', 1), ('Gxpi2', 1)), (('Gypi2', 0), ), (('Gypi2', 0), ('Gxpi2', 1)),
(('Gypi2', 0), ('Gypi2', 1)), (('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 1)), (('Gxpi2', 0), ('Gxpi2', 0)),
(('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 1)), (('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 1)),
(('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 1), ('Gxpi2', 1))]
_measfiducials = [(), (('Gxpi2', 1), ), (('Gypi2', 1), ), (('Gxpi2', 1), ('Gxpi2', 1)), (('Gxpi2', 0), ), (('Gypi2', 0), ),
(('Gxpi2', 0), ('Gxpi2', 0)), (('Gxpi2', 0), ('Gxpi2', 1)), (('Gxpi2', 0), ('Gypi2', 1)), (('Gypi2', 0), ('Gxpi2', 1)),
(('Gypi2', 0), ('Gypi2', 1))]
_clifford_compilation = OrderedDict([
('Gc0c0', [(), (), (), (), (), (), ()]), ('Gc0c1', [('Gypi2', 1), ('Gxpi2', 1), (), (), (), (), ()]),
('Gc0c2', [('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc0c3', [('Gxpi2', 1), ('Gxpi2', 1), (), (), (), (), ()]),
('Gc0c4', [('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc0c5', [('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc0c6', [('Gypi2', 1), ('Gypi2', 1), (), (), (), (), ()]),
('Gc0c7', [('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), (), (), ()]),
('Gc0c8', [('Gxpi2', 1), ('Gypi2', 1), (), (), (), (), ()]),
('Gc0c9', [('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc0c10', [('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), (), ()]),
('Gc0c11', [('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc0c12', [('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc0c13', [('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc0c14', [('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc0c15', [('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), (), (), (), ()]), ('Gc0c16', [('Gxpi2', 1), (), (), (), (), (), ()]),
('Gc0c17', [('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc0c18', [('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc0c19', [('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), (), (), (), ()]),
('Gc0c20', [('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), (), ()]),
('Gc0c21', [('Gypi2', 1), (), (), (), (), (), ()]),
('Gc0c22', [('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), (), ()]),
('Gc0c23', [('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc1c0', [('Gypi2', 0), ('Gxpi2', 0), (), (), (), (), ()]),
('Gc1c1', [('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), (), (), (), (), ()]),
('Gc1c2', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc1c3', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), (), (), (), (), ()]),
('Gc1c4', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc1c5', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc1c6', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), (), ()]),
('Gc1c7', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), (), (), ()]),
('Gc1c8', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), (), ()]),
('Gc1c9', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc1c10', [('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), (), ()]),
('Gc1c11', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc1c12', [('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc1c13', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc1c14', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc1c15', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), (), (), (), ()]),
('Gc1c16', [('Gxypi2', 0, 1), ('Gxpi2', 0), (), (), (), (), ()]),
('Gc1c17', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc1c18', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc1c19', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), (), (), (), ()]),
('Gc1c20', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), (), ()]),
('Gc1c21', [('Gyypi2', 0, 1), ('Gxpi2', 0), (), (), (), (), ()]),
('Gc1c22', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), (), ()]),
('Gc1c23', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc2c0', [('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c1', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c2', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ()]),
('Gc2c3', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c4', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ()]),
('Gc2c5', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c6', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c7', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c8', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c9', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c10', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c11', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c12', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c13', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c14', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1),
('Gxpi2', 1)]),
('Gc2c15', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c16', [('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c17', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c18', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ()]),
('Gc2c19', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c20', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ()]),
('Gc2c21', [('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ()]),
('Gc2c22', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ()]),
('Gc2c23', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ()]),
('Gc3c0', [('Gxpi2', 0), ('Gxpi2', 0), (), (), (), (), ()]),
('Gc3c1', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), (), (), (), (), ()]),
('Gc3c2', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc3c3', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), (), (), (), (), ()]),
('Gc3c4', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc3c5', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc3c6', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), (), ()]),
('Gc3c7', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), (), (), ()]),
('Gc3c8', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), (), ()]),
('Gc3c9', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc3c10', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), (), ()]),
('Gc3c11', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc3c12', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc3c13', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc3c14', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc3c15', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), (), (), (), ()]),
('Gc3c16', [('Gxxpi2', 0, 1), ('Gxpi2', 0), (), (), (), (), ()]),
('Gc3c17', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc3c18', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc3c19', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), (), (), (), ()]),
('Gc3c20', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), (), ()]),
('Gc3c21', [('Gxypi2', 0, 1), ('Gxpi2', 0), (), (), (), (), ()]),
('Gc3c22', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), (), ()]),
('Gc3c23', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc4c0', [('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c1', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c2', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ()]),
('Gc4c3', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c4', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ()]),
('Gc4c5', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c6', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c7', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c8', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c9', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c10', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c11', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c12', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c13', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c14', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1),
('Gxpi2', 1)]),
('Gc4c15', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c16', [('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c17', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c18', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ()]),
('Gc4c19', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c20', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ()]),
('Gc4c21', [('Gyypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ()]),
('Gc4c22', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ()]),
('Gc4c23', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ()]),
('Gc5c0', [('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc5c1', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc5c2', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc5c3', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc5c4', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc5c5', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), (), (), ()]),
('Gc5c6', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc5c7', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), (), (), ()]),
('Gc5c8', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc5c9', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), (), (), ()]),
('Gc5c10', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), ()]),
('Gc5c11', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), (), (), ()]),
('Gc5c12', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), (), (), ()]),
('Gc5c13', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), (), (), ()]),
('Gc5c14', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc5c15', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), (), (), ()]),
('Gc5c16', [('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc5c17', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), (), (), ()]),
('Gc5c18', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), ()]),
('Gc5c19', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), (), (), ()]),
('Gc5c20', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), (), ()]),
('Gc5c21', [('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc5c22', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), (), ()]),
('Gc5c23', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), ()]),
('Gc6c0', [('Gypi2', 0), ('Gypi2', 0), (), (), (), (), ()]),
('Gc6c1', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), (), ()]),
('Gc6c2', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc6c3', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), (), ()]),
('Gc6c4', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc6c5', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc6c6', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), (), (), (), (), ()]),
('Gc6c7', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), (), (), ()]),
('Gc6c8', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), (), (), (), (), ()]),
('Gc6c9', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc6c10', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), (), ()]),
('Gc6c11', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc6c12', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc6c13', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc6c14', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc6c15', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), (), (), (), ()]),
('Gc6c16', [('Gxypi2', 0, 1), ('Gypi2', 0), (), (), (), (), ()]),
('Gc6c17', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc6c18', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc6c19', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), (), (), (), ()]),
('Gc6c20', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), (), ()]),
('Gc6c21', [('Gyypi2', 0, 1), ('Gypi2', 0), (), (), (), (), ()]),
('Gc6c22', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), (), ()]),
('Gc6c23', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc7c0', [('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), (), (), ()]),
('Gc7c1', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), (), (), ()]),
('Gc7c2', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc7c3', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), (), (), ()]),
('Gc7c4', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc7c5', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), (), (), ()]),
('Gc7c6', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), (), (), ()]),
('Gc7c7', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), (), (), ()]),
('Gc7c8', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), (), (), ()]),
('Gc7c9', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), (), (), ()]),
('Gc7c10', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), (), (), ()]),
('Gc7c11', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), ()]),
('Gc7c12', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), (), ()]),
('Gc7c13', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), (), ()]),
('Gc7c14', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc7c15', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), (), (), ()]),
('Gc7c16', [('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), (), (), ()]),
('Gc7c17', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), (), ()]),
('Gc7c18', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), (), ()]),
('Gc7c19', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), (), (), ()]),
('Gc7c20', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), ()]),
('Gc7c21', [('Gyypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), (), (), ()]),
('Gc7c22', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), (), ()]),
('Gc7c23', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), (), ()]),
('Gc8c0', [('Gxpi2', 0), ('Gypi2', 0), (), (), (), (), ()]),
('Gc8c1', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), (), ()]),
('Gc8c2', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc8c3', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), (), ()]),
('Gc8c4', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc8c5', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc8c6', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), (), (), (), (), ()]),
('Gc8c7', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), (), (), ()]),
('Gc8c8', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), (), (), (), (), ()]),
('Gc8c9', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc8c10', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), (), ()]),
('Gc8c11', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc8c12', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc8c13', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc8c14', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc8c15', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), (), (), (), ()]),
('Gc8c16', [('Gxxpi2', 0, 1), ('Gypi2', 0), (), (), (), (), ()]),
('Gc8c17', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc8c18', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc8c19', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), (), (), (), ()]),
('Gc8c20', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), (), ()]),
('Gc8c21', [('Gxypi2', 0, 1), ('Gypi2', 0), (), (), (), (), ()]),
('Gc8c22', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), (), ()]),
('Gc8c23', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc9c0', [('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc9c1', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc9c2', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc9c3', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc9c4', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc9c5', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), (), (), ()]),
('Gc9c6', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc9c7', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), (), (), ()]),
('Gc9c8', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc9c9', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), (), (), ()]),
('Gc9c10', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), ()]),
('Gc9c11', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), (), (), ()]),
('Gc9c12', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), (), (), ()]),
('Gc9c13', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), (), (), ()]),
('Gc9c14', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc9c15', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), (), (), ()]),
('Gc9c16', [('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc9c17', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), (), (), ()]),
('Gc9c18', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), ()]),
('Gc9c19', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), (), (), ()]),
('Gc9c20', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), (), ()]),
('Gc9c21', [('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc9c22', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), (), ()]),
('Gc9c23', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), ()]),
('Gc10c0', [('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), (), ()]),
('Gc10c1', [('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), (), ()]),
('Gc10c2', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc10c3', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), (), ()]),
('Gc10c4', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc10c5', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), ()]),
('Gc10c6', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), (), ()]),
('Gc10c7', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), (), (), ()]),
('Gc10c8', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), (), ()]),
('Gc10c9', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), ()]),
('Gc10c10', [('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), (), (), ()]),
('Gc10c11', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), (), (), ()]),
('Gc10c12', [('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), (), (), ()]),
('Gc10c13', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), (), (), ()]),
('Gc10c14', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc10c15', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), (), ()]),
('Gc10c16', [('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), (), ()]),
('Gc10c17', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), (), (), ()]),
('Gc10c18', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), (), ()]),
('Gc10c19', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), (), ()]),
('Gc10c20', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), ()]),
('Gc10c21', [('Gyypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), (), ()]),
('Gc10c22', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), (), ()]),
('Gc10c23', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), (), ()]),
('Gc11c0', [('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc11c1', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc11c2', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc11c3', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc11c4', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc11c5', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), (), (), ()]),
('Gc11c6', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc11c7', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), ()]),
('Gc11c8', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc11c9', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), (), (), ()]),
('Gc11c10', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), (), (), ()]),
('Gc11c11', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), (), (), ()]),
('Gc11c12', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 0), (), (), ()]),
('Gc11c13', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 0), (), (), ()]),
('Gc11c14', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc11c15', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), (), (), ()]),
('Gc11c16', [('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc11c17', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 0), (), (), ()]),
('Gc11c18', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), ()]),
('Gc11c19', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), (), (), ()]),
('Gc11c20', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), (), ()]),
('Gc11c21', [('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0), (), (), ()]),
('Gc11c22', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), (), ()]),
('Gc11c23', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), ()]),
('Gc12c0', [('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), (), (), ()]),
('Gc12c1', [('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), (), (), (), ()]),
('Gc12c2', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc12c3', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), (), (), (), ()]),
('Gc12c4', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc12c5', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), (), (), ()]),
('Gc12c6', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), (), (), ()]),
('Gc12c7', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), (), ()]),
('Gc12c8', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), (), (), ()]),
('Gc12c9', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), (), (), ()]),
('Gc12c10', [('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), (), (), ()]),
('Gc12c11', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 1), (), (), ()]),
('Gc12c12', [('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), (), (), (), ()]),
('Gc12c13', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), (), (), (), ()]),
('Gc12c14', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc12c15', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), ()]),
('Gc12c16', [('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), (), (), ()]),
('Gc12c17', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), (), (), (), ()]),
('Gc12c18', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc12c19', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), ()]),
('Gc12c20', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), (), ()]),
('Gc12c21', [('Gyypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), (), (), ()]),
('Gc12c22', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), ()]),
('Gc12c23', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc13c0', [('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), (), (), ()]),
('Gc13c1', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), (), (), (), ()]),
('Gc13c2', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc13c3', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), (), (), (), ()]),
('Gc13c4', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc13c5', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), (), (), ()]),
('Gc13c6', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), (), (), ()]),
('Gc13c7', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), (), ()]),
('Gc13c8', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), (), (), ()]),
('Gc13c9', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), (), (), ()]),
('Gc13c10', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), (), (), ()]),
('Gc13c11', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 1), (), (), ()]),
('Gc13c12', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), (), (), (), ()]),
('Gc13c13', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), (), (), (), ()]),
('Gc13c14', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc13c15', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), ()]),
('Gc13c16', [('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), (), (), ()]),
('Gc13c17', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), (), (), (), ()]),
('Gc13c18', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc13c19', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), ()]),
('Gc13c20', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), (), ()]),
('Gc13c21', [('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), (), (), ()]),
('Gc13c22', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), ()]),
('Gc13c23', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc14c0', [('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c1', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c2', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1),
('Gxpi2', 0)]),
('Gc14c3', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c4', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1),
('Gxpi2', 0)]),
('Gc14c5', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c6', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c7', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c8', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c9', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c10', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c11', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c12', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c13', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c14', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1),
('Gxxpi2', 0, 1)]),
('Gc14c15', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c16', [('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c17', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c18', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c19', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c20', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c21', [('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c22', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc14c23', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0)]),
('Gc15c0', [('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), (), (), (), ()]),
('Gc15c1', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), (), (), (), ()]),
('Gc15c2', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc15c3', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), (), (), (), ()]),
('Gc15c4', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc15c5', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), (), (), ()]),
('Gc15c6', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), (), (), (), ()]),
('Gc15c7', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), (), (), ()]),
('Gc15c8', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), (), (), (), ()]),
('Gc15c9', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), (), (), ()]),
('Gc15c10', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), (), ()]),
('Gc15c11', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), (), (), ()]),
('Gc15c12', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), ()]),
('Gc15c13', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), ()]),
('Gc15c14', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc15c15', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), (), (), (), ()]),
('Gc15c16', [('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), (), (), ()]),
('Gc15c17', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), ()]),
('Gc15c18', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc15c19', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), (), (), (), ()]),
('Gc15c20', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), (), ()]),
('Gc15c21', [('Gyypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), (), (), ()]),
('Gc15c22', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), ()]),
('Gc15c23', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc16c0', [('Gxpi2', 0), (), (), (), (), (), ()]), ('Gc16c1', [('Gxypi2', 0, 1), ('Gxpi2', 1), (), (), (), (), ()]),
('Gc16c2', [('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc16c3', [('Gxxpi2', 0, 1), ('Gxpi2', 1), (), (), (), (), ()]),
('Gc16c4', [('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc16c5', [('Gxxpi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc16c6', [('Gxypi2', 0, 1), ('Gypi2', 1), (), (), (), (), ()]),
('Gc16c7', [('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), (), (), ()]),
('Gc16c8', [('Gxxpi2', 0, 1), ('Gypi2', 1), (), (), (), (), ()]),
('Gc16c9', [('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc16c10', [('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), (), ()]),
('Gc16c11', [('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc16c12', [('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc16c13', [('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc16c14', [('Gxxpi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc16c15', [('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), (), (), ()]),
('Gc16c16', [('Gxxpi2', 0, 1), (), (), (), (), (), ()]),
('Gc16c17', [('Gxxpi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc16c18', [('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc16c19', [('Gxxpi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), (), (), ()]),
('Gc16c20', [('Gxxpi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), (), ()]),
('Gc16c21', [('Gxypi2', 0, 1), (), (), (), (), (), ()]),
('Gc16c22', [('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), (), ()]),
('Gc16c23', [('Gxxpi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc17c0', [('Gxpi2', 0), ('Gypi2', 0), ('Gxpi2', 0), (), (), (), ()]),
('Gc17c1', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), (), (), ()]),
('Gc17c2', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc17c3', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), (), (), ()]),
('Gc17c4', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc17c5', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), (), (), ()]),
('Gc17c6', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), (), (), (), ()]),
('Gc17c7', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), (), ()]),
('Gc17c8', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), (), (), (), ()]),
('Gc17c9', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), (), (), ()]),
('Gc17c10', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), (), (), ()]),
('Gc17c11', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 1), (), (), ()]),
('Gc17c12', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), (), (), (), ()]),
('Gc17c13', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), (), (), (), ()]),
('Gc17c14', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc17c15', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), ()]),
('Gc17c16', [('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), (), (), (), ()]),
('Gc17c17', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), (), (), (), ()]),
('Gc17c18', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc17c19', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), ()]),
('Gc17c20', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), (), ()]),
('Gc17c21', [('Gxypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), (), (), (), ()]),
('Gc17c22', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), ()]),
('Gc17c23', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc18c0', [('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc18c1', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc18c2', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ()]),
('Gc18c3', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc18c4', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ()]),
('Gc18c5', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), ()]),
('Gc18c6', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc18c7', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), (), ()]),
('Gc18c8', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc18c9', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), ()]),
('Gc18c10', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), (), ()]),
('Gc18c11', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), ()]),
('Gc18c12', [('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc18c13', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc18c14', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc18c15', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc18c16', [('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc18c17', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc18c18', [('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), (), ()]),
('Gc18c19', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc18c20', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), (), ()]),
('Gc18c21', [('Gyypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc18c22', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), ()]),
('Gc18c23', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), (), ()]),
('Gc19c0', [('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), (), (), (), ()]),
('Gc19c1', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), (), (), (), ()]),
('Gc19c2', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc19c3', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), (), (), (), ()]),
('Gc19c4', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc19c5', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), (), (), ()]),
('Gc19c6', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), (), (), (), ()]),
('Gc19c7', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), (), (), ()]),
('Gc19c8', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), (), (), (), ()]),
('Gc19c9', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), (), (), ()]),
('Gc19c10', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), (), (), ()]),
('Gc19c11', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), (), (), ()]),
('Gc19c12', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), ()]),
('Gc19c13', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), ()]),
('Gc19c14', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc19c15', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), (), (), (), ()]),
('Gc19c16', [('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), (), (), ()]),
('Gc19c17', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), (), (), (), ()]),
('Gc19c18', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc19c19', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), (), (), (), ()]),
('Gc19c20', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), (), ()]),
('Gc19c21', [('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), (), (), ()]),
('Gc19c22', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), ()]),
('Gc19c23', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc20c0', [('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), (), ()]),
('Gc20c1', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), (), ()]),
('Gc20c2', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ()]),
('Gc20c3', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), (), ()]),
('Gc20c4', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ()]),
('Gc20c5', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), (), ()]),
('Gc20c6', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), (), ()]),
('Gc20c7', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), ()]),
('Gc20c8', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), (), ()]),
('Gc20c9', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), (), ()]),
('Gc20c10', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), ()]),
('Gc20c11', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), (), ()]),
('Gc20c12', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), (), ()]),
('Gc20c13', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), (), ()]),
('Gc20c14', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc20c15', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), (), ()]),
('Gc20c16', [('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), (), ()]),
('Gc20c17', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), (), ()]),
('Gc20c18', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), (), ()]),
('Gc20c19', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), (), ()]),
('Gc20c20', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), (), ()]),
('Gc20c21', [('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 0), (), ()]),
('Gc20c22', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), (), ()]),
('Gc20c23', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), (), ()]),
('Gc21c0', [('Gypi2', 0), (), (), (), (), (), ()]), ('Gc21c1', [('Gyypi2', 0, 1), ('Gxpi2', 1), (), (), (), (), ()]),
('Gc21c2', [('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ()]),
('Gc21c3', [('Gxypi2', 0, 1), ('Gxpi2', 1), (), (), (), (), ()]),
('Gc21c4', [('Gyypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ()]),
('Gc21c5', [('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc21c6', [('Gyypi2', 0, 1), ('Gypi2', 1), (), (), (), (), ()]),
('Gc21c7', [('Gyypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), (), (), ()]),
('Gc21c8', [('Gxypi2', 0, 1), ('Gypi2', 1), (), (), (), (), ()]),
('Gc21c9', [('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc21c10', [('Gyypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), (), ()]),
('Gc21c11', [('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1), (), (), ()]),
('Gc21c12', [('Gyypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc21c13', [('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc21c14', [('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc21c15', [('Gyypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), (), (), ()]),
('Gc21c16', [('Gxypi2', 0, 1), (), (), (), (), (), ()]),
('Gc21c17', [('Gxypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), (), (), (), ()]),
('Gc21c18', [('Gyypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc21c19', [('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), (), (), (), ()]),
('Gc21c20', [('Gxypi2', 0, 1), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), (), ()]),
('Gc21c21', [('Gyypi2', 0, 1), (), (), (), (), (), ()]),
('Gc21c22', [('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), (), ()]),
('Gc21c23', [('Gxypi2', 0, 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), (), ()]),
('Gc22c0', [('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), (), ()]),
('Gc22c1', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), (), ()]),
('Gc22c2', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 1), ()]),
('Gc22c3', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), (), ()]),
('Gc22c4', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ()]),
('Gc22c5', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), (), ()]),
('Gc22c6', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), (), ()]),
('Gc22c7', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), (), ()]),
('Gc22c8', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), (), ()]),
('Gc22c9', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), (), ()]),
('Gc22c10', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), (), ()]),
('Gc22c11', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gypi2', 0), (), ()]),
('Gc22c12', [('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), ()]),
('Gc22c13', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), ()]),
('Gc22c14', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc22c15', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), ()]),
('Gc22c16', [('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), (), ()]),
('Gc22c17', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), ()]),
('Gc22c18', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), ()]),
('Gc22c19', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0), (), ()]),
('Gc22c20', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), (), ()]),
('Gc22c21', [('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), (), ()]),
('Gc22c22', [('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1), (), ()]),
('Gc22c23', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), ()]),
('Gc23c0', [('Gxpi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc23c1', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc23c2', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gypi2', 1), ()]),
('Gc23c3', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc23c4', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ()]),
('Gc23c5', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), ()]),
('Gc23c6', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc23c7', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), (), ()]),
('Gc23c8', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc23c9', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), ()]),
('Gc23c10', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), (), ()]),
('Gc23c11', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), (), ()]),
('Gc23c12', [('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc23c13', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc23c14', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 1)]),
('Gc23c15', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc23c16', [('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc23c17', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc23c18', [('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), (), ()]),
('Gc23c19', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc23c20', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), (), ()]),
('Gc23c21', [('Gxypi2', 0, 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), (), ()]),
('Gc23c22', [('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), (), ()]),
('Gc23c23', [('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), (), ()])
])
global_fidPairs = [(0, 4), (0, 5), (1, 6), (2, 0), (2, 4), (2, 10), (3, 1), (3, 3), (3, 4), (3, 10), (4, 3), (4, 4),
(4, 5), (4, 6), (4, 10), (5, 2), (6, 5), (7, 1), (7, 2), (7, 5), (7, 7), (7, 8), (7, 10), (8, 6),
(8, 7), (8, 10), (9, 8), (10, 6), (10, 9), (10, 10), (11, 1), (11, 8), (12, 3), (13, 5), (13, 6),
(14, 4), (14, 7)]
_pergerm_fidPairsDict = {
(('Gxpi2', 1), ): [(0, 5), (1, 0), (1, 1), (2, 2), (2, 5), (2, 9), (3, 3), (3, 4), (3, 8), (4, 0), (4, 2), (4, 7),
(4, 8), (4, 10), (5, 0), (5, 1), (5, 2), (5, 6), (5, 8), (6, 7), (6, 8), (6, 9), (7, 0), (7, 4),
(8, 5), (8, 9), (9, 5), (10, 8), (10, 10), (12, 2), (12, 4), (12, 7), (13, 2), (13, 3), (13, 9),
(14, 0), (14, 5), (14, 6), (15, 5), (15, 8), (15, 9)],
(('Gyxpi2', 0, 1), ): [(0, 5), (0, 9), (1, 6), (3, 1), (3, 2), (5, 0), (5, 4), (6, 0), (6, 8), (9, 7), (10, 9), (11, 1),
(11, 4), (14, 4), (14, 9), (15, 5), (15, 7)],
(('Gypi2', 0), ): [(3, 1), (4, 1), (4, 2), (5, 0), (5, 1), (5, 7), (6, 0), (6, 8), (7, 2), (7, 4), (7, 9), (8, 0),
(8, 7), (9, 2), (9, 3), (10, 9), (10, 10), (14, 7), (14, 9), (15, 10)],
(('Gypi2', 1), ): [(0, 0), (0, 7), (1, 1), (3, 5), (3, 6), (4, 2), (4, 4), (4, 5), (5, 3), (5, 7), (7, 1), (7, 8),
(8, 5), (9, 4), (9, 5), (9, 9), (10, 5), (11, 5), (11, 6), (11, 8), (11, 10), (12, 0), (12, 3),
(13, 10), (14, 0), (14, 5), (14, 6), (14, 7), (15, 0), (15, 6), (15, 9)],
(('Gyypi2', 0, 1), ): [(0, 6), (0, 8), (0, 10), (1, 0), (1, 1), (1, 3), (2, 9), (3, 8), (4, 4), (4, 7), (5, 7), (6, 1),
(7, 0), (7, 8), (9, 10), (10, 5), (11, 5), (12, 5), (12, 6), (14, 0), (15, 0), (15, 6), (15, 8)],
(('Gxxpi2', 0, 1), ): [(0, 0), (1, 5), (2, 4), (3, 3), (3, 5), (5, 2), (6, 1), (6, 8), (6, 10), (8, 6), (10, 2), (10, 8),
(10, 10), (11, 8), (12, 1), (13, 1), (13, 4), (13, 6), (13, 10), (14, 8), (15, 3)],
((), ): [(0, 8), (1, 0), (1, 1), (1, 3), (1, 10), (2, 5), (2, 9), (3, 3), (3, 9), (4, 3), (4, 8), (5, 0),
(5, 5), (5, 7), (6, 4), (6, 6), (6, 8), (6, 10), (7, 0), (7, 2), (7, 3), (7, 4), (7, 6), (7, 10),
(8, 3), (8, 5), (9, 3), (9, 4), (9, 5), (9, 6), (9, 8), (9, 9), (10, 3), (10, 9), (10, 10), (11, 1),
(11, 5), (12, 5), (12, 7), (12, 9), (13, 0), (13, 10), (14, 0), (14, 1), (14, 2), (14, 6), (15, 0),
(15, 5), (15, 6), (15, 7), (15, 8)],
(('Gxypi2', 0, 1), ): [(1, 1), (2, 8), (3, 0), (3, 2), (3, 6), (4, 7), (7, 2), (8, 6), (9, 1), (9, 7), (9, 9), (10, 2),
(10, 10), (11, 8), (12, 6), (13, 2), (13, 7), (14, 2), (15, 5)],
(('Gxpi2', 0), ): [(0, 7), (1, 1), (1, 7), (2, 7), (3, 3), (4, 9), (5, 4), (7, 2), (7, 10), (8, 2), (9, 2), (9, 8),
(9, 9), (10, 1), (10, 10), (11, 2), (11, 5), (11, 6), (13, 2), (14, 7), (15, 2), (15, 3)],
(('Gypi2', 1), ('Gypi2', 0)): [(0, 6), (0, 8), (0, 10), (1, 0), (1, 1), (1, 3), (2, 9), (3, 8), (4, 4), (4, 7), (5, 7),
(6, 1), (7, 0), (7, 8), (9, 10), (10, 5), (11, 5), (12, 5), (12, 6), (14, 0), (15, 0), (15, 6),
(15, 8)],
(('Gxxpi2', 0, 1), ('Gyypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8), (12, 2),
(12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxypi2', 0, 1), ('Gyxpi2', 0, 1)): [(0, 1), (0, 2), (0, 5), (1, 3), (1, 9), (2, 4), (2, 10), (3, 8), (5, 5), (7, 0), (9, 3),
(9, 9), (9, 10), (10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 1), ('Gxpi2', 0)): [(1, 1), (2, 8), (3, 0), (3, 2), (3, 6), (4, 7), (7, 2), (8, 6), (9, 1), (9, 7), (9, 9),
(10, 2), (10, 10), (11, 8), (12, 6), (13, 2), (13, 7), (14, 2), (15, 5)],
(('Gypi2', 0), ('Gxxpi2', 0, 1)): [(1, 10), (2, 10), (4, 8), (5, 5), (5, 6), (6, 10), (7, 0), (7, 5), (7, 6), (7, 8), (8, 5),
(12, 5), (13, 0), (13, 2), (14, 1)],
((), ('Gypi2', 1)): [(0, 0), (0, 7), (1, 1), (3, 5), (3, 6), (4, 2), (4, 4), (4, 5), (5, 3), (5, 7), (7, 1), (7, 8),
(8, 5), (9, 4), (9, 5), (9, 9), (10, 5), (11, 5), (11, 6), (11, 8), (11, 10), (12, 0), (12, 3),
(13, 10), (14, 0), (14, 5), (14, 6), (14, 7), (15, 0), (15, 6), (15, 9)],
(('Gxxpi2', 0, 1), ('Gxypi2', 0, 1)): [(1, 1), (2, 5), (4, 3), (5, 5), (6, 3), (7, 1), (10, 2), (10, 5), (11, 2), (11, 5), (12, 7),
(12, 10), (13, 0), (13, 4), (14, 5)],
(('Gyypi2', 0, 1), ('Gxypi2', 0, 1)): [(0, 9), (1, 1), (1, 9), (2, 7), (3, 4), (4, 4), (4, 10), (6, 0), (6, 3), (7, 0), (9, 4),
(11, 5), (12, 4), (13, 7), (14, 0)],
(('Gxpi2', 1), ('Gypi2', 1)): [(1, 0), (1, 10), (4, 0), (4, 4), (4, 7), (4, 8), (5, 5), (7, 6), (8, 9), (9, 9), (10, 2),
(10, 8), (11, 10), (12, 6), (12, 9), (13, 9), (15, 1)],
(('Gxpi2', 1), ('Gyypi2', 0, 1)): [(3, 0), (4, 4), (5, 1), (5, 8), (6, 5), (7, 3), (8, 6), (8, 7), (9, 5), (10, 3), (11, 4),
(14, 0), (14, 6), (14, 9), (15, 5)],
(('Gxpi2', 1), ('Gxpi2', 0)): [(0, 0), (1, 5), (2, 4), (3, 3), (3, 5), (5, 2), (6, 1), (6, 8), (6, 10), (8, 6), (10, 2),
(10, 8), (10, 10), (11, 8), (12, 1), (13, 1), (13, 4), (13, 6), (13, 10), (14, 8), (15, 3)],
((), ('Gypi2', 0)): [(3, 1), (4, 1), (4, 2), (5, 0), (5, 1), (5, 7), (6, 0), (6, 8), (7, 2), (7, 4), (7, 9), (8, 0),
(8, 7), (9, 2), (9, 3), (10, 9), (10, 10), (14, 7), (14, 9), (15, 10)],
(('Gxpi2', 1), ('Gypi2', 0)): [(0, 5), (0, 9), (1, 6), (3, 1), (3, 2), (5, 0), (5, 4), (6, 0), (6, 8), (9, 7), (10, 9),
(11, 1), (11, 4), (14, 4), (14, 9), (15, 5), (15, 7)],
(('Gxxpi2', 0, 1), ('Gyxpi2', 0, 1)): [(1, 10), (2, 10), (4, 8), (5, 5), (5, 6), (6, 10), (7, 0), (7, 5), (7, 6), (7, 8), (8, 5),
(12, 5), (13, 0), (13, 2), (14, 1)],
(('Gypi2', 0), ('Gxypi2', 0, 1)): [(0, 9), (1, 1), (1, 9), (2, 7), (3, 4), (4, 4), (4, 10), (6, 0), (6, 3), (7, 0), (9, 4),
(11, 5), (12, 4), (13, 7), (14, 0)],
((), ('Gxpi2', 1)): [(0, 5), (1, 0), (1, 1), (2, 2), (2, 5), (2, 9), (3, 3), (3, 4), (3, 8), (4, 0), (4, 2), (4, 7),
(4, 8), (4, 10), (5, 0), (5, 1), (5, 2), (5, 6), (5, 8), (6, 7), (6, 8), (6, 9), (7, 0),
(7, 4), (8, 5), (8, 9), (9, 5), (10, 8), (10, 10), (12, 2), (12, 4), (12, 7), (13, 2), (13, 3),
(13, 9), (14, 0), (14, 5), (14, 6), (15, 5), (15, 8), (15, 9)],
(('Gxpi2', 0), ('Gypi2', 0)): [(0, 1), (0, 2), (0, 5), (1, 3), (1, 9), (2, 4), (2, 10), (3, 8), (5, 5), (7, 0), (9, 3),
(9, 9), (9, 10), (10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 1), ('Gxxpi2', 0, 1)): [(0, 6), (3, 0), (5, 0), (6, 7), (7, 1), (8, 3), (9, 9), (10, 4), (10, 9), (12, 9), (13, 2),
(14, 5), (14, 8), (14, 10), (15, 6)],
(('Gxpi2', 1), ('Gxypi2', 0, 1), ('Gyxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gyypi2', 0, 1), ('Gyxpi2', 0, 1), ('Gyxpi2', 0, 1)): [(0, 1), (0, 3), (0, 9), (2, 3), (2, 6), (3, 10), (5, 7), (6, 0), (7, 2), (7, 6), (7, 7),
(8, 1), (8, 5), (9, 4), (14, 10)],
(('Gxpi2', 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 0), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), ('Gxpi2', 0), ('Gxxpi2', 0, 1)): [(0, 6), (1, 3), (1, 7), (1, 10), (2, 10), (4, 1), (5, 1), (5, 5), (7, 3), (8, 2),
(8, 3), (9, 8), (10, 1), (10, 6), (10, 10), (11, 7), (15, 3)],
(('Gxxpi2', 0, 1), ('Gyxpi2', 0, 1), ('Gxypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 0), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gypi2', 0), ('Gxpi2', 0)): [(1, 10), (2, 10), (4, 8), (5, 5), (5, 6), (6, 10), (7, 0), (7, 5), (7, 6), (7, 8),
(8, 5), (12, 5), (13, 0), (13, 2), (14, 1)],
(('Gxxpi2', 0, 1), ('Gyxpi2', 0, 1), ('Gyypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gxpi2', 0), ('Gypi2', 1)): [(0, 6), (3, 0), (5, 0), (6, 7), (7, 1), (8, 3), (9, 9), (10, 4), (10, 9), (12, 9),
(13, 2), (14, 5), (14, 8), (14, 10), (15, 6)],
(('Gxpi2', 0), ('Gyxpi2', 0, 1), ('Gyypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 0), ('Gxxpi2', 0, 1), ('Gyxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), (), ()): [(0, 7), (1, 1), (1, 7), (2, 7), (3, 3), (4, 9), (5, 4), (7, 2), (7, 10), (8, 2), (9, 2),
(9, 8), (9, 9), (10, 1), (10, 10), (11, 2), (11, 5), (11, 6), (13, 2), (14, 7), (15, 2),
(15, 3)],
((), ('Gypi2', 1), ('Gypi2', 0)): [(0, 6), (0, 8), (0, 10), (1, 0), (1, 1), (1, 3), (2, 9), (3, 8), (4, 4), (4, 7), (5, 7),
(6, 1), (7, 0), (7, 8), (9, 10), (10, 5), (11, 5), (12, 5), (12, 6), (14, 0), (15, 0),
(15, 6), (15, 8)],
(('Gypi2', 1), ('Gyxpi2', 0, 1), ('Gyxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 0), ('Gyxpi2', 0, 1), ('Gxxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 0)): [(0, 1), (4, 2), (4, 7), (6, 7), (8, 3), (9, 5), (9, 7), (10, 0), (10, 4), (10, 5),
(11, 2), (11, 9), (14, 6), (14, 8), (15, 3)],
(('Gypi2', 1), (), ()): [(0, 0), (0, 7), (1, 1), (3, 5), (3, 6), (4, 2), (4, 4), (4, 5), (5, 3), (5, 7), (7, 1),
(7, 8), (8, 5), (9, 4), (9, 5), (9, 9), (10, 5), (11, 5), (11, 6), (11, 8), (11, 10),
(12, 0), (12, 3), (13, 10), (14, 0), (14, 5), (14, 6), (14, 7), (15, 0), (15, 6),
(15, 9)],
(('Gypi2', 1), ('Gyxpi2', 0, 1), ('Gxypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0)): [(1, 5), (3, 3), (4, 1), (6, 1), (6, 6), (6, 8), (8, 6), (10, 10), (11, 8), (13, 1),
(13, 4), (13, 6), (13, 10), (14, 8), (15, 3)],
(('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), ('Gyxpi2', 0, 1), ('Gxxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 0), (), ()): [(3, 1), (4, 1), (4, 2), (5, 0), (5, 1), (5, 7), (6, 0), (6, 8), (7, 2), (7, 4), (7, 9),
(8, 0), (8, 7), (9, 2), (9, 3), (10, 9), (10, 10), (14, 7), (14, 9), (15, 10)],
((), ('Gxpi2', 1), ('Gypi2', 0)): [(0, 5), (0, 9), (1, 6), (3, 1), (3, 2), (5, 0), (5, 4), (6, 0), (6, 8), (9, 7), (10, 9),
(11, 1), (11, 4), (14, 4), (14, 9), (15, 5), (15, 7)],
(('Gxpi2', 1), ('Gxypi2', 0, 1), ('Gypi2', 0)): [(0, 1), (0, 2), (0, 5), (1, 3), (1, 9), (2, 4), (2, 10), (3, 8), (5, 5), (7, 0), (9, 3),
(9, 9), (9, 10), (10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gxxpi2', 0, 1), ('Gypi2', 1)): [(0, 3), (1, 0), (1, 4), (3, 10), (4, 3), (5, 7), (7, 2), (7, 4), (7, 7), (7, 8), (8, 1),
(8, 5), (8, 7), (8, 9), (9, 2), (9, 6), (10, 3), (14, 10), (15, 4)],
(('Gypi2', 1), ('Gxxpi2', 0, 1), ('Gypi2', 0)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gypi2', 1), ('Gxxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gypi2', 0), ('Gypi2', 1)): [(3, 0), (4, 4), (5, 1), (5, 8), (6, 5), (7, 3), (8, 6), (8, 7), (9, 5), (10, 3),
(11, 4), (14, 0), (14, 6), (14, 9), (15, 5)],
(('Gypi2', 1), ('Gypi2', 0), ('Gxxpi2', 0, 1)): [(0, 1), (0, 2), (0, 5), (1, 3), (1, 9), (2, 4), (2, 10), (3, 8), (5, 5), (7, 0), (9, 3),
(9, 9), (9, 10), (10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), (), ('Gypi2', 0)): [(0, 1), (0, 2), (0, 5), (1, 3), (1, 9), (2, 4), (2, 10), (3, 8), (5, 5), (7, 0), (9, 3),
(9, 9), (9, 10), (10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 1), ('Gxpi2', 0), ('Gxpi2', 0)): [(1, 7), (2, 2), (4, 8), (7, 2), (7, 10), (8, 6), (9, 8), (9, 9), (10, 1), (11, 4),
(11, 9), (12, 8), (12, 9), (13, 0), (13, 1), (13, 9)],
(('Gxpi2', 0), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gyypi2', 0, 1), ('Gyxpi2', 0, 1)): [(0, 2), (1, 0), (1, 4), (1, 9), (2, 4), (2, 10), (4, 3), (7, 4), (7, 8), (8, 7), (8, 9),
(9, 2), (9, 6), (10, 3), (15, 4)],
(('Gxpi2', 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1)): [(0, 4), (0, 5), (0, 7), (1, 1), (1, 6), (2, 3), (4, 10), (5, 4), (6, 8),
(7, 4), (7, 10), (8, 8), (8, 9), (10, 5), (11, 5), (11, 6), (11, 9), (13, 10), (14, 1),
(14, 9)],
(('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gyypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), ('Gypi2', 0), ()): [(0, 1), (0, 2), (0, 5), (1, 3), (1, 9), (2, 4), (2, 10), (3, 8), (5, 5), (7, 0), (9, 3),
(9, 9), (9, 10), (10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1)): [(0, 0), (0, 6), (1, 0), (1, 10), (4, 0), (4, 4), (4, 7), (4, 8), (5, 5), (6, 7), (7, 6),
(8, 9), (9, 9), (10, 2), (10, 8), (11, 10), (12, 6), (12, 9), (13, 1), (13, 9),
(15, 1)],
(('Gypi2', 1), ('Gxxpi2', 0, 1), ('Gxypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
((), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1)): [(1, 5), (3, 3), (4, 1), (6, 1), (6, 6), (6, 8), (8, 6), (10, 10), (11, 8), (13, 1),
(13, 4), (13, 6), (13, 10), (14, 8), (15, 3)],
(('Gypi2', 0), ('Gypi2', 0), ('Gyypi2', 0, 1)): [(0, 2), (1, 1), (1, 4), (2, 1), (2, 10), (3, 10), (4, 0), (5, 3), (5, 7), (6, 4),
(6, 10), (8, 2), (8, 3), (9, 0), (10, 8), (11, 1), (11, 7), (13, 1), (13, 8)],
(('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 0)): [(0, 9), (1, 1), (1, 9), (2, 7), (3, 4), (4, 4), (4, 10), (6, 0), (6, 3), (7, 0), (9, 4),
(11, 5), (12, 4), (13, 7), (14, 0)],
(('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 1), ('Gypi2', 0), ('Gxpi2', 0)): [(0, 9), (1, 1), (1, 9), (2, 7), (3, 4), (4, 4), (4, 10), (6, 0), (6, 3), (7, 0), (9, 4),
(11, 5), (12, 4), (13, 7), (14, 0)],
((), ('Gypi2', 0), ('Gxpi2', 1)): [(0, 5), (0, 9), (1, 6), (3, 1), (3, 2), (5, 0), (5, 4), (6, 0), (6, 8), (9, 7), (10, 9),
(11, 1), (11, 4), (14, 4), (14, 9), (15, 5), (15, 7)],
(('Gypi2', 1), ('Gyxpi2', 0, 1), ('Gyypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 0)): [(0, 6), (3, 0), (5, 0), (6, 7), (7, 1), (8, 3), (9, 9), (10, 4), (10, 9), (12, 9),
(13, 2), (14, 5), (14, 8), (14, 10), (15, 6)],
(('Gxpi2', 1), ('Gxpi2', 1), ('Gxxpi2', 0, 1)): [(0, 0), (1, 5), (2, 4), (3, 3), (3, 5), (5, 2), (6, 1), (6, 8), (6, 10), (8, 6),
(10, 2), (10, 8), (10, 10), (11, 8), (12, 1), (13, 1), (13, 4), (13, 6), (13, 10),
(14, 8), (15, 3)],
(('Gxpi2', 0), ('Gypi2', 0), ('Gxxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 0), ('Gyxpi2', 0, 1), ('Gxypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxypi2', 0, 1), ('Gyxpi2', 0, 1), ('Gyxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 1), ('Gyypi2', 0, 1), ('Gyxpi2', 0, 1)): [(0, 2), (1, 0), (1, 4), (1, 9), (2, 4), (2, 10), (4, 3), (7, 4), (7, 8), (8, 7), (8, 9),
(9, 2), (9, 6), (10, 3), (15, 4)],
(('Gypi2', 0), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 1), ('Gxypi2', 0, 1), ('Gxxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gypi2', 1), ()): [(1, 0), (1, 10), (4, 0), (4, 4), (4, 7), (4, 8), (5, 5), (7, 6), (8, 9), (9, 9),
(10, 2), (10, 8), (11, 10), (12, 6), (12, 9), (13, 9), (15, 1)],
(('Gyypi2', 0, 1), ('Gyxpi2', 0, 1), ('Gxypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), (), ('Gypi2', 1)): [(1, 0), (1, 10), (4, 0), (4, 4), (4, 7), (4, 8), (5, 5), (7, 6), (8, 9), (9, 9),
(10, 2), (10, 8), (11, 10), (12, 6), (12, 9), (13, 9), (15, 1)],
(('Gxpi2', 1), (), ()): [(0, 5), (1, 0), (1, 1), (2, 2), (2, 5), (2, 9), (3, 3), (3, 4), (3, 8), (4, 0), (4, 2),
(4, 7), (4, 8), (4, 10), (5, 0), (5, 1), (5, 2), (5, 6), (5, 8), (6, 7), (6, 8), (6, 9),
(7, 0), (7, 4), (8, 5), (8, 9), (9, 5), (10, 8), (10, 10), (12, 2), (12, 4), (12, 7),
(13, 2), (13, 3), (13, 9), (14, 0), (14, 5), (14, 6), (15, 5), (15, 8), (15, 9)],
(('Gypi2', 0), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gyxpi2', 0, 1), ('Gxypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 0), ('Gypi2', 1)): [(0, 1), (0, 2), (0, 5), (1, 3), (1, 9), (2, 4), (2, 10), (3, 8), (5, 5), (7, 0),
(9, 3), (9, 9), (9, 10), (10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), (), (), ('Gxpi2', 0)): [(0, 0), (1, 5), (2, 4), (3, 3), (3, 5), (5, 2), (6, 1), (6, 8), (6, 10), (8, 6),
(10, 2), (10, 8), (10, 10), (11, 8), (12, 1), (13, 1), (13, 4), (13, 6),
(13, 10), (14, 8), (15, 3)],
(('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 0)): [(0, 1), (0, 2), (0, 5), (1, 3), (1, 9), (2, 4), (2, 10), (3, 8), (5, 5), (7, 0),
(9, 3), (9, 9), (9, 10), (10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyxpi2', 0, 1), ('Gyxpi2', 0, 1)): [(1, 10), (2, 10), (4, 8), (5, 5), (5, 6), (6, 10), (7, 0), (7, 5), (7, 6),
(7, 8), (8, 5), (12, 5), (13, 0), (13, 2), (14, 1)],
(('Gypi2', 1), ('Gxpi2', 0), (), ()): [(1, 1), (2, 8), (3, 0), (3, 2), (3, 6), (4, 7), (7, 2), (8, 6), (9, 1), (9, 7),
(9, 9), (10, 2), (10, 10), (11, 8), (12, 6), (13, 2), (13, 7), (14, 2), (15, 5)],
(('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gypi2', 0), ('Gypi2', 0)): [(0, 1), (0, 3), (0, 9), (2, 3), (2, 6), (3, 10), (5, 7), (6, 0), (7, 2), (7, 6),
(7, 7), (8, 1), (8, 5), (9, 4), (14, 10)],
(('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1)): [(0, 5), (0, 9), (1, 6), (3, 1), (3, 2), (5, 0), (5, 4), (6, 0), (6, 8), (9, 7),
(10, 9), (11, 1), (11, 4), (14, 4), (14, 9), (15, 5), (15, 7)],
(('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 1), ('Gypi2', 0)): [(0, 2), (1, 1), (1, 4), (2, 1), (2, 10), (3, 10), (4, 0), (5, 3), (5, 7), (6, 4),
(6, 10), (8, 2), (8, 3), (9, 0), (10, 8), (11, 1), (11, 7), (13, 1), (13, 8)],
(('Gyxpi2', 0, 1), ('Gyxpi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1)): [(1, 10), (2, 10), (4, 8), (5, 5), (5, 6), (6, 10), (7, 0), (7, 5), (7, 6),
(7, 8), (8, 5), (12, 5), (13, 0), (13, 2), (14, 1)],
(('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ()): [(0, 4), (0, 5), (0, 7), (1, 1), (1, 6), (2, 3), (4, 10), (5, 4), (6, 8), (7, 4),
(7, 10), (8, 8), (8, 9), (10, 5), (11, 5), (11, 6), (11, 9), (13, 10), (14, 1),
(14, 9)],
(('Gxpi2', 0), ('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 1)): [(1, 10), (2, 10), (4, 8), (5, 5), (5, 6), (6, 10), (7, 0), (7, 5), (7, 6),
(7, 8), (8, 5), (12, 5), (13, 0), (13, 2), (14, 1)],
(('Gyxpi2', 0, 1), ('Gxpi2', 1), ('Gxypi2', 0, 1), ('Gxpi2', 0)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0)): [(0, 1), (0, 2), (0, 5), (1, 3), (1, 9), (2, 4), (2, 10), (3, 8), (5, 5), (7, 0),
(9, 3), (9, 9), (9, 10), (10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gypi2', 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1)): [(1, 0), (1, 10), (4, 0), (4, 4), (4, 7), (4, 8), (5, 5), (7, 6), (8, 9), (9, 9),
(10, 2), (10, 8), (11, 10), (12, 6), (12, 9), (13, 9), (15, 1)],
(('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0), ()): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10),
(10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
((), ('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1), ('Gyypi2', 0, 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10),
(10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gxpi2', 0)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10),
(10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gyxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 0), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)): [(3, 0), (4, 4), (5, 1), (5, 8), (6, 5), (7, 3), (8, 6), (8, 7), (9, 5),
(10, 3), (11, 4), (14, 0), (14, 6), (14, 9), (15, 5)],
(('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 1), ('Gxxpi2', 0, 1), ('Gypi2', 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10),
(10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 0), ('Gypi2', 1), ('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10),
(10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10),
(10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 1), ('Gypi2', 0), ('Gypi2', 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10),
(10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 0)): [(1, 5), (3, 3), (4, 1), (6, 1), (6, 6), (6, 8), (8, 6), (10, 10), (11, 8),
(13, 1), (13, 4), (13, 6), (13, 10), (14, 8), (15, 3)],
(('Gypi2', 1), ('Gxpi2', 0), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 0)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10),
(10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 0), ('Gxpi2', 1), ('Gxpi2', 0)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10),
(10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gxpi2', 0), ('Gxpi2', 0)): [(1, 1), (2, 5), (4, 3), (5, 5), (6, 3), (7, 1), (10, 2), (10, 5),
(11, 2), (11, 5), (12, 7), (12, 10), (13, 0), (13, 4), (14, 5)],
(('Gypi2', 0), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 1), ('Gypi2', 1), ('Gypi2', 1)): [(0, 1), (0, 2), (0, 5), (1, 3), (1, 9), (2, 4), (2, 10), (3, 8),
(5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8), (12, 2), (12, 6),
(14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), ('Gypi2', 1), ('Gxpi2', 1), ('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 1)): [(0, 1), (0, 2), (0, 5), (1, 3), (1, 9), (2, 4), (2, 10), (3, 8),
(5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8), (12, 2), (12, 6),
(14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), ('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 1), ('Gypi2', 0)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9),
(9, 10), (10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 0)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9),
(9, 10), (10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxypi2', 0, 1), ('Gyxpi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9),
(9, 10), (10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 1), ('Gxpi2', 1), ('Gypi2', 1)): [(0, 4), (0, 6), (1, 1), (2, 2), (4, 1), (4, 3), (5, 1), (5, 3),
(6, 10), (8, 2), (8, 8), (9, 4), (10, 7), (12, 1), (13, 2),
(15, 6), (15, 9)],
(('Gypi2', 0), ('Gxpi2', 0), ('Gypi2', 1), ('Gypi2', 0), ('Gxxpi2', 0, 1), ('Gypi2', 0)): [(0, 1), (0, 2), (0, 5), (1, 3), (1, 9), (2, 4), (2, 10), (3, 8),
(5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8), (12, 2), (12, 6),
(14, 6), (15, 0), (15, 5)],
(('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gxpi2', 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9),
(9, 10), (10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 0)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9),
(9, 10), (10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 0), ('Gypi2', 1), ('Gypi2', 0), ('Gxpi2', 1)): [(0, 3), (1, 0), (1, 4), (3, 10), (4, 3), (5, 7), (7, 2), (7, 4),
(7, 7), (7, 8), (8, 1), (8, 5), (8, 7), (8, 9), (9, 2), (9, 6),
(10, 3), (14, 10), (15, 4)],
(('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gypi2', 1)): [(1, 0), (1, 10), (4, 0), (4, 4), (4, 7), (4, 8), (5, 5), (7, 6),
(8, 9), (9, 9), (10, 2), (10, 8), (11, 10), (12, 6), (12, 9),
(13, 9), (15, 1)],
(('Gypi2', 0), ('Gxpi2', 0), ('Gxpi2', 1), ('Gxpi2', 0), ('Gxpi2', 1), ('Gypi2', 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3), (9, 9),
(9, 10), (10, 8), (12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 0), ('Gxpi2', 1), (), ('Gypi2', 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3),
(9, 9), (9, 10), (10, 8), (12, 2), (12, 6), (14, 6),
(15, 0), (15, 5)],
((), ('Gypi2', 0), ('Gxpi2', 0), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxxpi2', 0, 1)): [(0, 1), (0, 2), (0, 5), (1, 3), (1, 9), (2, 4), (2, 10),
(3, 8), (5, 5), (7, 0), (9, 3), (9, 9), (9, 10), (10, 8),
(12, 2), (12, 6), (14, 6), (15, 0), (15, 5)],
(('Gypi2', 1), ('Gxpi2', 1), ('Gypi2', 0), ('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 0), ('Gypi2', 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0), (9, 3),
(9, 9), (9, 10), (10, 8), (12, 2), (12, 6), (14, 6),
(15, 0), (15, 5)],
(('Gypi2', 0), ('Gxpi2', 0), ('Gypi2', 1), ('Gxpi2', 0), ('Gxpi2', 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 1)): [(0, 1), (0, 5), (1, 3), (3, 8), (5, 5), (7, 0),
(9, 3), (9, 9), (9, 10), (10, 8), (12, 2), (12, 6),
(14, 6), (15, 0), (15, 5)],
(('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 0), ('Gxpi2', 0), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 1), ('Gypi2', 0)): [(1, 1), (2, 5), (4, 3), (5, 5), (6, 3), (7, 1),
(10, 2), (10, 5), (11, 2), (11, 5), (12, 7),
(12, 10), (13, 0), (13, 4), (14, 5)]
}
global_fidPairs_lite = None
_pergerm_fidPairsDict_lite = {
((),): [
(0, 3), (0, 4), (0, 6), (0, 7), (0, 8), (0, 9), (1, 1),
(1, 4), (1, 5), (1, 9), (2, 1), (2, 3), (2, 4), (2, 6),
(2, 7), (3, 5), (3, 7), (4, 0), (4, 2), (4, 4), (4, 9),
(5, 0), (5, 2), (5, 7), (5, 9), (5, 10), (6, 0), (6, 1),
(6, 2), (6, 3), (6, 4), (6, 8), (6, 9), (7, 6), (7, 7),
(8, 0), (8, 2), (8, 6), (8, 7), (8, 10), (9, 2), (9, 8),
(9, 9), (9, 10), (10, 4), (10, 7), (10, 8), (10, 9),
(10, 10), (11, 3), (11, 5), (11, 8), (11, 10), (12, 8),
(13, 2), (13, 7), (13, 9), (13, 10), (14, 1), (14, 2),
(14, 7), (14, 8), (14, 10), (15, 4), (15, 8)],
(('Gxpi2', 0),): [
(0, 8), (1, 2), (1, 7), (2, 4), (2, 6), (2, 7), (3, 3),
(3, 5), (3, 8), (4, 8), (5, 0), (5, 4), (5, 5), (5, 7),
(6, 8), (7, 9), (8, 3), (8, 5), (9, 0), (9, 2), (9, 9),
(10, 2), (10, 8), (10, 10), (11, 7), (11, 10), (12, 10),
(13, 1), (13, 2), (13, 6), (13, 9), (14, 0), (14, 5),
(14, 8), (15, 10)],
(('Gypi2', 0),): [
(1, 0), (1, 1), (1, 3), (1, 4), (2, 1), (2, 4), (3, 1),
(3, 2), (3, 8), (4, 3), (4, 5), (5, 7), (6, 0), (6, 1),
(6, 2), (6, 4), (6, 8), (7, 4), (7, 7), (8, 1), (9, 4),
(10, 2), (10, 7), (11, 2), (11, 6), (11, 7), (11, 10),
(12, 2), (12, 3), (12, 8), (13, 9), (13, 10), (15, 6),
(15, 9)],
(('Gxpi2', 1),): [
(0, 4), (0, 9), (1, 5), (1, 8), (1, 9), (1, 10), (2, 2),
(2, 6), (2, 7), (3, 5), (4, 1), (4, 7), (5, 0), (5, 2),
(5, 5), (5, 6), (5, 7), (5, 8), (6, 6), (6, 8), (7, 1),
(7, 5), (7, 9), (7, 10), (8, 5), (8, 6), (9, 0), (9, 4),
(9, 5), (9, 7), (10, 2), (10, 10), (12, 1), (12, 3),
(12, 9), (13, 4), (13, 7), (13, 8), (14, 2), (14, 6),
(14, 8), (15, 5)],
(('Gypi2', 1),): [
(0, 3), (2, 7), (3, 2), (3, 3), (3, 6), (4, 6), (4, 8),
(5, 7), (5, 10), (6, 8), (8, 0), (8, 4), (8, 10), (9, 9),
(10, 6), (12, 1), (12, 3), (12, 8), (13, 0), (13, 1),
(13, 6), (13, 10), (14, 0), (14, 1), (15, 0), (15, 5),
(15, 8)],
(('Gxxpi2', 0, 1),): [
(1, 1), (1, 4), (3, 0), (3, 2), (3, 4), (3, 6), (3, 9),
(4, 0), (4, 8), (7, 0), (7, 8), (8, 2), (8, 4), (9, 8),
(10, 10), (11, 6), (11, 10), (12, 4), (12, 8), (13, 6),
(14, 5)],
(('Gxypi2', 0, 1),): [
(0, 10), (1, 1), (2, 1), (2, 6), (2, 7), (3, 1), (5, 1),
(5, 4), (5, 8), (7, 4), (8, 1), (9, 0), (9, 9), (10, 0),
(12, 2), (12, 6), (13, 1), (13, 5), (14, 0), (14, 4),
(14, 9), (15, 5)],
(('Gyypi2', 0, 1),): [
(0, 5), (0, 10), (1, 7), (2, 6), (2, 9), (3, 10), (4, 0),
(4, 1), (4, 9), (5, 7), (6, 8), (8, 6), (9, 1), (9, 2),
(9, 6), (9, 8), (10, 0), (10, 5), (13, 2), (13, 3), (14, 2)],
(('Gxpi2', 0), ('Gypi2', 0)): [
(0, 1), (0, 4), (0, 7), (0, 8), (1, 10), (2, 1), (2, 3),
(2, 7), (5, 5), (8, 0), (8, 2), (10, 2), (13, 6), (15, 0),
(15, 6)],
(('Gxpi2', 1), ('Gypi2', 1)): [
(1, 3), (2, 0), (3, 6), (5, 3), (5, 7), (5, 8), (6, 6),
(6, 7), (7, 10), (8, 9), (8, 10), (11, 5), (12, 0), (12, 5),
(12, 6), (12, 10), (13, 6), (13, 9), (14, 1), (14, 2),
(15, 4)],
(('Gxpi2', 0), ('Gxpi2', 0), ('Gypi2', 0)): [
(0, 1), (0, 4), (0, 7), (0, 8), (1, 10), (2, 1), (2, 3),
(2, 7), (5, 5), (8, 0), (8, 2), (10, 2), (13, 6), (15, 0),
(15, 6)],
(('Gxpi2', 1), ('Gxpi2', 1), ('Gypi2', 1)): [
(0, 0), (0, 2), (0, 3), (1, 2), (2, 4), (3, 8), (4, 5),
(4, 7), (5, 7), (6, 4), (6, 9), (8, 5), (8, 7), (8, 9),
(9, 1), (10, 3), (10, 4), (11, 8), (11, 10), (12, 4),
(12, 10), (14, 9), (15, 5), (15, 8)],
(('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxypi2', 0, 1)): [
(0, 10), (3, 6), (4, 1), (6, 3), (6, 4), (6, 10), (7, 1),
(8, 1), (8, 10), (9, 7), (10, 6), (10, 8), (11, 5), (13, 1),
(15, 4)],
(('Gxxpi2', 0, 1), ('Gxypi2', 0, 1), ('Gyypi2', 0, 1)): [
(0, 1), (0, 4), (0, 7), (0, 8), (1, 10), (2, 1), (2, 3),
(2, 7), (5, 5), (8, 0), (8, 2), (10, 2), (13, 6), (15, 0),
(15, 6)],
(('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gxypi2', 0, 1), ('Gxpi2', 0)): [
(1, 0), (1, 5), (1, 6), (2, 0), (6, 3), (7, 0), (8, 2),
(10, 9), (11, 2), (11, 10), (12, 10), (13, 5), (13, 10),
(15, 0), (15, 7)],
(('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1), ('Gxpi2', 0)): [
(1, 0), (2, 9), (3, 10), (4, 3), (6, 4), (6, 9), (7, 10),
(9, 7), (10, 2), (10, 9), (10, 10), (11, 0), (12, 0),
(13, 6), (13, 9)],
(('Gxpi2', 0), ('Gyypi2', 0, 1), ('Gypi2', 0), ('Gxxpi2', 0, 1), ('Gxxpi2', 0, 1)): [
(0, 1), (0, 4), (0, 7), (0, 8), (1, 10), (2, 1), (2, 3),
(2, 7), (5, 5), (8, 0), (8, 2), (10, 2), (13, 6), (15, 0),
(15, 6)],
(('Gxpi2', 0), ('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 1), ('Gypi2', 0)): [
(0, 1), (0, 4), (0, 7), (0, 8), (1, 10), (2, 1), (2, 3),
(2, 7), (5, 5), (8, 0), (8, 2), (10, 2), (13, 6), (15, 0),
(15, 6)],
(('Gxpi2', 0), ('Gypi2', 1), ('Gxpi2', 1), ('Gypi2', 0), ('Gxpi2', 1), ('Gxpi2', 1)): [
(0, 1), (0, 3), (0, 4), (0, 7), (0, 8), (1, 10), (2, 1),
(2, 3), (2, 7), (3, 5), (4, 2), (5, 5), (8, 0), (8, 2),
(10, 2), (10, 8), (11, 2), (13, 6), (14, 3), (14, 5),
(15, 0), (15, 7), (15, 9)],
(('Gxypi2', 0, 1), ('Gxypi2', 0, 1), ('Gxpi2', 1), ('Gypi2', 1), ('Gxxpi2', 0, 1), ('Gxpi2', 1)): [
(0, 2), (1, 5), (1, 7), (1, 8), (2, 10), (6, 1), (7, 7),
(10, 5), (11, 2), (11, 10), (14, 0), (14, 2), (14, 3),
(14, 9), (14, 10)],
(('Gypi2', 0), ('Gxpi2', 0), ('Gypi2', 1), ('Gxpi2', 0), ('Gxpi2', 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gypi2', 1)): [
(0, 1), (0, 4), (0, 7), (0, 8), (1, 10), (2, 1), (2, 3),
(2, 7), (5, 5), (8, 0), (8, 2), (10, 2), (13, 6), (15, 0),
(15, 6)],
}
def _target_model(self, sslbls):
return self._build_explicit_target_model(
sslbls,
[(), ('Gxpi2', 1), ('Gypi2', 1), ('Gxpi2', 0), ('Gypi2', 0), ('Gxxpi2', 0, 1),
('Gyypi2', 0, 1), ('Gxypi2', 0, 1), ('Gyxpi2', 0, 1)],
['I({0}):I({1})', 'I({0}):X(pi/2,{1})', 'I({0}):Y(pi/2,{1})', 'X(pi/2,{0}):I({1})', 'Y(pi/2,{0}):I({1})',
'X(pi/2,{0}):X(pi/2,{1})', 'Y(pi/2,{0}):Y(pi/2,{1})',
'X(pi/2,{0}):Y(pi/2,{1})', 'Y(pi/2,{0}):X(pi/2,{1})'],
effectLabels=['00', '01', '10', '11'],
effectExpressions=['0', '1', '2', '3'])
import sys
sys.modules[__name__] = _Module()
| 100.667237 | 170 | 0.324941 | 16,244 | 117,680 | 2.352068 | 0.043339 | 0.102128 | 0.176093 | 0.133562 | 0.859371 | 0.839741 | 0.823278 | 0.804617 | 0.768131 | 0.729762 | 0 | 0.220038 | 0.278637 | 117,680 | 1,168 | 171 | 100.753425 | 0.230039 | 0.006705 | 0 | 0.138962 | 0 | 0.001759 | 0.208306 | 0.000787 | 0 | 0 | 0 | 0 | 0 | 1 | 0.00088 | false | 0 | 0.004398 | 0.00088 | 0.01847 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
5d90115e22fe914534a0e75ef527792a7aedd37e | 255 | py | Python | _broken/caffe-segnet-stuff/backend/__init__.py | chengjianglong/clab | 504a111a5ffbaa119dc64b30c8f7cb14288923a8 | [
"Apache-2.0"
] | null | null | null | _broken/caffe-segnet-stuff/backend/__init__.py | chengjianglong/clab | 504a111a5ffbaa119dc64b30c8f7cb14288923a8 | [
"Apache-2.0"
] | null | null | null | _broken/caffe-segnet-stuff/backend/__init__.py | chengjianglong/clab | 504a111a5ffbaa119dc64b30c8f7cb14288923a8 | [
"Apache-2.0"
] | 1 | 2020-10-15T00:03:40.000Z | 2020-10-15T00:03:40.000Z | # -*- coding: utf-8 -*-
"""
python -c "import ubelt._internal as a; a.autogen_init('pysseg.backend')"
"""
# flake8: noqa
# from pysseg.backend import iface_caffe
# from pysseg.backend import find_segnet_caffe
# from pysseg.backend import batch_norm_stats
| 28.333333 | 73 | 0.741176 | 37 | 255 | 4.918919 | 0.648649 | 0.285714 | 0.28022 | 0.379121 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008969 | 0.12549 | 255 | 8 | 74 | 31.875 | 0.807175 | 0.929412 | 0 | null | 1 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5dc816e52f0c1008e34a3b8250af76b6388b4428 | 13,256 | py | Python | tests/unit/beacons/test_wtmp_beacon.py | dwfreed/salt | ee11ae520f005e5be824a397982e888111606b11 | [
"Apache-2.0"
] | null | null | null | tests/unit/beacons/test_wtmp_beacon.py | dwfreed/salt | ee11ae520f005e5be824a397982e888111606b11 | [
"Apache-2.0"
] | null | null | null | tests/unit/beacons/test_wtmp_beacon.py | dwfreed/salt | ee11ae520f005e5be824a397982e888111606b11 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
# Python libs
from __future__ import absolute_import
import datetime
import logging
import sys
# Salt testing libs
from tests.support.unit import skipIf, TestCase
from tests.support.mock import NO_MOCK, NO_MOCK_REASON, patch, MagicMock, mock_open
from tests.support.mixins import LoaderModuleMockMixin
# Salt libs
import salt.beacons.wtmp as wtmp
# pylint: disable=import-error
try:
import dateutil.parser as dateutil_parser # pylint: disable=unused-import
_TIME_SUPPORTED = True
except ImportError:
_TIME_SUPPORTED = False
if sys.version_info >= (3,):
raw = bytes('\x07\x00\x00\x00H\x18\x00\x00pts/14\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00s/14gareth\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00::1\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x13I\xc5YZf\x05\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', 'utf-8')
pack = (7, 6216, b'pts/14\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', b's/14', b'gareth\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', b'::1\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', 0, 0, 0, 1506101523, 353882, 0, 0, 0, 16777216)
else:
raw = b'\x07\x00\x00\x00H\x18\x00\x00pts/14\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00s/14gareth\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00::1\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x13I\xc5YZf\x05\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
pack = (7, 6216, 'pts/14\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', 's/14', 'gareth\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', '::1\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', 0, 0, 0, 1506101523, 353882, 0, 0, 0, 16777216)
log = logging.getLogger(__name__)
@skipIf(NO_MOCK, NO_MOCK_REASON)
class WTMPBeaconTestCase(TestCase, LoaderModuleMockMixin):
'''
Test case for salt.beacons.[s]
'''
def setup_loader_modules(self):
return {
wtmp: {
'__context__': {'wtmp.loc': 2},
'__salt__': {},
}
}
def test_non_list_config(self):
config = {}
ret = wtmp.validate(config)
self.assertEqual(ret, (False, 'Configuration for wtmp beacon must'
' be a list.'))
def test_empty_config(self):
config = [{}]
ret = wtmp.validate(config)
self.assertEqual(ret, (True, 'Valid beacon configuration'))
def test_no_match(self):
config = [{'users': {'gareth': {'time_range': {'end': '09-22-2017 5pm',
'start': '09-22-2017 3pm'}}}}
]
ret = wtmp.validate(config)
self.assertEqual(ret, (True, 'Valid beacon configuration'))
with patch('salt.utils.files.fopen', mock_open()) as m_open:
ret = wtmp.beacon(config)
m_open.assert_called_with(wtmp.WTMP, 'rb')
self.assertEqual(ret, [])
def test_invalid_users(self):
config = [{'users': ['gareth']}]
ret = wtmp.validate(config)
self.assertEqual(ret, (False, 'User configuration for wtmp beacon must be a dictionary.'))
def test_invalid_groups(self):
config = [{'groups': ['docker']}]
ret = wtmp.validate(config)
self.assertEqual(ret, (False, 'Group configuration for wtmp beacon must be a dictionary.'))
def test_default_invalid_time_range(self):
config = [{'defaults': {'time_range': {'start': '3pm'}}}]
ret = wtmp.validate(config)
self.assertEqual(ret, (False, 'The time_range parameter for wtmp beacon must contain start & end options.'))
def test_users_invalid_time_range(self):
config = [{'users': {'gareth': {'time_range': {'start': '3pm'}}}}]
ret = wtmp.validate(config)
self.assertEqual(ret, (False, 'The time_range parameter for wtmp beacon must contain start & end options.'))
def test_groups_invalid_time_range(self):
config = [{'groups': {'docker': {'time_range': {'start': '3pm'}}}}]
ret = wtmp.validate(config)
self.assertEqual(ret, (False, 'The time_range parameter for wtmp beacon must contain start & end options.'))
def test_match(self):
with patch('salt.utils.files.fopen',
mock_open(read_data=raw)):
with patch('struct.unpack',
MagicMock(return_value=pack)):
config = [{'users': {'gareth': {}}}]
ret = wtmp.validate(config)
self.assertEqual(ret, (True, 'Valid beacon configuration'))
_expected = [{'PID': 6216,
'line': 'pts/14',
'session': 0,
'time': 0,
'exit_status': 0,
'inittab': 's/14',
'type': 7,
'addr': 1506101523,
'hostname': '::1',
'user': 'gareth'}]
ret = wtmp.beacon(config)
log.debug('{}'.format(ret))
self.assertEqual(ret, _expected)
@skipIf(not _TIME_SUPPORTED, 'dateutil.parser is missing.')
def test_match_time(self):
with patch('salt.utils.files.fopen',
mock_open(read_data=raw)):
mock_now = datetime.datetime(2017, 9, 22, 16, 0, 0, 0)
with patch('datetime.datetime', MagicMock()), \
patch('datetime.datetime.now',
MagicMock(return_value=mock_now)):
with patch('struct.unpack',
MagicMock(return_value=pack)):
config = [{'users': {'gareth': {'time': {'end': '09-22-2017 5pm',
'start': '09-22-2017 3pm'}}}}
]
ret = wtmp.validate(config)
self.assertEqual(ret, (True, 'Valid beacon configuration'))
_expected = [{'PID': 6216,
'line': 'pts/14',
'session': 0,
'time': 0,
'exit_status': 0,
'inittab': 's/14',
'type': 7,
'addr': 1506101523,
'hostname': '::1',
'user': 'gareth'}]
ret = wtmp.beacon(config)
self.assertEqual(ret, _expected)
def test_match_group(self):
for groupadd in ('salt.modules.aix_group',
'salt.modules.mac_group',
'salt.modules.pw_group',
'salt.modules.solaris_group',
'salt.modules.win_groupadd'):
mock_group_info = {'passwd': 'x',
'gid': 100,
'name': 'users',
'members': ['gareth']}
with patch('salt.utils.files.fopen',
mock_open(read_data=raw)):
with patch('time.time',
MagicMock(return_value=1506121200)):
with patch('struct.unpack',
MagicMock(return_value=pack)):
with patch('{0}.info'.format(groupadd),
new=MagicMock(return_value=mock_group_info)):
config = [{'group': {'users': {'time': {'end': '09-22-2017 5pm',
'start': '09-22-2017 3pm'}}}}
]
ret = wtmp.validate(config)
self.assertEqual(ret,
(True, 'Valid beacon configuration'))
_expected = [{'PID': 6216,
'line': 'pts/14',
'session': 0,
'time': 0,
'exit_status': 0,
'inittab': 's/14',
'type': 7,
'addr': 1506101523,
'hostname': '::1',
'user': 'gareth'}]
ret = wtmp.beacon(config)
self.assertEqual(ret, _expected)
| 62.824645 | 1,492 | 0.57755 | 2,068 | 13,256 | 3.65087 | 0.082689 | 1.026755 | 1.518676 | 1.99947 | 0.816159 | 0.802384 | 0.795629 | 0.791258 | 0.773377 | 0.773377 | 0 | 0.291186 | 0.247661 | 13,256 | 210 | 1,493 | 63.12381 | 0.465858 | 0.010863 | 0 | 0.460526 | 0 | 0.052632 | 0.520281 | 0.429532 | 0 | 1 | 0 | 0 | 0.105263 | 1 | 0.078947 | false | 0.006579 | 0.065789 | 0.006579 | 0.157895 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 14 |
5dfdc211c9a055722addc2ccffead83b3ad4835c | 4,677 | py | Python | tests/functional/api/test_moderation.py | pombredanne/h | 9c4c2dc0d53ed5bed5183936c24b4c27b23070b4 | [
"BSD-2-Clause"
] | null | null | null | tests/functional/api/test_moderation.py | pombredanne/h | 9c4c2dc0d53ed5bed5183936c24b4c27b23070b4 | [
"BSD-2-Clause"
] | null | null | null | tests/functional/api/test_moderation.py | pombredanne/h | 9c4c2dc0d53ed5bed5183936c24b4c27b23070b4 | [
"BSD-2-Clause"
] | null | null | null | import pytest
class TestPutHide:
def test_it_returns_http_204_for_group_creator(
self, app, group_annotation, user_with_token
):
_, token = user_with_token
headers = {"Authorization": str("Bearer {}".format(token.value))}
res = app.put(
"/api/annotations/{id}/hide".format(id=group_annotation.id), headers=headers
)
# The creator of a group has moderation rights over the annotations in that group
assert res.status_code == 204
def test_it_returns_http_404_if_annotation_is_in_world_group(
self, app, world_annotation, user_with_token
):
_, token = user_with_token
headers = {"Authorization": str("Bearer {}".format(token.value))}
res = app.put(
"/api/annotations/{id}/hide".format(id=world_annotation.id),
headers=headers,
expect_errors=True,
)
# The current user does not have moderation rights on the world group
assert res.status_code == 404
def test_it_returns_http_404_if_no_authn(self, app, group_annotation):
res = app.put(
"/api/annotations/{id}/hide".format(id=group_annotation.id),
expect_errors=True,
)
assert res.status_code == 404
def test_it_returns_http_404_if_annotation_is_private(
self, app, private_group_annotation, user_with_token
):
_, token = user_with_token
headers = {"Authorization": str("Bearer {}".format(token.value))}
res = app.put(
"/api/annotations/{id}/hide".format(id=private_group_annotation.id),
headers=headers,
expect_errors=True,
)
# private annotations cannot be moderated
assert res.status_code == 404
class TestDeleteHide:
def test_it_returns_http_204_for_group_creator(
self, app, group_annotation, user_with_token
):
_, token = user_with_token
headers = {"Authorization": str("Bearer {}".format(token.value))}
res = app.delete(
"/api/annotations/{id}/hide".format(id=group_annotation.id), headers=headers
)
# The creator of a group has moderation rights over the annotations in that group
assert res.status_code == 204
def test_it_returns_http_404_if_annotation_is_in_world_group(
self, app, world_annotation, user_with_token
):
_, token = user_with_token
headers = {"Authorization": str("Bearer {}".format(token.value))}
res = app.delete(
"/api/annotations/{id}/hide".format(id=world_annotation.id),
headers=headers,
expect_errors=True,
)
# The current user does not have moderation rights on the world group
assert res.status_code == 404
def test_it_returns_http_404_if_no_authn(self, app, group_annotation):
res = app.delete(
"/api/annotations/{id}/hide".format(id=group_annotation.id),
expect_errors=True,
)
assert res.status_code == 404
def test_it_returns_http_404_if_annotation_is_private(
self, app, private_group_annotation, user_with_token
):
_, token = user_with_token
headers = {"Authorization": str("Bearer {}".format(token.value))}
res = app.delete(
"/api/annotations/{id}/hide".format(id=private_group_annotation.id),
headers=headers,
expect_errors=True,
)
# private annotations cannot be moderated
assert res.status_code == 404
@pytest.fixture
def user(db_session, factories):
user = factories.User()
db_session.commit()
return user
@pytest.fixture
def group(user, db_session, factories):
group = factories.Group(creator=user)
db_session.commit()
return group
@pytest.fixture
def world_annotation(user, db_session, factories):
ann = factories.Annotation(userid=user.userid, groupid="__world__", shared=True)
db_session.commit()
return ann
@pytest.fixture
def group_annotation(group, db_session, factories):
ann = factories.Annotation(
userid="acct:someone@example.com", groupid=group.pubid, shared=True
)
db_session.commit()
return ann
@pytest.fixture
def private_group_annotation(group, db_session, factories):
ann = factories.Annotation(
userid="acct:someone@example.com", groupid=group.pubid, shared=False
)
db_session.commit()
return ann
@pytest.fixture
def user_with_token(user, db_session, factories):
token = factories.DeveloperToken(userid=user.userid)
db_session.add(token)
db_session.commit()
return (user, token)
| 30.174194 | 89 | 0.656831 | 572 | 4,677 | 5.106643 | 0.141608 | 0.071893 | 0.057857 | 0.043821 | 0.882917 | 0.855871 | 0.855871 | 0.840123 | 0.826429 | 0.826429 | 0 | 0.013536 | 0.241822 | 4,677 | 154 | 90 | 30.37013 | 0.810209 | 0.08018 | 0 | 0.741071 | 0 | 0 | 0.092433 | 0.059604 | 0 | 0 | 0 | 0 | 0.071429 | 1 | 0.125 | false | 0 | 0.008929 | 0 | 0.205357 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f8d92c7eb97496d203cf36bacc046af125a10fab | 172 | py | Python | pyretries/__init__.py | phantomii/retries-decorator | 62b9acd2c10eb6283a9e6879c6769a240edd8dd6 | [
"Apache-2.0"
] | null | null | null | pyretries/__init__.py | phantomii/retries-decorator | 62b9acd2c10eb6283a9e6879c6769a240edd8dd6 | [
"Apache-2.0"
] | 1 | 2019-02-06T14:00:52.000Z | 2019-02-06T14:00:52.000Z | pyretries/__init__.py | phantomii/retries-decorator | 62b9acd2c10eb6283a9e6879c6769a240edd8dd6 | [
"Apache-2.0"
] | null | null | null | from pyretries import decorators # noqa
from pyretries import defaults # noqa
from pyretries import network # noqa
__all__ = ["decorators", "defaults", "network"]
| 28.666667 | 47 | 0.732558 | 19 | 172 | 6.421053 | 0.421053 | 0.319672 | 0.467213 | 0.377049 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.186047 | 172 | 5 | 48 | 34.4 | 0.871429 | 0.081395 | 0 | 0 | 0 | 0 | 0.162338 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5d017021ac4125df9c137ba7d1121d969e94eda2 | 108 | py | Python | run_serpent_with_raven/calc_li7.py | jbae11/serpent-rom-raven | d19321f67fb4850390bd3857e79e1f32138cb490 | [
"BSD-3-Clause"
] | 1 | 2020-12-21T02:02:39.000Z | 2020-12-21T02:02:39.000Z | run_serpent_with_raven/calc_li7.py | jbae11/serpent-rom-raven | d19321f67fb4850390bd3857e79e1f32138cb490 | [
"BSD-3-Clause"
] | 1 | 2018-07-29T13:51:23.000Z | 2018-07-29T13:51:23.000Z | run_serpent_with_raven/calc_li7.py | jbae11/serpent-rom-raven | d19321f67fb4850390bd3857e79e1f32138cb490 | [
"BSD-3-Clause"
] | 2 | 2018-03-29T15:41:52.000Z | 2018-03-29T15:42:55.000Z | import mass_frac_calc
def evaluate(self):
return mass_frac_calc.return_value('li7', self.u233_mole_frac) | 36 | 66 | 0.814815 | 18 | 108 | 4.5 | 0.666667 | 0.197531 | 0.296296 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.040816 | 0.092593 | 108 | 3 | 66 | 36 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0.027523 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
5d02cfffeefe34d7bffad5ba628767aa248340ce | 19,061 | py | Python | mediaMicroservices/gen-py/media_service/MovieIdService.py | rodrigo-bruno/DeathStarBench | c9ce09aaf7c1298a7c88efacd1010a71db0fa59d | [
"Apache-2.0"
] | 364 | 2019-04-28T01:45:37.000Z | 2022-03-31T15:08:03.000Z | mediaMicroservices/gen-py/media_service/MovieIdService.py | rodrigo-bruno/DeathStarBench | c9ce09aaf7c1298a7c88efacd1010a71db0fa59d | [
"Apache-2.0"
] | 111 | 2019-04-15T11:08:49.000Z | 2022-03-31T17:39:16.000Z | mediaMicroservices/gen-py/media_service/MovieIdService.py | rodrigo-bruno/DeathStarBench | c9ce09aaf7c1298a7c88efacd1010a71db0fa59d | [
"Apache-2.0"
] | 229 | 2019-05-14T08:55:57.000Z | 2022-03-31T03:14:55.000Z | #
# Autogenerated by Thrift Compiler (0.12.0)
#
# DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING
#
# options string: py
#
from thrift.Thrift import TType, TMessageType, TFrozenDict, TException, TApplicationException
from thrift.protocol.TProtocol import TProtocolException
from thrift.TRecursive import fix_spec
import sys
import logging
from .ttypes import *
from thrift.Thrift import TProcessor
from thrift.transport import TTransport
all_structs = []
class Iface(object):
def UploadMovieId(self, req_id, title, rating, carrier):
"""
Parameters:
- req_id
- title
- rating
- carrier
"""
pass
def RegisterMovieId(self, req_id, title, movie_id, carrier):
"""
Parameters:
- req_id
- title
- movie_id
- carrier
"""
pass
class Client(Iface):
def __init__(self, iprot, oprot=None):
self._iprot = self._oprot = iprot
if oprot is not None:
self._oprot = oprot
self._seqid = 0
def UploadMovieId(self, req_id, title, rating, carrier):
"""
Parameters:
- req_id
- title
- rating
- carrier
"""
self.send_UploadMovieId(req_id, title, rating, carrier)
self.recv_UploadMovieId()
def send_UploadMovieId(self, req_id, title, rating, carrier):
self._oprot.writeMessageBegin('UploadMovieId', TMessageType.CALL, self._seqid)
args = UploadMovieId_args()
args.req_id = req_id
args.title = title
args.rating = rating
args.carrier = carrier
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_UploadMovieId(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = UploadMovieId_result()
result.read(iprot)
iprot.readMessageEnd()
if result.se is not None:
raise result.se
return
def RegisterMovieId(self, req_id, title, movie_id, carrier):
"""
Parameters:
- req_id
- title
- movie_id
- carrier
"""
self.send_RegisterMovieId(req_id, title, movie_id, carrier)
self.recv_RegisterMovieId()
def send_RegisterMovieId(self, req_id, title, movie_id, carrier):
self._oprot.writeMessageBegin('RegisterMovieId', TMessageType.CALL, self._seqid)
args = RegisterMovieId_args()
args.req_id = req_id
args.title = title
args.movie_id = movie_id
args.carrier = carrier
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_RegisterMovieId(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = RegisterMovieId_result()
result.read(iprot)
iprot.readMessageEnd()
if result.se is not None:
raise result.se
return
class Processor(Iface, TProcessor):
def __init__(self, handler):
self._handler = handler
self._processMap = {}
self._processMap["UploadMovieId"] = Processor.process_UploadMovieId
self._processMap["RegisterMovieId"] = Processor.process_RegisterMovieId
def process(self, iprot, oprot):
(name, type, seqid) = iprot.readMessageBegin()
if name not in self._processMap:
iprot.skip(TType.STRUCT)
iprot.readMessageEnd()
x = TApplicationException(TApplicationException.UNKNOWN_METHOD, 'Unknown function %s' % (name))
oprot.writeMessageBegin(name, TMessageType.EXCEPTION, seqid)
x.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
return
else:
self._processMap[name](self, seqid, iprot, oprot)
return True
def process_UploadMovieId(self, seqid, iprot, oprot):
args = UploadMovieId_args()
args.read(iprot)
iprot.readMessageEnd()
result = UploadMovieId_result()
try:
self._handler.UploadMovieId(args.req_id, args.title, args.rating, args.carrier)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except ServiceException as se:
msg_type = TMessageType.REPLY
result.se = se
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("UploadMovieId", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_RegisterMovieId(self, seqid, iprot, oprot):
args = RegisterMovieId_args()
args.read(iprot)
iprot.readMessageEnd()
result = RegisterMovieId_result()
try:
self._handler.RegisterMovieId(args.req_id, args.title, args.movie_id, args.carrier)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except ServiceException as se:
msg_type = TMessageType.REPLY
result.se = se
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("RegisterMovieId", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
# HELPER FUNCTIONS AND STRUCTURES
class UploadMovieId_args(object):
"""
Attributes:
- req_id
- title
- rating
- carrier
"""
def __init__(self, req_id=None, title=None, rating=None, carrier=None,):
self.req_id = req_id
self.title = title
self.rating = rating
self.carrier = carrier
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I64:
self.req_id = iprot.readI64()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.title = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.rating = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.MAP:
self.carrier = {}
(_ktype52, _vtype53, _size51) = iprot.readMapBegin()
for _i55 in range(_size51):
_key56 = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
_val57 = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
self.carrier[_key56] = _val57
iprot.readMapEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('UploadMovieId_args')
if self.req_id is not None:
oprot.writeFieldBegin('req_id', TType.I64, 1)
oprot.writeI64(self.req_id)
oprot.writeFieldEnd()
if self.title is not None:
oprot.writeFieldBegin('title', TType.STRING, 2)
oprot.writeString(self.title.encode('utf-8') if sys.version_info[0] == 2 else self.title)
oprot.writeFieldEnd()
if self.rating is not None:
oprot.writeFieldBegin('rating', TType.I32, 3)
oprot.writeI32(self.rating)
oprot.writeFieldEnd()
if self.carrier is not None:
oprot.writeFieldBegin('carrier', TType.MAP, 4)
oprot.writeMapBegin(TType.STRING, TType.STRING, len(self.carrier))
for kiter58, viter59 in self.carrier.items():
oprot.writeString(kiter58.encode('utf-8') if sys.version_info[0] == 2 else kiter58)
oprot.writeString(viter59.encode('utf-8') if sys.version_info[0] == 2 else viter59)
oprot.writeMapEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(UploadMovieId_args)
UploadMovieId_args.thrift_spec = (
None, # 0
(1, TType.I64, 'req_id', None, None, ), # 1
(2, TType.STRING, 'title', 'UTF8', None, ), # 2
(3, TType.I32, 'rating', None, None, ), # 3
(4, TType.MAP, 'carrier', (TType.STRING, 'UTF8', TType.STRING, 'UTF8', False), None, ), # 4
)
class UploadMovieId_result(object):
"""
Attributes:
- se
"""
def __init__(self, se=None,):
self.se = se
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.se = ServiceException()
self.se.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('UploadMovieId_result')
if self.se is not None:
oprot.writeFieldBegin('se', TType.STRUCT, 1)
self.se.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(UploadMovieId_result)
UploadMovieId_result.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'se', [ServiceException, None], None, ), # 1
)
class RegisterMovieId_args(object):
"""
Attributes:
- req_id
- title
- movie_id
- carrier
"""
def __init__(self, req_id=None, title=None, movie_id=None, carrier=None,):
self.req_id = req_id
self.title = title
self.movie_id = movie_id
self.carrier = carrier
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I64:
self.req_id = iprot.readI64()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.title = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRING:
self.movie_id = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.MAP:
self.carrier = {}
(_ktype61, _vtype62, _size60) = iprot.readMapBegin()
for _i64 in range(_size60):
_key65 = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
_val66 = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
self.carrier[_key65] = _val66
iprot.readMapEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('RegisterMovieId_args')
if self.req_id is not None:
oprot.writeFieldBegin('req_id', TType.I64, 1)
oprot.writeI64(self.req_id)
oprot.writeFieldEnd()
if self.title is not None:
oprot.writeFieldBegin('title', TType.STRING, 2)
oprot.writeString(self.title.encode('utf-8') if sys.version_info[0] == 2 else self.title)
oprot.writeFieldEnd()
if self.movie_id is not None:
oprot.writeFieldBegin('movie_id', TType.STRING, 3)
oprot.writeString(self.movie_id.encode('utf-8') if sys.version_info[0] == 2 else self.movie_id)
oprot.writeFieldEnd()
if self.carrier is not None:
oprot.writeFieldBegin('carrier', TType.MAP, 4)
oprot.writeMapBegin(TType.STRING, TType.STRING, len(self.carrier))
for kiter67, viter68 in self.carrier.items():
oprot.writeString(kiter67.encode('utf-8') if sys.version_info[0] == 2 else kiter67)
oprot.writeString(viter68.encode('utf-8') if sys.version_info[0] == 2 else viter68)
oprot.writeMapEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(RegisterMovieId_args)
RegisterMovieId_args.thrift_spec = (
None, # 0
(1, TType.I64, 'req_id', None, None, ), # 1
(2, TType.STRING, 'title', 'UTF8', None, ), # 2
(3, TType.STRING, 'movie_id', 'UTF8', None, ), # 3
(4, TType.MAP, 'carrier', (TType.STRING, 'UTF8', TType.STRING, 'UTF8', False), None, ), # 4
)
class RegisterMovieId_result(object):
"""
Attributes:
- se
"""
def __init__(self, se=None,):
self.se = se
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.se = ServiceException()
self.se.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('RegisterMovieId_result')
if self.se is not None:
oprot.writeFieldBegin('se', TType.STRUCT, 1)
self.se.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(RegisterMovieId_result)
RegisterMovieId_result.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'se', [ServiceException, None], None, ), # 1
)
fix_spec(all_structs)
del all_structs
| 34.71949 | 134 | 0.580715 | 2,055 | 19,061 | 5.181509 | 0.093431 | 0.016905 | 0.024512 | 0.023666 | 0.797145 | 0.782964 | 0.759955 | 0.740702 | 0.736664 | 0.729902 | 0 | 0.014375 | 0.313887 | 19,061 | 548 | 135 | 34.782847 | 0.799816 | 0.028383 | 0 | 0.758294 | 1 | 0 | 0.032858 | 0.001211 | 0 | 0 | 0 | 0 | 0 | 1 | 0.097156 | false | 0.004739 | 0.018957 | 0.028436 | 0.199052 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
537a1810281449ee6acab0f5bf6338ea1f270088 | 116 | py | Python | simuvex/simuvex/engines/vex/statements/storeg.py | Ruide/angr-dev | 964dc80c758e25c698c2cbcc454ef5954c5fa0a0 | [
"BSD-2-Clause"
] | 86 | 2015-08-06T23:25:07.000Z | 2022-02-17T14:58:22.000Z | simuvex/simuvex/engines/vex/statements/storeg.py | Ruide/angr-dev | 964dc80c758e25c698c2cbcc454ef5954c5fa0a0 | [
"BSD-2-Clause"
] | 132 | 2015-09-10T19:06:59.000Z | 2018-10-04T20:36:45.000Z | simuvex/simuvex/engines/vex/statements/storeg.py | Ruide/angr-dev | 964dc80c758e25c698c2cbcc454ef5954c5fa0a0 | [
"BSD-2-Clause"
] | 80 | 2015-08-07T10:30:20.000Z | 2020-03-21T14:45:28.000Z | print '... Importing simuvex/engines/vex/statements/storeg.py ...'
from angr.engines.vex.statements.storeg import *
| 38.666667 | 66 | 0.767241 | 15 | 116 | 5.933333 | 0.733333 | 0.224719 | 0.449438 | 0.58427 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.077586 | 116 | 2 | 67 | 58 | 0.831776 | 0 | 0 | 0 | 0 | 0 | 0.5 | 0.344828 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 1 | null | null | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 8 |
53a095f59976134346f9927af694249aa1dcf661 | 22,148 | py | Python | sudo_rm_rf/dnn/models/two_step_tdcn.py | ishine/sudo_rm_rf | ec3fae1e2c9d85710f933a600f3ab93f92468dee | [
"MIT"
] | 134 | 2020-07-14T05:57:06.000Z | 2022-03-29T07:17:00.000Z | sudo_rm_rf/dnn/models/two_step_tdcn.py | ishine/sudo_rm_rf | ec3fae1e2c9d85710f933a600f3ab93f92468dee | [
"MIT"
] | 12 | 2020-08-04T02:32:01.000Z | 2022-03-27T17:04:32.000Z | sudo_rm_rf/dnn/models/two_step_tdcn.py | ishine/sudo_rm_rf | ec3fae1e2c9d85710f933a600f3ab93f92468dee | [
"MIT"
] | 21 | 2020-07-15T03:46:27.000Z | 2022-03-28T05:51:56.000Z | """!
@brief Two step TDCN model from:
https://github.com/etzinis/two_step_mask_learning/tree/master/two_step_mask_learning/dnn/models
@author Efthymios Tzinis {etzinis2@illinois.edu}
@copyright University of Illinois at Urbana-Champaign
"""
import torch
import torch.nn as nn
import os
import glob2
import datetime
class TDCN(nn.Module):
# Simplified TCN layer
class TCN(nn.Module):
def __init__(self, B, H, P, D):
super(TDCN.TCN, self).__init__()
self.m = nn.ModuleList([
nn.Conv1d(in_channels=B, out_channels=H, kernel_size=1),
nn.PReLU(),
GlobalLayerNorm(H),
# nn.BatchNorm1d(H),
nn.Conv1d(in_channels=H, out_channels=H, kernel_size=P,
padding=(D * (P - 1)) // 2, dilation=D, groups=H),
nn.PReLU(),
GlobalLayerNorm(H),
# nn.BatchNorm1d(H),
nn.Conv1d(in_channels=H, out_channels=B, kernel_size=1),
])
def forward(self, x):
y = x.clone()
for l in self.m:
y = l(y)
return x + y
# Set things up
def __init__(self, N, L, B, H, P, X, R, S=1):
super(TDCN, self).__init__()
# Number of sources to produce
self.S, self.N, self.L, self.B, self.H, self.P = S, N, L, B, H, P
self.X, self.R = X, R
# Front end
self.fe = nn.ModuleList([
nn.Conv1d(in_channels=1, out_channels=N,
kernel_size=L, stride=L // 2, padding=L // 2),
nn.ReLU(),
])
# Norm before the rest, and apply one more dense layer
self.ln = GlobalLayerNorm(N)
# self.ln = nn.BatchNorm1d(N)
self.l1 = nn.Conv1d(in_channels=N, out_channels=B, kernel_size=1)
# Separation module
self.sm = nn.ModuleList([
TDCN.TCN(B=B, H=H, P=P, D=2 ** d)
for _ in range(R) for d in range(X)])
if B != N:
# self.ln_bef_out_reshape = GlobalLayerNorm(B)
self.reshape_before_masks = nn.Conv1d(in_channels=B,
out_channels=N,
kernel_size=1)
# self.ln_bef_masks = nn.GlobalLayerNorm(S * N)
# Masks layer
self.m = nn.Conv2d(in_channels=1,
out_channels=S,
kernel_size=(N + 1, 1),
padding=(N - N // 2, 0))
# Back end
self.be = nn.ConvTranspose1d(in_channels=N * S, out_channels=S,
output_padding=(L // 2) - 1, kernel_size=L,
stride=L // 2, padding=L // 2,
groups=S)
# self.ln_mask_in = nn.BatchNorm1d(self.N)
self.ln_mask_in = GlobalLayerNorm(self.N)
# Forward pass
def forward(self, x):
# Front end
for l in self.fe:
x = l(x)
# Split paths
s = x.clone()
# Separation module
x = self.ln(x)
x = self.l1(x)
for l in self.sm:
x = l(x)
if self.B != self.N:
# x = self.ln_bef_out_reshape(x)
x = self.reshape_before_masks(x)
x = self.ln_mask_in(x)
# Get masks and apply them
x = self.m(x.unsqueeze(1))
x = nn.functional.relu(x)
if self.S == 1:
x = torch.sigmoid(x)
else:
x = nn.functional.softmax(x, dim=1)
x = x * s.unsqueeze(1)
del s
# Back end
return self.be(x.view(x.shape[0], -1, x.shape[-1]))
@classmethod
def save(cls, model, path, optimizer, epoch,
tr_loss=None, cv_loss=None):
package = cls.serialize(model, optimizer, epoch,
tr_loss=tr_loss, cv_loss=cv_loss)
torch.save(package, path)
@classmethod
def load(cls, path):
package = torch.load(path, map_location=lambda storage, loc: storage)
model = cls.load_model_from_package(package)
return model
@classmethod
def load_model_from_package(cls, package):
model = cls(N=package['N'],
L=package['L'],
B=package['B'],
H=package['H'],
P=package['P'],
X=package['X'],
R=package['R'],
S=package['S'])
model.load_state_dict(package['state_dict'])
return model
@classmethod
def load_best_model(cls, models_dir, freq_res, sample_res):
dir_id = 'tasnet_L_{}_N_{}'.format(sample_res, freq_res)
dir_path = os.path.join(models_dir, dir_id)
best_path = glob2.glob(dir_path + '/best_*')[0]
return cls.load(best_path)
@staticmethod
def serialize(model, optimizer, epoch, tr_loss=None, cv_loss=None):
package = {
'N': model.N,
'L': model.L,
'B': model.B,
'H': model.H,
'P': model.P,
'X': model.X,
'R': model.R,
'S': model.S,
'state_dict': model.state_dict(),
'optim_dict': optimizer.state_dict(),
'epoch': epoch,
}
if tr_loss is not None:
package['tr_loss'] = tr_loss
package['cv_loss'] = cv_loss
return package
@classmethod
def encode_model_identifier(cls,
metric_name,
metric_value):
ts = datetime.datetime.now().strftime("%Y-%m-%d-%H:%M:%s")
file_identifiers = [metric_name, str(metric_value)]
model_identifier = "_".join(file_identifiers + [ts])
return model_identifier
@classmethod
def decode_model_identifier(cls,
model_identifier):
identifiers = model_identifier.split("_")
ts = identifiers[-1].split('.pt')[0]
[metric_name, metric_value] = identifiers[:-1]
return metric_name, float(metric_value), ts
@classmethod
def encode_dir_name(cls, model):
model_dir_name = 'tasnet_L_{}_N_{}'.format(model.L, model.N)
return model_dir_name
@classmethod
def get_best_checkpoint_path(cls, model_dir_path):
best_paths = glob2.glob(model_dir_path + '/best_*')
if best_paths:
return best_paths[0]
else:
return None
@classmethod
def get_current_checkpoint_path(cls, model_dir_path):
current_paths = glob2.glob(model_dir_path + '/current_*')
if current_paths:
return current_paths[0]
else:
return None
@classmethod
def save_if_best(cls, save_dir, model, optimizer, epoch,
tr_loss, cv_loss, cv_loss_name):
model_dir_path = os.path.join(save_dir, cls.encode_dir_name(model))
if not os.path.exists(model_dir_path):
print("Creating non-existing model states directory... {}"
"".format(model_dir_path))
os.makedirs(model_dir_path)
current_path = cls.get_current_checkpoint_path(model_dir_path)
models_to_remove = []
if current_path is not None:
models_to_remove = [current_path]
best_path = cls.get_best_checkpoint_path(model_dir_path)
file_id = cls.encode_model_identifier(cv_loss_name, cv_loss)
if best_path is not None:
best_fileid = os.path.basename(best_path)
_, best_metric_value, _ = cls.decode_model_identifier(
best_fileid.split('best_')[-1])
else:
best_metric_value = -99999999
if float(cv_loss) > float(best_metric_value):
if best_path is not None:
models_to_remove.append(best_path)
save_path = os.path.join(model_dir_path, 'best_' + file_id + '.pt')
cls.save(model, save_path, optimizer, epoch,
tr_loss=tr_loss, cv_loss=cv_loss)
save_path = os.path.join(model_dir_path, 'current_' + file_id + '.pt')
cls.save(model, save_path, optimizer, epoch,
tr_loss=tr_loss, cv_loss=cv_loss)
try:
for model_path in models_to_remove:
os.remove(model_path)
except:
print("Warning: Error in removing {} ...".format(current_path))
class GlobalLayerNorm(nn.Module):
"""Global Layer Normalization (gLN)"""
def __init__(self, channel_size):
super(GlobalLayerNorm, self).__init__()
self.gamma = nn.Parameter(torch.empty((1, channel_size, 1)))
self.beta = nn.Parameter(torch.empty((1, channel_size, 1)))
self.reset_parameters()
def reset_parameters(self):
self.gamma.data.fill_(1)
self.beta.data.zero_()
def forward(self, y):
"""
Args:
y: [M, N, K], M is batch size, N is channel size, K is length
Returns:
gLN_y: [M, N, K]
"""
# TODO: in torch 1.0, torch.mean() support dim list
mean = y.mean(dim=1, keepdim=True).mean(dim=2,
keepdim=True) # [M, 1, 1]
var = (torch.pow(y - mean, 2)).mean(dim=1,
keepdim=True).mean(dim=2,
keepdim=True)
gLN_y = (self.gamma * (y - mean) /
torch.pow(var + 10e-8, 0.5) + self.beta)
return gLN_y
class CepstralNorm(nn.Module):
"""Cepstral Layer Normalization (gLN)"""
def __init__(self, channel_size):
super(CepstralNorm, self).__init__()
self.gamma = nn.Parameter(torch.empty((1, channel_size, 1)))
self.beta = nn.Parameter(torch.empty((1, channel_size, 1)))
self.reset_parameters()
def reset_parameters(self):
self.gamma.data.fill_(1)
self.beta.data.zero_()
def forward(self, y):
"""
Args:
y: [M, N, K], M is batch size, N is channel size, K is length
Returns:
gLN_y: [M, N, K]
"""
mean = y.mean(dim=2, keepdim=True)
var = ((y - mean)**2).mean(dim=2, keepdim=True)
gLN_y = (self.gamma * (y - mean) /
torch.pow(var + 10e-8, 0.5) + self.beta)
return gLN_y
class ResidualTN(nn.Module):
# Simplified TCN layer
class TCN(nn.Module):
def __init__(self, B, H, P, D):
super(ResidualTN.TCN, self).__init__()
self.m = nn.ModuleList([
nn.Conv1d(in_channels=B, out_channels=H, kernel_size=1),
nn.PReLU(),
# GlobalLayerNorm(H),
CepstralNorm(H),
nn.Conv1d(in_channels=H, out_channels=H, kernel_size=P,
padding=(D * (P - 1)) // 2, dilation=D, groups=H),
nn.PReLU(),
# GlobalLayerNorm(H),
CepstralNorm(H),
nn.Conv1d(in_channels=H, out_channels=B, kernel_size=1),
])
def forward(self, x):
y = x.clone()
for l in self.m:
y = l(y)
return x + y
# Set things up
def __init__(self, N, L, B, H, P, X, R, S=1):
super(ResidualTN, self).__init__()
# Number of sources to produce
self.S, self.N, self.L, self.B, self.H, self.P = S, N, L, B, H, P
self.X, self.R = X, R
# Front end
self.fe = nn.ModuleList([
nn.Conv1d(in_channels=1, out_channels=N,
kernel_size=L, stride=L // 2, padding=L // 2),
nn.ReLU(),
])
self.ln = nn.BatchNorm1d(N)
self.l1 = nn.Conv1d(in_channels=N, out_channels=B, kernel_size=1)
# Separation module
# Residual connections
self.residual_to_from = [[] for _ in range(R*X)]
self.residual_to_from[8] = [-1]
self.residual_to_from[16] = [-1, 8]
self.residual_to_from[24] = [-1, 8, 16]
self.residual_to_from[11] = [3]
self.residual_to_from[19] = [3, 11]
self.residual_to_from[27] = [3, 11, 19]
self.layer_to_dense = {}
j = 0
for i, res_connections in enumerate(self.residual_to_from):
if len(res_connections):
self.layer_to_dense[i] = j
j += 1
self.residual_denses = nn.ModuleList([
nn.Conv1d(in_channels=len(res_connections) * B,
out_channels=B, kernel_size=1)
for res_connections in self.residual_to_from
if len(res_connections) > 0
])
self.prev_connections = {}
self.residual_norms = []
k = 0
for res_from in self.residual_to_from:
for res_ind in res_from:
if res_ind not in self.prev_connections:
self.prev_connections[res_ind] = k
k += 1
self.residual_norms.append(CepstralNorm(B))
self.residual_norms = nn.ModuleList(self.residual_norms)
self.sm = nn.ModuleList(
[ResidualTN.TCN(B=B, H=H, P=P, D=2 ** d)
for _ in range(R) for d in range(X)])
if B != N:
self.reshape_before_masks = nn.Conv1d(in_channels=B,
out_channels=N,
kernel_size=1)
# Masks layer
self.m = nn.Conv2d(in_channels=1,
out_channels=S,
kernel_size=(N + 1, 1),
padding=(N - N // 2, 0))
# Back end
self.be = nn.ConvTranspose1d(in_channels=N * S, out_channels=S,
output_padding=(L // 2) - 1, kernel_size=L,
stride=L // 2, padding=L // 2,
groups=S)
self.ln_mask_in = nn.BatchNorm1d(self.N)
# Forward pass
def forward(self, x):
# Front end
for l in self.fe:
x = l(x)
# Split paths
encoded_mixture = x.clone()
# Separation module
x = self.ln(x)
x = self.l1(x)
separation_input = x.clone()
layer_outputs = []
for l, tcn in enumerate(self.sm):
# gather residuals
residual_outputs = []
for k, res_ind in enumerate(self.residual_to_from[l]):
if res_ind == -1:
residual_outputs.append(self.residual_norms[
self.prev_connections[res_ind]](
separation_input))
else:
residual_outputs.append(self.residual_norms[
self.prev_connections[res_ind]](
layer_outputs[res_ind]))
if residual_outputs:
if len(residual_outputs) == 1:
residuals = residual_outputs[0]
else:
# Before concatenation normalize everything
residuals = torch.cat(residual_outputs, dim=1)
x = tcn(x + self.residual_denses[
self.layer_to_dense[l]](residuals))
else:
x = tcn(x)
if l in [8, 16, 24, 3, 11, 19]:
layer_outputs.append(x.clone())
else:
layer_outputs.append(None)
if self.B != self.N:
# x = self.ln_bef_out_reshape(x)
x = self.reshape_before_masks(x)
x = self.ln_mask_in(x)
# Get masks and apply them
x = self.m(x.unsqueeze(1))
x = nn.functional.relu(x)
if self.S == 1:
x = torch.sigmoid(x)
else:
x = nn.functional.softmax(x, dim=1)
x = x * encoded_mixture.unsqueeze(1)
del encoded_mixture
# Back end
return self.be(x.view(x.shape[0], -1, x.shape[-1]))
@classmethod
def save(cls, model, path, optimizer, epoch,
tr_loss=None, cv_loss=None):
package = cls.serialize(model, optimizer, epoch,
tr_loss=tr_loss, cv_loss=cv_loss)
torch.save(package, path)
@classmethod
def load(cls, path):
package = torch.load(path, map_location=lambda storage, loc: storage)
model = cls.load_model_from_package(package)
return model
@classmethod
def load_model_from_package(cls, package):
model = cls(N=package['N'],
L=package['L'],
B=package['B'],
H=package['H'],
P=package['P'],
X=package['X'],
R=package['R'],
S=package['S'])
model.load_state_dict(package['state_dict'])
return model
@classmethod
def load_best_model(cls, models_dir, freq_res, sample_res):
dir_id = 'residualTN_new_L_{}_N_{}'.format(sample_res, freq_res)
dir_path = os.path.join(models_dir, dir_id)
best_path = glob2.glob(dir_path + '/best_*')[0]
return cls.load(best_path)
@staticmethod
def serialize(model, optimizer, epoch, tr_loss=None, cv_loss=None):
package = {
'N': model.N,
'L': model.L,
'B': model.B,
'H': model.H,
'P': model.P,
'X': model.X,
'R': model.R,
'S': model.S,
'state_dict': model.state_dict(),
'optim_dict': optimizer.state_dict(),
'epoch': epoch,
}
if tr_loss is not None:
package['tr_loss'] = tr_loss
package['cv_loss'] = cv_loss
return package
@classmethod
def encode_model_identifier(cls,
metric_name,
metric_value):
ts = datetime.datetime.now().strftime("%Y-%m-%d-%H:%M:%s")
file_identifiers = [metric_name, str(metric_value)]
model_identifier = "_".join(file_identifiers + [ts])
return model_identifier
@classmethod
def decode_model_identifier(cls,
model_identifier):
identifiers = model_identifier.split("_")
ts = identifiers[-1].split('.pt')[0]
[metric_name, metric_value] = identifiers[:-1]
return metric_name, float(metric_value), ts
@classmethod
def encode_dir_name(cls, model):
model_dir_name = 'residualTN_new_L_{}_N_{}'.format(
model.L, model.N)
return model_dir_name
@classmethod
def get_best_checkpoint_path(cls, model_dir_path):
best_paths = glob2.glob(model_dir_path + '/best_*')
if best_paths:
return best_paths[0]
else:
return None
@classmethod
def get_current_checkpoint_path(cls, model_dir_path):
current_paths = glob2.glob(model_dir_path + '/current_*')
if current_paths:
return current_paths[0]
else:
return None
@classmethod
def save_if_best(cls, save_dir, model, optimizer, epoch,
tr_loss, cv_loss, cv_loss_name):
model_dir_path = os.path.join(save_dir, cls.encode_dir_name(model))
if not os.path.exists(model_dir_path):
print("Creating non-existing model states directory... {}"
"".format(model_dir_path))
os.makedirs(model_dir_path)
current_path = cls.get_current_checkpoint_path(model_dir_path)
models_to_remove = []
if current_path is not None:
models_to_remove = [current_path]
best_path = cls.get_best_checkpoint_path(model_dir_path)
file_id = cls.encode_model_identifier(cv_loss_name, cv_loss)
if best_path is not None:
best_fileid = os.path.basename(best_path)
_, best_metric_value, _ = cls.decode_model_identifier(
best_fileid.split('best_')[-1])
else:
best_metric_value = -99999999
if float(cv_loss) > float(best_metric_value):
if best_path is not None:
models_to_remove.append(best_path)
save_path = os.path.join(model_dir_path, 'best_' + file_id + '.pt')
cls.save(model, save_path, optimizer, epoch,
tr_loss=tr_loss, cv_loss=cv_loss)
save_path = os.path.join(model_dir_path, 'current_' + file_id + '.pt')
cls.save(model, save_path, optimizer, epoch,
tr_loss=tr_loss, cv_loss=cv_loss)
try:
for model_path in models_to_remove:
os.remove(model_path)
except:
print("Warning: Error in removing {} ...".format(current_path))
if __name__ == "__main__":
import torch
import os, sys
model = TDCN(
B=256,
H=512,
P=3,
R=4,
X=8,
L=21,
N=256,
S=2)
print('Testing Forward pass')
if sys.argv[1] == 'cuda':
os.environ['CUDA_VISIBLE_DEVICES'] = sys.argv[2]
model = model.cuda()
dummy_input = torch.rand(1, 1, 32000).cuda()
elif sys.argv[1] == 'cpu':
dummy_input = torch.rand(1, 1, 32000)
# import pdb; pdb.set_trace()
import time
now = time.time()
pred_sources = model.forward(dummy_input)
print(pred_sources.size())
print('Elapsed: {}'.format(time.time() - now))
try:
from thop import profile
macs, params = profile(model, inputs=(dummy_input,))
print('MACS and params')
print(round(macs / 10**6, 2), round(params / 10**6, 2))
from pytorch_memlab import profile
@profile
def work():
pred_sources = model.forward(dummy_input)
work()
except:
print('Could not find the profiler')
numparams = 0
for f in model.parameters():
if f.requires_grad:
numparams += f.numel()
print('Trainable Parameters: {}'.format(numparams))
| 33.506808 | 95 | 0.529709 | 2,759 | 22,148 | 4.026096 | 0.106198 | 0.016205 | 0.025927 | 0.021066 | 0.806266 | 0.793392 | 0.779078 | 0.774397 | 0.767735 | 0.759093 | 0 | 0.016752 | 0.355833 | 22,148 | 660 | 96 | 33.557576 | 0.761828 | 0.064475 | 0 | 0.751515 | 0 | 0 | 0.031507 | 0.002334 | 0 | 0 | 0 | 0.001515 | 0 | 1 | 0.074747 | false | 0.00202 | 0.020202 | 0 | 0.163636 | 0.022222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0748b96b904ac7fb4d3ddd2154a7931cc1df990f | 236 | py | Python | tests/unit/utils/scenario_validation_test.py | rkm/bluebird | 2325ebb151724d4444c092c095a040d7365dda79 | [
"MIT"
] | 8 | 2019-01-29T15:19:39.000Z | 2020-07-16T03:55:36.000Z | tests/unit/utils/scenario_validation_test.py | rkm/bluebird | 2325ebb151724d4444c092c095a040d7365dda79 | [
"MIT"
] | 46 | 2019-02-08T14:23:11.000Z | 2021-04-06T13:45:10.000Z | tests/unit/utils/scenario_validation_test.py | rkm/bluebird | 2325ebb151724d4444c092c095a040d7365dda79 | [
"MIT"
] | 3 | 2019-05-06T14:18:07.000Z | 2021-06-17T10:39:59.000Z | """
Tests for the scenario validation
"""
from bluebird.utils.scenario_validation import validate_json_scenario
from tests.data import TEST_SCENARIO
def test_scenario_validation():
assert not validate_json_scenario(TEST_SCENARIO)
| 23.6 | 69 | 0.830508 | 31 | 236 | 6.032258 | 0.516129 | 0.28877 | 0.213904 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.110169 | 236 | 9 | 70 | 26.222222 | 0.890476 | 0.139831 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.25 | true | 0 | 0.5 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
4adfea29a9fa821ab7a7e108a3e3445594000276 | 66,468 | py | Python | tests/test_data.py | albernsrya/ukis-pysat | 9161d8c92ca6354c8ce233499f31962439325bd2 | [
"Apache-2.0"
] | 1 | 2021-07-03T14:05:22.000Z | 2021-07-03T14:05:22.000Z | tests/test_data.py | albernsrya/ukis-pysat | 9161d8c92ca6354c8ce233499f31962439325bd2 | [
"Apache-2.0"
] | null | null | null | tests/test_data.py | albernsrya/ukis-pysat | 9161d8c92ca6354c8ce233499f31962439325bd2 | [
"Apache-2.0"
] | null | null | null | import os
import unittest
from pathlib import Path
from tempfile import gettempdir
import pystac
import requests_mock
from shapely.geometry import Polygon
from ukis_pysat.data import Source
from ukis_pysat.members import Datahub, Platform
os.environ["EARTHEXPLORER_USER"] = "Tim"
os.environ["EARTHEXPLORER_PW"] = "TheEnchanter"
os.environ["SCIHUB_USER"] = "Tim"
os.environ["SCIHUB_PW"] = "TheEnchanter"
catalog_path = Path(__file__).parents[0] / "testfiles" / "catalog.json"
target_dir = Path(__file__).parents[0] / "testfiles"
aoi_4326 = target_dir / "aoi_4326.geojson"
aoi_3857 = target_dir / "aoi_3857.geojson"
aoi_bbox = (11.90, 51.46, 11.94, 51.50)
class DataTest(unittest.TestCase):
def test_init_stac_catalog(self):
with Source(datahub=Datahub.STAC_local, catalog=catalog_path) as src:
self.assertTrue(isinstance(src.api, pystac.catalog.Catalog))
def test_init_stac_url(self):
with Source(datahub=Datahub.STAC_API, url=r"https://earth-search.aws.element84.com/v0/") as src:
self.assertEqual(src.api.url, r"https://earth-search.aws.element84.com/v0/")
def test_init_exception_other_hub(self):
with self.assertRaises(
NotImplementedError, msg=f"Hub is not supported [STAC_local, STAC_API, EarthExplorer, " f"Scihub]."
):
Source(datahub="Hub")
def test_init_exception_other_enum(self):
with self.assertRaises(AttributeError):
Source(datahub=Datahub.Hub)
def test_exception_false_aoi(self):
with Source(datahub=Datahub.STAC_local, catalog=catalog_path) as src:
with self.assertRaises(TypeError, msg=f"aoi must be of type string or tuple"):
src._prep_aoi(1)
def test_aoi_geointerface(self):
geom = Source._prep_aoi(
{
"type": "Polygon",
"coordinates": (
((0.0, 0.0), (0.0, 1.0), (1.0, 1.0), (2.0, -1.0), (0.0, 0.0)),
((0.1, 0.1), (0.1, 0.2), (0.2, 0.2), (0.2, 0.1), (0.1, 0.1)),
),
}
)
self.assertIsInstance(geom, Polygon)
self.assertEqual(tuple(geom.exterior.coords), ((0.0, 0.0), (0.0, 1.0), (1.0, 1.0), (2.0, -1.0), (0.0, 0.0)))
self.assertEqual(len(geom.interiors), 1)
@unittest.skip("Skip until we find a better test or this also runs with Github Actions")
def test_query_metadata_stac_local(self):
with Source(datahub=Datahub.STAC_local, catalog=catalog_path) as src:
meta = src.query_metadata(
platform=Platform.Sentinel2,
date=("20200220", "20200222"),
aoi=aoi_3857,
cloud_cover=(90, 100),
)
cat = src._init_catalog()
for item in meta:
cat.add_item(item)
item = cat.get_item("S2A_MSIL1C_20200221T102041_N0209_R065_T32UPC_20200221T110731")
self.assertEqual(item.properties.get("srcuuid"), "ae674e64-013d-4898-a6d7-096d7b02bdde")
cat.normalize_hrefs(Path(gettempdir()).as_posix())
cat.validate_all()
@requests_mock.Mocker(real_http=True)
def test_query_metadata_scihub(self, m):
m.get(
"https://apihub.copernicus.eu/apihub/search?format=json&rows=100&start=0&q=beginPosition%3A%5B%222020-02-24T00%3A00%3A00Z%22+TO+%222020-02-25T00%3A00%3A00Z%22%5D+platformname%3A%22Sentinel-1%22+footprint%3A%22Intersects%28POLYGON+%28%2811.90274575621129+51.46641523383226%2C+11.90274575621129+51.50095226908388%2C+11.94774352810161+51.50095226908388%2C+11.94774352810161+51.46641523383226%2C+11.90274575621129+51.46641523383226%29%29%29%22",
content=b'{"feed":{"xmlns:opensearch":"http://a9.com/-/spec/opensearch/1.1/","xmlns":"http://www.w3.org/2005/Atom","title":"Sentinels Scientific Data Hub search results for: beginPosition:[\\"2020-02-24T00:00:00Z\\" TO \\"2020-02-25T00:00:00Z\\"] platformname:\\"Sentinel-1\\" footprint:\\"Intersects(POLYGON ((11.90274575621129 51.46641523383226, 11.90274575621129 51.50095226908388, 11.94774352810161 51.50095226908388, 11.94774352810161 51.46641523383226, 11.90274575621129 51.46641523383226)))\\"","subtitle":"Displaying 3 results. Request done in 0.003 seconds.","updated":"2021-06-23T14:20:50.811Z","author":{"name":"Sentinels Scientific Data Hub"},"id":"https://apihub.copernicus.eu/apihub/search?q=beginPosition:[\\"2020-02-24T00:00:00Z\\" TO \\"2020-02-25T00:00:00Z\\"] platformname:\\"Sentinel-1\\" footprint:\\"Intersects(POLYGON ((11.90274575621129 51.46641523383226, 11.90274575621129 51.50095226908388, 11.94774352810161 51.50095226908388, 11.94774352810161 51.46641523383226, 11.90274575621129 51.46641523383226)))\\"","opensearch:totalResults":"3","opensearch:startIndex":"0","opensearch:itemsPerPage":"100","opensearch:Query":{"role":"request","searchTerms":"beginPosition:[\\"2020-02-24T00:00:00Z\\" TO \\"2020-02-25T00:00:00Z\\"] platformname:\\"Sentinel-1\\" footprint:\\"Intersects(POLYGON ((11.90274575621129 51.46641523383226, 11.90274575621129 51.50095226908388, 11.94774352810161 51.50095226908388, 11.94774352810161 51.46641523383226, 11.90274575621129 51.46641523383226)))\\"","startPage":"1"},"link":[{"rel":"self","type":"application/json","href":"https://apihub.copernicus.eu/apihub/search?q=beginPosition:[\\"2020-02-24T00:00:00Z\\" TO \\"2020-02-25T00:00:00Z\\"] platformname:\\"Sentinel-1\\" footprint:\\"Intersects(POLYGON ((11.90274575621129 51.46641523383226, 11.90274575621129 51.50095226908388, 11.94774352810161 51.50095226908388, 11.94774352810161 51.46641523383226, 11.90274575621129 51.46641523383226)))\\"&start=0&rows=100&format=json"},{"rel":"first","type":"application/json","href":"https://apihub.copernicus.eu/apihub/search?q=beginPosition:[\\"2020-02-24T00:00:00Z\\" TO \\"2020-02-25T00:00:00Z\\"] platformname:\\"Sentinel-1\\" footprint:\\"Intersects(POLYGON ((11.90274575621129 51.46641523383226, 11.90274575621129 51.50095226908388, 11.94774352810161 51.50095226908388, 11.94774352810161 51.46641523383226, 11.90274575621129 51.46641523383226)))\\"&start=0&rows=100&format=json"},{"rel":"last","type":"application/json","href":"https://apihub.copernicus.eu/apihub/search?q=beginPosition:[\\"2020-02-24T00:00:00Z\\" TO \\"2020-02-25T00:00:00Z\\"] platformname:\\"Sentinel-1\\" footprint:\\"Intersects(POLYGON ((11.90274575621129 51.46641523383226, 11.90274575621129 51.50095226908388, 11.94774352810161 51.50095226908388, 11.94774352810161 51.46641523383226, 11.90274575621129 51.46641523383226)))\\"&start=2&rows=100&format=json"},{"rel":"search","type":"application/opensearchdescription+xml","href":"opensearch_description.xml"}],"entry":[{"title":"S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6","link":[{"href":"https://apihub.copernicus.eu/apihub/odata/v1/Products(\'8a611d5b-f9d9-437e-9f55-eca18cf79fd4\')/$value"},{"rel":"alternative","href":"https://apihub.copernicus.eu/apihub/odata/v1/Products(\'8a611d5b-f9d9-437e-9f55-eca18cf79fd4\')/"},{"rel":"icon","href":"https://apihub.copernicus.eu/apihub/odata/v1/Products(\'8a611d5b-f9d9-437e-9f55-eca18cf79fd4\')/Products(\'Quicklook\')/$value"}],"id":"8a611d5b-f9d9-437e-9f55-eca18cf79fd4","summary":"Date: 2020-02-24T05:25:28.861Z, Instrument: SAR-C SAR, Mode: VV VH, Satellite: Sentinel-1, Size: 7.65 GB","ondemand":"false","date":[{"name":"beginposition","content":"2020-02-24T05:25:28.861Z"},{"name":"endposition","content":"2020-02-24T05:25:55.96Z"},{"name":"ingestiondate","content":"2020-02-24T09:44:57.338Z"}],"int":[{"name":"missiondatatakeid","content":"236786"},{"name":"slicenumber","content":"16"},{"name":"orbitnumber","content":"31390"},{"name":"lastorbitnumber","content":"31390"},{"name":"relativeorbitnumber","content":"168"},{"name":"lastrelativeorbitnumber","content":"168"}],"str":[{"name":"sensoroperationalmode","content":"IW"},{"name":"swathidentifier","content":"IW1 IW2 IW3"},{"name":"orbitdirection","content":"DESCENDING"},{"name":"producttype","content":"SLC"},{"name":"timeliness","content":"Fast-24h"},{"name":"platformname","content":"Sentinel-1"},{"name":"platformidentifier","content":"2014-016A"},{"name":"instrumentname","content":"Synthetic Aperture Radar (C-band)"},{"name":"instrumentshortname","content":"SAR-C SAR"},{"name":"filename","content":"S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6.SAFE"},{"name":"format","content":"SAFE"},{"name":"productclass","content":"S"},{"name":"polarisationmode","content":"VV VH"},{"name":"acquisitiontype","content":"NOMINAL"},{"name":"status","content":"ARCHIVED"},{"name":"size","content":"7.65 GB"},{"name":"gmlfootprint","content":"<gml:Polygon srsName=\\"http://www.opengis.net/gml/srs/epsg.xml#4326\\" xmlns:gml=\\"http://www.opengis.net/gml\\">\\n <gml:outerBoundaryIs>\\n <gml:LinearRing>\\n <gml:coordinates>50.907326,13.664250 51.312340,10.003944 52.933346,10.404160 52.525398,14.201497 50.907326,13.664250</gml:coordinates>\\n </gml:LinearRing>\\n </gml:outerBoundaryIs>\\n</gml:Polygon>"},{"name":"footprint","content":"MULTIPOLYGON (((13.66425 50.907326, 14.201497 52.525398, 10.40416 52.933346, 10.003944 51.31234, 13.66425 50.907326)))"},{"name":"identifier","content":"S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6"},{"name":"uuid","content":"8a611d5b-f9d9-437e-9f55-eca18cf79fd4"}]},{"title":"S1A_IW_GRDH_1SDV_20200224T052530_20200224T052555_031390_039CF2_EBED","link":[{"href":"https://apihub.copernicus.eu/apihub/odata/v1/Products(\'8b78a444-e6a4-48bc-9aa8-6b6a00cfcd80\')/$value"},{"rel":"alternative","href":"https://apihub.copernicus.eu/apihub/odata/v1/Products(\'8b78a444-e6a4-48bc-9aa8-6b6a00cfcd80\')/"},{"rel":"icon","href":"https://apihub.copernicus.eu/apihub/odata/v1/Products(\'8b78a444-e6a4-48bc-9aa8-6b6a00cfcd80\')/Products(\'Quicklook\')/$value"}],"id":"8b78a444-e6a4-48bc-9aa8-6b6a00cfcd80","summary":"Date: 2020-02-24T05:25:30.023Z, Instrument: SAR-C SAR, Mode: VV VH, Satellite: Sentinel-1, Size: 1.64 GB","ondemand":"false","date":[{"name":"beginposition","content":"2020-02-24T05:25:30.023Z"},{"name":"endposition","content":"2020-02-24T05:25:55.021Z"},{"name":"ingestiondate","content":"2020-02-24T09:29:17.891Z"}],"int":[{"name":"missiondatatakeid","content":"236786"},{"name":"slicenumber","content":"16"},{"name":"orbitnumber","content":"31390"},{"name":"lastorbitnumber","content":"31390"},{"name":"relativeorbitnumber","content":"168"},{"name":"lastrelativeorbitnumber","content":"168"}],"str":[{"name":"sensoroperationalmode","content":"IW"},{"name":"swathidentifier","content":"IW"},{"name":"orbitdirection","content":"DESCENDING"},{"name":"producttype","content":"GRD"},{"name":"timeliness","content":"Fast-24h"},{"name":"platformname","content":"Sentinel-1"},{"name":"platformidentifier","content":"2014-016A"},{"name":"instrumentname","content":"Synthetic Aperture Radar (C-band)"},{"name":"instrumentshortname","content":"SAR-C SAR"},{"name":"filename","content":"S1A_IW_GRDH_1SDV_20200224T052530_20200224T052555_031390_039CF2_EBED.SAFE"},{"name":"format","content":"SAFE"},{"name":"productclass","content":"S"},{"name":"polarisationmode","content":"VV VH"},{"name":"acquisitiontype","content":"NOMINAL"},{"name":"status","content":"ARCHIVED"},{"name":"size","content":"1.64 GB"},{"name":"gmlfootprint","content":"<gml:Polygon srsName=\\"http://www.opengis.net/gml/srs/epsg.xml#4326\\" xmlns:gml=\\"http://www.opengis.net/gml\\">\\n <gml:outerBoundaryIs>\\n <gml:LinearRing>\\n <gml:coordinates>50.963913,13.678375 51.374710,9.951296 52.869320,10.326388 52.455978,14.178555 50.963913,13.678375</gml:coordinates>\\n </gml:LinearRing>\\n </gml:outerBoundaryIs>\\n</gml:Polygon>"},{"name":"footprint","content":"MULTIPOLYGON (((13.678375 50.963913, 14.178555 52.455978, 10.326388 52.86932, 9.951296 51.37471, 13.678375 50.963913)))"},{"name":"identifier","content":"S1A_IW_GRDH_1SDV_20200224T052530_20200224T052555_031390_039CF2_EBED"},{"name":"uuid","content":"8b78a444-e6a4-48bc-9aa8-6b6a00cfcd80"}]},{"title":"S1A_IW_RAW__0SDV_20200224T052526_20200224T052558_031390_039CF2_16A0","link":[{"href":"https://apihub.copernicus.eu/apihub/odata/v1/Products(\'82cde615-bd22-44bb-98ca-b3e5d2811d32\')/$value"},{"rel":"alternative","href":"https://apihub.copernicus.eu/apihub/odata/v1/Products(\'82cde615-bd22-44bb-98ca-b3e5d2811d32\')/"},{"rel":"icon","href":"https://apihub.copernicus.eu/apihub/odata/v1/Products(\'82cde615-bd22-44bb-98ca-b3e5d2811d32\')/Products(\'Quicklook\')/$value"}],"id":"82cde615-bd22-44bb-98ca-b3e5d2811d32","summary":"Date: 2020-02-24T05:25:26.325Z, Instrument: SAR-C SAR, Mode: VH VV, Satellite: Sentinel-1, Size: 1.52 GB","ondemand":"false","date":[{"name":"beginposition","content":"2020-02-24T05:25:26.325Z"},{"name":"endposition","content":"2020-02-24T05:25:58.724Z"},{"name":"ingestiondate","content":"2020-02-24T07:34:15.963Z"}],"int":[{"name":"missiondatatakeid","content":"236786"},{"name":"slicenumber","content":"16"},{"name":"orbitnumber","content":"31390"},{"name":"lastorbitnumber","content":"31390"},{"name":"relativeorbitnumber","content":"168"},{"name":"lastrelativeorbitnumber","content":"168"}],"str":[{"name":"sensoroperationalmode","content":"IW"},{"name":"orbitdirection","content":"DESCENDING"},{"name":"producttype","content":"RAW"},{"name":"platformname","content":"Sentinel-1"},{"name":"platformidentifier","content":"2014-016A"},{"name":"instrumentname","content":"Synthetic Aperture Radar (C-band)"},{"name":"instrumentshortname","content":"SAR-C SAR"},{"name":"filename","content":"S1A_IW_RAW__0SDV_20200224T052526_20200224T052558_031390_039CF2_16A0.SAFE"},{"name":"format","content":"SAFE"},{"name":"productclass","content":"S"},{"name":"polarisationmode","content":"VH VV"},{"name":"acquisitiontype","content":"NOMINAL"},{"name":"status","content":"ARCHIVED"},{"name":"size","content":"1.52 GB"},{"name":"gmlfootprint","content":"<gml:Polygon srsName=\\"http://www.opengis.net/gml/srs/epsg.xml#4326\\" xmlns:gml=\\"http://www.opengis.net/gml\\">\\n <gml:outerBoundaryIs>\\n <gml:LinearRing>\\n <gml:coordinates>52.8520,10.6192 50.9032,10.1431 50.6090,13.6194 52.5508,14.2489 52.8520,10.6192 52.8520,10.6192</gml:coordinates>\\n </gml:LinearRing>\\n </gml:outerBoundaryIs>\\n</gml:Polygon>"},{"name":"footprint","content":"MULTIPOLYGON (((13.6194 50.609, 14.2489 52.5508, 10.6192 52.852, 10.1431 50.9032, 13.6194 50.609)))"},{"name":"identifier","content":"S1A_IW_RAW__0SDV_20200224T052526_20200224T052558_031390_039CF2_16A0"},{"name":"productconsolidation","content":"SLICE"},{"name":"uuid","content":"82cde615-bd22-44bb-98ca-b3e5d2811d32"}]}]}}',
)
with Source(datahub=Datahub.Scihub) as src:
meta = src.query_metadata(
platform=Platform.Sentinel1,
date=("20200224", "20200225"),
aoi=aoi_4326,
)
cat = src._init_catalog()
for item in meta:
cat.add_item(item)
item = cat.get_item("S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6")
self.assertEqual(item.properties.get("srcuuid"), "8a611d5b-f9d9-437e-9f55-eca18cf79fd4")
cat.normalize_hrefs(Path(gettempdir()).as_posix())
cat.validate_all()
@requests_mock.Mocker(real_http=True)
def test_query_metadata_earth_explorer(self, m):
m.post(
"https://m2m.cr.usgs.gov/api/api/json/stable/login",
content=b'{"requestId": 241064318, "version": "stable", "data": "token", "errorCode": null, "errorMessage": null, "sessionId": 51372983}',
)
m.get(
"https://m2m.cr.usgs.gov/api/api/json/stable/scene-search",
content=b'{"requestId": 241178020, "version": "stable", "sessionId": 51396676, "data": {"results": [{"browse":[{"id":"5e83d0b86f2c3061","browseRotationEnabled":null,"browseName":"LandsatLook Natural Color Preview Image","browsePath":"https:\\/\\/ims.cr.usgs.gov\\/browse\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1.jpg","overlayPath":"https:\\/\\/ims.cr.usgs.gov\\/wms\\/landsat_8_c1?sceneId=lc81930242020082lgn00","overlayType":"dmid_wms","thumbnailPath":"https:\\/\\/ims.cr.usgs.gov\\/thumbnail\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1.jpg"},{"id":"5e83d0b85da62c02","browseRotationEnabled":null,"browseName":"LandsatLook Thermal Preview Image","browsePath":"https:\\/\\/ims.cr.usgs.gov\\/browse\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1_TIR.jpg","overlayPath":"https:\\/\\/ims.cr.usgs.gov\\/wms\\/landsat_8_c1?sceneId=lc81930242020082lgn00_tir","overlayType":"dmid_wms","thumbnailPath":"https:\\/\\/ims.cr.usgs.gov\\/thumbnail\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1_TIR.jpg"},{"id":"5e83d0b8b814a0e6","browseRotationEnabled":null,"browseName":"LandsatLook Quality Preview Image","browsePath":"https:\\/\\/ims.cr.usgs.gov\\/browse\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1_QB.jpg","overlayPath":"https:\\/\\/ims.cr.usgs.gov\\/wms\\/landsat_8_c1?sceneId=lc81930242020082lgn00_qb","overlayType":"dmid_wms","thumbnailPath":"https:\\/\\/ims.cr.usgs.gov\\/thumbnail\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1_QB.jpg"}],"cloudCover":"12.47","entityId":"LC81930242020082LGN00","displayId":"LC08_L1TP_193024_20200322_20200326_01_T1","orderingId":null,"metadata":[{"id":"5e83d0b82af07b21","fieldName":"Landsat Product Identifier","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#landsat_product_id","value":"LC08_L1TP_193024_20200322_20200326_01_T1"},{"id":"5e83d0b88275745","fieldName":"Landsat Scene Identifier","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#landsat_scene_id","value":"LC81930242020082LGN00"},{"id":"5e83d0b92ff6b5e8","fieldName":"Acquisition Date","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":"2020\\/03\\/22"},{"id":"5e83d0b9332fd122","fieldName":"Collection Category","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":"T1"},{"id":"5e83d0b9948e2596","fieldName":"Collection Number","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":1},{"id":"5e83d0b922c5f981","fieldName":"WRS Path","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":" 193"},{"id":"5e83d0b94328e34e","fieldName":"WRS Row","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":" 024"},{"id":"5e83d0b9a34ebe01","fieldName":"Target WRS Path","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":" 193"},{"id":"5e83d0b8852f2e23","fieldName":"Target WRS Row","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":" 024"},{"id":"5e83d0b9173b715a","fieldName":"Nadir\\/Off Nadir","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":"NADIR"},{"id":"5e83d0b820eaa311","fieldName":"Roll Angle","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#roll_angle","value":"-0.001"},{"id":"5e83d0b8379c9dae","fieldName":"Date L-1 Generated","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#date_l1_generated","value":"2020\\/03\\/26"},{"id":"5e83d0b9a78d1dbe","fieldName":"Start Time","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#start_time","value":"2020:082:10:02:29.1296010"},{"id":"5e83d0b887e40f8e","fieldName":"Stop Time","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#stop_time","value":"2020:082:10:03:00.8996000"},{"id":"5e83d0b926d7d304","fieldName":"Station Identifier","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#distribution_site","value":"LGN"},{"id":"5e83d0b91ad332e3","fieldName":"Day\\/Night Indicator","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#day_or_night","value":"DAY"},{"id":"5e83d0b864f5722e","fieldName":"Land Cloud Cover","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#cloud_cover_land","value":"12.47"},{"id":"5e83d0b92e9d1b11","fieldName":"Scene Cloud Cover","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#cloud_cover","value":"12.47"},{"id":"5e83d0b991213d01","fieldName":"Ground Control Points Model","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#ground_control_points_model","value":253},{"id":"5e83d0b9e4a26b2a","fieldName":"Ground Control Points Version","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#ground_control_points_version","value":4},{"id":"5e83d0b9bb19d4cc","fieldName":"Geometric RMSE Model (meters)","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#geometric_rmse_model","value":"8.296"},{"id":"5e83d0b8f922f1d3","fieldName":"Geometric RMSE Model X","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#geometric_rmse_model_x","value":"5.809"},{"id":"5e83d0b93b83213d","fieldName":"Geometric RMSE Model Y","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#geometric_rmse_model_y","value":"5.924"},{"id":"5e83d0b9c9fa1556","fieldName":"Image Quality","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#image_quality_landsat_8","value":9},{"id":"5e83d0b8a926bb3e","fieldName":"Processing Software Version","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#processing_software_version","value":"LPGS_13.1.0"},{"id":"5e83d0b84092b361","fieldName":"Sun Elevation L1","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#sun_elevation","value":"36.94009856"},{"id":"5e83d0b8bf763033","fieldName":"Sun Azimuth L1","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#sun_azimuth","value":"156.99969272"},{"id":"5e83d0b97cb734c3","fieldName":"TIRS SSM Model","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#tirs_ssm_model","value":"FINAL"},{"id":"5e83d0b861614fa4","fieldName":"Data Type Level-1","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#data_type_l1","value":"OLI_TIRS_L1TP"},{"id":"5e83d0b993a4fa4a","fieldName":"Sensor Identifier","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#sensor_id ","value":"OLI_TIRS"},{"id":"5e83d0b9de23d772","fieldName":"Panchromatic Lines","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#panchromatic_lines","value":16301},{"id":"5e83d0b8100f0577","fieldName":"Panchromatic Samples","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#panchromatic_samples","value":16101},{"id":"5e83d0b92bc96899","fieldName":"Reflective Lines","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#reflective_lines","value":8151},{"id":"5e83d0b8bbb70baf","fieldName":"Reflective Samples","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#reflective_samples","value":8051},{"id":"5e83d0b955642ca4","fieldName":"Thermal Lines","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#thermal_lines","value":8151},{"id":"5e83d0b96b1a0d35","fieldName":"Thermal Samples","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#thermal_samples","value":8051},{"id":"5e83d0b9104800de","fieldName":"Map Projection Level-1","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#map_projection_l1","value":"UTM"},{"id":"5e83d0b9fb0dce2d","fieldName":"UTM Zone","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#utm_zone","value":33},{"id":"5e83d0b8fd64f557","fieldName":"Datum","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#datum","value":"WGS84"},{"id":"5e83d0b84cc2632e","fieldName":"Ellipsoid","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#ellipsoid","value":"WGS84"},{"id":"5e83d0b98f651440","fieldName":"Grid Cell Size Panchromatic","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#grid_cell_size_panchromatic","value":"15.00"},{"id":"5e83d0b9c6eaa87d","fieldName":"Grid Cell Size Reflective","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#grid_cell_size_reflective","value":"30.00"},{"id":"5e83d0b8c810ced","fieldName":"Grid Cell Size Thermal","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#grid_cell_size_thermal","value":"30.00"},{"id":"5e83d0b8a40bafa4","fieldName":"Bias Parameter File Name OLI","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#bpf_name_oli","value":"LO8BPF20200322095233_20200322113039.01"},{"id":"5e83d0b964a91d93","fieldName":"Bias Parameter File Name TIRS","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#bpf_name_tirs","value":"LT8BPF20200310060739_20200324104153.01"},{"id":"5e83d0b9b5d81214","fieldName":"Calibration Parameter File","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#cpf_name","value":"LC08CPF_20200101_20200331_01.04"},{"id":"5e83d0b9c67b22a3","fieldName":"RLUT File Name","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#rlut_file_name","value":"LC08RLUT_20150303_20431231_01_12.h5"},{"id":"5e83d0b9a610a996","fieldName":"Center Latitude","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"51°41\'35.34\\"N"},{"id":"5e83d0b94fad23df","fieldName":"Center Longitude","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"12°48\'15.73\\"E"},{"id":"5e83d0b9c1d54551","fieldName":"UL Corner Lat","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"52°45\'50.58\\"N"},{"id":"5e83d0b8d91068e","fieldName":"UL Corner Long","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"11°49\'23.52\\"E"},{"id":"5e83d0b94ff6f17e","fieldName":"UR Corner Lat","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"52°17\'44.92\\"N"},{"id":"5e83d0b9a76119fe","fieldName":"UR Corner Long","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"14°32\'32.78\\"E"},{"id":"5e83d0b9f29120e3","fieldName":"LL Corner Lat","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"51°03\'53.57\\"N"},{"id":"5e83d0b988d2162b","fieldName":"LL Corner Long","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"11°06\'35.75\\"E"},{"id":"5e83d0b96162233","fieldName":"LR Corner Lat","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"50°36\'17.24\\"N"},{"id":"5e83d0b894a09772","fieldName":"LR Corner Long","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"13°43\'57.40\\"E"},{"id":"5e83d0b87f203a10","fieldName":"Center Latitude dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"51.69315"},{"id":"5e83d0b9b2a9a299","fieldName":"Center Longitude dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"12.80437"},{"id":"5e83d0b8bc51bf5b","fieldName":"UL Corner Lat dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"52.76405"},{"id":"5e83d0b95071b3bf","fieldName":"UL Corner Long dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"11.82320"},{"id":"5e83d0b9842b0429","fieldName":"UR Corner Lat dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"52.29581"},{"id":"5e83d0b931a16a9","fieldName":"UR Corner Long dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"14.54244"},{"id":"5e83d0b8abad8ec9","fieldName":"LL Corner Lat dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"51.06488"},{"id":"5e83d0b92a9532e0","fieldName":"LL Corner Long dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"11.10993"},{"id":"5e83d0b834fea374","fieldName":"LR Corner Lat dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"50.60479"},{"id":"5e83d0b9f619dbbe","fieldName":"LR Corner Long dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"13.73261"}],"hasCustomizedMetadata":null,"options":{"bulk":true,"download":true,"order":true,"secondary":false},"selected":{"bulk":false,"compare":false,"order":false},"spatialBounds":{"type":"Polygon","coordinates":[[[11.10993,50.60479],[11.10993,52.76405],[14.54244,52.76405],[14.54244,50.60479],[11.10993,50.60479]]]},"spatialCoverage":{"type":"Polygon","coordinates":[[[11.10993,51.06488],[13.73261,50.60479],[14.54244,52.29581],[11.8232,52.76405],[11.10993,51.06488]]]},"temporalCoverage":{"endDate":"2020-03-22 00:00:00","startDate":"2020-03-22 00:00:00"},"publishDate":"2020-03-22 09:56:49"}],"recordsReturned": 1,"totalHits": 1,"isCustomized": false,"numExcluded": 0,"startingNumber": 1,"nextRecord": 1}, "errorCode": null, "errorMessage": null}',
)
m.get(
"https://m2m.cr.usgs.gov/api/api/json/stable/logout",
content=b'{"errorCode":null,"error":"","data":true,"api_version":"1.4.1","access_level":"user","catalog_id":"EE","executionTime":0.4}',
)
with Source(datahub=Datahub.EarthExplorer) as src:
meta = src.query_metadata(
platform=Platform.Landsat8,
date=("20200310", "20200325"),
aoi=aoi_bbox,
cloud_cover=(0, 20),
)
cat = src._init_catalog()
for item in meta:
cat.add_item(item)
item = cat.get_item("LC08_L1TP_193024_20200322_20200326_01_T1")
self.assertEqual(item.properties.get("srcuuid"), "LC81930242020082LGN00")
cat.normalize_hrefs(Path(gettempdir()).as_posix())
cat.validate_all()
@requests_mock.Mocker(real_http=True)
def test_query_metadata_srcid_scihub(self, m):
m.get(
"https://apihub.copernicus.eu/apihub/search?format=json&rows=100&start=0&q=identifier%3A%22S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6%22",
content=b'{"feed":{"xmlns:opensearch":"http://a9.com/-/spec/opensearch/1.1/","xmlns":"http://www.w3.org/2005/Atom","title":"Sentinels Scientific Data Hub search results for: identifier:\\"S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6\\"","subtitle":"Displaying 1 results. Request done in 0.038 seconds.","updated":"2021-06-23T14:14:46.076Z","author":{"name":"Sentinels Scientific Data Hub"},"id":"https://apihub.copernicus.eu/apihub/search?q=identifier:\\"S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6\\"","opensearch:totalResults":"1","opensearch:startIndex":"0","opensearch:itemsPerPage":"100","opensearch:Query":{"role":"request","searchTerms":"identifier:\\"S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6\\"","startPage":"1"},"link":[{"rel":"self","type":"application/json","href":"https://apihub.copernicus.eu/apihub/search?q=identifier:\\"S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6\\"&start=0&rows=100&format=json"},{"rel":"first","type":"application/json","href":"https://apihub.copernicus.eu/apihub/search?q=identifier:\\"S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6\\"&start=0&rows=100&format=json"},{"rel":"last","type":"application/json","href":"https://apihub.copernicus.eu/apihub/search?q=identifier:\\"S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6\\"&start=0&rows=100&format=json"},{"rel":"search","type":"application/opensearchdescription+xml","href":"opensearch_description.xml"}],"entry":{"title":"S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6","link":[{"href":"https://apihub.copernicus.eu/apihub/odata/v1/Products(\'8a611d5b-f9d9-437e-9f55-eca18cf79fd4\')/$value"},{"rel":"alternative","href":"https://apihub.copernicus.eu/apihub/odata/v1/Products(\'8a611d5b-f9d9-437e-9f55-eca18cf79fd4\')/"},{"rel":"icon","href":"https://apihub.copernicus.eu/apihub/odata/v1/Products(\'8a611d5b-f9d9-437e-9f55-eca18cf79fd4\')/Products(\'Quicklook\')/$value"}],"id":"8a611d5b-f9d9-437e-9f55-eca18cf79fd4","summary":"Date: 2020-02-24T05:25:28.861Z, Instrument: SAR-C SAR, Mode: VV VH, Satellite: Sentinel-1, Size: 7.65 GB","ondemand":"false","date":[{"name":"beginposition","content":"2020-02-24T05:25:28.861Z"},{"name":"endposition","content":"2020-02-24T05:25:55.96Z"},{"name":"ingestiondate","content":"2020-02-24T09:44:57.338Z"}],"int":[{"name":"missiondatatakeid","content":"236786"},{"name":"slicenumber","content":"16"},{"name":"orbitnumber","content":"31390"},{"name":"lastorbitnumber","content":"31390"},{"name":"relativeorbitnumber","content":"168"},{"name":"lastrelativeorbitnumber","content":"168"}],"str":[{"name":"sensoroperationalmode","content":"IW"},{"name":"swathidentifier","content":"IW1 IW2 IW3"},{"name":"orbitdirection","content":"DESCENDING"},{"name":"producttype","content":"SLC"},{"name":"timeliness","content":"Fast-24h"},{"name":"platformname","content":"Sentinel-1"},{"name":"platformidentifier","content":"2014-016A"},{"name":"instrumentname","content":"Synthetic Aperture Radar (C-band)"},{"name":"instrumentshortname","content":"SAR-C SAR"},{"name":"filename","content":"S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6.SAFE"},{"name":"format","content":"SAFE"},{"name":"productclass","content":"S"},{"name":"polarisationmode","content":"VV VH"},{"name":"acquisitiontype","content":"NOMINAL"},{"name":"status","content":"ARCHIVED"},{"name":"size","content":"7.65 GB"},{"name":"gmlfootprint","content":"<gml:Polygon srsName=\\"http://www.opengis.net/gml/srs/epsg.xml#4326\\" xmlns:gml=\\"http://www.opengis.net/gml\\">\\n <gml:outerBoundaryIs>\\n <gml:LinearRing>\\n <gml:coordinates>50.907326,13.664250 51.312340,10.003944 52.933346,10.404160 52.525398,14.201497 50.907326,13.664250</gml:coordinates>\\n </gml:LinearRing>\\n </gml:outerBoundaryIs>\\n</gml:Polygon>"},{"name":"footprint","content":"MULTIPOLYGON (((13.66425 50.907326, 14.201497 52.525398, 10.40416 52.933346, 10.003944 51.31234, 13.66425 50.907326)))"},{"name":"identifier","content":"S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6"},{"name":"uuid","content":"8a611d5b-f9d9-437e-9f55-eca18cf79fd4"}]}}}',
)
with Source(datahub=Datahub.Scihub) as src:
meta = src.query_metadata_srcid(
platform=Platform.Sentinel1,
srcid="S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6",
)
cat = src._init_catalog()
for item in meta:
cat.add_item(item)
item = cat.get_item("S1A_IW_SLC__1SDV_20200224T052528_20200224T052555_031390_039CF2_BEA6")
self.assertEqual(item.properties.get("srcuuid"), "8a611d5b-f9d9-437e-9f55-eca18cf79fd4")
cat.normalize_hrefs(Path(gettempdir()).as_posix())
item.validate()
@requests_mock.Mocker(real_http=True)
def test_query_metadata_srcid_earth_explorer(self, m):
m.post(
"https://m2m.cr.usgs.gov/api/api/json/stable/login",
content=b'{"requestId": 241064318, "version": "stable", "data": "token", "errorCode": null, "errorMessage": null, "sessionId": 51372983}',
)
m.get(
"https://m2m.cr.usgs.gov/api/api/json/stable/scene-list-add",
content=b'{"requestId": 241159456, "version": "stable", "sessionId": 51393323, "data": {"browse":[{"id":"5e83d0b86f2c3061","browseRotationEnabled":false,"browseName":"LandsatLook Natural Color Preview Image","browsePath":"https:\\/\\/ims.cr.usgs.gov\\/browse\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1.jpg","overlayPath":"https:\\/\\/ims.cr.usgs.gov\\/wms\\/landsat_8_c1?sceneId=lc81930242020082lgn00","overlayType":"dmid_wms","thumbnailPath":"https:\\/\\/ims.cr.usgs.gov\\/thumbnail\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1.jpg"},{"id":"5e83d0b85da62c02","browseRotationEnabled":false,"browseName":"LandsatLook Thermal Preview Image","browsePath":"https:\\/\\/ims.cr.usgs.gov\\/browse\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1_TIR.jpg","overlayPath":"https:\\/\\/ims.cr.usgs.gov\\/wms\\/landsat_8_c1?sceneId=lc81930242020082lgn00_tir","overlayType":"dmid_wms","thumbnailPath":"https:\\/\\/ims.cr.usgs.gov\\/thumbnail\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1_TIR.jpg"},{"id":"5e83d0b8b814a0e6","browseRotationEnabled":false,"browseName":"LandsatLook Quality Preview Image","browsePath":"https:\\/\\/ims.cr.usgs.gov\\/browse\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1_QB.jpg","overlayPath":"https:\\/\\/ims.cr.usgs.gov\\/wms\\/landsat_8_c1?sceneId=lc81930242020082lgn00_qb","overlayType":"dmid_wms","thumbnailPath":"https:\\/\\/ims.cr.usgs.gov\\/thumbnail\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1_QB.jpg"}],"cloudCover":"12.47","entityId":"LC81930242020082LGN00","displayId":"LC08_L1TP_193024_20200322_20200326_01_T1","orderingId":"LC81930242020082LGN00","metadata":[{"id":"5e83d0b82af07b21","fieldName":"Landsat Product Identifier","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#landsat_product_id","value":"LC08_L1TP_193024_20200322_20200326_01_T1"},{"id":"5e83d0b88275745","fieldName":"Landsat Scene Identifier","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#landsat_scene_id","value":"LC81930242020082LGN00"},{"id":"5e83d0b92ff6b5e8","fieldName":"Acquisition Date","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":"2020\\/03\\/22"},{"id":"5e83d0b9332fd122","fieldName":"Collection Category","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":"T1"},{"id":"5e83d0b9948e2596","fieldName":"Collection Number","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":1},{"id":"5e83d0b922c5f981","fieldName":"WRS Path","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":" 193"},{"id":"5e83d0b94328e34e","fieldName":"WRS Row","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":" 024"},{"id":"5e83d0b9a34ebe01","fieldName":"Target WRS Path","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":" 193"},{"id":"5e83d0b8852f2e23","fieldName":"Target WRS Row","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":" 024"},{"id":"5e83d0b9173b715a","fieldName":"Nadir\\/Off Nadir","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":"NADIR"},{"id":"5e83d0b820eaa311","fieldName":"Roll Angle","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#roll_angle","value":"-0.001"},{"id":"5e83d0b8379c9dae","fieldName":"Date L-1 Generated","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#date_l1_generated","value":"2020\\/03\\/26"},{"id":"5e83d0b9a78d1dbe","fieldName":"Start Time","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#start_time","value":"2020:082:10:02:29.1296010"},{"id":"5e83d0b887e40f8e","fieldName":"Stop Time","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#stop_time","value":"2020:082:10:03:00.8996000"},{"id":"5e83d0b926d7d304","fieldName":"Station Identifier","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#distribution_site","value":"LGN"},{"id":"5e83d0b91ad332e3","fieldName":"Day\\/Night Indicator","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#day_or_night","value":"DAY"},{"id":"5e83d0b864f5722e","fieldName":"Land Cloud Cover","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#cloud_cover_land","value":"12.47"},{"id":"5e83d0b92e9d1b11","fieldName":"Scene Cloud Cover","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#cloud_cover","value":"12.47"},{"id":"5e83d0b991213d01","fieldName":"Ground Control Points Model","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#ground_control_points_model","value":253},{"id":"5e83d0b9e4a26b2a","fieldName":"Ground Control Points Version","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#ground_control_points_version","value":4},{"id":"5e83d0b9bb19d4cc","fieldName":"Geometric RMSE Model (meters)","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#geometric_rmse_model","value":"8.296"},{"id":"5e83d0b8f922f1d3","fieldName":"Geometric RMSE Model X","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#geometric_rmse_model_x","value":"5.809"},{"id":"5e83d0b93b83213d","fieldName":"Geometric RMSE Model Y","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#geometric_rmse_model_y","value":"5.924"},{"id":"5e83d0b9c9fa1556","fieldName":"Image Quality","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#image_quality_landsat_8","value":9},{"id":"5e83d0b8a926bb3e","fieldName":"Processing Software Version","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#processing_software_version","value":"LPGS_13.1.0"},{"id":"5e83d0b84092b361","fieldName":"Sun Elevation L1","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#sun_elevation","value":"36.94009856"},{"id":"5e83d0b8bf763033","fieldName":"Sun Azimuth L1","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#sun_azimuth","value":"156.99969272"},{"id":"5e83d0b97cb734c3","fieldName":"TIRS SSM Model","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#tirs_ssm_model","value":"FINAL"},{"id":"5e83d0b861614fa4","fieldName":"Data Type Level-1","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#data_type_l1","value":"OLI_TIRS_L1TP"},{"id":"5e83d0b993a4fa4a","fieldName":"Sensor Identifier","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#sensor_id ","value":"OLI_TIRS"},{"id":"5e83d0b9de23d772","fieldName":"Panchromatic Lines","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#panchromatic_lines","value":16301},{"id":"5e83d0b8100f0577","fieldName":"Panchromatic Samples","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#panchromatic_samples","value":16101},{"id":"5e83d0b92bc96899","fieldName":"Reflective Lines","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#reflective_lines","value":8151},{"id":"5e83d0b8bbb70baf","fieldName":"Reflective Samples","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#reflective_samples","value":8051},{"id":"5e83d0b955642ca4","fieldName":"Thermal Lines","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#thermal_lines","value":8151},{"id":"5e83d0b96b1a0d35","fieldName":"Thermal Samples","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#thermal_samples","value":8051},{"id":"5e83d0b9104800de","fieldName":"Map Projection Level-1","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#map_projection_l1","value":"UTM"},{"id":"5e83d0b9fb0dce2d","fieldName":"UTM Zone","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#utm_zone","value":33},{"id":"5e83d0b8fd64f557","fieldName":"Datum","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#datum","value":"WGS84"},{"id":"5e83d0b84cc2632e","fieldName":"Ellipsoid","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#ellipsoid","value":"WGS84"},{"id":"5e83d0b98f651440","fieldName":"Grid Cell Size Panchromatic","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#grid_cell_size_panchromatic","value":"15.00"},{"id":"5e83d0b9c6eaa87d","fieldName":"Grid Cell Size Reflective","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#grid_cell_size_reflective","value":"30.00"},{"id":"5e83d0b8c810ced","fieldName":"Grid Cell Size Thermal","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#grid_cell_size_thermal","value":"30.00"},{"id":"5e83d0b8a40bafa4","fieldName":"Bias Parameter File Name OLI","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#bpf_name_oli","value":"LO8BPF20200322095233_20200322113039.01"},{"id":"5e83d0b964a91d93","fieldName":"Bias Parameter File Name TIRS","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#bpf_name_tirs","value":"LT8BPF20200310060739_20200324104153.01"},{"id":"5e83d0b9b5d81214","fieldName":"Calibration Parameter File","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#cpf_name","value":"LC08CPF_20200101_20200331_01.04"},{"id":"5e83d0b9c67b22a3","fieldName":"RLUT File Name","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#rlut_file_name","value":"LC08RLUT_20150303_20431231_01_12.h5"},{"id":"5e83d0b9a610a996","fieldName":"Center Latitude","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"51°41\'35.34\\"N"},{"id":"5e83d0b94fad23df","fieldName":"Center Longitude","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"12°48\'15.73\\"E"},{"id":"5e83d0b9c1d54551","fieldName":"UL Corner Lat","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"52°45\'50.58\\"N"},{"id":"5e83d0b8d91068e","fieldName":"UL Corner Long","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"11°49\'23.52\\"E"},{"id":"5e83d0b94ff6f17e","fieldName":"UR Corner Lat","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"52°17\'44.92\\"N"},{"id":"5e83d0b9a76119fe","fieldName":"UR Corner Long","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"14°32\'32.78\\"E"},{"id":"5e83d0b9f29120e3","fieldName":"LL Corner Lat","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"51°03\'53.57\\"N"},{"id":"5e83d0b988d2162b","fieldName":"LL Corner Long","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"11°06\'35.75\\"E"},{"id":"5e83d0b96162233","fieldName":"LR Corner Lat","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"50°36\'17.24\\"N"},{"id":"5e83d0b894a09772","fieldName":"LR Corner Long","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"13°43\'57.40\\"E"},{"id":"5e83d0b87f203a10","fieldName":"Center Latitude dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"51.69315"},{"id":"5e83d0b9b2a9a299","fieldName":"Center Longitude dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"12.80437"},{"id":"5e83d0b8bc51bf5b","fieldName":"UL Corner Lat dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"52.76405"},{"id":"5e83d0b95071b3bf","fieldName":"UL Corner Long dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"11.82320"},{"id":"5e83d0b9842b0429","fieldName":"UR Corner Lat dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"52.29581"},{"id":"5e83d0b931a16a9","fieldName":"UR Corner Long dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"14.54244"},{"id":"5e83d0b8abad8ec9","fieldName":"LL Corner Lat dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"51.06488"},{"id":"5e83d0b92a9532e0","fieldName":"LL Corner Long dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"11.10993"},{"id":"5e83d0b834fea374","fieldName":"LR Corner Lat dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"50.60479"},{"id":"5e83d0b9f619dbbe","fieldName":"LR Corner Long dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"13.73261"}],"hasCustomizedMetadata":false,"options":{"bulk":true,"download":true,"order":true,"secondary":false},"selected":null,"spatialBounds":{"type":"Polygon","coordinates":[[[11.10993,50.60479],[11.10993,52.76405],[14.54244,52.76405],[14.54244,50.60479],[11.10993,50.60479]]]},"spatialCoverage":{"type":"Polygon","coordinates":[[[11.10993,51.06488],[13.73261,50.60479],[14.54244,52.29581],[11.8232,52.76405],[11.10993,51.06488]]]},"temporalCoverage":{"endDate":"2020-03-22 00:00:00","startDate":"2020-03-22 00:00:00"},"publishDate":"2020-03-22T09:56:49"}, "errorCode": null, "errorMessage": null}',
)
m.get(
"https://m2m.cr.usgs.gov/api/api/json/stable/scene-list-get",
content=b'{"requestId": 241160510, "version": "stable", "sessionId": 51393558, "data": [{"entityId":"LC81930242020082LGN00","datasetName":"landsat_8_c1"}], "errorCode": null, "errorMessage": null}',
)
m.get(
"https://m2m.cr.usgs.gov/api/api/json/stable/scene-list-remove",
content=b'{"requestId": 241161482, "version": "stable", "sessionId": 51393761, "data": null, "errorCode": null, "errorMessage": null}',
)
m.get(
"https://m2m.cr.usgs.gov/api/api/json/stable/scene-metadata",
content=b'{"requestId": 241162488, "version": "stable", "sessionId": 51393950, "data": {"browse":[{"id":"5e83d0b86f2c3061","browseRotationEnabled":false,"browseName":"LandsatLook Natural Color Preview Image","browsePath":"https:\\/\\/ims.cr.usgs.gov\\/browse\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1.jpg","overlayPath":"https:\\/\\/ims.cr.usgs.gov\\/wms\\/landsat_8_c1?sceneId=lc81930242020082lgn00","overlayType":"dmid_wms","thumbnailPath":"https:\\/\\/ims.cr.usgs.gov\\/thumbnail\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1.jpg"},{"id":"5e83d0b85da62c02","browseRotationEnabled":false,"browseName":"LandsatLook Thermal Preview Image","browsePath":"https:\\/\\/ims.cr.usgs.gov\\/browse\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1_TIR.jpg","overlayPath":"https:\\/\\/ims.cr.usgs.gov\\/wms\\/landsat_8_c1?sceneId=lc81930242020082lgn00_tir","overlayType":"dmid_wms","thumbnailPath":"https:\\/\\/ims.cr.usgs.gov\\/thumbnail\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1_TIR.jpg"},{"id":"5e83d0b8b814a0e6","browseRotationEnabled":false,"browseName":"LandsatLook Quality Preview Image","browsePath":"https:\\/\\/ims.cr.usgs.gov\\/browse\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1_QB.jpg","overlayPath":"https:\\/\\/ims.cr.usgs.gov\\/wms\\/landsat_8_c1?sceneId=lc81930242020082lgn00_qb","overlayType":"dmid_wms","thumbnailPath":"https:\\/\\/ims.cr.usgs.gov\\/thumbnail\\/landsat_8_c1\\/2020\\/193\\/024\\/LC08_L1TP_193024_20200322_20200326_01_T1_QB.jpg"}],"cloudCover":"12.47","entityId":"LC81930242020082LGN00","displayId":"LC08_L1TP_193024_20200322_20200326_01_T1","orderingId":"LC81930242020082LGN00","metadata":[{"id":"5e83d0b82af07b21","fieldName":"Landsat Product Identifier","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#landsat_product_id","value":"LC08_L1TP_193024_20200322_20200326_01_T1"},{"id":"5e83d0b88275745","fieldName":"Landsat Scene Identifier","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#landsat_scene_id","value":"LC81930242020082LGN00"},{"id":"5e83d0b92ff6b5e8","fieldName":"Acquisition Date","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":"2020\\/03\\/22"},{"id":"5e83d0b9332fd122","fieldName":"Collection Category","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":"T1"},{"id":"5e83d0b9948e2596","fieldName":"Collection Number","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":1},{"id":"5e83d0b922c5f981","fieldName":"WRS Path","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":" 193"},{"id":"5e83d0b94328e34e","fieldName":"WRS Row","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":" 024"},{"id":"5e83d0b9a34ebe01","fieldName":"Target WRS Path","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":" 193"},{"id":"5e83d0b8852f2e23","fieldName":"Target WRS Row","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":" 024"},{"id":"5e83d0b9173b715a","fieldName":"Nadir\\/Off Nadir","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_c2_dictionary.html#acquisition_date","value":"NADIR"},{"id":"5e83d0b820eaa311","fieldName":"Roll Angle","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#roll_angle","value":"-0.001"},{"id":"5e83d0b8379c9dae","fieldName":"Date L-1 Generated","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#date_l1_generated","value":"2020\\/03\\/26"},{"id":"5e83d0b9a78d1dbe","fieldName":"Start Time","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#start_time","value":"2020:082:10:02:29.1296010"},{"id":"5e83d0b887e40f8e","fieldName":"Stop Time","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#stop_time","value":"2020:082:10:03:00.8996000"},{"id":"5e83d0b926d7d304","fieldName":"Station Identifier","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#distribution_site","value":"LGN"},{"id":"5e83d0b91ad332e3","fieldName":"Day\\/Night Indicator","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#day_or_night","value":"DAY"},{"id":"5e83d0b864f5722e","fieldName":"Land Cloud Cover","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#cloud_cover_land","value":"12.47"},{"id":"5e83d0b92e9d1b11","fieldName":"Scene Cloud Cover","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#cloud_cover","value":"12.47"},{"id":"5e83d0b991213d01","fieldName":"Ground Control Points Model","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#ground_control_points_model","value":253},{"id":"5e83d0b9e4a26b2a","fieldName":"Ground Control Points Version","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#ground_control_points_version","value":4},{"id":"5e83d0b9bb19d4cc","fieldName":"Geometric RMSE Model (meters)","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#geometric_rmse_model","value":"8.296"},{"id":"5e83d0b8f922f1d3","fieldName":"Geometric RMSE Model X","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#geometric_rmse_model_x","value":"5.809"},{"id":"5e83d0b93b83213d","fieldName":"Geometric RMSE Model Y","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#geometric_rmse_model_y","value":"5.924"},{"id":"5e83d0b9c9fa1556","fieldName":"Image Quality","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#image_quality_landsat_8","value":9},{"id":"5e83d0b8a926bb3e","fieldName":"Processing Software Version","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#processing_software_version","value":"LPGS_13.1.0"},{"id":"5e83d0b84092b361","fieldName":"Sun Elevation L1","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#sun_elevation","value":"36.94009856"},{"id":"5e83d0b8bf763033","fieldName":"Sun Azimuth L1","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#sun_azimuth","value":"156.99969272"},{"id":"5e83d0b97cb734c3","fieldName":"TIRS SSM Model","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#tirs_ssm_model","value":"FINAL"},{"id":"5e83d0b861614fa4","fieldName":"Data Type Level-1","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#data_type_l1","value":"OLI_TIRS_L1TP"},{"id":"5e83d0b993a4fa4a","fieldName":"Sensor Identifier","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#sensor_id ","value":"OLI_TIRS"},{"id":"5e83d0b9de23d772","fieldName":"Panchromatic Lines","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#panchromatic_lines","value":16301},{"id":"5e83d0b8100f0577","fieldName":"Panchromatic Samples","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#panchromatic_samples","value":16101},{"id":"5e83d0b92bc96899","fieldName":"Reflective Lines","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#reflective_lines","value":8151},{"id":"5e83d0b8bbb70baf","fieldName":"Reflective Samples","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#reflective_samples","value":8051},{"id":"5e83d0b955642ca4","fieldName":"Thermal Lines","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#thermal_lines","value":8151},{"id":"5e83d0b96b1a0d35","fieldName":"Thermal Samples","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#thermal_samples","value":8051},{"id":"5e83d0b9104800de","fieldName":"Map Projection Level-1","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#map_projection_l1","value":"UTM"},{"id":"5e83d0b9fb0dce2d","fieldName":"UTM Zone","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#utm_zone","value":33},{"id":"5e83d0b8fd64f557","fieldName":"Datum","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#datum","value":"WGS84"},{"id":"5e83d0b84cc2632e","fieldName":"Ellipsoid","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#ellipsoid","value":"WGS84"},{"id":"5e83d0b98f651440","fieldName":"Grid Cell Size Panchromatic","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#grid_cell_size_panchromatic","value":"15.00"},{"id":"5e83d0b9c6eaa87d","fieldName":"Grid Cell Size Reflective","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#grid_cell_size_reflective","value":"30.00"},{"id":"5e83d0b8c810ced","fieldName":"Grid Cell Size Thermal","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#grid_cell_size_thermal","value":"30.00"},{"id":"5e83d0b8a40bafa4","fieldName":"Bias Parameter File Name OLI","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#bpf_name_oli","value":"LO8BPF20200322095233_20200322113039.01"},{"id":"5e83d0b964a91d93","fieldName":"Bias Parameter File Name TIRS","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#bpf_name_tirs","value":"LT8BPF20200310060739_20200324104153.01"},{"id":"5e83d0b9b5d81214","fieldName":"Calibration Parameter File","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#cpf_name","value":"LC08CPF_20200101_20200331_01.04"},{"id":"5e83d0b9c67b22a3","fieldName":"RLUT File Name","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#rlut_file_name","value":"LC08RLUT_20150303_20431231_01_12.h5"},{"id":"5e83d0b9a610a996","fieldName":"Center Latitude","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"51°41\'35.34\\"N"},{"id":"5e83d0b94fad23df","fieldName":"Center Longitude","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"12°48\'15.73\\"E"},{"id":"5e83d0b9c1d54551","fieldName":"UL Corner Lat","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"52°45\'50.58\\"N"},{"id":"5e83d0b8d91068e","fieldName":"UL Corner Long","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"11°49\'23.52\\"E"},{"id":"5e83d0b94ff6f17e","fieldName":"UR Corner Lat","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"52°17\'44.92\\"N"},{"id":"5e83d0b9a76119fe","fieldName":"UR Corner Long","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"14°32\'32.78\\"E"},{"id":"5e83d0b9f29120e3","fieldName":"LL Corner Lat","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"51°03\'53.57\\"N"},{"id":"5e83d0b988d2162b","fieldName":"LL Corner Long","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"11°06\'35.75\\"E"},{"id":"5e83d0b96162233","fieldName":"LR Corner Lat","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"50°36\'17.24\\"N"},{"id":"5e83d0b894a09772","fieldName":"LR Corner Long","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_degrees","value":"13°43\'57.40\\"E"},{"id":"5e83d0b87f203a10","fieldName":"Center Latitude dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"51.69315"},{"id":"5e83d0b9b2a9a299","fieldName":"Center Longitude dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"12.80437"},{"id":"5e83d0b8bc51bf5b","fieldName":"UL Corner Lat dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"52.76405"},{"id":"5e83d0b95071b3bf","fieldName":"UL Corner Long dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"11.82320"},{"id":"5e83d0b9842b0429","fieldName":"UR Corner Lat dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"52.29581"},{"id":"5e83d0b931a16a9","fieldName":"UR Corner Long dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"14.54244"},{"id":"5e83d0b8abad8ec9","fieldName":"LL Corner Lat dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"51.06488"},{"id":"5e83d0b92a9532e0","fieldName":"LL Corner Long dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"11.10993"},{"id":"5e83d0b834fea374","fieldName":"LR Corner Lat dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"50.60479"},{"id":"5e83d0b9f619dbbe","fieldName":"LR Corner Long dec","dictionaryLink":"https:\\/\\/lta.cr.usgs.gov\\/DD\\/landsat_dictionary.html#coordinates_decimal","value":"13.73261"}],"hasCustomizedMetadata":false,"options":{"bulk":true,"download":true,"order":true,"secondary":false},"selected":null,"spatialBounds":{"type":"Polygon","coordinates":[[[11.10993,50.60479],[11.10993,52.76405],[14.54244,52.76405],[14.54244,50.60479],[11.10993,50.60479]]]},"spatialCoverage":{"type":"Polygon","coordinates":[[[11.10993,51.06488],[13.73261,50.60479],[14.54244,52.29581],[11.8232,52.76405],[11.10993,51.06488]]]},"temporalCoverage":{"endDate":"2020-03-22 00:00:00","startDate":"2020-03-22 00:00:00"},"publishDate":"2020-03-22T09:56:49"}, "errorCode": null, "errorMessage": null}',
)
m.get(
"https://m2m.cr.usgs.gov/api/api/json/stable/logout",
content=b'{"errorCode":null,"error":"","data":true,"api_version":"1.4.1","access_level":"user","catalog_id":"EE","executionTime":0.4}',
)
with Source(datahub=Datahub.EarthExplorer) as src:
meta = src.query_metadata_srcid(
platform=Platform.Landsat8,
srcid="LC08_L1TP_193024_20200322_20200326_01_T1",
)
cat = src._init_catalog()
for item in meta:
cat.add_item(item)
item = cat.get_item("LC08_L1TP_193024_20200322_20200326_01_T1")
self.assertEqual(item.properties.get("srcuuid"), "LC81930242020082LGN00")
cat.normalize_hrefs(Path(gettempdir()).as_posix())
item.validate()
if __name__ == "__main__":
unittest.main()
| 342.618557 | 14,343 | 0.716044 | 8,408 | 66,468 | 5.524619 | 0.087536 | 0.030613 | 0.045919 | 0.103851 | 0.941659 | 0.932553 | 0.925642 | 0.924027 | 0.920583 | 0.912984 | 0 | 0.150105 | 0.04372 | 66,468 | 193 | 14,344 | 344.393782 | 0.580693 | 0 | 0 | 0.421053 | 0 | 0.25731 | 0.851011 | 0.640037 | 0 | 0 | 0 | 0 | 0.076023 | 1 | 0.064327 | false | 0 | 0.052632 | 0 | 0.122807 | 0.017544 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
ab11d1c7447e7b4dffc7ed203c16e17c53c917c8 | 81 | py | Python | Drawing_a_Shape.py | PiggyAwesome/Learn-Python-Full-Course-for-Beginners-Tutorial-code | c164492a757cb825b73af1014f95aef884ac49af | [
"Unlicense"
] | 2 | 2021-08-11T15:53:16.000Z | 2021-09-13T13:43:59.000Z | Drawing_a_Shape.py | PiggyAwesome/Learn-Python-Full-Course-for-Beginners-Tutorial-code | c164492a757cb825b73af1014f95aef884ac49af | [
"Unlicense"
] | null | null | null | Drawing_a_Shape.py | PiggyAwesome/Learn-Python-Full-Course-for-Beginners-Tutorial-code | c164492a757cb825b73af1014f95aef884ac49af | [
"Unlicense"
] | null | null | null | # Drawing a Shape
print(" /|")
print(" / |")
print(" / |")
print("/___|") | 16.2 | 18 | 0.444444 | 7 | 81 | 4.714286 | 0.571429 | 0.909091 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.246914 | 81 | 5 | 19 | 16.2 | 0.540984 | 0.185185 | 0 | 0.75 | 0 | 0 | 0.327869 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
ab3dd6ed345dd55cb0a1ca5fdd800ac2c5e265ee | 880 | py | Python | src/algoritmia/problems/sorting/__init__.py | DavidLlorens/algoritmia | 40ca0a89ea6de9b633fa5f697f0a28cae70816a2 | [
"MIT"
] | 6 | 2018-09-15T15:09:10.000Z | 2022-02-27T01:23:11.000Z | src/algoritmia/problems/sorting/__init__.py | JeromeIllgner/algoritmia | 406afe7206f2411557859bf03480c16db7dcce0d | [
"MIT"
] | null | null | null | src/algoritmia/problems/sorting/__init__.py | JeromeIllgner/algoritmia | 406afe7206f2411557859bf03480c16db7dcce0d | [
"MIT"
] | 5 | 2018-07-10T20:19:55.000Z | 2021-03-31T03:32:22.000Z | from algoritmia.problems.sorting.interfaces import ISorter, IInPlaceSorter
from algoritmia.problems.sorting.bubblesort import InPlaceBubbleSorter
from algoritmia.problems.sorting.selectionsort import InPlaceSelectionSorter
from algoritmia.problems.sorting.insertionsort import InPlaceInsertionSorter
from algoritmia.problems.sorting.heapsort import HeapSorter
from algoritmia.problems.sorting.mergesort import InPlaceMergeSorter, InPlaceMergesortProblem, MergeSorter, MergesortProblem, ThresholdedInPlaceMergeSorter
from algoritmia.problems.sorting.quicksort import BasicInPlaceQuickSorter, BasicQuickSorter, BasicSemiIterativeInPlaceQuickSorter, InPlaceQuickSorter, RandomizedInPlaceQuickSorter, RandomizedSemiIterativeInPlaceQuickSorter, SemiIterativeInPlaceQuickSorter, SemiIterativeInPlaceQuickSorter1
from algoritmia.problems.sorting.countingsort import CountingSorter | 110 | 290 | 0.902273 | 68 | 880 | 11.676471 | 0.485294 | 0.141058 | 0.221662 | 0.292191 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001205 | 0.056818 | 880 | 8 | 291 | 110 | 0.955422 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
ab4a81b40d6edb37791bf456a91777a8ec920695 | 2,485 | py | Python | cloud_functions/main.py | gaurav-aiml/ecommerece-data-pipeline | 1c1aeebb2040c61d0fccd1949f9f80741a272712 | [
"MIT"
] | 2 | 2021-04-23T01:56:05.000Z | 2021-09-25T05:25:00.000Z | cloud_functions/main.py | gaurav-aiml/ecommerece-data-pipeline | 1c1aeebb2040c61d0fccd1949f9f80741a272712 | [
"MIT"
] | null | null | null | cloud_functions/main.py | gaurav-aiml/ecommerece-data-pipeline | 1c1aeebb2040c61d0fccd1949f9f80741a272712 | [
"MIT"
] | null | null | null | import os
from google.cloud import bigquery
def upload_visited_gcs_to_bq(data, context):
'''
Google cloud function to copy data a newly written object in the GCS bucket to a Big Query table
This function is triggered everytime airflow dag writes visit count data to Google Cloud Storage
'''
client = bigquery.Client()
dataset_id = "ecom_user_data"
dataset_ref = client.dataset(dataset_id)
job_config = bigquery.job.LoadJobConfig()
job_config.create_disposition = bigquery.CreateDisposition.CREATE_IF_NEEDED
job_config.write_disposition = bigquery.WriteDisposition.WRITE_TRUNCATE
job_config.autodetect = True
job_config.ignore_unknown_values = True
job_config.source_format = bigquery.SourceFormat.PARQUET
uri = 'gs://gmp-etl/real-time-user-logs/hive-processed-visits-output/*.parquet'
load_job = client.load_table_from_uri(uri,dataset_ref.table('processed_user_visits'),job_config=job_config)
print("Starting Job {}".format(load_job.job_id))
load_job.result()
print("Job Finished")
destination_table = client.get_table(dataset_ref.table('processed_user_visits'))
print('Loaded {} rows.' .format(destination_table.num_rows))
def upload_cart_gcs_to_bq(data, context):
'''
Google cloud function to copy data a newly written object in the GCS bucket to a Big Query table
This function is triggered everytime airflow dag writes shopping cart data to Google Cloud Storage
'''
client = bigquery.Client()
dataset_id = "ecom_user_data"
dataset_ref = client.dataset(dataset_id)
job_config = bigquery.job.LoadJobConfig()
job_config.create_disposition = bigquery.CreateDisposition.CREATE_IF_NEEDED
#Write Truncate, because every time all the parquet file are written Big Query table.
job_config.write_disposition = bigquery.WriteDisposition.WRITE_TRUNCATE
job_config.autodetect = True
job_config.ignore_unknown_values = True
job_config.source_format = bigquery.SourceFormat.PARQUET
uri = 'gs://gmp-etl/real-time-user-logs/hive-processed-cart-output/*.parquet'
load_job = client.load_table_from_uri(uri,dataset_ref.table('processed_user_cart'),job_config=job_config)
print("Starting Job {}".format(load_job.job_id))
load_job.result()
print("Job Finished")
destination_table = client.get_table(dataset_ref.table('processed_user_cart'))
print('Loaded {} rows.' .format(destination_table.num_rows))
| 38.230769 | 111 | 0.752918 | 339 | 2,485 | 5.271386 | 0.265487 | 0.080582 | 0.029099 | 0.053721 | 0.906547 | 0.906547 | 0.895355 | 0.895355 | 0.846111 | 0.846111 | 0 | 0 | 0.157746 | 2,485 | 64 | 112 | 38.828125 | 0.853798 | 0.190744 | 0 | 0.722222 | 0 | 0.055556 | 0.168357 | 0.092292 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.055556 | 0 | 0.111111 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
db71ae094d8e370f9923062ee6440104c3a6937b | 289,568 | py | Python | tests/bugs/core_6336_test.py | reevespaul/firebird-qa | 98f16f425aa9ab8ee63b86172f959d63a2d76f21 | [
"MIT"
] | null | null | null | tests/bugs/core_6336_test.py | reevespaul/firebird-qa | 98f16f425aa9ab8ee63b86172f959d63a2d76f21 | [
"MIT"
] | null | null | null | tests/bugs/core_6336_test.py | reevespaul/firebird-qa | 98f16f425aa9ab8ee63b86172f959d63a2d76f21 | [
"MIT"
] | null | null | null | #coding:utf-8
#
# id: bugs.core_6336
# title: Regression in FB 4.x: error "Implementation of text subtype <NNNN> not located" on attempt to use some collations defined in fbintl.conf
# decription:
# Test uses list of character sets and collations defined in %FB_HOME%\\intl
# bintl.conf.
# For each charset <W> we try following:
# 1) alter database set default character set <W>;
# 2) alter this <W> set default collation <W>;
# 3) create unicode collation <U> for this <W> and alter <W> so that default collation is <U>;
# 4) for each of other (non-unicode) collations <C> alter <W> with set default collation to <C>.
# Each of these actions is verified by creating several DB objects: domains, table, view and stored procedure.
# Every created DB object will use fields/parameters which refer to current charset and collation, i.e.:
# * create two domains of type VARCHAR; one of them will be later modified so that its default collation will be dropped;
# * create one domain of type BLOB; it can not be modified anyway because of implementation limits;
# * create table with two fields (varchar and blob) of these domains;
# * create view which refers to rdb$fields (this statement did FAIL and it was the reason of creation this ticket);
# * create stored proc with parameters of these domains.
# Finally, we do query to RDB$ tables in order to show data related to these domains.
#
# Following is what occurs for iso8859_1 (and similarly for all other charsets):
# ========
# alter database set default character set iso8859_1
# alter character set iso8859_1 set default collation iso8859_1
# create collation co_non_unc for iso8859_1 from iso8859_1 PAD SPACE
# create domain dm_text varchar(50) character set iso8859_1 collate co_non_unc
# create domain dm_name varchar(50) character set iso8859_1 collate iso8859_1
# create domain dm_blob blob character set iso8859_1 collate iso8859_1
# ...
# -- here we check that 'collate co_non_unc' will be cuted off and default collation will be restored for this domain:
# alter domain dm_text type char(50) character set iso8859_1
# <SHOW DATA FROM RDB$DATABASE FOR ALL THREE DOMAINS: DM_TEXT, DM_NAME and DM_BLOB>
# <DROP JUST CREATED OBJECTS>
#
# create collation ISO8859_1_UNICODE for iso8859_1
# alter character set iso8859_1 set default collation ISO8859_1_UNICODE
# create collation co_unicode for iso8859_1 from iso8859_1_unicode case insensitive accent insensitive 'NUMERIC-SORT=1'
# create domain dm_text varchar(50) character set iso8859_1 collate co_unicode
# create domain dm_name varchar(50) character set iso8859_1 collate co_unicode
# create domain dm_blob blob character set iso8859_1 collate co_unicode
# ...
# -- here we check that 'collate co_unicode' will be cuted off and default collation will be restored for this domain:
# alter domain dm_text type char(50) character set iso8859_1
# <SHOW DATA FROM RDB$DATABASE FOR ALL THREE DOMAINS: DM_TEXT, DM_NAME and DM_BLOB>
# <DROP JUST CREATED OBJECTS>
#
#
# alter character set iso8859_1 set default collation da_da
# create collation co_non_unc for iso8859_1 from da_da PAD SPACE
# create domain dm_text varchar(50) character set iso8859_1 collate co_non_unc
# create domain dm_name varchar(50) character set iso8859_1 collate da_da
# create domain dm_blob blob character set iso8859_1 collate da_da
# ...
# -- here we check that 'collate co_non_unc' will be cuted off and default collation will be restored for this domain:
# alter domain dm_text type char(50) character set iso8859_1
# <SHOW DATA FROM RDB$DATABASE FOR ALL THREE DOMAINS: DM_TEXT, DM_NAME and DM_BLOB>
# <DROP JUST CREATED OBJECTS>
#
# ... and so on for all other collations defined for charset ISO8859_1 ...
# ========
#
# Checked on 4.0.0.2114 SS. Time of execution: 73.668s.
#
# tracker_id: CORE-6336
# min_versions: ['4.0']
# versions: 4.0
# qmid: None
import pytest
from firebird.qa import db_factory, isql_act, Action
# version: 4.0
# resources: None
substitutions_1 = []
init_script_1 = """"""
db_1 = db_factory(sql_dialect=3, init=init_script_1)
# test_script_1
#---
#
# import os
# import sys
# import subprocess
# from fdb import services
#
# os.environ["ISC_USER"] = user_name
# os.environ["ISC_PASSWORD"] = user_password
# db_conn.close()
#
# #--------------------------------------------
#
# def flush_and_close( file_handle ):
# # https://docs.python.org/2/library/os.html#os.fsync
# # If you're starting with a Python file object f,
# # first do f.flush(), and
# # then do os.fsync(f.fileno()), to ensure that all internal buffers associated with f are written to disk.
# global os
#
# file_handle.flush()
# if file_handle.mode not in ('r', 'rb') and file_handle.name != os.devnull:
# # otherwise: "OSError: [Errno 9] Bad file descriptor"!
# os.fsync(file_handle.fileno())
# file_handle.close()
#
# #--------------------------------------------
#
# def cleanup( f_names_list ):
# global os
# for f in f_names_list:
# if type(f) == file:
# del_name = f.name
# elif type(f) == str:
# del_name = f
# else:
# print('Unrecognized type of element:', f, ' - can not be treated as file.')
# del_name = None
#
# if del_name and os.path.isfile( del_name ):
# os.remove( del_name )
#
# #--------------------------------------------
#
# runProgram('gfix',[dsn,'-w','async'])
#
# sql_text='''
# set list on;
# --set bail on;
# set blob all;
# set width f_name 20;
# set width cset_name 20;
# set width coll_name 20;
# set width cset_default_coll 20;
# set width domain_coll_name 20;
#
# --set echo on;
# --shell del c: emp mp4test.fdb 2>nul;
# connect '%(dsn)s';
# set autoddl off;
# SET KEEP_TRAN_PARAMS ON;
#
# --create database '%(dsn)s';
#
# commit;
# set transaction READ COMMITTED NO RECORD_VERSION NO WAIT;
#
# set term ^;
# create procedure sp_cleanup as
# begin
# begin
# execute statement 'drop procedure sp_info';
# when any do begin end
# end
#
# begin
# execute statement 'drop table t_info';
# when any do begin end
# end
#
# begin
# execute statement 'drop view v_info';
# when any do begin end
# end
#
# begin
# execute statement 'drop domain dm_name';
# when any do begin end
# end
#
# begin
# execute statement 'drop domain dm_text';
# when any do begin end
# end
#
# begin
# execute statement 'drop domain dm_blob';
# when any do begin end
# end
#
# begin
# execute statement 'drop collation co_unicode';
# when any do begin end
# end
#
# begin
# execute statement 'drop collation co_non_unc';
# when any do begin end
# end
# end
# ^
#
# create procedure sp_add_objects ( a_cset varchar(50), a_coll varchar(50) ) as
# begin
#
# /*
# create collation win1252_unicode_ci for win1252 from win1252_unicode case insensitive;
# */
# -- NB: COLLATE clause can be used only in CREATE domain statement. ALTER domain does not allow this.
# if ( right(upper(a_coll),8) = upper('_UNICODE') ) then
# begin
# execute statement 'create collation co_unicode for ' || a_cset || ' from ' || a_coll || q'{ case insensitive accent insensitive 'NUMERIC-SORT=1'}';
# execute statement 'create domain dm_text varchar(50) character set ' || a_cset || ' collate co_unicode';
# execute statement 'create domain dm_name varchar(50) character set ' || a_cset || ' collate co_unicode';
# execute statement 'create domain dm_blob blob character set ' || a_cset || ' collate co_unicode';
# end
# else
# begin
# -- CREATE COLLATION PT_PT2 FOR ISO8859_1 FROM PT_PT 'SPECIALS-FIRST=1';
# -- create collation co_non_unc for SJIS_0208 from SJIS_0208 'SPECIALS-FIRST=1'; ==> invalid collation attr; the same for DISABLE-COMPRESSIONS=1
# execute statement 'create collation co_non_unc for ' || a_cset || ' from ' || a_coll || ' PAD SPACE';
# execute statement 'create domain dm_text varchar(50) character set ' || a_cset || ' collate co_non_unc';
# execute statement 'create domain dm_name varchar(50) character set ' || a_cset || ' collate ' || a_coll;
# execute statement 'create domain dm_blob blob character set ' || a_cset || ' collate ' || a_coll ;
# end
#
# execute statement q'{recreate view v_name as select f.rdb$field_name as f_name from rdb$fields f where f.rdb$field_name = upper('dm_name')}';
#
# execute statement q'{recreate view v_blob as select f.rdb$field_name as f_name from rdb$fields f where f.rdb$field_name = upper('dm_blob')}';
#
# execute statement 'recreate table t_info(f_name dm_name, f_blob dm_blob)';
#
# execute statement q'{create procedure sp_info(a_name dm_name, a_blob dm_blob) returns(o_name dm_name, o_blob dm_blob) as begin suspend; end }';
#
# execute statement
# q'{recreate view v_info as
# select
# cast(f.rdb$field_name as varchar(20)) as f_name
# ,f.rdb$character_set_id as cset_id
# ,f.rdb$collation_id as coll_id
# ,cast(c.rdb$character_set_name as varchar(20)) as cset_name
# ,cast(c.rdb$default_collate_name as varchar(20)) as cset_default_coll
# ,cast(k.rdb$collation_name as varchar(20)) as domain_coll_name
# ,k.rdb$collation_attributes as coll_attr
# ,cast(k.rdb$specific_attributes as varchar(8190)) as coll_spec
# from rdb$fields f
# left join rdb$character_sets c on f.rdb$character_set_id = c.rdb$character_set_id
# left join rdb$collations k on c.rdb$character_set_id = k.rdb$character_set_id and f.rdb$collation_id = k.rdb$collation_id
# where f.rdb$field_name in ( upper('dm_text'), upper('dm_name'), upper('dm_blob') )
# order by f_name
# }'
# ;
#
#
# -- Here we try to REMOVE collation attribute from domain:
# execute statement 'alter domain dm_text type char(50) character set ' || a_cset ;
#
# -- dm_blob: "Cannot change datatype ... Changing datatype is not supported for BLOB or ARRAY columns."
# -- NB: this is so even when a new type is the same as old: BLOB.
# -- execute statement 'alter domain dm_blob type blob character set ' || a_cset ;
#
# end
# ^
# set term ;^
# commit;
#
# --################################ S J I S _ 0 2 0 8 #############################
#
# alter database set default character set SJIS_0208 ;
#
#
# alter character set SJIS_0208 set default collation SJIS_0208;
# commit;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('SJIS_0208', 'SJIS_0208');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation sjis_0208_unicode for sjis_0208;
# alter character set SJIS_0208 set default collation SJIS_0208_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('SJIS_0208', 'SJIS_0208_UNICODE');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --################################ E U C J _ 0 2 0 8 #############################
#
# alter database set default character set EUCJ_0208;
# alter character set EUCJ_0208 set default collation EUCJ_0208;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('EUCJ_0208', 'EUCJ_0208');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# create collation EUCJ_0208_UNICODE for EUCJ_0208;
# alter character set EUCJ_0208 set default collation EUCJ_0208_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('EUCJ_0208', 'EUCJ_0208_UNICODE');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# --################################ D O S 4 3 7 #############################
#
# alter database set default character set DOS437;
#
# alter character set DOS437 set default collation DOS437;
# create collation DOS437_UNICODE for DOS437;
# alter character set DOS437 set default collation DOS437_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS437', 'DOS437_UNICODE');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set DOS437 set default collation DB_DEU437;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS437', 'DB_DEU437');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set DOS437 set default collation DB_ESP437;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS437', 'DB_ESP437');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set DOS437 set default collation DB_FIN437;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS437', 'DB_FIN437');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set DOS437 set default collation DB_FRA437;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS437', 'DB_FRA437');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set DOS437 set default collation DB_ITA437;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS437', 'DB_ITA437');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set DOS437 set default collation DB_NLD437;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS437', 'DB_NLD437');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set DOS437 set default collation DB_SVE437;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS437', 'DB_SVE437');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set DOS437 set default collation DB_UK437;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS437', 'DB_UK437');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set DOS437 set default collation DB_US437;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS437', 'DB_US437');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set DOS437 set default collation PDOX_ASCII;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS437', 'PDOX_ASCII');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set DOS437 set default collation PDOX_INTL;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS437', 'PDOX_INTL');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set DOS437 set default collation PDOX_SWEDFIN;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS437', 'PDOX_SWEDFIN');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# --################################ D O S 8 5 0 #############################
#
# alter database set default character set dos850;
#
# alter character set dos850 set default collation dos850;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS850', 'DOS850');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation DOS850_UNICODE for DOS850;
# alter character set dos850 set default collation DOS850_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS850', 'DOS850_UNICODE');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos850 set default collation DB_DEU850;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS850', 'DB_DEU850');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos850 set default collation DB_FRA850;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS850', 'DB_FRA850');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos850 set default collation DB_FRC850;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS850', 'DB_FRC850');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos850 set default collation DB_ITA850;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS850', 'DB_ITA850');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos850 set default collation DB_NLD850;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS850', 'DB_NLD850');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos850 set default collation DB_PTB850;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS850', 'DB_PTB850');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos850 set default collation DB_SVE850;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS850', 'DB_SVE850');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos850 set default collation DB_UK850;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS850', 'DB_UK850');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos850 set default collation DB_US850;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS850', 'DB_US850');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --################################ D O S 8 6 5 #############################
#
# alter database set default character set dos865;
#
# alter character set dos865 set default collation dos865;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS865', 'DOS865');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation DOS865_UNICODE for DOS865;
# alter character set dos865 set default collation DOS865_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS865', 'DOS865_UNICODE');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos865 set default collation DB_DAN865;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS865', 'DB_DAN865');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos865 set default collation DB_NOR865;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS865', 'DB_NOR865');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos865 set default collation PDOX_NORDAN4;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('DOS865', 'PDOX_NORDAN4');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################## I S O 8 8 5 9 _ 1 ###########################
#
# alter database set default character set iso8859_1 ;
#
# alter character set iso8859_1 set default collation iso8859_1;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'iso8859_1');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation ISO8859_1_UNICODE for iso8859_1;
# alter character set iso8859_1 set default collation ISO8859_1_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'iso8859_1_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation da_da;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'da_da');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation de_de;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'de_de');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation du_nl;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'du_nl');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation en_uk;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'en_uk');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation en_us;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'en_us');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation es_es;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'es_es');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation es_es_ci_ai;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'es_es_ci_ai');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation fi_fi;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'fi_fi');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation fr_ca;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'fr_ca');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation fr_fr;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'fr_fr');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation is_is;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'is_is');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation it_it;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'it_it');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation no_no;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'no_no');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation sv_sv;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'sv_sv');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation pt_br;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'pt_br');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_1 set default collation pt_pt;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('iso8859_1', 'pt_pt');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################## I S O 8 8 5 9 _ 2 ###########################
#
# alter database set default character set ISO8859_2;
#
# alter character set iso8859_2 set default collation ISO8859_2;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_2', 'ISO8859_2');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation ISO8859_2_UNICODE for iso8859_2;
# alter character set iso8859_2 set default collation ISO8859_2_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_2', 'ISO8859_2_UNICODE');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_2 set default collation CS_CZ;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_2', 'CS_CZ');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_2 set default collation ISO_HUN;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_2', 'ISO_HUN');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_2 set default collation ISO_PLK;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_2', 'ISO_PLK');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################## I S O 8 8 5 9 _ 3 ###########################
#
# alter database set default character set ISO8859_3;
#
# alter character set iso8859_3 set default collation ISO8859_3;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_3', 'ISO8859_3');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation ISO8859_3_UNICODE for iso8859_3;
# alter character set iso8859_3 set default collation ISO8859_3_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_3', 'ISO8859_3_UNICODE');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################## I S O 8 8 5 9 _ 4 ###########################
#
# alter database set default character set ISO8859_4;
#
# alter character set iso8859_4 set default collation ISO8859_4;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_4', 'ISO8859_4');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation ISO8859_4_UNICODE for iso8859_4;
# alter character set iso8859_4 set default collation ISO8859_4_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_4', 'ISO8859_4_UNICODE');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################## I S O 8 8 5 9 _ 5 ###########################
#
# alter database set default character set ISO8859_5;
#
# alter character set iso8859_5 set default collation ISO8859_5;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_5', 'ISO8859_5');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation ISO8859_5_UNICODE for iso8859_5;
# alter character set iso8859_5 set default collation ISO8859_5_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_5', 'ISO8859_5_UNICODE');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################## I S O 8 8 5 9 _ 6 ###########################
#
# alter database set default character set ISO8859_6;
#
# alter character set iso8859_6 set default collation ISO8859_6;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_6', 'ISO8859_6');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation ISO8859_6_UNICODE for iso8859_6;
# alter character set iso8859_6 set default collation ISO8859_6_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_6', 'ISO8859_6_UNICODE');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################## I S O 8 8 5 9 _ 7 ###########################
#
# alter database set default character set ISO8859_7;
#
# alter character set iso8859_7 set default collation ISO8859_7;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_7', 'ISO8859_7');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation ISO8859_7_UNICODE for iso8859_7;
# alter character set iso8859_7 set default collation ISO8859_7_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_7', 'ISO8859_7_UNICODE');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################## I S O 8 8 5 9 _ 8 ###########################
#
# alter database set default character set ISO8859_8;
#
# alter character set iso8859_8 set default collation ISO8859_8;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_8', 'ISO8859_8');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation ISO8859_8_UNICODE for iso8859_8;
# alter character set iso8859_8 set default collation ISO8859_8_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_8', 'ISO8859_8_UNICODE');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# --############################## I S O 8 8 5 9 _ 9 ###########################
#
# alter database set default character set ISO8859_9;
#
# alter character set iso8859_9 set default collation ISO8859_9;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_9', 'ISO8859_9');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation ISO8859_9_UNICODE for iso8859_9;
# alter character set iso8859_9 set default collation ISO8859_9_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_9', 'ISO8859_9_UNICODE');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# --############################## I S O 8 8 5 9 _ 1 3 ###########################
#
# alter database set default character set ISO8859_13;
#
# alter character set iso8859_13 set default collation ISO8859_13;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_13', 'ISO8859_13');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation ISO8859_13_UNICODE for iso8859_13;
# alter character set iso8859_13 set default collation ISO8859_13_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ISO8859_13', 'ISO8859_13_UNICODE');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set iso8859_13 set default collation LT_LT;
# recreate view v_info as select f.rdb$field_name as f_name from rdb$fields f where f.rdb$field_name = upper('dm_name');
# commit;
# connect '%(dsn)s';
#
#
# --################################ D O S 8 5 2 #############################
#
# alter database set default character set dos852;
#
# alter character set dos852 set default collation dos852;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos852', 'dos852');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation DOS852_UNICODE for DOS852;
# alter character set dos852 set default collation DOS852_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos852', 'dos852_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos852 set default collation DB_CSY;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos852', 'DB_CSY');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos852 set default collation DB_PLK;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos852', 'DB_PLK');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos852 set default collation DB_SLO;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos852', 'DB_SLO');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos852 set default collation PDOX_CSY;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos852', 'PDOX_CSY');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos852 set default collation PDOX_HUN;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos852', 'PDOX_HUN');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos852 set default collation PDOX_PLK;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos852', 'PDOX_PLK');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos852 set default collation PDOX_SLO;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos852', 'PDOX_SLO');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# --################################ D O S 8 5 7 #############################
#
# alter database set default character set dos857;
#
# alter character set dos857 set default collation dos857;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos857', 'dos857');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation DOS857_UNICODE for dos857;
# alter character set dos857 set default collation DOS857_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos857', 'dos857_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos857 set default collation DB_TRK;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos857', 'db_trk');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --################################ D O S 8 6 0 #############################
#
# alter database set default character set dos860;
#
# alter character set dos860 set default collation dos860;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos860', 'dos860');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation DOS860_UNICODE for dos860;
# alter character set dos860 set default collation DOS860_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos860', 'dos860_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos860 set default collation DB_PTG860;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos860', 'DB_PTG860');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --################################ D O S 8 6 1 #############################
#
# alter database set default character set dos861;
#
# alter character set dos861 set default collation dos861;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos861', 'dos861');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation DOS861_UNICODE for dos861;
# alter character set dos861 set default collation DOS861_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos861', 'dos861_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos861 set default collation PDOX_ISL;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos861', 'pdox_isl');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --################################ D O S 8 6 3 #############################
#
# alter database set default character set dos863;
#
# alter character set dos863 set default collation dos863;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos863', 'dos863');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation DOS863_UNICODE for dos863;
# alter character set dos863 set default collation DOS863_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos863', 'dos863_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set dos863 set default collation DB_FRC863;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos863', 'db_frc863');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# --################################ C Y R L #############################
#
# alter database set default character set cyrl;
#
# alter character set cyrl set default collation cyrl;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('cyrl', 'cyrl');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation cyrl_UNICODE for cyrl;
# alter character set cyrl set default collation cyrl_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('cyrl', 'cyrl_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set cyrl set default collation DB_RUS;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('cyrl', 'db_rus');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set cyrl set default collation PDOX_CYRL;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('cyrl', 'pdox_cyrl');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --################################ D O S 7 3 7 #############################
#
# alter database set default character set dos737;
#
# alter character set dos737 set default collation dos737;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos737', 'dos737');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation DOS737_UNICODE for DOS737;
# alter character set dos737 set default collation DOS737_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos737', 'dos737_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# --################################ D O S 7 7 5 #############################
#
# alter database set default character set dos775;
#
# alter character set dos775 set default collation dos775;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos775', 'dos775');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation DOS775_UNICODE for DOS775;
# alter character set dos775 set default collation DOS775_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos775', 'dos775_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --################################ D O S 8 5 8 #############################
#
# alter database set default character set dos858;
#
# alter character set dos858 set default collation dos858;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos858', 'dos858');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation DOS858_UNICODE for DOS858;
# alter character set dos858 set default collation DOS858_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos858', 'dos858_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# --################################ D O S 8 6 2 #############################
#
# alter database set default character set dos862;
#
# alter character set dos862 set default collation dos862;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos862', 'dos862');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation DOS862_UNICODE for DOS862;
# alter character set dos862 set default collation DOS862_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos862', 'dos862_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# --################################ D O S 8 6 4 #############################
#
# alter database set default character set dos864;
#
# alter character set dos864 set default collation dos864;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos864', 'dos864');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation DOS864_UNICODE for DOS864;
# alter character set dos864 set default collation DOS864_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos864', 'dos864_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# --################################ D O S 8 6 6 #############################
#
# alter database set default character set dos866;
#
# alter character set dos866 set default collation dos866;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos866', 'dos866');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation DOS866_UNICODE for DOS866;
# alter character set dos866 set default collation DOS866_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos866', 'dos866_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# --################################ D O S 8 6 9 #############################
#
# alter database set default character set dos869;
#
# alter character set dos869 set default collation dos869;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos869', 'dos869');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation DOS869_UNICODE for DOS869;
# alter character set dos869 set default collation DOS869_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('dos869', 'dos869_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# --############################### W I N 1 2 5 0 #############################
#
# alter database set default character set win1250;
#
# alter character set win1250 set default collation win1250;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1250', 'win1250');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation win1250_UNICODE for win1250;
# alter character set win1250 set default collation win1250_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1250', 'win1250_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1250 set default collation PXW_CSY;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1250', 'pxw_csy');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1250 set default collation PXW_HUN;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1250', 'pxw_hun');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1250 set default collation PXW_HUNDC;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1250', 'pxw_hundc');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1250 set default collation PXW_PLK;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1250', 'pxw_plk');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1250 set default collation PXW_SLOV;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1250', 'pxw_slov');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1250 set default collation BS_BA;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1250', 'bs_ba');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1250 set default collation WIN_CZ;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1250', 'win_cz');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1250 set default collation WIN_CZ_CI_AI;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1250', 'WIN_CZ_CI_AI');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################### W I N 1 2 5 1 #############################
#
# alter database set default character set win1251;
#
# alter character set win1251 set default collation win1251;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1251', 'win1251');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation win1251_UNICODE for win1251;
# alter character set win1251 set default collation win1251_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1251', 'win1251_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1251 set default collation PXW_CYRL;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1251', 'pxw_cyrl');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1251 set default collation WIN1251_UA;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1251', 'win1251_ua');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################### W I N 1 2 5 2 #############################
#
# alter database set default character set win1252;
#
# alter character set win1252 set default collation win1252;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1252', 'win1252');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation win1252_UNICODE for win1252;
# alter character set win1252 set default collation win1252_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1252', 'win1252_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1252 set default collation PXW_INTL;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1252', 'pxw_intl');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1252 set default collation PXW_INTL850;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1252', 'pxw_intl850');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1252 set default collation PXW_NORDAN4;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1252', 'pxw_nordan4');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1252 set default collation WIN_PTBR;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1252', 'win_ptbr');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1252 set default collation PXW_SPAN;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1252', 'pxw_span');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1252 set default collation PXW_SWEDFIN;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1252', 'pxw_swedfin');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################### W I N 1 2 5 3 #############################
#
# alter database set default character set win1253;
#
# alter character set win1253 set default collation win1253;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1253', 'win1253');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation win1253_UNICODE for win1253;
# alter character set win1253 set default collation win1253_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1253', 'win1253_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1253 set default collation PXW_GREEK;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1253', 'pxw_greek');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################### W I N 1 2 5 4 #############################
#
# alter database set default character set win1254;
#
# alter character set win1254 set default collation win1254;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1254', 'win1254');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation win1254_UNICODE for win1254;
# alter character set win1254 set default collation win1254_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1254', 'win1254_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1254 set default collation PXW_TURK;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1254', 'pxw_turk');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# --################################## N E X T ###############################
#
# alter database set default character set next;
#
# alter character set next set default collation next;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('next', 'next');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation NEXT_UNICODE for next;
# alter character set next set default collation NEXT_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('next', 'next_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set next set default collation NXT_DEU;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('next', 'nxt_deu');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set next set default collation NXT_ESP;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('next', 'nxt_esp');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set next set default collation NXT_FRA;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('next', 'nxt_fra');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set next set default collation NXT_ITA;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('next', 'nxt_ita');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set next set default collation NXT_US;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('next', 'nxt_us');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################### W I N 1 2 5 5 #############################
#
# alter database set default character set win1255;
#
# alter character set win1255 set default collation win1255;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1255', 'win1255');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation win1255_UNICODE for win1255;
# alter character set win1255 set default collation win1255_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1255', 'win1255_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################### W I N 1 2 5 6 #############################
#
# alter database set default character set win1256;
#
# alter character set win1256 set default collation win1256;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1256', 'win1256');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation win1256_UNICODE for win1256;
# alter character set win1256 set default collation win1256_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1256', 'win1256_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################### W I N 1 2 5 7 #############################
#
# alter database set default character set win1257;
#
# alter character set win1257 set default collation win1257;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1257', 'win1257');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation win1257_UNICODE for win1257;
# alter character set win1257 set default collation win1257_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1257', 'win1257_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1257 set default collation WIN1257_EE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1257', 'win1257_ee');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1257 set default collation WIN1257_LT;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1257', 'win1257_lt');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set win1257 set default collation WIN1257_LV;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1257', 'win1257_lv');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################## K S C _ 5 6 0 1 #############################
#
# alter database set default character set ksc_5601;
#
# alter character set ksc_5601 set default collation ksc_5601;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ksc_5601', 'ksc_5601');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation ksc_5601_UNICODE for ksc_5601;
# alter character set ksc_5601 set default collation ksc_5601_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ksc_5601', 'ksc_5601_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set ksc_5601 set default collation KSC_DICTIONARY;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('ksc_5601', 'KSC_DICTIONARY');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --################################# B I G _ 5 ###############################
#
# alter database set default character set big_5;
#
# alter character set big_5 set default collation big_5;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('big_5', 'big_5');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation big_5_UNICODE for big_5;
# alter character set big_5 set default collation big_5_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('big_5', 'big_5_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################# G B _ 2 3 1 2 ###############################
#
# alter database set default character set gb_2312;
#
# alter character set gb_2312 set default collation gb_2312;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('gb_2312', 'gb_2312');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation gb_2312_UNICODE for gb_2312;
# alter character set gb_2312 set default collation gb_2312_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('gb_2312', 'gb_2312_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################# K O I 8 R #################################
#
# alter database set default character set koi8r;
#
# alter character set koi8r set default collation koi8r;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('koi8r', 'koi8r');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation koi8r_UNICODE for koi8r;
# alter character set koi8r set default collation koi8r_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('koi8r', 'koi8r_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set koi8r set default collation koi8r_ru;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('koi8r', 'koi8r_ru');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################# K O I 8 U #################################
#
# alter database set default character set koi8u;
#
# alter character set koi8u set default collation koi8u;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('koi8u', 'koi8u');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation koi8u_UNICODE for koi8u;
# alter character set koi8u set default collation koi8u_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('koi8u', 'koi8u_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# alter character set koi8u set default collation koi8u_ua;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('koi8u', 'koi8u_ua');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --############################### W I N 1 2 5 8 #############################
#
# alter database set default character set win1258;
#
# alter character set win1258 set default collation win1258;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1258', 'win1258');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# create collation win1258_UNICODE for win1258;
# alter character set win1258 set default collation win1258_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('win1258', 'win1258_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --################################ T I S 6 2 0 ##############################
#
# alter database set default character set tis620;
#
# alter character set tis620 set default collation tis620;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('tis620', 'tis620');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# -- pre-registered as system collation, SKIP creation: create collation tis620_UNICODE for tis620;
# alter character set tis620 set default collation tis620_UNICODE;
# commit;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('tis620', 'tis620_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --################################## G B K ################################
#
# alter database set default character set gbk;
#
# alter character set gbk set default collation gbk;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('gbk', 'gbk');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# -- pre-registered as system collation, SKIP creation: create collation gbk_UNICODE for gbk;
# alter character set gbk set default collation gbk_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('gbk', 'gbk_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --################################ C 9 6 4 3 C ##############################
#
# alter database set default character set cp943c;
#
# alter character set cp943c set default collation cp943c;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('cp943c', 'cp943c');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# -- pre-registered as system collation, SKIP creation: create collation cp943c_UNICODE for cp943c;
# alter character set cp943c set default collation cp943c_UNICODE;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('cp943c', 'cp943c_unicode');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
#
# --################################ G B 1 8 0 3 0 ##############################
#
# alter database set default character set gb18030;
#
# alter character set gb18030 set default collation gb18030;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('gb18030', 'gb18030');
# commit;
# select * from v_info;
# commit;
# connect '%(dsn)s';
#
# -- pre-registered as system collation, SKIP creation: create collation gb18030_UNICODE for gb18030;
# alter character set gb18030 set default collation gb18030_UNICODE;
# commit;
# -- remove existing objects:
# execute procedure sp_cleanup;
# commit;
# execute procedure sp_add_objects('gb18030', 'gb18030_unicode');
# commit;
# select * from v_info;
# commit;
# ''' % dict(globals(), **locals())
#
# fb_home = services.connect(host='localhost', user= user_name, password= user_password).get_home_directory()
#
# f_sql_cmd=open( os.path.join(context['temp_directory'],'tmp_6336.sql'), 'w')
# f_sql_cmd.write(sql_text)
# flush_and_close( f_sql_cmd )
#
# f_sql_log=open( os.path.join(context['temp_directory'],'tmp_6336.log'), 'w')
# f_sql_err=open( os.path.join(context['temp_directory'],'tmp_6336.err'), 'w')
# subprocess.call( [ context['isql_path'] , "-q", "-i", f_sql_cmd.name ], stdout=f_sql_log, stderr=f_sql_err )
# flush_and_close( f_sql_log )
# flush_and_close( f_sql_err )
#
# with open(f_sql_err.name, 'r') as f:
# for line in f:
# print('UNEXPECTED STDERR: ' + line)
#
# with open(f_sql_log.name, 'r') as f:
# for line in f:
# print(line)
#
#
# cleanup( ( f_sql_cmd, f_sql_log, f_sql_err ) )
#
#
#---
#act_1 = python_act('db_1', test_script_1, substitutions=substitutions_1)
expected_stdout_1 = """
F_NAME DM_BLOB
CSET_ID 5
COLL_ID 0
CSET_NAME SJIS_0208
CSET_DEFAULT_COLL SJIS_0208
DOMAIN_COLL_NAME SJIS_0208
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 5
COLL_ID 0
CSET_NAME SJIS_0208
CSET_DEFAULT_COLL SJIS_0208
DOMAIN_COLL_NAME SJIS_0208
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 5
COLL_ID 0
CSET_NAME SJIS_0208
CSET_DEFAULT_COLL SJIS_0208
DOMAIN_COLL_NAME SJIS_0208
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 5
COLL_ID 126
CSET_NAME SJIS_0208
CSET_DEFAULT_COLL SJIS_0208_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 5
COLL_ID 126
CSET_NAME SJIS_0208
CSET_DEFAULT_COLL SJIS_0208_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 5
COLL_ID 125
CSET_NAME SJIS_0208
CSET_DEFAULT_COLL SJIS_0208_UNICODE
DOMAIN_COLL_NAME SJIS_0208_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 6
COLL_ID 0
CSET_NAME EUCJ_0208
CSET_DEFAULT_COLL EUCJ_0208
DOMAIN_COLL_NAME EUCJ_0208
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 6
COLL_ID 0
CSET_NAME EUCJ_0208
CSET_DEFAULT_COLL EUCJ_0208
DOMAIN_COLL_NAME EUCJ_0208
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 6
COLL_ID 0
CSET_NAME EUCJ_0208
CSET_DEFAULT_COLL EUCJ_0208
DOMAIN_COLL_NAME EUCJ_0208
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 6
COLL_ID 126
CSET_NAME EUCJ_0208
CSET_DEFAULT_COLL EUCJ_0208_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 6
COLL_ID 126
CSET_NAME EUCJ_0208
CSET_DEFAULT_COLL EUCJ_0208_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 6
COLL_ID 125
CSET_NAME EUCJ_0208
CSET_DEFAULT_COLL EUCJ_0208_UNICODE
DOMAIN_COLL_NAME EUCJ_0208_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 10
COLL_ID 125
CSET_NAME DOS437
CSET_DEFAULT_COLL DOS437_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 10
COLL_ID 125
CSET_NAME DOS437
CSET_DEFAULT_COLL DOS437_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 10
COLL_ID 126
CSET_NAME DOS437
CSET_DEFAULT_COLL DOS437_UNICODE
DOMAIN_COLL_NAME DOS437_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 10
COLL_ID 4
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_DEU437
DOMAIN_COLL_NAME DB_DEU437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 10
COLL_ID 4
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_DEU437
DOMAIN_COLL_NAME DB_DEU437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 10
COLL_ID 4
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_DEU437
DOMAIN_COLL_NAME DB_DEU437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 10
COLL_ID 5
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_ESP437
DOMAIN_COLL_NAME DB_ESP437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 10
COLL_ID 5
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_ESP437
DOMAIN_COLL_NAME DB_ESP437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 10
COLL_ID 5
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_ESP437
DOMAIN_COLL_NAME DB_ESP437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 10
COLL_ID 6
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_FIN437
DOMAIN_COLL_NAME DB_FIN437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 10
COLL_ID 6
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_FIN437
DOMAIN_COLL_NAME DB_FIN437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 10
COLL_ID 6
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_FIN437
DOMAIN_COLL_NAME DB_FIN437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 10
COLL_ID 7
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_FRA437
DOMAIN_COLL_NAME DB_FRA437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 10
COLL_ID 7
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_FRA437
DOMAIN_COLL_NAME DB_FRA437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 10
COLL_ID 7
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_FRA437
DOMAIN_COLL_NAME DB_FRA437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 10
COLL_ID 8
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_ITA437
DOMAIN_COLL_NAME DB_ITA437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 10
COLL_ID 8
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_ITA437
DOMAIN_COLL_NAME DB_ITA437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 10
COLL_ID 8
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_ITA437
DOMAIN_COLL_NAME DB_ITA437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 10
COLL_ID 9
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_NLD437
DOMAIN_COLL_NAME DB_NLD437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 10
COLL_ID 9
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_NLD437
DOMAIN_COLL_NAME DB_NLD437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 10
COLL_ID 9
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_NLD437
DOMAIN_COLL_NAME DB_NLD437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 10
COLL_ID 10
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_SVE437
DOMAIN_COLL_NAME DB_SVE437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 10
COLL_ID 10
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_SVE437
DOMAIN_COLL_NAME DB_SVE437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 10
COLL_ID 10
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_SVE437
DOMAIN_COLL_NAME DB_SVE437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 10
COLL_ID 11
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_UK437
DOMAIN_COLL_NAME DB_UK437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 10
COLL_ID 11
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_UK437
DOMAIN_COLL_NAME DB_UK437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 10
COLL_ID 11
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_UK437
DOMAIN_COLL_NAME DB_UK437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 10
COLL_ID 12
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_US437
DOMAIN_COLL_NAME DB_US437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 10
COLL_ID 12
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_US437
DOMAIN_COLL_NAME DB_US437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 10
COLL_ID 12
CSET_NAME DOS437
CSET_DEFAULT_COLL DB_US437
DOMAIN_COLL_NAME DB_US437
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 10
COLL_ID 1
CSET_NAME DOS437
CSET_DEFAULT_COLL PDOX_ASCII
DOMAIN_COLL_NAME PDOX_ASCII
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 10
COLL_ID 1
CSET_NAME DOS437
CSET_DEFAULT_COLL PDOX_ASCII
DOMAIN_COLL_NAME PDOX_ASCII
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 10
COLL_ID 1
CSET_NAME DOS437
CSET_DEFAULT_COLL PDOX_ASCII
DOMAIN_COLL_NAME PDOX_ASCII
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 10
COLL_ID 2
CSET_NAME DOS437
CSET_DEFAULT_COLL PDOX_INTL
DOMAIN_COLL_NAME PDOX_INTL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 10
COLL_ID 2
CSET_NAME DOS437
CSET_DEFAULT_COLL PDOX_INTL
DOMAIN_COLL_NAME PDOX_INTL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 10
COLL_ID 2
CSET_NAME DOS437
CSET_DEFAULT_COLL PDOX_INTL
DOMAIN_COLL_NAME PDOX_INTL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 10
COLL_ID 3
CSET_NAME DOS437
CSET_DEFAULT_COLL PDOX_SWEDFIN
DOMAIN_COLL_NAME PDOX_SWEDFIN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 10
COLL_ID 3
CSET_NAME DOS437
CSET_DEFAULT_COLL PDOX_SWEDFIN
DOMAIN_COLL_NAME PDOX_SWEDFIN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 10
COLL_ID 3
CSET_NAME DOS437
CSET_DEFAULT_COLL PDOX_SWEDFIN
DOMAIN_COLL_NAME PDOX_SWEDFIN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 11
COLL_ID 0
CSET_NAME DOS850
CSET_DEFAULT_COLL DOS850
DOMAIN_COLL_NAME DOS850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 11
COLL_ID 0
CSET_NAME DOS850
CSET_DEFAULT_COLL DOS850
DOMAIN_COLL_NAME DOS850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 11
COLL_ID 0
CSET_NAME DOS850
CSET_DEFAULT_COLL DOS850
DOMAIN_COLL_NAME DOS850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 11
COLL_ID 126
CSET_NAME DOS850
CSET_DEFAULT_COLL DOS850_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 11
COLL_ID 126
CSET_NAME DOS850
CSET_DEFAULT_COLL DOS850_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 11
COLL_ID 125
CSET_NAME DOS850
CSET_DEFAULT_COLL DOS850_UNICODE
DOMAIN_COLL_NAME DOS850_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 11
COLL_ID 2
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_DEU850
DOMAIN_COLL_NAME DB_DEU850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 11
COLL_ID 2
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_DEU850
DOMAIN_COLL_NAME DB_DEU850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 11
COLL_ID 2
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_DEU850
DOMAIN_COLL_NAME DB_DEU850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 11
COLL_ID 4
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_FRA850
DOMAIN_COLL_NAME DB_FRA850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 11
COLL_ID 4
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_FRA850
DOMAIN_COLL_NAME DB_FRA850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 11
COLL_ID 4
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_FRA850
DOMAIN_COLL_NAME DB_FRA850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 11
COLL_ID 1
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_FRC850
DOMAIN_COLL_NAME DB_FRC850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 11
COLL_ID 1
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_FRC850
DOMAIN_COLL_NAME DB_FRC850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 11
COLL_ID 1
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_FRC850
DOMAIN_COLL_NAME DB_FRC850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 11
COLL_ID 5
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_ITA850
DOMAIN_COLL_NAME DB_ITA850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 11
COLL_ID 5
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_ITA850
DOMAIN_COLL_NAME DB_ITA850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 11
COLL_ID 5
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_ITA850
DOMAIN_COLL_NAME DB_ITA850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 11
COLL_ID 6
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_NLD850
DOMAIN_COLL_NAME DB_NLD850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 11
COLL_ID 6
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_NLD850
DOMAIN_COLL_NAME DB_NLD850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 11
COLL_ID 6
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_NLD850
DOMAIN_COLL_NAME DB_NLD850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 11
COLL_ID 7
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_PTB850
DOMAIN_COLL_NAME DB_PTB850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 11
COLL_ID 7
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_PTB850
DOMAIN_COLL_NAME DB_PTB850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 11
COLL_ID 7
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_PTB850
DOMAIN_COLL_NAME DB_PTB850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 11
COLL_ID 8
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_SVE850
DOMAIN_COLL_NAME DB_SVE850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 11
COLL_ID 8
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_SVE850
DOMAIN_COLL_NAME DB_SVE850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 11
COLL_ID 8
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_SVE850
DOMAIN_COLL_NAME DB_SVE850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 11
COLL_ID 9
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_UK850
DOMAIN_COLL_NAME DB_UK850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 11
COLL_ID 9
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_UK850
DOMAIN_COLL_NAME DB_UK850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 11
COLL_ID 9
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_UK850
DOMAIN_COLL_NAME DB_UK850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 11
COLL_ID 10
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_US850
DOMAIN_COLL_NAME DB_US850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 11
COLL_ID 10
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_US850
DOMAIN_COLL_NAME DB_US850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 11
COLL_ID 10
CSET_NAME DOS850
CSET_DEFAULT_COLL DB_US850
DOMAIN_COLL_NAME DB_US850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 12
COLL_ID 0
CSET_NAME DOS865
CSET_DEFAULT_COLL DOS865
DOMAIN_COLL_NAME DOS865
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 12
COLL_ID 0
CSET_NAME DOS865
CSET_DEFAULT_COLL DOS865
DOMAIN_COLL_NAME DOS865
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 12
COLL_ID 0
CSET_NAME DOS865
CSET_DEFAULT_COLL DOS865
DOMAIN_COLL_NAME DOS865
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 12
COLL_ID 126
CSET_NAME DOS865
CSET_DEFAULT_COLL DOS865_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 12
COLL_ID 126
CSET_NAME DOS865
CSET_DEFAULT_COLL DOS865_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 12
COLL_ID 125
CSET_NAME DOS865
CSET_DEFAULT_COLL DOS865_UNICODE
DOMAIN_COLL_NAME DOS865_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 12
COLL_ID 2
CSET_NAME DOS865
CSET_DEFAULT_COLL DB_DAN865
DOMAIN_COLL_NAME DB_DAN865
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 12
COLL_ID 2
CSET_NAME DOS865
CSET_DEFAULT_COLL DB_DAN865
DOMAIN_COLL_NAME DB_DAN865
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 12
COLL_ID 2
CSET_NAME DOS865
CSET_DEFAULT_COLL DB_DAN865
DOMAIN_COLL_NAME DB_DAN865
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 12
COLL_ID 3
CSET_NAME DOS865
CSET_DEFAULT_COLL DB_NOR865
DOMAIN_COLL_NAME DB_NOR865
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 12
COLL_ID 3
CSET_NAME DOS865
CSET_DEFAULT_COLL DB_NOR865
DOMAIN_COLL_NAME DB_NOR865
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 12
COLL_ID 3
CSET_NAME DOS865
CSET_DEFAULT_COLL DB_NOR865
DOMAIN_COLL_NAME DB_NOR865
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 12
COLL_ID 1
CSET_NAME DOS865
CSET_DEFAULT_COLL PDOX_NORDAN4
DOMAIN_COLL_NAME PDOX_NORDAN4
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 12
COLL_ID 1
CSET_NAME DOS865
CSET_DEFAULT_COLL PDOX_NORDAN4
DOMAIN_COLL_NAME PDOX_NORDAN4
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 12
COLL_ID 1
CSET_NAME DOS865
CSET_DEFAULT_COLL PDOX_NORDAN4
DOMAIN_COLL_NAME PDOX_NORDAN4
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 0
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL ISO8859_1
DOMAIN_COLL_NAME ISO8859_1
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 21
COLL_ID 0
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL ISO8859_1
DOMAIN_COLL_NAME ISO8859_1
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 0
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL ISO8859_1
DOMAIN_COLL_NAME ISO8859_1
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 126
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL ISO8859_1_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 21
COLL_ID 126
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL ISO8859_1_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 125
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL ISO8859_1_UNICODE
DOMAIN_COLL_NAME ISO8859_1_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 1
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL DA_DA
DOMAIN_COLL_NAME DA_DA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 21
COLL_ID 1
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL DA_DA
DOMAIN_COLL_NAME DA_DA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 1
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL DA_DA
DOMAIN_COLL_NAME DA_DA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 6
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL DE_DE
DOMAIN_COLL_NAME DE_DE
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 21
COLL_ID 6
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL DE_DE
DOMAIN_COLL_NAME DE_DE
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 6
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL DE_DE
DOMAIN_COLL_NAME DE_DE
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 2
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL DU_NL
DOMAIN_COLL_NAME DU_NL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 21
COLL_ID 2
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL DU_NL
DOMAIN_COLL_NAME DU_NL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 2
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL DU_NL
DOMAIN_COLL_NAME DU_NL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 12
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL EN_UK
DOMAIN_COLL_NAME EN_UK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 21
COLL_ID 12
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL EN_UK
DOMAIN_COLL_NAME EN_UK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 12
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL EN_UK
DOMAIN_COLL_NAME EN_UK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 14
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL EN_US
DOMAIN_COLL_NAME EN_US
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 21
COLL_ID 14
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL EN_US
DOMAIN_COLL_NAME EN_US
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 14
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL EN_US
DOMAIN_COLL_NAME EN_US
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 10
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL ES_ES
DOMAIN_COLL_NAME ES_ES
COLL_ATTR 1
COLL_SPEC DISABLE-COMPRESSIONS=1;SPECIALS-FIRST=1
F_NAME DM_NAME
CSET_ID 21
COLL_ID 10
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL ES_ES
DOMAIN_COLL_NAME ES_ES
COLL_ATTR 1
COLL_SPEC DISABLE-COMPRESSIONS=1;SPECIALS-FIRST=1
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 10
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL ES_ES
DOMAIN_COLL_NAME ES_ES
COLL_ATTR 1
COLL_SPEC DISABLE-COMPRESSIONS=1;SPECIALS-FIRST=1
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 17
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL ES_ES_CI_AI
DOMAIN_COLL_NAME ES_ES_CI_AI
COLL_ATTR 7
COLL_SPEC DISABLE-COMPRESSIONS=1;SPECIALS-FIRST=1
F_NAME DM_NAME
CSET_ID 21
COLL_ID 17
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL ES_ES_CI_AI
DOMAIN_COLL_NAME ES_ES_CI_AI
COLL_ATTR 7
COLL_SPEC DISABLE-COMPRESSIONS=1;SPECIALS-FIRST=1
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 17
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL ES_ES_CI_AI
DOMAIN_COLL_NAME ES_ES_CI_AI
COLL_ATTR 7
COLL_SPEC DISABLE-COMPRESSIONS=1;SPECIALS-FIRST=1
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 3
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL FI_FI
DOMAIN_COLL_NAME FI_FI
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 21
COLL_ID 3
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL FI_FI
DOMAIN_COLL_NAME FI_FI
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 3
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL FI_FI
DOMAIN_COLL_NAME FI_FI
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 5
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL FR_CA
DOMAIN_COLL_NAME FR_CA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 21
COLL_ID 5
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL FR_CA
DOMAIN_COLL_NAME FR_CA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 5
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL FR_CA
DOMAIN_COLL_NAME FR_CA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 4
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL FR_FR
DOMAIN_COLL_NAME FR_FR
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 21
COLL_ID 4
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL FR_FR
DOMAIN_COLL_NAME FR_FR
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 4
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL FR_FR
DOMAIN_COLL_NAME FR_FR
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 7
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL IS_IS
DOMAIN_COLL_NAME IS_IS
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 21
COLL_ID 7
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL IS_IS
DOMAIN_COLL_NAME IS_IS
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 7
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL IS_IS
DOMAIN_COLL_NAME IS_IS
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 8
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL IT_IT
DOMAIN_COLL_NAME IT_IT
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 21
COLL_ID 8
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL IT_IT
DOMAIN_COLL_NAME IT_IT
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 8
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL IT_IT
DOMAIN_COLL_NAME IT_IT
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 9
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL NO_NO
DOMAIN_COLL_NAME NO_NO
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 21
COLL_ID 9
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL NO_NO
DOMAIN_COLL_NAME NO_NO
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 9
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL NO_NO
DOMAIN_COLL_NAME NO_NO
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 11
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL SV_SV
DOMAIN_COLL_NAME SV_SV
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 21
COLL_ID 11
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL SV_SV
DOMAIN_COLL_NAME SV_SV
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 11
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL SV_SV
DOMAIN_COLL_NAME SV_SV
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 16
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL PT_BR
DOMAIN_COLL_NAME PT_BR
COLL_ATTR 7
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 21
COLL_ID 16
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL PT_BR
DOMAIN_COLL_NAME PT_BR
COLL_ATTR 7
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 16
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL PT_BR
DOMAIN_COLL_NAME PT_BR
COLL_ATTR 7
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 21
COLL_ID 15
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL PT_PT
DOMAIN_COLL_NAME PT_PT
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 21
COLL_ID 15
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL PT_PT
DOMAIN_COLL_NAME PT_PT
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 21
COLL_ID 15
CSET_NAME ISO8859_1
CSET_DEFAULT_COLL PT_PT
DOMAIN_COLL_NAME PT_PT
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 22
COLL_ID 0
CSET_NAME ISO8859_2
CSET_DEFAULT_COLL ISO8859_2
DOMAIN_COLL_NAME ISO8859_2
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 22
COLL_ID 0
CSET_NAME ISO8859_2
CSET_DEFAULT_COLL ISO8859_2
DOMAIN_COLL_NAME ISO8859_2
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 22
COLL_ID 0
CSET_NAME ISO8859_2
CSET_DEFAULT_COLL ISO8859_2
DOMAIN_COLL_NAME ISO8859_2
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 22
COLL_ID 126
CSET_NAME ISO8859_2
CSET_DEFAULT_COLL ISO8859_2_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 22
COLL_ID 126
CSET_NAME ISO8859_2
CSET_DEFAULT_COLL ISO8859_2_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 22
COLL_ID 125
CSET_NAME ISO8859_2
CSET_DEFAULT_COLL ISO8859_2_UNICODE
DOMAIN_COLL_NAME ISO8859_2_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 22
COLL_ID 1
CSET_NAME ISO8859_2
CSET_DEFAULT_COLL CS_CZ
DOMAIN_COLL_NAME CS_CZ
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 22
COLL_ID 1
CSET_NAME ISO8859_2
CSET_DEFAULT_COLL CS_CZ
DOMAIN_COLL_NAME CS_CZ
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 22
COLL_ID 1
CSET_NAME ISO8859_2
CSET_DEFAULT_COLL CS_CZ
DOMAIN_COLL_NAME CS_CZ
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 22
COLL_ID 2
CSET_NAME ISO8859_2
CSET_DEFAULT_COLL ISO_HUN
DOMAIN_COLL_NAME ISO_HUN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 22
COLL_ID 2
CSET_NAME ISO8859_2
CSET_DEFAULT_COLL ISO_HUN
DOMAIN_COLL_NAME ISO_HUN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 22
COLL_ID 2
CSET_NAME ISO8859_2
CSET_DEFAULT_COLL ISO_HUN
DOMAIN_COLL_NAME ISO_HUN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 22
COLL_ID 3
CSET_NAME ISO8859_2
CSET_DEFAULT_COLL ISO_PLK
DOMAIN_COLL_NAME ISO_PLK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 22
COLL_ID 3
CSET_NAME ISO8859_2
CSET_DEFAULT_COLL ISO_PLK
DOMAIN_COLL_NAME ISO_PLK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 22
COLL_ID 3
CSET_NAME ISO8859_2
CSET_DEFAULT_COLL ISO_PLK
DOMAIN_COLL_NAME ISO_PLK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 23
COLL_ID 0
CSET_NAME ISO8859_3
CSET_DEFAULT_COLL ISO8859_3
DOMAIN_COLL_NAME ISO8859_3
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 23
COLL_ID 0
CSET_NAME ISO8859_3
CSET_DEFAULT_COLL ISO8859_3
DOMAIN_COLL_NAME ISO8859_3
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 23
COLL_ID 0
CSET_NAME ISO8859_3
CSET_DEFAULT_COLL ISO8859_3
DOMAIN_COLL_NAME ISO8859_3
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 23
COLL_ID 126
CSET_NAME ISO8859_3
CSET_DEFAULT_COLL ISO8859_3_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 23
COLL_ID 126
CSET_NAME ISO8859_3
CSET_DEFAULT_COLL ISO8859_3_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 23
COLL_ID 125
CSET_NAME ISO8859_3
CSET_DEFAULT_COLL ISO8859_3_UNICODE
DOMAIN_COLL_NAME ISO8859_3_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 34
COLL_ID 0
CSET_NAME ISO8859_4
CSET_DEFAULT_COLL ISO8859_4
DOMAIN_COLL_NAME ISO8859_4
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 34
COLL_ID 0
CSET_NAME ISO8859_4
CSET_DEFAULT_COLL ISO8859_4
DOMAIN_COLL_NAME ISO8859_4
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 34
COLL_ID 0
CSET_NAME ISO8859_4
CSET_DEFAULT_COLL ISO8859_4
DOMAIN_COLL_NAME ISO8859_4
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 34
COLL_ID 126
CSET_NAME ISO8859_4
CSET_DEFAULT_COLL ISO8859_4_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 34
COLL_ID 126
CSET_NAME ISO8859_4
CSET_DEFAULT_COLL ISO8859_4_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 34
COLL_ID 125
CSET_NAME ISO8859_4
CSET_DEFAULT_COLL ISO8859_4_UNICODE
DOMAIN_COLL_NAME ISO8859_4_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 35
COLL_ID 0
CSET_NAME ISO8859_5
CSET_DEFAULT_COLL ISO8859_5
DOMAIN_COLL_NAME ISO8859_5
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 35
COLL_ID 0
CSET_NAME ISO8859_5
CSET_DEFAULT_COLL ISO8859_5
DOMAIN_COLL_NAME ISO8859_5
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 35
COLL_ID 0
CSET_NAME ISO8859_5
CSET_DEFAULT_COLL ISO8859_5
DOMAIN_COLL_NAME ISO8859_5
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 35
COLL_ID 126
CSET_NAME ISO8859_5
CSET_DEFAULT_COLL ISO8859_5_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 35
COLL_ID 126
CSET_NAME ISO8859_5
CSET_DEFAULT_COLL ISO8859_5_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 35
COLL_ID 125
CSET_NAME ISO8859_5
CSET_DEFAULT_COLL ISO8859_5_UNICODE
DOMAIN_COLL_NAME ISO8859_5_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 36
COLL_ID 0
CSET_NAME ISO8859_6
CSET_DEFAULT_COLL ISO8859_6
DOMAIN_COLL_NAME ISO8859_6
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 36
COLL_ID 0
CSET_NAME ISO8859_6
CSET_DEFAULT_COLL ISO8859_6
DOMAIN_COLL_NAME ISO8859_6
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 36
COLL_ID 0
CSET_NAME ISO8859_6
CSET_DEFAULT_COLL ISO8859_6
DOMAIN_COLL_NAME ISO8859_6
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 36
COLL_ID 126
CSET_NAME ISO8859_6
CSET_DEFAULT_COLL ISO8859_6_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 36
COLL_ID 126
CSET_NAME ISO8859_6
CSET_DEFAULT_COLL ISO8859_6_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 36
COLL_ID 125
CSET_NAME ISO8859_6
CSET_DEFAULT_COLL ISO8859_6_UNICODE
DOMAIN_COLL_NAME ISO8859_6_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 37
COLL_ID 0
CSET_NAME ISO8859_7
CSET_DEFAULT_COLL ISO8859_7
DOMAIN_COLL_NAME ISO8859_7
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 37
COLL_ID 0
CSET_NAME ISO8859_7
CSET_DEFAULT_COLL ISO8859_7
DOMAIN_COLL_NAME ISO8859_7
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 37
COLL_ID 0
CSET_NAME ISO8859_7
CSET_DEFAULT_COLL ISO8859_7
DOMAIN_COLL_NAME ISO8859_7
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 37
COLL_ID 126
CSET_NAME ISO8859_7
CSET_DEFAULT_COLL ISO8859_7_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 37
COLL_ID 126
CSET_NAME ISO8859_7
CSET_DEFAULT_COLL ISO8859_7_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 37
COLL_ID 125
CSET_NAME ISO8859_7
CSET_DEFAULT_COLL ISO8859_7_UNICODE
DOMAIN_COLL_NAME ISO8859_7_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 38
COLL_ID 0
CSET_NAME ISO8859_8
CSET_DEFAULT_COLL ISO8859_8
DOMAIN_COLL_NAME ISO8859_8
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 38
COLL_ID 0
CSET_NAME ISO8859_8
CSET_DEFAULT_COLL ISO8859_8
DOMAIN_COLL_NAME ISO8859_8
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 38
COLL_ID 0
CSET_NAME ISO8859_8
CSET_DEFAULT_COLL ISO8859_8
DOMAIN_COLL_NAME ISO8859_8
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 38
COLL_ID 126
CSET_NAME ISO8859_8
CSET_DEFAULT_COLL ISO8859_8_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 38
COLL_ID 126
CSET_NAME ISO8859_8
CSET_DEFAULT_COLL ISO8859_8_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 38
COLL_ID 125
CSET_NAME ISO8859_8
CSET_DEFAULT_COLL ISO8859_8_UNICODE
DOMAIN_COLL_NAME ISO8859_8_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 39
COLL_ID 0
CSET_NAME ISO8859_9
CSET_DEFAULT_COLL ISO8859_9
DOMAIN_COLL_NAME ISO8859_9
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 39
COLL_ID 0
CSET_NAME ISO8859_9
CSET_DEFAULT_COLL ISO8859_9
DOMAIN_COLL_NAME ISO8859_9
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 39
COLL_ID 0
CSET_NAME ISO8859_9
CSET_DEFAULT_COLL ISO8859_9
DOMAIN_COLL_NAME ISO8859_9
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 39
COLL_ID 126
CSET_NAME ISO8859_9
CSET_DEFAULT_COLL ISO8859_9_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 39
COLL_ID 126
CSET_NAME ISO8859_9
CSET_DEFAULT_COLL ISO8859_9_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 39
COLL_ID 125
CSET_NAME ISO8859_9
CSET_DEFAULT_COLL ISO8859_9_UNICODE
DOMAIN_COLL_NAME ISO8859_9_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 40
COLL_ID 0
CSET_NAME ISO8859_13
CSET_DEFAULT_COLL ISO8859_13
DOMAIN_COLL_NAME ISO8859_13
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 40
COLL_ID 0
CSET_NAME ISO8859_13
CSET_DEFAULT_COLL ISO8859_13
DOMAIN_COLL_NAME ISO8859_13
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 40
COLL_ID 0
CSET_NAME ISO8859_13
CSET_DEFAULT_COLL ISO8859_13
DOMAIN_COLL_NAME ISO8859_13
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 40
COLL_ID 126
CSET_NAME ISO8859_13
CSET_DEFAULT_COLL ISO8859_13_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 40
COLL_ID 126
CSET_NAME ISO8859_13
CSET_DEFAULT_COLL ISO8859_13_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 40
COLL_ID 125
CSET_NAME ISO8859_13
CSET_DEFAULT_COLL ISO8859_13_UNICODE
DOMAIN_COLL_NAME ISO8859_13_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 45
COLL_ID 0
CSET_NAME DOS852
CSET_DEFAULT_COLL DOS852
DOMAIN_COLL_NAME DOS852
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 45
COLL_ID 0
CSET_NAME DOS852
CSET_DEFAULT_COLL DOS852
DOMAIN_COLL_NAME DOS852
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 45
COLL_ID 0
CSET_NAME DOS852
CSET_DEFAULT_COLL DOS852
DOMAIN_COLL_NAME DOS852
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 45
COLL_ID 126
CSET_NAME DOS852
CSET_DEFAULT_COLL DOS852_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 45
COLL_ID 126
CSET_NAME DOS852
CSET_DEFAULT_COLL DOS852_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 45
COLL_ID 125
CSET_NAME DOS852
CSET_DEFAULT_COLL DOS852_UNICODE
DOMAIN_COLL_NAME DOS852_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 45
COLL_ID 1
CSET_NAME DOS852
CSET_DEFAULT_COLL DB_CSY
DOMAIN_COLL_NAME DB_CSY
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 45
COLL_ID 1
CSET_NAME DOS852
CSET_DEFAULT_COLL DB_CSY
DOMAIN_COLL_NAME DB_CSY
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 45
COLL_ID 1
CSET_NAME DOS852
CSET_DEFAULT_COLL DB_CSY
DOMAIN_COLL_NAME DB_CSY
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 45
COLL_ID 2
CSET_NAME DOS852
CSET_DEFAULT_COLL DB_PLK
DOMAIN_COLL_NAME DB_PLK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 45
COLL_ID 2
CSET_NAME DOS852
CSET_DEFAULT_COLL DB_PLK
DOMAIN_COLL_NAME DB_PLK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 45
COLL_ID 2
CSET_NAME DOS852
CSET_DEFAULT_COLL DB_PLK
DOMAIN_COLL_NAME DB_PLK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 45
COLL_ID 4
CSET_NAME DOS852
CSET_DEFAULT_COLL DB_SLO
DOMAIN_COLL_NAME DB_SLO
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 45
COLL_ID 4
CSET_NAME DOS852
CSET_DEFAULT_COLL DB_SLO
DOMAIN_COLL_NAME DB_SLO
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 45
COLL_ID 4
CSET_NAME DOS852
CSET_DEFAULT_COLL DB_SLO
DOMAIN_COLL_NAME DB_SLO
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 45
COLL_ID 5
CSET_NAME DOS852
CSET_DEFAULT_COLL PDOX_CSY
DOMAIN_COLL_NAME PDOX_CSY
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 45
COLL_ID 5
CSET_NAME DOS852
CSET_DEFAULT_COLL PDOX_CSY
DOMAIN_COLL_NAME PDOX_CSY
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 45
COLL_ID 5
CSET_NAME DOS852
CSET_DEFAULT_COLL PDOX_CSY
DOMAIN_COLL_NAME PDOX_CSY
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 45
COLL_ID 7
CSET_NAME DOS852
CSET_DEFAULT_COLL PDOX_HUN
DOMAIN_COLL_NAME PDOX_HUN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 45
COLL_ID 7
CSET_NAME DOS852
CSET_DEFAULT_COLL PDOX_HUN
DOMAIN_COLL_NAME PDOX_HUN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 45
COLL_ID 7
CSET_NAME DOS852
CSET_DEFAULT_COLL PDOX_HUN
DOMAIN_COLL_NAME PDOX_HUN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 45
COLL_ID 6
CSET_NAME DOS852
CSET_DEFAULT_COLL PDOX_PLK
DOMAIN_COLL_NAME PDOX_PLK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 45
COLL_ID 6
CSET_NAME DOS852
CSET_DEFAULT_COLL PDOX_PLK
DOMAIN_COLL_NAME PDOX_PLK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 45
COLL_ID 6
CSET_NAME DOS852
CSET_DEFAULT_COLL PDOX_PLK
DOMAIN_COLL_NAME PDOX_PLK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 45
COLL_ID 8
CSET_NAME DOS852
CSET_DEFAULT_COLL PDOX_SLO
DOMAIN_COLL_NAME PDOX_SLO
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 45
COLL_ID 8
CSET_NAME DOS852
CSET_DEFAULT_COLL PDOX_SLO
DOMAIN_COLL_NAME PDOX_SLO
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 45
COLL_ID 8
CSET_NAME DOS852
CSET_DEFAULT_COLL PDOX_SLO
DOMAIN_COLL_NAME PDOX_SLO
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 46
COLL_ID 0
CSET_NAME DOS857
CSET_DEFAULT_COLL DOS857
DOMAIN_COLL_NAME DOS857
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 46
COLL_ID 0
CSET_NAME DOS857
CSET_DEFAULT_COLL DOS857
DOMAIN_COLL_NAME DOS857
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 46
COLL_ID 0
CSET_NAME DOS857
CSET_DEFAULT_COLL DOS857
DOMAIN_COLL_NAME DOS857
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 46
COLL_ID 126
CSET_NAME DOS857
CSET_DEFAULT_COLL DOS857_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 46
COLL_ID 126
CSET_NAME DOS857
CSET_DEFAULT_COLL DOS857_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 46
COLL_ID 125
CSET_NAME DOS857
CSET_DEFAULT_COLL DOS857_UNICODE
DOMAIN_COLL_NAME DOS857_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 46
COLL_ID 1
CSET_NAME DOS857
CSET_DEFAULT_COLL DB_TRK
DOMAIN_COLL_NAME DB_TRK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 46
COLL_ID 1
CSET_NAME DOS857
CSET_DEFAULT_COLL DB_TRK
DOMAIN_COLL_NAME DB_TRK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 46
COLL_ID 1
CSET_NAME DOS857
CSET_DEFAULT_COLL DB_TRK
DOMAIN_COLL_NAME DB_TRK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 13
COLL_ID 0
CSET_NAME DOS860
CSET_DEFAULT_COLL DOS860
DOMAIN_COLL_NAME DOS860
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 13
COLL_ID 0
CSET_NAME DOS860
CSET_DEFAULT_COLL DOS860
DOMAIN_COLL_NAME DOS860
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 13
COLL_ID 0
CSET_NAME DOS860
CSET_DEFAULT_COLL DOS860
DOMAIN_COLL_NAME DOS860
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 13
COLL_ID 126
CSET_NAME DOS860
CSET_DEFAULT_COLL DOS860_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 13
COLL_ID 126
CSET_NAME DOS860
CSET_DEFAULT_COLL DOS860_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 13
COLL_ID 125
CSET_NAME DOS860
CSET_DEFAULT_COLL DOS860_UNICODE
DOMAIN_COLL_NAME DOS860_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 13
COLL_ID 1
CSET_NAME DOS860
CSET_DEFAULT_COLL DB_PTG860
DOMAIN_COLL_NAME DB_PTG860
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 13
COLL_ID 1
CSET_NAME DOS860
CSET_DEFAULT_COLL DB_PTG860
DOMAIN_COLL_NAME DB_PTG860
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 13
COLL_ID 1
CSET_NAME DOS860
CSET_DEFAULT_COLL DB_PTG860
DOMAIN_COLL_NAME DB_PTG860
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 47
COLL_ID 0
CSET_NAME DOS861
CSET_DEFAULT_COLL DOS861
DOMAIN_COLL_NAME DOS861
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 47
COLL_ID 0
CSET_NAME DOS861
CSET_DEFAULT_COLL DOS861
DOMAIN_COLL_NAME DOS861
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 47
COLL_ID 0
CSET_NAME DOS861
CSET_DEFAULT_COLL DOS861
DOMAIN_COLL_NAME DOS861
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 47
COLL_ID 126
CSET_NAME DOS861
CSET_DEFAULT_COLL DOS861_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 47
COLL_ID 126
CSET_NAME DOS861
CSET_DEFAULT_COLL DOS861_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 47
COLL_ID 125
CSET_NAME DOS861
CSET_DEFAULT_COLL DOS861_UNICODE
DOMAIN_COLL_NAME DOS861_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 47
COLL_ID 1
CSET_NAME DOS861
CSET_DEFAULT_COLL PDOX_ISL
DOMAIN_COLL_NAME PDOX_ISL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 47
COLL_ID 1
CSET_NAME DOS861
CSET_DEFAULT_COLL PDOX_ISL
DOMAIN_COLL_NAME PDOX_ISL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 47
COLL_ID 1
CSET_NAME DOS861
CSET_DEFAULT_COLL PDOX_ISL
DOMAIN_COLL_NAME PDOX_ISL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 14
COLL_ID 0
CSET_NAME DOS863
CSET_DEFAULT_COLL DOS863
DOMAIN_COLL_NAME DOS863
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 14
COLL_ID 0
CSET_NAME DOS863
CSET_DEFAULT_COLL DOS863
DOMAIN_COLL_NAME DOS863
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 14
COLL_ID 0
CSET_NAME DOS863
CSET_DEFAULT_COLL DOS863
DOMAIN_COLL_NAME DOS863
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 14
COLL_ID 126
CSET_NAME DOS863
CSET_DEFAULT_COLL DOS863_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 14
COLL_ID 126
CSET_NAME DOS863
CSET_DEFAULT_COLL DOS863_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 14
COLL_ID 125
CSET_NAME DOS863
CSET_DEFAULT_COLL DOS863_UNICODE
DOMAIN_COLL_NAME DOS863_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 14
COLL_ID 1
CSET_NAME DOS863
CSET_DEFAULT_COLL DB_FRC863
DOMAIN_COLL_NAME DB_FRC863
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 14
COLL_ID 1
CSET_NAME DOS863
CSET_DEFAULT_COLL DB_FRC863
DOMAIN_COLL_NAME DB_FRC863
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 14
COLL_ID 1
CSET_NAME DOS863
CSET_DEFAULT_COLL DB_FRC863
DOMAIN_COLL_NAME DB_FRC863
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 50
COLL_ID 0
CSET_NAME CYRL
CSET_DEFAULT_COLL CYRL
DOMAIN_COLL_NAME CYRL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 50
COLL_ID 0
CSET_NAME CYRL
CSET_DEFAULT_COLL CYRL
DOMAIN_COLL_NAME CYRL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 50
COLL_ID 0
CSET_NAME CYRL
CSET_DEFAULT_COLL CYRL
DOMAIN_COLL_NAME CYRL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 50
COLL_ID 126
CSET_NAME CYRL
CSET_DEFAULT_COLL CYRL_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 50
COLL_ID 126
CSET_NAME CYRL
CSET_DEFAULT_COLL CYRL_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 50
COLL_ID 125
CSET_NAME CYRL
CSET_DEFAULT_COLL CYRL_UNICODE
DOMAIN_COLL_NAME CYRL_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 50
COLL_ID 1
CSET_NAME CYRL
CSET_DEFAULT_COLL DB_RUS
DOMAIN_COLL_NAME DB_RUS
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 50
COLL_ID 1
CSET_NAME CYRL
CSET_DEFAULT_COLL DB_RUS
DOMAIN_COLL_NAME DB_RUS
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 50
COLL_ID 1
CSET_NAME CYRL
CSET_DEFAULT_COLL DB_RUS
DOMAIN_COLL_NAME DB_RUS
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 50
COLL_ID 2
CSET_NAME CYRL
CSET_DEFAULT_COLL PDOX_CYRL
DOMAIN_COLL_NAME PDOX_CYRL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 50
COLL_ID 2
CSET_NAME CYRL
CSET_DEFAULT_COLL PDOX_CYRL
DOMAIN_COLL_NAME PDOX_CYRL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 50
COLL_ID 2
CSET_NAME CYRL
CSET_DEFAULT_COLL PDOX_CYRL
DOMAIN_COLL_NAME PDOX_CYRL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 9
COLL_ID 0
CSET_NAME DOS737
CSET_DEFAULT_COLL DOS737
DOMAIN_COLL_NAME DOS737
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 9
COLL_ID 0
CSET_NAME DOS737
CSET_DEFAULT_COLL DOS737
DOMAIN_COLL_NAME DOS737
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 9
COLL_ID 0
CSET_NAME DOS737
CSET_DEFAULT_COLL DOS737
DOMAIN_COLL_NAME DOS737
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 9
COLL_ID 126
CSET_NAME DOS737
CSET_DEFAULT_COLL DOS737_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 9
COLL_ID 126
CSET_NAME DOS737
CSET_DEFAULT_COLL DOS737_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 9
COLL_ID 125
CSET_NAME DOS737
CSET_DEFAULT_COLL DOS737_UNICODE
DOMAIN_COLL_NAME DOS737_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 15
COLL_ID 0
CSET_NAME DOS775
CSET_DEFAULT_COLL DOS775
DOMAIN_COLL_NAME DOS775
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 15
COLL_ID 0
CSET_NAME DOS775
CSET_DEFAULT_COLL DOS775
DOMAIN_COLL_NAME DOS775
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 15
COLL_ID 0
CSET_NAME DOS775
CSET_DEFAULT_COLL DOS775
DOMAIN_COLL_NAME DOS775
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 15
COLL_ID 126
CSET_NAME DOS775
CSET_DEFAULT_COLL DOS775_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 15
COLL_ID 126
CSET_NAME DOS775
CSET_DEFAULT_COLL DOS775_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 15
COLL_ID 125
CSET_NAME DOS775
CSET_DEFAULT_COLL DOS775_UNICODE
DOMAIN_COLL_NAME DOS775_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 16
COLL_ID 0
CSET_NAME DOS858
CSET_DEFAULT_COLL DOS858
DOMAIN_COLL_NAME DOS858
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 16
COLL_ID 0
CSET_NAME DOS858
CSET_DEFAULT_COLL DOS858
DOMAIN_COLL_NAME DOS858
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 16
COLL_ID 0
CSET_NAME DOS858
CSET_DEFAULT_COLL DOS858
DOMAIN_COLL_NAME DOS858
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 16
COLL_ID 126
CSET_NAME DOS858
CSET_DEFAULT_COLL DOS858_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 16
COLL_ID 126
CSET_NAME DOS858
CSET_DEFAULT_COLL DOS858_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 16
COLL_ID 125
CSET_NAME DOS858
CSET_DEFAULT_COLL DOS858_UNICODE
DOMAIN_COLL_NAME DOS858_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 17
COLL_ID 0
CSET_NAME DOS862
CSET_DEFAULT_COLL DOS862
DOMAIN_COLL_NAME DOS862
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 17
COLL_ID 0
CSET_NAME DOS862
CSET_DEFAULT_COLL DOS862
DOMAIN_COLL_NAME DOS862
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 17
COLL_ID 0
CSET_NAME DOS862
CSET_DEFAULT_COLL DOS862
DOMAIN_COLL_NAME DOS862
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 17
COLL_ID 126
CSET_NAME DOS862
CSET_DEFAULT_COLL DOS862_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 17
COLL_ID 126
CSET_NAME DOS862
CSET_DEFAULT_COLL DOS862_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 17
COLL_ID 125
CSET_NAME DOS862
CSET_DEFAULT_COLL DOS862_UNICODE
DOMAIN_COLL_NAME DOS862_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 18
COLL_ID 0
CSET_NAME DOS864
CSET_DEFAULT_COLL DOS864
DOMAIN_COLL_NAME DOS864
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 18
COLL_ID 0
CSET_NAME DOS864
CSET_DEFAULT_COLL DOS864
DOMAIN_COLL_NAME DOS864
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 18
COLL_ID 0
CSET_NAME DOS864
CSET_DEFAULT_COLL DOS864
DOMAIN_COLL_NAME DOS864
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 18
COLL_ID 126
CSET_NAME DOS864
CSET_DEFAULT_COLL DOS864_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 18
COLL_ID 126
CSET_NAME DOS864
CSET_DEFAULT_COLL DOS864_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 18
COLL_ID 125
CSET_NAME DOS864
CSET_DEFAULT_COLL DOS864_UNICODE
DOMAIN_COLL_NAME DOS864_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 48
COLL_ID 0
CSET_NAME DOS866
CSET_DEFAULT_COLL DOS866
DOMAIN_COLL_NAME DOS866
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 48
COLL_ID 0
CSET_NAME DOS866
CSET_DEFAULT_COLL DOS866
DOMAIN_COLL_NAME DOS866
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 48
COLL_ID 0
CSET_NAME DOS866
CSET_DEFAULT_COLL DOS866
DOMAIN_COLL_NAME DOS866
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 48
COLL_ID 126
CSET_NAME DOS866
CSET_DEFAULT_COLL DOS866_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 48
COLL_ID 126
CSET_NAME DOS866
CSET_DEFAULT_COLL DOS866_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 48
COLL_ID 125
CSET_NAME DOS866
CSET_DEFAULT_COLL DOS866_UNICODE
DOMAIN_COLL_NAME DOS866_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 49
COLL_ID 0
CSET_NAME DOS869
CSET_DEFAULT_COLL DOS869
DOMAIN_COLL_NAME DOS869
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 49
COLL_ID 0
CSET_NAME DOS869
CSET_DEFAULT_COLL DOS869
DOMAIN_COLL_NAME DOS869
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 49
COLL_ID 0
CSET_NAME DOS869
CSET_DEFAULT_COLL DOS869
DOMAIN_COLL_NAME DOS869
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 49
COLL_ID 126
CSET_NAME DOS869
CSET_DEFAULT_COLL DOS869_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 49
COLL_ID 126
CSET_NAME DOS869
CSET_DEFAULT_COLL DOS869_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 49
COLL_ID 125
CSET_NAME DOS869
CSET_DEFAULT_COLL DOS869_UNICODE
DOMAIN_COLL_NAME DOS869_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 51
COLL_ID 0
CSET_NAME WIN1250
CSET_DEFAULT_COLL WIN1250
DOMAIN_COLL_NAME WIN1250
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 51
COLL_ID 0
CSET_NAME WIN1250
CSET_DEFAULT_COLL WIN1250
DOMAIN_COLL_NAME WIN1250
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 51
COLL_ID 0
CSET_NAME WIN1250
CSET_DEFAULT_COLL WIN1250
DOMAIN_COLL_NAME WIN1250
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 51
COLL_ID 126
CSET_NAME WIN1250
CSET_DEFAULT_COLL WIN1250_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 51
COLL_ID 126
CSET_NAME WIN1250
CSET_DEFAULT_COLL WIN1250_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 51
COLL_ID 125
CSET_NAME WIN1250
CSET_DEFAULT_COLL WIN1250_UNICODE
DOMAIN_COLL_NAME WIN1250_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 51
COLL_ID 1
CSET_NAME WIN1250
CSET_DEFAULT_COLL PXW_CSY
DOMAIN_COLL_NAME PXW_CSY
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 51
COLL_ID 1
CSET_NAME WIN1250
CSET_DEFAULT_COLL PXW_CSY
DOMAIN_COLL_NAME PXW_CSY
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 51
COLL_ID 1
CSET_NAME WIN1250
CSET_DEFAULT_COLL PXW_CSY
DOMAIN_COLL_NAME PXW_CSY
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 51
COLL_ID 5
CSET_NAME WIN1250
CSET_DEFAULT_COLL PXW_HUN
DOMAIN_COLL_NAME PXW_HUN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 51
COLL_ID 5
CSET_NAME WIN1250
CSET_DEFAULT_COLL PXW_HUN
DOMAIN_COLL_NAME PXW_HUN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 51
COLL_ID 5
CSET_NAME WIN1250
CSET_DEFAULT_COLL PXW_HUN
DOMAIN_COLL_NAME PXW_HUN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 51
COLL_ID 2
CSET_NAME WIN1250
CSET_DEFAULT_COLL PXW_HUNDC
DOMAIN_COLL_NAME PXW_HUNDC
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 51
COLL_ID 2
CSET_NAME WIN1250
CSET_DEFAULT_COLL PXW_HUNDC
DOMAIN_COLL_NAME PXW_HUNDC
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 51
COLL_ID 2
CSET_NAME WIN1250
CSET_DEFAULT_COLL PXW_HUNDC
DOMAIN_COLL_NAME PXW_HUNDC
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 51
COLL_ID 3
CSET_NAME WIN1250
CSET_DEFAULT_COLL PXW_PLK
DOMAIN_COLL_NAME PXW_PLK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 51
COLL_ID 3
CSET_NAME WIN1250
CSET_DEFAULT_COLL PXW_PLK
DOMAIN_COLL_NAME PXW_PLK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 51
COLL_ID 3
CSET_NAME WIN1250
CSET_DEFAULT_COLL PXW_PLK
DOMAIN_COLL_NAME PXW_PLK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 51
COLL_ID 4
CSET_NAME WIN1250
CSET_DEFAULT_COLL PXW_SLOV
DOMAIN_COLL_NAME PXW_SLOV
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 51
COLL_ID 4
CSET_NAME WIN1250
CSET_DEFAULT_COLL PXW_SLOV
DOMAIN_COLL_NAME PXW_SLOV
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 51
COLL_ID 4
CSET_NAME WIN1250
CSET_DEFAULT_COLL PXW_SLOV
DOMAIN_COLL_NAME PXW_SLOV
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 51
COLL_ID 6
CSET_NAME WIN1250
CSET_DEFAULT_COLL BS_BA
DOMAIN_COLL_NAME BS_BA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 51
COLL_ID 6
CSET_NAME WIN1250
CSET_DEFAULT_COLL BS_BA
DOMAIN_COLL_NAME BS_BA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 51
COLL_ID 6
CSET_NAME WIN1250
CSET_DEFAULT_COLL BS_BA
DOMAIN_COLL_NAME BS_BA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 51
COLL_ID 7
CSET_NAME WIN1250
CSET_DEFAULT_COLL WIN_CZ
DOMAIN_COLL_NAME WIN_CZ
COLL_ATTR 3
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 51
COLL_ID 7
CSET_NAME WIN1250
CSET_DEFAULT_COLL WIN_CZ
DOMAIN_COLL_NAME WIN_CZ
COLL_ATTR 3
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 51
COLL_ID 7
CSET_NAME WIN1250
CSET_DEFAULT_COLL WIN_CZ
DOMAIN_COLL_NAME WIN_CZ
COLL_ATTR 3
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 51
COLL_ID 8
CSET_NAME WIN1250
CSET_DEFAULT_COLL WIN_CZ_CI_AI
DOMAIN_COLL_NAME WIN_CZ_CI_AI
COLL_ATTR 7
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 51
COLL_ID 8
CSET_NAME WIN1250
CSET_DEFAULT_COLL WIN_CZ_CI_AI
DOMAIN_COLL_NAME WIN_CZ_CI_AI
COLL_ATTR 7
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 51
COLL_ID 8
CSET_NAME WIN1250
CSET_DEFAULT_COLL WIN_CZ_CI_AI
DOMAIN_COLL_NAME WIN_CZ_CI_AI
COLL_ATTR 7
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 52
COLL_ID 0
CSET_NAME WIN1251
CSET_DEFAULT_COLL WIN1251
DOMAIN_COLL_NAME WIN1251
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 52
COLL_ID 0
CSET_NAME WIN1251
CSET_DEFAULT_COLL WIN1251
DOMAIN_COLL_NAME WIN1251
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 52
COLL_ID 0
CSET_NAME WIN1251
CSET_DEFAULT_COLL WIN1251
DOMAIN_COLL_NAME WIN1251
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 52
COLL_ID 126
CSET_NAME WIN1251
CSET_DEFAULT_COLL WIN1251_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 52
COLL_ID 126
CSET_NAME WIN1251
CSET_DEFAULT_COLL WIN1251_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 52
COLL_ID 125
CSET_NAME WIN1251
CSET_DEFAULT_COLL WIN1251_UNICODE
DOMAIN_COLL_NAME WIN1251_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 52
COLL_ID 1
CSET_NAME WIN1251
CSET_DEFAULT_COLL PXW_CYRL
DOMAIN_COLL_NAME PXW_CYRL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 52
COLL_ID 1
CSET_NAME WIN1251
CSET_DEFAULT_COLL PXW_CYRL
DOMAIN_COLL_NAME PXW_CYRL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 52
COLL_ID 1
CSET_NAME WIN1251
CSET_DEFAULT_COLL PXW_CYRL
DOMAIN_COLL_NAME PXW_CYRL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 52
COLL_ID 2
CSET_NAME WIN1251
CSET_DEFAULT_COLL WIN1251_UA
DOMAIN_COLL_NAME WIN1251_UA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 52
COLL_ID 2
CSET_NAME WIN1251
CSET_DEFAULT_COLL WIN1251_UA
DOMAIN_COLL_NAME WIN1251_UA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 52
COLL_ID 2
CSET_NAME WIN1251
CSET_DEFAULT_COLL WIN1251_UA
DOMAIN_COLL_NAME WIN1251_UA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 53
COLL_ID 0
CSET_NAME WIN1252
CSET_DEFAULT_COLL WIN1252
DOMAIN_COLL_NAME WIN1252
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 53
COLL_ID 0
CSET_NAME WIN1252
CSET_DEFAULT_COLL WIN1252
DOMAIN_COLL_NAME WIN1252
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 53
COLL_ID 0
CSET_NAME WIN1252
CSET_DEFAULT_COLL WIN1252
DOMAIN_COLL_NAME WIN1252
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 53
COLL_ID 126
CSET_NAME WIN1252
CSET_DEFAULT_COLL WIN1252_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 53
COLL_ID 126
CSET_NAME WIN1252
CSET_DEFAULT_COLL WIN1252_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 53
COLL_ID 125
CSET_NAME WIN1252
CSET_DEFAULT_COLL WIN1252_UNICODE
DOMAIN_COLL_NAME WIN1252_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 53
COLL_ID 1
CSET_NAME WIN1252
CSET_DEFAULT_COLL PXW_INTL
DOMAIN_COLL_NAME PXW_INTL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 53
COLL_ID 1
CSET_NAME WIN1252
CSET_DEFAULT_COLL PXW_INTL
DOMAIN_COLL_NAME PXW_INTL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 53
COLL_ID 1
CSET_NAME WIN1252
CSET_DEFAULT_COLL PXW_INTL
DOMAIN_COLL_NAME PXW_INTL
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 53
COLL_ID 2
CSET_NAME WIN1252
CSET_DEFAULT_COLL PXW_INTL850
DOMAIN_COLL_NAME PXW_INTL850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 53
COLL_ID 2
CSET_NAME WIN1252
CSET_DEFAULT_COLL PXW_INTL850
DOMAIN_COLL_NAME PXW_INTL850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 53
COLL_ID 2
CSET_NAME WIN1252
CSET_DEFAULT_COLL PXW_INTL850
DOMAIN_COLL_NAME PXW_INTL850
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 53
COLL_ID 3
CSET_NAME WIN1252
CSET_DEFAULT_COLL PXW_NORDAN4
DOMAIN_COLL_NAME PXW_NORDAN4
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 53
COLL_ID 3
CSET_NAME WIN1252
CSET_DEFAULT_COLL PXW_NORDAN4
DOMAIN_COLL_NAME PXW_NORDAN4
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 53
COLL_ID 3
CSET_NAME WIN1252
CSET_DEFAULT_COLL PXW_NORDAN4
DOMAIN_COLL_NAME PXW_NORDAN4
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 53
COLL_ID 6
CSET_NAME WIN1252
CSET_DEFAULT_COLL WIN_PTBR
DOMAIN_COLL_NAME WIN_PTBR
COLL_ATTR 7
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 53
COLL_ID 6
CSET_NAME WIN1252
CSET_DEFAULT_COLL WIN_PTBR
DOMAIN_COLL_NAME WIN_PTBR
COLL_ATTR 7
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 53
COLL_ID 6
CSET_NAME WIN1252
CSET_DEFAULT_COLL WIN_PTBR
DOMAIN_COLL_NAME WIN_PTBR
COLL_ATTR 7
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 53
COLL_ID 4
CSET_NAME WIN1252
CSET_DEFAULT_COLL PXW_SPAN
DOMAIN_COLL_NAME PXW_SPAN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 53
COLL_ID 4
CSET_NAME WIN1252
CSET_DEFAULT_COLL PXW_SPAN
DOMAIN_COLL_NAME PXW_SPAN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 53
COLL_ID 4
CSET_NAME WIN1252
CSET_DEFAULT_COLL PXW_SPAN
DOMAIN_COLL_NAME PXW_SPAN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 53
COLL_ID 5
CSET_NAME WIN1252
CSET_DEFAULT_COLL PXW_SWEDFIN
DOMAIN_COLL_NAME PXW_SWEDFIN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 53
COLL_ID 5
CSET_NAME WIN1252
CSET_DEFAULT_COLL PXW_SWEDFIN
DOMAIN_COLL_NAME PXW_SWEDFIN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 53
COLL_ID 5
CSET_NAME WIN1252
CSET_DEFAULT_COLL PXW_SWEDFIN
DOMAIN_COLL_NAME PXW_SWEDFIN
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 54
COLL_ID 0
CSET_NAME WIN1253
CSET_DEFAULT_COLL WIN1253
DOMAIN_COLL_NAME WIN1253
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 54
COLL_ID 0
CSET_NAME WIN1253
CSET_DEFAULT_COLL WIN1253
DOMAIN_COLL_NAME WIN1253
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 54
COLL_ID 0
CSET_NAME WIN1253
CSET_DEFAULT_COLL WIN1253
DOMAIN_COLL_NAME WIN1253
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 54
COLL_ID 126
CSET_NAME WIN1253
CSET_DEFAULT_COLL WIN1253_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 54
COLL_ID 126
CSET_NAME WIN1253
CSET_DEFAULT_COLL WIN1253_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 54
COLL_ID 125
CSET_NAME WIN1253
CSET_DEFAULT_COLL WIN1253_UNICODE
DOMAIN_COLL_NAME WIN1253_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 54
COLL_ID 1
CSET_NAME WIN1253
CSET_DEFAULT_COLL PXW_GREEK
DOMAIN_COLL_NAME PXW_GREEK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 54
COLL_ID 1
CSET_NAME WIN1253
CSET_DEFAULT_COLL PXW_GREEK
DOMAIN_COLL_NAME PXW_GREEK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 54
COLL_ID 1
CSET_NAME WIN1253
CSET_DEFAULT_COLL PXW_GREEK
DOMAIN_COLL_NAME PXW_GREEK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 55
COLL_ID 0
CSET_NAME WIN1254
CSET_DEFAULT_COLL WIN1254
DOMAIN_COLL_NAME WIN1254
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 55
COLL_ID 0
CSET_NAME WIN1254
CSET_DEFAULT_COLL WIN1254
DOMAIN_COLL_NAME WIN1254
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 55
COLL_ID 0
CSET_NAME WIN1254
CSET_DEFAULT_COLL WIN1254
DOMAIN_COLL_NAME WIN1254
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 55
COLL_ID 126
CSET_NAME WIN1254
CSET_DEFAULT_COLL WIN1254_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 55
COLL_ID 126
CSET_NAME WIN1254
CSET_DEFAULT_COLL WIN1254_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 55
COLL_ID 125
CSET_NAME WIN1254
CSET_DEFAULT_COLL WIN1254_UNICODE
DOMAIN_COLL_NAME WIN1254_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 55
COLL_ID 1
CSET_NAME WIN1254
CSET_DEFAULT_COLL PXW_TURK
DOMAIN_COLL_NAME PXW_TURK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 55
COLL_ID 1
CSET_NAME WIN1254
CSET_DEFAULT_COLL PXW_TURK
DOMAIN_COLL_NAME PXW_TURK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 55
COLL_ID 1
CSET_NAME WIN1254
CSET_DEFAULT_COLL PXW_TURK
DOMAIN_COLL_NAME PXW_TURK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 19
COLL_ID 0
CSET_NAME NEXT
CSET_DEFAULT_COLL NEXT
DOMAIN_COLL_NAME NEXT
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 19
COLL_ID 0
CSET_NAME NEXT
CSET_DEFAULT_COLL NEXT
DOMAIN_COLL_NAME NEXT
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 19
COLL_ID 0
CSET_NAME NEXT
CSET_DEFAULT_COLL NEXT
DOMAIN_COLL_NAME NEXT
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 19
COLL_ID 126
CSET_NAME NEXT
CSET_DEFAULT_COLL NEXT_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 19
COLL_ID 126
CSET_NAME NEXT
CSET_DEFAULT_COLL NEXT_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 19
COLL_ID 125
CSET_NAME NEXT
CSET_DEFAULT_COLL NEXT_UNICODE
DOMAIN_COLL_NAME NEXT_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 19
COLL_ID 2
CSET_NAME NEXT
CSET_DEFAULT_COLL NXT_DEU
DOMAIN_COLL_NAME NXT_DEU
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 19
COLL_ID 2
CSET_NAME NEXT
CSET_DEFAULT_COLL NXT_DEU
DOMAIN_COLL_NAME NXT_DEU
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 19
COLL_ID 2
CSET_NAME NEXT
CSET_DEFAULT_COLL NXT_DEU
DOMAIN_COLL_NAME NXT_DEU
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 19
COLL_ID 5
CSET_NAME NEXT
CSET_DEFAULT_COLL NXT_ESP
DOMAIN_COLL_NAME NXT_ESP
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 19
COLL_ID 5
CSET_NAME NEXT
CSET_DEFAULT_COLL NXT_ESP
DOMAIN_COLL_NAME NXT_ESP
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 19
COLL_ID 5
CSET_NAME NEXT
CSET_DEFAULT_COLL NXT_ESP
DOMAIN_COLL_NAME NXT_ESP
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 19
COLL_ID 3
CSET_NAME NEXT
CSET_DEFAULT_COLL NXT_FRA
DOMAIN_COLL_NAME NXT_FRA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 19
COLL_ID 3
CSET_NAME NEXT
CSET_DEFAULT_COLL NXT_FRA
DOMAIN_COLL_NAME NXT_FRA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 19
COLL_ID 3
CSET_NAME NEXT
CSET_DEFAULT_COLL NXT_FRA
DOMAIN_COLL_NAME NXT_FRA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 19
COLL_ID 4
CSET_NAME NEXT
CSET_DEFAULT_COLL NXT_ITA
DOMAIN_COLL_NAME NXT_ITA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 19
COLL_ID 4
CSET_NAME NEXT
CSET_DEFAULT_COLL NXT_ITA
DOMAIN_COLL_NAME NXT_ITA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 19
COLL_ID 4
CSET_NAME NEXT
CSET_DEFAULT_COLL NXT_ITA
DOMAIN_COLL_NAME NXT_ITA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 19
COLL_ID 1
CSET_NAME NEXT
CSET_DEFAULT_COLL NXT_US
DOMAIN_COLL_NAME NXT_US
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 19
COLL_ID 1
CSET_NAME NEXT
CSET_DEFAULT_COLL NXT_US
DOMAIN_COLL_NAME NXT_US
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 19
COLL_ID 1
CSET_NAME NEXT
CSET_DEFAULT_COLL NXT_US
DOMAIN_COLL_NAME NXT_US
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 58
COLL_ID 0
CSET_NAME WIN1255
CSET_DEFAULT_COLL WIN1255
DOMAIN_COLL_NAME WIN1255
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 58
COLL_ID 0
CSET_NAME WIN1255
CSET_DEFAULT_COLL WIN1255
DOMAIN_COLL_NAME WIN1255
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 58
COLL_ID 0
CSET_NAME WIN1255
CSET_DEFAULT_COLL WIN1255
DOMAIN_COLL_NAME WIN1255
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 58
COLL_ID 126
CSET_NAME WIN1255
CSET_DEFAULT_COLL WIN1255_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 58
COLL_ID 126
CSET_NAME WIN1255
CSET_DEFAULT_COLL WIN1255_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 58
COLL_ID 125
CSET_NAME WIN1255
CSET_DEFAULT_COLL WIN1255_UNICODE
DOMAIN_COLL_NAME WIN1255_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 59
COLL_ID 0
CSET_NAME WIN1256
CSET_DEFAULT_COLL WIN1256
DOMAIN_COLL_NAME WIN1256
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 59
COLL_ID 0
CSET_NAME WIN1256
CSET_DEFAULT_COLL WIN1256
DOMAIN_COLL_NAME WIN1256
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 59
COLL_ID 0
CSET_NAME WIN1256
CSET_DEFAULT_COLL WIN1256
DOMAIN_COLL_NAME WIN1256
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 59
COLL_ID 126
CSET_NAME WIN1256
CSET_DEFAULT_COLL WIN1256_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 59
COLL_ID 126
CSET_NAME WIN1256
CSET_DEFAULT_COLL WIN1256_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 59
COLL_ID 125
CSET_NAME WIN1256
CSET_DEFAULT_COLL WIN1256_UNICODE
DOMAIN_COLL_NAME WIN1256_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 60
COLL_ID 0
CSET_NAME WIN1257
CSET_DEFAULT_COLL WIN1257
DOMAIN_COLL_NAME WIN1257
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 60
COLL_ID 0
CSET_NAME WIN1257
CSET_DEFAULT_COLL WIN1257
DOMAIN_COLL_NAME WIN1257
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 60
COLL_ID 0
CSET_NAME WIN1257
CSET_DEFAULT_COLL WIN1257
DOMAIN_COLL_NAME WIN1257
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 60
COLL_ID 126
CSET_NAME WIN1257
CSET_DEFAULT_COLL WIN1257_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 60
COLL_ID 126
CSET_NAME WIN1257
CSET_DEFAULT_COLL WIN1257_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 60
COLL_ID 125
CSET_NAME WIN1257
CSET_DEFAULT_COLL WIN1257_UNICODE
DOMAIN_COLL_NAME WIN1257_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 60
COLL_ID 1
CSET_NAME WIN1257
CSET_DEFAULT_COLL WIN1257_EE
DOMAIN_COLL_NAME WIN1257_EE
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 60
COLL_ID 1
CSET_NAME WIN1257
CSET_DEFAULT_COLL WIN1257_EE
DOMAIN_COLL_NAME WIN1257_EE
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 60
COLL_ID 1
CSET_NAME WIN1257
CSET_DEFAULT_COLL WIN1257_EE
DOMAIN_COLL_NAME WIN1257_EE
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 60
COLL_ID 2
CSET_NAME WIN1257
CSET_DEFAULT_COLL WIN1257_LT
DOMAIN_COLL_NAME WIN1257_LT
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 60
COLL_ID 2
CSET_NAME WIN1257
CSET_DEFAULT_COLL WIN1257_LT
DOMAIN_COLL_NAME WIN1257_LT
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 60
COLL_ID 2
CSET_NAME WIN1257
CSET_DEFAULT_COLL WIN1257_LT
DOMAIN_COLL_NAME WIN1257_LT
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 60
COLL_ID 3
CSET_NAME WIN1257
CSET_DEFAULT_COLL WIN1257_LV
DOMAIN_COLL_NAME WIN1257_LV
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 60
COLL_ID 3
CSET_NAME WIN1257
CSET_DEFAULT_COLL WIN1257_LV
DOMAIN_COLL_NAME WIN1257_LV
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 60
COLL_ID 3
CSET_NAME WIN1257
CSET_DEFAULT_COLL WIN1257_LV
DOMAIN_COLL_NAME WIN1257_LV
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 44
COLL_ID 0
CSET_NAME KSC_5601
CSET_DEFAULT_COLL KSC_5601
DOMAIN_COLL_NAME KSC_5601
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 44
COLL_ID 0
CSET_NAME KSC_5601
CSET_DEFAULT_COLL KSC_5601
DOMAIN_COLL_NAME KSC_5601
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 44
COLL_ID 0
CSET_NAME KSC_5601
CSET_DEFAULT_COLL KSC_5601
DOMAIN_COLL_NAME KSC_5601
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 44
COLL_ID 126
CSET_NAME KSC_5601
CSET_DEFAULT_COLL KSC_5601_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 44
COLL_ID 126
CSET_NAME KSC_5601
CSET_DEFAULT_COLL KSC_5601_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 44
COLL_ID 125
CSET_NAME KSC_5601
CSET_DEFAULT_COLL KSC_5601_UNICODE
DOMAIN_COLL_NAME KSC_5601_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 44
COLL_ID 1
CSET_NAME KSC_5601
CSET_DEFAULT_COLL KSC_DICTIONARY
DOMAIN_COLL_NAME KSC_DICTIONARY
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 44
COLL_ID 1
CSET_NAME KSC_5601
CSET_DEFAULT_COLL KSC_DICTIONARY
DOMAIN_COLL_NAME KSC_DICTIONARY
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 44
COLL_ID 1
CSET_NAME KSC_5601
CSET_DEFAULT_COLL KSC_DICTIONARY
DOMAIN_COLL_NAME KSC_DICTIONARY
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 56
COLL_ID 0
CSET_NAME BIG_5
CSET_DEFAULT_COLL BIG_5
DOMAIN_COLL_NAME BIG_5
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 56
COLL_ID 0
CSET_NAME BIG_5
CSET_DEFAULT_COLL BIG_5
DOMAIN_COLL_NAME BIG_5
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 56
COLL_ID 0
CSET_NAME BIG_5
CSET_DEFAULT_COLL BIG_5
DOMAIN_COLL_NAME BIG_5
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 56
COLL_ID 126
CSET_NAME BIG_5
CSET_DEFAULT_COLL BIG_5_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 56
COLL_ID 126
CSET_NAME BIG_5
CSET_DEFAULT_COLL BIG_5_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 56
COLL_ID 125
CSET_NAME BIG_5
CSET_DEFAULT_COLL BIG_5_UNICODE
DOMAIN_COLL_NAME BIG_5_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 57
COLL_ID 0
CSET_NAME GB_2312
CSET_DEFAULT_COLL GB_2312
DOMAIN_COLL_NAME GB_2312
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 57
COLL_ID 0
CSET_NAME GB_2312
CSET_DEFAULT_COLL GB_2312
DOMAIN_COLL_NAME GB_2312
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 57
COLL_ID 0
CSET_NAME GB_2312
CSET_DEFAULT_COLL GB_2312
DOMAIN_COLL_NAME GB_2312
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 57
COLL_ID 126
CSET_NAME GB_2312
CSET_DEFAULT_COLL GB_2312_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 57
COLL_ID 126
CSET_NAME GB_2312
CSET_DEFAULT_COLL GB_2312_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 57
COLL_ID 125
CSET_NAME GB_2312
CSET_DEFAULT_COLL GB_2312_UNICODE
DOMAIN_COLL_NAME GB_2312_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 63
COLL_ID 0
CSET_NAME KOI8R
CSET_DEFAULT_COLL KOI8R
DOMAIN_COLL_NAME KOI8R
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 63
COLL_ID 0
CSET_NAME KOI8R
CSET_DEFAULT_COLL KOI8R
DOMAIN_COLL_NAME KOI8R
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 63
COLL_ID 0
CSET_NAME KOI8R
CSET_DEFAULT_COLL KOI8R
DOMAIN_COLL_NAME KOI8R
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 63
COLL_ID 126
CSET_NAME KOI8R
CSET_DEFAULT_COLL KOI8R_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 63
COLL_ID 126
CSET_NAME KOI8R
CSET_DEFAULT_COLL KOI8R_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 63
COLL_ID 125
CSET_NAME KOI8R
CSET_DEFAULT_COLL KOI8R_UNICODE
DOMAIN_COLL_NAME KOI8R_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 63
COLL_ID 1
CSET_NAME KOI8R
CSET_DEFAULT_COLL KOI8R_RU
DOMAIN_COLL_NAME KOI8R_RU
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 63
COLL_ID 1
CSET_NAME KOI8R
CSET_DEFAULT_COLL KOI8R_RU
DOMAIN_COLL_NAME KOI8R_RU
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 63
COLL_ID 1
CSET_NAME KOI8R
CSET_DEFAULT_COLL KOI8R_RU
DOMAIN_COLL_NAME KOI8R_RU
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 64
COLL_ID 0
CSET_NAME KOI8U
CSET_DEFAULT_COLL KOI8U
DOMAIN_COLL_NAME KOI8U
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 64
COLL_ID 0
CSET_NAME KOI8U
CSET_DEFAULT_COLL KOI8U
DOMAIN_COLL_NAME KOI8U
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 64
COLL_ID 0
CSET_NAME KOI8U
CSET_DEFAULT_COLL KOI8U
DOMAIN_COLL_NAME KOI8U
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 64
COLL_ID 126
CSET_NAME KOI8U
CSET_DEFAULT_COLL KOI8U_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 64
COLL_ID 126
CSET_NAME KOI8U
CSET_DEFAULT_COLL KOI8U_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 64
COLL_ID 125
CSET_NAME KOI8U
CSET_DEFAULT_COLL KOI8U_UNICODE
DOMAIN_COLL_NAME KOI8U_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 64
COLL_ID 1
CSET_NAME KOI8U
CSET_DEFAULT_COLL KOI8U_UA
DOMAIN_COLL_NAME KOI8U_UA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 64
COLL_ID 1
CSET_NAME KOI8U
CSET_DEFAULT_COLL KOI8U_UA
DOMAIN_COLL_NAME KOI8U_UA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 64
COLL_ID 1
CSET_NAME KOI8U
CSET_DEFAULT_COLL KOI8U_UA
DOMAIN_COLL_NAME KOI8U_UA
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 65
COLL_ID 0
CSET_NAME WIN1258
CSET_DEFAULT_COLL WIN1258
DOMAIN_COLL_NAME WIN1258
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 65
COLL_ID 0
CSET_NAME WIN1258
CSET_DEFAULT_COLL WIN1258
DOMAIN_COLL_NAME WIN1258
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 65
COLL_ID 0
CSET_NAME WIN1258
CSET_DEFAULT_COLL WIN1258
DOMAIN_COLL_NAME WIN1258
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 65
COLL_ID 126
CSET_NAME WIN1258
CSET_DEFAULT_COLL WIN1258_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 65
COLL_ID 126
CSET_NAME WIN1258
CSET_DEFAULT_COLL WIN1258_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 6
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 65
COLL_ID 125
CSET_NAME WIN1258
CSET_DEFAULT_COLL WIN1258_UNICODE
DOMAIN_COLL_NAME WIN1258_UNICODE
COLL_ATTR 0
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 66
COLL_ID 0
CSET_NAME TIS620
CSET_DEFAULT_COLL TIS620
DOMAIN_COLL_NAME TIS620
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 66
COLL_ID 0
CSET_NAME TIS620
CSET_DEFAULT_COLL TIS620
DOMAIN_COLL_NAME TIS620
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 66
COLL_ID 0
CSET_NAME TIS620
CSET_DEFAULT_COLL TIS620
DOMAIN_COLL_NAME TIS620
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 66
COLL_ID 126
CSET_NAME TIS620
CSET_DEFAULT_COLL TIS620_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 7
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 66
COLL_ID 126
CSET_NAME TIS620
CSET_DEFAULT_COLL TIS620_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 7
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 66
COLL_ID 1
CSET_NAME TIS620
CSET_DEFAULT_COLL TIS620_UNICODE
DOMAIN_COLL_NAME TIS620_UNICODE
COLL_ATTR 1
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 67
COLL_ID 0
CSET_NAME GBK
CSET_DEFAULT_COLL GBK
DOMAIN_COLL_NAME GBK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 67
COLL_ID 0
CSET_NAME GBK
CSET_DEFAULT_COLL GBK
DOMAIN_COLL_NAME GBK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 67
COLL_ID 0
CSET_NAME GBK
CSET_DEFAULT_COLL GBK
DOMAIN_COLL_NAME GBK
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 67
COLL_ID 126
CSET_NAME GBK
CSET_DEFAULT_COLL GBK_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 7
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 67
COLL_ID 126
CSET_NAME GBK
CSET_DEFAULT_COLL GBK_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 7
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 67
COLL_ID 1
CSET_NAME GBK
CSET_DEFAULT_COLL GBK_UNICODE
DOMAIN_COLL_NAME GBK_UNICODE
COLL_ATTR 1
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 68
COLL_ID 0
CSET_NAME CP943C
CSET_DEFAULT_COLL CP943C
DOMAIN_COLL_NAME CP943C
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 68
COLL_ID 0
CSET_NAME CP943C
CSET_DEFAULT_COLL CP943C
DOMAIN_COLL_NAME CP943C
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 68
COLL_ID 0
CSET_NAME CP943C
CSET_DEFAULT_COLL CP943C
DOMAIN_COLL_NAME CP943C
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 68
COLL_ID 126
CSET_NAME CP943C
CSET_DEFAULT_COLL CP943C_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 7
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 68
COLL_ID 126
CSET_NAME CP943C
CSET_DEFAULT_COLL CP943C_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 7
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 68
COLL_ID 1
CSET_NAME CP943C
CSET_DEFAULT_COLL CP943C_UNICODE
DOMAIN_COLL_NAME CP943C_UNICODE
COLL_ATTR 1
COLL_SPEC COLL-VERSION=153.88
F_NAME DM_BLOB
CSET_ID 69
COLL_ID 0
CSET_NAME GB18030
CSET_DEFAULT_COLL GB18030
DOMAIN_COLL_NAME GB18030
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_NAME
CSET_ID 69
COLL_ID 0
CSET_NAME GB18030
CSET_DEFAULT_COLL GB18030
DOMAIN_COLL_NAME GB18030
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_TEXT
CSET_ID 69
COLL_ID 0
CSET_NAME GB18030
CSET_DEFAULT_COLL GB18030
DOMAIN_COLL_NAME GB18030
COLL_ATTR 1
COLL_SPEC <null>
F_NAME DM_BLOB
CSET_ID 69
COLL_ID 126
CSET_NAME GB18030
CSET_DEFAULT_COLL GB18030_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 7
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_NAME
CSET_ID 69
COLL_ID 126
CSET_NAME GB18030
CSET_DEFAULT_COLL GB18030_UNICODE
DOMAIN_COLL_NAME CO_UNICODE
COLL_ATTR 7
COLL_SPEC COLL-VERSION=153.88;NUMERIC-SORT=1
F_NAME DM_TEXT
CSET_ID 69
COLL_ID 1
CSET_NAME GB18030
CSET_DEFAULT_COLL GB18030_UNICODE
DOMAIN_COLL_NAME GB18030_UNICODE
COLL_ATTR 1
COLL_SPEC COLL-VERSION=153.88
"""
@pytest.mark.version('>=4.0')
@pytest.mark.xfail
def test_1(db_1):
pytest.fail("Test not IMPLEMENTED")
| 38.54226 | 163 | 0.392091 | 25,188 | 289,568 | 4.168136 | 0.020962 | 0.025813 | 0.076581 | 0.04792 | 0.947298 | 0.941831 | 0.937059 | 0.9238 | 0.918847 | 0.878414 | 0 | 0.081124 | 0.565705 | 289,568 | 7,512 | 164 | 38.547391 | 0.75371 | 0.245234 | 0 | 0.986458 | 0 | 0 | 0.987759 | 0.016074 | 0 | 0 | 0 | 0 | 0 | 1 | 0.000233 | false | 0 | 0.000467 | 0 | 0.0007 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
dbbfe6c4220d58d722b0dba7ca58ca402b850f38 | 175 | py | Python | base_phone/models/__init__.py | kenysmile/test_facebook | 844a3ddd53abd319c0115de86909118a37106c67 | [
"Apache-2.0"
] | null | null | null | base_phone/models/__init__.py | kenysmile/test_facebook | 844a3ddd53abd319c0115de86909118a37106c67 | [
"Apache-2.0"
] | null | null | null | base_phone/models/__init__.py | kenysmile/test_facebook | 844a3ddd53abd319c0115de86909118a37106c67 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from . import phone_validation_mixin
from . import res_company
from . import res_config_settings
from . import res_partner
from . import phone_common
| 21.875 | 36 | 0.771429 | 25 | 175 | 5.12 | 0.56 | 0.390625 | 0.304688 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006711 | 0.148571 | 175 | 7 | 37 | 25 | 0.852349 | 0.12 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
915761ae68c1f8a1f7603a2d87f40295c22bd595 | 96 | py | Python | propylean/__init__.py | abhishekvraman/Propylean | 1a682360ebe0da9fb0d12eed1f1ec18dacfd8bfa | [
"MIT"
] | null | null | null | propylean/__init__.py | abhishekvraman/Propylean | 1a682360ebe0da9fb0d12eed1f1ec18dacfd8bfa | [
"MIT"
] | null | null | null | propylean/__init__.py | abhishekvraman/Propylean | 1a682360ebe0da9fb0d12eed1f1ec18dacfd8bfa | [
"MIT"
] | null | null | null | from propylean import equipments
from propylean import properties
from propylean import streams | 32 | 33 | 0.875 | 12 | 96 | 7 | 0.5 | 0.464286 | 0.678571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 96 | 3 | 34 | 32 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
916e2d7611bace917cd4198ecc26ff33dd9b908e | 1,731 | py | Python | classification/VGG/vgg_config.py | sagnik1511/DLPI | 1609aaff4851efed829d11cc5eda721304ca0f02 | [
"Apache-2.0"
] | null | null | null | classification/VGG/vgg_config.py | sagnik1511/DLPI | 1609aaff4851efed829d11cc5eda721304ca0f02 | [
"Apache-2.0"
] | null | null | null | classification/VGG/vgg_config.py | sagnik1511/DLPI | 1609aaff4851efed829d11cc5eda721304ca0f02 | [
"Apache-2.0"
] | null | null | null | """
Configuration of different VGG architectures.
Architecture A-LRN haven't updated till now.
"""
CONFIG = {
'A': {
'b1': [[(3, 64), 3]],
'b2': [[(64, 128), 3]],
'b3': [[(128, 256), 3], [(256, 256), 3]],
'b4': [[(256, 512), 3], [(512, 512), 3]],
'b5': [[(512, 512), 3], [(512, 512), 3]]
},
'B': {
'b1': [[(3, 64), 3], [(64, 64), 3]],
'b2': [[(64, 128), 3], [(128, 128), 3]],
'b3': [[(128, 256), 3], [(256, 256), 3]],
'b4': [[(256, 512), 3], [(512, 512), 3]],
'b5': [[(512, 512), 3], [(512, 512), 3]]
},
'C': {
'b1': [[(3, 64), 3], [(64, 64), 3]],
'b2': [[(64, 128), 3], [(128, 128), 3]],
'b3': [[(128, 256), 3], [(256, 256), 3], [(256, 256), 1]],
'b4': [[(256, 512), 3], [(512, 512), 3], [(512, 512), 1]],
'b5': [[(512, 512), 3], [(512, 512), 3], [(512, 512), 1]]
},
'D': {
'b1': [[(3, 64), 3], [(64, 64), 3]],
'b2': [[(64, 128), 3], [(128, 128), 3]],
'b3': [[(128, 256), 3], [(256, 256), 3], [(256, 256), 3]],
'b4': [[(256, 512), 3], [(512, 512), 3], [(512, 512), 3]],
'b5': [[(512, 512), 3], [(512, 512), 3], [(512, 512), 3]]
},
'E': {
'b1': [[(3, 64), 3], [(64, 64), 3]],
'b2': [[(64, 128), 3], [(128, 128), 3]],
'b3': [[(128, 256), 3], [(256, 256), 3], [(256, 256), 3], [(256, 256), 3]],
'b4': [[(256, 512), 3], [(512, 512), 3], [(512, 512), 3], [(512, 512), 3]],
'b5': [[(512, 512), 3], [(512, 512), 3], [(512, 512), 3], [(512, 512), 3]]
}
} | 40.255814 | 88 | 0.305604 | 224 | 1,731 | 2.361607 | 0.129464 | 0.196597 | 0.277883 | 0.340265 | 0.827977 | 0.827977 | 0.814745 | 0.814745 | 0.810964 | 0.727788 | 0 | 0.375338 | 0.358174 | 1,731 | 43 | 89 | 40.255814 | 0.10081 | 0.051993 | 0 | 0.378378 | 0 | 0 | 0.034591 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
91790ff70227520e68096c6f5c531c4454da365f | 7,355 | py | Python | plenum/test/node_request/test_unit_setup_for_non_master.py | andkononykhin/plenum | 28dc1719f4b7e80d31dafbadb38cfec4da949886 | [
"Apache-2.0"
] | null | null | null | plenum/test/node_request/test_unit_setup_for_non_master.py | andkononykhin/plenum | 28dc1719f4b7e80d31dafbadb38cfec4da949886 | [
"Apache-2.0"
] | 1 | 2019-03-20T14:57:22.000Z | 2019-03-20T15:01:55.000Z | plenum/test/node_request/test_unit_setup_for_non_master.py | andkononykhin/plenum | 28dc1719f4b7e80d31dafbadb38cfec4da949886 | [
"Apache-2.0"
] | null | null | null | from collections import deque
import pytest
import time
from plenum.common.messages.node_messages import PrePrepare, Prepare
from plenum.common.util import compare_3PC_keys
from plenum.test.helper import sdk_send_random_and_check, init_discarded
from plenum.test.node_catchup.helper import waitNodeDataEquality
from plenum.test.pool_transactions.helper import sdk_add_new_steward_and_node
from plenum.test.test_node import getNonPrimaryReplicas, checkNodesConnected
from stp_core.common.log import getlogger
from stp_core.loop.eventually import eventually
logger = getlogger()
def test_setup_last_ordered_for_non_master_after_catchup(txnPoolNodeSet,
sdk_wallet_client):
inst_id = 1
replica = getNonPrimaryReplicas(txnPoolNodeSet, inst_id)[-1]
replica.preparesWaitingForPrePrepare.clear()
replica.prePreparesPendingPrevPP.clear()
replica.last_ordered_3pc = (0, 0)
timestamp = time.time()
ppSeqNo = 5
preprepare, prepare = \
_create_prepare_and_preprepare(inst_id,
replica.viewNo,
ppSeqNo,
timestamp,
sdk_wallet_client)
replica.prePreparesPendingPrevPP[replica.viewNo, ppSeqNo] = deque()
replica.prePreparesPendingPrevPP[replica.viewNo, ppSeqNo] \
.append((preprepare, replica.primaryName))
replica.preparesWaitingForPrePrepare[replica.viewNo, ppSeqNo] = deque()
for node in txnPoolNodeSet:
replica.preparesWaitingForPrePrepare[replica.viewNo, ppSeqNo] \
.append((prepare, node.name))
replica.first_batch_after_catchup = True
replica._setup_last_ordered_for_non_master()
assert replica.last_ordered_3pc == (replica.viewNo, ppSeqNo - 1)
def test_setup_last_ordered_for_non_master_without_preprepare(txnPoolNodeSet,
sdk_wallet_client):
inst_id = 1
replica = getNonPrimaryReplicas(txnPoolNodeSet, inst_id)[-1]
replica.preparesWaitingForPrePrepare.clear()
replica.prePreparesPendingPrevPP.clear()
replica.last_ordered_3pc = (0, 0)
timestamp = time.time()
ppSeqNo = 5
preprepare, prepare = \
_create_prepare_and_preprepare(inst_id,
replica.viewNo,
ppSeqNo,
timestamp,
sdk_wallet_client)
replica.preparesWaitingForPrePrepare[replica.viewNo, ppSeqNo] = deque()
for node in txnPoolNodeSet:
replica.preparesWaitingForPrePrepare[replica.viewNo, ppSeqNo] \
.append((prepare, node.name))
replica._setup_last_ordered_for_non_master()
assert replica.last_ordered_3pc == (0, 0)
def test_setup_last_ordered_for_non_master_without_quorum_of_prepares(
txnPoolNodeSet,
sdk_wallet_client):
inst_id = 1
replica = getNonPrimaryReplicas(txnPoolNodeSet, inst_id)[-1]
replica.preparesWaitingForPrePrepare.clear()
replica.prePreparesPendingPrevPP.clear()
replica.last_ordered_3pc = (0, 0)
timestamp = time.time()
ppSeqNo = 5
preprepare, prepare = \
_create_prepare_and_preprepare(inst_id,
replica.viewNo,
ppSeqNo,
timestamp,
sdk_wallet_client)
replica.prePreparesPendingPrevPP[replica.viewNo, ppSeqNo] = deque()
replica.prePreparesPendingPrevPP[replica.viewNo, ppSeqNo] \
.append((preprepare, replica.primaryName))
replica.preparesWaitingForPrePrepare[replica.viewNo, ppSeqNo] = deque()
replica.preparesWaitingForPrePrepare[replica.viewNo, ppSeqNo] \
.append((prepare, txnPoolNodeSet[-1].name))
replica._setup_last_ordered_for_non_master()
assert replica.last_ordered_3pc == (0, 0)
def test_setup_last_ordered_for_non_master_for_master(txnPoolNodeSet,
sdk_wallet_client):
inst_id = 0
replica = getNonPrimaryReplicas(txnPoolNodeSet, inst_id)[-1]
replica.preparesWaitingForPrePrepare.clear()
replica.prePreparesPendingPrevPP.clear()
replica.last_ordered_3pc = (0, 0)
timestamp = time.time()
ppSeqNo = 5
preprepare, prepare = \
_create_prepare_and_preprepare(inst_id,
replica.viewNo,
ppSeqNo,
timestamp,
sdk_wallet_client)
replica.prePreparesPendingPrevPP[replica.viewNo, ppSeqNo] = deque()
replica.prePreparesPendingPrevPP[replica.viewNo, ppSeqNo] \
.append((preprepare, replica.primaryName))
replica.preparesWaitingForPrePrepare[replica.viewNo, ppSeqNo] = deque()
for node in txnPoolNodeSet:
replica.preparesWaitingForPrePrepare[replica.viewNo, ppSeqNo] \
.append((prepare, node.name))
replica._setup_last_ordered_for_non_master()
assert replica.last_ordered_3pc == (0, 0)
def test_setup_last_ordered_for_non_master_without_catchup(txnPoolNodeSet,
sdk_wallet_client):
inst_id = 1
last_ordered_3pc = (0, 12)
timestamp = time.time()
ppSeqNo = 16
replica = getNonPrimaryReplicas(txnPoolNodeSet, inst_id)[-1]
replica.last_ordered_3pc = last_ordered_3pc
replica.preparesWaitingForPrePrepare.clear()
replica.prePreparesPendingPrevPP.clear()
preprepare, prepare = \
_create_prepare_and_preprepare(inst_id,
replica.viewNo,
ppSeqNo,
timestamp,
sdk_wallet_client)
replica.prePreparesPendingPrevPP[replica.viewNo, ppSeqNo] = deque()
replica.prePreparesPendingPrevPP[replica.viewNo, ppSeqNo] \
.append((preprepare, replica.primaryName))
replica.preparesWaitingForPrePrepare[replica.viewNo, ppSeqNo] = deque()
for node in txnPoolNodeSet:
replica.preparesWaitingForPrePrepare[replica.viewNo, ppSeqNo] \
.append((prepare, node.name))
replica._setup_last_ordered_for_non_master()
assert replica.last_ordered_3pc == last_ordered_3pc
def _create_prepare_and_preprepare(inst_id, pp_sq_no, view_no, timestamp,
sdk_wallet_client):
time = int(timestamp)
req_idr = ["random request digest"]
preprepare = PrePrepare(inst_id,
pp_sq_no,
view_no,
time,
req_idr,
init_discarded(),
"123",
1,
None,
None,
0,
True)
prepare = Prepare(inst_id,
pp_sq_no,
view_no,
time,
"321",
None,
None
)
return preprepare, prepare
| 42.514451 | 81 | 0.606934 | 656 | 7,355 | 6.507622 | 0.140244 | 0.073085 | 0.112439 | 0.044507 | 0.813539 | 0.813539 | 0.78379 | 0.744905 | 0.705786 | 0.695948 | 0 | 0.011453 | 0.323317 | 7,355 | 172 | 82 | 42.761628 | 0.846293 | 0 | 0 | 0.737179 | 0 | 0 | 0.003671 | 0 | 0 | 0 | 0 | 0 | 0.032051 | 1 | 0.038462 | false | 0 | 0.070513 | 0 | 0.115385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
91999a1b79c3bfb487a3ec2b96e418a96d44aaa7 | 38,441 | py | Python | tools/accuracy_checker/openvino/tools/accuracy_checker/evaluators/custom_evaluators/mtcnn_models.py | Pandinosaurus/open_model_zoo | 2543996541346418919c5cddfb71e33e2cdef080 | [
"Apache-2.0"
] | 1 | 2019-05-31T14:01:42.000Z | 2019-05-31T14:01:42.000Z | tools/accuracy_checker/openvino/tools/accuracy_checker/evaluators/custom_evaluators/mtcnn_models.py | Pandinosaurus/open_model_zoo | 2543996541346418919c5cddfb71e33e2cdef080 | [
"Apache-2.0"
] | null | null | null | tools/accuracy_checker/openvino/tools/accuracy_checker/evaluators/custom_evaluators/mtcnn_models.py | Pandinosaurus/open_model_zoo | 2543996541346418919c5cddfb71e33e2cdef080 | [
"Apache-2.0"
] | null | null | null | """
Copyright (c) 2018-2022 Intel Corporation
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import pickle # nosec - disable B403:import-pickle check
from collections import OrderedDict
from pathlib import Path
import numpy as np
from ...preprocessor import PreprocessingExecutor
from ...config import ConfigError
from ...utils import contains_any, extract_image_representations, read_pickle, get_path, parse_partial_shape
from .mtcnn_evaluator_utils import cut_roi, calibrate_predictions, nms, transform_for_callback
from ...logging import print_info
from ...launcher import InputFeeder
from .base_models import BaseOpenVINOModel
from ...adapters import create_adapter
def build_stages(models_info, preprocessors_config, launcher, model_args, delayed_model_loading=False):
required_stages = ['pnet']
stages_mapping = OrderedDict([
('pnet', {
'caffe': CaffeProposalStage, 'dlsdk': DLSDKProposalStage,
'dummy': DummyProposalStage, 'openvino': OpenVINOProposalStage}),
('rnet', {'caffe': CaffeRefineStage, 'dlsdk': DLSDKRefineStage,
'openvino': OpenVINORefineStage}),
('onet', {'caffe': CaffeOutputStage, 'dlsdk': DLSDKOutputStage, 'openvino': OpenVINOOutputStage})
])
framework = launcher.config['framework']
common_preprocessor = PreprocessingExecutor(preprocessors_config)
stages = OrderedDict()
for stage_name, stage_classes in stages_mapping.items():
if stage_name not in models_info:
if stage_name not in required_stages:
continue
raise ConfigError('{} required for evaluation'.format(stage_name))
model_config = models_info[stage_name]
if 'predictions' in model_config and not model_config.get('store_predictions', False):
stage_framework = 'dummy'
else:
stage_framework = framework
if not delayed_model_loading:
if not contains_any(model_config, ['model', 'caffe_model']) and stage_framework != 'dummy':
if model_args:
model_config['model'] = model_args[len(stages) if len(model_args) > 1 else 0]
stage = stage_classes.get(stage_framework)
if not stage_classes:
raise ConfigError('{} stage does not support {} framework'.format(stage_name, stage_framework))
stage_preprocess = models_info[stage_name].get('preprocessing', [])
model_specific_preprocessor = PreprocessingExecutor(stage_preprocess)
stages[stage_name] = stage(
models_info[stage_name], model_specific_preprocessor, common_preprocessor, launcher, delayed_model_loading
)
if not stages:
raise ConfigError('please provide information about MTCNN pipeline stages')
return stages
class BaseStage:
def __init__(self, model_info, model_specific_preprocessor, common_preprocessor, delayed_model_loading=False):
self.model_info = model_info
self.model_specific_preprocessor = model_specific_preprocessor
self.common_preprocessor = common_preprocessor
self.input_feeder = None
self.store = model_info.get('store_predictions', False)
self.predictions = []
def predict(self, input_blobs, batch_meta, output_callback=None):
raise NotImplementedError
def preprocess_data(self, batch_input, batch_annotation, previous_stage_prediction, *args, **kwargs):
raise NotImplementedError
def postprocess_result(self, identifiers, this_stage_result, batch_meta, previous_stage_result, *args, **kwargs):
raise NotImplementedError
def release(self):
pass
def reset(self):
self._predictions = []
def dump_predictions(self):
if not hasattr(self, 'prediction_file'):
prediction_file = Path(self.model_info.get('predictions', 'predictions.pickle'))
self.prediction_file = prediction_file
with self.prediction_file.open('wb') as out_file:
pickle.dump(self._predictions, out_file)
def update_preprocessing(self, preprocessor):
self.common_preprocessor = preprocessor
class ProposalBaseStage(BaseStage):
default_model_name = 'mtcnn-p'
default_model_suffix = 'pnet'
def __init__(self, model_info, model_specific_preprocessor, common_preprocessor, delayed_model_loading=False):
super().__init__(model_info, model_specific_preprocessor, common_preprocessor)
self.adapter = None
self.input_feeder = None
self._predictions = []
def preprocess_data(self, batch_input, batch_annotation, *args, **kwargs):
batch_input = self.model_specific_preprocessor.process(batch_input, batch_annotation)
batch_input = self.common_preprocessor.process(batch_input, batch_annotation)
_, batch_meta = extract_image_representations(batch_input)
filled_inputs = self.input_feeder.fill_inputs(batch_input) if self.input_feeder else batch_input
return filled_inputs, batch_meta
def postprocess_result(self, identifiers, this_stage_result, batch_meta, *args, **kwargs):
result = self.adapter.process(this_stage_result, identifiers, batch_meta) if self.adapter else this_stage_result
if self.store:
self._predictions.extend(result)
return result
def predict(self, input_blobs, batch_meta, output_callback=None):
return self._infer(input_blobs, batch_meta)
def dump_predictions(self):
if not hasattr(self, 'prediction_file'):
prediction_file = Path(self.model_info.get('predictions', 'pnet_predictions.pickle'))
self.prediction_file = prediction_file
with self.prediction_file.open('wb') as out_file:
pickle.dump(self._predictions, out_file)
class DummyProposalStage(ProposalBaseStage):
def __init__(self, model_info, model_specific_preprocessor, common_preprocessor, *args, **kwargs):
super().__init__(model_info, model_specific_preprocessor, common_preprocessor)
self._index = 0
if 'predictions' not in self.model_info:
raise ConfigError('predictions_file is not found')
self._predictions = read_pickle(self.model_info['predictions'])
self.iterator = 0
def preprocess_data(self, batch_input, batch_annotation, *args, **kwargs):
_, batch_meta = extract_image_representations(batch_input)
return batch_input, batch_meta
def _infer(self, input_blobs, batch_meta):
batch_size = len(batch_meta)
results = self._predictions[self._index:self._index + batch_size]
self._index += batch_size
return results
def postprocess_result(self, identifiers, this_stage_result, batch_meta, *args, **kwargs):
return this_stage_result
class RefineBaseStage(BaseStage):
input_size = 24
include_boundaries = True
default_model_name = 'mtcnn-r'
def preprocess_data(self, batch_input, batch_annotation, previous_stage_prediction, *args, **kwargs):
batch_input = self.model_specific_preprocessor.process(batch_input, batch_annotation)
batch_input = self.common_preprocessor.process(batch_input, batch_annotation)
_, batch_meta = extract_image_representations(batch_input)
batch_input = [
cut_roi(input_image, prediction, self.input_size, include_bound=self.include_boundaries)
for input_image, prediction in zip(batch_input, previous_stage_prediction)
]
filled_inputs = self.input_feeder.fill_inputs(batch_input) if self.input_feeder else batch_input
return filled_inputs, batch_meta
def postprocess_result(self, identifiers, this_stage_result, batch_meta, previous_stage_result, *args, **kwargs):
result = calibrate_predictions(
previous_stage_result, this_stage_result, 0.7, self.model_info['outputs'], 'Union'
)
if self.store:
self._predictions.extend(result)
return result
def predict(self, input_blobs, batch_meta, output_callback=None):
return self._infer(input_blobs, batch_meta)
def dump_predictions(self):
if not hasattr(self, 'prediction_file'):
prediction_file = Path(self.model_info.get('predictions', 'rnet_predictions.pickle'))
self.prediction_file = prediction_file
with self.prediction_file.open('wb') as out_file:
pickle.dump(self._predictions, out_file)
class OutputBaseStage(RefineBaseStage):
input_size = 48
include_boundaries = False
default_model_name = 'mtcnn-o'
def postprocess_result(self, identifiers, this_stage_result, batch_meta, previous_stage_result, *args, **kwargs):
batch_predictions = calibrate_predictions(
previous_stage_result, this_stage_result, 0.7, self.model_info['outputs']
)
batch_predictions[0], _ = nms(batch_predictions[0], 0.7, 'Min')
if self.store:
self._predictions.extend(batch_predictions)
return batch_predictions
def dump_predictions(self):
if not hasattr(self, 'prediction_file'):
prediction_file = Path(self.model_info.get('predictions', 'onet_predictions.pickle'))
self.prediction_file = prediction_file
with self.prediction_file.open('wb') as out_file:
pickle.dump(self._predictions, out_file)
class CaffeModelMixin:
def _infer(self, input_blobs, batch_meta, *args, **kwargs):
for meta in batch_meta:
meta['input_shape'] = []
results = []
for feed_dict in input_blobs:
for layer_name, data in feed_dict.items():
if data.shape != self.inputs[layer_name]:
self.net.blobs[layer_name].reshape(*data.shape)
for meta in batch_meta:
meta['input_shape'].append(self.inputs)
results.append(self.net.forward(**feed_dict))
return results
@property
def inputs(self):
inputs_map = {}
for input_blob in self.net.inputs:
inputs_map[input_blob] = self.net.blobs[input_blob].data.shape
return inputs_map
def input_shape(self, input_name):
return self.inputs[input_name]
def release(self):
del self.net
def fit_to_input(self, data, layer_name, layout, precision, tmpl=None):
data_shape = np.shape(data)
layer_shape = self.inputs[layer_name]
if len(data_shape) == 5 and len(layer_shape) == 4:
data = data[0]
data_shape = np.shape(data)
data = np.transpose(data, layout) if len(data_shape) == 4 else np.array(data)
if precision:
data = data.astype(precision)
return data
def automatic_model_search(self, network_info):
model = Path(network_info.get('model', ''))
weights = network_info.get('weights')
if model.is_dir():
models_list = list(Path(model).glob('{}.prototxt'.format(self.default_model_name)))
if not models_list:
models_list = list(Path(model).glob('*.prototxt'))
if not models_list:
raise ConfigError('Suitable model description is not detected')
if len(models_list) != 1:
raise ConfigError('Several suitable models found, please specify required model')
model = models_list[0]
if weights is None or Path(weights).is_dir():
weights_dir = weights or model.parent
weights = Path(weights_dir) / model.name.replace('prototxt', 'caffemodel')
if not weights.exists():
weights_list = list(weights_dir.glob('*.caffemodel'))
if not weights_list:
raise ConfigError('Suitable weights is not detected')
if len(weights_list) != 1:
raise ConfigError('Several suitable weights found, please specify required explicitly')
weights = weights_list[0]
weights = Path(weights)
accepted_suffixes = ['.prototxt']
if model.suffix not in accepted_suffixes:
raise ConfigError('Models with following suffixes are allowed: {}'.format(accepted_suffixes))
print_info('{} - Found model: {}'.format(self.default_model_name, model))
accepted_weights_suffixes = ['.caffemodel']
if weights.suffix not in accepted_weights_suffixes:
raise ConfigError('Weights with following suffixes are allowed: {}'.format(accepted_weights_suffixes))
print_info('{} - Found weights: {}'.format(self.default_model_name, weights))
return model, weights
class DLSDKModelMixin:
def _infer(self, input_blobs, batch_meta):
for meta in batch_meta:
meta['input_shape'] = []
results = []
for feed_dict in input_blobs:
input_shapes = {layer_name: data.shape for layer_name, data in feed_dict.items()}
if not self.is_dynamic:
self.reshape_net(input_shapes)
results.append(self.exec_network.infer(feed_dict))
for meta in batch_meta:
meta['input_shape'].append(input_shapes)
return results
@property
def inputs(self):
if self.exec_network:
has_info = hasattr(self.exec_network, 'input_info')
if not has_info:
return self.exec_network.inputs
inputs = OrderedDict()
for name, data in self.exec_network.input_info.items():
if name in self.partial_shapes:
inputs[name] = self.partial_shapes[name]
else:
inputs[name] = data.input_data
return inputs
has_info = hasattr(self.network, 'input_info')
if not has_info:
return self.network.inputs
inputs = OrderedDict()
for name, data in self.network.input_info.items():
if name in self.partial_shapes:
inputs[name] = self.partial_shapes[name]
else:
inputs[name] = data.input_data
return inputs
def input_shape(self, input_name):
return self.inputs[input_name]
def release(self):
self.input_feeder.release()
del self.network
del self.exec_network
self.launcher.release()
def fit_to_input(self, data, layer_name, layout, precision, template=None):
layer_shape = (
tuple(self.inputs[layer_name].shape)
if layer_name not in self.dynamic_inputs else self.partial_shapes[layer_name])
data_shape = np.shape(data)
if len(layer_shape) == 4:
if len(data_shape) == 5:
data = data[0]
data = np.transpose(data, layout)
if precision:
data = data.astype(precision)
return data
def prepare_model(self):
model, weights = self.auto_model_search(self.model_info)
return model, weights
def auto_model_search(self, network_info):
model = Path(network_info.get('model', ''))
weights = network_info.get('weights')
if model.is_dir():
models_list = list(Path(model).glob('{}.xml'.format(self.default_model_name)))
if not models_list:
models_list = list(Path(model).glob('*.xml'))
if not models_list:
raise ConfigError('Suitable model description is not detected')
if len(models_list) != 1:
raise ConfigError('Several suitable models found, please specify required model')
model = models_list[0]
if weights is None or Path(weights).is_dir():
weights_dir = weights or model.parent
weights = Path(weights_dir) / model.name.replace('xml', 'bin')
if not weights.exists():
weights_list = list(weights_dir.glob('*.bin'))
if not weights_list:
raise ConfigError('Suitable weights is not detected')
if len(weights_list) != 1:
raise ConfigError('Several suitable weights found, please specify required explicitly')
weights = weights_list[0]
weights = get_path(weights)
accepted_suffixes = ['.blob', '.xml']
if model.suffix not in accepted_suffixes:
raise ConfigError('Models with following suffixes are allowed: {}'.format(accepted_suffixes))
print_info('{} - Found model: {}'.format(self.default_model_name, model))
accepted_weights_suffixes = ['.bin']
if weights.suffix not in accepted_weights_suffixes:
raise ConfigError('Weights with following suffixes are allowed: {}'.format(accepted_weights_suffixes))
print_info('{} - Found weights: {}'.format(self.default_model_name, weights))
return model, weights
def load_network(self, network, launcher, model_prefix):
self.network = network
self.dynamic_inputs, self.partial_shapes = launcher.get_dynamic_inputs(self.network)
if self.dynamic_inputs and launcher.dynamic_shapes_policy in ['dynamic', 'default']:
try:
self.exec_network = launcher.ie_core.load_network(self.network, launcher.device)
self.is_dynamic = True
except RuntimeError as e:
if launcher.dynamic_shapes_policy == 'dynamic':
raise e
self.is_dynamic = False
self.exec_network = None
if not self.dynamic_inputs:
self.exec_network = launcher.ie_core.load_network(self.network, launcher.device)
self.update_input_output_info(model_prefix)
self.input_feeder = InputFeeder(
self.model_info.get('inputs', []), self.inputs, self.input_shape, self.fit_to_input)
def reshape_net(self, shape):
if self.is_dynamic:
return
if hasattr(self, 'exec_network') and self.exec_network is not None:
del self.exec_network
self.network.reshape(shape)
self.dynamic_inputs, self.partial_shapes = self.launcher.get_dynamic_inputs(self.network)
if not self.is_dynamic and self.dynamic_inputs:
return
self.exec_network = self.launcher.ie_core.load_network(self.network, self.launcher.device)
def load_model(self, network_info, launcher, model_prefix=None, log=False):
self.network = launcher.read_network(str(network_info['model']), str(network_info['weights']))
self.load_network(self.network, launcher, model_prefix)
if log:
self.print_input_output_info()
def print_input_output_info(self):
print_info('{} - Input info:'.format(self.default_model_name))
has_info = hasattr(self.network if self.network is not None else self.exec_network, 'input_info')
if self.network:
if has_info:
network_inputs = OrderedDict(
[(name, data.input_data) for name, data in self.network.input_info.items()]
)
else:
network_inputs = self.network.inputs
network_outputs = self.network.outputs
else:
if has_info:
network_inputs = OrderedDict([
(name, data.input_data) for name, data in self.exec_network.input_info.items()
])
else:
network_inputs = self.exec_network.inputs
network_outputs = self.exec_network.outputs
for name, input_info in network_inputs.items():
print_info('\tLayer name: {}'.format(name))
print_info('\tprecision: {}'.format(input_info.precision))
print_info('\tshape {}\n'.format(
input_info.shape if name not in self.partial_shapes else self.partial_shapes[name]))
print_info('{} - Output info'.format(self.default_model_name))
for name, output_info in network_outputs.items():
print_info('\tLayer name: {}'.format(name))
print_info('\tprecision: {}'.format(output_info.precision))
print_info('\tshape: {}\n'.format(
output_info.shape if name not in self.partial_shapes else self.partial_shapes[name]))
def update_input_output_info(self, model_prefix):
def generate_name(prefix, with_prefix, layer_name):
return prefix + layer_name if with_prefix else layer_name.split(prefix)[-1]
if model_prefix is None:
return
config_inputs = self.model_info.get('inputs', [])
network_with_prefix = next(iter(self.inputs)).startswith(model_prefix)
if config_inputs:
config_with_prefix = config_inputs[0]['name'].startswith(model_prefix)
if config_with_prefix == network_with_prefix:
return
for c_input in config_inputs:
c_input['name'] = generate_name(model_prefix, network_with_prefix, c_input['name'])
self.model_info['inputs'] = config_inputs
config_outputs = self.model_info['outputs']
for key, value in config_outputs.items():
config_with_prefix = value.startswith(model_prefix)
if config_with_prefix != network_with_prefix:
config_outputs[key] = generate_name(model_prefix, network_with_prefix, value)
self.model_info['outputs'] = config_outputs
class OVModelMixin(BaseOpenVINOModel):
def _infer(self, input_blobs, batch_meta):
for meta in batch_meta:
meta['input_shape'] = []
results = []
for feed_dict in input_blobs:
input_shapes = {layer_name: data.shape for layer_name, data in feed_dict.items()}
if not self.is_dynamic:
self.reshape_net(input_shapes)
results.append(self.infer(feed_dict))
for meta in batch_meta:
meta['input_shape'].append(input_shapes)
return results
def predict(self, identifiers, input_data):
raise NotImplementedError
def input_shape(self, input_name):
return parse_partial_shape(self.inputs[input_name].get_partial_shape())
def release(self):
self.input_feeder.release()
del self.network
del self.exec_network
self.launcher.release()
def fit_to_input(self, data, layer_name, layout, precision, template=None):
layer_shape = (
tuple(self.inputs[layer_name].shape)
if layer_name not in self.dynamic_inputs else self.partial_shapes[layer_name])
data_shape = np.shape(data)
if len(layer_shape) == 4:
if len(data_shape) == 5:
data = data[0]
data = np.transpose(data, layout)
if precision:
data = data.astype(precision)
return data
def prepare_model(self):
model, weights = self.auto_model_search(self.model_info)
return model, weights
def auto_model_search(self, network_info):
model = Path(network_info.get('model', ''))
weights = network_info.get('weights')
if model.is_dir():
models_list = list(Path(model).glob('{}.xml'.format(self.default_model_name)))
if not models_list:
models_list = list(Path(model).glob('*.xml'))
if not models_list:
raise ConfigError('Suitable model description is not detected')
if len(models_list) != 1:
raise ConfigError('Several suitable models found, please specify required model')
model = models_list[0]
if weights is None or Path(weights).is_dir():
weights_dir = weights or model.parent
weights = Path(weights_dir) / model.name.replace('xml', 'bin')
if not weights.exists():
weights_list = list(weights_dir.glob('*.bin'))
if not weights_list:
raise ConfigError('Suitable weights is not detected')
if len(weights_list) != 1:
raise ConfigError('Several suitable weights found, please specify required explicitly')
weights = weights_list[0]
weights = get_path(weights)
accepted_suffixes = ['.blob', '.xml']
if model.suffix not in accepted_suffixes:
raise ConfigError('Models with following suffixes are allowed: {}'.format(accepted_suffixes))
print_info('{} - Found model: {}'.format(self.default_model_name, model))
accepted_weights_suffixes = ['.bin']
if weights.suffix not in accepted_weights_suffixes:
raise ConfigError('Weights with following suffixes are allowed: {}'.format(accepted_weights_suffixes))
print_info('{} - Found weights: {}'.format(self.default_model_name, weights))
return model, weights
def load_network(self, network, launcher, model_prefix):
self.network = network
self.dynamic_inputs, self.partial_shapes = launcher.get_dynamic_inputs(self.network)
if self.dynamic_inputs and launcher.dynamic_shapes_policy in ['dynamic', 'default']:
try:
self.exec_network = launcher.ie_core.compile_model(self.network, launcher.device)
self.is_dynamic = True
except RuntimeError as e:
if launcher.dynamic_shapes_policy == 'dynamic':
raise e
self.is_dynamic = False
self.exec_network = None
if not self.dynamic_inputs:
self.exec_network = launcher.ie_core.compile_model(self.network, launcher.device)
self.update_input_output_info(model_prefix)
self.input_feeder = InputFeeder(
self.model_info.get('inputs', []), self.inputs, self.input_shape, self.fit_to_input)
self.infer_request = None
def reshape_net(self, shape):
if self.is_dynamic:
return
if hasattr(self, 'exec_network') and self.exec_network is not None:
del self.exec_network
self.launcher.reshape_network(self.network, shape)
self.dynamic_inputs, self.partial_shapes = self.launcher.get_dynamic_inputs(self.network)
if not self.is_dynamic and self.dynamic_inputs:
return
self.exec_network = self.launcher.ie_core.compile_model(self.network, self.launcher.device)
self.infer_request = None
def load_model(self, network_info, launcher, model_prefix=None, log=False):
self.network = launcher.read_network(str(network_info['model']), str(network_info['weights']))
self.load_network(self.network, launcher, model_prefix)
if log:
self.print_input_output_info()
self.infer_request = None
def update_input_output_info(self, model_prefix):
def generate_name(prefix, with_prefix, layer_name):
return prefix + layer_name if with_prefix else layer_name.split(prefix)[-1]
if model_prefix is None:
return
config_inputs = self.model_info.get('inputs', [])
network_with_prefix = next(iter(self.inputs)).startswith(model_prefix)
if config_inputs:
config_with_prefix = config_inputs[0]['name'].startswith(model_prefix)
if config_with_prefix == network_with_prefix:
return
for c_input in config_inputs:
c_input['name'] = generate_name(model_prefix, network_with_prefix, c_input['name'])
self.model_info['inputs'] = config_inputs
config_outputs = self.model_info['outputs']
for key, value in config_outputs.items():
config_with_prefix = value.startswith(model_prefix)
if config_with_prefix != network_with_prefix:
config_outputs[key] = generate_name(model_prefix, network_with_prefix, value)
self.model_info['outputs'] = config_outputs
class CaffeProposalStage(CaffeModelMixin, ProposalBaseStage):
def __init__(self, model_info, model_specific_preprocessor, common_preprocessor, launcher, *args, **kwargs):
super().__init__(model_info, model_specific_preprocessor, common_preprocessor)
self.net = launcher.create_network(self.model_info['model'], self.model_info['weights'])
self.input_feeder = InputFeeder(model_info.get('inputs', []), self.inputs, self.input_shape, self.fit_to_input)
pnet_outs = model_info['outputs']
pnet_adapter_config = launcher.config.get('adapter', {'type': 'mtcnn_p', **pnet_outs})
pnet_adapter_config.update({'regions_format': 'hw'})
self.adapter = create_adapter(pnet_adapter_config)
class CaffeRefineStage(CaffeModelMixin, RefineBaseStage):
def __init__(self, model_info, model_specific_preprocessor, common_preprocessor, launcher, *args, **kwargs):
super().__init__(model_info, model_specific_preprocessor, common_preprocessor)
self.net = launcher.create_network(self.model_info['model'], self.model_info['weights'])
self.input_feeder = InputFeeder(model_info.get('inputs', []), self.inputs, self.input_shape, self.fit_to_input)
class CaffeOutputStage(CaffeModelMixin, OutputBaseStage):
def __init__(self, model_info, model_specific_preprocessor, common_preprocessor, launcher):
super().__init__(model_info, model_specific_preprocessor, common_preprocessor)
self.net = launcher.create_network(self.model_info['model'], self.model_info['weights'])
self.input_feeder = InputFeeder(model_info.get('inputs', []), self.inputs, self.input_shape, self.fit_to_input)
class OpenVINOProposalStage(ProposalBaseStage, OVModelMixin):
def __init__(
self, model_info, model_specific_preprocessor, common_preprocessor, launcher, delayed_model_loading=False
):
super().__init__(model_info, model_specific_preprocessor, common_preprocessor)
self.adapter = None
self.is_dynamic = False
if not delayed_model_loading:
model_xml, model_bin = self.prepare_model()
self.load_model({'model': model_xml, 'weights': model_bin}, launcher, 'pnet_', log=True)
pnet_outs = model_info['outputs']
pnet_adapter_config = launcher.config.get('adapter', {'type': 'mtcnn_p', **pnet_outs})
# pnet_adapter_config.update({'regions_format': 'hw'})
self.adapter = create_adapter(pnet_adapter_config)
def load_network(self, network, launcher, model_prefix):
self.network = network
self.dynamic_inputs, self.partial_shapes = launcher.get_dynamic_inputs(self.network)
if self.dynamic_inputs and launcher.dynamic_shapes_policy in ['dynamic', 'default']:
try:
self.exec_network = launcher.ie_core.compile_modelk(self.network, launcher.device)
self.is_dynamic = True
except RuntimeError as e:
if launcher.dynamic_shapes_policy == 'dynamic':
raise e
self.is_dynamic = False
self.exec_network = None
if not self.dynamic_inputs:
self.exec_network = launcher.ie_core.compile_model(self.network, launcher.device)
self.update_input_output_info(model_prefix)
self.input_feeder = InputFeeder(
self.model_info.get('inputs', []), self.inputs, self.input_shape, self.fit_to_input)
pnet_outs = self.model_info['outputs']
pnet_adapter_config = launcher.config.get('adapter', {'type': 'mtcnn_p', **pnet_outs})
self.adapter = create_adapter(pnet_adapter_config)
def load_model(self, network_info, launcher, model_prefix=None, log=False):
self.network = launcher.read_network(str(network_info['model']), str(network_info['weights']))
self.load_network(self.network, launcher, model_prefix)
self.launcher = launcher
if log:
self.print_input_output_info()
def predict(self, input_blobs, batch_meta, output_callback=None):
raw_outputs = self._infer(input_blobs, batch_meta)
if output_callback:
for out in raw_outputs:
output_callback(out)
return raw_outputs
class DLSDKProposalStage(DLSDKModelMixin, ProposalBaseStage):
def __init__(
self, model_info, model_specific_preprocessor, common_preprocessor, launcher, delayed_model_loading=False
):
super().__init__(model_info, model_specific_preprocessor, common_preprocessor)
self.adapter = None
self.is_dynamic = False
if not delayed_model_loading:
model_xml, model_bin = self.prepare_model()
self.load_model({'model': model_xml, 'weights': model_bin}, launcher, 'pnet_', log=True)
pnet_outs = model_info['outputs']
pnet_adapter_config = launcher.config.get('adapter', {'type': 'mtcnn_p', **pnet_outs})
# pnet_adapter_config.update({'regions_format': 'hw'})
self.adapter = create_adapter(pnet_adapter_config)
def load_network(self, network, launcher, model_prefix):
self.network = network
self.dynamic_inputs, self.partial_shapes = launcher.get_dynamic_inputs(self.network)
if self.dynamic_inputs and launcher.dynamic_shapes_policy in ['dynamic', 'default']:
try:
self.exec_network = launcher.ie_core.load_network(self.network, launcher.device)
self.is_dynamic = True
except RuntimeError as e:
if launcher.dynamic_shapes_policy == 'dynamic':
raise e
self.is_dynamic = False
self.exec_network = None
if not self.dynamic_inputs:
self.exec_network = launcher.ie_core.load_network(self.network, launcher.device)
self.update_input_output_info(model_prefix)
self.input_feeder = InputFeeder(
self.model_info.get('inputs', []), self.inputs, self.input_shape, self.fit_to_input)
pnet_outs = self.model_info['outputs']
pnet_adapter_config = launcher.config.get('adapter', {'type': 'mtcnn_p', **pnet_outs})
self.adapter = create_adapter(pnet_adapter_config)
def load_model(self, network_info, launcher, model_prefix=None, log=False):
self.network = launcher.read_network(str(network_info['model']), str(network_info['weights']))
self.load_network(self.network, launcher, model_prefix)
self.launcher = launcher
if log:
self.print_input_output_info()
def predict(self, input_blobs, batch_meta, output_callback=None):
raw_outputs = self._infer(input_blobs, batch_meta)
if output_callback:
for out in raw_outputs:
output_callback(out)
return raw_outputs
class OpenVINORefineStage(RefineBaseStage, OVModelMixin):
def __init__(
self, model_info, model_specific_preprocessor, common_preprocessor, launcher, delayed_model_loading=False
):
self.default_model_suffix = 'rnet'
super().__init__(model_info, model_specific_preprocessor, common_preprocessor)
self.is_dynamic = False
self.launcher = launcher
if not delayed_model_loading:
model_xml, model_bin = self.prepare_model()
self.load_model({'model': model_xml, 'weights': model_bin}, launcher, 'rnet_', log=True)
def predict(self, input_blobs, batch_meta, output_callback=None):
raw_outputs = self._infer(input_blobs, batch_meta)
if output_callback:
batch_size = np.shape(next(iter(input_blobs[0].values())))[0]
output_callback(transform_for_callback(batch_size, raw_outputs))
return raw_outputs
class DLSDKRefineStage(DLSDKModelMixin, RefineBaseStage):
def __init__(
self, model_info, model_specific_preprocessor, common_preprocessor, launcher, delayed_model_loading=False
):
super().__init__(model_info, model_specific_preprocessor, common_preprocessor)
self.is_dynamic = False
self.launcher = launcher
if not delayed_model_loading:
model_xml, model_bin = self.prepare_model()
self.load_model({'model': model_xml, 'weights': model_bin}, launcher, 'rnet_', log=True)
def predict(self, input_blobs, batch_meta, output_callback=None):
raw_outputs = self._infer(input_blobs, batch_meta)
if output_callback:
batch_size = np.shape(next(iter(input_blobs[0].values())))[0]
output_callback(transform_for_callback(batch_size, raw_outputs))
return raw_outputs
class DLSDKOutputStage(DLSDKModelMixin, OutputBaseStage):
def __init__(
self, model_info, model_specific_preprocessor, common_preprocessor, launcher, delayed_model_loading=False
):
super().__init__(model_info, model_specific_preprocessor, common_preprocessor)
self.is_dynamic = False
self.launcher = launcher
if not delayed_model_loading:
model_xml, model_bin = self.prepare_model()
self.load_model({'model': model_xml, 'weights': model_bin}, launcher, 'onet_', log=True)
def predict(self, input_blobs, batch_meta, output_callback=None):
raw_outputs = self._infer(input_blobs, batch_meta)
return raw_outputs
class OpenVINOOutputStage(OutputBaseStage, OVModelMixin):
def __init__(
self, model_info, model_specific_preprocessor, common_preprocessor, launcher, delayed_model_loading=False
):
self.default_model_suffix = 'onet'
super().__init__(model_info, model_specific_preprocessor, common_preprocessor)
self.is_dynamic = False
self.launcher = launcher
if not delayed_model_loading:
model_xml, model_bin = self.prepare_model()
self.load_model({'model': model_xml, 'weights': model_bin}, launcher, 'onet_', log=True)
def predict(self, input_blobs, batch_meta, output_callback=None):
raw_outputs = self._infer(input_blobs, batch_meta)
return raw_outputs
| 47.109069 | 120 | 0.667594 | 4,559 | 38,441 | 5.341742 | 0.068436 | 0.022913 | 0.022954 | 0.030551 | 0.815875 | 0.795713 | 0.786844 | 0.775921 | 0.774073 | 0.759208 | 0 | 0.002085 | 0.23886 | 38,441 | 815 | 121 | 47.166871 | 0.83024 | 0.018626 | 0 | 0.749284 | 0 | 0 | 0.067506 | 0.00183 | 0 | 0 | 0 | 0 | 0 | 1 | 0.106017 | false | 0.001433 | 0.017192 | 0.011461 | 0.224928 | 0.028653 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
91b92a71edc046d843823f9db7411336c73075c3 | 2,967 | py | Python | MYPETS/tienda/models.py | sebastiansalazar123/Frameworks_8A | 6dccd7f55f8d5fe8c74508307182193dbf522641 | [
"MIT"
] | null | null | null | MYPETS/tienda/models.py | sebastiansalazar123/Frameworks_8A | 6dccd7f55f8d5fe8c74508307182193dbf522641 | [
"MIT"
] | null | null | null | MYPETS/tienda/models.py | sebastiansalazar123/Frameworks_8A | 6dccd7f55f8d5fe8c74508307182193dbf522641 | [
"MIT"
] | null | null | null | from django.db import models
from django.db.models.base import Model
class country(models.Model):
code = models.CharField(max_length=10)
name = models.CharField(max_length=150)
abrev = models.CharField(max_length=4)
status = models.BooleanField()
created_at = models.DateTimeField()
update_at = models.DateTimeField()
delete_at = models.DateTimeField()
class city(models.Model) :
code = models.CharField(max_length=10)
name = models.CharField(max_length=150)
abrev = models.CharField(max_length=4)
id_country = models.ForeignKey(country, on_delete=models.CASCADE)
status = models.BooleanField()
created_at = models.DateTimeField()
update_at = models.DateTimeField()
delete_at = models.DateTimeField()
class identification_type(models.Model) :
type = models.CharField(max_length=150)
abrev = models.CharField(max_length=4)
created_at = models.DateTimeField()
update_at = models.DateTimeField()
delete_at = models.DateTimeField()
class user(models.Model) :
first_name =models.CharField(max_length=200)
last_name =models.CharField(max_length=200)
id_identification_type = models.ForeignKey(identification_type, on_delete=models.CASCADE)
number_id = models.CharField(max_length=15)
id_city = models.ForeignKey(city, on_delete=models.CASCADE)
email = models.CharField(max_length=200)
password = models.CharField(max_length=200)
status = models.BooleanField()
created_at = models.DateTimeField()
update_at = models.DateTimeField()
delete_at = models.DateTimeField()
class session(models.Model) :
id_user = models.ForeignKey(user, on_delete=models.CASCADE)
ip = models.CharField(max_length=200)
status = models.BooleanField()
created_at = models.DateTimeField()
update_at = models.DateTimeField()
delete_at = models.DateTimeField()
class race(models.Model) :
code = models.CharField(max_length=10)
name = models.CharField(max_length=150)
abrev = models.CharField(max_length=4)
status = models.BooleanField()
created_at = models.DateTimeField()
update_at = models.DateTimeField()
delete_at = models.DateTimeField()
class type (models.Model) :
code = models.CharField(max_length=10)
name = models.CharField(max_length=150)
abrev = models.CharField(max_length=4)
status = models.BooleanField()
created_at = models.DateTimeField()
update_at = models.DateTimeField()
delete_at = models.DateTimeField()
class pet(models.Model) :
code = models.CharField(max_length=10)
name = models.CharField(max_length=150)
status = models.BooleanField()
id_user = models.ForeignKey(user, on_delete=models.CASCADE)
id_type = models.ForeignKey(type, on_delete=models.CASCADE)
id_race = models.ForeignKey(race, on_delete=models.CASCADE)
created_at = models.DateTimeField()
update_at = models.DateTimeField()
delete_at = models.DateTimeField()
| 34.5 | 93 | 0.733401 | 362 | 2,967 | 5.828729 | 0.121547 | 0.090995 | 0.238863 | 0.250237 | 0.788152 | 0.749763 | 0.720379 | 0.720379 | 0.720379 | 0.675829 | 0 | 0.02004 | 0.159083 | 2,967 | 85 | 94 | 34.905882 | 0.825651 | 0 | 0 | 0.685714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.014286 | 0.028571 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
91d0b5028661b836504b431bbbe5e432a0eeb3cf | 197 | py | Python | direct/functionals/__init__.py | directgroup/direct | 78cdd530b3c93e31c11d8963880e6329f0989243 | [
"Apache-2.0"
] | 55 | 2020-06-06T12:08:57.000Z | 2021-12-21T06:49:02.000Z | direct/functionals/__init__.py | directgroup/direct | 78cdd530b3c93e31c11d8963880e6329f0989243 | [
"Apache-2.0"
] | 102 | 2020-06-07T19:10:06.000Z | 2021-12-18T11:34:27.000Z | direct/functionals/__init__.py | directgroup/direct | 78cdd530b3c93e31c11d8963880e6329f0989243 | [
"Apache-2.0"
] | 24 | 2020-06-06T12:06:58.000Z | 2021-12-21T10:42:50.000Z | # coding=utf-8
# Copyright (c) DIRECT Contributors
from typing import Tuple
from direct.functionals.challenges import *
from direct.functionals.psnr import *
from direct.functionals.ssim import *
| 24.625 | 43 | 0.80203 | 26 | 197 | 6.076923 | 0.576923 | 0.189873 | 0.398734 | 0.341772 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00578 | 0.121827 | 197 | 7 | 44 | 28.142857 | 0.907514 | 0.233503 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
91d5f130fde81cd3e6268bd70adfe01e07e807ba | 29,646 | py | Python | basicsr/utils/lmdb.py | hanseungwook/BasicSR | 3dda59f179e19bd9e5741299d373a138c501485a | [
"Apache-2.0",
"MIT"
] | null | null | null | basicsr/utils/lmdb.py | hanseungwook/BasicSR | 3dda59f179e19bd9e5741299d373a138c501485a | [
"Apache-2.0",
"MIT"
] | null | null | null | basicsr/utils/lmdb.py | hanseungwook/BasicSR | 3dda59f179e19bd9e5741299d373a138c501485a | [
"Apache-2.0",
"MIT"
] | null | null | null | import cv2
import lmdb
import mmcv
import torch
from torch.autograd import Variable
import sys
from multiprocessing import Pool
from os import path as osp
import pywt
from .util import ProgressBar
from basicsr.data.util import duf_downsample
def make_lmdb_from_imgs(data_path,
lmdb_path,
img_path_list,
keys,
batch=5000,
compress_level=1,
multiprocessing_read=False,
n_thread=40,
map_size=None):
"""Make lmdb from images.
Contents of lmdb. The file structure is:
example.lmdb
├── data.mdb
├── lock.mdb
├── meta_info.txt
The data.mdb and lock.mdb are standard lmdb files and you can refer to
https://lmdb.readthedocs.io/en/release/ for more details.
The meta_info.txt is a specified txt file to record the meta information
of our datasets. It will be automatically created when preparing
datasets by our provided dataset tools.
Each line in the txt file records 1)image name (with extension),
2)image shape, and 3)compression level, separated by a white space.
For example, the meta information could be:
`000_00000000.png (720,1280,3) 1`, which means:
1) image name (with extension): 000_00000000.png;
2) image shape: (720,1280,3);
3) compression level: 1
We use the image name without extension as the lmdb key.
If `multiprocessing_read` is True, it will read all the images to memory
using multiprocessing. Thus, your server needs to have enough memory.
Args:
data_path (str): Data path for reading images.
lmdb_path (str): Lmdb save path.
img_path_list (str): Image path list.
keys (str): Used for lmdb keys.
batch (int): After processing batch images, lmdb commits.
Default: 5000.
compress_level (int): Compress level when encoding images. Default: 1.
multiprocessing_read (bool): Whether use multiprocessing to read all
the images to memory. Default: False.
n_thread (int): For multiprocessing.
map_size (int | None): Map size for lmdb env. If None, use the
estimated size from images. Default: None
"""
assert len(img_path_list) == len(keys), (
'img_path_list and keys should have the same length, '
f'but got {len(img_path_list)} and {len(keys)}')
print(f'Create lmdb for {data_path}, save to {lmdb_path}...')
print(f'Totoal images: {len(img_path_list)}')
if not lmdb_path.endswith('.lmdb'):
raise ValueError("lmdb_path must end with '.lmdb'.")
if osp.exists(lmdb_path):
print(f'Folder {lmdb_path} already exists. Exit.')
sys.exit(1)
if multiprocessing_read:
# read all the images to memory (multiprocessing)
dataset = {} # use dict to keep the order for multiprocessing
shapes = {}
print(f'Read images with multiprocessing, #thread: {n_thread} ...')
pbar = ProgressBar(len(img_path_list))
def callback(arg):
"""get the image data and update pbar."""
key, dataset[key], shapes[key] = arg
pbar.update('Reading {}'.format(key))
pool = Pool(n_thread)
for path, key in zip(img_path_list, keys):
pool.apply_async(
read_img_worker,
args=(osp.join(data_path, path), key, compress_level),
callback=callback)
pool.close()
pool.join()
print(f'Finish reading {len(img_path_list)} images.')
# create lmdb environment
if map_size is None:
# obtain data size for one image
img = mmcv.imread(
osp.join(data_path, img_path_list[0]), flag='unchanged')
_, img_byte = cv2.imencode(
'.png', img, [cv2.IMWRITE_PNG_COMPRESSION, compress_level])
data_size_per_img = img_byte.nbytes
print('Data size per image is: ', data_size_per_img)
data_size = data_size_per_img * len(img_path_list)
map_size = data_size * 10
env = lmdb.open(lmdb_path, map_size=map_size)
# write data to lmdb
pbar = ProgressBar(len(img_path_list))
txn = env.begin(write=True)
txt_file = open(osp.join(lmdb_path, 'meta_info.txt'), 'w')
for idx, (path, key) in enumerate(zip(img_path_list, keys)):
pbar.update(f'Write {key}')
key_byte = key.encode('ascii')
if multiprocessing_read:
img_byte = dataset[key]
h, w, c = shapes[key]
else:
_, img_byte, img_shape = read_img_worker(
osp.join(data_path, path), key, compress_level)
h, w, c = img_shape
txn.put(key_byte, img_byte)
# write meta information
txt_file.write(f'{key}.png ({h},{w},{c}) {compress_level}\n')
if idx % batch == 0:
txn.commit()
txn = env.begin(write=True)
txn.commit()
env.close()
txt_file.close()
print('\nFinish writing lmdb.')
#
def make_lr_lmdb_from_imgs(data_path,
lmdb_path,
img_path_list,
keys,
batch=5000,
compress_level=1,
multiprocessing_read=False,
n_thread=40,
map_size=None):
"""Make lmdb from images.
Contents of lmdb. The file structure is:
example.lmdb
├── data.mdb
├── lock.mdb
├── meta_info.txt
The data.mdb and lock.mdb are standard lmdb files and you can refer to
https://lmdb.readthedocs.io/en/release/ for more details.
The meta_info.txt is a specified txt file to record the meta information
of our datasets. It will be automatically created when preparing
datasets by our provided dataset tools.
Each line in the txt file records 1)image name (with extension),
2)image shape, and 3)compression level, separated by a white space.
For example, the meta information could be:
`000_00000000.png (720,1280,3) 1`, which means:
1) image name (with extension): 000_00000000.png;
2) image shape: (720,1280,3);
3) compression level: 1
We use the image name without extension as the lmdb key.
If `multiprocessing_read` is True, it will read all the images to memory
using multiprocessing. Thus, your server needs to have enough memory.
Args:
data_path (str): Data path for reading images.
lmdb_path (str): Lmdb save path.
img_path_list (str): Image path list.
keys (str): Used for lmdb keys.
batch (int): After processing batch images, lmdb commits.
Default: 5000.
compress_level (int): Compress level when encoding images. Default: 1.
multiprocessing_read (bool): Whether use multiprocessing to read all
the images to memory. Default: False.
n_thread (int): For multiprocessing.
map_size (int | None): Map size for lmdb env. If None, use the
estimated size from images. Default: None
"""
assert len(img_path_list) == len(keys), (
'img_path_list and keys should have the same length, '
f'but got {len(img_path_list)} and {len(keys)}')
print(f'Create lmdb for {data_path}, save to {lmdb_path}...')
print(f'Totoal images: {len(img_path_list)}')
if not lmdb_path.endswith('.lmdb'):
raise ValueError("lmdb_path must end with '.lmdb'.")
if osp.exists(lmdb_path):
print(f'Folder {lmdb_path} already exists. Exit.')
sys.exit(1)
if multiprocessing_read:
# read all the images to memory (multiprocessing)
dataset = {} # use dict to keep the order for multiprocessing
shapes = {}
print(f'Read images with multiprocessing, #thread: {n_thread} ...')
pbar = ProgressBar(len(img_path_list))
def callback(arg):
"""get the image data and update pbar."""
key, dataset[key], shapes[key] = arg
pbar.update('Reading {}'.format(key))
pool = Pool(n_thread)
for path, key in zip(img_path_list, keys):
pool.apply_async(
read_img_worker,
args=(osp.join(data_path, path), key, compress_level),
callback=callback)
pool.close()
pool.join()
print(f'Finish reading {len(img_path_list)} images.')
# create lmdb environment
if map_size is None:
# obtain data size for one image
img = mmcv.imread(
osp.join(data_path, img_path_list[0]), flag='unchanged')
_, img_byte = cv2.imencode(
'.png', img, [cv2.IMWRITE_PNG_COMPRESSION, compress_level])
data_size_per_img = img_byte.nbytes
print('Data size per image is: ', data_size_per_img)
data_size = data_size_per_img * len(img_path_list)
map_size = data_size * 10
env = lmdb.open(lmdb_path, map_size=map_size)
# write data to lmdb
pbar = ProgressBar(len(img_path_list))
txn = env.begin(write=True)
txt_file = open(osp.join(lmdb_path, 'meta_info.txt'), 'w')
for idx, (path, key) in enumerate(zip(img_path_list, keys)):
pbar.update(f'Write {key}')
key_byte = key.encode('ascii')
if multiprocessing_read:
img_byte = dataset[key]
h, w, c = shapes[key]
else:
_, img_byte, img_shape = read_img_worker(
osp.join(data_path, path), key, compress_level, lr=True)
h, w, c = img_shape
txn.put(key_byte, img_byte)
# write meta information
txt_file.write(f'{key}.png ({h},{w},{c}) {compress_level}\n')
if idx % batch == 0:
txn.commit()
txn = env.begin(write=True)
txn.commit()
env.close()
txt_file.close()
print('\nFinish writing lmdb.')
def make_lr_lmdb_from_imgs(data_path,
lmdb_path,
img_path_list,
keys,
batch=5000,
compress_level=1,
multiprocessing_read=False,
n_thread=40,
map_size=None):
"""Make lmdb from images.
Contents of lmdb. The file structure is:
example.lmdb
├── data.mdb
├── lock.mdb
├── meta_info.txt
The data.mdb and lock.mdb are standard lmdb files and you can refer to
https://lmdb.readthedocs.io/en/release/ for more details.
The meta_info.txt is a specified txt file to record the meta information
of our datasets. It will be automatically created when preparing
datasets by our provided dataset tools.
Each line in the txt file records 1)image name (with extension),
2)image shape, and 3)compression level, separated by a white space.
For example, the meta information could be:
`000_00000000.png (720,1280,3) 1`, which means:
1) image name (with extension): 000_00000000.png;
2) image shape: (720,1280,3);
3) compression level: 1
We use the image name without extension as the lmdb key.
If `multiprocessing_read` is True, it will read all the images to memory
using multiprocessing. Thus, your server needs to have enough memory.
Args:
data_path (str): Data path for reading images.
lmdb_path (str): Lmdb save path.
img_path_list (str): Image path list.
keys (str): Used for lmdb keys.
batch (int): After processing batch images, lmdb commits.
Default: 5000.
compress_level (int): Compress level when encoding images. Default: 1.
multiprocessing_read (bool): Whether use multiprocessing to read all
the images to memory. Default: False.
n_thread (int): For multiprocessing.
map_size (int | None): Map size for lmdb env. If None, use the
estimated size from images. Default: None
"""
assert len(img_path_list) == len(keys), (
'img_path_list and keys should have the same length, '
f'but got {len(img_path_list)} and {len(keys)}')
print(f'Create lmdb for {data_path}, save to {lmdb_path}...')
print(f'Totoal images: {len(img_path_list)}')
if not lmdb_path.endswith('.lmdb'):
raise ValueError("lmdb_path must end with '.lmdb'.")
if osp.exists(lmdb_path):
print(f'Folder {lmdb_path} already exists. Exit.')
sys.exit(1)
if multiprocessing_read:
# read all the images to memory (multiprocessing)
dataset = {} # use dict to keep the order for multiprocessing
shapes = {}
print(f'Read images with multiprocessing, #thread: {n_thread} ...')
pbar = ProgressBar(len(img_path_list))
def callback(arg):
"""get the image data and update pbar."""
key, dataset[key], shapes[key] = arg
pbar.update('Reading {}'.format(key))
pool = Pool(n_thread)
for path, key in zip(img_path_list, keys):
pool.apply_async(
read_img_worker,
args=(osp.join(data_path, path), key, compress_level),
callback=callback)
pool.close()
pool.join()
print(f'Finish reading {len(img_path_list)} images.')
# create lmdb environment
if map_size is None:
# obtain data size for one image
img = mmcv.imread(
osp.join(data_path, img_path_list[0]), flag='unchanged')
_, img_byte = cv2.imencode(
'.png', img, [cv2.IMWRITE_PNG_COMPRESSION, compress_level])
data_size_per_img = img_byte.nbytes
print('Data size per image is: ', data_size_per_img)
data_size = data_size_per_img * len(img_path_list)
map_size = data_size * 10
env = lmdb.open(lmdb_path, map_size=map_size)
# write data to lmdb
pbar = ProgressBar(len(img_path_list))
txn = env.begin(write=True)
txt_file = open(osp.join(lmdb_path, 'meta_info.txt'), 'w')
for idx, (path, key) in enumerate(zip(img_path_list, keys)):
pbar.update(f'Write {key}')
key_byte = key.encode('ascii')
if multiprocessing_read:
img_byte = dataset[key]
h, w, c = shapes[key]
else:
_, img_byte, img_shape = read_img_worker(
osp.join(data_path, path), key, compress_level, lr=True)
h, w, c = img_shape
txn.put(key_byte, img_byte)
# write meta information
txt_file.write(f'{key}.png ({h},{w},{c}) {compress_level}\n')
if idx % batch == 0:
txn.commit()
txn = env.begin(write=True)
txn.commit()
env.close()
txt_file.close()
print('\nFinish writing lmdb.')
# WT util functions
def create_filters(device, wt_fn='bior2.2'):
w = pywt.Wavelet(wt_fn)
dec_hi = torch.Tensor(w.dec_hi[::-1]).to(device)
dec_lo = torch.Tensor(w.dec_lo[::-1]).to(device)
filters = torch.stack([dec_lo.unsqueeze(0)*dec_lo.unsqueeze(1),
dec_lo.unsqueeze(0)*dec_hi.unsqueeze(1),
dec_hi.unsqueeze(0)*dec_lo.unsqueeze(1),
dec_hi.unsqueeze(0)*dec_hi.unsqueeze(1)], dim=0)
return filters
def wt(vimg, filters, levels=1):
bs = vimg.shape[0]
h = vimg.size(2)
w = vimg.size(3)
vimg = vimg.reshape(-1, 1, h, w)
padded = torch.nn.functional.pad(vimg,(2,2,2,2))
res = torch.nn.functional.conv2d(padded, Variable(filters[:,None]),stride=2)
if levels>1:
res[:,:1] = wt(res[:,:1], filters, levels-1)
res[:,:1,32:,:] = res[:,:1,32:,:]*1.
res[:,:1,:,32:] = res[:,:1,:,32:]*1.
res[:,1:] = res[:,1:]*1.
res = res.view(-1,2,h//2,w//2).transpose(1,2).contiguous().view(-1,1,h,w)
return res.reshape(bs, -1, h, w)
def make_wt_lmdb_from_imgs(data_path,
lmdb_path,
img_path_list,
keys,
batch=5000,
compress_level=1,
multiprocessing_read=False,
n_thread=40,
map_size=None):
"""Make lmdb from images.
Contents of lmdb. The file structure is:
example.lmdb
├── data.mdb
├── lock.mdb
├── meta_info.txt
The data.mdb and lock.mdb are standard lmdb files and you can refer to
https://lmdb.readthedocs.io/en/release/ for more details.
The meta_info.txt is a specified txt file to record the meta information
of our datasets. It will be automatically created when preparing
datasets by our provided dataset tools.
Each line in the txt file records 1)image name (with extension),
2)image shape, and 3)compression level, separated by a white space.
For example, the meta information could be:
`000_00000000.png (720,1280,3) 1`, which means:
1) image name (with extension): 000_00000000.png;
2) image shape: (720,1280,3);
3) compression level: 1
We use the image name without extension as the lmdb key.
If `multiprocessing_read` is True, it will read all the images to memory
using multiprocessing. Thus, your server needs to have enough memory.
Args:
data_path (str): Data path for reading images.
lmdb_path (str): Lmdb save path.
img_path_list (str): Image path list.
keys (str): Used for lmdb keys.
batch (int): After processing batch images, lmdb commits.
Default: 5000.
compress_level (int): Compress level when encoding images. Default: 1.
multiprocessing_read (bool): Whether use multiprocessing to read all
the images to memory. Default: False.
n_thread (int): For multiprocessing.
map_size (int | None): Map size for lmdb env. If None, use the
estimated size from images. Default: None
"""
assert len(img_path_list) == len(keys), (
'img_path_list and keys should have the same length, '
f'but got {len(img_path_list)} and {len(keys)}')
print(f'Create lmdb for {data_path}, save to {lmdb_path}...')
print(f'Totoal images: {len(img_path_list)}')
if not lmdb_path.endswith('.lmdb'):
raise ValueError("lmdb_path must end with '.lmdb'.")
if osp.exists(lmdb_path):
print(f'Folder {lmdb_path} already exists. Exit.')
sys.exit(1)
if multiprocessing_read:
# read all the images to memory (multiprocessing)
dataset = {} # use dict to keep the order for multiprocessing
shapes = {}
print(f'Read images with multiprocessing, #thread: {n_thread} ...')
pbar = ProgressBar(len(img_path_list))
def callback(arg):
"""get the image data and update pbar."""
key, dataset[key], shapes[key] = arg
pbar.update('Reading {}'.format(key))
pool = Pool(n_thread)
for path, key in zip(img_path_list, keys):
pool.apply_async(
read_img_worker,
args=(osp.join(data_path, path), key, compress_level),
callback=callback)
pool.close()
pool.join()
print(f'Finish reading {len(img_path_list)} images.')
# create wt filters
filters = create_filters(device='cuda:0')
# create lmdb environment
if map_size is None:
# obtain data size for one image
img = mmcv.imread(
osp.join(data_path, img_path_list[0]), flag='unchanged')
img = torch.from_numpy(img / 255.0).float()
img = wt(img.permute(2,0,1).unsqueeze(0).to('cuda:0'), filters, levels=3)[:, :, :64, :64].cpu().numpy()
img_byte = img.tobytes()
data_size_per_img = len(img_byte)
print('Data size per image is: ', data_size_per_img)
data_size = data_size_per_img * len(img_path_list)
map_size = data_size * 10
env = lmdb.open(lmdb_path, map_size=map_size)
# write data to lmdb
pbar = ProgressBar(len(img_path_list))
txn = env.begin(write=True)
txt_file = open(osp.join(lmdb_path, 'meta_info.txt'), 'w')
for idx, (path, key) in enumerate(zip(img_path_list, keys)):
pbar.update(f'Write {key}')
key_byte = key.encode('ascii')
if multiprocessing_read:
img_byte = dataset[key]
h, w, c = shapes[key]
else:
_, img_byte, img_shape = read_img_worker(
osp.join(data_path, path), key, compress_level, use_wt=True, filters=filters)
h, w, c = img_shape
txn.put(key_byte, img_byte)
# write meta information
txt_file.write(f'{key}.png ({h},{w},{c}) {compress_level}\n')
if idx % batch == 0:
txn.commit()
txn = env.begin(write=True)
txn.commit()
env.close()
txt_file.close()
print('\nFinish writing lmdb.')
def make_inter_wt_lmdb_from_imgs(data_path,
lmdb_path,
img_path_list,
keys,
batch=5000,
compress_level=1,
multiprocessing_read=False,
n_thread=40,
map_size=None):
"""Make lmdb from images.
Contents of lmdb. The file structure is:
example.lmdb
├── data.mdb
├── lock.mdb
├── meta_info.txt
The data.mdb and lock.mdb are standard lmdb files and you can refer to
https://lmdb.readthedocs.io/en/release/ for more details.
The meta_info.txt is a specified txt file to record the meta information
of our datasets. It will be automatically created when preparing
datasets by our provided dataset tools.
Each line in the txt file records 1)image name (with extension),
2)image shape, and 3)compression level, separated by a white space.
For example, the meta information could be:
`000_00000000.png (720,1280,3) 1`, which means:
1) image name (with extension): 000_00000000.png;
2) image shape: (720,1280,3);
3) compression level: 1
We use the image name without extension as the lmdb key.
If `multiprocessing_read` is True, it will read all the images to memory
using multiprocessing. Thus, your server needs to have enough memory.
Args:
data_path (str): Data path for reading images.
lmdb_path (str): Lmdb save path.
img_path_list (str): Image path list.
keys (str): Used for lmdb keys.
batch (int): After processing batch images, lmdb commits.
Default: 5000.
compress_level (int): Compress level when encoding images. Default: 1.
multiprocessing_read (bool): Whether use multiprocessing to read all
the images to memory. Default: False.
n_thread (int): For multiprocessing.
map_size (int | None): Map size for lmdb env. If None, use the
estimated size from images. Default: None
"""
assert len(img_path_list) == len(keys), (
'img_path_list and keys should have the same length, '
f'but got {len(img_path_list)} and {len(keys)}')
print(f'Create lmdb for {data_path}, save to {lmdb_path}...')
print(f'Totoal images: {len(img_path_list)}')
if not lmdb_path.endswith('.lmdb'):
raise ValueError("lmdb_path must end with '.lmdb'.")
if osp.exists(lmdb_path):
print(f'Folder {lmdb_path} already exists. Exit.')
sys.exit(1)
if multiprocessing_read:
# read all the images to memory (multiprocessing)
dataset = {} # use dict to keep the order for multiprocessing
shapes = {}
print(f'Read images with multiprocessing, #thread: {n_thread} ...')
pbar = ProgressBar(len(img_path_list))
def callback(arg):
"""get the image data and update pbar."""
key, dataset[key], shapes[key] = arg
pbar.update('Reading {}'.format(key))
pool = Pool(n_thread)
for path, key in zip(img_path_list, keys):
pool.apply_async(
read_img_worker,
args=(osp.join(data_path, path), key, compress_level),
callback=callback)
pool.close()
pool.join()
print(f'Finish reading {len(img_path_list)} images.')
# create wt filters
filters = create_filters(device='cuda:0')
# create lmdb environment
if map_size is None:
# obtain data size for one image
img = mmcv.imread(
osp.join(data_path, img_path_list[0]), flag='unchanged')
img = mmcv.image.imresize(img, (128, 128), interpolation='lanczos', backend='cv2')
img = torch.from_numpy(img / 255.0).float()
img = wt(img.permute(2,0,1).unsqueeze(0).to('cuda:0'), filters, levels=2)[:, :, :64, :64].cpu().numpy()
img_byte = img.tobytes()
data_size_per_img = len(img_byte)
print('Data size per image is: ', data_size_per_img)
data_size = data_size_per_img * len(img_path_list)
map_size = data_size * 10
env = lmdb.open(lmdb_path, map_size=map_size)
# write data to lmdb
pbar = ProgressBar(len(img_path_list))
txn = env.begin(write=True)
txt_file = open(osp.join(lmdb_path, 'meta_info.txt'), 'w')
for idx, (path, key) in enumerate(zip(img_path_list, keys)):
pbar.update(f'Write {key}')
key_byte = key.encode('ascii')
if multiprocessing_read:
img_byte = dataset[key]
h, w, c = shapes[key]
else:
_, img_byte, img_shape = read_img_worker(
osp.join(data_path, path), key, compress_level, use_wt=False, use_inter_wt=True, filters=filters)
h, w, c = img_shape
txn.put(key_byte, img_byte)
# write meta information
txt_file.write(f'{key}.png ({h},{w},{c}) {compress_level}\n')
if idx % batch == 0:
txn.commit()
txn = env.begin(write=True)
txn.commit()
env.close()
txt_file.close()
print('\nFinish writing lmdb.')
def read_img_worker(path, key, compress_level, lr=False, use_wt=False, use_inter_wt=False, filters=None):
"""Read image worker.
Args:
path (str): Image path.
key (str): Image key.
compress_level (int): Compress level when encoding images.
Returns:
str: Image key.
byte: Image byte.
tuple[int]: Image shape.
"""
img = mmcv.imread(path, flag='unchanged')
# If lr is True, then downsample from groundtruth images using duf_downsample
if lr:
img = duf_downsample(torch.from_numpy(img).permute(2,0,1).unsqueeze(0), scale=4).squeeze().permute(1,2,0).numpy()
elif use_wt:
img = torch.from_numpy(img / 255.0).float()
if img.ndim == 2:
print(img.shape, 'weird shape of image c=1')
sys.exit(1)
img = wt(img.unsqueeze(0).unsqueeze(0).to('cuda:0'), filters, levels=3)[:, :, :64, :64].squeeze().cpu().numpy()
elif img.ndim == 3:
img = wt(img.permute(2,0,1).unsqueeze(0).to('cuda:0'), filters, levels=3)[:, :, :64, :64].squeeze().permute(1,2,0).cpu().numpy()
elif use_inter_wt:
img = mmcv.image.imresize(img, (128, 128), interpolation='lanczos', backend='cv2')
img = torch.from_numpy(img / 255.0).float()
if img.ndim == 2:
img = wt(img.unsqueeze(0).unsqueeze(0).to('cuda:0'), filters, levels=2)[:, :, :64, :64].squeeze().cpu().numpy()
elif img.ndim == 3:
img = wt(img.permute(2,0,1).unsqueeze(0).to('cuda:0'), filters, levels=2)[:, :, :64, :64].squeeze().permute(1,2,0).cpu().numpy()
if img.ndim == 2:
h, w = img.shape
c = 1
else:
h, w, c = img.shape
if not (use_wt or use_inter_wt):
_, img_byte = cv2.imencode('.png', img,
[cv2.IMWRITE_PNG_COMPRESSION, compress_level])
# If writing WT patch in bytes, use np tobytes() function
else:
img_byte = img.tobytes()
return (key, img_byte, (h, w, c))
class LmdbMaker():
"""LMDB Maker.
Args:
lmdb_path (str): Lmdb save path.
map_size (int): Map size for lmdb env. Default: 1024 ** 4, 1TB.
batch (int): After processing batch images, lmdb commits.
Default: 5000.
compress_level (int): Compress level when encoding images. Default: 1.
"""
def __init__(self,
lmdb_path,
map_size=1024**4,
batch=5000,
compress_level=1):
if not lmdb_path.endswith('.lmdb'):
raise ValueError("lmdb_path must end with '.lmdb'.")
if osp.exists(lmdb_path):
print(f'Folder {lmdb_path} already exists. Exit.')
sys.exit(1)
self.lmdb_path = lmdb_path
self.batch = batch
self.compress_level = compress_level
self.env = lmdb.open(lmdb_path, map_size=map_size)
self.txn = self.env.begin(write=True)
self.txt_file = open(osp.join(lmdb_path, 'meta_info.txt'), 'w')
self.counter = 0
def put(self, img_byte, key, img_shape):
self.counter += 1
key_byte = key.encode('ascii')
self.txn.put(key_byte, img_byte)
# write meta information
h, w, c = img_shape
self.txt_file.write(f'{key}.png ({h},{w},{c}) {self.compress_level}\n')
if self.counter % self.batch == 0:
self.txn.commit()
self.txn = self.env.begin(write=True)
def close(self):
self.txn.commit()
self.env.close()
self.txt_file.close()
| 37.910486 | 140 | 0.601734 | 4,110 | 29,646 | 4.208029 | 0.067153 | 0.032379 | 0.041341 | 0.028332 | 0.91697 | 0.909511 | 0.904712 | 0.895172 | 0.892454 | 0.88488 | 0 | 0.024297 | 0.286413 | 29,646 | 781 | 141 | 37.959027 | 0.791113 | 0.32797 | 0 | 0.822727 | 0 | 0 | 0.14401 | 0.001206 | 0 | 0 | 0 | 0 | 0.011364 | 1 | 0.036364 | false | 0 | 0.025 | 0 | 0.070455 | 0.084091 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
91e933f70c563f35b37792389e2250200eacb461 | 53 | py | Python | action/__main__.py | mawassk/github-action-template | 97613d23e003916f129e1b5ae5f924cb0c422adf | [
"MIT"
] | null | null | null | action/__main__.py | mawassk/github-action-template | 97613d23e003916f129e1b5ae5f924cb0c422adf | [
"MIT"
] | 121 | 2021-08-21T17:22:45.000Z | 2022-03-18T21:18:18.000Z | action/__main__.py | mawassk/github-action-template | 97613d23e003916f129e1b5ae5f924cb0c422adf | [
"MIT"
] | null | null | null | from . import print_hello_world
print_hello_world()
| 13.25 | 31 | 0.830189 | 8 | 53 | 5 | 0.625 | 0.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113208 | 53 | 3 | 32 | 17.666667 | 0.851064 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 1 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
37f40b7352ff1b040efdaa5f414941156a5fc654 | 141 | py | Python | gaurabda/GCRasi.py | gopa810/gaurabda-calendar | 92c36b5948e9bcbfe991f19f511371aff1cc0fcb | [
"MIT"
] | 4 | 2020-09-12T06:32:08.000Z | 2022-01-15T09:31:31.000Z | gaurabda/GCRasi.py | gopa810/gaurabda-calendar | 92c36b5948e9bcbfe991f19f511371aff1cc0fcb | [
"MIT"
] | 2 | 2020-12-14T14:25:35.000Z | 2020-12-15T19:06:51.000Z | gaurabda/GCRasi.py | gopa810/gaurabda-calendar | 92c36b5948e9bcbfe991f19f511371aff1cc0fcb | [
"MIT"
] | 4 | 2020-10-10T16:31:05.000Z | 2021-08-20T17:23:01.000Z | from . import GCMath as GCMath
def GetRasi(SunLongitude,Ayanamsa):
return int(GCMath.Floor(GCMath.putIn360(SunLongitude - Ayanamsa)/30.0))
| 28.2 | 72 | 0.787234 | 19 | 141 | 5.842105 | 0.736842 | 0.36036 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047244 | 0.099291 | 141 | 4 | 73 | 35.25 | 0.826772 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
5329506ddd1f69026c9c699197121682c02c1616 | 6,732 | py | Python | crm/core/migrations/0003_auto__add_job__add_field_people_job2.py | klebercode/navasil | b6eec748659e7ba6bf3610a81a766f0efa9b6577 | [
"MIT"
] | null | null | null | crm/core/migrations/0003_auto__add_job__add_field_people_job2.py | klebercode/navasil | b6eec748659e7ba6bf3610a81a766f0efa9b6577 | [
"MIT"
] | null | null | null | crm/core/migrations/0003_auto__add_job__add_field_people_job2.py | klebercode/navasil | b6eec748659e7ba6bf3610a81a766f0efa9b6577 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from south.utils import datetime_utils as datetime
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
# Adding model 'Job'
db.create_table(u'core_job', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('name', self.gf('django.db.models.fields.CharField')(max_length=30, null=True, blank=True)),
))
db.send_create_signal(u'core', ['Job'])
# Adding field 'People.job2'
db.add_column(u'core_people', 'job2',
self.gf('django.db.models.fields.related.ForeignKey')(to=orm['core.Job'], null=True, blank=True),
keep_default=False)
def backwards(self, orm):
# Deleting model 'Job'
db.delete_table(u'core_job')
# Deleting field 'People.job2'
db.delete_column(u'core_people', 'job2_id')
models = {
u'core.contactemail': {
'Meta': {'ordering': "['email']", 'object_name': 'ContactEmail'},
'customer': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['core.Customer']", 'null': 'True', 'blank': 'True'}),
'email': ('django.db.models.fields.EmailField', [], {'max_length': '75'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'people': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['core.People']", 'null': 'True', 'blank': 'True'})
},
u'core.contactphone': {
'Meta': {'ordering': "['number']", 'object_name': 'ContactPhone'},
'customer': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['core.Customer']", 'null': 'True', 'blank': 'True'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'number': ('django.db.models.fields.CharField', [], {'max_length': '20'}),
'operate': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'people': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['core.People']", 'null': 'True', 'blank': 'True'}),
'type': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'})
},
u'core.customer': {
'Meta': {'ordering': "['name']", 'object_name': 'Customer'},
'address': ('django.db.models.fields.CharField', [], {'max_length': '200', 'null': 'True', 'blank': 'True'}),
'city': ('django.db.models.fields.CharField', [], {'max_length': '150', 'null': 'True', 'blank': 'True'}),
'cnpj': ('django.db.models.fields.CharField', [], {'max_length': '20', 'null': 'True', 'blank': 'True'}),
'complement': ('django.db.models.fields.CharField', [], {'max_length': '200', 'null': 'True', 'blank': 'True'}),
'district': ('django.db.models.fields.CharField', [], {'max_length': '150', 'null': 'True', 'blank': 'True'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
'number': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'observation': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
'site': ('django.db.models.fields.URLField', [], {'max_length': '200', 'null': 'True', 'blank': 'True'}),
'state': ('django.db.models.fields.CharField', [], {'max_length': '2', 'null': 'True', 'blank': 'True'}),
'zip_code': ('django.db.models.fields.CharField', [], {'max_length': '10', 'null': 'True', 'blank': 'True'})
},
u'core.job': {
'Meta': {'ordering': "['name']", 'object_name': 'Job'},
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '30', 'null': 'True', 'blank': 'True'})
},
u'core.people': {
'Meta': {'ordering': "['name']", 'object_name': 'People'},
'address': ('django.db.models.fields.CharField', [], {'max_length': '200', 'null': 'True', 'blank': 'True'}),
'brith_date': ('django.db.models.fields.DateField', [], {'null': 'True', 'blank': 'True'}),
'capacity': ('django.db.models.fields.CharField', [], {'max_length': '50', 'null': 'True', 'blank': 'True'}),
'city': ('django.db.models.fields.CharField', [], {'max_length': '150', 'null': 'True', 'blank': 'True'}),
'complement': ('django.db.models.fields.CharField', [], {'max_length': '200', 'null': 'True', 'blank': 'True'}),
'cpf': ('django.db.models.fields.CharField', [], {'max_length': '14', 'null': 'True', 'blank': 'True'}),
'customer': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['core.Customer']"}),
'district': ('django.db.models.fields.CharField', [], {'max_length': '150', 'null': 'True', 'blank': 'True'}),
'expeditor': ('django.db.models.fields.CharField', [], {'max_length': '10', 'null': 'True', 'blank': 'True'}),
'expeditor_date': ('django.db.models.fields.DateField', [], {'null': 'True', 'blank': 'True'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'job': ('django.db.models.fields.CharField', [], {'max_length': '30', 'null': 'True', 'blank': 'True'}),
'job2': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['core.Job']", 'null': 'True', 'blank': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
'number': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'observation': ('django.db.models.fields.TextField', [], {'null': 'True', 'blank': 'True'}),
'ord_date': ('django.db.models.fields.DateField', [], {'null': 'True', 'blank': 'True'}),
'registration': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'rg': ('django.db.models.fields.CharField', [], {'max_length': '20', 'null': 'True', 'blank': 'True'}),
'sex': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'state': ('django.db.models.fields.CharField', [], {'max_length': '2', 'null': 'True', 'blank': 'True'}),
'zip_code': ('django.db.models.fields.CharField', [], {'max_length': '10', 'null': 'True', 'blank': 'True'})
}
}
complete_apps = ['core'] | 70.125 | 141 | 0.542632 | 733 | 6,732 | 4.908595 | 0.139154 | 0.111173 | 0.190661 | 0.272374 | 0.807115 | 0.767093 | 0.756531 | 0.709283 | 0.697332 | 0.622846 | 0 | 0.012229 | 0.198307 | 6,732 | 96 | 142 | 70.125 | 0.654438 | 0.01738 | 0 | 0.283951 | 0 | 0 | 0.518306 | 0.256884 | 0 | 0 | 0 | 0 | 0 | 1 | 0.024691 | false | 0 | 0.049383 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
533813a13c9f9f3acbcfc8254b4d6de7f72d993e | 1,471 | py | Python | test/expressions/expr1.py | Setonas/MagicSetonas | ef76da5f27a0506b194c58072b81424e3ce985d7 | [
"MIT"
] | 5 | 2017-02-22T10:17:39.000Z | 2021-04-06T16:36:13.000Z | test/expressions/expr1.py | Setonas/MagicSetonas | ef76da5f27a0506b194c58072b81424e3ce985d7 | [
"MIT"
] | null | null | null | test/expressions/expr1.py | Setonas/MagicSetonas | ef76da5f27a0506b194c58072b81424e3ce985d7 | [
"MIT"
] | 1 | 2020-08-29T02:30:52.000Z | 2020-08-29T02:30:52.000Z | ~a + b @ c ^ d // e % f & e ir nebūtų g arba h
~ : keyword.operator.bitwise.python, source.python
a : source.python
: source.python
+ : keyword.operator.arithmetic.python, source.python
: source.python
b : source.python
: source.python
@ : keyword.operator.arithmetic.python, source.python
: source.python
c : source.python
: source.python
^ : keyword.operator.bitwise.python, source.python
: source.python
d : source.python
: source.python
// : keyword.operator.arithmetic.python, source.python
: source.python
e : source.python
: source.python
% : keyword.operator.arithmetic.python, source.python
: source.python
f : source.python
: source.python
& : keyword.operator.bitwise.python, source.python
: source.python
e : source.python
: source.python
ir : keyword.operator.logical.python, source.python
: source.python
nebūtų : keyword.operator.logical.python, source.python
: source.python
g : source.python
: source.python
arba : keyword.operator.logical.python, source.python
: source.python
h : source.python
| 36.775 | 65 | 0.520734 | 136 | 1,471 | 5.632353 | 0.132353 | 0.563969 | 0.634465 | 0.532637 | 0.900783 | 0.900783 | 0.848564 | 0.848564 | 0.644909 | 0.644909 | 0 | 0 | 0.39225 | 1,471 | 39 | 66 | 37.717949 | 0.856823 | 0 | 0 | 0.513514 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
5366e72af4d9982feb5d0aa6c9f657c9cbfb7ad2 | 13,345 | py | Python | ps5_II.4_certainty.py | gerkamspiano/QuantMacro | f7e6e4ff7ae075d556f73cb1434c45652b4180cb | [
"MIT"
] | null | null | null | ps5_II.4_certainty.py | gerkamspiano/QuantMacro | f7e6e4ff7ae075d556f73cb1434c45652b4180cb | [
"MIT"
] | null | null | null | ps5_II.4_certainty.py | gerkamspiano/QuantMacro | f7e6e4ff7ae075d556f73cb1434c45652b4180cb | [
"MIT"
] | null | null | null | # II.4. Partial equilibrium - Certainty
# Import packages
import numpy as np
from numpy import vectorize
import matplotlib.pyplot as plt
from itertools import product
# Parameters of the model:
ro = 0.06
r = 0.04
w = 1
beta = 1/(1+ro)
#%% Quadratic Utility
################## Infinitely-lived households economy #######################
gamma = 0
cbar = 100
sigmay = 0
Y = [1-sigmay, 1+sigmay]
Y = np.array(Y)
A = np.linspace(((-(1+r)/r)*Y[0]), 40, 80)
# Create the matrix A*Y, where there are all possible combinations of
# assets (today and tomorrow) and shocks:
ay = list(product(Y, A, A))
ay = np.array(ay)
y = ay[:, 0]
ai = ay[:, 1]
aj = ay[:, 2]
# Transition matrix:
pi = np.array([((1+gamma)/2, (1-gamma)/2), ((1-gamma)/2, (1+gamma)/2)])
# Consumption:
c = y + (1+r)*ai - aj
@vectorize
def M(c):
return -0.5*(c-cbar)**2
# Since we have the feasible constraints into account, now we can define the
# return matrix
M = M(c)
M = np.reshape(M, (1, 12800))
M = np.reshape(M, (160, 80))
# Initial guess for the value function is a vector of zeros:
Vs = np.zeros(160)
# Compute the matrix W:
def W1(A):
return pi[0, 0]*(-0.5*(Y[0] + (1+r)*A - A - cbar)**2)/(1-beta) + pi[0, 1]*(-0.5*(Y[1] + (1+r)*A - A - cbar)**2)/(1-beta)
def W2(A):
return pi[1, 0]*(-0.5*(Y[0] + (1+r)*A - A - cbar)**2)/(1-beta) + pi[1, 1]*(-0.5*(Y[1] + (1+r)*A - A - cbar)**2)/(1-beta)
W1 = W1(A)
W1 = np.reshape(W1, (80,1))
W1 = np.tile(W1, 80)
W1 = np.transpose(W1)
W2 = W2(A)
W2 = np.reshape(W2, (80,1))
W2 = np.tile(W2, 80)
W2 = np.transpose(W2)
W = [W1, W2]
W = np.reshape(W, (160,80))
# Compute the matrix X:
X = M + beta*W
Vs1 = np.amax(X, axis = 1)
diffVs = Vs - Vs1
count = 0
# If differences are larger than 1, we iterate taking as new value functions
# Vs1 up to obtain convergence:
for diffVs in range(1, 8000):
Vss = Vs1
Vs = [Vss[0:80], Vss[80:]]
Vs = np.array(Vs)
def W1(Vs):
return pi[0, 0]*Vs[0, :] + pi[0, 1]*Vs[1, :]
def W2(Vs):
return pi[1, 0]*Vs[0, :] + pi[1, 1]*Vs[1, :]
W1 = W1(Vs)
W1 = np.reshape(W1, (1,80))
W1 = np.tile(W1, 80)
W1 = np.reshape(W1, (80,80))
W2 = W2(Vs)
W2 = np.reshape(W2, (1,80))
W2 = np.tile(W2, 80)
W2 = np.reshape(W2, (80,80))
W = [W1, W2]
W = np.reshape(W, (160, 80))
X = M + beta*W
Vs1 = np.amax(X, axis = 1)
diffVs = Vss - Vs1
count += 1
# Once we obtain convergence, redefine the matrix X:
X = M + beta*W
# The value function given different realizations of y:
V_y1 = Vs1[0:80]
V_y2 = Vs1[80:]
# Now we can obtain the decision rule, which give us column number that
# maximizes row i of the X matrix:
g = np.argmax(X, axis = 1)
aopt_y1 = A[g[0:80]] # optimal decision of assets given y1
aopt_y2 = A[g[80:]] # optimal decision of assets given y2
c_y1 = np.zeros(80)
c_y2 = np.zeros(80)
c_y1 = Y[0]*np.ones(80) + (1+r)*A - aopt_y1
c_y2 = Y[1]*np.ones(80) + (1+r)*A - aopt_y2
for i in range(0, 80):
if c_y1[i] <= 0:
c_y1[i] = 0
if c_y2[i] <= 0:
c_y2[i] = 0
# Plot the value function and the optimal policy:
plt.figure()
plt.plot(A, c_y1, '.', label = 'Optimal consumption for negative shock')
plt.plot(A, c_y2, label = 'Optimal consumption for positive shock')
plt.title('Policy rule for consumption (quadratic utility)')
plt.legend()
plt.ylabel('Consumption')
plt.xlabel('Assets')
plt.show()
################### Simulation time paths for consumption ####################
y = np.zeros([1, 80])
for i in range(0, 80):
y[0, i] = 1 # In all cases, since gamma = 0, our y is going to be 1
# Simulation and plot for assets:
simulation = np.zeros(45)
aopt_y1 = A[g[0:80]] # optimal decision of assets
g_y1 = g[0:80]
simulation[0] = g_y1[79] # our initial guess of assets (a0)
for i in range(1, 45):
simulation[i] = g_y1[int(simulation[i-1])]
for i in range(0, 44):
simulation[i] = aopt_y1[int(simulation[i])]
t = np.linspace(0, 44, 44)
plt.figure()
plt.plot(t, simulation[0:44], label = 'Assets ')
plt.title('Assets simulation for 45 periods (quadratic utility)')
plt.ylabel('Assets')
plt.xlabel('Time (periods)')
plt.show()
# Simulation and plot for consumption:
c = np.zeros(44)
for i in range(0, 44):
c[i] = simulation[i]*(1+r)+w*y[0, i]-simulation[i+1]
if c[i] <= 0:
c[i] = 0
plt.figure()
plt.plot(t[0:43], c[0:43], label = 'Consumption')
plt.title('Consumption simulation for 45 periods (quadratic utility)')
plt.ylabel('Consumption')
plt.xlabel('Time (periods)')
plt.show()
################## Finitely-lived households economy #########################
# T = 45
# Normalize W of T+1 to zero
# Quadratic utility:
A = np.linspace(((-(1+r)/r)*Y[0]), 10, 80)
# Create the matrix A*Y, where there are all possible combinations of
# assets (today and tomorrow) and shocks:
ay = list(product(Y, A, A))
ay = np.array(ay)
y = ay[:,0]
ai = ay[:,1]
aj = ay[:,2]
# Transition matrix:
pi = np.array([((1+gamma)/2, (1-gamma)/2), ((1-gamma)/2, (1+gamma)/2)])
c = y+(1+r)*ai-aj
@vectorize
def M(c):
return -0.5*(c-cbar)**2
M = M(c)
M = np.reshape(M,(1, 12800))
M = np.reshape(M,(160, 80))
W = np.zeros(160*80)
W = np.reshape(W, (160,80))
count = 0
finiteV = []
finiteG = []
for count in range(1, 46):
X = M + beta*W
g = np.argmax(X, axis = 1)
W = np.amax(X, axis = 1)
finiteV.append(W) # It stores each iteration for obtaining the value function at each period (or age)
finiteG.append(g)
W = np.reshape(W, [160,1])
W = np.tile(W, 80)
W = np.transpose(W)
W1 = W[:80, :80]
W2 = W[:80, 80:]
W = np.concatenate((W1, W2))
count = count+1
finiteV = np.array(finiteV)
finiteV = np.transpose(finiteV)
finiteG = np.array(finiteG)
finiteG = np.transpose(finiteG)
# Individual at periods 5 and 40:
A5 = A[finiteG[0:80, 5]]
A40 = A[finiteG[0:80, 40]]
C5 = Y[0]*np.ones(80) + (1+r)*A - A5
C40 = Y[0]*np.ones(80) + (1+r)*A - A40
for i in range(0, 80):
if C5[i] < 0:
C5[i] = 0
if C40[i] < 0:
C40[i] = 0
plt.figure()
plt.plot(A, C5,'.', label = 'Consumption for T=5')
plt.plot(A, C40, label = 'Consumption for T=40')
plt.title('Policy rule for consumption')
plt.legend()
plt.ylabel('Consumption')
plt.xlabel('Assets')
plt.show()
#%% With CRRA utility:
r = 0.04
sigma = 2
A = np.linspace(((-(1+r)/r)*Y[0]), 40, 80)
# Create the matrix A*Y, where there are all possible combinations of
# assets (today and tomorrow) and shocks:
ay = list(product(Y, A, A))
ay = np.array(ay)
y = ay[:,0]
ai = ay[:,1]
aj = ay[:,2]
# Transition matrix:
pi = np.array([((1+gamma)/2, (1-gamma)/2), ((1-gamma)/2, (1+gamma)/2)])
c = y + (1+r)*ai - aj
M = np.zeros(12800)
for i in range(0, 12800):
if c[i] >= 0:
M[i] = ((c[i]**(1-sigma))-1)/(1-sigma)
if c[i] < 0:
M[i] = -100000
M = np.reshape(M, (1, 12800))
M = np.reshape(M, (160, 80))
# Initial guess for the value function is a vector of zeros:
Vs = np.zeros(160)
# Compute the matrix W:
def W1(A):
return pi[0, 0]*(((Y[0] + (1+r)*A - A)**(1-sigma))-1)/((1-sigma)*(1-beta)) + pi[0, 1]*(((Y[1] + (1+r)*A - A)**(1-sigma))-1)/((1-sigma)*(1-beta))
def W2(A):
return pi[1, 0]*(((Y[0] + (1+r)*A - A)**(1-sigma))-1)/((1-sigma)*(1-beta)) + pi[1, 1]*(((Y[1] + (1+r)*A - A)**(1-sigma))-1)/((1-sigma)*(1-beta))
W1 = W1(A)
W1 = np.reshape(W1, (80,1))
W1 = np.tile(W1, 80)
W1 = np.transpose(W1)
W2 = W2(A)
W2 = np.reshape(W2, (80,1))
W2 = np.tile(W2, 80)
W2 = np.transpose(W2)
W = [W1, W2]
W = np.reshape(W, (160,80))
# Compute the matrix X:
X = M + beta*W
Vs1 = np.amax(X, axis = 1)
diffVs = Vs - Vs1
count = 0
# If differences are larger than 1, we iterate taking as new value functions
# Vs1 up to obtain convergence:
for diffVs in range(1, 8000):
Vss = Vs1
Vs = [Vss[0:80], Vss[80:]]
Vs = np.array(Vs)
def W1(Vs):
return pi[0, 0]*Vs[0, :] + pi[0, 1]*Vs[1, :]
def W2(Vs):
return pi[1, 0]*Vs[0, :] + pi[1, 1]*Vs[1, :]
W1 = W1(Vs)
W1 = np.reshape(W1, (1,80))
W1 = np.tile(W1, 80)
W1 = np.reshape(W1, (80,80))
W2 = W2(Vs)
W2 = np.reshape(W2, (1,80))
W2 = np.tile(W2, 80)
W2 = np.reshape(W2, (80,80))
W = [W1, W2]
W = np.reshape(W, (160, 80))
X = M + beta*W
Vs1 = np.amax(X, axis = 1)
diffVs = Vss - Vs1
count += 1
# Once we obtain convergence, redefine the matrix X:
X = M + beta*W
# The value function given different realizations of y:
V_y1 = Vs1[0:80]
V_y2 = Vs1[80:]
# Now we can obtain the decision rule, which give us column number that
# maximizes row i of the X matrix:
g = np.argmax(X, axis = 1)
aopt_y1 = A[g[0:80]] # optimal decision of assets given y1
aopt_y2 = A[g[80:]] # optimal decision of assets given y2
c_y1 = Y[0]*np.ones(80) + (1+r)*A - aopt_y1
c_y2 = Y[1]*np.ones(80) + (1+r)*A - aopt_y2
for i in range(0, 80):
if c_y1[i] < 0:
c_y1[i] = 0
if c_y2[i] < 0:
c_y2[i] = 0
# Plot the value function and the optimal policy:
plt.figure()
plt.plot(A, c_y1, '.', label = 'Optimal consumption for negative shock')
plt.plot(A, c_y2, label = 'Optimal consumption for positive shock')
plt.title('Policy rule for consumption (CRRA utility)')
plt.legend()
plt.ylabel('Consumption')
plt.xlabel('Assets')
plt.show()
################### Simulation time paths for consumption ####################
y = np.zeros([1, 80])
for i in range(0, 80):
y[0, i] = 1 # In all cases, since gamma = 0, our y is going to be 1
# Simulation and plot for assets:
simulation = np.zeros(45)
aopt_y1 = A[g[0:80]] # optimal decision of assets
g_y1 = g[0:80]
simulation[0] = g_y1[79] # our initial guess of assets (a0)
for i in range(1, 45):
simulation[i] = g_y1[int(simulation[i-1])]
for i in range(0, 44):
simulation[i] = aopt_y1[int(simulation[i])]
t = np.linspace(0, 44, 44)
plt.figure()
plt.plot(t,simulation[0:44], label = 'Assets ')
plt.title('Assets simulation for 45 periods (CRRA utility)')
plt.ylabel('Assets')
plt.xlabel('Time (periods)')
plt.show()
# Simulation and plot for consumption:
c = np.zeros(44)
for i in range(0, 44):
c[i] = simulation[i]*(1+r)+w*y[0, i]-simulation[i+1]
if c[i] <= 0:
c[i] = 0
plt.figure()
plt.plot(t[0:43], c[0:43], label = 'Consumption')
plt.title('Consumption simulation for 45 periods (CRRA utility)')
plt.ylabel('Consumption')
plt.xlabel('Time (periods)')
plt.show()
################## Finitely-lived households economy #########################
A = np.linspace(((-(1+r)/r)*Y[0]), 40, 80)
# Create the matrix A*Y, where there are all possible combinations of
# assets (today and tomorrow) and shocks:
ay = list(product(Y, A, A))
ay = np.array(ay)
y = ay[:,0]
ai = ay[:,1]
aj = ay[:,2]
# Transition matrix:
pi = np.array([((1+gamma)/2, (1-gamma)/2), ((1-gamma)/2, (1+gamma)/2)])
c = y+(1+r)*ai-aj
M = np.zeros(12800)
for i in range(0, 12800):
if c[i] >= 0:
M[i] = ((c[i]**(1-sigma))-1)/(1-sigma)
if c[i] < 0:
M[i] = -100000
M = np.reshape(M, (1, 12800))
M = np.reshape(M, (160, 80))
W = np.zeros(160*80)
W = np.reshape(W, (160,80))
count = 0
finiteV = []
finiteG = []
for count in range(1, 46):
X = M + beta*W
g = np.argmax(X, axis = 1)
W = np.amax(X, axis = 1)
finiteV.append(W) # It stores each iteration for obtaining the value function at each period (or age)
finiteG.append(g)
W = np.reshape(W, [160,1])
W = np.tile(W, 80)
W = np.transpose(W)
W1 = W[:80, :80]
W2 = W[:80, 80:]
W = np.concatenate((W1, W2))
count = count+1
finiteV = np.array(finiteV)
finiteV = np.transpose(finiteV)
finiteG = np.array(finiteG)
finiteG = np.transpose(finiteG)
# Individual at periods 5 and 40:
A5 = A[finiteG[0:80, 5]]
A40 = A[finiteG[0:80, 40]]
C5 = Y[0]*np.ones(80) + (1+r)*A - A5
C40 = Y[0]*np.ones(80) + (1+r)*A - A40
for i in range(0, 80):
if C5[i] < 0:
C5[i] = 0
if C40[i] < 0:
C40[i] = 0
plt.figure()
plt.plot(A, C5,'.', label = 'Consumption for T=5')
plt.plot(A, C40, label = 'Consumption for T=40')
plt.title('Policy rule for consumption')
plt.legend()
plt.ylabel('Consumption')
plt.xlabel('Assets')
plt.show() | 21.352 | 149 | 0.530086 | 2,226 | 13,345 | 3.159928 | 0.088949 | 0.035826 | 0.015923 | 0.021894 | 0.935883 | 0.935314 | 0.935314 | 0.935314 | 0.929343 | 0.925078 | 0 | 0.090152 | 0.28018 | 13,345 | 625 | 150 | 21.352 | 0.642099 | 0.197752 | 0 | 0.910979 | 0 | 0 | 0.07936 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029674 | false | 0 | 0.011869 | 0.029674 | 0.071217 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
729d1736e167f2720c8b358a941cc9005b3bff76 | 35,957 | py | Python | spacegraph/spacegraph_codebase/model.py | gengchenmai/space2vec | a29793336e6a1ebdb497289c286a0b4d5a83079f | [
"Apache-2.0"
] | 80 | 2020-02-15T17:50:38.000Z | 2022-03-29T04:17:18.000Z | spacegraph/spacegraph_codebase/model.py | gengchenmai/space2vec | a29793336e6a1ebdb497289c286a0b4d5a83079f | [
"Apache-2.0"
] | 1 | 2020-08-01T01:28:05.000Z | 2020-10-23T20:01:06.000Z | spacegraph/spacegraph_codebase/model.py | gengchenmai/space2vec | a29793336e6a1ebdb497289c286a0b4d5a83079f | [
"Apache-2.0"
] | 12 | 2020-03-12T12:09:35.000Z | 2021-12-19T08:08:54.000Z | import torch
import torch.nn as nn
from torch.nn import init
import torch.nn.functional as F
import os
import cPickle as pickle
from collections import OrderedDict, defaultdict
import random
import json
from spacegraph_codebase.module import get_activation_function
from spacegraph_codebase.data import PointSet, NeighborGraph, Point
class NeighGraphEncoderDecoder(nn.Module):
"""
Combine the encoder, decoder and set up the training process
"""
def __init__(self, pointset, enc, spa_enc, init_dec, dec, activation = "sigmoid", num_context_sample = 10, num_neg_resample = 10):
super(NeighGraphEncoderDecoder, self).__init__()
self.pointset = pointset
self.enc = enc
self.init_dec = init_dec
self.dec = dec
self.spa_enc = spa_enc
self.num_context_sample = num_context_sample
self.num_neg_resample = num_neg_resample # given 100 negative sample, we sample 10
self.activation = get_activation_function(activation, "NeighGraphEncoderDecoder")
def sample_context_pts(self, ng_list):
for ng in ng_list:
ng.sample_neighbor(self.num_context_sample)
def sample_neg_pts(self, ng_list):
for ng in ng_list:
ng.sample_neg(self.num_neg_resample)
def forward(self, ng_list, do_full_eval = True):
'''
Given a list of NeighborGraph(),
1. Compute the predicted center point feature embedding,
2. Get the ground truth feature embedding for center points
3. Get the N negative sampled center points feature embedding
Args:
do_full_eval: do we use the full negative samples to do evaluation
ng_list: a list of NeighborGraph()
Return:
center_pred_embed: the predicted feature embedding for center points by using context points
shape (batch_size, embed_dim)
center_embed: the ground truth feature embedding for center points
shape (batch_size, embed_dim)
neg_embeds: the N negative sampled center points feature embedding
shape (batch_size, num_neg_sample, embed_dim)
'''
# random sample each context points in NeighborGraph()
self.sample_context_pts(ng_list)
if do_full_eval == False:
# random sample each context points in NeighborGraph()
self.sample_neg_pts(ng_list)
if self.spa_enc != None:
# get all context-center point (deltaX, deltaY) list
# coords: shape (batch_size, num_context_sample, 2)
coords = self.get_spa_coords(ng_list)
# key_spa_embeds: shape (batch_size, num_context_sample, spa_embed_dim)
key_spa_embeds = self.spa_enc(coords)
else:
key_spa_embeds = torch.FloatTensor([])
# get the feature embedding of the context points
# key_embeds: shape (batch_size, num_context_sample, embed_dim)
key_embeds = self.get_context_pt_embed(ng_list)
# init_query_embed: shape (batch_size, embed_dim)
init_query_embed = self.init_dec(key_embeds, key_spa_embeds, query_embed = None)
if self.dec != None:
# center_pred_embed: shape (batch_size, embed_dim)
center_pred_embed = self.dec(key_embeds, key_spa_embeds, query_embed = init_query_embed)
else:
center_pred_embed = init_query_embed
# center_embed: shape (batch_size, embed_dim)
center_embed = self.get_center_pt_embed(ng_list)
# neg_embeds: shape (batch_size, num_neg_sample, embed_dim)
neg_embeds = self.get_neg_pt_embed(ng_list, do_full_eval)
return center_pred_embed, center_embed, neg_embeds
def get_batch_scores(self, ng_list, do_full_eval = True):
'''
Given a list of NeighborGraph(),
Return:
pos: the dot product score between ground truth center point embedding and presented embedding
(batch_size)
neg: the dot product score between neg sampled center point embedding and presented embedding
(batch_size, num_neg_sample)
'''
# center_pred_embed: shape (batch_size, embed_dim)
# center_embed: shape (batch_size, embed_dim)
# neg_embeds: shape (batch_size, num_neg_sample, embed_dim)
center_pred_embed, center_embed, neg_embeds = self.forward(ng_list, do_full_eval)
# positive score
# pos: (batch_size)
pos = torch.sum(center_embed * center_pred_embed, dim=1, keepdim=False)
# negative sampling
# center_pred_embed_: shape (batch_size, num_neg_sample, embed_dim)
center_pred_embed_ = center_pred_embed.unsqueeze(1).expand_as(neg_embeds)
# neg: (batch_size, num_neg_sample)
neg = torch.sum(neg_embeds * center_pred_embed_, dim=2, keepdim=False)
# neg: (batch_size)
return pos, neg
def softmax_loss(self, ng_list, do_full_eval = True):
# pos: (batch_size)
# neg: (batch_size, num_neg_sample)
pos, neg = self.get_batch_scores(ng_list, do_full_eval)
num_neg_sample = neg.size()[1]
# pos: (batch_size)
pos = torch.log(self.activation(pos))
# neg: (batch_size)
neg = torch.sum(torch.log(self.activation(-neg)), dim=1, keepdim=False)/num_neg_sample
losses = -(pos + neg)
loss = losses.mean()
return loss
def get_context_pt_embed(self, ng_list):
'''
Given a list of NeighborGraph(), get the feature embedding of the context points
Return:
key_embeds: shape (batch_size, num_context_sample, embed_dim)
'''
# pt_list: shape (batch_size*num_context_sample)
pt_list = []
for ng in ng_list:
pt_list += list(ng.sample_context_pts)
# key_embeds: shape (batch_size*num_context_sample, embed_dim)
key_embeds = self.enc(pt_list)
# key_embeds: shape (batch_size, num_context_sample, embed_dim)
key_embeds = key_embeds.view(len(ng_list), self.num_context_sample, -1)
return key_embeds
def get_neg_pt_embed(self, ng_list, do_full_eval = True):
'''
Given a list of NeighborGraph(), get the feature embedding of the negative sampled center points
Return:
key_embeds: shape (batch_size, num_neg_sample, embed_dim)
'''
if do_full_eval == True:
num_neg_sample = len(ng_list[0].neg_samples)
# pt_list: shape (batch_size*num_neg_sample)
pt_list = []
for ng in ng_list:
pt_list += list(ng.neg_samples)
# key_embeds: shape (batch_size*num_neg_sample, embed_dim)
key_embeds = self.enc(pt_list)
# key_embeds: shape (batch_size, num_neg_sample, embed_dim)
key_embeds = key_embeds.view(len(ng_list), num_neg_sample, -1)
else:
# pt_list: shape (batch_size*num_neg_resample)
pt_list = []
for ng in ng_list:
pt_list += list(ng.sample_neg_pts)
# key_embeds: shape (batch_size*num_neg_resample, embed_dim)
key_embeds = self.enc(pt_list)
# key_embeds: shape (batch_size, num_neg_resample, embed_dim)
key_embeds = key_embeds.view(len(ng_list), self.num_neg_resample, -1)
return key_embeds
def get_center_pt_embed(self, ng_list):
'''
Given a list of NeighborGraph(), get the feature embedding of the center points
Return:
query_embed: shape (batch_size, embed_dim)
'''
pt_list = [ng.center_pt for ng in ng_list]
# query_embed: shape (batch_size, embed_dim)
query_embed = self.enc(pt_list)
return query_embed
def get_spa_coords(self, ng_list):
'''
Given a list of NeighborGraph(), get their (deltaX, deltaY) list
'''
coords = []
for ng in ng_list:
cur_coords = []
center_coord = self.pointset.pt_dict[ng.center_pt].coord
for i in range(len(ng.sample_context_pts)):
coord = self.pointset.pt_dict[ng.sample_context_pts[i]].coord
cur_coords.append([coord[0]-center_coord[0], coord[1]-center_coord[1]])
coords.append(cur_coords)
# coords: shape (batch_size, num_context_sample, 2)
return coords
class GlobalPositionNeighGraphEncoderDecoder(nn.Module):
"""
add the global position embedding of the center point in the decoder
Combine the encoder, decoder and set up the training process
"""
def __init__(self, pointset, enc, spa_enc, g_spa_enc, init_dec, dec, activation = "sigmoid", num_context_sample = 10, num_neg_resample = 10):
super(GlobalPositionNeighGraphEncoderDecoder, self).__init__()
self.pointset = pointset
self.enc = enc
self.init_dec = init_dec
self.dec = dec
self.spa_enc = spa_enc
self.g_spa_enc = g_spa_enc
self.num_context_sample = num_context_sample
self.num_neg_resample = num_neg_resample # given 100 negative sample, we sample 10
self.activation = get_activation_function(activation, "GlobalPositionNeighGraphEncoderDecoder")
def sample_context_pts(self, ng_list):
for ng in ng_list:
ng.sample_neighbor(self.num_context_sample)
def sample_neg_pts(self, ng_list):
for ng in ng_list:
ng.sample_neg(self.num_neg_resample)
def forward(self, ng_list, do_full_eval = True):
'''
Given a list of NeighborGraph(),
1. Compute the predicted center point feature embedding,
2. Get the ground truth feature embedding for center points
3. Get the N negative sampled center points feature embedding
Args:
do_full_eval: do we use the full negative samples to do evaluation
ng_list: a list of NeighborGraph()
Return:
center_pred_embed: the predicted feature embedding for center points by using context points
shape (batch_size, embed_dim)
center_embed: the ground truth feature embedding for center points
shape (batch_size, embed_dim)
neg_embeds: the N negative sampled center points feature embedding
shape (batch_size, num_neg_sample, embed_dim)
'''
# random sample each context points in NeighborGraph()
self.sample_context_pts(ng_list)
if do_full_eval == False:
# random sample each context points in NeighborGraph()
self.sample_neg_pts(ng_list)
# 1. predict the center pt feature embedding from context points
if self.spa_enc != None:
# get all context-center point (deltaX, deltaY) list
# coords: shape (batch_size, num_context_sample, 2)
coords = self.get_spa_coords(ng_list)
# key_spa_embeds: shape (batch_size, num_context_sample, spa_embed_dim)
key_spa_embeds = self.spa_enc(coords)
else:
key_spa_embeds = torch.FloatTensor([])
# 1.1 get the feature embedding of the context points
# key_embeds: shape (batch_size, num_context_sample, embed_dim)
key_embeds = self.get_context_pt_embed(ng_list)
# 1.2 get center pt position embedding
if self.g_spa_enc != None:
# coords: shape (batch_size, 1, 2)
coords = self.get_center_pt_spa_coords(ng_list)
# center_g_spa_embeds: shape (batch_size, 1, g_spa_embed_dim)
center_g_spa_embeds = self.g_spa_enc(coords)
# center_g_spa_embeds: shape (batch_size, g_spa_embed_dim)
center_g_spa_embeds = center_g_spa_embeds.squeeze(1)
else:
center_g_spa_embeds = torch.FloatTensor([])
# init_query_embed: shape (batch_size, embed_dim)
init_query_embed = self.init_dec(key_embeds, key_spa_embeds, center_g_spa_embeds, query_embed = None)
if self.dec != None:
# center_pred_embed: shape (batch_size, embed_dim)
center_pred_embed = self.dec(key_embeds, key_spa_embeds, center_g_spa_embeds, query_embed = init_query_embed)
else:
center_pred_embed = init_query_embed
# 2. get the true center embedding
# center_embed: shape (batch_size, embed_dim)
center_embed = self.get_center_pt_embed(ng_list)
# 3. get the true negative embedding
# neg_embeds: shape (batch_size, num_neg_sample, embed_dim)
neg_embeds = self.get_neg_pt_embed(ng_list, do_full_eval)
return center_pred_embed, center_embed, neg_embeds
def get_batch_scores(self, ng_list, do_full_eval = True):
'''
Given a list of NeighborGraph(),
Return:
pos: the dot product score between ground truth center point embedding and presented embedding
(batch_size)
neg: the dot product score between neg sampled center point embedding and presented embedding
(batch_size, num_neg_sample)
'''
# center_pred_embed: shape (batch_size, embed_dim)
# center_embed: shape (batch_size, embed_dim)
# neg_embeds: shape (batch_size, num_neg_sample, embed_dim)
center_pred_embed, center_embed, neg_embeds = self.forward(ng_list, do_full_eval)
# positive score
# pos: (batch_size)
pos = torch.sum(center_embed * center_pred_embed, dim=1, keepdim=False)
# negative sampling
# center_pred_embed_: shape (batch_size, num_neg_sample, embed_dim)
center_pred_embed_ = center_pred_embed.unsqueeze(1).expand_as(neg_embeds)
# neg: (batch_size, num_neg_sample)
neg = torch.sum(neg_embeds * center_pred_embed_, dim=2, keepdim=False)
# neg: (batch_size)
return pos, neg
def softmax_loss(self, ng_list, do_full_eval = True):
# pos: (batch_size)
# neg: (batch_size, num_neg_sample)
pos, neg = self.get_batch_scores(ng_list, do_full_eval)
num_neg_sample = neg.size()[1]
# pos: (batch_size)
pos = torch.log(self.activation(pos))
# neg: (batch_size)
neg = torch.sum(torch.log(self.activation(-neg)), dim=1, keepdim=False)/num_neg_sample
losses = -(pos + neg)
loss = losses.mean()
return loss
def get_context_pt_embed(self, ng_list):
'''
Given a list of NeighborGraph(), get the feature embedding of the context points
Return:
key_embeds: shape (batch_size, num_context_sample, embed_dim)
'''
# pt_list: shape (batch_size*num_context_sample)
pt_list = []
for ng in ng_list:
pt_list += list(ng.sample_context_pts)
# key_embeds: shape (batch_size*num_context_sample, embed_dim)
key_embeds = self.enc(pt_list)
# key_embeds: shape (batch_size, num_context_sample, embed_dim)
key_embeds = key_embeds.view(len(ng_list), self.num_context_sample, -1)
return key_embeds
def get_neg_pt_embed(self, ng_list, do_full_eval = True):
'''
Given a list of NeighborGraph(), get the feature embedding of the negative sampled center points
Return:
key_embeds: shape (batch_size, num_neg_sample, embed_dim)
'''
if do_full_eval == True:
num_neg_sample = len(ng_list[0].neg_samples)
# pt_list: shape (batch_size*num_neg_sample)
pt_list = []
for ng in ng_list:
pt_list += list(ng.neg_samples)
# key_embeds: shape (batch_size*num_neg_sample, embed_dim)
key_embeds = self.enc(pt_list)
# key_embeds: shape (batch_size, num_neg_sample, embed_dim)
key_embeds = key_embeds.view(len(ng_list), num_neg_sample, -1)
else:
# pt_list: shape (batch_size*num_neg_resample)
pt_list = []
for ng in ng_list:
pt_list += list(ng.sample_neg_pts)
# key_embeds: shape (batch_size*num_neg_resample, embed_dim)
key_embeds = self.enc(pt_list)
# key_embeds: shape (batch_size, num_neg_resample, embed_dim)
key_embeds = key_embeds.view(len(ng_list), self.num_neg_resample, -1)
return key_embeds
def get_center_pt_embed(self, ng_list):
'''
Given a list of NeighborGraph(), get the feature embedding of the center points
Return:
query_embed: shape (batch_size, embed_dim)
'''
pt_list = [ng.center_pt for ng in ng_list]
# query_embed: shape (batch_size, embed_dim)
query_embed = self.enc(pt_list)
return query_embed
def get_spa_coords(self, ng_list):
'''
Given a list of NeighborGraph(), get their (deltaX, deltaY) list
'''
coords = []
for ng in ng_list:
cur_coords = []
center_coord = self.pointset.pt_dict[ng.center_pt].coord
for i in range(len(ng.sample_context_pts)):
coord = self.pointset.pt_dict[ng.sample_context_pts[i]].coord
cur_coords.append([coord[0]-center_coord[0], coord[1]-center_coord[1]])
coords.append(cur_coords)
# coords: shape (batch_size, num_context_sample, 2)
return coords
def get_center_pt_spa_coords(self, ng_list):
'''
Given a list of NeighborGraph(), get their center point (X, Y) list
'''
coords = []
for ng in ng_list:
cur_coords = []
center_coord = self.pointset.pt_dict[ng.center_pt].coord
cur_coords.append(center_coord)
coords.append(cur_coords)
# coords: shape (batch_size, 1, 2)
return coords
def freeze_param_except_join_dec(self):
# freeze all parameter except the parameters of join_dec
self.freeze_param(self.enc)
self.freeze_param(self.init_dec)
# self.freeze_param(self.joint_dec)
self.freeze_param(self.spa_enc)
self.freeze_param(self.g_spa_enc)
self.freeze_param(self.g_spa_dec)
def freeze_param(self, module):
for param in module.parameters():
param.requires_grad = False
class GlobalPositionEncoderDecoder(nn.Module):
"""
encode the position of point and directly decode to its feature embedding
"""
def __init__(self, pointset, enc, g_spa_enc, g_spa_dec, activation = "sigmoid", num_neg_resample = 10):
super(GlobalPositionEncoderDecoder, self).__init__()
self.pointset = pointset
self.enc = enc # point feature embedding encoder
self.g_spa_enc = g_spa_enc # one of the SpatialRelationEncoder
self.g_spa_dec = g_spa_dec # DirectPositionEmbeddingDecoder()
self.activation = get_activation_function(activation, "GlobalPositionEncoderDecoder")
self.num_neg_resample = num_neg_resample # given 100 negative sample, we sample 10
def sample_neg_pts(self, ng_list):
for ng in ng_list:
ng.sample_neg(self.num_neg_resample)
def forward(self, ng_list, do_full_eval = True):
'''
Given a list of NeighborGraph(),
1. Compute the predicted center point feature embedding,
2. Get the ground truth feature embedding for center points
3. Get the N negative sampled center points feature embedding
Args:
do_full_eval: do we use the full negative samples to do evaluation
ng_list: a list of NeighborGraph()
Return:
center_pred_embed: the predicted feature embedding for center points by using context points
shape (batch_size, embed_dim)
center_embed: the ground truth feature embedding for center points
shape (batch_size, embed_dim)
neg_embeds: the N negative sampled center points feature embedding
shape (batch_size, num_neg_sample, embed_dim)
'''
if do_full_eval == False:
# random sample each context points in NeighborGraph()
self.sample_neg_pts(ng_list)
# coords: shape (batch_size, 1, 2)
coords = self.get_center_pt_spa_coords(ng_list)
# center_g_spa_embeds: shape (batch_size, 1, g_spa_embed_dim)
center_g_spa_embeds = self.g_spa_enc(coords)
# center_g_spa_embeds: shape (batch_size, g_spa_embed_dim)
center_g_spa_embeds = center_g_spa_embeds.squeeze(1)
# center_pred_embed: shape (batch_size, embed_dim)
center_pred_embed = self.g_spa_dec(center_g_spa_embeds)
# center_embed: shape (batch_size, embed_dim)
center_embed = self.get_center_pt_embed(ng_list)
# neg_embeds: shape (batch_size, num_neg_sample, embed_dim)
neg_embeds = self.get_neg_pt_embed(ng_list, do_full_eval)
return center_pred_embed, center_embed, neg_embeds
def get_pred_embed_from_coords(self, coords):
'''
Args:
coords: a list of coordinates, (y_batch_size, x_batch_size, 2)
'''
# center_g_spa_embeds: shape (y_batch_size, x_batch_size, g_spa_embed_dim)
center_g_spa_embeds = self.g_spa_enc(coords)
y_batch_size, x_batch_size, g_spa_embed_dim = center_g_spa_embeds.size()
# center_g_spa_embeds: shape (y_batch_size * x_batch_size, g_spa_embed_dim)
center_g_spa_embeds = center_g_spa_embeds.view(y_batch_size * x_batch_size, g_spa_embed_dim)
# center_pred_embed: shape (y_batch_size * x_batch_size, embed_dim)
center_pred_embed = self.g_spa_dec(center_g_spa_embeds)
return center_pred_embed
def get_batch_scores(self, ng_list, do_full_eval = True):
'''
Given a list of NeighborGraph(),
Return:
pos: the dot product score between ground truth center point embedding and presented embedding
(batch_size)
neg: the dot product score between neg sampled center point embedding and presented embedding
(batch_size, num_neg_sample)
'''
# center_pred_embed: shape (batch_size, embed_dim)
# center_embed: shape (batch_size, embed_dim)
# neg_embeds: shape (batch_size, num_neg_sample, embed_dim)
center_pred_embed, center_embed, neg_embeds = self.forward(ng_list, do_full_eval)
# positive score
# pos: (batch_size)
pos = torch.sum(center_embed * center_pred_embed, dim=1, keepdim=False)
# negative sampling
# center_pred_embed_: shape (batch_size, num_neg_sample, embed_dim)
center_pred_embed_ = center_pred_embed.unsqueeze(1).expand_as(neg_embeds)
# neg: (batch_size, num_neg_sample)
neg = torch.sum(neg_embeds * center_pred_embed_, dim=2, keepdim=False)
# neg: (batch_size)
return pos, neg
def softmax_loss(self, ng_list, do_full_eval = True):
# pos: (batch_size)
# neg: (batch_size, num_neg_sample)
pos, neg = self.get_batch_scores(ng_list, do_full_eval)
num_neg_sample = neg.size()[1]
# pos: (batch_size)
pos = torch.log(self.activation(pos))
# neg: (batch_size)
neg = torch.sum(torch.log(self.activation(-neg)), dim=1, keepdim=False)/num_neg_sample
losses = -(pos + neg)
loss = losses.mean()
return loss
def get_center_pt_spa_coords(self, ng_list):
'''
Given a list of NeighborGraph(), get their center point (X, Y) list
'''
coords = []
for ng in ng_list:
cur_coords = []
center_coord = self.pointset.pt_dict[ng.center_pt].coord
cur_coords.append(center_coord)
coords.append(cur_coords)
# coords: shape (batch_size, 1, 2)
return coords
def get_neg_pt_embed(self, ng_list, do_full_eval = True):
'''
Given a list of NeighborGraph(), get the feature embedding of the negative sampled center points
Return:
key_embeds: shape (batch_size, num_neg_sample, embed_dim)
'''
if do_full_eval == True:
num_neg_sample = len(ng_list[0].neg_samples)
# pt_list: shape (batch_size*num_neg_sample)
pt_list = []
for ng in ng_list:
pt_list += list(ng.neg_samples)
# key_embeds: shape (batch_size*num_neg_sample, embed_dim)
key_embeds = self.enc(pt_list)
# key_embeds: shape (batch_size, num_neg_sample, embed_dim)
key_embeds = key_embeds.view(len(ng_list), num_neg_sample, -1)
else:
# pt_list: shape (batch_size*num_neg_resample)
pt_list = []
for ng in ng_list:
pt_list += list(ng.sample_neg_pts)
# key_embeds: shape (batch_size*num_neg_resample, embed_dim)
key_embeds = self.enc(pt_list)
# key_embeds: shape (batch_size, num_neg_resample, embed_dim)
key_embeds = key_embeds.view(len(ng_list), self.num_neg_resample, -1)
return key_embeds
def get_center_pt_embed(self, ng_list):
'''
Given a list of NeighborGraph(), get the feature embedding of the center points
Return:
query_embed: shape (batch_size, embed_dim)
'''
pt_list = [ng.center_pt for ng in ng_list]
# query_embed: shape (batch_size, embed_dim)
query_embed = self.enc(pt_list)
return query_embed
class JointRelativeGlobalEncoderDecoder(nn.Module):
"""
Combine the encoder, decoder and set up the training process
"""
def __init__(self, pointset, enc, spa_enc, g_spa_enc, g_spa_dec, init_dec, dec, joint_dec, activation = "sigmoid", num_context_sample = 10, num_neg_resample = 10):
super(JointRelativeGlobalEncoderDecoder, self).__init__()
self.pointset = pointset
self.enc = enc
self.init_dec = init_dec
self.dec = dec
self.joint_dec = joint_dec
self.spa_enc = spa_enc
self.g_spa_enc = g_spa_enc
self.g_spa_dec = g_spa_dec
self.num_context_sample = num_context_sample
self.num_neg_resample = num_neg_resample # given 100 negative sample, we sample 10
self.activation = get_activation_function(activation, "JointRelativeGlobalEncoderDecoder")
def sample_context_pts(self, ng_list):
for ng in ng_list:
ng.sample_neighbor(self.num_context_sample)
def sample_neg_pts(self, ng_list):
for ng in ng_list:
ng.sample_neg(self.num_neg_resample)
def forward(self, ng_list, do_full_eval = True):
'''
Given a list of NeighborGraph(),
1. Compute the predicted center point feature embedding,
2. Get the ground truth feature embedding for center points
3. Get the N negative sampled center points feature embedding
Args:
do_full_eval: do we use the full negative samples to do evaluation
ng_list: a list of NeighborGraph()
Return:
center_pred_embed: the predicted feature embedding for center points by using context points
shape (batch_size, embed_dim)
center_embed: the ground truth feature embedding for center points
shape (batch_size, embed_dim)
neg_embeds: the N negative sampled center points feature embedding
shape (batch_size, num_neg_sample, embed_dim)
'''
# random sample each context points in NeighborGraph()
self.sample_context_pts(ng_list)
if do_full_eval == False:
# random sample each context points in NeighborGraph()
self.sample_neg_pts(ng_list)
# 1. predict the center pt feature embedding from context points
if self.spa_enc != None:
# get all context-center point (deltaX, deltaY) list
# coords: shape (batch_size, num_context_sample, 2)
coords = self.get_spa_coords(ng_list)
# key_spa_embeds: shape (batch_size, num_context_sample, spa_embed_dim)
key_spa_embeds = self.spa_enc(coords)
else:
key_spa_embeds = torch.FloatTensor([])
# get the feature embedding of the context points
# key_embeds: shape (batch_size, num_context_sample, embed_dim)
key_embeds = self.get_context_pt_embed(ng_list)
# init_query_embed: shape (batch_size, embed_dim)
init_query_embed = self.init_dec(key_embeds, key_spa_embeds, query_embed = None)
if self.dec != None:
# center_pred_embed_1: shape (batch_size, embed_dim)
center_pred_embed_1 = self.dec(key_embeds, key_spa_embeds, query_embed = init_query_embed)
else:
center_pred_embed_1 = init_query_embed
# 2. predict center feature embedding from point location
# coords: shape (batch_size, 1, 2)
coords = self.get_center_pt_spa_coords(ng_list)
# center_g_spa_embeds: shape (batch_size, 1, g_spa_embed_dim)
center_g_spa_embeds = self.g_spa_enc(coords)
# center_g_spa_embeds: shape (batch_size, g_spa_embed_dim)
center_g_spa_embeds = center_g_spa_embeds.squeeze(1)
# center_pred_embed_2: shape (batch_size, embed_dim)
center_pred_embed_2 = self.g_spa_dec(center_g_spa_embeds)
# 3. Given the predict center embedding from context, and the center position embedding
# predict final feature embedding
center_pred_embed = self.joint_dec(center_pred_embed_1, center_pred_embed_2)
# 4. get the true center embedding
# center_embed: shape (batch_size, embed_dim)
center_embed = self.get_center_pt_embed(ng_list)
# 5. get the true negative embedding
# neg_embeds: shape (batch_size, num_neg_sample, embed_dim)
neg_embeds = self.get_neg_pt_embed(ng_list, do_full_eval)
return center_pred_embed, center_embed, neg_embeds
def get_batch_scores(self, ng_list, do_full_eval = True):
'''
Given a list of NeighborGraph(),
Return:
pos: the dot product score between ground truth center point embedding and presented embedding
(batch_size)
neg: the dot product score between neg sampled center point embedding and presented embedding
(batch_size, num_neg_sample)
'''
# center_pred_embed: shape (batch_size, embed_dim)
# center_embed: shape (batch_size, embed_dim)
# neg_embeds: shape (batch_size, num_neg_sample, embed_dim)
center_pred_embed, center_embed, neg_embeds = self.forward(ng_list, do_full_eval)
# positive score
# pos: (batch_size)
pos = torch.sum(center_embed * center_pred_embed, dim=1, keepdim=False)
# negative sampling
# center_pred_embed_: shape (batch_size, num_neg_sample, embed_dim)
center_pred_embed_ = center_pred_embed.unsqueeze(1).expand_as(neg_embeds)
# neg: (batch_size, num_neg_sample)
neg = torch.sum(neg_embeds * center_pred_embed_, dim=2, keepdim=False)
# neg: (batch_size)
return pos, neg
def softmax_loss(self, ng_list, do_full_eval = True):
# pos: (batch_size)
# neg: (batch_size, num_neg_sample)
pos, neg = self.get_batch_scores(ng_list, do_full_eval)
num_neg_sample = neg.size()[1]
# pos: (batch_size)
pos = torch.log(self.activation(pos))
# neg: (batch_size)
neg = torch.sum(torch.log(self.activation(-neg)), dim=1, keepdim=False)/num_neg_sample
losses = -(pos + neg)
loss = losses.mean()
return loss
def get_context_pt_embed(self, ng_list):
'''
Given a list of NeighborGraph(), get the feature embedding of the context points
Return:
key_embeds: shape (batch_size, num_context_sample, embed_dim)
'''
# pt_list: shape (batch_size*num_context_sample)
pt_list = []
for ng in ng_list:
pt_list += list(ng.sample_context_pts)
# key_embeds: shape (batch_size*num_context_sample, embed_dim)
key_embeds = self.enc(pt_list)
# key_embeds: shape (batch_size, num_context_sample, embed_dim)
key_embeds = key_embeds.view(len(ng_list), self.num_context_sample, -1)
return key_embeds
def get_neg_pt_embed(self, ng_list, do_full_eval = True):
'''
Given a list of NeighborGraph(), get the feature embedding of the negative sampled center points
Return:
key_embeds: shape (batch_size, num_neg_sample, embed_dim)
'''
if do_full_eval == True:
num_neg_sample = len(ng_list[0].neg_samples)
# pt_list: shape (batch_size*num_neg_sample)
pt_list = []
for ng in ng_list:
pt_list += list(ng.neg_samples)
# key_embeds: shape (batch_size*num_neg_sample, embed_dim)
key_embeds = self.enc(pt_list)
# key_embeds: shape (batch_size, num_neg_sample, embed_dim)
key_embeds = key_embeds.view(len(ng_list), num_neg_sample, -1)
else:
# pt_list: shape (batch_size*num_neg_resample)
pt_list = []
for ng in ng_list:
pt_list += list(ng.sample_neg_pts)
# key_embeds: shape (batch_size*num_neg_resample, embed_dim)
key_embeds = self.enc(pt_list)
# key_embeds: shape (batch_size, num_neg_resample, embed_dim)
key_embeds = key_embeds.view(len(ng_list), self.num_neg_resample, -1)
return key_embeds
def get_center_pt_embed(self, ng_list):
'''
Given a list of NeighborGraph(), get the feature embedding of the center points
Return:
query_embed: shape (batch_size, embed_dim)
'''
pt_list = [ng.center_pt for ng in ng_list]
# query_embed: shape (batch_size, embed_dim)
query_embed = self.enc(pt_list)
return query_embed
def get_spa_coords(self, ng_list):
'''
Given a list of NeighborGraph(), get their (deltaX, deltaY) list
'''
coords = []
for ng in ng_list:
cur_coords = []
center_coord = self.pointset.pt_dict[ng.center_pt].coord
for i in range(len(ng.sample_context_pts)):
coord = self.pointset.pt_dict[ng.sample_context_pts[i]].coord
cur_coords.append([coord[0]-center_coord[0], coord[1]-center_coord[1]])
coords.append(cur_coords)
# coords: shape (batch_size, num_context_sample, 2)
return coords
def get_center_pt_spa_coords(self, ng_list):
'''
Given a list of NeighborGraph(), get their center point (X, Y) list
'''
coords = []
for ng in ng_list:
cur_coords = []
center_coord = self.pointset.pt_dict[ng.center_pt].coord
cur_coords.append(center_coord)
coords.append(cur_coords)
# coords: shape (batch_size, 1, 2)
return coords
def freeze_param_except_join_dec(self):
# freeze all parameter except the parameters of join_dec
self.freeze_param(self.enc)
self.freeze_param(self.init_dec)
# self.freeze_param(self.joint_dec)
self.freeze_param(self.spa_enc)
self.freeze_param(self.g_spa_enc)
self.freeze_param(self.g_spa_dec)
def freeze_param(self, module):
for param in module.parameters():
param.requires_grad = False | 39.90788 | 167 | 0.642712 | 4,856 | 35,957 | 4.420305 | 0.034596 | 0.068763 | 0.075658 | 0.053855 | 0.938458 | 0.935337 | 0.932308 | 0.928116 | 0.92164 | 0.92164 | 0 | 0.005283 | 0.27875 | 35,957 | 901 | 168 | 39.90788 | 0.822395 | 0.389966 | 0 | 0.893229 | 0 | 0 | 0.007421 | 0.006045 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117188 | false | 0 | 0.028646 | 0 | 0.234375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
72a092e9b6c5b30405db272a684ecb8dc54adbaf | 8,031 | py | Python | framework.py | XuminGaoGithub/MMUU-Net | 5617948d3646d8483af5f45a16cff35261f125f7 | [
"Net-SNMP",
"Xnet"
] | 1 | 2021-10-03T08:13:20.000Z | 2021-10-03T08:13:20.000Z | framework.py | scumechanics/MMUU-Net | 5617948d3646d8483af5f45a16cff35261f125f7 | [
"Net-SNMP",
"Xnet"
] | null | null | null | framework.py | scumechanics/MMUU-Net | 5617948d3646d8483af5f45a16cff35261f125f7 | [
"Net-SNMP",
"Xnet"
] | 1 | 2021-10-03T08:13:21.000Z | 2021-10-03T08:13:21.000Z | import torch
import torch.nn as nn
from torch.autograd import Variable as V
import cv2
import numpy as np
class MyFrame():
def __init__(self, net, loss, lr=2e-4, evalmode = False):
self.net = net().cuda()
self.net = torch.nn.DataParallel(self.net, device_ids=range(torch.cuda.device_count()))
self.optimizer = torch.optim.Adam(params=self.net.parameters(), lr=lr, betas=(0.9, 0.999), eps=1e-8,weight_decay=5e-4, amsgrad=False)
#self.optimizer = torch.optim.RMSprop(params=self.net.parameters(), lr=lr)
self.loss = loss()
self.old_lr = lr
if evalmode:
for i in self.net.modules():
if isinstance(i, nn.BatchNorm2d):
i.eval()
###(3)mloss
def set_input(self, img_batch, mask_batch=None, mask_batch1=None,mask_batch2=None,mask_batch3=None,mask_batch4=None,mask_batch5=None,img_id=None):
self.img = img_batch
self.mask = mask_batch
self.mask1 = mask_batch1
self.mask2 = mask_batch2
self.mask3 = mask_batch3
self.mask4 = mask_batch4
self.mask5 = mask_batch5
#self.masks = [mask_batch1,mask_batch2,mask_batch3,mask_batch4,mask_batch5,mask_batch]
self.masks = [mask_batch,mask_batch1, mask_batch2, mask_batch3, mask_batch4, mask_batch]
#print('type(self.mask)',type(self.mask))
self.img_id = img_id
def test_one_img(self, img):
pred = self.net.forward(img)
pred[pred>0.5] = 1
pred[pred<=0.5] = 0
mask = pred.squeeze().cpu().data.numpy()
return mask
def test_batch(self):
self.forward(volatile=True)
mask = self.net.forward(self.img).cpu().data.numpy().squeeze(1)
mask[mask>0.5] = 1
mask[mask<=0.5] = 0
return mask, self.img_id
def test_one_img_from_path(self, path):
img = cv2.imread(path)
img = np.array(img, np.float32)/255.0 * 3.2 - 1.6
img = V(torch.Tensor(img).cuda())
mask = self.net.forward(img).squeeze().cpu().data.numpy()#.squeeze(1)
mask[mask>0.5] = 1
mask[mask<=0.5] = 0
return mask
def forward(self, volatile=False):
self.img = V(self.img.cuda(), volatile=volatile)
if self.mask is not None:
self.mask = V(self.mask.cuda(), volatile=volatile)
###(3)mloss
self.masks[0] = V(self.masks[0].cuda(), volatile=volatile)
self.masks[1] = V(self.masks[1].cuda(), volatile=volatile)
self.masks[2] = V(self.masks[2].cuda(), volatile=volatile)
self.masks[3] = V(self.masks[3].cuda(), volatile=volatile)
self.masks[4] = V(self.masks[4].cuda(), volatile=volatile)
self.masks[5] = V(self.masks[5].cuda(), volatile=volatile)
def optimize(self):
self.forward()
self.optimizer.zero_grad()
pred = self.net.forward(self.img)
#print('self.mask', self.mask)
#print('pred', pred)
loss = self.loss(self.mask, pred)
loss.backward()
self.optimizer.step()
#return loss.data[0]
return loss.item()
def optimize_MLOSS(self):
self.forward()
self.optimizer.zero_grad()
preds = self.net.forward(self.img)
#print('preds', len(preds))
#print('preds',preds[0])
###(5)mloss
#loss = torch.zeros(1).cuda()
#print('self.mask[0]', self.masks[1].size())
#print('preds[0]', preds[1].size())
loss = self.loss(self.masks[0], preds[0])
loss = loss + self.loss(self.masks[1], preds[1])
loss = loss + self.loss(self.masks[2], preds[2])
loss = loss + self.loss(self.masks[3], preds[3])
loss = loss + self.loss(self.masks[4], preds[4])
loss = loss + self.loss(self.masks[5], preds[5])
loss = loss/6.0
#print('self.mask[5]', self.masks[5])
#print('preds[5]', preds[5])
#loss = self.loss(self.masks[0], preds[0])
#for o in preds:
#loss = loss + self.loss(self.mask, o)
loss.backward()
self.optimizer.step()
#return loss.data[0]
return loss.item()
def optimize_valid(self):
self.forward()
self.optimizer.zero_grad()
pred = self.net.forward(self.img)
loss = self.loss(self.mask, pred)
loss.backward()
self.optimizer.step()
#return loss.data[0]
return loss.item(),pred
def save(self, path):
#real_model = self.net.state_dict().module
#torch.save(real_model, path)
torch.save(self.net.state_dict(), path)
def load(self, path):
self.net.load_state_dict(torch.load(path))
def update_lr(self, new_lr, mylog, factor=False):
if factor:
new_lr = self.old_lr / new_lr
for param_group in self.optimizer.param_groups:
param_group['lr'] = new_lr
print (mylog, 'update learning rate: %f -> %f' % (self.old_lr, new_lr))
print ('update learning rate: %f -> %f' % (self.old_lr, new_lr))
self.old_lr = new_lr
class MyFrame_valid():
def __init__(self, net, loss, lr=2e-4, evalmode=False):
self.net = net().cuda()
self.net = torch.nn.DataParallel(self.net, device_ids=range(torch.cuda.device_count()))
self.optimizer = torch.optim.Adam(params=self.net.parameters(), lr=lr)
# self.optimizer = torch.optim.RMSprop(params=self.net.parameters(), lr=lr)
self.loss = loss()
self.old_lr = lr
if evalmode:
for i in self.net.modules():
if isinstance(i, nn.BatchNorm2d):
i.eval()
self.net=self.net.eval()
def set_input(self, img_batch, mask_batch=None, img_id=None):
self.img = img_batch
self.mask = mask_batch
self.img_id = img_id
def test_one_img(self, img):
pred = self.net.forward(img)
pred[pred > 0.5] = 1
pred[pred <= 0.5] = 0
mask = pred.squeeze().cpu().data.numpy()
return mask
def test_batch(self):
self.forward(volatile=True)
mask = self.net.forward(self.img).cpu().data.numpy().squeeze(1)
mask[mask > 0.5] = 1
mask[mask <= 0.5] = 0
return mask, self.img_id
def test_one_img_from_path(self, path):
img = cv2.imread(path)
img = np.array(img, np.float32) / 255.0 * 3.2 - 1.6
img = V(torch.Tensor(img).cuda())
mask = self.net.forward(img).squeeze().cpu().data.numpy() # .squeeze(1)
mask[mask > 0.5] = 1
mask[mask <= 0.5] = 0
return mask
def forward(self, volatile=False):
self.img = V(self.img.cuda(), volatile=volatile)
if self.mask is not None:
self.mask = V(self.mask.cuda(), volatile=volatile)
def optimize(self):
self.forward()
self.optimizer.zero_grad()
pred = self.net.forward(self.img)
loss = self.loss(self.mask, pred)
loss.backward()
self.optimizer.step()
# return loss.data[0]
return loss.item()
def optimize_valid(self):
self.forward()
self.optimizer.zero_grad()
pred = self.net.forward(self.img)
loss = self.loss(self.mask, pred)
loss.backward()
self.optimizer.step()
# return loss.data[0]
return loss.item(), pred
def save(self, path):
torch.save(self.net.state_dict(), path)
def load(self, path):
self.net.load_state_dict(torch.load(path))
def update_lr(self, new_lr, mylog, factor=False):
if factor:
new_lr = self.old_lr / new_lr
for param_group in self.optimizer.param_groups:
param_group['lr'] = new_lr
print(mylog, 'update learning rate: %f -> %f' % (self.old_lr, new_lr))
print('update learning rate: %f -> %f' % (self.old_lr, new_lr))
self.old_lr = new_lr
| 32.779592 | 150 | 0.574026 | 1,123 | 8,031 | 4.00089 | 0.11398 | 0.049855 | 0.03205 | 0.042733 | 0.823948 | 0.786334 | 0.751614 | 0.743601 | 0.731137 | 0.696417 | 0 | 0.025725 | 0.278795 | 8,031 | 244 | 151 | 32.913934 | 0.75 | 0.104968 | 0 | 0.801205 | 0 | 0 | 0.017333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.138554 | false | 0 | 0.03012 | 0 | 0.246988 | 0.024096 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
72f04521c4675dd73082bf7b0f12a8a444bc9ff7 | 6,179 | py | Python | tests/unit/controller/test_controller_handler.py | fgsalomon/petisco | 1aa53c67a43b80dca18000fd843380904c25edbb | [
"MIT"
] | null | null | null | tests/unit/controller/test_controller_handler.py | fgsalomon/petisco | 1aa53c67a43b80dca18000fd843380904c25edbb | [
"MIT"
] | null | null | null | tests/unit/controller/test_controller_handler.py | fgsalomon/petisco | 1aa53c67a43b80dca18000fd843380904c25edbb | [
"MIT"
] | null | null | null | import json
import pytest
from meiga import Success, isFailure
from petisco import controller_handler, CorrelationId, ERROR, INFO
from tests.unit.mocks.fake_logger import FakeLogger
from tests.unit.mocks.log_message_mother import LogMessageMother
@pytest.mark.unit
def test_should_execute_successfully_a_empty_controller_without_input_parameters():
logger = FakeLogger()
@controller_handler(logger=logger)
def my_controller():
return Success("Hello Petisco")
http_response = my_controller()
assert http_response == ({"message": "OK"}, 200)
first_logging_message = logger.get_logging_messages()[0]
second_logging_message = logger.get_logging_messages()[1]
assert first_logging_message == (
INFO,
LogMessageMother.get_controller(
operation="my_controller", message="Start"
).to_json(),
)
assert second_logging_message == (
INFO,
LogMessageMother.get_controller(
operation="my_controller",
message="Result[status: success | value: Hello Petisco]",
).to_json(),
)
@pytest.mark.unit
def test_should_execute_successfully_a_empty_controller_with_correlation_id_as_only_input_parameter():
logger = FakeLogger()
@controller_handler(logger=logger)
def my_controller(correlation_id: CorrelationId):
return Success("Hello Petisco")
http_response = my_controller()
assert http_response == ({"message": "OK"}, 200)
first_logging_message = logger.get_logging_messages()[0]
second_logging_message = logger.get_logging_messages()[1]
correlation_id = json.loads(first_logging_message[1])["correlation_id"]
assert first_logging_message == (
INFO,
LogMessageMother.get_controller(
operation="my_controller", correlation_id=correlation_id, message="Start"
).to_json(),
)
assert second_logging_message == (
INFO,
LogMessageMother.get_controller(
operation="my_controller",
correlation_id=correlation_id,
message="Result[status: success | value: Hello Petisco]",
).to_json(),
)
@pytest.mark.unit
def test_should_execute_with_a_failure_a_empty_controller_without_input_parameters():
logger = FakeLogger()
@controller_handler(logger=logger)
def my_controller():
return isFailure
http_response = my_controller()
assert http_response == (
{"error": {"message": "Unknown Error", "type": "HttpError"}},
500,
)
first_logging_message = logger.get_logging_messages()[0]
second_logging_message = logger.get_logging_messages()[1]
assert first_logging_message == (
INFO,
LogMessageMother.get_controller(
operation="my_controller", message="Start"
).to_json(),
)
assert second_logging_message == (
ERROR,
LogMessageMother.get_controller(
operation="my_controller", message="Result[status: failure | value: Error]"
).to_json(),
)
@pytest.mark.unit
def test_should_execute_with_a_failure_a_empty_controller_with_correlation_id_as_only_input_parameter():
logger = FakeLogger()
@controller_handler(logger=logger)
def my_controller(correlation_id: CorrelationId):
return isFailure
http_response = my_controller()
assert http_response == (
{"error": {"message": "Unknown Error", "type": "HttpError"}},
500,
)
first_logging_message = logger.get_logging_messages()[0]
second_logging_message = logger.get_logging_messages()[1]
correlation_id = json.loads(first_logging_message[1])["correlation_id"]
assert first_logging_message == (
INFO,
LogMessageMother.get_controller(
operation="my_controller", correlation_id=correlation_id, message="Start"
).to_json(),
)
assert second_logging_message == (
ERROR,
LogMessageMother.get_controller(
operation="my_controller",
correlation_id=correlation_id,
message="Result[status: failure | value: Error]",
).to_json(),
)
@pytest.mark.unit
def test_should_execute_successfully_a_empty_controller_without_input_parameters_and_logger():
@controller_handler()
def my_controller():
return Success("Hello Petisco")
http_response = my_controller()
assert http_response == ({"message": "OK"}, 200)
@pytest.mark.unit
def test_should_execute_successfully_a_filtered_object_by_blacklist():
logger = FakeLogger()
@controller_handler(logger=logger)
def my_controller():
return Success(b"This are bytes")
http_response = my_controller()
assert http_response == ({"message": "OK"}, 200)
first_logging_message = logger.get_logging_messages()[0]
second_logging_message = logger.get_logging_messages()[1]
assert first_logging_message == (
INFO,
LogMessageMother.get_controller(
operation="my_controller", message="Start"
).to_json(),
)
assert second_logging_message == (
INFO,
LogMessageMother.get_controller(
operation="my_controller", message="Success result of type: bytes"
).to_json(),
)
@pytest.mark.unit
def test_should_log_an_exception_occurred_on_the_controller():
logger = FakeLogger()
@controller_handler(logger=logger)
def my_controller():
raise RuntimeError("my_controller exception")
http_response = my_controller()
assert http_response == (
{"error": {"message": "Internal Error.", "type": "InternalHttpError"}},
500,
)
first_logging_message = logger.get_logging_messages()[0]
second_logging_message = logger.get_logging_messages()[1]
assert first_logging_message == (
INFO,
LogMessageMother.get_controller(
operation="my_controller", message="Start"
).to_json(),
)
assert second_logging_message[0] == ERROR
assert "line" in second_logging_message[1]
assert "RuntimeError" in second_logging_message[1]
assert "my_controller exception" in second_logging_message[1]
| 28.344037 | 104 | 0.686195 | 666 | 6,179 | 5.998499 | 0.132132 | 0.101627 | 0.075094 | 0.069086 | 0.886108 | 0.88035 | 0.865832 | 0.865832 | 0.857572 | 0.817272 | 0 | 0.008007 | 0.211685 | 6,179 | 217 | 105 | 28.474654 | 0.812154 | 0 | 0 | 0.761006 | 0 | 0 | 0.108917 | 0 | 0 | 0 | 0 | 0 | 0.138365 | 1 | 0.08805 | false | 0 | 0.037736 | 0.037736 | 0.163522 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
72fb880a4aafae78bc4ddfc6d98c0bad806d5f7d | 15,578 | py | Python | model.py | pwdonh/tedlium_model | 354fe48f208738e4b0f34017634c399729a4be89 | [
"MIT"
] | 2 | 2019-12-03T16:50:45.000Z | 2020-01-15T12:24:57.000Z | model.py | pwdonh/tedlium_model | 354fe48f208738e4b0f34017634c399729a4be89 | [
"MIT"
] | 1 | 2020-03-06T11:32:41.000Z | 2020-03-06T11:32:41.000Z | model.py | pwdonh/tedlium_model | 354fe48f208738e4b0f34017634c399729a4be89 | [
"MIT"
] | null | null | null | import torch
import torch.nn as nn
from torch.autograd import Variable
import torch.nn.functional as F
from rnn_modules import *
class RNNModel(nn.Module):
"""Container module with an encoder, a recurrent module, and a decoder."""
def __init__(self, rnn_type, ntoken, ninp, nhid, nlayers, dropout=0.5, tie_weights=False, is_lnorm=False):
super(RNNModel, self).__init__()
self.drop = nn.Dropout(dropout)
self.encoder = nn.Embedding(ntoken, ninp)
if rnn_type in ['LSTM', 'GRU']:
self.rnn = getattr(nn, rnn_type)(ninp, nhid, nlayers, dropout=dropout)
elif rnn_type in ['LSTM2', 'HMLSTM']:
exec("self.rnn = "+rnn_type+"(ninp, nhid, nlayers, dropout=dropout, is_lnorm=is_lnorm)")
else:
try:
nonlinearity = {'RNN_TANH': 'tanh', 'RNN_RELU': 'relu'}[rnn_type]
except KeyError:
raise ValueError( """An invalid option for `--model` was supplied,
options are ['LSTM', 'GRU', 'RNN_TANH' or 'RNN_RELU']""")
self.rnn = nn.RNN(ninp, nhid, nlayers, nonlinearity=nonlinearity, dropout=dropout)
self.decoder = nn.Linear(nhid, ntoken)
# Optionally tie weights as in:
# "Using the Output Embedding to Improve Language Models" (Press & Wolf 2016)
# https://arxiv.org/abs/1608.05859
# and
# "Tying Word Vectors and Word Classifiers: A Loss Framework for Language Modeling" (Inan et al. 2016)
# https://arxiv.org/abs/1611.01462
if tie_weights:
if nhid != ninp:
raise ValueError('When using the tied flag, nhid must be equal to emsize')
self.decoder.weight = self.encoder.weight
self.init_weights()
self.rnn_type = rnn_type
self.nhid = nhid
self.nlayers = nlayers
self.nout = ntoken
def init_weights(self):
initrange = 0.1
self.encoder.weight.data.uniform_(-initrange, initrange)
self.decoder.bias.data.fill_(0)
self.decoder.weight.data.uniform_(-initrange, initrange)
def forward(self, input, hidden):
emb = self.drop(self.encoder(input[0]))
output, hidden = self.rnn(emb, hidden)
output = self.drop(output)
decoded = self.decoder(output.view(output.size(0)*output.size(1), output.size(2)))
return decoded.view(output.size(0), output.size(1), decoded.size(1)), hidden
def init_hidden(self, bsz):
weight = next(self.parameters()).data
if self.rnn_type in ['LSTM', 'LSTM2']:
return (Variable(weight.new(self.nlayers, bsz, self.nhid).zero_()),
Variable(weight.new(self.nlayers, bsz, self.nhid).zero_()))
elif self.rnn_type == 'HMLSTM':
return (Variable(weight.new(self.nlayers, bsz, self.nhid).zero_()+1),
Variable(weight.new(self.nlayers, bsz, self.nhid).zero_()+1),
Variable(weight.new(self.nlayers, bsz).zero_()+1))
else:
return Variable(weight.new(self.nlayers, bsz, self.nhid).zero_())
def criterion(self, output, targets):
loss_fcn = nn.CrossEntropyLoss()
loss = loss_fcn(output.view(-1, self.nout), targets[0])
return loss
class TedliumModel(nn.Module):
def __init__(self, rnn_type, ntoken, ninp, nhid, nlayers, qu_steps=20, dropout=0.5, tie_weights=False, is_lnorm=False):
super(TedliumModel, self).__init__()
self.drop = nn.Dropout(dropout)
self.encoder = nn.Embedding(ntoken, ninp)
self.encoder_dursw = nn.Embedding(qu_steps, qu_steps)
self.encoder_dursp = nn.Embedding(qu_steps, qu_steps)
ninp_rnn = ninp + 2*qu_steps
if rnn_type in ['LSTM', 'GRU']:
self.rnn = getattr(nn, rnn_type)(ninp_rnn, nhid, nlayers, dropout=dropout)
elif rnn_type in ['LSTM2', 'HMLSTM']:
exec("self.rnn = "+rnn_type+"(ninp_rnn, nhid, nlayers, dropout=dropout, is_lnorm=is_lnorm)")
else:
try:
nonlinearity = {'RNN_TANH': 'tanh', 'RNN_RELU': 'relu'}[rnn_type]
except KeyError:
raise ValueError( """An invalid option for `--model` was supplied,
options are ['LSTM', 'GRU', 'RNN_TANH' or 'RNN_RELU']""")
self.rnn = nn.RNN(ninp, nhid, nlayers, nonlinearity=nonlinearity, dropout=dropout)
self.decoder = nn.Linear(nhid, ntoken)
if tie_weights:
if nhid != ninp:
raise ValueError('When using the tied flag, nhid must be equal to emsize')
self.decoder.weight = self.encoder.weight
self.init_weights()
self.rnn_type = rnn_type
self.nhid = nhid
self.nlayers = nlayers
self.nout = ntoken
def init_weights(self):
initrange = 0.1
self.encoder.weight.data.uniform_(-initrange, initrange)
self.decoder.bias.data.fill_(0)
self.decoder.weight.data.uniform_(-initrange, initrange)
self.encoder_dursw.weight.data.uniform_(-initrange, initrange)
self.encoder_dursp.weight.data.uniform_(-initrange, initrange)
def forward(self, input, hidden):
emb_word = self.drop(self.encoder(input[0]))
emb_durw = self.drop(self.encoder_dursw(input[1][:,:,0]))
emb_durp = self.drop(self.encoder_dursp(input[1][:,:,1]))
emb = torch.cat([emb_word, emb_durw, emb_durp], 2)
output, hidden = self.rnn(emb, hidden)
output = self.drop(output)
decoded = self.decoder(output.view(output.size(0)*output.size(1), output.size(2)))
return decoded.view(output.size(0), output.size(1), decoded.size(1)), hidden
def init_hidden(self, bsz):
weight = next(self.parameters()).data
if self.rnn_type in ['LSTM', 'LSTM2']:
return (Variable(weight.new(self.nlayers, bsz, self.nhid).zero_()),
Variable(weight.new(self.nlayers, bsz, self.nhid).zero_()))
elif self.rnn_type == 'HMLSTM':
return (Variable(weight.new(self.nlayers, bsz, self.nhid).zero_()+1),
Variable(weight.new(self.nlayers, bsz, self.nhid).zero_()+1),
Variable(weight.new(self.nlayers, bsz).zero_()+1))
else:
return Variable(weight.new(self.nlayers, bsz, self.nhid).zero_())
def criterion(self, output, targets):
loss_fcn = nn.CrossEntropyLoss()
loss = loss_fcn(output.view(-1, self.nout), targets[0])
return loss
class TedliumModelPredictDurs(TedliumModel):
def __init__(self, rnn_type, ntoken, ninp, nhid, nlayers, qu_steps=20, dropout=0.5, tie_weights=False):
super(TedliumModelPredictDurs, self).__init__(rnn_type, ntoken, ninp, nhid, nlayers, qu_steps, dropout, tie_weights)
self.decoder = nn.Linear(nhid, qu_steps)
if tie_weights:
raise ValueError('No tied weights for duration prediction')
# self.decoder.weight = self.encoder_dursp.weight
self.init_weights()
self.nout = qu_steps
def criterion(self, output, targets):
loss_fcn = nn.CrossEntropyLoss()
loss = loss_fcn(output.view(-1, self.nout), targets[1])
return loss
class TedliumModelCombined(TedliumModel):
def __init__(self, rnn_type, ntoken_word, ntoken_phone, ninp, nhid, nlayers, qu_steps=20, dropout=0.5, tie_weights=False):
super(TedliumModel, self).__init__()
self.drop = nn.Dropout(dropout)
self.encoder_word = nn.Embedding(ntoken_word, ninp)
# ninp_phone = max((int(ninp/5), ntoken_phone))
self.encoder_phone = nn.Embedding(ntoken_phone, ninp)
self.encoder_dursw = nn.Embedding(qu_steps, qu_steps)
self.encoder_dursp = nn.Embedding(qu_steps, qu_steps)
ninp_rnn = 2*ninp + 2*qu_steps
if rnn_type in ['LSTM', 'GRU']:
self.rnn = getattr(nn, rnn_type)(ninp_rnn, nhid, nlayers, dropout=dropout)
else:
try:
nonlinearity = {'RNN_TANH': 'tanh', 'RNN_RELU': 'relu'}[rnn_type]
except KeyError:
raise ValueError( """An invalid option for `--model` was supplied,
options are ['LSTM', 'GRU', 'RNN_TANH' or 'RNN_RELU']""")
self.rnn = nn.RNN(ninp_rnn, nhid, nlayers, nonlinearity=nonlinearity, dropout=dropout)
self.decoder_phone = nn.Linear(nhid, ntoken_phone)
self.decoder_word = nn.Linear(nhid, ntoken_word)
if tie_weights:
if nhid != ninp:
raise ValueError('When using the tied flag, nhid must be equal to emsize')
self.decoder_phone.weight = self.encoder_phone.weight
self.decoder_word.weight = self.encoder_word.weight
self.init_weights()
self.rnn_type = rnn_type
self.nhid = nhid
self.nlayers = nlayers
self.nout_phone = ntoken_phone
self.nout_word = ntoken_word
def init_weights(self):
initrange = 0.1
self.encoder_phone.weight.data.uniform_(-initrange, initrange)
self.encoder_word.weight.data.uniform_(-initrange, initrange)
self.decoder_phone.bias.data.fill_(0)
self.decoder_phone.weight.data.uniform_(-initrange, initrange)
self.decoder_word.bias.data.fill_(0)
self.decoder_word.weight.data.uniform_(-initrange, initrange)
self.encoder_dursw.weight.data.uniform_(-initrange, initrange)
self.encoder_dursp.weight.data.uniform_(-initrange, initrange)
def forward(self, input, hidden):
emb_phone = self.drop(self.encoder_phone(input[0]))
emb_word = self.drop(self.encoder_word(input[1]))
emb_durw = self.drop(self.encoder_dursw(input[2][:,:,0]))
emb_durp = self.drop(self.encoder_dursp(input[2][:,:,1]))
emb = torch.cat([emb_phone, emb_word, emb_durw, emb_durp], 2)
output, hidden = self.rnn(emb, hidden)
output = self.drop(output)
decoded_phone = self.decoder_phone(output.view(output.size(0)*output.size(1), output.size(2)))
decoded_word = self.decoder_word(output.view(output.size(0)*output.size(1), output.size(2)))
decoded_phone = decoded_phone.view(output.size(0), output.size(1), decoded_phone.size(1))
decoded_word = decoded_word.view(output.size(0), output.size(1), decoded_word.size(1))
return (decoded_phone, decoded_word), hidden
def init_hidden(self, bsz):
weight = next(self.parameters()).data
if self.rnn_type == 'LSTM':
return (Variable(weight.new(self.nlayers, bsz, self.nhid).zero_()),
Variable(weight.new(self.nlayers, bsz, self.nhid).zero_()))
else:
return Variable(weight.new(self.nlayers, bsz, self.nhid).zero_())
def criterion(self, output, targets):
loss_fcn = nn.CrossEntropyLoss()
loss_phone = loss_fcn(output[0].view(-1, self.nout_phone), targets[0])
loss_word = loss_fcn(output[1].view(-1, self.nout_word), targets[1])
loss = (loss_phone+loss_word)/2
return (loss,loss_phone,loss_word)
class TedliumModelCombined2(TedliumModel):
def __init__(self, rnn_type, ntoken_word, ntoken_phone, ninp, nhid, nlayers, qu_steps=20, dropout=0.5, tie_weights=False):
super(TedliumModel, self).__init__()
self.drop = nn.Dropout(dropout)
self.encoder_word = nn.Embedding(ntoken_word, ninp)
# ninp_phone = max((int(ninp/5), ntoken_phone))
self.encoder_phone = nn.Embedding(ntoken_phone, ninp)
self.encoder_dursw = nn.Embedding(qu_steps, qu_steps)
self.encoder_dursp = nn.Embedding(qu_steps, qu_steps)
ninp_rnn = 2*ninp + 2*qu_steps
if rnn_type in ['LSTM', 'GRU']:
self.rnn = getattr(nn, rnn_type)(ninp_rnn, nhid, nlayers, dropout=dropout)
else:
try:
nonlinearity = {'RNN_TANH': 'tanh', 'RNN_RELU': 'relu'}[rnn_type]
except KeyError:
raise ValueError( """An invalid option for `--model` was supplied,
options are ['LSTM', 'GRU', 'RNN_TANH' or 'RNN_RELU']""")
self.rnn = nn.RNN(ninp_rnn, nhid, nlayers, nonlinearity=nonlinearity, dropout=dropout)
self.decoder_phone = nn.Linear(nhid, ntoken_phone)
self.decoder_word = nn.Linear(nhid, ntoken_word)
self.decoder_word2 = nn.Linear(nhid, ntoken_word)
# if tie_weights:
# if nhid != ninp:
# raise ValueError('When using the tied flag, nhid must be equal to emsize')
# self.decoder_phone.weight = self.encoder_phone.weight
# self.decoder_word.weight = self.encoder_word.weight
# self.decoder_word2.weight = self.encoder_word.weight
self.init_weights()
self.rnn_type = rnn_type
self.nhid = nhid
self.nlayers = nlayers
self.nout_phone = ntoken_phone
self.nout_word = ntoken_word
def init_weights(self):
initrange = 0.1
self.encoder_phone.weight.data.uniform_(-initrange, initrange)
self.encoder_word.weight.data.uniform_(-initrange, initrange)
self.decoder_phone.bias.data.fill_(0)
self.decoder_phone.weight.data.uniform_(-initrange, initrange)
self.decoder_word.bias.data.fill_(0)
self.decoder_word.weight.data.uniform_(-initrange, initrange)
self.decoder_word2.bias.data.fill_(0)
self.decoder_word2.weight.data.uniform_(-initrange, initrange)
self.encoder_dursw.weight.data.uniform_(-initrange, initrange)
self.encoder_dursp.weight.data.uniform_(-initrange, initrange)
def forward(self, input, hidden):
emb_phone = self.drop(self.encoder_phone(input[0]))
emb_word = self.drop(self.encoder_word(input[1]))
emb_durw = self.drop(self.encoder_dursw(input[2][:,:,0]))
emb_durp = self.drop(self.encoder_dursp(input[2][:,:,1]))
emb = torch.cat([emb_phone, emb_word, emb_durw, emb_durp], 2)
output, hidden = self.rnn(emb, hidden)
output = self.drop(output)
decoded_phone = self.decoder_phone(output.view(output.size(0)*output.size(1), output.size(2)))
decoded_word = self.decoder_word(output.view(output.size(0)*output.size(1), output.size(2)))
decoded_word2 = self.decoder_word2(output.view(output.size(0)*output.size(1), output.size(2)))
decoded_phone = decoded_phone.view(output.size(0), output.size(1), decoded_phone.size(1))
decoded_word = decoded_word.view(output.size(0), output.size(1), decoded_word.size(1))
decoded_word2 = decoded_word2.view(output.size(0), output.size(1), decoded_word2.size(1))
return (decoded_phone, decoded_word, decoded_word2), hidden
def init_hidden(self, bsz):
weight = next(self.parameters()).data
if self.rnn_type == 'LSTM':
return (Variable(weight.new(self.nlayers, bsz, self.nhid).zero_()),
Variable(weight.new(self.nlayers, bsz, self.nhid).zero_()))
else:
return Variable(weight.new(self.nlayers, bsz, self.nhid).zero_())
def criterion(self, output, targets):
loss_fcn = nn.CrossEntropyLoss()
loss_phone = loss_fcn(output[0].view(-1, self.nout_phone), targets[0])
loss_word = loss_fcn(output[1].view(-1, self.nout_word), targets[1])
loss_word2 = loss_fcn(output[2].view(-1, self.nout_word), targets[2])
loss = (loss_phone+loss_word+loss_word2)/3
return (loss,loss_phone,loss_word,loss_word2)
| 47.206061 | 126 | 0.634613 | 2,032 | 15,578 | 4.686024 | 0.074311 | 0.05083 | 0.033921 | 0.05188 | 0.923861 | 0.908633 | 0.894455 | 0.880277 | 0.868305 | 0.860323 | 0 | 0.014776 | 0.235396 | 15,578 | 329 | 127 | 47.349544 | 0.784653 | 0.049878 | 0 | 0.840909 | 0 | 0 | 0.07124 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.018939 | 0 | 0.193182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f4211fa1100d1bd07a5397f4722993fa25b93503 | 11,725 | py | Python | tests/test_vmware.py | willnx/vlab_esrs | 1099e929c862c71a18cd8fb65c6a835c32226c75 | [
"Apache-2.0"
] | 1 | 2019-04-10T16:17:16.000Z | 2019-04-10T16:17:16.000Z | tests/test_vmware.py | willnx/vlab_esrs | 1099e929c862c71a18cd8fb65c6a835c32226c75 | [
"Apache-2.0"
] | null | null | null | tests/test_vmware.py | willnx/vlab_esrs | 1099e929c862c71a18cd8fb65c6a835c32226c75 | [
"Apache-2.0"
] | null | null | null | # -*- coding: UTF-8 -*-
"""
A suite of tests for the functions in vmware.py
"""
import unittest
from unittest.mock import patch, MagicMock
from vlab_esrs_api.lib.worker import vmware
class TestVMware(unittest.TestCase):
"""A set of test cases for the vmware.py module"""
@classmethod
def setUpClass(cls):
vmware.logger = MagicMock()
@patch.object(vmware.virtual_machine, 'get_info')
@patch.object(vmware, 'vCenter')
def test_show_esrs(self, fake_vCenter, fake_get_info):
"""``show_esrs`` returns a dictionary when everything works as expected"""
fake_vm = MagicMock()
fake_vm.name = 'myESRS'
fake_folder = MagicMock()
fake_folder.childEntity = [fake_vm]
fake_vCenter.return_value.__enter__.return_value.get_by_name.return_value = fake_folder
fake_get_info.return_value = {'meta' : {'component': 'ESRS',
'created': 1234,
'version': '3.28',
'configured': False,
'generation': 1}}
output = vmware.show_esrs(username='alice')
expected = {'myESRS': {'meta' : {'component': 'ESRS',
'created': 1234,
'version': '3.28',
'configured': False,
'generation': 1}}}
self.assertEqual(output, expected)
@patch.object(vmware.virtual_machine, 'get_info')
@patch.object(vmware, 'vCenter')
def test_show_esrs_nothing(self, fake_vCenter, fake_get_info):
"""``show_esrs`` returns an empty dictionary no esrs is found"""
fake_vm = MagicMock()
fake_vm.name = 'myESRS'
fake_folder = MagicMock()
fake_folder.childEntity = [fake_vm]
fake_vCenter.return_value.__enter__.return_value.get_by_name.return_value = fake_folder
fake_get_info.return_value = {'meta' : {'component': 'otherThing',
'created': 1234,
'version': '3.28',
'configured': False,
'generation': 1}}
output = vmware.show_esrs(username='alice')
expected = {}
self.assertEqual(output, expected)
@patch.object(vmware.virtual_machine, 'set_meta')
@patch.object(vmware, 'consume_task')
@patch.object(vmware, 'Ova')
@patch.object(vmware.virtual_machine, 'get_info')
@patch.object(vmware.virtual_machine, 'deploy_from_ova')
@patch.object(vmware, 'vCenter')
def test_create_esrs(self, fake_vCenter, fake_deploy_from_ova, fake_get_info, fake_Ova, fake_consume_task, set_meta):
"""``create_esrs`` returns the new esrs's info when everything works"""
fake_logger = MagicMock()
fake_deploy_from_ova.return_value.name = 'myESRS'
fake_Ova.return_value.networks = ['vLabNetwork']
fake_get_info.return_value = {'worked' : True}
fake_vCenter.return_value.__enter__.return_value.networks = {'someNetwork': vmware.vim.Network(moId='asdf')}
output = vmware.create_esrs(username='alice',
machine_name='myESRS',
image='3.28',
network='someNetwork',
logger=fake_logger)
expected = {'myESRS': {'worked': True}}
self.assertEqual(output, expected)
@patch.object(vmware, 'consume_task')
@patch.object(vmware, 'Ova')
@patch.object(vmware.virtual_machine, 'get_info')
@patch.object(vmware.virtual_machine, 'deploy_from_ova')
@patch.object(vmware, 'vCenter')
def test_create_esrs_value_error(self, fake_vCenter, fake_deploy_from_ova, fake_get_info, fake_Ova, fake_consume_task):
"""``create_esrs`` raises ValueError if supplied with a non-existing network"""
fake_logger = MagicMock()
fake_Ova.return_value.networks = ['vLabNetwork']
fake_get_info.return_value = {'worked' : True}
fake_vCenter.return_value.__enter__.return_value.networks = {'someNetwork': vmware.vim.Network(moId='asdf')}
with self.assertRaises(ValueError):
vmware.create_esrs(username='alice',
machine_name='myESRS',
image='3.28',
network='not a thing',
logger=fake_logger)
@patch.object(vmware, 'consume_task')
@patch.object(vmware, 'Ova')
@patch.object(vmware.virtual_machine, 'get_info')
@patch.object(vmware.virtual_machine, 'deploy_from_ova')
@patch.object(vmware, 'vCenter')
def test_create_esrs_bad_image(self, fake_vCenter, fake_deploy_from_ova, fake_get_info, fake_Ova, fake_consume_task):
"""``create_esrs`` raises ValueError if supplied with a non-existing image to deploy"""
fake_logger = MagicMock()
fake_Ova.side_effect = FileNotFoundError('testing')
fake_get_info.return_value = {'worked' : True}
fake_vCenter.return_value.__enter__.return_value.networks = {'someNetwork': vmware.vim.Network(moId='asdf')}
with self.assertRaises(ValueError):
vmware.create_esrs(username='alice',
machine_name='myESRS',
image='a.3.sdf',
network='someNetwork',
logger=fake_logger)
@patch.object(vmware.virtual_machine, 'get_info')
@patch.object(vmware, 'consume_task')
@patch.object(vmware.virtual_machine, 'power')
@patch.object(vmware, 'vCenter')
def test_delete_esrs(self, fake_vCenter, fake_power, fake_consume_task, fake_get_info):
"""``delete_esrs`` powers off the VM then deletes it"""
fake_logger = MagicMock()
fake_vm = MagicMock()
fake_vm.name = 'myESRS'
fake_folder = MagicMock()
fake_folder.childEntity = [fake_vm]
fake_vCenter.return_value.__enter__.return_value.get_by_name.return_value = fake_folder
fake_get_info.return_value = {'meta' : {'component': 'ESRS',
'created': 1234,
'version': '3.28',
'configured': False,
'generation': 1}}
vmware.delete_esrs(username='alice', machine_name='myESRS', logger=fake_logger)
self.assertTrue(fake_power.called)
self.assertTrue(fake_vm.Destroy_Task.called)
@patch.object(vmware.virtual_machine, 'get_info')
@patch.object(vmware, 'consume_task')
@patch.object(vmware.virtual_machine, 'power')
@patch.object(vmware, 'vCenter')
def test_delete_esrs_value_error(self, fake_vCenter, fake_power, fake_consume_task, fake_get_info):
"""``delete_esrs`` raises ValueError if no esrs machine has the supplied name"""
fake_logger = MagicMock()
fake_vm = MagicMock()
fake_vm.name = 'myESRS'
fake_folder = MagicMock()
fake_folder.childEntity = [fake_vm]
fake_vCenter.return_value.__enter__.return_value.get_by_name.return_value = fake_folder
fake_get_info.return_value = {'worked': True, 'note': "ESRS=3.28"}
with self.assertRaises(ValueError):
vmware.delete_esrs(username='alice', machine_name='not a thing', logger=fake_logger)
@patch.object(vmware.os, 'listdir')
def test_list_images(self, fake_listdir):
"""``list_images`` returns a list of images when everything works as expected"""
fake_listdir.return_value = ['esrs_3.28.ova']
output = vmware.list_images()
expected = ['3.28']
self.assertEqual(output, expected)
def test_convert_name(self):
"""``convert_name`` defaults to converting versions to images"""
output = vmware.convert_name('3.28')
expected = 'ESRS_3.28.ova'
self.assertEqual(output, expected)
def test_convert_name_to_version(self):
"""``convert_name`` can convert from versions to image names"""
output = vmware.convert_name('ESRS_3.28.ova', to_version=True)
expected = '3.28'
self.assertEqual(output, expected)
@patch.object(vmware.virtual_machine, 'change_network')
@patch.object(vmware.virtual_machine, 'get_info')
@patch.object(vmware, 'consume_task')
@patch.object(vmware, 'vCenter')
def test_update_network(self, fake_vCenter, fake_consume_task, fake_get_info, fake_change_network):
"""``update_network`` Returns None upon success"""
fake_logger = MagicMock()
fake_vm = MagicMock()
fake_vm.name = 'myESRS'
fake_folder = MagicMock()
fake_folder.childEntity = [fake_vm]
fake_vCenter.return_value.__enter__.return_value.get_by_name.return_value = fake_folder
fake_vCenter.return_value.__enter__.return_value.networks = {'wootTown' : 'someNetworkObject'}
fake_get_info.return_value = {'meta': {'component' : 'ESRS'}}
result = vmware.update_network(username='pat',
machine_name='myESRS',
new_network='wootTown')
self.assertTrue(result is None)
@patch.object(vmware.virtual_machine, 'change_network')
@patch.object(vmware.virtual_machine, 'get_info')
@patch.object(vmware, 'consume_task')
@patch.object(vmware, 'vCenter')
def test_update_network_no_vm(self, fake_vCenter, fake_consume_task, fake_get_info, fake_change_network):
"""``update_network`` Raises ValueError if the supplied VM doesn't exist"""
fake_logger = MagicMock()
fake_vm = MagicMock()
fake_vm.name = 'myESRS'
fake_folder = MagicMock()
fake_folder.childEntity = [fake_vm]
fake_vCenter.return_value.__enter__.return_value.get_by_name.return_value = fake_folder
fake_vCenter.return_value.__enter__.return_value.networks = {'wootTown' : 'someNetworkObject'}
fake_get_info.return_value = {'meta': {'component' : 'ESRS'}}
with self.assertRaises(ValueError):
vmware.update_network(username='pat',
machine_name='SomeOtherMachine',
new_network='wootTown')
@patch.object(vmware.virtual_machine, 'change_network')
@patch.object(vmware.virtual_machine, 'get_info')
@patch.object(vmware, 'consume_task')
@patch.object(vmware, 'vCenter')
def test_update_network_no_network(self, fake_vCenter, fake_consume_task, fake_get_info, fake_change_network):
"""``update_network`` Raises ValueError if the supplied new network doesn't exist"""
fake_logger = MagicMock()
fake_vm = MagicMock()
fake_vm.name = 'myESRS'
fake_folder = MagicMock()
fake_folder.childEntity = [fake_vm]
fake_vCenter.return_value.__enter__.return_value.get_by_name.return_value = fake_folder
fake_vCenter.return_value.__enter__.return_value.networks = {'wootTown' : 'someNetworkObject'}
fake_get_info.return_value = {'meta': {'component' : 'ESRS'}}
with self.assertRaises(ValueError):
vmware.update_network(username='pat',
machine_name='myESRS',
new_network='dohNet')
if __name__ == '__main__':
unittest.main()
| 46.527778 | 123 | 0.610832 | 1,281 | 11,725 | 5.265418 | 0.120999 | 0.076649 | 0.103336 | 0.067606 | 0.828317 | 0.810526 | 0.798073 | 0.774203 | 0.760267 | 0.720682 | 0 | 0.007193 | 0.276674 | 11,725 | 251 | 124 | 46.713147 | 0.788115 | 0.083241 | 0 | 0.742268 | 0 | 0 | 0.106082 | 0 | 0 | 0 | 0 | 0 | 0.072165 | 1 | 0.072165 | false | 0 | 0.015464 | 0 | 0.092784 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
be7da8eb6521655e60f40a08443dd019d7dd4314 | 1,469 | py | Python | xy_1d/attic/t9.py | nftqcd/nthmc | 010c70e297c904219e9d8a04cc20b9c75a4b61e5 | [
"MIT"
] | 2 | 2021-07-29T19:09:30.000Z | 2022-01-17T21:13:40.000Z | xy_1d/attic/t9.py | nftqcd/nthmc | 010c70e297c904219e9d8a04cc20b9c75a4b61e5 | [
"MIT"
] | null | null | null | xy_1d/attic/t9.py | nftqcd/nthmc | 010c70e297c904219e9d8a04cc20b9c75a4b61e5 | [
"MIT"
] | null | null | null | import tensorflow.keras as tk
import nthmc
conf = nthmc.Conf(nbatch=1024, nepoch=64, nstepEpoch=512, initDt=0.4, refreshOpt=False)
nthmc.setup(conf)
#action = OneD(transforms=[Ident()])
action = nthmc.OneD(beta=6, transforms=[
nthmc.OneDNeighbor(mask='even'), nthmc.OneDNeighbor(mask='odd'),
nthmc.OneDNeighbor(mask='even',distance=2), nthmc.OneDNeighbor(mask='odd',distance=2),
nthmc.OneDNeighbor(mask='even',distance=4), nthmc.OneDNeighbor(mask='odd',distance=4),
nthmc.OneDNeighbor(mask='even',distance=8), nthmc.OneDNeighbor(mask='odd',distance=8),
nthmc.OneDNeighbor(mask='even',distance=16), nthmc.OneDNeighbor(mask='odd',distance=16),
nthmc.OneDNeighbor(mask='even',distance=32), nthmc.OneDNeighbor(mask='odd',distance=32),
nthmc.OneDNeighbor(mask='even'), nthmc.OneDNeighbor(mask='odd'),
nthmc.OneDNeighbor(mask='even',distance=2), nthmc.OneDNeighbor(mask='odd',distance=2),
nthmc.OneDNeighbor(mask='even',distance=4), nthmc.OneDNeighbor(mask='odd',distance=4),
nthmc.OneDNeighbor(mask='even',distance=8), nthmc.OneDNeighbor(mask='odd',distance=8),
nthmc.OneDNeighbor(mask='even',distance=16), nthmc.OneDNeighbor(mask='odd',distance=16),
nthmc.OneDNeighbor(mask='even',distance=32), nthmc.OneDNeighbor(mask='odd',distance=32),
])
loss = nthmc.LossFun(action, cCosDiff=1.0, cTopoDiff=10.0, dHmin=0.5, topoFourierN=1)
opt = tk.optimizers.Adam(learning_rate=0.001)
x0 = action.initState(conf.nbatch)
nthmc.run(conf, action, loss, opt, x0)
| 58.76 | 90 | 0.75017 | 205 | 1,469 | 5.370732 | 0.253659 | 0.370572 | 0.457766 | 0.27248 | 0.704814 | 0.704814 | 0.704814 | 0.704814 | 0.704814 | 0.704814 | 0 | 0.03913 | 0.060585 | 1,469 | 24 | 91 | 61.208333 | 0.758696 | 0.023826 | 0 | 0.545455 | 0 | 0 | 0.058618 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
be7e5cfdd761003acd04dfe9200eeaa8a532b0ba | 65 | py | Python | samples/src/main/resources/datasets/python/53.py | sritchie/kotlingrad | 8165ed1cd77220a5347c58cded4c6f2bcf22ee30 | [
"Apache-2.0"
] | 11 | 2020-12-19T01:19:44.000Z | 2021-12-25T20:43:33.000Z | src/main/resources/datasets/python/53.py | breandan/katholic | 081c39f3acc73ff41f5865563debe78a36e1038f | [
"Apache-2.0"
] | null | null | null | src/main/resources/datasets/python/53.py | breandan/katholic | 081c39f3acc73ff41f5865563debe78a36e1038f | [
"Apache-2.0"
] | 2 | 2021-01-25T07:59:20.000Z | 2021-08-07T07:13:49.000Z | def bool9(a, b, c, d):
return ((a < b) + 4) == ((c + 5) < d)
| 21.666667 | 41 | 0.369231 | 13 | 65 | 1.846154 | 0.692308 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068182 | 0.323077 | 65 | 2 | 42 | 32.5 | 0.477273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
fe229862cd61ecf84315485c30a670eaaae0a465 | 159 | py | Python | ch4/arguments.variable.positional.unpacking.py | ldmcdaniel/learning_python | 63717c397cd75e45a8aef909d4b601466cd6036a | [
"MIT"
] | 55 | 2018-07-04T10:12:15.000Z | 2022-03-03T19:51:54.000Z | ch4/arguments.variable.positional.unpacking.py | ldmcdaniel/learning_python | 63717c397cd75e45a8aef909d4b601466cd6036a | [
"MIT"
] | 6 | 2020-03-24T16:37:46.000Z | 2021-06-10T21:04:36.000Z | ch4/arguments.variable.positional.unpacking.py | ldmcdaniel/learning_python | 63717c397cd75e45a8aef909d4b601466cd6036a | [
"MIT"
] | 32 | 2018-07-10T05:56:31.000Z | 2021-09-04T23:19:42.000Z | def func(*args):
print(args)
values = (1, 3, -7, 9)
func(values) # equivalent to: func((1, 3, -7, 9))
func(*values) # equivalent to: func(1, 3, -7, 9)
| 22.714286 | 51 | 0.566038 | 28 | 159 | 3.214286 | 0.392857 | 0.066667 | 0.1 | 0.133333 | 0.711111 | 0.711111 | 0.711111 | 0.711111 | 0.711111 | 0.711111 | 0 | 0.095238 | 0.207547 | 159 | 6 | 52 | 26.5 | 0.619048 | 0.421384 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.2 | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fe2d73d529b3662036cae72ca6b4c206c5aaeb68 | 13,733 | py | Python | gene_graph_dataset.py | Paeans/phylognn | 45048d2e68af7c9114ada7e3ede9e765d10fe0a1 | [
"MIT"
] | null | null | null | gene_graph_dataset.py | Paeans/phylognn | 45048d2e68af7c9114ada7e3ede9e765d10fe0a1 | [
"MIT"
] | null | null | null | gene_graph_dataset.py | Paeans/phylognn | 45048d2e68af7c9114ada7e3ede9e765d10fe0a1 | [
"MIT"
] | null | null | null | import sys
import time
import numpy as np
import torch
from gene_mat import gen_dataset_wt, gen_dataset_wb, gen_g2g_data, gen_m3g_data
from genome_graph import gen_graph, gen_g2g_graph, gen_g2b_graph
from gene_mat import dcj_dist
from torch_geometric.data import InMemoryDataset
# from multiprocessing import Pool
def save_dataset(gene_len, step_range, graph_num = None, fname = None):
if graph_num == None:
graph_num = 100
if fname == None:
fname = 'inv_' + str(gene_len) + '_' + str(step_range) + '.pt'
gene = np.zeros((graph_num * step_range, 2, gene_len), dtype = np.int32) #[]
label = np.zeros(graph_num * step_range, dtype = np.int32) #[]
for step in range(0, step_range):
s, o, t = gen_dataset_wt(gene_len, graph_num, step + 1, op_type = 2)
gene[step * graph_num : (step + 1) * graph_num] = s[:, (0,-1)].astype(np.int32)
# label[step * graph_num : (step + 1) * graph_num] = step
g_dist = [dcj_dist(g[0], g[1])[-1] for g in s[:, (0, -1)]]
label[step * graph_num : (step + 1) * graph_num] = g_dist
# g += [gen_graph(x, label = inv_num) for x in s]
torch.save((gene, label), fname)
def save_g2b_dataset(gene_len, step_range, graph_num = None, fname = None):
if graph_num == None:
graph_num = 100
if fname == None:
fname = 'g2b_' + str(gene_len) + '_' + str(step_range) + '.pt'
gene = np.zeros((graph_num * step_range, 2, gene_len), dtype = np.int32) #[]
label = np.zeros(graph_num * step_range, dtype = np.int32) #[]
node_label = np.zeros((graph_num * step_range, gene_len * 2), dtype = np.int32)
for step in range(0, step_range):
s, o, t, b = gen_dataset_wb(gene_len, graph_num, step + 2, op_type = 2)
gene[step * graph_num : (step + 1) * graph_num] = s[:, (0,-1)].astype(np.int32)
label[step * graph_num : (step + 1) * graph_num] = step + 1 #inv_num = step + 1
node_label[step * graph_num : (step + 1) * graph_num][b.any(axis = 1)] = 1
# g += [gen_graph(x, label = inv_num) for x in s]
torch.save((gene, label, node_label), fname)
def save_g2g_dataset(gene_len, step, graph_num = None, fname = None):
if graph_num == None:
graph_num = 100
if fname == None:
fname = 'g2g_' + str(gene_len) + '_' + str(step) + '.pt'
source = np.zeros((graph_num * step, 2, gene_len), dtype = np.int32) #[]
target = np.zeros((graph_num * step, gene_len), dtype = np.int32) #[]
for dist in range(0, step):
s = gen_g2g_data(gene_len, graph_num, dist, op_type = 2)
source[dist * graph_num : (dist + 1) * graph_num] = s[:, (0, -1)]
target[dist * graph_num : (dist + 1) * graph_num] = s[:, 1]
torch.save((source, target), fname)
def save_g3m_dataset_old(gene_len, step, graph_num = None, fname = None, mid_num = None):
if graph_num == None:
graph_num = 100
if fname == None:
fname = 'g3m_' + str(gene_len) + '_' + str(step) + '.pt'
if mid_num == None:
mid_num = 3
source = np.zeros((graph_num * step, mid_num, gene_len), dtype = np.int32)
target = np.zeros((graph_num * step, gene_len), dtype = np.int32)
for dist in range(0, step):
# print(f'{time.ctime()} >> step: {dist:04d}, number: {graph_num:06d}, length: {gene_len:06d}',
# file = sys.stderr)
m_seq, t_seq = gen_m3g_data(gene_len, graph_num, dist + 1, op_type = 2, mid_num = mid_num)
source[dist * graph_num : (dist + 1) * graph_num] = m_seq
target[dist * graph_num : (dist + 1) * graph_num] = t_seq.squeeze()
torch.save((source, target), fname)
def save_g3m_dataset(gene_len, step, graph_num = None, fname = None, mid_num = None):
if graph_num == None:
graph_num = 1000
if fname == None:
fname = 'g3m_' + str(gene_len) + '_' + str(step) + '.pt'
if mid_num == None:
mid_num = 3
source = np.zeros((graph_num, mid_num, gene_len), dtype = np.int32)
target = np.zeros((graph_num, gene_len), dtype = np.int32)
m_seq, t_seq = gen_m3g_data(gene_len, graph_num, step, op_type = 2, mid_num = mid_num)
source[0 : graph_num] = m_seq
target[0 : graph_num] = t_seq.squeeze()
torch.save((source, target), fname)
class GeneGraphDataset(InMemoryDataset):
def __init__(self, root, gene_len, step_range, graph_num = 100):
# transform=None, pre_transform=None, pre_filter = None):
self.gene_len = gene_len
self.step_range = step_range
self.graph_num = graph_num
super().__init__(root + '_' + str(self.gene_len) + '_'
+ str(self.step_range) + '_' + str(self.graph_num),
transform = None,
pre_transform = None,
pre_filter = None)
self.data, self.slices = torch.load(self.processed_paths[0])
@property
def raw_file_names(self):
return ['inv_' + str(self.gene_len) +
'_' + str(self.step_range) + '.pt']
@property
def processed_file_names(self):
return ['data_' + str(self.gene_len) +
'_' + str(self.step_range) + '.pt']
def download(self):
# Download to `self.raw_dir`.
print('Generating...', file=sys.stderr)
save_dataset(self.gene_len, self.step_range,
graph_num = self.graph_num,
fname = self.raw_dir + '/' + self.raw_file_names[0])
pass
def process(self):
# Read data into huge `Data` list.
filename = self.raw_dir + '/' + self.raw_file_names[0]
gene_list, label = torch.load(filename) #, map_location=torch.device('cuda'))
data_list = [gen_graph_adj(x, label = inv_num) for x, inv_num in zip(gene_list, label)]
# with Pool(22) as p:
# data_list = p.starmap(gen_graph, [(x, inv_num) for x, inv_num in zip(gene_list, label)])
if self.pre_filter is not None:
data_list = [data for data in data_list if self.pre_filter(data)]
if self.pre_transform is not None:
data_list = [self.pre_transform(data) for data in data_list]
data, slices = self.collate(data_list)
torch.save((data, slices), self.processed_paths[0])
class G2BraphDataset(InMemoryDataset):
def __init__(self, root, gene_len, step_range, graph_num = 100):
# transform=None, pre_transform=None, pre_filter = None):
self.gene_len = gene_len
self.step_range = step_range
self.graph_num = graph_num
super().__init__(root + '_' + str(self.gene_len) + '_'
+ str(self.step_range) + '_' + str(self.graph_num),
transform = None,
pre_transform = None,
pre_filter = None)
self.data, self.slices = torch.load(self.processed_paths[0])
@property
def raw_file_names(self):
return ['b2raw_' + str(self.gene_len) +
'_' + str(self.step_range) + '.pt']
@property
def processed_file_names(self):
return ['b2dat_' + str(self.gene_len) +
'_' + str(self.step_range) + '.pt']
def download(self):
# Download to `self.raw_dir`.
print('Generating...', file=sys.stderr)
save_g2b_dataset(self.gene_len, self.step_range,
graph_num = self.graph_num,
fname = self.raw_dir + '/' + self.raw_file_names[0])
pass
def process(self):
# Read data into huge `Data` list.
filename = self.raw_dir + '/' + self.raw_file_names[0]
gene_list, label, node_label = torch.load(filename) #, map_location=torch.device('cuda'))
data_list = [gen_g2b_graph(x, label = inv_num, node_label = t_label)
for x, inv_num, t_label in zip(gene_list, label, node_label)]
# with Pool(22) as p:
# data_list = p.starmap(gen_g2b_graph,
# [(x, inv_num, t_label)
# for x, inv_num, t_label in zip(gene_list, label, node_label)])
if self.pre_filter is not None:
data_list = [data for data in data_list if self.pre_filter(data)]
if self.pre_transform is not None:
data_list = [self.pre_transform(data) for data in data_list]
data, slices = self.collate(data_list)
torch.save((data, slices), self.processed_paths[0])
class G2GraphDataset(InMemoryDataset):
def __init__(self, root, gene_len, step_range, graph_num = 100):
# transform=None, pre_transform=None, pre_filter = None):
self.gene_len = gene_len
self.step_range = step_range
self.graph_num = graph_num
super().__init__(root + '_' + str(self.gene_len) + '_'
+ str(self.step_range) + '_' + str(self.graph_num),
transform = None,
pre_transform = None,
pre_filter = None)
self.data, self.slices = torch.load(self.processed_paths[0])
@property
def raw_file_names(self):
return ['g2raw_' + str(self.gene_len) +
'_' + str(self.step_range) + '.pt']
@property
def processed_file_names(self):
return ['g2dat_' + str(self.gene_len) +
'_' + str(self.step_range) + '.pt']
def download(self):
# Download to `self.raw_dir`.
print('Generating...', file=sys.stderr)
save_g2g_dataset(self.gene_len, self.step_range,
graph_num = self.graph_num,
fname = self.raw_dir + '/' + self.raw_file_names[0])
pass
def process(self):
# Read data into huge `Data` list.
filename = self.raw_dir + '/' + self.raw_file_names[0]
source, target = torch.load(filename) #, map_location=torch.device('cuda'))
data_list = [gen_g2g_graph(s, t) for s,t in zip(source, target)]
# with Pool(22) as p:
# data_list = p.starmap(gen_g2g_graph, [(s, t) for s, t in zip(source, target)])
if self.pre_filter is not None:
data_list = [data for data in data_list if self.pre_filter(data)]
if self.pre_transform is not None:
data_list = [self.pre_transform(data) for data in data_list]
data, slices = self.collate(data_list)
torch.save((data, slices), self.processed_paths[0])
class G3MedianDataset(G2GraphDataset):
def __init__(self, root, gene_len, step_range, graph_num = 100, mid_num = 3):
self.mid_num = mid_num if mid_num >= 3 else 3
super().__init__(root + '_' + str(self.mid_num), gene_len, step_range, graph_num)
# self.data, self.slices = torch.load(self.processed_paths[0])
@property
def raw_file_names(self):
return ['g3raw_' + str(self.gene_len) +
'_' + str(self.step_range) + '.pt']
@property
def processed_file_names(self):
return ['g3dat_' + str(self.gene_len) +
'_' + str(self.step_range) + '.pt']
def download(self):
# Download to `self.raw_dir`.
print('Generating...', file=sys.stderr)
save_g3m_dataset(self.gene_len, self.step_range,
graph_num = self.graph_num,
fname = self.raw_dir + '/' + self.raw_file_names[0],
mid_num = self.mid_num)
pass
def process(self):
super().process()
class ExpsDataset(G2GraphDataset):
def __init__(self, root, gene_len, step_range, graph_num = 100, mid_num = 3):
self.mid_num = mid_num if mid_num >= 3 else 3
super().__init__(root + '_' + str(self.mid_num), gene_len, step_range, graph_num)
# self.data, self.slices = torch.load(self.processed_paths[0])
@property
def raw_file_names(self):
return ['g3raw_' + str(self.gene_len) +
'_' + str(self.step_range) + '.pt']
@property
def processed_file_names(self):
return ['g3dat_' + str(self.gene_len) +
'_' + str(self.step_range) + '.pt']
def download(self):
# Download to `self.raw_dir`.
print('Generating...', file=sys.stderr)
save_g3m_dataset(self.gene_len, self.step_range,
graph_num = self.graph_num,
fname = self.raw_dir + '/' + self.raw_file_names[0],
mid_num = self.mid_num)
pass
def process(self):
filename = self.raw_dir + '/' + self.raw_file_names[0]
source, target = torch.load(filename) #, map_location=torch.device('cuda'))
st_list = [(s, t) for s,t in zip(source, target)]
sample_size = len(st_list) // 200
# data_list = [gen_g2g_graph(s, t) for s,t in zip(source, target)]
data_list = [gen_g2g_graph(s, t, i//sample_size) for i, (s,t) in enumerate(st_list)]
# with Pool(22) as p:
# data_list = p.starmap(gen_g2g_graph, [(s, t) for s, t in zip(source, target)])
if self.pre_filter is not None:
data_list = [data for data in data_list if self.pre_filter(data)]
if self.pre_transform is not None:
data_list = [self.pre_transform(data) for data in data_list]
data, slices = self.collate(data_list)
torch.save((data, slices), self.processed_paths[0]) | 40.872024 | 105 | 0.574893 | 1,878 | 13,733 | 3.925985 | 0.07295 | 0.086803 | 0.031331 | 0.03228 | 0.902075 | 0.889597 | 0.875627 | 0.868846 | 0.838736 | 0.824902 | 0 | 0.018972 | 0.297604 | 13,733 | 336 | 106 | 40.872024 | 0.745387 | 0.117818 | 0 | 0.728448 | 0 | 0 | 0.018387 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.12931 | false | 0.021552 | 0.034483 | 0.043103 | 0.228448 | 0.021552 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fe954a9a08459b42ab6f6652b7abd760bbbb13f0 | 251 | py | Python | sentence_transformers/models/tokenizer/__init__.py | ducanhdt/sentence-transformers | 72d229d76e4ad41ff478dce5159bbcf345dcab20 | [
"Apache-2.0"
] | null | null | null | sentence_transformers/models/tokenizer/__init__.py | ducanhdt/sentence-transformers | 72d229d76e4ad41ff478dce5159bbcf345dcab20 | [
"Apache-2.0"
] | null | null | null | sentence_transformers/models/tokenizer/__init__.py | ducanhdt/sentence-transformers | 72d229d76e4ad41ff478dce5159bbcf345dcab20 | [
"Apache-2.0"
] | null | null | null | from .WordTokenizer import WordTokenizer, ENGLISH_STOP_WORDS
from .WhitespaceTokenizer import WhitespaceTokenizer
from .PhraseTokenizer import PhraseTokenizer
from .WhitespaceTokenizer import WhitespaceTokenizer
from .PhoTokenizer import PhoTokenizer
| 41.833333 | 60 | 0.89243 | 23 | 251 | 9.652174 | 0.391304 | 0.207207 | 0.261261 | 0.432432 | 0.468468 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083665 | 251 | 5 | 61 | 50.2 | 0.965217 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
fea302aa2a032d9deb9a876c4acc81c862871e13 | 102 | py | Python | python/testData/refactoring/inlineFunction/importedLocally/main.py | tgodzik/intellij-community | f5ef4191fc30b69db945633951fb160c1cfb7b6f | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/refactoring/inlineFunction/importedLocally/main.py | tgodzik/intellij-community | f5ef4191fc30b69db945633951fb160c1cfb7b6f | [
"Apache-2.0"
] | 2 | 2022-02-19T09:45:05.000Z | 2022-02-27T20:32:55.000Z | python/testData/refactoring/inlineFunction/importedLocally/main.py | tgodzik/intellij-community | f5ef4191fc30b69db945633951fb160c1cfb7b6f | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | from src import foo
res = foo(1, 2)
def bar():
from src import foo
res1 = fo<caret>o(1, 2) | 11.333333 | 27 | 0.588235 | 20 | 102 | 3 | 0.65 | 0.233333 | 0.433333 | 0.533333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068493 | 0.284314 | 102 | 9 | 27 | 11.333333 | 0.753425 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
feb41c6e8755017d61665f0d86905b0bc0d63d45 | 3,824 | py | Python | src/OTLMOW/OTLModel/Datatypes/KlMozaiekkeiFormaat.py | davidvlaminck/OTLClassPython | 71330afeb37c3ea6d9981f521ff8f4a3f8b946fc | [
"MIT"
] | 2 | 2022-02-01T08:58:11.000Z | 2022-02-08T13:35:17.000Z | src/OTLMOW/OTLModel/Datatypes/KlMozaiekkeiFormaat.py | davidvlaminck/OTLMOW | 71330afeb37c3ea6d9981f521ff8f4a3f8b946fc | [
"MIT"
] | null | null | null | src/OTLMOW/OTLModel/Datatypes/KlMozaiekkeiFormaat.py | davidvlaminck/OTLMOW | 71330afeb37c3ea6d9981f521ff8f4a3f8b946fc | [
"MIT"
] | null | null | null | # coding=utf-8
from OTLMOW.OTLModel.Datatypes.KeuzelijstField import KeuzelijstField
from OTLMOW.OTLModel.Datatypes.KeuzelijstWaarde import KeuzelijstWaarde
# Generated with OTLEnumerationCreator. To modify: extend, do not edit
class KlMozaiekkeiFormaat(KeuzelijstField):
"""Formaten van de mozaïekkei."""
naam = 'KlMozaiekkeiFormaat'
label = 'Mozaiekkei formaat'
objectUri = 'https://wegenenverkeer.data.vlaanderen.be/ns/onderdeel#KlMozaiekkeiFormaat'
definition = 'Formaten van de mozaïekkei.'
codelist = 'https://wegenenverkeer.data.vlaanderen.be/id/conceptscheme/KlMozaiekkeiFormaat'
options = {
'bestratingen-van-mozaïekkeien-van-het-1ste-formaat': KeuzelijstWaarde(invulwaarde='bestratingen-van-mozaïekkeien-van-het-1ste-formaat',
label='bestratingen van mozaïekkeien van het 1ste formaat',
definitie='Bestratingen van mozaïekkeien van het 1ste formaat',
objectUri='https://wegenenverkeer.data.vlaanderen.be/id/concept/KlMozaiekkeiFormaat/bestratingen-van-mozaïekkeien-van-het-1ste-formaat'),
'bestratingen-van-mozaïekkeien-van-het-2de-formaat': KeuzelijstWaarde(invulwaarde='bestratingen-van-mozaïekkeien-van-het-2de-formaat',
label='bestratingen van mozaïekkeien van het 2de formaat',
definitie='Bestratingen van mozaïekkeien van het 2de formaat',
objectUri='https://wegenenverkeer.data.vlaanderen.be/id/concept/KlMozaiekkeiFormaat/bestratingen-van-mozaïekkeien-van-het-2de-formaat'),
'bestratingen-van-mozaïekkeien-van-het-3de-formaat': KeuzelijstWaarde(invulwaarde='bestratingen-van-mozaïekkeien-van-het-3de-formaat',
label='bestratingen van mozaïekkeien van het 3de formaat',
definitie='Bestratingen van mozaïekkeien van het 3de formaat',
objectUri='https://wegenenverkeer.data.vlaanderen.be/id/concept/KlMozaiekkeiFormaat/bestratingen-van-mozaïekkeien-van-het-3de-formaat'),
'bestratingen-van-mozaïekkeien-van-het-4de-formaat': KeuzelijstWaarde(invulwaarde='bestratingen-van-mozaïekkeien-van-het-4de-formaat',
label='bestratingen van mozaïekkeien van het 4de formaat',
definitie='Bestratingen van mozaïekkeien van het 4de formaat',
objectUri='https://wegenenverkeer.data.vlaanderen.be/id/concept/KlMozaiekkeiFormaat/bestratingen-van-mozaïekkeien-van-het-4de-formaat'),
'bestratingen-van-mozaïekkeien-van-het-5de-formaat': KeuzelijstWaarde(invulwaarde='bestratingen-van-mozaïekkeien-van-het-5de-formaat',
label='bestratingen van mozaïekkeien van het 5de formaat',
definitie='Bestratingen van mozaïekkeien van het 5de formaat',
objectUri='https://wegenenverkeer.data.vlaanderen.be/id/concept/KlMozaiekkeiFormaat/bestratingen-van-mozaïekkeien-van-het-5de-formaat')
}
| 103.351351 | 216 | 0.5591 | 302 | 3,824 | 7.07947 | 0.175497 | 0.175398 | 0.315716 | 0.350795 | 0.810571 | 0.810571 | 0.793265 | 0.418616 | 0.261927 | 0.261927 | 0 | 0.010647 | 0.361402 | 3,824 | 36 | 217 | 106.222222 | 0.864865 | 0.028766 | 0 | 0 | 1 | 0.166667 | 0.488667 | 0.132758 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.066667 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
feb686a6c21809b98e2362674cf4b42ef81cbac7 | 1,041 | py | Python | tests/test_provider_innovationnorway_azure_preview.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | tests/test_provider_innovationnorway_azure_preview.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | tests/test_provider_innovationnorway_azure_preview.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # tests/test_provider_innovationnorway_azure-preview.py
# Automatically generated by tools/makecode.py (24-Sep-2021 15:12:49 UTC)
def test_provider_import():
import terrascript.provider.innovationnorway.azure_preview
def test_resource_import():
from terrascript.resource.innovationnorway.azure_preview import azurepreview_budget
from terrascript.resource.innovationnorway.azure_preview import (
azurepreview_subscription,
)
def test_datasource_import():
from terrascript.data.innovationnorway.azure_preview import azurepreview_resources
# TODO: Shortcut imports without namespace for official and supported providers.
# TODO: This has to be moved into a required_providers block.
# def test_version_source():
#
# import terrascript.provider.innovationnorway.azure_preview
#
# t = terrascript.provider.innovationnorway.azure_preview.azure_preview()
# s = str(t)
#
# assert 'https://github.com/innovationnorway/terraform-provider-azure-preview' in s
# assert '0.1.0-alpha.3' in s
| 31.545455 | 89 | 0.780019 | 126 | 1,041 | 6.269841 | 0.507937 | 0.136709 | 0.248101 | 0.182278 | 0.426582 | 0.308861 | 0.174684 | 0.174684 | 0 | 0 | 0 | 0.017837 | 0.138329 | 1,041 | 32 | 90 | 32.53125 | 0.862876 | 0.54659 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03125 | 0 | 1 | 0.333333 | true | 0 | 0.777778 | 0 | 1.111111 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
227dd51225b9fd6a6b7512926e8971cc434026b6 | 3,837 | py | Python | env_tom_jerry.py | sillyemperor/mypynotebook | 27a3847764a422fee05590e6d2ef145256f558de | [
"MIT"
] | null | null | null | env_tom_jerry.py | sillyemperor/mypynotebook | 27a3847764a422fee05590e6d2ef145256f558de | [
"MIT"
] | null | null | null | env_tom_jerry.py | sillyemperor/mypynotebook | 27a3847764a422fee05590e6d2ef145256f558de | [
"MIT"
] | null | null | null | import pygame
import os.path
import numpy as np
import time
class OneCheese:
actions = range(4)
M = 64
def __init__(self, N, BASE_DIR):
self.N = N
self.cheese = [int(N*.6), int(N*.8)]
self.tom = [int(N*.5), int(N*.8)]
pygame.init()
self.cheese_img = pygame.image.load(os.path.join(BASE_DIR, "data/cheese64.png"))
self.tom_img = pygame.image.load(os.path.join(BASE_DIR, 'data/cat64.png'))
self.jerry_img = pygame.image.load(os.path.join(BASE_DIR, 'data/mouse64.png'))
self.screen = pygame.display.set_mode((N * self.M, N * self.M))
def reset(self): pass
def step(self, s, a):
"""
:param s:(x,y)
:param a:str
:return:
"""
if a == 0:
d = np.array([-1, 0])
elif a == 1:
d = np.array([1, 0])
elif a == 2:
d = np.array([0, 1])
elif a == 3:
d = np.array([0, -1])
s_ = s + d
r = -.1
done = False
if s_.min() < 0 or s_.max() > (self.N - 1): # 越界留在原地
s_ = s
r = -.2
elif (s_ == self.cheese).all():
done = True
r = .9
elif (s_ == self.tom).all():
done = True
r = -0.9
return done, s_, r
def render(self, s):
pygame.event.get()
self.screen.fill((255, 255, 255))
self.screen.blit(self.cheese_img, (self.cheese[0]*self.M, self.cheese[1]*self.M))
self.screen.blit(self.tom_img, (self.tom[0]*self.M, self.tom[1]*self.M))
self.screen.blit(self.jerry_img, (s[0]*self.M, s[1]*self.M))
pygame.display.flip()
time.sleep(.1)
class ThreeCheese:
actions = range(4)
M = 64
def __init__(self, N, BASE_DIR):
self.N = N
pygame.init()
self.cheese_img = pygame.image.load(os.path.join(BASE_DIR, "data/cheese64.png"))
self.tom_img = pygame.image.load(os.path.join(BASE_DIR, 'data/cat64.png'))
self.jerry_img = pygame.image.load(os.path.join(BASE_DIR, 'data/mouse64.png'))
self.screen = pygame.display.set_mode((N * self.M, N * self.M))
def reset(self):
self.cheese = (
[int(self.N * .6), int(self.N * .8), 1],
[int(self.N * .6), int(self.N * .6), 1],
[int(self.N * .5), int(self.N * .6), 1],
)
self.num_cheese = len(self.cheese)
self.tom = [int(self.N * .5), int(self.N * .8)]
def render(self, s):
pygame.event.get()
self.screen.fill((255, 255, 255))
for i in self.cheese:
if i[-1]:
self.screen.blit(self.cheese_img, (i[0]*self.M, i[1]*self.M))
self.screen.blit(self.tom_img, (self.tom[0]*self.M, self.tom[1]*self.M))
self.screen.blit(self.jerry_img, (s[0]*self.M, s[1]*self.M))
pygame.display.flip()
time.sleep(.1)
def step(self, s, a):
"""
:param s:(x,y)
:param a:str
:return:
"""
if a == 0:
d = np.array([-1, 0])
elif a == 1:
d = np.array([1, 0])
elif a == 2:
d = np.array([0, 1])
elif a == 3:
d = np.array([0, -1])
s_ = s + d
r = -.1
done = False
if s_.min() < 0 or s_.max() > (self.N - 1): # 越界留在原地
s_ = s
r = -.2
elif (s_ == self.tom).all():
done = True
r = -0.9
else:
for i in self.cheese:
if i[-1]:
if (s_ == i[:2]).all():
i[-1] = 0
r = .9
self.num_cheese -= 1
done = not self.num_cheese
# else:
# r = -.1
return done, s_, r
| 29.976563 | 89 | 0.45817 | 561 | 3,837 | 3.049911 | 0.151515 | 0.046756 | 0.037405 | 0.063121 | 0.824664 | 0.822326 | 0.790766 | 0.751023 | 0.727645 | 0.727645 | 0 | 0.045511 | 0.370081 | 3,837 | 127 | 90 | 30.212598 | 0.662391 | 0.027626 | 0 | 0.754902 | 0 | 0 | 0.025768 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.078431 | false | 0.009804 | 0.039216 | 0 | 0.196078 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
228a2df58c3a7eb548d562cf59e0c61d0288ce5e | 188 | py | Python | apitest/tests/test_suite.py | steeltomato/api-integration-test-template-py | 536cd0d8a922bd2dd889d590deff6d12e28ec1fd | [
"MIT"
] | null | null | null | apitest/tests/test_suite.py | steeltomato/api-integration-test-template-py | 536cd0d8a922bd2dd889d590deff6d12e28ec1fd | [
"MIT"
] | null | null | null | apitest/tests/test_suite.py | steeltomato/api-integration-test-template-py | 536cd0d8a922bd2dd889d590deff6d12e28ec1fd | [
"MIT"
] | null | null | null | class TestSuite:
def setup_class(self):
pass
def teardown_class(self):
pass
def setup_method(self):
pass
def teardown_method(self):
pass
| 14.461538 | 30 | 0.585106 | 22 | 188 | 4.818182 | 0.363636 | 0.301887 | 0.311321 | 0.301887 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.345745 | 188 | 12 | 31 | 15.666667 | 0.861789 | 0 | 0 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.444444 | false | 0.444444 | 0 | 0 | 0.555556 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
22a7dd4d4dd1c06990fbd58a03e2c8fd101693cb | 84 | py | Python | app_status/__init__.py | acatoire/app_status | e60d9ffac041a6ae24573e9be8a907dcaea5da30 | [
"MIT"
] | null | null | null | app_status/__init__.py | acatoire/app_status | e60d9ffac041a6ae24573e9be8a907dcaea5da30 | [
"MIT"
] | null | null | null | app_status/__init__.py | acatoire/app_status | e60d9ffac041a6ae24573e9be8a907dcaea5da30 | [
"MIT"
] | null | null | null | """
app-status classes
"""
from .core import AppStatus
from .core import RunStatus
| 12 | 27 | 0.738095 | 11 | 84 | 5.636364 | 0.727273 | 0.258065 | 0.451613 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.154762 | 84 | 6 | 28 | 14 | 0.873239 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
22da5400d73ed2b4b18995ac9dd75935397dc9e0 | 189 | py | Python | airmon/date.py | vit-/pi-airmon | 23deb5c1b04bb9e76eff2765380195377af9e12b | [
"MIT"
] | null | null | null | airmon/date.py | vit-/pi-airmon | 23deb5c1b04bb9e76eff2765380195377af9e12b | [
"MIT"
] | null | null | null | airmon/date.py | vit-/pi-airmon | 23deb5c1b04bb9e76eff2765380195377af9e12b | [
"MIT"
] | 1 | 2018-04-13T14:05:14.000Z | 2018-04-13T14:05:14.000Z | from datetime import datetime, timedelta
def past(**kwargs):
return datetime.utcnow() - timedelta(**kwargs)
def future(**kwargs):
return datetime.utcnow() + timedelta(**kwargs)
| 18.9 | 50 | 0.703704 | 21 | 189 | 6.333333 | 0.47619 | 0.180451 | 0.300752 | 0.390977 | 0.616541 | 0.616541 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153439 | 189 | 9 | 51 | 21 | 0.83125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | true | 0 | 0.2 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
22f4bddb8e118bb4476aada49055fe0d91fcaf65 | 20,424 | py | Python | NeuralSystems.py | kirill-pinigin/MedicalAI | 9a00fc5d1dc9207035112bd36e3f5b6cc3f8851a | [
"Apache-2.0"
] | null | null | null | NeuralSystems.py | kirill-pinigin/MedicalAI | 9a00fc5d1dc9207035112bd36e3f5b6cc3f8851a | [
"Apache-2.0"
] | null | null | null | NeuralSystems.py | kirill-pinigin/MedicalAI | 9a00fc5d1dc9207035112bd36e3f5b6cc3f8851a | [
"Apache-2.0"
] | null | null | null | import torch
from torch.utils.data import DataLoader
import torchvision
import pytorch_lightning as pl
from skimage.color import label2rgb
import numpy as np
import cv2
import monai
from DentalSegmentationDataset import DentalSegmentationDataset, DentalSegmentationDetectionDataset, INPUT_DIMENSION, OUTPUT_DIMENSION, MIN_RECT_POINTS_DIMENSION
from NeuralCriterions import configure_criterion
from NeuralMetrics import IntersectionOverUnion, SpatialMetric, SSIM
from NeuralModels import *
def configure_system(system_querry : str):
system_collection = {"DentalSegmentor" : DentalSegmentor
, "DentalJointDetectionSegmentor" : DentalJointDetectionSegmentor
, "DentalVariationalSegmentor" : DentalVariationalSegmentor
, 'DentalRecurentVariationalSegmentor': DentalRecurentVariationalSegmentor
}
return system_collection[system_querry]
class DentalSegmentor(pl.LightningModule):
def __init__(self, hparams):
super().__init__()
self.model = UNet(in_channels=INPUT_DIMENSION, out_channels=OUTPUT_DIMENSION)
self.criterion = configure_criterion(hparams["criterion"])
self.metric = IntersectionOverUnion()
self.hparams = hparams
self.data_table = {'train_loss' : [], 'train_iou' : [], 'valid_loss' : [] , 'valid_iou' : [] }
self.batch_size = hparams['batch_size']
self.test_counter = int(0)
self.make_datasets()
def forward(self, x):
return self.model(x)
def configure_optimizers(self):
optimizer = torch.optim.Adam(params=self.parameters(),
lr=self.hparams['lr'])
if self.hparams['scheduler'] == "ReduceLROnPlateau":
lr_scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(optimizer, patience=3, mode='max', factor=0.5, min_lr=1e-7, verbose=True)
scheduler = {
'scheduler': lr_scheduler,
'reduce_on_plateau': True,
'monitor': 'valid_iou'
}
elif self.hparams['scheduler'] == "CosineAnnealingLR":
scheduler = torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max=10, eta_min=1e-7, verbose=True)
elif self.hparams['scheduler'] == "CyclicLR":
scheduler = torch.optim.lr_scheduler.OneCycleLR(optimizer, max_lr=0.01, steps_per_epoch=5, epochs=self.hparams['epochs'], verbose=True)
return [optimizer], [scheduler]
def evaluating_step(self, batch, batch_idx):
img, mask = batch[0], batch[1]
result = self(img)
loss = self.criterion(result, mask)
accuracy = self.metric(result, mask)
return result, loss, accuracy
def training_step(self, batch, batch_idx):
_, loss, accuracy = self.evaluating_step(batch, batch_idx)
batch_dictionary = {
"loss": loss,
"iou": accuracy,
}
self.log_dict(batch_dictionary, prog_bar=True, on_step=True, on_epoch=True, sync_dist=True,logger= True)
return batch_dictionary
def validation_step(self, batch, batch_idx):
_, loss, accuracy = self.evaluating_step(batch, batch_idx)
batch_dictionary = {
"valid_loss": loss,
"valid_iou": accuracy,
}
self.log_dict(batch_dictionary, prog_bar=True, on_step=True, on_epoch=True, sync_dist=True, logger=True)
return batch_dictionary
def test_step(self, batch, batch_idx):
self.test_counter +=1
result, loss, accuracy = self.evaluating_step(batch, batch_idx)
output = torch.argmax(result.detach(), dim=1).squeeze(0).byte().cpu().numpy()
self.visualize(batch[0], output, accuracy)
batch_dictionary = {
"test_loss": loss,
"test_iou": accuracy,
}
self.log_dict(batch_dictionary, prog_bar=True, on_step=True, on_epoch=True, sync_dist=True, logger=True)
return batch_dictionary
def training_epoch_end(self, outputs):
avg_loss = torch.stack([x['loss'] for x in outputs]).mean()
avg_iou = torch.stack([x['iou'] for x in outputs]).mean()
self.logger.experiment.add_scalar("Loss/Train", avg_loss, self.current_epoch)
self.logger.experiment.add_scalar("IoU/Train", avg_iou, self.current_epoch)
for name, params in self.named_parameters():
self.logger.experiment.add_histogram(name, params, self.current_epoch)
self.log_dict({'train_epoch_loss': avg_loss, 'train_epoch_iou': avg_iou})
self.data_table['train_loss'].append(float(avg_loss))
self.data_table['train_iou'].append(float(avg_iou))
def validation_epoch_end(self, outputs):
valid_avg_loss = torch.stack([x['valid_loss'] for x in outputs]).mean()
valid_avg_iou = torch.stack([x['valid_iou'] for x in outputs]).mean()
print(" \n IoU/Valid = {}".format(valid_avg_iou))
self.logger.experiment.add_scalar("Loss/Valid", valid_avg_loss, self.current_epoch)
self.logger.experiment.add_scalar("IoU/Valid", valid_avg_iou, self.current_epoch)
for name, params in self.named_parameters():
self.logger.experiment.add_histogram(name, params, self.current_epoch)
self.log_dict({'valid_epoch_loss': valid_avg_loss, 'valid_epoch_iou': valid_avg_iou})
self.data_table['valid_loss'].append(float(valid_avg_loss))
self.data_table['valid_iou'].append(float(valid_avg_iou))
def train_dataloader(self):
return DataLoader(self.trainset, batch_size=self.batch_size, shuffle=True, num_workers=32, pin_memory=True)
def val_dataloader(self):
return DataLoader(self.validset, batch_size=self.batch_size, shuffle=False, num_workers=32, pin_memory=True)
def test_dataloader(self):
return DataLoader(self.validset, batch_size=1, shuffle=False, num_workers=32, pin_memory=True)
def visualize(self, image, output, accuracy):
print(" output.mean() ", output.mean())
img = torchvision.utils.make_grid(image.cpu().squeeze(0)).mul(float(255)).clamp(0,255).byte().permute(1,2,0).numpy()
grid = label2rgb(output, img, bg_label=0, alpha=0.5, colors=None)
cv2.imwrite("./"+self.logger.log_dir + "/IoU__"+ str(accuracy.item()) + "__"+ f"{self.test_counter:05d}" + "test_image.png", (grid * float(255)).astype(np.uint8))
grid = torch.from_numpy(grid).permute(2,0,1).float().mul(255.0).clamp(0.0,255.0).byte()
self.logger.experiment.add_image(f"{self.test_counter:05d}" +"_test_image", grid, self.test_counter)
def make_datasets(self):
self.trainset = DentalSegmentationDataset(self.hparams['data_path'], split='train', resolution = self.hparams['resolution'])
self.validset = DentalSegmentationDataset( self.hparams['data_path'], split='valid', resolution = self.hparams['resolution'])
class DentalJointDetectionSegmentor(DentalSegmentor):
def __init__(self, hparams):
super().__init__(hparams)
self.model = JointDetectionUNet(in_channels=INPUT_DIMENSION, out_channels=OUTPUT_DIMENSION, points_dimension=MIN_RECT_POINTS_DIMENSION * 2)
self.points_criterion = torch.nn.MSELoss()
self.spatial_metric = SpatialMetric()
def evaluating_step(self, batch, batch_idx):
img, mask = batch[0], batch[1]
segments, points = self(img)
seg_loss = self.criterion(segments, mask)
seg_accuracy = self.metric(segments, mask)
length = batch[2].size(0)
point_loss = self.points_criterion(points.view(length, -1), batch[2])
point_accuracy = self.spatial_metric(points.view(length, -1), batch[2])
loss = seg_loss + point_loss
accuracy_dictionary = {
"iou": seg_accuracy,
"mae": point_accuracy,
}
return segments, loss, accuracy_dictionary
def training_step(self, batch, batch_idx):
_, loss, accuracy_dictionary = self.evaluating_step(batch, batch_idx)
batch_dictionary = {
"loss": loss,
"iou": accuracy_dictionary["iou"],
"mae": accuracy_dictionary["mae"],
}
self.log_dict(batch_dictionary, prog_bar=True, on_step=True, on_epoch=True, sync_dist=True,logger= True)
return batch_dictionary
def validation_step(self, batch, batch_idx):
_, loss, accuracy_dictionary = self.evaluating_step(batch, batch_idx)
batch_dictionary = {
"valid_loss": loss,
"valid_iou": accuracy_dictionary["iou"],
"valid_mae": accuracy_dictionary["mae"],
}
self.log_dict(batch_dictionary, prog_bar=True, on_step=True, on_epoch=True, sync_dist=True, logger=True)
return batch_dictionary
def test_step(self, batch, batch_idx):
self.test_counter +=1
result, loss, accuracy_dictionary = self.evaluating_step(batch, batch_idx)
batch_dictionary = {
"test_loss": loss,
"test_iou": accuracy_dictionary["iou"],
"test_mae": accuracy_dictionary["mae"],
}
output = torch.argmax(result.detach(), dim=1).squeeze(0).byte().cpu().numpy()
self.visualize(batch[0], output, accuracy_dictionary["iou"])
self.log_dict(batch_dictionary, prog_bar=True, on_step=True, on_epoch=True, sync_dist=True, logger=True)
return batch_dictionary
def training_epoch_end(self, outputs):
avg_loss = torch.stack([x['loss'] for x in outputs]).mean()
avg_iou = torch.stack([x['iou'] for x in outputs]).mean()
self.logger.experiment.add_scalar("Loss/Train", avg_loss, self.current_epoch)
self.logger.experiment.add_scalar("IoU/Train", avg_iou, self.current_epoch)
for name, params in self.named_parameters():
self.logger.experiment.add_histogram(name, params, self.current_epoch)
self.log_dict({'train_epoch_loss': avg_loss, 'train_epoch_iou': avg_iou})
self.data_table['train_loss'].append(float(avg_loss))
self.data_table['train_iou'].append(float(avg_iou))
def validation_epoch_end(self, outputs):
valid_avg_loss = torch.stack([x['valid_loss'] for x in outputs]).mean()
valid_avg_iou = torch.stack([x['valid_iou'] for x in outputs]).mean()
valid_avg_mae = torch.stack([x['valid_mae'] for x in outputs]).mean()
print(" \n IoU = {}".format(valid_avg_iou))
print(" \n Spatial Accuracy ,percents = {}".format(valid_avg_mae*100.0))
self.logger.experiment.add_scalar("Loss/Valid", valid_avg_loss, self.current_epoch)
self.logger.experiment.add_scalar("IoU/Valid", valid_avg_iou, self.current_epoch)
for name, params in self.named_parameters():
self.logger.experiment.add_histogram(name, params, self.current_epoch)
self.log_dict({'valid_epoch_loss': valid_avg_loss, 'valid_epoch_iou': valid_avg_iou})
self.data_table['valid_loss'].append(float(valid_avg_loss))
self.data_table['valid_iou'].append(float(valid_avg_iou))
def make_datasets(self):
self.trainset = DentalSegmentationDetectionDataset(self.hparams['data_path'], split='train', resolution = self.hparams['resolution'])
self.validset = DentalSegmentationDetectionDataset(self.hparams['data_path'], split='valid', resolution = self.hparams['resolution'])
class DentalVariationalSegmentor(DentalSegmentor):
def __init__(self, hparams):
super().__init__(hparams)
self.model = VariationalUNet(in_channels=INPUT_DIMENSION, out_channels=OUTPUT_DIMENSION)
self.reconstruction_criterion = torch.nn.BCELoss()
self.spatial_metric = SpatialMetric()
#self.ssim_metric = SSIM(INPUT_DIMENSION)
def evaluating_step(self, batch, batch_idx):
img, mask = batch[0], batch[1]
segments, reconstruction, kl = self(img)
seg_loss = self.criterion(segments, mask)
seg_accuracy = self.metric(segments, mask)
reconstruction_loss = self.reconstruction_criterion(reconstruction, img)
loss = seg_loss + reconstruction_loss + 1e-3*kl
reconstruction_accuracy = self.spatial_metric(reconstruction, img)
#ssim = self.ssim_metric(reconstruction, img)
accuracy_dictionary = {
"iou": seg_accuracy,
"mae": reconstruction_accuracy,
#"ssim": ssim,
}
return segments, loss, accuracy_dictionary
def training_step(self, batch, batch_idx):
_, loss, accuracy_dictionary = self.evaluating_step(batch, batch_idx)
batch_dictionary = {
"loss": loss,
"iou": accuracy_dictionary["iou"],
"mae": accuracy_dictionary["mae"],
#"ssim": accuracy_dictionary["ssim"],
}
self.log_dict(batch_dictionary, prog_bar=True, on_step=True, on_epoch=True, sync_dist=True,logger= True)
return batch_dictionary
def validation_step(self, batch, batch_idx):
_, loss, accuracy_dictionary = self.evaluating_step(batch, batch_idx)
batch_dictionary = {
"valid_loss": loss,
"valid_iou": accuracy_dictionary["iou"],
"valid_mae": accuracy_dictionary["mae"],
#"valid_ssim": accuracy_dictionary["ssim"],
}
self.log_dict(batch_dictionary, prog_bar=True, on_step=True, on_epoch=True, sync_dist=True, logger=True)
return batch_dictionary
def test_step(self, batch, batch_idx):
self.test_counter +=1
result, loss, accuracy_dictionary = self.evaluating_step(batch, batch_idx)
batch_dictionary = {
"test_loss": loss,
"test_iou": accuracy_dictionary["iou"],
"test_mae": accuracy_dictionary["mae"],
#"test_ssim": accuracy_dictionary["ssim"],
}
output = torch.argmax(result.detach(), dim=1).squeeze(0).byte().cpu().numpy()
self.visualize(batch[0], output, accuracy_dictionary["iou"])
self.log_dict(batch_dictionary, prog_bar=True, on_step=True, on_epoch=True, sync_dist=True, logger=True)
return batch_dictionary
def training_epoch_end(self, outputs):
avg_loss = torch.stack([x['loss'] for x in outputs]).mean()
avg_iou = torch.stack([x['iou'] for x in outputs]).mean()
self.logger.experiment.add_scalar("Loss/Train", avg_loss, self.current_epoch)
self.logger.experiment.add_scalar("IoU/Train", avg_iou, self.current_epoch)
for name, params in self.named_parameters():
self.logger.experiment.add_histogram(name, params, self.current_epoch)
self.log_dict({'train_epoch_loss': avg_loss, 'train_epoch_iou': avg_iou})
self.data_table['train_loss'].append(float(avg_loss))
self.data_table['train_iou'].append(float(avg_iou))
def validation_epoch_end(self, outputs):
valid_avg_loss = torch.stack([x['valid_loss'] for x in outputs]).mean()
valid_avg_iou = torch.stack([x['valid_iou'] for x in outputs]).mean()
valid_avg_mae = torch.stack([x['valid_mae'] for x in outputs]).mean()
#valid_avg_ssim = torch.stack([x['valid_ssim'] for x in outputs]).mean()
print(" \n IoU = {}".format(valid_avg_iou))
print(" \n Spatial Accuracy ,percents = {}".format(valid_avg_mae*100.0))
#print(" \n Similarity ,percents = {}".format(valid_avg_ssim * 100.0))
self.logger.experiment.add_scalar("Loss/Valid", valid_avg_loss, self.current_epoch)
self.logger.experiment.add_scalar("IoU/Valid", valid_avg_iou, self.current_epoch)
for name, params in self.named_parameters():
self.logger.experiment.add_histogram(name, params, self.current_epoch)
self.log_dict({'valid_epoch_loss': valid_avg_loss, 'valid_epoch_iou': valid_avg_iou})
self.data_table['valid_loss'].append(float(valid_avg_loss))
self.data_table['valid_iou'].append(float(valid_avg_iou))
class DentalRecurentVariationalSegmentor(DentalVariationalSegmentor):
def __init__(self, hparams):
super().__init__(hparams)
self.model = RecurentVariationalUNet(in_channels=INPUT_DIMENSION, out_channels=OUTPUT_DIMENSION)
self.reconstruction_criterion = torch.nn.BCELoss()
self.spatial_metric = SpatialMetric()
def evaluating_step(self, batch, batch_idx):
img, mask = batch[0], batch[1]
segments, reconstruction, kl = self(img)
seg_loss = self.criterion(segments, mask)
seg_accuracy = self.metric(segments, mask)
reconstruction_loss = self.reconstruction_criterion(reconstruction, img)
loss = seg_loss + reconstruction_loss + 1e-3*kl
reconstruction_accuracy = self.spatial_metric(reconstruction, img)
#ssim = self.ssim_metric(reconstruction, img)
accuracy_dictionary = {
"iou": seg_accuracy,
"mae": reconstruction_accuracy,
#"ssim": ssim,
}
return segments, loss, accuracy_dictionary
def training_step(self, batch, batch_idx):
_, loss, accuracy_dictionary = self.evaluating_step(batch, batch_idx)
batch_dictionary = {
"loss": loss,
"iou": accuracy_dictionary["iou"],
"mae": accuracy_dictionary["mae"],
#"ssim": accuracy_dictionary["ssim"],
}
self.log_dict(batch_dictionary, prog_bar=True, on_step=True, on_epoch=True, sync_dist=True,logger= True)
return batch_dictionary
def validation_step(self, batch, batch_idx):
_, loss, accuracy_dictionary = self.evaluating_step(batch, batch_idx)
batch_dictionary = {
"valid_loss": loss,
"valid_iou": accuracy_dictionary["iou"],
"valid_mae": accuracy_dictionary["mae"],
#"valid_ssim": accuracy_dictionary["ssim"],
}
self.log_dict(batch_dictionary, prog_bar=True, on_step=True, on_epoch=True, sync_dist=True, logger=True)
return batch_dictionary
def test_step(self, batch, batch_idx):
self.test_counter +=1
result, loss, accuracy_dictionary = self.evaluating_step(batch, batch_idx)
batch_dictionary = {
"test_loss": loss,
"test_iou": accuracy_dictionary["iou"],
"test_mae": accuracy_dictionary["mae"],
#"test_ssim": accuracy_dictionary["ssim"],
}
output = torch.argmax(result.detach(), dim=1).squeeze(0).byte().cpu().numpy()
self.visualize(batch[0], output, accuracy_dictionary["iou"])
self.log_dict(batch_dictionary, prog_bar=True, on_step=True, on_epoch=True, sync_dist=True, logger=True)
return batch_dictionary
def training_epoch_end(self, outputs):
avg_loss = torch.stack([x['loss'] for x in outputs]).mean()
avg_iou = torch.stack([x['iou'] for x in outputs]).mean()
self.logger.experiment.add_scalar("Loss/Train", avg_loss, self.current_epoch)
self.logger.experiment.add_scalar("IoU/Train", avg_iou, self.current_epoch)
for name, params in self.named_parameters():
self.logger.experiment.add_histogram(name, params, self.current_epoch)
self.log_dict({'train_epoch_loss': avg_loss, 'train_epoch_iou': avg_iou})
self.data_table['train_loss'].append(float(avg_loss))
self.data_table['train_iou'].append(float(avg_iou))
def validation_epoch_end(self, outputs):
valid_avg_loss = torch.stack([x['valid_loss'] for x in outputs]).mean()
valid_avg_iou = torch.stack([x['valid_iou'] for x in outputs]).mean()
valid_avg_mae = torch.stack([x['valid_mae'] for x in outputs]).mean()
#valid_avg_ssim = torch.stack([x['valid_ssim'] for x in outputs]).mean()
print(" \n IoU = {}".format(valid_avg_iou))
print(" \n Spatial Accuracy ,percents = {}".format(valid_avg_mae*100.0))
#print(" \n Similarity ,percents = {}".format(valid_avg_ssim * 100.0))
self.logger.experiment.add_scalar("Loss/Valid", valid_avg_loss, self.current_epoch)
self.logger.experiment.add_scalar("IoU/Valid", valid_avg_iou, self.current_epoch)
for name, params in self.named_parameters():
self.logger.experiment.add_histogram(name, params, self.current_epoch)
self.log_dict({'valid_epoch_loss': valid_avg_loss, 'valid_epoch_iou': valid_avg_iou})
self.data_table['valid_loss'].append(float(valid_avg_loss))
self.data_table['valid_iou'].append(float(valid_avg_iou)) | 48.513064 | 171 | 0.671465 | 2,538 | 20,424 | 5.136722 | 0.079196 | 0.028227 | 0.027921 | 0.044105 | 0.824039 | 0.808085 | 0.783002 | 0.776559 | 0.759454 | 0.745033 | 0 | 0.00708 | 0.204759 | 20,424 | 421 | 172 | 48.513064 | 0.795592 | 0.032902 | 0 | 0.700297 | 0 | 0 | 0.082582 | 0.00684 | 0 | 0 | 0 | 0 | 0 | 1 | 0.109792 | false | 0 | 0.035608 | 0.011869 | 0.222552 | 0.023739 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a3df59aa5732c09a697fa50f1c3c49fcb3ee2559 | 2,570 | py | Python | test/pyaz/batch/application/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | test/pyaz/batch/application/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | 9 | 2021-09-24T16:37:24.000Z | 2021-12-24T00:39:19.000Z | test/pyaz/batch/application/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | import json, subprocess
from ... pyaz_utils import get_cli_name, get_params
def list(resource_group, name, maxresults=None):
params = get_params(locals())
command = "az batch application list " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def show(resource_group, name, application_name):
params = get_params(locals())
command = "az batch application show " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def create(resource_group, name, application_name, parameters=None):
params = get_params(locals())
command = "az batch application create " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def set(resource_group, name, application_name, allow_updates=None, display_name=None, default_version=None):
params = get_params(locals())
command = "az batch application set " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def delete(resource_group, name, application_name, yes=None):
params = get_params(locals())
command = "az batch application delete " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
| 34.72973 | 109 | 0.668872 | 310 | 2,570 | 5.477419 | 0.158065 | 0.08245 | 0.058893 | 0.061837 | 0.886926 | 0.811543 | 0.811543 | 0.811543 | 0.784452 | 0.666667 | 0 | 0.004975 | 0.217899 | 2,570 | 73 | 110 | 35.205479 | 0.839801 | 0 | 0 | 0.820896 | 0 | 0 | 0.071206 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074627 | false | 0 | 0.029851 | 0 | 0.179104 | 0.223881 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
43055663f41e9f9a42dd0f6b7e68cd69521378aa | 1,852 | py | Python | D2D/differential_manchester.py | dheeraj-bharadwaj/Vizualizer | f7eb293c80bc2fb94d7ac7a043f3e4e6a2690019 | [
"Apache-2.0"
] | null | null | null | D2D/differential_manchester.py | dheeraj-bharadwaj/Vizualizer | f7eb293c80bc2fb94d7ac7a043f3e4e6a2690019 | [
"Apache-2.0"
] | null | null | null | D2D/differential_manchester.py | dheeraj-bharadwaj/Vizualizer | f7eb293c80bc2fb94d7ac7a043f3e4e6a2690019 | [
"Apache-2.0"
] | 1 | 2020-10-12T04:15:46.000Z | 2020-10-12T04:15:46.000Z | def Differential_manchester(input_digital_signal):
digital_signal=list(input_digital_signal)
output_digital_signal,lock,pre=[],False,'S'
for i in range(len(digital_signal)):
# if digital_signal[i]==0 and not lock:
# output_digital_signal.append(-1)
# output_digital_signal.append(-1)
# output_digital_signal.append(1)
# lock=True
# pre='S'
# elif digital_signal[i]==1 and not lock :
# output_digital_signal.append(1)
# output_digital_signal.append(1)
# output_digital_signal.append(-1)
# lock=True
# pre='Z'
# else:
# if digital_signal[i]==1:
# if pre=='S':
# output_digital_signal.append(-1);output_digital_signal.append(1)
# else:
# output_digital_signal.append(1);output_digital_signal.append(-1)
# else:
# if pre=='Z':
# pre='S'
# output_digital_signal.append(-1);output_digital_signal.append(1)
# else:
# pre='Z'
# output_digital_signal.append(1);output_digital_signal.append(-1)
if digital_signal[i]==0:
if pre=='S':
output_digital_signal.append(-1);output_digital_signal.append(1)
else:
output_digital_signal.append(1);output_digital_signal.append(-1)
else:
if pre=='Z':
pre='S'
output_digital_signal.append(-1);output_digital_signal.append(1)
else:
pre='Z'
output_digital_signal.append(1);output_digital_signal.append(-1)
output_digital_signal.insert(0,1)
return output_digital_signal | 42.090909 | 86 | 0.542117 | 207 | 1,852 | 4.555556 | 0.140097 | 0.454931 | 0.503712 | 0.583245 | 0.775186 | 0.73913 | 0.73913 | 0.73913 | 0.73913 | 0.73913 | 0 | 0.023237 | 0.349352 | 1,852 | 44 | 87 | 42.090909 | 0.759336 | 0.438445 | 0 | 0.388889 | 0 | 0 | 0.004926 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4a567b7a5585f83177c2dba2666a372947b74638 | 33,032 | py | Python | plotData.py | Blue-Giant/MscaleDNN_tf1Class | ca36906724d41c51e5ae73bf011ebc0e2f2b3a26 | [
"MIT"
] | null | null | null | plotData.py | Blue-Giant/MscaleDNN_tf1Class | ca36906724d41c51e5ae73bf011ebc0e2f2b3a26 | [
"MIT"
] | null | null | null | plotData.py | Blue-Giant/MscaleDNN_tf1Class | ca36906724d41c51e5ae73bf011ebc0e2f2b3a26 | [
"MIT"
] | null | null | null | """
@author: LXA
Date: 2020 年 5 月 31 日
"""
import DNN_tools
import matplotlib.pyplot as plt
import numpy as np
from mpl_toolkits.axes_grid1.inset_locator import mark_inset
from mpl_toolkits.axes_grid1.inset_locator import inset_axes
from matplotlib.patches import ConnectionPatch
import matplotlib.cm as cm
from mpl_toolkits.mplot3d import Axes3D
from matplotlib.colors import LogNorm
# 对同一个网络所得到的单个loss数据画图。如只画 loss to boundary (loss_bd)
def plotTrain_loss_1act_func(data2loss, lossType=None, seedNo=1000, outPath=None, xaxis_scale=False, yaxis_scale=False):
plt.figure()
ax = plt.gca()
plt.plot(data2loss, 'b-.', label=lossType)
plt.xlabel('epoch', fontsize=14)
plt.ylabel(lossType, fontsize=14)
plt.legend(fontsize=18)
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
# plt.title('loss_it', fontsize=15)
fntmp = '%s/%s%s' % (outPath, seedNo, lossType)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
# 对同一个网络所得到的多个loss数据画图。如画 loss to boundary and loss to interior (loss_bd loss_it)
def plot_2Trainlosses_1act_func(data2loss_1, data2loss_2, lossName1=None, lossName2=None, seedNo=1000, outPath=None,
lossType=None, xaxis_scale=False, yaxis_scale=False):
plt.figure()
ax = plt.gca()
plt.plot(data2loss_1, 'b-.', label=lossName1)
plt.plot(data2loss_2, 'r:', label=lossName2)
plt.xlabel('epoch', fontsize=14)
plt.ylabel('loss', fontsize=14)
plt.legend(fontsize=13)
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
# plt.title('loss_it', fontsize=15)
fntmp = '%s/%s%s' % (outPath, seedNo, lossType)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
# 对同一个网络所得到的多个loss数据画图。如画 loss to boundary and loss to interior
def plot_3Trainlosses_1act_func(data2loss_1, data2loss_2, data2loss_3, lossName1=None, lossName2=None, lossName3=None,
lossType=None, seedNo=1000, outPath=None, xaxis_scale=False, yaxis_scale=False):
plt.figure()
ax = plt.gca()
plt.plot(data2loss_1, 'b-.', label=lossName1)
plt.plot(data2loss_2, 'r:', label=lossName2)
plt.plot(data2loss_3, 'c*', label=lossName3)
plt.xlabel('epoch', fontsize=14)
plt.ylabel('loss', fontsize=14)
plt.legend(fontsize=13)
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
# plt.title('loss_it', fontsize=15)
fntmp = '%s/%s%s' % (outPath, seedNo, lossType)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
# 对同两个网络所得到的同一种类型的loss数据画图。如画 loss to boundary
def plotTrain_losses_2act_funs(data2loss_1, data2loss_2, lossName1=None, lossName2=None, lossType=None,
seedNo=1000, outPath=None, xaxis_scale=False, yaxis_scale=False):
plt.figure()
ax = plt.gca()
plt.plot(data2loss_1, 'b-.', label=lossName1)
plt.plot(data2loss_2, 'r:', label=lossName2)
plt.xlabel('epoch', fontsize=14)
plt.ylabel('loss', fontsize=14)
plt.legend(fontsize=13)
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
# plt.title('loss_it', fontsize=15)
fntmp = '%s/%s%s' % (outPath, seedNo, lossType)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
# 对三个网络所得到的同一个类型的loss数据画图。如画 loss to boundary
def plotTrain_losses_2Type2(data2loss_1, data2loss_2, data2loss_3, lossName1=None, lossName2=None, lossName3=None,
lossType=None, seedNo=1000, outPath=None, xaxis_scale=False, yaxis_scale=False):
plt.figure()
ax = plt.gca()
plt.plot(data2loss_1, 'b-.', label=lossName1)
plt.plot(data2loss_2, 'r:', label=lossName2)
plt.plot(data2loss_3, 'c*', label=lossName3)
plt.xlabel('epoch', fontsize=14)
plt.ylabel('loss', fontsize=14)
plt.legend(fontsize=13)
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
# plt.title('loss_it', fontsize=15)
fntmp = '%s/%s%s' % (outPath, seedNo, lossType)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
# 这个函数可以由 plot_3losses_1Type(......)代替
def plotTrain_losses(loss2s2ReLU, loss2sReLU, loss2ReLU, lossType=None, seedNo=1000, outPath=None):
if 'loss_it' == lossType:
plt.figure()
ax = plt.gca()
plt.plot(loss2s2ReLU, 'b-.', label='s2ReLU')
plt.plot(loss2sReLU, 'r:', label='sReLU')
plt.plot(loss2ReLU, 'c-*', label='ReLU')
plt.xlabel('epoch', fontsize=14)
plt.ylabel('loss_it', fontsize=14)
plt.legend(fontsize=13)
# plt.title('loss_it', fontsize=15)
fntmp = '%s/%sloss_it' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
elif 'loss_bd' == lossType:
plt.figure()
ax = plt.gca()
plt.plot(loss2s2ReLU, 'b-.', label='s2ReLU')
plt.plot(loss2sReLU, 'r:', label='sReLU')
plt.plot(loss2ReLU, 'c-*', label='ReLU')
ax.set_yscale('log')
plt.xlabel('epoch', fontsize=14)
plt.ylabel('loss_bd', fontsize=14)
plt.legend(fontsize=13)
# plt.title('loss_bd', fontsize=15)
fntmp = '%s/%sloss_bd' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
elif 'loss' == lossType:
plt.figure()
ax = plt.gca()
plt.plot(loss2s2ReLU, 'b-.', label='s2ReLU')
plt.plot(loss2sReLU, 'r:', label='sReLU')
plt.plot(loss2ReLU, 'c-*', label='ReLU')
plt.xlabel('epoch', fontsize=14)
plt.ylabel('loss', fontsize=14)
plt.legend(fontsize=13)
# plt.title('loss', fontsize=15)
fntmp = '%s/%sloss' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plotTrain_MSE_1act_func(data2mse, mseType=None,seedNo=1000, outPath=None, xaxis_scale=False, yaxis_scale=False):
plt.figure()
ax = plt.gca()
plt.plot(data2mse, 'b-.', label=mseType)
plt.xlabel('epoch', fontsize=14)
plt.ylabel('error', fontsize=14)
plt.legend(fontsize=13)
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
# plt.title('training mse', fontsize=13)
fntmp = '%s/%strain_mse' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plotTrain_REL_1act_func(data2rel, relType=None, seedNo=1000, outPath=None, xaxis_scale=False, yaxis_scale=False):
plt.figure()
ax = plt.gca()
plt.plot(data2rel, 'b-.', label=relType)
plt.xlabel('epoch', fontsize=14)
plt.ylabel('error', fontsize=14)
plt.legend(fontsize=13)
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
# plt.title('training mse', fontsize=13)
fntmp = '%s/%strain_mse' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plotTrain_MSE_REL_1act_func(data2mse, data2rel, actName=None, seedNo=1000, outPath=None, xaxis_scale=False,
yaxis_scale=False):
plt.figure()
ax = plt.gca()
plt.plot(data2mse, 'r-.', label='MSE')
plt.plot(data2rel, 'b:', label='REL')
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
plt.xlabel('epoch', fontsize=18)
plt.ylabel('error', fontsize=18)
plt.legend(fontsize=18)
# plt.title('training error', fontsize=15)
if str.lower(actName) == 'srelu':
fntmp = '%s/%strainErr_%s' % (outPath, seedNo, 'sReLU')
elif str.lower(actName) == 'sin':
fntmp = '%s/%strainErr_%s' % (outPath, seedNo, 'sin')
elif str.lower(actName) == 's2relu':
fntmp = '%s/%strainErr_%s' % (outPath, seedNo, 's2ReLU')
elif str.lower(actName) == 's3relu':
fntmp = '%s/%strainErr_%s' % (outPath, seedNo, 's3ReLU')
elif str.lower(actName) == 'csrelu':
fntmp = '%s/%strainErr_%s' % (outPath, seedNo, 'CsReLU')
elif str.lower(actName) == 'relu':
fntmp = '%s/%strainErr_%s' % (outPath, seedNo, 'ReLU')
elif str.lower(actName) == 'elu':
fntmp = '%s/%strainErr_%s' % (outPath, seedNo, 'elu')
elif str.lower(actName) == 'tanh':
fntmp = '%s/%strainErr_%s' % (outPath, seedNo, 'tanh')
elif str.lower(actName) == 'sintanh':
fntmp = '%s/%strainErr_%s' % (outPath, seedNo, 'sintanh')
elif str.lower(actName) == 'singauss':
fntmp = '%s/%strainErr_%s' % (outPath, seedNo, 'singauss')
elif str.lower(actName) == 'gauss':
fntmp = '%s/%strainErr_%s' % (outPath, seedNo, 'gauss')
elif str.lower(actName) == 'mexican':
fntmp = '%s/%strainErr_%s' % (outPath, seedNo, 'mexican')
elif str.lower(actName) == 'modify_mexican':
fntmp = '%s/%s_%s' % (outPath, seedNo, 'Ummexican2test')
elif str.lower(actName) == 'sin_modify_mexican':
fntmp = '%s/%s_%s' % (outPath, seedNo, 'Usm_mexican2test')
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plotTrain_MSEs_2act_funcs(data2mse1, data2mse2, mseName1=None, mseName2=None, seedNo=1000, outPath=None,
xaxis_scale=False, yaxis_scale=False):
plt.figure()
ax = plt.gca()
plt.plot(data2mse1, 'b-.', label=mseName1)
plt.plot(data2mse2, 'r:', label=mseName2)
plt.xlabel('epoch', fontsize=14)
plt.ylabel('error', fontsize=14)
plt.legend(fontsize=13)
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
# plt.title('training mse', fontsize=13)
fntmp = '%s/%strain_mses' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plotTrain_RELs_2act_funcs(data2rel1, data2rel2, relName1=None, relName2=None, seedNo=1000, outPath=None,
xaxis_scale=False, yaxis_scale=False):
plt.figure()
ax = plt.gca()
plt.plot(data2rel1, 'b-.', label=relName1)
plt.plot(data2rel2, 'r:', label=relName2)
plt.xlabel('epoch', fontsize=14)
plt.ylabel('error', fontsize=14)
plt.legend(fontsize=13)
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
# plt.title('training mse', fontsize=13)
fntmp = '%s/%strain_rels' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plotTrain_MSEs_RELs_2act_funcs(mse2data1, mse2data2, rel2data1, rel2data2, actName1=None, actName2=None,
seedNo=1000, outPath=None, xaxis_scale=False, yaxis_scale=False):
plt.figure(figsize=(10, 8))
ax = plt.gca()
plt.plot(mse2data1, 'g-.', label=str('MSE-'+actName1))
plt.plot(rel2data1, 'b:', label=str('REL-'+actName1))
plt.plot(mse2data2, 'm--.', label=str('MSE-'+actName2))
plt.plot(rel2data2, 'c-*', label=str('REL-'+actName2))
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
plt.xlabel('epoch', fontsize=14)
plt.ylabel('error', fontsize=14)
ax.legend(loc='center', bbox_to_anchor=(0.485, 1.055), ncol=3, fontsize=12)
# plt.legend(fontsize=11)
# plt.title(' train error', fontsize=15)
fntmp = '%s/%strain_error' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plotTrain_MSEs_RELs_3act_funcs(mse2data1, mse2data2, mse2data3, rel2data1, rel2data2, rel2data3, actName1=None,
actName2=None, actName3=None, seedNo=1000, outPath=None, xaxis_scale=False,
yaxis_scale=False):
plt.figure(figsize=(10, 8))
ax = plt.gca()
plt.plot(mse2data1, 'g-.', label=str('MSE-'+actName1))
plt.plot(rel2data1, 'b:', label=str('REL-'+actName1))
plt.plot(mse2data2, 'm--.', label=str('MSE-'+actName2))
plt.plot(rel2data2, 'c-*', label=str('REL-'+actName2))
plt.plot(mse2data3, color='k', marker='v', label=str('MSE-'+actName3))
plt.plot(rel2data3, color='gold', marker='x', label=str('REL-'+actName3))
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
plt.xlabel('epoch', fontsize=14)
plt.ylabel('error', fontsize=14)
ax.legend(loc='center', bbox_to_anchor=(0.485, 1.055), ncol=3, fontsize=12)
# plt.legend(fontsize=11)
# plt.title(' train error', fontsize=15)
fntmp = '%s/%strain_Errs' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
# ------------------------------------ plot test results --------------------------------------------------
def plot_2TestMSEs(data2mse1, data2mse2, mseType1=None, mseType2=None, epoches=None, seedNo=1000, outPath=None,
xaxis_scale=False, yaxis_scale=False):
plt.figure()
ax = plt.gca()
plt.plot(epoches, data2mse1, 'r-.', label=mseType1)
plt.plot(epoches, data2mse2, 'b:', label=mseType2)
plt.xlabel('epoch/1000', fontsize=18)
# plt.ylabel('L2error', fontsize=18)
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
plt.legend(fontsize=18)
plt.title('testing mse ', fontsize=15)
fntmp = '%s/%stest_mse' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plot_2TestRELs(data2rel1, data2rel2, relType1=None, relType2=None, epoches=1000, seedNo=1000, outPath=None,
xaxis_scale=False, yaxis_scale=False):
plt.figure()
ax = plt.gca()
plt.plot(epoches, data2rel1, 'r-.', label=relType1)
plt.plot(epoches, data2rel2, 'b:', label=relType2)
plt.xlabel('epoch/1000', fontsize=18)
# plt.ylabel('L2error', fontsize=18)
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
plt.legend(fontsize=18)
plt.title('testing mse ', fontsize=15)
fntmp = '%s/%stest_rel' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plotTest_MSE_REL(data2mse, data2rel, epoches, actName=None, seedNo=1000, outPath=None, xaxis_scale=False, yaxis_scale=False):
plt.figure()
ax = plt.gca()
plt.plot(epoches, data2mse, 'r-.', label='MSE')
plt.plot(epoches, data2rel, 'b:', label='REL')
plt.xlabel('epoch/1000', fontsize=18)
# plt.ylabel('L2error', fontsize=18)
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
plt.legend(fontsize=18)
plt.title('testing error ', fontsize=15)
fntmp = '%s/%stestERR_%s' % (outPath, seedNo, actName)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plot_Test_MSE_REL_2ActFuncs(data_mse1, data_rel1, data_mse2, data_rel2, epoches, actName1=None, actName2=None,
seedNo=1000, outPath=None, xaxis_scale=False, yaxis_scale=False):
# fig2mse_test = plt.figure(figsize=(10, 8), dpi=98)
fig2mse_test = plt.figure(figsize=(9, 6.5), dpi=98)
ax = plt.gca()
ax.plot(epoches, data_mse1, 'g-.', label=str('MSE-'+actName1))
ax.plot(epoches, data_rel1, 'b:', label=str('REL'+actName1))
ax.plot(epoches, data_mse2, 'm--', label=str('MSE'+actName2))
ax.plot(epoches, data_rel2, 'c-*', label=str('REL'+actName2))
plt.xlabel('epoch/1000', fontsize=14)
plt.ylabel('error', fontsize=14)
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
ax.legend(loc='center', bbox_to_anchor=(0.49, 1.06), ncol=3, fontsize=12)
# plt.legend(fontsize=11)
# plt.title('testing error ', fontsize=15)
fntmp = '%s/%stest_error' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plot_Test_MSE_REL_3Types(mse2s2ReLU, mse2sReLU, mse2ReLU, rel2s2ReLU, rel2sReLU, rel2ReLU, epoches=100,
seedNo=1000, outPath=None):
# fig2mse_test = plt.figure(figsize=(10, 8), dpi=98)
fig2mse_test = plt.figure(figsize=(9, 6.5), dpi=98)
ax = plt.gca()
ax.plot(epoches, mse2s2ReLU, 'g-.', label='MSE-s2ReLU')
ax.plot(epoches, rel2s2ReLU, 'b:', label='REL-s2ReLU')
ax.plot(epoches, mse2sReLU, 'm--', label='MSE-sReLU')
ax.plot(epoches, rel2sReLU, 'c-*', label='REL-sReLU')
ax.plot(epoches, mse2ReLU, color='k', marker='v', label='MSE-ReLU')
ax.plot(epoches, rel2ReLU, color='gold', marker='x', label='REL-ReLU')
plt.xlabel('epoch/1000', fontsize=14)
plt.ylabel('error', fontsize=14)
ax.set_yscale('log')
ax.legend(loc='center', bbox_to_anchor=(0.49, 1.06), ncol=3, fontsize=12)
# plt.legend(fontsize=11)
# plt.title('testing error ', fontsize=15)
fntmp = '%s/%stest_error' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plotTest_MSEs_RELs_3act_funcs(mse2data1, mse2data2, mse2data3, rel2data1, rel2data2, rel2data3, actName1=None,
actName2=None, actName3=None, seedNo=1000, outPath=None, xaxis_scale=False,
yaxis_scale=False):
plt.figure(figsize=(10, 8))
ax = plt.gca()
plt.plot(mse2data1, 'g-.', label=str('MSE-'+actName1))
plt.plot(rel2data1, 'b:', label=str('REL-'+actName1))
plt.plot(mse2data2, 'm--.', label=str('MSE-'+actName2))
plt.plot(rel2data2, 'c-*', label=str('REL-'+actName2))
plt.plot(mse2data3, color='k', marker='v', label=str('MSE-'+actName3))
plt.plot(rel2data3, color='gold', marker='x', label=str('REL-'+actName3))
if xaxis_scale:
ax.set_yscale('log')
if yaxis_scale:
ax.set_yscale('log')
plt.xlabel('epoch', fontsize=14)
plt.ylabel('error', fontsize=14)
ax.legend(loc='center', bbox_to_anchor=(0.485, 1.055), ncol=3, fontsize=15)
# plt.legend(fontsize=11)
# plt.title(' train error', fontsize=15)
fntmp = '%s/%stest_Errs' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plot_2solutions2test(exact_solu2test, predict_solu2test, coord_points2test=None,
batch_size2test=1000, seedNo=1000, outPath=None, subfig_type=1):
if subfig_type == 1:
plt.figure(figsize=(16, 10), dpi=98)
fig, ax = plt.subplots(1, 1) # fig, ax = plt.subplots(a,b)用来控制子图个数:a为行数,b为列数。
ax.plot(coord_points2test, exact_solu2test, 'b-.', label='true')
ax.plot(coord_points2test, predict_solu2test, 'g:', label='predict')
ax.legend(fontsize=10)
ax.set_xlabel('epoch', fontsize=18)
axins = inset_axes(ax, width="50%", height="40%", loc=8, bbox_to_anchor=(0.2, 0.4, 0.5, 0.5),
bbox_transform=ax.transAxes)
# 在子坐标系中绘制原始数据
axins.plot(coord_points2test, exact_solu2test, color='b', linestyle='-.')
axins.plot(coord_points2test, predict_solu2test, color='g', linestyle=':')
axins.set_xticks([])
axins.set_yticks([])
# 设置放大区间
zone_left = int(0.4 * batch_size2test)
zone_right = int(0.4 * batch_size2test) + 100
# 坐标轴的扩展比例(根据实际数据调整)
x_ratio = 0.0 # x轴显示范围的扩展比例
y_ratio = 0.075 # y轴显示范围的扩展比例
# X轴的显示范围
xlim0 = coord_points2test[zone_left] - (coord_points2test[zone_right] - coord_points2test[zone_left]) * x_ratio
xlim1 = coord_points2test[zone_right] + (coord_points2test[zone_right] - coord_points2test[zone_left]) * x_ratio
# Y轴的显示范围
y = np.hstack((exact_solu2test[zone_left:zone_right], predict_solu2test[zone_left:zone_right]))
ylim0 = np.min(y) - (np.max(y) - np.min(y)) * y_ratio
ylim1 = np.max(y) + (np.max(y) - np.min(y)) * y_ratio
# 调整子坐标系的显示范围
axins.set_xlim(xlim0, xlim1)
axins.set_ylim(ylim0, ylim1)
# 建立父坐标系与子坐标系的连接线
# loc1 loc2: 坐标系的四个角
# 1 (右上) 2 (左上) 3(左下) 4(右下)
mark_inset(ax, axins, loc1=3, loc2=1, fc="none", ec='k', lw=1)
fntmp = '%s/%ssolu2test' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
elif subfig_type == 2:
plt.figure(figsize=(16, 10), dpi=98)
ax = plt.gca()
p1 = plt.subplot(121) # 1行2列,第一个图
p2 = plt.subplot(122) # 1行2列,第二个图
p1.plot(coord_points2test, exact_solu2test, color='b', linestyle='-.', label='true')
p1.plot(coord_points2test, predict_solu2test, color='g', linestyle=':', label='predict')
ax.legend(fontsize=10)
p2.plot(coord_points2test, exact_solu2test, color='b', linestyle='-.', label='true')
p2.plot(coord_points2test, predict_solu2test, color='g', linestyle=':', label='predict')
p2.axis([0.35, 0.65, 0.2, 0.27])
# plot the box of
tx0 = 0.35
tx1 = 0.65
ty0 = 0.2
ty1 = 0.27
sx = [tx0, tx1, tx1, tx0, tx0]
sy = [ty0, ty0, ty1, ty1, ty0]
p1.plot(sx, sy, "purple")
# plot patch lines
xy = (0.64, 0.265)
xy2 = (0.36, 0.265)
con = ConnectionPatch(xyA=xy2, xyB=xy, coordsA="data", coordsB="data", axesA=p2, axesB=p1)
p2.add_artist(con)
xy = (0.64, 0.21)
xy2 = (0.36, 0.205)
con = ConnectionPatch(xyA=xy2, xyB=xy, coordsA="data", coordsB="data",
axesA=p2, axesB=p1)
p2.add_artist(con)
fntmp = '%s/%ssolu2test' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
else:
# fig11 = plt.figure(figsize=(10, 8))
fig11 = plt.figure(figsize=(9, 6.5))
ax = plt.gca()
ax.plot(coord_points2test, exact_solu2test, 'b-.', label='exact')
ax.plot(coord_points2test, predict_solu2test, 'r:', label='s2ReLU')
# box = ax.get_position()
# ax.set_position([box.x0, box.y0, box.width, box.height * 0.8])
ax.legend(loc='right', bbox_to_anchor=(0.9, 1.05), ncol=4, fontsize=12)
ax.set_xlabel('x', fontsize=14)
ax.set_ylabel('u', fontsize=14)
fntmp = '%s/%ssolu2test' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plot_3solutions2test(exact_solu2test, s2ReLU_solu2test, sReLU_solu2test, ReLU_solu2test,
coord_points2test=None, batch_size2test=1000, seedNo=1000, outPath=None, subfig_type=1):
# 嵌入绘制局部放大图的坐标系
if subfig_type == 1:
subgfig = plt.figure(figsize=(10, 8), dpi=98)
ax = plt.gca() # fig, ax = plt.subplots(a,b)用来控制子图个数:a为行数,b为列数。
ax.plot(coord_points2test, exact_solu2test, 'b-.', label='exact')
ax.plot(coord_points2test, s2ReLU_solu2test, 'g:', label='s2ReLU')
ax.plot(coord_points2test, sReLU_solu2test, 'm--', label='sReLU')
ax.plot(coord_points2test, ReLU_solu2test, 'c-', label='ReLU')
ax.legend(loc='right', bbox_to_anchor=(0.85, 1.03), ncol=4, fontsize=12)
ax.set_xlabel('epoch', fontsize=14)
axins = inset_axes(ax, width="50%", height="40%", loc=8, bbox_to_anchor=(0.2, 0.2, 0.5, 0.5),
bbox_transform=ax.transAxes)
# 在子坐标系中绘制原始数据
axins.plot(coord_points2test, exact_solu2test, color='b', linestyle='-.')
axins.plot(coord_points2test, s2ReLU_solu2test, color='g', linestyle=':')
axins.plot(coord_points2test, sReLU_solu2test, color='m', linestyle='--')
axins.plot(coord_points2test, ReLU_solu2test, color='c', linestyle='-')
axins.set_xticks([])
axins.set_yticks([])
# 设置放大区间
zone_left = int(0.4 * batch_size2test)
zone_right = int(0.4 * batch_size2test) + 150
# 坐标轴的扩展比例(根据实际数据调整)
x_ratio = 0.075 # x轴显示范围的扩展比例
y_ratio = 0.04 # y轴显示范围的扩展比例
# X轴的显示范围
xlim0 = coord_points2test[zone_left] - (coord_points2test[zone_right] - coord_points2test[zone_left]) * x_ratio
xlim1 = coord_points2test[zone_right] + (coord_points2test[zone_right] - coord_points2test[zone_left]) * x_ratio
# Y轴的显示范围
y = np.hstack((exact_solu2test[zone_left:zone_right], s2ReLU_solu2test[zone_left:zone_right]))
ylim0 = np.min(y) - (np.max(y) - np.min(y)) * y_ratio
ylim1 = np.max(y) + (np.max(y) - np.min(y)) * y_ratio
# 调整子坐标系的显示范围
axins.set_xlim(xlim0, xlim1)
axins.set_ylim(ylim0, ylim1)
# 建立父坐标系与子坐标系的连接线
# loc1 loc2: 坐标系的四个角
# 1 (右上) 2 (左上) 3(左下) 4(右下)
mark_inset(ax, axins, loc1=3, loc2=1, fc="none", ec='k', lw=1)
fntmp = '%s/%ssolu2test' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
elif subfig_type == 2:
plt.figure(figsize=(16, 10), dpi=98)
ax = plt.gca()
p1 = plt.subplot(121) # 1行2列,第一个图
p2 = plt.subplot(122) # 1行2列,第二个图
p1.plot(coord_points2test, exact_solu2test, color='b', linestyle='-.', label='true')
p1.plot(coord_points2test, s2ReLU_solu2test, color='g', linestyle=':', label='predict')
ax.legend(fontsize=10)
p2.plot(coord_points2test, exact_solu2test, color='b', linestyle='-.', label='true')
p2.plot(coord_points2test, s2ReLU_solu2test, color='g', linestyle=':', label='predict')
p2.axis([0.35, 0.65, 0.2, 0.27])
# plot the box of
tx0 = 0.35
tx1 = 0.65
ty0 = 0.2
ty1 = 0.27
sx = [tx0, tx1, tx1, tx0, tx0]
sy = [ty0, ty0, ty1, ty1, ty0]
p1.plot(sx, sy, "purple")
# plot patch lines
xy = (0.64, 0.265)
xy2 = (0.36, 0.265)
con = ConnectionPatch(xyA=xy2, xyB=xy, coordsA="data", coordsB="data", axesA=p2, axesB=p1)
p2.add_artist(con)
xy = (0.64, 0.21)
xy2 = (0.36, 0.205)
con = ConnectionPatch(xyA=xy2, xyB=xy, coordsA="data", coordsB="data",
axesA=p2, axesB=p1)
p2.add_artist(con)
fntmp = '%s/%ssolu2test' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
else:
# fig11 = plt.figure(figsize=(10, 8))
fig11 = plt.figure(figsize=(9, 6.5))
ax = plt.gca()
ax.plot(coord_points2test, exact_solu2test, 'b-.', label='exact')
ax.plot(coord_points2test, s2ReLU_solu2test, 'g:', label='s2ReLU')
ax.plot(coord_points2test, sReLU_solu2test, 'm--', label='sReLU')
ax.plot(coord_points2test, ReLU_solu2test, 'c-', label='ReLU')
# box = ax.get_position()
# ax.set_position([box.x0, box.y0, box.width, box.height * 0.8])
ax.legend(loc='right', bbox_to_anchor=(0.9, 1.05), ncol=4, fontsize=12)
ax.set_xlabel('x', fontsize=14)
ax.set_ylabel('u', fontsize=14)
fntmp = '%s/%ssolu2test' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plot_Hot_solution2test(solu2test, size_vec2mat=20, actName=None, seedNo=1000, outPath=None):
solu2color = np.reshape(solu2test, (size_vec2mat, size_vec2mat))
plt.figure()
ax = plt.gca()
plt.imshow(solu2color, interpolation='nearest', cmap=cm.coolwarm, origin='lower')
plt.colorbar(shrink=0.9)
plt.xticks(())
plt.yticks(())
# plt.title('exact solution', fontsize=14)
if str.lower(actName) == 'utrue':
fntmp = '%s/%s%s' % (outPath, seedNo, 'Utrue2test')
elif str.lower(actName) == 'srelu':
fntmp = '%s/%s_%s' % (outPath, seedNo, 'UsReLU2test')
elif str.lower(actName) == 's2relu':
fntmp = '%s/%s_%s' % (outPath, seedNo, 'Us2ReLU2test')
elif str.lower(actName) == 's3relu':
fntmp = '%s/%s_%s' % (outPath, seedNo, 'Us3ReLU2test')
elif str.lower(actName) == 'csrelu':
fntmp = '%s/%s_%s' % (outPath, seedNo, 'UCsReLU2test')
elif str.lower(actName) == 'relu':
fntmp = '%s/%s_%s' % (outPath, seedNo, 'UReLU2test')
elif str.lower(actName) == 'tanh':
fntmp = '%s/%s_%s' % (outPath, seedNo, 'Utanh2test')
elif str.lower(actName) == 'sintanh':
fntmp = '%s/%s_%s' % (outPath, seedNo, 'Ustanh2test')
elif str.lower(actName) == 'singauss':
fntmp = '%s/%s_%s' % (outPath, seedNo, 'Usgauss2test')
elif str.lower(actName) == 'gauss':
fntmp = '%s/%s_%s' % (outPath, seedNo, 'Ugauss2test')
elif str.lower(actName) == 'mexican':
fntmp = '%s/%s_%s' % (outPath, seedNo, 'Umexican2test')
elif str.lower(actName) == 'modify_mexican':
fntmp = '%s/%s_%s' % (outPath, seedNo, 'Ummexican2test')
elif str.lower(actName) == 'sin_modify_mexican':
fntmp = '%s/%s_%s' % (outPath, seedNo, 'Usm-mexican2test')
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
def plot_scatter_solution2test(solu2test, test_batch, actName=None, seedNo=1000, outPath=None):
dim2test_batch = 2
if 2 == dim2test_batch:
test_x_bach = np.reshape(test_batch[:, 0], newshape=[-1, 1])
test_y_bach = np.reshape(test_batch[:, 1], newshape=[-1, 1])
# 绘制解的3D散点图
fig = plt.figure(figsize=(10, 10))
ax = Axes3D(fig)
ax.scatter(test_x_bach, test_y_bach, solu2test, c='b', label=actName)
# 绘制图例
ax.legend(loc='best')
# 添加坐标轴(顺序是X,Y, Z)
ax.set_xlabel('X', fontdict={'size': 15, 'color': 'red'})
ax.set_ylabel('Y', fontdict={'size': 15, 'color': 'red'})
ax.set_zlabel('u', fontdict={'size': 15, 'color': 'red'})
# plt.title('solution', fontsize=15)
fntmp = '%s/%ssolu' % (outPath, seedNo)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
else:
return
def plot_scatter_solutions2test(solu1_test, solu2_test, test_batch, actName1=None, actName2=None, seedNo=1000,
outPath=None):
dim2test_batch = 2
if 2 == dim2test_batch:
test_x_bach = np.reshape(test_batch[:, 0], newshape=[-1, 1])
test_y_bach = np.reshape(test_batch[:, 1], newshape=[-1, 1])
# 绘制解的3D散点图(真解和预测解)
fig = plt.figure(figsize=(10, 10))
ax = Axes3D(fig)
ax.scatter(test_x_bach, test_y_bach, solu1_test, c='b', label=actName1)
ax.scatter(test_x_bach, test_y_bach, solu2_test, c='b', label=actName2)
# 绘制图例
ax.legend(loc='best')
# 添加坐标轴(顺序是X,Y, Z)
ax.set_xlabel('X', fontdict={'size': 15, 'color': 'red'})
ax.set_ylabel('Y', fontdict={'size': 15, 'color': 'red'})
ax.set_zlabel('u', fontdict={'size': 15, 'color': 'red'})
# plt.title('solution', fontsize=15)
fntmp = '%s/%ssolus_%s' % (outPath, seedNo, actName2)
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
else:
return
def plot_Hot_point_wise_err(point_wise_err, size_vec2mat=20, actName=None, seedNo=1000, outPath=None):
# 逐点误差分布热力图
square_err_color2sin = np.reshape(point_wise_err, (size_vec2mat, size_vec2mat))
plt.figure(figsize=(10, 8))
ax = plt.gca()
plt.imshow(square_err_color2sin, interpolation='nearest', cmap=cm.coolwarm, origin='lower')
plt.colorbar(shrink=0.85)
plt.xticks(())
plt.yticks(())
# plt.title('point-wise error', fontsize=14)
if str.lower(actName) == 'srelu':
fntmp = '%s/%spErr_%s' % (outPath, seedNo, 'sReLU')
elif str.lower(actName) == 's2relu':
fntmp = '%s/%spErr_%s' % (outPath, seedNo, 's2ReLU')
elif str.lower(actName) == 's3relu':
fntmp = '%s/%spErr_%s' % (outPath, seedNo, 's3ReLU')
elif str.lower(actName) == 'csrelu':
fntmp = '%s/%spErr_%s' % (outPath, seedNo, 'CsReLU')
elif str.lower(actName) == 'relu':
fntmp = '%s/%spErr_%s' % (outPath, seedNo, 'ReLU')
elif str.lower(actName) == 'tanh':
fntmp = '%s/%spErr_%s' % (outPath, seedNo, 'tanh')
elif str.lower(actName) == 'sin':
fntmp = '%s/%spErr_%s' % (outPath, seedNo, 'sin')
elif str.lower(actName) == 'sintanh':
fntmp = '%s/%spErr_%s' % (outPath, seedNo, 'stanh')
elif str.lower(actName) == 'singauss':
fntmp = '%s/%spErr_%s' % (outPath, seedNo, 'sgauss')
elif str.lower(actName) == 'gauss':
fntmp = '%s/%spErr_%s' % (outPath, seedNo, 'gauss')
elif str.lower(actName) == 'mexican':
fntmp = '%s/%spErr_%s' % (outPath, seedNo, 'mexican')
elif str.lower(actName) == 'modify_mexican':
fntmp = '%s/%spErr_%s' % (outPath, seedNo, 'mmexican')
elif str.lower(actName) == 'sin_modify_mexican':
fntmp = '%s/%spErr_%s' % (outPath, seedNo, 'sm-mexican')
DNN_tools.mySaveFig(plt, fntmp, ax=ax, isax=1, iseps=0)
| 42.954486 | 130 | 0.601084 | 4,346 | 33,032 | 4.442016 | 0.091118 | 0.021134 | 0.034084 | 0.036415 | 0.87252 | 0.848433 | 0.806941 | 0.767418 | 0.732349 | 0.71914 | 0 | 0.052927 | 0.233531 | 33,032 | 768 | 131 | 43.010417 | 0.709574 | 0.074292 | 0 | 0.726968 | 0 | 0 | 0.08625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041876 | false | 0 | 0.015075 | 0 | 0.060302 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4a59c1efb3ca9ef79a842ba5c469ed671af61e42 | 3,626 | py | Python | modules/mobile.py | tansyab1/LightNetPlus | ed226e5454b2144063a6d8132b07c90e6a64e2d3 | [
"MIT"
] | 240 | 2019-02-27T08:39:06.000Z | 2021-05-31T19:38:17.000Z | modules/mobile.py | tansyab1/LightNetPlus | ed226e5454b2144063a6d8132b07c90e6a64e2d3 | [
"MIT"
] | 8 | 2019-04-22T10:59:47.000Z | 2021-03-19T15:38:52.000Z | modules/mobile.py | tansyab1/LightNetPlus | ed226e5454b2144063a6d8132b07c90e6a64e2d3 | [
"MIT"
] | 56 | 2019-04-18T03:34:17.000Z | 2021-04-25T09:32:50.000Z | import torch
import torch.nn as nn
from modules.inplace_abn.iabn import InPlaceABN
class InvertedResidual(nn.Module):
def __init__(self, inp, oup, stride, dilate, expand_ratio):
"""
InvertedResidual: Core block of the MobileNetV2
:param inp: (int) Number of the input channels
:param oup: (int) Number of the output channels
:param stride: (int) Stride used in the Conv3x3
:param dilate: (int) Dilation used in the Conv3x3
:param expand_ratio: (int) Expand ratio of the Channel Width of the Block
"""
super(InvertedResidual, self).__init__()
self.stride = stride
assert stride in [1, 2]
self.use_res_connect = self.stride == 1 and inp == oup
self.conv = nn.Sequential(
# step 1. point-wise convolution
nn.Conv2d(in_channels=inp, out_channels=inp * expand_ratio,
kernel_size=1, stride=1, padding=0, dilation=1, groups=1, bias=False),
nn.BatchNorm2d(num_features=inp * expand_ratio),
nn.LeakyReLU(inplace=True, negative_slope=0.01),
# step 2. depth-wise convolution
nn.Conv2d(in_channels=inp * expand_ratio, out_channels=inp * expand_ratio,
kernel_size=3, stride=stride, padding=dilate, dilation=dilate,
groups=inp * expand_ratio, bias=False),
nn.BatchNorm2d(num_features=inp * expand_ratio),
nn.LeakyReLU(inplace=True, negative_slope=0.01),
# step 3. point-wise convolution
nn.Conv2d(in_channels=inp * expand_ratio, out_channels=oup,
kernel_size=1, stride=1, padding=0, dilation=1, groups=1, bias=False),
nn.BatchNorm2d(num_features=oup),
)
def forward(self, x):
if self.use_res_connect:
return x + self.conv(x)
else:
return self.conv(x)
class InvertedResidualIABN(nn.Module):
def __init__(self, inp, oup, stride, dilate, expand_ratio, norm_act=InPlaceABN):
"""
InvertedResidual: Core block of the MobileNetV2
:param inp: (int) Number of the input channels
:param oup: (int) Number of the output channels
:param stride: (int) Stride used in the Conv3x3
:param dilate: (int) Dilation used in the Conv3x3
:param expand_ratio: (int) Expand ratio of the Channel Width of the Block
"""
super(InvertedResidualIABN, self).__init__()
self.stride = stride
assert stride in [1, 2]
self.use_res_connect = self.stride == 1 and inp == oup
self.conv = nn.Sequential(
# step 1. point-wise convolution
norm_act(inp),
nn.Conv2d(in_channels=inp, out_channels=inp * expand_ratio,
kernel_size=1, stride=1, padding=0, dilation=1, groups=1, bias=False),
# step 2. depth-wise convolution
norm_act(inp * expand_ratio),
nn.Conv2d(in_channels=inp * expand_ratio, out_channels=inp * expand_ratio,
kernel_size=3, stride=stride, padding=dilate, dilation=dilate,
groups=inp * expand_ratio, bias=False),
# step 3. point-wise convolution
norm_act(inp * expand_ratio),
nn.Conv2d(in_channels=inp * expand_ratio, out_channels=oup,
kernel_size=1, stride=1, padding=0, dilation=1, groups=1, bias=False)
)
def forward(self, x):
if self.use_res_connect:
return x + self.conv(x)
else:
return self.conv(x)
| 41.204545 | 92 | 0.607557 | 459 | 3,626 | 4.647059 | 0.174292 | 0.103141 | 0.091889 | 0.082513 | 0.919831 | 0.909517 | 0.902485 | 0.902485 | 0.900141 | 0.900141 | 0 | 0.023183 | 0.298125 | 3,626 | 87 | 93 | 41.678161 | 0.814931 | 0.228351 | 0 | 0.72549 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039216 | 1 | 0.078431 | false | 0 | 0.058824 | 0 | 0.254902 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4a83795e45b92ba3feec7d5cfc76e906ae242cb0 | 110 | py | Python | Task/Quine/Python/quine-1.py | LaudateCorpus1/RosettaCodeData | 9ad63ea473a958506c041077f1d810c0c7c8c18d | [
"Info-ZIP"
] | 1 | 2021-05-05T13:42:20.000Z | 2021-05-05T13:42:20.000Z | Task/Quine/Python/quine-1.py | seanwallawalla-forks/RosettaCodeData | 9ad63ea473a958506c041077f1d810c0c7c8c18d | [
"Info-ZIP"
] | null | null | null | Task/Quine/Python/quine-1.py | seanwallawalla-forks/RosettaCodeData | 9ad63ea473a958506c041077f1d810c0c7c8c18d | [
"Info-ZIP"
] | null | null | null | w = "print('w = ' + chr(34) + w + chr(34) + chr(10) + w)"
print('w = ' + chr(34) + w + chr(34) + chr(10) + w)
| 36.666667 | 57 | 0.427273 | 21 | 110 | 2.238095 | 0.238095 | 0.340426 | 0.510638 | 0.425532 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0.146341 | 0.254545 | 110 | 2 | 58 | 55 | 0.426829 | 0 | 0 | 0 | 0 | 0.5 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 14 |
4a8fc52ebbd76cf048456ed30ea8e5e1f5b35ae9 | 40,498 | py | Python | spark_fhir_schemas/r4/resources/communication.py | imranq2/SparkFhirSchemas | 24debae6980fb520fe55aa199bdfd43c0092eb9c | [
"Apache-2.0"
] | 2 | 2020-10-31T23:25:01.000Z | 2021-06-09T14:12:42.000Z | spark_fhir_schemas/r4/resources/communication.py | imranq2/SparkFhirSchemas | 24debae6980fb520fe55aa199bdfd43c0092eb9c | [
"Apache-2.0"
] | null | null | null | spark_fhir_schemas/r4/resources/communication.py | imranq2/SparkFhirSchemas | 24debae6980fb520fe55aa199bdfd43c0092eb9c | [
"Apache-2.0"
] | null | null | null | from typing import Union, List, Optional
from pyspark.sql.types import StructType, StructField, StringType, ArrayType, DataType
# This file is auto-generated by generate_schema so do not edit it manually
# noinspection PyPep8Naming
class CommunicationSchema:
"""
An occurrence of information being transmitted; e.g. an alert that was sent to
a responsible provider, a public health agency that was notified about a
reportable condition.
"""
# noinspection PyDefaultArgument
@staticmethod
def get_schema(
max_nesting_depth: Optional[int] = 6,
nesting_depth: int = 0,
nesting_list: List[str] = [],
max_recursion_limit: Optional[int] = 2,
include_extension: Optional[bool] = False,
extension_fields: Optional[List[str]] = [
"valueBoolean",
"valueCode",
"valueDate",
"valueDateTime",
"valueDecimal",
"valueId",
"valueInteger",
"valuePositiveInt",
"valueString",
"valueTime",
"valueUnsignedInt",
"valueUri",
"valueUrl",
],
extension_depth: int = 0,
max_extension_depth: Optional[int] = 2,
include_modifierExtension: Optional[bool] = False,
) -> Union[StructType, DataType]:
"""
An occurrence of information being transmitted; e.g. an alert that was sent to
a responsible provider, a public health agency that was notified about a
reportable condition.
resourceType: This is a Communication resource
id: The logical id of the resource, as used in the URL for the resource. Once
assigned, this value never changes.
meta: The metadata about the resource. This is content that is maintained by the
infrastructure. Changes to the content might not always be associated with
version changes to the resource.
implicitRules: A reference to a set of rules that were followed when the resource was
constructed, and which must be understood when processing the content. Often,
this is a reference to an implementation guide that defines the special rules
along with other profiles etc.
language: The base language in which the resource is written.
text: A human-readable narrative that contains a summary of the resource and can be
used to represent the content of the resource to a human. The narrative need
not encode all the structured data, but is required to contain sufficient
detail to make it "clinically safe" for a human to just read the narrative.
Resource definitions may define what content should be represented in the
narrative to ensure clinical safety.
contained: These resources do not have an independent existence apart from the resource
that contains them - they cannot be identified independently, and nor can they
have their own independent transaction scope.
extension: May be used to represent additional information that is not part of the basic
definition of the resource. To make the use of extensions safe and manageable,
there is a strict set of governance applied to the definition and use of
extensions. Though any implementer can define an extension, there is a set of
requirements that SHALL be met as part of the definition of the extension.
modifierExtension: May be used to represent additional information that is not part of the basic
definition of the resource and that modifies the understanding of the element
that contains it and/or the understanding of the containing element's
descendants. Usually modifier elements provide negation or qualification. To
make the use of extensions safe and manageable, there is a strict set of
governance applied to the definition and use of extensions. Though any
implementer is allowed to define an extension, there is a set of requirements
that SHALL be met as part of the definition of the extension. Applications
processing a resource are required to check for modifier extensions.
Modifier extensions SHALL NOT change the meaning of any elements on Resource
or DomainResource (including cannot change the meaning of modifierExtension
itself).
identifier: Business identifiers assigned to this communication by the performer or other
systems which remain constant as the resource is updated and propagates from
server to server.
instantiatesCanonical: The URL pointing to a FHIR-defined protocol, guideline, orderset or other
definition that is adhered to in whole or in part by this Communication.
instantiatesUri: The URL pointing to an externally maintained protocol, guideline, orderset or
other definition that is adhered to in whole or in part by this Communication.
basedOn: An order, proposal or plan fulfilled in whole or in part by this
Communication.
partOf: Part of this action.
inResponseTo: Prior communication that this communication is in response to.
status: The status of the transmission.
statusReason: Captures the reason for the current state of the Communication.
category: The type of message conveyed such as alert, notification, reminder,
instruction, etc.
priority: Characterizes how quickly the planned or in progress communication must be
addressed. Includes concepts such as stat, urgent, routine.
medium: A channel that was used for this communication (e.g. email, fax).
subject: The patient or group that was the focus of this communication.
topic: Description of the purpose/content, similar to a subject line in an email.
about: Other resources that pertain to this communication and to which this
communication should be associated.
encounter: The Encounter during which this Communication was created or to which the
creation of this record is tightly associated.
sent: The time when this communication was sent.
received: The time when this communication arrived at the destination.
recipient: The entity (e.g. person, organization, clinical information system, care team
or device) which was the target of the communication. If receipts need to be
tracked by an individual, a separate resource instance will need to be created
for each recipient. Multiple recipient communications are intended where
either receipts are not tracked (e.g. a mass mail-out) or a receipt is
captured in aggregate (all emails confirmed received by a particular time).
sender: The entity (e.g. person, organization, clinical information system, or device)
which was the source of the communication.
reasonCode: The reason or justification for the communication.
reasonReference: Indicates another resource whose existence justifies this communication.
payload: Text, attachment(s), or resource(s) that was communicated to the recipient.
note: Additional notes or commentary about the communication by the sender, receiver
or other interested parties.
"""
from spark_fhir_schemas.r4.simple_types.id import idSchema
from spark_fhir_schemas.r4.complex_types.meta import MetaSchema
from spark_fhir_schemas.r4.simple_types.uri import uriSchema
from spark_fhir_schemas.r4.simple_types.code import codeSchema
from spark_fhir_schemas.r4.complex_types.narrative import NarrativeSchema
from spark_fhir_schemas.r4.complex_types.resourcelist import ResourceListSchema
from spark_fhir_schemas.r4.complex_types.extension import ExtensionSchema
from spark_fhir_schemas.r4.complex_types.identifier import IdentifierSchema
from spark_fhir_schemas.r4.simple_types.canonical import canonicalSchema
from spark_fhir_schemas.r4.complex_types.reference import ReferenceSchema
from spark_fhir_schemas.r4.complex_types.codeableconcept import (
CodeableConceptSchema,
)
from spark_fhir_schemas.r4.simple_types.datetime import dateTimeSchema
from spark_fhir_schemas.r4.complex_types.communication_payload import (
Communication_PayloadSchema,
)
from spark_fhir_schemas.r4.complex_types.annotation import AnnotationSchema
if (
max_recursion_limit
and nesting_list.count("Communication") >= max_recursion_limit
) or (max_nesting_depth and nesting_depth >= max_nesting_depth):
return StructType([StructField("id", StringType(), True)])
# add my name to recursion list for later
my_nesting_list: List[str] = nesting_list + ["Communication"]
schema = StructType(
[
# This is a Communication resource
StructField("resourceType", StringType(), True),
# The logical id of the resource, as used in the URL for the resource. Once
# assigned, this value never changes.
StructField(
"id",
idSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
),
True,
),
# The metadata about the resource. This is content that is maintained by the
# infrastructure. Changes to the content might not always be associated with
# version changes to the resource.
StructField(
"meta",
MetaSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
),
True,
),
# A reference to a set of rules that were followed when the resource was
# constructed, and which must be understood when processing the content. Often,
# this is a reference to an implementation guide that defines the special rules
# along with other profiles etc.
StructField(
"implicitRules",
uriSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
),
True,
),
# The base language in which the resource is written.
StructField(
"language",
codeSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
),
True,
),
# A human-readable narrative that contains a summary of the resource and can be
# used to represent the content of the resource to a human. The narrative need
# not encode all the structured data, but is required to contain sufficient
# detail to make it "clinically safe" for a human to just read the narrative.
# Resource definitions may define what content should be represented in the
# narrative to ensure clinical safety.
StructField(
"text",
NarrativeSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
),
True,
),
# These resources do not have an independent existence apart from the resource
# that contains them - they cannot be identified independently, and nor can they
# have their own independent transaction scope.
StructField(
"contained",
ArrayType(
ResourceListSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# May be used to represent additional information that is not part of the basic
# definition of the resource. To make the use of extensions safe and manageable,
# there is a strict set of governance applied to the definition and use of
# extensions. Though any implementer can define an extension, there is a set of
# requirements that SHALL be met as part of the definition of the extension.
StructField(
"extension",
ArrayType(
ExtensionSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# May be used to represent additional information that is not part of the basic
# definition of the resource and that modifies the understanding of the element
# that contains it and/or the understanding of the containing element's
# descendants. Usually modifier elements provide negation or qualification. To
# make the use of extensions safe and manageable, there is a strict set of
# governance applied to the definition and use of extensions. Though any
# implementer is allowed to define an extension, there is a set of requirements
# that SHALL be met as part of the definition of the extension. Applications
# processing a resource are required to check for modifier extensions.
#
# Modifier extensions SHALL NOT change the meaning of any elements on Resource
# or DomainResource (including cannot change the meaning of modifierExtension
# itself).
StructField(
"modifierExtension",
ArrayType(
ExtensionSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# Business identifiers assigned to this communication by the performer or other
# systems which remain constant as the resource is updated and propagates from
# server to server.
StructField(
"identifier",
ArrayType(
IdentifierSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# The URL pointing to a FHIR-defined protocol, guideline, orderset or other
# definition that is adhered to in whole or in part by this Communication.
StructField(
"instantiatesCanonical",
ArrayType(
canonicalSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# The URL pointing to an externally maintained protocol, guideline, orderset or
# other definition that is adhered to in whole or in part by this Communication.
StructField(
"instantiatesUri",
ArrayType(
uriSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# An order, proposal or plan fulfilled in whole or in part by this
# Communication.
StructField(
"basedOn",
ArrayType(
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# Part of this action.
StructField(
"partOf",
ArrayType(
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# Prior communication that this communication is in response to.
StructField(
"inResponseTo",
ArrayType(
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# The status of the transmission.
StructField(
"status",
codeSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
),
True,
),
# Captures the reason for the current state of the Communication.
StructField(
"statusReason",
CodeableConceptSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
),
True,
),
# The type of message conveyed such as alert, notification, reminder,
# instruction, etc.
StructField(
"category",
ArrayType(
CodeableConceptSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# Characterizes how quickly the planned or in progress communication must be
# addressed. Includes concepts such as stat, urgent, routine.
StructField(
"priority",
codeSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
),
True,
),
# A channel that was used for this communication (e.g. email, fax).
StructField(
"medium",
ArrayType(
CodeableConceptSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# The patient or group that was the focus of this communication.
StructField(
"subject",
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
),
True,
),
# Description of the purpose/content, similar to a subject line in an email.
StructField(
"topic",
CodeableConceptSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
),
True,
),
# Other resources that pertain to this communication and to which this
# communication should be associated.
StructField(
"about",
ArrayType(
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# The Encounter during which this Communication was created or to which the
# creation of this record is tightly associated.
StructField(
"encounter",
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
),
True,
),
# The time when this communication was sent.
StructField(
"sent",
dateTimeSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
),
True,
),
# The time when this communication arrived at the destination.
StructField(
"received",
dateTimeSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
),
True,
),
# The entity (e.g. person, organization, clinical information system, care team
# or device) which was the target of the communication. If receipts need to be
# tracked by an individual, a separate resource instance will need to be created
# for each recipient. Multiple recipient communications are intended where
# either receipts are not tracked (e.g. a mass mail-out) or a receipt is
# captured in aggregate (all emails confirmed received by a particular time).
StructField(
"recipient",
ArrayType(
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# The entity (e.g. person, organization, clinical information system, or device)
# which was the source of the communication.
StructField(
"sender",
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth + 1,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
),
True,
),
# The reason or justification for the communication.
StructField(
"reasonCode",
ArrayType(
CodeableConceptSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# Indicates another resource whose existence justifies this communication.
StructField(
"reasonReference",
ArrayType(
ReferenceSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# Text, attachment(s), or resource(s) that was communicated to the recipient.
StructField(
"payload",
ArrayType(
Communication_PayloadSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
# Additional notes or commentary about the communication by the sender, receiver
# or other interested parties.
StructField(
"note",
ArrayType(
AnnotationSchema.get_schema(
max_nesting_depth=max_nesting_depth,
nesting_depth=nesting_depth + 1,
nesting_list=my_nesting_list,
max_recursion_limit=max_recursion_limit,
include_extension=include_extension,
extension_fields=extension_fields,
extension_depth=extension_depth,
max_extension_depth=max_extension_depth,
include_modifierExtension=include_modifierExtension,
)
),
True,
),
]
)
if not include_extension:
schema.fields = [
c
if c.name != "extension"
else StructField("extension", StringType(), True)
for c in schema.fields
]
if not include_modifierExtension:
schema.fields = [
c
if c.name != "modifierExtension"
else StructField("modifierExtension", StringType(), True)
for c in schema.fields
]
return schema
| 51.787724 | 104 | 0.542274 | 3,549 | 40,498 | 5.939983 | 0.111863 | 0.073431 | 0.04625 | 0.070585 | 0.879939 | 0.872539 | 0.870642 | 0.839334 | 0.833262 | 0.82036 | 0 | 0.002761 | 0.418737 | 40,498 | 781 | 105 | 51.854033 | 0.892778 | 0.265766 | 0 | 0.781469 | 1 | 0 | 0.017251 | 0.000725 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001748 | false | 0 | 0.027972 | 0 | 0.034965 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4aaa3eb91d27dd3536bd9e974af6792434ff2b93 | 120,382 | py | Python | parser_compiler/parser.py | lohhans/Compiladores-2020.4 | c196c11d0c1ec3b25b54b01e0729474205f328ed | [
"MIT"
] | 3 | 2021-01-08T03:41:35.000Z | 2021-01-11T04:22:31.000Z | parser_compiler/parser.py | laisy/Compiladores-2020.4 | c196c11d0c1ec3b25b54b01e0729474205f328ed | [
"MIT"
] | 1 | 2021-01-17T07:56:56.000Z | 2021-01-17T07:56:56.000Z | parser_compiler/parser.py | laisy/Compiladores-2020.4 | c196c11d0c1ec3b25b54b01e0729474205f328ed | [
"MIT"
] | 3 | 2021-01-08T00:13:27.000Z | 2021-09-09T13:56:54.000Z | import re
from parser_compiler.util import *
from pprint import pprint
# print('Entrou... Tipo: %s, lexema: %s, na linha: %s' % (self.tokenAtual().tipo, self.tokenAtual().lexema, self.tokenAtual().linha))
class Parser:
def __init__(self, tabelaDeTokens):
self.tabelaDeTokens = tabelaDeTokens
self.indexDaTabelaDeTokens = 0
self.indexLookAhead = 0
# self.listaEscopos = []
self.indexEscopoAtual = -1
self.tabelaDeSimbolos = []
# Pra saber na semantica qual declaracao de variavel no codigo tá sendo checada
self.indexDaDeclaracaoDaVariavelAtual = -1
self.indexEscopoAntesDaFuncao = 0
self.tabelaDeTresEnderecos = []
self.tempTresEnderecos = ''
def tokenAtual(self):
return self.tabelaDeTokens[self.indexDaTabelaDeTokens]
def tokenLookAhead(self):
self.indexLookAhead = self.indexDaTabelaDeTokens + 1
return self.tabelaDeTokens[self.indexLookAhead]
def start(self):
escopoPai = self.indexEscopoAtual # (-1 -> início)
self.indexEscopoAtual += 1
self.statement_list() # Análise Sintática
for linha in self.tabelaDeTresEnderecos:
pprint(linha)
print('\n')
self.checkSemantica()
return
def statement_list(self):
if self.tokenAtual().tipo == "END":
return
else:
self.statement()
self.statement_list()
return
def statement(self):
if self.tokenAtual().tipo == "PROGRAM":
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "CLEFT":
self.indexDaTabelaDeTokens += 1
while self.tokenAtual().tipo != "CRIGHT":
self.block_statement()
if self.tokenAtual().tipo == "CRIGHT":
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "END":
print("\nFIM DA ANÁLISE SINTÁTICA - DEU CERTO :)\n")
# Deu certo
else:
raise Exception(
"Erro sintatico: falta do END na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do CRIGHT na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do CLEFT na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: Código fora do padrão na linha "
+ str(self.tokenAtual().linha)
)
# <block>
def block_statement(self):
# ESCOPO OK
# <declaration_var>
if self.tokenAtual().tipo == "INT" or self.tokenAtual().tipo == "BOOL":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.declaration_var_statement(temp)
return temp
# ESCOPO OK
# <declaration_func>
if self.tokenAtual().tipo == "FUNC":
# Ordem: [escopo, tipo, tipoDoRetorno, id, [[params], [params], [params]]]
# Obs: Params pode ser >= 0
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
# temp.append('FUNC')
temp.append(self.tokenAtual().tipo)
self.declaration_func_statement(temp)
return temp
# ESCOPO OK
# <declaration_proc>
if self.tokenAtual().tipo == "PROC":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
# temp.append('PROC')
temp.append(self.tokenAtual().tipo)
temp = self.declaration_proc_statement(temp)
self.tabelaDeSimbolos.append(temp)
# INICIO TAB 3 END - PROC
nomeDaFuncao = temp[3]
paramsDaFuncao = temp[4]
self.tabelaDeTresEnderecos.append(('label', nomeDaFuncao, 'null'))
for param in paramsDaFuncao:
self.tabelaDeTresEnderecos.append(('pop', param[2], 'null'))
self.tabelaDeTresEnderecos.append(('ret', 'null', 'null'))
return temp
# ESCOPO OK
# Chamadas de função e procedimentos
if self.tokenAtual().tipo == "CALL":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.indexDaTabelaDeTokens += 1
# <call_func>
if self.tokenAtual().tipo == "FUNC":
temp.append(self.tokenAtual().tipo)
temp = self.call_func_statement(temp)
if self.tokenAtual().tipo == "SEMICOLON":
self.indexDaTabelaDeTokens += 1
self.tabelaDeSimbolos.append(temp)
return temp
else:
raise Exception(
"Erro sintatico: falta do ponto e virgula na linha "
+ str(self.tokenAtual().linha)
)
# <call_proc>
elif self.tokenAtual().tipo == "PROC":
temp.append(self.tokenAtual().tipo)
temp = self.call_proc_statement(temp)
# self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "SEMICOLON":
self.indexDaTabelaDeTokens += 1
self.tabelaDeSimbolos.append(temp)
return temp
else:
raise Exception(
"aErro sintatico: falta do ponto e virgula na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta de PROC ou FUNC"
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <print_statement>
if self.tokenAtual().tipo == "PRINT":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.print_statement(temp)
return temp
# ESCOPO ok
# <if_statement>
if self.tokenAtual().tipo == "IF":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.if_statement(temp)
return temp
# ESCOPO ok
# <while_statement>
if self.tokenAtual().tipo == "WHILE":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.while_statement(temp)
return temp
# ESCOPO OK
# <identifier>
if self.tokenAtual().tipo == "ID":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
temp.append(self.tokenAtual().lexema)
self.call_var_statement(temp)
return temp
else:
return
# block2 é o bloco que contém break/continue que só pode ser chamado dentro de um while
def block2_statement(self):
# ESCOPO OK
# <declaration_var>
if self.tokenAtual().tipo == "INT" or self.tokenAtual().tipo == "BOOL":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.declaration_var_statement(temp)
return temp
# ESCOPO OK
# Chamadas de função e procedimentos
if self.tokenAtual().tipo == "CALL":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.indexDaTabelaDeTokens += 1
# <call_func>
if self.tokenAtual().tipo == "FUNC":
temp.append(self.tokenAtual().tipo)
temp = self.call_func_statement(temp)
if self.tokenAtual().tipo == "SEMICOLON":
self.tabelaDeSimbolos.append(temp)
self.indexDaTabelaDeTokens += 1
return temp
else:
raise Exception(
"Erro sintatico: falta do ponto e virgula na linha "
+ str(self.tokenAtual().linha)
)
# <call_proc>
elif self.tokenAtual().tipo == "PROC":
temp.append(self.tokenAtual().tipo)
temp = self.call_proc_statement(temp)
# self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "SEMICOLON":
self.tabelaDeSimbolos.append(temp)
self.indexDaTabelaDeTokens += 1
return temp
else:
raise Exception(
"Erro sintatico: falta do ponto e virgula na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta de PROC ou FUNC"
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <print_statement>
if self.tokenAtual().tipo == "PRINT":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.print_statement(temp)
return temp
# ESCOPO OK
# <if_statement>
if self.tokenAtual().tipo == "IF":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.if_statement2(temp)
return temp
# Tratamento de erro ELSE
if self.tokenAtual().tipo == "ELSE":
raise Exception(
"Erro sintatico: ELSE adicionado de maneira incorreta "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <while_statement>
if self.tokenAtual().tipo == "WHILE":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.while_statement(temp)
return temp
# ESCOPO OK
# <identifier>
if self.tokenAtual().tipo == "ID":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
temp.append(self.tokenAtual().lexema)
self.call_var_statement(temp)
return temp
# ESCOPO OK
# <unconditional_branch>
if self.tokenAtual().tipo == "BREAK" or self.tokenAtual().tipo == "CONTINUE":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.unconditional_branch_statement()
return temp
else:
raise Exception(
"Erro sintatico: bloco vazio na linha " +
str(self.tokenAtual().linha)
)
# block3 é o bloco do if/else, que não pode declarar função e procedimento dentro
def block3_statement(self):
# ESCOPO OK
# <declaration_var>
if self.tokenAtual().tipo == "INT" or self.tokenAtual().tipo == "BOOL":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.declaration_var_statement(temp)
return temp
# ESCOPO OK
# Chamadas de função e procedimentos
if self.tokenAtual().tipo == "CALL":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.indexDaTabelaDeTokens += 1
# <call_func>
if self.tokenAtual().tipo == "FUNC":
temp.append(self.tokenAtual().tipo)
temp = self.call_func_statement(temp)
if self.tokenAtual().tipo == "SEMICOLON":
self.tabelaDeSimbolos.append(temp)
self.indexDaTabelaDeTokens += 1
return temp
else:
raise Exception(
"Erro sintatico: falta do ponto e virgula na linha "
+ str(self.tokenAtual().linha)
)
# <call_proc>
elif self.tokenAtual().tipo == "PROC":
temp.append(self.tokenAtual().tipo)
temp = self.call_proc_statement(temp)
# self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "SEMICOLON":
self.tabelaDeSimbolos.append(temp)
self.indexDaTabelaDeTokens += 1
return temp
else:
raise Exception(
"Erro sintatico: falta do ponto e virgula na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta de PROC ou FUNC"
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <print_statement>
if self.tokenAtual().tipo == "PRINT":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.print_statement(temp)
return temp
# ESCOPO Ok
# <if_statement>
if self.tokenAtual().tipo == "IF":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.if_statement(temp)
return temp
# Tratamento de erro ELSE
if self.tokenAtual().tipo == "ELSE":
raise Exception(
"Erro sintatico: ELSE adicionado de maneira incorreta "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <while_statement>
if self.tokenAtual().tipo == "WHILE":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
self.while_statement(temp)
return temp
# ESCOPO OK
# <identifier>
if self.tokenAtual().tipo == "ID":
temp = []
temp.append(self.indexEscopoAtual)
temp.append(self.tokenAtual().linha)
temp.append(self.tokenAtual().tipo)
temp.append(self.tokenAtual().lexema)
self.call_var_statement(temp)
return temp
else:
raise Exception(
"Erro sintatico: bloco vazio na linha " +
str(self.tokenAtual().linha)
)
# ESCOPO OK
# <declaration_var> OK
def declaration_var_statement(self, temp):
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "ID":
temp.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "ATB": # atribuicao
temp.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
tempEndVar = []
# o que tem dentro da variavel
self.end_var_statement(tempEndVar)
temp.append(tempEndVar)
if self.tokenAtual().tipo == "SEMICOLON":
self.indexDaTabelaDeTokens += 1
self.tabelaDeSimbolos.append(temp)
else:
raise Exception(
"Erro sintatico: falta do ponto e virgula na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta da atribuição na linha "
+ str(self.tokenAtual().linha)
)
# TRES END
self.tabelaDeTresEnderecos.append(('mov', temp[3], 'temp'))
else:
raise Exception(
"Erro sintatico: falta do ID na linha " +
str(self.tokenAtual().linha)
)
# ESCOPO OK
# <end_var> OK
def end_var_statement(self, tempEndVar):
# <call_func>
if self.tokenAtual().tipo == "CALL":
tempEndVar.append(self.tokenAtual().tipo)
self.indexDaTabelaDeTokens += 1
# <call_func>
if self.tokenAtual().tipo == "FUNC":
tempEndVar.append(self.tokenAtual().tipo)
self.call_func_statement(tempEndVar)
return
else:
raise Exception(
"Erro sintatico: chamada de função erroneamente na linha "
+ str(self.tokenAtual().linha)
)
# <boolean>
if self.tokenAtual().tipo == "BOOLEAN":
if (
self.tokenAtual().lexema == "True"
or self.tokenAtual().lexema == "False"
):
tempEndVar.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
return
else:
raise Exception(
"Erro sintatico: boolean atribuido erroneamente na linha "
+ str(self.tokenAtual().linha)
)
# <num>
if self.tokenAtual().tipo == "NUM":
tempEndVar.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
if (
self.tokenAtual().tipo == "ADD"
or self.tokenAtual().tipo == "SUB"
or self.tokenAtual().tipo == "MULT"
or self.tokenAtual().tipo == "DIV"
):
tempEndVar.append(self.tokenAtual().lexema)
self.call_op_statement(tempEndVar)
return
else:
return
# <identifier>
if self.tokenAtual().tipo == "ID":
tempEndVar.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
# <call_op>
if (
self.tokenAtual().tipo == "ADD"
or self.tokenAtual().tipo == "SUB"
or self.tokenAtual().tipo == "MULT"
or self.tokenAtual().tipo == "DIV"
):
tempEndVar.append(self.tokenAtual().lexema)
self.call_op_statement(tempEndVar)
return
else:
return
else:
raise Exception(
"Erro sintatico: atribuição de variavel erroneamente na linha "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# Chamada de variavel OK
def call_var_statement(self, temp):
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "ATB": # atribuicao
temp.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
if (
(self.tokenAtual().tipo == "NUM")
or (self.tokenAtual().tipo == "BOOLEAN")
or (self.tokenAtual().tipo == "ID")
):
temp.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "SEMICOLON":
self.indexDaTabelaDeTokens += 1
self.tabelaDeSimbolos.append(temp)
else:
raise Exception(
"Erro sintatico: falta do ponto e vírgula na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: variável não atribuída na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: símbolo de atribuição não encontrado na linha "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <declaration_func> OK
def declaration_func_statement(self, temp):
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "INT" or self.tokenAtual().tipo == "BOOL": # tipo
# Salvando o tipo do retorno
temp.append(self.tokenAtual().tipo)
self.indexDaTabelaDeTokens += 1
# identificador
if self.tokenAtual().tipo == "ID":
# Salvando o id
temp.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "PLEFT":
# (int a, bool b)
# [[params], [params]]
# [] -> lista do que tem dewntro dos parenteses do parametro
# [[escopo, int, a], [escopo, bool, var]]
tempParenteses = []
self.indexDaTabelaDeTokens += 1
if (
self.tokenAtual().tipo == "INT"
or self.tokenAtual().tipo == "BOOL"
):
tempParentesesParamAtual = []
# Ordem dos params: [[escopo, tipo, id], [escopo, tipo, id]]
# Fazendo com que o escopo fique correto
tempParentesesParamAtual.append(
self.indexEscopoAtual + 1)
# Salvando o tipo do parametro atual
tempParentesesParamAtual.append(self.tokenAtual().tipo)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "ID":
# Salvando o tipo do parametro atual
tempParentesesParamAtual.append(
self.tokenAtual().lexema)
# [escopo, int, a] -> antes
tempParenteses.append(tempParentesesParamAtual)
# [[escopo, int, a]] -> depois
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "COMMA":
# [[escopo, int, a]] -> antes
tempParenteses.append(
self.params_statement(tempParenteses)
)
tempParenteses.pop() # Remoção do None que fica ao fim
# [[escopo, int, a], i1, i2, i3 ... in]
temp.append(tempParenteses)
if self.tokenAtual().tipo == "PRIGHT":
self.indexDaTabelaDeTokens += 1
# INICIO TAB 3 END
nomeDaFuncao = temp[4]
paramsDaFuncao = temp[5]
self.tabelaDeTresEnderecos.append(
('label', nomeDaFuncao, 'null'))
for param in paramsDaFuncao:
self.tabelaDeTresEnderecos.append(
('pop', param[2], 'null'))
if self.tokenAtual().tipo == "CLEFT":
# Armazendando o escopo antes de entrar na função
self.indexEscopoAntesDaFuncao = (
self.indexEscopoAtual
)
self.indexEscopoAtual += 1
self.indexDaTabelaDeTokens += 1
tempBlock = []
# BLOCK
while self.tokenAtual().tipo != "RETURN":
tempBlock.append(
self.block_statement())
temp.append(tempBlock)
tempReturn = []
if self.tokenAtual().tipo == "RETURN":
tempReturn.append(
self.indexEscopoAtual)
tempReturn.append(
self.tokenAtual().tipo)
# RETURN
tempReturnParams = []
tempReturnParams = self.return_statement(
tempReturnParams
)
tempReturn.append(tempReturnParams)
temp.append(tempReturn)
if self.tokenAtual().tipo == "CRIGHT":
# Retornando o valor do escopo antes de entrar na função
self.indexEscopoAtual = (
self.indexEscopoAntesDaFuncao
)
self.indexDaTabelaDeTokens += 1
if (
self.tokenAtual().tipo
== "SEMICOLON"
):
self.indexDaTabelaDeTokens += 1
# Adiciona na tabela de símbolos
self.tabelaDeSimbolos.append(
temp)
self.tabelaDeTresEnderecos.append(
('push', self.tempTresEnderecos, 'null'))
self.tabelaDeTresEnderecos.append(
('ret', 'null', 'null'))
else:
raise Exception(
"Erro sintatico: falta do ponto e vírgula na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta da chave direita na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do retorno na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta da chave esquerda na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do parentese direito na linha "
+ str(self.tokenAtual().linha)
)
elif self.tokenAtual().tipo == "PRIGHT":
temp.append(tempParenteses)
if self.tokenAtual().tipo == "PRIGHT":
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "CLEFT":
self.indexEscopoAntesDaFuncao = (
self.indexEscopoAtual
)
self.indexEscopoAtual += 1
self.indexDaTabelaDeTokens += 1
tempBlock = []
# BLOCK
while self.tokenAtual().tipo != "RETURN":
tempBlock.append(
self.block_statement())
temp.append(tempBlock)
tempReturn = []
# RETURN
if self.tokenAtual().tipo == "RETURN":
tempReturn.append(
self.indexEscopoAtual)
tempReturn.append(
self.tokenAtual().tipo)
# RETURN
tempReturnParms = []
tempReturnParms = self.return_statement(
tempReturnParms
)
tempReturn.append(tempReturnParms)
temp.append(tempReturn)
if self.tokenAtual().tipo == "CRIGHT":
self.indexEscopoAtual = (
self.indexEscopoAntesDaFuncao
)
self.indexDaTabelaDeTokens += 1
if (
self.tokenAtual().tipo
== "SEMICOLON"
):
self.indexDaTabelaDeTokens += 1
# Adiciona na tabela de símbolos
self.tabelaDeSimbolos.append(
temp)
else:
raise Exception(
"Erro sintatico: falta do ponto e vírgula na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta da chave direita na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do retorno na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta da chave esquerda na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do parentese direito na linha "
+ str(self.tokenAtual().linha)
)
else:
# TODO: (5 - falta descobrir) resolver exceção
raise Exception(
"Erro sintatico: falta da virgula na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta o ID na linha "
+ str(self.tokenAtual().linha)
)
else:
if self.tokenAtual().tipo == "PRIGHT":
temp.append(tempParenteses)
self.indexDaTabelaDeTokens += 1
exit(0)
# INICIO TAB 3 END
nomeDaFuncao = temp[4]
paramsDaFuncao = temp[5]
self.tabelaDeTresEnderecos.append(
('label', nomeDaFuncao, 'null'))
for param in paramsDaFuncao:
self.tabelaDeTresEnderecos.append(
('pop', param[2], 'null'))
self.tabelaDeTresEnderecos.append(
('push', self.tempTresEnderecos, 'null'))
self.tabelaDeTresEnderecos.append(
('ret', 'null', 'null'))
if self.tokenAtual().tipo == "CLEFT":
self.indexEscopoAntesDaFuncao = self.indexEscopoAtual
self.indexEscopoAtual += 1
self.indexDaTabelaDeTokens += 1
tempBlock = []
# BLOCK
while self.tokenAtual().tipo != "RETURN":
tempBlock.append(self.block_statement())
temp.append(tempBlock)
tempReturn = []
# RETURN
if self.tokenAtual().tipo == "RETURN":
tempReturn.append(self.indexEscopoAtual)
tempReturn.append(self.tokenAtual().tipo)
# RETURN
tempReturnParms = []
tempReturnParms = self.return_statement(
tempReturnParms
)
tempReturn.append(tempReturnParms)
temp.append(tempReturn)
if self.tokenAtual().tipo == "CRIGHT":
self.indexEscopoAtual = (
self.indexEscopoAntesDaFuncao
)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "SEMICOLON":
self.indexDaTabelaDeTokens += 1
# Adiciona na tabela de símbolos
self.tabelaDeSimbolos.append(temp)
else:
raise Exception(
"Erro sintatico: falta do ponto e vírgula na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta da chave direita na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do retorno na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta da chave esquerda na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do parentese direito na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do parentese esquerdo na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do ID na linha "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <return_statement> OK
def return_statement(self, tempReturnParams):
self.indexDaTabelaDeTokens += 1
# Se for chamada de função
if self.tokenAtual().tipo == "CALL":
tempReturnParams.append(self.tokenAtual().tipo)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "FUNC":
tempReturnParams.append(self.tokenAtual().tipo)
self.call_func_statement(tempReturnParams)
self.indexDaTabelaDeTokens += 1
return tempReturnParams
else:
raise Exception(
"Erro sintatico: Erro de chamada, só é permitido chamada de funções na linha "
+ str(self.tokenAtual().linha)
)
# Se for chamada de variavel/num/bool
if (
(self.tokenAtual().tipo == "NUM")
or (self.tokenAtual().tipo == "BOOLEAN")
or (self.tokenAtual().tipo == "ID")
):
tempReturnParams.append(self.tokenAtual().lexema)
# TRES END
self.tempTresEnderecos = tempReturnParams[0]
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "SEMICOLON":
self.indexDaTabelaDeTokens += 1
return tempReturnParams
else:
raise Exception(
"Erro sintatico: falta do ponto e virgula na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: Retorno errado na linha "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <params> OK
def params_statement(self, tempParenteses):
# [[escopo, int, a], adsfasd, asdasd]
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "INT" or self.tokenAtual().tipo == "BOOL":
tempParentesesParamAtual = []
tempParentesesParamAtual.append(self.indexEscopoAtual + 1)
tempParentesesParamAtual.append(self.tokenAtual().tipo)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "ID":
tempParentesesParamAtual.append(self.tokenAtual().lexema)
tempParenteses.append(tempParentesesParamAtual)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "COMMA":
self.params_statement(tempParenteses)
elif (
self.tokenAtual().tipo == "INT" or self.tokenAtual().tipo == "BOOL"
):
raise Exception(
"Erro sintatico: falta vírgula na linha "
+ str(self.tokenAtual().linha)
)
else:
return tempParenteses
else:
raise Exception(
"Erro sintatico: é necessário informar alguma váriavel na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: é necessário informar um tipo na linha "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <declaration_proc> OK
def declaration_proc_statement(self, temp):
self.indexDaTabelaDeTokens += 1
# identificador
if self.tokenAtual().tipo == "ID":
temp.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "PLEFT":
tempParenteses = []
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "INT" or self.tokenAtual().tipo == "BOOL":
tempParentesesParamAtual = []
tempParentesesParamAtual.append(self.indexEscopoAtual + 1)
tempParentesesParamAtual.append(self.tokenAtual().tipo)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "ID":
tempParentesesParamAtual.append(
self.tokenAtual().lexema)
tempParenteses.append(tempParentesesParamAtual)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "COMMA":
tempParenteses.append(
self.params_statement(tempParenteses))
tempParenteses.pop()
temp.append(tempParenteses)
if self.tokenAtual().tipo == "PRIGHT":
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "CLEFT":
self.indexEscopoAntesDaFuncao = (
self.indexEscopoAtual
)
self.indexEscopoAtual += 1
self.indexDaTabelaDeTokens += 1
tempBlock = []
# BLOCK # TODO: Verificar
while (
self.tokenAtual().tipo != "CRIGHT"
and self.tokenLookAhead().tipo != "SEMICOLON"
):
tempBlock.append(
self.block_statement())
temp.append(tempBlock)
if self.tokenAtual().tipo == "CRIGHT":
self.indexEscopoAtual = (
self.indexEscopoAntesDaFuncao
)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "SEMICOLON":
self.indexDaTabelaDeTokens += 1
return temp
else:
raise Exception(
"Erro sintatico: falta do ponto e vírgula na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta da chave direito na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta da chave esquerda na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do parentese direito na linha "
+ str(self.tokenAtual().linha)
)
elif self.tokenAtual().tipo == "PRIGHT":
temp.append(tempParenteses)
if self.tokenAtual().tipo == "PRIGHT":
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "CLEFT":
self.indexEscopoAntesDaFuncao = (
self.indexEscopoAtual
)
self.indexEscopoAtual += 1
self.indexDaTabelaDeTokens += 1
tempBlock = []
# BLOCK # TODO: Verificar
while (
self.tokenAtual().tipo != "CRIGHT"
and self.tokenLookAhead().tipo != "SEMICOLON"
):
tempBlock.append(
self.block_statement())
temp.append(tempBlock)
if self.tokenAtual().tipo == "CRIGHT":
self.indexEscopoAtual = (
self.indexEscopoAntesDaFuncao
)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "SEMICOLON":
self.indexDaTabelaDeTokens += 1
return temp
else:
raise Exception(
"Erro sintatico: falta do ponto e vírgula na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta da chave direito na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta da chave esquerda na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do parentese direito na linha "
+ str(self.tokenAtual().linha)
)
else:
# TODO: (5 - falta descobrir) resolver exceção
raise Exception(
"Erro sintatico: falta da virgula na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta o ID na linha "
+ str(self.tokenAtual().linha)
)
else:
if self.tokenAtual().tipo == "PRIGHT":
temp.append(tempParenteses)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "CLEFT":
self.indexEscopoAntesDaFuncao = self.indexEscopoAtual
self.indexEscopoAtual += 1
self.indexDaTabelaDeTokens += 1
tempBlock = []
# BLOCK # TODO: Verificar
while (
self.tokenAtual().tipo != "CRIGHT"
and self.tokenLookAhead().tipo != "SEMICOLON"
):
tempBlock.append(self.block_statement())
temp.append(tempBlock)
if self.tokenAtual().tipo == "CRIGHT":
self.indexEscopoAtual = self.indexEscopoAntesDaFuncao
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "SEMICOLON":
self.indexDaTabelaDeTokens += 1
return temp
else:
raise Exception(
"Erro sintatico: falta do ponto e vírgula na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta da chave direito na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta da chave esquerda na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do parentese direito na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do parentese esquerdo na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do ID na linha " +
str(self.tokenAtual().linha)
)
# ESCOPO OK
# <call_proc>
def call_proc_statement(self, temp):
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "ID":
temp.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "PLEFT":
self.indexDaTabelaDeTokens += 1
tempParams = []
if (
self.tokenAtual().tipo == "ID"
or self.tokenAtual().lexema == "True"
or self.tokenAtual().lexema == "False"
):
tempParams.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "COMMA":
tempParams.append(
self.params_call_statement(tempParams))
tempParams.pop()
temp.append(tempParams)
# [0, 'CALL', 'PROC', 'proc1', ['a', 'b', 'c']],
# [0, 'CALL', 'PROC', 'proc1', [['a'], ['b'], ['c']]],
if self.tokenAtual().tipo == "PRIGHT":
self.indexDaTabelaDeTokens += 1
temp.append(tempParams)
return temp
elif self.tokenAtual().tipo == "PRIGHT":
self.indexDaTabelaDeTokens += 1
temp.append(tempParams)
return temp
else:
raise Exception(
"Erro sintatico: falta da virgula na linha "
+ str(self.tokenAtual().linha)
)
else:
temp.append(tempParams)
if self.tokenAtual().tipo == "PRIGHT":
self.indexDaTabelaDeTokens += 1
return temp
else:
raise Exception(
"Erro sintatico: falta do parentese direito na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do parentese esquerdo na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do ID na linha " +
str(self.tokenAtual().linha)
)
# ESCOPO OK
# <call_func>
def call_func_statement(self, temp):
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "ID":
temp.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "PLEFT":
self.indexDaTabelaDeTokens += 1
tempParams = []
if (
self.tokenAtual().tipo == "ID"
or self.tokenAtual().lexema == "True"
or self.tokenAtual().lexema == "False"
):
tempParams.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "COMMA":
tempParams.append(
self.params_call_statement(tempParams))
tempParams.pop()
if self.tokenAtual().tipo == "PRIGHT":
self.indexDaTabelaDeTokens += 1
temp.append(tempParams)
return temp
else:
raise Exception(
"Erro sintatico: falta do parentese direito na linha "
+ str(self.tokenAtual().linha)
)
elif self.tokenAtual().tipo == "PRIGHT":
self.indexDaTabelaDeTokens += 1
temp.append(tempParams)
return temp
else:
raise Exception(
"Erro sintatico: falta do parentese direito na linha "
+ str(self.tokenAtual().linha)
)
else:
temp.append(tempParams)
if self.tokenAtual().tipo == "PRIGHT":
self.indexDaTabelaDeTokens += 1
return temp
else:
raise Exception(
"Erro sintatico: falta do parentese direito na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do parentese esquerdo na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do ID na linha " +
str(self.tokenAtual().linha)
)
# ESCOPO OK
# <params_call>
def params_call_statement(self, tempParams):
self.indexDaTabelaDeTokens += 1
if (
self.tokenAtual().tipo == "ID"
or self.tokenAtual().lexema == "True"
or self.tokenAtual().lexema == "False"
):
tempParams.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "COMMA":
self.params_call_statement(tempParams)
elif (
self.tokenAtual().tipo == "ID"
or self.tokenAtual().lexema == "True"
or self.tokenAtual().lexema == "False"
):
raise Exception(
"Erro sintatico: falta vírgula na linha "
+ str(self.tokenAtual().linha)
)
else:
# Removido p/ correção
# self.indexDaTabelaDeTokens += 1
return tempParams
else:
raise Exception(
"Erro sintatico: é necessário informar alguma váriavel na linha "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <print_statement> OK
def print_statement(self, temp):
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "PLEFT":
tempParams = []
temp.append(self.params_print_statement(tempParams))
# TRES END
self.tabelaDeTresEnderecos.append(
('print', self.tempTresEnderecos, 'null'))
if self.tokenAtual().tipo == "PRIGHT":
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "SEMICOLON":
self.tabelaDeSimbolos.append(temp)
self.indexDaTabelaDeTokens += 1
return
else:
# TODO: (4) SOLVE BUG DE CONTAGEM DE LINHAS
raise Exception(
"Erro sintatico: falta do ponto e virgula na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do Parentese direito na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do Parentese esquerdo na linha "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <params_print_statement> OK
def params_print_statement(self, tempParams):
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "CALL":
tempParams.append(self.tokenAtual().tipo)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "FUNC":
tempParams.append(self.tokenAtual().tipo)
tempParams = self.call_func_statement(tempParams)
return tempParams
elif self.tokenAtual().tipo == "PROC":
raise Exception(
"Erro sintatico: Procedimento não tem retorno na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: chamada incorreta de função na linha "
+ str(self.tokenAtual().linha)
)
elif (
(self.tokenAtual().tipo == "NUM")
or (self.tokenAtual().tipo == "BOOLEAN")
or (self.tokenAtual().tipo == "ID")
):
tempParams.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
if (
self.tokenAtual().tipo == "ADD"
or self.tokenAtual().tipo == "SUB"
or self.tokenAtual().tipo == "MULT"
or self.tokenAtual().tipo == "DIV"
):
tempParams.append(self.tokenAtual().lexema)
self.call_op_statement(tempParams)
return tempParams
else:
return tempParams
else:
raise Exception(
"Erro sintatico: uso incorreto dos parametros na linha "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <if_statement>
def if_statement(self, temp):
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "PLEFT":
self.indexDaTabelaDeTokens += 1
tempExpression = []
# Expression
tempExpression = self.expression_statement(tempExpression)
temp.append(tempExpression)
if self.tokenAtual().tipo == "PRIGHT":
olhaAfrente = self.tokenLookAhead()
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "CLEFT" and olhaAfrente.tipo != "CRIGHT":
self.indexDaTabelaDeTokens += 1
self.indexEscopoAtual += 1
tempBlock = []
while (
self.tokenAtual().tipo != "CRIGHT"
and self.tokenLookAhead().tipo != "ENDIF"
):
tempBlock.append(self.block3_statement())
temp.append(tempBlock)
if self.tokenAtual().tipo == "CRIGHT":
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "ENDIF":
temp.append(self.tokenAtual().tipo)
self.indexDaTabelaDeTokens += 1
tempElse = []
if self.tokenAtual().tipo == "ELSE":
tempElse.append(self.indexEscopoAtual)
tempElse.append(self.tokenAtual().tipo)
tempElse = self.else_part_statement(
tempElse) # ELSE
temp.append(tempElse)
self.tabelaDeSimbolos.append(temp)
self.indexEscopoAtual -= 1
else:
temp.append(tempElse)
self.tabelaDeSimbolos.append(temp)
self.indexEscopoAtual -= 1
return
else:
raise Exception(
"Erro sintatico: falta de ENDIF "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do CRIGHT na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do CLEFT ou bloco vazio na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do Parentese direito na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do Parentese esquerdo na linha "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <else_part>
def else_part_statement(self, tempElse):
olhaAfrente = self.tokenLookAhead()
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "CLEFT" and olhaAfrente.tipo != "CRIGHT":
self.indexDaTabelaDeTokens += 1
tempBlock = []
while (
self.tokenAtual().tipo != "CRIGHT"
and self.tokenLookAhead().tipo != "ENDELSE"
):
# Block
tempBlock.append(self.block3_statement())
tempElse.append(tempBlock)
if self.tokenAtual().tipo == "CRIGHT":
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "ENDELSE":
tempElse.append(self.tokenAtual().tipo)
self.indexDaTabelaDeTokens += 1
return tempElse
else:
raise Exception(
"Erro sintatico: falta de ENDELSE na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do CRIGHT na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do CLEFT ou bloco vazio na linha "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <if_statement2>
# IF chamado somente dentro do while, pois dentro dele pode ter BREAK E CONTINUE (block2)
def if_statement2(self, temp):
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "PLEFT":
self.indexDaTabelaDeTokens += 1
tempExpression = []
# Expression
tempExpression = self.expression_statement(tempExpression)
temp.append(tempExpression)
if self.tokenAtual().tipo == "PRIGHT":
olhaAfrente = self.tokenLookAhead()
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "CLEFT" and olhaAfrente.tipo != "CRIGHT":
self.indexDaTabelaDeTokens += 1
self.indexEscopoAtual += 1
tempBlock = []
while (
self.tokenAtual().tipo != "CRIGHT"
and self.tokenLookAhead().tipo != "ENDIF"
):
tempBlock.append(self.block2_statement())
temp.append(tempBlock)
if self.tokenAtual().tipo == "CRIGHT":
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "ENDIF":
temp.append(self.tokenAtual().tipo)
self.indexDaTabelaDeTokens += 1
tempElse = []
if self.tokenAtual().tipo == "ELSE":
tempElse.append(self.indexEscopoAtual)
tempElse.append(self.tokenAtual().tipo)
tempElse = self.else_part_statement2(
tempElse) # ELSE
temp.append(tempElse)
self.tabelaDeSimbolos.append(temp)
self.indexEscopoAtual -= 1
else:
temp.append(tempElse)
self.tabelaDeSimbolos.append(temp)
self.indexEscopoAtual -= 1
return
else:
raise Exception(
"Erro sintatico: falta de ENDIF "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do CRIGHT na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do CLEFT ou Bloco vazio na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do Parentese direito na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do Parentese esquerdo na linha "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# ELSE chamado somente dentro do while, pois dentro dele pode ter BREAK E CONTINUE (block2)
# <else_part2>
def else_part_statement2(self, tempElse):
olhaAfrente = self.tokenLookAhead()
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "CLEFT" and olhaAfrente.tipo != "CRIGHT":
self.indexDaTabelaDeTokens += 1
# Block
tempBlock = []
while (
self.tokenAtual().tipo != "CRIGHT"
and self.tokenLookAhead().tipo != "ENDELSE"
):
tempBlock.append(self.block2_statement())
tempElse.append(tempBlock)
if self.tokenAtual().tipo == "CRIGHT":
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "ENDELSE":
tempElse.append(self.tokenAtual().tipo)
self.indexDaTabelaDeTokens += 1
return tempElse
else:
raise Exception(
"Erro sintatico: falta de ENDELSE na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do CRIGHT na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do CLEFT ou bloco vazio na linha "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <while_statement>
def while_statement(self, temp):
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "PLEFT":
self.indexDaTabelaDeTokens += 1
tempExpression = []
# Expression
tempExpression = self.expression_statement(tempExpression)
temp.append(tempExpression)
if self.tokenAtual().tipo == "PRIGHT":
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "CLEFT":
self.indexDaTabelaDeTokens += 1
self.indexEscopoAtual += 1
tempBlock = []
# BLOCK
while (
self.tokenAtual().tipo != "CRIGHT"
and self.tokenLookAhead().tipo != "ENDWHILE"
):
tempBlock.append(self.block2_statement())
temp.append(tempBlock)
if self.tokenAtual().tipo == "CRIGHT":
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "ENDWHILE":
temp.append(self.tokenAtual().tipo)
self.indexDaTabelaDeTokens += 1
self.tabelaDeSimbolos.append(temp)
self.indexEscopoAtual -= 1
else:
raise Exception(
"Erro sintatico: falta de ENDWHILE na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do CRIGHT na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do CLEFT na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do PRIGHT na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do PLEFT na linha "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <unconditional_branch>
def unconditional_branch_statement(self):
if self.tokenAtual().tipo == "CONTINUE":
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "SEMICOLON":
self.indexDaTabelaDeTokens += 1
else:
raise Exception(
"Erro sintatico: falta do ponto e virgula na linha "
+ str(self.tokenAtual().linha)
)
if self.tokenAtual().tipo == "BREAK":
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "SEMICOLON":
self.indexDaTabelaDeTokens += 1
else:
raise Exception(
"Erro sintatico: falta do ponto e virgula na linha "
+ str(self.tokenAtual().linha)
)
# ESCOPO OK
# <expression>
def expression_statement(self, tempExpression):
if self.tokenAtual().tipo == "ID" or self.tokenAtual().tipo == "NUM":
tempExpression.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
if (
self.tokenAtual().tipo == "EQUAL"
or self.tokenAtual().tipo == "DIFF"
or self.tokenAtual().tipo == "LESSEQUAL"
or self.tokenAtual().tipo == "LESS"
or self.tokenAtual().tipo == "GREATEREQUAL"
or self.tokenAtual().tipo == "GREATER"
):
tempExpression.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "ID" or self.tokenAtual().tipo == "NUM":
tempExpression.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
return tempExpression
else:
raise Exception(
"Erro sintatico: falta do ID na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do operador booleano na linha "
+ str(self.tokenAtual().linha)
)
else:
raise Exception(
"Erro sintatico: falta do ID na linha " +
str(self.tokenAtual().linha)
)
# ESCOPO OK
# <call_op> ok - Operações aritméticas
def call_op_statement(self, tempEndVar):
self.indexDaTabelaDeTokens += 1
if self.tokenAtual().tipo == "ID" or self.tokenAtual().tipo == "NUM":
tempEndVar.append(self.tokenAtual().lexema)
self.indexDaTabelaDeTokens += 1
if (
self.tokenAtual().tipo == "ADD"
or self.tokenAtual().tipo == "SUB"
or self.tokenAtual().tipo == "MULT"
or self.tokenAtual().tipo == "DIV"
):
tempEndVar.append(self.tokenAtual().lexema)
self.call_op_statement(tempEndVar)
# TRES END - OP
expressaoTratada = arvoreExpressao(tempEndVar)
var = expressaoTresEnderecos(expressaoTratada)
self.tabelaDeTresEnderecos.extend(var)
# [ 2 * [1 + 1]]
# ('Mov', temp, 1)
# ('Add', temp, 1)
# ('Mult', temp, 2)
else:
return
else:
raise Exception(
"Erro sintatico: falta do ID na linha " +
str(self.tokenAtual().linha)
)
"""
\/ Análise Semântica \/
"""
# Não finalizado
# Checa semantica, se tiver tudo OK return True
def checkSemantica(self):
for k in range(len(self.tabelaDeSimbolos)):
simbolo = self.tabelaDeSimbolos[k][2]
if simbolo == "FUNC":
self.declaration_func_semantico(self.tabelaDeSimbolos[k])
if simbolo == "PROC":
self.declaration_proc_semantico(self.tabelaDeSimbolos[k])
if simbolo == "CALL":
if self.tabelaDeSimbolos[k][3] == "FUNC":
self.call_func_semantico(
self.tabelaDeSimbolos[k],
4,
self.tabelaDeSimbolos[k][0],
5,
self.tabelaDeSimbolos[k][1],
)
if self.tabelaDeSimbolos[k][3] == "PROC":
self.call_proc_semantico(
self.tabelaDeSimbolos[k], 5, self.tabelaDeSimbolos[k][1]
)
# Se for declaração de variável
if simbolo == "INT" or simbolo == "BOOL":
# print("Análise da declaração", k + 1, " -> ", self.tabelaDeSimbolos[k])
self.declaration_var_semantico(self.tabelaDeSimbolos[k])
if simbolo == "IF":
# print("Análise da declaração", k + 1, " -> ", self.tabelaDeSimbolos[k])
self.expression_semantico(self.tabelaDeSimbolos[k])
if simbolo == "WHILE":
# print("Análise da declaração", k + 1, " -> ", self.tabelaDeSimbolos[k])
self.expression_semantico(self.tabelaDeSimbolos[k])
# Se for chamada/atribuição de variável
if simbolo == "ID":
# print("Análise da declaração", k + 1, " -> ", self.tabelaDeSimbolos[k])
self.call_var_semantico(self.tabelaDeSimbolos[k])
# Outras condições
print("FIM DA ANÁLISE SEMÂNTICA - DEU CERTO :)\n")
def buscarNaTabelaDeSimbolos(self, simbolo, indice):
for k in range(len(self.tabelaDeSimbolos)):
if self.tabelaDeSimbolos[k][indice] == simbolo:
return self.tabelaDeSimbolos[k]
# TODO: Não finalizado (faltam expressões e funções)
def declaration_var_semantico(self, tabelaNoIndiceAtual):
# Se var for int
if tabelaNoIndiceAtual[2] == "INT":
simbolo = tabelaNoIndiceAtual[5][0]
# Exemplo: Caso se 'int b = 1';
# <num>
if simbolo.isnumeric():
return True
# TODO: Caso se 'int b = call func a();' se o return de func for int
# <call_func>
if simbolo == "CALL":
if tabelaNoIndiceAtual[5][1] == "FUNC":
for k in range(len(self.tabelaDeSimbolos)):
# Procura na tabela de simbolos alguma declaração de Função
if self.tabelaDeSimbolos[k][2] == "FUNC":
# Vê se alguma função declarada tem o mesmo nome da função da variável
if self.tabelaDeSimbolos[k][4] == tabelaNoIndiceAtual[5][2]:
# Conferir se a função está declarada no escopo/linha menor ou igual
if (
self.tabelaDeSimbolos[k][0]
<= tabelaNoIndiceAtual[0]
) and (
self.tabelaDeSimbolos[k][1]
<= tabelaNoIndiceAtual[1]
):
# Verificar a quantidade de parametros da função declarada com a função passada
if len(self.tabelaDeSimbolos[k][5]) == len(
tabelaNoIndiceAtual[5][3]
):
# TODO: Verificar se as variáveis passadas na chamada, já foram declaradas
for n in range(len(tabelaNoIndiceAtual[5][3])):
# Procura tem alguma variável declarada na tabela com o nome da var passada na chamada
varDeclaradaNaTabela = self.buscarNaTabelaDeSimbolos(
tabelaNoIndiceAtual[5][3][n], 3)
if(varDeclaradaNaTabela != None):
# Conferir se a variavel está declarada no escopo/linha menor ou igual
if (varDeclaradaNaTabela[0] <= tabelaNoIndiceAtual[0]
) and (varDeclaradaNaTabela[1] <= tabelaNoIndiceAtual[1]):
# Verifica se ta passando variaveis com tipo certo nos parametros
if(varDeclaradaNaTabela[2] == self.tabelaDeSimbolos[k][5][n][1]):
# Verifica qual o tipo de retorno da função declarada
if self.tabelaDeSimbolos[k][3] == "INT":
return True
else:
raise Exception(
"Erro Semântico: int não recebe int na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: tipo de variáveis incompativéis nos parametros na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: variável não declarada nos parametros na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: variável não declarada nos parametros na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: quantidade de parametros inválida na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: função não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: função não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: variável não pode receber procedimento na linha: "
+ str(tabelaNoIndiceAtual[1])
)
# TODO: Fazer semantico caso 'int e = a + d;' 'int f = 1 + 2;' (Expressão aritmética)
# <call_op>
# Caso 'int b = a'; se 'int a' for declarado já
# <identifier>
if simbolo.isalpha() and simbolo != 'True' and simbolo != 'False':
# Buscar se o 'a' foi declarado
varDeclarada = self.buscarNaTabelaDeSimbolos(
tabelaNoIndiceAtual[5][0], 3
)
# Se foi a varDeclarada não é none
if varDeclarada != None:
# Verifica se 'a' foi declarada em um escopo visivel e linhas anteriores
if (
varDeclarada[0] <= tabelaNoIndiceAtual[0]
and varDeclarada[1] <= tabelaNoIndiceAtual[1]
):
# Verificar se 'a' é int
if varDeclarada[2] == "INT":
return True
# Se não, 'int b', não pode receber 'a'
else:
raise Exception(
"Erro Semântico: variável do tipo int não recebe int na linha: "
+ str(tabelaNoIndiceAtual[1])
)
# Se não está em um escopo visivel, é considerada como não declarada
else:
raise Exception(
"Erro Semântico: variavel não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
# Se varDeclarada == None, então 'a' nunca foi declarada
else:
raise Exception(
"Erro Semântico: variavel não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: variável do tipo inteiro não recebe inteiro na linha: "
+ str(tabelaNoIndiceAtual[1])
)
# Se var for bool
if tabelaNoIndiceAtual[2] == "BOOL":
# Exemplo: Caso se 'int b = True';
# <boolean>
simbolo = tabelaNoIndiceAtual[5][0]
if simbolo == "True" or simbolo == "False":
return True
if simbolo.isnumeric():
raise Exception(
"Erro Semântico: variável do tipo boolean não recebe boolean na linha: "
+ str(tabelaNoIndiceAtual[1])
)
# TODO: Fazer semantico caso 'int e = a + d;' 'int f = 1 + 2;' (Expressão aritmética)
if simbolo == "CALL":
if tabelaNoIndiceAtual[5][1] == "FUNC":
for k in range(len(self.tabelaDeSimbolos)):
# Procura na tabela de simbolos alguma declaração de Função
if self.tabelaDeSimbolos[k][2] == "FUNC":
# Vê se alguma função declarada tem o mesmo nome da função da variável
if self.tabelaDeSimbolos[k][4] == tabelaNoIndiceAtual[5][2]:
# Conferir se a função está declarada no escopo/linha menor ou igual
if (
self.tabelaDeSimbolos[k][0]
<= tabelaNoIndiceAtual[0]
) and (
self.tabelaDeSimbolos[k][1]
<= tabelaNoIndiceAtual[1]
):
# Verificar a quantidade de parametros da função declarada com a função passada
if len(self.tabelaDeSimbolos[k][5]) == len(
tabelaNoIndiceAtual[5][3]
):
# TODO: Verificar se as variáveis passadas na chamada, já foram declaradas
for n in range(len(tabelaNoIndiceAtual[5][3])):
# Procura tem alguma variável declarada na tabela com o nome da var passada na chamada
varDeclaradaNaTabela = self.buscarNaTabelaDeSimbolos(
tabelaNoIndiceAtual[5][3][n], 3)
if(varDeclaradaNaTabela != None):
# Conferir se a variavel está declarada no escopo/linha menor ou igual
if (varDeclaradaNaTabela[0] <= tabelaNoIndiceAtual[0]
) and (varDeclaradaNaTabela[1] <= tabelaNoIndiceAtual[1]):
# Verifica se ta passando variaveis com tipo certo nos parametros
if(varDeclaradaNaTabela[2] == self.tabelaDeSimbolos[k][5][n][1]):
# Verifica qual o tipo de retorno da função declarada
if self.tabelaDeSimbolos[k][3] == "BOOL":
return True
else:
raise Exception(
"Erro Semântico: boolean não recebe boolean na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: tipo de variáveis incompativéis nos parametros na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: variável não declarada nos parametros na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: variável não declarada nos parametros na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: quantidade de parametros inválida na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: função não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: função não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: variável não pode receber procedimento na linha: "
+ str(tabelaNoIndiceAtual[1])
)
# Caso 'bool b = a'; se 'bool a' for declarado já
# <identifier>
if simbolo.isalpha() and simbolo != 'True' and simbolo != 'False':
# Buscar se o 'a' foi declarado
varDeclarada = self.buscarNaTabelaDeSimbolos(
tabelaNoIndiceAtual[5][0], 3
)
# Se foi a varDeclarada não é none
if varDeclarada != None:
# Verifica se 'a' foi declarada em um escopo visivel e linhas anteriores
if (
varDeclarada[0] <= tabelaNoIndiceAtual[0]
and varDeclarada[1] <= tabelaNoIndiceAtual[1]
):
# Verificar se 'a' é bool
if varDeclarada[2] == "BOOL":
if (varDeclarada[5][0] == 'True' or varDeclarada[5][0] == 'False'):
return True
else:
raise Exception(
"Erro Semântico: variável do tipo boolean não recebe boolean na linha: "
+ str(tabelaNoIndiceAtual[1])
)
# Se não, 'bool b', não pode receber 'a'
else:
raise Exception(
"Erro Semântico: variável do tipo boolean não recebe boolean na linha: "
+ str(tabelaNoIndiceAtual[1])
)
# Se não está em um escopo visivel, é considerada como não declarada
else:
raise Exception(
"Erro Semântico: variavel não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
# Se varDeclarada == None, então 'a' nunca foi declarada
else:
raise Exception(
"Erro Semântico: variavel não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: variável do tipo boolean não recebe boolean na linha: "
+ str(tabelaNoIndiceAtual[1])
)
# TODO: Não finalizado (faltam expressões e funções)
# TODO: Resolver problema de escopo antigo sendo visivel
def call_var_semantico(self, simbolo):
flag = False
for k in range(len(self.tabelaDeSimbolos)):
if (
self.tabelaDeSimbolos[k][2] == "INT"
or self.tabelaDeSimbolos[k][2] == "BOOL"
):
# Verificando se há duas var. com msm nome
if self.tabelaDeSimbolos[k][3] == simbolo[3]:
# Se houver, verifica se a variavel está visivel no
# escopo da qual foi chamada
if self.tabelaDeSimbolos[k][0] <= simbolo[0]:
if self.tabelaDeSimbolos[k][1] <= simbolo[1]:
flag = True # Flag para verificar se a chamada tá ok
# Chamada de método para verificar o tipo da variavel
# que está sendo atribuída
self.verificarTipoCallVar(
self.tabelaDeSimbolos[k], simbolo)
break
# Buscar em parametros de PROC
elif self.buscarParamsProc(simbolo) == True:
flag = True
break
# Buscar em parametros de FUNC
elif self.buscarParamsFunc(simbolo, 3) == True:
flag = True
break
# Se der errado a declaração:
if flag == False:
raise Exception(
"Erro Semântico: variável não declarada na linha: " +
str(simbolo[1])
)
def buscarParamsProc(self, simbolo):
paramsProc = self.buscarNaTabelaDeSimbolos("PROC", 2)
if paramsProc != None:
paramsProc = paramsProc[4]
for k in range(len(paramsProc)):
if simbolo[3] == paramsProc[k][2]:
if paramsProc[k][1] == "INT":
if simbolo[5].isnumeric():
return True
if not simbolo[5].isnumeric():
raise Exception(
"Erro Semântico: variável do tipo int não recebe int na linha: "
+ str(simbolo[1])
)
if paramsProc[k][1] == "BOOL":
# TODO: verificar posteriormente
if simbolo[5] == "True" or simbolo[5] == "False":
return True
else:
raise Exception(
"Erro Semântico: variável do tipo booleano não recebe booleano na linha: "
+ str(simbolo[1])
)
break
else:
return False
def buscarParamsFunc(self, simbolo, n):
paramsFunc = self.buscarNaTabelaDeSimbolos("FUNC", 2)
if paramsFunc != None:
paramsFunc = paramsFunc[5]
for k in range(len(paramsFunc)):
if simbolo[n] == paramsFunc[k][2]:
if paramsFunc[k][1] == "INT":
if simbolo[5].isnumeric():
return True
if not simbolo[5].isnumeric():
raise Exception(
"Erro Semântico: variável do tipo int não recebe int na linha: "
+ str(simbolo[1])
)
if paramsFunc[k][1] == "BOOL":
# TODO: verificar posteriormente
if simbolo[5] == "True" or simbolo[5] == "False":
return True
else:
raise Exception(
"Erro Semântico: variável do tipo booleano não recebe booleano na linha: "
+ str(simbolo[1])
)
break
else:
return False
# TODO: Faltam expressões e funções
def verificarTipoCallVar(self, simboloDeclaradoNaTabela, simbolo):
if simboloDeclaradoNaTabela[2] == "INT":
if not simbolo[5].isnumeric():
raise Exception(
"Erro Semântico: variável do tipo int não recebe int na linha: "
+ str(simbolo[1])
)
if simboloDeclaradoNaTabela[2] == "BOOL":
if simbolo[5] == "True" or simbolo[5] == "False":
return True
else:
raise Exception(
"Erro Semântico: variável do tipo booleano não recebe booleano na linha: "
+ str(simbolo[1])
)
# TODO: Faltam variaveis e funções
def declaration_func_semantico(self, tabelaNoIndiceAtual):
# print(tabelaNoIndiceAtual)
if tabelaNoIndiceAtual[3] == "INT":
if not tabelaNoIndiceAtual[7][2][0].isnumeric():
raise Exception(
"Erro Semântico: O retorno espera um inteiro na linha: "
+ str(tabelaNoIndiceAtual[1])
)
if tabelaNoIndiceAtual[3] == "BOOL":
if (
tabelaNoIndiceAtual[7][2][0] == "True"
or tabelaNoIndiceAtual[7][2][0] == "False"
) is False:
raise Exception(
"Erro Semântico: O retorno espera um boolean na linha: "
+ str(tabelaNoIndiceAtual[1])
)
def call_func_semantico(self, tabelaNoIndiceAtual, n, escopo, m, linha):
# print(tabelaNoIndiceAtual)
flag = False
for k in range(len(self.tabelaDeSimbolos)):
if self.tabelaDeSimbolos[k][2] == "FUNC":
if self.tabelaDeSimbolos[k][4] == tabelaNoIndiceAtual[n]:
if self.tabelaDeSimbolos[k][0] <= escopo:
flag = True
self.verificarParams(
self.tabelaDeSimbolos[k],
tabelaNoIndiceAtual,
5,
"FUNC",
m,
linha,
escopo,
)
return True
break
# Se der errado a declaração:
if flag == False:
raise Exception(
"Erro Semântico: função não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
def verificarParams(
self, simboloDeclaradoNaTabela, simbolo, n, tipo, m, linha, escopo
):
# PASSO A PASSO:
# 1º -> Verificar quantidade de parametros de acordo com a declaração
# 2º -> Se for > 0
# Devemos percorrer cada variavel dos parametros, então verificar em cada um o seguinte:
# 1º -> Verificar se já foi declarada no escopo visível ok
# 2º -> Verificar se o tipo na chamada é o mesmo da declaração ok
# 3º -> Se for sem params, prosseguir
flag = 0
# Verifica se a quantidade de parametros da chamada corresponde com a declaração
if len(simboloDeclaradoNaTabela[n]) == len(simbolo[m]):
# Se os parâmetros não for vazio:
if len(simbolo[m]) > 0:
# P/ cada parâmetro
for k in range(len(simbolo[m])):
# Leitura da declaração do parametro atual
for i in range(len(self.tabelaDeSimbolos)):
# Busca na tabela de simbolos a variavel passada na chamada da função
if self.tabelaDeSimbolos[i][3] == simbolo[m][k]:
# Verifica se foi declarado em escopo/linhas anteriores
if (self.tabelaDeSimbolos[i][0] <= escopo) and (
self.tabelaDeSimbolos[i][1] <= linha
):
# Só incrementa quando acha declaração de váriavel
if (
self.tabelaDeSimbolos[i][2] == "INT"
or self.tabelaDeSimbolos[i][2] == "BOOL"
):
flag += 1
self.comparaTipoChamadaComDeclaracao(
self.tabelaDeSimbolos[i], simbolo, tipo, n
)
break
# Se não tiver params
else:
return True
else:
raise Exception(
"Erro Semântico: quantidade de parâmetros inválido na linha: "
+ str(linha)
)
if flag != len(simboloDeclaradoNaTabela[n]):
raise Exception(
"Erro Semântico: variável do parâmetro não declarada na linha: "
+ str(linha)
)
else:
return True
def comparaTipoChamadaComDeclaracao(
self, declaracaoVarNaTabela, callFuncTabela, tipo, n
):
declaracaoFuncNaTabela = self.buscarNaTabelaDeSimbolos(tipo, 2)
flag = False
for k in range(len(declaracaoFuncNaTabela[n])):
if declaracaoFuncNaTabela[n][k][1] == declaracaoVarNaTabela[2]:
flag = True
break
# Caso ele encontre um ID ao inves da declaração direta,
# deve buscar pra saber se o tipo corresponde
elif declaracaoVarNaTabela[2] == "ID":
tipoDeclaracaoDoID = self.buscarNaTabelaDeSimbolos("ID", 2)
varDeclarada = self.buscarNaTabelaDeSimbolos(
tipoDeclaracaoDoID[3], 3)
if declaracaoFuncNaTabela[n][k][1] == varDeclarada[2]:
flag = True
break
if flag == False:
raise Exception(
"Erro Semântico: tipo do parâmetro inválido na linha: "
+ str(callFuncTabela[1])
)
def declaration_proc_semantico(self, tabelaNoIndiceAtual):
# Analisar se variaveis e funções usados dentro do procedimento são passados no parametro ou se são declarados antes
# print(tabelaNoIndiceAtual)
# Quebrando no BOOL quando atualzia a variavel com outro valor
flag = False
cont = 0
for k in range(len(self.tabelaDeSimbolos)):
# Percorre lista de Block do PROC
for i in range(len(tabelaNoIndiceAtual[5])):
# Pega as variaveis declaradas da tabela de simbolo
if (
self.tabelaDeSimbolos[k][2] == "BOOL"
or self.tabelaDeSimbolos[k][2] == "INT"
):
if tabelaNoIndiceAtual[5][i] == self.tabelaDeSimbolos[k][3]:
# Verificar se a variável encontrada está no escopo/linha menor ou igual
if (
self.tabelaDeSimbolos[k][0] <= tabelaNoIndiceAtual[0]
and self.tabelaDeSimbolos[k][1] <= tabelaNoIndiceAtual[1]
):
# Chamada de método para verificar o tipo da variavel
# que está sendo atribuída
if self.tabelaDeSimbolos[k][2] == "INT":
if not tabelaNoIndiceAtual[5][i][5].isnumeric():
raise Exception(
"Erro Semântico: variável do tipo int não recebe int na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
cont += 1
flag = True
break
return True
elif self.tabelaDeSimbolos[k][2] == "BOOL":
if (
tabelaNoIndiceAtual[5][i][5] == "True"
or tabelaNoIndiceAtual[5][i][5] == "False"
):
cont += 1
flag = True
break
return True
else:
raise Exception(
"Erro Semântico: variável do tipo booleano não recebe booleano na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
for m in range(len(tabelaNoIndiceAtual[5])):
for n in range(len(tabelaNoIndiceAtual[4])):
if (
tabelaNoIndiceAtual[5][m][3]
== tabelaNoIndiceAtual[4][n][2]
):
if tabelaNoIndiceAtual[4][n][1] == "INT":
if not tabelaNoIndiceAtual[5][m][5].isnumeric():
raise Exception(
"Erro Semântico: variável do tipo int não recebe int na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
cont += 1
flag = True
break
return True
if tabelaNoIndiceAtual[4][n][1] == "BOOL":
if (
tabelaNoIndiceAtual[5][i][5] == "True"
or tabelaNoIndiceAtual[5][i][5] == "False"
):
cont += 1
flag = True
break
return True
else:
raise Exception(
"Erro Semântico: variável do tipo booleano não recebe booleano na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
for m in range(len(tabelaNoIndiceAtual[5])):
for n in range(len(tabelaNoIndiceAtual[4])):
if (
tabelaNoIndiceAtual[5][m][3]
== tabelaNoIndiceAtual[4][n][2]
):
if tabelaNoIndiceAtual[4][n][1] == "INT":
if not tabelaNoIndiceAtual[5][m][5].isnumeric():
raise Exception(
"Erro Semântico: variável do tipo int não recebe int na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
cont += 1
flag = True
break
return True
if tabelaNoIndiceAtual[4][n][1] == "BOOL":
if (
tabelaNoIndiceAtual[5][i][5] == "True"
or tabelaNoIndiceAtual[5][i][5] == "False"
):
cont += 1
flag = True
break
return True
else:
raise Exception(
"Erro Semântico: variável do tipo booleano não recebe booleano na linha: "
+ str(tabelaNoIndiceAtual[1])
)
# Se der errado a declaração:
if flag == False and (cont != len(tabelaNoIndiceAtual[4])):
raise Exception(
"Erro Semântico: variável não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
def call_proc_semantico(self, tabelaNoIndiceAtual, m, linha):
# Analisar se o procedimento chamado já foi declarado ok
# Analisar se os parâmetros da chamada foram declarados antes ok
# Analisar se o tipo dos parâmetros da chamada são os mesmos da declaração ok
# Analisar se a quantidade dos parâmetros da chamada é a mesma da declaração ok
# print(tabelaNoIndiceAtual)
flag = False
for k in range(len(self.tabelaDeSimbolos)):
if self.tabelaDeSimbolos[k][2] == "PROC":
if self.tabelaDeSimbolos[k][3] == tabelaNoIndiceAtual[4]:
if self.tabelaDeSimbolos[k][0] <= tabelaNoIndiceAtual[0]:
flag = True
self.verificarParams(
self.tabelaDeSimbolos[k],
tabelaNoIndiceAtual,
4,
"PROC",
m,
linha,
tabelaNoIndiceAtual[0],
)
break
# Se der errado a declaração:
if flag == False:
raise Exception(
"Erro Semântico: procedimento não declarado na linha: "
+ str(tabelaNoIndiceAtual[1])
)
def expression_semantico(self, tabelaNoIndiceAtual):
buscaParam1 = self.buscarNaTabelaDeSimbolos(
tabelaNoIndiceAtual[3][0], 3)
buscaParam2 = self.buscarNaTabelaDeSimbolos(
tabelaNoIndiceAtual[3][2], 3)
if (tabelaNoIndiceAtual[3][0]).isnumeric() and (
tabelaNoIndiceAtual[3][2]
).isnumeric():
return True
elif (
tabelaNoIndiceAtual[3][0].isalpha(
) and tabelaNoIndiceAtual[3][2].isalpha()
):
if buscaParam1 != None and buscaParam2 != None:
if buscaParam1[2] == "INT" and buscaParam2[2] != "INT":
raise Exception(
"Erro Semântico: Não é possível comparar dois tipos diferentes na linha: "
+ str(tabelaNoIndiceAtual[1])
)
if buscaParam2[2] == "INT" and buscaParam1[2] != "INT":
raise Exception(
"Erro Semântico: Não é possível comparar dois tipos diferentes na linha: "
+ str(tabelaNoIndiceAtual[1])
)
if buscaParam2[2] == "INT" and buscaParam1[2] == "INT":
if (buscaParam1[0] <= tabelaNoIndiceAtual[0]) and (
buscaParam2[0] <= tabelaNoIndiceAtual[0]
):
return True
else:
raise Exception(
"Erro Semântico: Variável não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
if buscaParam2[2] == "BOOL" and buscaParam1[2] == "BOOL":
if (buscaParam1[0] <= tabelaNoIndiceAtual[0]) and (
buscaParam2[0] <= tabelaNoIndiceAtual[0]
):
if (
tabelaNoIndiceAtual[3][1] == "=="
or tabelaNoIndiceAtual[3][1] == "!="
):
return True
else:
raise Exception(
"Erro Semântico: Não é possível fazer este tipo de comparação com Boolean na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: Variável não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
if buscaParam2[2] == "INT" and buscaParam1[2] != "BOOL":
raise Exception(
"Erro Semântico: Não é possível comparar dois tipos diferentes na linha: "
+ str(tabelaNoIndiceAtual[1])
)
if buscaParam2[2] == "BOOL" and buscaParam1[2] != "INT":
raise Exception(
"Erro Semântico: Não é possível comparar dois tipos diferentes na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: variavel não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
elif (
tabelaNoIndiceAtual[3][0].isalpha()
and (tabelaNoIndiceAtual[3][2]).isnumeric()
):
if buscaParam1 != None:
if buscaParam1[2] != "INT":
raise Exception(
"Erro Semântico: Não é possível comparar dois tipos diferentes na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
if buscaParam1[0] <= tabelaNoIndiceAtual[0]:
return True
else:
raise Exception(
"Erro Semântico: Variável não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: variavel não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
elif (tabelaNoIndiceAtual[3][0]).isnumeric() and tabelaNoIndiceAtual[3][
2
].isalpha():
if buscaParam2 != None:
if buscaParam2[2] != "INT":
raise Exception(
"Erro Semântico: Não é possível comparar dois tipos diferentes na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
if buscaParam2[0] <= tabelaNoIndiceAtual[0]:
return True
else:
raise Exception(
"Erro Semântico: Variável não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: variavel não declarada na linha: "
+ str(tabelaNoIndiceAtual[1])
)
else:
raise Exception(
"Erro Semântico: parametros inválidos na linha: "
+ str(tabelaNoIndiceAtual[1])
)
| 45.929798 | 135 | 0.416823 | 8,648 | 120,382 | 5.778793 | 0.048913 | 0.11962 | 0.091126 | 0.058829 | 0.833637 | 0.806343 | 0.789575 | 0.768504 | 0.744872 | 0.733747 | 0 | 0.009608 | 0.50633 | 120,382 | 2,620 | 136 | 45.947328 | 0.831311 | 0.073283 | 0 | 0.773594 | 0 | 0 | 0.09583 | 0 | 0 | 0 | 0 | 0.000763 | 0 | 1 | 0.019991 | false | 0 | 0.001395 | 0.000465 | 0.062297 | 0.005579 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
43b5a1b281ea678d5bbe7658cc863c0f5991dff8 | 2,680 | py | Python | pac_backup/migrations/0002_auto_20190713_2027.py | jakariapervez/cmis6 | de279e5c586a77745ddd3d471599784606d50d15 | [
"MIT"
] | null | null | null | pac_backup/migrations/0002_auto_20190713_2027.py | jakariapervez/cmis6 | de279e5c586a77745ddd3d471599784606d50d15 | [
"MIT"
] | 7 | 2021-06-04T23:45:15.000Z | 2022-03-12T00:44:14.000Z | pac_backup/migrations/0002_auto_20190713_2027.py | jakariapervez/cmis6 | de279e5c586a77745ddd3d471599784606d50d15 | [
"MIT"
] | null | null | null | # Generated by Django 2.0.3 on 2019-07-13 14:27
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('pac', '0001_initial'),
]
operations = [
migrations.AlterField(
model_name='budget_allocation',
name='Dpa',
field=models.DecimalField(blank=True, decimal_places=2, max_digits=18, null=True),
),
migrations.AlterField(
model_name='budget_allocation',
name='Gob',
field=models.DecimalField(blank=True, decimal_places=2, max_digits=18, null=True),
),
migrations.AlterField(
model_name='budget_allocation',
name='Rpa',
field=models.DecimalField(blank=True, decimal_places=2, max_digits=18, null=True),
),
migrations.AlterField(
model_name='budget_allocation',
name='Total',
field=models.DecimalField(blank=True, decimal_places=2, max_digits=18, null=True),
),
migrations.AlterField(
model_name='dpp_allocation',
name='Dpa',
field=models.DecimalField(blank=True, decimal_places=2, max_digits=18, null=True),
),
migrations.AlterField(
model_name='dpp_allocation',
name='Gob',
field=models.DecimalField(blank=True, decimal_places=2, max_digits=18, null=True),
),
migrations.AlterField(
model_name='dpp_allocation',
name='Rpa',
field=models.DecimalField(blank=True, decimal_places=2, max_digits=18, null=True),
),
migrations.AlterField(
model_name='dpp_allocation',
name='Total',
field=models.DecimalField(blank=True, decimal_places=2, max_digits=18, null=True),
),
migrations.AlterField(
model_name='expenditure_details',
name='Dpa',
field=models.DecimalField(blank=True, decimal_places=2, max_digits=18, null=True),
),
migrations.AlterField(
model_name='expenditure_details',
name='Gob',
field=models.DecimalField(blank=True, decimal_places=2, max_digits=18, null=True),
),
migrations.AlterField(
model_name='expenditure_details',
name='Rpa',
field=models.DecimalField(blank=True, decimal_places=2, max_digits=18, null=True),
),
migrations.AlterField(
model_name='expenditure_details',
name='Total',
field=models.DecimalField(blank=True, decimal_places=2, max_digits=18, null=True),
),
]
| 36.216216 | 94 | 0.59403 | 279 | 2,680 | 5.530466 | 0.164875 | 0.155541 | 0.194426 | 0.225535 | 0.911212 | 0.911212 | 0.911212 | 0.888529 | 0.888529 | 0.888529 | 0 | 0.028856 | 0.288806 | 2,680 | 73 | 95 | 36.712329 | 0.780693 | 0.016791 | 0 | 0.895522 | 1 | 0 | 0.097607 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.014925 | 0 | 0.059701 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
43b88bbfb94a46e460c725f16013c03e9fdb0de2 | 162 | py | Python | parser/fase2/team29/analizer_pl/modules/expressions.py | webdev188/tytus | 847071edb17b218f51bb969d335a8ec093d13f94 | [
"MIT"
] | 35 | 2020-12-07T03:11:43.000Z | 2021-04-15T17:38:16.000Z | parser/fase2/team29/analizer_pl/modules/expressions.py | webdev188/tytus | 847071edb17b218f51bb969d335a8ec093d13f94 | [
"MIT"
] | 47 | 2020-12-09T01:29:09.000Z | 2021-01-13T05:37:50.000Z | parser/fase2/team29/analizer_pl/modules/expressions.py | webdev188/tytus | 847071edb17b218f51bb969d335a8ec093d13f94 | [
"MIT"
] | 556 | 2020-12-07T03:13:31.000Z | 2021-06-17T17:41:10.000Z | # Tipos de datos primitivos
from analizer_pl.statement.expressions import code
def C3D(value, temp, row, column):
return code.C3D(value, temp, row, column)
| 23.142857 | 50 | 0.753086 | 24 | 162 | 5.041667 | 0.75 | 0.132231 | 0.198347 | 0.247934 | 0.347107 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014599 | 0.154321 | 162 | 6 | 51 | 27 | 0.868613 | 0.154321 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
43d7bfdb72fd56fe06ffb54e33d5d2572bfb8f8f | 71 | py | Python | agent/discrete/__init__.py | SunandBean/tensorflow_RL | a248cbfb99b2041f6f7cc008fcad53fb83ac486e | [
"MIT"
] | 60 | 2019-01-29T14:13:00.000Z | 2020-11-24T09:08:05.000Z | agent/discrete/__init__.py | SunandBean/tensorflow_RL | a248cbfb99b2041f6f7cc008fcad53fb83ac486e | [
"MIT"
] | 2 | 2019-08-14T06:44:32.000Z | 2020-11-12T12:57:55.000Z | agent/discrete/__init__.py | SunandBean/tensorflow_RL | a248cbfb99b2041f6f7cc008fcad53fb83ac486e | [
"MIT"
] | 37 | 2019-01-22T05:19:34.000Z | 2021-04-12T02:27:50.000Z | from agent.discrete.seperate import *
from agent.discrete.join import * | 35.5 | 37 | 0.816901 | 10 | 71 | 5.8 | 0.6 | 0.310345 | 0.586207 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098592 | 71 | 2 | 38 | 35.5 | 0.90625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
43d92f32d645bf89be513852dc7775fcbb6d828e | 2,955 | py | Python | data/utils/data_demokritos.py | firvain/phaetons | 0f32d0b83fbc2e14b4106e0a04ad39950229ff84 | [
"MIT"
] | null | null | null | data/utils/data_demokritos.py | firvain/phaetons | 0f32d0b83fbc2e14b4106e0a04ad39950229ff84 | [
"MIT"
] | 5 | 2020-05-05T11:09:13.000Z | 2022-02-10T01:41:15.000Z | data/utils/data_demokritos.py | firvain/phaetons | 0f32d0b83fbc2e14b4106e0a04ad39950229ff84 | [
"MIT"
] | 1 | 2020-11-03T04:53:16.000Z | 2020-11-03T04:53:16.000Z | import json
import pandas as pd
# weather Stations
with open("data\demokritos\\raw\weather_station.json") as f:
data = json.load(f)
columns = data["results"][0]["series"][0]["columns"]
values = data["results"][0]["series"][0]["values"]
df = pd.DataFrame(values, columns=columns)
df = df.drop(["packets"], axis=1)
dfs = dict(tuple(df.groupby("id")))
my_dict = {}
for id in df.id.unique():
my_dict[id] = dfs[id]
my_dict[id]["time"] = pd.to_datetime(my_dict[id]["time"], utc=True)
my_dict[id]["time"] = my_dict[id]["time"].dt.tz_convert("Europe/Athens")
my_dict[id]["time"] = my_dict[id]["time"].dt.round("1min")
my_dict[id]["time"] = pd.to_datetime(my_dict[id]["time"])
my_dict[id].set_index("time", inplace=True)
my_dict[id].to_csv(
"data\demokritos\\refined\weather_station_" + id + ".csv"
)
my_dict[id] = my_dict[id].resample("1H").sum()
my_dict[id].to_csv(
"data\demokritos\\resampled\weather_station_" + id + ".csv"
)
# HM_LHLM06 Controller
with open("data\demokritos\\raw\HM_LHLM06.json") as f:
data = json.load(f)
columns = data["results"][0]["series"][0]["columns"]
values = data["results"][0]["series"][0]["values"]
df = pd.DataFrame(values, columns=columns)
dfs = dict(tuple(df.groupby("id")))
my_dict = {}
for id in df.id.unique():
my_dict[id] = dfs[id]
my_dict[id]["time"] = pd.to_datetime(my_dict[id]["time"], utc=True)
my_dict[id]["time"] = my_dict[id]["time"].dt.tz_convert("Europe/Athens")
my_dict[id]["time"] = my_dict[id]["time"].dt.round("1min")
my_dict[id]["time"] = pd.to_datetime(my_dict[id]["time"])
my_dict[id].set_index("time", inplace=True)
my_dict[id].to_csv("data\demokritos\\refined\HM_LHLM06_" + id + ".csv")
my_dict[id] = my_dict[id].resample("1H").sum()
my_dict[id].to_csv(
"data\demokritos\\resampled\HM_LHLM06_" + id + ".csv"
)
# LHLM06 Controller
with open("data\demokritos\\raw\LHLM06.json") as f:
data = json.load(f)
columns = data["results"][0]["series"][0]["columns"]
values = data["results"][0]["series"][0]["values"]
df = pd.DataFrame(values, columns=columns)
dfs = dict(tuple(df.groupby("id")))
my_dict = {}
for id in df.id.unique():
my_dict[id] = dfs[id]
my_dict[id]["time"] = pd.to_datetime(my_dict[id]["time"], utc=True)
my_dict[id]["time"] = my_dict[id]["time"].dt.tz_convert("Europe/Athens")
my_dict[id]["time"] = my_dict[id]["time"].dt.round("1min")
my_dict[id]["time"] = pd.to_datetime(my_dict[id]["time"])
my_dict[id].set_index("time", inplace=True)
my_dict[id].to_csv("data\demokritos\\refined\LHLM06_" + id + ".csv")
my_dict[id] = my_dict[id].resample("1H").sum()
my_dict[id].to_csv("data\demokritos\\resampled\LHLM06_" + id + ".csv")
| 42.214286 | 80 | 0.595939 | 442 | 2,955 | 3.812217 | 0.133484 | 0.160237 | 0.199407 | 0.17092 | 0.918101 | 0.903264 | 0.903264 | 0.854599 | 0.854599 | 0.854599 | 0 | 0.014669 | 0.192555 | 2,955 | 69 | 81 | 42.826087 | 0.691534 | 0.018613 | 0 | 0.75 | 0 | 0 | 0.224102 | 0.11395 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.033333 | 0 | 0.033333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
78de2e630c78a33104cb9005d4efe046dc0228a0 | 781 | py | Python | octicons16px/smiley.py | andrewp-as-is/octicons16px.py | 1272dc9f290619d83bd881e87dbd723b0c48844c | [
"Unlicense"
] | 1 | 2021-01-28T06:47:39.000Z | 2021-01-28T06:47:39.000Z | octicons16px/smiley.py | andrewp-as-is/octicons16px.py | 1272dc9f290619d83bd881e87dbd723b0c48844c | [
"Unlicense"
] | null | null | null | octicons16px/smiley.py | andrewp-as-is/octicons16px.py | 1272dc9f290619d83bd881e87dbd723b0c48844c | [
"Unlicense"
] | null | null | null |
OCTICON_SMILEY = """
<svg class="octicon octicon-smiley" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M1.5 8a6.5 6.5 0 1113 0 6.5 6.5 0 01-13 0zM8 0a8 8 0 100 16A8 8 0 008 0zM5 8a1 1 0 100-2 1 1 0 000 2zm7-1a1 1 0 11-2 0 1 1 0 012 0zM5.32 9.636a.75.75 0 011.038.175l.007.009c.103.118.22.222.35.31.264.178.683.37 1.285.37.602 0 1.02-.192 1.285-.371.13-.088.247-.192.35-.31l.007-.008a.75.75 0 111.222.87l-.614-.431c.614.43.614.431.613.431v.001l-.001.002-.002.003-.005.007-.014.019a1.984 1.984 0 01-.184.213c-.16.166-.338.316-.53.445-.63.418-1.37.638-2.127.629-.946 0-1.652-.308-2.126-.63a3.32 3.32 0 01-.715-.657l-.014-.02-.005-.006-.002-.003v-.002h-.001l.613-.432-.614.43a.75.75 0 01.183-1.044h.001z"></path></svg>
"""
| 156.2 | 754 | 0.665813 | 196 | 781 | 2.647959 | 0.581633 | 0.023121 | 0.028902 | 0.015414 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.537377 | 0.09219 | 781 | 4 | 755 | 195.25 | 0.19464 | 0 | 0 | 0 | 0 | 0.333333 | 0.969231 | 0.524359 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
78eac8d74779b04fb7707993b6b0b4dbc4a245aa | 674 | py | Python | psutils/custom_errors.py | ehusby/pyscript-utils | 8483037a0836c7292e906e5572eb85e2623ed245 | [
"MIT"
] | 1 | 2022-02-22T01:39:43.000Z | 2022-02-22T01:39:43.000Z | psutils/custom_errors.py | ehusby/pyscript-utils | 8483037a0836c7292e906e5572eb85e2623ed245 | [
"MIT"
] | null | null | null | psutils/custom_errors.py | ehusby/pyscript-utils | 8483037a0836c7292e906e5572eb85e2623ed245 | [
"MIT"
] | null | null | null |
class VersionError(Exception):
def __init__(self, msg=""):
super(Exception, self).__init__(msg)
class DeveloperError(Exception):
def __init__(self, msg=""):
super(Exception, self).__init__(msg)
class ScriptArgumentError(Exception):
def __init__(self, msg=""):
super(Exception, self).__init__(msg)
class InvalidArgumentError(Exception):
def __init__(self, msg=""):
super(Exception, self).__init__(msg)
class ExternalError(Exception):
def __init__(self, msg=""):
super(Exception, self).__init__(msg)
class DimensionError(Exception):
def __init__(self, msg=""):
super(Exception, self).__init__(msg)
| 26.96 | 44 | 0.679525 | 72 | 674 | 5.694444 | 0.180556 | 0.17561 | 0.234146 | 0.292683 | 0.763415 | 0.763415 | 0.763415 | 0.763415 | 0.763415 | 0.763415 | 0 | 0 | 0.178042 | 674 | 24 | 45 | 28.083333 | 0.740072 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 9 |
600ab104cb0c9114582deb8526bbdeb18a2c0cd1 | 2,407 | py | Python | tests/lists/tests/delete_tests.py | kimgea/django-ordered-field | c3a79cd93b013d90bbe0d6b9c9ede872d16af949 | [
"MIT"
] | null | null | null | tests/lists/tests/delete_tests.py | kimgea/django-ordered-field | c3a79cd93b013d90bbe0d6b9c9ede872d16af949 | [
"MIT"
] | 1 | 2018-05-10T09:11:49.000Z | 2018-05-10T09:11:49.000Z | tests/lists/tests/delete_tests.py | kimgea/django-ordered-field | c3a79cd93b013d90bbe0d6b9c9ede872d16af949 | [
"MIT"
] | null | null | null | from django.test import TestCase
from tests.lists.models import Item
from tests.lists.tests.helper import set_up_helper
class ListDeleteTest(TestCase):
def setUp(self):
set_up_helper()
def test_delete_first(self):
item = Item.objects.filter(list=1, order=0).first()
item.delete()
result = list(Item.objects.filter(list=1).order_by("list", "order").
values_list("updated_by", "order_changed_count", "list", "order", "id"))
expected_result = [("a", 1, 1, 0, 2), ("a", 1, 1, 1, 3), ("a", 1, 1, 2, 4), ("a", 1, 1, 3, 5)]
self.assertEqual(result, expected_result)
def test_delete_second(self):
item = Item.objects.filter(list=1, order=1).first()
item.delete()
result = list(Item.objects.filter(list=1).order_by("list", "order").
values_list("updated_by", "order_changed_count", "list", "order", "id"))
expected_result = [("", 0, 1, 0, 1), ("a", 1, 1, 1, 3), ("a", 1, 1, 2, 4), ("a", 1, 1, 3, 5)]
self.assertEqual(result, expected_result)
def test_delete_middle(self):
item = Item.objects.filter(list=1, order=2).first()
item.delete()
result = list(Item.objects.filter(list=1).order_by("list", "order").
values_list("updated_by", "order_changed_count", "list", "order", "id"))
expected_result = [("", 0, 1, 0, 1), ("", 0, 1, 1, 2), ("a", 1, 1, 2, 4), ("a", 1, 1, 3, 5)]
self.assertEqual(result, expected_result)
def test_delete_second_last(self):
item = Item.objects.filter(list=1, order=3).first()
item.delete()
result = list(Item.objects.filter(list=1).order_by("list", "order").
values_list("updated_by", "order_changed_count", "list", "order", "id"))
expected_result = [("", 0, 1, 0, 1), ("", 0, 1, 1, 2), ("", 0, 1, 2, 3), ("a", 1, 1, 3, 5)]
self.assertEqual(result, expected_result)
def test_delete_last(self):
item = Item.objects.filter(list=1, order=4).first()
item.delete()
result = list(Item.objects.filter(list=1).order_by("list", "order").
values_list("updated_by", "order_changed_count", "list", "order", "id"))
expected_result = [("", 0, 1, 0, 1), ("", 0, 1, 1, 2), ("", 0, 1, 2, 3), ("", 0, 1, 3, 4)]
self.assertEqual(result, expected_result)
| 43.763636 | 102 | 0.565434 | 340 | 2,407 | 3.855882 | 0.123529 | 0.022883 | 0.129672 | 0.160183 | 0.856598 | 0.829901 | 0.829901 | 0.829901 | 0.749809 | 0.690313 | 0 | 0.051771 | 0.23764 | 2,407 | 54 | 103 | 44.574074 | 0.66267 | 0 | 0 | 0.487805 | 0 | 0 | 0.105941 | 0 | 0 | 0 | 0 | 0 | 0.121951 | 1 | 0.146341 | false | 0 | 0.073171 | 0 | 0.243902 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
600d899018eba822ce7b9886274c798db98f487e | 5,849 | py | Python | tonic/tensorflow/models/actor_critics.py | Eyalcohenx/tonic | afc15c6fa23fed4f696f68f0acf961964b0172dc | [
"MIT"
] | 350 | 2020-08-06T13:49:11.000Z | 2022-03-24T08:53:59.000Z | tonic/tensorflow/models/actor_critics.py | Eyalcohenx/tonic | afc15c6fa23fed4f696f68f0acf961964b0172dc | [
"MIT"
] | 12 | 2020-08-07T02:21:58.000Z | 2021-05-20T11:50:44.000Z | tonic/tensorflow/models/actor_critics.py | Eyalcohenx/tonic | afc15c6fa23fed4f696f68f0acf961964b0172dc | [
"MIT"
] | 35 | 2020-08-06T16:53:40.000Z | 2021-12-17T06:01:09.000Z | import copy
import tensorflow as tf
class ActorCritic(tf.keras.Model):
def __init__(
self, actor, critic, observation_normalizer=None,
return_normalizer=None
):
super().__init__()
self.actor = actor
self.critic = critic
self.observation_normalizer = observation_normalizer
self.return_normalizer = return_normalizer
def initialize(self, observation_space, action_space):
if self.observation_normalizer:
self.observation_normalizer.initialize(observation_space.shape)
self.actor.initialize(
observation_space, action_space, self.observation_normalizer)
self.critic.initialize(
observation_space, action_space, self.observation_normalizer,
self.return_normalizer)
dummy_observations = tf.zeros((1,) + observation_space.shape)
self.actor(dummy_observations)
self.critic(dummy_observations)
class ActorCriticWithTargets(tf.keras.Model):
def __init__(
self, actor, critic, observation_normalizer=None,
return_normalizer=None, target_coeff=0.005
):
super().__init__()
self.actor = actor
self.critic = critic
self.target_actor = copy.deepcopy(actor)
self.target_critic = copy.deepcopy(critic)
self.observation_normalizer = observation_normalizer
self.return_normalizer = return_normalizer
self.target_coeff = target_coeff
def initialize(self, observation_space, action_space):
if self.observation_normalizer:
self.observation_normalizer.initialize(observation_space.shape)
self.actor.initialize(
observation_space, action_space, self.observation_normalizer)
self.critic.initialize(
observation_space, action_space, self.observation_normalizer,
self.return_normalizer)
self.target_actor.initialize(
observation_space, action_space, self.observation_normalizer)
self.target_critic.initialize(
observation_space, action_space, self.observation_normalizer,
self.return_normalizer)
dummy_observations = tf.zeros((1,) + observation_space.shape)
dummy_actions = tf.zeros((1,) + action_space.shape)
self.actor(dummy_observations)
self.critic(dummy_observations, dummy_actions)
self.target_actor(dummy_observations)
self.target_critic(dummy_observations, dummy_actions)
self.online_variables = (
self.actor.trainable_variables +
self.critic.trainable_variables)
self.target_variables = (
self.target_actor.trainable_variables +
self.target_critic.trainable_variables)
self.assign_targets()
def assign_targets(self):
for o, t in zip(self.online_variables, self.target_variables):
t.assign(o)
def update_targets(self):
for o, t in zip(self.online_variables, self.target_variables):
t.assign((1 - self.target_coeff) * t + self.target_coeff * o)
class ActorTwinCriticWithTargets(tf.keras.Model):
def __init__(
self, actor, critic, observation_normalizer=None,
return_normalizer=None, target_coeff=0.005
):
super().__init__()
self.actor = actor
self.critic_1 = critic
self.critic_2 = copy.deepcopy(critic)
self.target_actor = copy.deepcopy(actor)
self.target_critic_1 = copy.deepcopy(critic)
self.target_critic_2 = copy.deepcopy(critic)
self.observation_normalizer = observation_normalizer
self.return_normalizer = return_normalizer
self.target_coeff = target_coeff
def initialize(self, observation_space, action_space):
if self.observation_normalizer:
self.observation_normalizer.initialize(observation_space.shape)
self.actor.initialize(
observation_space, action_space, self.observation_normalizer)
self.critic_1.initialize(
observation_space, action_space, self.observation_normalizer,
self.return_normalizer)
self.critic_2.initialize(
observation_space, action_space, self.observation_normalizer,
self.return_normalizer)
self.target_actor.initialize(
observation_space, action_space, self.observation_normalizer)
self.target_critic_1.initialize(
observation_space, action_space, self.observation_normalizer,
self.return_normalizer)
self.target_critic_2.initialize(
observation_space, action_space, self.observation_normalizer,
self.return_normalizer)
dummy_observations = tf.zeros((1,) + observation_space.shape)
dummy_actions = tf.zeros((1,) + action_space.shape)
self.actor(dummy_observations)
self.critic_1(dummy_observations, dummy_actions)
self.critic_2(dummy_observations, dummy_actions)
self.target_actor(dummy_observations)
self.target_critic_1(dummy_observations, dummy_actions)
self.target_critic_2(dummy_observations, dummy_actions)
self.online_variables = (
self.actor.trainable_variables +
self.critic_1.trainable_variables +
self.critic_2.trainable_variables)
self.target_variables = (
self.target_actor.trainable_variables +
self.target_critic_1.trainable_variables +
self.target_critic_2.trainable_variables)
self.assign_targets()
def assign_targets(self):
for o, t in zip(self.online_variables, self.target_variables):
t.assign(o)
def update_targets(self):
for o, t in zip(self.online_variables, self.target_variables):
t.assign((1 - self.target_coeff) * t + self.target_coeff * o)
| 41.778571 | 75 | 0.68781 | 633 | 5,849 | 6.036335 | 0.075829 | 0.083748 | 0.137399 | 0.105993 | 0.969641 | 0.945041 | 0.936666 | 0.915205 | 0.915205 | 0.906569 | 0 | 0.006918 | 0.233886 | 5,849 | 139 | 76 | 42.079137 | 0.845793 | 0 | 0 | 0.768 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08 | false | 0 | 0.016 | 0 | 0.12 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
602a5c9dd75903f424232d8ee3928c740e49f0e3 | 204 | py | Python | mayan/apps/documents/views/__init__.py | eshbeata/open-paperless | 6b9ed1f21908116ad2795b3785b2dbd66713d66e | [
"Apache-2.0"
] | 2,743 | 2017-12-18T07:12:30.000Z | 2022-03-27T17:21:25.000Z | mayan/apps/documents/views/__init__.py | eshbeata/open-paperless | 6b9ed1f21908116ad2795b3785b2dbd66713d66e | [
"Apache-2.0"
] | 15 | 2017-12-18T14:58:07.000Z | 2021-03-01T20:05:05.000Z | mayan/apps/documents/views/__init__.py | eshbeata/open-paperless | 6b9ed1f21908116ad2795b3785b2dbd66713d66e | [
"Apache-2.0"
] | 257 | 2017-12-18T03:12:58.000Z | 2022-03-25T08:59:10.000Z | from .document_page_views import * # NOQA
from .document_type_views import * # NOQA
from .document_version_views import * # NOQA
from .document_views import * # NOQA
from .misc_views import * # NOQA
| 34 | 45 | 0.754902 | 28 | 204 | 5.214286 | 0.321429 | 0.376712 | 0.513699 | 0.520548 | 0.554795 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171569 | 204 | 5 | 46 | 40.8 | 0.863905 | 0.117647 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
602fb310e8cd5592999f51b323f2cef45df5490e | 3,468 | py | Python | tests/parsers/test_pw2gw.py | lbotsch/aiida-quantumespresso | fe75c80cecb61113641366961ced8ed5a03cf896 | [
"MIT"
] | null | null | null | tests/parsers/test_pw2gw.py | lbotsch/aiida-quantumespresso | fe75c80cecb61113641366961ced8ed5a03cf896 | [
"MIT"
] | null | null | null | tests/parsers/test_pw2gw.py | lbotsch/aiida-quantumespresso | fe75c80cecb61113641366961ced8ed5a03cf896 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# pylint: disable=invalid-name,redefined-outer-name
"""Tests for the `Pw2gwParser`."""
from aiida import orm
def test_pw2gw_default(
aiida_profile, fixture_localhost, generate_parser, generate_calc_job_node, data_regression, num_regression
):
"""Test a normal pw2gw.x output."""
name = 'default'
entry_point_calc_job = 'quantumespresso.pw2gw'
entry_point_parser = 'quantumespresso.pw2gw'
node = generate_calc_job_node(entry_point_calc_job, fixture_localhost, name)
parser = generate_parser(entry_point_parser)
results, calcfunction = parser.parse_from_node(node, store_provenance=False)
assert calcfunction.is_finished, calcfunction.exception
assert calcfunction.is_finished_ok, calcfunction.exit_message
assert not orm.Log.objects.get_logs_for(node)
data_regression.check({'output_parameters': results['output_parameters'].get_dict()},
basename='test_pw2gw_default_data')
num_regression.check(dict(results['eps'].get_iterarrays()), basename='test_pw2gw_default_eps')
def test_pw2gw_failed_missing_output(
aiida_profile,
fixture_localhost,
generate_parser,
generate_calc_job_node,
):
"""Test a pw2gw.x output where file are missing."""
name = 'failed_missing_output'
entry_point_calc_job = 'quantumespresso.pw2gw'
entry_point_parser = 'quantumespresso.pw2gw'
node = generate_calc_job_node(entry_point_calc_job, fixture_localhost, name)
parser = generate_parser(entry_point_parser)
_, calcfunction = parser.parse_from_node(node, store_provenance=False)
assert calcfunction.is_finished, calcfunction.exception
assert calcfunction.is_failed, calcfunction.exit_status
assert calcfunction.exit_status, node.process_class.exit_codes.ERROR_OUTPUT_FILES.status
assert orm.Log.objects.get_logs_for(node)
def test_pw2gw_failed_missing_stdout(
aiida_profile,
fixture_localhost,
generate_parser,
generate_calc_job_node,
):
"""Test a pw2gw.x output where file are missing."""
name = 'failed_missing_stdout'
entry_point_calc_job = 'quantumespresso.pw2gw'
entry_point_parser = 'quantumespresso.pw2gw'
node = generate_calc_job_node(entry_point_calc_job, fixture_localhost, name)
parser = generate_parser(entry_point_parser)
_, calcfunction = parser.parse_from_node(node, store_provenance=False)
assert calcfunction.is_finished, calcfunction.exception
assert calcfunction.is_failed, calcfunction.exit_status
assert calcfunction.exit_status, node.process_class.exit_codes.ERROR_OUTPUT_STDOUT_MISSING.status
assert orm.Log.objects.get_logs_for(node)
def test_pw2gw_failed_corrupted_file(
aiida_profile,
fixture_localhost,
generate_parser,
generate_calc_job_node,
):
"""Test a pw2gw.x output where file are corrupted."""
name = 'failed_corrupted_file'
entry_point_calc_job = 'quantumespresso.pw2gw'
entry_point_parser = 'quantumespresso.pw2gw'
node = generate_calc_job_node(entry_point_calc_job, fixture_localhost, name)
parser = generate_parser(entry_point_parser)
_, calcfunction = parser.parse_from_node(node, store_provenance=False)
assert calcfunction.is_finished, calcfunction.exception
assert calcfunction.is_failed, calcfunction.exit_status
assert calcfunction.exit_status, node.process_class.exit_codes.ERROR_OUTPUT_FILES.status
assert orm.Log.objects.get_logs_for(node)
| 35.387755 | 110 | 0.774798 | 438 | 3,468 | 5.746575 | 0.173516 | 0.044497 | 0.047676 | 0.060389 | 0.814064 | 0.801351 | 0.801351 | 0.790624 | 0.790624 | 0.790624 | 0 | 0.006725 | 0.142445 | 3,468 | 97 | 111 | 35.752577 | 0.83961 | 0.078143 | 0 | 0.734375 | 0 | 0 | 0.100946 | 0.087066 | 0 | 0 | 0 | 0 | 0.234375 | 1 | 0.0625 | false | 0 | 0.015625 | 0 | 0.078125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6083756c93d56c438501c6561945abf967b13f49 | 148 | py | Python | my_classes/.history/ModulesPackages_PackageNamespaces/main_20210725190435.py | minefarmer/deep-Dive-1 | b0675b853180c5b5781888266ea63a3793b8d855 | [
"Unlicense"
] | null | null | null | my_classes/.history/ModulesPackages_PackageNamespaces/main_20210725190435.py | minefarmer/deep-Dive-1 | b0675b853180c5b5781888266ea63a3793b8d855 | [
"Unlicense"
] | null | null | null | my_classes/.history/ModulesPackages_PackageNamespaces/main_20210725190435.py | minefarmer/deep-Dive-1 | b0675b853180c5b5781888266ea63a3793b8d855 | [
"Unlicense"
] | null | null | null | # main.py
print('=================================')
print('Running main.py - module name: {0}'.fo)
print('=================================')
| 14.8 | 46 | 0.317568 | 12 | 148 | 3.916667 | 0.666667 | 0.255319 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007519 | 0.101351 | 148 | 9 | 47 | 16.444444 | 0.345865 | 0.047297 | 0 | 0.666667 | 0 | 0 | 0.719424 | 0.47482 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
608aed76b73d29d1e5428cc4a77f935f6fb2d592 | 11,711 | py | Python | imagesorter.py | chrisgw190015/projectbutterfly | cbb74544c30fc18b08e33e75c8586e8175f7a522 | [
"Apache-2.0"
] | null | null | null | imagesorter.py | chrisgw190015/projectbutterfly | cbb74544c30fc18b08e33e75c8586e8175f7a522 | [
"Apache-2.0"
] | null | null | null | imagesorter.py | chrisgw190015/projectbutterfly | cbb74544c30fc18b08e33e75c8586e8175f7a522 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on 2 May 2021
@author: Wong Wai Cheng
Description: This file contains the code for sorting images into different species.
"""
import os, shutil
from keras.preprocessing import image
import numpy as np
import matplotlib.pyplot as plt
import tensorflow as tf
from joblib import load
from functions import mobilebase1b
batch_size = 10
img_width_mobile, img_height_mobile = 224, 224
"""
Image Sorter with Softmax
"""
def visualize_predictions(classifier):
path ='/Users/chris/Desktop/MasterofDataScience/thesis_related/dataset/tosort/'
output_path = '/Users/chris/Desktop/MasterofDataScience/thesis_related/dataset/tosort/'
files = []
file_names_array = []
for r, d, file_names in os.walk(path):
for file in file_names:
if file.lower().endswith(('.png','.jpg','.jpeg')):
files.append(os.path.join(r, file))
file_names_array.append(file)
for f in files:
# print(f)
img = image.load_img(f, target_size=(224,224))
img_tensor = image.img_to_array(img)
img_tensor /= 255.
# Extract features
features = mobilebase1b.predict(img_tensor.reshape(1, img_width_mobile, img_height_mobile, 3))
# Make prediction
try:
prediction = classifier.predict(features)
except:
prediction = classifier.predict(features.reshape(1, 7 * 7 * 1024))
# Show picture
plt.imshow(img_tensor)
plt.show()
# Write prediction
if np.argmax(prediction) > 9.8e-1 and np.argmax(prediction) == 0:
print(prediction, 'catopsilia-pomona')
if not os.path.exists(output_path + 'catopsilia-pomona'):
os.makedirs(output_path + 'catopsilia-pomona')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='catopsilia pomona',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
elif np.argmax(prediction) > 9.8e-1 and np.argmax(prediction) == 1:
print(prediction, 'danaus-chrysippus')
if not os.path.exists(output_path + 'danaus-chrysippus'):
os.makedirs(output_path + 'danaus-chrysippus')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='danaus-chrysippus',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
elif np.argmax(prediction) > 9.8e-1 and np.argmax(prediction) == 2:
print(prediction, 'eurema-hecabe')
if not os.path.exists(output_path + 'eurema-hecabe'):
os.makedirs(output_path + 'eurema-hecabe')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='eurema-hecabe',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
elif np.argmax(prediction) > 9.8e-1 and np.argmax(prediction) == 3:
print(prediction, 'graphium-doson')
if not os.path.exists(output_path + 'graphium-doson'):
os.makedirs(output_path + 'graphium-doson')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='graphium-doson',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
elif np.argmax(prediction) > 9.8e-1 and np.argmax(prediction) == 4:
print(prediction, 'graphium-sarpedon')
if not os.path.exists(output_path + 'graphium-sarpedon'):
os.makedirs(output_path + 'graphium-sarpedon')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='graphium-sarpedon',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
elif np.argmax(prediction) > 9.8e-1 and np.argmax(prediction) == 5:
print(prediction, 'junonia-hedonia')
if not os.path.exists(output_path + 'junonia-hedonia'):
os.makedirs(output_path + 'junonia-hedonia')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='junonia-hedonia',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
elif np.argmax(prediction) > 9.8e-1 and np.argmax(prediction) == 6:
print(prediction, 'junonia-iphita')
if not os.path.exists(output_path + 'junonia-iphita'):
os.makedirs(output_path + 'junonia-iphita')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='junonia-iphita',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
elif np.argmax(prediction) > 9.8e-1 and np.argmax(prediction) == 7:
print(prediction, 'papilio-demoleus')
if not os.path.exists(output_path + 'papilio-demoleus'):
os.makedirs(output_path + 'papilio-demoleus')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='papilio-demoleus',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
elif np.argmax(prediction) < 9.8e-1:
print(prediction, 'unidentifiable')
if not os.path.exists(output_path + 'unidentifiable'):
os.makedirs(output_path + 'unidentifiable')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='unidentifiable',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
"""
Image Sorter with SVM
"""
def visualize_predictions_svm(classifier):
path ='/Users/chris/Desktop/MasterofDataScience/thesis_related/dataset/tosort/'
output_path = '/Users/chris/Desktop/MasterofDataScience/thesis_related/dataset/tosort/'
files = []
file_names_array = []
for r, d, file_names in os.walk(path):
for file in file_names:
if file.lower().endswith(('.png','.jpg','.jpeg')):
files.append(os.path.join(r, file))
file_names_array.append(file)
for f in files:
# print(f)
img = image.load_img(f, target_size=(224,224))
img_tensor = image.img_to_array(img)
img_tensor /= 255.
# Extract features
features = mobilebase1b.predict(img_tensor.reshape(1, 224, 224, 3))
# Make prediction
try:
prediction = classifier.predict(features)
except:
prediction = classifier.predict(features.reshape(1, 7 * 7 * 1024))
# Show picture
plt.imshow(img_tensor)
plt.show()
# Write prediction
if prediction == 0:
print(prediction, 'catopsilia-pomona')
if not os.path.exists(output_path + 'catopsilia-pomona'):
os.makedirs(output_path + 'catopsilia-pomona')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='catopsilia-pomona',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
elif prediction == 1:
print(prediction, 'danaus-chrysippus')
if not os.path.exists(output_path + 'danaus-chrysippus'):
os.makedirs(output_path + 'danaus-chrysippus')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='danaus-chrysippus',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
elif prediction == 2:
print(prediction, 'eurema-hecabe')
if not os.path.exists(output_path + 'eurema-hecabe'):
os.makedirs(output_path + 'eurema-hecabe')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='eurema-hecabe',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
elif prediction == 3:
print(prediction, 'graphium-doson')
if not os.path.exists(output_path + 'graphium-doson'):
os.makedirs(output_path + 'graphium-doson')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='graphium-doson',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
elif prediction == 4:
print(prediction, 'graphium-sarpedon')
if not os.path.exists(output_path + 'graphium-sarpedon'):
os.makedirs(output_path + 'graphium-sarpedon')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='graphium-sarpedon',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
elif prediction == 5:
print(prediction, 'junonia-hedonia')
if not os.path.exists(output_path + 'junonia-hedonia'):
os.makedirs(output_path + 'junonia-hedonia')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='junonia-hedonia',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
elif prediction == 6:
print(prediction, 'junonia-iphita')
if not os.path.exists(output_path + 'junonia-iphita'):
os.makedirs(output_path + 'junonia-iphita')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='junonia-iphita',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
elif prediction == 7:
print(prediction, 'papilio-demoleus')
if not os.path.exists(output_path + 'papilio-demoleus'):
os.makedirs(output_path + 'papilio-demoleus')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='papilio-demoleus',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
else:
print(prediction, 'unidentifiable')
if not os.path.exists(output_path + 'unidentifiable'):
os.makedirs(output_path + 'unidentifiable')
full_output_path = "{op}{lbl}/{fn}".format(op=output_path, lbl='unidentifiable',
fn=file_names_array[files.index(f)])
shutil.copyfile(f, full_output_path)
# softmax
model = tf.keras.models.load_model('/Users/chris/Desktop/MasterofDataScience/thesis_related/projectbutterfly/final-model-softmax-manualaug.h5')
visualize_predictions(model)
# svm
svm_model = load('/Users/chris/Desktop/MasterofDataScience/thesis_related/projectbutterfly/final-model-svm-manualaug.joblib')
visualize_predictions_svm(svm_model)
###############################################################################################
#End#
############################################################################################### | 52.048889 | 143 | 0.562633 | 1,299 | 11,711 | 4.906082 | 0.113934 | 0.144359 | 0.079084 | 0.031069 | 0.916052 | 0.916052 | 0.906951 | 0.906951 | 0.906951 | 0.883414 | 0 | 0.012615 | 0.302792 | 11,711 | 225 | 144 | 52.048889 | 0.767912 | 0.026727 | 0 | 0.79558 | 0 | 0.01105 | 0.16791 | 0.044404 | 0 | 0 | 0 | 0 | 0 | 1 | 0.01105 | false | 0 | 0.038674 | 0 | 0.049724 | 0.099448 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
608ef4bca948f628a32f98c0c805b3c152eed761 | 15,648 | py | Python | tests/build_tests.py | mattmoutoux/populi-api-python | c2be771e2700e7cdea32721a4ddccc01e21b79af | [
"MIT"
] | 1 | 2021-02-17T12:49:23.000Z | 2021-02-17T12:49:23.000Z | tests/build_tests.py | mattmoutoux/populi-api-python | c2be771e2700e7cdea32721a4ddccc01e21b79af | [
"MIT"
] | 1 | 2021-03-31T19:27:23.000Z | 2021-03-31T19:27:23.000Z | tests/build_tests.py | mattmoutoux/populi-api-python | c2be771e2700e7cdea32721a4ddccc01e21b79af | [
"MIT"
] | 1 | 2021-05-25T18:16:05.000Z | 2021-05-25T18:16:05.000Z | from nose.tools import *
import unittest
import pycurl
from io import BytesIO
from unittest.mock import call, patch, MagicMock
from lxml import etree
from populi import build
class TestPopuliBuild(unittest.TestCase):
@patch('populi.build.BytesIO')
@patch('populi.build.pycurl')
def test_get_api_reference(self, mock_pycurl, mock_BytesIO):
curl = MagicMock()
mock_pycurl.Curl.return_value = curl
mock_pycurl.URL = 'blah'
mock_pycurl.WRITEDATA = 'blimp'
bio = BytesIO()
expected_html = b'<html><body><p>Test</p></body></html>'
bio.write(expected_html)
mock_BytesIO.return_value = bio
response = build.request_api_reference()
calls = [call('blah', build.api_reference_url), call('blimp', bio)]
curl.setopt.assert_has_calls(calls, any_order=True)
curl.perform.assert_called_once_with()
curl.close.assert_called_once_with()
self.assertEqual(type(response), etree._Element)
self.assertEqual(expected_html, etree.tostring(response))
def test_get_none_command_parameters(self):
html = b'''<html><body>
<h2><a id="getPayment"></a>getPayment</h2>
<p>Returns all basic and associated information about a payment. This could include credit card information, electronic check information, invoice payments, and payment refunds.</p>
<h3>Parameters</h3>
<p>None.</p>
</body></html>'''
tree = etree.fromstring(html, etree.HTMLParser())
parameters = build.get_command_parameters(tree, 'getPayment')
self.assertEqual(len(parameters), 0)
def test_no_parameters_exist(self):
html = b'''<html><body>
<h2><a id="getPayment"></a>getPayment</h2>
<p>Returns all basic and associated information about a payment. This could include credit card information, electronic check information, invoice payments, and payment refunds.</p>
<h2><a id="getWhatever"></a>getWhatever</h2>
<p>Blah blah blah</p>
<h3>Parameters</h3>
<table style="width: 100%; border: 1px solid #767676;" border="0">
<thead>
<tr style="background-color: #eee;">
<td>Parameter</td>
<td>Description</td>
<td>Required</td>
</tr>
</thead>
<tbody>
<tr>
<td>payment_id</td>
<td>The numeric ID of the payment you're interested in.</td>
<td>Yes</td>
</tr>
<tr>
<td>student_id</td>
<td>The numeric ID of the student.</td>
<td>No</td>
</tr>
</tbody>
</table>
</body></html>'''
tree = etree.fromstring(html, etree.HTMLParser())
parameters = build.get_command_parameters(tree, 'getPayment')
self.assertEqual(len(parameters), 0)
def test_get_single_command_parameter(self):
html = b'''<html><body>
<h2><a id="getPayment"></a>getPayment</h2>
<p>Returns all basic and associated information about a payment. This could include credit card information, electronic check information, invoice payments, and payment refunds.</p>
<h3>Parameters</h3>
<table style="width: 100%; border: 1px solid #767676;" border="0">
<thead>
<tr style="background-color: #eee;">
<td>Parameter</td>
<td>Description</td>
<td>Required</td>
</tr>
</thead>
<tbody>
<tr>
<td>payment_id</td>
<td>The numeric ID of the payment you're interested in.</td>
<td>Yes</td>
</tr>
</tbody>
</table>
</body></html>'''
tree = etree.fromstring(html, etree.HTMLParser())
parameters = build.get_command_parameters(tree, 'getPayment')
self.assertEqual(len(parameters), 1)
self.assertEqual(parameters[0].field, 'payment_id')
self.assertEqual(parameters[0].comment, 'The numeric ID of the payment you\'re interested in.')
self.assertEqual(parameters[0].required, True)
def test_get_single_command_parameter_w_extra_junk(self):
html = b'''<html><body>
<h2><a id="getPayment"></a>getPayment</h2>
<p>Returns all basic and associated information about a payment. This could include credit card information, electronic check information, invoice payments, and payment refunds.</p>
<h3>Parameters</h3>
<table style="width: 100%; border: 1px solid #767676;" border="0">
<thead>
<tr style="background-color: #eee;">
<td>Parameter</td>
<td>Description</td>
<td>Required</td>
</tr>
</thead>
<tbody>
<tr>
<td>payment_id blah blah blah</td>
<td>The numeric ID of the payment you're interested in.</td>
<td>Yes</td>
</tr>
</tbody>
</table>
</body></html>'''
tree = etree.fromstring(html, etree.HTMLParser())
parameters = build.get_command_parameters(tree, 'getPayment')
self.assertEqual(len(parameters), 1)
self.assertEqual(parameters[0].field, 'payment_id')
self.assertEqual(parameters[0].comment, 'The numeric ID of the payment you\'re interested in.')
self.assertEqual(parameters[0].required, True)
def test_get_double_command_parameter(self):
html = b'''<html><body>
<h2><a id="getPayment"></a>getPayment</h2>
<p>Returns all basic and associated information about a payment. This could include credit card information, electronic check information, invoice payments, and payment refunds.</p>
<h3>Parameters</h3>
<table style="width: 100%; border: 1px solid #767676;" border="0">
<thead>
<tr style="background-color: #eee;">
<td>Parameter</td>
<td>Description</td>
<td>Required</td>
</tr>
</thead>
<tbody>
<tr>
<td>payment_id</td>
<td>The numeric ID of the payment you're interested in.</td>
<td>Yes</td>
</tr>
<tr>
<td>student_id</td>
<td>The numeric ID of the student.</td>
<td>No</td>
</tr>
</tbody>
</table>
</body></html>'''
tree = etree.fromstring(html, etree.HTMLParser())
parameters = build.get_command_parameters(tree, 'getPayment')
self.assertEqual(len(parameters), 2)
self.assertEqual(parameters[0].field, 'payment_id')
self.assertEqual(parameters[0].comment, 'The numeric ID of the payment you\'re interested in.')
self.assertEqual(parameters[0].required, True)
self.assertEqual(parameters[1].field, 'student_id')
self.assertEqual(parameters[1].comment, 'The numeric ID of the student.')
self.assertEqual(parameters[1].required, False)
def test_get_parameters_where_some_parameters_repeat(self):
html = b'''<html><body>
<h2><a id="getPayment"></a>getPayment</h2>
<p>Returns all basic and associated information about a payment. This could include credit card information, electronic check information, invoice payments, and payment refunds.</p>
<h3>Parameters</h3>
<table style="width: 100%; border: 1px solid #767676;" border="0">
<thead>
<tr style="background-color: #eee;">
<td>Parameter</td>
<td>Description</td>
<td>Required</td>
</tr>
</thead>
<tbody>
<tr>
<td>student_id</td>
<td>The numeric ID of the payment you're interested in.</td>
<td>Yes</td>
</tr>
<tr><td> </td><td> </td><td> </td></tr>
<tr><td></td><td>whatever</td><td>huh</td></tr>
<tr>
<td>student_id</td>
<td>The numeric ID of the student.</td>
<td>No</td>
</tr>
</tbody>
</table>
</body></html>'''
tree = etree.fromstring(html, etree.HTMLParser())
parameters = build.get_command_parameters(tree, 'getPayment')
self.assertEqual(len(parameters), 1)
self.assertEqual(parameters[0].field, 'student_id')
def test_get_parameters_where_some_rows_are_blank(self):
html = b'''<html><body>
<h2><a id="getPayment"></a>getPayment</h2>
<p>Returns all basic and associated information about a payment. This could include credit card information, electronic check information, invoice payments, and payment refunds.</p>
<h3>Parameters</h3>
<table style="width: 100%; border: 1px solid #767676;" border="0">
<thead>
<tr style="background-color: #eee;">
<td>Parameter</td>
<td>Description</td>
<td>Required</td>
</tr>
</thead>
<tbody>
<tr>
<td>payment_id</td>
<td>The numeric ID of the payment you're interested in.</td>
<td>Yes</td>
</tr>
<tr><td> </td><td> </td><td> </td></tr>
<tr><td></td><td>whatever</td><td>huh</td></tr>
<tr>
<td>student_id</td>
<td>The numeric ID of the student.</td>
<td>No</td>
</tr>
</tbody>
</table>
</body></html>'''
tree = etree.fromstring(html, etree.HTMLParser())
parameters = build.get_command_parameters(tree, 'getPayment')
self.assertEqual(len(parameters), 2)
self.assertEqual(parameters[0].field, 'payment_id')
self.assertEqual(parameters[1].field, 'student_id')
@patch('populi.build.request_api_reference')
def test_get_commands(self, mock_api_ref):
html = b'''<html><body>
<h2><a id="getPayment"></a>getPayment</h2>
<p>Returns all basic and associated information about a payment. This could include credit card information, electronic check information, invoice payments, and payment refunds.</p>
<h3>Parameters</h3>
<table style="width: 100%; border: 1px solid #767676;" border="0">
<thead>
<tr style="background-color: #eee;">
<td>Parameter</td>
<td>Description</td>
<td>Required</td>
</tr>
</thead>
<tbody>
<tr>
<td>payment_id</td>
<td>The numeric ID of the payment you're interested in.</td>
<td>Yes</td>
</tr>
</tbody>
</table>
</body></html>'''
mock_api_ref.return_value = etree.fromstring(html, etree.HTMLParser())
commands = build.get_commands()
self.assertEqual(len(commands), 1)
self.assertEqual(commands[0].name, 'getPayment')
self.assertEqual(commands[0].comment, 'Returns all basic and associated information about a payment. This could include credit card information, electronic check information, invoice payments, and payment refunds.')
self.assertEqual(len(commands[0].parameters), 1)
self.assertEqual(commands[0].parameters[0].field, 'payment_id')
self.assertEqual(commands[0].parameters[0].comment, 'The numeric ID of the payment you\'re interested in.')
self.assertEqual(commands[0].parameters[0].required, True)
self.assertEqual(commands[0].parameters[0].default, 'None')
def test_command_class_paging(self):
command = build.Command('freddy', 'fries', [])
reg_field = type('', (), {})
pag_field = type('', (), {})
reg_field.field = 'whatever'
pag_field.field = 'page'
self.assertFalse(command.paging())
command.parameters = [reg_field]
self.assertFalse(command.paging())
command.parameters = [pag_field]
self.assertTrue(command.paging())
command.parameters = [reg_field, pag_field]
self.assertTrue(command.paging())
def test_build_command_wo_arguments_string(self):
expected_string = '''def freddy():
"""
fries
:returns: String containing xml or an lxml element.
"""
return get_anonymous('freddy')'''
command = build.Command('freddy', 'fries', [])
result = str(command)
self.assertEqual(expected_string, result)
def test_build_command_w_capitalization_made_pythonic(self):
expected_string = '''def get_freddy():
"""
fries
:returns: String containing xml or an lxml element.
"""
return get_anonymous('getFreddy')'''
command = build.Command('getFreddy', 'fries', [])
result = str(command)
self.assertEqual(expected_string, result)
def test_build_command_with_paging(self):
expected_string = '''def get_freddy():
"""
fries
:returns: String containing xml or an lxml element.
"""
return get_all_anonymous('getFreddy', root_element='getFreddee')'''
command = build.Command('getFreddy', 'fries', [])
pag_field = type('', (), {})
pag_field.field = 'page'
command.parameters = [pag_field]
build.root_elements['getFreddy'] = 'getFreddee'
result = str(command)
self.assertEqual(expected_string, result)
del(build.root_elements['getFreddy'])
def test_build_command_with_one_parameter(self):
expected_string = '''def get_custom_field_options(custom_field_id: str=None):
"""
Returns
:param custom_field_id: Blah
:returns: String containing xml or an lxml element.
"""
return get_anonymous('getCustomFieldOptions', custom_field_id=custom_field_id)'''
field = type('', (), {})
field.field = 'custom_field_id'
field.required = False
field.default = 'None'
field.comment = 'Blah'
command = build.Command('getCustomFieldOptions', 'Returns', [field])
result = str(command)
self.assertEqual(expected_string, result)
def test_get_command_parameter_that_is_php_array(self):
html = b'''<html><body>
<h2><a id="getPayment"></a>getPayment</h2>
<p>Returns all basic and associated information about a payment. This could include credit card information, electronic check information, invoice payments, and payment refunds.</p>
<h3>Parameters</h3>
<table style="width: 100%; border: 1px solid #767676;" border="0">
<thead>
<tr style="background-color: #eee;">
<td>Parameter</td>
<td>Description</td>
<td>Required</td>
</tr>
</thead>
<tbody>
<tr>
<td>payment_id[]</td>
<td>The numeric ID of the payment you're interested in.</td>
<td>Yes</td>
</tr>
</tbody>
</table>
</body></html>'''
tree = etree.fromstring(html, etree.HTMLParser())
parameters = build.get_command_parameters(tree, 'getPayment')
self.assertEqual(len(parameters), 1)
self.assertEqual(parameters[0].field, 'payment_id')
self.assertEqual(parameters[0].comment, 'The numeric ID of the payment you\'re interested in.')
self.assertEqual(parameters[0].required, True)
self.assertEqual(parameters[0].default, '[]')
def test_build_command_with_one_array_parameter(self):
expected_string = '''def get_custom_field_options(custom_field_id: list=[]):
"""
Returns
:param custom_field_id: Blah
:returns: String containing xml or an lxml element.
"""
return get_anonymous('getCustomFieldOptions', custom_field_id=custom_field_id)'''
field = type('', (), {})
field.field = 'custom_field_id'
field.required = True
field.comment = 'Blah'
field.default = '[]'
command = build.Command('getCustomFieldOptions', 'Returns', [field])
result = str(command)
self.assertEqual(expected_string, result)
| 34.696231 | 223 | 0.61305 | 1,871 | 15,648 | 5.018172 | 0.094602 | 0.021302 | 0.050591 | 0.02684 | 0.826925 | 0.804985 | 0.755459 | 0.748003 | 0.742571 | 0.742571 | 0 | 0.014052 | 0.245079 | 15,648 | 450 | 224 | 34.773333 | 0.78075 | 0 | 0 | 0.791328 | 0 | 0.0271 | 0.568571 | 0.077134 | 0 | 0 | 0 | 0 | 0.132791 | 1 | 0.04336 | false | 0 | 0.01897 | 0 | 0.078591 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
609ed3e18e4d1865bb567f23656180f40e8b9354 | 1,020 | py | Python | Recursion/countPaths.py | eferroni/Data-Structure-and-Algorithms | bad33d4619884e03c85e13f604db753af5b543a0 | [
"MIT"
] | null | null | null | Recursion/countPaths.py | eferroni/Data-Structure-and-Algorithms | bad33d4619884e03c85e13f604db753af5b543a0 | [
"MIT"
] | null | null | null | Recursion/countPaths.py | eferroni/Data-Structure-and-Algorithms | bad33d4619884e03c85e13f604db753af5b543a0 | [
"MIT"
] | null | null | null | def countPaths(arr, row, col):
# Valid Square?
try:
if arr[row][col] == 0:
return 0
except IndexError:
return 0
# End?
if row == len(arr) - 1 and col == len(arr[row]) - 1:
return 1
# Recursion
return countPaths(arr, row + 1, col) + countPaths(arr, row, col + 1)
def countPaths2(arr, row, col, path=[[0] * 8] * 8):
# Valid Square?
try:
if arr[row][col] == 0:
return 0
except IndexError:
return 0
# End?
if row == len(arr) - 1 and col == len(arr[row]) - 1:
return 1
# Recursion
if path[row][col] == 0:
path[row][col] = countPaths(arr, row + 1, col) + countPaths(arr, row, col + 1)
return path
my_arr = [
[1, 1, 1, 1, 1, 1, 1, 1],
[1, 1, 0, 1, 1, 1, 0, 1],
[1, 1, 1, 1, 0, 1, 1, 1],
[0, 1, 0, 1, 1, 0, 1, 1],
[1, 1, 0, 1, 1, 1, 1, 1],
[1, 1, 1, 0, 0, 1, 0, 1],
[1, 0, 1, 1, 1, 0, 1, 1],
[1, 1, 1, 1, 1, 1, 1, 1],
]
print(countPaths(my_arr, 0, 0))
| 24.285714 | 86 | 0.460784 | 175 | 1,020 | 2.674286 | 0.131429 | 0.17094 | 0.192308 | 0.188034 | 0.747863 | 0.747863 | 0.747863 | 0.747863 | 0.747863 | 0.728632 | 0 | 0.130827 | 0.348039 | 1,020 | 41 | 87 | 24.878049 | 0.572932 | 0.055882 | 0 | 0.516129 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064516 | false | 0 | 0 | 0 | 0.322581 | 0.032258 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
60c2664c68e2ff84b8ccf258abc4c402e41d9b91 | 1,281 | py | Python | Blog/migrations/0004_auto_20180410_1220.py | stevenpi/Link.Python.Django.DiyBlog | b5899a9727ce68c069c2f121747b0e74747d005c | [
"MIT"
] | null | null | null | Blog/migrations/0004_auto_20180410_1220.py | stevenpi/Link.Python.Django.DiyBlog | b5899a9727ce68c069c2f121747b0e74747d005c | [
"MIT"
] | 57 | 2018-03-09T15:02:57.000Z | 2022-03-11T23:20:29.000Z | Blog/migrations/0004_auto_20180410_1220.py | stevenpi/Link.Python.Django.DiyBlog | b5899a9727ce68c069c2f121747b0e74747d005c | [
"MIT"
] | null | null | null | # Generated by Django 2.0.3 on 2018-04-10 12:20
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('Blog', '0003_auto_20180409_0857'),
]
operations = [
migrations.AddField(
model_name='comment',
name='num_vote_down',
field=models.PositiveIntegerField(db_index=True, default=0),
),
migrations.AddField(
model_name='comment',
name='num_vote_up',
field=models.PositiveIntegerField(db_index=True, default=0),
),
migrations.AddField(
model_name='comment',
name='vote_score',
field=models.IntegerField(db_index=True, default=0),
),
migrations.AddField(
model_name='post',
name='num_vote_down',
field=models.PositiveIntegerField(db_index=True, default=0),
),
migrations.AddField(
model_name='post',
name='num_vote_up',
field=models.PositiveIntegerField(db_index=True, default=0),
),
migrations.AddField(
model_name='post',
name='vote_score',
field=models.IntegerField(db_index=True, default=0),
),
]
| 29.113636 | 72 | 0.572209 | 129 | 1,281 | 5.488372 | 0.325581 | 0.152542 | 0.194915 | 0.228814 | 0.792373 | 0.792373 | 0.792373 | 0.792373 | 0.74435 | 0.74435 | 0 | 0.042093 | 0.313817 | 1,281 | 43 | 73 | 29.790698 | 0.763367 | 0.035129 | 0 | 0.810811 | 1 | 0 | 0.103728 | 0.018639 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.027027 | 0 | 0.108108 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
60e5bef532e71b2cf6a072c16959db657da439ca | 7,628 | py | Python | api/tests/opentrons/config/test_advanced_settings_migration.py | faliester/opentrons | e945d0f72fed39b0f68c0b30b7afd1981644184f | [
"Apache-2.0"
] | 1 | 2022-03-17T20:38:04.000Z | 2022-03-17T20:38:04.000Z | api/tests/opentrons/config/test_advanced_settings_migration.py | faliester/opentrons | e945d0f72fed39b0f68c0b30b7afd1981644184f | [
"Apache-2.0"
] | null | null | null | api/tests/opentrons/config/test_advanced_settings_migration.py | faliester/opentrons | e945d0f72fed39b0f68c0b30b7afd1981644184f | [
"Apache-2.0"
] | null | null | null | from opentrons.config.advanced_settings import _migrate, _ensure
good_file_version = 7
good_file_settings = {
'shortFixedTrash': None,
'calibrateToBottom': None,
'deckCalibrationDots': None,
'disableHomeOnBoot': None,
'useOldAspirationFunctions': None,
'disableLogAggregation': None,
'enableApi1BackCompat': None,
'useProtocolApi2': None,
'useV1HttpApi': None,
'enableDoorSafetySwitch': None,
'enableTipLengthCalibration': None,
'enableHttpProtocolSessions': None,
}
def test_migrates_empty_object():
settings, version = _migrate({})
assert version == good_file_version
assert settings == good_file_settings
def test_migrates_versionless_new_config():
settings, version = _migrate({
'shortFixedTrash': True,
'calibrateToBottom': True,
'deckCalibrationDots': False,
'disableHomeOnBoot': True,
'useOldAspirationFunctions': True,
})
assert version == good_file_version
assert settings == {
'shortFixedTrash': True,
'calibrateToBottom': True,
'deckCalibrationDots': None,
'disableHomeOnBoot': True,
'useOldAspirationFunctions': True,
'disableLogAggregation': None,
'enableApi1BackCompat': None,
'useProtocolApi2': None,
'useV1HttpApi': None,
'enableDoorSafetySwitch': None,
'enableTipLengthCalibration': None,
'enableHttpProtocolSessions': None,
}
def test_migrates_versionless_old_config():
settings, version = _migrate({
'short-fixed-trash': False,
'calibrate-to-bottom': False,
'dots-deck-type': True,
'disable-home-on-boot': False,
})
assert version == good_file_version
assert settings == {
'shortFixedTrash': None,
'calibrateToBottom': None,
'deckCalibrationDots': True,
'disableHomeOnBoot': None,
'useOldAspirationFunctions': None,
'disableLogAggregation': None,
'enableApi1BackCompat': None,
'useProtocolApi2': None,
'useV1HttpApi': None,
'enableDoorSafetySwitch': None,
'enableTipLengthCalibration': None,
'enableHttpProtocolSessions': None,
}
def test_ignores_invalid_keys():
settings, version = _migrate({
'split-labware-def': True,
'splitLabwareDefinitions': True
})
assert version == good_file_version
assert settings == good_file_settings
def test_migrates_v1_config():
settings, version = _migrate({
'_version': 1,
'shortFixedTrash': True,
'calibrateToBottom': True,
'deckCalibrationDots': False,
'disableHomeOnBoot': True,
'useProtocolApi2': None,
'useOldAspirationFunctions': True,
})
assert version == good_file_version
assert settings == {
'shortFixedTrash': True,
'calibrateToBottom': True,
'deckCalibrationDots': False,
'disableHomeOnBoot': True,
'useOldAspirationFunctions': True,
'disableLogAggregation': None,
'useProtocolApi2': None,
'enableApi1BackCompat': None,
'useV1HttpApi': None,
'enableDoorSafetySwitch': None,
'enableTipLengthCalibration': None,
'enableHttpProtocolSessions': None,
}
def test_migrates_v2_config():
settings, version = _migrate({
'_version': 2,
'shortFixedTrash': True,
'calibrateToBottom': True,
'deckCalibrationDots': False,
'disableHomeOnBoot': True,
'useProtocolApi2': None,
'enableApi1BackCompat': False,
'useOldAspirationFunctions': True,
'disableLogAggregation': False,
})
assert version == good_file_version
assert settings == {
'shortFixedTrash': True,
'calibrateToBottom': True,
'deckCalibrationDots': False,
'disableHomeOnBoot': True,
'useOldAspirationFunctions': True,
'disableLogAggregation': False,
'useProtocolApi2': None,
'enableApi1BackCompat': None,
'useV1HttpApi': None,
'enableDoorSafetySwitch': None,
'enableTipLengthCalibration': None,
'enableHttpProtocolSessions': None,
}
def test_migrates_v3_config():
settings, version = _migrate({
'_version': 3,
'shortFixedTrash': True,
'calibrateToBottom': True,
'deckCalibrationDots': False,
'disableHomeOnBoot': True,
'useProtocolApi2': None,
'useOldAspirationFunctions': True,
'disableLogAggregation': False,
'enableApi1BackCompat': False
})
assert version == good_file_version
assert settings == {
'shortFixedTrash': True,
'calibrateToBottom': True,
'deckCalibrationDots': False,
'disableHomeOnBoot': True,
'useOldAspirationFunctions': True,
'disableLogAggregation': False,
'useProtocolApi2': None,
'enableApi1BackCompat': False,
'useV1HttpApi': None,
'enableDoorSafetySwitch': None,
'enableTipLengthCalibration': None,
'enableHttpProtocolSessions': None,
}
def test_migrates_v4_config():
settings, version = _migrate({
'_version': 4,
'shortFixedTrash': True,
'calibrateToBottom': True,
'deckCalibrationDots': False,
'disableHomeOnBoot': True,
'useProtocolApi2': None,
'useOldAspirationFunctions': True,
'disableLogAggregation': False,
'enableApi1BackCompat': False,
'useV1HttpApi': False
})
assert version == good_file_version
assert settings == {
'shortFixedTrash': True,
'calibrateToBottom': True,
'deckCalibrationDots': False,
'disableHomeOnBoot': True,
'useOldAspirationFunctions': True,
'disableLogAggregation': False,
'useProtocolApi2': None,
'enableApi1BackCompat': False,
'useV1HttpApi': False,
'enableDoorSafetySwitch': None,
'enableTipLengthCalibration': None,
'enableHttpProtocolSessions': None,
}
def test_migrates_v5_config():
settings, version = _migrate({
'_version': 5,
'shortFixedTrash': True,
'calibrateToBottom': True,
'deckCalibrationDots': False,
'disableHomeOnBoot': True,
'useProtocolApi2': None,
'useOldAspirationFunctions': True,
'disableLogAggregation': False,
'enableApi1BackCompat': False,
'useV1HttpApi': False,
'enableDoorSafetySwitch': True,
})
assert version == good_file_version
assert settings == {
'shortFixedTrash': True,
'calibrateToBottom': True,
'deckCalibrationDots': False,
'disableHomeOnBoot': True,
'useOldAspirationFunctions': True,
'disableLogAggregation': False,
'useProtocolApi2': None,
'enableApi1BackCompat': False,
'useV1HttpApi': False,
'enableDoorSafetySwitch': True,
'enableTipLengthCalibration': None,
'enableHttpProtocolSessions': None,
}
def test_ensures_config():
assert _ensure(
{'_version': 3,
'shortFixedTrash': False,
'disableLogAggregation': True})\
== {
'_version': 3,
'shortFixedTrash': False,
'calibrateToBottom': None,
'deckCalibrationDots': None,
'disableHomeOnBoot': None,
'useOldAspirationFunctions': None,
'disableLogAggregation': True,
'useProtocolApi2': None,
'enableDoorSafetySwitch': None,
'enableTipLengthCalibration': None,
'enableHttpProtocolSessions': None,
}
| 30.031496 | 64 | 0.633325 | 509 | 7,628 | 9.339882 | 0.129666 | 0.055953 | 0.090871 | 0.100968 | 0.879049 | 0.842238 | 0.796382 | 0.777451 | 0.776609 | 0.713504 | 0 | 0.008663 | 0.258521 | 7,628 | 253 | 65 | 30.150198 | 0.83186 | 0 | 0 | 0.818584 | 0 | 0 | 0.410068 | 0.180912 | 0 | 0 | 0 | 0 | 0.084071 | 1 | 0.044248 | false | 0 | 0.004425 | 0 | 0.048673 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8804a243e1a71716d987bb29f6c2e19e7bcd8822 | 5,826 | py | Python | app/test/test_repository/test_base_hadoop_stub_repo.py | rodrigo-fagundes/suetonio-api | 4f8b09f2dd1227c42d2788553b55159365168080 | [
"MIT"
] | 1 | 2019-12-16T11:54:44.000Z | 2019-12-16T11:54:44.000Z | app/test/test_repository/test_base_hadoop_stub_repo.py | rodrigo-fagundes/suetonio-api | 4f8b09f2dd1227c42d2788553b55159365168080 | [
"MIT"
] | 15 | 2020-01-07T12:48:19.000Z | 2020-11-27T23:41:55.000Z | app/test/test_repository/test_base_hadoop_stub_repo.py | rodrigo-fagundes/suetonio-api | 4f8b09f2dd1227c42d2788553b55159365168080 | [
"MIT"
] | null | null | null | '''Main tests in API'''
import unittest
from test.stubs.repository import StubHadoopRepository
class HadoopRepositoryFindDatasetTest(unittest.TestCase):
''' Classe que testa a obtenção de dados de tabela única '''
def test_no_cats(self):
''' Lança exceção se não houver categoria nos parâmetros '''
repo = StubHadoopRepository()
options = {
"valor": ['vl_indicador'],
"agregacao": ['sum'],
"ordenacao": ['-vl_indicador'],
"where": ['eq-nu_competencia-2010']
}
self.assertRaises(
KeyError,
repo.find_dataset,
options
)
def test_empty_cats(self):
''' Lança exceção se houver categorias vazias '''
repo = StubHadoopRepository()
options = {
"categorias": [],
"valor": ['vl_indicador'],
"agregacao": ['sum'],
"ordenacao": ['-vl_indicador'],
"where": ['eq-nu_competencia-2010'],
"pivot": None
}
self.assertRaises(
ValueError,
repo.find_dataset,
options
)
def test_sql_injection_rejection(self):
''' Lança exceção se houver categorias com palavra bloqueada '''
repo = StubHadoopRepository()
options = {
"categorias": ['nm_indicador;select', 'nu_competencia', 'vl_indicador'],
"valor": ['vl_indicador'],
"agregacao": ['sum'],
"ordenacao": ['-vl_indicador'],
"where": ['eq-nu_competencia-2010']
}
self.assertRaises(
ValueError,
repo.find_dataset,
options
)
def test_full_query(self):
''' Verifica correta formação da query '''
repo = StubHadoopRepository()
options = {
"categorias": ['nm_indicador', 'nu_competencia', 'vl_indicador'],
"valor": ['vl_indicador'],
"agregacao": ['sum'],
"ordenacao": ['-nm_indicador'],
"where": ['eq-nu_competencia-2010'],
"pivot": None,
"limit": None,
"offset": None,
"calcs": None
}
result = repo.find_dataset(options)
self.assertEqual(
result,
('SELECT nm_indicador, nu_competencia, vl_indicador, '
'sum(vl_indicador) AS agr_sum_vl_indicador FROM indicadores '
'WHERE nu_competencia = 2010 GROUP BY nm_indicador, '
'nu_competencia, vl_indicador ORDER BY nm_indicador DESC ')
)
class HadoopRepositoryFindJoinedDatasetTest(unittest.TestCase):
''' Classe que testa a obtenção de dados de tabela com único join '''
def test_no_cats(self):
''' Lança exceção se não houver categoria nos parâmetros '''
repo = StubHadoopRepository()
options = {
"valor": ['vl_indicador'],
"agregacao": ['sum'],
"ordenacao": ['-vl_indicador'],
"where": ['eq-nu_competencia-2010'],
"joined": 'municipio'
}
self.assertRaises(
KeyError,
repo.find_joined_dataset,
options
)
def test_empty_cats(self):
''' Lança exceção se houver categorias vazias '''
repo = StubHadoopRepository()
options = {
"categorias": [],
"valor": ['vl_indicador'],
"agregacao": ['sum'],
"ordenacao": ['-vl_indicador'],
"where": ['eq-nu_competencia-2010'],
"joined": 'municipio'
}
self.assertRaises(
ValueError,
repo.find_joined_dataset,
options
)
def test_sql_injection_rejection(self):
''' Lança exceção se houver categorias com palavra bloqueada '''
repo = StubHadoopRepository()
options = {
"categorias": ['nm_indicador;select', 'nu_competencia', 'vl_indicador'],
"valor": ['vl_indicador'],
"agregacao": ['sum'],
"ordenacao": ['-vl_indicador'],
"where": ['eq-nu_competencia-2010'],
"joined": 'municipio'
}
self.assertRaises(
ValueError,
repo.find_dataset,
options
)
def test_no_join(self):
''' Lança exceção se não houver join nos parâmetros '''
repo = StubHadoopRepository()
options = {
"categorias": ['nm_indicador', 'nu_competencia', 'vl_indicador'],
"valor": ['vl_indicador'],
"agregacao": ['sum'],
"ordenacao": ['-vl_indicador'],
"where": ['eq-nu_competencia-2010']
}
self.assertRaises(
KeyError,
repo.find_joined_dataset,
options
)
def test_full_query_limit_offset(self):
''' Verifica correta formação da query com limit e offset'''
repo = StubHadoopRepository()
options = {
"categorias": ['nm_indicador', 'nu_competencia', 'vl_indicador', 'lat', 'long'],
"valor": ['vl_indicador'],
"agregacao": ['sum'],
"ordenacao": ['-nm_indicador'],
"where": ['eq-nu_competencia-2010'],
"pivot": None,
"limit": '1',
"offset": '5',
"calcs": None
}
result = repo.find_dataset(options)
expected = ('SELECT nm_indicador, nu_competencia, vl_indicador, '
'lat, long, sum(vl_indicador) AS agr_sum_vl_indicador '
'FROM indicadores WHERE nu_competencia = 2010 GROUP BY '
'nm_indicador, nu_competencia, vl_indicador, lat, long '
'ORDER BY nm_indicador DESC LIMIT 1 OFFSET 5')
self.assertEqual(result, expected)
| 34.886228 | 92 | 0.534329 | 512 | 5,826 | 5.884766 | 0.185547 | 0.105875 | 0.062064 | 0.074676 | 0.9001 | 0.881182 | 0.843677 | 0.801527 | 0.779954 | 0.748755 | 0 | 0.012529 | 0.34243 | 5,826 | 166 | 93 | 35.096386 | 0.773949 | 0.100069 | 0 | 0.710345 | 0 | 0 | 0.300522 | 0.03829 | 0 | 0 | 0 | 0 | 0.062069 | 1 | 0.062069 | false | 0 | 0.013793 | 0 | 0.089655 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
717f990267edb744b86537a44fb9e300e31d154c | 22,242 | py | Python | scripts/external_libs/elasticsearch7/elasticsearch/client/cat.py | timgates42/trex-core | efe94752fcb2d0734c83d4877afe92a3dbf8eccd | [
"Apache-2.0"
] | 956 | 2015-06-24T15:04:55.000Z | 2022-03-30T06:25:04.000Z | scripts/external_libs/elasticsearch7/elasticsearch/client/cat.py | angelyouyou/trex-core | fddf78584cae285d9298ef23f9f5c8725e16911e | [
"Apache-2.0"
] | 782 | 2015-09-20T15:19:00.000Z | 2022-03-31T23:52:05.000Z | scripts/external_libs/elasticsearch7/elasticsearch/client/cat.py | angelyouyou/trex-core | fddf78584cae285d9298ef23f9f5c8725e16911e | [
"Apache-2.0"
] | 429 | 2015-06-27T19:34:21.000Z | 2022-03-23T11:02:51.000Z | from .utils import NamespacedClient, query_params, _make_path, SKIP_IN_PATH
class CatClient(NamespacedClient):
@query_params('format', 'h', 'help', 'local', 'master_timeout', 's', 'v')
def aliases(self, name=None, params=None):
"""
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-alias.html>`_
:arg name: A comma-separated list of alias names to return
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg local: Return local information, do not retrieve the state from
master node (default: false)
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', _make_path('_cat',
'aliases', name), params=params)
@query_params('bytes', 'size', 'format', 'h', 'help', 'local', 'master_timeout',
's', 'v')
def allocation(self, node_id=None, params=None):
"""
Allocation provides a snapshot of how shards have located around the
cluster and the state of disk usage.
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-allocation.html>`_
:arg node_id: A comma-separated list of node IDs or names to limit the
returned information
:arg bytes: The unit in which to display byte values, valid choices are:
'b', 'k', 'kb', 'm', 'mb', 'g', 'gb', 't', 'tb', 'p', 'pb'
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg local: Return local information, do not retrieve the state from
master node (default: false)
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', _make_path('_cat',
'allocation', node_id), params=params)
@query_params('size', 'format', 'h', 'help', 'local', 'master_timeout', 's', 'v')
def count(self, index=None, params=None):
"""
Count provides quick access to the document count of the entire cluster,
or individual indices.
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-count.html>`_
:arg index: A comma-separated list of index names to limit the returned
information
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg local: Return local information, do not retrieve the state from
master node (default: false)
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', _make_path('_cat', 'count',
index), params=params)
@query_params('format', 'h', 'help', 'local', 'master_timeout', 's', 'ts',
'v')
def health(self, params=None):
"""
health is a terse, one-line representation of the same information from
:meth:`~elasticsearch.client.cluster.ClusterClient.health` API
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-health.html>`_
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg local: Return local information, do not retrieve the state from
master node (default: false)
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg ts: Set to false to disable timestamping, default True
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', '/_cat/health',
params=params)
@query_params('help', 's')
def help(self, params=None):
"""
A simple help for the cat api.
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat.html>`_
:arg help: Return help information, default False
:arg s: Comma-separated list of column names or column aliases to sort
by
"""
return self.transport.perform_request('GET', '/_cat', params=params)
@query_params('bytes', 'time', 'size', 'format', 'h', 'health', 'help', 'local',
'master_timeout', 'pri', 's', 'v')
def indices(self, index=None, params=None):
"""
The indices command provides a cross-section of each index.
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-indices.html>`_
:arg index: A comma-separated list of index names to limit the returned
information
:arg bytes: The unit in which to display byte values, valid choices are:
'b', 'k', 'm', 'g'
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg health: A health status ("green", "yellow", or "red" to filter only
indices matching the specified health status, default None, valid
choices are: 'green', 'yellow', 'red'
:arg help: Return help information, default False
:arg local: Return local information, do not retrieve the state from
master node (default: false)
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg pri: Set to true to return stats only for primary shards, default
False
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', _make_path('_cat',
'indices', index), params=params)
@query_params('format', 'h', 'help', 'local', 'master_timeout', 's', 'v')
def master(self, params=None):
"""
Displays the master's node ID, bound IP address, and node name.
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-master.html>`_
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg local: Return local information, do not retrieve the state from
master node (default: false)
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', '/_cat/master',
params=params)
@query_params('format', 'full_id', 'h', 'help', 'local', 'master_timeout',
's', 'v')
def nodes(self, params=None):
"""
The nodes command shows the cluster topology.
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-nodes.html>`_
:arg format: a short version of the Accept header, e.g. json, yaml
:arg full_id: Return the full node ID instead of the shortened version
(default: false)
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg local: Return local information, do not retrieve the state from
master node (default: false)
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', '/_cat/nodes',
params=params)
@query_params('bytes', 'time', 'size', 'format', 'h', 'help', 'master_timeout', 's', 'v')
def recovery(self, index=None, params=None):
"""
recovery is a view of shard replication.
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-recovery.html>`_
:arg index: A comma-separated list of index names to limit the returned
information
:arg bytes: The unit in which to display byte values, valid choices are:
'b', 'k', 'kb', 'm', 'mb', 'g', 'gb', 't', 'tb', 'p', 'pb'
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', _make_path('_cat',
'recovery', index), params=params)
@query_params('bytes', 'time', 'size', 'format', 'h', 'help', 'local', 'master_timeout', 's', 'v')
def shards(self, index=None, params=None):
"""
The shards command is the detailed view of what nodes contain which shards.
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-shards.html>`_
:arg index: A comma-separated list of index names to limit the returned
information
:arg bytes: The unit in which to display byte values, valid choices are:
'b', 'k', 'kb', 'm', 'mb', 'g', 'gb', 't', 'tb', 'p', 'pb'
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg local: Return local information, do not retrieve the state from
master node (default: false)
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', _make_path('_cat',
'shards', index), params=params)
@query_params('bytes', 'size', 'format', 'h', 'help', 's', 'v')
def segments(self, index=None, params=None):
"""
The segments command is the detailed view of Lucene segments per index.
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-segments.html>`_
:arg index: A comma-separated list of index names to limit the returned
information
:arg bytes: The unit in which to display byte values, valid choices are:
'b', 'k', 'kb', 'm', 'mb', 'g', 'gb', 't', 'tb', 'p', 'pb'
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', _make_path('_cat',
'segments', index), params=params)
@query_params('format', 'h', 'help', 'local', 'master_timeout', 's', 'v')
def pending_tasks(self, params=None):
"""
pending_tasks provides the same information as the
:meth:`~elasticsearch.client.cluster.ClusterClient.pending_tasks` API
in a convenient tabular format.
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-pending-tasks.html>`_
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg local: Return local information, do not retrieve the state from
master node (default: false)
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', '/_cat/pending_tasks',
params=params)
@query_params('format', 'h', 'help', 'local', 'master_timeout', 's', 'size',
'v')
def thread_pool(self, thread_pool_patterns=None, params=None):
"""
Get information about thread pools.
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-thread-pool.html>`_
:arg thread_pool_patterns: A comma-separated list of regular-expressions
to filter the thread pools in the output
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg local: Return local information, do not retrieve the state from
master node (default: false)
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg size: The multiplier in which to display values, valid choices are:
'', 'k', 'm', 'g', 't', 'p'
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', _make_path('_cat',
'thread_pool', thread_pool_patterns), params=params)
@query_params('bytes', 'format', 'h', 'help', 'local', 'master_timeout',
's', 'v')
def fielddata(self, fields=None, params=None):
"""
Shows information about currently loaded fielddata on a per-node basis.
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-fielddata.html>`_
:arg fields: A comma-separated list of fields to return the fielddata
size
:arg bytes: The unit in which to display byte values, valid choices are:
'b', 'k', 'kb', 'm', 'mb', 'g', 'gb', 't', 'tb', 'p', 'pb'
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg local: Return local information, do not retrieve the state from
master node (default: false)
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', _make_path('_cat',
'fielddata', fields), params=params)
@query_params('format', 'h', 'help', 'local', 'master_timeout', 's', 'v')
def plugins(self, params=None):
"""
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-plugins.html>`_
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg local: Return local information, do not retrieve the state from
master node (default: false)
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', '/_cat/plugins',
params=params)
@query_params('format', 'h', 'help', 'local', 'master_timeout', 's', 'v')
def nodeattrs(self, params=None):
"""
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-nodeattrs.html>`_
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg local: Return local information, do not retrieve the state from
master node (default: false)
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', '/_cat/nodeattrs',
params=params)
@query_params('format', 'h', 'help', 'local', 'master_timeout', 's', 'v')
def repositories(self, params=None):
"""
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-repositories.html>`_
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg local: Return local information, do not retrieve the state from
master node, default False
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', '/_cat/repositories',
params=params)
@query_params('format', 'h', 'help', 'ignore_unavailable', 'master_timeout',
's', 'v')
def snapshots(self, repository, params=None):
"""
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-snapshots.html>`_
:arg repository: Name of repository from which to fetch the snapshot
information
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg ignore_unavailable: Set to true to ignore unavailable snapshots,
default False
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
if repository in SKIP_IN_PATH:
raise ValueError("Empty value passed for a required argument 'repository'.")
return self.transport.perform_request('GET', _make_path('_cat',
'snapshots', repository), params=params)
@query_params('actions', 'detailed', 'format', 'h', 'help', 'nodes',
'parent_task_id', 's', 'v')
def tasks(self, params=None):
"""
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/tasks.html>`_
:arg actions: A comma-separated list of actions that should be returned.
Leave empty to return all.
:arg detailed: Return detailed task information (default: false)
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg nodes: A comma-separated list of node IDs or names to limit the
returned information; use `_local` to return information from the
node you're connecting to, leave empty to get information from all
nodes
:arg parent_task_id: Return tasks with specified parent task id. Set to -1
to return all.
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', '/_cat/tasks',
params=params)
@query_params('format', 'h', 'help', 'local', 'master_timeout', 's', 'v')
def templates(self, name=None, params=None):
"""
`<https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-templates.html>`_
:arg name: A pattern that returned template names must match
:arg format: a short version of the Accept header, e.g. json, yaml
:arg h: Comma-separated list of column names to display
:arg help: Return help information, default False
:arg local: Return local information, do not retrieve the state from
master node (default: false)
:arg master_timeout: Explicit operation timeout for connection to master
node
:arg s: Comma-separated list of column names or column aliases to sort
by
:arg v: Verbose mode. Display column headers, default False
"""
return self.transport.perform_request('GET', _make_path('_cat',
'templates', name), params=params)
| 49.317073 | 102 | 0.636768 | 2,906 | 22,242 | 4.816242 | 0.079147 | 0.049728 | 0.064304 | 0.071449 | 0.811017 | 0.78894 | 0.778937 | 0.773721 | 0.77172 | 0.757788 | 0 | 0.000061 | 0.262746 | 22,242 | 450 | 103 | 49.426667 | 0.853458 | 0.651875 | 0 | 0.340659 | 0 | 0 | 0.203461 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.21978 | false | 0.010989 | 0.010989 | 0 | 0.461538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7182cb4c62fe624f43220099b6e4a58443bcd221 | 26,115 | py | Python | nova/virt/libvirt/volume/remotefs.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/virt/libvirt/volume/remotefs.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/virt/libvirt/volume/remotefs.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | 2 | 2017-07-20T17:31:34.000Z | 2020-07-24T02:42:19.000Z | begin_unit
comment|'# Copyright 2014 Cloudbase Solutions Srl'
nl|'\n'
comment|'# All Rights Reserved.'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Licensed under the Apache License, Version 2.0 (the "License"); you may'
nl|'\n'
comment|'# not use this file except in compliance with the License. You may obtain'
nl|'\n'
comment|'# a copy of the License at'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# http://www.apache.org/licenses/LICENSE-2.0'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Unless required by applicable law or agreed to in writing, software'
nl|'\n'
comment|'# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT'
nl|'\n'
comment|'# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the'
nl|'\n'
comment|'# License for the specific language governing permissions and limitations'
nl|'\n'
comment|'# under the License.'
nl|'\n'
nl|'\n'
name|'import'
name|'abc'
newline|'\n'
name|'import'
name|'functools'
newline|'\n'
name|'import'
name|'os'
newline|'\n'
name|'import'
name|'tempfile'
newline|'\n'
nl|'\n'
name|'from'
name|'oslo_concurrency'
name|'import'
name|'processutils'
newline|'\n'
name|'from'
name|'oslo_log'
name|'import'
name|'log'
name|'as'
name|'logging'
newline|'\n'
name|'from'
name|'oslo_utils'
name|'import'
name|'importutils'
newline|'\n'
name|'import'
name|'six'
newline|'\n'
nl|'\n'
name|'import'
name|'nova'
op|'.'
name|'conf'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'i18n'
name|'import'
name|'_LE'
op|','
name|'_LW'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'utils'
newline|'\n'
nl|'\n'
DECL|variable|LOG
name|'LOG'
op|'='
name|'logging'
op|'.'
name|'getLogger'
op|'('
name|'__name__'
op|')'
newline|'\n'
nl|'\n'
DECL|variable|CONF
name|'CONF'
op|'='
name|'nova'
op|'.'
name|'conf'
op|'.'
name|'CONF'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|mount_share
name|'def'
name|'mount_share'
op|'('
name|'mount_path'
op|','
name|'export_path'
op|','
nl|'\n'
name|'export_type'
op|','
name|'options'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Mount a remote export to mount_path.\n\n :param mount_path: place where the remote export will be mounted\n :param export_path: path of the export to be mounted\n :export_type: remote export type (e.g. cifs, nfs, etc.)\n :options: A list containing mount options\n """'
newline|'\n'
name|'utils'
op|'.'
name|'execute'
op|'('
string|"'mkdir'"
op|','
string|"'-p'"
op|','
name|'mount_path'
op|')'
newline|'\n'
nl|'\n'
name|'mount_cmd'
op|'='
op|'['
string|"'mount'"
op|','
string|"'-t'"
op|','
name|'export_type'
op|']'
newline|'\n'
name|'if'
name|'options'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'mount_cmd'
op|'.'
name|'extend'
op|'('
name|'options'
op|')'
newline|'\n'
dedent|''
name|'mount_cmd'
op|'.'
name|'extend'
op|'('
op|'['
name|'export_path'
op|','
name|'mount_path'
op|']'
op|')'
newline|'\n'
nl|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'utils'
op|'.'
name|'execute'
op|'('
op|'*'
name|'mount_cmd'
op|','
name|'run_as_root'
op|'='
name|'True'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'processutils'
op|'.'
name|'ProcessExecutionError'
name|'as'
name|'exc'
op|':'
newline|'\n'
indent|' '
name|'if'
string|"'Device or resource busy'"
name|'in'
name|'six'
op|'.'
name|'text_type'
op|'('
name|'exc'
op|')'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'warning'
op|'('
name|'_LW'
op|'('
string|'"%s is already mounted"'
op|')'
op|','
name|'export_path'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'raise'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|unmount_share
dedent|''
dedent|''
dedent|''
name|'def'
name|'unmount_share'
op|'('
name|'mount_path'
op|','
name|'export_path'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Unmount a remote share.\n\n :param mount_path: remote export mount point\n :param export_path: path of the remote export to be unmounted\n """'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'utils'
op|'.'
name|'execute'
op|'('
string|"'umount'"
op|','
name|'mount_path'
op|','
name|'run_as_root'
op|'='
name|'True'
op|','
nl|'\n'
name|'attempts'
op|'='
number|'3'
op|','
name|'delay_on_retry'
op|'='
name|'True'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'processutils'
op|'.'
name|'ProcessExecutionError'
name|'as'
name|'exc'
op|':'
newline|'\n'
indent|' '
name|'if'
string|"'target is busy'"
name|'in'
name|'six'
op|'.'
name|'text_type'
op|'('
name|'exc'
op|')'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|'"The share %s is still in use."'
op|','
name|'export_path'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'exception'
op|'('
name|'_LE'
op|'('
string|'"Couldn\'t unmount the share %s"'
op|')'
op|','
nl|'\n'
name|'export_path'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|RemoteFilesystem
dedent|''
dedent|''
dedent|''
name|'class'
name|'RemoteFilesystem'
op|'('
name|'object'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Represents actions that can be taken on a remote host\'s filesystem."""'
newline|'\n'
nl|'\n'
DECL|member|__init__
name|'def'
name|'__init__'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'transport'
op|'='
name|'CONF'
op|'.'
name|'libvirt'
op|'.'
name|'remote_filesystem_transport'
newline|'\n'
name|'cls_name'
op|'='
string|"'.'"
op|'.'
name|'join'
op|'('
op|'['
name|'__name__'
op|','
name|'transport'
op|'.'
name|'capitalize'
op|'('
op|')'
op|']'
op|')'
newline|'\n'
name|'cls_name'
op|'+='
string|"'Driver'"
newline|'\n'
name|'self'
op|'.'
name|'driver'
op|'='
name|'importutils'
op|'.'
name|'import_object'
op|'('
name|'cls_name'
op|')'
newline|'\n'
nl|'\n'
DECL|member|create_file
dedent|''
name|'def'
name|'create_file'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|'='
name|'None'
op|','
nl|'\n'
name|'on_completion'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|'"Creating file %s on remote host %s"'
op|','
name|'dst_path'
op|','
name|'host'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'driver'
op|'.'
name|'create_file'
op|'('
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|'='
name|'on_execute'
op|','
nl|'\n'
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
DECL|member|remove_file
dedent|''
name|'def'
name|'remove_file'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|'='
name|'None'
op|','
nl|'\n'
name|'on_completion'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|'"Removing file %s on remote host %s"'
op|','
name|'dst_path'
op|','
name|'host'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'driver'
op|'.'
name|'remove_file'
op|'('
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|'='
name|'on_execute'
op|','
nl|'\n'
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
DECL|member|create_dir
dedent|''
name|'def'
name|'create_dir'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|'='
name|'None'
op|','
nl|'\n'
name|'on_completion'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|'"Creating directory %s on remote host %s"'
op|','
name|'dst_path'
op|','
name|'host'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'driver'
op|'.'
name|'create_dir'
op|'('
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|'='
name|'on_execute'
op|','
nl|'\n'
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
DECL|member|remove_dir
dedent|''
name|'def'
name|'remove_dir'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|'='
name|'None'
op|','
nl|'\n'
name|'on_completion'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|'"Removing directory %s on remote host %s"'
op|','
name|'dst_path'
op|','
name|'host'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'driver'
op|'.'
name|'remove_dir'
op|'('
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|'='
name|'on_execute'
op|','
nl|'\n'
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
DECL|member|copy_file
dedent|''
name|'def'
name|'copy_file'
op|'('
name|'self'
op|','
name|'src'
op|','
name|'dst'
op|','
name|'on_execute'
op|'='
name|'None'
op|','
nl|'\n'
name|'on_completion'
op|'='
name|'None'
op|','
name|'compression'
op|'='
name|'True'
op|')'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|'"Copying file %s to %s"'
op|','
name|'src'
op|','
name|'dst'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'driver'
op|'.'
name|'copy_file'
op|'('
name|'src'
op|','
name|'dst'
op|','
name|'on_execute'
op|'='
name|'on_execute'
op|','
nl|'\n'
name|'on_completion'
op|'='
name|'on_completion'
op|','
nl|'\n'
name|'compression'
op|'='
name|'compression'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'six'
op|'.'
name|'add_metaclass'
op|'('
name|'abc'
op|'.'
name|'ABCMeta'
op|')'
newline|'\n'
DECL|class|RemoteFilesystemDriver
name|'class'
name|'RemoteFilesystemDriver'
op|'('
name|'object'
op|')'
op|':'
newline|'\n'
indent|' '
op|'@'
name|'abc'
op|'.'
name|'abstractmethod'
newline|'\n'
DECL|member|create_file
name|'def'
name|'create_file'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create file on the remote system.\n\n :param host: Remote host\n :param dst_path: Destination path\n :param on_execute: Callback method to store pid of process in cache\n :param on_completion: Callback method to remove pid of process from\n cache\n """'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'abc'
op|'.'
name|'abstractmethod'
newline|'\n'
DECL|member|remove_file
name|'def'
name|'remove_file'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Removes a file on a remote host.\n\n :param host: Remote host\n :param dst_path: Destination path\n :param on_execute: Callback method to store pid of process in cache\n :param on_completion: Callback method to remove pid of process from\n cache\n """'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'abc'
op|'.'
name|'abstractmethod'
newline|'\n'
DECL|member|create_dir
name|'def'
name|'create_dir'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create directory on the remote system.\n\n :param host: Remote host\n :param dst_path: Destination path\n :param on_execute: Callback method to store pid of process in cache\n :param on_completion: Callback method to remove pid of process from\n cache\n """'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'abc'
op|'.'
name|'abstractmethod'
newline|'\n'
DECL|member|remove_dir
name|'def'
name|'remove_dir'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Removes a directory on a remote host.\n\n :param host: Remote host\n :param dst_path: Destination path\n :param on_execute: Callback method to store pid of process in cache\n :param on_completion: Callback method to remove pid of process from\n cache\n """'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'abc'
op|'.'
name|'abstractmethod'
newline|'\n'
DECL|member|copy_file
name|'def'
name|'copy_file'
op|'('
name|'self'
op|','
name|'src'
op|','
name|'dst'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Copy file to/from remote host.\n\n Remote address must be specified in format:\n REM_HOST_IP_ADDRESS:REM_HOST_PATH\n For example:\n 192.168.1.10:/home/file\n\n :param src: Source address\n :param dst: Destination path\n :param on_execute: Callback method to store pid of process in cache\n :param on_completion: Callback method to remove pid of process from\n """'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|SshDriver
dedent|''
dedent|''
name|'class'
name|'SshDriver'
op|'('
name|'RemoteFilesystemDriver'
op|')'
op|':'
newline|'\n'
nl|'\n'
DECL|member|create_file
indent|' '
name|'def'
name|'create_file'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|')'
op|':'
newline|'\n'
indent|' '
name|'utils'
op|'.'
name|'ssh_execute'
op|'('
name|'host'
op|','
string|"'touch'"
op|','
name|'dst_path'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
DECL|member|remove_file
dedent|''
name|'def'
name|'remove_file'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|')'
op|':'
newline|'\n'
indent|' '
name|'utils'
op|'.'
name|'ssh_execute'
op|'('
name|'host'
op|','
string|"'rm'"
op|','
name|'dst'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
DECL|member|create_dir
dedent|''
name|'def'
name|'create_dir'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|')'
op|':'
newline|'\n'
indent|' '
name|'utils'
op|'.'
name|'ssh_execute'
op|'('
name|'host'
op|','
string|"'mkdir'"
op|','
string|"'-p'"
op|','
name|'dst_path'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
DECL|member|remove_dir
dedent|''
name|'def'
name|'remove_dir'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|')'
op|':'
newline|'\n'
indent|' '
name|'utils'
op|'.'
name|'ssh_execute'
op|'('
name|'host'
op|','
string|"'rm'"
op|','
string|"'-rf'"
op|','
name|'dst'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
DECL|member|copy_file
dedent|''
name|'def'
name|'copy_file'
op|'('
name|'self'
op|','
name|'src'
op|','
name|'dst'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|','
name|'compression'
op|')'
op|':'
newline|'\n'
indent|' '
name|'utils'
op|'.'
name|'execute'
op|'('
string|"'scp'"
op|','
name|'src'
op|','
name|'dst'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|create_tmp_dir
dedent|''
dedent|''
name|'def'
name|'create_tmp_dir'
op|'('
name|'function'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Creates temporary directory for rsync purposes.\n Removes created directory in the end.\n """'
newline|'\n'
nl|'\n'
op|'@'
name|'functools'
op|'.'
name|'wraps'
op|'('
name|'function'
op|')'
newline|'\n'
DECL|function|decorated_function
name|'def'
name|'decorated_function'
op|'('
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
comment|'# Create directory'
nl|'\n'
indent|' '
name|'tmp_dir_path'
op|'='
name|'tempfile'
op|'.'
name|'mkdtemp'
op|'('
op|')'
newline|'\n'
name|'kwargs'
op|'['
string|"'tmp_dir_path'"
op|']'
op|'='
name|'tmp_dir_path'
newline|'\n'
nl|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'function'
op|'('
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
newline|'\n'
dedent|''
name|'finally'
op|':'
newline|'\n'
comment|'# Remove directory'
nl|'\n'
indent|' '
name|'utils'
op|'.'
name|'execute'
op|'('
string|"'rm'"
op|','
string|"'-rf'"
op|','
name|'tmp_dir_path'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'return'
name|'decorated_function'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|RsyncDriver
dedent|''
name|'class'
name|'RsyncDriver'
op|'('
name|'RemoteFilesystemDriver'
op|')'
op|':'
newline|'\n'
nl|'\n'
indent|' '
op|'@'
name|'create_tmp_dir'
newline|'\n'
DECL|member|create_file
name|'def'
name|'create_file'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'dir_path'
op|'='
name|'os'
op|'.'
name|'path'
op|'.'
name|'dirname'
op|'('
name|'os'
op|'.'
name|'path'
op|'.'
name|'normpath'
op|'('
name|'dst_path'
op|')'
op|')'
newline|'\n'
nl|'\n'
comment|'# Create target dir inside temporary directory'
nl|'\n'
name|'local_tmp_dir'
op|'='
name|'os'
op|'.'
name|'path'
op|'.'
name|'join'
op|'('
name|'kwargs'
op|'['
string|"'tmp_dir_path'"
op|']'
op|','
nl|'\n'
name|'dir_path'
op|'.'
name|'strip'
op|'('
name|'os'
op|'.'
name|'path'
op|'.'
name|'sep'
op|')'
op|')'
newline|'\n'
name|'utils'
op|'.'
name|'execute'
op|'('
string|"'mkdir'"
op|','
string|"'-p'"
op|','
name|'local_tmp_dir'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
comment|'# Create file in directory'
nl|'\n'
name|'file_name'
op|'='
name|'os'
op|'.'
name|'path'
op|'.'
name|'basename'
op|'('
name|'os'
op|'.'
name|'path'
op|'.'
name|'normpath'
op|'('
name|'dst_path'
op|')'
op|')'
newline|'\n'
name|'local_tmp_file'
op|'='
name|'os'
op|'.'
name|'path'
op|'.'
name|'join'
op|'('
name|'local_tmp_dir'
op|','
name|'file_name'
op|')'
newline|'\n'
name|'utils'
op|'.'
name|'execute'
op|'('
string|"'touch'"
op|','
name|'local_tmp_file'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
name|'RsyncDriver'
op|'.'
name|'_synchronize_object'
op|'('
name|'kwargs'
op|'['
string|"'tmp_dir_path'"
op|']'
op|','
nl|'\n'
name|'host'
op|','
name|'dst_path'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
nl|'\n'
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'create_tmp_dir'
newline|'\n'
DECL|member|remove_file
name|'def'
name|'remove_file'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
comment|'# Delete file'
nl|'\n'
indent|' '
name|'RsyncDriver'
op|'.'
name|'_remove_object'
op|'('
name|'kwargs'
op|'['
string|"'tmp_dir_path'"
op|']'
op|','
name|'host'
op|','
name|'dst'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
nl|'\n'
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'create_tmp_dir'
newline|'\n'
DECL|member|create_dir
name|'def'
name|'create_dir'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst_path'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'dir_path'
op|'='
name|'os'
op|'.'
name|'path'
op|'.'
name|'normpath'
op|'('
name|'dst_path'
op|')'
newline|'\n'
nl|'\n'
comment|'# Create target dir inside temporary directory'
nl|'\n'
name|'local_tmp_dir'
op|'='
name|'os'
op|'.'
name|'path'
op|'.'
name|'join'
op|'('
name|'kwargs'
op|'['
string|"'tmp_dir_path'"
op|']'
op|','
nl|'\n'
name|'dir_path'
op|'.'
name|'strip'
op|'('
name|'os'
op|'.'
name|'path'
op|'.'
name|'sep'
op|')'
op|')'
newline|'\n'
name|'utils'
op|'.'
name|'execute'
op|'('
string|"'mkdir'"
op|','
string|"'-p'"
op|','
name|'local_tmp_dir'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
name|'RsyncDriver'
op|'.'
name|'_synchronize_object'
op|'('
name|'kwargs'
op|'['
string|"'tmp_dir_path'"
op|']'
op|','
nl|'\n'
name|'host'
op|','
name|'dst_path'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
nl|'\n'
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'create_tmp_dir'
newline|'\n'
DECL|member|remove_dir
name|'def'
name|'remove_dir'
op|'('
name|'self'
op|','
name|'host'
op|','
name|'dst'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
comment|"# Remove remote directory's content"
nl|'\n'
indent|' '
name|'utils'
op|'.'
name|'execute'
op|'('
string|"'rsync'"
op|','
string|"'--archive'"
op|','
string|"'--delete-excluded'"
op|','
nl|'\n'
name|'kwargs'
op|'['
string|"'tmp_dir_path'"
op|']'
op|'+'
name|'os'
op|'.'
name|'path'
op|'.'
name|'sep'
op|','
nl|'\n'
string|"'%s:%s'"
op|'%'
op|'('
name|'host'
op|','
name|'dst'
op|')'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
comment|'# Delete empty directory'
nl|'\n'
name|'RsyncDriver'
op|'.'
name|'_remove_object'
op|'('
name|'kwargs'
op|'['
string|"'tmp_dir_path'"
op|']'
op|','
name|'host'
op|','
name|'dst'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
nl|'\n'
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'staticmethod'
newline|'\n'
DECL|member|_remove_object
name|'def'
name|'_remove_object'
op|'('
name|'src'
op|','
name|'host'
op|','
name|'dst'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Removes a file or empty directory on a remote host.\n\n :param src: Empty directory used for rsync purposes\n :param host: Remote host\n :param dst: Destination path\n :param on_execute: Callback method to store pid of process in cache\n :param on_completion: Callback method to remove pid of process from\n cache\n """'
newline|'\n'
name|'utils'
op|'.'
name|'execute'
op|'('
string|"'rsync'"
op|','
string|"'--archive'"
op|','
string|"'--delete'"
op|','
nl|'\n'
string|"'--include'"
op|','
name|'os'
op|'.'
name|'path'
op|'.'
name|'basename'
op|'('
name|'os'
op|'.'
name|'path'
op|'.'
name|'normpath'
op|'('
name|'dst'
op|')'
op|')'
op|','
nl|'\n'
string|"'--exclude'"
op|','
string|"'*'"
op|','
nl|'\n'
name|'os'
op|'.'
name|'path'
op|'.'
name|'normpath'
op|'('
name|'src'
op|')'
op|'+'
name|'os'
op|'.'
name|'path'
op|'.'
name|'sep'
op|','
nl|'\n'
string|"'%s:%s'"
op|'%'
op|'('
name|'host'
op|','
name|'os'
op|'.'
name|'path'
op|'.'
name|'dirname'
op|'('
name|'os'
op|'.'
name|'path'
op|'.'
name|'normpath'
op|'('
name|'dst'
op|')'
op|')'
op|')'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'staticmethod'
newline|'\n'
DECL|member|_synchronize_object
name|'def'
name|'_synchronize_object'
op|'('
name|'src'
op|','
name|'host'
op|','
name|'dst'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Creates a file or empty directory on a remote host.\n\n :param src: Empty directory used for rsync purposes\n :param host: Remote host\n :param dst: Destination path\n :param on_execute: Callback method to store pid of process in cache\n :param on_completion: Callback method to remove pid of process from\n cache\n """'
newline|'\n'
nl|'\n'
comment|'# For creating path on the remote host rsync --relative path must'
nl|'\n'
comment|'# be used. With a modern rsync on the sending side (beginning with'
nl|'\n'
comment|'# 2.6.7), you can insert a dot and a slash into the source path,'
nl|'\n'
comment|'# like this:'
nl|'\n'
comment|'# rsync -avR /foo/./bar/baz.c remote:/tmp/'
nl|'\n'
comment|'# That would create /tmp/bar/baz.c on the remote machine.'
nl|'\n'
comment|'# (Note that the dot must be followed by a slash, so "/foo/."'
nl|'\n'
comment|'# would not be abbreviated.)'
nl|'\n'
name|'relative_tmp_file_path'
op|'='
name|'os'
op|'.'
name|'path'
op|'.'
name|'join'
op|'('
nl|'\n'
name|'src'
op|','
string|"'./'"
op|','
nl|'\n'
name|'os'
op|'.'
name|'path'
op|'.'
name|'normpath'
op|'('
name|'dst'
op|')'
op|'.'
name|'strip'
op|'('
name|'os'
op|'.'
name|'path'
op|'.'
name|'sep'
op|')'
op|')'
newline|'\n'
nl|'\n'
comment|'# Do relative rsync local directory with remote root directory'
nl|'\n'
name|'utils'
op|'.'
name|'execute'
op|'('
string|"'rsync'"
op|','
string|"'--archive'"
op|','
string|"'--relative'"
op|','
string|"'--no-implied-dirs'"
op|','
nl|'\n'
name|'relative_tmp_file_path'
op|','
string|"'%s:%s'"
op|'%'
op|'('
name|'host'
op|','
name|'os'
op|'.'
name|'path'
op|'.'
name|'sep'
op|')'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
nl|'\n'
DECL|member|copy_file
dedent|''
name|'def'
name|'copy_file'
op|'('
name|'self'
op|','
name|'src'
op|','
name|'dst'
op|','
name|'on_execute'
op|','
name|'on_completion'
op|','
name|'compression'
op|')'
op|':'
newline|'\n'
indent|' '
name|'args'
op|'='
op|'['
string|"'rsync'"
op|','
string|"'--sparse'"
op|','
name|'src'
op|','
name|'dst'
op|']'
newline|'\n'
name|'if'
name|'compression'
op|':'
newline|'\n'
indent|' '
name|'args'
op|'.'
name|'append'
op|'('
string|"'--compress'"
op|')'
newline|'\n'
dedent|''
name|'utils'
op|'.'
name|'execute'
op|'('
op|'*'
name|'args'
op|','
nl|'\n'
name|'on_execute'
op|'='
name|'on_execute'
op|','
name|'on_completion'
op|'='
name|'on_completion'
op|')'
newline|'\n'
dedent|''
dedent|''
endmarker|''
end_unit
| 14.055436 | 446 | 0.603408 | 3,954 | 26,115 | 3.890996 | 0.071826 | 0.169646 | 0.050959 | 0.062398 | 0.804355 | 0.756971 | 0.728632 | 0.709132 | 0.684888 | 0.669548 | 0 | 0.00101 | 0.127819 | 26,115 | 1,857 | 447 | 14.063005 | 0.674452 | 0 | 0 | 0.941303 | 0 | 0.004847 | 0.745204 | 0.055792 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.007539 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
718dc2e948cee8c5c724d73e728b3c6422f3ebd3 | 154 | py | Python | taggle/utils/__init__.py | tattaka/Taggle | d78e7f76c65cd69336a0347299939eeb3a184d4e | [
"MIT"
] | 7 | 2020-08-06T08:53:26.000Z | 2021-04-17T12:03:28.000Z | taggle/utils/__init__.py | tattaka/Taggle | d78e7f76c65cd69336a0347299939eeb3a184d4e | [
"MIT"
] | null | null | null | taggle/utils/__init__.py | tattaka/Taggle | d78e7f76c65cd69336a0347299939eeb3a184d4e | [
"MIT"
] | null | null | null | from .cross_validator import *
from .helper_functions import *
from .lr_finder import BaseLRFinder
from .metric_functions import *
from .metrics import *
| 25.666667 | 35 | 0.811688 | 20 | 154 | 6.05 | 0.55 | 0.247934 | 0.31405 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12987 | 154 | 5 | 36 | 30.8 | 0.902985 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
71a741352d8dda68e7a849a1faba79e7910d5928 | 96,299 | py | Python | alnitak/tests/setup.py | definitelyprobably/alnitak | 9e0413ce4c7408a8516570c980def27ebe8ee6c3 | [
"MIT"
] | null | null | null | alnitak/tests/setup.py | definitelyprobably/alnitak | 9e0413ce4c7408a8516570c980def27ebe8ee6c3 | [
"MIT"
] | null | null | null | alnitak/tests/setup.py | definitelyprobably/alnitak | 9e0413ce4c7408a8516570c980def27ebe8ee6c3 | [
"MIT"
] | null | null | null |
import re
import os
import sys
import shutil
import datetime
from alnitak import prog as Prog
from pathlib import Path
def create_api_c4_obj(email, key):
a = Prog.ApiCloudflare()
a.email = email
a.key = key
return a
def create_api_exec_obj(command, uid=None, gid=None):
if isinstance(command, list):
prog = command
else:
prog = [ command ]
return Prog.ApiExec(prog, uid=uid, gid=gid)
def create_tlsa_obj(param, port, protocol, domain, publish=True):
t = Prog.Tlsa(param, port, protocol, domain)
t.publish = publish
return t
def create_cert_obj(path, domain, name):
# name: x123.pem
# name2: x.pem
m = re.match(r'([^0-9]+)[^\.]+\.pem', name)
name2 = "{}.pem".format(m.group(1))
dane = Path(path) / 'dane' / domain / name2
live = Path(path) / 'live' / domain / name2
archive = Path(path) / 'archive' / domain / name
return Prog.Cert(dane, live, archive)
def create_target_obj(domain, api=None, certs=[], tlsas=[]):
t = Prog.Target(domain)
t.certs = certs
t.tlsa = tlsas
if api:
t.api = api.copy()
if t.api:
t.api.domain = domain
return t
def create_datapre_obj(domain, lineno, dane, live, archive, pending,
state=Prog.DataLineState.write):
d = Prog.DataPre(domain, lineno, dane, live, archive, pending)
d.state = state
return d
def create_datapost_obj(domain, lineno, tlsa, pending, time, hash,
state=Prog.DataLineState.write):
d = Prog.DataPost(domain, lineno, tlsa, pending, time, hash)
d.state = state
return d
def create_datadelete_obj(domain, lineno, tlsa, count, time, hash,
state=Prog.DataLineState.write):
d = Prog.DataDelete(domain, lineno, tlsa, count, time, hash)
d.state = state
return d
def create_prehook_data_obj():
return Prog.Data()
def install_binary_program():
with open("/usr/local/bin/alnitak.dns", "w") as f:
f.write('''#!/bin/sh
# if you find this program, delete it: it's just a testing program for the
# alnitak program and will be recreated as needed anyway.
cd `dirname $0`
echo "$@" > ./calls
test -z "$1" && exit 0
arg="$1"
test `expr "$arg" : "[0-9]*"` = `expr length "$arg"` && exit $arg
exit 1''')
def uninstall_binary_program():
os.unlink("/usr/local/bin/alnitak.dns")
class Init:
def __init__(self, parent="./.alnitak_tests", keep=False):
self.parent = Path(parent)
self.keep = keep
if self.parent.exists():
shutil.rmtree(str(self.parent))
self.parent.mkdir()
self.data = self.parent / 'data'
self.etc = self.parent / 'etc'
self.le = self.etc / 'le'
self.live = self.le / 'live'
self.archive = self.le / 'archive'
self.varlog = self.parent / 'var' / 'log'
self.varlock = self.parent / 'var' / 'lock'
self.datadir = self.parent / 'var' / 'alnitak'
self.bin = self.parent / 'bin'
self.dane = self.etc / 'alnitak'
self.live.mkdir(parents=True)
self.archive.mkdir()
self.varlog.mkdir(parents=True)
self.varlock.mkdir()
self.datadir.mkdir()
self.bin.mkdir()
self.dane.mkdir()
self.data.mkdir()
self.data.chmod(0o1777)
self.readme = self.parent / 'README'
self.config = self.etc / 'alnitak.conf'
self.config1 = self.etc / 'alnitak.conf.1'
self.config2 = self.etc / 'alnitak.conf.2'
self.config3 = self.etc / 'alnitak.conf.3'
self.config4 = self.etc / 'alnitak.conf.4'
self.config5 = self.etc / 'alnitak.conf.5'
self.config6 = self.etc / 'alnitak.conf.6'
self.config7 = self.etc / 'alnitak.conf.7'
self.config8 = self.etc / 'alnitak.conf.8'
#self.config9 = self.etc / 'alnitak.conf.9'
# was for testing locking
self.configX1 = self.etc / 'alnitak.conf.X1'
self.configX2 = self.etc / 'alnitak.conf.X2'
self.configX3 = self.etc / 'alnitak.conf.X3'
self.configX4 = self.etc / 'alnitak.conf.X4'
self.configX5 = self.etc / 'alnitak.conf.X5'
self.configX6 = self.etc / 'alnitak.conf.X6'
self.configX7 = self.etc / 'alnitak.conf.X7'
self.configX8 = self.etc / 'alnitak.conf.X8'
self.configX9 = self.etc / 'alnitak.conf.X9'
self.configX10 = self.etc / 'alnitak.conf.X10'
self.configX11 = self.etc / 'alnitak.conf.X11'
self.configX12 = self.etc / 'alnitak.conf.X12'
self.configX13 = self.etc / 'alnitak.conf.X13'
self.configX14 = self.etc / 'alnitak.conf.X14'
self.configX15 = self.etc / 'alnitak.conf.X15'
self.configX16 = self.etc / 'alnitak.conf.X16'
self.configX17 = self.etc / 'alnitak.conf.X17'
self.configX18 = self.etc / 'alnitak.conf.X18'
self.configX19 = self.etc / 'alnitak.conf.X19'
self.configX20 = self.etc / 'alnitak.conf.X20'
self.configX21 = self.etc / 'alnitak.conf.X21'
self.configX22 = self.etc / 'alnitak.conf.X22'
self.configC1 = self.etc / 'alnitak.conf.C1'
self.binary = self.bin / 'dns'
#self.binary_wait = self.bin / 'wait'
# was for testing locking
self.chain = '''-----BEGIN CERTIFICATE-----
MIIEkjCCA3qgAwIBAgIQCgFBQgAAAVOFc2oLheynCDANBgkqhkiG9w0BAQsFADA/
MSQwIgYDVQQKExtEaWdpdGFsIFNpZ25hdHVyZSBUcnVzdCBDby4xFzAVBgNVBAMT
DkRTVCBSb290IENBIFgzMB4XDTE2MDMxNzE2NDA0NloXDTIxMDMxNzE2NDA0Nlow
SjELMAkGA1UEBhMCVVMxFjAUBgNVBAoTDUxldCdzIEVuY3J5cHQxIzAhBgNVBAMT
GkxldCdzIEVuY3J5cHQgQXV0aG9yaXR5IFgzMIIBIjANBgkqhkiG9w0BAQEFAAOC
AQ8AMIIBCgKCAQEAnNMM8FrlLke3cl03g7NoYzDq1zUmGSXhvb418XCSL7e4S0EF
q6meNQhY7LEqxGiHC6PjdeTm86dicbp5gWAf15Gan/PQeGdxyGkOlZHP/uaZ6WA8
SMx+yk13EiSdRxta67nsHjcAHJyse6cF6s5K671B5TaYucv9bTyWaN8jKkKQDIZ0
Z8h/pZq4UmEUEz9l6YKHy9v6Dlb2honzhT+Xhq+w3Brvaw2VFn3EK6BlspkENnWA
a6xK8xuQSXgvopZPKiAlKQTGdMDQMc2PMTiVFrqoM7hD8bEfwzB/onkxEz0tNvjj
/PIzark5McWvxI0NHWQWM6r6hCm21AvA2H3DkwIDAQABo4IBfTCCAXkwEgYDVR0T
AQH/BAgwBgEB/wIBADAOBgNVHQ8BAf8EBAMCAYYwfwYIKwYBBQUHAQEEczBxMDIG
CCsGAQUFBzABhiZodHRwOi8vaXNyZy50cnVzdGlkLm9jc3AuaWRlbnRydXN0LmNv
bTA7BggrBgEFBQcwAoYvaHR0cDovL2FwcHMuaWRlbnRydXN0LmNvbS9yb290cy9k
c3Ryb290Y2F4My5wN2MwHwYDVR0jBBgwFoAUxKexpHsscfrb4UuQdf/EFWCFiRAw
VAYDVR0gBE0wSzAIBgZngQwBAgEwPwYLKwYBBAGC3xMBAQEwMDAuBggrBgEFBQcC
ARYiaHR0cDovL2Nwcy5yb290LXgxLmxldHNlbmNyeXB0Lm9yZzA8BgNVHR8ENTAz
MDGgL6AthitodHRwOi8vY3JsLmlkZW50cnVzdC5jb20vRFNUUk9PVENBWDNDUkwu
Y3JsMB0GA1UdDgQWBBSoSmpjBH3duubRObemRWXv86jsoTANBgkqhkiG9w0BAQsF
AAOCAQEA3TPXEfNjWDjdGBX7CVW+dla5cEilaUcne8IkCJLxWh9KEik3JHRRHGJo
uM2VcGfl96S8TihRzZvoroed6ti6WqEBmtzw3Wodatg+VyOeph4EYpr/1wXKtx8/
wApIvJSwtmVi4MFU5aMqrSDE6ea73Mj2tcMyo5jMd6jmeWUHK8so/joWUoHOUgwu
X4Po1QYz+3dszkDqMp4fklxBwXRsW10KXzPMTZ+sOPAveyxindmjkW8lGy+QsRlG
PfZ+G6Z6h7mjem0Y+iWlkYcV4PIWL1iwBi8saCbGS5jN2p8M+X+Q7UNKEkROb3N6
KOqkqm57TH2H3eDJAkSnh6/DNFu0Qg==
-----END CERTIFICATE-----
'''
self.domains = { 'a.com': { 'live': self.live / 'a.com',
'archive': self.archive / 'a.com',
'dane': self.dane / 'a.com',
'cert1': '''-----BEGIN CERTIFICATE-----
MIIClzCCAgCgAwIBAgIJALZnNWR3/N7aMA0GCSqGSIb3DQEBCwUAMGMxCzAJBgNV
BAYTAkdCMQ8wDQYDVQQIDAZMb25kb24xDzANBgNVBAcMBkxvbmRvbjEQMA4GA1UE
CgwHQWxuaXRhazEQMA4GA1UECwwHdGVzdGluZzEOMAwGA1UEAwwFYS5jb20wHhcN
MTkwMTI0MTQ0MjQ4WhcNMTkwMjIzMTQ0MjQ4WjBjMQswCQYDVQQGEwJHQjEPMA0G
A1UECAwGTG9uZG9uMQ8wDQYDVQQHDAZMb25kb24xEDAOBgNVBAoMB0Fsbml0YWsx
EDAOBgNVBAsMB3Rlc3RpbmcxDjAMBgNVBAMMBWEuY29tMIGfMA0GCSqGSIb3DQEB
AQUAA4GNADCBiQKBgQDgODlws5tZjrIX4J52erhkaBRrnCSwE24wVAedh4piIR4u
e5W3H/Z5DQo2nqKMhNo2magaaBNsDUGyRdg3H8nLLtPDPtAmN41VxMerWySDNwNn
43O7Y56iAODz1Nk7IHHzEOUZ9R/XhUB+KSxkkog9fo2T/lVFWJnqIcxWqPpp1QID
AQABo1MwUTAdBgNVHQ4EFgQUjbLDXEncsRtPPw7RGPtuX0r3+9UwHwYDVR0jBBgw
FoAUjbLDXEncsRtPPw7RGPtuX0r3+9UwDwYDVR0TAQH/BAUwAwEB/zANBgkqhkiG
9w0BAQsFAAOBgQCv+jXJQ1MtABAMswyaI+S8jnhgSzy3KrsUAhyt+BekBHZtt4bI
2MjA7QgbI0vQT4D7g4WQLnW3QIaQ6c1lqO8h835bCWQHMR6H4orvWL4SJDBsvGiK
/+YrW4Mx0VVrwJbnbAJ+thUWPxswtOmI/NQsth1D6neL5TTwDBmQuowZtg==
-----END CERTIFICATE-----
''',
'cert2': '''-----BEGIN CERTIFICATE-----
MIIClzCCAgCgAwIBAgIJAIxNFzKotLgrMA0GCSqGSIb3DQEBCwUAMGMxCzAJBgNV
BAYTAkdCMQ8wDQYDVQQIDAZMb25kb24xDzANBgNVBAcMBkxvbmRvbjEQMA4GA1UE
CgwHQWxuaXRhazEQMA4GA1UECwwHdGVzdGluZzEOMAwGA1UEAwwFYS5jb20wHhcN
MTkwMTI0MTQ0MTQ3WhcNMTkwMjIzMTQ0MTQ3WjBjMQswCQYDVQQGEwJHQjEPMA0G
A1UECAwGTG9uZG9uMQ8wDQYDVQQHDAZMb25kb24xEDAOBgNVBAoMB0Fsbml0YWsx
EDAOBgNVBAsMB3Rlc3RpbmcxDjAMBgNVBAMMBWEuY29tMIGfMA0GCSqGSIb3DQEB
AQUAA4GNADCBiQKBgQDyZip4bFrvS6g1wr8M6jV72Z7d9vlZpOAxgG7g2mFesXaO
fybJbEQfCeUKvlH2pO/a7RuED1xGXJX/WtLudWiO0Hq9ExVSwhx9OOugKZFG2cWa
DTAGgON/G8Xr+OkDZXqUK7JOqgcy0NK4MbX9Pv28yWbQ7Kg+Bhw0Zx99O1l8YQID
AQABo1MwUTAdBgNVHQ4EFgQUQirr13cp3DH4s+riBG/sy7NnxmIwHwYDVR0jBBgw
FoAUQirr13cp3DH4s+riBG/sy7NnxmIwDwYDVR0TAQH/BAUwAwEB/zANBgkqhkiG
9w0BAQsFAAOBgQA+zXSP+DG+tuZ5fKhZfSh58LntTVM3zZbzEA85LsVu3u+walX8
3XpzgKpBdzKZOgeUNldobv/OgkpIBjCVi3q4Qu44FrEwUlVDQDjGttSignF+ua2X
DLmeZk81zhKrghOOqx/IzBLcMWpOHgFHGXesVKatwcbenAUfv7oUe7XpKQ==
-----END CERTIFICATE-----
''',
'cert3': '''-----BEGIN CERTIFICATE-----
MIIFnDCCA4SgAwIBAgIJAM1Y9xn8zUn5MA0GCSqGSIb3DQEBCwUAMGMxCzAJBgNV
BAYTAkdCMQ8wDQYDVQQIDAZMb25kb24xDzANBgNVBAcMBkxvbmRvbjEQMA4GA1UE
CgwHQWxuaXRhazEQMA4GA1UECwwHdGVzdGluZzEOMAwGA1UEAwwFYS5jb20wHhcN
MTkwMTI0MTQwODM1WhcNMTkwMjIzMTQwODM1WjBjMQswCQYDVQQGEwJHQjEPMA0G
A1UECAwGTG9uZG9uMQ8wDQYDVQQHDAZMb25kb24xEDAOBgNVBAoMB0Fsbml0YWsx
EDAOBgNVBAsMB3Rlc3RpbmcxDjAMBgNVBAMMBWEuY29tMIICIjANBgkqhkiG9w0B
AQEFAAOCAg8AMIICCgKCAgEAwRa51lpxV3BvidBJ7Jh0cuMwdRVzAo2gSA0QznY+
SgX1L5VABq92RbH45pUQzCQCl1Unn2/q4zYhP16xIJw400UrMky8xIMDxphmhAPT
dnXm0fXkHMDPAbBARYhrMmd1l8C+kTGAqgqFViV4yk+203/siIfKOAsJUgLwRuaw
VKB6VMexZYnhq4njJk+ICPivPbpR6C+3cp8B5t/eAFhc5IHZreqTyYqVJnyb9RpF
W3xi7O0eFN8nVgIuCGo5Ci9m988I2HouoSZeP74PbvpuqOYRg6EfYSJuUeskzQcy
zb50Cxh4wB2x2LNQZD/1UyGHxXmwBRZaXUmrbJFUdhJAtAku9mVZHJiejgrl2jxe
VQsspVBh06FBkwSEfEoAjELC9Yp1JZHX6m0AzQI/gIwF3dpePkEOiboEOWe86C1a
AGoVjeoYkdNILqvBUOZwum5J83s+xgO2T0c3IYTwQwVor/it+1eRlyPcu7PWUH1K
0+p0e/Xvt6+CHCvokjcLM6O9q/lKh4NN+TIZo6COeFFKF+uEYXGLzOWzaiz16afP
lk0OkAnJZ1uTiraCGyvuOM7yG5v0VVprEJLq/6K45fLh70Lhae8CdMiJiyGANDhI
lLoh1OpI7LBIaSJflSVlzoGOGFA9hqy+ecpE14Ii3B+HiULDzbzYfDD1FqjJPkcV
AvkCAwEAAaNTMFEwHQYDVR0OBBYEFM5bXWTBirflDXxkUpUOUoGJzQagMB8GA1Ud
IwQYMBaAFM5bXWTBirflDXxkUpUOUoGJzQagMA8GA1UdEwEB/wQFMAMBAf8wDQYJ
KoZIhvcNAQELBQADggIBAHYw6qjM7ZJAr2k8+Vc9SSa1xqOhAmMjsa8OVkkH1m+1
l48NINwvcJ5CUHe/Rn5grFzatNy1XgiL4zhD4LmIZS2UhKIf16NphWYBDArg1ut+
8/agVojUneNTiYIkyL5+cPZK/58ChdKyeqZgKSfdzhiIP1nMzsigitOTtXk/fCxG
phucu7Ojpmu+tpPKwNLjengeLCOhnvjY8Xc94WVgGXY2OmCJnZQdBvK8ZA/PDtkb
6YQQWU6DWZGXQhlLmmKpZHuqEE/Fb2+0mB4vK99eAgdxr6Mv7cz3WCT/P1ikP6HA
6QJl1FR0fDjibeJbjJMaRjPr9oVf/aGoS8TP0FlAifVVNAoieZtXncD99bsT/Ltl
nN9lXmkk/pi5YHIwRZrWtIgn374MlaNuoHanmA6FgSqtAwl1Nv8xFbc4lAbq4VaA
eBPls0nRUjHS55nGaPZEhMI4J/9xUMDRn+wOMZHXisHgHPU/6MfiONDobVNxL4E+
h0+47d7xJfWNZmFPm/8Nk36J2R0mevY1ERLw5+sLPGnwJnGCLSG5mLsNBQtSvSTa
ixYV/T1qkcsoDoJTEnkclHskXDY9rN6iJvhMlV6cdR8QWYFH3vc5OPrIKHf+1x+x
zKOVPpVoRmQefxXd/ro/gHLoZO89YhzmjfYkQgjb9akbFvGLAHMX9IsGoMcqHWGZ
-----END CERTIFICATE-----
''' },
'b.com': { 'live': self.live / 'b.com',
'archive': self.archive / 'b.com',
'dane': self.dane / 'b.com',
'cert1': '''-----BEGIN CERTIFICATE-----
MIIClzCCAgCgAwIBAgIJAIzZ6W1XJFyCMA0GCSqGSIb3DQEBCwUAMGMxCzAJBgNV
BAYTAkdCMQ8wDQYDVQQIDAZMb25kb24xDzANBgNVBAcMBkxvbmRvbjEQMA4GA1UE
CgwHQWxuaXRhazEQMA4GA1UECwwHdGVzdGluZzEOMAwGA1UEAwwFYi5jb20wHhcN
MTkwMTI0MTQ0NTU5WhcNMTkwMjIzMTQ0NTU5WjBjMQswCQYDVQQGEwJHQjEPMA0G
A1UECAwGTG9uZG9uMQ8wDQYDVQQHDAZMb25kb24xEDAOBgNVBAoMB0Fsbml0YWsx
EDAOBgNVBAsMB3Rlc3RpbmcxDjAMBgNVBAMMBWIuY29tMIGfMA0GCSqGSIb3DQEB
AQUAA4GNADCBiQKBgQDMzl8BOijFQRb8gQOR0QLLr1jC3Wy7QSCOFEnm+LZUH6iE
BG6zSiA62j4zvEDRp1MGgqq+sUHBBr6laWXkR08ksQd6u7fVTmJLYD8sJc+wOu7y
UJOx9LJJuGqUR8XJ6Q0J6o0366YBUX3Ms48OPULtNjoCGcUJdyqwhbb6sfSKYQID
AQABo1MwUTAdBgNVHQ4EFgQU9ECkhmNz9f184LIMeuu18wr10Z8wHwYDVR0jBBgw
FoAU9ECkhmNz9f184LIMeuu18wr10Z8wDwYDVR0TAQH/BAUwAwEB/zANBgkqhkiG
9w0BAQsFAAOBgQCA5ieKY4XecJXhhmtx7KKPVBse9aTmlYHKoWWwpTb+fDuOXyO4
khVaEZmRXztt6iNgR2sgwUMVAZVjACVb16e8i3gnGFS6UjGzfhax8pAVzd/Xnjil
ej0Oa800hKUOsxAS+5YfR0WmEWaA2u5TfX34++V2lyrCqji42nLvZnH/nQ==
-----END CERTIFICATE-----
''',
'cert2': '''-----BEGIN CERTIFICATE-----
MIIClzCCAgCgAwIBAgIJAJJvdZ99n2EiMA0GCSqGSIb3DQEBCwUAMGMxCzAJBgNV
BAYTAkdCMQ8wDQYDVQQIDAZMb25kb24xDzANBgNVBAcMBkxvbmRvbjEQMA4GA1UE
CgwHQWxuaXRhazEQMA4GA1UECwwHdGVzdGluZzEOMAwGA1UEAwwFYi5jb20wHhcN
MTkwMTI0MTQ0NjIxWhcNMTkwMjIzMTQ0NjIxWjBjMQswCQYDVQQGEwJHQjEPMA0G
A1UECAwGTG9uZG9uMQ8wDQYDVQQHDAZMb25kb24xEDAOBgNVBAoMB0Fsbml0YWsx
EDAOBgNVBAsMB3Rlc3RpbmcxDjAMBgNVBAMMBWIuY29tMIGfMA0GCSqGSIb3DQEB
AQUAA4GNADCBiQKBgQC5N+IyRFmPOfBRffzzilmLU2LUhOG8g6eZ+k4qlU6z+gto
H1fu38nt7i0M2/3CuYkBoXKQwgiedP5vITdscUgLkClVLZJ2YaB5t5t429u7qZEC
Bs+JVRKLV/7fRfGLRdCiIsXOYMMeekn6yW/cVYI5WPiysFhs93dFyIBVAqkPPwID
AQABo1MwUTAdBgNVHQ4EFgQUIvZFklIHwNZFeIUns5/SBDDFoMYwHwYDVR0jBBgw
FoAUIvZFklIHwNZFeIUns5/SBDDFoMYwDwYDVR0TAQH/BAUwAwEB/zANBgkqhkiG
9w0BAQsFAAOBgQAUqjTWBxGu7WJTkav2sE0fu2KxBLuGerSt8SQSOwYChjN/bKXl
ys/wHHGrrhZ4wp/78LUKzCj3VB4PzU878h6JEUgsdbk0oUvM4TuZctv6DdXVEGnI
rdRRLb7m3uy677OejoMzU4v+GnaKGdWJ4A7PFz09Vv1rsSneinkdVLsh3g==
-----END CERTIFICATE-----
''',
'cert3': '''-----BEGIN CERTIFICATE-----
MIIDnDCCAoSgAwIBAgIJALYVFUDe4r+3MA0GCSqGSIb3DQEBCwUAMGMxCzAJBgNV
BAYTAkdCMQ8wDQYDVQQIDAZMb25kb24xDzANBgNVBAcMBkxvbmRvbjEQMA4GA1UE
CgwHQWxuaXRhazEQMA4GA1UECwwHdGVzdGluZzEOMAwGA1UEAwwFYi5jb20wHhcN
MTkwMTI0MTQ0MzU2WhcNMTkwMjIzMTQ0MzU2WjBjMQswCQYDVQQGEwJHQjEPMA0G
A1UECAwGTG9uZG9uMQ8wDQYDVQQHDAZMb25kb24xEDAOBgNVBAoMB0Fsbml0YWsx
EDAOBgNVBAsMB3Rlc3RpbmcxDjAMBgNVBAMMBWIuY29tMIIBIjANBgkqhkiG9w0B
AQEFAAOCAQ8AMIIBCgKCAQEA0wmxH1uULHFbavYP8UrcDz9EbC2DZusuoC+iH83i
9a2F+yA2ReIX6vHXpobDf2h67S5CF85eca7B5viGnwil5IcbxkieWeK/98EtgpT6
4z6KU2tPFNL3NeRuyaKwh6AlwpHkE/JCbrqzJlqlaqrBOk+M3UuVbLLlFVHNdXIT
AfzFfWVWZSfdRSk3pMST2l57Y1U62RjiJGrRekxHCtzt3URmGhs2nREFq3CjyAtw
Q6+nbsBK1zcDzD7dTF7FQuc6LgVnfN91KztRVM4EERvKDIf2XwJbQ5wtX2j7ZVKe
+kudy4RhnKWyEqiIYpMgP8mgBZORem59uLjuiKFKM+5ABQIDAQABo1MwUTAdBgNV
HQ4EFgQUBWMGPLNrokNjuULhh/mfu/ixGJEwHwYDVR0jBBgwFoAUBWMGPLNrokNj
uULhh/mfu/ixGJEwDwYDVR0TAQH/BAUwAwEB/zANBgkqhkiG9w0BAQsFAAOCAQEA
o1czBGmQW7mr2LVokwc2JTB1EH/2F5fFsxq5MhXDa7CKojDGjN4y/zwZuGW7/vk3
YSMsyx/fWP3HMFe73gaqMXUwAxcTNPXv5CRNA6BjP0uR6Fz04+6l9s3tCMozljQT
coccJNzzgdScaoLAi2fMSoLn2oB/Evjh0/ntktItTqpRuVeNFCj/wsKQhsx9qUZG
EiEUepJOeqHPgxx2nI8KQJMXkIgvgjkYWlIZJyPpCNXe0BRupeM1q6Sbtgw2aL3B
aiOnv52z0x1f4TzRGtSm35lttRj+GW0IQ3+8UCKPZ3QfjJRzFQxBrj9P5rHgDOyY
8yCjesxjTbFBQY+hiKdXbQ==
-----END CERTIFICATE-----
''', },
'c.com': { 'live': self.live / 'c.com',
'archive': self.archive / 'c.com',
'dane': self.dane / 'c.com',
'cert1': '''-----BEGIN CERTIFICATE-----
MIIClzCCAgCgAwIBAgIJAPxBy0EvAnFsMA0GCSqGSIb3DQEBCwUAMGMxCzAJBgNV
BAYTAkdCMQ8wDQYDVQQIDAZMb25kb24xDzANBgNVBAcMBkxvbmRvbjEQMA4GA1UE
CgwHQWxuaXRhazEQMA4GA1UECwwHdGVzdGluZzEOMAwGA1UEAwwFYy5jb20wHhcN
MTkwMTI0MTQ0NTM2WhcNMTkwMjIzMTQ0NTM2WjBjMQswCQYDVQQGEwJHQjEPMA0G
A1UECAwGTG9uZG9uMQ8wDQYDVQQHDAZMb25kb24xEDAOBgNVBAoMB0Fsbml0YWsx
EDAOBgNVBAsMB3Rlc3RpbmcxDjAMBgNVBAMMBWMuY29tMIGfMA0GCSqGSIb3DQEB
AQUAA4GNADCBiQKBgQCyeITh7Wxf8EtXTAx3ugQRouW/5caBd3gvFkyc7F/z1Bcp
mrC4giBanJrjOTxTzOqYRNcfRFMSXI5EjhLAEaVY5yN3dxLh38aEFCWYYYUCmE+5
4gI5vQQlV2XbjYNFMSeio5dBUOnx/H9ecfKaqVklpOV27SJ+cuERrGAHL4e0ZwID
AQABo1MwUTAdBgNVHQ4EFgQUqPaAw0ck30tqGJsg1tBsJmqN+n4wHwYDVR0jBBgw
FoAUqPaAw0ck30tqGJsg1tBsJmqN+n4wDwYDVR0TAQH/BAUwAwEB/zANBgkqhkiG
9w0BAQsFAAOBgQAO0eDZ8L4thFnk/8RbQexS3RmBq97WX1dSYpViCFGynXuxWkg5
v3NZ1c1KXIZF757RSEJgKjJ8ujE7v557gy2SeVzsdofyG23DK7G9F66UR344YQmK
qdPHHd8julm6pWJ0ZwTS0a+pSmH1jWlOYx1AQjKEe0RJtnnqbBd3F4vkNQ==
-----END CERTIFICATE-----
''',
'cert2': '''-----BEGIN CERTIFICATE-----
MIIClzCCAgCgAwIBAgIJAIC6WRCPtu8LMA0GCSqGSIb3DQEBCwUAMGMxCzAJBgNV
BAYTAkdCMQ8wDQYDVQQIDAZMb25kb24xDzANBgNVBAcMBkxvbmRvbjEQMA4GA1UE
CgwHQWxuaXRhazEQMA4GA1UECwwHdGVzdGluZzEOMAwGA1UEAwwFYy5jb20wHhcN
MTkwMTI0MTQ0NTExWhcNMTkwMjIzMTQ0NTExWjBjMQswCQYDVQQGEwJHQjEPMA0G
A1UECAwGTG9uZG9uMQ8wDQYDVQQHDAZMb25kb24xEDAOBgNVBAoMB0Fsbml0YWsx
EDAOBgNVBAsMB3Rlc3RpbmcxDjAMBgNVBAMMBWMuY29tMIGfMA0GCSqGSIb3DQEB
AQUAA4GNADCBiQKBgQCw2Zs4ZIgCfLKFNcM6m9YaU+aJkWOacWH7jglEGJemfLl0
z4EZpdJZMhuujDF4oeWPAjGf3ixTH1uxYVkikvvt3NrdH4rQQmWOqgYQ5ZMmNNMu
plrO0RNphYn+wL3Cq8Rv4eqj9LndIfrxKrgGHujUo1ig25ZzhdqgmR40bjjOnQID
AQABo1MwUTAdBgNVHQ4EFgQUOVLYJcwQbmJUX/6ZjiZ9zcKXhigwHwYDVR0jBBgw
FoAUOVLYJcwQbmJUX/6ZjiZ9zcKXhigwDwYDVR0TAQH/BAUwAwEB/zANBgkqhkiG
9w0BAQsFAAOBgQCbBofLi+fOzD0iwjrTYaL4o0gp4U2H8i35N/1+Ku5mJUv5tWNE
ypeWjdubhk2bxBR/Q26/N2+uACR87+3CBVeEJyTfFdQtrgHjE5QHNn3Ju1aI2lZj
pUbuhIhvKzM48pgoqJMMLDWgtK+hxnb5nWW01hhb6JiXeJWCx3keqxt00A==
-----END CERTIFICATE-----
''',
'cert3': '''-----BEGIN CERTIFICATE-----
MIIDnDCCAoSgAwIBAgIJAJrsYibODwN4MA0GCSqGSIb3DQEBCwUAMGMxCzAJBgNV
BAYTAkdCMQ8wDQYDVQQIDAZMb25kb24xDzANBgNVBAcMBkxvbmRvbjEQMA4GA1UE
CgwHQWxuaXRhazEQMA4GA1UECwwHdGVzdGluZzEOMAwGA1UEAwwFYy5jb20wHhcN
MTkwMTI0MTQ0NDE2WhcNMTkwMjIzMTQ0NDE2WjBjMQswCQYDVQQGEwJHQjEPMA0G
A1UECAwGTG9uZG9uMQ8wDQYDVQQHDAZMb25kb24xEDAOBgNVBAoMB0Fsbml0YWsx
EDAOBgNVBAsMB3Rlc3RpbmcxDjAMBgNVBAMMBWMuY29tMIIBIjANBgkqhkiG9w0B
AQEFAAOCAQ8AMIIBCgKCAQEArfMaxw2afbW0l00wP3Lr6K+T66CesSmv5ZToJSA1
ok9cOn+4Dp195y8zCZPHizE3daw4Ymgvnv9g/Tt6NGVsOaI+b1hr5XGUzUyOOZK/
ffcOuoww7+SedbF94pVQ+cC5rUA1x4O/8Oavw638X6K+NQnfCgihI+mSJJ0hRBCQ
1lXmqW8MzXHq0XLsmh+PoADEQ8q9oSJ0h9NhcFoMUfi7yhRBNx/+U8UTqqCWyIsJ
LNuD2CC8oltSV3dFlSIRDKI8h8W2XBLxg5a7wncxCyn1emzs+QafbHizOa+fX7Qg
xLBPKqPZBDo4dgnWEvmsZEGkNz8Nsz+Aw2P7cbeDKeqDTwIDAQABo1MwUTAdBgNV
HQ4EFgQU54znG+BiUL+T6SiC2njIteSbuXwwHwYDVR0jBBgwFoAU54znG+BiUL+T
6SiC2njIteSbuXwwDwYDVR0TAQH/BAUwAwEB/zANBgkqhkiG9w0BAQsFAAOCAQEA
SNgSFWaWq0iR+yERFqUd1Q/VGiRc4ytrL6BpK6jykFPD2PdPZBDhkhvvsMt8CgYA
pSGtxXPLoBTE4FIUa5pVN1B8kEA8vc9UIWQrGNhASCeAmSNMlS4fs5xgG+ISbrTc
dwjbMP9xeX2049qR9EI8Fl5AHMUJJga8RvBWCrmI7CJTalInEc7O67J4bSsGUAM1
eZJbCLmMdDRNkamSjnLoa6LOxs5c5OSR6RyDIp78TQaW+7R+HOJIzyfzFroi3GnE
Sxv5z+9HhPhN7IR608ODKcxxQBJMABBxHP1kcxvGEIYxm+jZLQXOv+H3bJiPMBqX
J8nV2gnPMIAaz0EVKPwmTA==
-----END CERTIFICATE-----
''', }, }
self.hash = {
'a.com': {
'cert1': {
200: '308204923082037aa00302010202100a0141420000015385736a0b85eca708300d06092a864886f70d01010b0500303f31243022060355040a131b4469676974616c205369676e617475726520547275737420436f2e311730150603550403130e44535420526f6f74204341205833301e170d3136303331373136343034365a170d3231303331373136343034365a304a310b300906035504061302555331163014060355040a130d4c6574277320456e6372797074312330210603550403131a4c6574277320456e637279707420417574686f7269747920583330820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001a382017d3082017930120603551d130101ff040830060101ff020100300e0603551d0f0101ff040403020186307f06082b0601050507010104733071303206082b060105050730018626687474703a2f2f697372672e747275737469642e6f6373702e6964656e74727573742e636f6d303b06082b06010505073002862f687474703a2f2f617070732e6964656e74727573742e636f6d2f726f6f74732f647374726f6f74636178332e703763301f0603551d23041830168014c4a7b1a47b2c71fadbe14b9075ffc4156085891030540603551d20044d304b3008060667810c010201303f060b2b0601040182df130101013030302e06082b060105050702011622687474703a2f2f6370732e726f6f742d78312e6c657473656e63727970742e6f7267303c0603551d1f043530333031a02fa02d862b687474703a2f2f63726c2e6964656e74727573742e636f6d2f445354524f4f544341583343524c2e63726c301d0603551d0e04160414a84a6a63047dddbae6d139b7a64565eff3a8eca1300d06092a864886f70d01010b05000382010100dd33d711f3635838dd1815fb0955be7656b97048a56947277bc2240892f15a1f4a1229372474511c6268b8cd957067e5f7a4bc4e2851cd9be8ae879dead8ba5aa1019adcf0dd6a1d6ad83e57239ea61e04629affd705cab71f3fc00a48bc94b0b66562e0c154e5a32aad20c4e9e6bbdcc8f6b5c332a398cc77a8e67965072bcb28fe3a165281ce520c2e5f83e8d50633fb776cce40ea329e1f925c41c1746c5b5d0a5f33cc4d9fac38f02f7b2c629dd9a3916f251b2f90b119463df67e1ba67a87b9a37a6d18fa25a5918715e0f2162f58b0062f2c6826c64b98cdda9f0cf97f90ed434a12444e6f737a28eaa4aa6e7b4c7d87dde0c90244a787afc3345bb442',
201: '25847d668eb4f04fdd40b12b6b0740c567da7d024308eb6c2c96fe41d9de218d',
202: '2e1e12dacb350e69317a7f37d769f46f16f437cf8d392319279c93515e5600baed3d3acd5dc83b673e8c60cf7fba0dce00a4d162a3b966a3ebf72487c376fca0',
210: '30820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001',
211: '60b87575447dcba2a36b7d11ac09fb24a9db406fee12d2cc90180517616e8a18',
212: '774fad8c9a6afc2bdb44faba8390d213ae592fb0d56c5dfab152284e334d7cd6abd05799236e7aa6266edf81907c60404c57ee54c10a3a82fcc2a9146629b140',
300: '3082029730820200a003020102020900b667356477fcdeda300d06092a864886f70d01010b05003063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05612e636f6d301e170d3139303132343134343234385a170d3139303232333134343234385a3063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05612e636f6d30819f300d06092a864886f70d010101050003818d0030818902818100e0383970b39b598eb217e09e767ab86468146b9c24b0136e3054079d878a62211e2e7b95b71ff6790d0a369ea28c84da3699a81a68136c0d41b245d8371fc9cb2ed3c33ed026378d55c4c7ab5b2483370367e373bb639ea200e0f3d4d93b2071f310e519f51fd785407e292c6492883d7e8d93fe55455899ea21cc56a8fa69d50203010001a3533051301d0603551d0e041604148db2c35c49dcb11b4f3f0ed118fb6e5f4af7fbd5301f0603551d230418301680148db2c35c49dcb11b4f3f0ed118fb6e5f4af7fbd5300f0603551d130101ff040530030101ff300d06092a864886f70d01010b050003818100affa35c943532d00100cb30c9a23e4bc8e78604b3cb72abb14021cadf817a404766db786c8d8c8c0ed081b234bd04f80fb8385902e75b7408690e9cd65a8ef21f37e5b096407311e87e28aef58be1224306cbc688affe62b5b8331d1556bc096e76c027eb615163f1b30b4e988fcd42cb61d43ea778be534f00c1990ba8c19b6',
301: '4b6ebf5b27cb8b090a86c19943d9e2d799a3467ef18e8c866c605df46134677a',
302: 'b9bf7c30e2871d5efd022bd35c1b00bbebb54e264bf0ec10ec99d7a2355ac4de2b348be4ff8e2a1add2450fa16aaa74900bc9a2835d3e288edf3a5ccb29ae98e',
310: '30819f300d06092a864886f70d010101050003818d0030818902818100e0383970b39b598eb217e09e767ab86468146b9c24b0136e3054079d878a62211e2e7b95b71ff6790d0a369ea28c84da3699a81a68136c0d41b245d8371fc9cb2ed3c33ed026378d55c4c7ab5b2483370367e373bb639ea200e0f3d4d93b2071f310e519f51fd785407e292c6492883d7e8d93fe55455899ea21cc56a8fa69d50203010001',
311: 'f73e2add0cc95f0890594d203f2829d69f5288feb0431c81bb0336a18054148c',
312: 'b173cfcad24da5defa2f34b6aa1b0d66340b22b7b1541253e86ce9225d8b2478bd0f9fb443cc69f41351562f6b862ac3245c1f27721ca53e3a531df545292501',
},
'cert2': {
200: '308204923082037aa00302010202100a0141420000015385736a0b85eca708300d06092a864886f70d01010b0500303f31243022060355040a131b4469676974616c205369676e617475726520547275737420436f2e311730150603550403130e44535420526f6f74204341205833301e170d3136303331373136343034365a170d3231303331373136343034365a304a310b300906035504061302555331163014060355040a130d4c6574277320456e6372797074312330210603550403131a4c6574277320456e637279707420417574686f7269747920583330820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001a382017d3082017930120603551d130101ff040830060101ff020100300e0603551d0f0101ff040403020186307f06082b0601050507010104733071303206082b060105050730018626687474703a2f2f697372672e747275737469642e6f6373702e6964656e74727573742e636f6d303b06082b06010505073002862f687474703a2f2f617070732e6964656e74727573742e636f6d2f726f6f74732f647374726f6f74636178332e703763301f0603551d23041830168014c4a7b1a47b2c71fadbe14b9075ffc4156085891030540603551d20044d304b3008060667810c010201303f060b2b0601040182df130101013030302e06082b060105050702011622687474703a2f2f6370732e726f6f742d78312e6c657473656e63727970742e6f7267303c0603551d1f043530333031a02fa02d862b687474703a2f2f63726c2e6964656e74727573742e636f6d2f445354524f4f544341583343524c2e63726c301d0603551d0e04160414a84a6a63047dddbae6d139b7a64565eff3a8eca1300d06092a864886f70d01010b05000382010100dd33d711f3635838dd1815fb0955be7656b97048a56947277bc2240892f15a1f4a1229372474511c6268b8cd957067e5f7a4bc4e2851cd9be8ae879dead8ba5aa1019adcf0dd6a1d6ad83e57239ea61e04629affd705cab71f3fc00a48bc94b0b66562e0c154e5a32aad20c4e9e6bbdcc8f6b5c332a398cc77a8e67965072bcb28fe3a165281ce520c2e5f83e8d50633fb776cce40ea329e1f925c41c1746c5b5d0a5f33cc4d9fac38f02f7b2c629dd9a3916f251b2f90b119463df67e1ba67a87b9a37a6d18fa25a5918715e0f2162f58b0062f2c6826c64b98cdda9f0cf97f90ed434a12444e6f737a28eaa4aa6e7b4c7d87dde0c90244a787afc3345bb442',
201: '25847d668eb4f04fdd40b12b6b0740c567da7d024308eb6c2c96fe41d9de218d',
202: '2e1e12dacb350e69317a7f37d769f46f16f437cf8d392319279c93515e5600baed3d3acd5dc83b673e8c60cf7fba0dce00a4d162a3b966a3ebf72487c376fca0',
210: '30820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001',
211: '60b87575447dcba2a36b7d11ac09fb24a9db406fee12d2cc90180517616e8a18',
212: '774fad8c9a6afc2bdb44faba8390d213ae592fb0d56c5dfab152284e334d7cd6abd05799236e7aa6266edf81907c60404c57ee54c10a3a82fcc2a9146629b140',
300: '3082029730820200a0030201020209008c4d1732a8b4b82b300d06092a864886f70d01010b05003063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05612e636f6d301e170d3139303132343134343134375a170d3139303232333134343134375a3063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05612e636f6d30819f300d06092a864886f70d010101050003818d0030818902818100f2662a786c5aef4ba835c2bf0cea357bd99eddf6f959a4e031806ee0da615eb1768e7f26c96c441f09e50abe51f6a4efdaed1b840f5c465c95ff5ad2ee75688ed07abd131552c21c7d38eba0299146d9c59a0d300680e37f1bc5ebf8e903657a942bb24eaa0732d0d2b831b5fd3efdbcc966d0eca83e061c34671f7d3b597c610203010001a3533051301d0603551d0e04160414422aebd77729dc31f8b3eae2046feccbb367c662301f0603551d23041830168014422aebd77729dc31f8b3eae2046feccbb367c662300f0603551d130101ff040530030101ff300d06092a864886f70d01010b0500038181003ecd748ff831beb6e6797ca8597d2879f0b9ed4d5337cd96f3100f392ec56edeefb06a55fcdd7a7380aa417732993a07943657686effce824a480630958b7ab842ee3816b1305255434038c6b6d4a282717eb9ad970cb99e664f35ce12ab82138eab1fc8cc12dc316a4e1e01471977ac54a6adc1c6de9c051fbfba147bb5e929',
301: '64adbb86d7ef684ead0a68f9ff16cbdc1ae9085bc294c1528c4a463557729c4c',
302: '33d1c6da2f59bbb37562078684bb549276fb8dd06b48ed3da0cd4015e24cde43b018925abbb43bfe5964891855385ffe15cd66c218b2077e5c4191b7f70f3478',
310: '30819f300d06092a864886f70d010101050003818d0030818902818100f2662a786c5aef4ba835c2bf0cea357bd99eddf6f959a4e031806ee0da615eb1768e7f26c96c441f09e50abe51f6a4efdaed1b840f5c465c95ff5ad2ee75688ed07abd131552c21c7d38eba0299146d9c59a0d300680e37f1bc5ebf8e903657a942bb24eaa0732d0d2b831b5fd3efdbcc966d0eca83e061c34671f7d3b597c610203010001',
311: '89d496304d899b10e3320cf3d398be642f57f6a32639d69be22c1ad16e86f113',
312: '9a603785be6226d765b2e2fc9f478cabe7d074e2d32e2af2f7eadcb1d7ed1806faa8bc447667c1f1a9dbcfe6b012da63fd13091d68951863d5699d455bba12ad',
},
'cert3': {
200: '308204923082037aa00302010202100a0141420000015385736a0b85eca708300d06092a864886f70d01010b0500303f31243022060355040a131b4469676974616c205369676e617475726520547275737420436f2e311730150603550403130e44535420526f6f74204341205833301e170d3136303331373136343034365a170d3231303331373136343034365a304a310b300906035504061302555331163014060355040a130d4c6574277320456e6372797074312330210603550403131a4c6574277320456e637279707420417574686f7269747920583330820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001a382017d3082017930120603551d130101ff040830060101ff020100300e0603551d0f0101ff040403020186307f06082b0601050507010104733071303206082b060105050730018626687474703a2f2f697372672e747275737469642e6f6373702e6964656e74727573742e636f6d303b06082b06010505073002862f687474703a2f2f617070732e6964656e74727573742e636f6d2f726f6f74732f647374726f6f74636178332e703763301f0603551d23041830168014c4a7b1a47b2c71fadbe14b9075ffc4156085891030540603551d20044d304b3008060667810c010201303f060b2b0601040182df130101013030302e06082b060105050702011622687474703a2f2f6370732e726f6f742d78312e6c657473656e63727970742e6f7267303c0603551d1f043530333031a02fa02d862b687474703a2f2f63726c2e6964656e74727573742e636f6d2f445354524f4f544341583343524c2e63726c301d0603551d0e04160414a84a6a63047dddbae6d139b7a64565eff3a8eca1300d06092a864886f70d01010b05000382010100dd33d711f3635838dd1815fb0955be7656b97048a56947277bc2240892f15a1f4a1229372474511c6268b8cd957067e5f7a4bc4e2851cd9be8ae879dead8ba5aa1019adcf0dd6a1d6ad83e57239ea61e04629affd705cab71f3fc00a48bc94b0b66562e0c154e5a32aad20c4e9e6bbdcc8f6b5c332a398cc77a8e67965072bcb28fe3a165281ce520c2e5f83e8d50633fb776cce40ea329e1f925c41c1746c5b5d0a5f33cc4d9fac38f02f7b2c629dd9a3916f251b2f90b119463df67e1ba67a87b9a37a6d18fa25a5918715e0f2162f58b0062f2c6826c64b98cdda9f0cf97f90ed434a12444e6f737a28eaa4aa6e7b4c7d87dde0c90244a787afc3345bb442',
201: '25847d668eb4f04fdd40b12b6b0740c567da7d024308eb6c2c96fe41d9de218d',
202: '2e1e12dacb350e69317a7f37d769f46f16f437cf8d392319279c93515e5600baed3d3acd5dc83b673e8c60cf7fba0dce00a4d162a3b966a3ebf72487c376fca0',
210: '30820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001',
211: '60b87575447dcba2a36b7d11ac09fb24a9db406fee12d2cc90180517616e8a18',
212: '774fad8c9a6afc2bdb44faba8390d213ae592fb0d56c5dfab152284e334d7cd6abd05799236e7aa6266edf81907c60404c57ee54c10a3a82fcc2a9146629b140',
300: '3082059c30820384a003020102020900cd58f719fccd49f9300d06092a864886f70d01010b05003063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05612e636f6d301e170d3139303132343134303833355a170d3139303232333134303833355a3063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05612e636f6d30820222300d06092a864886f70d01010105000382020f003082020a0282020100c116b9d65a7157706f89d049ec987472e330751573028da0480d10ce763e4a05f52f954006af7645b1f8e69510cc24029755279f6feae336213f5eb1209c38d3452b324cbcc48303c698668403d37675e6d1f5e41cc0cf01b04045886b32677597c0be913180aa0a85562578ca4fb6d37fec8887ca380b095202f046e6b054a07a54c7b16589e1ab89e3264f8808f8af3dba51e82fb7729f01e6dfde00585ce481d9adea93c98a95267c9bf51a455b7c62eced1e14df2756022e086a390a2f66f7cf08d87a2ea1265e3fbe0f6efa6ea8e61183a11f61226e51eb24cd0732cdbe740b1878c01db1d8b350643ff5532187c579b005165a5d49ab6c9154761240b4092ef665591c989e8e0ae5da3c5e550b2ca55061d3a1419304847c4a008c42c2f58a752591d7ea6d00cd023f808c05ddda5e3e410e89ba043967bce82d5a006a158dea1891d3482eabc150e670ba6e49f37b3ec603b64f47372184f0430568aff8adfb57919723dcbbb3d6507d4ad3ea747bf5efb7af821c2be892370b33a3bdabf94a87834df93219a3a08e78514a17eb8461718bcce5b36a2cf5e9a7cf964d0e9009c9675b938ab6821b2bee38cef21b9bf4555a6b1092eaffa2b8e5f2e1ef42e169ef0274c8898b218034384894ba21d4ea48ecb04869225f952565ce818e18503d86acbe79ca44d78222dc1f878942c3cdbcd87c30f516a8c93e471502f90203010001a3533051301d0603551d0e04160414ce5b5d64c18ab7e50d7c6452950e528189cd06a0301f0603551d23041830168014ce5b5d64c18ab7e50d7c6452950e528189cd06a0300f0603551d130101ff040530030101ff300d06092a864886f70d01010b050003820201007630eaa8cced9240af693cf9573d4926b5c6a3a1026323b1af0e564907d66fb5978f0d20dc2f709e425077bf467e60ac5cdab4dcb55e088be33843e0b988652d9484a21fd7a3698566010c0ae0d6eb7ef3f6a05688d49de353898224c8be7e70f64aff9f0285d2b27aa6602927ddce18883f59cccec8a08ad393b5793f7c2c46a61b9cbbb3a3a66bbeb693cac0d2e37a781e2c23a19ef8d8f1773de165601976363a60899d941d06f2bc640fcf0ed91be98410594e8359919742194b9a62a9647baa104fc56f6fb4981e2f2bdf5e020771afa32fedccf75824ff3f58a43fa1c0e90265d454747c38e26de25b8c931a4633ebf6855ffda1a84bc4cfd0594089f555340a22799b579dc0fdf5bb13fcbb659cdf655e6924fe98b9607230459ad6b48827dfbe0c95a36ea076a7980e85812aad03097536ff3115b7389406eae156807813e5b349d15231d2e799c668f64484c23827ff7150c0d19fec0e3191d78ac1e01cf53fe8c7e238d0e86d53712f813e874fb8eddef125f58d66614f9bff0d937e89d91d267af6351112f0e7eb0b3c69f02671822d21b998bb0d050b52bd24da8b1615fd3d6a91cb280e825312791c947b245c363dacdea226f84c955e9c751f10598147def73938fac82877fed71fb1cca3953e956846641e7f15ddfeba3f8072e864ef3d621ce68df6244208dbf5a91b16f18b007317f48b06a0c72a1d6199',
301: '67f76c1b4945cb0eead61b9b5872624204a69e6f162c28e18f4c8f0a0cd9e879',
302: '18a1533c41e47db5f6f7c316e782f2101e8367f50dd01532bfb94719d015148f6b43367cd2e8f68ee7d0500699f1823dd12b7ecb2b7390ab14a16ffb94329188',
310: '30820222300d06092a864886f70d01010105000382020f003082020a0282020100c116b9d65a7157706f89d049ec987472e330751573028da0480d10ce763e4a05f52f954006af7645b1f8e69510cc24029755279f6feae336213f5eb1209c38d3452b324cbcc48303c698668403d37675e6d1f5e41cc0cf01b04045886b32677597c0be913180aa0a85562578ca4fb6d37fec8887ca380b095202f046e6b054a07a54c7b16589e1ab89e3264f8808f8af3dba51e82fb7729f01e6dfde00585ce481d9adea93c98a95267c9bf51a455b7c62eced1e14df2756022e086a390a2f66f7cf08d87a2ea1265e3fbe0f6efa6ea8e61183a11f61226e51eb24cd0732cdbe740b1878c01db1d8b350643ff5532187c579b005165a5d49ab6c9154761240b4092ef665591c989e8e0ae5da3c5e550b2ca55061d3a1419304847c4a008c42c2f58a752591d7ea6d00cd023f808c05ddda5e3e410e89ba043967bce82d5a006a158dea1891d3482eabc150e670ba6e49f37b3ec603b64f47372184f0430568aff8adfb57919723dcbbb3d6507d4ad3ea747bf5efb7af821c2be892370b33a3bdabf94a87834df93219a3a08e78514a17eb8461718bcce5b36a2cf5e9a7cf964d0e9009c9675b938ab6821b2bee38cef21b9bf4555a6b1092eaffa2b8e5f2e1ef42e169ef0274c8898b218034384894ba21d4ea48ecb04869225f952565ce818e18503d86acbe79ca44d78222dc1f878942c3cdbcd87c30f516a8c93e471502f90203010001',
311: '0da25074bc07d104653c29dd7ff993b421436cd34ccec15503741d50d4b0df3e',
312: '0d25edeec3bb81f82f84842854db3f31f8b97236517e70abb36215e5a2ef3d2b73026722e0b6d9222f1cad1b600fa7ed24eedb47467659fc48cbb92b9594dff5',
},
},
'b.com': {
'cert1': {
200: '308204923082037aa00302010202100a0141420000015385736a0b85eca708300d06092a864886f70d01010b0500303f31243022060355040a131b4469676974616c205369676e617475726520547275737420436f2e311730150603550403130e44535420526f6f74204341205833301e170d3136303331373136343034365a170d3231303331373136343034365a304a310b300906035504061302555331163014060355040a130d4c6574277320456e6372797074312330210603550403131a4c6574277320456e637279707420417574686f7269747920583330820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001a382017d3082017930120603551d130101ff040830060101ff020100300e0603551d0f0101ff040403020186307f06082b0601050507010104733071303206082b060105050730018626687474703a2f2f697372672e747275737469642e6f6373702e6964656e74727573742e636f6d303b06082b06010505073002862f687474703a2f2f617070732e6964656e74727573742e636f6d2f726f6f74732f647374726f6f74636178332e703763301f0603551d23041830168014c4a7b1a47b2c71fadbe14b9075ffc4156085891030540603551d20044d304b3008060667810c010201303f060b2b0601040182df130101013030302e06082b060105050702011622687474703a2f2f6370732e726f6f742d78312e6c657473656e63727970742e6f7267303c0603551d1f043530333031a02fa02d862b687474703a2f2f63726c2e6964656e74727573742e636f6d2f445354524f4f544341583343524c2e63726c301d0603551d0e04160414a84a6a63047dddbae6d139b7a64565eff3a8eca1300d06092a864886f70d01010b05000382010100dd33d711f3635838dd1815fb0955be7656b97048a56947277bc2240892f15a1f4a1229372474511c6268b8cd957067e5f7a4bc4e2851cd9be8ae879dead8ba5aa1019adcf0dd6a1d6ad83e57239ea61e04629affd705cab71f3fc00a48bc94b0b66562e0c154e5a32aad20c4e9e6bbdcc8f6b5c332a398cc77a8e67965072bcb28fe3a165281ce520c2e5f83e8d50633fb776cce40ea329e1f925c41c1746c5b5d0a5f33cc4d9fac38f02f7b2c629dd9a3916f251b2f90b119463df67e1ba67a87b9a37a6d18fa25a5918715e0f2162f58b0062f2c6826c64b98cdda9f0cf97f90ed434a12444e6f737a28eaa4aa6e7b4c7d87dde0c90244a787afc3345bb442',
201: '25847d668eb4f04fdd40b12b6b0740c567da7d024308eb6c2c96fe41d9de218d',
202: '2e1e12dacb350e69317a7f37d769f46f16f437cf8d392319279c93515e5600baed3d3acd5dc83b673e8c60cf7fba0dce00a4d162a3b966a3ebf72487c376fca0',
210: '30820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001',
211: '60b87575447dcba2a36b7d11ac09fb24a9db406fee12d2cc90180517616e8a18',
212: '774fad8c9a6afc2bdb44faba8390d213ae592fb0d56c5dfab152284e334d7cd6abd05799236e7aa6266edf81907c60404c57ee54c10a3a82fcc2a9146629b140',
300: '3082029730820200a0030201020209008cd9e96d57245c82300d06092a864886f70d01010b05003063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05622e636f6d301e170d3139303132343134343535395a170d3139303232333134343535395a3063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05622e636f6d30819f300d06092a864886f70d010101050003818d0030818902818100ccce5f013a28c54116fc810391d102cbaf58c2dd6cbb41208e1449e6f8b6541fa884046eb34a203ada3e33bc40d1a7530682aabeb141c106bea56965e4474f24b1077abbb7d54e624b603f2c25cfb03aeef25093b1f4b249b86a9447c5c9e90d09ea8d37eba601517dccb38f0e3d42ed363a0219c509772ab085b6fab1f48a610203010001a3533051301d0603551d0e04160414f440a4866373f5fd7ce0b20c7aebb5f30af5d19f301f0603551d23041830168014f440a4866373f5fd7ce0b20c7aebb5f30af5d19f300f0603551d130101ff040530030101ff300d06092a864886f70d01010b05000381810080e6278a6385de7095e1866b71eca28f541b1ef5a4e69581caa165b0a536fe7c3b8e5f23b892155a1199915f3b6dea2360476b20c1431501956300255bd7a7bc8b78271854ba5231b37e16b1f29015cddfd79e38a57a3d0e6bcd3484a50eb31012fb961f4745a6116680daee537d7df8fbe576972ac2aa38b8da72ef6671ff9d',
301: 'e448c386abce2a8f5962b163720c6651738a12e5bb39123237e3b29913d802ea',
302: '92c057cfe645e6c176f0e944bedec49d02a9d295493510b5c7cc01bfe6370d2dd215f672c4e6f692f02fc5642e6b5154c877438820cb5f53d330a524315fc035',
310: '30819f300d06092a864886f70d010101050003818d0030818902818100ccce5f013a28c54116fc810391d102cbaf58c2dd6cbb41208e1449e6f8b6541fa884046eb34a203ada3e33bc40d1a7530682aabeb141c106bea56965e4474f24b1077abbb7d54e624b603f2c25cfb03aeef25093b1f4b249b86a9447c5c9e90d09ea8d37eba601517dccb38f0e3d42ed363a0219c509772ab085b6fab1f48a610203010001',
311: 'e5f88030480e359c17a33d2f02c42033b6eb5b482f182930087bb6fa8c701805',
312: 'e5024953edaccf482c438f2dcd1cee98b31094fd9f959c3dc071d6027cb58eec94cd8c9389915d2096b19469141cb29cbd63bbcdde03d8ceaa04f20c523149e3',
},
'cert2': {
200: '308204923082037aa00302010202100a0141420000015385736a0b85eca708300d06092a864886f70d01010b0500303f31243022060355040a131b4469676974616c205369676e617475726520547275737420436f2e311730150603550403130e44535420526f6f74204341205833301e170d3136303331373136343034365a170d3231303331373136343034365a304a310b300906035504061302555331163014060355040a130d4c6574277320456e6372797074312330210603550403131a4c6574277320456e637279707420417574686f7269747920583330820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001a382017d3082017930120603551d130101ff040830060101ff020100300e0603551d0f0101ff040403020186307f06082b0601050507010104733071303206082b060105050730018626687474703a2f2f697372672e747275737469642e6f6373702e6964656e74727573742e636f6d303b06082b06010505073002862f687474703a2f2f617070732e6964656e74727573742e636f6d2f726f6f74732f647374726f6f74636178332e703763301f0603551d23041830168014c4a7b1a47b2c71fadbe14b9075ffc4156085891030540603551d20044d304b3008060667810c010201303f060b2b0601040182df130101013030302e06082b060105050702011622687474703a2f2f6370732e726f6f742d78312e6c657473656e63727970742e6f7267303c0603551d1f043530333031a02fa02d862b687474703a2f2f63726c2e6964656e74727573742e636f6d2f445354524f4f544341583343524c2e63726c301d0603551d0e04160414a84a6a63047dddbae6d139b7a64565eff3a8eca1300d06092a864886f70d01010b05000382010100dd33d711f3635838dd1815fb0955be7656b97048a56947277bc2240892f15a1f4a1229372474511c6268b8cd957067e5f7a4bc4e2851cd9be8ae879dead8ba5aa1019adcf0dd6a1d6ad83e57239ea61e04629affd705cab71f3fc00a48bc94b0b66562e0c154e5a32aad20c4e9e6bbdcc8f6b5c332a398cc77a8e67965072bcb28fe3a165281ce520c2e5f83e8d50633fb776cce40ea329e1f925c41c1746c5b5d0a5f33cc4d9fac38f02f7b2c629dd9a3916f251b2f90b119463df67e1ba67a87b9a37a6d18fa25a5918715e0f2162f58b0062f2c6826c64b98cdda9f0cf97f90ed434a12444e6f737a28eaa4aa6e7b4c7d87dde0c90244a787afc3345bb442',
201: '25847d668eb4f04fdd40b12b6b0740c567da7d024308eb6c2c96fe41d9de218d',
202: '2e1e12dacb350e69317a7f37d769f46f16f437cf8d392319279c93515e5600baed3d3acd5dc83b673e8c60cf7fba0dce00a4d162a3b966a3ebf72487c376fca0',
210: '30820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001',
211: '60b87575447dcba2a36b7d11ac09fb24a9db406fee12d2cc90180517616e8a18',
212: '774fad8c9a6afc2bdb44faba8390d213ae592fb0d56c5dfab152284e334d7cd6abd05799236e7aa6266edf81907c60404c57ee54c10a3a82fcc2a9146629b140',
300: '3082029730820200a003020102020900926f759f7d9f6122300d06092a864886f70d01010b05003063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05622e636f6d301e170d3139303132343134343632315a170d3139303232333134343632315a3063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05622e636f6d30819f300d06092a864886f70d010101050003818d0030818902818100b937e23244598f39f0517dfcf38a598b5362d484e1bc83a799fa4e2a954eb3fa0b681f57eedfc9edee2d0cdbfdc2b98901a17290c2089e74fe6f21376c71480b9029552d927661a079b79b78dbdbbba9910206cf8955128b57fedf45f18b45d0a222c5ce60c31e7a49fac96fdc55823958f8b2b0586cf77745c8805502a90f3f0203010001a3533051301d0603551d0e0416041422f645925207c0d645788527b39fd20430c5a0c6301f0603551d2304183016801422f645925207c0d645788527b39fd20430c5a0c6300f0603551d130101ff040530030101ff300d06092a864886f70d01010b05000381810014aa34d60711aeed625391abf6b04d1fbb62b104bb867ab4adf124123b060286337f6ca5e5cacff01c71abae1678c29ffbf0b50acc28f7541e0fcd4f3bf21e8911482c75b934a14bcce13b9972dbfa0dd5d51069c8add4512dbee6deecbaefb39e8e8333538bfe1a768a19d589e00ecf173d3d56fd6bb129de8a791d54bb21de',
301: '62ff6fe596af9cd6a50aa3ea213d9ddda51c117d1d415a2fbfb858101ef8d532',
302: 'f8f43871414fef37ea5bea19a1cd57e8f7a528276a4a6934cfe89bd44aaf22c6d1f0d81f9934fc5534bbea80cc281747aa4704688f54e8b0cac3841732a01726',
310: '30819f300d06092a864886f70d010101050003818d0030818902818100b937e23244598f39f0517dfcf38a598b5362d484e1bc83a799fa4e2a954eb3fa0b681f57eedfc9edee2d0cdbfdc2b98901a17290c2089e74fe6f21376c71480b9029552d927661a079b79b78dbdbbba9910206cf8955128b57fedf45f18b45d0a222c5ce60c31e7a49fac96fdc55823958f8b2b0586cf77745c8805502a90f3f0203010001',
311: '9be00418751c2889dc6688d5e88b52da8c1696add47b7073beda4c3bb0fad469',
312: '549477faac78a5892b351077051ed8b7eaac7457f079fa130835fb72d33baee4e552f6668526a66f680001589500d768a9adaf1114c041a2517b7cf9d1791c90',
},
'cert3': {
200: '308204923082037aa00302010202100a0141420000015385736a0b85eca708300d06092a864886f70d01010b0500303f31243022060355040a131b4469676974616c205369676e617475726520547275737420436f2e311730150603550403130e44535420526f6f74204341205833301e170d3136303331373136343034365a170d3231303331373136343034365a304a310b300906035504061302555331163014060355040a130d4c6574277320456e6372797074312330210603550403131a4c6574277320456e637279707420417574686f7269747920583330820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001a382017d3082017930120603551d130101ff040830060101ff020100300e0603551d0f0101ff040403020186307f06082b0601050507010104733071303206082b060105050730018626687474703a2f2f697372672e747275737469642e6f6373702e6964656e74727573742e636f6d303b06082b06010505073002862f687474703a2f2f617070732e6964656e74727573742e636f6d2f726f6f74732f647374726f6f74636178332e703763301f0603551d23041830168014c4a7b1a47b2c71fadbe14b9075ffc4156085891030540603551d20044d304b3008060667810c010201303f060b2b0601040182df130101013030302e06082b060105050702011622687474703a2f2f6370732e726f6f742d78312e6c657473656e63727970742e6f7267303c0603551d1f043530333031a02fa02d862b687474703a2f2f63726c2e6964656e74727573742e636f6d2f445354524f4f544341583343524c2e63726c301d0603551d0e04160414a84a6a63047dddbae6d139b7a64565eff3a8eca1300d06092a864886f70d01010b05000382010100dd33d711f3635838dd1815fb0955be7656b97048a56947277bc2240892f15a1f4a1229372474511c6268b8cd957067e5f7a4bc4e2851cd9be8ae879dead8ba5aa1019adcf0dd6a1d6ad83e57239ea61e04629affd705cab71f3fc00a48bc94b0b66562e0c154e5a32aad20c4e9e6bbdcc8f6b5c332a398cc77a8e67965072bcb28fe3a165281ce520c2e5f83e8d50633fb776cce40ea329e1f925c41c1746c5b5d0a5f33cc4d9fac38f02f7b2c629dd9a3916f251b2f90b119463df67e1ba67a87b9a37a6d18fa25a5918715e0f2162f58b0062f2c6826c64b98cdda9f0cf97f90ed434a12444e6f737a28eaa4aa6e7b4c7d87dde0c90244a787afc3345bb442',
201: '25847d668eb4f04fdd40b12b6b0740c567da7d024308eb6c2c96fe41d9de218d',
202: '2e1e12dacb350e69317a7f37d769f46f16f437cf8d392319279c93515e5600baed3d3acd5dc83b673e8c60cf7fba0dce00a4d162a3b966a3ebf72487c376fca0',
210: '30820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001',
211: '60b87575447dcba2a36b7d11ac09fb24a9db406fee12d2cc90180517616e8a18',
212: '774fad8c9a6afc2bdb44faba8390d213ae592fb0d56c5dfab152284e334d7cd6abd05799236e7aa6266edf81907c60404c57ee54c10a3a82fcc2a9146629b140',
300: '3082039c30820284a003020102020900b6151540dee2bfb7300d06092a864886f70d01010b05003063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05622e636f6d301e170d3139303132343134343335365a170d3139303232333134343335365a3063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05622e636f6d30820122300d06092a864886f70d01010105000382010f003082010a0282010100d309b11f5b942c715b6af60ff14adc0f3f446c2d8366eb2ea02fa21fcde2f5ad85fb203645e217eaf1d7a686c37f687aed2e4217ce5e71aec1e6f8869f08a5e4871bc6489e59e2bff7c12d8294fae33e8a536b4f14d2f735e46ec9a2b087a025c291e413f2426ebab3265aa56aaac13a4f8cdd4b956cb2e51551cd75721301fcc57d65566527dd452937a4c493da5e7b63553ad918e2246ad17a4c470adceddd44661a1b369d1105ab70a3c80b7043afa76ec04ad73703cc3edd4c5ec542e73a2e05677cdf752b3b5154ce04111bca0c87f65f025b439c2d5f68fb65529efa4b9dcb84619ca5b212a8886293203fc9a00593917a6e7db8b8ee88a14a33ee40050203010001a3533051301d0603551d0e041604140563063cb36ba24363b942e187f99fbbf8b11891301f0603551d230418301680140563063cb36ba24363b942e187f99fbbf8b11891300f0603551d130101ff040530030101ff300d06092a864886f70d01010b05000382010100a357330469905bb9abd8b568930736253075107ff61797c5b31ab93215c36bb08aa230c68cde32ff3c19b865bbfef93761232ccb1fdf58fdc73057bbde06aa31753003171334f5efe4244d03a0633f4b91e85cf4e3eea5f6cded08ca3396341372871c24dcf381d49c6a82c08b67cc4a82e7da807f12f8e1d3f9ed92d22d4eaa51b9578d1428ffc2c29086cc7da946461221147a924e7aa1cf831c769c8f0a40931790882f8239185a52192723e908d5ded0146ea5e335aba49bb60c3668bdc16a23a7bf9db3d31d5fe13cd11ad4a6df996db518fe196d08437fbc50228f67741f8c9473150c41ae3f4fe6b1e00cec98f320a37acc634db141418fa188a7576d',
301: 'b51537af4a092f3de9a5821bc770228a4c942e3d0ff71cf347cd17c858c1a00c',
302: '1db9c27fca8cd5d6b7568badac25477dcdb4418fb544bb8873c3d79f6ce2234c7ddaa23a1916e4119eefde6843330ed18b8b3b181aea526748ad5e5c99a36806',
310: '30820122300d06092a864886f70d01010105000382010f003082010a0282010100d309b11f5b942c715b6af60ff14adc0f3f446c2d8366eb2ea02fa21fcde2f5ad85fb203645e217eaf1d7a686c37f687aed2e4217ce5e71aec1e6f8869f08a5e4871bc6489e59e2bff7c12d8294fae33e8a536b4f14d2f735e46ec9a2b087a025c291e413f2426ebab3265aa56aaac13a4f8cdd4b956cb2e51551cd75721301fcc57d65566527dd452937a4c493da5e7b63553ad918e2246ad17a4c470adceddd44661a1b369d1105ab70a3c80b7043afa76ec04ad73703cc3edd4c5ec542e73a2e05677cdf752b3b5154ce04111bca0c87f65f025b439c2d5f68fb65529efa4b9dcb84619ca5b212a8886293203fc9a00593917a6e7db8b8ee88a14a33ee40050203010001',
311: '20a8da331b07bae5b4aa717d63c3734d48ecfadb7699a7fdce256afbd315903b',
312: '83267dab049bb6f7a04da73b6f63c3de0ae146f3d28cd2697e1bb15c94b5a419dae9268b2143141fc09b4029a937e385ab49262e7962ecc96044d44200de63e6',
},
},
'c.com': {
'cert1': {
200: '308204923082037aa00302010202100a0141420000015385736a0b85eca708300d06092a864886f70d01010b0500303f31243022060355040a131b4469676974616c205369676e617475726520547275737420436f2e311730150603550403130e44535420526f6f74204341205833301e170d3136303331373136343034365a170d3231303331373136343034365a304a310b300906035504061302555331163014060355040a130d4c6574277320456e6372797074312330210603550403131a4c6574277320456e637279707420417574686f7269747920583330820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001a382017d3082017930120603551d130101ff040830060101ff020100300e0603551d0f0101ff040403020186307f06082b0601050507010104733071303206082b060105050730018626687474703a2f2f697372672e747275737469642e6f6373702e6964656e74727573742e636f6d303b06082b06010505073002862f687474703a2f2f617070732e6964656e74727573742e636f6d2f726f6f74732f647374726f6f74636178332e703763301f0603551d23041830168014c4a7b1a47b2c71fadbe14b9075ffc4156085891030540603551d20044d304b3008060667810c010201303f060b2b0601040182df130101013030302e06082b060105050702011622687474703a2f2f6370732e726f6f742d78312e6c657473656e63727970742e6f7267303c0603551d1f043530333031a02fa02d862b687474703a2f2f63726c2e6964656e74727573742e636f6d2f445354524f4f544341583343524c2e63726c301d0603551d0e04160414a84a6a63047dddbae6d139b7a64565eff3a8eca1300d06092a864886f70d01010b05000382010100dd33d711f3635838dd1815fb0955be7656b97048a56947277bc2240892f15a1f4a1229372474511c6268b8cd957067e5f7a4bc4e2851cd9be8ae879dead8ba5aa1019adcf0dd6a1d6ad83e57239ea61e04629affd705cab71f3fc00a48bc94b0b66562e0c154e5a32aad20c4e9e6bbdcc8f6b5c332a398cc77a8e67965072bcb28fe3a165281ce520c2e5f83e8d50633fb776cce40ea329e1f925c41c1746c5b5d0a5f33cc4d9fac38f02f7b2c629dd9a3916f251b2f90b119463df67e1ba67a87b9a37a6d18fa25a5918715e0f2162f58b0062f2c6826c64b98cdda9f0cf97f90ed434a12444e6f737a28eaa4aa6e7b4c7d87dde0c90244a787afc3345bb442',
201: '25847d668eb4f04fdd40b12b6b0740c567da7d024308eb6c2c96fe41d9de218d',
202: '2e1e12dacb350e69317a7f37d769f46f16f437cf8d392319279c93515e5600baed3d3acd5dc83b673e8c60cf7fba0dce00a4d162a3b966a3ebf72487c376fca0',
210: '30820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001',
211: '60b87575447dcba2a36b7d11ac09fb24a9db406fee12d2cc90180517616e8a18',
212: '774fad8c9a6afc2bdb44faba8390d213ae592fb0d56c5dfab152284e334d7cd6abd05799236e7aa6266edf81907c60404c57ee54c10a3a82fcc2a9146629b140',
300: '3082029730820200a003020102020900fc41cb412f02716c300d06092a864886f70d01010b05003063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05632e636f6d301e170d3139303132343134343533365a170d3139303232333134343533365a3063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05632e636f6d30819f300d06092a864886f70d010101050003818d0030818902818100b27884e1ed6c5ff04b574c0c77ba0411a2e5bfe5c68177782f164c9cec5ff3d417299ab0b882205a9c9ae3393c53ccea9844d71f4453125c8e448e12c011a558e723777712e1dfc684142598618502984fb9e20239bd04255765db8d83453127a2a3974150e9f1fc7f5e71f29aa95925a4e576ed227e72e111ac60072f87b4670203010001a3533051301d0603551d0e04160414a8f680c34724df4b6a189b20d6d06c266a8dfa7e301f0603551d23041830168014a8f680c34724df4b6a189b20d6d06c266a8dfa7e300f0603551d130101ff040530030101ff300d06092a864886f70d01010b0500038181000ed1e0d9f0be2d8459e4ffc45b41ec52dd1981abded65f57526295620851b29d7bb15a4839bf7359d5cd4a5c8645ef9ed14842602a327cba313bbf9e7b832d92795cec7687f21b6dc32bb1bd17ae94477e3861098aa9d3c71ddf23ba59baa562746704d2d1afa94a61f58d694e631d404232847b4449b679ea6c1777178be435',
301: '19b8a37e7217b04fe2a06462b01058ef17673cde32f98c314688f2f041edffc1',
302: 'e9a4602874fbac163ec7f6691b355e8bf48395f9ff5ad507a1ea5b6baaf2f0d7e6bce297f6cc3374b6cda984acd2831bc61ab9b94948980fae50faae5a19f174',
310: '30819f300d06092a864886f70d010101050003818d0030818902818100b27884e1ed6c5ff04b574c0c77ba0411a2e5bfe5c68177782f164c9cec5ff3d417299ab0b882205a9c9ae3393c53ccea9844d71f4453125c8e448e12c011a558e723777712e1dfc684142598618502984fb9e20239bd04255765db8d83453127a2a3974150e9f1fc7f5e71f29aa95925a4e576ed227e72e111ac60072f87b4670203010001',
311: 'b9d0f21a2c0eab9254bdd530c503ad3aa33354bb147d6d054e2c70a1b208e938',
312: '2fa354783d4fe1b926f1976e8169deb75e1ca41ef6234ddfead41d0b9854162b0d54df335060852436f023444c40af32575e58511a0b31a137199f5737589dce',
},
'cert2': {
200: '308204923082037aa00302010202100a0141420000015385736a0b85eca708300d06092a864886f70d01010b0500303f31243022060355040a131b4469676974616c205369676e617475726520547275737420436f2e311730150603550403130e44535420526f6f74204341205833301e170d3136303331373136343034365a170d3231303331373136343034365a304a310b300906035504061302555331163014060355040a130d4c6574277320456e6372797074312330210603550403131a4c6574277320456e637279707420417574686f7269747920583330820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001a382017d3082017930120603551d130101ff040830060101ff020100300e0603551d0f0101ff040403020186307f06082b0601050507010104733071303206082b060105050730018626687474703a2f2f697372672e747275737469642e6f6373702e6964656e74727573742e636f6d303b06082b06010505073002862f687474703a2f2f617070732e6964656e74727573742e636f6d2f726f6f74732f647374726f6f74636178332e703763301f0603551d23041830168014c4a7b1a47b2c71fadbe14b9075ffc4156085891030540603551d20044d304b3008060667810c010201303f060b2b0601040182df130101013030302e06082b060105050702011622687474703a2f2f6370732e726f6f742d78312e6c657473656e63727970742e6f7267303c0603551d1f043530333031a02fa02d862b687474703a2f2f63726c2e6964656e74727573742e636f6d2f445354524f4f544341583343524c2e63726c301d0603551d0e04160414a84a6a63047dddbae6d139b7a64565eff3a8eca1300d06092a864886f70d01010b05000382010100dd33d711f3635838dd1815fb0955be7656b97048a56947277bc2240892f15a1f4a1229372474511c6268b8cd957067e5f7a4bc4e2851cd9be8ae879dead8ba5aa1019adcf0dd6a1d6ad83e57239ea61e04629affd705cab71f3fc00a48bc94b0b66562e0c154e5a32aad20c4e9e6bbdcc8f6b5c332a398cc77a8e67965072bcb28fe3a165281ce520c2e5f83e8d50633fb776cce40ea329e1f925c41c1746c5b5d0a5f33cc4d9fac38f02f7b2c629dd9a3916f251b2f90b119463df67e1ba67a87b9a37a6d18fa25a5918715e0f2162f58b0062f2c6826c64b98cdda9f0cf97f90ed434a12444e6f737a28eaa4aa6e7b4c7d87dde0c90244a787afc3345bb442',
201: '25847d668eb4f04fdd40b12b6b0740c567da7d024308eb6c2c96fe41d9de218d',
202: '2e1e12dacb350e69317a7f37d769f46f16f437cf8d392319279c93515e5600baed3d3acd5dc83b673e8c60cf7fba0dce00a4d162a3b966a3ebf72487c376fca0',
210: '30820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001',
211: '60b87575447dcba2a36b7d11ac09fb24a9db406fee12d2cc90180517616e8a18',
212: '774fad8c9a6afc2bdb44faba8390d213ae592fb0d56c5dfab152284e334d7cd6abd05799236e7aa6266edf81907c60404c57ee54c10a3a82fcc2a9146629b140',
300: '3082029730820200a00302010202090080ba59108fb6ef0b300d06092a864886f70d01010b05003063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05632e636f6d301e170d3139303132343134343531315a170d3139303232333134343531315a3063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05632e636f6d30819f300d06092a864886f70d010101050003818d0030818902818100b0d99b386488027cb28535c33a9bd61a53e68991639a7161fb8e09441897a67cb974cf8119a5d259321bae8c3178a1e58f02319fde2c531f5bb161592292fbeddcdadd1f8ad042658eaa0610e5932634d32ea65aced113698589fec0bdc2abc46fe1eaa3f4b9dd21faf12ab8061ee8d4a358a0db967385daa0991e346e38ce9d0203010001a3533051301d0603551d0e041604143952d825cc106e62545ffe998e267dcdc2978628301f0603551d230418301680143952d825cc106e62545ffe998e267dcdc2978628300f0603551d130101ff040530030101ff300d06092a864886f70d01010b0500038181009b0687cb8be7cecc3d22c23ad361a2f8a34829e14d87f22df937fd7e2aee66254bf9b56344ca97968ddb9b864d9bc4147f436ebf376fae00247cefedc20557842724df15d42dae01e3139407367dc9bb5688da5663a546ee84886f2b3338f29828a8930c2c35a0b4afa1c676f99d65b4d6185be89897789582c7791eab1b74d0',
301: '06a7e55b1525c14f1536b1fa56bd32c4a8fa019893192a781dc989bf41814afc',
302: 'fa297515255d44b3f97327e47f4f1b07c363f204265b7dbbd41c2d3073c3def5c2f565f4c3f3046bf8cf2d602b7e0f911a15b5f6d815b8281d2288e3de857aa5',
310: '30819f300d06092a864886f70d010101050003818d0030818902818100b0d99b386488027cb28535c33a9bd61a53e68991639a7161fb8e09441897a67cb974cf8119a5d259321bae8c3178a1e58f02319fde2c531f5bb161592292fbeddcdadd1f8ad042658eaa0610e5932634d32ea65aced113698589fec0bdc2abc46fe1eaa3f4b9dd21faf12ab8061ee8d4a358a0db967385daa0991e346e38ce9d0203010001',
311: '8260378e9c69fcbd165af31e12c915c41fe013e892a847a88f4f9e893ff57f24',
312: 'd7ce961fa365f74512f01ae6c766ac9992f886363cea48bbb8d99a1848f07d8556cd99e9ab1b9b3a5c31db75d1122e7b9aebd6b131e46944f2161a5feac85fc0',
},
'cert3': {
200: '308204923082037aa00302010202100a0141420000015385736a0b85eca708300d06092a864886f70d01010b0500303f31243022060355040a131b4469676974616c205369676e617475726520547275737420436f2e311730150603550403130e44535420526f6f74204341205833301e170d3136303331373136343034365a170d3231303331373136343034365a304a310b300906035504061302555331163014060355040a130d4c6574277320456e6372797074312330210603550403131a4c6574277320456e637279707420417574686f7269747920583330820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001a382017d3082017930120603551d130101ff040830060101ff020100300e0603551d0f0101ff040403020186307f06082b0601050507010104733071303206082b060105050730018626687474703a2f2f697372672e747275737469642e6f6373702e6964656e74727573742e636f6d303b06082b06010505073002862f687474703a2f2f617070732e6964656e74727573742e636f6d2f726f6f74732f647374726f6f74636178332e703763301f0603551d23041830168014c4a7b1a47b2c71fadbe14b9075ffc4156085891030540603551d20044d304b3008060667810c010201303f060b2b0601040182df130101013030302e06082b060105050702011622687474703a2f2f6370732e726f6f742d78312e6c657473656e63727970742e6f7267303c0603551d1f043530333031a02fa02d862b687474703a2f2f63726c2e6964656e74727573742e636f6d2f445354524f4f544341583343524c2e63726c301d0603551d0e04160414a84a6a63047dddbae6d139b7a64565eff3a8eca1300d06092a864886f70d01010b05000382010100dd33d711f3635838dd1815fb0955be7656b97048a56947277bc2240892f15a1f4a1229372474511c6268b8cd957067e5f7a4bc4e2851cd9be8ae879dead8ba5aa1019adcf0dd6a1d6ad83e57239ea61e04629affd705cab71f3fc00a48bc94b0b66562e0c154e5a32aad20c4e9e6bbdcc8f6b5c332a398cc77a8e67965072bcb28fe3a165281ce520c2e5f83e8d50633fb776cce40ea329e1f925c41c1746c5b5d0a5f33cc4d9fac38f02f7b2c629dd9a3916f251b2f90b119463df67e1ba67a87b9a37a6d18fa25a5918715e0f2162f58b0062f2c6826c64b98cdda9f0cf97f90ed434a12444e6f737a28eaa4aa6e7b4c7d87dde0c90244a787afc3345bb442',
201: '25847d668eb4f04fdd40b12b6b0740c567da7d024308eb6c2c96fe41d9de218d',
202: '2e1e12dacb350e69317a7f37d769f46f16f437cf8d392319279c93515e5600baed3d3acd5dc83b673e8c60cf7fba0dce00a4d162a3b966a3ebf72487c376fca0',
210: '30820122300d06092a864886f70d01010105000382010f003082010a02820101009cd30cf05ae52e47b7725d3783b3686330ead735261925e1bdbe35f170922fb7b84b4105aba99e350858ecb12ac468870ba3e375e4e6f3a76271ba7981601fd7919a9ff3d0786771c8690e9591cffee699e9603c48cc7eca4d7712249d471b5aebb9ec1e37001c9cac7ba705eace4aebbd41e53698b9cbfd6d3c9668df232a42900c867467c87fa59ab8526114133f65e98287cbdbfa0e56f68689f3853f9786afb0dc1aef6b0d95167dc42ba065b299043675806bac4af31b9049782fa2964f2a20252904c674c0d031cd8f31389516baa833b843f1b11fc3307fa27931133d2d36f8e3fcf2336ab93931c5afc48d0d1d641633aafa8429b6d40bc0d87dc3930203010001',
211: '60b87575447dcba2a36b7d11ac09fb24a9db406fee12d2cc90180517616e8a18',
212: '774fad8c9a6afc2bdb44faba8390d213ae592fb0d56c5dfab152284e334d7cd6abd05799236e7aa6266edf81907c60404c57ee54c10a3a82fcc2a9146629b140',
300: '3082039c30820284a0030201020209009aec6226ce0f0378300d06092a864886f70d01010b05003063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05632e636f6d301e170d3139303132343134343431365a170d3139303232333134343431365a3063310b3009060355040613024742310f300d06035504080c064c6f6e646f6e310f300d06035504070c064c6f6e646f6e3110300e060355040a0c07416c6e6974616b3110300e060355040b0c0774657374696e67310e300c06035504030c05632e636f6d30820122300d06092a864886f70d01010105000382010f003082010a0282010100adf31ac70d9a7db5b4974d303f72ebe8af93eba09eb129afe594e8252035a24f5c3a7fb80e9d7de72f330993c78b313775ac3862682f9eff60fd3b7a34656c39a23e6f586be57194cd4c8e3992bf7df70eba8c30efe49e75b17de29550f9c0b9ad4035c783bff0e6afc3adfc5fa2be3509df0a08a123e992249d21441090d655e6a96f0ccd71ead172ec9a1f8fa000c443cabda1227487d361705a0c51f8bbca1441371ffe53c513aaa096c88b092cdb83d820bca25b525777459522110ca23c87c5b65c12f18396bbc277310b29f57a6cecf9069f6c78b339af9f5fb420c4b04f2aa3d9043a387609d612f9ac6441a4373f0db33f80c363fb71b78329ea834f0203010001a3533051301d0603551d0e04160414e78ce71be06250bf93e92882da78c8b5e49bb97c301f0603551d23041830168014e78ce71be06250bf93e92882da78c8b5e49bb97c300f0603551d130101ff040530030101ff300d06092a864886f70d01010b0500038201010048d812156696ab4891fb211116a51dd50fd51a245ce32b6b2fa0692ba8f29053c3d8f74f6410e1921befb0cb7c0a0600a521adc573cba014c4e052146b9a5537507c90403cbdcf5421642b18d84048278099234c952e1fb39c601be2126eb4dc7708db30ff71797db4e3da91f4423c165e401cc5092606bc46f0560ab988ec22536a522711ceceebb2786d2b0650033579925b08b98c74344d91a9928e72e86ba2cec6ce5ce4e491e91c83229efc4d0696fbb47e1ce248cf27f316ba22dc69c44b1bf9cfef4784f84dec847ad3c38329cc7140124c0010711cfd64731bc61086319be8d92d05cebfe1f76c988f301a9727c9d5da09cf30801acf411528fc264c',
301: 'a33dd4789fa25280d9bbeba11e15957c917347f863d79b4e75011b7413c9f49e',
302: '7c6609b5ca9e76cd38983b6094b4cbfcbfc69f8ac1c664ac1ebc6ec2ca51ad120b5a6ddaab856d471c2f8719453089b83dfb12096f02da683372e3562ec3cbfc',
310: '30820122300d06092a864886f70d01010105000382010f003082010a0282010100adf31ac70d9a7db5b4974d303f72ebe8af93eba09eb129afe594e8252035a24f5c3a7fb80e9d7de72f330993c78b313775ac3862682f9eff60fd3b7a34656c39a23e6f586be57194cd4c8e3992bf7df70eba8c30efe49e75b17de29550f9c0b9ad4035c783bff0e6afc3adfc5fa2be3509df0a08a123e992249d21441090d655e6a96f0ccd71ead172ec9a1f8fa000c443cabda1227487d361705a0c51f8bbca1441371ffe53c513aaa096c88b092cdb83d820bca25b525777459522110ca23c87c5b65c12f18396bbc277310b29f57a6cecf9069f6c78b339af9f5fb420c4b04f2aa3d9043a387609d612f9ac6441a4373f0db33f80c363fb71b78329ea834f0203010001',
311: 'b4e5e7da9a76ab60fee17a736beaaf21090038f76468e9d46e853de0259d22ad',
312: '3596ab9e1505a1c7a8725dffe6c87d672a4004ce0db3152a88420fa0ad82054d85e35f151b03f7382f21471571434bdc54a2a9db8ab13b53ca10b40c6324fc04',
},
},
}
self.renew_a_at = 1
self.renew_b_at = 1
self.renew_c_at = 1
with open(str(self.readme), 'w') as file:
file.write('This directory and its contents have been created by the testing scripts. You can safely delete this directory if you wish; the tests will recreate the directory and its contents if it is missing.\n')
with open(str(self.live / 'taint'), 'w') as file:
file.write("0")
with open(str(self.archive / 'taint'), 'w') as file:
file.write("0")
for d in self.domains:
live_d = self.domains[d]['live']
archive_d = self.domains[d]['archive']
live_d.mkdir()
archive_d.mkdir()
with open(str(live_d / 'taint'), 'w') as file:
file.write("1")
with open(str(archive_d / 'cert1.pem'), 'w') as file:
file.write(self.domains[d]['cert1'])
with open(str(archive_d / 'cert2.pem'), 'w') as file:
file.write(self.domains[d]['cert2'])
with open(str(archive_d / 'cert3.pem'), 'w') as file:
file.write(self.domains[d]['cert3'])
with open(str(archive_d / 'chain1.pem'), 'w') as file:
file.write(self.chain)
with open(str(archive_d / 'chain2.pem'), 'w') as file:
file.write(self.chain)
with open(str(archive_d / 'chain3.pem'), 'w') as file:
file.write(self.chain)
with open(str(archive_d / 'privkey1.pem'), 'w') as file:
file.write('\n')
with open(str(archive_d / 'privkey2.pem'), 'w') as file:
file.write('\n')
with open(str(archive_d / 'privkey3.pem'), 'w') as file:
file.write('\n')
with open(str(archive_d / 'fullchain1.pem'), 'w') as file:
file.write(self.domains[d]['cert1'] + self.chain)
with open(str(archive_d / 'fullchain2.pem'), 'w') as file:
file.write(self.domains[d]['cert2'] + self.chain)
with open(str(archive_d / 'fullchain3.pem'), 'w') as file:
file.write(self.domains[d]['cert3'] + self.chain)
# symlinks
Path(live_d / 'cert.pem').symlink_to(
'../../archive/{}/cert1.pem'.format(d))
Path(live_d / 'chain.pem').symlink_to(
'../../archive/{}/chain1.pem'.format(d))
Path(live_d / 'fullchain.pem').symlink_to(
'../../archive/{}/fullchain1.pem'.format(d))
Path(live_d / 'privkey.pem').symlink_to(
'../../archive/{}/privkey1.pem'.format(d))
with open(str(self.binary), 'w') as file:
file.write('''#!/bin/sh
# if you find this program, delete it: it's just a testing program for the
# alnitak program and will be recreated as needed anyway.
cd `dirname $0`
echo ":$TLSA_OPERATION:$TLSA_PARAM:$TLSA_HASH:$TLSA_LIVE_HASH:$@" >> ../data/calls
echo "`whoami`:`groups`" >> ../data/user
for i in $@
do
case "$i" in
--is-up=*)
up=`echo "$i" | cut -c "9-"`
;;
--is-up)
up="-"
;;
--not-up=*)
nup=`echo "$i" | cut -c "10-"`
;;
--not-up)
nup="-"
;;
--fail-publish)
failpub=2
;;
--fail-publish-noret)
failpubnoret=200
;;
--fail-delete)
faildel=2
;;
--fail-delete-noret)
faildelnoret=200
;;
[0-9]*)
exit $i
;;
esac
done
if test "$TLSA_OPERATION" = "publish"
then
if test -n "$up"
then
if test "$up" = "-"
then
exit 1
else
oIFS="$IFS"
IFS=:
for k in $up
do
IFS="$oIFS"
if test "$k" = "$TLSA_PARAM"
then
exit 1
fi
done
test -n "$failpub" && exit $failpub
test -n "$failpubnoret" && exit $failpubnoret
exit 0
fi
fi
test -n "$failpub" && exit $failpub
test -n "$failpubnoret" && exit $failpubnoret
exit 0
elif test "$TLSA_OPERATION" = "delete"
then
if test -n "$nup"
then
if test "$nup" = "-"
then
exit 1
else
oIFS="$IFS"
IFS=:
for k in $nup
do
IFS="$oIFS"
if test "$k" = "$TLSA_PARAM"
then
exit 1
fi
done
test -n "$faildel" && exit $faildel
test -n "$faildelnoret" && exit $faildelnoret
exit 0
fi
fi
test -n "$faildel" && exit $faildel
test -n "$faildelnoret" && exit $faildelnoret
exit 0
fi
exit 10
''')
# was for testing locking
#with open(str(self.binary), 'w') as file:
# file.write('''#!/bin/sh
#sleep 2
#''')
self.binary.chmod(0o755)
#self.binary_wait.chmod(0o755)
# was for testing locking
with open(str(self.config), 'w') as file:
file.write('''
# default config file
#
api=exec {}
[a.com] ##target
tlsa = 311 12725
tlsa = 201 12725
[b.com] ## target
tlsa = 311 12780 udp
tlsa = 201 12780 sctp A.b.com
[c.com]
tlsa = 311 12722 A.c.com #tlsa
tlsa = 311 12723 B.c.com'''.format(self.binary))
with open(str(self.config1), 'w') as file:
file.write('''
# example valid config file
#
api=exec {}
[a.com] ##target
tlsa = 200 12725
tlsa = 201 12725
tlsa = 202 12725
tlsa = 210 12725
tlsa = 211 12725
tlsa = 212 12725
tlsa = 300 12725
tlsa = 301 12725
tlsa = 302 12725
tlsa = 310 12725
tlsa = 311 12725
tlsa = 312 12725
'''.format(self.binary))
with open(str(self.config2), 'w') as file:
file.write('''
# example valid config file
#
api=exec {}
[a.com] ##target
tlsa = 201 12725
tlsa = 211 12725
tlsa = 301 12725
tlsa = 311 12725
'''.format(self.binary))
with open(str(self.config3), 'w') as file:
file.write('''
# default config file
#
api=exec uid:nobody {}
[a.com] ##target
tlsa = 311 12725
tlsa = 201 12725
[b.com] ## target
tlsa = 311 12780 udp
tlsa = 201 12780 sctp A.b.com
[c.com]
tlsa = 311 12722 A.c.com #tlsa
tlsa = 311 12723 B.c.com'''.format(self.binary))
with open(str(self.config4), 'w') as file:
file.write('''
# example valid config file
#
api=exec uid:nobody {}
[a.com] ##target
tlsa = 200 12725
tlsa = 201 12725
tlsa = 202 12725
tlsa = 210 12725
tlsa = 211 12725
tlsa = 212 12725
tlsa = 300 12725
tlsa = 301 12725
tlsa = 302 12725
tlsa = 310 12725
tlsa = 311 12725
tlsa = 312 12725
'''.format(self.binary))
with open(str(self.config5), 'w') as file:
file.write('''
# example valid config file
#
api=exec uid:nobody {}
[a.com] ##target
tlsa = 201 12725
tlsa = 211 12725
tlsa = 301 12725
tlsa = 311 12725
'''.format(self.binary))
with open(str(self.config6), 'w') as file:
file.write('''
# example valid config file
#
#dane_directory= NONEXIST1
#dane_directory = NONEXIST2
dane_directory=/tmp/Q
letsencrypt_directory=../relative_path
api=exec {}
[a.com] ##target
tlsa=201 12725
tlsa \t=\t 211 12725
tlsa\t=\t301 12725
tlsa =\t\t\t311 12725
api = cloudflare email:A@domain.com key:1
api = exec bin --flag1 input "input with\t whitespace"
\t[ b.com ]
tlsa \t =\t \t200 1 sctp
tlsa \t =\t \t201 1 sctp W.com
tlsa \t =\t \t202 1 X.com
tlsa \t =\t \t210 Y.com 1 sctp
tlsa \t =\t \t211 1 Z.com sctp
tlsa \t =\t \t212 A.com sctp 1
tlsa \t =\t \t212 udp B.com 1
api = exec X
api = cloudflare email:me@domain.com key:KEY
[c.com ]
tlsa=200 2
'''.format(self.binary))
with open(str(self.config7), 'w') as file:
file.write('''
# example valid config file
#
api=exec {}
[a.com]
tlsa=201 12725
[b.com]
tlsa=201 12725
tlsa=212 12725
'''.format(self.binary))
with open(str(self.config8), 'w') as file:
file.write('''
# example valid config file
#
api=exec uid:'nobody' {}
[a.com]
tlsa=201 12725
[b.com]
tlsa=201 12725
tlsa=212 12725
'''.format(self.binary))
# was for testing locking
#with open(str(self.config9), 'w') as file:
# file.write('''
# # example valid config file
# #
#
# api=exec uid:'nobody' {}
#
# [a.com]
# tlsa=201 12725
# '''.format(self.binary_wait))
with open(str(self.configX1), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
'''.format(self.binary))
with open(str(self.configX2), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[a.com]
'''.format(self.binary))
with open(str(self.configX3), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[a]
tlsa = 200 1
'''.format(self.binary))
with open(str(self.configX4), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[-a.com]
tlsa = 200 1
'''.format(self.binary))
with open(str(self.configX5), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[a-.com]
tlsa = 200 1
'''.format(self.binary))
with open(str(self.configX6), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[a.com]
tlsa = 2021 1
'''.format(self.binary))
with open(str(self.configX7), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[a.com]
tlsa = 202 0
'''.format(self.binary))
with open(str(self.configX8), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[a.com]
tlsa = 202 65536
'''.format(self.binary))
with open(str(self.configX9), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[a.com]
tlsa = 202 a
'''.format(self.binary))
with open(str(self.configX10), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[a.com]
tlsa = a 1
'''.format(self.binary))
with open(str(self.configX11), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[a.com]
tlsa = 202 1 "asd er"
'''.format(self.binary))
with open(str(self.configX12), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[a.com]
tlsa = 202 1 tcp a
'''.format(self.binary))
with open(str(self.configX13), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[a.com]
tlsa = 202 1 tcp b-.com
'''.format(self.binary))
with open(str(self.configX14), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[a.com]
tlsa = 202 1 -a.com
'''.format(self.binary))
with open(str(self.configX15), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[a.com]
tlsa = 202 1
XXX = oops
'''.format(self.binary))
with open(str(self.configX16), 'w') as file:
file.write('''
# example invalid config file
#
api exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[a.com]
tlsa = 202 1
'''.format(self.binary))
with open(str(self.configX17), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory /tmp
letsencrypt_directory = /var/tmp
[a.com]
tlsa = 202 1
'''.format(self.binary))
with open(str(self.configX18), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory /var/tmp
[a.com]
tlsa = 202 1
'''.format(self.binary))
with open(str(self.configX19), 'w') as file:
file.write('''
# example invalid config file
#
api=exec {}
dane_directory = /tmp
letsencrypt_directory = /var/tmp
[a.com]
tlsa 202 1
'''.format(self.binary))
with open(str(self.configX20), 'w') as file:
file.write('''
# example invalid config file
#
api=exec uid: {}
[a.com]
tlsa = 202 1
'''.format(self.binary))
with open(str(self.configX21), 'w') as file:
file.write('''
# example invalid config file
#
api=exec uid:nonexistentuserwithastupidlylongusername {}
[a.com]
tlsa = 202 1
'''.format(self.binary))
with open(str(self.configX22), 'w') as file:
file.write('''
# example invalid config file
#
api=exec uid:nonexistentuserwithastupidlylongusername
[a.com]
tlsa = 202 1
''')
# we need to create several files (relative to 'parent'):
# /etc/le/live/a.com
# /etc/le/live/b.com
# /etc/le/live/c.com
#
# /etc/le/archive/a.com
# /etc/le/archive/b.com
# /etc/le/archive/c.com
#
# /var/log
# /etc
# /bin
#
# we also need to create config files to read...
# /etc/alnitak.conf # <-- default
# /etc/alnitak.conf.1
# /etc/alnitak.conf.2
# /etc/alnitak.conf.3
# ...
def renew(self, domain, num):
live_d = self.domains[domain]['live']
Path(live_d / 'cert.pem').unlink()
Path(live_d / 'chain.pem').unlink()
Path(live_d / 'fullchain.pem').unlink()
Path(live_d / 'privkey.pem').unlink()
Path(live_d / 'cert.pem').symlink_to(
'../../archive/{}/cert{}.pem'.format(domain, num))
Path(live_d / 'chain.pem').symlink_to(
'../../archive/{}/chain{}.pem'.format(domain, num))
Path(live_d / 'fullchain.pem').symlink_to(
'../../archive/{}/fullchain{}.pem'.format(domain, num))
Path(live_d / 'privkey.pem').symlink_to(
'../../archive/{}/privkey{}.pem'.format(domain, num))
def renew_a(self):
if self.renew_a_at == 1:
self.renew('a.com', 2)
self.renew_a_at = 2
elif self.renew_a_at == 2:
self.renew('a.com', 3)
self.renew_a_at = 3
else:
self.renew('a.com', 1)
self.renew_a_at = 1
def renew_b(self):
if self.renew_b_at == 1:
self.renew('b.com', 2)
self.renew_b_at = 2
elif self.renew_b_at == 2:
self.renew('b.com', 3)
self.renew_b_at = 3
else:
self.renew('b.com', 1)
self.renew_b_at = 1
def renew_c(self):
if self.renew_c_at == 1:
self.renew('c.com', 2)
self.renew_c_at = 2
elif self.renew_c_at == 2:
self.renew('c.com', 3)
self.renew_c_at = 3
else:
self.renew('c.com', 1)
self.renew_c_at = 1
def __del__(self):
if not self.keep:
if self.parent.exists():
shutil.rmtree(self.parent)
def create_cloudflare_config(self, path, domain):
with open(str(self.configC1), 'w') as file:
file.write('''
# cloudflare api test config file
#
api = cloudflare {}
[{}]
tlsa = 211 53527
tlsa = 311 53527
'''.format(path, domain))
def create_state_obj(init=None, config=None, recreate=True, lock=False,
log=False):
if os.getuid() == 0:
prog = Prog.State(lock=lock)
else:
prog = Prog.State(lock=lock, testing=True)
prog.lockfile = Path(init.varlock / 'alnitak.lock')
prog.force = True
prog.log.set_no_logging()
prog.recreate_dane = recreate
if init:
prog.set_dane_directory(init.dane)
prog.set_letsencrypt_directory(init.le)
if config:
prog.set_config_file(config)
else:
prog.set_config_file(init.config)
prog.datafile = Path(init.datadir / "data")
elif config:
prog.set_config_file(config)
if log:
prog.log.set_file(init.varlog / 'log')
prog.log.set_debug_logging()
else:
prog.log.set_nolog()
return prog
def clear_state(prog):
prog.timenow = datetime.datetime.now()
prog.target_list = [ ]
prog.dane_domain_directories = { }
prog.renewed_domains = []
prog.datafile_lines = []
prog.data = Prog.Data()
def prehook_line(state, cwd, domain, numcert, pending):
m = re.match(r'([^0-9]+)', numcert)
cert = "{}.pem".format(m.group(0))
return [ domain, str(cwd / state.dane / domain / cert),
str(cwd / state.live / domain / cert),
str(cwd / state.archive / domain / numcert),
str(pending) ]
def call_line(operation, flags, params, hash1, hash2=""):
if operation[0] == 'p':
return ":publish:{}:{}:{}:{}".format(params, hash1, hash2, flags)
else:
return ":delete:{}:{}:{}:{}".format(params, hash1, hash2, flags)
| 76.125692 | 2,912 | 0.802199 | 3,810 | 96,299 | 20.224672 | 0.198425 | 0.005399 | 0.00728 | 0.00728 | 0.489436 | 0.478535 | 0.473902 | 0.467232 | 0.459173 | 0.456486 | 0 | 0.491234 | 0.150074 | 96,299 | 1,264 | 2,913 | 76.185918 | 0.45023 | 0.009668 | 0 | 0.443825 | 0 | 0.001857 | 0.806654 | 0.675448 | 0 | 1 | 0 | 0 | 0 | 1 | 0.020427 | false | 0 | 0.0065 | 0.000929 | 0.039926 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e0c75f3e8388958d48f0fafad9c85c4a2c5db8fc | 29,389 | py | Python | sdk/python/pulumi_oci/identity/dynamic_group.py | EladGabay/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 5 | 2021-08-17T11:14:46.000Z | 2021-12-31T02:07:03.000Z | sdk/python/pulumi_oci/identity/dynamic_group.py | pulumi-oci/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2021-09-06T11:21:29.000Z | 2021-09-06T11:21:29.000Z | sdk/python/pulumi_oci/identity/dynamic_group.py | pulumi-oci/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2021-08-24T23:31:30.000Z | 2022-01-02T19:26:54.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['DynamicGroupArgs', 'DynamicGroup']
@pulumi.input_type
class DynamicGroupArgs:
def __init__(__self__, *,
compartment_id: pulumi.Input[str],
description: pulumi.Input[str],
matching_rule: pulumi.Input[str],
defined_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
freeform_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
name: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a DynamicGroup resource.
:param pulumi.Input[str] compartment_id: The OCID of the tenancy containing the group.
:param pulumi.Input[str] description: (Updatable) The description you assign to the group during creation. Does not have to be unique, and it's changeable.
:param pulumi.Input[str] matching_rule: (Updatable) The matching rule to dynamically match an instance certificate to this dynamic group. For rule syntax, see [Managing Dynamic Groups](https://docs.cloud.oracle.com/iaas/Content/Identity/Tasks/managingdynamicgroups.htm).
:param pulumi.Input[Mapping[str, Any]] defined_tags: (Updatable) Defined tags for this resource. Each key is predefined and scoped to a namespace. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{"Operations.CostCenter": "42"}`
:param pulumi.Input[Mapping[str, Any]] freeform_tags: (Updatable) Free-form tags for this resource. Each tag is a simple key-value pair with no predefined name, type, or namespace. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{"Department": "Finance"}`
:param pulumi.Input[str] name: The name you assign to the group during creation. The name must be unique across all groups in the tenancy and cannot be changed.
"""
pulumi.set(__self__, "compartment_id", compartment_id)
pulumi.set(__self__, "description", description)
pulumi.set(__self__, "matching_rule", matching_rule)
if defined_tags is not None:
pulumi.set(__self__, "defined_tags", defined_tags)
if freeform_tags is not None:
pulumi.set(__self__, "freeform_tags", freeform_tags)
if name is not None:
pulumi.set(__self__, "name", name)
@property
@pulumi.getter(name="compartmentId")
def compartment_id(self) -> pulumi.Input[str]:
"""
The OCID of the tenancy containing the group.
"""
return pulumi.get(self, "compartment_id")
@compartment_id.setter
def compartment_id(self, value: pulumi.Input[str]):
pulumi.set(self, "compartment_id", value)
@property
@pulumi.getter
def description(self) -> pulumi.Input[str]:
"""
(Updatable) The description you assign to the group during creation. Does not have to be unique, and it's changeable.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: pulumi.Input[str]):
pulumi.set(self, "description", value)
@property
@pulumi.getter(name="matchingRule")
def matching_rule(self) -> pulumi.Input[str]:
"""
(Updatable) The matching rule to dynamically match an instance certificate to this dynamic group. For rule syntax, see [Managing Dynamic Groups](https://docs.cloud.oracle.com/iaas/Content/Identity/Tasks/managingdynamicgroups.htm).
"""
return pulumi.get(self, "matching_rule")
@matching_rule.setter
def matching_rule(self, value: pulumi.Input[str]):
pulumi.set(self, "matching_rule", value)
@property
@pulumi.getter(name="definedTags")
def defined_tags(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
(Updatable) Defined tags for this resource. Each key is predefined and scoped to a namespace. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{"Operations.CostCenter": "42"}`
"""
return pulumi.get(self, "defined_tags")
@defined_tags.setter
def defined_tags(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "defined_tags", value)
@property
@pulumi.getter(name="freeformTags")
def freeform_tags(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
(Updatable) Free-form tags for this resource. Each tag is a simple key-value pair with no predefined name, type, or namespace. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{"Department": "Finance"}`
"""
return pulumi.get(self, "freeform_tags")
@freeform_tags.setter
def freeform_tags(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "freeform_tags", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name you assign to the group during creation. The name must be unique across all groups in the tenancy and cannot be changed.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@pulumi.input_type
class _DynamicGroupState:
def __init__(__self__, *,
compartment_id: Optional[pulumi.Input[str]] = None,
defined_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
description: Optional[pulumi.Input[str]] = None,
freeform_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
inactive_state: Optional[pulumi.Input[str]] = None,
matching_rule: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
state: Optional[pulumi.Input[str]] = None,
time_created: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering DynamicGroup resources.
:param pulumi.Input[str] compartment_id: The OCID of the tenancy containing the group.
:param pulumi.Input[Mapping[str, Any]] defined_tags: (Updatable) Defined tags for this resource. Each key is predefined and scoped to a namespace. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{"Operations.CostCenter": "42"}`
:param pulumi.Input[str] description: (Updatable) The description you assign to the group during creation. Does not have to be unique, and it's changeable.
:param pulumi.Input[Mapping[str, Any]] freeform_tags: (Updatable) Free-form tags for this resource. Each tag is a simple key-value pair with no predefined name, type, or namespace. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{"Department": "Finance"}`
:param pulumi.Input[str] inactive_state: The detailed status of INACTIVE lifecycleState.
:param pulumi.Input[str] matching_rule: (Updatable) The matching rule to dynamically match an instance certificate to this dynamic group. For rule syntax, see [Managing Dynamic Groups](https://docs.cloud.oracle.com/iaas/Content/Identity/Tasks/managingdynamicgroups.htm).
:param pulumi.Input[str] name: The name you assign to the group during creation. The name must be unique across all groups in the tenancy and cannot be changed.
:param pulumi.Input[str] state: The group's current state.
:param pulumi.Input[str] time_created: Date and time the group was created, in the format defined by RFC3339. Example: `2016-08-25T21:10:29.600Z`
"""
if compartment_id is not None:
pulumi.set(__self__, "compartment_id", compartment_id)
if defined_tags is not None:
pulumi.set(__self__, "defined_tags", defined_tags)
if description is not None:
pulumi.set(__self__, "description", description)
if freeform_tags is not None:
pulumi.set(__self__, "freeform_tags", freeform_tags)
if inactive_state is not None:
pulumi.set(__self__, "inactive_state", inactive_state)
if matching_rule is not None:
pulumi.set(__self__, "matching_rule", matching_rule)
if name is not None:
pulumi.set(__self__, "name", name)
if state is not None:
pulumi.set(__self__, "state", state)
if time_created is not None:
pulumi.set(__self__, "time_created", time_created)
@property
@pulumi.getter(name="compartmentId")
def compartment_id(self) -> Optional[pulumi.Input[str]]:
"""
The OCID of the tenancy containing the group.
"""
return pulumi.get(self, "compartment_id")
@compartment_id.setter
def compartment_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compartment_id", value)
@property
@pulumi.getter(name="definedTags")
def defined_tags(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
(Updatable) Defined tags for this resource. Each key is predefined and scoped to a namespace. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{"Operations.CostCenter": "42"}`
"""
return pulumi.get(self, "defined_tags")
@defined_tags.setter
def defined_tags(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "defined_tags", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
(Updatable) The description you assign to the group during creation. Does not have to be unique, and it's changeable.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter(name="freeformTags")
def freeform_tags(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
(Updatable) Free-form tags for this resource. Each tag is a simple key-value pair with no predefined name, type, or namespace. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{"Department": "Finance"}`
"""
return pulumi.get(self, "freeform_tags")
@freeform_tags.setter
def freeform_tags(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "freeform_tags", value)
@property
@pulumi.getter(name="inactiveState")
def inactive_state(self) -> Optional[pulumi.Input[str]]:
"""
The detailed status of INACTIVE lifecycleState.
"""
return pulumi.get(self, "inactive_state")
@inactive_state.setter
def inactive_state(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "inactive_state", value)
@property
@pulumi.getter(name="matchingRule")
def matching_rule(self) -> Optional[pulumi.Input[str]]:
"""
(Updatable) The matching rule to dynamically match an instance certificate to this dynamic group. For rule syntax, see [Managing Dynamic Groups](https://docs.cloud.oracle.com/iaas/Content/Identity/Tasks/managingdynamicgroups.htm).
"""
return pulumi.get(self, "matching_rule")
@matching_rule.setter
def matching_rule(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "matching_rule", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name you assign to the group during creation. The name must be unique across all groups in the tenancy and cannot be changed.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def state(self) -> Optional[pulumi.Input[str]]:
"""
The group's current state.
"""
return pulumi.get(self, "state")
@state.setter
def state(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "state", value)
@property
@pulumi.getter(name="timeCreated")
def time_created(self) -> Optional[pulumi.Input[str]]:
"""
Date and time the group was created, in the format defined by RFC3339. Example: `2016-08-25T21:10:29.600Z`
"""
return pulumi.get(self, "time_created")
@time_created.setter
def time_created(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "time_created", value)
class DynamicGroup(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
compartment_id: Optional[pulumi.Input[str]] = None,
defined_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
description: Optional[pulumi.Input[str]] = None,
freeform_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
matching_rule: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
This resource provides the Dynamic Group resource in Oracle Cloud Infrastructure Identity service.
Creates a new dynamic group in your tenancy.
You must specify your tenancy's OCID as the compartment ID in the request object (remember that the tenancy
is simply the root compartment). Notice that IAM resources (users, groups, compartments, and some policies)
reside within the tenancy itself, unlike cloud resources such as compute instances, which typically
reside within compartments inside the tenancy. For information about OCIDs, see
[Resource Identifiers](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm).
You must also specify a *name* for the dynamic group, which must be unique across all dynamic groups in your
tenancy, and cannot be changed. Note that this name has to be also unique across all groups in your tenancy.
You can use this name or the OCID when writing policies that apply to the dynamic group. For more information
about policies, see [How Policies Work](https://docs.cloud.oracle.com/iaas/Content/Identity/Concepts/policies.htm).
You must also specify a *description* for the dynamic group (although it can be an empty string). It does not
have to be unique, and you can change it anytime with [UpdateDynamicGroup](https://docs.cloud.oracle.com/iaas/api/#/en/identity/20160918/DynamicGroup/UpdateDynamicGroup).
## Example Usage
```python
import pulumi
import pulumi_oci as oci
test_dynamic_group = oci.identity.DynamicGroup("testDynamicGroup",
compartment_id=var["tenancy_ocid"],
description=var["dynamic_group_description"],
matching_rule=var["dynamic_group_matching_rule"],
defined_tags={
"Operations.CostCenter": "42",
},
freeform_tags={
"Department": "Finance",
})
```
## Import
DynamicGroups can be imported using the `id`, e.g.
```sh
$ pulumi import oci:identity/dynamicGroup:DynamicGroup test_dynamic_group "id"
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] compartment_id: The OCID of the tenancy containing the group.
:param pulumi.Input[Mapping[str, Any]] defined_tags: (Updatable) Defined tags for this resource. Each key is predefined and scoped to a namespace. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{"Operations.CostCenter": "42"}`
:param pulumi.Input[str] description: (Updatable) The description you assign to the group during creation. Does not have to be unique, and it's changeable.
:param pulumi.Input[Mapping[str, Any]] freeform_tags: (Updatable) Free-form tags for this resource. Each tag is a simple key-value pair with no predefined name, type, or namespace. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{"Department": "Finance"}`
:param pulumi.Input[str] matching_rule: (Updatable) The matching rule to dynamically match an instance certificate to this dynamic group. For rule syntax, see [Managing Dynamic Groups](https://docs.cloud.oracle.com/iaas/Content/Identity/Tasks/managingdynamicgroups.htm).
:param pulumi.Input[str] name: The name you assign to the group during creation. The name must be unique across all groups in the tenancy and cannot be changed.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: DynamicGroupArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
This resource provides the Dynamic Group resource in Oracle Cloud Infrastructure Identity service.
Creates a new dynamic group in your tenancy.
You must specify your tenancy's OCID as the compartment ID in the request object (remember that the tenancy
is simply the root compartment). Notice that IAM resources (users, groups, compartments, and some policies)
reside within the tenancy itself, unlike cloud resources such as compute instances, which typically
reside within compartments inside the tenancy. For information about OCIDs, see
[Resource Identifiers](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm).
You must also specify a *name* for the dynamic group, which must be unique across all dynamic groups in your
tenancy, and cannot be changed. Note that this name has to be also unique across all groups in your tenancy.
You can use this name or the OCID when writing policies that apply to the dynamic group. For more information
about policies, see [How Policies Work](https://docs.cloud.oracle.com/iaas/Content/Identity/Concepts/policies.htm).
You must also specify a *description* for the dynamic group (although it can be an empty string). It does not
have to be unique, and you can change it anytime with [UpdateDynamicGroup](https://docs.cloud.oracle.com/iaas/api/#/en/identity/20160918/DynamicGroup/UpdateDynamicGroup).
## Example Usage
```python
import pulumi
import pulumi_oci as oci
test_dynamic_group = oci.identity.DynamicGroup("testDynamicGroup",
compartment_id=var["tenancy_ocid"],
description=var["dynamic_group_description"],
matching_rule=var["dynamic_group_matching_rule"],
defined_tags={
"Operations.CostCenter": "42",
},
freeform_tags={
"Department": "Finance",
})
```
## Import
DynamicGroups can be imported using the `id`, e.g.
```sh
$ pulumi import oci:identity/dynamicGroup:DynamicGroup test_dynamic_group "id"
```
:param str resource_name: The name of the resource.
:param DynamicGroupArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(DynamicGroupArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
compartment_id: Optional[pulumi.Input[str]] = None,
defined_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
description: Optional[pulumi.Input[str]] = None,
freeform_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
matching_rule: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = DynamicGroupArgs.__new__(DynamicGroupArgs)
if compartment_id is None and not opts.urn:
raise TypeError("Missing required property 'compartment_id'")
__props__.__dict__["compartment_id"] = compartment_id
__props__.__dict__["defined_tags"] = defined_tags
if description is None and not opts.urn:
raise TypeError("Missing required property 'description'")
__props__.__dict__["description"] = description
__props__.__dict__["freeform_tags"] = freeform_tags
if matching_rule is None and not opts.urn:
raise TypeError("Missing required property 'matching_rule'")
__props__.__dict__["matching_rule"] = matching_rule
__props__.__dict__["name"] = name
__props__.__dict__["inactive_state"] = None
__props__.__dict__["state"] = None
__props__.__dict__["time_created"] = None
super(DynamicGroup, __self__).__init__(
'oci:identity/dynamicGroup:DynamicGroup',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
compartment_id: Optional[pulumi.Input[str]] = None,
defined_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
description: Optional[pulumi.Input[str]] = None,
freeform_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
inactive_state: Optional[pulumi.Input[str]] = None,
matching_rule: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
state: Optional[pulumi.Input[str]] = None,
time_created: Optional[pulumi.Input[str]] = None) -> 'DynamicGroup':
"""
Get an existing DynamicGroup resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] compartment_id: The OCID of the tenancy containing the group.
:param pulumi.Input[Mapping[str, Any]] defined_tags: (Updatable) Defined tags for this resource. Each key is predefined and scoped to a namespace. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{"Operations.CostCenter": "42"}`
:param pulumi.Input[str] description: (Updatable) The description you assign to the group during creation. Does not have to be unique, and it's changeable.
:param pulumi.Input[Mapping[str, Any]] freeform_tags: (Updatable) Free-form tags for this resource. Each tag is a simple key-value pair with no predefined name, type, or namespace. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{"Department": "Finance"}`
:param pulumi.Input[str] inactive_state: The detailed status of INACTIVE lifecycleState.
:param pulumi.Input[str] matching_rule: (Updatable) The matching rule to dynamically match an instance certificate to this dynamic group. For rule syntax, see [Managing Dynamic Groups](https://docs.cloud.oracle.com/iaas/Content/Identity/Tasks/managingdynamicgroups.htm).
:param pulumi.Input[str] name: The name you assign to the group during creation. The name must be unique across all groups in the tenancy and cannot be changed.
:param pulumi.Input[str] state: The group's current state.
:param pulumi.Input[str] time_created: Date and time the group was created, in the format defined by RFC3339. Example: `2016-08-25T21:10:29.600Z`
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _DynamicGroupState.__new__(_DynamicGroupState)
__props__.__dict__["compartment_id"] = compartment_id
__props__.__dict__["defined_tags"] = defined_tags
__props__.__dict__["description"] = description
__props__.__dict__["freeform_tags"] = freeform_tags
__props__.__dict__["inactive_state"] = inactive_state
__props__.__dict__["matching_rule"] = matching_rule
__props__.__dict__["name"] = name
__props__.__dict__["state"] = state
__props__.__dict__["time_created"] = time_created
return DynamicGroup(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="compartmentId")
def compartment_id(self) -> pulumi.Output[str]:
"""
The OCID of the tenancy containing the group.
"""
return pulumi.get(self, "compartment_id")
@property
@pulumi.getter(name="definedTags")
def defined_tags(self) -> pulumi.Output[Mapping[str, Any]]:
"""
(Updatable) Defined tags for this resource. Each key is predefined and scoped to a namespace. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{"Operations.CostCenter": "42"}`
"""
return pulumi.get(self, "defined_tags")
@property
@pulumi.getter
def description(self) -> pulumi.Output[str]:
"""
(Updatable) The description you assign to the group during creation. Does not have to be unique, and it's changeable.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter(name="freeformTags")
def freeform_tags(self) -> pulumi.Output[Mapping[str, Any]]:
"""
(Updatable) Free-form tags for this resource. Each tag is a simple key-value pair with no predefined name, type, or namespace. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{"Department": "Finance"}`
"""
return pulumi.get(self, "freeform_tags")
@property
@pulumi.getter(name="inactiveState")
def inactive_state(self) -> pulumi.Output[str]:
"""
The detailed status of INACTIVE lifecycleState.
"""
return pulumi.get(self, "inactive_state")
@property
@pulumi.getter(name="matchingRule")
def matching_rule(self) -> pulumi.Output[str]:
"""
(Updatable) The matching rule to dynamically match an instance certificate to this dynamic group. For rule syntax, see [Managing Dynamic Groups](https://docs.cloud.oracle.com/iaas/Content/Identity/Tasks/managingdynamicgroups.htm).
"""
return pulumi.get(self, "matching_rule")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
The name you assign to the group during creation. The name must be unique across all groups in the tenancy and cannot be changed.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def state(self) -> pulumi.Output[str]:
"""
The group's current state.
"""
return pulumi.get(self, "state")
@property
@pulumi.getter(name="timeCreated")
def time_created(self) -> pulumi.Output[str]:
"""
Date and time the group was created, in the format defined by RFC3339. Example: `2016-08-25T21:10:29.600Z`
"""
return pulumi.get(self, "time_created")
| 53.048736 | 346 | 0.672837 | 3,635 | 29,389 | 5.291609 | 0.074278 | 0.057187 | 0.052404 | 0.044606 | 0.89987 | 0.87975 | 0.869093 | 0.847777 | 0.840031 | 0.815285 | 0 | 0.005213 | 0.223247 | 29,389 | 553 | 347 | 53.144665 | 0.837393 | 0.486679 | 0 | 0.652632 | 1 | 0 | 0.098448 | 0.002796 | 0 | 0 | 0 | 0 | 0 | 1 | 0.161404 | false | 0.003509 | 0.017544 | 0 | 0.277193 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e0cebad3e0675c75fc3ddf189bebd7657b092da0 | 25,189 | py | Python | service_flows/data_processor.py | fColangelo/MORA-Multi-Objective-Routing-Algorithm | b867d606b01f5993ea4364ac6ff6caf86aa5ddec | [
"MIT"
] | 1 | 2021-12-15T00:27:59.000Z | 2021-12-15T00:27:59.000Z | service_flows/data_processor.py | fColangelo/MORA-Multi-Objective-Routing-Algorithm | b867d606b01f5993ea4364ac6ff6caf86aa5ddec | [
"MIT"
] | null | null | null | service_flows/data_processor.py | fColangelo/MORA-Multi-Objective-Routing-Algorithm | b867d606b01f5993ea4364ac6ff6caf86aa5ddec | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import sys
sys.dont_write_bytecode
import os
import shutil
import json
from datetime import date
import csv
# START DATE: 10 FEBBRAIO 2020 - GRAPH START = 1581292860
# STEP 86400 (= SECONDI IN UN GIORNO 60*60*24)
STEP = 86400 # seconds in a day
START_DATE = date(2020,2,9)
BEGINNING_OF_TIME = 1581206460
DAY_DIFF = date.today() - START_DATE
DAYS = DAY_DIFF.days
# Everyday, catch previous day traffic
START_TIME = BEGINNING_OF_TIME + (DAYS-1) * STEP - 7200 # '7200' -> Begins two hours before to fill previous day gap
END_TIME = START_TIME + STEP + 7200
def process_data(urls, raw_data_path, refined_data_path):
raw_data_directories = sorted(next(os.walk(raw_data_path))[1])
# ********* CREATE DATASET ******** #
for d in raw_data_directories:
source_path = os.path.join(raw_data_path, d)
files = os.listdir(source_path)
## FIND LINK FILE
for link in urls:
link_id = urls[link][1]
for f in files:
if link_id in f:
# Import Data
csvfile = os.path.join(source_path, f)
csvdata = import_csv(csvfile)
# Export Data to correct location
fname = link + '_' + link_id + '_traffic.csv'
destination_path = os.path.join(refined_data_path, fname)
export_csv(destination_path, csvdata[12:])
## VALIDATE DATA
validate_data(refined_data_path, urls)
def get_mean_link_bw():
urls = {
"BENL": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47771&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ams-bru", 0],
"DENL": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=48423&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ams-fra", 0],
"DUNL": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47927&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ams-ham", 0],
"NLUK": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=48929&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ams-lon", 1],
"GCGR": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=36069&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ath-ath2", 0],
"GRIT": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47977&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ath-mil", 1],
"ATGC": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=36087&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ath2-vie", 0],
"HUSK": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=8630&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "bra-bud", 0],
"ATSK": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47779&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "bra-vie", 0],
"BEUK": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=9122&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "bru-lon", 1],
"HURO": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=21165&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "buc-bud", 0],
"BGRO": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=21019&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "buc-sof", 0],
"HUSI": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47789&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "bud-lju", 1],
"CZHU": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47791&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "bud-pra", 0],
"ATHU": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47937&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "bud-vie", 0],
"HRHU": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47787&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "bud-zag", 0],
"IEIR": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47799&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "dub-dub", 0], ## ricontrolla! (già fatto una volta)
"IEUK": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47797&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "dub-lon", 1],
"IRUI": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=39699&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "dub2-lon2", 1],
"CHDE": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=48189&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "FRA-GEN", 0],
"DEDU": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=20493&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "fra-ham", 1],
"DEPL": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47803&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "fra-poz", 1],
"CZDE": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=53015&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "FRA-PRA", 0],
"CHES": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47831&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "gen-mad", 1],
"CHFN": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=20747&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "gen-mar", 1],
"CHIT": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=52853&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "GEN-MIL2", 1],
"CHFR": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=48177&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "GEN-PAR", 1],
"DUEE": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=20497&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ham-tal", 1],
"LNLT": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=12697&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "kau-kau", 0], ## ricontrolla!
"LTPL": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=526&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "kau-poz", 1],
"LNLV": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47863&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "kau-rig", 1],
"PRPT": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47857&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "lis-lis", 0],
"PTUK": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=7450&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "lis-lon", 1],
"ESPR": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47865&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "lis-mad", 0],
"ITSI": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=54087&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "LJU-MIL2", 0],
"UIUK": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=46031&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "lon-lon2", 0],
"FRUI": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47463&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "lon2-par", 0],
"ESFR": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47849&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "mad-par", 1],
"ITFN": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=20737&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "mar-mil2", 0],
"ATIT": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=52905&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "MIL2-VIE", 0],
"ATPL": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47877&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "poz-vie", 0],
"ATCZ": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=52961&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "PRA-VIE", 0],
"ETLV": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47861&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "rig-tal", 0],
"ATBG": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=21125&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "sof-vie", 0],
"EEET": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47859&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "tal-tal", 1],
"ATHR": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47883&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "vie-zag", 1],
}
current_dir = os.path.dirname(__file__)
raw_data_path = os.path.join(current_dir, 'geant_ctrl_data')
foo_dataset_path = create_folder('foo_dataset_geant')
# ********* CREATE FOO DATASET ******** #
#
process_data(urls, raw_data_path, foo_dataset_path)
#
# ********* CALCULATE LINKS' MEAN BANDWIDTH ******** #
#
# init variable
links_mean_bw = {}
## FIND LINK FILE
files = os.listdir(foo_dataset_path)
for link in urls:
link_id = urls[link][1]
for f in files:
if link_id in f:
# Import Data
csvfile = os.path.join(foo_dataset_path, f)
csvdata = import_csv(csvfile)
# Compute link mean bandwidth
link_direction_1, link_direction_2 = compute_mean_bw(csvdata)
links_mean_bw.update({ link_direction_1[0] : link_direction_1[1] })
links_mean_bw.update({ link_direction_2[0] : link_direction_2[1] })
#
# ********* DELETE FOO DATASET ******** #
#
shutil.rmtree(foo_dataset_path)
#
return links_mean_bw
def create_folder(folder_name):
current_dir = os.path.dirname(__file__)
database_path = os.path.join(current_dir, folder_name)
if not os.path.exists(database_path):
os.makedirs(database_path, exist_ok=True)
return database_path
def import_csv(csvfilename):
data = []
with open(csvfilename, "r", encoding="utf-8", errors="ignore") as scraped:
reader = csv.reader(scraped, delimiter=',')
for row in reader:
if row: # avoid blank lines
data.append(row)
return data
def export_csv(path, data):
# Append data to file @ path if file exists, otherwise create it.
with open(path, 'a+') as f:
writer = csv.writer(f)
for row in data:
writer.writerow(row)
def validate_data(path, urls):
files = os.listdir(path)
for f in files:
## IMPORT FILE DATA
csvfile = os.path.join(path, f)
csvdata = import_csv(csvfile)
## DELETE DUPLICATE ROWS
# Find rows with same date and time
for i in range(len(csvdata)):
for j in range(len(csvdata[0:i])):
# if date is equal to previous row date
if csvdata[i][0] == csvdata[j][0]:
# ...copy this row to that row
csvdata[j] = csvdata[i]
# Delete all equal rows
foo_csvdata_set = set(tuple(x) for x in csvdata)
foo_csvdata = [list(x) for x in foo_csvdata_set]
foo_csvdata.sort(key = lambda x: csvdata.index(x))
csvdata = foo_csvdata.copy()
## SET NaN ELEMENTS TO 0
for row in csvdata:
for index, value in enumerate(row):
if value == 'NaN':
row[index] = '0'
## ADD HEADING
# Find Link
for link in urls:
if urls[link][1] in f:
link_name = get_key_from_value(urls, urls[link][1])
eman_knil = link_name[len(link)//2:] + link[:len(link)//2]
# Check if data is coherent (straight = 1) or vice versa (reverse = 0)
if urls[link][2] == 1:
heading = ['DATE', '{}'.format(link_name), '{}_peak'.format(link_name), '{}'.format(eman_knil), '{}_peak'.format(link_name) ]
else:
heading = ['DATE', '{}'.format(eman_knil), '{}_peak'.format(eman_knil), '{}'.format(link_name), '{}_peak'.format(link_name) ]
break
# Insert heading on top
if csvdata[0][0] != 'DATE':
csvdata.insert(0, heading)
## SAVE FILE
with open(csvfile, 'w') as csvf:
writer = csv.writer(csvf)
for row in csvdata:
writer.writerow(row)
def get_key_from_value(dictionary, val):
for k, v in dictionary.items():
if v[1] == val:
return k
def compute_mean_bw(data):
direction_1 = data[0][1]
direction_2 = data[0][3]
direction_1_sum_bw = 0.0
direction_2_sum_bw = 0.0
for i in range(1, len(data)):
direction_1_sum_bw += float(data[i][1])
direction_2_sum_bw += float(data[i][3])
return [(direction_1, direction_1_sum_bw//(len(data)-1)), (direction_2, direction_2_sum_bw//(len(data)-1))]
def write_to_json(data, filename, json_path):
"""
Write 'data' to json file named 'filename' at 'json_path' location.
Arguments:
data {dict} -- data to be written.
filename {str} -- name of file to be created/overwritten.
json_path {str} -- relative path of json file to be created/overwritten.
"""
# Get the complete path
filepath = os.path.join(json_path, filename)
# Write data
with open(filepath + '.json', 'w+') as f:
json.dump(data, f, sort_keys=True, indent=4)
def main():
urls = {
"BENL": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47771&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ams-bru", 0],
"DENL": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=48423&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ams-fra", 0],
"DUNL": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47927&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ams-ham", 0],
"NLUK": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=48929&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ams-lon", 1],
"GCGR": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=36069&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ath-ath2", 0],
"GRIT": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47977&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ath-mil", 1],
"ATGC": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=36087&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ath2-vie", 0],
"HUSK": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=8630&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "bra-bud", 0],
"ATSK": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47779&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "bra-vie", 0],
"BEUK": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=9122&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "bru-lon", 1],
"HURO": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=21165&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "buc-bud", 0],
"BGRO": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=21019&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "buc-sof", 0],
"HUSI": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47789&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "bud-lju", 1],
"CZHU": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47791&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "bud-pra", 0],
"ATHU": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47937&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "bud-vie", 0],
"HRHU": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47787&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "bud-zag", 0],
"IEIR": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47799&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "dub-dub", 0], ## ricontrolla! (già fatto una volta)
"IEUK": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47797&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "dub-lon", 1],
"IRUI": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=39699&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "dub2-lon2", 1],
"CHDE": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=48189&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "FRA-GEN", 0],
"DEDU": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=20493&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "fra-ham", 1],
"DEPL": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47803&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "fra-poz", 1],
"CZDE": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=53015&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "FRA-PRA", 0],
"CHES": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47831&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "gen-mad", 1],
"CHFN": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=20747&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "gen-mar", 1],
"CHIT": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=52853&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "GEN-MIL2", 1],
"CHFR": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=48177&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "GEN-PAR", 1],
"DUEE": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=20497&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "ham-tal", 1],
"LNLT": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=12697&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "kau-kau", 0], ## ricontrolla!
"LTPL": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=526&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "kau-poz", 1],
"LNLV": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47863&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "kau-rig", 1],
"PRPT": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47857&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "lis-lis", 0],
"PTUK": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=7450&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "lis-lon", 1],
"ESPR": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47865&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "lis-mad", 0],
"ITSI": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=54087&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "LJU-MIL2", 0],
"UIUK": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=46031&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "lon-lon2", 0],
"FRUI": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47463&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "lon2-par", 0],
"ESFR": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47849&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "mad-par", 1],
"ITFN": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=20737&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "mar-mil2", 0],
"ATIT": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=52905&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "MIL2-VIE", 0],
"ATPL": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47877&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "poz-vie", 0],
"ATCZ": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=52961&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "PRA-VIE", 0],
"ETLV": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47861&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "rig-tal", 0],
"ATBG": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=21125&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "sof-vie", 0],
"EEET": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47859&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "tal-tal", 1],
"ATHR": ["https://tools.geant.org/portal/links/p-cacti/graph_xport.php?local_graph_id=47883&rra_id=0&view_type=tree&graph_start={}&graph_end={}".format(START_TIME, END_TIME), "vie-zag", 1],
}
current_dir = os.path.dirname(__file__)
raw_data_path = os.path.join(current_dir, 'geant_ctrl_data')
refined_data_path = create_folder('geant_refined_data')
process_data(urls, raw_data_path, refined_data_path)
print("DAJE")
if __name__ == "__main__":
main()
| 78.962382 | 235 | 0.683949 | 4,051 | 25,189 | 3.998519 | 0.087386 | 0.052229 | 0.085196 | 0.102235 | 0.818743 | 0.807816 | 0.794172 | 0.790221 | 0.790221 | 0.785159 | 0 | 0.035281 | 0.130176 | 25,189 | 318 | 236 | 79.210692 | 0.704016 | 0.052444 | 0 | 0.52968 | 0 | 0.420091 | 0.564537 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045662 | false | 0 | 0.045662 | 0 | 0.114155 | 0.004566 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4615f94b51fd722c6c8fafeae53bbfe1454611bd | 109 | py | Python | tests/test_version.py | oarepo/oarepo | 1e78575a4e5582fd2f2f0cd5073c75f9a70fdca9 | [
"MIT"
] | null | null | null | tests/test_version.py | oarepo/oarepo | 1e78575a4e5582fd2f2f0cd5073c75f9a70fdca9 | [
"MIT"
] | 47 | 2020-06-19T18:53:31.000Z | 2022-03-21T09:51:22.000Z | tests/test_version.py | oarepo/oarepo | 1e78575a4e5582fd2f2f0cd5073c75f9a70fdca9 | [
"MIT"
] | null | null | null | def test_version():
"""Test version import."""
from oarepo import __version__
assert __version__
| 21.8 | 34 | 0.697248 | 12 | 109 | 5.583333 | 0.583333 | 0.328358 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.211009 | 109 | 4 | 35 | 27.25 | 0.77907 | 0.183486 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1cb3a62c1bedd98b06a0ad8b32b5e53c0bc9fc52 | 204 | py | Python | database/neo4j/neomodel/__init__.py | aiscenblue/sanic-framework-starterkit | 505ef3fc76c181b949a1dc5826296d6d990d37cd | [
"MIT"
] | 1 | 2017-10-30T04:33:33.000Z | 2017-10-30T04:33:33.000Z | database/neo4j/neomodel/__init__.py | aiscenblue/sanic-framework-starterkit | 505ef3fc76c181b949a1dc5826296d6d990d37cd | [
"MIT"
] | null | null | null | database/neo4j/neomodel/__init__.py | aiscenblue/sanic-framework-starterkit | 505ef3fc76c181b949a1dc5826296d6d990d37cd | [
"MIT"
] | null | null | null | from neomodel import config
config.DATABASE_URL = 'bolt://neo4j:fullspeed@35.194.124.33:7687' # default
class Configuration:
DATABASE_URL = 'bolt://neo4j:fullspeed@35.194.124.33:7687' # default
| 22.666667 | 76 | 0.735294 | 29 | 204 | 5.103448 | 0.586207 | 0.148649 | 0.202703 | 0.27027 | 0.675676 | 0.675676 | 0.675676 | 0.675676 | 0.675676 | 0.675676 | 0 | 0.168539 | 0.127451 | 204 | 8 | 77 | 25.5 | 0.662921 | 0.073529 | 0 | 0 | 0 | 0 | 0.445652 | 0.445652 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
1cb6cf5b31b2518f69891b94d764ef97deb2b4d9 | 116,379 | py | Python | test/unit/test_speech_to_text_v1.py | laggraw/python-sdk | 80b33065b8d526a9a5f9a62dc892a6fba53c703f | [
"Apache-2.0"
] | null | null | null | test/unit/test_speech_to_text_v1.py | laggraw/python-sdk | 80b33065b8d526a9a5f9a62dc892a6fba53c703f | [
"Apache-2.0"
] | 2 | 2020-01-18T23:42:45.000Z | 2020-01-18T23:52:44.000Z | test/unit/test_speech_to_text_v1.py | truthiswill/python-sdk-1 | e0e5f833e4935f9b52c17c4fae653c08b2bc323f | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# (C) Copyright IBM Corp. 2015, 2020.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from ibm_cloud_sdk_core.authenticators.no_auth_authenticator import NoAuthAuthenticator
import inspect
import json
import pytest
import responses
import tempfile
import ibm_watson.speech_to_text_v1
from ibm_watson.speech_to_text_v1 import *
base_url = 'https://stream.watsonplatform.net/speech-to-text/api'
##############################################################################
# Start of Service: Models
##############################################################################
# region
#-----------------------------------------------------------------------------
# Test Class for list_models
#-----------------------------------------------------------------------------
class TestListModels():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_list_models_response(self):
body = self.construct_full_body()
response = fake_response_SpeechModels_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_list_models_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_SpeechModels_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_list_models_empty(self):
check_empty_response(self)
assert len(responses.calls) == 1
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/models'
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.list_models(**body)
return output
def construct_full_body(self):
body = dict()
return body
def construct_required_body(self):
body = dict()
return body
#-----------------------------------------------------------------------------
# Test Class for get_model
#-----------------------------------------------------------------------------
class TestGetModel():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_get_model_response(self):
body = self.construct_full_body()
response = fake_response_SpeechModel_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_get_model_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_SpeechModel_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_get_model_empty(self):
check_empty_required_params(self, fake_response_SpeechModel_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/models/{0}'.format(body['model_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.get_model(**body)
return output
def construct_full_body(self):
body = dict()
body['model_id'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['model_id'] = "string1"
return body
# endregion
##############################################################################
# End of Service: Models
##############################################################################
##############################################################################
# Start of Service: Synchronous
##############################################################################
# region
#-----------------------------------------------------------------------------
# Test Class for recognize
#-----------------------------------------------------------------------------
class TestRecognize():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_recognize_response(self):
body = self.construct_full_body()
response = fake_response_SpeechRecognitionResults_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_recognize_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_SpeechRecognitionResults_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_recognize_empty(self):
check_empty_required_params(
self, fake_response_SpeechRecognitionResults_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/recognize'
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.recognize(**body)
return output
def construct_full_body(self):
body = dict()
body['audio'] = tempfile.NamedTemporaryFile()
body['content_type'] = "string1"
body['model'] = "string1"
body['language_customization_id'] = "string1"
body['acoustic_customization_id'] = "string1"
body['base_model_version'] = "string1"
body['customization_weight'] = 12345.0
body['inactivity_timeout'] = 12345
body['keywords'] = []
body['keywords_threshold'] = 12345.0
body['max_alternatives'] = 12345
body['word_alternatives_threshold'] = 12345.0
body['word_confidence'] = True
body['timestamps'] = True
body['profanity_filter'] = True
body['smart_formatting'] = True
body['speaker_labels'] = True
body['customization_id'] = "string1"
body['grammar_name'] = "string1"
body['redaction'] = True
body['audio_metrics'] = True
body['end_of_phrase_silence_time'] = 12345.0
body['split_transcript_at_phrase_end'] = True
return body
def construct_required_body(self):
body = dict()
body['audio'] = tempfile.NamedTemporaryFile()
return body
# endregion
##############################################################################
# End of Service: Synchronous
##############################################################################
##############################################################################
# Start of Service: Asynchronous
##############################################################################
# region
#-----------------------------------------------------------------------------
# Test Class for register_callback
#-----------------------------------------------------------------------------
class TestRegisterCallback():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_register_callback_response(self):
body = self.construct_full_body()
response = fake_response_RegisterStatus_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_register_callback_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_RegisterStatus_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_register_callback_empty(self):
check_empty_required_params(self, fake_response_RegisterStatus_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/register_callback'
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.register_callback(**body)
return output
def construct_full_body(self):
body = dict()
body['callback_url'] = "string1"
body['user_secret'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['callback_url'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for unregister_callback
#-----------------------------------------------------------------------------
class TestUnregisterCallback():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_unregister_callback_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_unregister_callback_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_unregister_callback_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/unregister_callback'
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=200,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.unregister_callback(**body)
return output
def construct_full_body(self):
body = dict()
body['callback_url'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['callback_url'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for create_job
#-----------------------------------------------------------------------------
class TestCreateJob():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_create_job_response(self):
body = self.construct_full_body()
response = fake_response_RecognitionJob_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_create_job_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_RecognitionJob_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_create_job_empty(self):
check_empty_required_params(self, fake_response_RecognitionJob_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/recognitions'
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=201,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.create_job(**body)
return output
def construct_full_body(self):
body = dict()
body['audio'] = tempfile.NamedTemporaryFile()
body['content_type'] = "string1"
body['model'] = "string1"
body['callback_url'] = "string1"
body['events'] = "string1"
body['user_token'] = "string1"
body['results_ttl'] = 12345
body['language_customization_id'] = "string1"
body['acoustic_customization_id'] = "string1"
body['base_model_version'] = "string1"
body['customization_weight'] = 12345.0
body['inactivity_timeout'] = 12345
body['keywords'] = []
body['keywords_threshold'] = 12345.0
body['max_alternatives'] = 12345
body['word_alternatives_threshold'] = 12345.0
body['word_confidence'] = True
body['timestamps'] = True
body['profanity_filter'] = True
body['smart_formatting'] = True
body['speaker_labels'] = True
body['customization_id'] = "string1"
body['grammar_name'] = "string1"
body['redaction'] = True
body['processing_metrics'] = True
body['processing_metrics_interval'] = 12345.0
body['audio_metrics'] = True
body['end_of_phrase_silence_time'] = 12345.0
body['split_transcript_at_phrase_end'] = True
return body
def construct_required_body(self):
body = dict()
body['audio'] = tempfile.NamedTemporaryFile()
return body
#-----------------------------------------------------------------------------
# Test Class for check_jobs
#-----------------------------------------------------------------------------
class TestCheckJobs():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_check_jobs_response(self):
body = self.construct_full_body()
response = fake_response_RecognitionJobs_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_check_jobs_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_RecognitionJobs_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_check_jobs_empty(self):
check_empty_response(self)
assert len(responses.calls) == 1
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/recognitions'
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.check_jobs(**body)
return output
def construct_full_body(self):
body = dict()
return body
def construct_required_body(self):
body = dict()
return body
#-----------------------------------------------------------------------------
# Test Class for check_job
#-----------------------------------------------------------------------------
class TestCheckJob():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_check_job_response(self):
body = self.construct_full_body()
response = fake_response_RecognitionJob_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_check_job_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_RecognitionJob_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_check_job_empty(self):
check_empty_required_params(self, fake_response_RecognitionJob_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/recognitions/{0}'.format(body['id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.check_job(**body)
return output
def construct_full_body(self):
body = dict()
body['id'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['id'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for delete_job
#-----------------------------------------------------------------------------
class TestDeleteJob():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_job_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_job_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_job_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/recognitions/{0}'.format(body['id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.DELETE,
url,
body=json.dumps(response),
status=204,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.delete_job(**body)
return output
def construct_full_body(self):
body = dict()
body['id'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['id'] = "string1"
return body
# endregion
##############################################################################
# End of Service: Asynchronous
##############################################################################
##############################################################################
# Start of Service: CustomLanguageModels
##############################################################################
# region
#-----------------------------------------------------------------------------
# Test Class for create_language_model
#-----------------------------------------------------------------------------
class TestCreateLanguageModel():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_create_language_model_response(self):
body = self.construct_full_body()
response = fake_response_LanguageModel_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_create_language_model_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_LanguageModel_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_create_language_model_empty(self):
check_empty_required_params(self, fake_response_LanguageModel_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations'
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=201,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.create_language_model(**body)
return output
def construct_full_body(self):
body = dict()
body.update({
"name": "string1",
"base_model_name": "string1",
"dialect": "string1",
"description": "string1",
})
return body
def construct_required_body(self):
body = dict()
body.update({
"name": "string1",
"base_model_name": "string1",
"dialect": "string1",
"description": "string1",
})
return body
#-----------------------------------------------------------------------------
# Test Class for list_language_models
#-----------------------------------------------------------------------------
class TestListLanguageModels():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_list_language_models_response(self):
body = self.construct_full_body()
response = fake_response_LanguageModels_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_list_language_models_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_LanguageModels_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_list_language_models_empty(self):
check_empty_response(self)
assert len(responses.calls) == 1
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations'
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.list_language_models(**body)
return output
def construct_full_body(self):
body = dict()
body['language'] = "string1"
return body
def construct_required_body(self):
body = dict()
return body
#-----------------------------------------------------------------------------
# Test Class for get_language_model
#-----------------------------------------------------------------------------
class TestGetLanguageModel():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_get_language_model_response(self):
body = self.construct_full_body()
response = fake_response_LanguageModel_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_get_language_model_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_LanguageModel_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_get_language_model_empty(self):
check_empty_required_params(self, fake_response_LanguageModel_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}'.format(body['customization_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.get_language_model(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for delete_language_model
#-----------------------------------------------------------------------------
class TestDeleteLanguageModel():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_language_model_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_language_model_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_language_model_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}'.format(body['customization_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.DELETE,
url,
body=json.dumps(response),
status=200,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.delete_language_model(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for train_language_model
#-----------------------------------------------------------------------------
class TestTrainLanguageModel():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_train_language_model_response(self):
body = self.construct_full_body()
response = fake_response_TrainingResponse_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_train_language_model_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_TrainingResponse_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_train_language_model_empty(self):
check_empty_required_params(self, fake_response_TrainingResponse_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/train'.format(
body['customization_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.train_language_model(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['word_type_to_add'] = "string1"
body['customization_weight'] = 12345.0
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for reset_language_model
#-----------------------------------------------------------------------------
class TestResetLanguageModel():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_reset_language_model_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_reset_language_model_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_reset_language_model_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/reset'.format(
body['customization_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=200,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.reset_language_model(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for upgrade_language_model
#-----------------------------------------------------------------------------
class TestUpgradeLanguageModel():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_upgrade_language_model_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_upgrade_language_model_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_upgrade_language_model_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/upgrade_model'.format(
body['customization_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=200,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.upgrade_language_model(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
return body
# endregion
##############################################################################
# End of Service: CustomLanguageModels
##############################################################################
##############################################################################
# Start of Service: CustomCorpora
##############################################################################
# region
#-----------------------------------------------------------------------------
# Test Class for list_corpora
#-----------------------------------------------------------------------------
class TestListCorpora():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_list_corpora_response(self):
body = self.construct_full_body()
response = fake_response_Corpora_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_list_corpora_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_Corpora_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_list_corpora_empty(self):
check_empty_required_params(self, fake_response_Corpora_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/corpora'.format(
body['customization_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.list_corpora(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for add_corpus
#-----------------------------------------------------------------------------
class TestAddCorpus():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_add_corpus_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_add_corpus_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_add_corpus_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/corpora/{1}'.format(
body['customization_id'], body['corpus_name'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=201,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.add_corpus(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['corpus_name'] = "string1"
body['corpus_file'] = tempfile.NamedTemporaryFile()
body['allow_overwrite'] = True
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
body['corpus_name'] = "string1"
body['corpus_file'] = tempfile.NamedTemporaryFile()
return body
#-----------------------------------------------------------------------------
# Test Class for get_corpus
#-----------------------------------------------------------------------------
class TestGetCorpus():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_get_corpus_response(self):
body = self.construct_full_body()
response = fake_response_Corpus_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_get_corpus_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_Corpus_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_get_corpus_empty(self):
check_empty_required_params(self, fake_response_Corpus_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/corpora/{1}'.format(
body['customization_id'], body['corpus_name'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.get_corpus(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['corpus_name'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
body['corpus_name'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for delete_corpus
#-----------------------------------------------------------------------------
class TestDeleteCorpus():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_corpus_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_corpus_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_corpus_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/corpora/{1}'.format(
body['customization_id'], body['corpus_name'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.DELETE,
url,
body=json.dumps(response),
status=200,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.delete_corpus(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['corpus_name'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
body['corpus_name'] = "string1"
return body
# endregion
##############################################################################
# End of Service: CustomCorpora
##############################################################################
##############################################################################
# Start of Service: CustomWords
##############################################################################
# region
#-----------------------------------------------------------------------------
# Test Class for list_words
#-----------------------------------------------------------------------------
class TestListWords():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_list_words_response(self):
body = self.construct_full_body()
response = fake_response_Words_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_list_words_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_Words_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_list_words_empty(self):
check_empty_required_params(self, fake_response_Words_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/words'.format(
body['customization_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.list_words(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['word_type'] = "string1"
body['sort'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for add_words
#-----------------------------------------------------------------------------
class TestAddWords():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_add_words_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_add_words_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_add_words_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/words'.format(
body['customization_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=201,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.add_words(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body.update({
"words": [],
})
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
body.update({
"words": [],
})
return body
#-----------------------------------------------------------------------------
# Test Class for add_word
#-----------------------------------------------------------------------------
class TestAddWord():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_add_word_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_add_word_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_add_word_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/words/{1}'.format(
body['customization_id'], body['word_name'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.PUT,
url,
body=json.dumps(response),
status=201,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.add_word(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['word_name'] = "string1"
body.update({
"word": "string1",
"sounds_like": [],
"display_as": "string1",
})
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
body['word_name'] = "string1"
body.update({
"word": "string1",
"sounds_like": [],
"display_as": "string1",
})
return body
#-----------------------------------------------------------------------------
# Test Class for get_word
#-----------------------------------------------------------------------------
class TestGetWord():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_get_word_response(self):
body = self.construct_full_body()
response = fake_response_Word_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_get_word_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_Word_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_get_word_empty(self):
check_empty_required_params(self, fake_response_Word_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/words/{1}'.format(
body['customization_id'], body['word_name'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.get_word(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['word_name'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
body['word_name'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for delete_word
#-----------------------------------------------------------------------------
class TestDeleteWord():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_word_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_word_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_word_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/words/{1}'.format(
body['customization_id'], body['word_name'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.DELETE,
url,
body=json.dumps(response),
status=200,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.delete_word(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['word_name'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
body['word_name'] = "string1"
return body
# endregion
##############################################################################
# End of Service: CustomWords
##############################################################################
##############################################################################
# Start of Service: CustomGrammars
##############################################################################
# region
#-----------------------------------------------------------------------------
# Test Class for list_grammars
#-----------------------------------------------------------------------------
class TestListGrammars():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_list_grammars_response(self):
body = self.construct_full_body()
response = fake_response_Grammars_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_list_grammars_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_Grammars_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_list_grammars_empty(self):
check_empty_required_params(self, fake_response_Grammars_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/grammars'.format(
body['customization_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.list_grammars(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for add_grammar
#-----------------------------------------------------------------------------
class TestAddGrammar():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_add_grammar_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_add_grammar_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_add_grammar_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/grammars/{1}'.format(
body['customization_id'], body['grammar_name'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=201,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.add_grammar(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['grammar_name'] = "string1"
body['grammar_file'] = "string1"
body['content_type'] = "string1"
body['allow_overwrite'] = True
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
body['grammar_name'] = "string1"
body['grammar_file'] = "string1"
body['content_type'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for get_grammar
#-----------------------------------------------------------------------------
class TestGetGrammar():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_get_grammar_response(self):
body = self.construct_full_body()
response = fake_response_Grammar_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_get_grammar_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_Grammar_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_get_grammar_empty(self):
check_empty_required_params(self, fake_response_Grammar_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/grammars/{1}'.format(
body['customization_id'], body['grammar_name'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.get_grammar(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['grammar_name'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
body['grammar_name'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for delete_grammar
#-----------------------------------------------------------------------------
class TestDeleteGrammar():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_grammar_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_grammar_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_grammar_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/customizations/{0}/grammars/{1}'.format(
body['customization_id'], body['grammar_name'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.DELETE,
url,
body=json.dumps(response),
status=200,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.delete_grammar(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['grammar_name'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
body['grammar_name'] = "string1"
return body
# endregion
##############################################################################
# End of Service: CustomGrammars
##############################################################################
##############################################################################
# Start of Service: CustomAcousticModels
##############################################################################
# region
#-----------------------------------------------------------------------------
# Test Class for create_acoustic_model
#-----------------------------------------------------------------------------
class TestCreateAcousticModel():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_create_acoustic_model_response(self):
body = self.construct_full_body()
response = fake_response_AcousticModel_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_create_acoustic_model_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_AcousticModel_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_create_acoustic_model_empty(self):
check_empty_required_params(self, fake_response_AcousticModel_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/acoustic_customizations'
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=201,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.create_acoustic_model(**body)
return output
def construct_full_body(self):
body = dict()
body.update({
"name": "string1",
"base_model_name": "string1",
"description": "string1",
})
return body
def construct_required_body(self):
body = dict()
body.update({
"name": "string1",
"base_model_name": "string1",
"description": "string1",
})
return body
#-----------------------------------------------------------------------------
# Test Class for list_acoustic_models
#-----------------------------------------------------------------------------
class TestListAcousticModels():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_list_acoustic_models_response(self):
body = self.construct_full_body()
response = fake_response_AcousticModels_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_list_acoustic_models_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_AcousticModels_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_list_acoustic_models_empty(self):
check_empty_response(self)
assert len(responses.calls) == 1
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/acoustic_customizations'
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.list_acoustic_models(**body)
return output
def construct_full_body(self):
body = dict()
body['language'] = "string1"
return body
def construct_required_body(self):
body = dict()
return body
#-----------------------------------------------------------------------------
# Test Class for get_acoustic_model
#-----------------------------------------------------------------------------
class TestGetAcousticModel():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_get_acoustic_model_response(self):
body = self.construct_full_body()
response = fake_response_AcousticModel_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_get_acoustic_model_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_AcousticModel_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_get_acoustic_model_empty(self):
check_empty_required_params(self, fake_response_AcousticModel_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/acoustic_customizations/{0}'.format(
body['customization_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.get_acoustic_model(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for delete_acoustic_model
#-----------------------------------------------------------------------------
class TestDeleteAcousticModel():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_acoustic_model_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_acoustic_model_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_acoustic_model_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/acoustic_customizations/{0}'.format(
body['customization_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.DELETE,
url,
body=json.dumps(response),
status=200,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.delete_acoustic_model(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for train_acoustic_model
#-----------------------------------------------------------------------------
class TestTrainAcousticModel():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_train_acoustic_model_response(self):
body = self.construct_full_body()
response = fake_response_TrainingResponse_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_train_acoustic_model_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_TrainingResponse_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_train_acoustic_model_empty(self):
check_empty_required_params(self, fake_response_TrainingResponse_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/acoustic_customizations/{0}/train'.format(
body['customization_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.train_acoustic_model(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['custom_language_model_id'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for reset_acoustic_model
#-----------------------------------------------------------------------------
class TestResetAcousticModel():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_reset_acoustic_model_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_reset_acoustic_model_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_reset_acoustic_model_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/acoustic_customizations/{0}/reset'.format(
body['customization_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=200,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.reset_acoustic_model(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for upgrade_acoustic_model
#-----------------------------------------------------------------------------
class TestUpgradeAcousticModel():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_upgrade_acoustic_model_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_upgrade_acoustic_model_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_upgrade_acoustic_model_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/acoustic_customizations/{0}/upgrade_model'.format(
body['customization_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=200,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.upgrade_acoustic_model(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['custom_language_model_id'] = "string1"
body['force'] = True
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
return body
# endregion
##############################################################################
# End of Service: CustomAcousticModels
##############################################################################
##############################################################################
# Start of Service: CustomAudioResources
##############################################################################
# region
#-----------------------------------------------------------------------------
# Test Class for list_audio
#-----------------------------------------------------------------------------
class TestListAudio():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_list_audio_response(self):
body = self.construct_full_body()
response = fake_response_AudioResources_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_list_audio_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_AudioResources_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_list_audio_empty(self):
check_empty_required_params(self, fake_response_AudioResources_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/acoustic_customizations/{0}/audio'.format(
body['customization_id'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.list_audio(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for add_audio
#-----------------------------------------------------------------------------
class TestAddAudio():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_add_audio_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_add_audio_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_add_audio_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/acoustic_customizations/{0}/audio/{1}'.format(
body['customization_id'], body['audio_name'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.POST,
url,
body=json.dumps(response),
status=201,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.add_audio(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['audio_name'] = "string1"
body['audio_resource'] = tempfile.NamedTemporaryFile()
body['content_type'] = "string1"
body['contained_content_type'] = "string1"
body['allow_overwrite'] = True
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
body['audio_name'] = "string1"
body['audio_resource'] = tempfile.NamedTemporaryFile()
return body
#-----------------------------------------------------------------------------
# Test Class for get_audio
#-----------------------------------------------------------------------------
class TestGetAudio():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_get_audio_response(self):
body = self.construct_full_body()
response = fake_response_AudioListing_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_get_audio_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response_AudioListing_json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_get_audio_empty(self):
check_empty_required_params(self, fake_response_AudioListing_json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/acoustic_customizations/{0}/audio/{1}'.format(
body['customization_id'], body['audio_name'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.GET,
url,
body=json.dumps(response),
status=200,
content_type='application/json')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.get_audio(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['audio_name'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
body['audio_name'] = "string1"
return body
#-----------------------------------------------------------------------------
# Test Class for delete_audio
#-----------------------------------------------------------------------------
class TestDeleteAudio():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_audio_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_audio_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_audio_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/acoustic_customizations/{0}/audio/{1}'.format(
body['customization_id'], body['audio_name'])
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.DELETE,
url,
body=json.dumps(response),
status=200,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.delete_audio(**body)
return output
def construct_full_body(self):
body = dict()
body['customization_id'] = "string1"
body['audio_name'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customization_id'] = "string1"
body['audio_name'] = "string1"
return body
# endregion
##############################################################################
# End of Service: CustomAudioResources
##############################################################################
##############################################################################
# Start of Service: UserData
##############################################################################
# region
#-----------------------------------------------------------------------------
# Test Class for delete_user_data
#-----------------------------------------------------------------------------
class TestDeleteUserData():
#--------------------------------------------------------
# Test 1: Send fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_user_data_response(self):
body = self.construct_full_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 2: Send only required fake data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_user_data_required_response(self):
# Check response with required params
body = self.construct_required_body()
response = fake_response__json
send_request(self, body, response)
assert len(responses.calls) == 1
#--------------------------------------------------------
# Test 3: Send empty data and check response
#--------------------------------------------------------
@responses.activate
def test_delete_user_data_empty(self):
check_empty_required_params(self, fake_response__json)
check_missing_required_params(self)
assert len(responses.calls) == 0
#-----------
#- Helpers -
#-----------
def make_url(self, body):
endpoint = '/v1/user_data'
url = '{0}{1}'.format(base_url, endpoint)
return url
def add_mock_response(self, url, response):
responses.add(responses.DELETE,
url,
body=json.dumps(response),
status=200,
content_type='')
def call_service(self, body):
service = SpeechToTextV1(authenticator=NoAuthAuthenticator(),)
service.set_service_url(base_url)
output = service.delete_user_data(**body)
return output
def construct_full_body(self):
body = dict()
body['customer_id'] = "string1"
return body
def construct_required_body(self):
body = dict()
body['customer_id'] = "string1"
return body
# endregion
##############################################################################
# End of Service: UserData
##############################################################################
def check_empty_required_params(obj, response):
"""Test function to assert that the operation will throw an error when given empty required data
Args:
obj: The generated test function
"""
body = obj.construct_full_body()
body = {k: None for k in body.keys()}
error = False
try:
send_request(obj, body, response)
except ValueError as e:
error = True
assert error
def check_missing_required_params(obj):
"""Test function to assert that the operation will throw an error when missing required data
Args:
obj: The generated test function
"""
body = obj.construct_full_body()
url = obj.make_url(body)
error = False
try:
send_request(obj, {}, {}, url=url)
except TypeError as e:
error = True
assert error
def check_empty_response(obj):
"""Test function to assert that the operation will return an empty response when given an empty request
Args:
obj: The generated test function
"""
body = obj.construct_full_body()
url = obj.make_url(body)
send_request(obj, {}, {}, url=url)
def send_request(obj, body, response, url=None):
"""Test function to create a request, send it, and assert its accuracy to the mock response
Args:
obj: The generated test function
body: Dict filled with fake data for calling the service
response_str: Mock response string
"""
if not url:
url = obj.make_url(body)
obj.add_mock_response(url, response)
output = obj.call_service(body)
assert responses.calls[0].request.url.startswith(url)
assert output.get_result() == response
####################
## Mock Responses ##
####################
fake_response__json = None
fake_response_SpeechModels_json = """{"models": []}"""
fake_response_SpeechModel_json = """{"name": "fake_name", "language": "fake_language", "rate": 4, "url": "fake_url", "supported_features": {"custom_language_model": false, "speaker_labels": true}, "description": "fake_description"}"""
fake_response_SpeechRecognitionResults_json = """{"results": [], "result_index": 12, "speaker_labels": [], "processing_metrics": {"processed_audio": {"received": 8, "seen_by_engine": 14, "transcription": 13, "speaker_labels": 14}, "wall_clock_since_first_byte_received": 36, "periodic": true}, "audio_metrics": {"sampling_interval": 17, "accumulated": {"final": false, "end_time": 8, "signal_to_noise_ratio": 21, "speech_ratio": 12, "high_frequency_loss": 19, "direct_current_offset": [], "clipping_rate": [], "speech_level": [], "non_speech_level": []}}, "warnings": []}"""
fake_response_RegisterStatus_json = """{"status": "fake_status", "url": "fake_url"}"""
fake_response_RecognitionJob_json = """{"id": "fake_id", "status": "fake_status", "created": "fake_created", "updated": "fake_updated", "url": "fake_url", "user_token": "fake_user_token", "results": [], "warnings": []}"""
fake_response_RecognitionJobs_json = """{"recognitions": []}"""
fake_response_RecognitionJob_json = """{"id": "fake_id", "status": "fake_status", "created": "fake_created", "updated": "fake_updated", "url": "fake_url", "user_token": "fake_user_token", "results": [], "warnings": []}"""
fake_response_LanguageModel_json = """{"customization_id": "fake_customization_id", "created": "fake_created", "updated": "fake_updated", "language": "fake_language", "dialect": "fake_dialect", "versions": [], "owner": "fake_owner", "name": "fake_name", "description": "fake_description", "base_model_name": "fake_base_model_name", "status": "fake_status", "progress": 8, "error": "fake_error", "warnings": "fake_warnings"}"""
fake_response_LanguageModels_json = """{"customizations": []}"""
fake_response_LanguageModel_json = """{"customization_id": "fake_customization_id", "created": "fake_created", "updated": "fake_updated", "language": "fake_language", "dialect": "fake_dialect", "versions": [], "owner": "fake_owner", "name": "fake_name", "description": "fake_description", "base_model_name": "fake_base_model_name", "status": "fake_status", "progress": 8, "error": "fake_error", "warnings": "fake_warnings"}"""
fake_response_TrainingResponse_json = """{"warnings": []}"""
fake_response_Corpora_json = """{"corpora": []}"""
fake_response_Corpus_json = """{"name": "fake_name", "total_words": 11, "out_of_vocabulary_words": 23, "status": "fake_status", "error": "fake_error"}"""
fake_response_Words_json = """{"words": []}"""
fake_response_Word_json = """{"word": "fake_word", "sounds_like": [], "display_as": "fake_display_as", "count": 5, "source": [], "error": []}"""
fake_response_Grammars_json = """{"grammars": []}"""
fake_response_Grammar_json = """{"name": "fake_name", "out_of_vocabulary_words": 23, "status": "fake_status", "error": "fake_error"}"""
fake_response_AcousticModel_json = """{"customization_id": "fake_customization_id", "created": "fake_created", "updated": "fake_updated", "language": "fake_language", "versions": [], "owner": "fake_owner", "name": "fake_name", "description": "fake_description", "base_model_name": "fake_base_model_name", "status": "fake_status", "progress": 8, "warnings": "fake_warnings"}"""
fake_response_AcousticModels_json = """{"customizations": []}"""
fake_response_AcousticModel_json = """{"customization_id": "fake_customization_id", "created": "fake_created", "updated": "fake_updated", "language": "fake_language", "versions": [], "owner": "fake_owner", "name": "fake_name", "description": "fake_description", "base_model_name": "fake_base_model_name", "status": "fake_status", "progress": 8, "warnings": "fake_warnings"}"""
fake_response_TrainingResponse_json = """{"warnings": []}"""
fake_response_AudioResources_json = """{"total_minutes_of_audio": 22, "audio": []}"""
fake_response_AudioListing_json = """{"duration": 8, "name": "fake_name", "details": {"type": "fake_type", "codec": "fake_codec", "frequency": 9, "compression": "fake_compression"}, "status": "fake_status", "container": {"duration": 8, "name": "fake_name", "details": {"type": "fake_type", "codec": "fake_codec", "frequency": 9, "compression": "fake_compression"}, "status": "fake_status"}, "audio": []}"""
| 37.28901 | 574 | 0.496816 | 10,193 | 116,379 | 5.433631 | 0.036496 | 0.041455 | 0.02665 | 0.044416 | 0.922001 | 0.917216 | 0.91393 | 0.91189 | 0.905588 | 0.901147 | 0 | 0.009363 | 0.217187 | 116,379 | 3,120 | 575 | 37.300962 | 0.598575 | 0.274843 | 0 | 0.850106 | 0 | 0.006356 | 0.116456 | 0.02264 | 0 | 0 | 0 | 0 | 0.067267 | 1 | 0.175847 | false | 0 | 0.004237 | 0 | 0.288665 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1cde3184231bae1a44984b1de65a55d60282857c | 11,311 | py | Python | grpc_ssm/opac_pb2_grpc.py | khalilpreview/opac_ssm | 5132ef75efe41104cf3a26899bdb53c4efcda5d1 | [
"BSD-3-Clause"
] | 5 | 2021-01-19T09:53:34.000Z | 2021-01-19T20:39:18.000Z | grpc_ssm/opac_pb2_grpc.py | DalavanCloud/opac_ssm | 5132ef75efe41104cf3a26899bdb53c4efcda5d1 | [
"BSD-3-Clause"
] | 713 | 2016-09-06T02:32:56.000Z | 2022-03-31T19:20:50.000Z | grpc_ssm/opac_pb2_grpc.py | DalavanCloud/opac_ssm | 5132ef75efe41104cf3a26899bdb53c4efcda5d1 | [
"BSD-3-Clause"
] | 7 | 2016-09-06T02:45:58.000Z | 2020-12-26T19:30:52.000Z | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
import grpc
import opac_pb2 as opac__pb2
class AssetServiceStub(object):
# missing associated documentation comment in .proto file
pass
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.get_asset = channel.unary_unary(
'/AssetService/get_asset',
request_serializer=opac__pb2.TaskId.SerializeToString,
response_deserializer=opac__pb2.Asset.FromString,
)
self.add_asset = channel.unary_unary(
'/AssetService/add_asset',
request_serializer=opac__pb2.Asset.SerializeToString,
response_deserializer=opac__pb2.TaskId.FromString,
)
self.update_asset = channel.unary_unary(
'/AssetService/update_asset',
request_serializer=opac__pb2.Asset.SerializeToString,
response_deserializer=opac__pb2.TaskId.FromString,
)
self.remove_asset = channel.unary_unary(
'/AssetService/remove_asset',
request_serializer=opac__pb2.TaskId.SerializeToString,
response_deserializer=opac__pb2.AssetRemoved.FromString,
)
self.exists_asset = channel.unary_unary(
'/AssetService/exists_asset',
request_serializer=opac__pb2.TaskId.SerializeToString,
response_deserializer=opac__pb2.AssetExists.FromString,
)
self.get_asset_info = channel.unary_unary(
'/AssetService/get_asset_info',
request_serializer=opac__pb2.TaskId.SerializeToString,
response_deserializer=opac__pb2.AssetInfo.FromString,
)
self.get_task_state = channel.unary_unary(
'/AssetService/get_task_state',
request_serializer=opac__pb2.TaskId.SerializeToString,
response_deserializer=opac__pb2.TaskState.FromString,
)
self.get_bucket = channel.unary_unary(
'/AssetService/get_bucket',
request_serializer=opac__pb2.TaskId.SerializeToString,
response_deserializer=opac__pb2.Bucket.FromString,
)
self.query = channel.unary_unary(
'/AssetService/query',
request_serializer=opac__pb2.Asset.SerializeToString,
response_deserializer=opac__pb2.Assets.FromString,
)
class AssetServiceServicer(object):
# missing associated documentation comment in .proto file
pass
def get_asset(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_asset(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def update_asset(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def remove_asset(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def exists_asset(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def get_asset_info(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def get_task_state(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def get_bucket(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def query(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_AssetServiceServicer_to_server(servicer, server):
rpc_method_handlers = {
'get_asset': grpc.unary_unary_rpc_method_handler(
servicer.get_asset,
request_deserializer=opac__pb2.TaskId.FromString,
response_serializer=opac__pb2.Asset.SerializeToString,
),
'add_asset': grpc.unary_unary_rpc_method_handler(
servicer.add_asset,
request_deserializer=opac__pb2.Asset.FromString,
response_serializer=opac__pb2.TaskId.SerializeToString,
),
'update_asset': grpc.unary_unary_rpc_method_handler(
servicer.update_asset,
request_deserializer=opac__pb2.Asset.FromString,
response_serializer=opac__pb2.TaskId.SerializeToString,
),
'remove_asset': grpc.unary_unary_rpc_method_handler(
servicer.remove_asset,
request_deserializer=opac__pb2.TaskId.FromString,
response_serializer=opac__pb2.AssetRemoved.SerializeToString,
),
'exists_asset': grpc.unary_unary_rpc_method_handler(
servicer.exists_asset,
request_deserializer=opac__pb2.TaskId.FromString,
response_serializer=opac__pb2.AssetExists.SerializeToString,
),
'get_asset_info': grpc.unary_unary_rpc_method_handler(
servicer.get_asset_info,
request_deserializer=opac__pb2.TaskId.FromString,
response_serializer=opac__pb2.AssetInfo.SerializeToString,
),
'get_task_state': grpc.unary_unary_rpc_method_handler(
servicer.get_task_state,
request_deserializer=opac__pb2.TaskId.FromString,
response_serializer=opac__pb2.TaskState.SerializeToString,
),
'get_bucket': grpc.unary_unary_rpc_method_handler(
servicer.get_bucket,
request_deserializer=opac__pb2.TaskId.FromString,
response_serializer=opac__pb2.Bucket.SerializeToString,
),
'query': grpc.unary_unary_rpc_method_handler(
servicer.query,
request_deserializer=opac__pb2.Asset.FromString,
response_serializer=opac__pb2.Assets.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'AssetService', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
class BucketServiceStub(object):
# missing associated documentation comment in .proto file
pass
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.add_bucket = channel.unary_unary(
'/BucketService/add_bucket',
request_serializer=opac__pb2.BucketName.SerializeToString,
response_deserializer=opac__pb2.TaskId.FromString,
)
self.update_bucket = channel.unary_unary(
'/BucketService/update_bucket',
request_serializer=opac__pb2.BucketName.SerializeToString,
response_deserializer=opac__pb2.TaskId.FromString,
)
self.remove_bucket = channel.unary_unary(
'/BucketService/remove_bucket',
request_serializer=opac__pb2.BucketName.SerializeToString,
response_deserializer=opac__pb2.BucketRemoved.FromString,
)
self.exists_bucket = channel.unary_unary(
'/BucketService/exists_bucket',
request_serializer=opac__pb2.BucketName.SerializeToString,
response_deserializer=opac__pb2.BucketExists.FromString,
)
self.get_assets = channel.unary_unary(
'/BucketService/get_assets',
request_serializer=opac__pb2.BucketName.SerializeToString,
response_deserializer=opac__pb2.Assets.FromString,
)
class BucketServiceServicer(object):
# missing associated documentation comment in .proto file
pass
def add_bucket(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def update_bucket(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def remove_bucket(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def exists_bucket(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def get_assets(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_BucketServiceServicer_to_server(servicer, server):
rpc_method_handlers = {
'add_bucket': grpc.unary_unary_rpc_method_handler(
servicer.add_bucket,
request_deserializer=opac__pb2.BucketName.FromString,
response_serializer=opac__pb2.TaskId.SerializeToString,
),
'update_bucket': grpc.unary_unary_rpc_method_handler(
servicer.update_bucket,
request_deserializer=opac__pb2.BucketName.FromString,
response_serializer=opac__pb2.TaskId.SerializeToString,
),
'remove_bucket': grpc.unary_unary_rpc_method_handler(
servicer.remove_bucket,
request_deserializer=opac__pb2.BucketName.FromString,
response_serializer=opac__pb2.BucketRemoved.SerializeToString,
),
'exists_bucket': grpc.unary_unary_rpc_method_handler(
servicer.exists_bucket,
request_deserializer=opac__pb2.BucketName.FromString,
response_serializer=opac__pb2.BucketExists.SerializeToString,
),
'get_assets': grpc.unary_unary_rpc_method_handler(
servicer.get_assets,
request_deserializer=opac__pb2.BucketName.FromString,
response_serializer=opac__pb2.Assets.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'BucketService', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
| 38.604096 | 72 | 0.734595 | 1,194 | 11,311 | 6.639028 | 0.068677 | 0.051217 | 0.060048 | 0.084017 | 0.884067 | 0.833859 | 0.825155 | 0.810521 | 0.736092 | 0.722089 | 0 | 0.006295 | 0.185395 | 11,311 | 292 | 73 | 38.736301 | 0.854026 | 0.103439 | 0 | 0.521368 | 1 | 0 | 0.117192 | 0.033512 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0.076923 | 0.008547 | 0 | 0.102564 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
e806930c4888996b4dc49e4756ba84f9d960893e | 25,098 | py | Python | Netconf/bindings/bindingConnection.py | lrodrin/transceivers | 62b8e44c0a459615a0464a9567e28e195c4dabf1 | [
"MIT"
] | null | null | null | Netconf/bindings/bindingConnection.py | lrodrin/transceivers | 62b8e44c0a459615a0464a9567e28e195c4dabf1 | [
"MIT"
] | null | null | null | Netconf/bindings/bindingConnection.py | lrodrin/transceivers | 62b8e44c0a459615a0464a9567e28e195c4dabf1 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from operator import attrgetter
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType
from pyangbind.lib.yangtypes import RestrictedClassType
from pyangbind.lib.yangtypes import TypedListType
from pyangbind.lib.yangtypes import YANGBool
from pyangbind.lib.yangtypes import YANGListType
from pyangbind.lib.yangtypes import YANGDynClass
from pyangbind.lib.yangtypes import ReferenceType
from pyangbind.lib.base import PybindBase
from collections import OrderedDict
from decimal import Decimal
from bitarray import bitarray
import six
# PY3 support of some PY2 keywords (needs improved)
if six.PY3:
import builtins as __builtin__
long = int
elif six.PY2:
import __builtin__
class yc_transceiver_node_connectivity__connection_transceiver(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module node-connectivity - based on the path /connection/transceiver. Each member element of
the container is represented as a class variable - with a specific
YANG type.
"""
__slots__ = ('_path_helper', '_extmethods', '__transceiverid',)
_yang_name = 'transceiver'
_yang_namespace = 'urn:node-connectivity'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
self._path_helper = False
self._extmethods = False
self.__transceiverid = YANGDynClass(base=six.text_type, is_leaf=True, yang_name="transceiverid", parent=self,
path_helper=self._path_helper, extmethods=self._extmethods,
register_paths=True, is_keyval=True, namespace='urn:node-connectivity',
defining_module='node-connectivity', yang_type='string', is_config=True)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path() + [self._yang_name]
else:
return [u'connection', u'transceiver']
def _get_transceiverid(self):
"""
Getter method for transceiverid, mapped from YANG variable /connection/transceiver/transceiverid (string)
"""
return self.__transceiverid
def _set_transceiverid(self, v, load=False):
"""
Setter method for transceiverid, mapped from YANG variable /connection/transceiver/transceiverid (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_transceiverid is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_transceiverid() directly.
"""
parent = getattr(self, "_parent", None)
if parent is not None and load is False:
raise AttributeError("Cannot set keys directly when" +
" within an instantiated list")
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v, base=six.text_type, is_leaf=True, yang_name="transceiverid", parent=self,
path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True,
is_keyval=True, namespace='urn:node-connectivity', defining_module='node-connectivity',
yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """transceiverid must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=six.text_type, is_leaf=True, yang_name="transceiverid", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, is_keyval=True, namespace='urn:node-connectivity', defining_module='node-connectivity', yang_type='string', is_config=True)""",
})
self.__transceiverid = t
if hasattr(self, '_set'):
self._set()
def _unset_transceiverid(self):
self.__transceiverid = YANGDynClass(base=six.text_type, is_leaf=True, yang_name="transceiverid", parent=self,
path_helper=self._path_helper, extmethods=self._extmethods,
register_paths=True, is_keyval=True, namespace='urn:node-connectivity',
defining_module='node-connectivity', yang_type='string', is_config=True)
transceiverid = __builtin__.property(_get_transceiverid, _set_transceiverid)
_pyangbind_elements = OrderedDict([('transceiverid', transceiverid), ])
class yc_connection_node_connectivity__connection(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module node-connectivity - based on the path /connection. Each member element of
the container is represented as a class variable - with a specific
YANG type.
"""
__slots__ = ('_path_helper', '_extmethods', '__connectionid', '__port_in_id', '__port_out_out', '__transceiver',)
_yang_name = 'connection'
_yang_namespace = 'urn:node-connectivity'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
self._path_helper = False
self._extmethods = False
self.__port_in_id = YANGDynClass(base=six.text_type, is_leaf=True, yang_name="port-in_id", parent=self,
path_helper=self._path_helper, extmethods=self._extmethods,
register_paths=True, namespace='urn:node-connectivity',
defining_module='node-connectivity', yang_type='string', is_config=True)
self.__transceiver = YANGDynClass(
base=YANGListType("transceiverid", yc_transceiver_node_connectivity__connection_transceiver,
yang_name="transceiver", parent=self, is_container='list', user_ordered=False,
path_helper=self._path_helper, yang_keys='transceiverid', extensions=None),
is_container='list', yang_name="transceiver", parent=self, path_helper=self._path_helper,
extmethods=self._extmethods, register_paths=True, extensions=None, namespace='urn:node-connectivity',
defining_module='node-connectivity', yang_type='list', is_config=True)
self.__connectionid = YANGDynClass(base=six.text_type, is_leaf=True, yang_name="connectionid", parent=self,
path_helper=self._path_helper, extmethods=self._extmethods,
register_paths=True, is_keyval=True, namespace='urn:node-connectivity',
defining_module='node-connectivity', yang_type='string', is_config=True)
self.__port_out_out = YANGDynClass(base=six.text_type, is_leaf=True, yang_name="port-out_out", parent=self,
path_helper=self._path_helper, extmethods=self._extmethods,
register_paths=True, namespace='urn:node-connectivity',
defining_module='node-connectivity', yang_type='string', is_config=True)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path() + [self._yang_name]
else:
return [u'connection']
def _get_connectionid(self):
"""
Getter method for connectionid, mapped from YANG variable /connection/connectionid (string)
"""
return self.__connectionid
def _set_connectionid(self, v, load=False):
"""
Setter method for connectionid, mapped from YANG variable /connection/connectionid (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_connectionid is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_connectionid() directly.
"""
parent = getattr(self, "_parent", None)
if parent is not None and load is False:
raise AttributeError("Cannot set keys directly when" +
" within an instantiated list")
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v, base=six.text_type, is_leaf=True, yang_name="connectionid", parent=self,
path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True,
is_keyval=True, namespace='urn:node-connectivity', defining_module='node-connectivity',
yang_type='string', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """connectionid must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=six.text_type, is_leaf=True, yang_name="connectionid", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, is_keyval=True, namespace='urn:node-connectivity', defining_module='node-connectivity', yang_type='string', is_config=True)""",
})
self.__connectionid = t
if hasattr(self, '_set'):
self._set()
def _unset_connectionid(self):
self.__connectionid = YANGDynClass(base=six.text_type, is_leaf=True, yang_name="connectionid", parent=self,
path_helper=self._path_helper, extmethods=self._extmethods,
register_paths=True, is_keyval=True, namespace='urn:node-connectivity',
defining_module='node-connectivity', yang_type='string', is_config=True)
def _get_port_in_id(self):
"""
Getter method for port_in_id, mapped from YANG variable /connection/port_in_id (string)
"""
return self.__port_in_id
def _set_port_in_id(self, v, load=False):
"""
Setter method for port_in_id, mapped from YANG variable /connection/port_in_id (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_port_in_id is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_port_in_id() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v, base=six.text_type, is_leaf=True, yang_name="port-in_id", parent=self,
path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True,
namespace='urn:node-connectivity', defining_module='node-connectivity', yang_type='string',
is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """port_in_id must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=six.text_type, is_leaf=True, yang_name="port-in_id", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:node-connectivity', defining_module='node-connectivity', yang_type='string', is_config=True)""",
})
self.__port_in_id = t
if hasattr(self, '_set'):
self._set()
def _unset_port_in_id(self):
self.__port_in_id = YANGDynClass(base=six.text_type, is_leaf=True, yang_name="port-in_id", parent=self,
path_helper=self._path_helper, extmethods=self._extmethods,
register_paths=True, namespace='urn:node-connectivity',
defining_module='node-connectivity', yang_type='string', is_config=True)
def _get_port_out_out(self):
"""
Getter method for port_out_out, mapped from YANG variable /connection/port_out_out (string)
"""
return self.__port_out_out
def _set_port_out_out(self, v, load=False):
"""
Setter method for port_out_out, mapped from YANG variable /connection/port_out_out (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_port_out_out is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_port_out_out() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v, base=six.text_type, is_leaf=True, yang_name="port-out_out", parent=self,
path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True,
namespace='urn:node-connectivity', defining_module='node-connectivity', yang_type='string',
is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """port_out_out must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=six.text_type, is_leaf=True, yang_name="port-out_out", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:node-connectivity', defining_module='node-connectivity', yang_type='string', is_config=True)""",
})
self.__port_out_out = t
if hasattr(self, '_set'):
self._set()
def _unset_port_out_out(self):
self.__port_out_out = YANGDynClass(base=six.text_type, is_leaf=True, yang_name="port-out_out", parent=self,
path_helper=self._path_helper, extmethods=self._extmethods,
register_paths=True, namespace='urn:node-connectivity',
defining_module='node-connectivity', yang_type='string', is_config=True)
def _get_transceiver(self):
"""
Getter method for transceiver, mapped from YANG variable /connection/transceiver (list)
"""
return self.__transceiver
def _set_transceiver(self, v, load=False):
"""
Setter method for transceiver, mapped from YANG variable /connection/transceiver (list)
If this variable is read-only (config: false) in the
source YANG file, then _set_transceiver is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_transceiver() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v, base=YANGListType("transceiverid",
yc_transceiver_node_connectivity__connection_transceiver,
yang_name="transceiver", parent=self, is_container='list',
user_ordered=False, path_helper=self._path_helper,
yang_keys='transceiverid', extensions=None), is_container='list',
yang_name="transceiver", parent=self, path_helper=self._path_helper,
extmethods=self._extmethods, register_paths=True, extensions=None,
namespace='urn:node-connectivity', defining_module='node-connectivity', yang_type='list',
is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """transceiver must be of a type compatible with list""",
'defined-type': "list",
'generated-type': """YANGDynClass(base=YANGListType("transceiverid",yc_transceiver_node_connectivity__connection_transceiver, yang_name="transceiver", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='transceiverid', extensions=None), is_container='list', yang_name="transceiver", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions=None, namespace='urn:node-connectivity', defining_module='node-connectivity', yang_type='list', is_config=True)""",
})
self.__transceiver = t
if hasattr(self, '_set'):
self._set()
def _unset_transceiver(self):
self.__transceiver = YANGDynClass(
base=YANGListType("transceiverid", yc_transceiver_node_connectivity__connection_transceiver,
yang_name="transceiver", parent=self, is_container='list', user_ordered=False,
path_helper=self._path_helper, yang_keys='transceiverid', extensions=None),
is_container='list', yang_name="transceiver", parent=self, path_helper=self._path_helper,
extmethods=self._extmethods, register_paths=True, extensions=None, namespace='urn:node-connectivity',
defining_module='node-connectivity', yang_type='list', is_config=True)
connectionid = __builtin__.property(_get_connectionid, _set_connectionid)
port_in_id = __builtin__.property(_get_port_in_id, _set_port_in_id)
port_out_out = __builtin__.property(_get_port_out_out, _set_port_out_out)
transceiver = __builtin__.property(_get_transceiver, _set_transceiver)
_pyangbind_elements = OrderedDict(
[('connectionid', connectionid), ('port_in_id', port_in_id), ('port_out_out', port_out_out),
('transceiver', transceiver), ])
class node_connectivity(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module node-connectivity - based on the path /node-connectivity. Each member element of
the container is represented as a class variable - with a specific
YANG type.
YANG Description: Latest update to node connectivity SDM YANG data model.
"""
__slots__ = ('_path_helper', '_extmethods', '__connection',)
_yang_name = 'node-connectivity'
_yang_namespace = 'urn:node-connectivity'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
self._path_helper = False
self._extmethods = False
self.__connection = YANGDynClass(
base=YANGListType("connectionid", yc_connection_node_connectivity__connection, yang_name="connection",
parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper,
yang_keys='connectionid', extensions=None), is_container='list', yang_name="connection",
parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True,
extensions=None, namespace='urn:node-connectivity', defining_module='node-connectivity', yang_type='list',
is_config=True)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path() + [self._yang_name]
else:
return []
def _get_connection(self):
"""
Getter method for connection, mapped from YANG variable /connection (list)
"""
return self.__connection
def _set_connection(self, v, load=False):
"""
Setter method for connection, mapped from YANG variable /connection (list)
If this variable is read-only (config: false) in the
source YANG file, then _set_connection is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_connection() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v, base=YANGListType("connectionid", yc_connection_node_connectivity__connection,
yang_name="connection", parent=self, is_container='list',
user_ordered=False, path_helper=self._path_helper,
yang_keys='connectionid', extensions=None), is_container='list',
yang_name="connection", parent=self, path_helper=self._path_helper,
extmethods=self._extmethods, register_paths=True, extensions=None,
namespace='urn:node-connectivity', defining_module='node-connectivity', yang_type='list',
is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """connection must be of a type compatible with list""",
'defined-type': "list",
'generated-type': """YANGDynClass(base=YANGListType("connectionid",yc_connection_node_connectivity__connection, yang_name="connection", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='connectionid', extensions=None), is_container='list', yang_name="connection", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions=None, namespace='urn:node-connectivity', defining_module='node-connectivity', yang_type='list', is_config=True)""",
})
self.__connection = t
if hasattr(self, '_set'):
self._set()
def _unset_connection(self):
self.__connection = YANGDynClass(
base=YANGListType("connectionid", yc_connection_node_connectivity__connection, yang_name="connection",
parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper,
yang_keys='connectionid', extensions=None), is_container='list', yang_name="connection",
parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True,
extensions=None, namespace='urn:node-connectivity', defining_module='node-connectivity', yang_type='list',
is_config=True)
connection = __builtin__.property(_get_connection, _set_connection)
_pyangbind_elements = OrderedDict([('connection', connection), ])
| 53.513859 | 563 | 0.62236 | 2,761 | 25,098 | 5.373778 | 0.067729 | 0.047179 | 0.055672 | 0.038822 | 0.88468 | 0.862439 | 0.855429 | 0.850037 | 0.847341 | 0.83157 | 0 | 0.001277 | 0.282333 | 25,098 | 468 | 564 | 53.628205 | 0.822452 | 0.133676 | 0 | 0.704615 | 0 | 0.018462 | 0.230366 | 0.086606 | 0 | 0 | 0 | 0 | 0 | 1 | 0.073846 | false | 0 | 0.046154 | 0 | 0.230769 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e8264455cd2a692633fdf962d2febbd078f29b36 | 51,192 | py | Python | qa/rpc-tests/parallel.py | MONIMAKER365/BitcoinUnlimited | 8aea282c44ee23ca65cdd895c99b3f6347f46dfc | [
"MIT"
] | 535 | 2015-09-04T15:10:08.000Z | 2022-03-17T20:51:05.000Z | qa/rpc-tests/parallel.py | MONIMAKER365/BitcoinUnlimited | 8aea282c44ee23ca65cdd895c99b3f6347f46dfc | [
"MIT"
] | 1,269 | 2016-01-31T20:21:24.000Z | 2022-03-16T01:20:08.000Z | qa/rpc-tests/parallel.py | MONIMAKER365/BitcoinUnlimited | 8aea282c44ee23ca65cdd895c99b3f6347f46dfc | [
"MIT"
] | 295 | 2015-10-19T16:12:29.000Z | 2021-08-02T20:05:17.000Z | #!/usr/bin/env python3
# Copyright (c) 2014-2015 The Bitcoin Core developers
# Copyright (c) 2015-2016 The Bitcoin Unlimited developers
# Distributed under the MIT software license, see the accompanying
# file COPYING or http://www.opensource.org/licenses/mit-license.php.
import test_framework.loginit
from test_framework.test_framework import BitcoinTestFramework
from test_framework.util import *
class ParallelTest (BitcoinTestFramework):
def __init__(self):
self.rep = False
BitcoinTestFramework.__init__(self)
def setup_chain(self):
print ("Initializing test directory "+self.options.tmpdir)
initialize_chain_clean(self.options.tmpdir, 6)
def setup_network(self, split=False):
self.nodes = []
self.nodes.append(start_node(0, self.options.tmpdir, ["-parallel=0", "-rpcservertimeout=0", "-use-thinblocks=0", "-excessiveblocksize=6000000", "-blockprioritysize=6000000", "-blockmaxsize=6000000"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-parallel=0", "-rpcservertimeout=0", "-use-thinblocks=0", "-excessiveblocksize=6000000", "-blockprioritysize=6000000", "-blockmaxsize=6000000"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-parallel=0", "-rpcservertimeout=0", "-use-thinblocks=0", "-excessiveblocksize=6000000", "-blockprioritysize=6000000", "-blockmaxsize=6000000"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-parallel=0", "-rpcservertimeout=0", "-use-thinblocks=0", "-excessiveblocksize=6000000", "-blockprioritysize=6000000", "-blockmaxsize=6000000"]))
interconnect_nodes(self.nodes)
self.is_network_split=False
self.sync_all()
def cleanup_and_reset(self):
# Cleanup - start and connect the other nodes so that we have syncd chains before proceeding
# to other tests.
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(5, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
interconnect_nodes(self.nodes)
sync_blocks(self.nodes)
print ("Mine more blocks on each node...")
self.nodes[0].generate(25)
sync_blocks(self.nodes)
self.nodes[1].generate(25)
sync_blocks(self.nodes)
self.nodes[2].generate(25)
sync_blocks(self.nodes)
self.nodes[3].generate(25)
sync_blocks(self.nodes)
self.nodes[4].generate(25)
sync_blocks(self.nodes)
self.nodes[5].generate(25)
sync_blocks(self.nodes)
stop_nodes(self.nodes)
wait_bitcoinds()
def repetitiveTest(self):
# get some coins
self.nodeLookup = {}
i = 0
for n in self.nodes:
print("Node %d is %s" % (i,n.url))
print ("generating coins for node")
n.generate(200)
self.sync_all()
i += 1
for i in range(0,200):
# Create many utxo's
print ("round %d: Generating txns..." % i)
for n in self.nodes:
send_to = {}
n.keypoolrefill(100)
n.keypoolrefill(100)
for i in range(200):
send_to[n.getnewaddress()] = Decimal("0.01")
n.sendmany("", send_to)
self.sync_all()
print (" generating blocks...")
i = 0
for n in self.nodes:
try:
n.generate(1)
except JSONRPCException as e:
print (e)
print ("Node ", i, " ", n.url)
pdb.set_trace()
i += 1
print (" syncing...")
self.sync_all()
def run_test (self):
if self.rep:
self.repetitiveTest()
return
print ("Mining blocks with PV off...")
# Mine some blocks on node2 which we will need at the end to generate a few transactions from that node
# in order to create the small block with just a few transactions in it.
self.nodes[2].generate(2)
self.sync_blocks()
# Mine the rest on node0 where we will generate the bigger block.
self.nodes[0].generate(100)
self.sync_blocks()
self.nodes[0].generate(1)
self.sync_blocks()
self.nodes[2].generate(100)
self.sync_blocks()
#stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
#restart nodes with -pvtest off and do not yet connect the nodes
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0"]))
# Send tx's which do not propagate
addr2 = self.nodes[2].getnewaddress()
for i in range(50):
self.nodes[0].sendtoaddress(addr2, "0.01")
# Send a few transactions from node2 that will get mined so that we will have at least
# a few inputs to check when the two competing blocks enter parallel validation.
addr0 = self.nodes[0].getnewaddress()
for i in range(5):
self.nodes[2].sendtoaddress(addr0, "0.01")
# Have node0 and node2 mine the same block which will compete to advance the chaintip when
# The nodes are connected back together.
print ("Mine two competing blocks...")
self.nodes[0].generate(1)
self.nodes[2].generate(1)
#stop nodes and restart right away
stop_nodes(self.nodes)
wait_bitcoinds()
# Restart nodes with pvtest=1
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=1"]))
print ("Connect nodes...")
interconnect_nodes(self.nodes)
sync_blocks(self.nodes[0:3])
# Wait here to make sure a re-org does not happen on node0 so we want to give it some time. If the
# memory pool on node 0 does not change within 5 seconds then we assume a reorg is not occurring
# because a reorg would cause transactions to be placed in the mempool from the old block on node 0.
old_mempoolbytes = self.nodes[0].getmempoolinfo()["bytes"]
for i in range(5):
mempoolbytes = self.nodes[0].getmempoolinfo()["bytes"]
if old_mempoolbytes != mempoolbytes:
assert("Reorg happened when it should not - Mempoolbytes has changed")
old_mempoolbytes = mempoolbytes
# node0 has the bigger block and was sent and began processing first, however the block from node2
# should have come in after and beaten node0's block. Therefore the blockhash from chaintip from
# node2 should now match the blockhash from the chaintip on node1; and node0 and node1 should not match.
print ("check for re-org " + str(i+1))
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[2].getbestblockhash())
assert_not_equal(self.nodes[0].getbestblockhash(), self.nodes[1].getbestblockhash())
time.sleep(1)
#stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
# Restart nodes with pvtest off.
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0"]))
print ("Connect nodes...")
interconnect_nodes(self.nodes)
# mine a block on node3 and then connect to the others. This tests when a third block arrives after
# the tip has been advanced.
# this block should propagate to the other nodes but not cause a re-org
print ("Mine another block...")
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0"]))
self.nodes[3].generate(1)
connect_nodes(self.nodes[1],3)
sync_blocks(self.nodes)
# Wait here to make sure a re-org does not happen on node0 so we want to give it some time. If the
# memory pool on node 0 does not change within 5 seconds then we assume a reorg is not occurring
# because a reorg would cause transactions to be placed in the mempool from the old block on node 0.
for i in range(5):
mempoolbytes = self.nodes[0].getmempoolinfo()["bytes"]
if old_mempoolbytes != mempoolbytes:
assert("Reorg happened when it should not - Mempoolbytes has changed")
old_mempoolbytes = mempoolbytes
print ("check for re-org " + str(i+1))
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[2].getbestblockhash())
assert_not_equal(self.nodes[0].getbestblockhash(), self.nodes[1].getbestblockhash())
assert_not_equal(self.nodes[1].getbestblockhash(), self.nodes[3].getbestblockhash())
time.sleep(1)
# Send some transactions and Mine a block on node 2.
# This should cause node0 and node3 to re-org and all chains should now match.
for i in range(5):
self.nodes[2].sendtoaddress(addr2, .01)
print ("Mine another block on node2 which causes a reorg on node0 and node3...")
self.nodes[2].generate(1)
sync_blocks(self.nodes)
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[2].getbestblockhash())
assert_equal(self.nodes[0].getbestblockhash(), self.nodes[1].getbestblockhash())
counts = [ x.getblockcount() for x in self.nodes ]
assert_equal(counts, [205,205,205,205])
#stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(5, self.options.tmpdir, ["-pvtest=0"]))
connect_nodes(self.nodes[1],0)
connect_nodes(self.nodes[1],2)
connect_nodes(self.nodes[1],3)
connect_nodes(self.nodes[1],4)
connect_nodes(self.nodes[1],5)
sync_blocks(self.nodes)
# Mine blocks on each node and then mine 100 to age them such that they are spendable.
print ("Mine more blocks on each node...")
self.nodes[1].generate(5)
sync_blocks(self.nodes)
self.nodes[2].generate(5)
sync_blocks(self.nodes)
self.nodes[3].generate(5)
sync_blocks(self.nodes)
self.nodes[4].generate(5)
sync_blocks(self.nodes)
self.nodes[5].generate(5)
sync_blocks(self.nodes)
self.nodes[1].generate(100)
sync_blocks(self.nodes)
#stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(5, self.options.tmpdir, ["-pvtest=0"]))
print ("Send more transactions...")
num_range = 50
addrs = [ x.getnewaddress() for x in self.nodes]
for i in range(num_range):
self.nodes[0].sendtoaddress(addrs[0], 0.01)
num_range = 10
for i in range(num_range):
self.nodes[2].sendtoaddress(addrs[2], 0.01)
for i in range(num_range):
self.nodes[3].sendtoaddress(addrs[3], 0.01)
for i in range(num_range):
self.nodes[4].sendtoaddress(addrs[4], 0.01)
for i in range(num_range):
self.nodes[5].sendtoaddress(addrs[5], 0.01)
# Mine 5 competing blocks.
print ("Mine 5 competing blocks...")
self.nodes[0].generate(1)
self.nodes[2].generate(1)
self.nodes[3].generate(1)
self.nodes[4].generate(1)
self.nodes[5].generate(1)
counts = [ x.getblockcount() for x in self.nodes ]
assert_equal(counts, [331,330,331,331,331,331])
# Connect nodes so that all blocks are sent at same time to node1. Largest block from node0 will be terminated.
print ("connnect nodes...")
connect_nodes(self.nodes[1],0)
connect_nodes(self.nodes[1],2)
connect_nodes(self.nodes[1],3)
connect_nodes(self.nodes[1],4)
connect_nodes(self.nodes[1],5)
sync_blocks(self.nodes)
# Mine a block which will cause a reorg back to node0
print ("Mine another block...")
self.nodes[0].generate(1)
sync_blocks(self.nodes)
# Mine 5 more competing blocks of the same size. The last block to arrive will have its validation terminated.
print ("Mine 5 more competing blocks...")
self.nodes[0].generate(1)
self.nodes[2].generate(1)
self.nodes[3].generate(1)
self.nodes[4].generate(1)
self.nodes[5].generate(1)
sync_blocks(self.nodes)
# Mine another block which will cause the nodes to sync to one chain
print ("Mine another block...")
self.nodes[0].generate(1)
sync_blocks(self.nodes)
#stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
# Cleanup by mining more blocks if we need to run extended tests
if self.longTest == True:
self.cleanup_and_reset()
################################################
# Begin extended tests
################################################
if self.longTest == False:
return
###########################################################################################
# Test reorgs
###########################################################################################
###########################################################################################
# Basic reorg - see section below on 4 block attack scenarios. At the end there is a
# repeated test that does basic reorgs multiple times.
###########################################################################################
# 1) Start a slow to validate block race then mine another block pulling one chain ahead.
# - threads on the chain that is now not the most proof of work should be stopped and the
# most proof of work block should proceed.
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0"]))
print ("Send more transactions...")
num_range = 15
for i in range(num_range):
self.nodes[0].sendtoaddress(addrs[0], 0.01)
for i in range(num_range):
self.nodes[1].sendtoaddress(addrs[1], 0.01)
for i in range(num_range):
self.nodes[2].sendtoaddress(addrs[2], 0.01)
# Mine a block on each node
print ("Mine a block on each node..")
self.nodes[0].generate(1)
self.nodes[1].generate(1)
self.nodes[2].generate(1)
basecount = self.nodes[0].getblockcount()
# Mine another block on node2 so that it's chain will be the longest when we connect it
print ("Mine another block on node2..")
self.nodes[2].generate(1)
bestblock = self.nodes[2].getbestblockhash()
stop_nodes(self.nodes)
wait_bitcoinds()
# Restart nodes with pvtest=1
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=1"]))
# Connect node 0 and 1 so that a block validation race begins
print ("Connect nodes0 and 1...")
connect_nodes(self.nodes[1],0)
# Wait for a little while before connecting node 2
time.sleep(3)
print ("Connect node2...")
counts = [ x.getblockcount() for x in self.nodes ]
print (str(counts))
assert_equal(counts, [basecount,basecount,basecount+1])
interconnect_nodes(self.nodes)
# All chains will sync to node2
sync_blocks(self.nodes)
assert_equal(self.nodes[0].getbestblockhash(), bestblock)
assert_equal(self.nodes[1].getbestblockhash(), bestblock)
assert_equal(self.nodes[2].getbestblockhash(), bestblock)
stop_nodes(self.nodes)
wait_bitcoinds()
# cleanup and sync chains for next tests
self.cleanup_and_reset()
###########################################################################################
# Mine two forks of equal work and start slow to validate block race on fork1. Then another
# block arrives on fork2
# - the slow to validate blocks will still continue
# Mine another block on fork2 two pulling that fork ahead.
# - threads on the fork1 should be stopped allowing fork2 to connect blocks and pull ahead
print ("Mine two forks.")
# fork 1 (both nodes on fork1 should be syncd)
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
interconnect_nodes(self.nodes)
self.nodes[0].generate(1)
sync_blocks(self.nodes)
# fork 2
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes[2].generate(1)
stop_nodes(self.nodes)
wait_bitcoinds()
# restart nodes but don't connect them yet
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
# Create txns on node0 and 1 to setup for a slow to validate race between those nodes.
print ("Send more transactions...")
num_range = 15
for i in range(num_range):
self.nodes[0].sendtoaddress(addrs[0], 0.01)
for i in range(num_range):
self.nodes[1].sendtoaddress(addrs[1], 0.01)
# Mine a block on each node
print ("Mine a block on each node..")
self.nodes[0].generate(1)
self.nodes[1].generate(1)
self.nodes[2].generate(1)
basecount = self.nodes[0].getblockcount()
# Mine another block on node2 so that it's chain will be the longest when we connect it
print ("Mine another block on node2..")
self.nodes[2].generate(1)
bestblock = self.nodes[2].getbestblockhash()
stop_nodes(self.nodes)
wait_bitcoinds()
# Restart nodes with pvtest=1
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=1", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=1", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=1", "-whitelist=127.0.0.1"]))
# Connect node 0 and 1 so that a block validation race begins
print ("Connect nodes0 and 1...")
connect_nodes(self.nodes[1],0)
# Wait for a little while before connecting node 2
time.sleep(3)
print ("Connect node2...")
counts = [ x.getblockcount() for x in self.nodes ]
print (str(counts))
assert_equal(counts, [basecount,basecount,basecount+1])
interconnect_nodes(self.nodes)
# All chains will sync to node2
sync_blocks(self.nodes)
assert_equal(self.nodes[0].getbestblockhash(), bestblock)
assert_equal(self.nodes[1].getbestblockhash(), bestblock)
assert_equal(self.nodes[2].getbestblockhash(), bestblock)
stop_nodes(self.nodes)
wait_bitcoinds()
# cleanup and sync chains for next tests
self.cleanup_and_reset()
##############################################################################################
# Mine two forks of equal work and start slow to validate 4 block race on fork1. Then another
# block arrives on fork2
# - the slow to validate blocks will still continue
# Mine another block on fork2 two pulling that fork ahead.
# - threads on the fork1 should be stopped allowing fork2 to connect blocks and pull ahead
print ("Mine two forks.")
# fork 1 (both nodes on fork1 should be syncd)
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
interconnect_nodes(self.nodes)
self.nodes[0].generate(1)
sync_blocks(self.nodes)
# fork 2
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes[4].generate(1)
stop_nodes(self.nodes)
wait_bitcoinds()
# restart nodes but don't connect them yet
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
# Create txns on node0 and 1 to setup for a slow to validate race between those nodes.
print ("Send more transactions...")
num_range = 15
for i in range(num_range):
self.nodes[0].sendtoaddress(addrs[0], 0.01)
for i in range(num_range):
self.nodes[1].sendtoaddress(addrs[1], 0.01)
for i in range(num_range):
self.nodes[3].sendtoaddress(addrs[1], 0.01)
for i in range(num_range):
self.nodes[4].sendtoaddress(addrs[1], 0.01)
# Mine a block on each node
print ("Mine a block on each node..")
self.nodes[0].generate(1)
self.nodes[1].generate(1)
self.nodes[2].generate(1)
self.nodes[3].generate(1)
self.nodes[4].generate(1)
basecount = self.nodes[0].getblockcount()
# Mine another block on node4 so that it's chain will be the longest when we connect it
print ("Mine another block on node4..")
self.nodes[4].generate(1)
bestblock = self.nodes[4].getbestblockhash()
stop_nodes(self.nodes)
wait_bitcoinds()
# Restart nodes with pvtest=1
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=1", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=1", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=1", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=1", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=1", "-whitelist=127.0.0.1"]))
# Connect node 0 and 1 so that a block validation race begins
print ("Connect nodes0, 1, 2 and 3...")
connect_nodes(self.nodes[1],0)
# Wait for a little while before connecting node 4
time.sleep(3)
print ("Connect node4...")
counts = [ x.getblockcount() for x in self.nodes ]
print (str(counts))
assert_equal(counts, [basecount,basecount,basecount, basecount, basecount+1])
interconnect_nodes(self.nodes)
# All chains will sync to node2
sync_blocks(self.nodes)
assert_equal(self.nodes[0].getbestblockhash(), bestblock)
assert_equal(self.nodes[1].getbestblockhash(), bestblock)
assert_equal(self.nodes[2].getbestblockhash(), bestblock)
assert_equal(self.nodes[3].getbestblockhash(), bestblock)
assert_equal(self.nodes[4].getbestblockhash(), bestblock)
stop_nodes(self.nodes)
wait_bitcoinds()
# cleanup and sync chains for next tests
self.cleanup_and_reset()
###########################################################################################
# 1) Mine two forks of equal work and start slow to validate block race on fork1. Then another
# block arrives on fork2 pulling that fork ahead.
# - threads on the fork1 should be stopped allowing fork2 to connect blocks and pull ahead
# 2) As fork2 is being validated, fork 1 pulls ahead
# - fork 2 is now stopped and fork 1 begins to validate
# 3) do step 2 repeatedely, going back and forth between forks
print ("Mine three forks.")
# fork 1 (both nodes on fork1 should be syncd)
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
interconnect_nodes(self.nodes)
self.nodes[0].generate(1)
sync_blocks(self.nodes)
# fork 2
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes[2].generate(1)
# fork 3
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes[3].generate(1)
stop_nodes(self.nodes)
wait_bitcoinds()
# restart nodes but don't connect them yet
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0", "-whitelist=127.0.0.1"]))
# Create txns on node0 and 1 to setup for a slow to validate race between those nodes.
print ("Send more transactions...")
num_range = 15
for i in range(num_range):
self.nodes[0].sendtoaddress(self.nodes[0].getnewaddress(), 0.01)
for i in range(num_range):
self.nodes[1].sendtoaddress(self.nodes[1].getnewaddress(), 0.01)
# in this test we also generate txns on node 2 so that all nodes will validate slowly.
for i in range(num_range):
self.nodes[2].sendtoaddress(self.nodes[1].getnewaddress(), 0.01)
# Mine a block on each node
print ("Mine a block on each node..")
self.nodes[0].generate(1)
self.nodes[1].generate(1)
self.nodes[2].generate(1)
self.nodes[3].generate(1)
basecount = self.nodes[0].getblockcount()
# Mine another block on node2 so that it's chain will be the longest when we connect it
print ("Mine another block on node2..")
self.nodes[2].generate(1)
# Mine two blocks on node3 so that it's chain will be the longest when we connect it
print ("Mine 2 blocks on node3..")
self.nodes[3].generate(1)
self.nodes[3].generate(1)
bestblock = self.nodes[3].getbestblockhash()
stop_nodes(self.nodes)
wait_bitcoinds()
# Restart nodes with pvtest=1
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=1", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=1", "-whitelist=127.0.0.1"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=1", "-whitelist=127.0.0.1"]))
# Connect node 0 and 1 so that a block validation race begins
print ("Connect nodes 0 and 1...")
connect_nodes(self.nodes[1],0)
# Wait for a little while before connecting node 2 (fork2)
time.sleep(3)
print ("Connect node2 - fork2...")
counts = [ x.getblockcount() for x in self.nodes ]
print (str(counts))
assert_equal(counts, [basecount,basecount,basecount+1])
interconnect_nodes(self.nodes)
# Wait for a little while before connecting node 3 (fork3)
time.sleep(3)
print ("Connect node3 - fork3...")
self.nodes.append(start_node(3, self.options.tmpdir, ["-debug=","-pvtest=1", "-whitelist=127.0.0.1"]))
counts = [ x.getblockcount() for x in self.nodes ]
interconnect_nodes(self.nodes)
print (str(counts))
assert_equal(counts, [basecount-1,basecount-1,basecount+1, basecount+2])
interconnect_nodes(self.nodes)
sync_blocks(self.nodes)
assert_equal(self.nodes[0].getbestblockhash(), bestblock)
assert_equal(self.nodes[1].getbestblockhash(), bestblock)
assert_equal(self.nodes[2].getbestblockhash(), bestblock)
assert_equal(self.nodes[3].getbestblockhash(), bestblock)
stop_nodes(self.nodes)
wait_bitcoinds()
# cleanup and sync chains for next tests
self.cleanup_and_reset()
###########################################################################################
# 1) Large reorg - can we do a 144 block reorg?
print ("Starting repeating many competing blocks test")
self.nodes.append(start_node(0, self.options.tmpdir, ["-debug=","-pvtest=0"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-debug=","-pvtest=0"]))
print ("Mine 144 blocks on each chain...")
self.nodes[0].generate(144)
self.nodes[1].generate(144)
print ("Connect nodes for larg reorg...")
connect_nodes(self.nodes[1],0)
sync_blocks(self.nodes)
print ("Mine another block on node5 causing large reorg...")
self.nodes[1].generate(1)
sync_blocks(self.nodes)
# Mine another block which will cause some nodes to reorg and sync to the same chain.
print ("Mine another block on node0...")
self.nodes[0].generate(1)
sync_blocks(self.nodes)
# stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
# cleanup and sync chains for next tests
self.cleanup_and_reset()
###########################################################################################
# Test the 4 block attack scenarios - use -pvtest=true to slow down the checking of inputs.
###########################################################################################
####################################################################
# Mine 4 blocks of all different sizes
# - the smallest block should win
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=0"]))
print ("Send more transactions...")
num_range = 15
for i in range(num_range):
self.nodes[0].sendtoaddress(self.nodes[0].getnewaddress(), 0.01)
num_range = 14
for i in range(num_range):
self.nodes[2].sendtoaddress(self.nodes[2].getnewaddress(), 0.01)
num_range = 13
for i in range(num_range):
self.nodes[3].sendtoaddress(self.nodes[3].getnewaddress(), 0.01)
num_range = 2
for i in range(num_range):
self.nodes[4].sendtoaddress(self.nodes[4].getnewaddress(), 0.01)
# Mine 4 competing blocks.
print ("Mine 4 competing blocks...")
self.nodes[0].generate(1)
self.nodes[2].generate(1)
self.nodes[3].generate(1)
self.nodes[4].generate(1)
# stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
# start nodes with -pvtest set to true.
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=1"]))
# Connect nodes so that all blocks are sent at same time to node1.
connect_nodes(self.nodes[1],0)
connect_nodes(self.nodes[1],2)
connect_nodes(self.nodes[1],3)
connect_nodes(self.nodes[1],4)
sync_blocks(self.nodes)
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[4].getbestblockhash())
# stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(5, self.options.tmpdir, ["-pvtest=0"]))
connect_nodes(self.nodes[1],0)
connect_nodes(self.nodes[1],2)
connect_nodes(self.nodes[1],3)
connect_nodes(self.nodes[1],4)
connect_nodes(self.nodes[1],5)
sync_blocks(self.nodes)
# Mine a block which will cause all nodes to update their chains
print ("Mine another block...")
self.nodes[1].generate(1)
time.sleep(2) #wait for blocks to propagate
sync_blocks(self.nodes)
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[0].getbestblockhash())
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[2].getbestblockhash())
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[3].getbestblockhash())
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[4].getbestblockhash())
#stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
# cleanup and sync chains for next tests
self.cleanup_and_reset()
########################################################################################################
# Mine 4 blocks all the same size and get them to start validating and then send a 5th that is smaller
# - the last smallest and last block arriving should win.
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(5, self.options.tmpdir, ["-pvtest=0"]))
print ("Send more transactions...")
num_range = 15
for i in range(num_range):
self.nodes[0].sendtoaddress(self.nodes[0].getnewaddress(), 0.01)
num_range = 15
for i in range(num_range):
self.nodes[2].sendtoaddress(self.nodes[2].getnewaddress(), 0.01)
num_range = 15
for i in range(num_range):
self.nodes[3].sendtoaddress(self.nodes[3].getnewaddress(), 0.01)
num_range = 15
for i in range(num_range):
self.nodes[4].sendtoaddress(self.nodes[4].getnewaddress(), 0.01)
num_range = 2
for i in range(num_range):
self.nodes[5].sendtoaddress(self.nodes[5].getnewaddress(), 0.01)
# stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
# start nodes with -pvtest set to true.
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(5, self.options.tmpdir, ["-pvtest=1"]))
# Connect nodes so that first 4 blocks are sent at same time to node1.
connect_nodes(self.nodes[1],0)
connect_nodes(self.nodes[1],2)
connect_nodes(self.nodes[1],3)
connect_nodes(self.nodes[1],4)
time.sleep(5) #wait for blocks to start processing
# Connect 5th block and this one should win the race
connect_nodes(self.nodes[1],5)
sync_blocks(self.nodes)
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[5].getbestblockhash())
#stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(5, self.options.tmpdir, ["-pvtest=0"]))
connect_nodes(self.nodes[1],0)
connect_nodes(self.nodes[1],2)
connect_nodes(self.nodes[1],3)
connect_nodes(self.nodes[1],4)
connect_nodes(self.nodes[1],5)
# Mine a block which will cause all nodes to update their chains
print ("Mine another block...")
self.nodes[1].generate(1)
time.sleep(2) #wait for blocks to propagate
sync_blocks(self.nodes)
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[0].getbestblockhash())
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[2].getbestblockhash())
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[3].getbestblockhash())
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[4].getbestblockhash())
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[5].getbestblockhash())
# stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
# cleanup and sync chains for next tests
self.cleanup_and_reset()
############################################################################################################
# Mine 4 blocks all the same size and get them to start validating and then send a 5th that is the same size
# - the first block arriving should win
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(5, self.options.tmpdir, ["-pvtest=0"]))
print ("Send more transactions...")
num_range = 10
for i in range(num_range):
self.nodes[0].sendtoaddress(self.nodes[0].getnewaddress(), 0.01)
num_range = 10
for i in range(num_range):
self.nodes[2].sendtoaddress(self.nodes[2].getnewaddress(), 0.01)
num_range = 10
for i in range(num_range):
self.nodes[3].sendtoaddress(self.nodes[3].getnewaddress(), 0.01)
num_range = 10
for i in range(num_range):
self.nodes[4].sendtoaddress(self.nodes[4].getnewaddress(), 0.01)
num_range = 10
for i in range(num_range):
self.nodes[5].sendtoaddress(self.nodes[5].getnewaddress(), 0.01)
# stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
# start nodes with -pvtest set to true.
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(5, self.options.tmpdir, ["-pvtest=1"]))
# Connect nodes so that first 4 blocks are sent 1 second apart to node1.
connect_nodes(self.nodes[1],0)
time.sleep(1)
connect_nodes(self.nodes[1],2)
time.sleep(1)
connect_nodes(self.nodes[1],3)
time.sleep(1)
connect_nodes(self.nodes[1],4)
time.sleep(1) #wait for blocks to start processing
# Connect 5th block and this one be terminated and the first block to connect from node0 should win the race
connect_nodes(self.nodes[1],5)
sync_blocks(self.nodes)
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[0].getbestblockhash())
#stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(5, self.options.tmpdir, ["-pvtest=0"]))
connect_nodes(self.nodes[1],0)
connect_nodes(self.nodes[1],2)
connect_nodes(self.nodes[1],3)
connect_nodes(self.nodes[1],4)
connect_nodes(self.nodes[1],5)
# Mine a block which will cause all nodes to update their chains
print ("Mine another block...")
self.nodes[1].generate(1)
time.sleep(2) #wait for blocks to propagate
sync_blocks(self.nodes)
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[0].getbestblockhash())
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[2].getbestblockhash())
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[3].getbestblockhash())
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[4].getbestblockhash())
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[5].getbestblockhash())
# stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
#########################################################################################################
# Mine 4 blocks all the same size and get them to start validating and then send a 5th that is bigger
# - the first block arriving should win
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(5, self.options.tmpdir, ["-pvtest=0"]))
print ("Send more transactions...")
num_range = 10
for i in range(num_range):
self.nodes[0].sendtoaddress(self.nodes[0].getnewaddress(), 0.01)
num_range = 10
for i in range(num_range):
self.nodes[2].sendtoaddress(self.nodes[2].getnewaddress(), 0.01)
num_range = 10
for i in range(num_range):
self.nodes[3].sendtoaddress(self.nodes[3].getnewaddress(), 0.01)
num_range = 10
for i in range(num_range):
self.nodes[4].sendtoaddress(self.nodes[4].getnewaddress(), 0.01)
num_range = 20
for i in range(num_range):
self.nodes[5].sendtoaddress(self.nodes[5].getnewaddress(), 0.01)
# stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
# start nodes with -pvtest set to true.
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=1"]))
self.nodes.append(start_node(5, self.options.tmpdir, ["-pvtest=1"]))
# Connect nodes so that first 4 blocks are sent 1 second apart to node1.
connect_nodes(self.nodes[1],0)
time.sleep(1)
connect_nodes(self.nodes[1],2)
time.sleep(1)
connect_nodes(self.nodes[1],3)
time.sleep(1)
connect_nodes(self.nodes[1],4)
time.sleep(1) #wait for blocks to start processing
# Connect 5th block and this one be terminated and the first block to connect from node0 should win the race
connect_nodes(self.nodes[1],5)
sync_blocks(self.nodes)
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[0].getbestblockhash())
# stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(5, self.options.tmpdir, ["-pvtest=0"]))
connect_nodes(self.nodes[1],0)
connect_nodes(self.nodes[1],2)
connect_nodes(self.nodes[1],3)
connect_nodes(self.nodes[1],4)
connect_nodes(self.nodes[1],5)
# Mine a block which will cause all nodes to update their chains
print ("Mine another block...")
self.nodes[1].generate(1)
time.sleep(2) #wait for blocks to propagate
sync_blocks(self.nodes)
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[0].getbestblockhash())
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[2].getbestblockhash())
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[3].getbestblockhash())
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[4].getbestblockhash())
assert_equal(self.nodes[1].getbestblockhash(), self.nodes[5].getbestblockhash())
# stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
# cleanup and sync chains for next tests
self.cleanup_and_reset()
#################################################################################
# Repeated 5 blocks mined with a reorg after
#################################################################################
# Repeatedly mine 5 blocks at a time on each node to have many blocks both arriving
# at the same time and racing each other to see which can extend the chain the fastest.
# This is intented just a stress test of the 4 block scenario but also while blocks
# are in the process of being both mined and with reorgs sometimes happening at the same time.
print ("Starting repeating many competing blocks test")
self.nodes.append(start_node(0, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(1, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(2, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(3, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(4, self.options.tmpdir, ["-pvtest=0"]))
self.nodes.append(start_node(5, self.options.tmpdir, ["-pvtest=0"]))
connect_nodes(self.nodes[1],0)
connect_nodes(self.nodes[1],2)
connect_nodes(self.nodes[1],3)
connect_nodes(self.nodes[1],4)
connect_nodes(self.nodes[1],5)
sync_blocks(self.nodes)
for i in range(100):
print ("Mine many more competing blocks...")
self.nodes[0].generate(1)
self.nodes[2].generate(1)
self.nodes[3].generate(1)
self.nodes[4].generate(1)
self.nodes[5].generate(1)
sync_blocks(self.nodes)
# Mine another block which will cause some nodes to reorg and sync to the same chain.
print ("%d: Mine another block..." % i)
self.nodes[0].generate(1)
sync_blocks(self.nodes)
# stop nodes
stop_nodes(self.nodes)
wait_bitcoinds()
# cleanup and sync chains for next tests
self.cleanup_and_reset()
def Test():
t = ParallelTest()
t.drop_to_pdb = True
# t.rep = True
t.longTest = False
bitcoinConf = {
"debug": ["net", "blk", "thin", "mempool", "req", "bench", "evict"],
}
flags = standardFlags()
t.main(flags, bitcoinConf, None)
if __name__ == '__main__':
p = ParallelTest()
if "--rep" in sys.argv:
print("Repetitive test")
p.rep = True
sys.argv.remove("--rep")
else:
p.rep = False
if "--extensive" in sys.argv:
p.longTest = True
# we must remove duplicate 'extensive' arg here
while True:
try:
sys.argv.remove('--extensive')
except:
break
print ("Running extensive tests")
else:
p.longTest = False
p.main ()
| 44.514783 | 208 | 0.607302 | 6,845 | 51,192 | 4.466326 | 0.058875 | 0.158969 | 0.08619 | 0.100092 | 0.857909 | 0.840475 | 0.831611 | 0.822387 | 0.804429 | 0.792097 | 0 | 0.039534 | 0.232634 | 51,192 | 1,149 | 209 | 44.553525 | 0.738716 | 0.177723 | 0 | 0.819121 | 0 | 0 | 0.113715 | 0.007333 | 0 | 0 | 0 | 0 | 0.069767 | 1 | 0.009044 | false | 0 | 0.003876 | 0 | 0.016796 | 0.094315 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1c26f6467a3caa502dca896a107ac99e866a1582 | 29,321 | py | Python | tests/base/test_endpoints_authorizations.py | rapydo/http-api | ef0a299173195145303069534d45d446ea4da93a | [
"MIT"
] | 8 | 2018-07-04T09:54:46.000Z | 2022-03-17T08:21:06.000Z | tests/base/test_endpoints_authorizations.py | rapydo/http-api | ef0a299173195145303069534d45d446ea4da93a | [
"MIT"
] | 19 | 2018-04-18T07:24:55.000Z | 2022-03-04T01:03:15.000Z | tests/base/test_endpoints_authorizations.py | rapydo/http-api | ef0a299173195145303069534d45d446ea4da93a | [
"MIT"
] | 7 | 2018-07-03T12:17:50.000Z | 2021-05-05T04:33:32.000Z | import re
from typing import Dict, List, Optional
import pytest
from restapi.config import ABS_RESTAPI_PATH
from restapi.connectors import Connector
from restapi.env import Env
from restapi.rest.loader import EndpointsLoader
from restapi.services.authentication import Role
from restapi.tests import SERVER_URI, BaseTests, FlaskClient
from restapi.utilities.logs import log
class TestApp1(BaseTests):
@staticmethod
def get_path(method: str, path: str) -> str:
method = method.upper()
path = re.sub(r"\{[a-zA-Z0-9_]+\}", "VARIABLE", path)
path = re.sub(r"\<[a-zA-Z0-9_]+\>", "VARIABLE", path)
return f"{method} {path}"
# This utility returns a list of _core_ paths with the form:
# METHOD /path, e.g.
# GET /api/admin/users
# POST /api/admin/users
def get_paths(self, client: FlaskClient) -> List[str]:
loader = EndpointsLoader()
loader.load_endpoints_folder(ABS_RESTAPI_PATH)
paths: List[str] = []
for endpoint_class in loader.endpoints:
for method, path in endpoint_class.methods.items():
if path.startswith("/api/tests/"):
continue
paths.append(self.get_path(method, path))
return paths
# Test a single endpoint, remove the path from the list and return the new list
# Once tested all paths, the list should be empty
def check_endpoint(
self,
client: FlaskClient,
method: str,
endpoint: str,
headers: Optional[Dict[str, str]],
expected_authorized: bool,
paths: List[str],
) -> List[str]:
assert method in (
"GET",
"POST",
"PUT",
"PATCH",
"DELETE",
)
path = self.get_path(method, endpoint)
assert path in paths
# SERVER_URI because api and auth are already included in endpoint
full_endpoint = f"{SERVER_URI}/{endpoint}"
if method == "GET":
r = client.get(full_endpoint, headers=headers)
elif method == "POST":
r = client.post(full_endpoint, headers=headers)
elif method == "PUT":
r = client.put(full_endpoint, headers=headers)
elif method == "PATCH":
r = client.patch(full_endpoint, headers=headers)
elif method == "DELETE":
r = client.delete(full_endpoint, headers=headers)
else: # pragma: no cover
pytest.fail("Unknown method")
if expected_authorized:
assert r.status_code != 401
else:
assert r.status_code != 400
paths.remove(path)
return paths
def test_admin(self, client: FlaskClient) -> None:
if not Env.get_bool("AUTH_ENABLE"):
log.warning("Skipping admin authorizations tests")
return
# List of all paths to be tested. After each test a path will be removed.
# At the end the list is expected to be empty
paths = self.get_paths(client)
uuid, data = self.create_user(client, roles=[Role.ADMIN])
headers, _ = self.do_login(client, data.get("email"), data.get("password"))
# These are public
paths = self.check_endpoint(client, "GET", "/api/status", headers, True, paths)
paths = self.check_endpoint(client, "GET", "/api/specs", headers, True, paths)
paths = self.check_endpoint(client, "POST", "/auth/login", headers, True, paths)
if Env.get_int("AUTH_MAX_LOGIN_ATTEMPTS") > 0:
paths = self.check_endpoint(
client, "POST", "/auth/login/unlock/<token>", headers, True, paths
)
if Env.get_bool("ALLOW_REGISTRATION"):
paths = self.check_endpoint(
client, "POST", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/auth/profile/activate", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/auth/profile/activate/<token>", headers, True, paths
)
if Env.get_bool("ALLOW_PASSWORD_RESET") and Connector.check_availability(
"smtp"
):
paths = self.check_endpoint(
client, "POST", "/auth/reset", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/auth/reset/<token>", headers, True, paths
)
# These are allowed to each user
paths = self.check_endpoint(client, "GET", "/auth/status", headers, True, paths)
paths = self.check_endpoint(
client, "GET", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(
client, "PATCH", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(client, "GET", "/auth/tokens", headers, True, paths)
paths = self.check_endpoint(
client, "DELETE", "/auth/tokens/<token>", headers, True, paths
)
if Connector.check_availability("pushpin"):
paths = self.check_endpoint(
client, "PUT", "/api/socket/<channel>/<sync>", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/api/socket/<channel>", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/api/stream/<channel>/<sync>", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/api/stream/<channel>", headers, True, paths
)
# These are allowed to coordinators
paths = self.check_endpoint(
client, "GET", "/api/group/users", headers, False, paths
)
# These are allowed to staff
# ... none
# These are allowed to admins
paths = self.check_endpoint(
client, "GET", "/api/admin/users", headers, True, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/users/<user_id>", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/api/admin/users", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/api/admin/users/<user_id>", headers, True, paths
)
paths = self.check_endpoint(
client, "DELETE", "/api/admin/users/<user_id>", headers, True, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/groups", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/api/admin/groups", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/api/admin/groups/<group_id>", headers, True, paths
)
paths = self.check_endpoint(
client, "DELETE", "/api/admin/groups/<group_id>", headers, True, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/logins", headers, True, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/tokens", headers, True, paths
)
paths = self.check_endpoint(
client, "DELETE", "/api/admin/tokens/<token>", headers, True, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/stats", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/api/admin/mail", headers, True, paths
)
# logout MUST be the last one or the token will be invalidated!! :-)
paths = self.check_endpoint(client, "GET", "/auth/logout", headers, True, paths)
assert paths == []
self.delete_user(client, uuid)
def test_staff(self, client: FlaskClient) -> None:
if not Env.get_bool("AUTH_ENABLE"):
log.warning("Skipping staff authorizations tests")
return
auth = Connector.get_authentication_instance()
auth.get_roles()
if "staff_user" not in [r.name for r in auth.get_roles()]: # pragma: no cover
log.warning("Skipping authorization tests on role Staff (not enabled)")
return
# List of all paths to be tested. After each test a path will be removed.
# At the end the list is expected to be empty
paths = self.get_paths(client)
uuid, data = self.create_user(client, roles=[Role.STAFF])
headers, _ = self.do_login(client, data.get("email"), data.get("password"))
# These are public
paths = self.check_endpoint(client, "GET", "/api/status", headers, True, paths)
paths = self.check_endpoint(client, "GET", "/api/specs", headers, True, paths)
paths = self.check_endpoint(client, "POST", "/auth/login", headers, True, paths)
if Env.get_int("AUTH_MAX_LOGIN_ATTEMPTS") > 0:
paths = self.check_endpoint(
client, "POST", "/auth/login/unlock/<token>", headers, True, paths
)
if Env.get_bool("ALLOW_REGISTRATION"):
paths = self.check_endpoint(
client, "POST", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/auth/profile/activate", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/auth/profile/activate/<token>", headers, True, paths
)
if Env.get_bool("ALLOW_PASSWORD_RESET") and Connector.check_availability(
"smtp"
):
paths = self.check_endpoint(
client, "POST", "/auth/reset", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/auth/reset/<token>", headers, True, paths
)
# These are allowed to each user
paths = self.check_endpoint(client, "GET", "/auth/status", headers, True, paths)
paths = self.check_endpoint(
client, "GET", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(
client, "PATCH", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(client, "GET", "/auth/tokens", headers, True, paths)
paths = self.check_endpoint(
client, "DELETE", "/auth/tokens/<token>", headers, True, paths
)
if Connector.check_availability("pushpin"):
paths = self.check_endpoint(
client, "PUT", "/api/socket/<channel>/<sync>", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/api/socket/<channel>", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/api/stream/<channel>/<sync>", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/api/stream/<channel>", headers, True, paths
)
# These are allowed to coordinators
paths = self.check_endpoint(
client, "GET", "/api/group/users", headers, False, paths
)
# These are allowed to staff
# ... none
# These are allowed to admins
paths = self.check_endpoint(
client, "GET", "/api/admin/users", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/users/<user_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "POST", "/api/admin/users", headers, False, paths
)
paths = self.check_endpoint(
client, "PUT", "/api/admin/users/<user_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "DELETE", "/api/admin/users/<user_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/groups", headers, False, paths
)
paths = self.check_endpoint(
client, "POST", "/api/admin/groups", headers, False, paths
)
paths = self.check_endpoint(
client, "PUT", "/api/admin/groups/<group_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "DELETE", "/api/admin/groups/<group_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/logins", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/tokens", headers, False, paths
)
paths = self.check_endpoint(
client, "DELETE", "/api/admin/tokens/<token>", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/stats", headers, False, paths
)
paths = self.check_endpoint(
client, "POST", "/api/admin/mail", headers, False, paths
)
# logout MUST be the last one or the token will be invalidated!! :-)
paths = self.check_endpoint(client, "GET", "/auth/logout", headers, True, paths)
assert paths == []
self.delete_user(client, uuid)
def test_coordinator(self, client: FlaskClient) -> None:
if not Env.get_bool("AUTH_ENABLE"):
log.warning("Skipping coordinator authorizations tests")
return
# List of all paths to be tested. After each test a path will be removed.
# At the end the list is expected to be empty
paths = self.get_paths(client)
uuid, data = self.create_user(client, roles=[Role.COORDINATOR])
headers, _ = self.do_login(client, data.get("email"), data.get("password"))
# These are public
paths = self.check_endpoint(client, "GET", "/api/status", headers, True, paths)
paths = self.check_endpoint(client, "GET", "/api/specs", headers, True, paths)
paths = self.check_endpoint(client, "POST", "/auth/login", headers, True, paths)
if Env.get_int("AUTH_MAX_LOGIN_ATTEMPTS") > 0:
paths = self.check_endpoint(
client, "POST", "/auth/login/unlock/<token>", headers, True, paths
)
if Env.get_bool("ALLOW_REGISTRATION"):
paths = self.check_endpoint(
client, "POST", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/auth/profile/activate", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/auth/profile/activate/<token>", headers, True, paths
)
if Env.get_bool("ALLOW_PASSWORD_RESET") and Connector.check_availability(
"smtp"
):
paths = self.check_endpoint(
client, "POST", "/auth/reset", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/auth/reset/<token>", headers, True, paths
)
# These are allowed to each user
paths = self.check_endpoint(client, "GET", "/auth/status", headers, True, paths)
paths = self.check_endpoint(
client, "GET", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(
client, "PATCH", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(client, "GET", "/auth/tokens", headers, True, paths)
paths = self.check_endpoint(
client, "DELETE", "/auth/tokens/<token>", headers, True, paths
)
if Connector.check_availability("pushpin"):
paths = self.check_endpoint(
client, "PUT", "/api/socket/<channel>/<sync>", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/api/socket/<channel>", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/api/stream/<channel>/<sync>", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/api/stream/<channel>", headers, True, paths
)
# These are allowed to coordinators
paths = self.check_endpoint(
client, "GET", "/api/group/users", headers, True, paths
)
# These are allowed to staff
# ... none
# These are allowed to admins
paths = self.check_endpoint(
client, "GET", "/api/admin/users", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/users/<user_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "POST", "/api/admin/users", headers, False, paths
)
paths = self.check_endpoint(
client, "PUT", "/api/admin/users/<user_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "DELETE", "/api/admin/users/<user_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/groups", headers, False, paths
)
paths = self.check_endpoint(
client, "POST", "/api/admin/groups", headers, False, paths
)
paths = self.check_endpoint(
client, "PUT", "/api/admin/groups/<group_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "DELETE", "/api/admin/groups/<group_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/logins", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/tokens", headers, False, paths
)
paths = self.check_endpoint(
client, "DELETE", "/api/admin/tokens/<token>", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/stats", headers, False, paths
)
paths = self.check_endpoint(
client, "POST", "/api/admin/mail", headers, False, paths
)
# logout MUST be the last one or the token will be invalidated!! :-)
paths = self.check_endpoint(client, "GET", "/auth/logout", headers, True, paths)
assert paths == []
self.delete_user(client, uuid)
def test_user(self, client: FlaskClient) -> None:
if not Env.get_bool("AUTH_ENABLE"):
log.warning("Skipping user authorizations tests")
return
# List of all paths to be tested. After each test a path will be removed.
# At the end the list is expected to be empty
paths = self.get_paths(client)
uuid, data = self.create_user(client, roles=[Role.USER])
headers, _ = self.do_login(client, data.get("email"), data.get("password"))
# These are public
paths = self.check_endpoint(client, "GET", "/api/status", headers, True, paths)
paths = self.check_endpoint(client, "GET", "/api/specs", headers, True, paths)
paths = self.check_endpoint(client, "POST", "/auth/login", headers, True, paths)
if Env.get_int("AUTH_MAX_LOGIN_ATTEMPTS") > 0:
paths = self.check_endpoint(
client, "POST", "/auth/login/unlock/<token>", headers, True, paths
)
if Env.get_bool("ALLOW_REGISTRATION"):
paths = self.check_endpoint(
client, "POST", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/auth/profile/activate", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/auth/profile/activate/<token>", headers, True, paths
)
if Env.get_bool("ALLOW_PASSWORD_RESET") and Connector.check_availability(
"smtp"
):
paths = self.check_endpoint(
client, "POST", "/auth/reset", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/auth/reset/<token>", headers, True, paths
)
# These are allowed to each user
paths = self.check_endpoint(client, "GET", "/auth/status", headers, True, paths)
paths = self.check_endpoint(
client, "GET", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(
client, "PATCH", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(client, "GET", "/auth/tokens", headers, True, paths)
paths = self.check_endpoint(
client, "DELETE", "/auth/tokens/<token>", headers, True, paths
)
if Connector.check_availability("pushpin"):
paths = self.check_endpoint(
client, "PUT", "/api/socket/<channel>/<sync>", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/api/socket/<channel>", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/api/stream/<channel>/<sync>", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/api/stream/<channel>", headers, True, paths
)
# These are allowed to coordinators
paths = self.check_endpoint(
client, "GET", "/api/group/users", headers, False, paths
)
# These are allowed to staff
# ... none
# These are allowed to admins
paths = self.check_endpoint(
client, "GET", "/api/admin/users", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/users/<user_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "POST", "/api/admin/users", headers, False, paths
)
paths = self.check_endpoint(
client, "PUT", "/api/admin/users/<user_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "DELETE", "/api/admin/users/<user_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/groups", headers, False, paths
)
paths = self.check_endpoint(
client, "POST", "/api/admin/groups", headers, False, paths
)
paths = self.check_endpoint(
client, "PUT", "/api/admin/groups/<group_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "DELETE", "/api/admin/groups/<group_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/logins", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/tokens", headers, False, paths
)
paths = self.check_endpoint(
client, "DELETE", "/api/admin/tokens/<token>", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/stats", headers, False, paths
)
paths = self.check_endpoint(
client, "POST", "/api/admin/mail", headers, False, paths
)
# logout MUST be the last one or the token will be invalidated!! :-)
paths = self.check_endpoint(client, "GET", "/auth/logout", headers, True, paths)
assert paths == []
self.delete_user(client, uuid)
def test_public(self, client: FlaskClient) -> None:
# List of all paths to be tested. After each test a path will be removed.
# At the end the list is expected to be empty
paths = self.get_paths(client)
headers = None
# These are public
paths = self.check_endpoint(client, "GET", "/api/status", headers, True, paths)
paths = self.check_endpoint(client, "GET", "/api/specs", headers, True, paths)
if not Env.get_bool("AUTH_ENABLE"):
assert paths == []
log.warning("Skipping other public authorizations tests")
return
paths = self.check_endpoint(client, "POST", "/auth/login", headers, True, paths)
if Env.get_int("AUTH_MAX_LOGIN_ATTEMPTS") > 0:
paths = self.check_endpoint(
client, "POST", "/auth/login/unlock/<token>", headers, True, paths
)
if Env.get_bool("ALLOW_REGISTRATION"):
paths = self.check_endpoint(
client, "POST", "/auth/profile", headers, True, paths
)
paths = self.check_endpoint(
client, "POST", "/auth/profile/activate", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/auth/profile/activate/<token>", headers, True, paths
)
if Env.get_bool("ALLOW_PASSWORD_RESET") and Connector.check_availability(
"smtp"
):
paths = self.check_endpoint(
client, "POST", "/auth/reset", headers, True, paths
)
paths = self.check_endpoint(
client, "PUT", "/auth/reset/<token>", headers, True, paths
)
# These are allowed to each user
paths = self.check_endpoint(
client, "GET", "/auth/status", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/auth/profile", headers, False, paths
)
paths = self.check_endpoint(
client, "PATCH", "/auth/profile", headers, False, paths
)
paths = self.check_endpoint(
client, "PUT", "/auth/profile", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/auth/tokens", headers, False, paths
)
paths = self.check_endpoint(
client, "DELETE", "/auth/tokens/<token>", headers, False, paths
)
if Connector.check_availability("pushpin"):
paths = self.check_endpoint(
client, "PUT", "/api/socket/<channel>/<sync>", headers, False, paths
)
paths = self.check_endpoint(
client, "POST", "/api/socket/<channel>", headers, False, paths
)
paths = self.check_endpoint(
client, "PUT", "/api/stream/<channel>/<sync>", headers, False, paths
)
paths = self.check_endpoint(
client, "POST", "/api/stream/<channel>", headers, False, paths
)
# These are allowed to coordinators
paths = self.check_endpoint(
client, "GET", "/api/group/users", headers, False, paths
)
# These are allowed to staff
# ... none
# These are allowed to admins
paths = self.check_endpoint(
client, "GET", "/api/admin/users", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/users/<user_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "POST", "/api/admin/users", headers, False, paths
)
paths = self.check_endpoint(
client, "PUT", "/api/admin/users/<user_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "DELETE", "/api/admin/users/<user_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/groups", headers, False, paths
)
paths = self.check_endpoint(
client, "POST", "/api/admin/groups", headers, False, paths
)
paths = self.check_endpoint(
client, "PUT", "/api/admin/groups/<group_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "DELETE", "/api/admin/groups/<group_id>", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/logins", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/tokens", headers, False, paths
)
paths = self.check_endpoint(
client, "DELETE", "/api/admin/tokens/<token>", headers, False, paths
)
paths = self.check_endpoint(
client, "GET", "/api/admin/stats", headers, False, paths
)
paths = self.check_endpoint(
client, "POST", "/api/admin/mail", headers, False, paths
)
# logout MUST be the last one or the token will be invalidated!! :-)
paths = self.check_endpoint(
client, "GET", "/auth/logout", headers, False, paths
)
assert paths == []
| 37.98057 | 88 | 0.557825 | 3,193 | 29,321 | 5.020357 | 0.057 | 0.103868 | 0.152838 | 0.240175 | 0.888334 | 0.888022 | 0.879039 | 0.876918 | 0.876918 | 0.876918 | 0 | 0.000796 | 0.314075 | 29,321 | 771 | 89 | 38.029831 | 0.796241 | 0.067869 | 0 | 0.587948 | 0 | 0 | 0.175246 | 0.066469 | 0 | 0 | 0 | 0 | 0.016287 | 1 | 0.013029 | false | 0.014658 | 0.016287 | 0 | 0.045603 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1c3a99e3183356804b19506bb3a1a514d934c5ba | 77 | py | Python | geco/mips/packing/__init__.py | FreestyleBuild/GeCO | 6db1a549b3145b3bc5d3025a9bccc03be6575564 | [
"MIT"
] | 8 | 2020-12-16T09:59:05.000Z | 2022-03-18T09:48:43.000Z | geco/mips/packing/__init__.py | FreestyleBuild/GeCO | 6db1a549b3145b3bc5d3025a9bccc03be6575564 | [
"MIT"
] | 101 | 2020-11-09T10:20:03.000Z | 2022-03-24T13:50:06.000Z | geco/mips/packing/__init__.py | FreestyleBuild/GeCO | 6db1a549b3145b3bc5d3025a9bccc03be6575564 | [
"MIT"
] | 3 | 2021-04-06T13:26:03.000Z | 2022-03-22T13:22:16.000Z | from geco.mips.packing.generic import *
from geco.mips.packing.tang import *
| 25.666667 | 39 | 0.792208 | 12 | 77 | 5.083333 | 0.583333 | 0.262295 | 0.393443 | 0.622951 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103896 | 77 | 2 | 40 | 38.5 | 0.884058 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
1c60e8af07d8e929732a8d460e00ba5de27fdcfe | 5,863 | py | Python | tests/jit/test_basic.py | mswart/topaz | 4bc02d6f4bf29c20f045223ecb6ae8a5cc9df2ae | [
"BSD-3-Clause"
] | 1 | 2016-07-17T09:59:55.000Z | 2016-07-17T09:59:55.000Z | tests/jit/test_basic.py | mswart/topaz | 4bc02d6f4bf29c20f045223ecb6ae8a5cc9df2ae | [
"BSD-3-Clause"
] | null | null | null | tests/jit/test_basic.py | mswart/topaz | 4bc02d6f4bf29c20f045223ecb6ae8a5cc9df2ae | [
"BSD-3-Clause"
] | null | null | null | from .base import BaseJITTest
class TestBasic(BaseJITTest):
def test_while_loop(self, topaz, tmpdir):
traces = self.run(topaz, tmpdir, """
i = 0
while i < 10000
i += 1
end
""")
self.assert_matches(traces[0].loop, """
label(p0, p1, p3, p4, p5, p6, p7, p10, i35, p20, p22, p28, descr=TargetToken(4310781936))
debug_merge_point(0, 0, '<main> at LOAD_DEREF')
debug_merge_point(0, 0, '<main> at LOAD_CONST')
debug_merge_point(0, 0, '<main> at SEND')
setfield_gc(p22, 21, descr=<FieldS topaz.executioncontext.ExecutionContext.inst_last_instr 24>)
guard_not_invalidated(descr=<Guard0x100febda8>)
p37 = force_token()
i38 = int_lt(i35, 10000)
guard_true(i38, descr=<Guard0x100febcb8>)
debug_merge_point(0, 0, '<main> at JUMP_IF_FALSE')
debug_merge_point(0, 0, '<main> at LOAD_DEREF')
debug_merge_point(0, 0, '<main> at LOAD_CONST')
debug_merge_point(0, 0, '<main> at SEND')
p39 = force_token()
i40 = int_add(i35, 1)
debug_merge_point(0, 0, '<main> at STORE_DEREF')
debug_merge_point(0, 0, '<main> at DISCARD_TOP')
debug_merge_point(0, 0, '<main> at JUMP')
debug_merge_point(0, 0, '<main> at LOAD_DEREF')
setfield_gc(p22, 35, descr=<FieldS topaz.executioncontext.ExecutionContext.inst_last_instr 24>)
jump(p0, p1, p3, p4, p5, p6, p7, p10, i40, p20, p22, p28, descr=TargetToken(4310781936))
""")
def test_constant_string(self, topaz, tmpdir):
traces = self.run(topaz, tmpdir, """
i = 0
while i < 10000
i += "a".length
end
""")
self.assert_matches(traces[0].loop, """
label(p0, p1, p3, p4, p5, p6, p7, p10, i36, p20, p22, p28, descr=TargetToken(4310781936))
debug_merge_point(0, 0, '<main> at LOAD_DEREF')
debug_merge_point(0, 0, '<main> at LOAD_CONST')
debug_merge_point(0, 0, '<main> at SEND')
setfield_gc(p22, 21, descr=<FieldS topaz.executioncontext.ExecutionContext.inst_last_instr 24>)
guard_not_invalidated(descr=<Guard0x100ff6818>)
p38 = force_token()
i39 = int_lt(i36, 10000)
guard_true(i39, descr=<Guard0x100ff6728>)
debug_merge_point(0, 0, '<main> at JUMP_IF_FALSE')
debug_merge_point(0, 0, '<main> at LOAD_DEREF')
debug_merge_point(0, 0, '<main> at LOAD_CONST')
debug_merge_point(0, 0, '<main> at COERCE_STRING')
debug_merge_point(0, 0, '<main> at SEND')
p40 = force_token()
debug_merge_point(0, 0, '<main> at SEND')
p41 = force_token()
i42 = int_add(i36, 1)
debug_merge_point(0, 0, '<main> at STORE_DEREF')
debug_merge_point(0, 0, '<main> at DISCARD_TOP')
debug_merge_point(0, 0, '<main> at JUMP')
debug_merge_point(0, 0, '<main> at LOAD_DEREF')
setfield_gc(p22, 41, descr=<FieldS topaz.executioncontext.ExecutionContext.inst_last_instr 24>)
jump(p0, p1, p3, p4, p5, p6, p7, p10, i42, p20, p22, p28, descr=TargetToken(4310781936))
""")
def test_method_missing(self, topaz, tmpdir):
traces = self.run(topaz, tmpdir, """
i = 0
while i < 10000
Array.try_convert(1)
i += 1
end
""")
self.assert_matches(traces[0].loop, """
label(p0, p1, p3, p4, p5, p7, p10, i56, p19, p22, p24, p30, descr=TargetToken(4310782288))
debug_merge_point(0, 0, '<main> at LOAD_DEREF')
debug_merge_point(0, 0, '<main> at LOAD_CONST')
debug_merge_point(0, 0, '<main> at SEND')
setfield_gc(p24, 21, descr=<FieldS topaz.executioncontext.ExecutionContext.inst_last_instr 24>)
guard_not_invalidated(descr=<Guard0x101e21e20>)
p59 = force_token()
i60 = int_lt(i56, 10000)
guard_true(i60, descr=<Guard0x101e209f8>)
debug_merge_point(0, 0, '<main> at JUMP_IF_FALSE')
debug_merge_point(0, 0, '<main> at LOAD_SCOPE')
debug_merge_point(0, 0, '<main> at LOAD_LOCAL_CONSTANT')
debug_merge_point(0, 0, '<main> at LOAD_CONST')
debug_merge_point(0, 0, '<main> at SEND')
p61 = force_token()
debug_merge_point(1, 1, 'try_convert at LOAD_SCOPE')
debug_merge_point(1, 1, 'try_convert at LOAD_LOCAL_CONSTANT')
debug_merge_point(1, 1, 'try_convert at LOAD_DEREF')
debug_merge_point(1, 1, 'try_convert at LOAD_SCOPE')
debug_merge_point(1, 1, 'try_convert at LOAD_LOCAL_CONSTANT')
debug_merge_point(1, 1, 'try_convert at LOAD_CONST')
debug_merge_point(1, 1, 'try_convert at SEND')
p62 = force_token()
p63 = force_token()
p64 = force_token()
p65 = force_token()
p66 = force_token()
p67 = force_token()
p68 = force_token()
p69 = force_token()
p70 = force_token()
p71 = force_token()
p72 = force_token()
debug_merge_point(1, 1, 'try_convert at RETURN')
p73 = force_token()
debug_merge_point(0, 0, '<main> at DISCARD_TOP')
debug_merge_point(0, 0, '<main> at LOAD_DEREF')
debug_merge_point(0, 0, '<main> at LOAD_CONST')
debug_merge_point(0, 0, '<main> at SEND')
p74 = force_token()
i75 = int_add(i56, 1)
debug_merge_point(0, 0, '<main> at STORE_DEREF')
debug_merge_point(0, 0, '<main> at DISCARD_TOP')
debug_merge_point(0, 0, '<main> at JUMP')
debug_merge_point(0, 0, '<main> at LOAD_DEREF')
setfield_gc(p24, 48, descr=<FieldS topaz.executioncontext.ExecutionContext.inst_last_instr 24>)
setfield_gc(p1, p73, descr=<FieldP topaz.frame.Frame.vable_token 32>)
jump(p0, p1, p3, p4, p5, p7, p10, i75, p19, p22, p24, p30, descr=TargetToken(4310782288))
""")
| 45.804688 | 103 | 0.614191 | 829 | 5,863 | 4.09047 | 0.155609 | 0.141551 | 0.212327 | 0.188735 | 0.801828 | 0.801828 | 0.796225 | 0.764081 | 0.713654 | 0.682395 | 0 | 0.104119 | 0.254648 | 5,863 | 127 | 104 | 46.165354 | 0.671854 | 0 | 0 | 0.536585 | 0 | 0.04878 | 0.912332 | 0.139519 | 0 | 0 | 0 | 0 | 0.02439 | 1 | 0.02439 | false | 0 | 0.00813 | 0 | 0.04065 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
1c9c3b9fc90103be365795fb3d6a74ebc7652ba5 | 251 | py | Python | Data Structures/LinkedLists/Python/__init__.py | Praggya17/HacktoberFestContribute | 098cb1012f1f2ed6ca6b3544a7b962b6c49e2643 | [
"MIT"
] | 22,426 | 2017-01-17T04:01:44.000Z | 2022-03-31T12:06:16.000Z | Data Structures/LinkedLists/Python/__init__.py | Praggya17/HacktoberFestContribute | 098cb1012f1f2ed6ca6b3544a7b962b6c49e2643 | [
"MIT"
] | 523 | 2017-04-18T12:05:11.000Z | 2022-03-20T11:10:41.000Z | Data Structures/LinkedLists/Python/__init__.py | Praggya17/HacktoberFestContribute | 098cb1012f1f2ed6ca6b3544a7b962b6c49e2643 | [
"MIT"
] | 4,900 | 2017-01-19T23:47:05.000Z | 2022-03-31T10:00:47.000Z | from .reverse import *
from .is_sorted import *
from .remove_range import *
from .swap_in_pairs import *
from .rotate_list import *
from .is_cyclic import *
from .merge_two_list import *
from .is_palindrome import *
from .copy_random_pointer import *
| 25.1 | 34 | 0.784861 | 38 | 251 | 4.894737 | 0.5 | 0.430108 | 0.193548 | 0.172043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.143426 | 251 | 9 | 35 | 27.888889 | 0.865116 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
98c1a1edffc3f4d9fc164ccc7148fad12c4081e2 | 98,587 | py | Python | test/f3.py | PrudhviGNV/AI-models-for-bug-prediction | 35ad3707f25db700c9ddd79aaed59fab81d1a2b5 | [
"MIT"
] | 5 | 2021-04-29T09:44:24.000Z | 2022-03-03T10:48:04.000Z | test/f3.py | PrudhviGNV/AI-models-for-bug-prediction | 35ad3707f25db700c9ddd79aaed59fab81d1a2b5 | [
"MIT"
] | null | null | null | test/f3.py | PrudhviGNV/AI-models-for-bug-prediction | 35ad3707f25db700c9ddd79aaed59fab81d1a2b5 | [
"MIT"
] | 1 | 2021-05-03T18:16:07.000Z | 2021-05-03T18:16:07.000Z | # -*- coding: utf-8 -*-
"""f3.ipynb
Automatically generated by Colaboratory.
Original file is located at
https://colab.research.google.com/drive/1sGwSaZ-D6bDXpnSjz2pAUMWM2CUXyEbg
"""
# Importing the libraries
import numpy as np
import pandas as pd
a =1
b = 2
c = "ksjak"
d ="akjfghak"
ea =" adjklagl"
aa=1
ba = 2
ca = "ksjak"
da ="akjfghak"
ea =" adjklagl"
aa =1
ba = 2
ca = "ksjak"
da ="akjfghak"
ed =" adjklagl"
ag =1
bg = 2
ce = "ksjak"
du ="akjfghak"
ef =" adjklagl"
ad =1
bd = 2
co= "ksjak"
dl ="akjfghak"
ed =" adjklagl"
ad =1
bd = 2
cd = "ksjak"
dd ="akjfghak"
ed =" adjklagl"
def a():
a =1
b = 2
c = "ksjak"
d ="akjfghak"
ea =" adjklagl"
aa=1
ba = 2
ca = "ksjak"
da ="akjfghak"
ea =" adjklagl"
aa =1
ba = 2
ca = "ksjak"
da ="akjfghak"
ed =" adjklagl"
ag =1
bg = 2
ce = "ksjak"
du ="akjfghak"
ef =" adjklagl"
ad =1
bd = 2
co= "ksjak"
dl ="akjfghak"
ed =" adjklagl"
ad =1
bd = 2
cd = "ksjak"
dd ="akjfghak"
ed =" adjklagl"
def b():
a =1
b = 2
c = "ksjak"
d ="akjfghak"
ea =" adjklagl"
aa=1
ba = 2
ca = "ksjak"
da ="akjfghak"
ea =" adjklagl"
aa =1
ba = 2
ca = "ksjak"
da ="akjfghak"
ed =" adjklagl"
ag =1
bg = 2
ce = "ksjak"
du ="akjfghak"
ef =" adjklagl"
ad =1
bd = 2
co= "ksjak"
dl ="akjfghak"
ed =" adjklagl"
ad =1
bd = 2
cd = "ksjak"
dd ="akjfghak"
ed =" adjklagl"
def cc():
if 0:
print("Start")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("Ntg")
if args.has_key( "rate" ) and args.has_key( "time" ): args["distance"]= args["rate"]*args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
args["rate"]= args["distance"]/args["time"]
elif args.has_key( "rate" ) and args.has_key( "distance" ):
args["time"]= args["distance"]/args["rate"]
elif args.has_key( "time" ) and args.has_key( "distance" ):
print("NTg")
else:
print("End") | 51.833333 | 99 | 0.625539 | 14,482 | 98,587 | 4.131612 | 0.00511 | 0.214794 | 0.306849 | 0.199452 | 0.996306 | 0.996306 | 0.996306 | 0.996306 | 0.996306 | 0.996306 | 0 | 0.000513 | 0.150152 | 98,587 | 1,902 | 100 | 51.833333 | 0.713633 | 0.002069 | 0 | 0.995215 | 1 | 0 | 0.248923 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001595 | false | 0 | 0.001063 | 0 | 0.002658 | 0.05848 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
98f54cc876634cd3ca5a75e08053c37f8ef67984 | 75,715 | py | Python | facility_stat_report/fstat_reportlab_facility_data.py | senthil10/dc_reporting_scripts | 9d7d75a71a7f3cd28bec2e0dc3ea1dd92b0018b0 | [
"MIT"
] | null | null | null | facility_stat_report/fstat_reportlab_facility_data.py | senthil10/dc_reporting_scripts | 9d7d75a71a7f3cd28bec2e0dc3ea1dd92b0018b0 | [
"MIT"
] | null | null | null | facility_stat_report/fstat_reportlab_facility_data.py | senthil10/dc_reporting_scripts | 9d7d75a71a7f3cd28bec2e0dc3ea1dd92b0018b0 | [
"MIT"
] | null | null | null | #/usr/bin/env python
# -*- coding: utf-8 -*-
# This only contains the plot values for individual facility
# This is a dependancy for the fstat_pdf_gen.py script
from reportlab.lib.units import mm
base_dir = "/Users/senpa282/opt/publication_reporting/facility_stat_plot/pdfs/"
doc_width = 250*mm
doc_height = 150*mm
doc_pads = 0*mm
show_bound = 0
def calc_frame_width(dw, dpd, ncol=3):
return (dw - ((ncol-1)*dpd))/ncol
def calc_frame_height(dh, dpd, nrow=2):
return (dh - ((nrow-1)*dpd))/nrow
facility_graph_data = {}
# Facilites with huge user data #
# For facility 'National Genomics Infrastructure'
facility_graph_data['National Genomics Infrastructure'] = dict(
doc = dict(
fname = base_dir + "National Genomics Infrastructure.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
style = ("small_inner_heading", "small_page_text"),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 7*mm,
topPadding = 6*mm
),
ctbar = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "ctbar",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
jfbar = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "jfbar",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 2*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=78, imh=59),
u18 = dict(imw=85, imh=61),
u19 = dict(imw=85, imh=61)
)
)
# For facility 'Support, Infrastructure and Training'
facility_graph_data['Support, Infrastructure and Training'] = dict(
doc = dict(
fname = base_dir + "Support Infrastructure and Training.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
style = ("small_inner_heading", "small_page_text"),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 8*mm,
topPadding = 5*mm
),
ctbar = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "ctbar",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
jfbar = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "jfbar",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 4*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=76, imh=58),
u18 = dict(imw=82, imh=60),
u19 = dict(imw=86, imh=61)
)
)
# For facility 'Compute and Storage'
facility_graph_data['Compute and Storage'] = dict(
doc = dict(
fname = base_dir + "Compute and Storage.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 11*mm,
topPadding = 8*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 2*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=86, imh=65),
u18 = dict(imw=86, imh=65),
u19 = dict(imw=86, imh=65)
)
)
# For facility 'Biochemical Imaging Centre Umea'
facility_graph_data['Biochemical Imaging Centre Umea'] = dict(
doc = dict(
fname = base_dir + "Biochemical Imaging Centre Umea.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
style = ("medium_inner_heading", "medium_page_text"),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 12*mm,
topPadding = 5*mm
),
ctbar = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "ctbar",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
jfbar = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "jfbar",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 8*mm,
topPadding = 3.5*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=70, imh=58),
u18 = dict(imw=92, imh=63),
u19 = dict(imw=92, imh=63)
)
)
# Facilities with all pub but only two user plots #
# For facility 'AIDA Data Hub'
facility_graph_data['AIDA Data Hub'] = dict(
doc = dict(
fname = base_dir + "AIDA Data Hub.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
flist = ['fstat', 'ctbar', 'jfbar', 'usr18', 'usr19'],
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 13*mm,
topPadding = 7*mm
),
usr18 = dict(
x1 = (calc_frame_width(doc_width, doc_pads)/2)-5*mm,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 3*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (calc_frame_width(doc_width, doc_pads)*1.5)+4*mm,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 2*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u18 = dict(imw=80, imh=60),
u19 = dict(imw=85, imh=62)
)
)
# For facility 'Intravital Microscopy Facility'
facility_graph_data['Intravital Microscopy Facility'] = dict(
doc = dict(
fname = base_dir + "Intravital Microscopy Facility.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
flist = ['fstat', 'ctbar', 'jfbar', 'usr18', 'usr19'],
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 12*mm,
topPadding = 6*mm
),
usr18 = dict(
x1 = (calc_frame_width(doc_width, doc_pads)/2)-6*mm,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 1*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (calc_frame_width(doc_width, doc_pads)*1.5)+5*mm,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u18 = dict(imw=82, imh=60),
u19 = dict(imw=88, imh=61)
)
)
# For facilites have no publication plot but 2 user plot #
# For facility 'Ancient DNA'
facility_graph_data['Ancient DNA'] = dict(
doc = dict(
fname = base_dir + "Ancient DNA.pdf",
dwidth = doc_width,
dheight = doc_height/2,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
use_default = False,
flist = ['fstat', 'usr18', 'usr19'],
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 13*mm,
topPadding = 4*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "ctbar",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 3*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "jfbar",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 3*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u18 = dict(imw=75, imh=59)
)
)
# For facilites have 2 publication plot but no user plot #
# For facility 'Advanced FISH Technologies'
facility_graph_data['Advanced FISH Technologies'] = dict(
doc = dict(
fname = base_dir + "Advanced FISH Technologies.pdf",
dwidth = doc_width,
dheight = doc_height/2,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
use_default = False,
flist = ['fstat', 'ctbar', 'jfbar'],
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 11*mm,
topPadding = 4*mm
),
ctbar = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "ctbar",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 3*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
jfbar = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "jfbar",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 3*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
)
)
# For facilites with 3 plots (either all user or 2 bar 1 user)
# For facility 'Glycoproteomics'
facility_graph_data['Glycoproteomics'] = dict(
doc = dict(
fname = base_dir + "Glycoproteomics.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
flist = ['fstat', 'ctbar', 'jfbar', 'usr17', "usr18", "usr19"],
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 14*mm,
topPadding = 7*mm
),
ctbar = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "ctbar",
showBoundary = show_bound,
leftPadding = 2*mm,
topPadding = 2*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
jfbar = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "jfbar",
showBoundary = show_bound,
leftPadding = 2*mm,
topPadding = 2*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 1*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 3*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=85, imh=62),
u18 = dict(imw=87, imh=62),
u19 = dict(imw=81, imh=61)
)
)
facility_graph_data['Clinical Genomics Orebro'] = dict(
doc = dict(
fname = base_dir + "Clinical Genomics Orebro.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
use_default = False,
flist = ['fstat', 'ctbar', 'jfbar', 'usr19'],
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 14*mm,
topPadding = 7*mm
),
ctbar = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "ctbar",
showBoundary = show_bound,
leftPadding = 2*mm,
topPadding = 2*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
jfbar = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "jfbar",
showBoundary = show_bound,
leftPadding = 2*mm,
topPadding = 2*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 2*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u19 = dict(imw=93, imh=63)
)
)
# For facilites with only stat no plots
# For facility 'Exposomics'
facility_graph_data['Exposomics'] = dict(
doc = dict(
fname = base_dir + "Exposomics.pdf",
dwidth = doc_width/3,
dheight = doc_height/2,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
use_default = False,
flist = ['fstat'],
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 12*mm,
topPadding = 7*mm
)
)
)
)
# For facility 'Clinical Genomics Linkoping'
facility_graph_data['Clinical Genomics Linkoping'] = dict(
doc = dict(
fname = base_dir + "Clinical Genomics Linkoping.pdf",
dwidth = doc_width/3,
dheight = doc_height/2,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
use_default = False,
flist = ['fstat'],
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 12*mm,
topPadding = 7*mm
)
)
)
)
# For facility 'Clinical Genomics Linkoping'
facility_graph_data['Clinical Genomics Umea'] = dict(
doc = dict(
fname = base_dir + "Clinical Genomics Umea.pdf",
dwidth = doc_width/3,
dheight = doc_height/2,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
use_default = False,
flist = ['fstat'],
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 12*mm,
topPadding = 7*mm
)
)
)
)
# For facilites with pie size tweaks #
# For facility 'BioImage Informatics'
facility_graph_data['BioImage Informatics'] = dict(
sdoc = dict(
fname = base_dir + "BioImage Informatics.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
style = ("medium_inner_heading", "medium_page_text"),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 12*mm,
topPadding = 8*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 5*mm,
topPadding = 2*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=72, imh=58)
)
)
# For facility 'Advanced Light Microscopy'
facility_graph_data['Advanced Light Microscopy'] = dict(
sdoc = dict(
fname = base_dir + "Advanced Light Microscopy.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 13*mm,
topPadding = 8*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 5*mm,
topPadding = 1.5*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=76, imh=60),
u18 = dict(imw=82, imh=60),
u19 = dict(imw=82, imh=60)
)
)
# For facility 'Cryo-EM'
facility_graph_data['Cryo-EM'] = dict(
sdoc = dict(
fname = base_dir + "Cryo-EM.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
style = ("medium_inner_heading", "medium_page_text"),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 13*mm,
topPadding = 8*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 5*mm,
topPadding = 1.5*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=72, imh=60),
u18 = dict(imw=84, imh=60),
u19 = dict(imw=84, imh=60)
)
)
# For facility 'Cell Profiling'
facility_graph_data['Cell Profiling'] = dict(
sdoc = dict(
fname = base_dir + "Cell Profiling.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 15*mm,
topPadding = 8*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 5*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 1.5*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u18 = dict(imw=84, imh=60)
)
)
# For facility 'In Situ Sequencing'
facility_graph_data['In Situ Sequencing'] = dict(
sdoc = dict(
fname = base_dir + "In Situ Sequencing.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 16*mm,
topPadding = 6*mm
)
)
)
)
# For facility 'National Resource for Mass Spectrometry Imaging'
facility_graph_data['National Resource for Mass Spectrometry Imaging'] = dict(
sdoc = dict(
fname = base_dir + "National Resource for Mass Spectrometry Imaging.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 15*mm,
topPadding = 8*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 10*mm,
topPadding = 1.5*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u18 = dict(imw=66, imh=58)
)
)
# For facility 'Gothenburg Imaging Mass Spectrometry'
facility_graph_data['Gothenburg Imaging Mass Spectrometry'] = dict(
sdoc = dict(
fname = base_dir + "Gothenburg Imaging Mass Spectrometry.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 15*mm,
topPadding = 8*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 0.5*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 3*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 3*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=84, imh=61)
)
)
# For facility 'Chemical Biology Consortium Sweden'
facility_graph_data['Chemical Biology Consortium Sweden'] = dict(
sdoc = dict(
fname = base_dir + "Chemical Biology Consortium Sweden.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
style = ("medium_inner_heading", "medium_page_text"),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 12*mm,
topPadding = 8*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 3.5*mm,
topPadding = 2*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=76, imh=59)
)
)
# For facility 'Chemical Proteomics'
facility_graph_data['Chemical Proteomics'] = dict(
sdoc = dict(
fname = base_dir + "Chemical Proteomics.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 13*mm,
topPadding = 8*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 7*mm,
topPadding = 1*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 3*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=71, imh=59),
u18 = dict(imw=82, imh=60)
)
)
# For facility 'Genome Engineering Zebrafish'
facility_graph_data['Genome Engineering Zebrafish'] = dict(
sdoc = dict(
fname = base_dir + "Genome Engineering Zebrafish.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 13*mm,
topPadding = 6*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 3*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=78, imh=60)
)
)
# For facility 'Clinical Genomics Gothenburg'
facility_graph_data['Clinical Genomics Gothenburg'] = dict(
sdoc = dict(
fname = base_dir + "Clinical Genomics Gothenburg.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 13*mm,
topPadding = 7*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 4*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
)
)
# For facility 'Clinical Genomics Uppsala'
facility_graph_data['Clinical Genomics Uppsala'] = dict(
sdoc = dict(
fname = base_dir + "Clinical Genomics Uppsala.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 14*mm,
topPadding = 7*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 6*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=73, imh=58),
u18 = dict(imw=79, imh=60)
)
)
# For facility 'Clinical Genomics Stockholm'
facility_graph_data['Clinical Genomics Stockholm'] = dict(
sdoc = dict(
fname = base_dir + "Clinical Genomics Stockholm.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 14*mm,
topPadding = 8*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 6*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=74, imh=58),
u18 = dict(imw=85, imh=60),
u19 = dict(imw=76, imh=59)
)
)
# For facility 'Autoimmunity Profiling'
facility_graph_data['Autoimmunity Profiling'] = dict(
sdoc = dict(
fname = base_dir + "Autoimmunity Profiling.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 15*mm,
topPadding = 8*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 3*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 3*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=75, imh=58),
u18 = dict(imw=77, imh=59),
u19 = dict(imw=86, imh=60)
)
)
# For facility 'Targeted and Structural Proteomics'
facility_graph_data['Targeted and Structural Proteomics'] = dict(
sdoc = dict(
fname = base_dir + "Targeted and Structural Proteomics.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 15*mm,
topPadding = 7*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=75, imh=58),
u18 = dict(imw=79, imh=60),
u19 = dict(imw=85, imh=60)
)
)
# For facility 'Microbial Single Cell Genomics'
facility_graph_data['Microbial Single Cell Genomics'] = dict(
sdoc = dict(
fname = base_dir + "Microbial Single Cell Genomics.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 14*mm,
topPadding = 8*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 4*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0.7*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u18 = dict(imw=84, imh=60),
u19 = dict(imw=74, imh=59)
)
)
# For facility 'High Throughput Genome Engineering'
facility_graph_data['High Throughput Genome Engineering'] = dict(
sdoc = dict(
fname = base_dir + "High Throughput Genome Engineering.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 14*mm,
topPadding = 8*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 3.5*mm,
topPadding = 0.5*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u18 = dict(imw=76, imh=59)
)
)
# For facility 'Swedish Metabolomics Centre'
facility_graph_data['Swedish Metabolomics Centre'] = dict(
sdoc = dict(
fname = base_dir + "Swedish Metabolomics Centre.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
style = ("medium_inner_heading", "medium_page_text"),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 14*mm,
topPadding = 8*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 2*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 2*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=79, imh=60),
u19 = dict(imw=85, imh=60)
)
)
# For facility 'Drug Discovery and Development'
facility_graph_data['Drug Discovery and Development'] = dict(
sdoc = dict(
fname = base_dir + "Drug Discovery and Development.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 14*mm,
topPadding = 8*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 3.5*mm,
topPadding = 0.8*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u18 = dict(imw=79, imh=59),
u19 = dict(imw=84, imh=61)
)
)
# For facility 'Clinical Genomics Lund'
facility_graph_data['Clinical Genomics Lund'] = dict(
sdoc = dict(
fname = base_dir + "Clinical Genomics Lund.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
style = ("medium_inner_heading", "medium_page_text"),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 14*mm,
topPadding = 7*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 5*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u18 = dict(imw=78, imh=60),
u19 = dict(imw=84, imh=61)
)
)
# For facility 'Mass Cytometry'
facility_graph_data['Mass Cytometry'] = dict(
sdoc = dict(
fname = base_dir + "Mass Cytometry.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
style = ("medium_inner_heading", "medium_page_text"),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 14*mm,
topPadding = 7*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 2*mm,
topPadding = 2.5*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 3.3*mm,
topPadding = 1*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u18 = dict(imw=74, imh=59),
u19 = dict(imw=84, imh=60)
)
)
# For facility 'Proximity Proteomics'
facility_graph_data['Proximity Proteomics'] = dict(
sdoc = dict(
fname = base_dir + "Proximity Proteomics.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 14*mm,
topPadding = 7*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 3*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 6.5*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr19 = dict(
x1 = (doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads)*2,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr19",
showBoundary = show_bound,
leftPadding = 0*mm,
topPadding = 2*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u18 = dict(imw=74, imh=59),
u19 = dict(imw=84, imh=60)
)
)
# For facility 'Plasma Profiling'
facility_graph_data['Plasma Profiling'] = dict(
sdoc = dict(
fname = base_dir + "Plasma Profiling.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 14*mm,
topPadding = 8*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 3*mm,
topPadding = 0*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u17 = dict(imw=77, imh=59)
)
)
# For facility 'Proteogenomics'
facility_graph_data['Proteogenomics'] = dict(
sdoc = dict(
fname = base_dir + "Proteogenomics.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 14*mm,
topPadding = 8*mm
),
usr17 = dict(
x1 = doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr17",
showBoundary = show_bound,
leftPadding = 3*mm,
topPadding = 2*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
),
usr18 = dict(
x1 = doc_pads + calc_frame_width(doc_width, doc_pads) + doc_pads,
y1 = doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "usr18",
showBoundary = show_bound,
leftPadding = 3*mm,
topPadding = 2*mm,
rightPadding = 0*mm,
bottomPadding = 0*mm
)
)
),
figsize = dict(
u18 = dict(imw=77, imh=59)
)
)
# For facility 'Swedish NMR Centre'
facility_graph_data['Swedish NMR Centre'] = dict(
sdoc = dict(
fname = base_dir + "Swedish NMR Centre.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 14*mm,
topPadding = 6*mm
)
)
)
)
#For facility 'Centre for Cellular Imaging'
facility_graph_data['Centre for Cellular Imaging'] = dict(
sdoc = dict(
fname = base_dir + "Centre for Cellular Imaging.pdf",
dwidth = doc_width,
dheight = doc_height,
dpads = doc_pads,
show_bound = show_bound
),
style = ("medium_inner_heading", "medium_page_text"),
frames = dict(
fdict = dict(
fstat = dict(
x1 = doc_pads,
y1 = doc_pads + calc_frame_height(doc_height, doc_pads) + doc_pads,
width = calc_frame_width(doc_width, doc_pads),
height = calc_frame_height(doc_height, doc_pads),
id = "fstat",
showBoundary = show_bound,
leftPadding = 16*mm,
topPadding = 7*mm
)
)
)
) | 35.249069 | 89 | 0.485558 | 7,710 | 75,715 | 4.488327 | 0.029313 | 0.142003 | 0.070394 | 0.084988 | 0.905967 | 0.892039 | 0.872475 | 0.856408 | 0.843289 | 0.840168 | 0 | 0.032173 | 0.427749 | 75,715 | 2,148 | 90 | 35.249069 | 0.766502 | 0.026428 | 0 | 0.846307 | 0 | 0 | 0.043116 | 0.000896 | 0 | 0 | 0 | 0 | 0 | 1 | 0.000998 | false | 0 | 0.000499 | 0.000998 | 0.002495 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c7765df014f993bc875ef542368f44d4d1cd9220 | 162 | py | Python | tests/python/import_order/test.py | Erotemic/misc | 6f8460a690d05e7e0117becc6cae9902cbe2cedd | [
"Apache-2.0"
] | 5 | 2021-04-29T21:07:18.000Z | 2021-09-29T08:46:08.000Z | tests/python/import_order/test.py | Erotemic/misc | 6f8460a690d05e7e0117becc6cae9902cbe2cedd | [
"Apache-2.0"
] | null | null | null | tests/python/import_order/test.py | Erotemic/misc | 6f8460a690d05e7e0117becc6cae9902cbe2cedd | [
"Apache-2.0"
] | 1 | 2018-04-07T12:26:21.000Z | 2018-04-07T12:26:21.000Z |
print('--- DECOUPLED PKG ---')
python -c "import decoupled_pkg"
python -c "from decoupled_pkg import mod1"
print('--- COUPLED PKG ---')
import coupled_rel_pkg
| 18 | 42 | 0.697531 | 22 | 162 | 4.954545 | 0.454545 | 0.330275 | 0.330275 | 0.348624 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007194 | 0.141975 | 162 | 8 | 43 | 20.25 | 0.776978 | 0 | 0 | 0 | 0 | 0 | 0.559006 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.6 | null | null | 0.4 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
c77fab81a907f7a08a845f53554bf3e769e7661e | 372 | py | Python | synthetic_datasets/__init__.py | kim0215/xai-bench | 5bdccf5aeedaa743d398b2a028d189f72a83b3d3 | [
"Apache-2.0"
] | null | null | null | synthetic_datasets/__init__.py | kim0215/xai-bench | 5bdccf5aeedaa743d398b2a028d189f72a83b3d3 | [
"Apache-2.0"
] | null | null | null | synthetic_datasets/__init__.py | kim0215/xai-bench | 5bdccf5aeedaa743d398b2a028d189f72a83b3d3 | [
"Apache-2.0"
] | null | null | null | from .synthetic_gaussian import GaussianLinearRegression, GaussianNonlinearAdditiveRegression, GaussianPiecewiseConstantRegression, GaussianLinearBinary, GaussianNonlinearAdditiveBinary, GaussianPiecewiseConstantBinary
from .synthetic_mixture import GMLinearRegression, GMNonlinearAdditiveRegression, GMPiecewiseConstantRegression
from .custom_dataset import CustomDataset | 124 | 218 | 0.924731 | 22 | 372 | 15.5 | 0.772727 | 0.076246 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.048387 | 372 | 3 | 219 | 124 | 0.963277 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c7917ab62604cfc8f5b3f06496bb8206a9e10a64 | 40,054 | py | Python | tests/v2_validation/cattlevalidationtest/core/test_rancher_compose_metadata.py | bmdepesa/validation-tests | 23e7ab95ce76744483a0657f790b42a88a93436d | [
"Apache-2.0"
] | null | null | null | tests/v2_validation/cattlevalidationtest/core/test_rancher_compose_metadata.py | bmdepesa/validation-tests | 23e7ab95ce76744483a0657f790b42a88a93436d | [
"Apache-2.0"
] | null | null | null | tests/v2_validation/cattlevalidationtest/core/test_rancher_compose_metadata.py | bmdepesa/validation-tests | 23e7ab95ce76744483a0657f790b42a88a93436d | [
"Apache-2.0"
] | null | null | null | from common_fixtures import * # NOQA
import json
TEST_SERVICE_OPT_IMAGE = 'ibuildthecloud/helloworld'
TEST_SERVICE_OPT_IMAGE_LATEST = TEST_SERVICE_OPT_IMAGE + ':latest'
TEST_SERVICE_OPT_IMAGE_UUID = 'docker:' + TEST_SERVICE_OPT_IMAGE_LATEST
METADATA_SUBDIR = os.path.join(os.path.dirname(os.path.realpath(__file__)),
'resources/metadatadc')
logger = logging.getLogger(__name__)
start_project_str = "Starting"
if_compose_data_files = pytest.mark.skipif(
not os.path.isdir(METADATA_SUBDIR),
reason='Docker compose files directory location not set/ does not Exist')
metadata_client_service = []
metadata_client_port = 999
@pytest.fixture(scope='session', autouse=True)
def create_metadata_client_service(request, client):
env = create_env(client)
launch_config = {"imageUuid": SSH_IMAGE_UUID,
"ports": [str(metadata_client_port) + ":22/tcp"],
"labels": {"io.rancher.scheduler.global": "true"}}
service = client.create_service(name="metadataclient",
stackId=env.id,
launchConfig=launch_config)
service = client.wait_success(service, 60)
env = env.activateservices()
service = client.wait_success(service, 300)
assert service.state == "active"
metadata_client_service.extend(
get_service_container_list(client, service))
def fin():
delete_all(client, [service])
request.addfinalizer(fin)
@if_compose_data_files
def test_metadata_self_2016_07_29(
client, rancher_cli_container):
env_name = random_str().replace("-", "")
dc_file = "dc_metadata_1_2016_07_29.yml"
rc_file = "rc_metadata_1_2016_07_29.yml"
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "test120160729", METADATA_SUBDIR, dc_file, rc_file)
print service.metadata
assert service.metadata["test1"]["name"] == "t1name"
assert service.metadata["test1"]["value"] == "t1value"
assert service.metadata["test2"]["name"] == [1, 2, 3, 4]
service_containers = get_service_container_list(client, service)
port = 6002
con_metadata = {}
wait_for_metadata_propagation(client)
for con in service_containers:
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"containers/" + con.name,
"2016-07-29")
con_metadata[con.name] = json.loads(metadata_str)
for con in service_containers:
# Service related metadata
metadata_str = fetch_rancher_metadata(client, con, port,
"self/service", "2016-07-29")
service_metadata = json.loads(metadata_str)
con_list = service_metadata["containers"]
# Check for container object list
assert len(con_list) == len(con_metadata.keys())
for container in con_list:
print container
print con_metadata[container["name"]]
assert cmp(container, con_metadata[container["name"]]) == 0
assert service_metadata["name"] == "test120160729"
assert service_metadata["ports"] == ["6002:22/tcp"]
assert service_metadata["stack_name"] == env_name
assert service_metadata["kind"] == "service"
assert service_metadata["labels"] == service.launchConfig["labels"]
assert service_metadata["metadata"] == service.metadata
assert service_metadata["uuid"] == service.uuid
host = client.by_id('host', con.hosts[0].id)
# Host related metadata
metadata_str = fetch_rancher_metadata(client, con, port,
"self/host", "2016-07-29")
metadata = json.loads(metadata_str)
assert metadata["agent_ip"] == host.ipAddresses()[0].address
assert metadata["labels"] == host.labels
assert metadata["name"] == host.hostname
assert metadata["uuid"] == host.uuid
# Stack related metadata
metadata_str = fetch_rancher_metadata(client, con, port,
"self/stack", "2016-07-29")
metadata = json.loads(metadata_str)
assert metadata["environment_name"] == PROJECT_NAME
# Check for service object list
# Set token value to None in service metadata object returned
# from self before comparing service object retrieved by index
service_metadata["token"] = None
assert cmp(metadata["services"][0], service_metadata) == 0
assert metadata["name"] == env.name
assert metadata["uuid"] == env.uuid
# Container related metadata
metadata_str = fetch_rancher_metadata(client, con, port,
"self/container", "2016-07-29")
metadata = json.loads(metadata_str)
assert metadata["create_index"] == con.createIndex
assert metadata["host_uuid"] == host.uuid
assert metadata["ips"] == [con.primaryIpAddress]
assert metadata["labels"] == con.labels
assert metadata["name"] == con.name
assert metadata["ports"] == ["0.0.0.0" +
":6002:22/tcp"]
assert metadata["primary_ip"] == con.primaryIpAddress
assert metadata["service_name"] == "test120160729"
assert metadata["stack_name"] == env.name
assert metadata["uuid"] == con.uuid
delete_all(client, [env])
@if_compose_data_files
def test_metadata_byname_2016_07_29(
client, rancher_cli_container):
env_name = random_str().replace("-", "")
dc_file = "dc_metadata_2_2016_07_29.yml"
rc_file = "rc_metadata_2_2016_07_29.yml"
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "test2120160729", METADATA_SUBDIR, dc_file, rc_file)
print service.metadata
assert service.metadata["test1"]["name"] == "t1name"
assert service.metadata["test1"]["value"] == "t1value"
assert service.metadata["test2"]["name"] == [1, 2, 3, 4]
service_containers = get_service_container_list(client, service)
wait_for_metadata_propagation(client)
con_metadata = {}
for con in service_containers:
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"containers/" + con.name,
"2016-07-29")
con_metadata[con.name] = json.loads(metadata_str)
assert len(metadata_client_service) == \
len(client.list_host(kind='docker', removed_null=True))
for con in metadata_client_service:
# Service related metadata
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"services/" + "test2120160729",
"2016-07-29")
service_metadata = json.loads(metadata_str)
con_list = service_metadata["containers"]
# Check for container object list
assert len(con_list) == len(con_metadata.keys())
for container in con_list:
assert cmp(container, con_metadata[container["name"]]) == 0
print service_metadata["external_ips"]
print service_metadata["hostname"]
assert service_metadata["name"] == "test2120160729"
assert service_metadata["stack_name"] == env_name
assert service_metadata["kind"] == "service"
assert service_metadata["labels"] == service.launchConfig["labels"]
assert service_metadata["metadata"] == service.metadata
assert service_metadata["uuid"] == service.uuid
# Stack related metadata
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"stacks/" + env_name,
"2016-07-29")
metadata = json.loads(metadata_str)
assert metadata["environment_name"] == PROJECT_NAME
# Check for service object list
assert cmp(metadata["services"][0], service_metadata) == 0
assert metadata["name"] == env.name
assert metadata["uuid"] == env.uuid
# Container related metadata
con = service_containers[0]
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"containers/" + con.name,
"2016-07-29")
metadata = json.loads(metadata_str)
assert metadata["create_index"] == con.createIndex
host = client.by_id('host', con.hosts[0].id)
assert metadata["host_uuid"] == host.uuid
assert metadata["ips"] == [con.primaryIpAddress]
assert metadata["labels"] == con.labels
assert metadata["name"] == con.name
assert metadata["primary_ip"] == con.primaryIpAddress
assert metadata["service_name"] == "test2120160729"
assert metadata["stack_name"] == env.name
assert metadata["uuid"] == con.uuid
delete_all(client, [env])
@if_compose_data_files
def test_metadata_self_2015_12_19(
client, rancher_cli_container):
env_name = random_str().replace("-", "")
dc_file = "dc_metadata_1n.yml"
rc_file = "rc_metadata_1n.yml"
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "test1n", METADATA_SUBDIR, dc_file, rc_file)
print service.metadata
assert service.metadata["test1"]["name"] == "t1name"
assert service.metadata["test1"]["value"] == "t1value"
assert service.metadata["test2"]["name"] == [1, 2, 3, 4]
service_containers = get_service_container_list(client, service)
port = 6001
con_metadata = {}
wait_for_metadata_propagation(client)
for con in service_containers:
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"containers/" + con.name,
"2015-12-19")
con_metadata[con.name] = json.loads(metadata_str)
for con in service_containers:
# Service related metadata
metadata_str = fetch_rancher_metadata(client, con, port,
"self/service", "2015-12-19")
service_metadata = json.loads(metadata_str)
con_list = service_metadata["containers"]
# Check for container object list
assert len(con_list) == len(con_metadata.keys())
for container in con_list:
assert cmp(container, con_metadata[container["name"]]) == 0
assert service_metadata["name"] == "test1n"
assert service_metadata["ports"] == ["6001:22/tcp"]
assert service_metadata["stack_name"] == env_name
assert service_metadata["kind"] == "service"
assert service_metadata["labels"] == service.launchConfig["labels"]
assert service_metadata["metadata"] == service.metadata
assert service_metadata["uuid"] == service.uuid
host = client.by_id('host', con.hosts[0].id)
# Host related metadata
metadata_str = fetch_rancher_metadata(client, con, port,
"self/host", "2015-12-19")
metadata = json.loads(metadata_str)
assert metadata["agent_ip"] == host.ipAddresses()[0].address
assert metadata["labels"] == host.labels
assert metadata["name"] == host.hostname
assert metadata["uuid"] == host.uuid
# Stack related metadata
metadata_str = fetch_rancher_metadata(client, con, port,
"self/stack", "2015-12-19")
metadata = json.loads(metadata_str)
assert metadata["environment_name"] == PROJECT_NAME
# Check for service object list
# Set token value to None in service metadata object returned
# from self before comparing service object retrieved by index
service_metadata["token"] = None
assert cmp(metadata["services"][0], service_metadata) == 0
assert metadata["name"] == env.name
assert metadata["uuid"] == env.uuid
# Container related metadata
metadata_str = fetch_rancher_metadata(client, con, port,
"self/container", "2015-12-19")
metadata = json.loads(metadata_str)
assert metadata["create_index"] == con.createIndex
assert metadata["host_uuid"] == host.uuid
assert metadata["ips"] == [con.primaryIpAddress]
assert metadata["labels"] == con.labels
assert metadata["name"] == con.name
assert metadata["ports"] == [host.ipAddresses()[0].address +
":6001:22/tcp"]
assert metadata["primary_ip"] == con.primaryIpAddress
assert metadata["service_name"] == "test1n"
assert metadata["stack_name"] == env.name
assert metadata["uuid"] == con.uuid
delete_all(client, [env])
@if_compose_data_files
def test_metadata_byname_2015_12_19(
client, rancher_cli_container):
env_name = random_str().replace("-", "")
dc_file = "dc_metadata_2n.yml"
rc_file = "rc_metadata_2n.yml"
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "test2n", METADATA_SUBDIR, dc_file, rc_file)
print service.metadata
assert service.metadata["test1"]["name"] == "t1name"
assert service.metadata["test1"]["value"] == "t1value"
assert service.metadata["test2"]["name"] == [1, 2, 3, 4]
service_containers = get_service_container_list(client, service)
wait_for_metadata_propagation(client)
con_metadata = {}
for con in service_containers:
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"containers/" + con.name,
"2015-12-19")
con_metadata[con.name] = json.loads(metadata_str)
assert len(metadata_client_service) == \
len(client.list_host(kind='docker', removed_null=True))
for con in metadata_client_service:
# Service related metadata
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"services/" + "test2n",
"2015-12-19")
service_metadata = json.loads(metadata_str)
con_list = service_metadata["containers"]
# Check for container object list
assert len(con_list) == len(con_metadata.keys())
for container in con_list:
assert cmp(container, con_metadata[container["name"]]) == 0
print service_metadata["external_ips"]
print service_metadata["hostname"]
assert service_metadata["name"] == "test2n"
assert service_metadata["stack_name"] == env_name
assert service_metadata["kind"] == "service"
assert service_metadata["labels"] == service.launchConfig["labels"]
assert service_metadata["metadata"] == service.metadata
assert service_metadata["uuid"] == service.uuid
# Stack related metadata
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"stacks/" + env_name,
"2015-12-19")
metadata = json.loads(metadata_str)
assert metadata["environment_name"] == PROJECT_NAME
# Check for service object list
assert cmp(metadata["services"][0], service_metadata) == 0
assert metadata["name"] == env.name
assert metadata["uuid"] == env.uuid
# Container related metadata
con = service_containers[0]
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"containers/" + con.name,
"2015-12-19")
metadata = json.loads(metadata_str)
assert metadata["create_index"] == con.createIndex
host = client.by_id('host', con.hosts[0].id)
assert metadata["host_uuid"] == host.uuid
assert metadata["ips"] == [con.primaryIpAddress]
assert metadata["labels"] == con.labels
assert metadata["name"] == con.name
assert metadata["primary_ip"] == con.primaryIpAddress
assert metadata["service_name"] == "test2n"
assert metadata["stack_name"] == env.name
assert metadata["uuid"] == con.uuid
delete_all(client, [env])
@if_compose_data_files
def test_metadata_self_2015_07_25(
client, rancher_cli_container):
env_name = random_str().replace("-", "")
dc_file = "dc_metadata_1.yml"
rc_file = "rc_metadata_1.yml"
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "test", METADATA_SUBDIR, dc_file, rc_file)
print service.metadata
assert service.metadata["test1"]["name"] == "t1name"
assert service.metadata["test1"]["value"] == "t1value"
assert service.metadata["test2"]["name"] == [1, 2, 3, 4]
wait_for_metadata_propagation(client)
service_containers = get_service_container_list(client, service)
port = 6000
con_names = []
for con in service_containers:
con_names.append(con.name)
for con in service_containers:
# Service related metadata
metadata_str = fetch_rancher_metadata(client, con, port,
"self/service", "2015-07-25")
metadata = json.loads(metadata_str)
assert set(metadata["containers"]) == set(con_names)
print metadata["external_ips"]
print metadata["hostname"]
assert metadata["name"] == "test"
assert metadata["ports"] == ["6000:22/tcp"]
assert metadata["stack_name"] == env_name
assert metadata["kind"] == "service"
assert metadata["labels"] == service.launchConfig["labels"]
assert metadata["metadata"] == service.metadata
assert metadata["uuid"] == service.uuid
host = client.by_id('host', con.hosts[0].id)
# Host related metadata
metadata_str = fetch_rancher_metadata(client, con, port,
"self/host", "2015-07-25")
metadata = json.loads(metadata_str)
assert metadata["agent_ip"] == host.ipAddresses()[0].address
assert metadata["labels"] == host.labels
assert metadata["name"] == host.hostname
assert metadata["uuid"] == host.uuid
# Stack related metadata
metadata_str = fetch_rancher_metadata(client, con, port,
"self/stack", "2015-07-25")
metadata = json.loads(metadata_str)
assert metadata["environment_name"] == PROJECT_NAME
assert metadata["services"] == ["test"]
assert metadata["name"] == env.name
assert metadata["uuid"] == env.uuid
# Container related metadata
metadata_str = fetch_rancher_metadata(client, con, port,
"self/container", "2015-07-25")
metadata = json.loads(metadata_str)
assert metadata["create_index"] == con.createIndex
assert metadata["host_uuid"] == host.uuid
assert metadata["ips"] == [con.primaryIpAddress]
assert metadata["labels"] == con.labels
assert metadata["name"] == con.name
assert metadata["ports"] == [host.ipAddresses()[0].address +
":6000:22/tcp"]
assert metadata["primary_ip"] == con.primaryIpAddress
assert metadata["service_name"] == "test"
assert metadata["stack_name"] == env.name
assert metadata["uuid"] == con.uuid
delete_all(client, [env])
@if_compose_data_files
def test_metadata_byname_2015_07_25(
client, rancher_cli_container):
env_name = random_str().replace("-", "")
dc_file = "dc_metadata_2.yml"
rc_file = "rc_metadata_2.yml"
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "test2", METADATA_SUBDIR, dc_file, rc_file)
print service.metadata
assert service.metadata["test1"]["name"] == "t1name"
assert service.metadata["test1"]["value"] == "t1value"
assert service.metadata["test2"]["name"] == [1, 2, 3, 4]
service_containers = get_service_container_list(client, service)
con_names = []
for con in service_containers:
con_names.append(con.name)
wait_for_metadata_propagation(client)
assert len(metadata_client_service) == \
len(client.list_host(kind='docker', removed_null=True))
for con in metadata_client_service:
# Service related metadata
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"services/" + "test2",
"2015-07-25")
metadata = json.loads(metadata_str)
assert set(metadata["containers"]) == set(con_names)
print metadata["external_ips"]
print metadata["hostname"]
assert metadata["name"] == "test2"
assert metadata["stack_name"] == env_name
assert metadata["kind"] == "service"
assert metadata["labels"] == service.launchConfig["labels"]
assert metadata["metadata"] == service.metadata
assert metadata["uuid"] == service.uuid
# Stack related metadata
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"stacks/" + env_name,
"2015-07-25")
metadata = json.loads(metadata_str)
assert metadata["environment_name"] == PROJECT_NAME
assert metadata["services"] == ["test2"]
assert metadata["name"] == env.name
assert metadata["uuid"] == env.uuid
# Container related metadata
con = service_containers[0]
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"containers/" + con.name,
"2015-07-25")
metadata = json.loads(metadata_str)
assert metadata["create_index"] == con.createIndex
host = client.by_id('host', con.hosts[0].id)
assert metadata["host_uuid"] == host.uuid
assert metadata["ips"] == [con.primaryIpAddress]
assert metadata["labels"] == con.labels
assert metadata["name"] == con.name
assert metadata["primary_ip"] == con.primaryIpAddress
assert metadata["service_name"] == "test2"
assert metadata["stack_name"] == env.name
assert metadata["uuid"] == con.uuid
delete_all(client, [env])
@if_compose_data_files
def test_metadata_update(client, rancher_cli_container):
env_name = random_str().replace("-", "")
dc_file = "dc_metadata_3.yml"
rc_file = "rc_metadata_3.yml"
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "test3", METADATA_SUBDIR, dc_file, rc_file)
assert service.metadata["test1"]["name"] == "t1name"
assert service.metadata["test1"]["value"] == "t1value"
assert isinstance(service.metadata["test2"]["name"], list)
assert service.metadata["test2"]["name"] == [1, 2, 3, 4]
assert len(metadata_client_service) == \
len(client.list_host(kind='docker', removed_null=True))
wait_for_metadata_propagation(client)
for con in metadata_client_service:
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"services/" + "test3")
metadata = json.loads(metadata_str)
assert metadata["metadata"] == service.metadata
# Update user metadata
launch_rancher_cli_from_file(
client, METADATA_SUBDIR, env_name,
"up --upgrade -d", "Updating",
"dc_metadata_3.yml", "rc_metadata_31.yml")
service = client.reload(service)
assert service.state == "active"
assert service.metadata["test1"]["name"] == "t2name"
assert service.metadata["test1"]["value"] == "t1value"
assert isinstance(service.metadata["test2"]["name"], list)
assert service.metadata["test2"]["name"] == [1, 2, 5]
assert service.metadata["test3"]["name"] == "t3name"
wait_for_metadata_propagation(client)
for con in metadata_client_service:
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"services/" + "test3")
metadata = json.loads(metadata_str)
assert metadata["metadata"] == service.metadata
delete_all(client, [env])
@if_compose_data_files
def test_metadata_scaleup(
client, rancher_cli_container):
env_name = random_str().replace("-", "")
dc_file = "dc_metadata_4.yml"
rc_file = "rc_metadata_4.yml"
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "test4", METADATA_SUBDIR, dc_file, rc_file)
service_containers = get_service_container_list(client, service)
assert len(service_containers) == 2
wait_for_metadata_propagation(client)
con_metadata = {}
for con in service_containers:
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"containers/" + con.name)
con_metadata[con.name] = json.loads(metadata_str)
assert len(metadata_client_service) == \
len(client.list_host(kind='docker', removed_null=True))
for con in metadata_client_service:
validate_service_container_list(client, con, "test4",
con_metadata)
# Scale up service
launch_rancher_cli_from_file(
client, METADATA_SUBDIR, env_name,
"scale test4=3", "test4")
service = client.wait_success(service, 60)
assert service.state == "active"
service_containers = get_service_container_list(client, service)
assert len(service_containers) == 3
con_metadata = {}
for con in service_containers:
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"containers/" + con.name)
con_metadata[con.name] = json.loads(metadata_str)
wait_for_metadata_propagation(client)
for con in metadata_client_service:
validate_service_container_list(client, con, "test4",
con_metadata)
delete_all(client, [env])
@if_compose_data_files
def test_metadata_scaledown(
client, rancher_cli_container):
env_name = random_str().replace("-", "")
dc_file = "dc_metadata_5.yml"
rc_file = "rc_metadata_5.yml"
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "test5", METADATA_SUBDIR, dc_file, rc_file)
service_containers = get_service_container_list(client, service)
assert len(service_containers) == 2
wait_for_metadata_propagation(client)
con_metadata = {}
for con in service_containers:
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"containers/" + con.name)
con_metadata[con.name] = json.loads(metadata_str)
assert len(metadata_client_service) == \
len(client.list_host(kind='docker', removed_null=True))
for con in metadata_client_service:
validate_service_container_list(client, con, "test5",
con_metadata)
# Scale down service
launch_rancher_cli_from_file(
client, METADATA_SUBDIR, env_name,
"scale test5=1", "test5")
service = client.wait_success(service, 60)
assert service.state == "active"
service_containers = get_service_container_list(client, service)
assert len(service_containers) == 1
con_metadata = {}
for con in service_containers:
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"containers/" + con.name)
con_metadata[con.name] = json.loads(metadata_str)
wait_for_metadata_propagation(client)
for con in metadata_client_service:
validate_service_container_list(client, con, "test5",
con_metadata)
delete_all(client, [env])
@if_compose_data_files
def test_metadata_sidekick(client, rancher_cli_container):
env_name = random_str().replace("-", "")
dc_file = "dc_metadata_sk.yml"
rc_file = "rc_metadata_sk.yml"
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "testsk", METADATA_SUBDIR, dc_file, rc_file)
service_containers = get_service_container_list(client, service)
con_names = []
for con in service_containers:
con_names.append(con.name)
print con_names
print metadata_client_service
assert len(metadata_client_service) == \
len(client.list_host(kind='docker', removed_null=True))
wait_for_metadata_propagation(client)
for con in metadata_client_service:
# Service related metadata
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"services/" + "testsk")
metadata = json.loads(metadata_str)
assert set(metadata["sidekicks"]) == set(["sk1", "sk2"])
delete_all(client, [env])
@if_compose_data_files
def test_metadata_links(client, rancher_cli_container):
env_name1 = "testlink"
dc_file = "dc_metadata_links_1.yml"
# Create an environment using up
linked_env, linked_service = create_stack_using_rancher_cli(
client, env_name1, "testl1", METADATA_SUBDIR, dc_file)
env_name2 = random_str().replace("-", "")
dc_file = "dc_metadata_links_2.yml"
env, service = create_stack_using_rancher_cli(
client, env_name2, "testl2", METADATA_SUBDIR, dc_file)
linked_services = {env_name1 + "/" + "testl1": "linkexttest",
env_name2 + "/" + "testl2": "linktest"}
linked_env, linked_service = \
get_env_service_by_name(client, env_name1, "testl1")
assert len(metadata_client_service) == \
len(client.list_host(kind='docker', removed_null=True))
wait_for_metadata_propagation(client)
for con in metadata_client_service:
# Service related metadata
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"services/" + "testl3")
metadata = json.loads(metadata_str)
print metadata["links"]
assert metadata["links"] == linked_services
delete_all(client, [env, linked_env])
@if_compose_data_files
def test_metadata_hostnet(client, rancher_cli_container):
env_name = random_str().replace("-", "")
dc_file = "dc_metadata_hostnet.yml"
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "testhostdns", METADATA_SUBDIR, dc_file)
service_containers = get_service_container_list(client, service)
assert len(service_containers) == service.scale
port = 33
wait_for_metadata_propagation(client)
for con in service_containers:
host = client.by_id('host', con.hosts[0].id)
# Host related metadata
metadata_str = fetch_rancher_metadata(client, con, port,
"self/host")
metadata = json.loads(metadata_str)
assert metadata["agent_ip"] == host.ipAddresses()[0].address
assert metadata["labels"] == host.labels
assert metadata["name"] == host.hostname
assert metadata["uuid"] == host.uuid
delete_all(client, [env])
@if_compose_data_files
def test_metadata_externalservice_ip(
client, rancher_cli_container):
env_name = random_str().replace("-", "")
dc_file = "dc_metadata_extservice_ip.yml"
rc_file = "rc_metadata_extservice_ip.yml"
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "testextip", METADATA_SUBDIR, dc_file, rc_file)
wait_for_metadata_propagation(client)
for con in metadata_client_service:
# Service related metadata
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"services/" + "testextip")
metadata = json.loads(metadata_str)
print metadata["external_ips"]
assert set(metadata["external_ips"]) == set(["1.1.1.1", "2.2.2.2"])
assert metadata["kind"] == "externalService"
delete_all(client, [env])
@if_compose_data_files
def test_metadata_externalservice_cname(
client, rancher_cli_container):
env_name = random_str().replace("-", "")
dc_file = "dc_metadata_extservice_cname.yml"
rc_file = "rc_metadata_extservice_cname.yml"
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "testextcname", METADATA_SUBDIR, dc_file, rc_file)
wait_for_metadata_propagation(client)
for con in metadata_client_service:
# Service related metadata
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"services/" + "testextcname")
metadata = json.loads(metadata_str)
print metadata["hostname"]
assert metadata["hostname"] == "google.com"
assert metadata["kind"] == "externalService"
delete_all(client, [env])
@if_compose_data_files
def test_metadata_lb(client, rancher_cli_container):
env_name = random_str().replace("-", "")
dc_file = "dc_metadata_lb.yml"
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "lb-1", METADATA_SUBDIR, dc_file)
linked_services = {env_name + "/" + "web1": "web1",
env_name + "/" + "web2": "web2"}
assert len(metadata_client_service) == \
len(client.list_host(kind='docker', removed_null=True))
wait_for_metadata_propagation(client)
for con in metadata_client_service:
# Service related metadata
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"services/" + "lb-1")
metadata = json.loads(metadata_str)
assert metadata["links"] == linked_services
assert metadata["kind"] == "loadBalancerService"
delete_all(client, [env])
@if_compose_data_files
def test_metadata_lb_updatetarget(
client, rancher_cli_container):
env_name = random_str().replace("-", "")
dc_file = "dc_metadata_lb_1.yml"
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "lb-2", METADATA_SUBDIR, dc_file)
linked_services = {env_name + "/" + "web1": "web1",
env_name + "/" + "web2": "web2"}
assert len(metadata_client_service) == \
len(client.list_host(kind='docker', removed_null=True))
wait_for_metadata_propagation(client)
for con in metadata_client_service:
# Service related metadata
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"services/" + "lb-2")
metadata = json.loads(metadata_str)
assert metadata["links"] == linked_services
assert metadata["kind"] == "loadBalancerService"
# Add another target to existing LB service
dc_file = "dc_metadata_lb_11.yml"
"""
# Create an environment using up
env, service = create_stack_using_rancher_cli(
client, env_name, "lb-2", METADATA_SUBDIR, dc_file)
"""
launch_rancher_cli_from_file(
client, METADATA_SUBDIR, env_name,
"up --upgrade -d", "Updating",
"dc_metadata_lb_11.yml")
linked_services = {env_name + "/" + "web1": "web1",
env_name + "/" + "web2": "web2",
env_name + "/" + "web3": "web3"}
assert len(metadata_client_service) == \
len(client.list_host(kind='docker', removed_null=True))
wait_for_metadata_propagation(client)
for con in metadata_client_service:
# Service related metadata
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"services/" + "lb-2")
metadata = json.loads(metadata_str)
assert metadata["links"] == linked_services
assert metadata["kind"] == "loadBalancerService"
delete_all(client, [env])
def get_env_service_by_name(client, env_name, service_name):
env = client.list_stack(name=env_name, removed_null=True)
assert len(env) == 1
service = client.list_service(name=service_name,
stackId=env[0].id,
removed_null=True)
assert len(service) == 1
return env[0], service[0]
def fetch_rancher_metadata(client, con, port, command, version=None):
host = client.by_id('host', con.hosts[0].id)
if version is None:
version = "latest"
rancher_metadata_cmd = \
"wget -O result.txt --header 'Accept: application/json' " + \
"http://rancher-metadata/"+version+"/" + command + "; cat result.txt"
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(host.ipAddresses()[0].address,
username="root",
password="root",
port=port)
print rancher_metadata_cmd
stdin, stdout, stderr = ssh.exec_command(rancher_metadata_cmd)
response = stdout.readlines()
assert len(response) > 0
return response[0]
def validate_service_container_list(client, con, serviceName,
con_metadata):
metadata_str = fetch_rancher_metadata(client, con,
metadata_client_port,
"services/"+serviceName)
metadata = json.loads(metadata_str)
print metadata
con_list = metadata["containers"]
assert len(con_list) == len(con_metadata.keys())
for con in con_list:
print con
print con_metadata[con["name"]]
assert cmp(con, con_metadata[con["name"]]) == 0
| 40.705285 | 78 | 0.60708 | 4,352 | 40,054 | 5.314798 | 0.059743 | 0.071422 | 0.046304 | 0.046087 | 0.891569 | 0.871336 | 0.859922 | 0.846131 | 0.836879 | 0.826546 | 0 | 0.022052 | 0.286738 | 40,054 | 983 | 79 | 40.746694 | 0.787567 | 0.044989 | 0 | 0.728859 | 0 | 0 | 0.110439 | 0.010439 | 0 | 0 | 0 | 0 | 0.284564 | 0 | null | null | 0.001342 | 0.002685 | null | null | 0.033557 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c7cff4b194e352dcad59c27a85407929dfdeac69 | 29 | py | Python | services/__init__.py | kecorbin/isemon | af4ac5ab70c61733ef20b793d390ca8bfa0c800d | [
"MIT"
] | 3 | 2018-08-21T21:45:22.000Z | 2021-01-07T03:16:54.000Z | services/__init__.py | kecorbin/isemon | af4ac5ab70c61733ef20b793d390ca8bfa0c800d | [
"MIT"
] | 1 | 2021-12-13T19:46:27.000Z | 2021-12-13T19:46:27.000Z | services/__init__.py | kecorbin/isemon | af4ac5ab70c61733ef20b793d390ca8bfa0c800d | [
"MIT"
] | 1 | 2018-09-12T16:10:14.000Z | 2018-09-12T16:10:14.000Z | from ise import get_from_ise
| 14.5 | 28 | 0.862069 | 6 | 29 | 3.833333 | 0.666667 | 0.608696 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 1 | 29 | 29 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
40057f90e7ac6b49bd919077770f1c535efe59ed | 130 | py | Python | neuralcode/data/__init__.py | neuralcode/neuralcode | bf03523fed0240477e153ee2beb319f0594e4095 | [
"MIT"
] | 5 | 2021-02-23T22:54:34.000Z | 2021-02-25T15:07:54.000Z | neuralcode/data/__init__.py | neuralcode/neuralcode | bf03523fed0240477e153ee2beb319f0594e4095 | [
"MIT"
] | null | null | null | neuralcode/data/__init__.py | neuralcode/neuralcode | bf03523fed0240477e153ee2beb319f0594e4095 | [
"MIT"
] | null | null | null | from .vocab import Vocab # NOQA
from .vocab import get_vocab_from_dict # NOQA
from .vocab import get_vocab_from_iterator # NOQA | 43.333333 | 50 | 0.8 | 21 | 130 | 4.666667 | 0.333333 | 0.27551 | 0.459184 | 0.387755 | 0.632653 | 0.632653 | 0.632653 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 130 | 3 | 50 | 43.333333 | 0.890909 | 0.107692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
400a52a88afa1fa6d9fe6d159242663719294f70 | 1,120 | py | Python | tests/test_loader.py | Pimax1/keras-gpt-2 | 0a4adaad651a5a51e8a9c647c50cc01c3e51055c | [
"MIT"
] | 131 | 2019-02-19T09:02:39.000Z | 2022-03-21T12:59:37.000Z | tests/test_loader.py | Pimax1/keras-gpt-2 | 0a4adaad651a5a51e8a9c647c50cc01c3e51055c | [
"MIT"
] | 12 | 2019-03-08T10:34:54.000Z | 2022-01-09T05:01:31.000Z | tests/test_loader.py | neeleshdodda44/keras-gpt-2-doublehead | ec77b17a8e5888979018c9e056a02bd9ad5c8d3d | [
"MIT"
] | 31 | 2019-02-19T10:29:39.000Z | 2021-12-20T19:07:19.000Z | import os
from unittest import TestCase
from keras_gpt_2 import load_trained_model_from_checkpoint
class TestLoader(TestCase):
def test_load_from_checkpoint(self):
current_path = os.path.dirname(os.path.abspath(__file__))
toy_checkpoint_path = os.path.join(current_path, 'toy_checkpoint')
config_path = os.path.join(toy_checkpoint_path, 'hparams.json')
checkpoint_path = os.path.join(toy_checkpoint_path, 'model.ckpt')
model = load_trained_model_from_checkpoint(config_path=config_path, checkpoint_path=checkpoint_path)
model.summary()
def test_load_from_checkpoint_shorter(self):
current_path = os.path.dirname(os.path.abspath(__file__))
toy_checkpoint_path = os.path.join(current_path, 'toy_checkpoint')
config_path = os.path.join(toy_checkpoint_path, 'hparams.json')
checkpoint_path = os.path.join(toy_checkpoint_path, 'model.ckpt')
model = load_trained_model_from_checkpoint(
config_path=config_path,
checkpoint_path=checkpoint_path,
seq_len=10,
)
model.summary()
| 41.481481 | 108 | 0.722321 | 145 | 1,120 | 5.165517 | 0.227586 | 0.224299 | 0.106809 | 0.11215 | 0.846462 | 0.739653 | 0.739653 | 0.739653 | 0.739653 | 0.739653 | 0 | 0.0033 | 0.188393 | 1,120 | 26 | 109 | 43.076923 | 0.820682 | 0 | 0 | 0.454545 | 0 | 0 | 0.064286 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.136364 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4072f0850ba4f9272d1f034a112756c958c376e1 | 346,582 | py | Python | streaming-api-client/proto/monitoring_pb2.py | k1z2/central-examples-only | 10aa1bcbcd1854ab272f9d250d49c11b7b0dcf2d | [
"MIT"
] | 21 | 2019-12-03T17:18:50.000Z | 2022-01-16T22:55:08.000Z | streaming-api-client/proto/monitoring_pb2.py | k1z2/central-examples-only | 10aa1bcbcd1854ab272f9d250d49c11b7b0dcf2d | [
"MIT"
] | 4 | 2019-11-01T19:12:24.000Z | 2021-09-17T00:48:06.000Z | streaming-api-client/proto/monitoring_pb2.py | k1z2/central-examples-only | 10aa1bcbcd1854ab272f9d250d49c11b7b0dcf2d | [
"MIT"
] | 25 | 2019-09-25T05:48:40.000Z | 2022-03-30T12:33:04.000Z | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: monitoring.proto
"""Generated protocol buffer code."""
from google.protobuf.internal import enum_type_wrapper
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='monitoring.proto',
package='Monitoring',
syntax='proto2',
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_pb=b'\n\x10monitoring.proto\x12\nMonitoring\"\x84\x01\n\tIpAddress\x12-\n\x02\x61\x66\x18\x01 \x02(\x0e\x32!.Monitoring.IpAddress.addr_family\x12\x0c\n\x04\x61\x64\x64r\x18\x02 \x02(\x0c\":\n\x0b\x61\x64\x64r_family\x12\x14\n\x10\x41\x44\x44R_FAMILY_INET\x10\x02\x12\x15\n\x11\x41\x44\x44R_FAMILY_INET6\x10\n\"\x1a\n\nMacAddress\x12\x0c\n\x04\x61\x64\x64r\x18\x01 \x02(\x0c\"\xf2\x01\n\x05Swarm\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x10\n\x08swarm_id\x18\x02 \x01(\t\x12\x0c\n\x04name\x18\x03 \x01(\t\x12&\n\x06status\x18\x04 \x01(\x0e\x32\x12.Monitoring.Status:\x02UP\x12\x30\n\x11public_ip_address\x18\x05 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12)\n\nip_address\x18\x06 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\x18\n\x10\x66irmware_version\x18\x07 \x01(\t\"\xdf\x02\n\x06Tunnel\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x10\n\x08swarm_id\x18\x02 \x01(\t\x12&\n\x05index\x18\x03 \x01(\x0e\x32\x17.Monitoring.TunnelIndex\x12+\n\x0b\x63rypto_type\x18\x04 \x01(\x0e\x32\x16.Monitoring.CryptoType\x12\x11\n\tpeer_name\x18\x05 \x01(\t\x12*\n\x0bpeer_tun_ip\x18\x06 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12(\n\ttunnel_ip\x18\x07 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12&\n\x06status\x18\x08 \x01(\x0e\x32\x12.Monitoring.Status:\x02UP\x12\x0e\n\x06\x61\x63tive\x18\t \x01(\x08\x12\x0e\n\x06uptime\x18\n \x01(\r\x12\x11\n\ttunnel_id\x18\x0b \x01(\x04\"\xc1\x0e\n\tInterface\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x11\n\tdevice_id\x18\x02 \x01(\t\x12\'\n\x07macaddr\x18\x03 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12&\n\x06status\x18\x04 \x01(\x0e\x32\x12.Monitoring.Status:\x02UP\x12%\n\x06ipaddr\x18\x05 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\x31\n\x0b\x64uplex_mode\x18\x06 \x01(\x0e\x32\x1c.Monitoring.Interface.Duplex\x12\x0c\n\x04name\x18\x07 \x01(\t\x12\x13\n\x0bport_number\x18\x08 \x01(\t\x12,\n\x04type\x18\t \x01(\x0e\x32\x1e.Monitoring.Interface.IntfType\x12\x0c\n\x04mode\x18\n \x01(\t\x12\x0c\n\x04vlan\x18\x0b \x01(\r\x12\x35\n\x07has_poe\x18\x0c \x01(\x0e\x32 .Monitoring.Interface.PoeSupport:\x02NA\x12)\n\tpoe_state\x18\r \x01(\x0e\x32\x12.Monitoring.Status:\x02UP\x12*\n\noper_state\x18\x0e \x01(\x0e\x32\x12.Monitoring.Status:\x02UP\x12+\n\x0b\x61\x64min_state\x18\x0f \x01(\x0e\x32\x12.Monitoring.Status:\x02UP\x12.\n\x05speed\x18\x10 \x01(\x0e\x32\x1f.Monitoring.Interface.SpeedType\x12\x0b\n\x03mux\x18\x11 \x01(\r\x12\x0f\n\x07trusted\x18\x12 \x01(\r\x12\x0c\n\x04slot\x18\x13 \x01(\t\x12\x30\n\x08phy_type\x18\x14 \x01(\x0e\x32\x1e.Monitoring.Interface.PortType\x12\x10\n\x08sub_type\x18\x15 \x01(\t\x12\x14\n\x0c\x61llowed_vlan\x18\x16 \x03(\r\x12\x13\n\x0bnative_vlan\x18\x17 \x01(\r\x12\x13\n\x0bvsx_enabled\x18\x18 \x01(\x08\x12@\n\x11state_down_reason\x18\x19 \x01(\x0e\x32%.Monitoring.Interface.StateDownReason\x12\x32\n\tvlan_mode\x18\x1a \x01(\x0e\x32\x1f.Monitoring.Interface.VlanModes\"&\n\x06\x44uplex\x12\x08\n\x04HALF\x10\x01\x12\x08\n\x04\x46ULL\x10\x02\x12\x08\n\x04\x41UTO\x10\x03\"\x91\x01\n\x08IntfType\x12\x0c\n\x08\x45THERNET\x10\x01\x12\x0c\n\x08LOOPBACK\x10\x02\x12\x08\n\x04VLAN\x10\x03\x12\n\n\x06TUNNEL\x10\x04\x12\x10\n\x0cPORT_CHANNEL\x10\x05\x12\x0b\n\x07STANDBY\x10\x06\x12\n\n\x06\x42RIDGE\x10\x07\x12\t\n\x05SPLIT\x10\x08\x12\t\n\x05STACK\x10\t\x12\x08\n\x04MGMT\x10\n\x12\x08\n\x04NONE\x10\x0b\"l\n\tSpeedType\x12\x11\n\rSPEED_INVALID\x10\x00\x12\x0e\n\nSPEED_AUTO\x10\x01\x12\x0c\n\x08SPEED_10\x10\x02\x12\r\n\tSPEED_100\x10\x03\x12\x0e\n\nSPEED_1000\x10\x04\x12\x0f\n\x0bSPEED_10000\x10\x05\"J\n\x08PortType\x12\x0b\n\x07PT_RJ45\x10\x00\x12\x0b\n\x07PT_GBIC\x10\x01\x12\r\n\tPT_SERIAL\x10\x02\x12\n\n\x06PT_USB\x10\x03\x12\t\n\x05PT_X2\x10\x04\"6\n\nPoeSupport\x12\x06\n\x02NA\x10\x00\x12\r\n\tSUPPORTED\x10\x01\x12\x11\n\rNOT_SUPPORTED\x10\x02\"\xdc\x03\n\x0fStateDownReason\x12\x11\n\rUNINITIALIZED\x10\x01\x12\x14\n\x10WAITING_FOR_LINK\x10\x02\x12\x18\n\x14\x41\x44MIN_INTERFACE_DOWN\x10\x03\x12\x12\n\x0eMODULE_MISSING\x10\x04\x12\x17\n\x13MODULE_UNRECOGNIZED\x10\x05\x12\x16\n\x12MODULE_UNSUPPORTED\x10\x06\x12\x17\n\x13MODULE_INCOMPATIBLE\x10\x07\x12\x10\n\x0cMODULE_FAULT\x10\x08\x12\x18\n\x14GROUP_SPEED_MISMATCH\x10\t\x12\x0f\n\x0bLANES_SPLIT\x10\n\x12\x13\n\x0fLANES_NOT_SPLIT\x10\x0b\x12\x0f\n\x0bINVALID_MTU\x10\x0c\x12\x12\n\x0eINVALID_SPEEDS\x10\r\x12\x19\n\x15\x41UTONEG_NOT_SUPPORTED\x10\x0e\x12\x14\n\x10\x41UTONEG_REQUIRED\x10\x0f\x12\x14\n\x10INTERFACE_ABSENT\x10\x10\x12\x1d\n\x19PHYSICAL_INTERFACE_FAILED\x10\x11\x12\x1e\n\x1aPSPO_ENABLEMENT_LAYER_DOWN\x10\x12\x12\x19\n\x15\x43\x41RD_INTERFACE_ERRORS\x10\x13\x12\x10\n\x0cINTERFACE_OK\x10\x14\"?\n\tVlanModes\x12\n\n\x06\x41\x43\x43\x45SS\x10\x01\x12\x11\n\rNATIVE_TAGGED\x10\x02\x12\x13\n\x0fNATIVE_UNTAGGED\x10\x03\"\xd1\x01\n\x07VapInfo\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x11\n\tdevice_id\x18\x02 \x01(\t\x12)\n\tradio_mac\x18\x03 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\r\n\x05\x65ssid\x18\x04 \x01(\x0c\x12&\n\x06\x61p_mac\x18\x05 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12%\n\x05\x62ssid\x18\x06 \x01(\x0b\x32\x16.Monitoring.MacAddress\"\x84\x02\n\x05Radio\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x11\n\tdevice_id\x18\x02 \x01(\t\x12\r\n\x05index\x18\x03 \x01(\r\x12\'\n\x07macaddr\x18\x04 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12&\n\x06status\x18\x05 \x01(\x0e\x32\x12.Monitoring.Status:\x02UP\x12\x0f\n\x07\x63hannel\x18\x06 \x01(\t\x12\x0c\n\x04\x62\x61nd\x18\x07 \x01(\r\x12\x15\n\rchannel_width\x18\x08 \x01(\r\x12&\n\x06\x61p_mac\x18\t \x01(\x0b\x32\x16.Monitoring.MacAddress\"\xc5\x03\n\x02\x41p\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x0e\n\x06serial\x18\x02 \x01(\t\x12\x0c\n\x04name\x18\x03 \x01(\t\x12\'\n\x07macaddr\x18\x04 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\x12\n\ncluster_id\x18\x05 \x01(\t\x12&\n\x06status\x18\x06 \x01(\x0e\x32\x12.Monitoring.Status:\x02UP\x12)\n\nip_address\x18\x07 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\r\n\x05model\x18\x08 \x01(\t\x12\x11\n\tmesh_role\x18\t \x01(\t\x12\x0c\n\x04mode\x18\n \x01(\t\x12\x14\n\x0cswarm_master\x18\x0b \x01(\x08\x12\x17\n\x0fmodem_connected\x18\x0c \x01(\x08\x12.\n\x0buplink_type\x18\r \x01(\x0e\x32\x19.Monitoring.Ap.UplinkType\x12\x18\n\x10\x66irmware_version\x18\x0e \x01(\t\"<\n\nUplinkType\x12\x0c\n\x08\x45THERNET\x10\x01\x12\x08\n\x04MESH\x10\x02\x12\x0b\n\x07STATION\x10\x03\x12\t\n\x05MODEM\x10\x04\"v\n\x07Network\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x10\n\x08swarm_id\x18\x02 \x01(\t\x12\r\n\x05\x65ssid\x18\x03 \x01(\x0c\x12\x10\n\x08security\x18\x04 \x01(\t\x12\x0c\n\x04type\x18\x05 \x01(\t\"\xf5\x02\n\x0eWirelessClient\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\'\n\x07macaddr\x18\x02 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\x0c\n\x04name\x18\x03 \x01(\t\x12)\n\nip_address\x18\x04 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\x10\n\x08username\x18\x05 \x01(\t\x12\x19\n\x11\x61ssociated_device\x18\x06 \x01(\t\x12)\n\tradio_mac\x18\x07 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\x0f\n\x07network\x18\x08 \x01(\x0c\x12\x11\n\tuser_role\x18\t \x01(\t\x12\x14\n\x0cmanufacturer\x18\n \x01(\t\x12\x0f\n\x07os_type\x18\x0b \x01(\t\x12\x12\n\nconnection\x18\x0c \x01(\t\x12\x10\n\x08maxspeed\x18\r \x01(\r\x12\x0c\n\x04vlan\x18\x0e \x01(\r\"\xb8\x01\n\x0eHardwareModule\x12\r\n\x05index\x18\x01 \x01(\r\x12\x39\n\x06status\x18\x02 \x01(\x0e\x32).Monitoring.HardwareModule.HardwareStatus\"\\\n\x0eHardwareStatus\x12\x06\n\x02OK\x10\x00\x12\t\n\x05\x45RROR\x10\x01\x12\x11\n\rNOT_CONNECTED\x10\x02\x12\n\n\x06\x41\x43TIVE\x10\x03\x12\x0b\n\x07STANDBY\x10\x04\x12\x0b\n\x07OFFLINE\x10\x05\"\xf8\x05\n\x06Switch\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x0e\n\x06serial\x18\x02 \x01(\t\x12\x0c\n\x04name\x18\x03 \x01(\t\x12\'\n\x07macaddr\x18\x04 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\r\n\x05model\x18\x05 \x01(\t\x12&\n\x06status\x18\x06 \x01(\x0e\x32\x12.Monitoring.Status:\x02UP\x12\x30\n\x11public_ip_address\x18\x07 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12)\n\nip_address\x18\x08 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\x18\n\x10\x66irmware_version\x18\t \x01(\t\x12.\n\x0f\x64\x65\x66\x61ult_gateway\x18\n \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\x13\n\x0b\x64\x65vice_mode\x18\x0b \x01(\r\x12\x14\n\x0cuplink_ports\x18\x0c \x03(\t\x12\x11\n\tmax_slots\x18\r \x01(\r\x12\x12\n\nused_slots\x18\x0e \x03(\t\x12\x36\n\x12management_modules\x18\x0f \x03(\x0b\x32\x1a.Monitoring.HardwareModule\x12\x32\n\x0epower_supplies\x18\x10 \x03(\x0b\x32\x1a.Monitoring.HardwareModule\x12\x10\n\x08stack_id\x18\x11 \x01(\t\x12\x17\n\x0fstack_member_id\x18\x12 \x01(\r\x12=\n\x11stack_member_role\x18\x13 \x01(\x0e\x32\".Monitoring.Switch.StackMemberRole\x12-\n\rstack_macaddr\x18\x14 \x01(\x0b\x32\x16.Monitoring.MacAddress\"F\n\x0fStackMemberRole\x12\x0b\n\x07UNKNOWN\x10\x01\x12\r\n\tCOMMANDER\x10\x02\x12\x0b\n\x07STANDBY\x10\x03\x12\n\n\x06MEMBER\x10\x04\"\xdc\x03\n\x0bSwitchStack\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x10\n\x08stack_id\x18\x02 \x01(\t\x12&\n\x06status\x18\x03 \x01(\x0e\x32\x12.Monitoring.Status:\x02UP\x12\x37\n\x08topology\x18\x04 \x01(\x0e\x32%.Monitoring.SwitchStack.StackTopology\x12\x33\n\x06policy\x18\x05 \x01(\x0e\x32#.Monitoring.SwitchStack.StackPolicy\x12\x18\n\x10\x66irmware_version\x18\x06 \x01(\t\x12\x15\n\rvsf_domain_id\x18\x07 \x01(\r\"]\n\rStackTopology\x12\x0e\n\nSTANDALONE\x10\x01\x12\t\n\x05\x43HAIN\x10\x02\x12\x08\n\x04RING\x10\x03\x12\x08\n\x04MESH\x10\x04\x12\x10\n\x0cPARTIAL_MESH\x10\x05\x12\x0b\n\x07UNKNOWN\x10\x06\"i\n\x0bStackPolicy\x12\x17\n\x13STACK_SPLIT_UNKNOWN\x10\x00\x12\x1f\n\x1bSTACK_SPLIT_ONE_FRAGMENT_UP\x10\x01\x12 \n\x1cSTACK_SPLIT_ALL_FRAGMENTS_UP\x10\x02\"\xc1\x02\n\x0bWiredClient\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\'\n\x07macaddr\x18\x02 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\x0c\n\x04name\x18\x03 \x01(\t\x12)\n\nip_address\x18\x04 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\x10\n\x08username\x18\x05 \x01(\t\x12\x19\n\x11\x61ssociated_device\x18\x06 \x01(\t\x12-\n\rinterface_mac\x18\x07 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\x11\n\tuser_role\x18\x08 \x01(\t\x12\x0c\n\x04vlan\x18\t \x01(\r\x12\'\n\tauth_type\x18\n \x01(\x0e\x32\x14.Monitoring.AuthType\"\xcb\x03\n\x12MobilityController\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x0e\n\x06serial\x18\x02 \x01(\t\x12\x0c\n\x04name\x18\x03 \x01(\t\x12\'\n\x07macaddr\x18\x04 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\r\n\x05model\x18\x05 \x01(\t\x12&\n\x06status\x18\x06 \x01(\x0e\x32\x12.Monitoring.Status:\x02UP\x12\x30\n\x11public_ip_address\x18\x07 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12)\n\nip_address\x18\x08 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\x18\n\x10\x66irmware_version\x18\t \x01(\t\x12.\n\x0f\x64\x65\x66\x61ult_gateway\x18\n \x01(\x0b\x32\x15.Monitoring.IpAddress\x12;\n\x04mode\x18\x0b \x01(\x0e\x32-.Monitoring.MobilityController.ControllerMode\"\'\n\x0e\x43ontrollerMode\x12\x0b\n\x07GATEWAY\x10\x00\x12\x08\n\x04VPNC\x10\x01\"\xe9\x02\n\x06Uplink\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x11\n\tdevice_id\x18\x02 \x01(\t\x12\x12\n\nlink_index\x18\x03 \x01(\x04\x12\x0c\n\x04name\x18\x04 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x05 \x01(\t\x12\x10\n\x08priority\x18\x06 \x01(\r\x12\"\n\x06status\x18\x07 \x01(\x0e\x32\x12.Monitoring.Status\x12&\n\nwan_status\x18\x08 \x01(\x0e\x32\x12.Monitoring.Status\x12\x0c\n\x04vlan\x18\t \x01(\r\x12\x18\n\x10vlan_description\x18\n \x01(\t\x12\x30\n\x11public_ip_address\x18\x0b \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\x31\n\x12private_ip_address\x18\x0c \x01(\x0b\x32\x15.Monitoring.IpAddress\"\xb3\x02\n\tIkeTunnel\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x11\n\tdevice_id\x18\x02 \x01(\t\x12\x0e\n\x06map_id\x18\x03 \x01(\x04\x12(\n\x08peer_mac\x18\x04 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12)\n\tlocal_mac\x18\x05 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12%\n\x06src_ip\x18\x06 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12%\n\x06\x64st_ip\x18\x07 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\"\n\x06status\x18\x08 \x01(\x0e\x32\x12.Monitoring.Status\x12\x10\n\x08map_name\x18\t \x01(\t\"\xc5\x02\n\x0b\x44\x65viceStats\x12\x11\n\tdevice_id\x18\x01 \x01(\t\x12\x11\n\ttimestamp\x18\x02 \x01(\r\x12\x0e\n\x06uptime\x18\x03 \x01(\x04\x12\x17\n\x0f\x63pu_utilization\x18\x04 \x01(\r\x12\x11\n\tmem_total\x18\x05 \x01(\x04\x12\x10\n\x08mem_free\x18\x06 \x01(\x04\x12\x19\n\x11power_consumption\x18\x07 \x01(\r\x12\x11\n\tfan_speed\x18\x08 \x01(\r\x12\x13\n\x0btemperature\x18\t \x01(\r\x12&\n\nfan_status\x18\n \x01(\x0e\x32\x12.Monitoring.Status\x12\x11\n\tmax_power\x18\x0b \x01(\r\x12\x17\n\x0fpoe_consumption\x18\x0c \x01(\r\x12\x12\n\npoe_budget\x18\r \x01(\r\x12\x17\n\x0fmem_utilization\x18\x0e \x01(\x04\"\xdd\x01\n\nRadioStats\x12\x11\n\tdevice_id\x18\x01 \x01(\t\x12\'\n\x07macaddr\x18\x02 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\x11\n\ttimestamp\x18\x03 \x01(\r\x12\x10\n\x08tx_bytes\x18\x04 \x01(\x04\x12\x10\n\x08rx_bytes\x18\x05 \x01(\x04\x12\x10\n\x08tx_drops\x18\x06 \x01(\r\x12\x10\n\x08tx_power\x18\x07 \x01(\r\x12\x13\n\x0bnoise_floor\x18\x08 \x01(\r\x12\x13\n\x0butilization\x18\t \x01(\r\x12\x0e\n\x06rx_bad\x18\n \x01(\x04\"\x90\x01\n\x08VapStats\x12\x11\n\tdevice_id\x18\x01 \x01(\t\x12)\n\tradio_mac\x18\x02 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\x0f\n\x07network\x18\x03 \x01(\x0c\x12\x11\n\ttimestamp\x18\x04 \x01(\r\x12\x10\n\x08tx_bytes\x18\x05 \x01(\x04\x12\x10\n\x08rx_bytes\x18\x06 \x01(\x04\"\xa6\x01\n\x0bTunnelStats\x12\x10\n\x08swarm_id\x18\x01 \x01(\t\x12&\n\x05index\x18\x02 \x01(\x0e\x32\x17.Monitoring.TunnelIndex\x12\x11\n\ttimestamp\x18\x03 \x01(\r\x12\x10\n\x08tx_bytes\x18\x04 \x01(\x04\x12\x10\n\x08rx_bytes\x18\x05 \x01(\x04\x12\x11\n\ttunnel_id\x18\x06 \x01(\x04\x12\x13\n\x0btunnel_name\x18\x07 \x01(\t\"\xda\x01\n\x0b\x43lientStats\x12\x11\n\tdevice_id\x18\x01 \x01(\t\x12\'\n\x07macaddr\x18\x02 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\x11\n\ttimestamp\x18\x03 \x01(\r\x12\x10\n\x08tx_bytes\x18\x04 \x01(\x04\x12\x10\n\x08rx_bytes\x18\x05 \x01(\x04\x12\x12\n\nrx_retries\x18\x06 \x01(\r\x12\x12\n\ntx_retries\x18\x07 \x01(\r\x12\r\n\x05speed\x18\x08 \x01(\r\x12\x14\n\x0csignal_in_db\x18\t \x01(\r\x12\x0b\n\x03snr\x18\n \x01(\r\"\xa8\x05\n\x0eInterfaceStats\x12\x11\n\tdevice_id\x18\x01 \x01(\t\x12\'\n\x07macaddr\x18\x02 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\x11\n\ttimestamp\x18\x03 \x01(\r\x12\x10\n\x08tx_bytes\x18\x04 \x01(\x04\x12\x10\n\x08rx_bytes\x18\x05 \x01(\x04\x12\x19\n\x11power_consumption\x18\x06 \x01(\r\x12\x11\n\tin_errors\x18\x07 \x01(\x04\x12\x12\n\nout_errors\x18\x08 \x01(\x04\x12\x13\n\x0bin_discards\x18\t \x01(\x04\x12\x14\n\x0cout_discards\x18\n \x01(\x04\x12\x12\n\nin_packets\x18\x0b \x01(\x04\x12\x13\n\x0bout_packets\x18\x0c \x01(\x04\x12\x14\n\x0cin_other_err\x18\r \x01(\r\x12\x18\n\x10in_multicast_pkt\x18\x0e \x01(\x04\x12\x18\n\x10in_broadcast_pkt\x18\x0f \x01(\x04\x12\x16\n\x0ein_unicast_pkt\x18\x10 \x01(\x04\x12\x19\n\x11out_multicast_pkt\x18\x11 \x01(\x04\x12\x19\n\x11out_broadcast_pkt\x18\x12 \x01(\x04\x12\x17\n\x0fout_unicast_pkt\x18\x13 \x01(\x04\x12\x0e\n\x06in_fcs\x18\x14 \x01(\x04\x12\x14\n\x0cin_alignment\x18\x15 \x01(\x04\x12\x1f\n\x17out_excessive_collision\x18\x16 \x01(\r\x12\x12\n\nin_jabbers\x18\x17 \x01(\x04\x12\x15\n\rin_fragmented\x18\x18 \x01(\x04\x12\x10\n\x08in_giant\x18\x19 \x01(\r\x12\x0f\n\x07in_runt\x18\x1a \x01(\r\x12\x15\n\rout_collision\x18\x1b \x01(\x04\x12\x1a\n\x12out_late_collision\x18\x1c \x01(\r\x12\x14\n\x0cout_deferred\x18\x1d \x01(\r\"\xbc\x01\n\x0bUplinkStats\x12\x11\n\tdevice_id\x18\x01 \x01(\t\x12\x0f\n\x07link_id\x18\x02 \x01(\r\x12\x11\n\ttimestamp\x18\x03 \x01(\r\x12\x10\n\x08tx_bytes\x18\x04 \x01(\x04\x12\x10\n\x08rx_bytes\x18\x05 \x01(\x04\x12\x17\n\x0ftunnel_tx_bytes\x18\x06 \x01(\x04\x12\x17\n\x0ftunnel_rx_bytes\x18\x07 \x01(\x04\x12\x0e\n\x06map_id\x18\x08 \x01(\x04\x12\x10\n\x08map_name\x18\t \x01(\t\"\x94\x01\n\x0eUplinkWanStats\x12\x11\n\tdevice_id\x18\x01 \x01(\t\x12\x0f\n\x07link_id\x18\x02 \x01(\r\x12\x11\n\ttimestamp\x18\x03 \x01(\r\x12\x18\n\x10\x63ompressed_bytes\x18\x04 \x01(\x04\x12\x1a\n\x12uncompressed_bytes\x18\x05 \x01(\x04\x12\x15\n\rsavings_bytes\x18\x06 \x01(\x04\"V\n\nModemStats\x12\x11\n\tdevice_id\x18\x01 \x01(\t\x12\x11\n\ttimestamp\x18\x02 \x01(\r\x12\x10\n\x08tx_bytes\x18\x03 \x01(\x04\x12\x10\n\x08rx_bytes\x18\x04 \x01(\x04\"h\n\tRoleStats\x12\x11\n\tdevice_id\x18\x01 \x01(\t\x12\x11\n\tuser_role\x18\x02 \x01(\t\x12\x11\n\ttimestamp\x18\x03 \x01(\r\x12\x10\n\x08tx_bytes\x18\x04 \x01(\x04\x12\x10\n\x08rx_bytes\x18\x05 \x01(\x04\"c\n\tVlanStats\x12\x11\n\tdevice_id\x18\x01 \x01(\t\x12\x0c\n\x04vlan\x18\x02 \x01(\r\x12\x11\n\ttimestamp\x18\x03 \x01(\r\x12\x10\n\x08tx_bytes\x18\x04 \x01(\x04\x12\x10\n\x08rx_bytes\x18\x05 \x01(\x04\"d\n\tSsidStats\x12\x11\n\tdevice_id\x18\x01 \x01(\t\x12\r\n\x05\x65ssid\x18\x02 \x01(\x0c\x12\x11\n\ttimestamp\x18\x03 \x01(\r\x12\x10\n\x08tx_bytes\x18\x04 \x01(\x04\x12\x10\n\x08rx_bytes\x18\x05 \x01(\x04\"\xe1\x01\n\x12TunnelIpProbeStats\x12\x11\n\tdevice_id\x18\x01 \x01(\t\x12-\n\x0ctunnel_index\x18\x02 \x01(\x0e\x32\x17.Monitoring.TunnelIndex\x12,\n\rprobe_ip_addr\x18\x03 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\x14\n\x0cprobe_status\x18\x04 \x01(\r\x12\x1d\n\x15ip_probe_pkt_loss_pct\x18\x05 \x01(\r\x12\x13\n\x0btunnel_name\x18\x06 \x01(\t\x12\x11\n\ttunnel_id\x18\x11 \x01(\x04\"\xdf\x05\n\x12UplinkIpProbeStats\x12\x11\n\tdevice_id\x18\x01 \x01(\t\x12\x0f\n\x07link_id\x18\x02 \x01(\r\x12\x11\n\ttimestamp\x18\x03 \x01(\r\x12)\n\nip_address\x18\x04 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\x0c\n\x04vlan\x18\x05 \x01(\r\x12\x0f\n\x07\x61vg_rtt\x18\x06 \x01(\x04\x12\x0f\n\x07max_rtt\x18\x07 \x01(\x04\x12\x0f\n\x07min_rtt\x18\x08 \x01(\x04\x12\x12\n\navg_jitter\x18\t \x01(\x04\x12\x12\n\nmax_jitter\x18\n \x01(\x04\x12\x12\n\nmin_jitter\x18\x0b \x01(\x04\x12\x13\n\x0bmos_quality\x18\x0c \x01(\x04\x12\x16\n\x0esd_avg_latency\x18\r \x01(\x04\x12\x16\n\x0e\x64s_avg_latency\x18\x0e \x01(\x04\x12\x15\n\rsd_avg_jitter\x18\x0f \x01(\x04\x12\x15\n\rds_avg_jitter\x18\x10 \x01(\x04\x12\x14\n\x0cprobe_status\x18\x11 \x01(\r\x12\x10\n\x08loss_pct\x18\x12 \x01(\r\x12\x14\n\x0cvpnc_ip_addr\x18\x13 \x01(\x04\x12\x15\n\rprobe_ip_addr\x18\x14 \x01(\x04\x12\x15\n\ravg_rtt_float\x18\x15 \x01(\x02\x12\x15\n\rmax_rtt_float\x18\x16 \x01(\x02\x12\x15\n\rmin_rtt_float\x18\x17 \x01(\x02\x12\x18\n\x10\x61vg_jitter_float\x18\x18 \x01(\x02\x12\x18\n\x10max_jitter_float\x18\x19 \x01(\x02\x12\x18\n\x10min_jitter_float\x18\x1a \x01(\x02\x12\x19\n\x11mos_quality_float\x18\x1b \x01(\x02\x12\x1c\n\x14sd_avg_latency_float\x18\x1c \x01(\x02\x12\x1c\n\x14\x64s_avg_latency_float\x18\x1d \x01(\x02\x12\x1b\n\x13sd_avg_jitter_float\x18\x1e \x01(\x02\x12\x1b\n\x13\x64s_avg_jitter_float\x18\x1f \x01(\x02\"\xe3\x01\n\x0fUplinkSpeedtest\x12\x11\n\tdevice_id\x18\x01 \x01(\t\x12(\n\tserver_ip\x18\x02 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\x0c\n\x04vlan\x18\x03 \x01(\r\x12\x10\n\x08protocol\x18\x04 \x01(\t\x12\x14\n\x0cupstream_bps\x18\x05 \x01(\x04\x12\x16\n\x0e\x64ownstream_bps\x18\x06 \x01(\x04\x12\x11\n\ttime_secs\x18\x07 \x01(\r\x12\x17\n\x0fupstream_jitter\x18\x08 \x01(\x02\x12\x19\n\x11\x64ownstream_jitter\x18\t \x01(\x02\"\x92\x0e\n\tWIDSEvent\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x33\n\nevent_type\x18\x02 \x01(\x0e\x32\x1f.Monitoring.WIDSEvent.EventType\x12\'\n\x07macaddr\x18\x03 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\x13\n\x0b\x64\x65tected_ap\x18\x04 \x01(\t\x12\x35\n\x0b\x61ttack_type\x18\x05 \x01(\x0e\x32 .Monitoring.WIDSEvent.AttackType\x12\x0f\n\x07\x63hannel\x18\x06 \x01(\t\x12\x0f\n\x07network\x18\x07 \x01(\x0c\"U\n\tEventType\x12\t\n\x05ROGUE\x10\x01\x12\x0f\n\x0bINTERFERING\x10\x02\x12\x19\n\x15INFRASTRUCTURE_ATTACK\x10\x03\x12\x11\n\rCLIENT_ATTACK\x10\x04\"\xb5\x0b\n\nAttackType\x12\x1c\n\x18\x44\x45TECT_VALID_SSID_MISUSE\x10\x01\x12\x18\n\x14\x44\x45TECT_ADHOC_NETWORK\x10\x02\x12\x13\n\x0f\x44\x45TECT_AP_FLOOD\x10\x03\x12\x1a\n\x16\x44\x45TECT_WIRELESS_BRIDGE\x10\x04\x12\x1d\n\x19\x44\x45TECT_INVALID_MAC_OUI_AP\x10\x05\x12\x1e\n\x1a\x44\x45TECT_INVALID_MAC_OUI_STA\x10\x06\x12\x12\n\x0e\x44\x45TECT_BAD_WEP\x10\x07\x12\x1b\n\x17\x44\x45TECT_AP_IMPERSONATION\x10\x08\x12\x19\n\x15\x44\x45TECT_WINDOWS_BRIDGE\x10\t\x12!\n\x1dSIGNATURE_DEAUTH_BROADCAST_AP\x10\n\x12\"\n\x1eSIGNATURE_DEAUTH_BROADCAST_STA\x10\x0b\x12\x18\n\x14\x44\x45TECT_HT_GREENFIELD\x10\x0c\x12\"\n\x1e\x44\x45TECT_HT_40MHZ_INTOLERANCE_AP\x10\r\x12#\n\x1f\x44\x45TECT_HT_40MHZ_INTOLERANCE_STA\x10\x0e\x12\x17\n\x13\x44\x45TECT_CLIENT_FLOOD\x10\x0f\x12!\n\x1d\x44\x45TECT_ADHOC_USING_VALID_SSID\x10\x10\x12\x16\n\x12\x44\x45TECT_AP_SPOOFING\x10\x11\x12%\n!DETECT_INVALID_ADDRESSCOMBINATION\x10\x12\x12\x19\n\x15\x44\x45TECT_MALFORMED_HTIE\x10\x13\x12\x1e\n\x1a\x44\x45TECT_MALFORMED_ASSOC_REQ\x10\x14\x12\x16\n\x12\x44\x45TECT_OVERFLOW_IE\x10\x15\x12\x1d\n\x19\x44\x45TECT_OVERFLOW_EAPOL_KEY\x10\x16\x12#\n\x1f\x44\x45TECT_MALFORMED_LARGE_DURATION\x10\x17\x12(\n$DETECT_MALFORMED_FRAME_WRONG_CHANNEL\x10\x18\x12\x1f\n\x1b\x44\x45TECT_MALFORMED_FRAME_AUTH\x10\x19\x12\x1b\n\x17\x44\x45TECT_CTS_RATE_ANOMALY\x10\x1a\x12\x1b\n\x17\x44\x45TECT_RTS_RATE_ANOMALY\x10\x1b\x12\x1e\n\x1aSIGNATURE_DEAUTH_BROADCAST\x10\x1c\x12%\n!SIGNATURE_DEASSOCIATION_BROADCAST\x10\x1d\x12\x1f\n\x1b\x44\x45TECT_RATE_ANOMALIES_BY_AP\x10\x1e\x12 \n\x1c\x44\x45TECT_RATE_ANOMALIES_BY_STA\x10\x1f\x12\x1b\n\x17\x44\x45TECT_EAP_RATE_ANOMALY\x10 \x12\x19\n\x15\x44\x45TECT_DISCONNECT_STA\x10!\x12\x1c\n\x18SIGNATURE_ASLEAP_FROM_AP\x10\"\x12\x1d\n\x19SIGNATURE_ASLEAP_FROM_STA\x10#\x12\x1d\n\x19SIGNATURE_AIRJACK_FROM_AP\x10$\x12\x1e\n\x1aSIGNATURE_AIRJACK_FROM_STA\x10%\x12\'\n#DETECT_STATION_DISCONNECT_ATTACK_AP\x10&\x12\x1c\n\x18\x44\x45TECT_UNENCRYPTED_VALID\x10\'\x12\x18\n\x14\x44\x45TECT_OMERTA_ATTACK\x10(\x12\x1d\n\x19\x44\x45TECT_TKIP_REPLAY_ATTACK\x10)\x12\x1a\n\x16\x44\x45TECT_CHOPCHOP_ATTACK\x10*\x12\x13\n\x0f\x44\x45TECT_FATAJACK\x10+\x12&\n\"DETECT_VALID_CLIENT_MISASSOCIATION\x10,\x12\x1b\n\x17\x44\x45TECT_BLOCK_ACK_ATTACK\x10-\x12\x1c\n\x18\x44\x45TECT_HOTSPOTTER_ATTACK\x10.\x12 \n\x1c\x44\x45TECT_POWER_SAVE_DOS_ATTACK\x10/\"\xdd\x05\n\x13\x41irMonitorRogueInfo\x12\x46\n\nmatch_type\x18\x01 \x01(\x0e\x32\x32.Monitoring.AirMonitorRogueInfo.wms_rap_match_type\x12)\n\tmatch_mac\x18\x02 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\'\n\x08match_ip\x18\x03 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\x14\n\x0cmonitor_name\x18\x04 \x01(\t\x12N\n\x0enat_match_type\x18\x05 \x01(\x0e\x32\x36.Monitoring.AirMonitorRogueInfo.wms_rap_nat_match_type\"\xc6\x02\n\x12wms_rap_match_type\x12\x0f\n\x0bRAP_MT_NONE\x10\x00\x12\x11\n\rRAP_MT_CFG_WM\x10\x01\x12\x11\n\rRAP_MT_ETH_WM\x10\x02\x12\x10\n\x0cRAP_MT_AP_WM\x10\x03\x12\x11\n\rRAP_MT_EXT_WM\x10\x04\x12\x11\n\rRAP_MT_MANUAL\x10\x05\x12\x15\n\x11RAP_MT_BASE_BSSID\x10\x06\x12\x0e\n\nRAP_MT_EMS\x10\x07\x12\x14\n\x10RAP_MT_ETH_GW_WM\x10\x08\x12\x14\n\x10RAP_MT_CLASS_OFF\x10\t\x12\x13\n\x0fRAP_MT_AP_BSSID\x10\n\x12\x16\n\x12RAP_MT_PROP_ETH_WM\x10\x0b\x12\x12\n\x0eRAP_MT_AP_RULE\x10\x0c\x12\x14\n\x10RAP_MT_SYSTEM_WM\x10\r\x12\x17\n\x13RAP_MT_SYSTEM_GW_WM\x10\x0e\"{\n\x16wms_rap_nat_match_type\x12\x10\n\x0cRAP_NMT_NONE\x10\x00\x12\x11\n\rRAP_NMT_EQUAL\x10\x01\x12\x14\n\x10RAP_NMT_PLUS_ONE\x10\x02\x12\x15\n\x11RAP_NMT_MINUS_ONE\x10\x03\x12\x0f\n\x0bRAP_NMT_OUI\x10\x04\"\x93\x03\n\nRogueEvent\x12\'\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x03\x41\x44\x44\x12\x13\n\x0b\x64\x65tected_ap\x18\x02 \x01(\t\x12\'\n\x07macaddr\x18\x03 \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\x0f\n\x07\x63hannel\x18\x04 \x01(\r\x12\x0f\n\x07network\x18\x05 \x01(\x0c\x12@\n\tencr_type\x18\x06 \x01(\x0e\x32-.Monitoring.RogueEvent.wms_snmp_encr_protocol\x12\x31\n\x08\x61m_rogue\x18\x07 \x01(\x0b\x32\x1f.Monitoring.AirMonitorRogueInfo\"\x86\x01\n\x16wms_snmp_encr_protocol\x12\x1a\n\x16WMS_SNMP_WPA_ENCR_OPEN\x10\x00\x12\x19\n\x15WMS_SNMP_WPA_ENCR_WEP\x10\x01\x12\x19\n\x15WMS_SNMP_WPA_ENCR_WPA\x10\x02\x12\x1a\n\x16WMS_SNMP_WPA_ENCR_WPA2\x10\x03\"\xbb\x01\n\x10\x44\x65viceNeighbours\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x11\n\tdevice_id\x18\x02 \x01(\t\x12\x0c\n\x04port\x18\x03 \x01(\t\x12\x18\n\x10remote_device_id\x18\x04 \x01(\t\x12\x13\n\x0bremote_port\x18\x05 \x01(\t\x12\x1a\n\x12remote_port_number\x18\x06 \x01(\t\x12\x0f\n\x07vlan_id\x18\x07 \x01(\t\"\xc9\r\n\x15MonitoringInformation\x12\x13\n\x0b\x63ustomer_id\x18\x01 \x02(\t\x12.\n\rdata_elements\x18\x02 \x03(\x0e\x32\x17.Monitoring.DataElement\x12!\n\x06swarms\x18\x03 \x03(\x0b\x32\x11.Monitoring.Swarm\x12\x1b\n\x03\x61ps\x18\x04 \x03(\x0b\x32\x0e.Monitoring.Ap\x12%\n\x08networks\x18\x05 \x03(\x0b\x32\x13.Monitoring.Network\x12!\n\x06radios\x18\x06 \x03(\x0b\x32\x11.Monitoring.Radio\x12!\n\x04vaps\x18\x07 \x03(\x0b\x32\x13.Monitoring.VapInfo\x12)\n\ninterfaces\x18\x08 \x03(\x0b\x32\x15.Monitoring.Interface\x12#\n\x07tunnels\x18\t \x03(\x0b\x32\x12.Monitoring.Tunnel\x12\x34\n\x10wireless_clients\x18\n \x03(\x0b\x32\x1a.Monitoring.WirelessClient\x12$\n\x08switches\x18\x0b \x03(\x0b\x32\x12.Monitoring.Switch\x12.\n\rwired_clients\x18\x0c \x03(\x0b\x32\x17.Monitoring.WiredClient\x12-\n\x0c\x64\x65vice_stats\x18\r \x03(\x0b\x32\x17.Monitoring.DeviceStats\x12+\n\x0bradio_stats\x18\x0e \x03(\x0b\x32\x16.Monitoring.RadioStats\x12\x33\n\x0finterface_stats\x18\x0f \x03(\x0b\x32\x1a.Monitoring.InterfaceStats\x12\'\n\tvap_stats\x18\x10 \x03(\x0b\x32\x14.Monitoring.VapStats\x12-\n\x0c\x63lient_stats\x18\x11 \x03(\x0b\x32\x17.Monitoring.ClientStats\x12-\n\x0ctunnel_stats\x18\x12 \x03(\x0b\x32\x17.Monitoring.TunnelStats\x12*\n\x0bwids_events\x18\x13 \x03(\x0b\x32\x15.Monitoring.WIDSEvent\x12+\n\x0bmodem_stats\x18\x14 \x03(\x0b\x32\x16.Monitoring.ModemStats\x12)\n\nrole_stats\x18\x15 \x03(\x0b\x32\x15.Monitoring.RoleStats\x12)\n\nvlan_stats\x18\x16 \x03(\x0b\x32\x15.Monitoring.VlanStats\x12)\n\nssid_stats\x18\x17 \x03(\x0b\x32\x15.Monitoring.SsidStats\x12\x35\n\ripprobe_stats\x18\x18 \x03(\x0b\x32\x1e.Monitoring.TunnelIpProbeStats\x12,\n\x0crogue_events\x18\x19 \x03(\x0b\x32\x16.Monitoring.RogueEvent\x12<\n\x14mobility_controllers\x18\x1a \x03(\x0b\x32\x1e.Monitoring.MobilityController\x12#\n\x07uplinks\x18\x1b \x03(\x0b\x32\x12.Monitoring.Uplink\x12-\n\x0cuplink_stats\x18\x1c \x03(\x0b\x32\x17.Monitoring.UplinkStats\x12\x34\n\x10uplink_wan_stats\x18\x1d \x03(\x0b\x32\x1a.Monitoring.UplinkWanStats\x12:\n\x12uplink_probe_stats\x18\x1e \x03(\x0b\x32\x1e.Monitoring.UplinkIpProbeStats\x12\x35\n\x10uplink_speedtest\x18\x1f \x03(\x0b\x32\x1b.Monitoring.UplinkSpeedtest\x12\x37\n\x11\x64\x65vice_neighbours\x18 \x03(\x0b\x32\x1c.Monitoring.DeviceNeighbours\x12.\n\x0cnotification\x18! \x03(\x0b\x32\x18.Monitoring.Notification\x12.\n\rswitch_stacks\x18\" \x03(\x0b\x32\x17.Monitoring.SwitchStack\x12*\n\x0bike_tunnels\x18# \x03(\x0b\x32\x15.Monitoring.IkeTunnel\x12\x34\n\x10switch_vlan_info\x18$ \x01(\x0b\x32\x1a.Monitoring.SwitchVlanInfo\x12\x1f\n\x05vlans\x18% \x03(\x0b\x32\x10.Monitoring.Vlan\x12!\n\x03vsx\x18& \x01(\x0b\x32\x14.Monitoring.VSXState\x12\x11\n\ttimestamp\x18\' \x01(\r\"\xbc\x05\n\x1aMonitoringStateInformation\x12\x13\n\x0b\x63ustomer_id\x18\x01 \x02(\t\x12<\n\x14mobility_controllers\x18\x02 \x03(\x0b\x32\x1e.Monitoring.MobilityController\x12$\n\x08switches\x18\x03 \x03(\x0b\x32\x12.Monitoring.Switch\x12!\n\x06swarms\x18\x04 \x03(\x0b\x32\x11.Monitoring.Swarm\x12\x1b\n\x03\x61ps\x18\x05 \x03(\x0b\x32\x0e.Monitoring.Ap\x12!\n\x04vaps\x18\x06 \x03(\x0b\x32\x13.Monitoring.VapInfo\x12!\n\x06radios\x18\x07 \x03(\x0b\x32\x11.Monitoring.Radio\x12)\n\ninterfaces\x18\x08 \x03(\x0b\x32\x15.Monitoring.Interface\x12%\n\x08networks\x18\t \x03(\x0b\x32\x13.Monitoring.Network\x12#\n\x07tunnels\x18\n \x03(\x0b\x32\x12.Monitoring.Tunnel\x12\x34\n\x10wireless_clients\x18\x0b \x03(\x0b\x32\x1a.Monitoring.WirelessClient\x12.\n\rwired_clients\x18\x0c \x03(\x0b\x32\x17.Monitoring.WiredClient\x12#\n\x07uplinks\x18\r \x03(\x0b\x32\x12.Monitoring.Uplink\x12.\n\rswitch_stacks\x18\x0e \x03(\x0b\x32\x17.Monitoring.SwitchStack\x12*\n\x0bike_tunnels\x18\x0f \x03(\x0b\x32\x15.Monitoring.IkeTunnel\x12.\n\rdata_elements\x18\x10 \x03(\x0e\x32\x17.Monitoring.DataElement\x12\x11\n\ttimestamp\x18\x11 \x01(\r\"*\n\x0cKeyValueData\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\r\n\x05value\x18\x02 \x01(\t\"\x83\x03\n\x0cNotification\x12\n\n\x02id\x18\x01 \x01(\t\x12\x0c\n\x04type\x18\x02 \x01(\t\x12\x12\n\nsetting_id\x18\x03 \x01(\t\x12\x11\n\tdevice_id\x18\x04 \x01(\t\x12\x33\n\x08severity\x18\x05 \x01(\x0e\x32!.Monitoring.Notification.Severity\x12\x11\n\ttimestamp\x18\x06 \x01(\r\x12\x39\n\x05state\x18\x07 \x01(\x0e\x32*.Monitoring.Notification.NotificationState\x12\x13\n\x0b\x64\x65scription\x18\x08 \x01(\t\x12\'\n\x05\x65xtra\x18\t \x03(\x0b\x32\x18.Monitoring.KeyValueData\"G\n\x08Severity\x12\n\n\x06Normal\x10\x01\x12\x0b\n\x07Warning\x10\x02\x12\t\n\x05Minor\x10\x03\x12\t\n\x05Major\x10\x04\x12\x0c\n\x08\x43ritical\x10\x05\"(\n\x11NotificationState\x12\x08\n\x04Open\x10\x00\x12\t\n\x05\x43lose\x10\x01\"J\n\x0eSwitchVlanInfo\x12\x11\n\tdevice_id\x18\x01 \x02(\t\x12%\n\x05vlans\x18\x02 \x03(\x0b\x32\x16.Monitoring.SwitchVlan\"\xbc\x04\n\nSwitchVlan\x12\n\n\x02id\x18\x01 \x01(\r\x12\x0c\n\x04name\x18\x02 \x01(\t\x12\x14\n\x0ctagged_ports\x18\x03 \x03(\t\x12\x16\n\x0euntagged_ports\x18\x04 \x03(\t\x12\x17\n\x0fprimary_vlan_id\x18\x05 \x01(\r\x12\x19\n\x11primary_vlan_type\x18\x06 \x01(\t\x12\x19\n\x11promiscuous_ports\x18\x07 \x03(\t\x12\x11\n\tisl_ports\x18\x08 \x03(\t\x12\x1a\n\x12is_management_vlan\x18\t \x01(\x08\x12\x18\n\x10is_voice_enabled\x18\n \x01(\x08\x12\x18\n\x10is_jumbo_enabled\x18\x0b \x01(\x08\x12\x17\n\x0fis_igmp_enabled\x18\x0c \x01(\x08\x12(\n\tipaddress\x18\r \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\x31\n\x06status\x18\x0e \x01(\x0e\x32!.Monitoring.SwitchVlan.VlanStatus\x12\x19\n\x11oper_state_reason\x18\x0f \x01(\t\x12-\n\x04type\x18\x10 \x01(\x0e\x32\x1f.Monitoring.SwitchVlan.VlanType\x12\x14\n\x0c\x61\x63\x63\x65ss_ports\x18\x11 \x03(\t\"\x1e\n\nVlanStatus\x12\x06\n\x02UP\x10\x01\x12\x08\n\x04\x44OWN\x10\x02\">\n\x08VlanType\x12\n\n\x06STATIC\x10\x01\x12\x0b\n\x07\x44YNAMIC\x10\x02\x12\x0c\n\x08INTERNAL\x10\x03\x12\x0b\n\x07\x44\x45\x46\x41ULT\x10\x04\"\xfe\x02\n\x04Vlan\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x0f\n\x07vlan_id\x18\x02 \x01(\r\x12#\n\x04ipv4\x18\x03 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12&\n\x07ipv6_ll\x18\x04 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12%\n\x06ipv6_1\x18\x05 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12%\n\x06ipv6_2\x18\x06 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12%\n\x06ipv6_3\x18\x07 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12&\n\noper_state\x18\x08 \x01(\x0e\x32\x12.Monitoring.Status\x12\x13\n\x0b\x64\x65scription\x18\t \x01(\t\x12\'\n\x0b\x61\x64min_state\x18\n \x01(\x0e\x32\x12.Monitoring.Status\x12\x11\n\taddr_mode\x18\x0b \x01(\t\"\xf6\n\n\x08VSXState\x12*\n\x06\x61\x63tion\x18\x01 \x01(\x0e\x32\x12.Monitoring.Action:\x06UPDATE\x12\x11\n\tdevice_id\x18\x02 \x01(\t\x12-\n\x04role\x18\x03 \x01(\x0e\x32\x1f.Monitoring.VSXState.DeviceRole\x12\x32\n\tpeer_role\x18\x04 \x01(\x0e\x32\x1f.Monitoring.VSXState.DeviceRole\x12\x10\n\x08isl_port\x18\x05 \x01(\t\x12\x15\n\rpeer_isl_port\x18\x06 \x01(\t\x12\x30\n\x11keepalive_peer_ip\x18\x07 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12/\n\x10keepalive_src_ip\x18\x08 \x01(\x0b\x32\x15.Monitoring.IpAddress\x12\x1b\n\x13last_sync_timestamp\x18\t \x01(\x04\x12#\n\x03mac\x18\n \x01(\x0b\x32\x16.Monitoring.MacAddress\x12(\n\x08peer_mac\x18\x0b \x01(\x0b\x32\x16.Monitoring.MacAddress\x12\x1b\n\x13\x63onfig_sync_disable\x18\x0c \x01(\x08\x12\x45\n\x17islp_device_state_value\x18\r \x01(\x0e\x32$.Monitoring.VSXState.ISLPDeviceState\x12>\n\x17\x63onfig_sync_state_value\x18\x0e \x01(\x0e\x32\x1d.Monitoring.VSXState.ISLState\x12;\n\x14isl_mgmt_state_value\x18\x0f \x01(\x0e\x32\x1d.Monitoring.VSXState.ISLState\x12\x36\n\x0fnae_state_value\x18\x10 \x01(\x0e\x32\x1d.Monitoring.VSXState.ISLState\x12?\n\x18https_server_state_value\x18\x11 \x01(\x0e\x32\x1d.Monitoring.VSXState.ISLState\"(\n\nDeviceRole\x12\x0b\n\x07PRIMARY\x10\x01\x12\r\n\tSECONDARY\x10\x02\"\xba\x01\n\x0fISLPDeviceState\x12\x14\n\x10WAITING_FOR_PEER\x10\x01\x12\x14\n\x10PEER_ESTABLISHED\x10\x02\x12\x18\n\x14SPLIT_SYSTEM_PRIMARY\x10\x03\x12\x1a\n\x16SPLIT_SYSTEM_SECONDARY\x10\x04\x12\x10\n\x0cSYNC_PRIMARY\x10\x05\x12\x12\n\x0eSYNC_SECONDARY\x10\x06\x12\x1f\n\x1bSYNC_SECONDARY_LINKUP_DELAY\x10\x07\"\x8e\x03\n\x08ISLState\x12\x0b\n\x07IN_SYNC\x10\x01\x12\x0c\n\x08\x44ISABLED\x10\x02\x12#\n\x1fSW_IMAGE_VERSION_MISMATCH_ERROR\x10\x03\x12$\n CONFLICTING_OR_MISSING_DEV_ROLES\x10\x04\x12\x1c\n\x18PEER_DB_CONNECTION_ERROR\x10\x05\x12\x1f\n\x1b\x43ONFIGURATION_SYNC_CONFLICT\x10\x06\x12(\n$CONFIGURATION_SYNC_MISSING_REFERENCE\x10\x07\x12\x12\n\x0ePEER_REACHABLE\x10\x08\x12\x14\n\x10PEER_UNREACHABLE\x10\t\x12\x0f\n\x0bOPERATIONAL\x10\n\x12\x1f\n\x1bINTER_SWITCH_LINK_MGMT_INIT\x10\x0b\x12\'\n#CONFLICTING_OR_MISSING_DEVICE_ROLES\x10\x0c\x12\x1a\n\x16INTER_SWITCH_LINK_DOWN\x10\r\x12\x12\n\x0eINTERNAL_ERROR\x10\x0e*)\n\x06\x41\x63tion\x12\x07\n\x03\x41\x44\x44\x10\x01\x12\n\n\x06\x44\x45LETE\x10\x02\x12\n\n\x06UPDATE\x10\x03*\x1a\n\x06Status\x12\x06\n\x02UP\x10\x01\x12\x08\n\x04\x44OWN\x10\x02*&\n\x0bTunnelIndex\x12\x0b\n\x07PRIMARY\x10\x00\x12\n\n\x06\x42\x41\x43KUP\x10\x01*\"\n\nCryptoType\x12\x0b\n\x07\x43\x41_CERT\x10\x00\x12\x07\n\x03PSK\x10\x01*\xa3\x05\n\x0b\x44\x61taElement\x12\x14\n\x10STATE_CONTROLLER\x10\x01\x12\x10\n\x0cSTATE_SWITCH\x10\x02\x12\x0f\n\x0bSTATE_SWARM\x10\x03\x12\x0c\n\x08STATE_AP\x10\x04\x12\r\n\tSTATE_VAP\x10\x05\x12\x0f\n\x0bSTATE_RADIO\x10\x06\x12\x13\n\x0fSTATE_INTERFACE\x10\x07\x12\x11\n\rSTATE_NETWORK\x10\x08\x12\x10\n\x0cSTATE_TUNNEL\x10\t\x12\x18\n\x14STATE_WIRELESSCLIENT\x10\n\x12\x15\n\x11STATE_WIREDCLIENT\x10\x0b\x12\x10\n\x0cSTATE_UPLINK\x10\x0c\x12\x0f\n\x0bSTAT_DEVICE\x10\r\x12\x0e\n\nSTAT_RADIO\x10\x0e\x12\x0c\n\x08STAT_VAP\x10\x0f\x12\x12\n\x0eSTAT_INTERFACE\x10\x10\x12\x0f\n\x0bSTAT_CLIENT\x10\x11\x12\x0f\n\x0bSTAT_TUNNEL\x10\x12\x12\x0e\n\nSTAT_MODEM\x10\x13\x12\r\n\tSTAT_ROLE\x10\x14\x12\r\n\tSTAT_VLAN\x10\x15\x12\r\n\tSTAT_SSID\x10\x16\x12\x10\n\x0cSTAT_IPPROBE\x10\x17\x12\x0f\n\x0bSTAT_UPLINK\x10\x18\x12\x12\n\x0eSTAT_UPLINKWAN\x10\x19\x12\x16\n\x12STAT_UPLINKIPPROBE\x10\x1a\x12\x0f\n\x0b\x45VENTS_WIDS\x10\x1b\x12\x10\n\x0c\x45VENTS_ROGUE\x10\x1c\x12\x1a\n\x16STATS_UPLINK_SPEEDTEST\x10\x1d\x12\x15\n\x11\x44\x45VICE_NEIGHBOURS\x10\x1e\x12\x11\n\rNOTIFICATIONS\x10\x1f\x12\x10\n\x0cSWITCH_STACK\x10 \x12\x14\n\x10STATE_IKE_TUNNEL\x10!\x12\x0f\n\x0bSWITCH_VLAN\x10\"\x12\x0e\n\nSTATE_VLAN\x10#\x12\r\n\tSTATE_VSX\x10$*\xb0\x06\n\x08\x41uthType\x12\x08\n\x04NONE\x10\x01\x12\x0c\n\x08MAC_AUTH\x10\x02\x12\x0e\n\nDOT1X_AUTH\x10\x03\x12\x0b\n\x07L3_AUTH\x10\x04\x12\x10\n\x0c\x43ONSOLE_AUTH\x10\x05\x12\x0f\n\x0bTELNET_AUTH\x10\x06\x12\x0e\n\nWEBUI_AUTH\x10\x07\x12\x0c\n\x08SSH_AUTH\x10\x08\x12\x0c\n\x08WEB_AUTH\x10\t\x12\r\n\tSNMP_AUTH\x10\n\x12\x11\n\rSSH_NONE_AUTH\x10\x0b\x12\x0c\n\x08LMA_AUTH\x10\x0c\x12\x0c\n\x08\x41NY_AUTH\x10\r\x12\x12\n\x0e\x43\x41PTIVE_PORTAL\x10\x0e\x12\x0c\n\x08VPN_AUTH\x10\x0f\x12\x15\n\x11STATEFUL_KERBEROS\x10\x10\x12\x15\n\x11RADIUS_ACCOUNTING\x10\x11\x12\r\n\tSECURE_ID\x10\x12\x12\x13\n\x0fSTATEFUL_RADIUS\x10\x13\x12\x15\n\x11SWITCH_MANAGEMENT\x10\x14\x12\x11\n\rDOT1X_MACHINE\x10\x15\x12\x0e\n\nDOT1X_USER\x10\x16\x12\x0f\n\x0b\x44OT1X_WIRED\x10\x17\x12\x17\n\x13\x44OT1X_WIRED_MACHINE\x10\x18\x12\x14\n\x10\x44OT1X_WIRED_USER\x10\x19\x12\x0e\n\nPUB_COOKIE\x10\x1a\x12\x10\n\x0cTACACAS_PLUS\x10\x1b\x12\x11\n\rWIRELESS_XSEC\x10\x1c\x12\x19\n\x15WIRELESS_XSEC_MACHINE\x10\x1d\x12\x16\n\x12WIRELESS_XSEC_USER\x10\x1e\x12\x17\n\x13WIRELESS_XSEC_WIRED\x10\x1f\x12\x1f\n\x1bWIRELESS_XSEC_WIRED_MACHINE\x10 \x12\x1c\n\x18WIRELESS_XSEC_WIRED_USER\x10!\x12\x11\n\rSTATEFUL_NTLM\x10\"\x12\n\n\x06RAP_AP\x10#\x12\x0b\n\x07VIA_WEB\x10$\x12\x1a\n\x16GENERIC_INTERFACE_SPEC\x10%\x12\x11\n\rTRANSPORT_VPN\x10&\x12\x0b\n\x07VIA_VPN\x10\'\x12\x0e\n\nPUTN_DOT1X\x10(\x12\x0c\n\x08PUTN_MAC\x10)\x12\x0b\n\x07PUTN_CP\x10*\x12\x0c\n\x08PUTN_LMA\x10+\x12\x13\n\x0fNUM_AUTH_CLIENT\x10,'
)
_ACTION = _descriptor.EnumDescriptor(
name='Action',
full_name='Monitoring.Action',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='ADD', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DELETE', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='UPDATE', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=19136,
serialized_end=19177,
)
_sym_db.RegisterEnumDescriptor(_ACTION)
Action = enum_type_wrapper.EnumTypeWrapper(_ACTION)
_STATUS = _descriptor.EnumDescriptor(
name='Status',
full_name='Monitoring.Status',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UP', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DOWN', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=19179,
serialized_end=19205,
)
_sym_db.RegisterEnumDescriptor(_STATUS)
Status = enum_type_wrapper.EnumTypeWrapper(_STATUS)
_TUNNELINDEX = _descriptor.EnumDescriptor(
name='TunnelIndex',
full_name='Monitoring.TunnelIndex',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='PRIMARY', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='BACKUP', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=19207,
serialized_end=19245,
)
_sym_db.RegisterEnumDescriptor(_TUNNELINDEX)
TunnelIndex = enum_type_wrapper.EnumTypeWrapper(_TUNNELINDEX)
_CRYPTOTYPE = _descriptor.EnumDescriptor(
name='CryptoType',
full_name='Monitoring.CryptoType',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='CA_CERT', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PSK', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=19247,
serialized_end=19281,
)
_sym_db.RegisterEnumDescriptor(_CRYPTOTYPE)
CryptoType = enum_type_wrapper.EnumTypeWrapper(_CRYPTOTYPE)
_DATAELEMENT = _descriptor.EnumDescriptor(
name='DataElement',
full_name='Monitoring.DataElement',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='STATE_CONTROLLER', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATE_SWITCH', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATE_SWARM', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATE_AP', index=3, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATE_VAP', index=4, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATE_RADIO', index=5, number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATE_INTERFACE', index=6, number=7,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATE_NETWORK', index=7, number=8,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATE_TUNNEL', index=8, number=9,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATE_WIRELESSCLIENT', index=9, number=10,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATE_WIREDCLIENT', index=10, number=11,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATE_UPLINK', index=11, number=12,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STAT_DEVICE', index=12, number=13,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STAT_RADIO', index=13, number=14,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STAT_VAP', index=14, number=15,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STAT_INTERFACE', index=15, number=16,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STAT_CLIENT', index=16, number=17,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STAT_TUNNEL', index=17, number=18,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STAT_MODEM', index=18, number=19,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STAT_ROLE', index=19, number=20,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STAT_VLAN', index=20, number=21,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STAT_SSID', index=21, number=22,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STAT_IPPROBE', index=22, number=23,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STAT_UPLINK', index=23, number=24,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STAT_UPLINKWAN', index=24, number=25,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STAT_UPLINKIPPROBE', index=25, number=26,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='EVENTS_WIDS', index=26, number=27,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='EVENTS_ROGUE', index=27, number=28,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATS_UPLINK_SPEEDTEST', index=28, number=29,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DEVICE_NEIGHBOURS', index=29, number=30,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='NOTIFICATIONS', index=30, number=31,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SWITCH_STACK', index=31, number=32,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATE_IKE_TUNNEL', index=32, number=33,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SWITCH_VLAN', index=33, number=34,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATE_VLAN', index=34, number=35,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATE_VSX', index=35, number=36,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=19284,
serialized_end=19959,
)
_sym_db.RegisterEnumDescriptor(_DATAELEMENT)
DataElement = enum_type_wrapper.EnumTypeWrapper(_DATAELEMENT)
_AUTHTYPE = _descriptor.EnumDescriptor(
name='AuthType',
full_name='Monitoring.AuthType',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='NONE', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MAC_AUTH', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DOT1X_AUTH', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='L3_AUTH', index=3, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='CONSOLE_AUTH', index=4, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='TELNET_AUTH', index=5, number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='WEBUI_AUTH', index=6, number=7,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SSH_AUTH', index=7, number=8,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='WEB_AUTH', index=8, number=9,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SNMP_AUTH', index=9, number=10,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SSH_NONE_AUTH', index=10, number=11,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LMA_AUTH', index=11, number=12,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ANY_AUTH', index=12, number=13,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='CAPTIVE_PORTAL', index=13, number=14,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='VPN_AUTH', index=14, number=15,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATEFUL_KERBEROS', index=15, number=16,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RADIUS_ACCOUNTING', index=16, number=17,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SECURE_ID', index=17, number=18,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATEFUL_RADIUS', index=18, number=19,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SWITCH_MANAGEMENT', index=19, number=20,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DOT1X_MACHINE', index=20, number=21,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DOT1X_USER', index=21, number=22,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DOT1X_WIRED', index=22, number=23,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DOT1X_WIRED_MACHINE', index=23, number=24,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DOT1X_WIRED_USER', index=24, number=25,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PUB_COOKIE', index=25, number=26,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='TACACAS_PLUS', index=26, number=27,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='WIRELESS_XSEC', index=27, number=28,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='WIRELESS_XSEC_MACHINE', index=28, number=29,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='WIRELESS_XSEC_USER', index=29, number=30,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='WIRELESS_XSEC_WIRED', index=30, number=31,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='WIRELESS_XSEC_WIRED_MACHINE', index=31, number=32,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='WIRELESS_XSEC_WIRED_USER', index=32, number=33,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATEFUL_NTLM', index=33, number=34,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_AP', index=34, number=35,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='VIA_WEB', index=35, number=36,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='GENERIC_INTERFACE_SPEC', index=36, number=37,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='TRANSPORT_VPN', index=37, number=38,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='VIA_VPN', index=38, number=39,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PUTN_DOT1X', index=39, number=40,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PUTN_MAC', index=40, number=41,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PUTN_CP', index=41, number=42,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PUTN_LMA', index=42, number=43,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='NUM_AUTH_CLIENT', index=43, number=44,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=19962,
serialized_end=20778,
)
_sym_db.RegisterEnumDescriptor(_AUTHTYPE)
AuthType = enum_type_wrapper.EnumTypeWrapper(_AUTHTYPE)
ADD = 1
DELETE = 2
UPDATE = 3
UP = 1
DOWN = 2
PRIMARY = 0
BACKUP = 1
CA_CERT = 0
PSK = 1
STATE_CONTROLLER = 1
STATE_SWITCH = 2
STATE_SWARM = 3
STATE_AP = 4
STATE_VAP = 5
STATE_RADIO = 6
STATE_INTERFACE = 7
STATE_NETWORK = 8
STATE_TUNNEL = 9
STATE_WIRELESSCLIENT = 10
STATE_WIREDCLIENT = 11
STATE_UPLINK = 12
STAT_DEVICE = 13
STAT_RADIO = 14
STAT_VAP = 15
STAT_INTERFACE = 16
STAT_CLIENT = 17
STAT_TUNNEL = 18
STAT_MODEM = 19
STAT_ROLE = 20
STAT_VLAN = 21
STAT_SSID = 22
STAT_IPPROBE = 23
STAT_UPLINK = 24
STAT_UPLINKWAN = 25
STAT_UPLINKIPPROBE = 26
EVENTS_WIDS = 27
EVENTS_ROGUE = 28
STATS_UPLINK_SPEEDTEST = 29
DEVICE_NEIGHBOURS = 30
NOTIFICATIONS = 31
SWITCH_STACK = 32
STATE_IKE_TUNNEL = 33
SWITCH_VLAN = 34
STATE_VLAN = 35
STATE_VSX = 36
NONE = 1
MAC_AUTH = 2
DOT1X_AUTH = 3
L3_AUTH = 4
CONSOLE_AUTH = 5
TELNET_AUTH = 6
WEBUI_AUTH = 7
SSH_AUTH = 8
WEB_AUTH = 9
SNMP_AUTH = 10
SSH_NONE_AUTH = 11
LMA_AUTH = 12
ANY_AUTH = 13
CAPTIVE_PORTAL = 14
VPN_AUTH = 15
STATEFUL_KERBEROS = 16
RADIUS_ACCOUNTING = 17
SECURE_ID = 18
STATEFUL_RADIUS = 19
SWITCH_MANAGEMENT = 20
DOT1X_MACHINE = 21
DOT1X_USER = 22
DOT1X_WIRED = 23
DOT1X_WIRED_MACHINE = 24
DOT1X_WIRED_USER = 25
PUB_COOKIE = 26
TACACAS_PLUS = 27
WIRELESS_XSEC = 28
WIRELESS_XSEC_MACHINE = 29
WIRELESS_XSEC_USER = 30
WIRELESS_XSEC_WIRED = 31
WIRELESS_XSEC_WIRED_MACHINE = 32
WIRELESS_XSEC_WIRED_USER = 33
STATEFUL_NTLM = 34
RAP_AP = 35
VIA_WEB = 36
GENERIC_INTERFACE_SPEC = 37
TRANSPORT_VPN = 38
VIA_VPN = 39
PUTN_DOT1X = 40
PUTN_MAC = 41
PUTN_CP = 42
PUTN_LMA = 43
NUM_AUTH_CLIENT = 44
_IPADDRESS_ADDR_FAMILY = _descriptor.EnumDescriptor(
name='addr_family',
full_name='Monitoring.IpAddress.addr_family',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='ADDR_FAMILY_INET', index=0, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ADDR_FAMILY_INET6', index=1, number=10,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=107,
serialized_end=165,
)
_sym_db.RegisterEnumDescriptor(_IPADDRESS_ADDR_FAMILY)
_INTERFACE_DUPLEX = _descriptor.EnumDescriptor(
name='Duplex',
full_name='Monitoring.Interface.Duplex',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='HALF', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='FULL', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='AUTO', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=1680,
serialized_end=1718,
)
_sym_db.RegisterEnumDescriptor(_INTERFACE_DUPLEX)
_INTERFACE_INTFTYPE = _descriptor.EnumDescriptor(
name='IntfType',
full_name='Monitoring.Interface.IntfType',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='ETHERNET', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LOOPBACK', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='VLAN', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='TUNNEL', index=3, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PORT_CHANNEL', index=4, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STANDBY', index=5, number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='BRIDGE', index=6, number=7,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SPLIT', index=7, number=8,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STACK', index=8, number=9,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MGMT', index=9, number=10,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='NONE', index=10, number=11,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=1721,
serialized_end=1866,
)
_sym_db.RegisterEnumDescriptor(_INTERFACE_INTFTYPE)
_INTERFACE_SPEEDTYPE = _descriptor.EnumDescriptor(
name='SpeedType',
full_name='Monitoring.Interface.SpeedType',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='SPEED_INVALID', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SPEED_AUTO', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SPEED_10', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SPEED_100', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SPEED_1000', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SPEED_10000', index=5, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=1868,
serialized_end=1976,
)
_sym_db.RegisterEnumDescriptor(_INTERFACE_SPEEDTYPE)
_INTERFACE_PORTTYPE = _descriptor.EnumDescriptor(
name='PortType',
full_name='Monitoring.Interface.PortType',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='PT_RJ45', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PT_GBIC', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PT_SERIAL', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PT_USB', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PT_X2', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=1978,
serialized_end=2052,
)
_sym_db.RegisterEnumDescriptor(_INTERFACE_PORTTYPE)
_INTERFACE_POESUPPORT = _descriptor.EnumDescriptor(
name='PoeSupport',
full_name='Monitoring.Interface.PoeSupport',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='NA', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SUPPORTED', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='NOT_SUPPORTED', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=2054,
serialized_end=2108,
)
_sym_db.RegisterEnumDescriptor(_INTERFACE_POESUPPORT)
_INTERFACE_STATEDOWNREASON = _descriptor.EnumDescriptor(
name='StateDownReason',
full_name='Monitoring.Interface.StateDownReason',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UNINITIALIZED', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='WAITING_FOR_LINK', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ADMIN_INTERFACE_DOWN', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MODULE_MISSING', index=3, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MODULE_UNRECOGNIZED', index=4, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MODULE_UNSUPPORTED', index=5, number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MODULE_INCOMPATIBLE', index=6, number=7,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MODULE_FAULT', index=7, number=8,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='GROUP_SPEED_MISMATCH', index=8, number=9,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LANES_SPLIT', index=9, number=10,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LANES_NOT_SPLIT', index=10, number=11,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='INVALID_MTU', index=11, number=12,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='INVALID_SPEEDS', index=12, number=13,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='AUTONEG_NOT_SUPPORTED', index=13, number=14,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='AUTONEG_REQUIRED', index=14, number=15,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='INTERFACE_ABSENT', index=15, number=16,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PHYSICAL_INTERFACE_FAILED', index=16, number=17,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PSPO_ENABLEMENT_LAYER_DOWN', index=17, number=18,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='CARD_INTERFACE_ERRORS', index=18, number=19,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='INTERFACE_OK', index=19, number=20,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=2111,
serialized_end=2587,
)
_sym_db.RegisterEnumDescriptor(_INTERFACE_STATEDOWNREASON)
_INTERFACE_VLANMODES = _descriptor.EnumDescriptor(
name='VlanModes',
full_name='Monitoring.Interface.VlanModes',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='ACCESS', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='NATIVE_TAGGED', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='NATIVE_UNTAGGED', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=2589,
serialized_end=2652,
)
_sym_db.RegisterEnumDescriptor(_INTERFACE_VLANMODES)
_AP_UPLINKTYPE = _descriptor.EnumDescriptor(
name='UplinkType',
full_name='Monitoring.Ap.UplinkType',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='ETHERNET', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MESH', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STATION', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MODEM', index=3, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=3523,
serialized_end=3583,
)
_sym_db.RegisterEnumDescriptor(_AP_UPLINKTYPE)
_HARDWAREMODULE_HARDWARESTATUS = _descriptor.EnumDescriptor(
name='HardwareStatus',
full_name='Monitoring.HardwareModule.HardwareStatus',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='OK', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ERROR', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='NOT_CONNECTED', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ACTIVE', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STANDBY', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='OFFLINE', index=5, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=4174,
serialized_end=4266,
)
_sym_db.RegisterEnumDescriptor(_HARDWAREMODULE_HARDWARESTATUS)
_SWITCH_STACKMEMBERROLE = _descriptor.EnumDescriptor(
name='StackMemberRole',
full_name='Monitoring.Switch.StackMemberRole',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UNKNOWN', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='COMMANDER', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STANDBY', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MEMBER', index=3, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=4959,
serialized_end=5029,
)
_sym_db.RegisterEnumDescriptor(_SWITCH_STACKMEMBERROLE)
_SWITCHSTACK_STACKTOPOLOGY = _descriptor.EnumDescriptor(
name='StackTopology',
full_name='Monitoring.SwitchStack.StackTopology',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='STANDALONE', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='CHAIN', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RING', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MESH', index=3, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PARTIAL_MESH', index=4, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='UNKNOWN', index=5, number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=5308,
serialized_end=5401,
)
_sym_db.RegisterEnumDescriptor(_SWITCHSTACK_STACKTOPOLOGY)
_SWITCHSTACK_STACKPOLICY = _descriptor.EnumDescriptor(
name='StackPolicy',
full_name='Monitoring.SwitchStack.StackPolicy',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='STACK_SPLIT_UNKNOWN', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STACK_SPLIT_ONE_FRAGMENT_UP', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STACK_SPLIT_ALL_FRAGMENTS_UP', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=5403,
serialized_end=5508,
)
_sym_db.RegisterEnumDescriptor(_SWITCHSTACK_STACKPOLICY)
_MOBILITYCONTROLLER_CONTROLLERMODE = _descriptor.EnumDescriptor(
name='ControllerMode',
full_name='Monitoring.MobilityController.ControllerMode',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='GATEWAY', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='VPNC', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=6255,
serialized_end=6294,
)
_sym_db.RegisterEnumDescriptor(_MOBILITYCONTROLLER_CONTROLLERMODE)
_WIDSEVENT_EVENTTYPE = _descriptor.EnumDescriptor(
name='EventType',
full_name='Monitoring.WIDSEvent.EventType',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='ROGUE', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='INTERFERING', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='INFRASTRUCTURE_ATTACK', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='CLIENT_ATTACK', index=3, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=10939,
serialized_end=11024,
)
_sym_db.RegisterEnumDescriptor(_WIDSEVENT_EVENTTYPE)
_WIDSEVENT_ATTACKTYPE = _descriptor.EnumDescriptor(
name='AttackType',
full_name='Monitoring.WIDSEvent.AttackType',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='DETECT_VALID_SSID_MISUSE', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_ADHOC_NETWORK', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_AP_FLOOD', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_WIRELESS_BRIDGE', index=3, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_INVALID_MAC_OUI_AP', index=4, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_INVALID_MAC_OUI_STA', index=5, number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_BAD_WEP', index=6, number=7,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_AP_IMPERSONATION', index=7, number=8,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_WINDOWS_BRIDGE', index=8, number=9,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SIGNATURE_DEAUTH_BROADCAST_AP', index=9, number=10,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SIGNATURE_DEAUTH_BROADCAST_STA', index=10, number=11,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_HT_GREENFIELD', index=11, number=12,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_HT_40MHZ_INTOLERANCE_AP', index=12, number=13,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_HT_40MHZ_INTOLERANCE_STA', index=13, number=14,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_CLIENT_FLOOD', index=14, number=15,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_ADHOC_USING_VALID_SSID', index=15, number=16,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_AP_SPOOFING', index=16, number=17,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_INVALID_ADDRESSCOMBINATION', index=17, number=18,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_MALFORMED_HTIE', index=18, number=19,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_MALFORMED_ASSOC_REQ', index=19, number=20,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_OVERFLOW_IE', index=20, number=21,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_OVERFLOW_EAPOL_KEY', index=21, number=22,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_MALFORMED_LARGE_DURATION', index=22, number=23,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_MALFORMED_FRAME_WRONG_CHANNEL', index=23, number=24,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_MALFORMED_FRAME_AUTH', index=24, number=25,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_CTS_RATE_ANOMALY', index=25, number=26,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_RTS_RATE_ANOMALY', index=26, number=27,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SIGNATURE_DEAUTH_BROADCAST', index=27, number=28,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SIGNATURE_DEASSOCIATION_BROADCAST', index=28, number=29,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_RATE_ANOMALIES_BY_AP', index=29, number=30,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_RATE_ANOMALIES_BY_STA', index=30, number=31,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_EAP_RATE_ANOMALY', index=31, number=32,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_DISCONNECT_STA', index=32, number=33,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SIGNATURE_ASLEAP_FROM_AP', index=33, number=34,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SIGNATURE_ASLEAP_FROM_STA', index=34, number=35,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SIGNATURE_AIRJACK_FROM_AP', index=35, number=36,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SIGNATURE_AIRJACK_FROM_STA', index=36, number=37,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_STATION_DISCONNECT_ATTACK_AP', index=37, number=38,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_UNENCRYPTED_VALID', index=38, number=39,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_OMERTA_ATTACK', index=39, number=40,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_TKIP_REPLAY_ATTACK', index=40, number=41,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_CHOPCHOP_ATTACK', index=41, number=42,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_FATAJACK', index=42, number=43,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_VALID_CLIENT_MISASSOCIATION', index=43, number=44,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_BLOCK_ACK_ATTACK', index=44, number=45,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_HOTSPOTTER_ATTACK', index=45, number=46,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DETECT_POWER_SAVE_DOS_ATTACK', index=46, number=47,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=11027,
serialized_end=12488,
)
_sym_db.RegisterEnumDescriptor(_WIDSEVENT_ATTACKTYPE)
_AIRMONITORROGUEINFO_WMS_RAP_MATCH_TYPE = _descriptor.EnumDescriptor(
name='wms_rap_match_type',
full_name='Monitoring.AirMonitorRogueInfo.wms_rap_match_type',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='RAP_MT_NONE', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_MT_CFG_WM', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_MT_ETH_WM', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_MT_AP_WM', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_MT_EXT_WM', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_MT_MANUAL', index=5, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_MT_BASE_BSSID', index=6, number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_MT_EMS', index=7, number=7,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_MT_ETH_GW_WM', index=8, number=8,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_MT_CLASS_OFF', index=9, number=9,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_MT_AP_BSSID', index=10, number=10,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_MT_PROP_ETH_WM', index=11, number=11,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_MT_AP_RULE', index=12, number=12,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_MT_SYSTEM_WM', index=13, number=13,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_MT_SYSTEM_GW_WM', index=14, number=14,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=12773,
serialized_end=13099,
)
_sym_db.RegisterEnumDescriptor(_AIRMONITORROGUEINFO_WMS_RAP_MATCH_TYPE)
_AIRMONITORROGUEINFO_WMS_RAP_NAT_MATCH_TYPE = _descriptor.EnumDescriptor(
name='wms_rap_nat_match_type',
full_name='Monitoring.AirMonitorRogueInfo.wms_rap_nat_match_type',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='RAP_NMT_NONE', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_NMT_EQUAL', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_NMT_PLUS_ONE', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_NMT_MINUS_ONE', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RAP_NMT_OUI', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=13101,
serialized_end=13224,
)
_sym_db.RegisterEnumDescriptor(_AIRMONITORROGUEINFO_WMS_RAP_NAT_MATCH_TYPE)
_ROGUEEVENT_WMS_SNMP_ENCR_PROTOCOL = _descriptor.EnumDescriptor(
name='wms_snmp_encr_protocol',
full_name='Monitoring.RogueEvent.wms_snmp_encr_protocol',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='WMS_SNMP_WPA_ENCR_OPEN', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='WMS_SNMP_WPA_ENCR_WEP', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='WMS_SNMP_WPA_ENCR_WPA', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='WMS_SNMP_WPA_ENCR_WPA2', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=13496,
serialized_end=13630,
)
_sym_db.RegisterEnumDescriptor(_ROGUEEVENT_WMS_SNMP_ENCR_PROTOCOL)
_NOTIFICATION_SEVERITY = _descriptor.EnumDescriptor(
name='Severity',
full_name='Monitoring.Notification.Severity',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='Normal', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='Warning', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='Minor', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='Major', index=3, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='Critical', index=4, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=16584,
serialized_end=16655,
)
_sym_db.RegisterEnumDescriptor(_NOTIFICATION_SEVERITY)
_NOTIFICATION_NOTIFICATIONSTATE = _descriptor.EnumDescriptor(
name='NotificationState',
full_name='Monitoring.Notification.NotificationState',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='Open', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='Close', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=16657,
serialized_end=16697,
)
_sym_db.RegisterEnumDescriptor(_NOTIFICATION_NOTIFICATIONSTATE)
_SWITCHVLAN_VLANSTATUS = _descriptor.EnumDescriptor(
name='VlanStatus',
full_name='Monitoring.SwitchVlan.VlanStatus',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UP', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DOWN', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=17254,
serialized_end=17284,
)
_sym_db.RegisterEnumDescriptor(_SWITCHVLAN_VLANSTATUS)
_SWITCHVLAN_VLANTYPE = _descriptor.EnumDescriptor(
name='VlanType',
full_name='Monitoring.SwitchVlan.VlanType',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='STATIC', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DYNAMIC', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='INTERNAL', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DEFAULT', index=3, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=17286,
serialized_end=17348,
)
_sym_db.RegisterEnumDescriptor(_SWITCHVLAN_VLANTYPE)
_VSXSTATE_DEVICEROLE = _descriptor.EnumDescriptor(
name='DeviceRole',
full_name='Monitoring.VSXState.DeviceRole',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='PRIMARY', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SECONDARY', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=18504,
serialized_end=18544,
)
_sym_db.RegisterEnumDescriptor(_VSXSTATE_DEVICEROLE)
_VSXSTATE_ISLPDEVICESTATE = _descriptor.EnumDescriptor(
name='ISLPDeviceState',
full_name='Monitoring.VSXState.ISLPDeviceState',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='WAITING_FOR_PEER', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PEER_ESTABLISHED', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SPLIT_SYSTEM_PRIMARY', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SPLIT_SYSTEM_SECONDARY', index=3, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SYNC_PRIMARY', index=4, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SYNC_SECONDARY', index=5, number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SYNC_SECONDARY_LINKUP_DELAY', index=6, number=7,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=18547,
serialized_end=18733,
)
_sym_db.RegisterEnumDescriptor(_VSXSTATE_ISLPDEVICESTATE)
_VSXSTATE_ISLSTATE = _descriptor.EnumDescriptor(
name='ISLState',
full_name='Monitoring.VSXState.ISLState',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='IN_SYNC', index=0, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DISABLED', index=1, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SW_IMAGE_VERSION_MISMATCH_ERROR', index=2, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='CONFLICTING_OR_MISSING_DEV_ROLES', index=3, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PEER_DB_CONNECTION_ERROR', index=4, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='CONFIGURATION_SYNC_CONFLICT', index=5, number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='CONFIGURATION_SYNC_MISSING_REFERENCE', index=6, number=7,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PEER_REACHABLE', index=7, number=8,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PEER_UNREACHABLE', index=8, number=9,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='OPERATIONAL', index=9, number=10,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='INTER_SWITCH_LINK_MGMT_INIT', index=10, number=11,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='CONFLICTING_OR_MISSING_DEVICE_ROLES', index=11, number=12,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='INTER_SWITCH_LINK_DOWN', index=12, number=13,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='INTERNAL_ERROR', index=13, number=14,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=18736,
serialized_end=19134,
)
_sym_db.RegisterEnumDescriptor(_VSXSTATE_ISLSTATE)
_IPADDRESS = _descriptor.Descriptor(
name='IpAddress',
full_name='Monitoring.IpAddress',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='af', full_name='Monitoring.IpAddress.af', index=0,
number=1, type=14, cpp_type=8, label=2,
has_default_value=False, default_value=2,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='addr', full_name='Monitoring.IpAddress.addr', index=1,
number=2, type=12, cpp_type=9, label=2,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_IPADDRESS_ADDR_FAMILY,
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=33,
serialized_end=165,
)
_MACADDRESS = _descriptor.Descriptor(
name='MacAddress',
full_name='Monitoring.MacAddress',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='addr', full_name='Monitoring.MacAddress.addr', index=0,
number=1, type=12, cpp_type=9, label=2,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=167,
serialized_end=193,
)
_SWARM = _descriptor.Descriptor(
name='Swarm',
full_name='Monitoring.Swarm',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.Swarm.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='swarm_id', full_name='Monitoring.Swarm.swarm_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='Monitoring.Swarm.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='status', full_name='Monitoring.Swarm.status', index=3,
number=4, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='public_ip_address', full_name='Monitoring.Swarm.public_ip_address', index=4,
number=5, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ip_address', full_name='Monitoring.Swarm.ip_address', index=5,
number=6, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='firmware_version', full_name='Monitoring.Swarm.firmware_version', index=6,
number=7, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=196,
serialized_end=438,
)
_TUNNEL = _descriptor.Descriptor(
name='Tunnel',
full_name='Monitoring.Tunnel',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.Tunnel.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='swarm_id', full_name='Monitoring.Tunnel.swarm_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='index', full_name='Monitoring.Tunnel.index', index=2,
number=3, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='crypto_type', full_name='Monitoring.Tunnel.crypto_type', index=3,
number=4, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='peer_name', full_name='Monitoring.Tunnel.peer_name', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='peer_tun_ip', full_name='Monitoring.Tunnel.peer_tun_ip', index=5,
number=6, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tunnel_ip', full_name='Monitoring.Tunnel.tunnel_ip', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='status', full_name='Monitoring.Tunnel.status', index=7,
number=8, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='active', full_name='Monitoring.Tunnel.active', index=8,
number=9, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='uptime', full_name='Monitoring.Tunnel.uptime', index=9,
number=10, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tunnel_id', full_name='Monitoring.Tunnel.tunnel_id', index=10,
number=11, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=441,
serialized_end=792,
)
_INTERFACE = _descriptor.Descriptor(
name='Interface',
full_name='Monitoring.Interface',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.Interface.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.Interface.device_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='macaddr', full_name='Monitoring.Interface.macaddr', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='status', full_name='Monitoring.Interface.status', index=3,
number=4, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ipaddr', full_name='Monitoring.Interface.ipaddr', index=4,
number=5, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='duplex_mode', full_name='Monitoring.Interface.duplex_mode', index=5,
number=6, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='Monitoring.Interface.name', index=6,
number=7, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='port_number', full_name='Monitoring.Interface.port_number', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='type', full_name='Monitoring.Interface.type', index=8,
number=9, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='mode', full_name='Monitoring.Interface.mode', index=9,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vlan', full_name='Monitoring.Interface.vlan', index=10,
number=11, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='has_poe', full_name='Monitoring.Interface.has_poe', index=11,
number=12, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='poe_state', full_name='Monitoring.Interface.poe_state', index=12,
number=13, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='oper_state', full_name='Monitoring.Interface.oper_state', index=13,
number=14, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='admin_state', full_name='Monitoring.Interface.admin_state', index=14,
number=15, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='speed', full_name='Monitoring.Interface.speed', index=15,
number=16, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='mux', full_name='Monitoring.Interface.mux', index=16,
number=17, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='trusted', full_name='Monitoring.Interface.trusted', index=17,
number=18, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='slot', full_name='Monitoring.Interface.slot', index=18,
number=19, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='phy_type', full_name='Monitoring.Interface.phy_type', index=19,
number=20, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='sub_type', full_name='Monitoring.Interface.sub_type', index=20,
number=21, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='allowed_vlan', full_name='Monitoring.Interface.allowed_vlan', index=21,
number=22, type=13, cpp_type=3, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='native_vlan', full_name='Monitoring.Interface.native_vlan', index=22,
number=23, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vsx_enabled', full_name='Monitoring.Interface.vsx_enabled', index=23,
number=24, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='state_down_reason', full_name='Monitoring.Interface.state_down_reason', index=24,
number=25, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vlan_mode', full_name='Monitoring.Interface.vlan_mode', index=25,
number=26, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_INTERFACE_DUPLEX,
_INTERFACE_INTFTYPE,
_INTERFACE_SPEEDTYPE,
_INTERFACE_PORTTYPE,
_INTERFACE_POESUPPORT,
_INTERFACE_STATEDOWNREASON,
_INTERFACE_VLANMODES,
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=795,
serialized_end=2652,
)
_VAPINFO = _descriptor.Descriptor(
name='VapInfo',
full_name='Monitoring.VapInfo',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.VapInfo.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.VapInfo.device_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='radio_mac', full_name='Monitoring.VapInfo.radio_mac', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='essid', full_name='Monitoring.VapInfo.essid', index=3,
number=4, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ap_mac', full_name='Monitoring.VapInfo.ap_mac', index=4,
number=5, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='bssid', full_name='Monitoring.VapInfo.bssid', index=5,
number=6, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=2655,
serialized_end=2864,
)
_RADIO = _descriptor.Descriptor(
name='Radio',
full_name='Monitoring.Radio',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.Radio.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.Radio.device_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='index', full_name='Monitoring.Radio.index', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='macaddr', full_name='Monitoring.Radio.macaddr', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='status', full_name='Monitoring.Radio.status', index=4,
number=5, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='channel', full_name='Monitoring.Radio.channel', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='band', full_name='Monitoring.Radio.band', index=6,
number=7, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='channel_width', full_name='Monitoring.Radio.channel_width', index=7,
number=8, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ap_mac', full_name='Monitoring.Radio.ap_mac', index=8,
number=9, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=2867,
serialized_end=3127,
)
_AP = _descriptor.Descriptor(
name='Ap',
full_name='Monitoring.Ap',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.Ap.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='serial', full_name='Monitoring.Ap.serial', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='Monitoring.Ap.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='macaddr', full_name='Monitoring.Ap.macaddr', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='cluster_id', full_name='Monitoring.Ap.cluster_id', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='status', full_name='Monitoring.Ap.status', index=5,
number=6, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ip_address', full_name='Monitoring.Ap.ip_address', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='model', full_name='Monitoring.Ap.model', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='mesh_role', full_name='Monitoring.Ap.mesh_role', index=8,
number=9, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='mode', full_name='Monitoring.Ap.mode', index=9,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='swarm_master', full_name='Monitoring.Ap.swarm_master', index=10,
number=11, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='modem_connected', full_name='Monitoring.Ap.modem_connected', index=11,
number=12, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='uplink_type', full_name='Monitoring.Ap.uplink_type', index=12,
number=13, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='firmware_version', full_name='Monitoring.Ap.firmware_version', index=13,
number=14, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_AP_UPLINKTYPE,
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=3130,
serialized_end=3583,
)
_NETWORK = _descriptor.Descriptor(
name='Network',
full_name='Monitoring.Network',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.Network.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='swarm_id', full_name='Monitoring.Network.swarm_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='essid', full_name='Monitoring.Network.essid', index=2,
number=3, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='security', full_name='Monitoring.Network.security', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='type', full_name='Monitoring.Network.type', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=3585,
serialized_end=3703,
)
_WIRELESSCLIENT = _descriptor.Descriptor(
name='WirelessClient',
full_name='Monitoring.WirelessClient',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.WirelessClient.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='macaddr', full_name='Monitoring.WirelessClient.macaddr', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='Monitoring.WirelessClient.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ip_address', full_name='Monitoring.WirelessClient.ip_address', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='username', full_name='Monitoring.WirelessClient.username', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='associated_device', full_name='Monitoring.WirelessClient.associated_device', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='radio_mac', full_name='Monitoring.WirelessClient.radio_mac', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='network', full_name='Monitoring.WirelessClient.network', index=7,
number=8, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='user_role', full_name='Monitoring.WirelessClient.user_role', index=8,
number=9, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='manufacturer', full_name='Monitoring.WirelessClient.manufacturer', index=9,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='os_type', full_name='Monitoring.WirelessClient.os_type', index=10,
number=11, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='connection', full_name='Monitoring.WirelessClient.connection', index=11,
number=12, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='maxspeed', full_name='Monitoring.WirelessClient.maxspeed', index=12,
number=13, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vlan', full_name='Monitoring.WirelessClient.vlan', index=13,
number=14, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=3706,
serialized_end=4079,
)
_HARDWAREMODULE = _descriptor.Descriptor(
name='HardwareModule',
full_name='Monitoring.HardwareModule',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='index', full_name='Monitoring.HardwareModule.index', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='status', full_name='Monitoring.HardwareModule.status', index=1,
number=2, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_HARDWAREMODULE_HARDWARESTATUS,
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=4082,
serialized_end=4266,
)
_SWITCH = _descriptor.Descriptor(
name='Switch',
full_name='Monitoring.Switch',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.Switch.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='serial', full_name='Monitoring.Switch.serial', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='Monitoring.Switch.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='macaddr', full_name='Monitoring.Switch.macaddr', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='model', full_name='Monitoring.Switch.model', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='status', full_name='Monitoring.Switch.status', index=5,
number=6, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='public_ip_address', full_name='Monitoring.Switch.public_ip_address', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ip_address', full_name='Monitoring.Switch.ip_address', index=7,
number=8, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='firmware_version', full_name='Monitoring.Switch.firmware_version', index=8,
number=9, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='default_gateway', full_name='Monitoring.Switch.default_gateway', index=9,
number=10, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='device_mode', full_name='Monitoring.Switch.device_mode', index=10,
number=11, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='uplink_ports', full_name='Monitoring.Switch.uplink_ports', index=11,
number=12, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='max_slots', full_name='Monitoring.Switch.max_slots', index=12,
number=13, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='used_slots', full_name='Monitoring.Switch.used_slots', index=13,
number=14, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='management_modules', full_name='Monitoring.Switch.management_modules', index=14,
number=15, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='power_supplies', full_name='Monitoring.Switch.power_supplies', index=15,
number=16, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='stack_id', full_name='Monitoring.Switch.stack_id', index=16,
number=17, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='stack_member_id', full_name='Monitoring.Switch.stack_member_id', index=17,
number=18, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='stack_member_role', full_name='Monitoring.Switch.stack_member_role', index=18,
number=19, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='stack_macaddr', full_name='Monitoring.Switch.stack_macaddr', index=19,
number=20, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_SWITCH_STACKMEMBERROLE,
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=4269,
serialized_end=5029,
)
_SWITCHSTACK = _descriptor.Descriptor(
name='SwitchStack',
full_name='Monitoring.SwitchStack',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.SwitchStack.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='stack_id', full_name='Monitoring.SwitchStack.stack_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='status', full_name='Monitoring.SwitchStack.status', index=2,
number=3, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='topology', full_name='Monitoring.SwitchStack.topology', index=3,
number=4, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='policy', full_name='Monitoring.SwitchStack.policy', index=4,
number=5, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='firmware_version', full_name='Monitoring.SwitchStack.firmware_version', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vsf_domain_id', full_name='Monitoring.SwitchStack.vsf_domain_id', index=6,
number=7, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_SWITCHSTACK_STACKTOPOLOGY,
_SWITCHSTACK_STACKPOLICY,
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=5032,
serialized_end=5508,
)
_WIREDCLIENT = _descriptor.Descriptor(
name='WiredClient',
full_name='Monitoring.WiredClient',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.WiredClient.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='macaddr', full_name='Monitoring.WiredClient.macaddr', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='Monitoring.WiredClient.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ip_address', full_name='Monitoring.WiredClient.ip_address', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='username', full_name='Monitoring.WiredClient.username', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='associated_device', full_name='Monitoring.WiredClient.associated_device', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='interface_mac', full_name='Monitoring.WiredClient.interface_mac', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='user_role', full_name='Monitoring.WiredClient.user_role', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vlan', full_name='Monitoring.WiredClient.vlan', index=8,
number=9, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='auth_type', full_name='Monitoring.WiredClient.auth_type', index=9,
number=10, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=5511,
serialized_end=5832,
)
_MOBILITYCONTROLLER = _descriptor.Descriptor(
name='MobilityController',
full_name='Monitoring.MobilityController',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.MobilityController.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='serial', full_name='Monitoring.MobilityController.serial', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='Monitoring.MobilityController.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='macaddr', full_name='Monitoring.MobilityController.macaddr', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='model', full_name='Monitoring.MobilityController.model', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='status', full_name='Monitoring.MobilityController.status', index=5,
number=6, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='public_ip_address', full_name='Monitoring.MobilityController.public_ip_address', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ip_address', full_name='Monitoring.MobilityController.ip_address', index=7,
number=8, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='firmware_version', full_name='Monitoring.MobilityController.firmware_version', index=8,
number=9, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='default_gateway', full_name='Monitoring.MobilityController.default_gateway', index=9,
number=10, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='mode', full_name='Monitoring.MobilityController.mode', index=10,
number=11, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_MOBILITYCONTROLLER_CONTROLLERMODE,
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=5835,
serialized_end=6294,
)
_UPLINK = _descriptor.Descriptor(
name='Uplink',
full_name='Monitoring.Uplink',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.Uplink.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.Uplink.device_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='link_index', full_name='Monitoring.Uplink.link_index', index=2,
number=3, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='Monitoring.Uplink.name', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='description', full_name='Monitoring.Uplink.description', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='priority', full_name='Monitoring.Uplink.priority', index=5,
number=6, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='status', full_name='Monitoring.Uplink.status', index=6,
number=7, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='wan_status', full_name='Monitoring.Uplink.wan_status', index=7,
number=8, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vlan', full_name='Monitoring.Uplink.vlan', index=8,
number=9, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vlan_description', full_name='Monitoring.Uplink.vlan_description', index=9,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='public_ip_address', full_name='Monitoring.Uplink.public_ip_address', index=10,
number=11, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='private_ip_address', full_name='Monitoring.Uplink.private_ip_address', index=11,
number=12, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=6297,
serialized_end=6658,
)
_IKETUNNEL = _descriptor.Descriptor(
name='IkeTunnel',
full_name='Monitoring.IkeTunnel',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.IkeTunnel.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.IkeTunnel.device_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='map_id', full_name='Monitoring.IkeTunnel.map_id', index=2,
number=3, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='peer_mac', full_name='Monitoring.IkeTunnel.peer_mac', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='local_mac', full_name='Monitoring.IkeTunnel.local_mac', index=4,
number=5, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='src_ip', full_name='Monitoring.IkeTunnel.src_ip', index=5,
number=6, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='dst_ip', full_name='Monitoring.IkeTunnel.dst_ip', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='status', full_name='Monitoring.IkeTunnel.status', index=7,
number=8, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='map_name', full_name='Monitoring.IkeTunnel.map_name', index=8,
number=9, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=6661,
serialized_end=6968,
)
_DEVICESTATS = _descriptor.Descriptor(
name='DeviceStats',
full_name='Monitoring.DeviceStats',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.DeviceStats.device_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.DeviceStats.timestamp', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='uptime', full_name='Monitoring.DeviceStats.uptime', index=2,
number=3, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='cpu_utilization', full_name='Monitoring.DeviceStats.cpu_utilization', index=3,
number=4, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='mem_total', full_name='Monitoring.DeviceStats.mem_total', index=4,
number=5, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='mem_free', full_name='Monitoring.DeviceStats.mem_free', index=5,
number=6, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='power_consumption', full_name='Monitoring.DeviceStats.power_consumption', index=6,
number=7, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='fan_speed', full_name='Monitoring.DeviceStats.fan_speed', index=7,
number=8, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='temperature', full_name='Monitoring.DeviceStats.temperature', index=8,
number=9, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='fan_status', full_name='Monitoring.DeviceStats.fan_status', index=9,
number=10, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='max_power', full_name='Monitoring.DeviceStats.max_power', index=10,
number=11, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='poe_consumption', full_name='Monitoring.DeviceStats.poe_consumption', index=11,
number=12, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='poe_budget', full_name='Monitoring.DeviceStats.poe_budget', index=12,
number=13, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='mem_utilization', full_name='Monitoring.DeviceStats.mem_utilization', index=13,
number=14, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=6971,
serialized_end=7296,
)
_RADIOSTATS = _descriptor.Descriptor(
name='RadioStats',
full_name='Monitoring.RadioStats',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.RadioStats.device_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='macaddr', full_name='Monitoring.RadioStats.macaddr', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.RadioStats.timestamp', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tx_bytes', full_name='Monitoring.RadioStats.tx_bytes', index=3,
number=4, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='rx_bytes', full_name='Monitoring.RadioStats.rx_bytes', index=4,
number=5, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tx_drops', full_name='Monitoring.RadioStats.tx_drops', index=5,
number=6, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tx_power', full_name='Monitoring.RadioStats.tx_power', index=6,
number=7, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='noise_floor', full_name='Monitoring.RadioStats.noise_floor', index=7,
number=8, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='utilization', full_name='Monitoring.RadioStats.utilization', index=8,
number=9, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='rx_bad', full_name='Monitoring.RadioStats.rx_bad', index=9,
number=10, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=7299,
serialized_end=7520,
)
_VAPSTATS = _descriptor.Descriptor(
name='VapStats',
full_name='Monitoring.VapStats',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.VapStats.device_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='radio_mac', full_name='Monitoring.VapStats.radio_mac', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='network', full_name='Monitoring.VapStats.network', index=2,
number=3, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.VapStats.timestamp', index=3,
number=4, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tx_bytes', full_name='Monitoring.VapStats.tx_bytes', index=4,
number=5, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='rx_bytes', full_name='Monitoring.VapStats.rx_bytes', index=5,
number=6, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=7523,
serialized_end=7667,
)
_TUNNELSTATS = _descriptor.Descriptor(
name='TunnelStats',
full_name='Monitoring.TunnelStats',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='swarm_id', full_name='Monitoring.TunnelStats.swarm_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='index', full_name='Monitoring.TunnelStats.index', index=1,
number=2, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.TunnelStats.timestamp', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tx_bytes', full_name='Monitoring.TunnelStats.tx_bytes', index=3,
number=4, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='rx_bytes', full_name='Monitoring.TunnelStats.rx_bytes', index=4,
number=5, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tunnel_id', full_name='Monitoring.TunnelStats.tunnel_id', index=5,
number=6, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tunnel_name', full_name='Monitoring.TunnelStats.tunnel_name', index=6,
number=7, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=7670,
serialized_end=7836,
)
_CLIENTSTATS = _descriptor.Descriptor(
name='ClientStats',
full_name='Monitoring.ClientStats',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.ClientStats.device_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='macaddr', full_name='Monitoring.ClientStats.macaddr', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.ClientStats.timestamp', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tx_bytes', full_name='Monitoring.ClientStats.tx_bytes', index=3,
number=4, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='rx_bytes', full_name='Monitoring.ClientStats.rx_bytes', index=4,
number=5, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='rx_retries', full_name='Monitoring.ClientStats.rx_retries', index=5,
number=6, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tx_retries', full_name='Monitoring.ClientStats.tx_retries', index=6,
number=7, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='speed', full_name='Monitoring.ClientStats.speed', index=7,
number=8, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_in_db', full_name='Monitoring.ClientStats.signal_in_db', index=8,
number=9, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='snr', full_name='Monitoring.ClientStats.snr', index=9,
number=10, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=7839,
serialized_end=8057,
)
_INTERFACESTATS = _descriptor.Descriptor(
name='InterfaceStats',
full_name='Monitoring.InterfaceStats',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.InterfaceStats.device_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='macaddr', full_name='Monitoring.InterfaceStats.macaddr', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.InterfaceStats.timestamp', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tx_bytes', full_name='Monitoring.InterfaceStats.tx_bytes', index=3,
number=4, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='rx_bytes', full_name='Monitoring.InterfaceStats.rx_bytes', index=4,
number=5, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='power_consumption', full_name='Monitoring.InterfaceStats.power_consumption', index=5,
number=6, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='in_errors', full_name='Monitoring.InterfaceStats.in_errors', index=6,
number=7, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='out_errors', full_name='Monitoring.InterfaceStats.out_errors', index=7,
number=8, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='in_discards', full_name='Monitoring.InterfaceStats.in_discards', index=8,
number=9, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='out_discards', full_name='Monitoring.InterfaceStats.out_discards', index=9,
number=10, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='in_packets', full_name='Monitoring.InterfaceStats.in_packets', index=10,
number=11, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='out_packets', full_name='Monitoring.InterfaceStats.out_packets', index=11,
number=12, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='in_other_err', full_name='Monitoring.InterfaceStats.in_other_err', index=12,
number=13, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='in_multicast_pkt', full_name='Monitoring.InterfaceStats.in_multicast_pkt', index=13,
number=14, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='in_broadcast_pkt', full_name='Monitoring.InterfaceStats.in_broadcast_pkt', index=14,
number=15, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='in_unicast_pkt', full_name='Monitoring.InterfaceStats.in_unicast_pkt', index=15,
number=16, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='out_multicast_pkt', full_name='Monitoring.InterfaceStats.out_multicast_pkt', index=16,
number=17, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='out_broadcast_pkt', full_name='Monitoring.InterfaceStats.out_broadcast_pkt', index=17,
number=18, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='out_unicast_pkt', full_name='Monitoring.InterfaceStats.out_unicast_pkt', index=18,
number=19, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='in_fcs', full_name='Monitoring.InterfaceStats.in_fcs', index=19,
number=20, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='in_alignment', full_name='Monitoring.InterfaceStats.in_alignment', index=20,
number=21, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='out_excessive_collision', full_name='Monitoring.InterfaceStats.out_excessive_collision', index=21,
number=22, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='in_jabbers', full_name='Monitoring.InterfaceStats.in_jabbers', index=22,
number=23, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='in_fragmented', full_name='Monitoring.InterfaceStats.in_fragmented', index=23,
number=24, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='in_giant', full_name='Monitoring.InterfaceStats.in_giant', index=24,
number=25, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='in_runt', full_name='Monitoring.InterfaceStats.in_runt', index=25,
number=26, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='out_collision', full_name='Monitoring.InterfaceStats.out_collision', index=26,
number=27, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='out_late_collision', full_name='Monitoring.InterfaceStats.out_late_collision', index=27,
number=28, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='out_deferred', full_name='Monitoring.InterfaceStats.out_deferred', index=28,
number=29, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=8060,
serialized_end=8740,
)
_UPLINKSTATS = _descriptor.Descriptor(
name='UplinkStats',
full_name='Monitoring.UplinkStats',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.UplinkStats.device_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='link_id', full_name='Monitoring.UplinkStats.link_id', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.UplinkStats.timestamp', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tx_bytes', full_name='Monitoring.UplinkStats.tx_bytes', index=3,
number=4, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='rx_bytes', full_name='Monitoring.UplinkStats.rx_bytes', index=4,
number=5, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tunnel_tx_bytes', full_name='Monitoring.UplinkStats.tunnel_tx_bytes', index=5,
number=6, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tunnel_rx_bytes', full_name='Monitoring.UplinkStats.tunnel_rx_bytes', index=6,
number=7, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='map_id', full_name='Monitoring.UplinkStats.map_id', index=7,
number=8, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='map_name', full_name='Monitoring.UplinkStats.map_name', index=8,
number=9, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=8743,
serialized_end=8931,
)
_UPLINKWANSTATS = _descriptor.Descriptor(
name='UplinkWanStats',
full_name='Monitoring.UplinkWanStats',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.UplinkWanStats.device_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='link_id', full_name='Monitoring.UplinkWanStats.link_id', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.UplinkWanStats.timestamp', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='compressed_bytes', full_name='Monitoring.UplinkWanStats.compressed_bytes', index=3,
number=4, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='uncompressed_bytes', full_name='Monitoring.UplinkWanStats.uncompressed_bytes', index=4,
number=5, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='savings_bytes', full_name='Monitoring.UplinkWanStats.savings_bytes', index=5,
number=6, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=8934,
serialized_end=9082,
)
_MODEMSTATS = _descriptor.Descriptor(
name='ModemStats',
full_name='Monitoring.ModemStats',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.ModemStats.device_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.ModemStats.timestamp', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tx_bytes', full_name='Monitoring.ModemStats.tx_bytes', index=2,
number=3, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='rx_bytes', full_name='Monitoring.ModemStats.rx_bytes', index=3,
number=4, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=9084,
serialized_end=9170,
)
_ROLESTATS = _descriptor.Descriptor(
name='RoleStats',
full_name='Monitoring.RoleStats',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.RoleStats.device_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='user_role', full_name='Monitoring.RoleStats.user_role', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.RoleStats.timestamp', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tx_bytes', full_name='Monitoring.RoleStats.tx_bytes', index=3,
number=4, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='rx_bytes', full_name='Monitoring.RoleStats.rx_bytes', index=4,
number=5, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=9172,
serialized_end=9276,
)
_VLANSTATS = _descriptor.Descriptor(
name='VlanStats',
full_name='Monitoring.VlanStats',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.VlanStats.device_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vlan', full_name='Monitoring.VlanStats.vlan', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.VlanStats.timestamp', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tx_bytes', full_name='Monitoring.VlanStats.tx_bytes', index=3,
number=4, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='rx_bytes', full_name='Monitoring.VlanStats.rx_bytes', index=4,
number=5, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=9278,
serialized_end=9377,
)
_SSIDSTATS = _descriptor.Descriptor(
name='SsidStats',
full_name='Monitoring.SsidStats',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.SsidStats.device_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='essid', full_name='Monitoring.SsidStats.essid', index=1,
number=2, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.SsidStats.timestamp', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tx_bytes', full_name='Monitoring.SsidStats.tx_bytes', index=3,
number=4, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='rx_bytes', full_name='Monitoring.SsidStats.rx_bytes', index=4,
number=5, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=9379,
serialized_end=9479,
)
_TUNNELIPPROBESTATS = _descriptor.Descriptor(
name='TunnelIpProbeStats',
full_name='Monitoring.TunnelIpProbeStats',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.TunnelIpProbeStats.device_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tunnel_index', full_name='Monitoring.TunnelIpProbeStats.tunnel_index', index=1,
number=2, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='probe_ip_addr', full_name='Monitoring.TunnelIpProbeStats.probe_ip_addr', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='probe_status', full_name='Monitoring.TunnelIpProbeStats.probe_status', index=3,
number=4, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ip_probe_pkt_loss_pct', full_name='Monitoring.TunnelIpProbeStats.ip_probe_pkt_loss_pct', index=4,
number=5, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tunnel_name', full_name='Monitoring.TunnelIpProbeStats.tunnel_name', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tunnel_id', full_name='Monitoring.TunnelIpProbeStats.tunnel_id', index=6,
number=17, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=9482,
serialized_end=9707,
)
_UPLINKIPPROBESTATS = _descriptor.Descriptor(
name='UplinkIpProbeStats',
full_name='Monitoring.UplinkIpProbeStats',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.UplinkIpProbeStats.device_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='link_id', full_name='Monitoring.UplinkIpProbeStats.link_id', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.UplinkIpProbeStats.timestamp', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ip_address', full_name='Monitoring.UplinkIpProbeStats.ip_address', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vlan', full_name='Monitoring.UplinkIpProbeStats.vlan', index=4,
number=5, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='avg_rtt', full_name='Monitoring.UplinkIpProbeStats.avg_rtt', index=5,
number=6, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='max_rtt', full_name='Monitoring.UplinkIpProbeStats.max_rtt', index=6,
number=7, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='min_rtt', full_name='Monitoring.UplinkIpProbeStats.min_rtt', index=7,
number=8, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='avg_jitter', full_name='Monitoring.UplinkIpProbeStats.avg_jitter', index=8,
number=9, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='max_jitter', full_name='Monitoring.UplinkIpProbeStats.max_jitter', index=9,
number=10, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='min_jitter', full_name='Monitoring.UplinkIpProbeStats.min_jitter', index=10,
number=11, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='mos_quality', full_name='Monitoring.UplinkIpProbeStats.mos_quality', index=11,
number=12, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='sd_avg_latency', full_name='Monitoring.UplinkIpProbeStats.sd_avg_latency', index=12,
number=13, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ds_avg_latency', full_name='Monitoring.UplinkIpProbeStats.ds_avg_latency', index=13,
number=14, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='sd_avg_jitter', full_name='Monitoring.UplinkIpProbeStats.sd_avg_jitter', index=14,
number=15, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ds_avg_jitter', full_name='Monitoring.UplinkIpProbeStats.ds_avg_jitter', index=15,
number=16, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='probe_status', full_name='Monitoring.UplinkIpProbeStats.probe_status', index=16,
number=17, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='loss_pct', full_name='Monitoring.UplinkIpProbeStats.loss_pct', index=17,
number=18, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vpnc_ip_addr', full_name='Monitoring.UplinkIpProbeStats.vpnc_ip_addr', index=18,
number=19, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='probe_ip_addr', full_name='Monitoring.UplinkIpProbeStats.probe_ip_addr', index=19,
number=20, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='avg_rtt_float', full_name='Monitoring.UplinkIpProbeStats.avg_rtt_float', index=20,
number=21, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='max_rtt_float', full_name='Monitoring.UplinkIpProbeStats.max_rtt_float', index=21,
number=22, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='min_rtt_float', full_name='Monitoring.UplinkIpProbeStats.min_rtt_float', index=22,
number=23, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='avg_jitter_float', full_name='Monitoring.UplinkIpProbeStats.avg_jitter_float', index=23,
number=24, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='max_jitter_float', full_name='Monitoring.UplinkIpProbeStats.max_jitter_float', index=24,
number=25, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='min_jitter_float', full_name='Monitoring.UplinkIpProbeStats.min_jitter_float', index=25,
number=26, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='mos_quality_float', full_name='Monitoring.UplinkIpProbeStats.mos_quality_float', index=26,
number=27, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='sd_avg_latency_float', full_name='Monitoring.UplinkIpProbeStats.sd_avg_latency_float', index=27,
number=28, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ds_avg_latency_float', full_name='Monitoring.UplinkIpProbeStats.ds_avg_latency_float', index=28,
number=29, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='sd_avg_jitter_float', full_name='Monitoring.UplinkIpProbeStats.sd_avg_jitter_float', index=29,
number=30, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ds_avg_jitter_float', full_name='Monitoring.UplinkIpProbeStats.ds_avg_jitter_float', index=30,
number=31, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=9710,
serialized_end=10445,
)
_UPLINKSPEEDTEST = _descriptor.Descriptor(
name='UplinkSpeedtest',
full_name='Monitoring.UplinkSpeedtest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.UplinkSpeedtest.device_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='server_ip', full_name='Monitoring.UplinkSpeedtest.server_ip', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vlan', full_name='Monitoring.UplinkSpeedtest.vlan', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='protocol', full_name='Monitoring.UplinkSpeedtest.protocol', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='upstream_bps', full_name='Monitoring.UplinkSpeedtest.upstream_bps', index=4,
number=5, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='downstream_bps', full_name='Monitoring.UplinkSpeedtest.downstream_bps', index=5,
number=6, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='time_secs', full_name='Monitoring.UplinkSpeedtest.time_secs', index=6,
number=7, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='upstream_jitter', full_name='Monitoring.UplinkSpeedtest.upstream_jitter', index=7,
number=8, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='downstream_jitter', full_name='Monitoring.UplinkSpeedtest.downstream_jitter', index=8,
number=9, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=10448,
serialized_end=10675,
)
_WIDSEVENT = _descriptor.Descriptor(
name='WIDSEvent',
full_name='Monitoring.WIDSEvent',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.WIDSEvent.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='event_type', full_name='Monitoring.WIDSEvent.event_type', index=1,
number=2, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='macaddr', full_name='Monitoring.WIDSEvent.macaddr', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='detected_ap', full_name='Monitoring.WIDSEvent.detected_ap', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='attack_type', full_name='Monitoring.WIDSEvent.attack_type', index=4,
number=5, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='channel', full_name='Monitoring.WIDSEvent.channel', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='network', full_name='Monitoring.WIDSEvent.network', index=6,
number=7, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_WIDSEVENT_EVENTTYPE,
_WIDSEVENT_ATTACKTYPE,
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=10678,
serialized_end=12488,
)
_AIRMONITORROGUEINFO = _descriptor.Descriptor(
name='AirMonitorRogueInfo',
full_name='Monitoring.AirMonitorRogueInfo',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='match_type', full_name='Monitoring.AirMonitorRogueInfo.match_type', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='match_mac', full_name='Monitoring.AirMonitorRogueInfo.match_mac', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='match_ip', full_name='Monitoring.AirMonitorRogueInfo.match_ip', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='monitor_name', full_name='Monitoring.AirMonitorRogueInfo.monitor_name', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='nat_match_type', full_name='Monitoring.AirMonitorRogueInfo.nat_match_type', index=4,
number=5, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_AIRMONITORROGUEINFO_WMS_RAP_MATCH_TYPE,
_AIRMONITORROGUEINFO_WMS_RAP_NAT_MATCH_TYPE,
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=12491,
serialized_end=13224,
)
_ROGUEEVENT = _descriptor.Descriptor(
name='RogueEvent',
full_name='Monitoring.RogueEvent',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.RogueEvent.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='detected_ap', full_name='Monitoring.RogueEvent.detected_ap', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='macaddr', full_name='Monitoring.RogueEvent.macaddr', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='channel', full_name='Monitoring.RogueEvent.channel', index=3,
number=4, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='network', full_name='Monitoring.RogueEvent.network', index=4,
number=5, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='encr_type', full_name='Monitoring.RogueEvent.encr_type', index=5,
number=6, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='am_rogue', full_name='Monitoring.RogueEvent.am_rogue', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_ROGUEEVENT_WMS_SNMP_ENCR_PROTOCOL,
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=13227,
serialized_end=13630,
)
_DEVICENEIGHBOURS = _descriptor.Descriptor(
name='DeviceNeighbours',
full_name='Monitoring.DeviceNeighbours',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.DeviceNeighbours.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.DeviceNeighbours.device_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='port', full_name='Monitoring.DeviceNeighbours.port', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='remote_device_id', full_name='Monitoring.DeviceNeighbours.remote_device_id', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='remote_port', full_name='Monitoring.DeviceNeighbours.remote_port', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='remote_port_number', full_name='Monitoring.DeviceNeighbours.remote_port_number', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vlan_id', full_name='Monitoring.DeviceNeighbours.vlan_id', index=6,
number=7, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=13633,
serialized_end=13820,
)
_MONITORINGINFORMATION = _descriptor.Descriptor(
name='MonitoringInformation',
full_name='Monitoring.MonitoringInformation',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='customer_id', full_name='Monitoring.MonitoringInformation.customer_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='data_elements', full_name='Monitoring.MonitoringInformation.data_elements', index=1,
number=2, type=14, cpp_type=8, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='swarms', full_name='Monitoring.MonitoringInformation.swarms', index=2,
number=3, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='aps', full_name='Monitoring.MonitoringInformation.aps', index=3,
number=4, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='networks', full_name='Monitoring.MonitoringInformation.networks', index=4,
number=5, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='radios', full_name='Monitoring.MonitoringInformation.radios', index=5,
number=6, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vaps', full_name='Monitoring.MonitoringInformation.vaps', index=6,
number=7, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='interfaces', full_name='Monitoring.MonitoringInformation.interfaces', index=7,
number=8, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tunnels', full_name='Monitoring.MonitoringInformation.tunnels', index=8,
number=9, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='wireless_clients', full_name='Monitoring.MonitoringInformation.wireless_clients', index=9,
number=10, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='switches', full_name='Monitoring.MonitoringInformation.switches', index=10,
number=11, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='wired_clients', full_name='Monitoring.MonitoringInformation.wired_clients', index=11,
number=12, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='device_stats', full_name='Monitoring.MonitoringInformation.device_stats', index=12,
number=13, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='radio_stats', full_name='Monitoring.MonitoringInformation.radio_stats', index=13,
number=14, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='interface_stats', full_name='Monitoring.MonitoringInformation.interface_stats', index=14,
number=15, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vap_stats', full_name='Monitoring.MonitoringInformation.vap_stats', index=15,
number=16, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='client_stats', full_name='Monitoring.MonitoringInformation.client_stats', index=16,
number=17, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tunnel_stats', full_name='Monitoring.MonitoringInformation.tunnel_stats', index=17,
number=18, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='wids_events', full_name='Monitoring.MonitoringInformation.wids_events', index=18,
number=19, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='modem_stats', full_name='Monitoring.MonitoringInformation.modem_stats', index=19,
number=20, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='role_stats', full_name='Monitoring.MonitoringInformation.role_stats', index=20,
number=21, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vlan_stats', full_name='Monitoring.MonitoringInformation.vlan_stats', index=21,
number=22, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ssid_stats', full_name='Monitoring.MonitoringInformation.ssid_stats', index=22,
number=23, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ipprobe_stats', full_name='Monitoring.MonitoringInformation.ipprobe_stats', index=23,
number=24, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='rogue_events', full_name='Monitoring.MonitoringInformation.rogue_events', index=24,
number=25, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='mobility_controllers', full_name='Monitoring.MonitoringInformation.mobility_controllers', index=25,
number=26, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='uplinks', full_name='Monitoring.MonitoringInformation.uplinks', index=26,
number=27, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='uplink_stats', full_name='Monitoring.MonitoringInformation.uplink_stats', index=27,
number=28, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='uplink_wan_stats', full_name='Monitoring.MonitoringInformation.uplink_wan_stats', index=28,
number=29, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='uplink_probe_stats', full_name='Monitoring.MonitoringInformation.uplink_probe_stats', index=29,
number=30, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='uplink_speedtest', full_name='Monitoring.MonitoringInformation.uplink_speedtest', index=30,
number=31, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='device_neighbours', full_name='Monitoring.MonitoringInformation.device_neighbours', index=31,
number=32, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='notification', full_name='Monitoring.MonitoringInformation.notification', index=32,
number=33, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='switch_stacks', full_name='Monitoring.MonitoringInformation.switch_stacks', index=33,
number=34, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ike_tunnels', full_name='Monitoring.MonitoringInformation.ike_tunnels', index=34,
number=35, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='switch_vlan_info', full_name='Monitoring.MonitoringInformation.switch_vlan_info', index=35,
number=36, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vlans', full_name='Monitoring.MonitoringInformation.vlans', index=36,
number=37, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vsx', full_name='Monitoring.MonitoringInformation.vsx', index=37,
number=38, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.MonitoringInformation.timestamp', index=38,
number=39, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=13823,
serialized_end=15560,
)
_MONITORINGSTATEINFORMATION = _descriptor.Descriptor(
name='MonitoringStateInformation',
full_name='Monitoring.MonitoringStateInformation',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='customer_id', full_name='Monitoring.MonitoringStateInformation.customer_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='mobility_controllers', full_name='Monitoring.MonitoringStateInformation.mobility_controllers', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='switches', full_name='Monitoring.MonitoringStateInformation.switches', index=2,
number=3, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='swarms', full_name='Monitoring.MonitoringStateInformation.swarms', index=3,
number=4, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='aps', full_name='Monitoring.MonitoringStateInformation.aps', index=4,
number=5, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vaps', full_name='Monitoring.MonitoringStateInformation.vaps', index=5,
number=6, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='radios', full_name='Monitoring.MonitoringStateInformation.radios', index=6,
number=7, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='interfaces', full_name='Monitoring.MonitoringStateInformation.interfaces', index=7,
number=8, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='networks', full_name='Monitoring.MonitoringStateInformation.networks', index=8,
number=9, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tunnels', full_name='Monitoring.MonitoringStateInformation.tunnels', index=9,
number=10, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='wireless_clients', full_name='Monitoring.MonitoringStateInformation.wireless_clients', index=10,
number=11, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='wired_clients', full_name='Monitoring.MonitoringStateInformation.wired_clients', index=11,
number=12, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='uplinks', full_name='Monitoring.MonitoringStateInformation.uplinks', index=12,
number=13, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='switch_stacks', full_name='Monitoring.MonitoringStateInformation.switch_stacks', index=13,
number=14, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ike_tunnels', full_name='Monitoring.MonitoringStateInformation.ike_tunnels', index=14,
number=15, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='data_elements', full_name='Monitoring.MonitoringStateInformation.data_elements', index=15,
number=16, type=14, cpp_type=8, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.MonitoringStateInformation.timestamp', index=16,
number=17, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=15563,
serialized_end=16263,
)
_KEYVALUEDATA = _descriptor.Descriptor(
name='KeyValueData',
full_name='Monitoring.KeyValueData',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='Monitoring.KeyValueData.key', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='value', full_name='Monitoring.KeyValueData.value', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=16265,
serialized_end=16307,
)
_NOTIFICATION = _descriptor.Descriptor(
name='Notification',
full_name='Monitoring.Notification',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='Monitoring.Notification.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='type', full_name='Monitoring.Notification.type', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='setting_id', full_name='Monitoring.Notification.setting_id', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.Notification.device_id', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='severity', full_name='Monitoring.Notification.severity', index=4,
number=5, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp', full_name='Monitoring.Notification.timestamp', index=5,
number=6, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='state', full_name='Monitoring.Notification.state', index=6,
number=7, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='description', full_name='Monitoring.Notification.description', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='extra', full_name='Monitoring.Notification.extra', index=8,
number=9, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_NOTIFICATION_SEVERITY,
_NOTIFICATION_NOTIFICATIONSTATE,
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=16310,
serialized_end=16697,
)
_SWITCHVLANINFO = _descriptor.Descriptor(
name='SwitchVlanInfo',
full_name='Monitoring.SwitchVlanInfo',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.SwitchVlanInfo.device_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vlans', full_name='Monitoring.SwitchVlanInfo.vlans', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=16699,
serialized_end=16773,
)
_SWITCHVLAN = _descriptor.Descriptor(
name='SwitchVlan',
full_name='Monitoring.SwitchVlan',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='Monitoring.SwitchVlan.id', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='Monitoring.SwitchVlan.name', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tagged_ports', full_name='Monitoring.SwitchVlan.tagged_ports', index=2,
number=3, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='untagged_ports', full_name='Monitoring.SwitchVlan.untagged_ports', index=3,
number=4, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='primary_vlan_id', full_name='Monitoring.SwitchVlan.primary_vlan_id', index=4,
number=5, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='primary_vlan_type', full_name='Monitoring.SwitchVlan.primary_vlan_type', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='promiscuous_ports', full_name='Monitoring.SwitchVlan.promiscuous_ports', index=6,
number=7, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='isl_ports', full_name='Monitoring.SwitchVlan.isl_ports', index=7,
number=8, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='is_management_vlan', full_name='Monitoring.SwitchVlan.is_management_vlan', index=8,
number=9, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='is_voice_enabled', full_name='Monitoring.SwitchVlan.is_voice_enabled', index=9,
number=10, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='is_jumbo_enabled', full_name='Monitoring.SwitchVlan.is_jumbo_enabled', index=10,
number=11, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='is_igmp_enabled', full_name='Monitoring.SwitchVlan.is_igmp_enabled', index=11,
number=12, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ipaddress', full_name='Monitoring.SwitchVlan.ipaddress', index=12,
number=13, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='status', full_name='Monitoring.SwitchVlan.status', index=13,
number=14, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='oper_state_reason', full_name='Monitoring.SwitchVlan.oper_state_reason', index=14,
number=15, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='type', full_name='Monitoring.SwitchVlan.type', index=15,
number=16, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='access_ports', full_name='Monitoring.SwitchVlan.access_ports', index=16,
number=17, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_SWITCHVLAN_VLANSTATUS,
_SWITCHVLAN_VLANTYPE,
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=16776,
serialized_end=17348,
)
_VLAN = _descriptor.Descriptor(
name='Vlan',
full_name='Monitoring.Vlan',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.Vlan.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vlan_id', full_name='Monitoring.Vlan.vlan_id', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ipv4', full_name='Monitoring.Vlan.ipv4', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ipv6_ll', full_name='Monitoring.Vlan.ipv6_ll', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ipv6_1', full_name='Monitoring.Vlan.ipv6_1', index=4,
number=5, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ipv6_2', full_name='Monitoring.Vlan.ipv6_2', index=5,
number=6, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ipv6_3', full_name='Monitoring.Vlan.ipv6_3', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='oper_state', full_name='Monitoring.Vlan.oper_state', index=7,
number=8, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='description', full_name='Monitoring.Vlan.description', index=8,
number=9, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='admin_state', full_name='Monitoring.Vlan.admin_state', index=9,
number=10, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='addr_mode', full_name='Monitoring.Vlan.addr_mode', index=10,
number=11, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=17351,
serialized_end=17733,
)
_VSXSTATE = _descriptor.Descriptor(
name='VSXState',
full_name='Monitoring.VSXState',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='action', full_name='Monitoring.VSXState.action', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=True, default_value=3,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='device_id', full_name='Monitoring.VSXState.device_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='role', full_name='Monitoring.VSXState.role', index=2,
number=3, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='peer_role', full_name='Monitoring.VSXState.peer_role', index=3,
number=4, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='isl_port', full_name='Monitoring.VSXState.isl_port', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='peer_isl_port', full_name='Monitoring.VSXState.peer_isl_port', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='keepalive_peer_ip', full_name='Monitoring.VSXState.keepalive_peer_ip', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='keepalive_src_ip', full_name='Monitoring.VSXState.keepalive_src_ip', index=7,
number=8, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='last_sync_timestamp', full_name='Monitoring.VSXState.last_sync_timestamp', index=8,
number=9, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='mac', full_name='Monitoring.VSXState.mac', index=9,
number=10, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='peer_mac', full_name='Monitoring.VSXState.peer_mac', index=10,
number=11, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='config_sync_disable', full_name='Monitoring.VSXState.config_sync_disable', index=11,
number=12, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='islp_device_state_value', full_name='Monitoring.VSXState.islp_device_state_value', index=12,
number=13, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='config_sync_state_value', full_name='Monitoring.VSXState.config_sync_state_value', index=13,
number=14, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='isl_mgmt_state_value', full_name='Monitoring.VSXState.isl_mgmt_state_value', index=14,
number=15, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='nae_state_value', full_name='Monitoring.VSXState.nae_state_value', index=15,
number=16, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='https_server_state_value', full_name='Monitoring.VSXState.https_server_state_value', index=16,
number=17, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=1,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_VSXSTATE_DEVICEROLE,
_VSXSTATE_ISLPDEVICESTATE,
_VSXSTATE_ISLSTATE,
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=17736,
serialized_end=19134,
)
_IPADDRESS.fields_by_name['af'].enum_type = _IPADDRESS_ADDR_FAMILY
_IPADDRESS_ADDR_FAMILY.containing_type = _IPADDRESS
_SWARM.fields_by_name['action'].enum_type = _ACTION
_SWARM.fields_by_name['status'].enum_type = _STATUS
_SWARM.fields_by_name['public_ip_address'].message_type = _IPADDRESS
_SWARM.fields_by_name['ip_address'].message_type = _IPADDRESS
_TUNNEL.fields_by_name['action'].enum_type = _ACTION
_TUNNEL.fields_by_name['index'].enum_type = _TUNNELINDEX
_TUNNEL.fields_by_name['crypto_type'].enum_type = _CRYPTOTYPE
_TUNNEL.fields_by_name['peer_tun_ip'].message_type = _IPADDRESS
_TUNNEL.fields_by_name['tunnel_ip'].message_type = _IPADDRESS
_TUNNEL.fields_by_name['status'].enum_type = _STATUS
_INTERFACE.fields_by_name['action'].enum_type = _ACTION
_INTERFACE.fields_by_name['macaddr'].message_type = _MACADDRESS
_INTERFACE.fields_by_name['status'].enum_type = _STATUS
_INTERFACE.fields_by_name['ipaddr'].message_type = _IPADDRESS
_INTERFACE.fields_by_name['duplex_mode'].enum_type = _INTERFACE_DUPLEX
_INTERFACE.fields_by_name['type'].enum_type = _INTERFACE_INTFTYPE
_INTERFACE.fields_by_name['has_poe'].enum_type = _INTERFACE_POESUPPORT
_INTERFACE.fields_by_name['poe_state'].enum_type = _STATUS
_INTERFACE.fields_by_name['oper_state'].enum_type = _STATUS
_INTERFACE.fields_by_name['admin_state'].enum_type = _STATUS
_INTERFACE.fields_by_name['speed'].enum_type = _INTERFACE_SPEEDTYPE
_INTERFACE.fields_by_name['phy_type'].enum_type = _INTERFACE_PORTTYPE
_INTERFACE.fields_by_name['state_down_reason'].enum_type = _INTERFACE_STATEDOWNREASON
_INTERFACE.fields_by_name['vlan_mode'].enum_type = _INTERFACE_VLANMODES
_INTERFACE_DUPLEX.containing_type = _INTERFACE
_INTERFACE_INTFTYPE.containing_type = _INTERFACE
_INTERFACE_SPEEDTYPE.containing_type = _INTERFACE
_INTERFACE_PORTTYPE.containing_type = _INTERFACE
_INTERFACE_POESUPPORT.containing_type = _INTERFACE
_INTERFACE_STATEDOWNREASON.containing_type = _INTERFACE
_INTERFACE_VLANMODES.containing_type = _INTERFACE
_VAPINFO.fields_by_name['action'].enum_type = _ACTION
_VAPINFO.fields_by_name['radio_mac'].message_type = _MACADDRESS
_VAPINFO.fields_by_name['ap_mac'].message_type = _MACADDRESS
_VAPINFO.fields_by_name['bssid'].message_type = _MACADDRESS
_RADIO.fields_by_name['action'].enum_type = _ACTION
_RADIO.fields_by_name['macaddr'].message_type = _MACADDRESS
_RADIO.fields_by_name['status'].enum_type = _STATUS
_RADIO.fields_by_name['ap_mac'].message_type = _MACADDRESS
_AP.fields_by_name['action'].enum_type = _ACTION
_AP.fields_by_name['macaddr'].message_type = _MACADDRESS
_AP.fields_by_name['status'].enum_type = _STATUS
_AP.fields_by_name['ip_address'].message_type = _IPADDRESS
_AP.fields_by_name['uplink_type'].enum_type = _AP_UPLINKTYPE
_AP_UPLINKTYPE.containing_type = _AP
_NETWORK.fields_by_name['action'].enum_type = _ACTION
_WIRELESSCLIENT.fields_by_name['action'].enum_type = _ACTION
_WIRELESSCLIENT.fields_by_name['macaddr'].message_type = _MACADDRESS
_WIRELESSCLIENT.fields_by_name['ip_address'].message_type = _IPADDRESS
_WIRELESSCLIENT.fields_by_name['radio_mac'].message_type = _MACADDRESS
_HARDWAREMODULE.fields_by_name['status'].enum_type = _HARDWAREMODULE_HARDWARESTATUS
_HARDWAREMODULE_HARDWARESTATUS.containing_type = _HARDWAREMODULE
_SWITCH.fields_by_name['action'].enum_type = _ACTION
_SWITCH.fields_by_name['macaddr'].message_type = _MACADDRESS
_SWITCH.fields_by_name['status'].enum_type = _STATUS
_SWITCH.fields_by_name['public_ip_address'].message_type = _IPADDRESS
_SWITCH.fields_by_name['ip_address'].message_type = _IPADDRESS
_SWITCH.fields_by_name['default_gateway'].message_type = _IPADDRESS
_SWITCH.fields_by_name['management_modules'].message_type = _HARDWAREMODULE
_SWITCH.fields_by_name['power_supplies'].message_type = _HARDWAREMODULE
_SWITCH.fields_by_name['stack_member_role'].enum_type = _SWITCH_STACKMEMBERROLE
_SWITCH.fields_by_name['stack_macaddr'].message_type = _MACADDRESS
_SWITCH_STACKMEMBERROLE.containing_type = _SWITCH
_SWITCHSTACK.fields_by_name['action'].enum_type = _ACTION
_SWITCHSTACK.fields_by_name['status'].enum_type = _STATUS
_SWITCHSTACK.fields_by_name['topology'].enum_type = _SWITCHSTACK_STACKTOPOLOGY
_SWITCHSTACK.fields_by_name['policy'].enum_type = _SWITCHSTACK_STACKPOLICY
_SWITCHSTACK_STACKTOPOLOGY.containing_type = _SWITCHSTACK
_SWITCHSTACK_STACKPOLICY.containing_type = _SWITCHSTACK
_WIREDCLIENT.fields_by_name['action'].enum_type = _ACTION
_WIREDCLIENT.fields_by_name['macaddr'].message_type = _MACADDRESS
_WIREDCLIENT.fields_by_name['ip_address'].message_type = _IPADDRESS
_WIREDCLIENT.fields_by_name['interface_mac'].message_type = _MACADDRESS
_WIREDCLIENT.fields_by_name['auth_type'].enum_type = _AUTHTYPE
_MOBILITYCONTROLLER.fields_by_name['action'].enum_type = _ACTION
_MOBILITYCONTROLLER.fields_by_name['macaddr'].message_type = _MACADDRESS
_MOBILITYCONTROLLER.fields_by_name['status'].enum_type = _STATUS
_MOBILITYCONTROLLER.fields_by_name['public_ip_address'].message_type = _IPADDRESS
_MOBILITYCONTROLLER.fields_by_name['ip_address'].message_type = _IPADDRESS
_MOBILITYCONTROLLER.fields_by_name['default_gateway'].message_type = _IPADDRESS
_MOBILITYCONTROLLER.fields_by_name['mode'].enum_type = _MOBILITYCONTROLLER_CONTROLLERMODE
_MOBILITYCONTROLLER_CONTROLLERMODE.containing_type = _MOBILITYCONTROLLER
_UPLINK.fields_by_name['action'].enum_type = _ACTION
_UPLINK.fields_by_name['status'].enum_type = _STATUS
_UPLINK.fields_by_name['wan_status'].enum_type = _STATUS
_UPLINK.fields_by_name['public_ip_address'].message_type = _IPADDRESS
_UPLINK.fields_by_name['private_ip_address'].message_type = _IPADDRESS
_IKETUNNEL.fields_by_name['action'].enum_type = _ACTION
_IKETUNNEL.fields_by_name['peer_mac'].message_type = _MACADDRESS
_IKETUNNEL.fields_by_name['local_mac'].message_type = _MACADDRESS
_IKETUNNEL.fields_by_name['src_ip'].message_type = _IPADDRESS
_IKETUNNEL.fields_by_name['dst_ip'].message_type = _IPADDRESS
_IKETUNNEL.fields_by_name['status'].enum_type = _STATUS
_DEVICESTATS.fields_by_name['fan_status'].enum_type = _STATUS
_RADIOSTATS.fields_by_name['macaddr'].message_type = _MACADDRESS
_VAPSTATS.fields_by_name['radio_mac'].message_type = _MACADDRESS
_TUNNELSTATS.fields_by_name['index'].enum_type = _TUNNELINDEX
_CLIENTSTATS.fields_by_name['macaddr'].message_type = _MACADDRESS
_INTERFACESTATS.fields_by_name['macaddr'].message_type = _MACADDRESS
_TUNNELIPPROBESTATS.fields_by_name['tunnel_index'].enum_type = _TUNNELINDEX
_TUNNELIPPROBESTATS.fields_by_name['probe_ip_addr'].message_type = _IPADDRESS
_UPLINKIPPROBESTATS.fields_by_name['ip_address'].message_type = _IPADDRESS
_UPLINKSPEEDTEST.fields_by_name['server_ip'].message_type = _IPADDRESS
_WIDSEVENT.fields_by_name['action'].enum_type = _ACTION
_WIDSEVENT.fields_by_name['event_type'].enum_type = _WIDSEVENT_EVENTTYPE
_WIDSEVENT.fields_by_name['macaddr'].message_type = _MACADDRESS
_WIDSEVENT.fields_by_name['attack_type'].enum_type = _WIDSEVENT_ATTACKTYPE
_WIDSEVENT_EVENTTYPE.containing_type = _WIDSEVENT
_WIDSEVENT_ATTACKTYPE.containing_type = _WIDSEVENT
_AIRMONITORROGUEINFO.fields_by_name['match_type'].enum_type = _AIRMONITORROGUEINFO_WMS_RAP_MATCH_TYPE
_AIRMONITORROGUEINFO.fields_by_name['match_mac'].message_type = _MACADDRESS
_AIRMONITORROGUEINFO.fields_by_name['match_ip'].message_type = _IPADDRESS
_AIRMONITORROGUEINFO.fields_by_name['nat_match_type'].enum_type = _AIRMONITORROGUEINFO_WMS_RAP_NAT_MATCH_TYPE
_AIRMONITORROGUEINFO_WMS_RAP_MATCH_TYPE.containing_type = _AIRMONITORROGUEINFO
_AIRMONITORROGUEINFO_WMS_RAP_NAT_MATCH_TYPE.containing_type = _AIRMONITORROGUEINFO
_ROGUEEVENT.fields_by_name['action'].enum_type = _ACTION
_ROGUEEVENT.fields_by_name['macaddr'].message_type = _MACADDRESS
_ROGUEEVENT.fields_by_name['encr_type'].enum_type = _ROGUEEVENT_WMS_SNMP_ENCR_PROTOCOL
_ROGUEEVENT.fields_by_name['am_rogue'].message_type = _AIRMONITORROGUEINFO
_ROGUEEVENT_WMS_SNMP_ENCR_PROTOCOL.containing_type = _ROGUEEVENT
_DEVICENEIGHBOURS.fields_by_name['action'].enum_type = _ACTION
_MONITORINGINFORMATION.fields_by_name['data_elements'].enum_type = _DATAELEMENT
_MONITORINGINFORMATION.fields_by_name['swarms'].message_type = _SWARM
_MONITORINGINFORMATION.fields_by_name['aps'].message_type = _AP
_MONITORINGINFORMATION.fields_by_name['networks'].message_type = _NETWORK
_MONITORINGINFORMATION.fields_by_name['radios'].message_type = _RADIO
_MONITORINGINFORMATION.fields_by_name['vaps'].message_type = _VAPINFO
_MONITORINGINFORMATION.fields_by_name['interfaces'].message_type = _INTERFACE
_MONITORINGINFORMATION.fields_by_name['tunnels'].message_type = _TUNNEL
_MONITORINGINFORMATION.fields_by_name['wireless_clients'].message_type = _WIRELESSCLIENT
_MONITORINGINFORMATION.fields_by_name['switches'].message_type = _SWITCH
_MONITORINGINFORMATION.fields_by_name['wired_clients'].message_type = _WIREDCLIENT
_MONITORINGINFORMATION.fields_by_name['device_stats'].message_type = _DEVICESTATS
_MONITORINGINFORMATION.fields_by_name['radio_stats'].message_type = _RADIOSTATS
_MONITORINGINFORMATION.fields_by_name['interface_stats'].message_type = _INTERFACESTATS
_MONITORINGINFORMATION.fields_by_name['vap_stats'].message_type = _VAPSTATS
_MONITORINGINFORMATION.fields_by_name['client_stats'].message_type = _CLIENTSTATS
_MONITORINGINFORMATION.fields_by_name['tunnel_stats'].message_type = _TUNNELSTATS
_MONITORINGINFORMATION.fields_by_name['wids_events'].message_type = _WIDSEVENT
_MONITORINGINFORMATION.fields_by_name['modem_stats'].message_type = _MODEMSTATS
_MONITORINGINFORMATION.fields_by_name['role_stats'].message_type = _ROLESTATS
_MONITORINGINFORMATION.fields_by_name['vlan_stats'].message_type = _VLANSTATS
_MONITORINGINFORMATION.fields_by_name['ssid_stats'].message_type = _SSIDSTATS
_MONITORINGINFORMATION.fields_by_name['ipprobe_stats'].message_type = _TUNNELIPPROBESTATS
_MONITORINGINFORMATION.fields_by_name['rogue_events'].message_type = _ROGUEEVENT
_MONITORINGINFORMATION.fields_by_name['mobility_controllers'].message_type = _MOBILITYCONTROLLER
_MONITORINGINFORMATION.fields_by_name['uplinks'].message_type = _UPLINK
_MONITORINGINFORMATION.fields_by_name['uplink_stats'].message_type = _UPLINKSTATS
_MONITORINGINFORMATION.fields_by_name['uplink_wan_stats'].message_type = _UPLINKWANSTATS
_MONITORINGINFORMATION.fields_by_name['uplink_probe_stats'].message_type = _UPLINKIPPROBESTATS
_MONITORINGINFORMATION.fields_by_name['uplink_speedtest'].message_type = _UPLINKSPEEDTEST
_MONITORINGINFORMATION.fields_by_name['device_neighbours'].message_type = _DEVICENEIGHBOURS
_MONITORINGINFORMATION.fields_by_name['notification'].message_type = _NOTIFICATION
_MONITORINGINFORMATION.fields_by_name['switch_stacks'].message_type = _SWITCHSTACK
_MONITORINGINFORMATION.fields_by_name['ike_tunnels'].message_type = _IKETUNNEL
_MONITORINGINFORMATION.fields_by_name['switch_vlan_info'].message_type = _SWITCHVLANINFO
_MONITORINGINFORMATION.fields_by_name['vlans'].message_type = _VLAN
_MONITORINGINFORMATION.fields_by_name['vsx'].message_type = _VSXSTATE
_MONITORINGSTATEINFORMATION.fields_by_name['mobility_controllers'].message_type = _MOBILITYCONTROLLER
_MONITORINGSTATEINFORMATION.fields_by_name['switches'].message_type = _SWITCH
_MONITORINGSTATEINFORMATION.fields_by_name['swarms'].message_type = _SWARM
_MONITORINGSTATEINFORMATION.fields_by_name['aps'].message_type = _AP
_MONITORINGSTATEINFORMATION.fields_by_name['vaps'].message_type = _VAPINFO
_MONITORINGSTATEINFORMATION.fields_by_name['radios'].message_type = _RADIO
_MONITORINGSTATEINFORMATION.fields_by_name['interfaces'].message_type = _INTERFACE
_MONITORINGSTATEINFORMATION.fields_by_name['networks'].message_type = _NETWORK
_MONITORINGSTATEINFORMATION.fields_by_name['tunnels'].message_type = _TUNNEL
_MONITORINGSTATEINFORMATION.fields_by_name['wireless_clients'].message_type = _WIRELESSCLIENT
_MONITORINGSTATEINFORMATION.fields_by_name['wired_clients'].message_type = _WIREDCLIENT
_MONITORINGSTATEINFORMATION.fields_by_name['uplinks'].message_type = _UPLINK
_MONITORINGSTATEINFORMATION.fields_by_name['switch_stacks'].message_type = _SWITCHSTACK
_MONITORINGSTATEINFORMATION.fields_by_name['ike_tunnels'].message_type = _IKETUNNEL
_MONITORINGSTATEINFORMATION.fields_by_name['data_elements'].enum_type = _DATAELEMENT
_NOTIFICATION.fields_by_name['severity'].enum_type = _NOTIFICATION_SEVERITY
_NOTIFICATION.fields_by_name['state'].enum_type = _NOTIFICATION_NOTIFICATIONSTATE
_NOTIFICATION.fields_by_name['extra'].message_type = _KEYVALUEDATA
_NOTIFICATION_SEVERITY.containing_type = _NOTIFICATION
_NOTIFICATION_NOTIFICATIONSTATE.containing_type = _NOTIFICATION
_SWITCHVLANINFO.fields_by_name['vlans'].message_type = _SWITCHVLAN
_SWITCHVLAN.fields_by_name['ipaddress'].message_type = _IPADDRESS
_SWITCHVLAN.fields_by_name['status'].enum_type = _SWITCHVLAN_VLANSTATUS
_SWITCHVLAN.fields_by_name['type'].enum_type = _SWITCHVLAN_VLANTYPE
_SWITCHVLAN_VLANSTATUS.containing_type = _SWITCHVLAN
_SWITCHVLAN_VLANTYPE.containing_type = _SWITCHVLAN
_VLAN.fields_by_name['action'].enum_type = _ACTION
_VLAN.fields_by_name['ipv4'].message_type = _IPADDRESS
_VLAN.fields_by_name['ipv6_ll'].message_type = _IPADDRESS
_VLAN.fields_by_name['ipv6_1'].message_type = _IPADDRESS
_VLAN.fields_by_name['ipv6_2'].message_type = _IPADDRESS
_VLAN.fields_by_name['ipv6_3'].message_type = _IPADDRESS
_VLAN.fields_by_name['oper_state'].enum_type = _STATUS
_VLAN.fields_by_name['admin_state'].enum_type = _STATUS
_VSXSTATE.fields_by_name['action'].enum_type = _ACTION
_VSXSTATE.fields_by_name['role'].enum_type = _VSXSTATE_DEVICEROLE
_VSXSTATE.fields_by_name['peer_role'].enum_type = _VSXSTATE_DEVICEROLE
_VSXSTATE.fields_by_name['keepalive_peer_ip'].message_type = _IPADDRESS
_VSXSTATE.fields_by_name['keepalive_src_ip'].message_type = _IPADDRESS
_VSXSTATE.fields_by_name['mac'].message_type = _MACADDRESS
_VSXSTATE.fields_by_name['peer_mac'].message_type = _MACADDRESS
_VSXSTATE.fields_by_name['islp_device_state_value'].enum_type = _VSXSTATE_ISLPDEVICESTATE
_VSXSTATE.fields_by_name['config_sync_state_value'].enum_type = _VSXSTATE_ISLSTATE
_VSXSTATE.fields_by_name['isl_mgmt_state_value'].enum_type = _VSXSTATE_ISLSTATE
_VSXSTATE.fields_by_name['nae_state_value'].enum_type = _VSXSTATE_ISLSTATE
_VSXSTATE.fields_by_name['https_server_state_value'].enum_type = _VSXSTATE_ISLSTATE
_VSXSTATE_DEVICEROLE.containing_type = _VSXSTATE
_VSXSTATE_ISLPDEVICESTATE.containing_type = _VSXSTATE
_VSXSTATE_ISLSTATE.containing_type = _VSXSTATE
DESCRIPTOR.message_types_by_name['IpAddress'] = _IPADDRESS
DESCRIPTOR.message_types_by_name['MacAddress'] = _MACADDRESS
DESCRIPTOR.message_types_by_name['Swarm'] = _SWARM
DESCRIPTOR.message_types_by_name['Tunnel'] = _TUNNEL
DESCRIPTOR.message_types_by_name['Interface'] = _INTERFACE
DESCRIPTOR.message_types_by_name['VapInfo'] = _VAPINFO
DESCRIPTOR.message_types_by_name['Radio'] = _RADIO
DESCRIPTOR.message_types_by_name['Ap'] = _AP
DESCRIPTOR.message_types_by_name['Network'] = _NETWORK
DESCRIPTOR.message_types_by_name['WirelessClient'] = _WIRELESSCLIENT
DESCRIPTOR.message_types_by_name['HardwareModule'] = _HARDWAREMODULE
DESCRIPTOR.message_types_by_name['Switch'] = _SWITCH
DESCRIPTOR.message_types_by_name['SwitchStack'] = _SWITCHSTACK
DESCRIPTOR.message_types_by_name['WiredClient'] = _WIREDCLIENT
DESCRIPTOR.message_types_by_name['MobilityController'] = _MOBILITYCONTROLLER
DESCRIPTOR.message_types_by_name['Uplink'] = _UPLINK
DESCRIPTOR.message_types_by_name['IkeTunnel'] = _IKETUNNEL
DESCRIPTOR.message_types_by_name['DeviceStats'] = _DEVICESTATS
DESCRIPTOR.message_types_by_name['RadioStats'] = _RADIOSTATS
DESCRIPTOR.message_types_by_name['VapStats'] = _VAPSTATS
DESCRIPTOR.message_types_by_name['TunnelStats'] = _TUNNELSTATS
DESCRIPTOR.message_types_by_name['ClientStats'] = _CLIENTSTATS
DESCRIPTOR.message_types_by_name['InterfaceStats'] = _INTERFACESTATS
DESCRIPTOR.message_types_by_name['UplinkStats'] = _UPLINKSTATS
DESCRIPTOR.message_types_by_name['UplinkWanStats'] = _UPLINKWANSTATS
DESCRIPTOR.message_types_by_name['ModemStats'] = _MODEMSTATS
DESCRIPTOR.message_types_by_name['RoleStats'] = _ROLESTATS
DESCRIPTOR.message_types_by_name['VlanStats'] = _VLANSTATS
DESCRIPTOR.message_types_by_name['SsidStats'] = _SSIDSTATS
DESCRIPTOR.message_types_by_name['TunnelIpProbeStats'] = _TUNNELIPPROBESTATS
DESCRIPTOR.message_types_by_name['UplinkIpProbeStats'] = _UPLINKIPPROBESTATS
DESCRIPTOR.message_types_by_name['UplinkSpeedtest'] = _UPLINKSPEEDTEST
DESCRIPTOR.message_types_by_name['WIDSEvent'] = _WIDSEVENT
DESCRIPTOR.message_types_by_name['AirMonitorRogueInfo'] = _AIRMONITORROGUEINFO
DESCRIPTOR.message_types_by_name['RogueEvent'] = _ROGUEEVENT
DESCRIPTOR.message_types_by_name['DeviceNeighbours'] = _DEVICENEIGHBOURS
DESCRIPTOR.message_types_by_name['MonitoringInformation'] = _MONITORINGINFORMATION
DESCRIPTOR.message_types_by_name['MonitoringStateInformation'] = _MONITORINGSTATEINFORMATION
DESCRIPTOR.message_types_by_name['KeyValueData'] = _KEYVALUEDATA
DESCRIPTOR.message_types_by_name['Notification'] = _NOTIFICATION
DESCRIPTOR.message_types_by_name['SwitchVlanInfo'] = _SWITCHVLANINFO
DESCRIPTOR.message_types_by_name['SwitchVlan'] = _SWITCHVLAN
DESCRIPTOR.message_types_by_name['Vlan'] = _VLAN
DESCRIPTOR.message_types_by_name['VSXState'] = _VSXSTATE
DESCRIPTOR.enum_types_by_name['Action'] = _ACTION
DESCRIPTOR.enum_types_by_name['Status'] = _STATUS
DESCRIPTOR.enum_types_by_name['TunnelIndex'] = _TUNNELINDEX
DESCRIPTOR.enum_types_by_name['CryptoType'] = _CRYPTOTYPE
DESCRIPTOR.enum_types_by_name['DataElement'] = _DATAELEMENT
DESCRIPTOR.enum_types_by_name['AuthType'] = _AUTHTYPE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
IpAddress = _reflection.GeneratedProtocolMessageType('IpAddress', (_message.Message,), {
'DESCRIPTOR' : _IPADDRESS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.IpAddress)
})
_sym_db.RegisterMessage(IpAddress)
MacAddress = _reflection.GeneratedProtocolMessageType('MacAddress', (_message.Message,), {
'DESCRIPTOR' : _MACADDRESS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.MacAddress)
})
_sym_db.RegisterMessage(MacAddress)
Swarm = _reflection.GeneratedProtocolMessageType('Swarm', (_message.Message,), {
'DESCRIPTOR' : _SWARM,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.Swarm)
})
_sym_db.RegisterMessage(Swarm)
Tunnel = _reflection.GeneratedProtocolMessageType('Tunnel', (_message.Message,), {
'DESCRIPTOR' : _TUNNEL,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.Tunnel)
})
_sym_db.RegisterMessage(Tunnel)
Interface = _reflection.GeneratedProtocolMessageType('Interface', (_message.Message,), {
'DESCRIPTOR' : _INTERFACE,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.Interface)
})
_sym_db.RegisterMessage(Interface)
VapInfo = _reflection.GeneratedProtocolMessageType('VapInfo', (_message.Message,), {
'DESCRIPTOR' : _VAPINFO,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.VapInfo)
})
_sym_db.RegisterMessage(VapInfo)
Radio = _reflection.GeneratedProtocolMessageType('Radio', (_message.Message,), {
'DESCRIPTOR' : _RADIO,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.Radio)
})
_sym_db.RegisterMessage(Radio)
Ap = _reflection.GeneratedProtocolMessageType('Ap', (_message.Message,), {
'DESCRIPTOR' : _AP,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.Ap)
})
_sym_db.RegisterMessage(Ap)
Network = _reflection.GeneratedProtocolMessageType('Network', (_message.Message,), {
'DESCRIPTOR' : _NETWORK,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.Network)
})
_sym_db.RegisterMessage(Network)
WirelessClient = _reflection.GeneratedProtocolMessageType('WirelessClient', (_message.Message,), {
'DESCRIPTOR' : _WIRELESSCLIENT,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.WirelessClient)
})
_sym_db.RegisterMessage(WirelessClient)
HardwareModule = _reflection.GeneratedProtocolMessageType('HardwareModule', (_message.Message,), {
'DESCRIPTOR' : _HARDWAREMODULE,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.HardwareModule)
})
_sym_db.RegisterMessage(HardwareModule)
Switch = _reflection.GeneratedProtocolMessageType('Switch', (_message.Message,), {
'DESCRIPTOR' : _SWITCH,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.Switch)
})
_sym_db.RegisterMessage(Switch)
SwitchStack = _reflection.GeneratedProtocolMessageType('SwitchStack', (_message.Message,), {
'DESCRIPTOR' : _SWITCHSTACK,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.SwitchStack)
})
_sym_db.RegisterMessage(SwitchStack)
WiredClient = _reflection.GeneratedProtocolMessageType('WiredClient', (_message.Message,), {
'DESCRIPTOR' : _WIREDCLIENT,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.WiredClient)
})
_sym_db.RegisterMessage(WiredClient)
MobilityController = _reflection.GeneratedProtocolMessageType('MobilityController', (_message.Message,), {
'DESCRIPTOR' : _MOBILITYCONTROLLER,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.MobilityController)
})
_sym_db.RegisterMessage(MobilityController)
Uplink = _reflection.GeneratedProtocolMessageType('Uplink', (_message.Message,), {
'DESCRIPTOR' : _UPLINK,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.Uplink)
})
_sym_db.RegisterMessage(Uplink)
IkeTunnel = _reflection.GeneratedProtocolMessageType('IkeTunnel', (_message.Message,), {
'DESCRIPTOR' : _IKETUNNEL,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.IkeTunnel)
})
_sym_db.RegisterMessage(IkeTunnel)
DeviceStats = _reflection.GeneratedProtocolMessageType('DeviceStats', (_message.Message,), {
'DESCRIPTOR' : _DEVICESTATS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.DeviceStats)
})
_sym_db.RegisterMessage(DeviceStats)
RadioStats = _reflection.GeneratedProtocolMessageType('RadioStats', (_message.Message,), {
'DESCRIPTOR' : _RADIOSTATS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.RadioStats)
})
_sym_db.RegisterMessage(RadioStats)
VapStats = _reflection.GeneratedProtocolMessageType('VapStats', (_message.Message,), {
'DESCRIPTOR' : _VAPSTATS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.VapStats)
})
_sym_db.RegisterMessage(VapStats)
TunnelStats = _reflection.GeneratedProtocolMessageType('TunnelStats', (_message.Message,), {
'DESCRIPTOR' : _TUNNELSTATS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.TunnelStats)
})
_sym_db.RegisterMessage(TunnelStats)
ClientStats = _reflection.GeneratedProtocolMessageType('ClientStats', (_message.Message,), {
'DESCRIPTOR' : _CLIENTSTATS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.ClientStats)
})
_sym_db.RegisterMessage(ClientStats)
InterfaceStats = _reflection.GeneratedProtocolMessageType('InterfaceStats', (_message.Message,), {
'DESCRIPTOR' : _INTERFACESTATS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.InterfaceStats)
})
_sym_db.RegisterMessage(InterfaceStats)
UplinkStats = _reflection.GeneratedProtocolMessageType('UplinkStats', (_message.Message,), {
'DESCRIPTOR' : _UPLINKSTATS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.UplinkStats)
})
_sym_db.RegisterMessage(UplinkStats)
UplinkWanStats = _reflection.GeneratedProtocolMessageType('UplinkWanStats', (_message.Message,), {
'DESCRIPTOR' : _UPLINKWANSTATS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.UplinkWanStats)
})
_sym_db.RegisterMessage(UplinkWanStats)
ModemStats = _reflection.GeneratedProtocolMessageType('ModemStats', (_message.Message,), {
'DESCRIPTOR' : _MODEMSTATS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.ModemStats)
})
_sym_db.RegisterMessage(ModemStats)
RoleStats = _reflection.GeneratedProtocolMessageType('RoleStats', (_message.Message,), {
'DESCRIPTOR' : _ROLESTATS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.RoleStats)
})
_sym_db.RegisterMessage(RoleStats)
VlanStats = _reflection.GeneratedProtocolMessageType('VlanStats', (_message.Message,), {
'DESCRIPTOR' : _VLANSTATS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.VlanStats)
})
_sym_db.RegisterMessage(VlanStats)
SsidStats = _reflection.GeneratedProtocolMessageType('SsidStats', (_message.Message,), {
'DESCRIPTOR' : _SSIDSTATS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.SsidStats)
})
_sym_db.RegisterMessage(SsidStats)
TunnelIpProbeStats = _reflection.GeneratedProtocolMessageType('TunnelIpProbeStats', (_message.Message,), {
'DESCRIPTOR' : _TUNNELIPPROBESTATS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.TunnelIpProbeStats)
})
_sym_db.RegisterMessage(TunnelIpProbeStats)
UplinkIpProbeStats = _reflection.GeneratedProtocolMessageType('UplinkIpProbeStats', (_message.Message,), {
'DESCRIPTOR' : _UPLINKIPPROBESTATS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.UplinkIpProbeStats)
})
_sym_db.RegisterMessage(UplinkIpProbeStats)
UplinkSpeedtest = _reflection.GeneratedProtocolMessageType('UplinkSpeedtest', (_message.Message,), {
'DESCRIPTOR' : _UPLINKSPEEDTEST,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.UplinkSpeedtest)
})
_sym_db.RegisterMessage(UplinkSpeedtest)
WIDSEvent = _reflection.GeneratedProtocolMessageType('WIDSEvent', (_message.Message,), {
'DESCRIPTOR' : _WIDSEVENT,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.WIDSEvent)
})
_sym_db.RegisterMessage(WIDSEvent)
AirMonitorRogueInfo = _reflection.GeneratedProtocolMessageType('AirMonitorRogueInfo', (_message.Message,), {
'DESCRIPTOR' : _AIRMONITORROGUEINFO,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.AirMonitorRogueInfo)
})
_sym_db.RegisterMessage(AirMonitorRogueInfo)
RogueEvent = _reflection.GeneratedProtocolMessageType('RogueEvent', (_message.Message,), {
'DESCRIPTOR' : _ROGUEEVENT,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.RogueEvent)
})
_sym_db.RegisterMessage(RogueEvent)
DeviceNeighbours = _reflection.GeneratedProtocolMessageType('DeviceNeighbours', (_message.Message,), {
'DESCRIPTOR' : _DEVICENEIGHBOURS,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.DeviceNeighbours)
})
_sym_db.RegisterMessage(DeviceNeighbours)
MonitoringInformation = _reflection.GeneratedProtocolMessageType('MonitoringInformation', (_message.Message,), {
'DESCRIPTOR' : _MONITORINGINFORMATION,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.MonitoringInformation)
})
_sym_db.RegisterMessage(MonitoringInformation)
MonitoringStateInformation = _reflection.GeneratedProtocolMessageType('MonitoringStateInformation', (_message.Message,), {
'DESCRIPTOR' : _MONITORINGSTATEINFORMATION,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.MonitoringStateInformation)
})
_sym_db.RegisterMessage(MonitoringStateInformation)
KeyValueData = _reflection.GeneratedProtocolMessageType('KeyValueData', (_message.Message,), {
'DESCRIPTOR' : _KEYVALUEDATA,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.KeyValueData)
})
_sym_db.RegisterMessage(KeyValueData)
Notification = _reflection.GeneratedProtocolMessageType('Notification', (_message.Message,), {
'DESCRIPTOR' : _NOTIFICATION,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.Notification)
})
_sym_db.RegisterMessage(Notification)
SwitchVlanInfo = _reflection.GeneratedProtocolMessageType('SwitchVlanInfo', (_message.Message,), {
'DESCRIPTOR' : _SWITCHVLANINFO,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.SwitchVlanInfo)
})
_sym_db.RegisterMessage(SwitchVlanInfo)
SwitchVlan = _reflection.GeneratedProtocolMessageType('SwitchVlan', (_message.Message,), {
'DESCRIPTOR' : _SWITCHVLAN,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.SwitchVlan)
})
_sym_db.RegisterMessage(SwitchVlan)
Vlan = _reflection.GeneratedProtocolMessageType('Vlan', (_message.Message,), {
'DESCRIPTOR' : _VLAN,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.Vlan)
})
_sym_db.RegisterMessage(Vlan)
VSXState = _reflection.GeneratedProtocolMessageType('VSXState', (_message.Message,), {
'DESCRIPTOR' : _VSXSTATE,
'__module__' : 'monitoring_pb2'
# @@protoc_insertion_point(class_scope:Monitoring.VSXState)
})
_sym_db.RegisterMessage(VSXState)
# @@protoc_insertion_point(module_scope)
| 50.004617 | 36,460 | 0.755204 | 45,291 | 346,582 | 5.443466 | 0.029851 | 0.056559 | 0.11429 | 0.089584 | 0.82284 | 0.773903 | 0.754924 | 0.736761 | 0.723546 | 0.707918 | 0 | 0.045864 | 0.127367 | 346,582 | 6,930 | 36,461 | 50.011833 | 0.769309 | 0.008359 | 0 | 0.729943 | 1 | 0.005219 | 0.163818 | 0.12507 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.000746 | 0 | 0.000746 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
40836c5328a76bba0dd8231cd53719ac24187881 | 8,786 | py | Python | rlzoo/algorithms/pg/default.py | tensorlayer/RLzoo | 9a587b97f706b2a59ac98555945822bf3987b1d1 | [
"Apache-2.0"
] | 750 | 2019-07-26T10:56:28.000Z | 2022-03-25T08:36:38.000Z | rlzoo/algorithms/pg/default.py | tensorlayer/RLzoo | 9a587b97f706b2a59ac98555945822bf3987b1d1 | [
"Apache-2.0"
] | 29 | 2019-10-25T02:46:19.000Z | 2022-02-09T23:41:24.000Z | rlzoo/algorithms/pg/default.py | tensorlayer/RLzoo | 9a587b97f706b2a59ac98555945822bf3987b1d1 | [
"Apache-2.0"
] | 101 | 2019-08-04T12:21:25.000Z | 2022-03-18T18:06:50.000Z | from rlzoo.common.policy_networks import *
from rlzoo.common.utils import set_seed
"""
full list of algorithm parameters (alg_params)
-----------------------------------------------
net_list: a list of networks (value and policy) used in the algorithm, from common functions or customization
optimizers_list: a list of optimizers for all networks and differentiable variables
-----------------------------------------------
full list of learning parameters (learn_params)
-----------------------------------------------
train_episodes: total number of episodes for training
test_episodes: total number of episodes for testing
max_steps: maximum number of steps for one episode
save_interval: time steps for saving
mode: train or test
render: render each step
gamma: reward decay
-----------------------------------------------
"""
def atari(env, default_seed=True):
if default_seed:
seed = 2
set_seed(seed, env) # reproducible
alg_params = dict()
if alg_params.get('net_list') is None:
num_hidden_layer = 1 # number of hidden layers for the networks
hidden_dim = 32 # dimension of hidden layers for the networks
with tf.name_scope('PG'):
with tf.name_scope('Policy'):
policy_net = StochasticPolicyNetwork(env.observation_space, env.action_space,
num_hidden_layer * [hidden_dim])
net_list = [policy_net]
alg_params['net_list'] = net_list
if alg_params.get('optimizers_list') is None:
learning_rate = 0.02
policy_optimizer = tf.optimizers.Adam(learning_rate)
optimizers_list = [policy_optimizer]
alg_params['optimizers_list'] = optimizers_list
learn_params = dict(
train_episodes=200,
test_episodes=100,
max_steps=200,
save_interval=20,
gamma=0.95
)
return alg_params, learn_params
def classic_control(env, default_seed=True):
if default_seed:
seed = 2
set_seed(seed, env) # reproducible
alg_params = dict()
if alg_params.get('net_list') is None:
num_hidden_layer = 1 # number of hidden layers for the networks
hidden_dim = 32 # dimension of hidden layers for the networks
with tf.name_scope('PG'):
with tf.name_scope('Policy'):
policy_net = StochasticPolicyNetwork(env.observation_space, env.action_space,
num_hidden_layer * [hidden_dim])
net_list = [policy_net]
alg_params['net_list'] = net_list
if alg_params.get('optimizers_list') is None:
learning_rate = 0.02
policy_optimizer = tf.optimizers.Adam(learning_rate)
optimizers_list = [policy_optimizer]
alg_params['optimizers_list'] = optimizers_list
learn_params = dict(
train_episodes=200,
test_episodes=100,
max_steps=200,
save_interval=20,
gamma=0.95
)
return alg_params, learn_params
def box2d(env, default_seed=True):
if default_seed:
seed = 2
set_seed(seed, env) # reproducible
alg_params = dict()
if alg_params.get('net_list') is None:
num_hidden_layer = 1 # number of hidden layers for the networks
hidden_dim = 32 # dimension of hidden layers for the networks
with tf.name_scope('PG'):
with tf.name_scope('Policy'):
policy_net = StochasticPolicyNetwork(env.observation_space, env.action_space,
num_hidden_layer * [hidden_dim])
net_list = [policy_net]
alg_params['net_list'] = net_list
if alg_params.get('optimizers_list') is None:
learning_rate = 0.02
policy_optimizer = tf.optimizers.Adam(learning_rate)
optimizers_list = [policy_optimizer]
alg_params['optimizers_list'] = optimizers_list
learn_params = dict(
train_episodes=200,
test_episodes=100,
max_steps=200,
save_interval=20,
gamma=0.95
)
return alg_params, learn_params
def mujoco(env, default_seed=True):
if default_seed:
seed = 2
set_seed(seed, env) # reproducible
alg_params = dict()
if alg_params.get('net_list') is None:
num_hidden_layer = 1 # number of hidden layers for the networks
hidden_dim = 32 # dimension of hidden layers for the networks
with tf.name_scope('PG'):
with tf.name_scope('Policy'):
policy_net = StochasticPolicyNetwork(env.observation_space, env.action_space,
num_hidden_layer * [hidden_dim])
net_list = [policy_net]
alg_params['net_list'] = net_list
if alg_params.get('optimizers_list') is None:
learning_rate = 0.02
policy_optimizer = tf.optimizers.Adam(learning_rate)
optimizers_list = [policy_optimizer]
alg_params['optimizers_list'] = optimizers_list
learn_params = dict(
train_episodes=200,
test_episodes=100,
max_steps=200,
save_interval=20,
gamma=0.95
)
return alg_params, learn_params
def robotics(env, default_seed=True):
if default_seed:
seed = 2
set_seed(seed, env) # reproducible
alg_params = dict()
if alg_params.get('net_list') is None:
num_hidden_layer = 1 # number of hidden layers for the networks
hidden_dim = 32 # dimension of hidden layers for the networks
with tf.name_scope('PG'):
with tf.name_scope('Policy'):
policy_net = StochasticPolicyNetwork(env.observation_space, env.action_space,
num_hidden_layer * [hidden_dim])
net_list = [policy_net]
alg_params['net_list'] = net_list
if alg_params.get('optimizers_list') is None:
learning_rate = 0.02
policy_optimizer = tf.optimizers.Adam(learning_rate)
optimizers_list = [policy_optimizer]
alg_params['optimizers_list'] = optimizers_list
learn_params = dict(
train_episodes=200,
test_episodes=100,
max_steps=200,
save_interval=20,
gamma=0.95
)
return alg_params, learn_params
def dm_control(env, default_seed=True):
if default_seed:
seed = 2
set_seed(seed, env) # reproducible
alg_params = dict()
if alg_params.get('net_list') is None:
num_hidden_layer = 1 # number of hidden layers for the networks
hidden_dim = 32 # dimension of hidden layers for the networks
with tf.name_scope('PG'):
with tf.name_scope('Policy'):
policy_net = StochasticPolicyNetwork(env.observation_space, env.action_space,
num_hidden_layer * [hidden_dim])
net_list = [policy_net]
alg_params['net_list'] = net_list
if alg_params.get('optimizers_list') is None:
learning_rate = 0.02
policy_optimizer = tf.optimizers.Adam(learning_rate)
optimizers_list = [policy_optimizer]
alg_params['optimizers_list'] = optimizers_list
learn_params = dict(
train_episodes=200,
test_episodes=100,
max_steps=200,
save_interval=20,
gamma=0.95
)
return alg_params, learn_params
def rlbench(env, default_seed=True):
if default_seed:
seed = 2
set_seed(seed, env) # reproducible
alg_params = dict()
if alg_params.get('net_list') is None:
num_hidden_layer = 1 # number of hidden layers for the networks
hidden_dim = 32 # dimension of hidden layers for the networks
with tf.name_scope('PG'):
with tf.name_scope('Policy'):
policy_net = StochasticPolicyNetwork(env.observation_space, env.action_space,
num_hidden_layer * [hidden_dim])
net_list = [policy_net]
alg_params['net_list'] = net_list
if alg_params.get('optimizers_list') is None:
learning_rate = 0.02
policy_optimizer = tf.optimizers.Adam(learning_rate)
optimizers_list = [policy_optimizer]
alg_params['optimizers_list'] = optimizers_list
learn_params = dict(
train_episodes=200,
test_episodes=100,
max_steps=200,
save_interval=20,
gamma=0.95
)
return alg_params, learn_params
| 33.792308 | 110 | 0.599021 | 1,039 | 8,786 | 4.791145 | 0.098171 | 0.077742 | 0.030936 | 0.039373 | 0.900964 | 0.900964 | 0.888108 | 0.888108 | 0.888108 | 0.888108 | 0 | 0.024258 | 0.3056 | 8,786 | 259 | 111 | 33.92278 | 0.791674 | 0.077965 | 0 | 0.91623 | 0 | 0 | 0.053458 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036649 | false | 0 | 0.010471 | 0 | 0.08377 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
40a910aecf8f1fd5645a8e42a24dd4ac4b1a73f5 | 128 | py | Python | tests/schema/misc/gql/fragments/__init__.py | simonsobs/acondbs | 6ca11c2889d827ecdb2b54d0cf3b94b8cdd281e6 | [
"MIT"
] | null | null | null | tests/schema/misc/gql/fragments/__init__.py | simonsobs/acondbs | 6ca11c2889d827ecdb2b54d0cf3b94b8cdd281e6 | [
"MIT"
] | 24 | 2020-04-02T19:29:07.000Z | 2022-03-08T03:05:43.000Z | tests/schema/misc/gql/fragments/__init__.py | simonsobs/acondbs | 6ca11c2889d827ecdb2b54d0cf3b94b8cdd281e6 | [
"MIT"
] | 1 | 2020-04-08T15:48:28.000Z | 2020-04-08T15:48:28.000Z | from .fragment_log import FRAGMENT_LOG # noqa: F401
from .fragment_log_connection import FRAGMENT_LOG_CONNECTION # noqa: F401
| 42.666667 | 74 | 0.828125 | 18 | 128 | 5.555556 | 0.388889 | 0.44 | 0.3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053571 | 0.125 | 128 | 2 | 75 | 64 | 0.839286 | 0.164063 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
40b65755a2ef6bfe47a257b8c48d767942cc274f | 4,957 | py | Python | tests/unit/language/ast/test_variable_definition.py | matt-koevort/tartiflette | 5777866b133d846ce4f8aa03f735fa81832896cd | [
"MIT"
] | 530 | 2019-06-04T11:45:36.000Z | 2022-03-31T09:29:56.000Z | tests/unit/language/ast/test_variable_definition.py | matt-koevort/tartiflette | 5777866b133d846ce4f8aa03f735fa81832896cd | [
"MIT"
] | 242 | 2019-06-04T11:53:08.000Z | 2022-03-28T07:06:27.000Z | tests/unit/language/ast/test_variable_definition.py | matt-koevort/tartiflette | 5777866b133d846ce4f8aa03f735fa81832896cd | [
"MIT"
] | 36 | 2019-06-21T06:40:27.000Z | 2021-11-04T13:11:16.000Z | import pytest
from tartiflette.language.ast import VariableDefinitionNode
def test_variabledefinitionnode__init__():
variable_definition_node = VariableDefinitionNode(
variable="variableDefinitionVariable",
type="variableDefinitionType",
default_value="variableDefinitionDefaultValue",
location="variableDefinitionLocation",
)
assert variable_definition_node.variable == "variableDefinitionVariable"
assert variable_definition_node.type == "variableDefinitionType"
assert (
variable_definition_node.default_value
== "variableDefinitionDefaultValue"
)
assert variable_definition_node.location == "variableDefinitionLocation"
@pytest.mark.parametrize(
"variable_definition_node,other,expected",
[
(
VariableDefinitionNode(
variable="variableDefinitionVariable",
type="variableDefinitionType",
default_value="variableDefinitionDefaultValue",
location="variableDefinitionLocation",
),
Ellipsis,
False,
),
(
VariableDefinitionNode(
variable="variableDefinitionVariable",
type="variableDefinitionType",
default_value="variableDefinitionDefaultValue",
location="variableDefinitionLocation",
),
VariableDefinitionNode(
variable="variableDefinitionVariableBis",
type="variableDefinitionType",
default_value="variableDefinitionDefaultValue",
location="variableDefinitionLocation",
),
False,
),
(
VariableDefinitionNode(
variable="variableDefinitionVariable",
type="variableDefinitionType",
default_value="variableDefinitionDefaultValue",
location="variableDefinitionLocation",
),
VariableDefinitionNode(
variable="variableDefinitionVariable",
type="variableDefinitionTypeBis",
default_value="variableDefinitionDefaultValue",
location="variableDefinitionLocation",
),
False,
),
(
VariableDefinitionNode(
variable="variableDefinitionVariable",
type="variableDefinitionType",
default_value="variableDefinitionDefaultValue",
location="variableDefinitionLocation",
),
VariableDefinitionNode(
variable="variableDefinitionVariable",
type="variableDefinitionType",
default_value="variableDefinitionDefaultValueBis",
location="variableDefinitionLocation",
),
False,
),
(
VariableDefinitionNode(
variable="variableDefinitionVariable",
type="variableDefinitionType",
default_value="variableDefinitionDefaultValue",
location="variableDefinitionLocation",
),
VariableDefinitionNode(
variable="variableDefinitionVariable",
type="variableDefinitionType",
default_value="variableDefinitionDefaultValue",
location="variableDefinitionLocationBis",
),
False,
),
(
VariableDefinitionNode(
variable="variableDefinitionVariable",
type="variableDefinitionType",
default_value="variableDefinitionDefaultValue",
location="variableDefinitionLocation",
),
VariableDefinitionNode(
variable="variableDefinitionVariable",
type="variableDefinitionType",
default_value="variableDefinitionDefaultValue",
location="variableDefinitionLocation",
),
True,
),
],
)
def test_variabledefinitionnode__eq__(
variable_definition_node, other, expected
):
assert (variable_definition_node == other) is expected
@pytest.mark.parametrize(
"variable_definition_node,expected",
[
(
VariableDefinitionNode(
variable="variableDefinitionVariable",
type="variableDefinitionType",
default_value="variableDefinitionDefaultValue",
location="variableDefinitionLocation",
),
"VariableDefinitionNode(variable='variableDefinitionVariable', "
"type='variableDefinitionType', "
"default_value='variableDefinitionDefaultValue', "
"location='variableDefinitionLocation')",
)
],
)
def test_variabledefinitionnode__repr__(variable_definition_node, expected):
assert variable_definition_node.__repr__() == expected
| 36.182482 | 76 | 0.606213 | 232 | 4,957 | 12.711207 | 0.155172 | 0.061038 | 0.19939 | 0.264496 | 0.79315 | 0.757884 | 0.728722 | 0.719905 | 0.719905 | 0.719905 | 0 | 0 | 0.319145 | 4,957 | 136 | 77 | 36.448529 | 0.873778 | 0 | 0 | 0.674419 | 0 | 0 | 0.346782 | 0.346177 | 0 | 0 | 0 | 0 | 0.046512 | 1 | 0.023256 | false | 0 | 0.015504 | 0 | 0.03876 | 0 | 0 | 0 | 1 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
90c69ee10ca2b0b93784f251586e7171e4a02ff2 | 9,011 | py | Python | python/cuxfilter/layouts/layout_templates.py | rnyak/cuxfilter | 626e45af3b8a0f2e37bc5cdbe6d2da618141f995 | [
"Apache-2.0"
] | null | null | null | python/cuxfilter/layouts/layout_templates.py | rnyak/cuxfilter | 626e45af3b8a0f2e37bc5cdbe6d2da618141f995 | [
"Apache-2.0"
] | null | null | null | python/cuxfilter/layouts/layout_templates.py | rnyak/cuxfilter | 626e45af3b8a0f2e37bc5cdbe6d2da618141f995 | [
"Apache-2.0"
] | null | null | null | layout_0 = """
<!-- goes in body -->
{% block contents %}
<nav>
{{embed(roots.title)}}
<div class="nav-container"> {{ embed(roots.widgets) }} </div>
</nav>
<div class="container">
<div class="pure-g">
<div class="pure-u-1">
{{ embed(roots.chart1) }}
</div>
</div>
</div>
{% endblock %}
"""
layout_1 = """
<!-- goes in body -->
{% block contents %}
<nav>
{{embed(roots.title)}}
<div class="nav-container"> {{ embed(roots.widgets) }} </div>
</nav>
<div class="container">
<div class="pure-g">
<div class="pure-u-1">
{{ embed(roots.chart1) }}
</div>
</div>
<div class="pure-g">
<div class="pure-u-1">
{{ embed(roots.chart2) }}
</div>
</div>
</div>
{% endblock %}
"""
layout_2 = """
<!-- goes in body -->
{% block contents %}
<nav>
{{embed(roots.title)}}
<div class="nav-container"> {{ embed(roots.widgets) }} </div>
</nav>
<div class="container">
<div class="pure-g">
<div class="pure-u-1 pure-u-md-1-2" >
{{ embed(roots.chart1) }}
</div>
<div class="pure-u-1 pure-u-md-1-2" >
{{ embed(roots.chart2) }}
</div>
</div>
</div>
{% endblock %}
"""
layout_3 = """
<!-- goes in body -->
{% block contents %}
<nav>
{{embed(roots.title)}}
<div class="nav-container"> {{ embed(roots.widgets) }} </div>
</nav>
<div class="container">
<div class="pure-g">
<div class="pure-u-1 pure-u-md-3-5">
{{ embed(roots.chart1) }}
</div>
<div class="pure-u-1 pure-u-md-2-5" style="align-self:center">
<div class="pure-u-1">
{{ embed(roots.chart2) }}
</div>
<div class="pure-u-1">
{{ embed(roots.chart3) }}
</div>
</div>
</div>
</div>
{% endblock %}
"""
layout_4 = """
<!-- goes in body -->
{% block contents %}
<nav>
{{embed(roots.title)}}
<div class="nav-container"> {{ embed(roots.widgets) }} </div>
</nav>
<div class="container">
<div class="pure-g vertical-3-center" >
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart1) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart2) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart3) }}
</div>
</div>
</div>
{% endblock %}
"""
layout_5 = """
<!-- goes in body -->
{% block contents %}
<nav>
{{embed(roots.title)}}
<div class="nav-container"> {{ embed(roots.widgets) }} </div>
</nav>
<div class="container">
<div class="pure-g">
<div class="pure-u-1">
{{ embed(roots.chart1) }}
</div>
</div>
<div class="pure-g">
<div class="pure-u-1 pure-u-md-1-2">
{{ embed(roots.chart2) }}
</div>
<div class="pure-u-1 pure-u-md-1-2">
{{ embed(roots.chart3) }}
</div>
</div>
</div>
{% endblock %}
"""
layout_6 = """
<!-- goes in body -->
{% block contents %}
<nav>
{{embed(roots.title)}}
<div class="nav-container"> {{ embed(roots.widgets) }} </div>
</nav>
<div class="container">
<div class="pure-g">
<div class="pure-u-1 pure-u-md-1-2">
{{ embed(roots.chart1) }}
</div>
<div class="pure-u-1 pure-u-md-1-2">
{{ embed(roots.chart2) }}
</div>
</div>
<div class="pure-g">
<div class="pure-u-1 pure-u-md-1-2">
{{ embed(roots.chart3) }}
</div>
<div class="pure-u-1 pure-u-md-1-2">
{{ embed(roots.chart4) }}
</div>
</div>
</div>
{% endblock %}
"""
layout_7 = """
<!-- goes in body -->
{% block contents %}
<nav>
{{embed(roots.title)}}
<div class="nav-container"> {{ embed(roots.widgets) }} </div>
</nav>
<div class="container">
<div class="pure-g">
<div class="pure-u-1">
{{ embed(roots.chart1) }}
</div>
</div>
<div class="pure-g">
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart2) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart3) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart4) }}
</div>
</div>
</div>
{% endblock %}
"""
layout_8 = """
<!-- goes in body -->
{% block contents %}
<nav>
{{embed(roots.title)}}
<div class="nav-container"> {{ embed(roots.widgets) }} </div>
</nav>
<div class="container">
<div class="pure-g">
<div class="pure-u-1">
{{ embed(roots.chart1) }}
</div>
</div>
<div class="pure-g">
<div class="pure-u-1 pure-u-md-1-4">
{{ embed(roots.chart2) }}
</div>
<div class="pure-u-1 pure-u-md-1-4">
{{ embed(roots.chart3) }}
</div>
<div class="pure-u-1 pure-u-md-1-4">
{{ embed(roots.chart4) }}
</div>
<div class="pure-u-1 pure-u-md-1-4">
{{ embed(roots.chart5) }}
</div>
</div>
</div>
{% endblock %}
"""
layout_9 = """
<!-- goes in body -->
{% block contents %}
<nav>
{{embed(roots.title)}}
<div class="nav-container"> {{ embed(roots.widgets) }} </div>
</nav>
<div class="container">
<div class="pure-g">
<div class="pure-u-1 pure-u-md-3-4">
{{ embed(roots.chart1) }}
</div>
<div class="pure-u-1 pure-u-md-1-4">
<div class="pure-u-1">
{{ embed(roots.chart2) }}
</div>
<div class="pure-u-1">
{{ embed(roots.chart3) }}
</div>
</div>
</div>
<div class="pure-g">
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart4) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart5) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart6) }}
</div>
</div>
</div>
{% endblock %}
"""
layout_10 = """
<!-- goes in body -->
{% block contents %}
<nav>
{{embed(roots.title)}}
<div class="nav-container"> {{ embed(roots.widgets) }} </div>
</nav>
<div class="container">
<div class="pure-g">
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart1) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart2) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart3) }}
</div>
</div>
<div class="pure-g">
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart4) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart5) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart6) }}
</div>
</div>
</div>
{% endblock %}
"""
layout_11 = """
<!-- goes in body -->
{% block contents %}
<nav>
{{embed(roots.title)}}
<div class="nav-container"> {{ embed(roots.widgets) }} </div>
</nav>
<div class="container">
<div class="pure-g">
<div class="pure-u-1 pure-u-md-1-2">
{{ embed(roots.chart1) }}
</div>
<div class="pure-u-1 pure-u-md-1-2">
{{ embed(roots.chart2) }}
</div>
</div>
<div class="pure-g">
<div class="pure-u-1 pure-u-md-1-4">
{{ embed(roots.chart3) }}
</div>
<div class="pure-u-1 pure-u-md-1-4">
{{ embed(roots.chart4) }}
</div>
<div class="pure-u-1 pure-u-md-1-4">
{{ embed(roots.chart5) }}
</div>
<div class="pure-u-1 pure-u-md-1-4">
{{ embed(roots.chart6) }}
</div>
</div>
</div>
{% endblock %}
"""
layout_12 = """
<!-- goes in body -->
{% block contents %}
<nav>
{{embed(roots.title)}}
<div class="nav-container"> {{ embed(roots.widgets) }} </div>
</nav>
<div class="container">
<div class="pure-g">
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart1) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart2) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart3) }}
</div>
</div>
<div class="pure-g">
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart4) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart5) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart6) }}
</div>
</div>
<div class="pure-g">
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart7) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart8) }}
</div>
<div class="pure-u-1 pure-u-md-1-3">
{{ embed(roots.chart9) }}
</div>
</div>
</div>
{% endblock %}
"""
| 23.284238 | 70 | 0.465653 | 1,175 | 9,011 | 3.56 | 0.039149 | 0.200813 | 0.226632 | 0.174038 | 0.979919 | 0.969161 | 0.966292 | 0.966292 | 0.933541 | 0.933541 | 0 | 0.034839 | 0.302408 | 9,011 | 386 | 71 | 23.34456 | 0.630608 | 0 | 0 | 0.943662 | 0 | 0.002817 | 0.97159 | 0.066363 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.