hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0eb226ecb666e555e1ee21ca2021fa57c5336178 | 392 | py | Python | scripts/utils/convert_samplesheet_config.py | a113n/bcbio-nextgen | 1d4afef27ad2e84a4ecb6145ccc5058f2abb4616 | [
"MIT"
] | 3 | 2015-11-18T07:17:54.000Z | 2021-04-28T13:58:37.000Z | scripts/utils/convert_samplesheet_config.py | a113n/bcbio-nextgen | 1d4afef27ad2e84a4ecb6145ccc5058f2abb4616 | [
"MIT"
] | 2 | 2020-04-10T13:56:52.000Z | 2020-04-10T13:58:43.000Z | scripts/utils/convert_samplesheet_config.py | a113n/bcbio-nextgen | 1d4afef27ad2e84a4ecb6145ccc5058f2abb4616 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
"""Convert Illumina SampleSheet CSV files to the run_info.yaml input file.
This allows running the analysis pipeline without Galaxy, using CSV input
files from Illumina SampleSheet or Genesifter.
Usage:
convert_samplesheet_config.py <input csv>
"""
import sys
from bcbio.illumina import samplesheet
if __name__ == "__main__":
samplesheet.csv2yaml(sys.argv[1])
| 24.5 | 74 | 0.780612 | 55 | 392 | 5.363636 | 0.709091 | 0.128814 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005935 | 0.140306 | 392 | 15 | 75 | 26.133333 | 0.869436 | 0.678571 | 0 | 0 | 0 | 0 | 0.067227 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
0eb4081c3bfd193de0851db18c7a88b6625b08a5 | 230 | py | Python | commanderbot/ext/manifest/__init__.py | CommanderBot-Dev/commanderbot-ext | c8798b4475b892c234a1e4ffbfb4fed3fb702938 | [
"MIT"
] | 4 | 2020-09-25T19:22:48.000Z | 2021-06-16T18:08:49.000Z | commanderbot/ext/manifest/__init__.py | CommanderBot-Dev/commanderbot-py | 835841f733e466c5a0e6724d4020747c55856fe3 | [
"MIT"
] | 23 | 2021-08-30T04:07:29.000Z | 2021-11-08T17:44:41.000Z | commanderbot/ext/manifest/__init__.py | CommanderBot-Dev/commanderbot-py | 835841f733e466c5a0e6724d4020747c55856fe3 | [
"MIT"
] | 3 | 2020-09-25T19:23:22.000Z | 2021-03-16T18:19:48.000Z | from discord.ext.commands import Bot
from commanderbot.core.utils import add_configured_cog
from commanderbot.ext.manifest.manifest_cog import ManifestCog
def setup(bot: Bot):
add_configured_cog(bot, __name__, ManifestCog)
| 25.555556 | 62 | 0.826087 | 32 | 230 | 5.65625 | 0.53125 | 0.176796 | 0.176796 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108696 | 230 | 8 | 63 | 28.75 | 0.882927 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.6 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
0ecb0e59d1796d51b8b8691272f5027f62016317 | 120 | py | Python | 07-Array-Oriented-Programming-with-NumPy/7-6-astype.py | rajevac/deitel-intro-to-python-exercises | 05f427d35ebb4bd315904f6919659335b1bf3fc9 | [
"MIT"
] | null | null | null | 07-Array-Oriented-Programming-with-NumPy/7-6-astype.py | rajevac/deitel-intro-to-python-exercises | 05f427d35ebb4bd315904f6919659335b1bf3fc9 | [
"MIT"
] | null | null | null | 07-Array-Oriented-Programming-with-NumPy/7-6-astype.py | rajevac/deitel-intro-to-python-exercises | 05f427d35ebb4bd315904f6919659335b1bf3fc9 | [
"MIT"
] | null | null | null | import numpy as np
# 2 x 3
arr = np.linspace(1.1, 6.6, 6).reshape(2, 3)
print(arr)
arr = arr.astype('int')
print(arr) | 13.333333 | 44 | 0.633333 | 26 | 120 | 2.923077 | 0.576923 | 0.052632 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 0.175 | 120 | 9 | 45 | 13.333333 | 0.676768 | 0.041667 | 0 | 0.4 | 0 | 0 | 0.026316 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.4 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
161d6cfbf4c3aa206056f34f63be614832d26dd7 | 3,604 | py | Python | tests/board/test_board.py | trewjames/tdd-chess | 7aa5c1942627cc93886ffede8e84b65726e44946 | [
"MIT"
] | null | null | null | tests/board/test_board.py | trewjames/tdd-chess | 7aa5c1942627cc93886ffede8e84b65726e44946 | [
"MIT"
] | 3 | 2020-08-19T18:07:16.000Z | 2020-08-24T20:57:13.000Z | tests/board/test_board.py | trewjames/tdd-chess | 7aa5c1942627cc93886ffede8e84b65726e44946 | [
"MIT"
] | null | null | null | import pytest
from chess.board import Board
def test_board_init_play_white(start_board):
assert start_board.player_white is True
assert start_board.white_to_move is True
assert start_board.moves == []
def test_board_to_array_white(start_board, game_grid_white):
assert start_board.to_array() == game_grid_white
assert start_board.to_array() == game_grid_white
def test_board_to_array_black(game_grid_black):
board = Board(player_white=False)
assert board.to_array() == game_grid_black
assert board.to_array() == game_grid_black
def test_board_init_from_array():
test_board = [
["br", "bn", "bb", "bq", "--", "bb", "bn", "br"],
["--", "--", "--", "bp", "bk", "bp", "bp", "--"],
["--", "--", "bp", "--", "bp", "--", "--", "--"],
["bp", "bp", "--", "--", "--", "--", "--", "bp"],
["wp", "wp", "--", "--", "--", "--", "--", "wp"],
["--", "--", "wp", "--", "wp", "--", "--", "--"],
["--", "wp", "--", "wp", "--", "wp", "wp", "wr"],
["wr", "wn", "wb", "wq", "wk", "wb", "wn", "--"]
]
board = Board(player_white=True, array=test_board, white_to_move=True)
assert board.to_array() == test_board
assert board[(3, 1)].first_move is False
assert board[(1, 5)].name == 'bp'
assert board[6, 7].name == 'wr'
assert board[6, 7].first_move is False
assert board[1, 4].name == 'bk'
assert board[1, 4].first_move is False
def test_board_print_white(start_board):
board_str = "br bn bb bq bk bb bn br 8\n" + \
"bp bp bp bp bp bp bp bp 7\n" + \
"-- -- -- -- -- -- -- -- 6\n" + \
"-- -- -- -- -- -- -- -- 5\n" + \
"-- -- -- -- -- -- -- -- 4\n" + \
"-- -- -- -- -- -- -- -- 3\n" + \
"wp wp wp wp wp wp wp wp 2\n" + \
"wr wn wb wq wk wb wn wr 1\n" + \
"-a -b -c -d -e -f -g -h"
assert start_board.__str__() == board_str
def test_board_print_black():
board = Board(player_white=False, white_to_move=True)
board_str = "wr wn wb wk wq wb wn wr 1\n" + \
"wp wp wp wp wp wp wp wp 2\n" + \
"-- -- -- -- -- -- -- -- 3\n" + \
"-- -- -- -- -- -- -- -- 4\n" + \
"-- -- -- -- -- -- -- -- 5\n" + \
"-- -- -- -- -- -- -- -- 6\n" + \
"bp bp bp bp bp bp bp bp 7\n" + \
"br bn bb bk bq bb bn br 8\n" + \
"-h -g -f -e -d -c -b -a"
assert board.__str__() == board_str
@pytest.fixture
def game_grid_white():
return [
["br", "bn", "bb", "bq", "bk", "bb", "bn", "br"],
["bp", "bp", "bp", "bp", "bp", "bp", "bp", "bp"],
["--", "--", "--", "--", "--", "--", "--", "--"],
["--", "--", "--", "--", "--", "--", "--", "--"],
["--", "--", "--", "--", "--", "--", "--", "--"],
["--", "--", "--", "--", "--", "--", "--", "--"],
["wp", "wp", "wp", "wp", "wp", "wp", "wp", "wp"],
["wr", "wn", "wb", "wq", "wk", "wb", "wn", "wr"]
]
@pytest.fixture
def game_grid_black():
return [
["wr", "wn", "wb", "wk", "wq", "wb", "wn", "wr"],
["wp", "wp", "wp", "wp", "wp", "wp", "wp", "wp"],
["--", "--", "--", "--", "--", "--", "--", "--"],
["--", "--", "--", "--", "--", "--", "--", "--"],
["--", "--", "--", "--", "--", "--", "--", "--"],
["--", "--", "--", "--", "--", "--", "--", "--"],
["bp", "bp", "bp", "bp", "bp", "bp", "bp", "bp"],
["br", "bn", "bb", "bk", "bq", "bb", "bn", "br"]
]
| 34.990291 | 74 | 0.372364 | 416 | 3,604 | 3.033654 | 0.139423 | 0.114105 | 0.147385 | 0.164818 | 0.568146 | 0.46038 | 0.411252 | 0.306656 | 0.179873 | 0.175119 | 0 | 0.011111 | 0.300777 | 3,604 | 102 | 75 | 35.333333 | 0.489683 | 0 | 0 | 0.405063 | 0 | 0 | 0.240844 | 0 | 0 | 0 | 0 | 0 | 0.202532 | 1 | 0.101266 | false | 0 | 0.025316 | 0.025316 | 0.151899 | 0.025316 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1621416e860da5e9b4d4fdf04fe0bdf2c8642029 | 585 | py | Python | Gaussian_Elimination_Versatile/Verilog/gen_test.sage.py | davidhoo1988/gaussian-elimination-hardware | ff8e9475438744f40fd57eb553c4b772316191a6 | [
"MIT"
] | null | null | null | Gaussian_Elimination_Versatile/Verilog/gen_test.sage.py | davidhoo1988/gaussian-elimination-hardware | ff8e9475438744f40fd57eb553c4b772316191a6 | [
"MIT"
] | null | null | null | Gaussian_Elimination_Versatile/Verilog/gen_test.sage.py | davidhoo1988/gaussian-elimination-hardware | ff8e9475438744f40fd57eb553c4b772316191a6 | [
"MIT"
] | null | null | null |
# This file was *autogenerated* from the file gen_test.sage
from sage.all_cmdline import * # import sage library
_sage_const_2 = Integer(2); _sage_const_1 = Integer(1)#example sage gen_test.sage 7 127
import sys
k = int(sys.argv[_sage_const_1 ])
l = int(sys.argv[_sage_const_2 ])
MS = MatrixSpace(GF(_sage_const_2 ), k, l)
M = MS.random_element()
#M_rref = M.echelon_form()
#write data_in file - row wise
with open("data.in", "w") as f:
for r in M.rows():
for i in r:
f.write("{0}".format(i))
f.write("\n")
f.close()
print "Input matrix:"
print M
print ""
| 18.870968 | 87 | 0.673504 | 106 | 585 | 3.509434 | 0.518868 | 0.120968 | 0.080645 | 0.075269 | 0.102151 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025105 | 0.182906 | 585 | 30 | 88 | 19.5 | 0.753138 | 0.278632 | 0 | 0 | 1 | 0 | 0.062802 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.125 | null | null | 0.1875 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1624f0aa12611740b614927254ccd9dc894baaee | 5,748 | py | Python | test/experimental/test_new.py | allena29/pyyang-voodoo | 43e674ca1e18053a487d5169ce7c4cf5fc70c95d | [
"Apache-2.0"
] | 4 | 2019-04-10T23:02:22.000Z | 2020-08-03T03:33:02.000Z | test/experimental/test_new.py | allena29/pyyang-voodoo | 43e674ca1e18053a487d5169ce7c4cf5fc70c95d | [
"Apache-2.0"
] | 5 | 2019-08-06T23:14:02.000Z | 2021-06-01T23:52:34.000Z | test/experimental/test_new.py | allena29/pyyang-voodoo | 43e674ca1e18053a487d5169ce7c4cf5fc70c95d | [
"Apache-2.0"
] | null | null | null | import libyang
from jinja2 import Template
import unittest
import yangvoodoo
import yangvoodoo.stublydal
"""
This set of unit tests uses the stub backend datastore, which is not preseeded with
any data.
"""
class test_new(unittest.TestCase):
def setUp(self):
self.maxDiff = None
self.stub = yangvoodoo.stublydal.StubLyDataAbstractionLayer()
self.subject = yangvoodoo.DataAccess(
data_abstraction_layer=self.stub, disable_proxy=True
)
self.subject.connect("integrationtest", yang_location="yang")
self.root = self.subject.get_node()
#
# def test_a(self):
# print(self.root.morecomplex.inner.leaf63636363)
# self.root.morecomplex.inner.leaf63636363 = 'af11'
# print(self.root.morecomplex.inner.leaf63636363)
# self.root.morecomplex.inner.leaf63636363 = 63
# print(self.root.morecomplex.inner.leaf63636363)
# self.root.morecomplex.inner.leaf63636363 = 0
# print(self.root.morecomplex.inner.leaf63636363)
# self.root.morecomplex.inner.leaf63636363 = 'precedence'
# print(self.root.morecomplex.inner.leaf63636363)
# self.root.morecomplex.inner.list63636363.create(4)
# self.root.morecomplex.inner.list63636363.create('af41')
# self.root.morecomplex.inner.list63636363.create('precedence')
# print(self.subject.dumps())
#
# def test_b(self):
# x = """
# <morecomplex xmlns="http://brewerslabng.mellon-collie.net/yang/integrationtest"><inner><leaf63636363>precedence</leaf63636363><list63636363><key63636363>4</key63636363></list63636363><list63636363><key63636363>af41</key63636363></list63636363><list63636363><key63636363>precedence</key63636363></list63636363></inner></morecomplex>
# """
# self.subject.loads(x, trusted=True)
# print(self.subject.dumps())
#
# def test_c(self):
# print(self.root.morecomplex.inner.leaf63636363)
# self.root.morecomplex.inner.leaf63636363 = 10
# print(self.subject.dumps())
#
# def test_d(self):
# x = """
# <morecomplex xmlns="http://brewerslabng.mellon-collie.net/yang/integrationtest"><inner><leaf63636363>10</leaf63636363><list63636363><key63636363>4</key63636363></list63636363><list63636363><key63636363>af41</key63636363></list63636363><list63636363><key63636363>precedence</key63636363></list63636363></inner></morecomplex>
# """
# self.subject.loads(x, trusted=True)
# print(self.subject.dumps())
#
# def test_e(self):
# self.root.validator.types.u_int_8 = 0
# print(self.subject.dumps())
#
# def test_f(self):
# self.subject.loads("""<validator xmlns="http://brewerslabng.mellon-collie.net/yang/integrationtest"><types><u_int_8>0</u_int_8></types></validator>""")
# print(self.subject.dumps())
#
# def test_g(self):
# x = self.root.morecomplex.inner.uint8keylist.create(1)
# x.nonkey = 'o'
# self.assertEqual(x._path, "/integrationtest:morecomplex/inner/uint8keylist[mykey='1']")
# x = self.root.morecomplex.inner.uint8keylist.create(0)
# x.nonkey = 'z'
# self.assertEqual(x._path, "/integrationtest:morecomplex/inner/uint8keylist[mykey='0']")
# print(self.subject.dumps())
#
# def test_h(self):
# print('from the merged in case')
# self.subject.loads("""<morecomplex xmlns="http://brewerslabng.mellon-collie.net/yang/integrationtest"><inner><uint8keylist><mykey>1</mykey><nonkey>o</nonkey></uint8keylist><uint8keylist><mykey>0</mykey><nonkey>z</nonkey></uint8keylist></inner></morecomplex>""", trusted=True)
# print(self.subject.dumps())
#
# def test_i(self):
# class pythondata:
# def __getattr__(self, attr):
# return 0
#
# def __getitem__(self, attr):
# return 0
#
# class crap:
#
# @staticmethod
# def as_string(i):
# if i is None:
# return '0'
# return str(i)
#
# self.root.simpleleaf = 0
# self.root.validator.types.int_8 = 0
# x = self.root.morecomplex.inner.uint8keylist.create(1).nonkey = 'O'
# x = self.root.morecomplex.inner.uint8keylist.create(0).nonkey = 'Z'
# template = """
# root.simpleleaf {{ root.simpleleaf }} this should be 0 - this is a ynag string
# root.validator.types.int_8 {{ root.validator.types.int_8 }} this should be 0 - this is a ynan uint_8 ***IT COMES AS None ***
# root.validator.types.int_8 {{ workarounds.as_string(root.validator.types.int_8) }} this should be 0 - this is a ynan uint_8
#
# {% for x in root.morecomplex.inner.uint8keylist %}
# {{ x.mykey }} this should be either 1 or 0 ***THIS IS BROKEN FOR a number 0 ***
# {{ x.nonkey }} this should be either 0 or Z
# {% endfor %}
# {{ pythondata['x'] }} getitem
# {{ pythondata.x }} this is from the python datamodel so the problem is somewhere between Jinja2 and yangvoodoo
#
# """
# template = Template(template)
# answer = template.render(root=self.root, pythondata=pythondata(), workarounds=crap())
#
# print(answer)
# assert answer == ''
# def test_i(self):
# x = self.root.morecomplex.inner.uint8keylist.create(1)
# x = self.root.morecomplex.inner.uint8keylist.create(0)
# template = """
# {% for x in root.morecomplex.inner.uint8keylist %}
# {{root.morecomplex.inner.uint8keylist[x.mykey].nonkey}}
# {% endfor %}
# """
# template = Template(template)
# answer = template.render(root=self.root)
#
# print(answer)
# assert answer == ''
| 42.577778 | 341 | 0.639353 | 652 | 5,748 | 5.57362 | 0.205521 | 0.057237 | 0.126582 | 0.132086 | 0.654926 | 0.637039 | 0.564392 | 0.522014 | 0.471932 | 0.379472 | 0 | 0.089582 | 0.213466 | 5,748 | 134 | 342 | 42.895522 | 0.714223 | 0.818198 | 0 | 0 | 0 | 0 | 0.022754 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.357143 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
162bfa30ec6a9944e9ef9faf545cc44b2534b509 | 2,631 | py | Python | sympy/physics/vector/tests/test_output.py | utkarshdeorah/sympy | dcdf59bbc6b13ddbc329431adf72fcee294b6389 | [
"BSD-3-Clause"
] | 1 | 2020-09-09T20:40:17.000Z | 2020-09-09T20:40:17.000Z | sympy/physics/vector/tests/test_output.py | utkarshdeorah/sympy | dcdf59bbc6b13ddbc329431adf72fcee294b6389 | [
"BSD-3-Clause"
] | 14 | 2018-02-08T10:11:03.000Z | 2019-04-16T10:32:46.000Z | sympy/physics/vector/tests/test_output.py | utkarshdeorah/sympy | dcdf59bbc6b13ddbc329431adf72fcee294b6389 | [
"BSD-3-Clause"
] | 1 | 2022-02-04T13:50:29.000Z | 2022-02-04T13:50:29.000Z | from sympy.core.singleton import S
from sympy.physics.vector import Vector, ReferenceFrame, Dyadic
from sympy.testing.pytest import raises
Vector.simp = True
A = ReferenceFrame('A')
def test_output_type():
A = ReferenceFrame('A')
v = A.x + A.y
d = v | v
zerov = Vector(0)
zerod = Dyadic(0)
# dot products
assert isinstance(d & d, Dyadic)
assert isinstance(d & zerod, Dyadic)
assert isinstance(zerod & d, Dyadic)
assert isinstance(d & v, Vector)
assert isinstance(v & d, Vector)
assert isinstance(d & zerov, Vector)
assert isinstance(zerov & d, Vector)
raises(TypeError, lambda: d & S.Zero)
raises(TypeError, lambda: S.Zero & d)
raises(TypeError, lambda: d & 0)
raises(TypeError, lambda: 0 & d)
assert not isinstance(v & v, (Vector, Dyadic))
assert not isinstance(v & zerov, (Vector, Dyadic))
assert not isinstance(zerov & v, (Vector, Dyadic))
raises(TypeError, lambda: v & S.Zero)
raises(TypeError, lambda: S.Zero & v)
raises(TypeError, lambda: v & 0)
raises(TypeError, lambda: 0 & v)
# cross products
raises(TypeError, lambda: d ^ d)
raises(TypeError, lambda: d ^ zerod)
raises(TypeError, lambda: zerod ^ d)
assert isinstance(d ^ v, Dyadic)
assert isinstance(v ^ d, Dyadic)
assert isinstance(d ^ zerov, Dyadic)
assert isinstance(zerov ^ d, Dyadic)
assert isinstance(zerov ^ d, Dyadic)
raises(TypeError, lambda: d ^ S.Zero)
raises(TypeError, lambda: S.Zero ^ d)
raises(TypeError, lambda: d ^ 0)
raises(TypeError, lambda: 0 ^ d)
assert isinstance(v ^ v, Vector)
assert isinstance(v ^ zerov, Vector)
assert isinstance(zerov ^ v, Vector)
raises(TypeError, lambda: v ^ S.Zero)
raises(TypeError, lambda: S.Zero ^ v)
raises(TypeError, lambda: v ^ 0)
raises(TypeError, lambda: 0 ^ v)
# outer products
raises(TypeError, lambda: d | d)
raises(TypeError, lambda: d | zerod)
raises(TypeError, lambda: zerod | d)
raises(TypeError, lambda: d | v)
raises(TypeError, lambda: v | d)
raises(TypeError, lambda: d | zerov)
raises(TypeError, lambda: zerov | d)
raises(TypeError, lambda: zerov | d)
raises(TypeError, lambda: d | S.Zero)
raises(TypeError, lambda: S.Zero | d)
raises(TypeError, lambda: d | 0)
raises(TypeError, lambda: 0 | d)
assert isinstance(v | v, Dyadic)
assert isinstance(v | zerov, Dyadic)
assert isinstance(zerov | v, Dyadic)
raises(TypeError, lambda: v | S.Zero)
raises(TypeError, lambda: S.Zero | v)
raises(TypeError, lambda: v | 0)
raises(TypeError, lambda: 0 | v)
| 34.168831 | 63 | 0.651843 | 351 | 2,631 | 4.880342 | 0.105413 | 0.30648 | 0.429072 | 0.154116 | 0.780502 | 0.548745 | 0.512551 | 0.512551 | 0.481027 | 0.481027 | 0 | 0.006886 | 0.22729 | 2,631 | 76 | 64 | 34.618421 | 0.835711 | 0.015964 | 0 | 0.089552 | 0 | 0 | 0.000774 | 0 | 0 | 0 | 0 | 0 | 0.313433 | 1 | 0.014925 | false | 0 | 0.044776 | 0 | 0.059701 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
162d762ef6232bf4d34844359df7c61627b33393 | 5,602 | py | Python | cyber/python/cyber_py3/parameter.py | henryhxyao/apollo | 56afbe2a3be2078e05ae0b91a93fa6d1b6b51619 | [
"Apache-2.0"
] | null | null | null | cyber/python/cyber_py3/parameter.py | henryhxyao/apollo | 56afbe2a3be2078e05ae0b91a93fa6d1b6b51619 | [
"Apache-2.0"
] | null | null | null | cyber/python/cyber_py3/parameter.py | henryhxyao/apollo | 56afbe2a3be2078e05ae0b91a93fa6d1b6b51619 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
# ****************************************************************************
# Copyright 2019 The Apollo Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ****************************************************************************
# -*- coding: utf-8 -*-
"""Module for init environment."""
import sys
import os
import importlib
# init vars
CYBER_PATH = os.environ['CYBER_PATH']
CYBER_DIR = os.path.split(CYBER_PATH)[0]
sys.path.append(CYBER_PATH + "/third_party/")
sys.path.append(CYBER_PATH + "/lib/")
sys.path.append(CYBER_PATH + "/python/cyber")
sys.path.append(CYBER_PATH + "/python/cyber_py")
sys.path.append(CYBER_PATH + "/lib/python/")
sys.path.append(CYBER_DIR + "/python/")
sys.path.append(CYBER_DIR + "/cyber/")
_CYBER_PARAM = importlib.import_module('_cyber_parameter_py3')
class Parameter(object):
"""
Class for Parameter wrapper.
"""
def __init__(self, name, value=None):
if (name is not None and value is None):
self.param = name
elif (name is None and value is None):
self.param = _CYBER_PARAM.new_PyParameter_noparam()
elif isinstance(value, int):
self.param = _CYBER_PARAM.new_PyParameter_int(name, value)
elif isinstance(value, float):
self.param = _CYBER_PARAM.new_PyParameter_double(name, value)
elif isinstance(value, str):
self.param = _CYBER_PARAM.new_PyParameter_string(name, value)
else:
print("type is not supported: ", type(value))
def __del__(self):
_CYBER_PARAM.delete_PyParameter(self.param)
def type_name(self):
"""
return Parameter typename
"""
return _CYBER_PARAM.PyParameter_type_name(self.param)
def descriptor(self):
"""
return Parameter descriptor
"""
return _CYBER_PARAM.PyParameter_descriptor(self.param)
def name(self):
"""
return Parameter name
"""
return _CYBER_PARAM.PyParameter_name(self.param)
def debug_string(self):
"""
return Parameter debug string
"""
return _CYBER_PARAM.PyParameter_debug_string(self.param)
def as_string(self):
"""
return native value
"""
return _CYBER_PARAM.PyParameter_as_string(self.param)
def as_double(self):
"""
return native value
"""
return _CYBER_PARAM.PyParameter_as_double(self.param)
def as_int64(self):
"""
return native value
"""
return _CYBER_PARAM.PyParameter_as_int64(self.param)
class ParameterClient(object):
"""
Class for ParameterClient wrapper.
"""
##
# @brief constructor the ParameterClient by a node and the parameter server node name.
#
# @param node a node to create client.
# @param server_node_name the parameter server's node name.
def __init__(self, node, server_node_name):
self.param_clt = _CYBER_PARAM.new_PyParameterClient(
node.node, server_node_name)
def __del__(self):
_CYBER_PARAM.delete_PyParameterClient(self.param_clt)
def set_parameter(self, param):
"""
set parameter, param is Parameter.
"""
return _CYBER_PARAM.PyParameter_clt_set_parameter(self.param_clt, param.param)
def get_parameter(self, param_name):
"""
get Parameter by param name param_name.
"""
return Parameter(_CYBER_PARAM.PyParameter_clt_get_parameter(self.param_clt, param_name))
def get_paramslist(self):
"""
get all params of the server_node_name parameterserver.
"""
pycapsulelist = _CYBER_PARAM.PyParameter_clt_get_parameter_list(
self.param_clt)
param_list = []
for capsuleobj in pycapsulelist:
param_list.append(Parameter(capsuleobj))
return param_list
class ParameterServer(object):
"""
Class for ParameterServer wrapper.
"""
##
# @brief constructor the ParameterServer by the node object.
#
# @param node the node to support the parameter server.
def __init__(self, node):
self.param_srv = _CYBER_PARAM.new_PyParameterServer(node.node)
def __del__(self):
_CYBER_PARAM.delete_PyParameterServer(self.param_srv)
def set_parameter(self, param):
"""
set parameter, param is Parameter.
"""
return _CYBER_PARAM.PyParameter_srv_set_parameter(self.param_srv, param.param)
def get_parameter(self, param_name):
"""
get Parameter by param name param_name.
"""
return Parameter(_CYBER_PARAM.PyParameter_srv_get_parameter(self.param_srv, param_name))
def get_paramslist(self):
"""
get all params of this parameterserver.
"""
pycapsulelist = _CYBER_PARAM.PyParameter_srv_get_parameter_list(
self.param_srv)
param_list = []
for capsuleobj in pycapsulelist:
param_list.append(Parameter(capsuleobj))
return param_list
| 29.957219 | 96 | 0.642092 | 662 | 5,602 | 5.172205 | 0.222054 | 0.07097 | 0.079731 | 0.07097 | 0.45882 | 0.381133 | 0.262266 | 0.22722 | 0.22722 | 0.183411 | 0 | 0.003757 | 0.239736 | 5,602 | 186 | 97 | 30.11828 | 0.800188 | 0.28829 | 0 | 0.223684 | 0 | 0 | 0.03519 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.052632 | 0 | 0.513158 | 0.013158 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
163102785944ed53379248f4440280202da6afa3 | 334 | py | Python | SimCalorimetry/EcalElectronicsEmulation/python/EcalFEtoDigi_cfi.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 852 | 2015-01-11T21:03:51.000Z | 2022-03-25T21:14:00.000Z | SimCalorimetry/EcalElectronicsEmulation/python/EcalFEtoDigi_cfi.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 30,371 | 2015-01-02T00:14:40.000Z | 2022-03-31T23:26:05.000Z | SimCalorimetry/EcalElectronicsEmulation/python/EcalFEtoDigi_cfi.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 3,240 | 2015-01-02T05:53:18.000Z | 2022-03-31T17:24:21.000Z | import FWCore.ParameterSet.Config as cms
tccFlatToDigi = cms.EDProducer("EcalFEtoDigi",
FileEventOffset = cms.untracked.int32(0),
UseIdentityLUT = cms.untracked.bool(False),
SuperModuleId = cms.untracked.int32(-1),
debugPrintFlag = cms.untracked.bool(False),
FlatBaseName = cms.untracked.string('ecal_tcc_')
)
| 25.692308 | 52 | 0.736527 | 36 | 334 | 6.777778 | 0.638889 | 0.245902 | 0.139344 | 0.172131 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020906 | 0.140719 | 334 | 12 | 53 | 27.833333 | 0.829268 | 0 | 0 | 0 | 0 | 0 | 0.063444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
163c35d4ed0fbf3792f3a32b76398082bd1730ea | 779 | py | Python | dev/mssprocessor_20210422_quickpatch.py | XiminHu/mass-suite | c14733f1a33a80deb446e0de86b8d4a85fb17989 | [
"MIT"
] | 2 | 2020-04-03T20:54:33.000Z | 2022-02-16T11:37:32.000Z | dev/mssprocessor_20210422_quickpatch.py | XiminHu/mass-suite | c14733f1a33a80deb446e0de86b8d4a85fb17989 | [
"MIT"
] | 3 | 2020-06-17T18:35:33.000Z | 2022-01-21T22:57:49.000Z | dev/mssprocessor_20210422_quickpatch.py | XiminHu/mass-suite | c14733f1a33a80deb446e0de86b8d4a85fb17989 | [
"MIT"
] | 2 | 2020-04-14T18:31:03.000Z | 2020-08-26T23:17:23.000Z | import sys
sys.path.append('../')
import mss
from mss import mssmain as msm
from mss import align
from timeit import default_timer as timer
'''
path = input('filename to process:')
scans = msm.get_scans(path, ms_all=False, ms_lv=1)
#noise removal
msm.noise_removal(scans, 2000)
d_op = msm.peak_list(scans, 10, enable_score=True,peak_base=0.001,peak_area_thres=0)
output_path = input('path and filename to export:')
d_op.to_csv(output_path)
'''
def main():
data_path = 'D:/UW/6ppdozonation_mzml/20210618_6PPDozonation_mzml_MSStest/'#input('data path:')
output_path = 'D:/UW/6ppdozonation_mzml/summary.csv'#input('output path:')
align.mss_process(data_path, output_path,err_ppm=20, thres_noise=3000,enable_score=False)
return
if __name__ == '__main__':
main()
| 32.458333 | 99 | 0.752246 | 126 | 779 | 4.365079 | 0.47619 | 0.090909 | 0.047273 | 0.072727 | 0.087273 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04209 | 0.115533 | 779 | 23 | 100 | 33.869565 | 0.756168 | 0.051348 | 0 | 0 | 0 | 0 | 0.248848 | 0.223502 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.384615 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
1645dfc669a0aa9230aa95f67a836080889b2de0 | 5,753 | py | Python | reports/brokers/tests/test_databridge/test_download_from_doc_service.py | ITVaan/reports.brokers.bridge | 348dd3d47bc4769d3e86378c4c6709a980312ddd | [
"Apache-2.0"
] | null | null | null | reports/brokers/tests/test_databridge/test_download_from_doc_service.py | ITVaan/reports.brokers.bridge | 348dd3d47bc4769d3e86378c4c6709a980312ddd | [
"Apache-2.0"
] | null | null | null | reports/brokers/tests/test_databridge/test_download_from_doc_service.py | ITVaan/reports.brokers.bridge | 348dd3d47bc4769d3e86378c4c6709a980312ddd | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from gevent import monkey
from gevent.hub import LoopExit
from reports.brokers.databridge.doc_service_client import DocServiceClient
monkey.patch_all()
import requests_mock
import unittest
import hypothesis.strategies as st
from reports.brokers.databridge.utils import EdrDocument
from gevent.queue import Queue
from mock import MagicMock
from time import sleep
from hypothesis import given, assume
from yaml import dump as yaml_dump
from utils import AlmostAlwaysFalse
from reports.brokers.databridge.download_from_doc_service import DownloadFromDocServiceWorker
class TestDownloadFromDocServiceWorker(unittest.TestCase):
def setUp(self):
self.in_queue = Queue(100)
self.out_queue = Queue(100)
@given(st.text())
def test_init(self, url):
doc_client_mock = DocServiceClient(url)
worker = DownloadFromDocServiceWorker(MagicMock(), doc_client_mock, self.in_queue, self.out_queue)
self.assertEqual(worker.doc_client, doc_client_mock)
def test_start_jobs(self):
worker = DownloadFromDocServiceWorker(MagicMock(), MagicMock(), self.in_queue, self.out_queue)
worker.get_item_from_doc_service = MagicMock()
worker.retry_get_item_from_doc_service = MagicMock()
worker._start_jobs()
sleep(1)
worker.get_item_from_doc_service.assert_called_once()
worker.retry_get_item_from_doc_service.assert_called_once()
@given(st.uuids(), st.uuids(), st.text())
def test_get_items_from_doc_service(self, tender_id, bid_id, doc_url):
res = yaml_dump({"test": "test"})
doc_client_mock = MagicMock(download=MagicMock(return_value=MagicMock(raw=MagicMock(read=MagicMock(return_value=res)))))
data = EdrDocument(tender_id, bid_id, doc_url)
self.in_queue.put(data)
worker = DownloadFromDocServiceWorker(MagicMock(), doc_client_mock, self.in_queue, self.out_queue)
worker.exit = AlmostAlwaysFalse()
worker.get_item_from_doc_service()
self.assertEqual(worker.edr_data_to_database.get(), (tender_id, bid_id, {"test": "test"}))
@given(st.uuids(), st.uuids(), st.text())
def test_retry_get_items_from_doc_service(self, tender_id, bid_id, doc_url):
assume(doc_url != '')
res = yaml_dump({"test": "test"})
data = EdrDocument(tender_id, bid_id, doc_url)
doc_client_mock = MagicMock(download=MagicMock(return_value=MagicMock(raw=MagicMock(read=MagicMock(return_value=res)))))
worker = DownloadFromDocServiceWorker(MagicMock(), doc_client_mock, self.in_queue, self.out_queue)
worker.retry_items_to_download_queue.put(data)
worker.exit = AlmostAlwaysFalse()
worker.retry_get_item_from_doc_service()
self.assertEqual(worker.edr_data_to_database.get(), (tender_id, bid_id, {"test": "test"}))
#TODO this test will have to be updated after I finish interaction with DB
@requests_mock.Mocker()
def test_sent_get_request(self, mrequest):
url = '{host}:{port}/get/111'.format(host="127.0.0.1", port=6555)
mrequest.get(url, [{'json': {'data': {}}, 'status_code': 200}])
data = EdrDocument(1, 1, "127.0.0.1:6555/get/111")
self.in_queue.put(data)
client = DocServiceClient(host="127.0.0.1")
worker = DownloadFromDocServiceWorker(MagicMock(), client, self.in_queue, self.out_queue)
worker.exit = AlmostAlwaysFalse()
worker.get_item_from_doc_service()
self.assertEqual(len(mrequest.request_history), 1)
self.assertEqual(self.out_queue.get(), (data.tender_id, data.bid_id, {'data': {}}))
@requests_mock.Mocker()
def test_sent_get_request_fail_and_sux(self, mrequest):
url = '{host}:{port}/get/111'.format(host="127.0.0.1", port=6555)
mrequest.get(url, [{'status_code': 404}, {'json': {'data': {}}, 'status_code': 200}])
data = EdrDocument(1, 1, url)
self.in_queue.put(data)
client = DocServiceClient(host="127.0.0.1")
worker = DownloadFromDocServiceWorker(MagicMock(), client, self.in_queue, self.out_queue)
worker.exit = AlmostAlwaysFalse()
worker.temp_action()
self.assertEqual(len(mrequest.request_history), 2)
self.assertEqual(self.out_queue.get(), (data.tender_id, data.bid_id, {'data': {}}))
@requests_mock.Mocker()
def test_sent_get_request_exception(self, mrequest):
url = '{host}:{port}/get/111'.format(host="127.0.0.1", port=6555)
mrequest.get(url, [{'status_code': 404} for _ in range(6)])
data = EdrDocument(1, 1, "127.0.0.1:6555/get/111")
self.in_queue.put(data)
client = DocServiceClient(host="127.0.0.1")
worker = DownloadFromDocServiceWorker(MagicMock(), client, self.in_queue, self.out_queue)
worker.exit = AlmostAlwaysFalse()
worker.get_item_from_doc_service()
self.assertEqual(len(mrequest.request_history), 5)
self.assertEqual(self.out_queue.qsize(), 0)
@requests_mock.Mocker()
def test_retry_sent_get_request_exception(self, mrequest):
url = '{host}:{port}/get/111'.format(host="127.0.0.1", port=6555)
mrequest.get(url, [{'status_code': 404} for _ in range(6)] + [{'json': {'data': {}}, 'status_code': 200}])
data = EdrDocument(1, 1, url)
self.in_queue.put(data)
client = DocServiceClient(host="127.0.0.1")
worker = DownloadFromDocServiceWorker(MagicMock(), client, self.in_queue, self.out_queue)
worker.exit = AlmostAlwaysFalse()
worker.temp_action()
worker.retry_temp_action()
self.assertEqual(len(mrequest.request_history), 7)
self.assertEqual(self.out_queue.get(), (data.tender_id, data.bid_id, {'data': {}}))
| 47.941667 | 128 | 0.692334 | 756 | 5,753 | 5.027778 | 0.160053 | 0.022099 | 0.040516 | 0.015785 | 0.74875 | 0.73507 | 0.728756 | 0.683504 | 0.612207 | 0.589845 | 0 | 0.030137 | 0.175213 | 5,753 | 119 | 129 | 48.344538 | 0.770917 | 0.016339 | 0 | 0.509804 | 0 | 0 | 0.059052 | 0.022631 | 0 | 0 | 0 | 0.008403 | 0.127451 | 1 | 0.088235 | false | 0 | 0.137255 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1654be0eaafed11172c6d2958a3040ada760b046 | 1,344 | py | Python | Lib/test/test_frozen.py | arvindm95/unladen-swallow | 8175e37eaea7ca66ed03283b46bc1d2db0d3f9c3 | [
"PSF-2.0"
] | 2,293 | 2015-01-02T12:46:10.000Z | 2022-03-29T09:45:43.000Z | python/src/Lib/test/test_frozen.py | weiqiangzheng/sl4a | d3c17dca978cbeee545e12ea240a9dbf2a6999e9 | [
"Apache-2.0"
] | 315 | 2015-05-31T11:55:46.000Z | 2022-01-12T08:36:37.000Z | python/src/Lib/test/test_frozen.py | weiqiangzheng/sl4a | d3c17dca978cbeee545e12ea240a9dbf2a6999e9 | [
"Apache-2.0"
] | 1,033 | 2015-01-04T07:48:40.000Z | 2022-03-24T09:34:37.000Z | # Test the frozen module defined in frozen.c.
from test.test_support import captured_stdout, run_unittest
import unittest
import sys, os
class FrozenTests(unittest.TestCase):
def test_frozen(self):
with captured_stdout() as stdout:
try:
import __hello__
except ImportError, x:
self.fail("import __hello__ failed:" + str(x))
try:
import __phello__
except ImportError, x:
self.fail("import __phello__ failed:" + str(x))
try:
import __phello__.spam
except ImportError, x:
self.fail("import __phello__.spam failed:" + str(x))
if sys.platform != "mac": # On the Mac this import does succeed.
try:
import __phello__.foo
except ImportError:
pass
else:
self.fail("import __phello__.foo should have failed")
self.assertEquals(stdout.getvalue(),
'Hello world...\nHello world...\nHello world...\n')
del sys.modules['__hello__']
del sys.modules['__phello__']
del sys.modules['__phello__.spam']
def test_main():
run_unittest(FrozenTests)
if __name__ == '__main__':
test_main()
| 27.428571 | 77 | 0.551339 | 138 | 1,344 | 4.934783 | 0.391304 | 0.105727 | 0.082232 | 0.096916 | 0.232012 | 0.232012 | 0.111601 | 0 | 0 | 0 | 0 | 0 | 0.358631 | 1,344 | 48 | 78 | 28 | 0.790023 | 0.059524 | 0 | 0.205882 | 0 | 0 | 0.168121 | 0 | 0 | 0 | 0 | 0 | 0.029412 | 0 | null | null | 0.029412 | 0.441176 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
1657bca3d45344d66ae5d285560bdf6b1bcdf820 | 1,171 | py | Python | tests/test_get_invoices.py | tcosta84/python-bling | 9636718437edf6c4ee7e1da596e2590c4601770f | [
"MIT"
] | 4 | 2020-05-29T15:38:30.000Z | 2020-10-03T19:36:36.000Z | tests/test_get_invoices.py | WeisnerR/python-bling | 3624642dc8d0607e3e2486b516c6570f1798ad10 | [
"MIT"
] | null | null | null | tests/test_get_invoices.py | WeisnerR/python-bling | 3624642dc8d0607e3e2486b516c6570f1798ad10 | [
"MIT"
] | 6 | 2020-05-29T15:38:34.000Z | 2021-08-17T15:06:09.000Z | from bling import Api
def test_should_call_get_objects_with_correct_arguments_when_all_filters_are_provided(mocker):
mock_get_objects = mocker.patch.object(Api, '_get_objects')
api = Api(api_key='fake-api-key')
api.get_invoices(
issued_date=['17/05/2019', '17/05/2019'],
situation='pago',
type='S',
)
expected_params = {
'filters': 'dataEmissao[17/05/2019 TO 17/05/2019];situacao[pago];tipo[S]'
}
mock_get_objects.assert_called_with(
'notasfiscais', 'notafiscal', expected_params
)
def test_should_call_get_objects_with_correct_arguments_when_no_filters_are_provided(mocker):
mock_get_objects = mocker.patch.object(Api, '_get_objects')
api = Api(api_key='fake-api-key')
api.get_invoices()
expected_params = {}
mock_get_objects.assert_called_with(
'notasfiscais', 'notafiscal', expected_params
)
def test_should_return_correct_content(mocker):
mock_get_objects = mocker.patch.object(Api, '_get_objects')
api = Api(api_key='fake-api-key')
objs = api.get_invoices(['17/05/2019', '17/05/2019'], 1)
assert objs == mock_get_objects.return_value
| 28.560976 | 94 | 0.706234 | 162 | 1,171 | 4.722222 | 0.308642 | 0.143791 | 0.109804 | 0.078431 | 0.730719 | 0.730719 | 0.688889 | 0.688889 | 0.688889 | 0.688889 | 0 | 0.050463 | 0.170794 | 1,171 | 40 | 95 | 29.275 | 0.737384 | 0 | 0 | 0.357143 | 0 | 0.035714 | 0.194705 | 0.047822 | 0 | 0 | 0 | 0 | 0.107143 | 1 | 0.107143 | false | 0 | 0.035714 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
166351745a3b94d4972b9916a957b43b10bb6050 | 2,201 | py | Python | python/tests/extelem.py | sys-bio/libxslt | 7c1569a36fffc0449fa051d792d14957e894ff60 | [
"MIT"
] | 3 | 2016-06-27T19:01:47.000Z | 2021-05-04T01:21:16.000Z | python/tests/extelem.py | sys-bio/libxslt | 7c1569a36fffc0449fa051d792d14957e894ff60 | [
"MIT"
] | 5 | 2016-06-09T05:53:36.000Z | 2020-10-01T17:07:14.000Z | python/tests/extelem.py | sys-bio/libxslt | 7c1569a36fffc0449fa051d792d14957e894ff60 | [
"MIT"
] | 6 | 2015-12-13T11:33:08.000Z | 2018-09-18T17:11:49.000Z | #!/usr/bin/python -u
import sys
import string
import StringIO
import libxml2
# Memory debug specific
libxml2.debugMemory(1)
import libxslt
EXT_URL="http://example.com/foo"
insertNodeName = None
transformNodeName = None
def compile_test(style, inst, func):
pass
def transform_test(ctx, node, inst, comp):
global insertNodeName
#
# Small check to verify the context is correcly accessed
#
try:
#
# FIXME add some more sanity checks
#
tctxt = libxslt.transformCtxt(_obj=ctx)
insertNodeName = tctxt.insertNode().name
# FIXME find and confirm the note being replaced is called 'test'
# transformNodeName = libxml2.xmlNode(inst).name
except:
pass
tctxt.insertNode().addContent('SUCCESS')
styledoc = libxml2.parseDoc("""
<xsl:stylesheet version='1.0'
xmlns:xsl='http://www.w3.org/1999/XSL/Transform'
xmlns:foo='%s'
extension-element-prefixes='foo'>
<xsl:template match='/'>
<article><foo:test>FAILURE</foo:test></article>
<deeper><article><foo:test>something<foo:test>nested</foo:test>even</foo:test></article></deeper>
</xsl:template>
</xsl:stylesheet>
""" % EXT_URL)
style = libxslt.parseStylesheetDoc(styledoc)
libxslt.registerExtModuleElement("test", EXT_URL, compile_test, transform_test)
doc = libxml2.parseDoc("<doc/>")
result = style.applyStylesheet(doc, None)
style.freeStylesheet()
doc.freeDoc()
extensions = StringIO.StringIO()
libxslt.debugDumpExtensions(extensions)
if 0 and extensions.buf.find(EXT_URL) < 0:
print "Element extension not registered (or dumping broken)"
sys.exit(1)
root = result.children
if root.name != "article":
print "Unexpected root node name"
sys.exit(1)
if root.content != "SUCCESS":
print "Unexpected root node content, extension function failed"
sys.exit(1)
if insertNodeName != 'article':
print "The function callback failed to access its context"
sys.exit(1)
result.dump(sys.stdout)
result.freeDoc()
# Memory debug specific
libxslt.cleanup()
if libxml2.debugMemory(1) == 0:
print "OK"
else:
print "Memory leak %d bytes" % (libxml2.debugMemory(1))
libxml2.dumpMemory()
sys.exit(255)
| 24.186813 | 101 | 0.699228 | 277 | 2,201 | 5.523466 | 0.462094 | 0.027451 | 0.020915 | 0.026144 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015402 | 0.174012 | 2,201 | 90 | 102 | 24.455556 | 0.826183 | 0.119491 | 0 | 0.1 | 0 | 0.016667 | 0.320187 | 0.091853 | 0 | 0 | 0 | 0.011111 | 0 | 0 | null | null | 0.033333 | 0.083333 | null | null | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
166dcf04448d86163a95aec9592657857792e9b9 | 1,822 | py | Python | build/lib/openrobotics/robotics.py | XM522706601/robotics_tutorial_for_zhihu | 21d61098b69e22684de462f68b68f7550eaf7971 | [
"MIT"
] | 63 | 2019-05-07T02:51:26.000Z | 2021-03-22T12:58:44.000Z | build/lib/openrobotics/robotics.py | XM522706601/robotics_tutorial_for_zhihu | 21d61098b69e22684de462f68b68f7550eaf7971 | [
"MIT"
] | null | null | null | build/lib/openrobotics/robotics.py | XM522706601/robotics_tutorial_for_zhihu | 21d61098b69e22684de462f68b68f7550eaf7971 | [
"MIT"
] | 36 | 2019-05-22T03:09:51.000Z | 2021-03-19T08:01:38.000Z | import numpy as np
import math
PI = math.pi
TO_DEG = 180/PI
TO_RAD = PI/180
def cos(angle):
return math.cos(angle*TO_RAD)
def sin(angle):
return math.sin(angle*TO_RAD)
def rot_x(angle):
return np.array([[ 1, 0, 0 ],
[ 0, cos(angle), -sin(angle)],
[ 0, sin(angle), cos(angle)]])
def rot_y(angle):
return np.array([[ cos(angle), 0, sin(angle)],
[ 0, 1, 0 ],
[-sin(angle), 0, cos(angle)]])
def rot_z(angle):
return np.array([[ cos(angle), -sin(angle), 0],
[ sin(angle), cos(angle), 0],
[ 0, 0, 1]])
def rot_inv(rot):
return np.array(rot).T
def rot_to_euler(rot):
sy = math.sqrt(rot[0,0] * rot[0,0] + rot[1,0] * rot[1,0])
singular = (sy < 1e-6)
if (not singular):
x = math.atan2(rot[2,1] , rot[2,2])
y = math.atan2(-rot[2,0], sy)
z = math.atan2(rot[1,0], rot[0,0])
else:
x = math.atan2(-rot[1,2], rot[1,1])
y = math.atan2(-rot[2,0], sy)
z = 0
return np.array([x, y, z])
def euler_to_rot(euler):
return np.dot(rot_z(euler[2]), np.dot(rot_y(euler[1]), rot_x(euler[0])))
def trans_inv(trans):
pos, rot = decompose_trans(trans)
rot_i = rot_inv(rot)
return np.r_[np.c_[rot_i, -np.dot(rot_i, pos)], [[0, 0, 0, 1]]]
def pose_to_trans(pose):
pos = np.array(pose[0:3]).T
euler = pose[3:6]
rot = euler_to_rot(euler)
trans = compose_trans(pos, rot)
return trans
def compose_trans(pos, rot):
return np.array(np.r_[np.c_[np.array(rot), np.array([pos[0], pos[1], pos[2]])], [[0, 0, 0, 1]]])
def decompose_trans(trans):
trans = np.array(trans)
return np.array(trans[0: 3, 3]), np.array(trans[0: 3, 0: 3])
| 27.606061 | 100 | 0.523052 | 310 | 1,822 | 2.970968 | 0.145161 | 0.091205 | 0.098806 | 0.058632 | 0.30076 | 0.158523 | 0.110749 | 0.110749 | 0.071661 | 0 | 0 | 0.06 | 0.286498 | 1,822 | 65 | 101 | 28.030769 | 0.648462 | 0 | 0 | 0.038462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.038462 | 0.153846 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
1671ed62a3271ddf0b4e062b8e4fcd71aa1a7196 | 922 | py | Python | lalgebra/polygon.py | peaceknight05/linear | f478fcb93438bed4916c9af1fd07cf8ed2824edf | [
"MIT"
] | null | null | null | lalgebra/polygon.py | peaceknight05/linear | f478fcb93438bed4916c9af1fd07cf8ed2824edf | [
"MIT"
] | null | null | null | lalgebra/polygon.py | peaceknight05/linear | f478fcb93438bed4916c9af1fd07cf8ed2824edf | [
"MIT"
] | null | null | null | from point import Point
from line import Line
from vector import Vector
import math
class InvalidPolygonError(Exception):
pass
class Polygon:
vertices = []
__calc = property(lambda self: self.vertices + [self.vertices[0]])
area = property(lambda self: 0.5 * (sum([self.__calc[v].x * self.__calc[v+1].y for v in range(0, len(self.__calc)-1)]) + sum([-1*(self.__calc[v].x * self.__calc[v-1].y) for v in range(1, len(self.__calc))])))
centroid = property(lambda self: Vector(sum([a.x for a in self.vertices])/len(self.vertices),sum([a.y for a in self.vertices])/len(self.vertices)))
def __init__(self, vertices):
if len(vertices) < 3: raise InvalidPolygonError("A polygon must have at least 3 vertices!")
self.vertices = vertices
l = [[math.atan2(a.y - self.centroid.y, a.x - self.centroid.x), a] for a in vertices]
l.sort()
self.vertices = [a[1] for a in l] | 46.1 | 212 | 0.661605 | 150 | 922 | 3.946667 | 0.28 | 0.182432 | 0.060811 | 0.033784 | 0.219595 | 0.219595 | 0.219595 | 0.219595 | 0.108108 | 0.108108 | 0 | 0.01731 | 0.185466 | 922 | 20 | 213 | 46.1 | 0.770972 | 0 | 0 | 0 | 0 | 0 | 0.043337 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0.058824 | 0.235294 | 0 | 0.647059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
167b54a66f37c615a8c1b61008d700d7d66e1889 | 2,127 | py | Python | pyncm/apis/track.py | asdMild/pyncm | 493e5b7ff26547e049dfd3f25431e57cb6f61709 | [
"Apache-2.0"
] | null | null | null | pyncm/apis/track.py | asdMild/pyncm | 493e5b7ff26547e049dfd3f25431e57cb6f61709 | [
"Apache-2.0"
] | null | null | null | pyncm/apis/track.py | asdMild/pyncm | 493e5b7ff26547e049dfd3f25431e57cb6f61709 | [
"Apache-2.0"
] | null | null | null | '''歌曲 - Track APIs'''
from . import EapiEncipered, WeapiCryptoRequest,LoginRequiredApi
import json
@WeapiCryptoRequest
def GetTrackDetail(song_ids : list):
'''网页端 - 获取歌曲详情
Args:
song_ids (list): 歌曲 ID
Returns:
dict
'''
ids = song_ids if isinstance(song_ids,list) else [song_ids]
return '/weapi/v3/song/detail',{"c":json.dumps([{'id': str(id)} for id in ids])}
@WeapiCryptoRequest
def GetTrackAudio(song_ids:list, quality='lossless',encodeType='aac'):
'''网页端 - 获取歌曲音频详情(文件URL、MD5...)
Args:
song_ids (list): 歌曲 ID
quality (str, optional): standard / high / lossless . Defaults to 'lossless'.
encodeType (str, optional): 歌曲编码 . Defaults to 'aac'.
Returns:
dict
'''
ids = song_ids if isinstance(song_ids,list) else [song_ids]
return '/weapi/song/enhance/player/url/v1',{"ids":ids,"level":quality,"encodeType":encodeType}
@WeapiCryptoRequest
def GetTrackLyrics(song_id : str):
'''网页端 - 获取歌曲歌词
Args:
song_id (str): 歌曲ID
Returns:
dict
'''
return '/weapi/song/lyric',{"id": str(song_id), "lv": -1, "tv": -1}
@WeapiCryptoRequest
def GetTrackComments(song_id, offset=0, limit=20,beforeTime=0):
'''网页端 - 获取歌曲评论
Args:
album_id (str): 歌曲ID
offset (int, optional): 时间顺序偏移数. Defaults to 0.
limit (int, optional): 单次评论获取量. Defaults to 20.
beforeTime (int, optional): 评论将从该时间戳(秒为单位)开始. Defaults to 0.
Returns:
dict
'''
return '/weapi/v1/resource/comments/R_SO_4_%s' % song_id,{"rid":str(song_id),"offset":str(offset),"total":"true","limit":str(limit),"beforeTime":str(beforeTime * 1000)}
@EapiEncipered
@WeapiCryptoRequest
@LoginRequiredApi
def SetLikeTrack(trackId,like=True,userid=0,e_r=True):
'''PC端 - 收藏歌曲到 `我喜欢的音乐`
Args:
trackId (int): 歌曲 ID
like (bool, optional): 收藏或取消收藏. Defaults to True.
userid (int, optional): 暂存. Defaults to 0.
Returns:
dict
'''
return '/api/song/like',{"trackId":str(trackId),"userid":str(userid),"like":str(like).lower(),"e_r":str(e_r).lower()} | 27.986842 | 172 | 0.631876 | 268 | 2,127 | 4.929104 | 0.354478 | 0.05299 | 0.049962 | 0.02271 | 0.171083 | 0.171083 | 0.099924 | 0.099924 | 0.099924 | 0.099924 | 0 | 0.012522 | 0.211566 | 2,127 | 76 | 173 | 27.986842 | 0.775194 | 0.362953 | 0 | 0.333333 | 0 | 0 | 0.179293 | 0.076599 | 0 | 0 | 0 | 0 | 0 | 1 | 0.238095 | false | 0 | 0.095238 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1680d7b56c0956364b744453f82d01af770e2654 | 8,046 | py | Python | tests/aws_batch_db/conftest.py | dazza-codes/aio-aws | 5bce9e0adbb6d4613748c3f9b6e265aa35144372 | [
"Apache-2.0"
] | 2 | 2021-09-14T10:10:23.000Z | 2021-12-07T02:42:00.000Z | tests/aws_batch_db/conftest.py | dazza-codes/aio-aws | 5bce9e0adbb6d4613748c3f9b6e265aa35144372 | [
"Apache-2.0"
] | 19 | 2020-10-13T02:41:25.000Z | 2022-03-29T06:09:38.000Z | tests/aws_batch_db/conftest.py | dazza-codes/aio-aws | 5bce9e0adbb6d4613748c3f9b6e265aa35144372 | [
"Apache-2.0"
] | null | null | null | import pytest
from aio_aws.aws_batch_models import AWSBatchJob
#
# TODO: create fixtures for the same job-name with different job-id and different status
#
#
# TODO: create fixtures for the same job-name with different status
#
#
# TODO: change all fixtures with different job-name and job-id with different status
#
@pytest.fixture
def aws_batch_job_submitted() -> AWSBatchJob:
# "87e36046-c675-4c84-bedd-7cffbca0528c"
job_data = {
"job_id": "1aac2eab-897a-4b89-9eb3-08986fbb7144",
"job_name": "sleep-1-job",
"job_queue": "arn:aws:batch:us-west-2:123456789012:job-queue/moto_test_job_queue",
"job_definition": "arn:aws:batch:us-west-2:123456789012:job-definition/moto_test_job_definition:1",
"job_submission": {
"ResponseMetadata": {
"HTTPHeaders": {
"content-length": "75",
"content-type": "text/html; charset=utf-8",
"date": "Mon, 23 Mar 2020 15:29:33 GMT",
"server": "amazon.com",
},
"HTTPStatusCode": 200,
"RetryAttempts": 0,
},
"jobName": "sleep-1-job",
"jobId": "1aac2eab-897a-4b89-9eb3-08986fbb7144",
},
"job_description": None,
"container_overrides": {
"command": ["/bin/sh", "-c", "echo Hello && sleep 1 && echo Bye"]
},
"command": ["/bin/sh", "-c", "echo Hello && sleep 1 && echo Bye"],
"depends_on": [],
"status": "SUBMITTED",
"job_tries": ["1aac2eab-897a-4b89-9eb3-08986fbb7144"],
"max_tries": 4,
"num_tries": 1,
}
return AWSBatchJob(**job_data)
@pytest.fixture
def aws_batch_job() -> AWSBatchJob:
# TODO: use some kind of faker-db for pydantic models and
# create pydantic models or schema for all elements
job_data = {
"job_id": "1aac2eab-897a-4b89-9eb3-08986fbb7144",
"job_name": "sleep-1-job",
"job_queue": "arn:aws:batch:us-west-2:123456789012:job-queue/moto_test_job_queue",
"job_definition": "arn:aws:batch:us-west-2:123456789012:job-definition/moto_test_job_definition:1",
"job_submission": {
"ResponseMetadata": {
"HTTPHeaders": {
"content-length": "75",
"content-type": "text/html; charset=utf-8",
"date": "Mon, 23 Mar 2020 15:29:33 GMT",
"server": "amazon.com",
},
"HTTPStatusCode": 200,
"RetryAttempts": 0,
},
"jobId": "1aac2eab-897a-4b89-9eb3-08986fbb7144",
"jobName": "sleep-1-job",
},
"job_tries": ["1aac2eab-897a-4b89-9eb3-08986fbb7144"],
"max_tries": 4,
"num_tries": 1,
"status": "SUCCEEDED",
"command": ["/bin/sh", "-c", "echo Hello && sleep 0.2 && echo Bye"],
"container_overrides": {
"command": ["/bin/sh", "-c", "echo Hello && sleep 0.2 && echo Bye"]
},
"depends_on": [],
"job_description": {
"container": {
"command": [
'/bin/sh -c "for a in `seq 1 '
"10`; do echo Hello World; "
'sleep 1; done"'
],
"logStreamName": "moto_test_job_definition/default/1aac2eab-897a-4b89-9eb3-08986fbb7144",
"privileged": False,
"readonlyRootFilesystem": False,
"ulimits": [],
"vcpus": 1,
"volumes": [],
},
"dependsOn": [],
"jobDefinition": "arn:aws:batch:us-west-2:123456789012:job-definition/moto_test_job_definition:1",
"jobId": "1aac2eab-897a-4b89-9eb3-08986fbb7144",
"jobName": "sleep-1-job",
"jobQueue": "arn:aws:batch:us-west-2:123456789012:job-queue/moto_test_job_queue",
"createdAt": 1584977374, # https://github.com/spulec/moto/issues/2829
"startedAt": 1584977376,
"status": "SUCCEEDED",
"stoppedAt": 1584977386,
},
}
return AWSBatchJob(**job_data)
PENDING = {
"jobName": "sleep-1-job",
"jobId": "f6b11be1-1a92-4999-8628-bf8ff9380106",
"jobQueue": "arn:aws:batch:us-west-2:123456789012:job-queue/moto_test_job_queue",
"status": "PENDING",
"dependsOn": [],
"jobDefinition": "arn:aws:batch:us-west-2:123456789012:job-definition/moto_test_job_definition:1",
}
RUNNABLE = {
"jobName": "sleep-1-job",
"jobId": "347a2870-d688-4fa7-bbec-4f9ffd556980",
"jobQueue": "arn:aws:batch:us-west-2:123456789012:job-queue/moto_test_job_queue",
"status": "RUNNABLE",
"dependsOn": [],
"jobDefinition": "arn:aws:batch:us-west-2:123456789012:job-definition/moto_test_job_definition:1",
}
STARTING = {
"jobName": "sleep-1-job",
"jobId": "4190b441-0ade-4584-a474-162bb793bf50",
"jobQueue": "arn:aws:batch:us-west-2:123456789012:job-queue/moto_test_job_queue",
"status": "STARTING",
"dependsOn": [],
"jobDefinition": "arn:aws:batch:us-west-2:123456789012:job-definition/moto_test_job_definition:1",
}
RUNNING = {
"jobName": "sleep-1-job",
"jobId": "fe93a1e2-b415-43df-bb82-f9081bd8b97b",
"jobQueue": "arn:aws:batch:us-west-2:123456789012:job-queue/moto_test_job_queue",
"status": "RUNNING",
"startedAt": 1631597959,
"dependsOn": [],
"jobDefinition": "arn:aws:batch:us-west-2:123456789012:job-definition/moto_test_job_definition:1",
}
SUCCEEDED = {
"jobName": "sleep-1-job",
"jobId": "e8a69543-2cc3-4412-917d-a189d1567d47",
"jobQueue": "arn:aws:batch:us-west-2:123456789012:job-queue/moto_test_job_queue",
"status": "SUCCEEDED",
"startedAt": 1631598039,
"stoppedAt": 1631598040,
"dependsOn": [],
"jobDefinition": "arn:aws:batch:us-west-2:123456789012:job-definition/moto_test_job_definition:1",
"container": {
"vcpus": 1,
"command": [
'/bin/sh -c "for a in `seq 1 10`; do echo Hello World; sleep 1; done"'
],
"volumes": [],
"readonlyRootFilesystem": False,
"ulimits": [],
"privileged": False,
"logStreamName": "moto_test_job_definition/default/e8a69543-2cc3-4412-917d-a189d1567d47",
},
}
@pytest.fixture
def aws_batch_job_succeeded(aws_batch_job):
job = AWSBatchJob(**aws_batch_job.db_data)
job.job_name = "job-SUCCEEDED"
assert job.status == "SUCCEEDED"
return job
@pytest.fixture
def aws_batch_job_failed(aws_batch_job):
job = AWSBatchJob(**aws_batch_job.db_data)
job.job_name = "job-FAILED"
job.job_id = aws_batch_job.job_id.replace("08986fbb7144", "08986fbb7145")
job.job_submission["jobId"] = job.job_id
job.job_description["status"] = "FAILED"
job.status = job.job_description["status"]
return job
@pytest.fixture
def aws_batch_job_starting(aws_batch_job):
job = AWSBatchJob(**aws_batch_job.db_data)
job.job_name = "job-RUNNING"
job.job_id = aws_batch_job.job_id.replace("08986fbb7144", "08986fbb7146")
job.job_submission["jobId"] = job.job_id
job.job_description["status"] = "STARTING"
del job.job_description["startedAt"]
del job.job_description["stoppedAt"]
del job.job_description["container"]
job.status = job.job_description["status"]
return job
@pytest.fixture
def aws_batch_job_running(aws_batch_job):
job = AWSBatchJob(**aws_batch_job.db_data)
job.job_name = "job-RUNNING"
job.job_id = aws_batch_job.job_id.replace("08986fbb7144", "08986fbb7147")
job.job_submission["jobId"] = job.job_id
job.job_description["status"] = "RUNNING"
del job.job_description["stoppedAt"]
del job.job_description["container"]
job.status = job.job_description["status"]
return job
@pytest.fixture
def aws_batch_jobs(
aws_batch_job_succeeded, aws_batch_job_failed, aws_batch_job_running
):
return aws_batch_job_succeeded, aws_batch_job_failed, aws_batch_job_running
| 35.76 | 110 | 0.610117 | 945 | 8,046 | 5.01164 | 0.186243 | 0.069257 | 0.053421 | 0.043919 | 0.770904 | 0.725084 | 0.684544 | 0.679054 | 0.671453 | 0.663851 | 0 | 0.114845 | 0.238131 | 8,046 | 224 | 111 | 35.919643 | 0.657749 | 0.053318 | 0 | 0.611702 | 0 | 0.047872 | 0.46942 | 0.232277 | 0 | 0 | 0 | 0.004464 | 0.005319 | 1 | 0.037234 | false | 0 | 0.010638 | 0.005319 | 0.085106 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
168476fbfba632d2971ab991ca8fa0e5f6448613 | 850 | py | Python | Part-03-Understanding-Software-Crafting-Your-Own-Tools/models/edx-platform/openedx/core/djangoapps/util/tests/test_signals.py | osoco/better-ways-of-thinking-about-software | 83e70d23c873509e22362a09a10d3510e10f6992 | [
"MIT"
] | 3 | 2021-12-15T04:58:18.000Z | 2022-02-06T12:15:37.000Z | Part-03-Understanding-Software-Crafting-Your-Own-Tools/models/edx-platform/openedx/core/djangoapps/util/tests/test_signals.py | osoco/better-ways-of-thinking-about-software | 83e70d23c873509e22362a09a10d3510e10f6992 | [
"MIT"
] | null | null | null | Part-03-Understanding-Software-Crafting-Your-Own-Tools/models/edx-platform/openedx/core/djangoapps/util/tests/test_signals.py | osoco/better-ways-of-thinking-about-software | 83e70d23c873509e22362a09a10d3510e10f6992 | [
"MIT"
] | 1 | 2019-01-02T14:38:50.000Z | 2019-01-02T14:38:50.000Z | # pylint: disable=no-member, missing-docstring
from unittest import TestCase
from pytest import mark
from celery import shared_task
from django.test.utils import override_settings
from edx_django_utils.cache import RequestCache
@mark.django_db
class TestClearRequestCache(TestCase):
"""
Tests _clear_request_cache is called after celery task is run.
"""
def _get_cache(self):
return RequestCache("TestClearRequestCache")
@shared_task
def _dummy_task(self):
""" A task that adds stuff to the request cache. """
self._get_cache().set("cache_key", "blah blah")
@override_settings(CLEAR_REQUEST_CACHE_ON_TASK_COMPLETION=True)
def test_clear_cache_celery(self):
self._dummy_task.apply(args=(self,)).get()
assert not self._get_cache().get_cached_response('cache_key').is_found
| 29.310345 | 78 | 0.738824 | 114 | 850 | 5.219298 | 0.5 | 0.060504 | 0.057143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170588 | 850 | 28 | 79 | 30.357143 | 0.843972 | 0.181176 | 0 | 0 | 0 | 0 | 0.071217 | 0.031157 | 0 | 0 | 0 | 0 | 0.0625 | 1 | 0.1875 | false | 0 | 0.3125 | 0.0625 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
168bc5240af220fb0bd230ccbce3bd5e95b50c07 | 291 | py | Python | site_utils/handlers.py | ninemoreminutes/django-site-utils | d0c9f360451593f20ce0e80866f7e76185c0764b | [
"BSD-3-Clause"
] | null | null | null | site_utils/handlers.py | ninemoreminutes/django-site-utils | d0c9f360451593f20ce0e80866f7e76185c0764b | [
"BSD-3-Clause"
] | 13 | 2020-05-07T03:57:03.000Z | 2022-03-12T00:54:56.000Z | site_utils/handlers.py | ninemoreminutes/django-site-utils | d0c9f360451593f20ce0e80866f7e76185c0764b | [
"BSD-3-Clause"
] | null | null | null | # Python
from __future__ import unicode_literals
__all__ = ['handler400', 'handler403', 'handler404', 'handler500']
handler400 = 'site_utils.views.handle_400'
handler403 = 'site_utils.views.handle_403'
handler404 = 'site_utils.views.handle_404'
handler500 = 'site_utils.views.handle_500'
| 26.454545 | 66 | 0.786942 | 35 | 291 | 6.057143 | 0.542857 | 0.169811 | 0.264151 | 0.377358 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 0.092784 | 291 | 10 | 67 | 29.1 | 0.666667 | 0.020619 | 0 | 0 | 0 | 0 | 0.522968 | 0.381625 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
16c0ec38afd170e4928850b7995d5d4e1e5ff2bf | 1,546 | py | Python | log_viewer/controllers/session_controller.py | neotherack/log-viewer | 06f302afc355141fe0f5e75abc61d2389561c7ad | [
"MIT"
] | 2 | 2021-05-02T09:58:10.000Z | 2021-08-13T04:24:39.000Z | log_viewer/controllers/session_controller.py | neotherack/log-viewer | 06f302afc355141fe0f5e75abc61d2389561c7ad | [
"MIT"
] | null | null | null | log_viewer/controllers/session_controller.py | neotherack/log-viewer | 06f302afc355141fe0f5e75abc61d2389561c7ad | [
"MIT"
] | 1 | 2020-04-21T20:50:04.000Z | 2020-04-21T20:50:04.000Z | from flask import current_app, session
from log_viewer.controllers.exceptions import BadUserOrPasswordException
class SessionController:
"""
Class to control the user's session.
"""
def __init__(self):
self._db_user = current_app.config['USER']
self._db_psw = current_app.config['PASSWORD']
self._session = session
def is_login_needed(self):
"""
Returns True if user or password are set, False otherwise.
:return bool:
"""
return any([self._db_user is not None, self._db_psw is not None])
def is_logged_in(self):
"""
Returns True if user is logged in, False otherwise.
:return bool:
"""
return all(['username' in self._session, 'password' in self._session])
def drop_session(self):
"""
Drops user's and password's session.
"""
self._session.pop('username', None)
self._session.pop('password', None)
def login_user(self, user, password):
"""
Sets user's and password's session
:param user: User to login.
:param password: Password.
:raise BadUserOrPasswordException: Bad user or password.
"""
if (self._db_user is not None and self._db_user != user) or \
(self._db_psw is not None and self._db_psw != password):
raise BadUserOrPasswordException("Bad user or password.")
else:
self._session['username'] = user
self._session['password'] = password
| 31.55102 | 78 | 0.610608 | 186 | 1,546 | 4.876344 | 0.290323 | 0.052922 | 0.044101 | 0.037486 | 0.390298 | 0.277839 | 0.123484 | 0 | 0 | 0 | 0 | 0 | 0.289133 | 1,546 | 48 | 79 | 32.208333 | 0.825296 | 0.233506 | 0 | 0 | 0 | 0 | 0.07811 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.238095 | false | 0.380952 | 0.095238 | 0 | 0.47619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
16c703d0741c32d21630fd90d0d7a7ad1bf55858 | 1,146 | py | Python | qulab/storage/schema/base.py | feihoo87/QuLab | cc16f4777e5523fca327f7f0a9725fd13f9b057f | [
"MIT"
] | 16 | 2018-03-16T12:08:31.000Z | 2022-03-20T08:53:35.000Z | qulab/storage/schema/base.py | Lagikna/QuLab | 417111cffd3941b4a581ebf80066863e78476f0a | [
"MIT"
] | 148 | 2018-03-18T09:33:18.000Z | 2022-03-21T16:00:15.000Z | qulab/storage/schema/base.py | feihoo87/QuLab | cc16f4777e5523fca327f7f0a9725fd13f9b057f | [
"MIT"
] | 14 | 2018-03-18T08:00:12.000Z | 2020-10-21T12:39:42.000Z | import datetime
import io
import lzma
import pickle
from mongoengine import signals
def now():
return datetime.datetime.now()
def to_pickle(obj):
buff = pickle.dumps(obj, protocol=pickle.HIGHEST_PROTOCOL)
cbuff = lzma.compress(buff, format=lzma.FORMAT_XZ)
return io.BytesIO(cbuff)
def from_pickle(buff):
return pickle.loads(lzma.decompress(buff.read(), format=lzma.FORMAT_XZ))
def handler(event):
"""Signal decorator to allow use of callback functions as class decorators."""
def decorator(fn):
def apply(cls):
event.connect(fn, sender=cls)
return cls
fn.apply = apply
return fn
return decorator
@handler(signals.pre_save)
def update_modified(sender, document, **kwargs):
if kwargs.get('finished', False):
document.finished_time = now()
document.modified_time = now()
@handler(signals.pre_delete)
def delete_children(sender, document):
for attrname in ['datafield']:
attr = getattr(document, attrname, None)
if attr is not None:
attr.delete()
for child in document.children:
child.delete()
| 22.038462 | 82 | 0.67452 | 146 | 1,146 | 5.219178 | 0.445205 | 0.026247 | 0.041995 | 0.047244 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222513 | 1,146 | 51 | 83 | 22.470588 | 0.855219 | 0.062827 | 0 | 0 | 0 | 0 | 0.015918 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.147059 | 0.058824 | 0.558824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
16d9ffa024558e26f91a08af50a04893d79d1848 | 389 | py | Python | mimesis/data/int/text.py | chinghwayu/mimesis | 3afcda8e68ee2f6feb61a5c7ca663328909828fa | [
"MIT"
] | 2,619 | 2017-07-18T13:25:46.000Z | 2022-03-31T17:52:53.000Z | mimesis/data/int/text.py | chinghwayu/mimesis | 3afcda8e68ee2f6feb61a5c7ca663328909828fa | [
"MIT"
] | 947 | 2017-07-15T18:32:12.000Z | 2022-03-28T10:04:15.000Z | mimesis/data/int/text.py | chinghwayu/mimesis | 3afcda8e68ee2f6feb61a5c7ca663328909828fa | [
"MIT"
] | 328 | 2017-07-18T01:11:12.000Z | 2022-03-30T09:20:48.000Z | # -*- coding: utf-8 -*-
"""Provides all the data related to text."""
SAFE_COLORS = [
"#1abc9c",
"#16a085",
"#2ecc71",
"#27ae60",
"#3498db",
"#2980b9",
"#9b59b6",
"#8e44ad",
"#34495e",
"#2c3e50",
"#f1c40f",
"#f39c12",
"#e67e22",
"#d35400",
"#e74c3c",
"#c0392b",
"#ecf0f1",
"#bdc3c7",
"#95a5a6",
"#7f8c8d",
]
| 14.407407 | 44 | 0.442159 | 32 | 389 | 5.34375 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.273063 | 0.303342 | 389 | 26 | 45 | 14.961538 | 0.357934 | 0.156812 | 0 | 0 | 0 | 0 | 0.434783 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
16e27b0ade01738d9441683fb333be54674089ca | 415 | py | Python | pokedex/accounts/admin.py | JGabriel-AbreuM/Pokedex2 | e900850aa12974f1c34b08de5bb741616169b75e | [
"MIT"
] | null | null | null | pokedex/accounts/admin.py | JGabriel-AbreuM/Pokedex2 | e900850aa12974f1c34b08de5bb741616169b75e | [
"MIT"
] | null | null | null | pokedex/accounts/admin.py | JGabriel-AbreuM/Pokedex2 | e900850aa12974f1c34b08de5bb741616169b75e | [
"MIT"
] | null | null | null | from django.contrib import admin
from django.contrib.auth import admin as auth_admin
from .forms import UserChangeForm, UserCreationForm
from .models import User
@admin.register(User)
class UserAdmin(auth_admin.UserAdmin):
form = UserChangeForm
add_form = UserCreationForm
model = User
fieldsets = auth_admin.UserAdmin.fieldsets + (
("OTP", {"fields": ("codigo",)}),
)
| 25.9375 | 52 | 0.698795 | 46 | 415 | 6.217391 | 0.478261 | 0.094406 | 0.118881 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.207229 | 415 | 15 | 53 | 27.666667 | 0.869301 | 0 | 0 | 0 | 0 | 0 | 0.0375 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
bc43cbcf366b8ea445c927cbc4a067729e2f0e91 | 42,404 | py | Python | firestore/google/cloud/firestore_admin_v1/proto/firestore_admin_pb2.py | hugovk/google-cloud-python | b387134827dbc3be0e1b431201e0875798002fda | [
"Apache-2.0"
] | 1 | 2020-01-09T14:42:37.000Z | 2020-01-09T14:42:37.000Z | firestore/google/cloud/firestore_admin_v1/proto/firestore_admin_pb2.py | hugovk/google-cloud-python | b387134827dbc3be0e1b431201e0875798002fda | [
"Apache-2.0"
] | 6 | 2019-05-27T22:05:58.000Z | 2019-08-05T16:46:16.000Z | firestore/google/cloud/firestore_admin_v1/proto/firestore_admin_pb2.py | hugovk/google-cloud-python | b387134827dbc3be0e1b431201e0875798002fda | [
"Apache-2.0"
] | 1 | 2019-03-29T18:26:16.000Z | 2019-03-29T18:26:16.000Z | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: google/cloud/firestore/admin_v1/proto/firestore_admin.proto
import sys
_b = sys.version_info[0] < 3 and (lambda x: x) or (lambda x: x.encode("latin1"))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from google.api import annotations_pb2 as google_dot_api_dot_annotations__pb2
from google.cloud.firestore_admin_v1.proto import (
field_pb2 as google_dot_cloud_dot_firestore_dot_admin__v1_dot_proto_dot_field__pb2,
)
from google.cloud.firestore_admin_v1.proto import (
index_pb2 as google_dot_cloud_dot_firestore_dot_admin__v1_dot_proto_dot_index__pb2,
)
from google.longrunning import (
operations_pb2 as google_dot_longrunning_dot_operations__pb2,
)
from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2
from google.protobuf import field_mask_pb2 as google_dot_protobuf_dot_field__mask__pb2
from google.api import client_pb2 as google_dot_api_dot_client__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name="google/cloud/firestore/admin_v1/proto/firestore_admin.proto",
package="google.firestore.admin.v1",
syntax="proto3",
serialized_options=_b(
"\n\035com.google.firestore.admin.v1B\023FirestoreAdminProtoP\001Z>google.golang.org/genproto/googleapis/firestore/admin/v1;admin\242\002\004GCFS\252\002\037Google.Cloud.Firestore.Admin.V1\312\002\037Google\\Cloud\\Firestore\\Admin\\V1"
),
serialized_pb=_b(
'\n;google/cloud/firestore/admin_v1/proto/firestore_admin.proto\x12\x19google.firestore.admin.v1\x1a\x1cgoogle/api/annotations.proto\x1a\x31google/cloud/firestore/admin_v1/proto/field.proto\x1a\x31google/cloud/firestore/admin_v1/proto/index.proto\x1a#google/longrunning/operations.proto\x1a\x1bgoogle/protobuf/empty.proto\x1a google/protobuf/field_mask.proto\x1a\x17google/api/client.proto"U\n\x12\x43reateIndexRequest\x12\x0e\n\x06parent\x18\x01 \x01(\t\x12/\n\x05index\x18\x02 \x01(\x0b\x32 .google.firestore.admin.v1.Index"[\n\x12ListIndexesRequest\x12\x0e\n\x06parent\x18\x01 \x01(\t\x12\x0e\n\x06\x66ilter\x18\x02 \x01(\t\x12\x11\n\tpage_size\x18\x03 \x01(\x05\x12\x12\n\npage_token\x18\x04 \x01(\t"a\n\x13ListIndexesResponse\x12\x31\n\x07indexes\x18\x01 \x03(\x0b\x32 .google.firestore.admin.v1.Index\x12\x17\n\x0fnext_page_token\x18\x02 \x01(\t"\x1f\n\x0fGetIndexRequest\x12\x0c\n\x04name\x18\x01 \x01(\t""\n\x12\x44\x65leteIndexRequest\x12\x0c\n\x04name\x18\x01 \x01(\t"v\n\x12UpdateFieldRequest\x12/\n\x05\x66ield\x18\x01 \x01(\x0b\x32 .google.firestore.admin.v1.Field\x12/\n\x0bupdate_mask\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.FieldMask"\x1f\n\x0fGetFieldRequest\x12\x0c\n\x04name\x18\x01 \x01(\t"Z\n\x11ListFieldsRequest\x12\x0e\n\x06parent\x18\x01 \x01(\t\x12\x0e\n\x06\x66ilter\x18\x02 \x01(\t\x12\x11\n\tpage_size\x18\x03 \x01(\x05\x12\x12\n\npage_token\x18\x04 \x01(\t"_\n\x12ListFieldsResponse\x12\x30\n\x06\x66ields\x18\x01 \x03(\x0b\x32 .google.firestore.admin.v1.Field\x12\x17\n\x0fnext_page_token\x18\x02 \x01(\t"Y\n\x16\x45xportDocumentsRequest\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x16\n\x0e\x63ollection_ids\x18\x02 \x03(\t\x12\x19\n\x11output_uri_prefix\x18\x03 \x01(\t"X\n\x16ImportDocumentsRequest\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x16\n\x0e\x63ollection_ids\x18\x02 \x03(\t\x12\x18\n\x10input_uri_prefix\x18\x03 \x01(\t2\xfd\x0c\n\x0e\x46irestoreAdmin\x12\xaa\x01\n\x0b\x43reateIndex\x12-.google.firestore.admin.v1.CreateIndexRequest\x1a\x1d.google.longrunning.Operation"M\x82\xd3\xe4\x93\x02G">/v1/{parent=projects/*/databases/*/collectionGroups/*}/indexes:\x05index\x12\xb4\x01\n\x0bListIndexes\x12-.google.firestore.admin.v1.ListIndexesRequest\x1a..google.firestore.admin.v1.ListIndexesResponse"F\x82\xd3\xe4\x93\x02@\x12>/v1/{parent=projects/*/databases/*/collectionGroups/*}/indexes\x12\xa0\x01\n\x08GetIndex\x12*.google.firestore.admin.v1.GetIndexRequest\x1a .google.firestore.admin.v1.Index"F\x82\xd3\xe4\x93\x02@\x12>/v1/{name=projects/*/databases/*/collectionGroups/*/indexes/*}\x12\x9c\x01\n\x0b\x44\x65leteIndex\x12-.google.firestore.admin.v1.DeleteIndexRequest\x1a\x16.google.protobuf.Empty"F\x82\xd3\xe4\x93\x02@*>/v1/{name=projects/*/databases/*/collectionGroups/*/indexes/*}\x12\x9f\x01\n\x08GetField\x12*.google.firestore.admin.v1.GetFieldRequest\x1a .google.firestore.admin.v1.Field"E\x82\xd3\xe4\x93\x02?\x12=/v1/{name=projects/*/databases/*/collectionGroups/*/fields/*}\x12\xaf\x01\n\x0bUpdateField\x12-.google.firestore.admin.v1.UpdateFieldRequest\x1a\x1d.google.longrunning.Operation"R\x82\xd3\xe4\x93\x02L2C/v1/{field.name=projects/*/databases/*/collectionGroups/*/fields/*}:\x05\x66ield\x12\xb0\x01\n\nListFields\x12,.google.firestore.admin.v1.ListFieldsRequest\x1a-.google.firestore.admin.v1.ListFieldsResponse"E\x82\xd3\xe4\x93\x02?\x12=/v1/{parent=projects/*/databases/*/collectionGroups/*}/fields\x12\xa1\x01\n\x0f\x45xportDocuments\x12\x31.google.firestore.admin.v1.ExportDocumentsRequest\x1a\x1d.google.longrunning.Operation"<\x82\xd3\xe4\x93\x02\x36"1/v1/{name=projects/*/databases/*}:exportDocuments:\x01*\x12\xa1\x01\n\x0fImportDocuments\x12\x31.google.firestore.admin.v1.ImportDocumentsRequest\x1a\x1d.google.longrunning.Operation"<\x82\xd3\xe4\x93\x02\x36"1/v1/{name=projects/*/databases/*}:importDocuments:\x01*\x1av\xca\x41\x18\x66irestore.googleapis.com\xd2\x41Xhttps://www.googleapis.com/auth/cloud-platform,https://www.googleapis.com/auth/datastoreB\xc1\x01\n\x1d\x63om.google.firestore.admin.v1B\x13\x46irestoreAdminProtoP\x01Z>google.golang.org/genproto/googleapis/firestore/admin/v1;admin\xa2\x02\x04GCFS\xaa\x02\x1fGoogle.Cloud.Firestore.Admin.V1\xca\x02\x1fGoogle\\Cloud\\Firestore\\Admin\\V1b\x06proto3'
),
dependencies=[
google_dot_api_dot_annotations__pb2.DESCRIPTOR,
google_dot_cloud_dot_firestore_dot_admin__v1_dot_proto_dot_field__pb2.DESCRIPTOR,
google_dot_cloud_dot_firestore_dot_admin__v1_dot_proto_dot_index__pb2.DESCRIPTOR,
google_dot_longrunning_dot_operations__pb2.DESCRIPTOR,
google_dot_protobuf_dot_empty__pb2.DESCRIPTOR,
google_dot_protobuf_dot_field__mask__pb2.DESCRIPTOR,
google_dot_api_dot_client__pb2.DESCRIPTOR,
],
)
_CREATEINDEXREQUEST = _descriptor.Descriptor(
name="CreateIndexRequest",
full_name="google.firestore.admin.v1.CreateIndexRequest",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="parent",
full_name="google.firestore.admin.v1.CreateIndexRequest.parent",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="index",
full_name="google.firestore.admin.v1.CreateIndexRequest.index",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=347,
serialized_end=432,
)
_LISTINDEXESREQUEST = _descriptor.Descriptor(
name="ListIndexesRequest",
full_name="google.firestore.admin.v1.ListIndexesRequest",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="parent",
full_name="google.firestore.admin.v1.ListIndexesRequest.parent",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="filter",
full_name="google.firestore.admin.v1.ListIndexesRequest.filter",
index=1,
number=2,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="page_size",
full_name="google.firestore.admin.v1.ListIndexesRequest.page_size",
index=2,
number=3,
type=5,
cpp_type=1,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="page_token",
full_name="google.firestore.admin.v1.ListIndexesRequest.page_token",
index=3,
number=4,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=434,
serialized_end=525,
)
_LISTINDEXESRESPONSE = _descriptor.Descriptor(
name="ListIndexesResponse",
full_name="google.firestore.admin.v1.ListIndexesResponse",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="indexes",
full_name="google.firestore.admin.v1.ListIndexesResponse.indexes",
index=0,
number=1,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="next_page_token",
full_name="google.firestore.admin.v1.ListIndexesResponse.next_page_token",
index=1,
number=2,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=527,
serialized_end=624,
)
_GETINDEXREQUEST = _descriptor.Descriptor(
name="GetIndexRequest",
full_name="google.firestore.admin.v1.GetIndexRequest",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="name",
full_name="google.firestore.admin.v1.GetIndexRequest.name",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
)
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=626,
serialized_end=657,
)
_DELETEINDEXREQUEST = _descriptor.Descriptor(
name="DeleteIndexRequest",
full_name="google.firestore.admin.v1.DeleteIndexRequest",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="name",
full_name="google.firestore.admin.v1.DeleteIndexRequest.name",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
)
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=659,
serialized_end=693,
)
_UPDATEFIELDREQUEST = _descriptor.Descriptor(
name="UpdateFieldRequest",
full_name="google.firestore.admin.v1.UpdateFieldRequest",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="field",
full_name="google.firestore.admin.v1.UpdateFieldRequest.field",
index=0,
number=1,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="update_mask",
full_name="google.firestore.admin.v1.UpdateFieldRequest.update_mask",
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=695,
serialized_end=813,
)
_GETFIELDREQUEST = _descriptor.Descriptor(
name="GetFieldRequest",
full_name="google.firestore.admin.v1.GetFieldRequest",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="name",
full_name="google.firestore.admin.v1.GetFieldRequest.name",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
)
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=815,
serialized_end=846,
)
_LISTFIELDSREQUEST = _descriptor.Descriptor(
name="ListFieldsRequest",
full_name="google.firestore.admin.v1.ListFieldsRequest",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="parent",
full_name="google.firestore.admin.v1.ListFieldsRequest.parent",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="filter",
full_name="google.firestore.admin.v1.ListFieldsRequest.filter",
index=1,
number=2,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="page_size",
full_name="google.firestore.admin.v1.ListFieldsRequest.page_size",
index=2,
number=3,
type=5,
cpp_type=1,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="page_token",
full_name="google.firestore.admin.v1.ListFieldsRequest.page_token",
index=3,
number=4,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=848,
serialized_end=938,
)
_LISTFIELDSRESPONSE = _descriptor.Descriptor(
name="ListFieldsResponse",
full_name="google.firestore.admin.v1.ListFieldsResponse",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="fields",
full_name="google.firestore.admin.v1.ListFieldsResponse.fields",
index=0,
number=1,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="next_page_token",
full_name="google.firestore.admin.v1.ListFieldsResponse.next_page_token",
index=1,
number=2,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=940,
serialized_end=1035,
)
_EXPORTDOCUMENTSREQUEST = _descriptor.Descriptor(
name="ExportDocumentsRequest",
full_name="google.firestore.admin.v1.ExportDocumentsRequest",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="name",
full_name="google.firestore.admin.v1.ExportDocumentsRequest.name",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="collection_ids",
full_name="google.firestore.admin.v1.ExportDocumentsRequest.collection_ids",
index=1,
number=2,
type=9,
cpp_type=9,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="output_uri_prefix",
full_name="google.firestore.admin.v1.ExportDocumentsRequest.output_uri_prefix",
index=2,
number=3,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=1037,
serialized_end=1126,
)
_IMPORTDOCUMENTSREQUEST = _descriptor.Descriptor(
name="ImportDocumentsRequest",
full_name="google.firestore.admin.v1.ImportDocumentsRequest",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="name",
full_name="google.firestore.admin.v1.ImportDocumentsRequest.name",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="collection_ids",
full_name="google.firestore.admin.v1.ImportDocumentsRequest.collection_ids",
index=1,
number=2,
type=9,
cpp_type=9,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="input_uri_prefix",
full_name="google.firestore.admin.v1.ImportDocumentsRequest.input_uri_prefix",
index=2,
number=3,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=1128,
serialized_end=1216,
)
_CREATEINDEXREQUEST.fields_by_name[
"index"
].message_type = (
google_dot_cloud_dot_firestore_dot_admin__v1_dot_proto_dot_index__pb2._INDEX
)
_LISTINDEXESRESPONSE.fields_by_name[
"indexes"
].message_type = (
google_dot_cloud_dot_firestore_dot_admin__v1_dot_proto_dot_index__pb2._INDEX
)
_UPDATEFIELDREQUEST.fields_by_name[
"field"
].message_type = (
google_dot_cloud_dot_firestore_dot_admin__v1_dot_proto_dot_field__pb2._FIELD
)
_UPDATEFIELDREQUEST.fields_by_name[
"update_mask"
].message_type = google_dot_protobuf_dot_field__mask__pb2._FIELDMASK
_LISTFIELDSRESPONSE.fields_by_name[
"fields"
].message_type = (
google_dot_cloud_dot_firestore_dot_admin__v1_dot_proto_dot_field__pb2._FIELD
)
DESCRIPTOR.message_types_by_name["CreateIndexRequest"] = _CREATEINDEXREQUEST
DESCRIPTOR.message_types_by_name["ListIndexesRequest"] = _LISTINDEXESREQUEST
DESCRIPTOR.message_types_by_name["ListIndexesResponse"] = _LISTINDEXESRESPONSE
DESCRIPTOR.message_types_by_name["GetIndexRequest"] = _GETINDEXREQUEST
DESCRIPTOR.message_types_by_name["DeleteIndexRequest"] = _DELETEINDEXREQUEST
DESCRIPTOR.message_types_by_name["UpdateFieldRequest"] = _UPDATEFIELDREQUEST
DESCRIPTOR.message_types_by_name["GetFieldRequest"] = _GETFIELDREQUEST
DESCRIPTOR.message_types_by_name["ListFieldsRequest"] = _LISTFIELDSREQUEST
DESCRIPTOR.message_types_by_name["ListFieldsResponse"] = _LISTFIELDSRESPONSE
DESCRIPTOR.message_types_by_name["ExportDocumentsRequest"] = _EXPORTDOCUMENTSREQUEST
DESCRIPTOR.message_types_by_name["ImportDocumentsRequest"] = _IMPORTDOCUMENTSREQUEST
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
CreateIndexRequest = _reflection.GeneratedProtocolMessageType(
"CreateIndexRequest",
(_message.Message,),
dict(
DESCRIPTOR=_CREATEINDEXREQUEST,
__module__="google.cloud.firestore.admin_v1.proto.firestore_admin_pb2",
__doc__="""The request for
[FirestoreAdmin.CreateIndex][google.firestore.admin.v1.FirestoreAdmin.CreateIndex].
Attributes:
parent:
A parent name of the form ``projects/{project_id}/databases/{d
atabase_id}/collectionGroups/{collection_id}``
index:
The composite index to create.
""",
# @@protoc_insertion_point(class_scope:google.firestore.admin.v1.CreateIndexRequest)
),
)
_sym_db.RegisterMessage(CreateIndexRequest)
ListIndexesRequest = _reflection.GeneratedProtocolMessageType(
"ListIndexesRequest",
(_message.Message,),
dict(
DESCRIPTOR=_LISTINDEXESREQUEST,
__module__="google.cloud.firestore.admin_v1.proto.firestore_admin_pb2",
__doc__="""The request for
[FirestoreAdmin.ListIndexes][google.firestore.admin.v1.FirestoreAdmin.ListIndexes].
Attributes:
parent:
A parent name of the form ``projects/{project_id}/databases/{d
atabase_id}/collectionGroups/{collection_id}``
filter:
The filter to apply to list results.
page_size:
The number of results to return.
page_token:
A page token, returned from a previous call to [FirestoreAdmin
.ListIndexes][google.firestore.admin.v1.FirestoreAdmin.ListInd
exes], that may be used to get the next page of results.
""",
# @@protoc_insertion_point(class_scope:google.firestore.admin.v1.ListIndexesRequest)
),
)
_sym_db.RegisterMessage(ListIndexesRequest)
ListIndexesResponse = _reflection.GeneratedProtocolMessageType(
"ListIndexesResponse",
(_message.Message,),
dict(
DESCRIPTOR=_LISTINDEXESRESPONSE,
__module__="google.cloud.firestore.admin_v1.proto.firestore_admin_pb2",
__doc__="""The response for
[FirestoreAdmin.ListIndexes][google.firestore.admin.v1.FirestoreAdmin.ListIndexes].
Attributes:
indexes:
The requested indexes.
next_page_token:
A page token that may be used to request another page of
results. If blank, this is the last page.
""",
# @@protoc_insertion_point(class_scope:google.firestore.admin.v1.ListIndexesResponse)
),
)
_sym_db.RegisterMessage(ListIndexesResponse)
GetIndexRequest = _reflection.GeneratedProtocolMessageType(
"GetIndexRequest",
(_message.Message,),
dict(
DESCRIPTOR=_GETINDEXREQUEST,
__module__="google.cloud.firestore.admin_v1.proto.firestore_admin_pb2",
__doc__="""The request for
[FirestoreAdmin.GetIndex][google.firestore.admin.v1.FirestoreAdmin.GetIndex].
Attributes:
name:
A name of the form ``projects/{project_id}/databases/{database
_id}/collectionGroups/{collection_id}/indexes/{index_id}``
""",
# @@protoc_insertion_point(class_scope:google.firestore.admin.v1.GetIndexRequest)
),
)
_sym_db.RegisterMessage(GetIndexRequest)
DeleteIndexRequest = _reflection.GeneratedProtocolMessageType(
"DeleteIndexRequest",
(_message.Message,),
dict(
DESCRIPTOR=_DELETEINDEXREQUEST,
__module__="google.cloud.firestore.admin_v1.proto.firestore_admin_pb2",
__doc__="""The request for
[FirestoreAdmin.DeleteIndex][google.firestore.admin.v1.FirestoreAdmin.DeleteIndex].
Attributes:
name:
A name of the form ``projects/{project_id}/databases/{database
_id}/collectionGroups/{collection_id}/indexes/{index_id}``
""",
# @@protoc_insertion_point(class_scope:google.firestore.admin.v1.DeleteIndexRequest)
),
)
_sym_db.RegisterMessage(DeleteIndexRequest)
UpdateFieldRequest = _reflection.GeneratedProtocolMessageType(
"UpdateFieldRequest",
(_message.Message,),
dict(
DESCRIPTOR=_UPDATEFIELDREQUEST,
__module__="google.cloud.firestore.admin_v1.proto.firestore_admin_pb2",
__doc__="""The request for
[FirestoreAdmin.UpdateField][google.firestore.admin.v1.FirestoreAdmin.UpdateField].
Attributes:
field:
The field to be updated.
update_mask:
A mask, relative to the field. If specified, only
configuration specified by this field\_mask will be updated in
the field.
""",
# @@protoc_insertion_point(class_scope:google.firestore.admin.v1.UpdateFieldRequest)
),
)
_sym_db.RegisterMessage(UpdateFieldRequest)
GetFieldRequest = _reflection.GeneratedProtocolMessageType(
"GetFieldRequest",
(_message.Message,),
dict(
DESCRIPTOR=_GETFIELDREQUEST,
__module__="google.cloud.firestore.admin_v1.proto.firestore_admin_pb2",
__doc__="""The request for
[FirestoreAdmin.GetField][google.firestore.admin.v1.FirestoreAdmin.GetField].
Attributes:
name:
A name of the form ``projects/{project_id}/databases/{database
_id}/collectionGroups/{collection_id}/fields/{field_id}``
""",
# @@protoc_insertion_point(class_scope:google.firestore.admin.v1.GetFieldRequest)
),
)
_sym_db.RegisterMessage(GetFieldRequest)
ListFieldsRequest = _reflection.GeneratedProtocolMessageType(
"ListFieldsRequest",
(_message.Message,),
dict(
DESCRIPTOR=_LISTFIELDSREQUEST,
__module__="google.cloud.firestore.admin_v1.proto.firestore_admin_pb2",
__doc__="""The request for
[FirestoreAdmin.ListFields][google.firestore.admin.v1.FirestoreAdmin.ListFields].
Attributes:
parent:
A parent name of the form ``projects/{project_id}/databases/{d
atabase_id}/collectionGroups/{collection_id}``
filter:
The filter to apply to list results. Currently, [FirestoreAdmi
n.ListFields][google.firestore.admin.v1.FirestoreAdmin.ListFie
lds] only supports listing fields that have been explicitly
overridden. To issue this query, call [FirestoreAdmin.ListFiel
ds][google.firestore.admin.v1.FirestoreAdmin.ListFields] with
the filter set to ``indexConfig.usesAncestorConfig:false``.
page_size:
The number of results to return.
page_token:
A page token, returned from a previous call to [FirestoreAdmin
.ListFields][google.firestore.admin.v1.FirestoreAdmin.ListFiel
ds], that may be used to get the next page of results.
""",
# @@protoc_insertion_point(class_scope:google.firestore.admin.v1.ListFieldsRequest)
),
)
_sym_db.RegisterMessage(ListFieldsRequest)
ListFieldsResponse = _reflection.GeneratedProtocolMessageType(
"ListFieldsResponse",
(_message.Message,),
dict(
DESCRIPTOR=_LISTFIELDSRESPONSE,
__module__="google.cloud.firestore.admin_v1.proto.firestore_admin_pb2",
__doc__="""The response for
[FirestoreAdmin.ListFields][google.firestore.admin.v1.FirestoreAdmin.ListFields].
Attributes:
fields:
The requested fields.
next_page_token:
A page token that may be used to request another page of
results. If blank, this is the last page.
""",
# @@protoc_insertion_point(class_scope:google.firestore.admin.v1.ListFieldsResponse)
),
)
_sym_db.RegisterMessage(ListFieldsResponse)
ExportDocumentsRequest = _reflection.GeneratedProtocolMessageType(
"ExportDocumentsRequest",
(_message.Message,),
dict(
DESCRIPTOR=_EXPORTDOCUMENTSREQUEST,
__module__="google.cloud.firestore.admin_v1.proto.firestore_admin_pb2",
__doc__="""The request for
[FirestoreAdmin.ExportDocuments][google.firestore.admin.v1.FirestoreAdmin.ExportDocuments].
Attributes:
name:
Database to export. Should be of the form:
``projects/{project_id}/databases/{database_id}``.
collection_ids:
Which collection ids to export. Unspecified means all
collections.
output_uri_prefix:
The output URI. Currently only supports Google Cloud Storage
URIs of the form: ``gs://BUCKET_NAME[/NAMESPACE_PATH]``, where
``BUCKET_NAME`` is the name of the Google Cloud Storage bucket
and ``NAMESPACE_PATH`` is an optional Google Cloud Storage
namespace path. When choosing a name, be sure to consider
Google Cloud Storage naming guidelines:
https://cloud.google.com/storage/docs/naming. If the URI is a
bucket (without a namespace path), a prefix will be generated
based on the start time.
""",
# @@protoc_insertion_point(class_scope:google.firestore.admin.v1.ExportDocumentsRequest)
),
)
_sym_db.RegisterMessage(ExportDocumentsRequest)
ImportDocumentsRequest = _reflection.GeneratedProtocolMessageType(
"ImportDocumentsRequest",
(_message.Message,),
dict(
DESCRIPTOR=_IMPORTDOCUMENTSREQUEST,
__module__="google.cloud.firestore.admin_v1.proto.firestore_admin_pb2",
__doc__="""The request for
[FirestoreAdmin.ImportDocuments][google.firestore.admin.v1.FirestoreAdmin.ImportDocuments].
Attributes:
name:
Database to import into. Should be of the form:
``projects/{project_id}/databases/{database_id}``.
collection_ids:
Which collection ids to import. Unspecified means all
collections included in the import.
input_uri_prefix:
Location of the exported files. This must match the
output\_uri\_prefix of an ExportDocumentsResponse from an
export that has completed successfully. See: [google.firestore
.admin.v1.ExportDocumentsResponse.output\_uri\_prefix][google.
firestore.admin.v1.ExportDocumentsResponse.output\_uri\_prefix
].
""",
# @@protoc_insertion_point(class_scope:google.firestore.admin.v1.ImportDocumentsRequest)
),
)
_sym_db.RegisterMessage(ImportDocumentsRequest)
DESCRIPTOR._options = None
_FIRESTOREADMIN = _descriptor.ServiceDescriptor(
name="FirestoreAdmin",
full_name="google.firestore.admin.v1.FirestoreAdmin",
file=DESCRIPTOR,
index=0,
serialized_options=_b(
"\312A\030firestore.googleapis.com\322AXhttps://www.googleapis.com/auth/cloud-platform,https://www.googleapis.com/auth/datastore"
),
serialized_start=1219,
serialized_end=2880,
methods=[
_descriptor.MethodDescriptor(
name="CreateIndex",
full_name="google.firestore.admin.v1.FirestoreAdmin.CreateIndex",
index=0,
containing_service=None,
input_type=_CREATEINDEXREQUEST,
output_type=google_dot_longrunning_dot_operations__pb2._OPERATION,
serialized_options=_b(
'\202\323\344\223\002G">/v1/{parent=projects/*/databases/*/collectionGroups/*}/indexes:\005index'
),
),
_descriptor.MethodDescriptor(
name="ListIndexes",
full_name="google.firestore.admin.v1.FirestoreAdmin.ListIndexes",
index=1,
containing_service=None,
input_type=_LISTINDEXESREQUEST,
output_type=_LISTINDEXESRESPONSE,
serialized_options=_b(
"\202\323\344\223\002@\022>/v1/{parent=projects/*/databases/*/collectionGroups/*}/indexes"
),
),
_descriptor.MethodDescriptor(
name="GetIndex",
full_name="google.firestore.admin.v1.FirestoreAdmin.GetIndex",
index=2,
containing_service=None,
input_type=_GETINDEXREQUEST,
output_type=google_dot_cloud_dot_firestore_dot_admin__v1_dot_proto_dot_index__pb2._INDEX,
serialized_options=_b(
"\202\323\344\223\002@\022>/v1/{name=projects/*/databases/*/collectionGroups/*/indexes/*}"
),
),
_descriptor.MethodDescriptor(
name="DeleteIndex",
full_name="google.firestore.admin.v1.FirestoreAdmin.DeleteIndex",
index=3,
containing_service=None,
input_type=_DELETEINDEXREQUEST,
output_type=google_dot_protobuf_dot_empty__pb2._EMPTY,
serialized_options=_b(
"\202\323\344\223\002@*>/v1/{name=projects/*/databases/*/collectionGroups/*/indexes/*}"
),
),
_descriptor.MethodDescriptor(
name="GetField",
full_name="google.firestore.admin.v1.FirestoreAdmin.GetField",
index=4,
containing_service=None,
input_type=_GETFIELDREQUEST,
output_type=google_dot_cloud_dot_firestore_dot_admin__v1_dot_proto_dot_field__pb2._FIELD,
serialized_options=_b(
"\202\323\344\223\002?\022=/v1/{name=projects/*/databases/*/collectionGroups/*/fields/*}"
),
),
_descriptor.MethodDescriptor(
name="UpdateField",
full_name="google.firestore.admin.v1.FirestoreAdmin.UpdateField",
index=5,
containing_service=None,
input_type=_UPDATEFIELDREQUEST,
output_type=google_dot_longrunning_dot_operations__pb2._OPERATION,
serialized_options=_b(
"\202\323\344\223\002L2C/v1/{field.name=projects/*/databases/*/collectionGroups/*/fields/*}:\005field"
),
),
_descriptor.MethodDescriptor(
name="ListFields",
full_name="google.firestore.admin.v1.FirestoreAdmin.ListFields",
index=6,
containing_service=None,
input_type=_LISTFIELDSREQUEST,
output_type=_LISTFIELDSRESPONSE,
serialized_options=_b(
"\202\323\344\223\002?\022=/v1/{parent=projects/*/databases/*/collectionGroups/*}/fields"
),
),
_descriptor.MethodDescriptor(
name="ExportDocuments",
full_name="google.firestore.admin.v1.FirestoreAdmin.ExportDocuments",
index=7,
containing_service=None,
input_type=_EXPORTDOCUMENTSREQUEST,
output_type=google_dot_longrunning_dot_operations__pb2._OPERATION,
serialized_options=_b(
'\202\323\344\223\0026"1/v1/{name=projects/*/databases/*}:exportDocuments:\001*'
),
),
_descriptor.MethodDescriptor(
name="ImportDocuments",
full_name="google.firestore.admin.v1.FirestoreAdmin.ImportDocuments",
index=8,
containing_service=None,
input_type=_IMPORTDOCUMENTSREQUEST,
output_type=google_dot_longrunning_dot_operations__pb2._OPERATION,
serialized_options=_b(
'\202\323\344\223\0026"1/v1/{name=projects/*/databases/*}:importDocuments:\001*'
),
),
],
)
_sym_db.RegisterServiceDescriptor(_FIRESTOREADMIN)
DESCRIPTOR.services_by_name["FirestoreAdmin"] = _FIRESTOREADMIN
# @@protoc_insertion_point(module_scope)
| 36.429553 | 4,192 | 0.648311 | 4,441 | 42,404 | 5.921189 | 0.093898 | 0.070809 | 0.070581 | 0.07697 | 0.726612 | 0.669151 | 0.627928 | 0.548639 | 0.529472 | 0.500114 | 0 | 0.038927 | 0.245142 | 42,404 | 1,163 | 4,193 | 36.460877 | 0.782592 | 0.026743 | 0 | 0.698516 | 1 | 0.011132 | 0.363901 | 0.217754 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.029685 | 0 | 0.029685 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bc44f98ea1effaff7325a117ca0369b2653c502a | 16,006 | py | Python | aki/core/astree.py | syegulalp/Akilang | 9bd125465fbba0c799a4d2c2216eb58e39ae61d6 | [
"MIT"
] | 87 | 2018-06-11T22:33:59.000Z | 2022-03-25T10:51:21.000Z | aki/core/astree.py | syegulalp/Akilang | 9bd125465fbba0c799a4d2c2216eb58e39ae61d6 | [
"MIT"
] | 4 | 2018-05-16T21:45:46.000Z | 2020-03-22T16:43:17.000Z | aki/core/astree.py | syegulalp/Akilang | 9bd125465fbba0c799a4d2c2216eb58e39ae61d6 | [
"MIT"
] | 5 | 2018-10-04T12:01:59.000Z | 2021-01-27T06:31:48.000Z | from core.error import AkiSyntaxErr
from llvmlite import ir
from typing import Optional
class ASTNode:
"""
Base type for all AST nodes, with helper functions.
"""
def __init__(self, index):
self.child = None
self.index = index
def __eq__(self, other):
raise NotImplementedError
def flatten(self):
return [self.__class__.__name__, "flatten unimplemented"]
class Expression(ASTNode):
"""
Base type for all expressions.
"""
pass
class Keyword(ASTNode):
"""
Base type for keywords.
"""
pass
class TopLevel(ASTNode):
"""
Mixin type for top-level AST nodes.
"""
pass
class VarTypeNode(Expression):
name: Optional[str] = None
class VarTypeName(VarTypeNode):
def __init__(self, p, name: str):
super().__init__(p)
self.name = name
def __eq__(self, other):
return self.name == other.name
def flatten(self):
return [self.__class__.__name__, self.name]
class VarTypePtr(VarTypeNode):
def __init__(self, p, pointee: VarTypeNode):
super().__init__(p)
self.pointee = pointee
self.name = f"ptr {pointee.name}"
def __eq__(self, other):
return self.pointee == other.pointee
def flatten(self):
return [self.__class__.__name__, self.pointee.flatten()]
class VarTypeFunc(VarTypeNode):
def __init__(self, p, arguments, return_type: VarTypeNode):
super().__init__(p)
self.arguments = arguments
self.return_type = return_type
def __eq__(self, other):
return (
self.arguments == other.arguments and self.return_type == other.return_type
)
def flatten(self):
return [
self.__class__.__name__,
self.arguments.flatten() if self.arguments else [],
self.return_type.flatten() if self.return_type else None,
]
class VarTypeAccessor(VarTypeNode):
def __init__(self, p, vartype: VarTypeNode, accessors: list):
super().__init__(p)
self.vartype = vartype
self.accessors = accessors
# self.name = f"array({vartype.name})[{','.join([str(_[0]) for _ in accessors.accessors])}]"
def __eq__(self, other):
return self.vartype == other.vartype and self.accessors == other.accessors
def flatten(self):
return [
self.__class__.__name__,
self.vartype.flatten(),
self.accessors.flatten() if self.accessors else [],
]
class Name(Expression):
"""
Variable reference.
"""
def __init__(self, p, name, val=None, vartype=None):
super().__init__(p)
self.name = name
self.val = val
# `val` is only used in variable assignment form
self.vartype = vartype
def __eq__(self, other):
return self.name == other.name
def flatten(self):
return [
self.__class__.__name__,
self.name,
self.val.flatten() if self.val else None,
self.vartype.flatten() if self.vartype else None,
]
class VarList(Expression):
"""
`var` declaration with one or more variables.
"""
def __init__(self, p, vars):
super().__init__(p)
self.vars = vars
def __eq__(self, other):
return self.vars == other.vars
def flatten(self):
return [
self.__class__.__name__,
[_.flatten() for _ in self.vars] if self.vars else [],
]
class UniList(TopLevel, VarList):
pass
class ConstList(TopLevel, VarList):
pass
class Argument(ASTNode):
"""
Function argument, with optional type declaration.
"""
def __init__(self, p, name, vartype=None, default_value=None):
super().__init__(p)
self.name = name
self.vartype = vartype
self.default_value = default_value
def __eq__(self, other):
return self.name == other.name and self.vartype == other.vartype
def flatten(self):
return [
self.__class__.__name__,
self.name,
self.vartype.flatten() if self.vartype else None,
self.default_value.flatten() if self.default_value else None,
]
class StarArgument(Argument):
pass
class Constant(Expression):
"""
LLVM constant value.
"""
def __init__(self, p, val, vartype):
super().__init__(p)
self.val = val
self.vartype = vartype
def __eq__(self, other):
return self.val == other.val and self.vartype == other.vartype
def flatten(self):
return [self.__class__.__name__, self.val, self.vartype.flatten() if self.vartype is not None else None]
class String(Expression):
"""
String constant.
"""
def __init__(self, p, val, vartype):
super().__init__(p)
self.val = val
self.vartype = vartype
self.name = f'"{val}"'
def __eq__(self, other):
return self.val == other.val
def flatten(self):
return [self.__class__.__name__, self.val]
class UnOp(Expression):
"""
Unary operator expression.
"""
def __init__(self, p, op, lhs):
super().__init__(p)
self.op = op
self.lhs = lhs
def __eq__(self, other):
return self.op == other.op and self.lhs == other.lhs
def flatten(self):
return [self.__class__.__name__, self.op, self.lhs.flatten()]
class RefExpr(Expression):
"""
Reference expression (obtaining a pointer to an object)
"""
def __init__(self, p, ref):
super().__init__(p)
self.ref = ref
def __eq__(self, other):
return self.ref == other.ref
def flatten(self):
return [self.__class__.__name__, self.ref.flatten()]
class DerefExpr(RefExpr):
pass
class BinOp(Expression):
"""
Binary operator expression.
"""
def __init__(self, p, op, lhs, rhs):
super().__init__(p)
self.op = op
self.lhs = lhs
self.rhs = rhs
def __eq__(self, other):
return self.op == other.op and self.lhs == other.lhs and self.rhs == other.rhs
def flatten(self):
return [
self.__class__.__name__,
self.op,
self.lhs.flatten(),
self.rhs.flatten(),
]
class Assignment(BinOp):
pass
class BinOpComparison(BinOp):
pass
class IfExpr(ASTNode):
def __init__(self, p, if_expr, then_expr, else_expr=None):
super().__init__(p)
self.if_expr = if_expr
self.then_expr = then_expr
self.else_expr = else_expr
def __eq__(self, other):
return (
self.if_expr == other.if_expr
and self.then_expr == other.then_expr
and self.else_expr == other.else_expr
)
def flatten(self):
return [
self.__class__.__name__,
self.if_expr.flatten(),
self.then_expr.flatten(),
self.else_expr.flatten() if self.else_expr else None,
]
class WhenExpr(IfExpr):
pass
class Return(ASTNode):
def __init__(self, p, return_val):
super().__init__(p)
self.return_val = return_val
def __eq__(self, other):
return (
self.return_val == other.return_val
)
def flatten(self):
return [
self.__class__.__name__,
self.return_val.flatten()
]
class Prototype(ASTNode):
"""
Function prototype.
"""
def __init__(
self,
p,
name: str,
arguments: list,
return_type: VarTypeNode,
is_declaration=False,
):
super().__init__(p)
self.name = name
self.arguments = arguments
self.return_type = return_type
self.is_declaration = is_declaration
def __eq__(self, other):
return (
self.arguments == other.arguments and self.return_type == other.return_type
)
def flatten(self):
return [
self.__class__.__name__,
self.name,
[_.flatten() for _ in self.arguments] if self.arguments else [],
self.return_type.flatten() if self.return_type else None,
]
class Function(TopLevel, ASTNode):
"""
Function body.
"""
def __init__(self, p, prototype, body):
super().__init__(p)
self.prototype = prototype
self.body = body
def __eq__(self, other):
return self.prototype == other.prototype and self.body == other.body
def flatten(self):
return [
self.__class__.__name__,
self.prototype.flatten(),
[_.flatten() for _ in self.body],
]
class External(Function):
pass
class Call(Expression, Prototype):
"""
Function call.
Re-uses Prototype since it has the same basic structure.
Arguments contains a list of Expression-class ASTs.
"""
pass
class ExpressionBlock(Expression):
"""
{}-delimeted set of expressions, stored as a list in `body`.
"""
def __init__(self, p, body):
super().__init__(p)
self.body = body
def __eq__(self, other):
return self.body == other.body
def flatten(self):
return [self.__class__.__name__, [_.flatten() for _ in self.body]]
class LLVMNode(Expression):
"""
Repackages an LLVM op as if it were an unprocessed AST node.
You should use this if:
a) you have an op that has been produced by a previous codegen action
b) you're going to codegen a synthetic AST node using that result as a parameter
"""
def __init__(self, node, vartype, llvm_node):
super().__init__(node.index)
# Aki node, for position information
self.node = node
# Vartype (an AST vartype node provided by the caller)
# This can also also be an .akitype node
# so it can just be copied from the last instruction
self.vartype = vartype
# LLVM node
# This MUST have .akitype and .akinode data
assert isinstance(self.llvm_node, ir.Instruction)
self.llvm_node = llvm_node
assert self.llvm_node.akinode
assert self.llvm_node.akitype
# Name (optional)
self.name = None
class LoopExpr(Expression):
def __init__(self, p, conditions, body):
super().__init__(p)
self.conditions = conditions
self.body = body
def __eq__(self, other):
return self.conditions == other.conditions and self.body == other.body
def flatten(self):
return [
self.__class__.__name__,
[_.flatten() for _ in self.conditions],
self.body.flatten(),
]
class Break(Expression):
def __init__(self, p):
super().__init__(p)
def flatten(self):
return [self.__class__.__name__]
def __eq__(self, other):
return self.__class__ == other.__class__
class WithExpr(Expression):
def __init__(self, p, varlist: VarList, body: ExpressionBlock):
super().__init__(p)
self.varlist = varlist
self.body = body
def __eq__(self, other):
return self.varlist == other.varlist and self.body == other.body
def flatten(self):
return [
self.__class__.__name__,
[_.flatten() for _ in self.varlist.vars],
self.body.flatten(),
]
class ChainExpr(Expression):
def __init__(self, p, expr_chain: list):
super().__init__(p)
self.expr_chain = expr_chain
def __eq__(self, other):
return self.expr_chain == other.expr_chain
def flatten(self):
return [self.__class__.__name__, [_.flatten() for _ in self.expr_chain]]
class UnsafeBlock(Expression):
def __init__(self, p, expr_block):
super().__init__(p)
self.expr_block = expr_block
def __eq__(self, other):
return self.expr_block == other.expr_block
def flatten(self):
return [self.__class__.__name__, [_.flatten() for _ in self.expr_block]]
class Accessor(Expression):
def __init__(self, p, accessors):
super().__init__(p)
self.accessors = accessors
def __eq__(self, other):
return self.accessors == other.accessors
def flatten(self):
return [self.__class__.__name__, [_.flatten() for _ in self.accessors]]
class AccessorExpr(Expression):
def __init__(self, p, expr, accessors):
super().__init__(p)
self.expr = expr
self.accessors = accessors
def __eq__(self, other):
return self.expr == other.expr and self.accessors == other.accessors
def flatten(self):
return [
self.__class__.__name__,
self.expr.flatten(),
[_.flatten() for _ in self.accessors],
]
class ObjectRef(Expression):
"""
Target of an assignment operation.
The expr in question is a name, etc.
In every case we want to return a reference to the object,
not its value.
"""
def __init__(self, p, expr):
super().__init__(p)
self.expr = expr
def __eq__(self, other):
return self.expr == other.expr
def flatten(self):
return [self.__class__.__name__, self.expr.flatten()]
class ObjectValue(Expression):
"""
Extracted value.
"""
def __init__(self, p, expr):
super().__init__(p)
self.expr = expr
def __eq__(self, other):
return self.expr == other.expr
def flatten(self):
return [self.__class__.__name__, self.expr.flatten()]
class SelectExpr(Expression):
"""
`select` expression.
"""
def __init__(self, p, select_expr, case_list: list, default_case=None):
super().__init__(p)
self.select_expr = select_expr
self.case_list = case_list
self.default_case = default_case
def __eq__(self, other):
return (
self.select_expr == other.select_expr
and self.case_list == other.case_list
and self.default_case == other.default_case
)
def flatten(self):
return [
self.__class__.__name__,
self.select_expr.flatten(),
[_.flatten() for _ in self.case_list],
]
class CaseExpr(Expression):
"""
`case` expression.
"""
def __init__(self, p, case_value, case_expr):
super().__init__(p)
self.case_value = case_value
self.case_expr = case_expr
def __eq__(self, other):
return self.case_value == other.case_value and self.case_expr == other.case_expr
def flatten(self):
return [
self.__class__.__name__,
self.case_value.flatten(),
self.case_expr.flatten(),
]
class DefaultExpr(CaseExpr):
pass
class WhileExpr(Expression):
"""
`while` expression.
"""
def __init__(self, p, while_value, while_expr):
super().__init__(p)
self.while_value = while_value
self.while_expr = while_expr
def __eq__(self, other):
return (
self.while_value == other.while_value
and self.while_expr == other.while_expr
)
def flatten(self):
return [
self.__class__.__name__,
self.while_value.flatten(),
self.while_expr.flatten(),
]
class BaseDecorator(Expression):
def __init__(self, p, name, args, expr_block):
super().__init__(p)
self.name = name
self.args = args
self.expr_block = expr_block
def __eq__(self, other):
return self.name == other.name and self.args == other.args
def flatten(self):
return [
self.__class__.__name__,
self.name,
self.args.flatten() if self.args else [],
self.expr_block.flatten(),
]
class Decorator(BaseDecorator, TopLevel):
pass
class InlineDecorator(Decorator):
pass
| 23.400585 | 112 | 0.595964 | 1,858 | 16,006 | 4.721744 | 0.113025 | 0.069532 | 0.040123 | 0.04947 | 0.514305 | 0.438163 | 0.393936 | 0.361678 | 0.305483 | 0.237661 | 0 | 0.000089 | 0.295639 | 16,006 | 683 | 113 | 23.434846 | 0.778073 | 0.092465 | 0 | 0.487981 | 0 | 0 | 0.00325 | 0 | 0 | 0 | 0 | 0 | 0.007212 | 1 | 0.225962 | false | 0.036058 | 0.007212 | 0.146635 | 0.497596 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
bc4bf2696ba5704305bc542db888f96843b54f92 | 511 | py | Python | implementations/PG2/loss.py | pasan1992/Human-Pose-Transfer | a7febc632d4fbf627ba05740d2048accb25575f2 | [
"MIT"
] | 64 | 2019-06-13T01:01:44.000Z | 2022-03-20T08:09:18.000Z | implementations/PG2/loss.py | pasan1992/Human-Pose-Transfer | a7febc632d4fbf627ba05740d2048accb25575f2 | [
"MIT"
] | 10 | 2019-06-20T15:07:42.000Z | 2021-11-13T11:47:31.000Z | implementations/PG2/loss.py | pasan1992/Human-Pose-Transfer | a7febc632d4fbf627ba05740d2048accb25575f2 | [
"MIT"
] | 17 | 2019-08-01T02:28:30.000Z | 2022-02-03T10:27:33.000Z | import torch.nn as nn
class MaskL1Loss(nn.Module):
"""
Loss from paper <Pose Guided Person Image Generation> Sec3.1 pose mask loss
"""
def __init__(self, ratio=1):
super(MaskL1Loss, self).__init__()
self.criterion = nn.L1Loss()
self.ratio = ratio
def forward(self, generated_img, target_img, mask):
pose_mask_l1 = self.criterion(generated_img * mask, target_img * mask)
return self.criterion(generated_img, target_img) + pose_mask_l1 * self.ratio
| 30.058824 | 84 | 0.67319 | 69 | 511 | 4.724638 | 0.449275 | 0.07362 | 0.110429 | 0.128834 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020202 | 0.225049 | 511 | 16 | 85 | 31.9375 | 0.80303 | 0.146771 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
bc4ff97dc1e291661daff96f55b4a581f9a8d5e9 | 117 | py | Python | constants.py | Taiki-jp/A3C-tensorflow | 87a6e3ebafd81e8b5a6725db018f10765774fb62 | [
"MIT"
] | 11 | 2017-07-24T21:48:20.000Z | 2021-04-06T09:49:35.000Z | constants.py | Taiki-jp/A3C-tensorflow | 87a6e3ebafd81e8b5a6725db018f10765774fb62 | [
"MIT"
] | null | null | null | constants.py | Taiki-jp/A3C-tensorflow | 87a6e3ebafd81e8b5a6725db018f10765774fb62 | [
"MIT"
] | 2 | 2017-07-24T21:48:22.000Z | 2020-05-04T14:02:25.000Z | # Network parameters
IMAGE_WIDTH = 84
IMAGE_HEIGHT = 84
NUM_CHANNELS = 4 # dqn inputs 4 image at same time as state
| 23.4 | 60 | 0.760684 | 20 | 117 | 4.3 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06383 | 0.196581 | 117 | 4 | 61 | 29.25 | 0.851064 | 0.504274 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bc54014291849c0239884e30ee03af85343806db | 306 | py | Python | pretalx_vimeo/recording.py | pretalx/pretalx-vimeo | aad9349e34e0a8bc5ebcf1334beb0fcaba2608f9 | [
"Apache-2.0"
] | null | null | null | pretalx_vimeo/recording.py | pretalx/pretalx-vimeo | aad9349e34e0a8bc5ebcf1334beb0fcaba2608f9 | [
"Apache-2.0"
] | null | null | null | pretalx_vimeo/recording.py | pretalx/pretalx-vimeo | aad9349e34e0a8bc5ebcf1334beb0fcaba2608f9 | [
"Apache-2.0"
] | 1 | 2021-12-28T12:39:15.000Z | 2021-12-28T12:39:15.000Z | from pretalx.agenda.recording import BaseRecordingProvider
class VimeoProvider(BaseRecordingProvider):
def get_recording(self, submission):
vimeo = getattr(submission, "vimeo_link", None)
if vimeo:
return {"iframe": vimeo.iframe, "csp_header": "https://player.vimeo.com"}
| 34 | 85 | 0.70915 | 32 | 306 | 6.6875 | 0.75 | 0.140187 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179739 | 306 | 8 | 86 | 38.25 | 0.85259 | 0 | 0 | 0 | 0 | 0 | 0.163399 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
bc5cec5f43d95c9eae09b24692de5cc87dcdc25f | 387 | py | Python | layers/lstm.py | blaisewang/seq2seq-topic-labelling | d78e1362d9c85396dc65385de3dc7d0958abf044 | [
"BSD-3-Clause"
] | 2 | 2019-12-01T05:48:12.000Z | 2021-04-17T08:41:09.000Z | layers/lstm.py | blaisewang/seq2seq-topic-labelling | d78e1362d9c85396dc65385de3dc7d0958abf044 | [
"BSD-3-Clause"
] | 5 | 2020-02-07T16:44:21.000Z | 2022-02-10T00:38:40.000Z | layers/lstm.py | blaisewang/seq2seq-topic-labelling | d78e1362d9c85396dc65385de3dc7d0958abf044 | [
"BSD-3-Clause"
] | null | null | null | import tensorflow as tf
l2 = tf.keras.regularizers.l2(0.01)
def rnn_layer(units):
return tf.keras.layers.LSTM(units, recurrent_activation="sigmoid", recurrent_initializer="glorot_uniform",
kernel_regularizer=l2, recurrent_regularizer=l2, dropout=0.5, recurrent_dropout=0.5,
return_sequences=True, return_state=True)
| 38.7 | 116 | 0.671835 | 47 | 387 | 5.340426 | 0.617021 | 0.055777 | 0.071713 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0.232558 | 387 | 9 | 117 | 43 | 0.808081 | 0 | 0 | 0 | 0 | 0 | 0.054264 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0.166667 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
bc628915fd2e9c0f7d127e8e7684ac6019e36236 | 114 | py | Python | dataset/__init__.py | haoxizhong/plm_scm | 102dd88cd83ee736fc87862cb8976d798c9e7cfb | [
"MIT"
] | 71 | 2020-07-16T01:49:27.000Z | 2022-03-27T16:55:00.000Z | dataset/__init__.py | haoxizhong/plm_scm | 102dd88cd83ee736fc87862cb8976d798c9e7cfb | [
"MIT"
] | 11 | 2020-09-18T14:26:25.000Z | 2022-02-09T23:49:33.000Z | dataset/__init__.py | haoxizhong/plm_scm | 102dd88cd83ee736fc87862cb8976d798c9e7cfb | [
"MIT"
] | 16 | 2020-07-15T07:24:30.000Z | 2022-03-19T05:41:11.000Z | from .nlp.JsonFromFiles import JsonFromFilesDataset
dataset_list = {
"JsonFromFiles": JsonFromFilesDataset
}
| 19 | 51 | 0.798246 | 9 | 114 | 10 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131579 | 114 | 5 | 52 | 22.8 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0.114035 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bc76a074361845627774e747d82be12363d5f118 | 4,948 | py | Python | BPt/pipeline/Selector.py | sahahn/BPt | 1a2967f4ca3fa070b7417a4f59a218ae171daadd | [
"MIT"
] | 6 | 2020-11-06T15:45:28.000Z | 2022-03-08T19:15:35.000Z | BPt/pipeline/Selector.py | sahahn/BPt | 1a2967f4ca3fa070b7417a4f59a218ae171daadd | [
"MIT"
] | 14 | 2020-10-20T13:55:23.000Z | 2022-01-25T17:36:07.000Z | BPt/pipeline/Selector.py | sahahn/BPt | 1a2967f4ca3fa070b7417a4f59a218ae171daadd | [
"MIT"
] | 2 | 2020-10-23T19:48:53.000Z | 2020-11-06T15:46:04.000Z | from sklearn.utils.metaestimators import _BaseComposition
from sklearn.utils.metaestimators import if_delegate_has_method
from ..default.params.Params import Dict, Choice
from .base import _get_est_fit_params, _get_est_trans_params
class Selector(_BaseComposition):
_needs_mapping = True
_needs_fit_index = True
_needs_transform_index = True
def __init__(self, estimators, to_use=0):
self.estimators = estimators
self.to_use = to_use
self.example_estimator_ = self.estimators[0][1]
def get_params(self, deep=True):
return self._get_params('estimators', deep=deep)
def set_params(self, **kwargs):
# Pass params as dict with key select
select = kwargs['select']
# Get to use from select
self.to_use = select['to_use']
# Set rest of select params
self._set_params('estimators', **select)
return self
@if_delegate_has_method(delegate='example_estimator_')
def fit(self, *args, **kwargs):
# Select correct estimator
self.estimator_ = self.estimators[self.to_use][1]
# Set correct fit params based on chosen estimator
fit_params = _get_est_fit_params(self.estimator_, other_params=kwargs)
# Fit
self.estimator_.fit(*args, **fit_params)
self.is_fitted_ = True
return self
@if_delegate_has_method(delegate='example_estimator_')
def fit_transform(self, *args, **kwargs):
self.estimator_ = self.estimators[self.to_use][1]
return self.estimator_.fit_transform(*args, **kwargs)
@if_delegate_has_method(delegate='example_estimator_')
def transform(self, *args, **kwargs):
if 'transform_index' in kwargs:
transform_index = kwargs.pop('transform_index')
else:
transform_index = None
tranform_params = _get_est_trans_params(self.estimator_,
transform_index)
return self.estimator_.transform(*args, **tranform_params)
@if_delegate_has_method(delegate='example_estimator_')
def fit_resample(self, *args, **kwargs):
self.estimator_ = self.estimators[self.to_use][1]
return self.estimator_.fit_resample(*args, **kwargs)
@if_delegate_has_method(delegate='example_estimator_')
def fit_predict(self, *args, **kwargs):
self.estimator_ = self.estimators[self.to_use][1]
return self.estimator_.fit_predict(*args, **kwargs)
@if_delegate_has_method(delegate='estimator_')
def predict(self, *args, **kwargs):
return self.estimator_.predict(*args, **kwargs)
@if_delegate_has_method(delegate='estimator_')
def predict_proba(self, *args, **kwargs):
return self.estimator_.predict_proba(*args, **kwargs)
@if_delegate_has_method(delegate='estimator_')
def decision_function(self, *args, **kwargs):
return self.estimator_.decision_function(*args, **kwargs)
@if_delegate_has_method(delegate='estimator_')
def predict_log_proba(self, *args, **kwargs):
return self.estimator_.predict_log_proba(*args, **kwargs)
@if_delegate_has_method(delegate='estimator_')
def score(self, *args, **kwargs):
return self.estimator_.score(*args, **kwargs)
@if_delegate_has_method(delegate='estimator_')
def inverse_transform(self, *args, **kwargs):
return self.estimator_.inverse_transform(*args, **kwargs)
@if_delegate_has_method(delegate='estimator_')
def transform_df(self, *args, **kwargs):
return self.estimator_.transform_df(*args, **kwargs)
@if_delegate_has_method(delegate='estimator_')
def _proc_new_names(self, *args, **kwargs):
return self.estimator_._proc_new_names(*args, **kwargs)
@property
def _estimator_type(self):
'''This should remain static across all passed estimators'''
return self.example_estimator_._estimator_type
@property
def feature_importances_(self):
if hasattr(self.estimator_, 'feature_importances_'):
return getattr(self.estimator_, 'feature_importances_')
return None
@property
def coef_(self):
if hasattr(self.estimator_, 'coef_'):
return getattr(self.estimator_, 'coef_')
return None
@property
def classes_(self):
if hasattr(self.estimator_, 'classes_'):
return getattr(self.estimator_, 'classes_')
return None
def selector_wrapper(objs, params, name):
selector = (name, Selector(objs))
p_dicts = []
for i in range(len(objs)):
obj_name = objs[i][0]
rel_params =\
{p: params[p] for p in params if p.split('__')[0] == obj_name}
rel_params['to_use'] = i
p_dict = Dict(**rel_params)
p_dicts.append(p_dict)
select = Choice(p_dicts, deterministic=True)
select_params = {name + '__select': select}
return selector, select_params
| 32.768212 | 78 | 0.671787 | 593 | 4,948 | 5.261383 | 0.17032 | 0.104167 | 0.058333 | 0.085256 | 0.492949 | 0.410577 | 0.357692 | 0.347115 | 0.300641 | 0.236859 | 0 | 0.002324 | 0.217462 | 4,948 | 150 | 79 | 32.986667 | 0.803461 | 0.044058 | 0 | 0.254902 | 0 | 0 | 0.06654 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.205882 | false | 0 | 0.068627 | 0.088235 | 0.539216 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
bc7817b4326c707cc9196709b3b7baf22a85c5c7 | 299 | py | Python | pairRdd/sort/SortedWordCountProblem.py | shubozhang/pyspark-tutorial | 244f69bc75ad4238a00151dc7802bbb63e6f35e1 | [
"MIT"
] | null | null | null | pairRdd/sort/SortedWordCountProblem.py | shubozhang/pyspark-tutorial | 244f69bc75ad4238a00151dc7802bbb63e6f35e1 | [
"MIT"
] | null | null | null | pairRdd/sort/SortedWordCountProblem.py | shubozhang/pyspark-tutorial | 244f69bc75ad4238a00151dc7802bbb63e6f35e1 | [
"MIT"
] | null | null | null | from pyspark import SparkContext
if __name__ == "__main__":
'''
Create a Spark program to read the an article from in/word_count.text,
output the number of occurrence of each word in descending order.
Sample output:
apple : 200
shoes : 193
bag : 176
...
'''
| 17.588235 | 74 | 0.64214 | 40 | 299 | 4.575 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042056 | 0.284281 | 299 | 16 | 75 | 18.6875 | 0.813084 | 0 | 0 | 0 | 0 | 0 | 0.119403 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
bc7834f72483d1f1a1e483b2ede65df340c8e27b | 709 | py | Python | odd_models_src/adapter/__init__.py | opendatadiscovery/odd-contract-package | 5444a328267c49884d6e1803d387ef0e4d7de705 | [
"Apache-2.0"
] | null | null | null | odd_models_src/adapter/__init__.py | opendatadiscovery/odd-contract-package | 5444a328267c49884d6e1803d387ef0e4d7de705 | [
"Apache-2.0"
] | 2 | 2021-11-16T08:49:50.000Z | 2022-02-10T17:12:21.000Z | odd_models_src/adapter/__init__.py | opendatadiscovery/odd-contract-package | 5444a328267c49884d6e1803d387ef0e4d7de705 | [
"Apache-2.0"
] | null | null | null | import os
import connexion
from flask import Response
from flask_compress import Compress
# Pieces of generated code
from odd_models.adapter.controllers import ODDController, ControllerHolder
def init_flask_app():
app = connexion.App(__name__, specification_dir='openapi')
app.add_api(specification='openapi.yaml',
arguments={'title': 'ODD adapter HTTP API contract'},
pythonic_params=True)
app = app.app
app.add_url_rule(os.environ.get('HEALTHCHECK_PATH', '/health'), "healthcheck", lambda: Response(status=200))
Compress().init_app(app)
return app
def init_controller(controller: ODDController):
ControllerHolder.init_controller(controller)
| 29.541667 | 112 | 0.739069 | 85 | 709 | 5.964706 | 0.541176 | 0.059172 | 0.035503 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005068 | 0.165021 | 709 | 23 | 113 | 30.826087 | 0.851351 | 0.03385 | 0 | 0 | 1 | 0 | 0.127379 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.3125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
bc8078fcaf33a11e52fab17143d9483683b03d5f | 406 | py | Python | database/db_init.py | zedoax/Hubris | 0f57fb72afbfbed22699628af63fa324c29ed3d2 | [
"MIT"
] | 1 | 2018-02-24T22:45:03.000Z | 2018-02-24T22:45:03.000Z | database/db_init.py | zedoax/Hubris | 0f57fb72afbfbed22699628af63fa324c29ed3d2 | [
"MIT"
] | 5 | 2018-02-24T22:49:46.000Z | 2018-03-02T00:27:14.000Z | database/db_init.py | zedoax/Hubris | 0f57fb72afbfbed22699628af63fa324c29ed3d2 | [
"MIT"
] | 2 | 2018-02-24T22:47:17.000Z | 2019-01-21T07:32:02.000Z | from database import metadata
import logging
logger = logging.getLogger(__name__)
def database_check_init():
if 'tournament' in metadata.tables and 'rule_set' in metadata.tables and \
'match' in metadata.tables and 'team' in metadata.tables:
logger.info("Database exists, and is proper")
else:
logger.error("Database does not exist or is not proper")
exit(1)
| 29 | 78 | 0.692118 | 55 | 406 | 4.981818 | 0.581818 | 0.145985 | 0.233577 | 0.208029 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003165 | 0.221675 | 406 | 13 | 79 | 31.230769 | 0.863924 | 0 | 0 | 0 | 0 | 0 | 0.238916 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.2 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bc8784cffc5a1f07f30eac566943c6ee8187df51 | 387 | py | Python | Representative_Based_Clustering/gaussian_kernel_kmeans.py | galnagdeno/Data-Mining-and-Machine-Learning | 9038e5fc44e20ac13df7b61587b497068f28bbb6 | [
"MIT"
] | null | null | null | Representative_Based_Clustering/gaussian_kernel_kmeans.py | galnagdeno/Data-Mining-and-Machine-Learning | 9038e5fc44e20ac13df7b61587b497068f28bbb6 | [
"MIT"
] | null | null | null | Representative_Based_Clustering/gaussian_kernel_kmeans.py | galnagdeno/Data-Mining-and-Machine-Learning | 9038e5fc44e20ac13df7b61587b497068f28bbb6 | [
"MIT"
] | null | null | null | import pandas as pd
import scipy
from kernel_kmeans import kernel_kmeans
from scipy.spatial.distance import pdist, squareform
def gaussian_kmeans(df, var, num_clusters):
kernel_matrix = scipy.exp(- squareform(pdist(df, 'euclidean')) / var ** 2)
kernel_matrix = pd.DataFrame(kernel_matrix, columns = range(df.shape[0]))
return kernel_kmeans(kernel_matrix, num_clusters)
| 35.181818 | 78 | 0.762274 | 54 | 387 | 5.277778 | 0.518519 | 0.168421 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006024 | 0.142119 | 387 | 10 | 79 | 38.7 | 0.85241 | 0 | 0 | 0 | 0 | 0 | 0.023256 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.5 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
bc8e61aff39807abd1d6dd46ac5befdb8ae653c3 | 315 | py | Python | monday/resources/users.py | tsh356/monday | 9350999b7ead71a4ad0ccaac3a651ff680af1c4b | [
"Apache-2.0"
] | 37 | 2020-06-14T20:33:03.000Z | 2022-03-21T09:46:49.000Z | monday/resources/users.py | tsh356/monday | 9350999b7ead71a4ad0ccaac3a651ff680af1c4b | [
"Apache-2.0"
] | 31 | 2019-12-12T20:26:52.000Z | 2022-03-14T17:39:00.000Z | monday/resources/users.py | tsh356/monday | 9350999b7ead71a4ad0ccaac3a651ff680af1c4b | [
"Apache-2.0"
] | 29 | 2020-09-15T11:07:32.000Z | 2022-03-21T09:46:30.000Z | from monday.resources.base import BaseResource
from monday.query_joins import get_users_query
class UserResource(BaseResource):
def __init__(self, token):
super().__init__(token)
def fetch_users(self, **kwargs):
query = get_users_query(**kwargs)
return self.client.execute(query)
| 26.25 | 46 | 0.72381 | 39 | 315 | 5.487179 | 0.564103 | 0.093458 | 0.121495 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.180952 | 315 | 11 | 47 | 28.636364 | 0.829457 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
bc901ccd27fee146f6f002a2055fe476cce3db06 | 3,661 | py | Python | demo_setup/clients/DASHClient/Tapas/controllers/BaseController.py | tum-lkn/appaware | 6335dbca67e3d73a431c1f0433fc3819c45d1b2b | [
"MIT"
] | null | null | null | demo_setup/clients/DASHClient/Tapas/controllers/BaseController.py | tum-lkn/appaware | 6335dbca67e3d73a431c1f0433fc3819c45d1b2b | [
"MIT"
] | null | null | null | demo_setup/clients/DASHClient/Tapas/controllers/BaseController.py | tum-lkn/appaware | 6335dbca67e3d73a431c1f0433fc3819c45d1b2b | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- Mode: Python -*-
# -*- encoding: utf-8 -*-
# Copyright (c) Vito Caldaralo <vito.caldaralo@gmail.com>
# This file may be distributed and/or modified under the terms of
# the GNU General Public License version 2 as published by
# the Free Software Foundation.
# This file is distributed without any warranty; without even the implied
# warranty of merchantability or fitness for a particular purpose.
# See "LICENSE" in the source distribution for more information.
import os, sys, inspect
from ..utils_py.util import debug
DEBUG = 1
'''feedback = dict('queued_bytes':[B],
'queued_time':[s],
'max_buffer_time':[s],
'bwe':[B/s],
'level':[],
'max_level':[],
'cur_rate':[B/s],
'max_rate':[B/s],
'min_rate':[B/s],
'player_status':[boolean],
'paused_time':[s],
'last_fragment_size':[B],
'last_download_time':[s],
'downloaded_bytes':[B],
'fragment_duration':[s],
'rates':[B/s{list}],
'is_check_buffering:[boolean]
)'''
class BaseController(object):
def __init__(self):
self.idle_duration = 4
self.control_action = None
self.feedback = None
def __repr__(self):
return '<BaseController-%d>' %id(self)
def calcControlAction(self):
'''
Computes the control action. It must return a value in B/s
(It must be implemented for new controller).
'''
raise NotImplementedError("Subclasses should implement "+inspect.stack()[0][3]+"()")
def setControlAction(self,rate):
'''
Sets the value of control action in B/s
:param rate: the result of control action.
'''
self.control_action = rate
def getControlAction(self):
'''
Gets the value of control action in B/s
'''
return self.control_action
def isBuffering(self):
'''
Boolean expression returning true if the state of the player is buffering
:rtype: bool
'''
return self.feedback['queued_time'] < self.feedback['max_buffer_time']
def getIdleDuration(self):
'''
Gets the ``idle duration`` when the state of player is not buffering
'''
return self.idle_duration
def setIdleDuration(self, idle):
'''
Sets idle duration when in steady state
:param idle: seconds of idle between two consecutive downloads.
'''
if idle < 0:
idle = 0
debug(DEBUG, '%s setting Idle duration: %.2f', self, idle)
self.idle_duration = idle
def onPlaying(self):
'''
Called when changing state from pause to play
'''
#raise NotImplementedError("Subclasses should implement "+inspect.stack()[0][3]+"()")
pass
def onPaused(self):
'''
Called when changing state from play to pause (re-buffering event)'''
#raise NotImplementedError("Subclasses should implement "+inspect.stack()[0][3]+"()")
pass
def setPlayerFeedback(self, dict_params):
'''
Sets the dictionary of all player feedback used for the control.
This method is called from ``TapasPlayer`` before ``calcControlAction``
:param dict_params: dictionary of player feedbacks.
'''
self.feedback = dict_params
def quantizeRate(self, rate):
'''
Returns the highest level index below the ``rate``
:param rate: rate to be quantized.
:rtype: int
'''
new_level = 0
r=self.feedback['rates']
for i in range(0,len(r)):
if rate >= r[i]:
new_level = i
return new_level
| 28.601563 | 93 | 0.609123 | 446 | 3,661 | 4.90583 | 0.408072 | 0.007313 | 0.008227 | 0.054845 | 0.145795 | 0.145795 | 0.117459 | 0.117459 | 0.092779 | 0.063985 | 0 | 0.006013 | 0.273149 | 3,661 | 127 | 94 | 28.826772 | 0.816235 | 0.418738 | 0 | 0.052632 | 0 | 0 | 0.078348 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.315789 | false | 0.052632 | 0.052632 | 0.026316 | 0.526316 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
bc948f4d43d1b72cc48ea4ada9d933e54a040ee6 | 639 | py | Python | Advanced_Calc.py | elazdi-al/AdvancedCalculator | fc6cf9dba12f24d0299b9216dbae2352c098a8a4 | [
"MIT"
] | null | null | null | Advanced_Calc.py | elazdi-al/AdvancedCalculator | fc6cf9dba12f24d0299b9216dbae2352c098a8a4 | [
"MIT"
] | null | null | null | Advanced_Calc.py | elazdi-al/AdvancedCalculator | fc6cf9dba12f24d0299b9216dbae2352c098a8a4 | [
"MIT"
] | null | null | null | print(" Calculator \n")
print(" ")
num1 = float(input("Please enter your first number: "))
op = input("Please enter your operator(+ or - or / or *): ")
num2 = float(input("Please enter your second number: "))
if op == "+":
print(num1 + num2)
elif op == "-":
print(num1 - num2)
elif op == "/":
print(num1 / num2)
elif op == "*":
print(num1*num2)
else:
print("Invalid expression. Read instructions carefully.") | 37.588235 | 145 | 0.411581 | 56 | 639 | 4.696429 | 0.428571 | 0.171103 | 0.1673 | 0.228137 | 0.463878 | 0.273764 | 0.273764 | 0.273764 | 0.273764 | 0.273764 | 0 | 0.029155 | 0.463224 | 639 | 17 | 146 | 37.588235 | 0.737609 | 0 | 0 | 0 | 0 | 0 | 0.594551 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.466667 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
bc986dd17af5956c6e23001353873cca18449446 | 700 | py | Python | download-lostanimals.py | robertreppel/lostcreatures | cdeeaa6efd7c7f2bed81e2343c09fa966c12e605 | [
"MIT"
] | null | null | null | download-lostanimals.py | robertreppel/lostcreatures | cdeeaa6efd7c7f2bed81e2343c09fa966c12e605 | [
"MIT"
] | null | null | null | download-lostanimals.py | robertreppel/lostcreatures | cdeeaa6efd7c7f2bed81e2343c09fa966c12e605 | [
"MIT"
] | null | null | null |
# Import Vancouver's lost animals into Elasticsearch
import urllib
import json
from pprint import pprint
from datetime import datetime
from elasticsearch import Elasticsearch
vancouverLostAnimalsFtp = 'ftp://webftp.vancouver.ca/OpenData/json/LostAnimals.json'
print "Importing Vancouver lost & found animals from " + vancouverLostAnimalsFtp
lostAnimalsJson = urllib.urlopen(vancouverLostAnimalsFtp)
lostAnimalsJsonArray = json.load(lostAnimalsJson)
es = Elasticsearch("http://localhost")
animalCount = 0
for i in lostAnimalsJsonArray:
animalCount = animalCount + 1
res = es.index(index="animals", id=str(animalCount), doc_type="lost", body=i)
print
print "Imported " + str(animalCount)
| 29.166667 | 84 | 0.794286 | 79 | 700 | 7.025316 | 0.531646 | 0.068468 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003247 | 0.12 | 700 | 23 | 85 | 30.434783 | 0.897727 | 0.071429 | 0 | 0 | 0 | 0 | 0.213292 | 0.086553 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.4375 | null | null | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
bca853e846c6a6ef81fb0e0a2179ec0e2496da95 | 251 | py | Python | vff_push_config.py | ktanushree/cloudgenix_vff_push_config | 6fe9112296faaec606e7e8db28ca716c10525829 | [
"MIT"
] | 1 | 2021-02-03T23:47:34.000Z | 2021-02-03T23:47:34.000Z | vff_push_config.py | ktanushree/cloudgenix_vff_push_config | 6fe9112296faaec606e7e8db28ca716c10525829 | [
"MIT"
] | 1 | 2021-03-01T17:57:48.000Z | 2021-03-01T17:57:48.000Z | vff_push_config.py | ktanushree/cloudgenix_vff_push_config | 6fe9112296faaec606e7e8db28ca716c10525829 | [
"MIT"
] | 1 | 2021-02-03T23:55:27.000Z | 2021-02-03T23:55:27.000Z | #!/usr/bin/env python
"""
Stub script for vff_push_client
Should be installed via pip, but can use this script if it happens to be missed or not easy to find in path.
"""
from cloudgenix_vff_push_config import go
if __name__ == "__main__":
go()
| 22.818182 | 108 | 0.737052 | 44 | 251 | 3.909091 | 0.840909 | 0.081395 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.183267 | 251 | 10 | 109 | 25.1 | 0.839024 | 0.641434 | 0 | 0 | 0 | 0 | 0.097561 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
bcbcd9bf5e8da42c41a3b225033f44dbdbca3604 | 2,512 | py | Python | tests/autocomplete.py | silviogutierrez/reactivated | 8bbd3095cd07af5bb022f110233a184fd3946b44 | [
"MIT"
] | 178 | 2018-10-24T07:58:26.000Z | 2022-03-31T10:26:57.000Z | tests/autocomplete.py | silviogutierrez/reactivated | 8bbd3095cd07af5bb022f110233a184fd3946b44 | [
"MIT"
] | 26 | 2019-10-19T02:52:08.000Z | 2022-03-23T16:47:33.000Z | tests/autocomplete.py | silviogutierrez/reactivated | 8bbd3095cd07af5bb022f110233a184fd3946b44 | [
"MIT"
] | 5 | 2019-07-25T03:55:03.000Z | 2022-03-21T04:19:50.000Z | import pytest
from reactivated import forms
from sample.server.apps.samples import models
@pytest.mark.django_db
@pytest.mark.urls("tests.urls")
def test_autocomplete(client):
composer = models.Composer.objects.create(name="Richard Wagner")
models.Composer.objects.create(name="Wolfgang Amadeus Mozart")
assert client.get("/autocomplete-view/").status_code == 200
assert (
client.post(
"/autocomplete-view/",
{"name": "Zarzuela", "style": "BUFFA", "composer": composer.pk},
).status_code
== 302
)
response = client.get(
"/autocomplete-view/", {"autocomplete": "name", "query": "Wagner"}
)
assert "Rendered form" in str(response.content)
response = client.get(
"/autocomplete-view/", {"autocomplete": "composer", "query": "Wagner"}
)
assert response.json()["results"][0]["label"] == "Richard Wagner"
@pytest.mark.django_db
@pytest.mark.urls("tests.urls")
def test_invalid_value(client):
response = client.post(
"/autocomplete-view/", {"name": "Zarzuela", "composer": "21s7"}
)
assert "Select a valid choice" in response.context["form"].errors["composer"][0]
assert response.context["form"]["composer"].value() == "21s7"
@pytest.mark.django_db
@pytest.mark.urls("tests.urls")
def test_typed_autocomplete(client):
composer = models.Composer.objects.create(name="Richard Wagner")
models.Composer.objects.create(name="Wolfgang Amadeus Mozart")
assert client.get("/typed-autocomplete-view/").status_code == 200
assert (
client.post(
"/typed-autocomplete-view/", {"name": "Zarzuela", "composer": composer.pk}
).status_code
== 302
)
response = client.get(
"/typed-autocomplete-view/", {"autocomplete": "name", "query": "Wagner"}
)
assert "" in str(response.content)
response = client.get(
"/typed-autocomplete-view/", {"autocomplete": "composer", "query": "Wagner"}
)
assert response.json()["results"][0]["label"] == "Richard Wagner"
def test_prefix_calculation(client):
assert forms.get_form_or_form_set_descriptor("opera_form_set-0-composer_field") == (
"opera_form_set",
"composer_field",
)
assert forms.get_form_or_form_set_descriptor("opera_form-composer_field") == (
"opera_form",
"composer_field",
)
assert forms.get_form_or_form_set_descriptor("composer_field") == (
None,
"composer_field",
)
| 29.552941 | 88 | 0.646099 | 279 | 2,512 | 5.670251 | 0.247312 | 0.091024 | 0.053097 | 0.068268 | 0.777497 | 0.75158 | 0.709861 | 0.594817 | 0.537927 | 0.477244 | 0 | 0.010864 | 0.193869 | 2,512 | 84 | 89 | 29.904762 | 0.77037 | 0 | 0 | 0.415385 | 0 | 0 | 0.294188 | 0.062102 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.061538 | false | 0 | 0.046154 | 0 | 0.107692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bcc0a406924110c420b7d5a8ef367baf6d6ada85 | 1,470 | py | Python | dl/layers.py | nilayjain/DeepDive | 997b3380d1c65d0b2311220b24485fdbf4f0614c | [
"MIT"
] | 1 | 2020-12-14T15:41:16.000Z | 2020-12-14T15:41:16.000Z | dl/layers.py | nilayjain/DeepDive | 997b3380d1c65d0b2311220b24485fdbf4f0614c | [
"MIT"
] | null | null | null | dl/layers.py | nilayjain/DeepDive | 997b3380d1c65d0b2311220b24485fdbf4f0614c | [
"MIT"
] | null | null | null | import torch.nn as nn
import torch.tensor as T
import torch.functional as F
class ConvLayer(nn.Module):
"""
A 2d conv layer, followed by bn, and relu nonlinearity.
N, C_in, H_in, W_in -> N, C_out, H_out, W_out
"""
def __init__(self, c_in: int, c_out: int, k: int, s: int = 1, p: int = 0):
"""
:param c_in: num input channels
:param c_out: num output channels
:param k: kernel size
:param s: stride
:param p: padding
"""
super().__init__()
self.conv = nn.Conv2d(c_in, c_out, k, s, p, bias=False)
self.bn = nn.BatchNorm2d(c_out)
def forward(self, x: T):
return F.relu(self.bn(self.conv(x)))
class LinearLayer(nn.Module):
"""
A linear layer, followed by bn, and relu nonlinearity.
"""
def __init__(self, c_in: int, c_out: int):
"""
:param c_in: in_features
:param c_out: out_features
"""
super().__init__()
self.linear = nn.Linear(c_in, c_out, bias=False)
self.bn = nn.BatchNorm1d(c_out)
def forward(self, x: T):
return F.relu(self.bn(self.linear(x)))
class ResBlock(nn.Module):
"""
implements a residual block with a skip connection.
connects 2 instances of a layer with a skip connection.
# TODO: clone or just take the layer function as input
"""
def __init__(self, layer: Type[nn.Module]):
def forward(self, *input):
pass
| 25.344828 | 78 | 0.591837 | 223 | 1,470 | 3.713004 | 0.345291 | 0.043478 | 0.039855 | 0.041063 | 0.285024 | 0.243961 | 0.243961 | 0.157005 | 0.157005 | 0.099034 | 0 | 0.006673 | 0.286395 | 1,470 | 57 | 79 | 25.789474 | 0.78265 | 0 | 0 | 0.190476 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017544 | 0 | 0 | null | null | 0.047619 | 0.142857 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bcc2c8d5822a0fb3a7816235f4f9e5b31484871c | 419 | py | Python | views/graphs.py | clavesi/coronavirus-map | ad561af62be5cf3d28fd89bd47077236a5218171 | [
"MIT"
] | 1 | 2020-04-03T06:07:59.000Z | 2020-04-03T06:07:59.000Z | views/graphs.py | clavesi/coronavirus-map | ad561af62be5cf3d28fd89bd47077236a5218171 | [
"MIT"
] | 3 | 2020-03-24T01:49:56.000Z | 2020-09-02T05:18:36.000Z | views/graphs.py | clavesi/coronavirus-map | ad561af62be5cf3d28fd89bd47077236a5218171 | [
"MIT"
] | null | null | null | import requests
import json
from flask import render_template
def graphs():
"""
It takes the data from the URL, loads it into a variable, and then passes it to the template
:return: The coviddata variable is being returned.
"""
info = requests.get("https://pomber.github.io/covid19/timeseries.json")
covid19 = json.loads(info.text)
return render_template('graphs.html', coviddata=covid19) | 29.928571 | 96 | 0.720764 | 59 | 419 | 5.084746 | 0.627119 | 0.093333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017544 | 0.183771 | 419 | 14 | 97 | 29.928571 | 0.859649 | 0.341289 | 0 | 0 | 0 | 0 | 0.229572 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.428571 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
bcc2ce8344b9e1f529aff46dafcc36aa410c0644 | 653 | py | Python | app/__init__.py | katono254/News-Highlights | 032a80c3743a0300b09a24d48f2bc02c0a58bfea | [
"MIT"
] | null | null | null | app/__init__.py | katono254/News-Highlights | 032a80c3743a0300b09a24d48f2bc02c0a58bfea | [
"MIT"
] | null | null | null | app/__init__.py | katono254/News-Highlights | 032a80c3743a0300b09a24d48f2bc02c0a58bfea | [
"MIT"
] | null | null | null | from flask import Flask
from flask_bootstrap import Bootstrap
from config import config_options
bootstrap = Bootstrap()
def create_app(config_name):
app = Flask(__name__,instance_relative_config = True) #it allows us to go to the instane
#create the app configurations
app.config.from_object(config_options[config_name])
# Initializing flask extensions that was installed
bootstrap.init_app(app)
#registering the blueprint
from .main import main as main_blueprint
app.register_blueprint(main_blueprint)
#setting config
from .requests import configure_request
configure_request(app)
return app | 25.115385 | 92 | 0.764165 | 85 | 653 | 5.647059 | 0.447059 | 0.0375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.18683 | 653 | 26 | 93 | 25.115385 | 0.903955 | 0.229709 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.384615 | 0 | 0.538462 | 0.153846 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
bcd8d26ff0e57a5e241b04f077ea45ecf71defc6 | 1,097 | py | Python | blog/models.py | Shawn9717/Neighberhood | d2cbc5e2489e56547703914f90fb663f8d1e92d0 | [
"MIT"
] | null | null | null | blog/models.py | Shawn9717/Neighberhood | d2cbc5e2489e56547703914f90fb663f8d1e92d0 | [
"MIT"
] | null | null | null | blog/models.py | Shawn9717/Neighberhood | d2cbc5e2489e56547703914f90fb663f8d1e92d0 | [
"MIT"
] | null | null | null | from django.db import models
from django.urls import reverse
from django.contrib.auth.models import User
from django.utils.timezone import now
# Create your models here.
class BlogPost(models.Model):
title=models.CharField(max_length=255)
author= models.ForeignKey(User, on_delete=models.CASCADE)
slug=models.CharField(max_length=130)
content=models.TextField()
image = models.ImageField(upload_to="blog_pics", blank=True, null=True)
dateTime=models.DateTimeField(auto_now_add=True)
def __str__(self):
return str(self.author) + " Blog Title: " + self.title
def get_absolute_url(self):
return reverse('blogs')
class Comment(models.Model):
user = models.ForeignKey(User, on_delete=models.CASCADE)
content = models.TextField()
blog = models.ForeignKey(BlogPost, on_delete=models.CASCADE)
parent_comment = models.ForeignKey('self', on_delete=models.CASCADE, null=True, blank=True)
dateTime=models.DateTimeField(default=now)
def __str__(self):
return self.user.username + " Comment: " + self.content
| 36.566667 | 98 | 0.724704 | 142 | 1,097 | 5.450704 | 0.408451 | 0.05168 | 0.072351 | 0.108527 | 0.105943 | 0.105943 | 0.105943 | 0 | 0 | 0 | 0 | 0.006557 | 0.165907 | 1,097 | 29 | 99 | 37.827586 | 0.839344 | 0.021878 | 0 | 0.173913 | 0 | 0 | 0.038282 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.130435 | false | 0 | 0.173913 | 0.130435 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
bce8bbe39e015fc7132410a2318f740f9c5ffc84 | 1,703 | py | Python | setup.py | bukzor/pre-commit | 7d546c1f815a41f17af276e2f8d0636efdd87a6d | [
"MIT"
] | null | null | null | setup.py | bukzor/pre-commit | 7d546c1f815a41f17af276e2f8d0636efdd87a6d | [
"MIT"
] | null | null | null | setup.py | bukzor/pre-commit | 7d546c1f815a41f17af276e2f8d0636efdd87a6d | [
"MIT"
] | null | null | null | from setuptools import find_packages
from setuptools import setup
setup(
name='pre_commit',
description=(
'A framework for managing and maintaining multi-language pre-commit '
'hooks.'
),
url='https://github.com/pre-commit/pre-commit',
version='0.3.0',
author='Anthony Sottile',
author_email='asottile@umich.edu',
platforms='linux',
classifiers=[
'License :: OSI Approved :: MIT License',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: Implementation :: CPython',
'Programming Language :: Python :: Implementation :: PyPy',
],
packages=find_packages('.', exclude=('tests*', 'testing*')),
package_data={
'pre_commit': [
'resources/pre-commit-hook',
'resources/rbenv.tar.gz',
'resources/ruby-build.tar.gz',
'resources/ruby-download.tar.gz',
]
},
install_requires=[
'argparse',
'aspy.yaml',
'cached-property',
'jsonschema',
'nodeenv>=0.11.1',
'ordereddict',
'plumbum',
'pyyaml',
'simplejson',
'virtualenv',
],
entry_points={
'console_scripts': [
'pre-commit = pre_commit.main:main',
'validate-config = pre_commit.clientlib.validate_config:run',
'validate-manifest = pre_commit.clientlib.validate_manifest:run',
],
},
)
| 28.864407 | 77 | 0.568409 | 162 | 1,703 | 5.888889 | 0.54321 | 0.09434 | 0.209644 | 0.081761 | 0.056604 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013957 | 0.284792 | 1,703 | 58 | 78 | 29.362069 | 0.769294 | 0 | 0 | 0.056604 | 0 | 0 | 0.549031 | 0.109219 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.037736 | 0 | 0.037736 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bcf1e0738f66e5a789d9f1bda6959e1f58569165 | 1,733 | py | Python | meraki_sdk/models/update_organization_branding_policies_priorities_model.py | meraki/meraki-python-sdk | 9894089eb013318243ae48869cc5130eb37f80c0 | [
"MIT"
] | 37 | 2019-04-24T14:01:33.000Z | 2022-01-28T01:37:21.000Z | meraki_sdk/models/update_organization_branding_policies_priorities_model.py | ankita66666666/meraki-python-sdk | 9894089eb013318243ae48869cc5130eb37f80c0 | [
"MIT"
] | 10 | 2019-07-09T16:35:11.000Z | 2021-12-07T03:47:53.000Z | meraki_sdk/models/update_organization_branding_policies_priorities_model.py | ankita66666666/meraki-python-sdk | 9894089eb013318243ae48869cc5130eb37f80c0 | [
"MIT"
] | 17 | 2019-04-30T23:53:21.000Z | 2022-02-07T22:57:44.000Z | # -*- coding: utf-8 -*-
"""
meraki_sdk
This file was automatically generated for meraki by APIMATIC v2.0 ( https://apimatic.io ).
"""
class UpdateOrganizationBrandingPoliciesPrioritiesModel(object):
"""Implementation of the 'updateOrganizationBrandingPoliciesPriorities' model.
TODO: type model description here.
Attributes:
branding_policy_ids (list of string): A list of branding policy IDs
arranged in ascending priority order (IDs later in the array have
higher priority).
"""
# Create a mapping from Model property names to API property names
_names = {
"branding_policy_ids":'brandingPolicyIds'
}
def __init__(self,
branding_policy_ids=None):
"""Constructor for the UpdateOrganizationBrandingPoliciesPrioritiesModel class"""
# Initialize members of the class
self.branding_policy_ids = branding_policy_ids
@classmethod
def from_dictionary(cls,
dictionary):
"""Creates an instance of this model from a dictionary
Args:
dictionary (dictionary): A dictionary representation of the object as
obtained from the deserialization of the server's response. The keys
MUST match property names in the API description.
Returns:
object: An instance of this structure class.
"""
if dictionary is None:
return None
# Extract variables from the dictionary
branding_policy_ids = dictionary.get('brandingPolicyIds')
# Return an object of this model
return cls(branding_policy_ids)
| 28.883333 | 95 | 0.637046 | 178 | 1,733 | 6.08427 | 0.477528 | 0.103416 | 0.125577 | 0.038781 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002494 | 0.305828 | 1,733 | 59 | 96 | 29.372881 | 0.897756 | 0.567224 | 0 | 0 | 1 | 0 | 0.092982 | 0 | 0 | 0 | 0 | 0.016949 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bcf35b533df1d43e13ab2e0c705ff67ab122eb31 | 860 | py | Python | administration/serializers.py | HunterPen/PmpApi | c50ef9f8ecac7575fc6a45e96bacd18e56b9c0bd | [
"Apache-2.0"
] | null | null | null | administration/serializers.py | HunterPen/PmpApi | c50ef9f8ecac7575fc6a45e96bacd18e56b9c0bd | [
"Apache-2.0"
] | null | null | null | administration/serializers.py | HunterPen/PmpApi | c50ef9f8ecac7575fc6a45e96bacd18e56b9c0bd | [
"Apache-2.0"
] | null | null | null | from rest_framework import serializers
from login.models import Employee
from .models import News, Activity
class EmployeeSerializer(serializers.HyperlinkedModelSerializer):
part = serializers.ReadOnlyField(source='part.part_name')
room = serializers.ReadOnlyField(source='room.room_id')
class Meta:
model = Employee
fields = ('emp_id', 'name', 'sex', 'part', 'room')
class NewsSerializer(serializers.HyperlinkedModelSerializer):
part = serializers.ReadOnlyField(source='part.part_name')
class Meta:
model = News
fields = ('news_id', 'part', 'news_date', 'title', 'news_content', 'original_content', 'picture_path', )
class ActivitySerializer(serializers.HyperlinkedModelSerializer):
class Meta:
model = Activity
fields = ('act_id', 'act_name', 'act_content', 'true_content', )
| 29.655172 | 112 | 0.709302 | 89 | 860 | 6.696629 | 0.382022 | 0.186242 | 0.151007 | 0.174497 | 0.278523 | 0.278523 | 0.278523 | 0.278523 | 0.278523 | 0 | 0 | 0 | 0.173256 | 860 | 28 | 113 | 30.714286 | 0.838256 | 0 | 0 | 0.277778 | 0 | 0 | 0.189535 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
4c009936c2023e07deda9dc32695c49d29fe2596 | 2,125 | py | Python | src/knn.py | zhangxyz/MLFSdel | d12f173b0fd3aeea5278a16809df2576f63b97d6 | [
"BSD-2-Clause"
] | null | null | null | src/knn.py | zhangxyz/MLFSdel | d12f173b0fd3aeea5278a16809df2576f63b97d6 | [
"BSD-2-Clause"
] | null | null | null | src/knn.py | zhangxyz/MLFSdel | d12f173b0fd3aeea5278a16809df2576f63b97d6 | [
"BSD-2-Clause"
] | null | null | null | #!usr/bin/env python
#-*- coding: utf-8 -*-
import logging
import numpy as np
import sklearn
from sklearn import metrics
from sklearn import neighbors
from sklearn import cross_validation
from sklearn.datasets import load_svmlight_file
from sklearn.cross_validation import cross_val_score
import math
import sys
import os
from subprocess import *
from sklearn.feature_selection import SelectPercentile, f_classif, SelectKBest
from sklearn.pipeline import Pipeline
from sklearn import metrics
from sklearn.neighbors import DistanceMetric
from sklearn.neighbors import KNeighborsClassifier
def calculate_result(actual,pred):
m_precision = metrics.precision_score(actual,pred);
m_recall = metrics.recall_score(actual,pred);
print 'predict info:'
print 'accuracy:{0:.3f}'.format(metrics.accuracy_score(actual,pred))
print 'precision:{0:.3f}'.format(m_precision)
print 'recall:{0:0.3f}'.format(m_recall);
print 'f1-score:{0:.3f}'.format(metrics.f1_score(actual,pred));
def calculate_k(x,y):
clf = KNeighborsClassifier(n_neighbors=3)
max_metric=cross_val_score(clf,x,y,cv=10,scoring='f1').mean()
k=3
for i in range(4,100):
clf = KNeighborsClassifier(n_neighbors=i)#default with k=5
metric = cross_val_score(clf,x,y,cv=10,scoring='f1').mean()
if max_metric < metric :
max_metric=metric
k=i
return k
def knn_result(train_x,train_y,test_x,test_y,out_file):
print '*************************\nKNN\n*************************'
#c= calculate_k(train_x,train_y)
clf = KNeighborsClassifier(n_neighbors=10)
#clf = KNeighborsClassifier(n_neighbors=c)
clf.fit(train_x,train_y)
pred = clf.predict(test_x);
calculate_result(test_y,pred);
np.savetxt(out_file,pred,fmt='%d')
#print(c)
def knn_select(train_x,train_y,test_x,test_y):
#c= calculate_k(train_x,train_y)
clf = KNeighborsClassifier(n_neighbors=10)
#clf = KNeighborsClassifier(n_neighbors=c)
clf.fit(train_x,train_y)
pred = clf.predict(test_x);
return metrics.precision_score(test_y,pred)
#print(c)
| 32.19697 | 78 | 0.709176 | 307 | 2,125 | 4.716612 | 0.283388 | 0.075967 | 0.099448 | 0.13674 | 0.337017 | 0.313536 | 0.265193 | 0.265193 | 0.234807 | 0.234807 | 0 | 0.016237 | 0.159529 | 2,125 | 65 | 79 | 32.692308 | 0.794513 | 0.101647 | 0 | 0.166667 | 0 | 0 | 0.073684 | 0.03 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.354167 | null | null | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
4c03d48e533ce8b54414229f1e9e6cb360c43bf5 | 666 | py | Python | packages/openshift/__init__.py | jinho10/openshift-client-python | e369c6c37f961b674b70041a6597a0c9086637de | [
"Apache-2.0"
] | null | null | null | packages/openshift/__init__.py | jinho10/openshift-client-python | e369c6c37f961b674b70041a6597a0c9086637de | [
"Apache-2.0"
] | null | null | null | packages/openshift/__init__.py | jinho10/openshift-client-python | e369c6c37f961b674b70041a6597a0c9086637de | [
"Apache-2.0"
] | null | null | null | from __future__ import absolute_import
from .context import *
from .base_verbs import *
from .model import OpenShiftPythonException
from .model import Model, Missing
from .selector import *
from .apiobject import *
from . import naming
from . import status
from . import config
from .ansible import ansible
# Single source for module version
__VERSION__ = __version__ = '1.0.6'
null = None # Allow scripts to specify null in object definitions
# Allows modules to trigger errors
def error(msg, **kwargs):
raise OpenShiftPythonException(msg, **kwargs)
# Convenience method for accessing the module version
def get_module_version():
return __VERSION__
| 23.785714 | 66 | 0.777778 | 86 | 666 | 5.790698 | 0.55814 | 0.100402 | 0.060241 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005376 | 0.162162 | 666 | 27 | 67 | 24.666667 | 0.887097 | 0.253754 | 0 | 0 | 0 | 0 | 0.010163 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.647059 | 0.058824 | 0.823529 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
4c0a8bf4c1a5de32a32f24829bd3bceaa0fb251c | 556 | py | Python | firefly/distributed/reference.py | genghaolove/firefly | 7120b10ae3ebd7b89fd97274eff78fab54764110 | [
"MIT"
] | 675 | 2015-01-01T05:18:30.000Z | 2022-03-18T08:27:06.000Z | firefly/distributed/reference.py | fuhongxue/Firefly | 1c4b11f90782ef8aa2c96afe2fb67f3d65c03a2e | [
"MIT"
] | 3 | 2015-01-29T02:36:14.000Z | 2022-01-21T09:19:21.000Z | firefly/distributed/reference.py | fuhongxue/Firefly | 1c4b11f90782ef8aa2c96afe2fb67f3d65c03a2e | [
"MIT"
] | 248 | 2015-01-04T08:24:31.000Z | 2022-02-18T07:14:02.000Z | #coding:utf8
'''
Created on 2013-8-14
@author: lan (www.9miao.com)
'''
from twisted.spread import pb
from firefly.utils.services import Service
class ProxyReference(pb.Referenceable):
'''代理通道'''
def __init__(self):
'''初始化'''
self._service = Service('proxy')
def addService(self,service):
'''添加一条服务通道'''
self._service = service
def remote_callChild(self, command,*arg,**kw):
'''代理发送数据
'''
return self._service.callTarget(command,*arg,**kw)
| 19.172414 | 58 | 0.57554 | 59 | 556 | 5.288136 | 0.677966 | 0.141026 | 0.115385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0225 | 0.280576 | 556 | 29 | 59 | 19.172414 | 0.7575 | 0.172662 | 0 | 0 | 0 | 0 | 0.011547 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.222222 | 0 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
4c0cd92d5bb05763e086b13002fedbb518032f74 | 640 | py | Python | hopstack/yamlparser.py | willpatterson/HOPSTACK | 5904063f08d75c916d7e652e30375cf0c8f0a59f | [
"MIT"
] | 1 | 2017-11-25T18:24:27.000Z | 2017-11-25T18:24:27.000Z | hopstack/yamlparser.py | willpatterson/HOPSTACK | 5904063f08d75c916d7e652e30375cf0c8f0a59f | [
"MIT"
] | null | null | null | hopstack/yamlparser.py | willpatterson/HOPSTACK | 5904063f08d75c916d7e652e30375cf0c8f0a59f | [
"MIT"
] | null | null | null | """
TODO:
Turn read_yaml() into LASDO factory
explore multiple yaml files per actual file
"""
import yaml
from collections import namedtuple
def read_yaml(yaml_path):
"""
Reads YAML file and yields its objects
"""
YmlObj = namedtuple('YmlObj', ['name', 'data'])
with open(yaml_path, 'r') as yfile:
raw_yaml_data = yaml.load(yfile)
print(raw_yaml_data)
for yobj in raw_yaml_data:
#yield YmlObj(name=yobj, data=yobj[
print(yobj.keys())
yield yobj
if __name__ == '__main__':
yaml_path = './test/yaml_reader_test.yml'
for i in read_yaml(yaml_path):
pass
| 21.333333 | 51 | 0.64375 | 89 | 640 | 4.370787 | 0.52809 | 0.082262 | 0.084833 | 0.082262 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.246875 | 640 | 29 | 52 | 22.068966 | 0.807054 | 0.260938 | 0 | 0 | 0 | 0 | 0.111111 | 0.06 | 0 | 0 | 0 | 0.034483 | 0 | 1 | 0.071429 | false | 0.071429 | 0.142857 | 0 | 0.214286 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
4c1396f68c8b0917047856e040a9d2462e7a07c0 | 802 | py | Python | corehq/util/workbook_reading/__init__.py | dimagilg/commcare-hq | ea1786238eae556bb7f1cbd8d2460171af1b619c | [
"BSD-3-Clause"
] | 471 | 2015-01-10T02:55:01.000Z | 2022-03-29T18:07:18.000Z | corehq/util/workbook_reading/__init__.py | dimagilg/commcare-hq | ea1786238eae556bb7f1cbd8d2460171af1b619c | [
"BSD-3-Clause"
] | 14,354 | 2015-01-01T07:38:23.000Z | 2022-03-31T20:55:14.000Z | corehq/util/workbook_reading/__init__.py | dimagilg/commcare-hq | ea1786238eae556bb7f1cbd8d2460171af1b619c | [
"BSD-3-Clause"
] | 175 | 2015-01-06T07:16:47.000Z | 2022-03-29T13:27:01.000Z | from corehq.util.workbook_reading.exceptions import (
CellValueError,
SpreadsheetFileEncrypted,
SpreadsheetFileError,
SpreadsheetFileExtError,
SpreadsheetFileInvalidError,
SpreadsheetFileNotFound,
)
from .datamodels import Cell, Workbook, Worksheet
from .adapters import (
make_worksheet,
open_any_workbook,
open_csv_workbook,
open_xls_workbook,
open_xlsx_workbook,
valid_extensions,
)
__all__ = [
'open_csv_workbook',
'open_xls_workbook',
'open_xlsx_workbook',
'open_any_workbook',
'make_worksheet',
'valid_extensions',
'SpreadsheetFileError',
'SpreadsheetFileExtError',
'SpreadsheetFileInvalidError',
'SpreadsheetFileNotFound',
'SpreadsheetFileEncrypted',
'Workbook',
'Worksheet',
'Cell',
]
| 20.564103 | 53 | 0.721945 | 63 | 802 | 8.793651 | 0.412698 | 0.129964 | 0.252708 | 0.33574 | 0.166065 | 0.166065 | 0.166065 | 0.166065 | 0.166065 | 0 | 0 | 0 | 0.193267 | 802 | 38 | 54 | 21.105263 | 0.85626 | 0 | 0 | 0 | 0 | 0 | 0.295511 | 0.120948 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4c1512d94a8a14451661a5c853159e9f9dda1dd4 | 223 | py | Python | sdk/tables/azure-data-tables/azure/data/tables/_generated/aio/__init__.py | iamvishnuks/azure-sdk-for-python | 4df435651ab32f57b1e9f33fc65fd46632055704 | [
"MIT"
] | null | null | null | sdk/tables/azure-data-tables/azure/data/tables/_generated/aio/__init__.py | iamvishnuks/azure-sdk-for-python | 4df435651ab32f57b1e9f33fc65fd46632055704 | [
"MIT"
] | null | null | null | sdk/tables/azure-data-tables/azure/data/tables/_generated/aio/__init__.py | iamvishnuks/azure-sdk-for-python | 4df435651ab32f57b1e9f33fc65fd46632055704 | [
"MIT"
] | 1 | 2020-07-31T16:33:51.000Z | 2020-07-31T16:33:51.000Z | # ------------------------------------
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT License.
# ------------------------------------
__all__ = ['AzureTable']
from ._azure_table_async import AzureTable
| 20.272727 | 42 | 0.497758 | 17 | 223 | 6.117647 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116592 | 223 | 10 | 43 | 22.3 | 0.527919 | 0.636771 | 0 | 0 | 0 | 0 | 0.135135 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
4c15ee5f62d8c5692b73dbe236abe532b530edb1 | 464 | py | Python | challenges/list_sorts/insertion_sort/insertion_sort.py | ravewillow6383/data-structures-and-algorithms-python | 98533ee241a3ae452dab1ecb87aab39742005e35 | [
"MIT"
] | null | null | null | challenges/list_sorts/insertion_sort/insertion_sort.py | ravewillow6383/data-structures-and-algorithms-python | 98533ee241a3ae452dab1ecb87aab39742005e35 | [
"MIT"
] | null | null | null | challenges/list_sorts/insertion_sort/insertion_sort.py | ravewillow6383/data-structures-and-algorithms-python | 98533ee241a3ae452dab1ecb87aab39742005e35 | [
"MIT"
] | null | null | null | def insertion_sort(lst):
"""Returns a sorted array.
A provided list will be sorted out of place. Returns a new list sorted smallest to largest."""
for i in range (1, len(lst)):
current_idx = i
temp_vlaue = lst[i]
while current_idx > 0 and lst[current_idx - 1] > temp_vlaue:
lst[current_idx] = lst[current_idx - 1]
current_idx = current_idx - 1
lst[current_idx] = temp_vlaue
return lst | 30.933333 | 98 | 0.616379 | 69 | 464 | 3.971014 | 0.478261 | 0.291971 | 0.237226 | 0.10219 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015385 | 0.299569 | 464 | 15 | 99 | 30.933333 | 0.827692 | 0.247845 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4c27d53c6b7ba45a76af771be2d62c38430c04c7 | 238 | py | Python | website/src/rss/urls.py | iamcholo/videoplatform | 72dd1db73e1c940e5992dacbb63feb8fc11394e3 | [
"Apache-2.0"
] | null | null | null | website/src/rss/urls.py | iamcholo/videoplatform | 72dd1db73e1c940e5992dacbb63feb8fc11394e3 | [
"Apache-2.0"
] | 9 | 2020-06-05T19:18:35.000Z | 2022-03-11T23:30:50.000Z | website/src/rss/urls.py | iamcholo/videoplatform | 72dd1db73e1c940e5992dacbb63feb8fc11394e3 | [
"Apache-2.0"
] | null | null | null | from django.conf.urls import url
from utilities.generic.base import TemplateView
from rss import views
# Create your views here.
urlpatterns = [
url(r'^rss.xml$',
views.rss,
name='rss'
),
] | 23.8 | 48 | 0.596639 | 29 | 238 | 4.896552 | 0.655172 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.306723 | 238 | 10 | 49 | 23.8 | 0.860606 | 0.096639 | 0 | 0 | 0 | 0 | 0.058537 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
4c303e0c7666d731b93c1c531688762d259c5a5e | 3,562 | py | Python | infoBottleneck.py | usc-sail/IBdiar | 1d13b7a2bacfc82bb74c15b66a739b4d5872af46 | [
"MIT"
] | 3 | 2019-02-04T02:44:59.000Z | 2020-09-08T15:42:58.000Z | infoBottleneck.py | usc-sail/IBdiar | 1d13b7a2bacfc82bb74c15b66a739b4d5872af46 | [
"MIT"
] | null | null | null | infoBottleneck.py | usc-sail/IBdiar | 1d13b7a2bacfc82bb74c15b66a739b4d5872af46 | [
"MIT"
] | 3 | 2018-05-22T17:16:46.000Z | 2019-10-17T05:20:33.000Z | #!/bin/python
# Date created: Nov 5 2017
#PRINT AN USAGE INFORMATION
import sys
import argparse
import numpy as np
from functions import *
from scipy.cluster.hierarchy import fcluster
parser = argparse.ArgumentParser()
parser.add_argument("--beta", help="Lagrangian parameter in the IB criterion (10)", default=10,type=float)
parser.add_argument("--segLen", help="Size of each segment during uniform segmentation (seconds, 2)",default=2,type=int)
parser.add_argument("--frameRate", help="Number of frames per second (Hz, 50). Must match with VAD file, if supplied",default=50,type=int)
parser.add_argument("--numCluster", help="Number of speakers (2)",default=2,type=int)
parser.add_argument("--library", help="For feature extractio and GMM training: kaldi/sklearn (kaldi)",default="kaldi")
parser.add_argument("--vadFile", help="Voiced(1)/Unvoiced(0) values. One frame per line",default=None)
parser.add_argument("--gmmFile", help="Pre-trained GMM model",default=None)
parser.add_argument("--localGMM", help="Whether to train a GMM locally (1) or not (0). This argument overrides --gmmFile",default=1,type=int)
parser.add_argument("--kaldiRoot", help="Kaldi root location",default=None)
parser.add_argument("--numMix", help="Number of Gaussian components. Will be over-written if supplied with pre-trained GMM model",default=64,type=int)
parser.add_argument("--minBlockLen", help="Minimum number of frames to be treated as a contiguous unit during re-alignment (Number of frames,25)",default=25,type=int)
parser.add_argument("--numRealignments", help="Number of Viterbi realignments (1)",default=1,type=int)
parser.add_argument("wavFile")
parser.add_argument("rttmFile")
args = parser.parse_args()
if args.localGMM == 0 and args.gmmFile == None:
print("Please provide a GMM file if not training locally\n\n")
parser.print_help()
sys.exit(1)
if args.localGMM == 1 and args.gmmFile != None:
print("Overriding pre-trained GMM model since localGMM is set to 1")
if args.library == "kaldi" and args.kaldiRoot == None:
print("Please provide the kaldi root directory\n\n")
parser.print_help()
sys.exit(1)
np.random.seed(1000)
if args.vadFile is not None:
vad = np.loadtxt(args.vadFile).astype('bool')
#vad = np.interp(np.linspace(0,len(vad),int(len(vad)*args.frameRate/100.0)),np.arange(len(vad)),vad).astype('bool')
else:
vad = None
if args.library == "kaldi":
mfcc,vad,p_y_x = trainGMMWithKaldi(args.wavFile, args.gmmFile, args.frameRate, args.segLen, args.kaldiRoot, vad, args.localGMM, args.numMix)
else:
mfcc,vad,p_y_x = trainGMMWithSklearn(args.wavFile, args.gmmFile, args.frameRate, args.segLen, vad, args.localGMM, args.numMix)
p_y_x[p_y_x<1e-10] = 1e-10
Z,C = cluster(p_y_x,args.beta,0)
clust = fcluster(Z,args.numCluster,criterion='maxclust')
frameClust = convertDecisionsSegToFrame(clust, args.segLen, args.frameRate, mfcc.shape[0])
pass1hyp = -1*np.ones(len(vad))
pass1hyp[vad] = frameClust
prevPassHyp = pass1hyp
for realignIter in range(args.numRealignments):
frameClust = viterbiRealignment(mfcc,frameClust,args.segLen,args.frameRate,args.minBlockLen,numMix=16)
nextPassHyp = -1*np.ones(len(vad))
nextPassHyp[vad] = frameClust
# If any speaker was lost during realignment, use hypothesis from previous iteration
if len(np.unique(nextPassHyp)) < args.numCluster+1:
writeRttmFile(prevPassHyp, args.frameRate, args.wavFile, args.rttmFile)
break
else:
prevPassHyp = nextPassHyp
writeRttmFile(nextPassHyp, args.frameRate, args.wavFile, args.rttmFile)
| 44.525 | 166 | 0.74621 | 524 | 3,562 | 5.020992 | 0.343511 | 0.047891 | 0.09046 | 0.042569 | 0.259597 | 0.129989 | 0.102623 | 0.078297 | 0 | 0 | 0 | 0.019078 | 0.117069 | 3,562 | 79 | 167 | 45.088608 | 0.817488 | 0.072993 | 0 | 0.122807 | 0 | 0.017544 | 0.29633 | 0.006369 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.157895 | 0.087719 | null | null | 0.087719 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
4c4a11045ea0c10cb9e72b3baba06ada502d72c2 | 813 | py | Python | src/28-ImplementStrstr.py | Jiezhi/myleetcode | b346e94c46da2a3033ebc8ff50e621aa179c4f62 | [
"MIT"
] | 1 | 2022-03-03T15:11:48.000Z | 2022-03-03T15:11:48.000Z | src/28-ImplementStrstr.py | Jiezhi/myleetcode | b346e94c46da2a3033ebc8ff50e621aa179c4f62 | [
"MIT"
] | null | null | null | src/28-ImplementStrstr.py | Jiezhi/myleetcode | b346e94c46da2a3033ebc8ff50e621aa179c4f62 | [
"MIT"
] | 2 | 2022-01-20T22:49:58.000Z | 2022-01-20T22:53:13.000Z | #!/usr/bin/env python
"""
https://leetcode.com/problems/implement-strstr/description/
Created on 2018-11-13
@author: 'Jiezhi.G@gmail.com'
Reference:
"""
class Solution:
def strStr(self, haystack, needle):
"""
:type haystack: str
:type needle: str
:rtype: int
"""
if not needle or haystack == needle:
return 0
len_n = len(needle)
for i in range(len(haystack) - len_n + 1):
if haystack[i:i + len_n] == needle:
return i
return -1
def test():
assert Solution().strStr("test", "") == 0
assert Solution().strStr("hello", "ll") == 2
assert Solution().strStr("hello", "k") == -1
assert Solution().strStr("hello", "hello") == 0
assert Solution().strStr("mississippi", "pi") == 9
| 23.911765 | 59 | 0.557196 | 100 | 813 | 4.5 | 0.52 | 0.155556 | 0.222222 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027304 | 0.279213 | 813 | 33 | 60 | 24.636364 | 0.740614 | 0.241082 | 0 | 0 | 0 | 0 | 0.070175 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.133333 | false | 0 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4c5f1bc11a54135f463312f95775796d3f97a21f | 1,463 | py | Python | setup.py | BlackBoiler/pyap | 2653dd75c37c8f305422ae849419e18a7a2c9486 | [
"MIT"
] | null | null | null | setup.py | BlackBoiler/pyap | 2653dd75c37c8f305422ae849419e18a7a2c9486 | [
"MIT"
] | null | null | null | setup.py | BlackBoiler/pyap | 2653dd75c37c8f305422ae849419e18a7a2c9486 | [
"MIT"
] | null | null | null | from setuptools import setup
def readme():
with open('README.rst') as f:
return f.read()
setup(name='pyap',
version='0.2.0-bb',
description='Library for detecting and parsing addresses.'
' Currently supports US, Canadian and British addresses.',
long_description=readme(),
long_description_content_type="text/x-rst",
keywords='address detection, address parsing',
url='http://github.com/vladimarius/pyap',
author='Vladimir Goncharov',
author_email='vladimarius@gmail.com',
license='MIT',
packages=['pyap', 'pyap.packages', 'pyap.source_CA', 'pyap.source_US', 'pyap.source_GB'],
download_url='https://github.com/vladimarius/pyap',
zip_safe=False,
classifiers=[
'Intended Audience :: Developers',
'Development Status :: 4 - Beta',
'License :: OSI Approved :: MIT License',
'Natural Language :: English',
'Programming Language :: Python',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.1',
'Programming Language :: Python :: 3.2',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
'Topic :: Software Development :: Libraries',
'Topic :: Scientific/Engineering :: Information Analysis',
'Topic :: Utilities'
],
)
| 36.575 | 95 | 0.602187 | 150 | 1,463 | 5.806667 | 0.573333 | 0.152698 | 0.200918 | 0.149254 | 0.061998 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013812 | 0.25769 | 1,463 | 39 | 96 | 37.512821 | 0.788214 | 0 | 0 | 0 | 0 | 0 | 0.564593 | 0.029392 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028571 | true | 0 | 0.028571 | 0 | 0.085714 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4c633744e0bad8ef9cafe3faa3cef31df0a68f24 | 5,404 | py | Python | ceee/testing/utils/com_mock.py | Gitman1989/chromium | 2b1cceae1075ef012fb225deec8b4c8bbe4bc897 | [
"BSD-3-Clause"
] | 2 | 2017-09-02T19:08:28.000Z | 2021-11-15T15:15:14.000Z | ceee/testing/utils/com_mock.py | Gitman1989/chromium | 2b1cceae1075ef012fb225deec8b4c8bbe4bc897 | [
"BSD-3-Clause"
] | null | null | null | ceee/testing/utils/com_mock.py | Gitman1989/chromium | 2b1cceae1075ef012fb225deec8b4c8bbe4bc897 | [
"BSD-3-Clause"
] | 1 | 2020-04-13T05:45:10.000Z | 2020-04-13T05:45:10.000Z | #!/bin/env python
# Copyright (c) 2010 The Chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""Generates a file with MOCK_METHODX(...) macros for a COM interface.
For interfaces that will not change, you can run this file by hand. For
interfaces that may still change, you could run it as part of your build
system to generate code, then #include the generated list of methods inside an
appropriate wrapper class.
Usage:
com_mock.py INTERFACE_NAME HEADER [ADD_HEADERS]* > output-inl.h
"""
import re
import sys
# Finds an interface block in a header file generated by IDL.
_INTERFACE_PATTERN = (r'MIDL_INTERFACE\("[A-Za-z0-9\-]+"\)\s*%s\s*:\s*'
r'public\s*(?P<base>\w+)\s*{\s*public:'
r'(?P<method_block>[^}]+)};')
# Parses out a method from within an interface block.
_METHOD_RE = re.compile(r'virtual\s*(:?/\*[^*]+\*/)?\s*(?P<ret>\w+)\s*'
r'STDMETHODCALLTYPE\s*(?P<name>\w+)\s*\('
r'(?P<params>[^\)]+)\)\s*=\s*0;')
# Finds inline C-style comments.
_COMMENTS_RE = re.compile('/\*.*?\*/')
# Finds __RPC_xyz defines.
_RPC_RE = re.compile(' __RPC\w*')
# Finds whitespace.
_WHITESPACE_RE = re.compile('\s+')
# Finds default argument values.
_DEFAULT_ARG_RE = re.compile('\s*=\s*[^,]*')
# Header for generated files.
_FILE_HEADER = '''\
// Copyright (c) 2010 The Chromium Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
//
// Auto-generated by com_mock.py
'''
class Mocker(object):
"""Given a set of header files, can generate mocky goodness for interfaces
defined in them.
"""
def __init__(self):
self.headers_ = ''
def AddHeaders(self, header_files):
"""Adds the specified files to the headers to parse."""
assert header_files
# Slurp all the header files into a single huge big string.
# Yummy nonperformant code - but it probably doesn't need to be.
headers = []
for f in header_files:
fo = open(f, 'r')
headers.append(fo.read())
fo.close()
self.headers_ += '\n'.join(headers)
def _GetInterfaceInfo(self, interface_name):
"""Returns a tuple (base_interface, method_block) or None."""
m = re.search(_INTERFACE_PATTERN % interface_name, self.headers_)
if m:
return (m.group('base'), m.group('method_block'))
else:
return None
@staticmethod
def _ProcessParams(params):
"""Return a tuple (count, cleaned_up_params)."""
if params == ' void':
return (0, '')
else:
params = params.replace('\n', ' ')
params = _COMMENTS_RE.sub('', params)
params = _RPC_RE.sub(' ', params)
params = _WHITESPACE_RE.sub(' ', params)
params = _DEFAULT_ARG_RE.sub(' ', params)
return (params.count(',') + 1, params.strip())
def _MockMethodsInBlock(self, block):
"""Return a list of mock functions for each of the methods defined
in 'block', which is the text between 'public:' and '};' in an interface
definition.
"""
methods = []
for m in _METHOD_RE.finditer(block):
return_type = m.group('ret')
name = m.group('name')
params = m.group('params')
(param_count, clean_params) = self._ProcessParams(params)
method_definition = ('MOCK_METHOD%d_WITH_CALLTYPE(__stdcall, %s, %s(%s));'
% (param_count, name, return_type, clean_params))
if len(method_definition) > 78:
method_definition = (
'MOCK_METHOD%d_WITH_CALLTYPE(__stdcall, %s, %s(\n %s));' %
(param_count, name, return_type, clean_params))
methods.append(method_definition)
return methods
def MockInterface(self, name):
"""Returns a list of all the mock methods needed for the given
interface (including methods from inherited interfaces, but stopping
short of IDispatch and IUnknown).
"""
info = self._GetInterfaceInfo(name)
if not info:
return ''
# Generate inherited methods
methods = []
if info[0] not in ('IUnknown', 'IDispatch'):
methods.extend(self.MockInterface(info[0]))
methods.extend(self._MockMethodsInBlock(info[1]))
return methods
def MockMethods(interface_name, header_files):
"""Returns a string with a correctly filled-out MOCK_METHODX(...) line
for each method in the given interface, plus all of its inherited interfaces,
terminating when IDispatch or IUnknown is reached.
You must list all header files required to find the interface itself and
all of the interfaces it inherits from, except IDispatch and IUnknown.
Header files must be IDL-generated for the pattern matching used in the
code to work correctly.
Args:
interface_name: 'IWebBrowser2'
header_files: ['c:\\platform_sdk\\files\\Include\\exdisp.h',
'c:\\platform_sdk\\files\\Include\\oaidl.h']
Returns:
'MOCK_METHOD1_WITH_CALLTYPE(__stdcall, GetWindow, HRESULT()); ...'
"""
mocker = Mocker()
mocker.AddHeaders(header_files)
return '\n'.join(mocker.MockInterface(interface_name))
def Main(args):
if not args or len(args) < 2:
print __doc__
return 1
else:
print _FILE_HEADER
print MockMethods(args[0], args[1:])
return 0
if __name__ == '__main__':
sys.exit(Main(sys.argv[1:]))
| 32.359281 | 80 | 0.657476 | 740 | 5,404 | 4.647297 | 0.314865 | 0.005816 | 0.015993 | 0.01483 | 0.136086 | 0.122129 | 0.122129 | 0.122129 | 0.101192 | 0.101192 | 0 | 0.006113 | 0.21299 | 5,404 | 166 | 81 | 32.554217 | 0.802492 | 0.105292 | 0 | 0.082353 | 1 | 0 | 0.206206 | 0.095023 | 0 | 0 | 0 | 0 | 0.011765 | 0 | null | null | 0 | 0.023529 | null | null | 0.035294 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4c64c5fa2e68a708577c139dcb55a8830bfec4a8 | 10,679 | py | Python | strategy/src/example.py | errrr0501/wrs2020 | 8695eca402081d3d02dc12bb50b3ebd78464e96f | [
"MIT"
] | null | null | null | strategy/src/example.py | errrr0501/wrs2020 | 8695eca402081d3d02dc12bb50b3ebd78464e96f | [
"MIT"
] | null | null | null | strategy/src/example.py | errrr0501/wrs2020 | 8695eca402081d3d02dc12bb50b3ebd78464e96f | [
"MIT"
] | 2 | 2020-10-12T07:48:51.000Z | 2020-10-19T15:28:28.000Z | #!/usr/bin/env python
"""Use to generate arm task and run."""
import os
import sys
import rospy
from arm_control import ArmTask, SuctionTask
from std_msgs.msg import String, Float64, Bool, Int32
idle = 0
busy = 1
initPose = 2
frontSafetyPos = 3
rearSafetyPos = 4
move2Bin = 5
move2Shelf = 6
moveIn2Shelf = 7
leaveBin = 8
leaveShelf = 9
move2Object = 10
move2PlacedPos = 11
pickObject = 12
placeObject = 13
move2point = 14
catchobj = 15
back2point = 16
def start_callback(msg):
global is_start
if not is_start:
is_start = msg.data
def next_pub(msg):
pub = rospy.Publisher(
'scan_black/strategy_behavior',
Int32,
# latch=True,
queue_size=1
)
if msg != 0:
pub.publish(msg)
def start_sub():
global is_start
rospy.Subscriber(
'scan_black/dualarm_start',
Bool,
start_callback,
queue_size=1
)
class exampleTask:
def __init__(self, _name ='/robotis'):
"""Initial object."""
en_sim = False
if len(sys.argv) >= 2:
rospy.set_param('en_sim', sys.argv[1])
en_sim = rospy.get_param('en_sim')
# if en_sim:
# print en_sim
# return
self.name = _name
self.state = initPose
self.nextState = idle
self.arm = ArmTask(self.name +'_arm',1)
self.pick_list = 2
self.pos = (0, 0, 0)
self.euler = (0, 0, 0)
self.phi = 0
if en_sim:
self.suction = SuctionTask(self.name + '_gazebo')
print "aa"
else:
self.suction = SuctionTask(self.name)
print "bb"
@property
def finish(self):
return self.pick_list == 0
def getRearSafetyPos(self):
if self.name == 'right':
self.pos, self.euler, self.phi = (-0.1, -0.45, -0.45), (90, 0, 0), -30
#self.pos, self.euler, self.phi = (0.3, -0.3006, -0.46), (5.029, 82.029, 4.036), 60
print "a1"
elif self.name == 'left':
self.pos, self.euler, self.phi = (-0.1, 0.45, -0.45), (-90, 0, 0), 30
#self.pos, self.euler, self.phi = (0.3, 0.3506, -0.46), (5.029, 82.029, 4.036), -60
print "b1"
def getFrontSafetyPos(self):
if self.name == 'right':
self.pos, self.euler, self.phi = (0.1, -0.45, -0.45), (0, 20, 0), 45
print "a2"
elif self.name == 'left':
self.pos, self.euler, self.phi = (0.1, 0.45, -0.45), (0, 20, 0), -45
print "b2"
def getObjectPos(self):
lunchboxPos = [[-0.4, -0.15, -0.63],
[-0.4, -0.15, -0.68]]
drinkPos = [[-0.4, 0.15, -0.63],
[-0.4, 0.15, -0.68]]
if self.name == 'right':
self.pos, self.euler, self.phi = lunchboxPos[2-self.pick_list], (90, 0, 0), -30
print "a3"
elif self.name == 'left':
self.pos, self.euler, self.phi = drinkPos[2-self.pick_list], (-90, 0, 0), 30
print "b3"
def getPlacePos(self):
lunchboxPos = [[0.5, -0.25, -0.54],
[0.5, -0.25, -0.49]]
drinkPos = [[0.5, 0.25, -0.5],
[0.42, 0.25, -0.5]]
if self.name == 'right':
self.pos, self.euler, self.phi = lunchboxPos[2-self.pick_list], (0, 90, 0), 45
print "a4"
elif self.name == 'left':
self.pos, self.euler, self.phi = drinkPos[2-self.pick_list], (0, 90, 0), -45
print "b4"
#new test
def getPlace(self):
if self.name == 'right':
self.pos, self.euler, self.phi = (0.3, -0.3006, -0.56), (5.029, 82.029, 4.036), 60
print "newA"
elif self.name == 'left':
self.pos, self.euler, self.phi = (0.3, 0.3506, -0.56), (5.029, 82.029, 4.036), -60
print "newB"
def catchobj(self):
if self.name == 'right':
self.pos, self.euler, self.phi = (0.55, -0.3006, -0.56), (5.029, 82.029, 4.036), 60
print "catchA"
elif self.name == 'left':
self.pos, self.euler, self.phi = (0.55, 0.3506, -0.56), (5.029, 82.029, 4.036), -60
print "catchB"
def backPlace(self):
if self.name == 'right':
self.pos, self.euler, self.phi = (0.3, -0.3006, -0.46), (5.029, 82.029, 4.036), 60
print "newA"
elif self.name == 'left':
self.pos, self.euler, self.phi = (0.3, 0.3506, -0.46), (5.029, 82.029, 4.036), -60
print "newB"
#end
def proces(self):
if self.arm.is_stop: # must be include in your strategy
self.finish = True # must be include in your strategy
print "!!! Robot is stop !!!" # must be include in your strategy
return # must be include in your strategy
if self.state == idle:
if self.finish:
return
else:
self.state = rearSafetyPos
print "self.pick_list = " + str(self.pick_list)
elif self.state == initPose:
self.state = busy
self.nextState = idle
self.arm.set_speed(30)
self.arm.jointMove(0, (0, -0.5, 0, 1, 0, -0.5, 0))
#self.arm.jointMove(0, (0, 0, 0, 0, 0, 0, 0))
self.suction.gripper_suction_deg(0)
print "1"
elif self.state == frontSafetyPos:
self.state = busy
self.nextState = move2Shelf
self.getFrontSafetyPos()
self.arm.set_speed(30)
self.arm.ikMove('line', self.pos, self.euler, self.phi)
self.suction.gripper_suction_deg(-20)
print "2"
elif self.state == rearSafetyPos:
self.state = busy
self.nextState = move2Bin
#self.nextState = move2point
self.getRearSafetyPos()
self.arm.ikMove('line', self.pos, self.euler, self.phi)
print "3"
#new test
elif self.state == move2point:
self.state = busy
self.nextState = catchobj
self.getPlace()
self.arm.ikMove('line', self.pos, self.euler, self.phi)
print "1111111"
elif self.state == catchobj:
self.state = busy
self.nextState = back2point
self.catchobj()
self.arm.ikMove('line', self.pos, self.euler, self.phi)
print "2222222"
elif self.state == back2point:
self.state = busy
self.nextState = initPose
self.backPlace()
self.arm.ikMove('line', self.pos, self.euler, self.phi)
print "3333333"
#end
elif self.state == move2Bin:
self.state = busy
self.nextState = move2Object
print("pos[2] type = ",type(self.pos))
self.getObjectPos()
self.pos[2] += 0.2
print("pos[2] type = ",type(self.pos))
self.arm.ikMove('line', self.pos, self.euler, self.phi)
print "4"
elif self.state == move2Shelf:
self.state = busy
self.nextState = moveIn2Shelf
#print("pos[2] type = ",type(self.pos))
self.getPlacePos()
self.pos[0] += -0.3
self.pos[2] += 0.1
self.arm.ikMove('line', self.pos, self.euler, self.phi)
self.suction.gripper_suction_deg(-90)
print "5"
elif self.state == moveIn2Shelf:
self.state = busy
self.nextState = move2PlacedPos
self.getPlacePos()
#print("pos[2] type = ",type(self.pos[2]))
self.pos[2] += 0.1
self.arm.ikMove('line', self.pos, self.euler, self.phi)
print "6"
elif self.state == leaveBin:
self.state = busy
self.nextState = frontSafetyPos
self.arm.set_speed(20)
self.arm.relative_move_pose('line', [0, 0, 0.2])
print "7"
elif self.state == leaveShelf:
self.state = busy
self.nextState = idle
self.arm.relative_move_pose('line', [-0.3, 0, 0.1])
self.pick_list -= 1
self.suction.gripper_suction_deg(0)
print "8"
elif self.state == move2Object:
self.state = busy
self.nextState = pickObject
self.arm.relative_move_pose('line', [0, 0, -0.2])
print "9"
elif self.state == move2PlacedPos:
self.state = busy
self.nextState = placeObject
self.arm.relative_move_pose('line', [0, 0, -0.1])
print "10"
elif self.state == pickObject:
self.state = busy
self.nextState = leaveBin
#self.suction.gripper_vaccum_on()
print "11"
elif self.state == placeObject:
self.state = busy
self.nextState = leaveShelf
#self.suction.gripper_vaccum_off()
print "12"
elif self.state == busy:
if self.arm.is_busy:
return
else:
self.state = self.nextState
if __name__ == '__main__':
#print "2222222"
rospy.init_node('example') #enable this node
right = exampleTask('right') #Set up right arm controller
left = exampleTask('left') #Set up left arm controller
rospy.sleep(0.3)
is_start = False
is_stop = False
print is_start
start_sub()
next_pub(0)
print 'aaaa'
rate = rospy.Rate(30) # 30hz
# R_Pos = [0.3, -0.3006, -0.46]
# R_Euler = [5.029, 82.029, 4.036]
# R_Redun = 60
# L_Pos = [0.3, 0.3506, -0.46]
# L_Euler = [5.029, 82.029, 4.036]
# L_Redun = -60
# right.arm.ikMove('line', R_Pos, R_Euler, R_Redun)
# left.arm.ikMove('line', L_Pos, L_Euler, L_Redun)
while not rospy.is_shutdown() and not is_stop:
global is_start
if is_start:
while not rospy.is_shutdown() and (not right.finish or not left.finish):
left.proces()
right.proces()
rate.sleep()
is_start = False
is_stop = True
next_pub(3)
rospy.sleep(3)
rate.sleep()
# rospy.spin()
| 32.069069 | 101 | 0.497612 | 1,343 | 10,679 | 3.881608 | 0.151154 | 0.060426 | 0.056973 | 0.073662 | 0.512181 | 0.420871 | 0.401304 | 0.353539 | 0.311337 | 0.29484 | 0 | 0.087745 | 0.370353 | 10,679 | 332 | 102 | 32.165663 | 0.687537 | 0.08999 | 0 | 0.268482 | 1 | 0 | 0.038657 | 0.005404 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.019455 | null | null | 0.143969 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d5b34294b074293054cfefd2a5825202bd035e82 | 1,879 | py | Python | utils/configs.py | GENZxM/PyLyricsBot | 718b3c771673bbb8ec8e33fe115be5490e304b46 | [
"MIT"
] | null | null | null | utils/configs.py | GENZxM/PyLyricsBot | 718b3c771673bbb8ec8e33fe115be5490e304b46 | [
"MIT"
] | null | null | null | utils/configs.py | GENZxM/PyLyricsBot | 718b3c771673bbb8ec8e33fe115be5490e304b46 | [
"MIT"
] | null | null | null | import os
import time
class Var(object):
# Get a bot token from botfather
BOT_TOKEN = os.environ.get("BOT_TOKEN", "")
# Get from my.telegram.org
API_ID = int(os.environ.get("API_ID", 12345))
# Get from my.telegram.org
API_HASH = os.environ.get("API_HASH", "")
# ID of users that can't use the bot commands
BANNED_USERS = set(
int(x) for x in os.environ.get(
"BANNED_USERS", "").split())
# To record start time of bot
BOT_START_TIME = time.time()
# Genius Api From Here : https://genius.com/api-clients
API = os.environ.get("GENIUS_API", None)
# buttons
PAGENUM = int(os.environ.get("PAGENUM", 20))
class Tr(object):
START_TEXT = """
👋 Hi ! {} Welcome To @PyLyricsXflickbot !
PyLyricsXflickbot Is An [Open-Source](https://github.com/GENZxM/PyLyricsBot/fork) Bot That Can Help You Get Song Lyrics
"""
ABOUT_TEXT = """🤖 **My Name:** [Py Lyrics by XFlick](t.me/PyLyricsXflickbot)
📝 **Language:** [Python 3](https://www.python.org)
📚 **Framework:** [Pyrogram](https://github.com/pyrogram/pyrogram)
📡 **Hosted On:** [Heroku](heroku.com)
👨💻 **Developer:** [XFlickbot](t.me/XFlick)
💡 **Source Code:** [Github](https://github.com/GENZxM/PyLyricsBot/fork)
👥 **Support Group:** [XFlick Support](https://t.me/+hpCwlBcPJtI1ZDU9)
📢 **Updates Channel:** [AnimeXflickz](https://t.me/animeXflickz)
❤ [Donate](https://www.paypal.me/AmineSoukara) (PayPal)
"""
HELP_TEXT = """💡 Just Send Me The Name Of The Song. That's it
❤ [Donate](https://www.paypal.me/AmineSoukara) (PayPal)
"""
ERR_TEXT = "⚠️ Genius API Not Found"
ERRTOKEN_TEXT = "😶 The Access Token Provided Is Expired, Revoked, Malformed Or Invalid For Other Reasons.",
NORES = "💬 No Results"
SEARCHING = "🔍 Searching For :"
WAIT = "💬 Please Wait !!"
ARTIST = "🗣 Artist :"
SONG = "🎵 Song :"
| 24.402597 | 119 | 0.63917 | 272 | 1,879 | 4.4375 | 0.481618 | 0.044739 | 0.059652 | 0.028169 | 0.164043 | 0.164043 | 0.067937 | 0.067937 | 0 | 0 | 0 | 0.006592 | 0.192656 | 1,879 | 76 | 120 | 24.723684 | 0.774555 | 0.11389 | 0 | 0.135135 | 0 | 0.027027 | 0.619795 | 0.033193 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.054054 | 0 | 0.567568 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
d5c1ef0d66242cd2f1564b9de0743432924d3ad2 | 722 | py | Python | eggs/Cheetah-2.2.2-py2.7-linux-x86_64-ucs4.egg/Cheetah/Tools/RecursiveNull.py | bopopescu/phyG | 023f505b705ab953f502cbc55e90612047867583 | [
"CC-BY-3.0"
] | 115 | 2015-01-02T12:23:59.000Z | 2021-12-22T08:18:04.000Z | eggs/Cheetah-2.2.2-py2.7-linux-x86_64-ucs4.egg/Cheetah/Tools/RecursiveNull.py | bopopescu/phyG | 023f505b705ab953f502cbc55e90612047867583 | [
"CC-BY-3.0"
] | 1 | 2017-01-12T09:10:46.000Z | 2017-01-12T09:10:46.000Z | eggs/Cheetah-2.2.2-py2.7-linux-x86_64-ucs4.egg/Cheetah/Tools/RecursiveNull.py | bopopescu/phyG | 023f505b705ab953f502cbc55e90612047867583 | [
"CC-BY-3.0"
] | 69 | 2015-03-12T07:40:21.000Z | 2020-04-04T15:15:55.000Z | """
Nothing, but in a friendly way. Good for filling in for objects you want to
hide. If $form.f1 is a RecursiveNull object, then
$form.f1.anything["you"].might("use") will resolve to the empty string.
This module was contributed by Ian Bicking.
"""
class RecursiveNull(object):
def __getattr__(self, attr):
return self
def __getitem__(self, item):
return self
def __call__(self, *args, **kwargs):
return self
def __str__(self):
return ''
def __repr__(self):
return ''
def __nonzero__(self):
return 0
def __eq__(self, x):
if x:
return False
return True
def __ne__(self, x):
return x and True or False
| 24.896552 | 76 | 0.619114 | 98 | 722 | 4.234694 | 0.591837 | 0.072289 | 0.093976 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005837 | 0.288089 | 722 | 28 | 77 | 25.785714 | 0.801556 | 0.33795 | 0 | 0.263158 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.421053 | false | 0 | 0 | 0.368421 | 0.947368 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
d5c8bc625a5027ca9cd0b01ce882045f78c21f30 | 559 | py | Python | Locker_Pro_V4/Library/Constand_Hs.py | huynhquoc1990/Locker_Pro_V4 | 37093cd75abb3981b0bb77d77d09a5e92e80781e | [
"MIT"
] | null | null | null | Locker_Pro_V4/Library/Constand_Hs.py | huynhquoc1990/Locker_Pro_V4 | 37093cd75abb3981b0bb77d77d09a5e92e80781e | [
"MIT"
] | null | null | null | Locker_Pro_V4/Library/Constand_Hs.py | huynhquoc1990/Locker_Pro_V4 | 37093cd75abb3981b0bb77d77d09a5e92e80781e | [
"MIT"
] | null | null | null | class Constand_Hs:
HEADERSIZE=1024
Dangkyvantay=str('FDK')
Dangkythetu=str('CDK')
Van_tay_mo_tu_F=str('Fopen\n')
Van_tay_dong_tu_F=str('Fclose\n')
Van_Tay_Su_Dung_Tu=str('Fused\n')
The_Tu_Su_Dung_Tu=str('Cused\n')
Van_tay_mo_tu_C=str('Copen')
Van_tay_dong_tu_C=str('Cclose')
Server=('192.168.1.128',3003)
lstouputtemp = [0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15]
lstinputtemp = [7,6,5,4,3,2,1,0,11,10,9,8,15,14,13,12]
dooropen=str('Dooropen')
Dongtu=False
Motu=True
def __init__(self):
pass | 31.055556 | 58 | 0.649374 | 108 | 559 | 3.092593 | 0.546296 | 0.08982 | 0.062874 | 0.05988 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133621 | 0.169946 | 559 | 18 | 59 | 31.055556 | 0.586207 | 0 | 0 | 0 | 0 | 0 | 0.119643 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0.055556 | 0 | 0 | 0.944444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
d5d7c8a4d07705213ba7e1456563abaa9ac4293e | 665 | py | Python | library/books/models.py | Truta446/django-library | 4bacd6e450a1cf25fdf6ba46f2144ac13ef7ceca | [
"MIT"
] | null | null | null | library/books/models.py | Truta446/django-library | 4bacd6e450a1cf25fdf6ba46f2144ac13ef7ceca | [
"MIT"
] | null | null | null | library/books/models.py | Truta446/django-library | 4bacd6e450a1cf25fdf6ba46f2144ac13ef7ceca | [
"MIT"
] | null | null | null | from django.db import models
from authors.models import Author
class Book(models.Model):
author = models.ForeignKey(Author, on_delete=models.CASCADE)
title = models.CharField(max_length=200)
description = models.TextField()
pages = models.IntegerField()
editor = models.CharField(max_length=50)
edition = models.IntegerField()
publication_year = models.IntegerField()
language = models.CharField(max_length=20)
isbn = models.CharField(max_length=20)
cover = models.ImageField(upload_to='covers/%d/%m/%Y', blank=True)
borrowed = models.BooleanField(default=False)
def __str__(self) -> str:
return self.title
| 35 | 70 | 0.723308 | 82 | 665 | 5.731707 | 0.597561 | 0.12766 | 0.153191 | 0.204255 | 0.110638 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016187 | 0.16391 | 665 | 18 | 71 | 36.944444 | 0.829137 | 0 | 0 | 0 | 0 | 0 | 0.022556 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.125 | 0.0625 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
d5dad1f26404e7be7c08e908f6cad6a7227ce6dc | 1,204 | py | Python | examples/oauth_sqlite3_app_org_level.py | hirosassa/bolt-python | befc3a1463f3ac8dbb780d66decc304e2bdf3e7a | [
"MIT"
] | 504 | 2020-08-07T05:02:57.000Z | 2022-03-31T14:32:46.000Z | examples/oauth_sqlite3_app_org_level.py | hirosassa/bolt-python | befc3a1463f3ac8dbb780d66decc304e2bdf3e7a | [
"MIT"
] | 560 | 2020-08-07T01:16:06.000Z | 2022-03-30T00:40:56.000Z | examples/oauth_sqlite3_app_org_level.py | hirosassa/bolt-python | befc3a1463f3ac8dbb780d66decc304e2bdf3e7a | [
"MIT"
] | 150 | 2020-08-07T09:41:14.000Z | 2022-03-30T04:54:51.000Z | import logging
logging.basicConfig(level=logging.DEBUG)
from slack_bolt import App, BoltContext
from slack_bolt.oauth import OAuthFlow
from slack_sdk import WebClient
app = App(oauth_flow=OAuthFlow.sqlite3(database="./slackapp.db"))
@app.use
def dump(context, next, logger):
logger.info(context)
next()
@app.use
def call_apis_with_team_id(context: BoltContext, client: WebClient, next):
# client.users_list()
client.bots_info(bot=context.bot_id)
next()
@app.event("app_mention")
def handle_app_mentions(body, say, logger):
logger.info(body)
say("What's up?")
@app.command("/org-level-command")
def command(ack):
ack("I got it!")
@app.shortcut("org-level-shortcut")
def shortcut(ack):
ack()
@app.event("team_access_granted")
def team_access_granted(event):
pass
@app.event("team_access_revoked")
def team_access_revoked(event):
pass
if __name__ == "__main__":
app.start(3000)
# pip install slack_bolt
# export SLACK_SIGNING_SECRET=***
# export SLACK_BOT_TOKEN=xoxb-***
# export SLACK_CLIENT_ID=111.111
# export SLACK_CLIENT_SECRET=***
# export SLACK_SCOPES=app_mentions:read,channels:history,im:history,chat:write
# python oauth_app.py
| 19.419355 | 78 | 0.73588 | 174 | 1,204 | 4.850575 | 0.448276 | 0.065166 | 0.030806 | 0.042654 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010526 | 0.13206 | 1,204 | 61 | 79 | 19.737705 | 0.797129 | 0.2201 | 0 | 0.1875 | 0 | 0 | 0.134409 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.21875 | false | 0.0625 | 0.125 | 0 | 0.34375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
d5dc8929db625868b1a9e53f8d5e396cbfacc28d | 9,012 | py | Python | reports/gen_usage_report.py | vmware-archive/vcloud-availability-examples | 1b8fd7d6f4d51f0cc69ab9f175c8182130798286 | [
"Apache-2.0"
] | 3 | 2017-10-12T13:40:30.000Z | 2018-11-18T16:29:55.000Z | reports/gen_usage_report.py | vmware-archive/vcloud-availability-examples | 1b8fd7d6f4d51f0cc69ab9f175c8182130798286 | [
"Apache-2.0"
] | null | null | null | reports/gen_usage_report.py | vmware-archive/vcloud-availability-examples | 1b8fd7d6f4d51f0cc69ab9f175c8182130798286 | [
"Apache-2.0"
] | 4 | 2017-02-14T22:01:37.000Z | 2018-11-18T16:29:56.000Z | #!/usr/bin/env python
"""
This script generates a usage report from the vCloud Director API.
Copyright (c) 2016 VMware, Inc. All Rights Reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not
use this file except in compliance with the License. You may obtain a copy of
the License at http://www.apache.org/licenses/LICENSE-2.0
Some files may be comprised of various open source software components, each of which
has its own license that is located in the source code of the respective component."
"""
import argparse
import csv
import json
import requests
import xml.etree.ElementTree as ET
from pprint import pprint
from requests.packages.urllib3.exceptions import InsecureRequestWarning
#Hide warning when we are skipping the SSL verification check.
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
HEADERS = {'Accept' : 'application/*+xml;version=6.0;vr-version=3.0'}
VERIFY_CERT = True
#Login
def vcd_login(username, password, url):
"""
Perform login to vCloud Director, saving the vcloud auth token to header.
"""
response = requests.post(url + "sessions", headers=HEADERS,
auth=(username, password), verify=VERIFY_CERT)
if response.status_code == requests.codes.OK:
HEADERS["x-vcloud-authorization"] = response.headers.get('x-vcloud-authorization')
else:
print "ERROR: There was an error logging into vCloud Director."
print "STATUS: %s \n\n %s" % (response.status_code, response.content)
exit()
def vcd_get(url):
"""
Generic call that will be used for all HTTP GET calls.
"""
response = requests.get(url, headers=HEADERS, verify=VERIFY_CERT)
if response.status_code == requests.codes.OK:
try:
return ET.fromstring(response.content)
except Exception as err:
print "ERROR - Unable to convert response from %s to XML" % url
print err
exit()
else:
print "ERROR - GET call to %s returned:" % url
print "STATUS: %s" % response.status_code
return None
def get_orgs(url):
"""
Return a dict of organizations in vCloud Director.
"""
orgs = {}
org_list = vcd_get(url + "org")
for org in org_list:
orgs[org.attrib['name']] = org.attrib['href']
return orgs
def fix_ns(tag):
"""
A quick hack to clean out the namespace in XML.
"""
return tag.split('}')[1]
def process_children(element):
"""
Recursiviely walk XML and return results in dict.
"""
if list(element):
ret_data = {}
for child in list(element):
ret_data[fix_ns(child.tag)] = process_children(child)
return ret_data
else:
return element.text
def process_detail(href):
"""
Return dict of replication details.
Computes TotalVMReplicationSize as a helper.
"""
repl_detail = {'Disks':[], 'TotalVMReplicationSize': 0}
replication_detail = vcd_get(href)
if replication_detail is not None:
for repd in replication_detail:
replicated_data_size = 0
if repd.tag.endswith('Vm'):
for vmd in list(repd):
child_tag = fix_ns(vmd.tag)
if child_tag == 'Disk':
disk_detail = process_children(vmd)
replicated_data_size += int(disk_detail['SpaceRequirement'])
repl_detail['Disks'].append(disk_detail)
else:
repl_detail[child_tag] = process_children(vmd)
repl_detail['TotalVMReplicationSize'] = replicated_data_size
return repl_detail
def get_paged_data(url, api):
"""
Return replication_group data that is paged
"""
ret_data = []
get_data = True
page_num = 1
while get_data:
replication_references = vcd_get("%s/%s?page=%s" % (url, api, page_num))
total_vms = int(replication_references.attrib['total'])
page_size = int(replication_references.attrib['pageSize'])
for reference in replication_references:
if reference.tag.endswith('Reference'):
repl_data = {'Detail': {}}
replication_group = vcd_get(reference.attrib['href'])
repl_data['Name'] = replication_group.attrib['name']
repl_data['Id'] = replication_group.attrib['id']
for el_rg in replication_group:
if el_rg.tag.endswith('Link') and el_rg.attrib['rel'] == 'down:details':
repl_data['Detail'] = process_detail(el_rg.attrib['href'])
elif not el_rg.tag.endswith('Link'):
repl_data[fix_ns(el_rg.tag)] = process_children(el_rg)
ret_data.append(repl_data)
if (page_size * page_num) < total_vms:
page_num += 1
else:
get_data = False
return ret_data
def get_replications(orgs):
"""
Return a dict of all the replicated VMs.
"""
repl_info = {}
for key, url in orgs.iteritems():
if key != "System":
# Creating Dict to allow for expansion of Organization data.
repl_info[key] = {'ReplicatedVMs':[], 'TotalOrganizationReplicationSize': 0}
# Process incoming
incoming_repl_data = get_paged_data(url, "replications")
for ird in incoming_repl_data:
repl_info[key]['TotalOrganizationReplicationSize'] += ird['Detail'].get('TotalVMReplicationSize', 0)
repl_info[key]['ReplicatedVMs'].extend(incoming_repl_data)
# Process outgoing
outgoing_repl_data = get_paged_data(url, "failbackreplications")
repl_info[key]['ReplicatedVMs'].extend(outgoing_repl_data)
return repl_info
def main():
"""
Main function executed when script is run.
"""
parser = argparse.ArgumentParser(description='Gather usage data from the vCD-SP API.')
parser.add_argument('username', help='vCloud Director System user.')
parser.add_argument('password', help='Provide the password for the System user.')
parser.add_argument('vcdaddress',
help='The URL of the vCloud Director Instance. Example: vcd.vmware.com')
parser.add_argument('--skip-ssl-check', help='Skip SSL verification.', action='store_false')
parser.add_argument('-o', '--output', help='Write data to file. Defualt is JSON format.')
parser.add_argument('--csv', help='Write output in csv instead of JSON', action='store_true')
parser.add_argument('--detail', help='Output a detailed view', action='store_true')
args = parser.parse_args()
vcd_url = "https://%s/api/" % args.vcdaddress
#Set as global so we don't have to pass around.
global VERIFY_CERT
VERIFY_CERT = args.skip_ssl_check
vcd_login("%s@System" % args.username, args.password, vcd_url)
vcloud_orgs = get_orgs(vcd_url)
vcloud_repl_information = get_replications(vcloud_orgs)
# If we only want the summary, dump the rest of the data.
if not args.detail:
vcloud_repl_information = [(key,
len(repl_data['ReplicatedVMs']),
repl_data['TotalOrganizationReplicationSize'])
for key, repl_data in vcloud_repl_information.iteritems()]
if args.output:
print "Output saved to: %s" % args.output
with open(args.output, 'w') as outfile:
if args.csv:
writer = csv.writer(outfile, delimiter=',', quoting=csv.QUOTE_NONNUMERIC)
if not args.detail:
writer.writerow(['Organization', 'TotalReplicatedVMs',
'TotalOrganizationReplicationSize'])
writer.writerows(vcloud_repl_information)
else:
writer.writerow(['Organization', 'Name', 'Id', 'RecoveryState',
'TestRecoveryState', 'CurrentRpoViolation', 'Paused',
'ReplicationState', 'Rpo', 'TotalVMReplicationSize'])
for key, repl_data in vcloud_repl_information.iteritems():
writer.writerows([(key, rd['Name'], rd['Id'],
rd['RecoveryState'], rd['TestRecoveryState'],
rd['CurrentRpoViolation'], rd['Paused'],
rd['ReplicationState'], rd['Rpo'],
rd['Detail']['TotalVMReplicationSize'])
for rd in repl_data['ReplicatedVMs']])
else:
# Write output in JSON to file.
json.dump(vcloud_repl_information, outfile)
else:
# Dump data to screen.
pprint(vcloud_repl_information)
if __name__ == '__main__':
main()
| 35.620553 | 116 | 0.606081 | 1,038 | 9,012 | 5.106936 | 0.286127 | 0.024146 | 0.022449 | 0.008489 | 0.073948 | 0.045274 | 0.036597 | 0.036597 | 0.036597 | 0.019242 | 0 | 0.003285 | 0.290613 | 9,012 | 252 | 117 | 35.761905 | 0.825903 | 0.03684 | 0 | 0.106667 | 1 | 0 | 0.1855 | 0.043288 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.026667 | 0.046667 | null | null | 0.06 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d5dd3ce61218b1078f1bb62995a9c396f4726e96 | 4,413 | py | Python | examples/pylab_examples/tripcolor_demo.py | f0k/matplotlib | b33a031deeb6a69533449d38d82ccf5a8246c959 | [
"MIT",
"PSF-2.0",
"BSD-3-Clause"
] | null | null | null | examples/pylab_examples/tripcolor_demo.py | f0k/matplotlib | b33a031deeb6a69533449d38d82ccf5a8246c959 | [
"MIT",
"PSF-2.0",
"BSD-3-Clause"
] | null | null | null | examples/pylab_examples/tripcolor_demo.py | f0k/matplotlib | b33a031deeb6a69533449d38d82ccf5a8246c959 | [
"MIT",
"PSF-2.0",
"BSD-3-Clause"
] | null | null | null | """
Pseudocolor plots of unstructured triangular grids.
"""
import matplotlib.pyplot as plt
import matplotlib.tri as tri
import numpy as np
import math
# Creating a Triangulation without specifying the triangles results in the
# Delaunay triangulation of the points.
# First create the x and y coordinates of the points.
n_angles = 36
n_radii = 8
min_radius = 0.25
radii = np.linspace(min_radius, 0.95, n_radii)
angles = np.linspace(0, 2*math.pi, n_angles, endpoint=False)
angles = np.repeat(angles[...,np.newaxis], n_radii, axis=1)
angles[:,1::2] += math.pi/n_angles
x = (radii*np.cos(angles)).flatten()
y = (radii*np.sin(angles)).flatten()
z = (np.cos(radii)*np.cos(angles*3.0)).flatten()
# Create the Triangulation; no triangles so Delaunay triangulation created.
triang = tri.Triangulation(x, y)
# Mask off unwanted triangles.
xmid = x[triang.triangles].mean(axis=1)
ymid = y[triang.triangles].mean(axis=1)
mask = np.where(xmid*xmid + ymid*ymid < min_radius*min_radius, 1, 0)
triang.set_mask(mask)
# pcolor plot.
plt.figure()
plt.gca().set_aspect('equal')
plt.tripcolor(triang, z, shading='flat')
plt.colorbar()
plt.title('tripcolor of Delaunay triangulation: flat')
# Illustrate Gouraud shading.
plt.figure()
plt.gca().set_aspect('equal')
plt.tripcolor(triang, z, shading='gouraud')
plt.colorbar()
plt.title('tripcolor with Gouraud shading')
# You can specify your own triangulation rather than perform a Delaunay
# triangulation of the points, where each triangle is given by the indices of
# the three points that make up the triangle, ordered in either a clockwise or
# anticlockwise manner.
xy = np.asarray([
[-0.101,0.872],[-0.080,0.883],[-0.069,0.888],[-0.054,0.890],[-0.045,0.897],
[-0.057,0.895],[-0.073,0.900],[-0.087,0.898],[-0.090,0.904],[-0.069,0.907],
[-0.069,0.921],[-0.080,0.919],[-0.073,0.928],[-0.052,0.930],[-0.048,0.942],
[-0.062,0.949],[-0.054,0.958],[-0.069,0.954],[-0.087,0.952],[-0.087,0.959],
[-0.080,0.966],[-0.085,0.973],[-0.087,0.965],[-0.097,0.965],[-0.097,0.975],
[-0.092,0.984],[-0.101,0.980],[-0.108,0.980],[-0.104,0.987],[-0.102,0.993],
[-0.115,1.001],[-0.099,0.996],[-0.101,1.007],[-0.090,1.010],[-0.087,1.021],
[-0.069,1.021],[-0.052,1.022],[-0.052,1.017],[-0.069,1.010],[-0.064,1.005],
[-0.048,1.005],[-0.031,1.005],[-0.031,0.996],[-0.040,0.987],[-0.045,0.980],
[-0.052,0.975],[-0.040,0.973],[-0.026,0.968],[-0.020,0.954],[-0.006,0.947],
[ 0.003,0.935],[ 0.006,0.926],[ 0.005,0.921],[ 0.022,0.923],[ 0.033,0.912],
[ 0.029,0.905],[ 0.017,0.900],[ 0.012,0.895],[ 0.027,0.893],[ 0.019,0.886],
[ 0.001,0.883],[-0.012,0.884],[-0.029,0.883],[-0.038,0.879],[-0.057,0.881],
[-0.062,0.876],[-0.078,0.876],[-0.087,0.872],[-0.030,0.907],[-0.007,0.905],
[-0.057,0.916],[-0.025,0.933],[-0.077,0.990],[-0.059,0.993] ])
x = xy[:,0]*180/3.14159
y = xy[:,1]*180/3.14159
x0 = -5
y0 = 52
z = np.exp(-0.01*( (x-x0)*(x-x0) + (y-y0)*(y-y0) ))
triangles = np.asarray([
[67,66, 1],[65, 2,66],[ 1,66, 2],[64, 2,65],[63, 3,64],[60,59,57],
[ 2,64, 3],[ 3,63, 4],[ 0,67, 1],[62, 4,63],[57,59,56],[59,58,56],
[61,60,69],[57,69,60],[ 4,62,68],[ 6, 5, 9],[61,68,62],[69,68,61],
[ 9, 5,70],[ 6, 8, 7],[ 4,70, 5],[ 8, 6, 9],[56,69,57],[69,56,52],
[70,10, 9],[54,53,55],[56,55,53],[68,70, 4],[52,56,53],[11,10,12],
[69,71,68],[68,13,70],[10,70,13],[51,50,52],[13,68,71],[52,71,69],
[12,10,13],[71,52,50],[71,14,13],[50,49,71],[49,48,71],[14,16,15],
[14,71,48],[17,19,18],[17,20,19],[48,16,14],[48,47,16],[47,46,16],
[16,46,45],[23,22,24],[21,24,22],[17,16,45],[20,17,45],[21,25,24],
[27,26,28],[20,72,21],[25,21,72],[45,72,20],[25,28,26],[44,73,45],
[72,45,73],[28,25,29],[29,25,31],[43,73,44],[73,43,40],[72,73,39],
[72,31,25],[42,40,43],[31,30,29],[39,73,40],[42,41,40],[72,33,31],
[32,31,33],[39,38,72],[33,72,38],[33,38,34],[37,35,38],[34,38,35],
[35,37,36] ])
# Rather than create a Triangulation object, can simply pass x, y and triangles
# arrays to tripcolor directly. It would be better to use a Triangulation object
# if the same triangulation was to be used more than once to save duplicated
# calculations.
plt.figure()
plt.gca().set_aspect('equal')
plt.tripcolor(x, y, triangles, z, shading='flat', edgecolors='k')
plt.colorbar()
plt.title('tripcolor of user-specified triangulation')
plt.xlabel('Longitude (degrees)')
plt.ylabel('Latitude (degrees)')
plt.show()
| 41.632075 | 81 | 0.615228 | 898 | 4,413 | 3.007795 | 0.335189 | 0.008886 | 0.009256 | 0.016661 | 0.146612 | 0.078119 | 0.055905 | 0.055905 | 0.055905 | 0.040726 | 0 | 0.27717 | 0.109676 | 4,413 | 105 | 82 | 42.028571 | 0.410283 | 0.192839 | 0 | 0.123288 | 0 | 0 | 0.050847 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.054795 | 0 | 0.054795 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d5e79a94828a38708e571f774e6dc7d2f5af0b8c | 276 | py | Python | src/test_it.py | Nova-Striker/Open-Palm | c7a4b0ae11fc12706bb93deca489ca622295e66d | [
"MIT"
] | 4 | 2020-10-02T05:15:29.000Z | 2020-10-09T16:42:31.000Z | src/output/test_it.py | JDeepD/Open-Palm | 894e4e53f5766d98e6f16e54aafd1f5f8f3f7815 | [
"MIT"
] | 5 | 2020-11-20T18:53:05.000Z | 2021-04-08T10:42:54.000Z | src/output/test_it.py | JDeepD/Open-Palm | 894e4e53f5766d98e6f16e54aafd1f5f8f3f7815 | [
"MIT"
] | 1 | 2020-12-06T09:14:57.000Z | 2020-12-06T09:14:57.000Z | #It is highly recommended that you check your code first in IDLE first for syntax errors and then Test it here.
#If your code has syntax errors ,Open-Palm will freeze. You have to restart it in that case
def check_even(n):
if n%2 == 0:
return True
else:
return False
| 27.6 | 111 | 0.73913 | 52 | 276 | 3.903846 | 0.730769 | 0.078818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009174 | 0.210145 | 276 | 9 | 112 | 30.666667 | 0.922018 | 0.724638 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
91109d2885d9bd37bf131bb2e4531e0638c308c7 | 10,594 | py | Python | pysnmp/ALTIGA-MULTILINK-STATS-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/ALTIGA-MULTILINK-STATS-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/ALTIGA-MULTILINK-STATS-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module ALTIGA-MULTILINK-STATS-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/ALTIGA-MULTILINK-STATS-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 17:05:49 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
alMultiLinkMibModule, = mibBuilder.importSymbols("ALTIGA-GLOBAL-REG", "alMultiLinkMibModule")
alMultiLinkGroup, alStatsMultiLink = mibBuilder.importSymbols("ALTIGA-MIB", "alMultiLinkGroup", "alStatsMultiLink")
ObjectIdentifier, Integer, OctetString = mibBuilder.importSymbols("ASN1", "ObjectIdentifier", "Integer", "OctetString")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueSizeConstraint, SingleValueConstraint, ValueRangeConstraint, ConstraintsUnion, ConstraintsIntersection = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueSizeConstraint", "SingleValueConstraint", "ValueRangeConstraint", "ConstraintsUnion", "ConstraintsIntersection")
ObjectGroup, NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "ObjectGroup", "NotificationGroup", "ModuleCompliance")
MibIdentifier, IpAddress, ObjectIdentity, Counter32, TimeTicks, Bits, Unsigned32, MibScalar, MibTable, MibTableRow, MibTableColumn, Integer32, ModuleIdentity, Gauge32, NotificationType, iso, Counter64 = mibBuilder.importSymbols("SNMPv2-SMI", "MibIdentifier", "IpAddress", "ObjectIdentity", "Counter32", "TimeTicks", "Bits", "Unsigned32", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Integer32", "ModuleIdentity", "Gauge32", "NotificationType", "iso", "Counter64")
DisplayString, TextualConvention, RowStatus = mibBuilder.importSymbols("SNMPv2-TC", "DisplayString", "TextualConvention", "RowStatus")
altigaMultiLinkStatsMibModule = ModuleIdentity((1, 3, 6, 1, 4, 1, 3076, 1, 1, 39, 2))
altigaMultiLinkStatsMibModule.setRevisions(('2002-09-05 13:00', '2002-07-10 00:00',))
if mibBuilder.loadTexts: altigaMultiLinkStatsMibModule.setLastUpdated('200209051300Z')
if mibBuilder.loadTexts: altigaMultiLinkStatsMibModule.setOrganization('Cisco Systems, Inc.')
alStatsMultiLinkGlobal = MibIdentifier((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 1))
alMultiLinkStatsTable = MibTable((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2), )
if mibBuilder.loadTexts: alMultiLinkStatsTable.setStatus('current')
alMultiLinkStatsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1), ).setIndexNames((0, "ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsIndex"))
if mibBuilder.loadTexts: alMultiLinkStatsEntry.setStatus('current')
alMultiLinkStatsRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 1), RowStatus()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: alMultiLinkStatsRowStatus.setStatus('current')
alMultiLinkStatsIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsIndex.setStatus('current')
alMultiLinkStatsTxOctets = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 3), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsTxOctets.setStatus('current')
alMultiLinkStatsTxPackets = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsTxPackets.setStatus('current')
alMultiLinkStatsTxMlpFragments = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 5), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsTxMlpFragments.setStatus('current')
alMultiLinkStatsTxMlpPackets = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 6), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsTxMlpPackets.setStatus('current')
alMultiLinkStatsTxNonMlpPackets = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 7), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsTxNonMlpPackets.setStatus('current')
alMultiLinkStatsTxThroughput = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 8), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsTxThroughput.setStatus('current')
alMultiLinkStatsRxOctets = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 9), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsRxOctets.setStatus('current')
alMultiLinkStatsRxPackets = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 10), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsRxPackets.setStatus('current')
alMultiLinkStatsRxMlpFragments = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 11), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsRxMlpFragments.setStatus('current')
alMultiLinkStatsRxMlpPackets = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 12), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsRxMlpPackets.setStatus('current')
alMultiLinkStatsRxNonMlpPackets = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 13), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsRxNonMlpPackets.setStatus('current')
alMultiLinkStatsRxThroughput = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 14), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsRxThroughput.setStatus('current')
alMultiLinkStatsRxLostEnd = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 15), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsRxLostEnd.setStatus('current')
alMultiLinkStatsRxStalePackets = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 16), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsRxStalePackets.setStatus('current')
alMultiLinkStatsRxStaleFragments = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 17), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsRxStaleFragments.setStatus('current')
alMultiLinkStatsRxDroppedFragments = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 18), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsRxDroppedFragments.setStatus('current')
alMultiLinkStatsRxOOSFragments = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 19), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsRxOOSFragments.setStatus('current')
alMultiLinkStatsIdleTmrCleanup = MibTableColumn((1, 3, 6, 1, 4, 1, 3076, 2, 1, 2, 34, 2, 1, 20), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alMultiLinkStatsIdleTmrCleanup.setStatus('current')
altigaMultiLinkStatsMibConformance = MibIdentifier((1, 3, 6, 1, 4, 1, 3076, 1, 1, 39, 2, 1))
altigaMultiLinkStatsMibCompliances = MibIdentifier((1, 3, 6, 1, 4, 1, 3076, 1, 1, 39, 2, 1, 1))
altigaMultiLinkStatsMibCompliance = ModuleCompliance((1, 3, 6, 1, 4, 1, 3076, 1, 1, 39, 2, 1, 1, 1)).setObjects(("ALTIGA-MULTILINK-STATS-MIB", "altigaMultiLinkStatsGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
altigaMultiLinkStatsMibCompliance = altigaMultiLinkStatsMibCompliance.setStatus('current')
altigaMultiLinkStatsGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 3076, 2, 1, 1, 1, 34, 2)).setObjects(("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsRowStatus"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsIndex"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsTxOctets"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsTxPackets"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsTxMlpFragments"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsTxMlpPackets"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsTxNonMlpPackets"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsTxThroughput"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsRxOctets"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsRxPackets"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsRxMlpFragments"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsRxMlpPackets"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsRxNonMlpPackets"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsRxThroughput"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsRxLostEnd"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsRxStalePackets"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsRxStaleFragments"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsRxDroppedFragments"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsRxOOSFragments"), ("ALTIGA-MULTILINK-STATS-MIB", "alMultiLinkStatsIdleTmrCleanup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
altigaMultiLinkStatsGroup = altigaMultiLinkStatsGroup.setStatus('current')
mibBuilder.exportSymbols("ALTIGA-MULTILINK-STATS-MIB", altigaMultiLinkStatsMibConformance=altigaMultiLinkStatsMibConformance, altigaMultiLinkStatsMibModule=altigaMultiLinkStatsMibModule, alMultiLinkStatsRxOctets=alMultiLinkStatsRxOctets, alMultiLinkStatsTxMlpFragments=alMultiLinkStatsTxMlpFragments, alMultiLinkStatsRowStatus=alMultiLinkStatsRowStatus, alMultiLinkStatsTxPackets=alMultiLinkStatsTxPackets, alMultiLinkStatsRxThroughput=alMultiLinkStatsRxThroughput, alMultiLinkStatsRxStalePackets=alMultiLinkStatsRxStalePackets, alMultiLinkStatsTxThroughput=alMultiLinkStatsTxThroughput, alMultiLinkStatsEntry=alMultiLinkStatsEntry, alMultiLinkStatsIndex=alMultiLinkStatsIndex, alMultiLinkStatsRxNonMlpPackets=alMultiLinkStatsRxNonMlpPackets, alMultiLinkStatsRxMlpFragments=alMultiLinkStatsRxMlpFragments, alMultiLinkStatsTxOctets=alMultiLinkStatsTxOctets, alMultiLinkStatsRxLostEnd=alMultiLinkStatsRxLostEnd, alMultiLinkStatsRxOOSFragments=alMultiLinkStatsRxOOSFragments, PYSNMP_MODULE_ID=altigaMultiLinkStatsMibModule, altigaMultiLinkStatsMibCompliance=altigaMultiLinkStatsMibCompliance, altigaMultiLinkStatsGroup=altigaMultiLinkStatsGroup, alMultiLinkStatsTxMlpPackets=alMultiLinkStatsTxMlpPackets, alMultiLinkStatsTxNonMlpPackets=alMultiLinkStatsTxNonMlpPackets, altigaMultiLinkStatsMibCompliances=altigaMultiLinkStatsMibCompliances, alStatsMultiLinkGlobal=alStatsMultiLinkGlobal, alMultiLinkStatsRxDroppedFragments=alMultiLinkStatsRxDroppedFragments, alMultiLinkStatsRxMlpPackets=alMultiLinkStatsRxMlpPackets, alMultiLinkStatsTable=alMultiLinkStatsTable, alMultiLinkStatsRxPackets=alMultiLinkStatsRxPackets, alMultiLinkStatsRxStaleFragments=alMultiLinkStatsRxStaleFragments, alMultiLinkStatsIdleTmrCleanup=alMultiLinkStatsIdleTmrCleanup)
| 141.253333 | 1,742 | 0.790447 | 1,036 | 10,594 | 8.081081 | 0.158301 | 0.011467 | 0.010033 | 0.013378 | 0.297301 | 0.250358 | 0.141424 | 0.141424 | 0.139871 | 0.138438 | 0 | 0.068228 | 0.07306 | 10,594 | 74 | 1,743 | 143.162162 | 0.784318 | 0.03266 | 0 | 0.030303 | 0 | 0 | 0.209884 | 0.121789 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.121212 | 0 | 0.121212 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
91139bb35ffdbce8891b62dcf28e90d44416cddd | 339 | py | Python | opengl/demo/demo.py | acsstudios/space | 740bdf4d58beb7d8bdb0ae469f192da60aee61f5 | [
"MIT"
] | null | null | null | opengl/demo/demo.py | acsstudios/space | 740bdf4d58beb7d8bdb0ae469f192da60aee61f5 | [
"MIT"
] | null | null | null | opengl/demo/demo.py | acsstudios/space | 740bdf4d58beb7d8bdb0ae469f192da60aee61f5 | [
"MIT"
] | null | null | null | """
Python version: 3.8.10
"""
import pygame as pg
from pygame.locals import *
from win32api import GetSystemMetrics
import os
import math
from OpenGL.GL import *
from OpenGL.GLU import *
pg.init()
# Variables
x , y = [int(GetSystemMetrics(i)/2) for i in range(2)]
windowSize = (x,y)
pg.display.set_mode(windowSize, DOUBLEBUF|OPENGL) | 16.142857 | 54 | 0.731563 | 53 | 339 | 4.660377 | 0.622642 | 0.080972 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027778 | 0.150442 | 339 | 21 | 55 | 16.142857 | 0.829861 | 0.097345 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.636364 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
911d0f1e581c0718eb205c7f80098dff179cf6c6 | 84,562 | py | Python | language-modeling/lmtool-fwms/src/mem_transformer.py | minhtannguyen/transformer-mgk | 304ebf3781b1eb4aeef93f2757319775d2fcdbc4 | [
"CC0-1.0"
] | 5 | 2021-11-06T16:10:31.000Z | 2021-12-25T19:47:42.000Z | language-modeling/lmtool-fwms/src/mem_transformer.py | minhtannguyen/transformer-mgk | 304ebf3781b1eb4aeef93f2757319775d2fcdbc4 | [
"CC0-1.0"
] | null | null | null | language-modeling/lmtool-fwms/src/mem_transformer.py | minhtannguyen/transformer-mgk | 304ebf3781b1eb4aeef93f2757319775d2fcdbc4 | [
"CC0-1.0"
] | 2 | 2021-11-30T03:36:54.000Z | 2021-12-25T19:49:58.000Z | import torch
import torch.nn as nn
import torch.nn.functional as F
import numpy as np
import pdb
# import sys
from utils.proj_adaptive_softmax import ProjectedAdaptiveLogSoftmax
from utils.log_uniform_sampler import LogUniformSampler, sample_logits
from utils.performer_helper import prime, draw_orthogonal_random_matrix
from utils.fast_weight import StepWiseLinearTransformerLayer
from utils.fast_weight import StepWiseDPFPLinearTransformerLayer
from utils.fast_weight import DebugStepWiseLinearTransformerLayer
from utils.cuda_fast_weight_layer import CudaFastWeightLinearTransformerLayer
from utils.cuda_fast_weight_layer import CudaFastWeightPerformerLayer
from utils.cuda_fast_weight_layer import CudaFastWeightSumLinearTransformerLayer
from utils.cuda_fast_weight_layer import CudaFastWeightSumPerformerLayer
from utils.cuda_fast_weight_layer import CudaNormFastWeightLinearTransformerLayer
from utils.cuda_fast_weight_layer import CudaNormFastWeightPerformerLayer
from utils.cuda_fast_weight_layer import CudaFastWeightDPFPTransformerLayer
from utils.cuda_fast_weight_layer import CudaNormFastWeightDPFPTransformerLayer
from utils.fast_fast_weight import fast_weight_memory
from utils.fast_transformers import fast_weight_sum
from utils.performer_helper import prime, draw_orthogonal_random_matrix
class PositionalEmbedding(nn.Module):
def __init__(self, demb):
super(PositionalEmbedding, self).__init__()
self.demb = demb
inv_freq = 1 / (10000 ** (torch.arange(0.0, demb, 2.0) / demb))
self.register_buffer('inv_freq', inv_freq)
def forward(self, pos_seq, bsz=None):
sinusoid_inp = torch.ger(pos_seq, self.inv_freq)
pos_emb = torch.cat([sinusoid_inp.sin(), sinusoid_inp.cos()], dim=-1)
if bsz is not None:
return pos_emb[:,None,:].expand(-1, bsz, -1)
else:
return pos_emb[:,None,:]
class PositionwiseFF(nn.Module):
def __init__(self, d_model, d_inner, dropout, pre_lnorm=False):
super(PositionwiseFF, self).__init__()
self.d_model = d_model
self.d_inner = d_inner
self.dropout = dropout
self.CoreNet = nn.Sequential(
nn.Linear(d_model, d_inner), nn.ReLU(inplace=True),
nn.Dropout(dropout),
nn.Linear(d_inner, d_model),
nn.Dropout(dropout),
)
self.layer_norm = nn.LayerNorm(d_model)
self.pre_lnorm = pre_lnorm
def forward(self, inp):
if self.pre_lnorm:
# layer normalization + positionwise feed-forward
core_out = self.CoreNet(self.layer_norm(inp))
# residual connection
output = core_out + inp
else:
# positionwise feed-forward
core_out = self.CoreNet(inp)
# residual connection + layer normalization
output = self.layer_norm(inp + core_out)
return output
class MGKAttn(nn.Module):
def __init__(self, n_head, d_model, d_head, dropout, dropatt=0,
pre_lnorm=False, update_mode='hard'):
super(MGKAttn, self).__init__()
self.n_head = n_head
self.d_model = d_model
self.d_head = d_head
self.dropout = dropout
self.update_mode = update_mode
self.q_net = nn.Linear(d_model, n_head * d_head, bias=False)
if self.update_mode == 'hard2keys' or self.update_mode == 'soft2keys' or self.update_mode == 'rbf2keys':
self.kv_net = nn.Linear(d_model, 3 * n_head * d_head, bias=False)
else:
self.kv_net = nn.Linear(d_model, 2 * n_head * d_head, bias=False)
# for mgk
if self.update_mode == 'hard' or self.update_mode == 'soft' or self.update_mode == 'rbf':
self.mu = nn.Parameter((torch.empty(2, n_head, d_head).normal_(mean = 0.0, std = .5))*torch.tensor([0.,1.])[:, None, None], requires_grad= True)
self.drop = nn.Dropout(dropout)
self.dropatt = nn.Dropout(dropatt)
self.o_net = nn.Linear(n_head * d_head, d_model, bias=False)
self.layer_norm = nn.LayerNorm(d_model)
self.scale = 1 / (d_head ** 0.5)
self.pre_lnorm = pre_lnorm
if self.update_mode == 'soft' or self.update_mode == 'soft2keys':
self.register_buffer("pi", 0.5 * torch.ones(self.n_head, 1, 1, 256, requires_grad= False))
# self.register_buffer("pi", 0.5 * torch.ones(self.n_head, 1, 1, 1, requires_grad= False))
if self.update_mode == 'rbf' or self.update_mode == 'rbf2keys':
# self.pi0 = nn.Parameter(torch.rand(self.n_head, 1, 1, 256), requires_grad= True)
# self.pi1 = nn.Parameter(torch.rand(self.n_head, 1, 1, 256), requires_grad= True)
self.pi0 = nn.Parameter(0.5 * torch.ones(self.n_head, 1, 1, 256), requires_grad= True)
self.pi1 = nn.Parameter(0.5 * torch.ones(self.n_head, 1, 1, 256), requires_grad= True)
def forward(self, h, attn_mask=None, mems=None,
carry_over_fast_weight=False):
assert not carry_over_fast_weight, "Not supported."
# multihead attention
# [hlen x bsz x n_head x d_head]
if mems is not None:
c = torch.cat([mems, h], 0)
else:
c = h
if self.pre_lnorm:
# layer normalization
c = self.layer_norm(c)
head_q = self.q_net(h)
if self.update_mode == 'hard2keys' or self.update_mode == 'soft2keys' or self.update_mode == 'rbf2keys':
head_k, head_k1, head_v = torch.chunk(self.kv_net(c), 3, -1)
else:
head_k, head_v = torch.chunk(self.kv_net(c), 2, -1)
head_q = head_q.view(h.size(0), h.size(1), self.n_head, self.d_head)
head_k = head_k.view(c.size(0), c.size(1), self.n_head, self.d_head)
head_v = head_v.view(c.size(0), c.size(1), self.n_head, self.d_head)
if self.update_mode == 'hard2keys' or self.update_mode == 'soft2keys' or self.update_mode == 'rbf2keys':
head_k1 = head_k1.view(c.size(0), c.size(1), self.n_head, self.d_head)
if self.update_mode == 'hard2keys' or self.update_mode == 'soft2keys' or self.update_mode == 'rbf2keys':
QK_distance0 = (-self.scale/2.0)*torch.square(torch.cdist(head_q.transpose(0,2), head_k.transpose(0,2)))
QK_distance1 = (-1.5*self.scale)*torch.square(torch.cdist(head_q.transpose(0,2), head_k1.transpose(0,2)))
# nu = 1.0
# QK_distance1 = (-nu*0.5 - 0.5) * torch.log(1 + self.scale*torch.square(torch.cdist(head_q.transpose(0,2), head_k1.transpose(0,2)))/nu)
else:
QK_distance0 = (-self.scale/2.0)*torch.square(torch.cdist(head_q.transpose(0,2), (head_k - self.mu[0]).transpose(0,2))) # n_head x bsz x qlen x klen
QK_distance1 = (-1.5*self.scale)*torch.square(torch.cdist(head_q.transpose(0,2), (head_k - self.mu[1]).transpose(0,2))) # n_head x bsz x qlen x klen
if self.update_mode == 'hard' or self.update_mode == 'hard2keys':
attn_score = torch.maximum(QK_distance0, QK_distance1)
attn_score = attn_score.permute(2, 3, 1, 0)
# [qlen x klen x bsz x n_head]
if attn_mask is not None and attn_mask.any().item():
if attn_mask.dim() == 2:
attn_score.masked_fill_(
attn_mask[None,:,:,None], -float('inf'))
elif attn_mask.dim() == 3:
attn_score.masked_fill_(attn_mask[:,:,:,None], -float('inf'))
# [qlen x klen x bsz x n_head]
attn_prob = F.softmax(attn_score, dim=1)
elif self.update_mode == 'soft' or self.update_mode == 'soft2keys':
pi = self.pi.clone().detach()
attn_prob = pi[:,:,:,:c.size(0)] * torch.exp(QK_distance0) + (1.0 - pi[:,:,:,:c.size(0)]) * torch.exp(QK_distance1)
# attn_prob = pi * torch.exp(QK_distance0) + (1.0 - pi) * torch.exp(QK_distance1)
if self.training is True:
resp0 = pi[:,:,:,:c.size(0)] * torch.exp(QK_distance0) / (attn_prob + 1e-6)
# resp0 = pi * torch.exp(QK_distance0) / (attn_prob + 1e-6)
pi_new = torch.sum(resp0, dim=(1,2), keepdim=True)/(h.size(0) * h.size(1))
pi_new = torch.cat((pi_new, pi[:,:,:,c.size(0):]), dim=3)
# pi_new = torch.sum(resp0, dim=(1,2,3), keepdim=True)/(h.size(0) * h.size(1) * c.size(0))
pi_new = pi_new.to(h)
self.pi.copy_(pi_new.detach())
attn_prob = attn_prob.permute(2, 3, 1, 0)
if attn_mask is not None and attn_mask.any().item():
if attn_mask.dim() == 2:
attn_prob.masked_fill_(
attn_mask[None,:,:,None], 0.0)
elif attn_mask.dim() == 3:
attn_prob.masked_fill_(attn_mask[:,:,:,None], 0.0)
attn_prob = attn_prob / ((attn_prob.sum(dim=1))[:, None, :, :] + 1e-6)
else:
attn_prob = torch.clamp(self.pi0[:,:,:,:c.size(0)], min=0.0, max=1.0) * torch.exp(QK_distance0) + torch.clamp(self.pi1[:,:,:,:c.size(0)], min=0.0, max=1.0) * torch.exp(QK_distance1)
# attn_prob = self.pi0 * torch.exp(QK_distance0) + self.pi1 * torch.exp(QK_distance1)
attn_prob = attn_prob.permute(2, 3, 1, 0)
if attn_mask is not None and attn_mask.any().item():
if attn_mask.dim() == 2:
attn_prob.masked_fill_(
attn_mask[None,:,:,None], 0.0)
elif attn_mask.dim() == 3:
attn_prob.masked_fill_(attn_mask[:,:,:,None], 0.0)
attn_prob = attn_prob / ((attn_prob.sum(dim=1))[:, None, :, :] + 1e-6)
attn_prob = self.dropatt(attn_prob)
# [qlen x klen x bsz x n_head] + [klen x bsz x n_head x d_head]
# -> [qlen x bsz x n_head x d_head]
attn_vec = torch.einsum('ijbn,jbnd->ibnd', (attn_prob, head_v))
attn_vec = attn_vec.contiguous().view(
attn_vec.size(0), attn_vec.size(1), self.n_head * self.d_head)
# linear projection
attn_out = self.o_net(attn_vec)
attn_out = self.drop(attn_out)
if self.pre_lnorm:
# residual connection
output = h + attn_out
else:
# residual connection + layer normalization
output = self.layer_norm(h + attn_out)
return output
# Standard multihead attention.
class MultiHeadAttn(nn.Module):
def __init__(self, n_head, d_model, d_head, dropout, dropatt=0,
pre_lnorm=False):
super(MultiHeadAttn, self).__init__()
self.n_head = n_head
self.d_model = d_model
self.d_head = d_head
self.dropout = dropout
self.q_net = nn.Linear(d_model, n_head * d_head, bias=False)
self.kv_net = nn.Linear(d_model, 2 * n_head * d_head, bias=False)
self.drop = nn.Dropout(dropout)
self.dropatt = nn.Dropout(dropatt)
self.o_net = nn.Linear(n_head * d_head, d_model, bias=False)
self.layer_norm = nn.LayerNorm(d_model)
self.scale = 1 / (d_head ** 0.5)
self.pre_lnorm = pre_lnorm
def forward(self, h, attn_mask=None, mems=None,
carry_over_fast_weight=False):
assert not carry_over_fast_weight, "Not supported."
# multihead attention
# [hlen x bsz x n_head x d_head]
if mems is not None:
c = torch.cat([mems, h], 0)
else:
c = h
if self.pre_lnorm:
# layer normalization
c = self.layer_norm(c)
head_q = self.q_net(h)
head_k, head_v = torch.chunk(self.kv_net(c), 2, -1)
head_q = head_q.view(h.size(0), h.size(1), self.n_head, self.d_head)
head_k = head_k.view(c.size(0), c.size(1), self.n_head, self.d_head)
head_v = head_v.view(c.size(0), c.size(1), self.n_head, self.d_head)
# [qlen x klen x bsz x n_head]
attn_score = torch.einsum('ibnd,jbnd->ijbn', (head_q, head_k))
attn_score.mul_(self.scale)
if attn_mask is not None and attn_mask.any().item():
if attn_mask.dim() == 2:
attn_score.masked_fill_(
attn_mask[None,:,:,None], -float('inf'))
elif attn_mask.dim() == 3:
attn_score.masked_fill_(attn_mask[:,:,:,None], -float('inf'))
# [qlen x klen x bsz x n_head]
attn_prob = F.softmax(attn_score, dim=1)
attn_prob = self.dropatt(attn_prob)
# [qlen x klen x bsz x n_head] + [klen x bsz x n_head x d_head]
# -> [qlen x bsz x n_head x d_head]
attn_vec = torch.einsum('ijbn,jbnd->ibnd', (attn_prob, head_v))
attn_vec = attn_vec.contiguous().view(
attn_vec.size(0), attn_vec.size(1), self.n_head * self.d_head)
# linear projection
attn_out = self.o_net(attn_vec)
attn_out = self.drop(attn_out)
if self.pre_lnorm:
# residual connection
output = h + attn_out
else:
# residual connection + layer normalization
output = self.layer_norm(h + attn_out)
pdb.set_trace()
return output
# Linear multihead attention from Katharopoulos et al.
# Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention
# https://arxiv.org/abs/2006.16236
class LinearMultiHeadAttn(nn.Module):
def __init__(self, n_head, d_model, d_head, dropout, dropatt=0,
pre_lnorm=False):
super(LinearMultiHeadAttn, self).__init__()
print("Using LinearMultiHeadAttn --")
self.n_head = n_head
self.d_model = d_model
self.d_head = d_head
self.dropout = dropout
self.q_net = nn.Linear(d_model, n_head * d_head, bias=False)
self.kv_net = nn.Linear(d_model, 2 * n_head * d_head, bias=False)
self.drop = nn.Dropout(dropout)
self.dropatt = nn.Dropout(dropatt)
self.o_net = nn.Linear(n_head * d_head, d_model, bias=False)
self.layer_norm = nn.LayerNorm(d_model)
self.scale = 1 / (d_head ** 0.5)
self.pre_lnorm = pre_lnorm
def forward(self, h, attn_mask=None, mems=None,
carry_over_fast_weight=False):
assert not carry_over_fast_weight, "Not supported."
# multihead attention
# [hlen x bsz x n_head x d_head]
if mems is not None:
c = torch.cat([mems, h], 0)
else:
c = h
if self.pre_lnorm:
# layer normalization
c = self.layer_norm(c)
head_q = self.q_net(h)
head_k, head_v = torch.chunk(self.kv_net(c), 2, -1)
head_q = head_q.view(h.size(0), h.size(1), self.n_head, self.d_head)
head_k = head_k.view(c.size(0), c.size(1), self.n_head, self.d_head)
head_v = head_v.view(c.size(0), c.size(1), self.n_head, self.d_head)
# transform q and k
head_q = F.elu(head_q, 1., False) + 1.
head_k = F.elu(head_k, 1., False) + 1.
# [qlen x klen x bsz x n_head]
attn_score = torch.einsum('ibnd,jbnd->ijbn', (head_q, head_k))
attn_score.mul_(self.scale)
if attn_mask is not None and attn_mask.any().item():
if attn_mask.dim() == 2:
# masked position to 0
attn_score.masked_fill_(attn_mask[None, :, :, None], 0)
elif attn_mask.dim() == 3:
attn_score.masked_fill_(attn_mask[:, :, :, None], 0)
# normalize attn scores over keys
eps = 1e-5
denominator = torch.sum(attn_score, 1, keepdim=True) + eps
# get (q_len, 1, B, n_head)
attn_score = self.dropatt(attn_score) # change
attn_prob = attn_score / denominator
# [qlen x klen x bsz x n_head] + [klen x bsz x n_head x d_head]
# -> [qlen x bsz x n_head x d_head]
attn_vec = torch.einsum('ijbn,jbnd->ibnd', (attn_prob, head_v))
attn_vec = attn_vec.contiguous().view(
attn_vec.size(0), attn_vec.size(1), self.n_head * self.d_head)
# linear projection
attn_out = self.o_net(attn_vec)
attn_out = self.drop(attn_out)
if self.pre_lnorm:
# residual connection
output = h + attn_out
else:
# residual connection + layer normalization
output = self.layer_norm(h + attn_out)
return output
class CudaFastWeightSumTwoLinearTransformerLayer(nn.Module):
def __init__(self, n_head, d_model, d_head, dropout, dropatt=0,
pre_lnorm=False, eps=1e-5, layer_id=None, num_layer=None,
skip_attn_normalization=False, update_mode = 'rbf2keys', scale_w = 1.):
super(CudaFastWeightSumTwoLinearTransformerLayer, self).__init__()
print(f"Using CudaFastWeightSumTwoLinearTransformerLayer {layer_id} -")
assert layer_id is not None
assert num_layer is not None
self.layer_id = layer_id
self.num_layer = num_layer
self.n_head = n_head
self.d_model = d_model
self.d_head = d_head
self.dropout = dropout
self.update_mode = update_mode
self.scale_w = scale_w
# self.learn_scale_w = learn_scale_w
# (3 * d_head * n_head) for qkv and (1 * n_head) for beta.
if update_mode == 'rbf2keys':
self.qkv_net = nn.Linear(d_model, n_head * (4 * d_head), bias=False)
else:
self.qkv_net = nn.Linear(d_model, n_head * (3 * d_head), bias=False)
if update_mode == 'rbf':
self.mu = nn.Parameter((torch.empty(2, n_head, d_head).normal_(mean = 0.0, std = .5))*torch.tensor([0.,1.])[:, None, None], requires_grad= True)
# self.qkv_net = nn.Linear(
# d_model, n_head * 3 * d_head, bias=False)
self.drop = nn.Dropout(dropout)
self.dropatt = nn.Dropout(dropatt)
self.o_net = nn.Linear(n_head * d_head, d_model, bias=False)
self.layer_norm = nn.LayerNorm(d_model)
self.scale = 1 / (d_head ** 0.5)
self.pre_lnorm = pre_lnorm
self.normalize_attn_scores = (not skip_attn_normalization)
self.eps = eps
if self.update_mode in ['rbf', 'rbf2keys']:
self.pi0 = nn.Parameter(0.5 * torch.ones(self.n_head, 256), requires_grad= True)
self.pi1 = nn.Parameter(0.5 * torch.ones(self.n_head, 256), requires_grad= True)
# pi0 = torch.rand(self.n_head, 256)
# self.pi0 = nn.Parameter(pi0, requires_grad= True)
# self.pi1 = nn.Parameter(1 - pi0, requires_grad= True)
def forward(self, h, attn_mask=None, mems=None,
carry_over_fast_weight=False):
# multihead attention
# shape h: (len, B, n_head * d_head)
l = h.size()[0]
if self.pre_lnorm:
# layer normalization
h = self.layer_norm(h)
slen, bsz, _ = h.size()
qkv = self.qkv_net(h)
if self.update_mode == 'rbf2keys':
qkv = qkv.view(slen, bsz, self.n_head, 4 * self.d_head)
head_q, head_k1,head_k2, head_v = torch.split(
qkv, (self.d_head,) * 4, -1)
else:
qkv = qkv.view(slen, bsz, self.n_head, 3 * self.d_head)
head_q, head_k, head_v = torch.split(
qkv, (self.d_head,) * 3, -1)
# reshape to (B, heads, len, dim)
# head_q = head_q.permute(1, 2, 0, 3)
# head_k = head_k.permute(1, 2, 0, 3)
# head_v = head_v.permute(1, 2, 0, 3)
head_q = head_q.permute(1, 2, 0, 3)
head_v = head_v.permute(1, 2, 0, 3)
if self.update_mode == 'rbf2keys':
head_k1 = head_k1.permute(1, 2, 0, 3)
head_k2 = head_k2.permute(1, 2, 0, 3)
# assert 1==2
elif self.update_mode == 'rbf':
head_k1 = (head_k - self.mu[0]).permute(1, 2, 0, 3) # (B, n_head, len, proj_dim)
head_k2 = (head_k - self.mu[1]).permute(1, 2, 0, 3)
else:
head_k = head_k.permute(1, 2, 0, 3)
# TODO add dropout here?
# transform q and k
head_q = F.elu(head_q, 1., False) + 1.
head_q = head_q / head_q.sum(-1, keepdim=True)
if self.update_mode in ['rbf2keys', 'rbf']:
head_k1 = F.elu(head_k1, 1., False) + 1.
head_k2 = F.elu(head_k2, 1., False) + 1.
head_k1 = head_k1 / head_k1.sum(-1, keepdim=True)
head_k2 = head_k2 / head_k2.sum(-1, keepdim=True)
# head_k = head_k1*(self.pi0[None,:,:l,None]) + head_k2*(self.pi1[None,:,:l,None])
head_k = head_k1*(torch.clamp(self.pi0, min = 0., max = 1.)[None,:,:l,None]) + self.scale_w*head_k2*(torch.clamp(self.pi1, min = 0., max = 1.)[None,:,:l,None])
# head_k = head_k / head_k.sum(-1, keepdim=True)
else:
head_k = F.elu(head_k, 1., False) + 1.
head_k = head_k / head_k.sum(-1, keepdim=True)
if self.normalize_attn_scores:
denominator_acc = torch.cumsum(head_k, dim=2)
if mems is None:
mem_fast_weights = torch.zeros(
bsz, self.n_head, self.d_head, self.d_head,
device=head_k.device)
else:
assert carry_over_fast_weight
mem_fast_weights, fast_denom = mems
# bsz can be smaller for the last batch
mem_fast_weights = mem_fast_weights[:bsz]
if self.normalize_attn_scores:
denominator_acc = denominator_acc + fast_denom[:bsz]
if self.normalize_attn_scores:
denominator = torch.einsum(
'lbij,lbij->lbi', denominator_acc, head_q).unsqueeze(-1)
layer_out = fast_weight_sum(
head_q, head_k, head_v, mem_fast_weights)
# shape (B, n_head, len, d_head)
if self.normalize_attn_scores:
layer_out = self.scale * layer_out / (denominator + self.eps)
else:
layer_out = self.scale * layer_out
layer_out = layer_out.transpose(1, 2)
layer_out = layer_out.reshape(
bsz, slen, self.n_head * self.d_head)
layer_out = layer_out.transpose(0, 1)
# expect [qlen, B, n_head * d_head]
#######
# linear projection
attn_out = self.o_net(layer_out)
attn_out = self.drop(attn_out)
if self.pre_lnorm:
# residual connection
output = h + attn_out
else:
# residual connection + layer normalization
output = self.layer_norm(h + attn_out)
if carry_over_fast_weight:
# last values of accumulator should be carried over.
# clone is needed as backward modifies the data of fast weight
if self.normalize_attn_scores:
new_k_acc = denominator_acc[:, :, -1, :].unsqueeze(2).detach()
else:
new_k_acc = None
new_mem = (mem_fast_weights.clone().detach(), new_k_acc)
return output, new_mem
return output
class CudaFastWeightSumQueryTwoLinearTransformerLayer(nn.Module):
def __init__(self, n_head, d_model, d_head, dropout, dropatt=0,
pre_lnorm=False, eps=1e-5, layer_id=None, num_layer=None,
skip_attn_normalization=False, update_mode = 'rbf2keys', scale_w = 1.):
super(CudaFastWeightSumQueryTwoLinearTransformerLayer, self).__init__()
print(f"Using CudaFastWeightSumQueryTwoLinearTransformerLayer {layer_id} -")
assert layer_id is not None
assert num_layer is not None
self.layer_id = layer_id
self.num_layer = num_layer
self.n_head = n_head
self.d_model = d_model
self.d_head = d_head
self.dropout = dropout
self.update_mode = update_mode
self.scale_w = scale_w
# self.learn_scale_w = learn_scale_w
# (3 * d_head * n_head) for qkv and (1 * n_head) for beta.
if update_mode == 'rbf2keys':
self.qkv_net = nn.Linear(d_model, n_head * (4 * d_head), bias=False)
else:
self.qkv_net = nn.Linear(d_model, n_head * (3 * d_head), bias=False)
if update_mode == 'rbf':
self.mu = nn.Parameter((torch.empty(2, n_head, d_head).normal_(mean = 0.0, std = .5))*torch.tensor([0.,1.])[:, None, None], requires_grad= True)
# self.qkv_net = nn.Linear(
# d_model, n_head * 3 * d_head, bias=False)
self.drop = nn.Dropout(dropout)
self.dropatt = nn.Dropout(dropatt)
self.o_net = nn.Linear(n_head * d_head, d_model, bias=False)
self.layer_norm = nn.LayerNorm(d_model)
self.scale = 1 / (d_head ** 0.5)
self.pre_lnorm = pre_lnorm
self.normalize_attn_scores = (not skip_attn_normalization)
self.eps = eps
if self.update_mode in ['rbf', 'rbf2keys']:
# self.pi0 = nn.Parameter(0.5 * torch.ones(self.n_head, 256), requires_grad= True)
# self.pi1 = nn.Parameter(0.5 * torch.ones(self.n_head, 256), requires_grad= True)
pi0 = torch.rand(self.n_head, 256)
self.pi0 = nn.Parameter(pi0, requires_grad= True)
# self.pi1 = nn.Parameter(1 - pi0, requires_grad= True)
def forward(self, h, attn_mask=None, mems=None,
carry_over_fast_weight=False):
# multihead attention
# shape h: (len, B, n_head * d_head)
l = h.size()[0]
if self.pre_lnorm:
# layer normalization
h = self.layer_norm(h)
slen, bsz, _ = h.size()
qkv = self.qkv_net(h)
if self.update_mode == 'rbf2keys':
qkv = qkv.view(slen, bsz, self.n_head, 4 * self.d_head)
head_q, head_k1,head_k2, head_v = torch.split(
qkv, (self.d_head,) * 4, -1)
else:
qkv = qkv.view(slen, bsz, self.n_head, 3 * self.d_head)
head_q, head_k, head_v = torch.split(
qkv, (self.d_head,) * 3, -1)
# reshape to (B, heads, len, dim)
# head_q = head_q.permute(1, 2, 0, 3)
# head_k = head_k.permute(1, 2, 0, 3)
# head_v = head_v.permute(1, 2, 0, 3)
head_q = head_q.permute(1, 2, 0, 3)
head_v = head_v.permute(1, 2, 0, 3)
if self.update_mode == 'rbf2keys':
head_k1 = head_k1.permute(1, 2, 0, 3)
head_k2 = head_k2.permute(1, 2, 0, 3)
# assert 1==2
elif self.update_mode == 'rbf':
head_k1 = (head_k - self.mu[0]).permute(1, 2, 0, 3) # (B, n_head, len, proj_dim)
head_k2 = (head_k - self.mu[1]).permute(1, 2, 0, 3)
else:
head_k = head_k.permute(1, 2, 0, 3)
# TODO add dropout here?
# transform q and k
head_q = F.elu(head_q, 1., False) + 1.
head_q = head_q / head_q.sum(-1, keepdim=True)
if self.update_mode in ['rbf2keys', 'rbf']:
head_k1 = F.elu(head_k1, 1., False) + 1.
head_k2 = F.elu(head_k2, 1., False) + 1.
head_k1 = head_k1 / head_k1.sum(-1, keepdim=True)
head_k2 = head_k2 / head_k2.sum(-1, keepdim=True)
# head_k = head_k1*self.pi0[None,:,:l,None] + self.scale_w*head_k2*self.pi1[None,:,:l,None]
# head_k = head_k / head_k.sum(-1, keepdim=True)
else:
head_k = F.elu(head_k, 1., False) + 1.
head_k = head_k / head_k.sum(-1, keepdim=True)
if self.normalize_attn_scores:
if self.update_mode in ['rbf2keys', 'rbf']:
denominator_acc1 = torch.cumsum(head_k1, dim=2)
denominator_acc2 = torch.cumsum(head_k2, dim=2)
else:
denominator_acc = torch.cumsum(head_k, dim=2)
if mems is None:
mem_fast_weights = torch.zeros(
bsz, self.n_head, self.d_head, self.d_head,
device=head_q.device)
else:
assert 1==2
assert carry_over_fast_weight
mem_fast_weights, fast_denom = mems
# bsz can be smaller for the last batch
mem_fast_weights = mem_fast_weights[:bsz]
if self.normalize_attn_scores:
denominator_acc = denominator_acc + fast_denom[:bsz]
if self.normalize_attn_scores:
if self.update_mode in ['rbf2keys', 'rbf']:
denominator1 = torch.einsum('lbij,lbij->lbi', denominator_acc1, head_q).unsqueeze(-1)
denominator2 = torch.einsum('lbij,lbij->lbi', denominator_acc2, head_q).unsqueeze(-1)
else:
denominator = torch.einsum(
'lbij,lbij->lbi', denominator_acc, head_q).unsqueeze(-1)
if self.update_mode in ['rbf2keys', 'rbf']:
layer_out1 = fast_weight_sum(head_q, head_k1, head_v, mem_fast_weights)
layer_out2 = fast_weight_sum(head_q, head_k2, head_v, mem_fast_weights)
else:
layer_out = fast_weight_sum(head_q, head_k, head_v, mem_fast_weights)
# shape (B, n_head, len, d_head)
if self.normalize_attn_scores:
if self.update_mode in ['rbf2keys', 'rbf']:
layer_out1 = self.scale * layer_out1 / (denominator1 + self.eps)
layer_out2 = self.scale * layer_out2 / (denominator2 + self.eps)
else:
layer_out = self.scale * layer_out / (denominator + self.eps)
else:
if self.update_mode in ['rbf2keys', 'rbf']:
layer_out = self.scale * (layer_out1*(torch.clamp(self.pi0, min = 0., max = 1.)[None, :, :l, None]) + layer_out2*(1 - torch.clamp(self.pi0, min = 0., max = 1.)[None, :, :l, None]))
else:
layer_out = self.scale * layer_out
layer_out = layer_out.transpose(1, 2)
layer_out = layer_out.reshape(
bsz, slen, self.n_head * self.d_head)
layer_out = layer_out.transpose(0, 1)
# expect [qlen, B, n_head * d_head]
#######
# linear projection
attn_out = self.o_net(layer_out)
attn_out = self.drop(attn_out)
if self.pre_lnorm:
# residual connection
output = h + attn_out
else:
# residual connection + layer normalization
output = self.layer_norm(h + attn_out)
if carry_over_fast_weight:
assert 1==2
# last values of accumulator should be carried over.
# clone is needed as backward modifies the data of fast weight
if self.normalize_attn_scores:
new_k_acc = denominator_acc[:, :, -1, :].unsqueeze(2).detach()
else:
new_k_acc = None
new_mem = (mem_fast_weights.clone().detach(), new_k_acc)
return output, new_mem
return output
# DPFP linear attention.
class DPFPMultiHeadAttn(nn.Module):
def __init__(self, n_head, d_model, d_head, dropout, dropatt=0,
pre_lnorm=False, n_roll=3):
super(DPFPMultiHeadAttn, self).__init__()
print(f"Using DPFPMultiHeadAttn with {n_roll} rolls --")
self.n_roll = n_roll
self.n_head = n_head
self.d_model = d_model
self.d_head = d_head
self.dropout = dropout
self.q_net = nn.Linear(d_model, n_head * d_head, bias=False)
self.kv_net = nn.Linear(d_model, 2 * n_head * d_head, bias=False)
self.drop = nn.Dropout(dropout)
self.dropatt = nn.Dropout(dropatt)
self.o_net = nn.Linear(n_head * d_head, d_model, bias=False)
self.layer_norm = nn.LayerNorm(d_model)
self.scale = 1 / (d_head ** 0.5)
self.pre_lnorm = pre_lnorm
def mul_roll_repeat(self, x):
rolls = []
for i in range(1, self.n_roll + 1):
rolls.append(x * x.roll(shifts=i, dims=-1))
return torch.cat(rolls, dim=-1)
def forward(self, h, attn_mask=None, mems=None,
carry_over_fast_weight=False):
assert not carry_over_fast_weight, "Not supported."
# multihead attention
# [hlen x bsz x n_head x d_head]
if mems is not None:
c = torch.cat([mems, h], 0)
else:
c = h
if self.pre_lnorm:
# layer normalization
c = self.layer_norm(c)
head_q = self.q_net(h)
head_k, head_v = torch.chunk(self.kv_net(c), 2, -1)
head_q = head_q.view(h.size(0), h.size(1), self.n_head, self.d_head)
head_k = head_k.view(c.size(0), c.size(1), self.n_head, self.d_head)
head_v = head_v.view(c.size(0), c.size(1), self.n_head, self.d_head)
# transform q and k
act = lambda x: F.relu(x) # relu or exp
head_k = torch.cat([act(head_k), act(-head_k)], dim=-1)
head_q = torch.cat([act(head_q), act(-head_q)], dim=-1)
head_k = self.mul_roll_repeat(head_k)
head_q = self.mul_roll_repeat(head_q)
# [qlen x klen x bsz x n_head]
attn_score = torch.einsum('ibnd,jbnd->ijbn', (head_q, head_k))
attn_score.mul_(self.scale)
if attn_mask is not None and attn_mask.any().item():
if attn_mask.dim() == 2:
# masked position to 0
attn_score.masked_fill_(attn_mask[None, :, :, None], 0)
elif attn_mask.dim() == 3:
attn_score.masked_fill_(attn_mask[:, :, :, None], 0)
# normalize attn scores over keys
eps = 1e-5
denominator = torch.sum(attn_score, 1, keepdim=True) + eps
# get (q_len, 1, B, n_head)
attn_score = self.dropatt(attn_score) # change
attn_prob = attn_score / denominator
# [qlen x klen x bsz x n_head] + [klen x bsz x n_head x d_head]
# -> [qlen x bsz x n_head x d_head]
attn_vec = torch.einsum('ijbn,jbnd->ibnd', (attn_prob, head_v))
attn_vec = attn_vec.contiguous().view(
attn_vec.size(0), attn_vec.size(1), self.n_head * self.d_head)
# linear projection
attn_out = self.o_net(attn_vec)
attn_out = self.drop(attn_out)
if self.pre_lnorm:
# residual connection
output = h + attn_out
else:
# residual connection + layer normalization
output = self.layer_norm(h + attn_out)
return output
# Performer multihead attention from Choromanski et al.
# Rethinking Attention with Performers. https://arxiv.org/abs/2009.14794
class PerformerMultiHeadAttn(nn.Module):
def __init__(self, n_head, d_model, d_head, dropout, dropatt=0,
pre_lnorm=False, proj_dim=256, device='cuda',
skip_attn_normalization=False):
assert not skip_attn_normalization, "Not implemented."
# proj_dim: projected dimension
print(f"Using PerformerMultiHeadAttn -- proj_dim: {proj_dim}")
super(PerformerMultiHeadAttn, self).__init__()
self.n_head = n_head
self.d_model = d_model
self.d_head = d_head
self.dropout = dropout
self.q_net = nn.Linear(d_model, n_head * d_head, bias=False)
self.kv_net = nn.Linear(d_model, 2 * n_head * d_head, bias=False)
self.drop = nn.Dropout(dropout)
self.dropatt = nn.Dropout(dropatt)
self.o_net = nn.Linear(n_head * d_head, d_model, bias=False)
self.layer_norm = nn.LayerNorm(d_model)
self.scale = 1 / (d_head ** 0.5)
self.pre_lnorm = pre_lnorm
self.proj_dim = proj_dim
# so that we can keep the same matrix at test time.
self.proj_matrix = draw_orthogonal_random_matrix(
d_head, proj_dim, device=device) # TODO store this as param?
def forward(self, h, attn_mask=None, mems=None, redraw=True,
carry_over_fast_weight=False):
assert not carry_over_fast_weight, "Not supported."
# multihead attention
# [hlen x bsz x n_head x d_head]
if mems is not None:
c = torch.cat([mems, h], 0)
else:
c = h
if self.pre_lnorm:
# layer normalization
c = self.layer_norm(c)
head_q = self.q_net(h)
head_k, head_v = torch.chunk(self.kv_net(c), 2, -1)
head_q = head_q.view(h.size(0), h.size(1), self.n_head, self.d_head)
head_k = head_k.view(c.size(0), c.size(1), self.n_head, self.d_head)
head_v = head_v.view(c.size(0), c.size(1), self.n_head, self.d_head)
if redraw:
self.proj_matrix = draw_orthogonal_random_matrix(
self.d_head, self.proj_dim, device=h.device)
# transform q and k
head_q = prime(head_q, self.proj_matrix) # (len, B, n_head, proj_dim)
head_k = prime(head_k, self.proj_matrix)
# [qlen x klen x bsz x n_head]
attn_score = torch.einsum('ibnd,jbnd->ijbn', (head_q, head_k))
attn_score.mul_(self.scale)
if attn_mask is not None and attn_mask.any().item():
if attn_mask.dim() == 2:
# set masked positions to 0
attn_score.masked_fill_(attn_mask[None, :, :, None], 0)
elif attn_mask.dim() == 3:
attn_score.masked_fill_(attn_mask[:, :, :, None], 0)
# normalize attn scores over keys
eps = 1e-5
denominator = torch.sum(attn_score, 1, keepdim=True) + eps
# get (q_len, 1, B, n_head)
attn_score = self.dropatt(attn_score) # change
attn_prob = attn_score / denominator
# [qlen x klen x bsz x n_head] + [klen x bsz x n_head x d_head]
# -> [qlen x bsz x n_head x d_head]
attn_vec = torch.einsum('ijbn,jbnd->ibnd', (attn_prob, head_v))
attn_vec = attn_vec.contiguous().view(
attn_vec.size(0), attn_vec.size(1), self.n_head * self.d_head)
# linear projection
attn_out = self.o_net(attn_vec)
attn_out = self.drop(attn_out)
if self.pre_lnorm:
# residual connection
output = h + attn_out
else:
# residual connection + layer normalization
output = self.layer_norm(h + attn_out)
return output
class RelMultiHeadAttn(nn.Module):
def __init__(self, n_head, d_model, d_head, dropout, dropatt=0,
tgt_len=None, ext_len=None, mem_len=None, pre_lnorm=False):
super(RelMultiHeadAttn, self).__init__()
self.n_head = n_head
self.d_model = d_model
self.d_head = d_head
self.dropout = dropout
self.qkv_net = nn.Linear(d_model, 3 * n_head * d_head, bias=False)
self.drop = nn.Dropout(dropout)
self.dropatt = nn.Dropout(dropatt)
self.o_net = nn.Linear(n_head * d_head, d_model, bias=False)
self.layer_norm = nn.LayerNorm(d_model)
self.scale = 1 / (d_head ** 0.5)
self.pre_lnorm = pre_lnorm
def _parallelogram_mask(self, h, w, left=False):
mask = torch.ones((h, w)).byte()
m = min(h, w)
mask[:m,:m] = torch.triu(mask[:m,:m])
mask[-m:,-m:] = torch.tril(mask[-m:,-m:])
if left:
return mask
else:
return mask.flip(0)
def _shift(self, x, qlen, klen, mask, left=False):
if qlen > 1:
zero_pad = torch.zeros((x.size(0), qlen-1, x.size(2), x.size(3)),
device=x.device, dtype=x.dtype)
else:
zero_pad = torch.zeros(0, device=x.device, dtype=x.dtype)
if left:
mask = mask.flip(1)
x_padded = torch.cat([zero_pad, x], dim=1).expand(qlen, -1, -1, -1)
else:
x_padded = torch.cat([x, zero_pad], dim=1).expand(qlen, -1, -1, -1)
x = x_padded.masked_select(mask[:,:,None,None]) \
.view(qlen, klen, x.size(2), x.size(3))
return x
def _rel_shift(self, x, zero_triu=False):
zero_pad = torch.zeros((x.size(0), 1, *x.size()[2:]),
device=x.device, dtype=x.dtype)
x_padded = torch.cat([zero_pad, x], dim=1)
x_padded = x_padded.view(x.size(1) + 1, x.size(0), *x.size()[2:])
x = x_padded[1:].view_as(x)
if zero_triu:
ones = torch.ones((x.size(0), x.size(1)))
x = x * torch.tril(ones, x.size(1) - x.size(0))[:,:,None,None]
return x
def forward(self, w, r, attn_mask=None, mems=None):
raise NotImplementedError
class RelPartialLearnableMultiHeadAttn(RelMultiHeadAttn):
def __init__(self, *args, **kwargs):
super(RelPartialLearnableMultiHeadAttn, self).__init__(*args, **kwargs)
self.r_net = nn.Linear(
self.d_model, self.n_head * self.d_head, bias=False)
def forward(self, w, r, r_w_bias, r_r_bias, attn_mask=None, mems=None):
qlen, rlen, bsz = w.size(0), r.size(0), w.size(1)
if mems is not None:
cat = torch.cat([mems, w], 0)
if self.pre_lnorm:
w_heads = self.qkv_net(self.layer_norm(cat))
else:
w_heads = self.qkv_net(cat)
r_head_k = self.r_net(r)
w_head_q, w_head_k, w_head_v = torch.chunk(w_heads, 3, dim=-1)
w_head_q = w_head_q[-qlen:]
else:
if self.pre_lnorm:
w_heads = self.qkv_net(self.layer_norm(w))
else:
w_heads = self.qkv_net(w)
r_head_k = self.r_net(r)
w_head_q, w_head_k, w_head_v = torch.chunk(w_heads, 3, dim=-1)
klen = w_head_k.size(0)
w_head_q = w_head_q.view(qlen, bsz, self.n_head, self.d_head)
w_head_k = w_head_k.view(klen, bsz, self.n_head, self.d_head)
w_head_v = w_head_v.view(klen, bsz, self.n_head, self.d_head)
r_head_k = r_head_k.view(rlen, self.n_head, self.d_head)
# compute attention score
rw_head_q = w_head_q + r_w_bias
AC = torch.einsum('ibnd,jbnd->ijbn', (rw_head_q, w_head_k))
rr_head_q = w_head_q + r_r_bias
BD = torch.einsum('ibnd,jnd->ijbn', (rr_head_q, r_head_k))
BD = self._rel_shift(BD)
# [qlen x klen x bsz x n_head]
attn_score = AC + BD
attn_score.mul_(self.scale)
# compute attention probability
if attn_mask is not None and attn_mask.any().item():
if attn_mask.dim() == 2:
attn_score = attn_score.float().masked_fill(
attn_mask[None,:,:,None], -float('inf')).type_as(
attn_score)
elif attn_mask.dim() == 3:
attn_score = attn_score.float().masked_fill(
attn_mask[:,:,:,None], -float('inf')).type_as(attn_score)
# [qlen x klen x bsz x n_head]
attn_prob = F.softmax(attn_score, dim=1)
attn_prob = self.dropatt(attn_prob)
# compute attention vector
attn_vec = torch.einsum('ijbn,jbnd->ibnd', (attn_prob, w_head_v))
# [qlen x bsz x n_head x d_head]
attn_vec = attn_vec.contiguous().view(
attn_vec.size(0), attn_vec.size(1), self.n_head * self.d_head)
# linear projection
attn_out = self.o_net(attn_vec)
attn_out = self.drop(attn_out)
if self.pre_lnorm:
# residual connection
output = w + attn_out
else:
# residual connection + layer normalization
output = self.layer_norm(w + attn_out)
return output
class RelLearnableMultiHeadAttn(RelMultiHeadAttn):
def __init__(self, *args, **kwargs):
super(RelLearnableMultiHeadAttn, self).__init__(*args, **kwargs)
def forward(self, w, r_emb, r_w_bias, r_bias, attn_mask=None, mems=None):
# r_emb: [klen, n_head, d_head], used for term B
# r_w_bias: [n_head, d_head], used for term C
# r_bias: [klen, n_head], used for term D
qlen, bsz = w.size(0), w.size(1)
if mems is not None:
cat = torch.cat([mems, w], 0)
if self.pre_lnorm:
w_heads = self.qkv_net(self.layer_norm(cat))
else:
w_heads = self.qkv_net(cat)
w_head_q, w_head_k, w_head_v = torch.chunk(w_heads, 3, dim=-1)
w_head_q = w_head_q[-qlen:]
else:
if self.pre_lnorm:
w_heads = self.qkv_net(self.layer_norm(w))
else:
w_heads = self.qkv_net(w)
w_head_q, w_head_k, w_head_v = torch.chunk(w_heads, 3, dim=-1)
klen = w_head_k.size(0)
w_head_q = w_head_q.view(qlen, bsz, self.n_head, self.d_head)
w_head_k = w_head_k.view(klen, bsz, self.n_head, self.d_head)
w_head_v = w_head_v.view(klen, bsz, self.n_head, self.d_head)
if klen > r_emb.size(0):
r_emb_pad = r_emb[0:1].expand(klen-r_emb.size(0), -1, -1)
r_emb = torch.cat([r_emb_pad, r_emb], 0)
r_bias_pad = r_bias[0:1].expand(klen-r_bias.size(0), -1)
r_bias = torch.cat([r_bias_pad, r_bias], 0)
else:
r_emb = r_emb[-klen:]
r_bias = r_bias[-klen:]
# compute attention score
# qlen x bsz x n_head x d_head
rw_head_q = w_head_q + r_w_bias[None]
# qlen x klen x bsz x n_head
AC = torch.einsum('ibnd,jbnd->ijbn', (rw_head_q, w_head_k))
# qlen x klen x bsz x n_head
B_ = torch.einsum('ibnd,jnd->ijbn', (w_head_q, r_emb))
# 1 x klen x 1 x n_head
D_ = r_bias[None, :, None]
BD = self._rel_shift(B_ + D_)
# [qlen x klen x bsz x n_head]
attn_score = AC + BD
attn_score.mul_(self.scale)
# compute attention probability
if attn_mask is not None and attn_mask.any().item():
if attn_mask.dim() == 2:
attn_score.masked_fill_(
attn_mask[None,:,:,None], -float('inf'))
elif attn_mask.dim() == 3:
attn_score.masked_fill_(attn_mask[:,:,:,None], -float('inf'))
# [qlen x klen x bsz x n_head]
attn_prob = F.softmax(attn_score, dim=1)
attn_prob = self.dropatt(attn_prob)
# compute attention vector
attn_vec = torch.einsum('ijbn,jbnd->ibnd', (attn_prob, w_head_v))
# [qlen x bsz x n_head x d_head]
attn_vec = attn_vec.contiguous().view(
attn_vec.size(0), attn_vec.size(1), self.n_head * self.d_head)
# linear projection
attn_out = self.o_net(attn_vec)
attn_out = self.drop(attn_out)
if self.pre_lnorm:
# residual connection
output = w + attn_out
else:
# residual connection + layer normalization
output = self.layer_norm(w + attn_out)
return output
class CudaFastWeightSumTwoPerformerLayer(nn.Module):
def __init__(self, n_head, d_model, d_head, dropout, dropatt=0,
pre_lnorm=False, eps=1e-5, skip_attn_normalization=False,qkv_bias = False,
proj_dim=256, device='cuda', update_mode = 'rbf2keys', scale_w = 1., two_proj_matrix = False, learn_scale_w = False):
super(CudaFastWeightSumTwoPerformerLayer, self).__init__()
print(f"Using CudaFastWeightSumTwoPerformerLayer - "
f"proj_dim: {proj_dim}")
self.n_head = n_head
self.d_model = d_model
self.d_head = d_head
self.dropout = dropout
self.update_mode = update_mode
self.learn_scale_w = learn_scale_w
if self.learn_scale_w:
# self.scale_w = nn.Parameter(torch.ones(d_head), requires_grad = True)
self.scale_w = nn.Parameter((.4 - 1.) * torch.rand(d_head) + 1., requires_grad = True)
else:
self.scale_w = scale_w
self.device = device
self.two_proj_matrix = two_proj_matrix
# self.qkv_bias = qkv_bias
assert update_mode in ['rbf2keys', 'rbf', 'standard']
# (3 * d_head * n_head) for qkv and (1 * n_head) for beta.
if update_mode == 'rbf2keys':
self.qkv_net = nn.Linear(d_model, n_head * (4 * d_head), bias=qkv_bias)
else:
self.qkv_net = nn.Linear(d_model, n_head * (3 * d_head), bias=qkv_bias)
if update_mode == 'rbf':
self.mu = nn.Parameter((torch.empty(2, n_head, d_head).normal_(mean = 0.0, std = .5))*torch.tensor([0.,1.])[:, None, None], requires_grad= True)
self.drop = nn.Dropout(dropout)
self.dropatt = nn.Dropout(dropatt)
self.o_net = nn.Linear(n_head * d_head, d_model, bias=False)
self.layer_norm = nn.LayerNorm(d_model)
self.scale = 1 / (d_head ** 0.5)
self.pre_lnorm = pre_lnorm
self.normalize_attn_scores = (not skip_attn_normalization)
self.eps = eps
self.proj_dim = proj_dim
if self.update_mode in ['rbf', 'rbf2keys']:
self.pi0 = nn.Parameter(0.5 * torch.ones(self.n_head, 256), requires_grad= True)
self.pi1 = nn.Parameter(0.5 * torch.ones(self.n_head, 256), requires_grad= True)
if self.two_proj_matrix:
self.proj_matrix1 = draw_orthogonal_random_matrix(d_head, proj_dim, device=device)
self.proj_matrix2 = draw_orthogonal_random_matrix(d_head, proj_dim, device=device)
else:
self.proj_matrix = draw_orthogonal_random_matrix(d_head, proj_dim, device=device)
def scale_proj_matrix(self, proj_matrix):
if self.learn_scale_w:
# print(torch.diag(self.scale_w).shape)
# print(proj_matrix.shape)
# assert 1==2
return torch.diag(self.scale_w)@proj_matrix
else:
return proj_matrix*self.scale_w
def forward(self, h, attn_mask=None, mems=None, redraw=True,
carry_over_fast_weight=False):
self.device = h.device
# shape h: (len, B, n_head * d_head)
l = h.size()[0]
if self.pre_lnorm:
# layer normalization
h = self.layer_norm(h)
slen, bsz, _ = h.size()
qkv = self.qkv_net(h)
if self.update_mode == 'rbf2keys':
qkv = qkv.view(slen, bsz, self.n_head, 4 * self.d_head)
head_q, head_k1,head_k2, head_v = torch.split(
qkv, (self.d_head,) * 4, -1)
else:
qkv = qkv.view(slen, bsz, self.n_head, 3 * self.d_head)
head_q, head_k, head_v = torch.split(
qkv, (self.d_head,) * 3, -1)
# reshape to (B, heads, len, dim)
head_q = head_q.permute(1, 2, 0, 3)
head_v = head_v.permute(1, 2, 0, 3)
if self.update_mode == 'rbf2keys':
head_k1 = head_k1.permute(1, 2, 0, 3)
head_k2 = head_k2.permute(1, 2, 0, 3)
elif self.update_mode == 'rbf':
head_k1 = (head_k - self.mu[0]).permute(1, 2, 0, 3) # (B, n_head, len, proj_dim)
head_k2 = (head_k - self.mu[1]).permute(1, 2, 0, 3)
else:
head_k = head_k.permute(1, 2, 0, 3)
if redraw:
if self.two_proj_matrix:
self.proj_matrix1 = draw_orthogonal_random_matrix(self.d_head, self.proj_dim, device=self.device)
self.proj_matrix2 = draw_orthogonal_random_matrix(self.d_head, self.proj_dim, device=self.device)
else:
self.proj_matrix = draw_orthogonal_random_matrix(self.d_head, self.proj_dim, device=self.device)
if self.two_proj_matrix:
head_q = prime(head_q, self.proj_matrix1) # (B, n_head, len, proj_dim)
head_q = head_q / head_q.sum(-1, keepdim=True)
head_k1 = prime(head_k1, self.proj_matrix1)
head_k1 = head_k1 / head_k1.sum(-1, keepdim=True)
head_k2 = prime(head_k2, self.scale_proj_matrix(self.proj_matrix2))
head_k2 = head_k2 / head_k2.sum(-1, keepdim=True)
head_k = head_k1*self.pi0[None,:,:l,None] + head_k2*self.pi1[None,:,:l,None]
# head_k = head_k / head_k.sum(-1, keepdim=True)
else:
#this normalization follows equation 29 on the paper
head_q = prime(head_q, self.proj_matrix) # (B, n_head, len, proj_dim)
head_q = head_q / head_q.sum(-1, keepdim=True)
if self.update_mode in ['rbf2keys','rbf']:
head_k1 = prime(head_k1, self.proj_matrix)
head_k1 = head_k1 / head_k1.sum(-1, keepdim=True)
head_k2 = prime(head_k2, self.scale_proj_matrix(self.proj_matrix))
head_k2 = head_k2 / head_k2.sum(-1, keepdim=True)
#pi0 (n_head, klen)
### Here, I simply use anh Tan's trick to make the computation faster
### by combine 2 keys instead of combines 2 output results
head_k = head_k1*self.pi0[None,:,:l,None] + head_k2*self.pi1[None,:,:l,None]
# head_k = head_k / head_k.sum(-1, keepdim=True)
else:
head_k = prime(head_k, self.proj_matrix)
head_k = head_k / head_k.sum(-1, keepdim=True)
### I did not change anything from this line on
## the denorminator of softmax
if self.normalize_attn_scores:
denominator_acc = torch.cumsum(head_k, dim=2)
if mems is None:
mem_fast_weights = torch.zeros(
bsz, self.n_head, 2 * self.proj_dim, self.d_head,
device=head_k.device)
else:
assert carry_over_fast_weight
mem_fast_weights, fast_denom = mems
# bsz can be smaller for the last batch
mem_fast_weights = mem_fast_weights[:bsz]
if self.normalize_attn_scores:
denominator_acc = denominator_acc + fast_denom[:bsz]
if self.normalize_attn_scores:
denominator = torch.einsum(
'lbij,lbij->lbi', denominator_acc, head_q).unsqueeze(-1)
layer_out = fast_weight_sum(
head_q, head_k, head_v, mem_fast_weights)
# shape (B, n_head, len, d_head)
if self.normalize_attn_scores:
layer_out = self.scale * layer_out / (denominator + self.eps)
else:
layer_out = self.scale * layer_out
layer_out = layer_out.transpose(1, 2)
layer_out = layer_out.reshape(
bsz, slen, self.n_head * self.d_head)
layer_out = layer_out.transpose(0, 1)
# expect [qlen, B, n_head * d_head]
# linear projection
attn_out = self.o_net(layer_out)
attn_out = self.drop(attn_out)
if self.pre_lnorm:
# residual connection
output = h + attn_out
else:
# residual connection + layer normalization
output = self.layer_norm(h + attn_out)
if carry_over_fast_weight:
# last values of accumulator should be carried over.
# clone is needed as backward modifies the data of fast weight
if self.normalize_attn_scores:
new_k_acc = denominator_acc[:, :, -1, :].unsqueeze(2).detach()
else:
new_k_acc = None
new_mem = (mem_fast_weights.clone().detach(), new_k_acc)
return output, new_mem
return output
class PerformerDecoderLayer(nn.Module):
def __init__(self, n_head, d_model, d_head, d_inner, dropout, attn_type,
scale_w = 1., two_proj_matrix = False,learn_scale_w = False, update_mode = 'rbf2keys',
**kwargs):
super(PerformerDecoderLayer, self).__init__()
if attn_type == 25:
attn_func = CudaFastWeightPerformerLayer
elif attn_type == 35:
attn_func = CudaFastWeightSumPerformerLayer
elif attn_type == 45:
attn_func = CudaNormFastWeightPerformerLayer
elif attn_type == 5:
attn_func = PerformerMultiHeadAttn
elif attn_type == 300:
attn_func = CudaFastWeightSumTwoPerformerLayer
else:
raise Exception(f"attn_type {attn_type} not allowed "
f"in PerformerDecoderLayer.")
if attn_type == 300:
self.dec_attn = attn_func(n_head, d_model, d_head, dropout,
scale_w= scale_w, two_proj_matrix=two_proj_matrix, learn_scale_w = learn_scale_w, **kwargs)
else:
self.dec_attn = attn_func(n_head, d_model, d_head, dropout, **kwargs)
self.pos_ff = PositionwiseFF(
d_model, d_inner, dropout, pre_lnorm=kwargs.get('pre_lnorm'))
def forward(self, dec_inp, dec_attn_mask=None, mems=None, redraw=True,
carry_over_fast_weight=False):
output = self.dec_attn(
dec_inp, attn_mask=dec_attn_mask, mems=mems, redraw=redraw,
carry_over_fast_weight=carry_over_fast_weight)
if carry_over_fast_weight:
output, new_mem = output
output = self.pos_ff(output)
if carry_over_fast_weight:
return output, new_mem
return output
class DecoderLayer(nn.Module):
def __init__(self, n_head, d_model, d_head, d_inner, dropout, attn_type, update_mode = 'rbf2keys', scale_w = 1.,
**kwargs):
super(DecoderLayer, self).__init__()
if attn_type == 2:
attn_func = MultiHeadAttn
elif attn_type == 4:
attn_func = LinearMultiHeadAttn
elif attn_type == 6:
attn_func = DPFPMultiHeadAttn
elif attn_type == 14:
attn_func = StepWiseLinearTransformerLayer
elif attn_type == 16:
attn_func = StepWiseDPFPLinearTransformerLayer
elif attn_type == 10:
attn_func = DebugStepWiseLinearTransformerLayer
elif attn_type == 24:
attn_func = CudaFastWeightLinearTransformerLayer
elif attn_type == 26:
attn_func = CudaFastWeightDPFPTransformerLayer
elif attn_type == 44:
attn_func = CudaNormFastWeightLinearTransformerLayer
elif attn_type == 46:
attn_func = CudaNormFastWeightDPFPTransformerLayer
elif attn_type == 34:
attn_func = CudaFastWeightSumLinearTransformerLayer
elif attn_type == 200:
attn_func = MGKAttn
elif attn_type == 400:
attn_func = CudaFastWeightSumTwoLinearTransformerLayer
elif attn_type == 500:
attn_func = CudaFastWeightSumQueryTwoLinearTransformerLayer
else:
raise Exception(f"attn_type {attn_type} not allowed here.")
if attn_type in [400, 500]:
self.dec_attn = attn_func(n_head, d_model, d_head, dropout,
update_mode = update_mode, scale_w = scale_w, **kwargs)
else:
self.dec_attn = attn_func(n_head, d_model, d_head, dropout, **kwargs)
self.pos_ff = PositionwiseFF(
d_model, d_inner, dropout, pre_lnorm=kwargs.get('pre_lnorm'))
self.attn_type = attn_type
def forward(self, *dec_inp, dec_attn_mask=None, mems=None, redraw=True,
carry_over_fast_weight=False):
output = self.dec_attn(*dec_inp, attn_mask=dec_attn_mask, mems=mems, carry_over_fast_weight=carry_over_fast_weight)
if carry_over_fast_weight:
output, new_mem = output
output = self.pos_ff(output)
if carry_over_fast_weight:
return output, new_mem
return output
class RelLearnableDecoderLayer(nn.Module):
def __init__(self, n_head, d_model, d_head, d_inner, dropout,
**kwargs):
super(RelLearnableDecoderLayer, self).__init__()
self.dec_attn = RelLearnableMultiHeadAttn(
n_head, d_model, d_head, dropout, **kwargs)
self.pos_ff = PositionwiseFF(
d_model, d_inner, dropout, pre_lnorm=kwargs.get('pre_lnorm'))
def forward(self, dec_inp, r_emb, r_w_bias, r_bias,
dec_attn_mask=None, mems=None):
output = self.dec_attn(dec_inp, r_emb, r_w_bias, r_bias,
attn_mask=dec_attn_mask, mems=mems)
output = self.pos_ff(output)
return output
class RelPartialLearnableDecoderLayer(nn.Module):
def __init__(self, n_head, d_model, d_head, d_inner, dropout,
**kwargs):
super(RelPartialLearnableDecoderLayer, self).__init__()
self.dec_attn = RelPartialLearnableMultiHeadAttn(
n_head, d_model, d_head, dropout, **kwargs)
self.pos_ff = PositionwiseFF(
d_model, d_inner, dropout, pre_lnorm=kwargs.get('pre_lnorm'))
def forward(self, dec_inp, r, r_w_bias, r_r_bias, dec_attn_mask=None,
mems=None):
output = self.dec_attn(dec_inp, r, r_w_bias, r_r_bias,
attn_mask=dec_attn_mask, mems=mems)
output = self.pos_ff(output)
return output
class AdaptiveEmbedding(nn.Module):
def __init__(self, n_token, d_embed, d_proj, cutoffs, div_val=1,
sample_softmax=False):
super(AdaptiveEmbedding, self).__init__()
self.n_token = n_token
self.d_embed = d_embed
self.cutoffs = cutoffs + [n_token]
self.div_val = div_val
self.d_proj = d_proj
self.emb_scale = d_proj ** 0.5
self.cutoff_ends = [0] + self.cutoffs
self.emb_layers = nn.ModuleList()
self.emb_projs = nn.ParameterList()
if div_val == 1:
self.emb_layers.append(
nn.Embedding(n_token, d_embed, sparse=sample_softmax>0)
)
if d_proj != d_embed:
self.emb_projs.append(
nn.Parameter(torch.Tensor(d_proj, d_embed)))
else:
for i in range(len(self.cutoffs)):
l_idx, r_idx = self.cutoff_ends[i], self.cutoff_ends[i+1]
d_emb_i = d_embed // (div_val ** i)
self.emb_layers.append(nn.Embedding(r_idx-l_idx, d_emb_i))
self.emb_projs.append(
nn.Parameter(torch.Tensor(d_proj, d_emb_i)))
def forward(self, inp):
if self.div_val == 1:
embed = self.emb_layers[0](inp)
if self.d_proj != self.d_embed:
embed = F.linear(embed, self.emb_projs[0])
else:
param = next(self.parameters())
inp_flat = inp.view(-1)
emb_flat = torch.zeros([inp_flat.size(0), self.d_proj],
dtype=param.dtype, device=param.device)
for i in range(len(self.cutoffs)):
l_idx, r_idx = self.cutoff_ends[i], self.cutoff_ends[i + 1]
mask_i = (inp_flat >= l_idx) & (inp_flat < r_idx)
indices_i = mask_i.nonzero().squeeze()
if indices_i.numel() == 0:
continue
inp_i = inp_flat.index_select(0, indices_i) - l_idx
emb_i = self.emb_layers[i](inp_i)
emb_i = F.linear(emb_i, self.emb_projs[i])
emb_flat.index_copy_(0, indices_i, emb_i)
embed = emb_flat.view(*inp.size(), self.d_proj)
embed.mul_(self.emb_scale)
return embed
class MemTransformerLM(nn.Module):
def __init__(self,
n_token,
n_layer,
n_head,
d_model,
d_head,
d_inner,
dropout,
dropatt,
tie_weight=True,
d_embed=None,
div_val=1,
tie_projs=[False],
pre_lnorm=False,
tgt_len=None,
ext_len=None,
mem_len=None,
cutoffs=[],
adapt_inp=False,
same_length=False,
attn_type=0,
clamp_len=-1,
sample_softmax=-1,
proj_dim=256, # for performer layers
n_roll=3, # mirrored attention
skip_attn_normalization=False,
no_pos=False, # no positional encoding
device='cuda',
update_mode='hard',
scale_w = 1.,
two_proj_matrix = False,
learn_scale_w = False):
super(MemTransformerLM, self).__init__()
self.n_token = n_token
self.no_pos = no_pos
d_embed = d_model if d_embed is None else d_embed
self.d_embed = d_embed
self.d_model = d_model
self.n_head = n_head
self.d_head = d_head
self.word_emb = AdaptiveEmbedding(
n_token, d_embed, d_model, cutoffs, div_val=div_val)
self.drop = nn.Dropout(dropout)
self.n_layer = n_layer
self.tgt_len = tgt_len
self.mem_len = mem_len
self.ext_len = ext_len
self.max_klen = tgt_len + ext_len + mem_len
self.attn_type = attn_type
self.layers = nn.ModuleList()
if attn_type == 0: # the default attention
for i in range(n_layer):
self.layers.append(
RelPartialLearnableDecoderLayer(
n_head, d_model, d_head, d_inner, dropout,
tgt_len=tgt_len, ext_len=ext_len, mem_len=mem_len,
dropatt=dropatt, pre_lnorm=pre_lnorm)
)
elif attn_type == 1: # learnable embeddings
for i in range(n_layer):
self.layers.append(
RelLearnableDecoderLayer(
n_head, d_model, d_head, d_inner, dropout,
tgt_len=tgt_len, ext_len=ext_len, mem_len=mem_len,
dropatt=dropatt, pre_lnorm=pre_lnorm)
)
elif attn_type in [2, 3, 4]: # absolute embeddings
# 2: baseline vanilla transformer
# 3:
# 4: linear transformer
for i in range(n_layer):
self.layers.append(
DecoderLayer(
n_head, d_model, d_head, d_inner, dropout,
dropatt=dropatt, pre_lnorm=pre_lnorm,
attn_type=attn_type)
)
elif attn_type in [200,]: # absolute embeddings
for i in range(n_layer):
self.layers.append(
DecoderLayer(
n_head, d_model, d_head, d_inner, dropout,
dropatt=dropatt, pre_lnorm=pre_lnorm,
attn_type=attn_type, update_mode=update_mode)
)
elif attn_type in [6, 7]: # absolute embeddings
# 6: mirrored attention
# 7: mirrored attention v2
for i in range(n_layer):
self.layers.append(
DecoderLayer(
n_head, d_model, d_head, d_inner, dropout,
dropatt=dropatt, pre_lnorm=pre_lnorm,
attn_type=attn_type, n_roll=n_roll)
)
elif attn_type in [10, 14, 24, 34, 44, 400, 500]: # fast weights
# 10: debugging, same as linear trafo but step by step
# 14: linear fast weight
for i in range(n_layer):
self.layers.append(
DecoderLayer(
n_head, d_model, d_head, d_inner, dropout,
dropatt=dropatt, pre_lnorm=pre_lnorm,
attn_type=attn_type, layer_id=i, num_layer=n_layer,
skip_attn_normalization=skip_attn_normalization, )
)
elif attn_type in [16, 26, 46]: # fast weights w/ absolute embeddings
# 10: debugging, same as linear trafo but step by step
# 14: linear fast weight
for i in range(n_layer):
self.layers.append(
DecoderLayer(
n_head, d_model, d_head, d_inner, dropout,
dropatt=dropatt, pre_lnorm=pre_lnorm,
attn_type=attn_type, layer_id=i, num_layer=n_layer,
n_roll=n_roll,
skip_attn_normalization=skip_attn_normalization)
)
elif attn_type in [5, 25, 35, 45, 300]: # absolute embeddings, performer
# performer case needs to be separate from the case above
# such that we can del with custom logic for random projections.
for i in range(n_layer):
self.layers.append(
PerformerDecoderLayer(
n_head, d_model, d_head, d_inner, dropout,
dropatt=dropatt, pre_lnorm=pre_lnorm,
attn_type=attn_type, proj_dim=proj_dim, device=device,update_mode = update_mode,
skip_attn_normalization=skip_attn_normalization, scale_w=scale_w,
two_proj_matrix = two_proj_matrix, learn_scale_w = learn_scale_w)
)
self.sample_softmax = sample_softmax
# use sampled softmax
if sample_softmax > 0:
self.out_layer = nn.Linear(d_model, n_token)
if tie_weight:
self.out_layer.weight = self.word_emb.weight
self.tie_weight = tie_weight
self.sampler = LogUniformSampler(n_token, sample_softmax)
# use adaptive softmax (including standard softmax)
else:
self.crit = ProjectedAdaptiveLogSoftmax(n_token, d_embed, d_model,
cutoffs, div_val=div_val)
if tie_weight:
for i in range(len(self.crit.out_layers)):
self.crit.out_layers[i].weight = self.word_emb.emb_layers[i].weight
if tie_projs:
for i, tie_proj in enumerate(tie_projs):
if tie_proj and div_val == 1 and d_model != d_embed:
self.crit.out_projs[i] = self.word_emb.emb_projs[0]
elif tie_proj and div_val != 1:
self.crit.out_projs[i] = self.word_emb.emb_projs[i]
self.same_length = same_length
self.clamp_len = clamp_len
self._create_params()
def backward_compatible(self):
self.sample_softmax = -1
def _create_params(self):
if self.attn_type == 0: # default attention
self.pos_emb = PositionalEmbedding(self.d_model)
self.r_w_bias = nn.Parameter(
torch.Tensor(self.n_head, self.d_head))
self.r_r_bias = nn.Parameter(
torch.Tensor(self.n_head, self.d_head))
elif self.attn_type == 1: # learnable
self.r_emb = nn.Parameter(torch.Tensor(
self.n_layer, self.max_klen, self.n_head, self.d_head))
self.r_w_bias = nn.Parameter(torch.Tensor(
self.n_layer, self.n_head, self.d_head))
self.r_bias = nn.Parameter(torch.Tensor(
self.n_layer, self.max_klen, self.n_head))
elif self.attn_type in [2, 4, 5, 6, 7,
10, 14, 16,
24, 25, 26,
34, 35,
44, 45, 46, 200, 300, 400, 500]:
# standard absolute pos
self.pos_emb = PositionalEmbedding(self.d_model)
elif self.attn_type == 3: # absolute deeper SA
self.r_emb = nn.Parameter(torch.Tensor(
self.n_layer, self.max_klen, self.n_head, self.d_head))
def reset_length(self, tgt_len, ext_len, mem_len):
self.tgt_len = tgt_len
self.mem_len = mem_len
self.ext_len = ext_len
def init_mems(self):
if self.mem_len > 0:
mems = []
param = next(self.parameters())
for i in range(self.n_layer+1):
empty = torch.empty(0, dtype=param.dtype, device=param.device)
mems.append(empty)
return mems
else:
return None
def _update_mems(self, hids, mems, qlen, mlen):
# does not deal with None
if mems is None:
return None
# mems is not None
assert len(hids) == len(mems), 'len(hids) != len(mems)'
# There are `mlen + qlen` steps that can be cached into mems
# For the next step, the last `ext_len` of the `qlen` tokens
# will be used as the extended context. Hence, we only cache
# the tokens from `mlen + qlen - self.ext_len - self.mem_len`
# to `mlen + qlen - self.ext_len`.
with torch.no_grad():
new_mems = []
end_idx = mlen + max(0, qlen - 0 - self.ext_len)
beg_idx = max(0, end_idx - self.mem_len)
for i in range(len(hids)):
cat = torch.cat([mems[i], hids[i]], dim=0)
new_mems.append(cat[beg_idx:end_idx].detach())
return new_mems
def _forward(self, dec_inp, mems=None, carry_over_fast_weight=False):
qlen, bsz = dec_inp.size()
word_emb = self.word_emb(dec_inp)
if carry_over_fast_weight:
mlen = 0
else:
mlen = mems[0].size(0) if mems is not None else 0
klen = mlen + qlen
if self.same_length:
all_ones = word_emb.new_ones(qlen, klen)
mask_len = klen - self.mem_len
if mask_len > 0:
mask_shift_len = qlen - mask_len
else:
mask_shift_len = qlen
dec_attn_mask = (torch.triu(all_ones, 1+mlen)
+ torch.tril(all_ones, -mask_shift_len)
).bool()[:, :, None]
else:
dec_attn_mask = torch.triu(
word_emb.new_ones(qlen, klen),
diagonal=1+mlen).bool()[:, :, None]
hids = []
if self.attn_type == 0: # default
pos_seq = torch.arange(
klen-1, -1, -1.0, device=word_emb.device, dtype=word_emb.dtype)
if self.clamp_len > 0:
pos_seq.clamp_(max=self.clamp_len)
pos_emb = self.pos_emb(pos_seq)
pos_emb = self.drop(pos_emb)
core_out = self.drop(word_emb)
hids.append(core_out)
for i, layer in enumerate(self.layers):
mems_i = None if mems is None else mems[i]
core_out = layer(
core_out, pos_emb, self.r_w_bias, self.r_r_bias,
dec_attn_mask=dec_attn_mask, mems=mems_i)
hids.append(core_out)
elif self.attn_type == 1: # learnable
core_out = self.drop(word_emb)
hids.append(core_out)
for i, layer in enumerate(self.layers):
if self.clamp_len > 0:
r_emb = self.r_emb[i][-self.clamp_len:]
r_bias = self.r_bias[i][-self.clamp_len:]
else:
r_emb, r_bias = self.r_emb[i], self.r_bias[i]
mems_i = None if mems is None else mems[i]
core_out = layer(
core_out, r_emb, self.r_w_bias[i], r_bias,
dec_attn_mask=dec_attn_mask, mems=mems_i)
hids.append(core_out)
elif self.attn_type in [2, 4, 5, 6, 7, 10, 14, 16, 200]: # absolute
if self.no_pos:
core_out = self.drop(word_emb)
else:
pos_seq = torch.arange(klen-1, -1, -1.0,
device=word_emb.device,
dtype=word_emb.dtype)
# pos_seq = torch.arange(0, klen, device=word_emb.device,
# dtype=word_emb.dtype)
if self.clamp_len > 0:
pos_seq.clamp_(max=self.clamp_len)
pos_emb = self.pos_emb(pos_seq)
core_out = self.drop(word_emb + pos_emb[-qlen:])
hids.append(core_out)
for i, layer in enumerate(self.layers):
mems_i = None if mems is None else mems[i]
if mems_i is not None and i == 0:
mems_i += pos_emb[:mlen]
if self.attn_type == 5:
core_out = layer(core_out, dec_attn_mask=dec_attn_mask,
mems=mems_i, redraw=self.training)
else:
core_out = layer(core_out, dec_attn_mask=dec_attn_mask,
mems=mems_i)
hids.append(core_out)
elif self.attn_type in [24, 25, 26, 34, 35, 44, 45, 46, 300, 400, 500]: # absolute
if self.no_pos:
core_out = self.drop(word_emb)
else:
pos_seq = torch.arange(klen-1, -1, -1.0,
device=word_emb.device,
dtype=word_emb.dtype)
# pos_seq = torch.arange(0, klen, device=word_emb.device,
# dtype=word_emb.dtype)
if self.clamp_len > 0:
pos_seq.clamp_(max=self.clamp_len)
pos_emb = self.pos_emb(pos_seq)
core_out = self.drop(word_emb + pos_emb[-qlen:])
hids.append(core_out)
if carry_over_fast_weight:
new_mems = []
for i, layer in enumerate(self.layers):
mems_i = None if mems is None else mems[i]
if self.attn_type in [25, 35, 45]:
out = layer(
core_out, dec_attn_mask=dec_attn_mask,
mems=mems_i, redraw=self.training,
carry_over_fast_weight=carry_over_fast_weight)
else:
out = layer(
core_out, dec_attn_mask=dec_attn_mask, mems=mems_i,
carry_over_fast_weight=carry_over_fast_weight)
if carry_over_fast_weight:
core_out, new_fast_weight = out
new_mems.append(new_fast_weight)
else:
core_out = out
hids.append(core_out)
elif self.attn_type == 3:
core_out = self.drop(word_emb)
hids.append(core_out)
for i, layer in enumerate(self.layers):
mems_i = None if mems is None else mems[i]
if mems_i is not None and mlen > 0:
cur_emb = self.r_emb[i][:-qlen]
cur_size = cur_emb.size(0)
if cur_size < mlen:
cur_emb_pad = cur_emb[0:1].expand(
mlen-cur_size, -1, -1)
cur_emb = torch.cat([cur_emb_pad, cur_emb], 0)
else:
cur_emb = cur_emb[-mlen:]
mems_i += cur_emb.view(mlen, 1, -1)
core_out += self.r_emb[i][-qlen:].view(qlen, 1, -1)
core_out = layer(
core_out, dec_attn_mask=dec_attn_mask, mems=mems_i)
hids.append(core_out)
core_out = self.drop(core_out)
if not carry_over_fast_weight:
new_mems = self._update_mems(hids, mems, mlen, qlen)
return core_out, new_mems
def forward(self, data, target, *mems, softmax_keep_order=False,
carry_over_fast_weight=False):
# nn.DataParallel does not allow size(0) tensors to be broadcasted.
# So, have to initialize size(0) mems inside the model forward.
# Moreover, have to return new_mems to allow nn.DataParallel to piece
# them together.
if not mems:
mems = self.init_mems()
tgt_len = target.size(0)
hidden, new_mems = self._forward(
data, mems=mems, carry_over_fast_weight=carry_over_fast_weight)
pred_hid = hidden[-tgt_len:]
if self.sample_softmax > 0 and self.training:
assert self.tie_weight
logit = sample_logits(self.word_emb,
self.out_layer.bias, target, pred_hid, self.sampler)
loss = -F.log_softmax(logit, -1)[:, :, 0]
else:
loss = self.crit(pred_hid.reshape(-1, pred_hid.size(-1)),
target.reshape(-1),
keep_order=softmax_keep_order)
loss = loss.view(tgt_len, -1)
# import pdb; pdb.set_trace()
if new_mems is None:
return [loss]
else:
return [loss] + new_mems
def get_pi(self):
pi_list = []
if self.attn_type in [200, 300]:
for pi_indx in range(len(self.layers)):
pi_list.append(self.layers[pi_indx].dec_attn.pi)
return pi_list
def get_pi0(self):
pi_list = []
if self.attn_type in [200, 300]:
for pi_indx in range(len(self.layers)):
pi_list.append(self.layers[pi_indx].dec_attn.pi0)
return pi_list
def get_pi1(self):
pi_list = []
if self.attn_type in [200, 300]:
for pi_indx in range(len(self.layers)):
pi_list.append(self.layers[pi_indx].dec_attn.pi1)
return pi_list
def get_pi0_data(self):
pi_list = []
if self.attn_type in [200, 300]:
for pi_indx in range(len(self.layers)):
pi_list.append(self.layers[pi_indx].dec_attn.pi0.data)
return pi_list
def get_pi1_data(self):
pi_list = []
if self.attn_type in [200, 300]:
for pi_indx in range(len(self.layers)):
pi_list.append(self.layers[pi_indx].dec_attn.pi1.data)
return pi_list
def get_mu_diff(self):
md_list = []
if self.attn_type in [200, 300]:
for md_indx in range(len(self.layers)):
md_list.append(-torch.sum((self.layers[md_indx].dec_attn.mu[0] - self.layers[md_indx].dec_attn.mu[1])**2))
return md_list
if __name__ == '__main__':
import argparse
parser = argparse.ArgumentParser(description='unit test')
parser.add_argument('--n_layer', type=int, default=4, help='')
parser.add_argument('--n_rel_layer', type=int, default=4, help='')
parser.add_argument('--n_head', type=int, default=2, help='')
parser.add_argument('--d_head', type=int, default=2, help='')
parser.add_argument('--d_model', type=int, default=200, help='')
parser.add_argument('--d_embed', type=int, default=200, help='')
parser.add_argument('--d_inner', type=int, default=200, help='')
parser.add_argument('--dropout', type=float, default=0.0, help='')
parser.add_argument('--cuda', action='store_true', help='')
parser.add_argument('--seed', type=int, default=1111, help='')
parser.add_argument('--multi_gpu', action='store_true', help='')
args = parser.parse_args()
device = torch.device("cuda" if args.cuda else "cpu")
B = 4
tgt_len, mem_len, ext_len = 36, 36, 0
data_len = tgt_len * 20
args.n_token = 10000
import data_utils
data = torch.LongTensor(data_len*B).random_(0, args.n_token).to(device)
diter = data_utils.LMOrderedIterator(
data, B, tgt_len, device=device, ext_len=ext_len)
cutoffs = [args.n_token // 2]
tie_projs = [False] + [True] * len(cutoffs)
for div_val in [1, 2]:
for d_embed in [200, 100]:
model = MemTransformerLM(
args.n_token, args.n_layer, args.n_head, args.d_model,
args.d_head, args.d_inner, args.dropout, dropatt=args.dropout,
tie_weight=True, d_embed=d_embed, div_val=div_val,
tie_projs=tie_projs, pre_lnorm=True, tgt_len=tgt_len,
ext_len=ext_len, mem_len=mem_len, cutoffs=cutoffs,
attn_type=0, update_mode='hard').to(device)
print(sum(p.numel() for p in model.parameters()))
mems = tuple()
for idx, (inp, tgt, seqlen) in enumerate(diter):
print('batch {}'.format(idx))
out = model(inp, tgt, *mems)
mems = out[1:]
| 39.130958 | 196 | 0.5697 | 11,732 | 84,562 | 3.842056 | 0.04688 | 0.023627 | 0.017171 | 0.011314 | 0.750327 | 0.721753 | 0.693977 | 0.668774 | 0.650116 | 0.635141 | 0 | 0.022246 | 0.320629 | 84,562 | 2,160 | 197 | 39.149074 | 0.762363 | 0.101949 | 0 | 0.624005 | 0 | 0 | 0.019731 | 0.002207 | 0 | 0 | 0 | 0.000463 | 0.011936 | 1 | 0.036472 | false | 0 | 0.015915 | 0 | 0.093501 | 0.005305 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
911f4857265238cca6538c05e5f7027ba4bbce60 | 745 | py | Python | exopy_i3py/__init__.py | Exopy/exopy_i3py | 17f472efc029d167645b03d9fcefd0ea3f4540c4 | [
"BSD-3-Clause"
] | null | null | null | exopy_i3py/__init__.py | Exopy/exopy_i3py | 17f472efc029d167645b03d9fcefd0ea3f4540c4 | [
"BSD-3-Clause"
] | null | null | null | exopy_i3py/__init__.py | Exopy/exopy_i3py | 17f472efc029d167645b03d9fcefd0ea3f4540c4 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
# -----------------------------------------------------------------------------
# Copyright 2018-2018 by ExopyI3py Authors, see AUTHORS for more details.
#
# Distributed under the terms of the BSD license.
#
# The full license is in the file LICENCE, distributed with this software.
# -----------------------------------------------------------------------------
"""Package allowing a seemless integration of I3py in Exopy.
"""
import enaml
def list_manifests():
"""List all the manifest contributed by the package.
"""
with enaml.imports():
from .instruments.manifest import I3pyInstrManifest
from .tasks.manifest import I3pyTaskManifest
return [I3pyInstrManifest, I3pyTaskManifest]
| 31.041667 | 79 | 0.569128 | 72 | 745 | 5.875 | 0.680556 | 0.066194 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023885 | 0.157047 | 745 | 23 | 80 | 32.391304 | 0.649682 | 0.651007 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0 | 0.666667 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
91258bcaebc0a033fbb070b0d140547376b8107e | 3,019 | py | Python | usage/usage.py | m-lab/analysis | cfdcb3475c042f0d2d6fa96ef29c57f8e29c17d9 | [
"Apache-2.0"
] | 4 | 2020-04-11T20:06:59.000Z | 2021-06-30T08:06:31.000Z | usage/usage.py | m-lab/analysis | cfdcb3475c042f0d2d6fa96ef29c57f8e29c17d9 | [
"Apache-2.0"
] | 11 | 2019-01-19T02:19:20.000Z | 2021-08-30T22:04:21.000Z | usage/usage.py | m-lab/analysis | cfdcb3475c042f0d2d6fa96ef29c57f8e29c17d9 | [
"Apache-2.0"
] | 1 | 2020-06-25T09:48:12.000Z | 2020-06-25T09:48:12.000Z | """Visualize usage stats for each server on M-Lab.
Generates frames of a movie, each of which visualizes a single day of activity
on the M-Lab platform. Each row is a tool running on the platform, and each
column is one of the servers. The cell is coloured according to the relative
usage.
"""
__author__ = 'dominich@google.com (Dominic Hamon)'
import gflags
import math
import numpy
import os
import sys
from scipy.misc import pilutil
FLAGS = gflags.FLAGS
gflags.DEFINE_string('input', 'usage_stats_servers.txt', 'Input filename', short_name = 'i')
gflags.DEFINE_string('output', 'usage', 'Output prefix', short_name = 'o')
gflags.MarkFlagAsRequired('input')
gflags.MarkFlagAsRequired('output')
stats = {}
max_test_count = {}
num_tools = 0
num_servers = 0
tools = []
servers = []
def read_counts():
global stats
global max_test_count
global num_tools
global num_servers
rows = [row.strip() for row in open(FLAGS.input)]
rows.pop(0)
for row in rows:
# print 'row: %s' % row
parts = row.split(',');
date = parts[2]
tool_name = parts[0]
server_name = parts[1]
test_count = float(parts[3])
if not date in stats:
stats[date] = {}
if not tool_name in stats[date]:
stats[date][tool_name] = {}
stats[date][tool_name][server_name] = test_count
if not tool_name in tools:
# print 'Adding tool %s' % tool_name
tools.append(tool_name)
if not server_name in servers:
# print 'Adding server %s' % server_name
servers.append(server_name)
if date in max_test_count and tool_name in max_test_count[date]:
max_test_count[date][tool_name] = max(test_count, max_test_count[date][tool_name])
else:
if not date in max_test_count:
max_test_count[date] = {}
max_test_count[date][tool_name] = test_count
def main():
global stats
try:
FLAGS(sys.argv)
except gflags.FlagsError, err:
print '%s\nUsage: %s ARGS\n%s' % (err, sys.argv[0], FLAGS)
sys.exit(1)
read_counts()
print 'Completed reading input'
for date in stats.keys():
# note: number of rows is first.
# Initialize with ones so we start all white
array = numpy.ones((len(tools), len(servers), 3), dtype = numpy.float)
y = 0
x = 0
for tool in tools:
if tool in stats[date].keys() and max_test_count[date][tool] != 0:
for server in servers:
if server in stats[date][tool].keys():
color = stats[date][tool][server] / max_test_count[date][tool]
array[y, x] = [1.0, 1.0 - color, 1.0 - color]
x += 1
x = 0
y += 1
if not os.path.exists('out'):
os.makedirs('out')
output = 'out/' + FLAGS.output + '.' + date + '.bmp'
sys.stdout.write('Saving usage to %s\r' % output)
sys.stdout.flush()
resized_array = pilutil.imresize(array, (10*len(tools), 10*len(servers)),
'nearest')
pilutil.imsave(output, resized_array)
print '\nDone'
if __name__ == '__main__':
main()
| 26.955357 | 92 | 0.645909 | 452 | 3,019 | 4.163717 | 0.30531 | 0.071732 | 0.076514 | 0.059511 | 0.11796 | 0.068013 | 0.063762 | 0.042508 | 0.042508 | 0.042508 | 0 | 0.011197 | 0.230871 | 3,019 | 111 | 93 | 27.198198 | 0.799311 | 0.055979 | 0 | 0.050633 | 1 | 0 | 0.084706 | 0.00902 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.075949 | null | null | 0.037975 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
912f75cbfea7f0f32b06311927307e448dbd6bde | 532 | py | Python | app/pubmed/__init__.py | aaronnorrish/PubMedConnections | dc17e141d94afe6d26a9b49b2183c06f3630e561 | [
"CC-BY-4.0"
] | 4 | 2022-03-09T05:20:46.000Z | 2022-03-13T11:18:58.000Z | app/pubmed/__init__.py | aaronnorrish/PubMedConnections | dc17e141d94afe6d26a9b49b2183c06f3630e561 | [
"CC-BY-4.0"
] | null | null | null | app/pubmed/__init__.py | aaronnorrish/PubMedConnections | dc17e141d94afe6d26a9b49b2183c06f3630e561 | [
"CC-BY-4.0"
] | 1 | 2022-03-09T05:21:53.000Z | 2022-03-09T05:21:53.000Z | """
Entrez is an API that provides access to many databases, with most databases
dealing with the biomedical and molecular fields.
In this project we are concerned with the PubMed and the PubMed Central
databases that hold biomedical literature.
See:
- Links to API documentation & examples: https://www.ncbi.nlm.nih.gov/pmc/tools/developers/
- Documentation about the E-utilities APIs: https://www.ncbi.nlm.nih.gov/books/NBK25501/
- List of the databases available through Entrez: https://www.ncbi.nlm.nih.gov/books/NBK3837/
"""
| 40.923077 | 93 | 0.781955 | 82 | 532 | 5.073171 | 0.634146 | 0.057692 | 0.086538 | 0.108173 | 0.175481 | 0.175481 | 0.125 | 0 | 0 | 0 | 0 | 0.019355 | 0.12594 | 532 | 12 | 94 | 44.333333 | 0.875269 | 0.983083 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9140927b307e41d1a2fb85b4e70181c1a5b6150b | 2,894 | py | Python | instagram/models.py | LewisNjagi/instagram | ead356d13ffcf1ca43250639b1876246d941698e | [
"MIT"
] | null | null | null | instagram/models.py | LewisNjagi/instagram | ead356d13ffcf1ca43250639b1876246d941698e | [
"MIT"
] | null | null | null | instagram/models.py | LewisNjagi/instagram | ead356d13ffcf1ca43250639b1876246d941698e | [
"MIT"
] | 1 | 2021-04-15T18:06:02.000Z | 2021-04-15T18:06:02.000Z | from django.db import models
from django.contrib.auth.models import User
from django.db.models.signals import post_save
from django.dispatch import receiver
from django.core.exceptions import ObjectDoesNotExist
# Create your models here.
# likes = models.ManyToManyField(User, related_name='likes')
class Image(models.Model):
user = models.ForeignKey('Profile', on_delete=models.CASCADE, related_name='images')
image = models.ImageField(upload_to = 'gallery/', null=True, blank=True)
name = models.CharField(max_length=30)
caption = models.CharField(max_length=30)
class Meta:
ordering = ["-pk"]
@classmethod
def images(cls):
images = cls.objects.all()
return images
def image_url(self):
if self.image and hasattr(self.image, 'url'):
return self.image.url
def save_image(self):
self.save()
def delete_image(self):
self.delete()
@classmethod
def update_image(cls,old,new):
cap = Image.objects.filter(caption=old).update(caption=new)
return cap
def __str__(self):
return self.name
class Profile(models.Model):
user = models.OneToOneField(User, on_delete=models.CASCADE, related_name='profile',null=True)
photo = models.ImageField(upload_to = 'gallery/', null=True, blank=True, default='download.jpeg')
bio = models.CharField(max_length=300)
name = models.CharField(blank=True, max_length=120)
@receiver(post_save, sender=User)
def create_user_profile(sender, instance, created, **kwargs):
try:
instance.profile.save()
except ObjectDoesNotExist:
Profile.objects.create(user=instance)
@receiver(post_save, sender=User)
def save_user_profile(sender, instance, **kwargs):
instance.profile.save()
@classmethod
def profile(cls):
profiles = cls.objects.all()
return profiles
def photo_url(self):
if self.photo and hasattr(self.photo, 'url'):
return self.photo.url
def save_profile(self):
self.user
def __str__(self):
return self.name
@classmethod
def search_profile(cls, name):
return cls.objects.filter(user__username__icontains=name).all()
class Follow(models.Model):
follower = models.ForeignKey(Profile, on_delete=models.CASCADE, related_name='following')
followed = models.ForeignKey(Profile, on_delete=models.CASCADE, related_name='followers')
def __str__(self):
return f'{self.follower} Follow'
class Comment(models.Model):
comment = models.TextField()
user = models.ForeignKey('Profile',on_delete=models.CASCADE,related_name='comment')
photo = models.ForeignKey('Image',on_delete=models.CASCADE,related_name='comment')
class Meta:
ordering = ["-pk"]
def __str__(self):
return f'{self.user.name} Image' | 29.232323 | 101 | 0.680028 | 357 | 2,894 | 5.364146 | 0.252101 | 0.040209 | 0.043864 | 0.065796 | 0.31436 | 0.287206 | 0.193211 | 0.169191 | 0.169191 | 0.061619 | 0 | 0.004344 | 0.204561 | 2,894 | 99 | 102 | 29.232323 | 0.827541 | 0.02868 | 0 | 0.257143 | 0 | 0 | 0.053044 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.071429 | 0.071429 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
91485766f01f0aae9aa55ea3bf7e69cad9ef2c12 | 475 | py | Python | src/poetry/console/command_loader.py | hadialqattan/poetry | 02444308831358c26595c0f36ea263a3e02cb5d4 | [
"MIT"
] | 2 | 2022-01-15T20:22:15.000Z | 2022-01-16T09:17:11.000Z | src/poetry/console/command_loader.py | hadialqattan/poetry | 02444308831358c26595c0f36ea263a3e02cb5d4 | [
"MIT"
] | 1 | 2022-02-22T05:52:32.000Z | 2022-02-22T05:52:32.000Z | src/poetry/console/command_loader.py | hadialqattan/poetry | 02444308831358c26595c0f36ea263a3e02cb5d4 | [
"MIT"
] | 1 | 2022-03-19T12:13:53.000Z | 2022-03-19T12:13:53.000Z | from __future__ import annotations
from typing import Callable
from cleo.exceptions import LogicException
from cleo.loaders.factory_command_loader import FactoryCommandLoader
class CommandLoader(FactoryCommandLoader):
def register_factory(self, command_name: str, factory: Callable) -> None:
if command_name in self._factories:
raise LogicException(f'The command "{command_name}" already exists.')
self._factories[command_name] = factory
| 31.666667 | 81 | 0.776842 | 54 | 475 | 6.592593 | 0.555556 | 0.123596 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 475 | 14 | 82 | 33.928571 | 0.892231 | 0 | 0 | 0 | 0 | 0 | 0.092632 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.444444 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
91497428b89ad85a8ae37d8f9db83bfb4713db79 | 379 | py | Python | packages/pyright-internal/src/tests/samples/final5.py | Jasha10/pyright | 0ce0cfa10fe7faa41071a2cc417bb449cf8276fe | [
"MIT"
] | 3,934 | 2019-03-22T09:26:41.000Z | 2019-05-06T21:03:08.000Z | packages/pyright-internal/src/tests/samples/final5.py | Jasha10/pyright | 0ce0cfa10fe7faa41071a2cc417bb449cf8276fe | [
"MIT"
] | 107 | 2019-03-24T04:09:37.000Z | 2019-05-06T17:00:04.000Z | packages/pyright-internal/src/tests/samples/final5.py | Jasha10/pyright | 0ce0cfa10fe7faa41071a2cc417bb449cf8276fe | [
"MIT"
] | 119 | 2019-03-23T10:48:04.000Z | 2019-05-06T08:57:56.000Z | # This sample tests that instance variables declared as Final within
# a dataclass do not need to have an explicit assignment because
# the generated __init__ method will assign them.
from dataclasses import dataclass
from typing import Final
class Foo1:
x: Final[int]
def __init__(self, x: int) -> None:
self.x = x
@dataclass
class Foo2:
x: Final[int]
| 19.947368 | 68 | 0.722955 | 56 | 379 | 4.75 | 0.714286 | 0.045113 | 0.067669 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006757 | 0.218997 | 379 | 18 | 69 | 21.055556 | 0.891892 | 0.467018 | 0 | 0.222222 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.222222 | 0 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
914bb14ca250b9f29ca1a4f9edb158a3115d8947 | 775 | py | Python | Quizzo/questions.py | Tauseef-Hilal/Quizzo-GUI | fa889a958ea431127e705252043d55141b8a7cf5 | [
"MIT"
] | 1 | 2021-05-09T07:55:02.000Z | 2021-05-09T07:55:02.000Z | Quizzo/questions.py | Tauseef-Hilal/Quizzo-GUI | fa889a958ea431127e705252043d55141b8a7cf5 | [
"MIT"
] | null | null | null | Quizzo/questions.py | Tauseef-Hilal/Quizzo-GUI | fa889a958ea431127e705252043d55141b8a7cf5 | [
"MIT"
] | null | null | null | """
questions.py
[1] Read questions form json file.
[2] Set up a mechanism for sharing questions one by one with controller.py
"""
import json
from random import shuffle
from os.path import join
class Question:
"""Read question data"""
def __init__(self):
self.filename = join("Data", "question-data.json")
self.question_list = []
# Read the json data
with open(self.filename) as file:
self.question_list = json.load(file)
# Shuffle question list
shuffle(self.question_list)
def _getQuestion(self):
"""Send a question dict to the controller"""
return self.question_list.pop(0)
def __repr__(self):
return f"Question() => Keeps the list of all questions"
| 22.794118 | 78 | 0.636129 | 102 | 775 | 4.705882 | 0.480392 | 0.125 | 0.133333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005282 | 0.267097 | 775 | 33 | 79 | 23.484848 | 0.839789 | 0.287742 | 0 | 0 | 0 | 0 | 0.128599 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.214286 | 0.071429 | 0.642857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e67b7887317f53020d4a0a54c7dda8ceb866a4a7 | 582 | py | Python | nautobot/extras/tests/dummy_plugin/__init__.py | steffann/nautobot | f5cf4a294861e69fa10ac445f7fc89f432d5b3df | [
"Apache-2.0"
] | 1 | 2021-03-16T15:14:55.000Z | 2021-03-16T15:14:55.000Z | nautobot/extras/tests/dummy_plugin/__init__.py | steffann/nautobot | f5cf4a294861e69fa10ac445f7fc89f432d5b3df | [
"Apache-2.0"
] | null | null | null | nautobot/extras/tests/dummy_plugin/__init__.py | steffann/nautobot | f5cf4a294861e69fa10ac445f7fc89f432d5b3df | [
"Apache-2.0"
] | 1 | 2021-10-14T01:54:24.000Z | 2021-10-14T01:54:24.000Z | from nautobot.extras.plugins import PluginConfig
class DummyPluginConfig(PluginConfig):
name = "nautobot.extras.tests.dummy_plugin"
verbose_name = "Dummy plugin"
version = "0.0"
description = "For testing purposes only"
base_url = "dummy-plugin"
min_version = "0.9"
max_version = "9.0"
middleware = ["nautobot.extras.tests.dummy_plugin.middleware.DummyMiddleware"]
installed_apps = ["nautobot.extras.tests.dummy_plugin_dependency"]
default_settings = {
"dummy_default_key": "dummy_default_value",
}
config = DummyPluginConfig
| 29.1 | 82 | 0.719931 | 65 | 582 | 6.230769 | 0.538462 | 0.135802 | 0.140741 | 0.177778 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0125 | 0.175258 | 582 | 19 | 83 | 30.631579 | 0.83125 | 0 | 0 | 0 | 0 | 0 | 0.402062 | 0.24055 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.066667 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e67dffce6020bd9bf9e0c1bd5c53b10bdbe4316c | 1,942 | py | Python | asv_bench/benchmarks/benchmarks_get_keepbits.py | observingClouds/bitinformation_pipeline | c91a21f1e5e2164f3c1e6e543fad9e2a2547ae8f | [
"MIT"
] | null | null | null | asv_bench/benchmarks/benchmarks_get_keepbits.py | observingClouds/bitinformation_pipeline | c91a21f1e5e2164f3c1e6e543fad9e2a2547ae8f | [
"MIT"
] | null | null | null | asv_bench/benchmarks/benchmarks_get_keepbits.py | observingClouds/bitinformation_pipeline | c91a21f1e5e2164f3c1e6e543fad9e2a2547ae8f | [
"MIT"
] | null | null | null | import numpy as np
import xarray as xr
from xbitinfo import get_keepbits
from . import _skip_slow, ensure_loaded, parameterized, randn, requires_dask
class GetKeepbits:
"""
Benchmark time and peak memory of `get_keepbits`.
"""
# https://asv.readthedocs.io/en/stable/benchmarks.html
timeout = 30.0
repeat = 3
number = 5
def setup(self, *args, **kwargs):
self.info_per_bit = {
"air": np.array(
[
0.00000000e00,
0.00000000e00,
0.00000000e00,
0.00000000e00,
0.00000000e00,
3.94447851e-01,
3.94447851e-01,
3.94447851e-01,
3.94447851e-01,
3.94447851e-01,
3.94310542e-01,
7.36739987e-01,
5.62682836e-01,
3.60511555e-01,
1.52471111e-01,
4.18818055e-02,
3.65276146e-03,
1.19975820e-05,
4.39366160e-05,
4.18329296e-05,
2.54572089e-05,
1.44121797e-04,
1.34144798e-03,
1.55468479e-06,
5.38601212e-04,
8.09862581e-04,
1.74893445e-04,
4.97915410e-05,
3.88027711e-04,
0.00000000e00,
3.95323228e-05,
6.88854435e-04,
]
)
}
def time_get_keepbits(self, **kwargs):
"""Take time for `get_keepbits`."""
get_keepbits(self.info_per_bit, **kwargs)
def peakmem_get_keepbits(self, **kwargs):
"""Take memory peak for `get_keepbits`."""
get_keepbits(self.info_per_bit, **kwargs)
| 29.424242 | 76 | 0.444387 | 187 | 1,942 | 4.508021 | 0.475936 | 0.104389 | 0.071174 | 0.077106 | 0.309609 | 0.250297 | 0.250297 | 0.250297 | 0.179122 | 0.179122 | 0 | 0.341954 | 0.46241 | 1,942 | 65 | 77 | 29.876923 | 0.465517 | 0.087539 | 0 | 0.254902 | 0 | 0 | 0.001718 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.078431 | 0 | 0.215686 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e67f5820d381e9eaf5ae3b886676bddc083f33ae | 386 | py | Python | line/admin.py | j3ygh/j3y-line-app | ce864d1da5161f4c667e3dd1802b1c413edffeb8 | [
"MIT"
] | 1 | 2020-09-03T00:09:57.000Z | 2020-09-03T00:09:57.000Z | line/admin.py | j3ygithub/j3y-line-app | ce864d1da5161f4c667e3dd1802b1c413edffeb8 | [
"MIT"
] | null | null | null | line/admin.py | j3ygithub/j3y-line-app | ce864d1da5161f4c667e3dd1802b1c413edffeb8 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import QNA, Follower
class QNAadmin(admin.ModelAdmin):
list_display = ['question', 'answer', 'learned_from', ]
admin.site.register(QNA, QNAadmin)
class Followeradmin(admin.ModelAdmin):
list_display = ['user_id', 'display_name', 'picture_url', 'status_message', 'timestamp', 'state', ]
admin.site.register(Follower, Followeradmin) | 35.090909 | 103 | 0.751295 | 46 | 386 | 6.152174 | 0.608696 | 0.106007 | 0.134276 | 0.183746 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108808 | 386 | 11 | 104 | 35.090909 | 0.822674 | 0 | 0 | 0 | 0 | 0 | 0.217054 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e6882e10e2d7bd654d59299e0f26d2a9157c2c0f | 60,897 | py | Python | pysnmp-with-texts/TPT-DDOS-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/TPT-DDOS-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/TPT-DDOS-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module TPT-DDOS-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/TPT-DDOS-MIB
# Produced by pysmi-0.3.4 at Wed May 1 15:26:17 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
OctetString, ObjectIdentifier, Integer = mibBuilder.importSymbols("ASN1", "OctetString", "ObjectIdentifier", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ConstraintsIntersection, ConstraintsUnion, ValueRangeConstraint, ValueSizeConstraint, SingleValueConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ConstraintsIntersection", "ConstraintsUnion", "ValueRangeConstraint", "ValueSizeConstraint", "SingleValueConstraint")
ModuleCompliance, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "NotificationGroup")
NotificationType, Unsigned32, Counter32, Bits, ModuleIdentity, TimeTicks, Gauge32, ObjectIdentity, IpAddress, Integer32, iso, MibIdentifier, Counter64, MibScalar, MibTable, MibTableRow, MibTableColumn = mibBuilder.importSymbols("SNMPv2-SMI", "NotificationType", "Unsigned32", "Counter32", "Bits", "ModuleIdentity", "TimeTicks", "Gauge32", "ObjectIdentity", "IpAddress", "Integer32", "iso", "MibIdentifier", "Counter64", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn")
DisplayString, TextualConvention = mibBuilder.importSymbols("SNMPv2-TC", "DisplayString", "TextualConvention")
tpt_tpa_objs, = mibBuilder.importSymbols("TPT-TPAMIBS-MIB", "tpt-tpa-objs")
tpt_ddos = ModuleIdentity((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9)).setLabel("tpt-ddos")
tpt_ddos.setRevisions(('2016-05-25 18:54',))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
if mibBuilder.loadTexts: tpt_ddos.setRevisionsDescriptions(('Updated copyright information. Minor MIB syntax fixes.',))
if mibBuilder.loadTexts: tpt_ddos.setLastUpdated('201605251854Z')
if mibBuilder.loadTexts: tpt_ddos.setOrganization('Trend Micro, Inc.')
if mibBuilder.loadTexts: tpt_ddos.setContactInfo('www.trendmicro.com')
if mibBuilder.loadTexts: tpt_ddos.setDescription("DDoS management (statistics). Copyright (C) 2016 Trend Micro Incorporated. All Rights Reserved. Trend Micro makes no warranty of any kind with regard to this material, including, but not limited to, the implied warranties of merchantability and fitness for a particular purpose. Trend Micro shall not be liable for errors contained herein or for incidental or consequential damages in connection with the furnishing, performance, or use of this material. This document contains proprietary information, which is protected by copyright. No part of this document may be photocopied, reproduced, or translated into another language without the prior written consent of Trend Micro. The information is provided 'as is' without warranty of any kind and is subject to change without notice. The only warranties for Trend Micro products and services are set forth in the express warranty statements accompanying such products and services. Nothing herein should be construed as constituting an additional warranty. Trend Micro shall not be liable for technical or editorial errors or omissions contained herein. TippingPoint(R), the TippingPoint logo, and Digital Vaccine(R) are registered trademarks of Trend Micro. All other company and product names may be trademarks of their respective holders. All rights reserved. This document contains confidential information, trade secrets or both, which are the property of Trend Micro. No part of this documentation may be reproduced in any form or by any means or used to make any derivative work (such as translation, transformation, or adaptation) without written permission from Trend Micro or one of its subsidiaries. All other company and product names may be trademarks of their respective holders. ")
rejectSynHistSecondsTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 5), )
if mibBuilder.loadTexts: rejectSynHistSecondsTable.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistSecondsTable.setDescription('Historical (sampled) data every second for a minute.')
rejectSynHistSecondsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 5, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "rejectSynHistSecondsGlobalID"), (0, "TPT-DDOS-MIB", "rejectSynHistSecondsIndex"))
if mibBuilder.loadTexts: rejectSynHistSecondsEntry.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistSecondsEntry.setDescription('An entry in the rejected SYNs per second history seconds table. Rows cannot be created or deleted. ')
rejectSynHistSecondsGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 5, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: rejectSynHistSecondsGlobalID.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistSecondsGlobalID.setDescription('The global identifier of a DDoS filter group.')
rejectSynHistSecondsIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 5, 1, 2), Unsigned32())
if mibBuilder.loadTexts: rejectSynHistSecondsIndex.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistSecondsIndex.setDescription('The index (0-59) of the second.')
rejectSynHistSecondsUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 5, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectSynHistSecondsUnitCount.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistSecondsUnitCount.setDescription('The count of filter-specific units matching the criteria for this filter in the specified second.')
rejectSynHistSecondsTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 5, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectSynHistSecondsTimestamp.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistSecondsTimestamp.setDescription('The time SecondsUnitCount was updated (in seconds since January 1, 1970).')
rejectSynHistMinutesTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 6), )
if mibBuilder.loadTexts: rejectSynHistMinutesTable.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistMinutesTable.setDescription('Historical (sampled) data every minute for an hour.')
rejectSynHistMinutesEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 6, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "rejectSynHistMinutesGlobalID"), (0, "TPT-DDOS-MIB", "rejectSynHistMinutesIndex"))
if mibBuilder.loadTexts: rejectSynHistMinutesEntry.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistMinutesEntry.setDescription('An entry in the rejected SYNs per second history minutes table. Rows cannot be created or deleted. ')
rejectSynHistMinutesGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 6, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: rejectSynHistMinutesGlobalID.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistMinutesGlobalID.setDescription('The global identifier of a DDoS filter group.')
rejectSynHistMinutesIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 6, 1, 2), Unsigned32())
if mibBuilder.loadTexts: rejectSynHistMinutesIndex.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistMinutesIndex.setDescription('The index (0-59) of the minute.')
rejectSynHistMinutesUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 6, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectSynHistMinutesUnitCount.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistMinutesUnitCount.setDescription('The average of the SecondsUnitCount values corresponding to this minute.')
rejectSynHistMinutesTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 6, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectSynHistMinutesTimestamp.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistMinutesTimestamp.setDescription('The time MinutesUnitCount was updated (in seconds since January 1, 1970).')
rejectSynHistHoursTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 7), )
if mibBuilder.loadTexts: rejectSynHistHoursTable.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistHoursTable.setDescription('Historical (sampled) data every hour for a day.')
rejectSynHistHoursEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 7, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "rejectSynHistHoursGlobalID"), (0, "TPT-DDOS-MIB", "rejectSynHistHoursIndex"))
if mibBuilder.loadTexts: rejectSynHistHoursEntry.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistHoursEntry.setDescription('An entry in the rejected SYNs per second history hours table. Rows cannot be created or deleted. ')
rejectSynHistHoursGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 7, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: rejectSynHistHoursGlobalID.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistHoursGlobalID.setDescription('The global identifier of a DDoS filter group.')
rejectSynHistHoursIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 7, 1, 2), Unsigned32())
if mibBuilder.loadTexts: rejectSynHistHoursIndex.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistHoursIndex.setDescription('The index (0-23) of the hour.')
rejectSynHistHoursUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 7, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectSynHistHoursUnitCount.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistHoursUnitCount.setDescription('The average of the MinutesUnitCount values corresponding to this hour.')
rejectSynHistHoursTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 7, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectSynHistHoursTimestamp.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistHoursTimestamp.setDescription('The time HoursUnitCount was updated (in seconds since January 1, 1970).')
rejectSynHistDaysTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 8), )
if mibBuilder.loadTexts: rejectSynHistDaysTable.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistDaysTable.setDescription('Historical (sampled) data every day for 35 days.')
rejectSynHistDaysEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 8, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "rejectSynHistDaysGlobalID"), (0, "TPT-DDOS-MIB", "rejectSynHistDaysIndex"))
if mibBuilder.loadTexts: rejectSynHistDaysEntry.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistDaysEntry.setDescription('An entry in the rejected SYNs per second history days table. Rows cannot be created or deleted. ')
rejectSynHistDaysGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 8, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: rejectSynHistDaysGlobalID.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistDaysGlobalID.setDescription('The global identifier of a DDoS filter group.')
rejectSynHistDaysIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 8, 1, 2), Unsigned32())
if mibBuilder.loadTexts: rejectSynHistDaysIndex.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistDaysIndex.setDescription('The index (0-34) of the day.')
rejectSynHistDaysUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 8, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectSynHistDaysUnitCount.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistDaysUnitCount.setDescription('The average of the HoursUnitCount values corresponding to this day.')
rejectSynHistDaysTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 8, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectSynHistDaysTimestamp.setStatus('current')
if mibBuilder.loadTexts: rejectSynHistDaysTimestamp.setDescription('The time DaysUnitCount was updated (in seconds since January 1, 1970).')
proxyConnHistSecondsTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 9), )
if mibBuilder.loadTexts: proxyConnHistSecondsTable.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistSecondsTable.setDescription('Historical (sampled) data every second for a minute.')
proxyConnHistSecondsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 9, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "proxyConnHistSecondsGlobalID"), (0, "TPT-DDOS-MIB", "proxyConnHistSecondsIndex"))
if mibBuilder.loadTexts: proxyConnHistSecondsEntry.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistSecondsEntry.setDescription('An entry in the proxied connections per second history seconds table. Rows cannot be created or deleted. ')
proxyConnHistSecondsGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 9, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: proxyConnHistSecondsGlobalID.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistSecondsGlobalID.setDescription('The global identifier of a DDoS filter group.')
proxyConnHistSecondsIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 9, 1, 2), Unsigned32())
if mibBuilder.loadTexts: proxyConnHistSecondsIndex.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistSecondsIndex.setDescription('The index (0-59) of the second.')
proxyConnHistSecondsUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 9, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: proxyConnHistSecondsUnitCount.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistSecondsUnitCount.setDescription('The count of filter-specific units matching the traffic criteria for this filter in the specified second.')
proxyConnHistSecondsTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 9, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: proxyConnHistSecondsTimestamp.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistSecondsTimestamp.setDescription('The time SecondsUnitCount was updated (in seconds since January 1, 1970).')
proxyConnHistMinutesTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 10), )
if mibBuilder.loadTexts: proxyConnHistMinutesTable.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistMinutesTable.setDescription('Historical (sampled) data every minute for an hour.')
proxyConnHistMinutesEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 10, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "proxyConnHistMinutesGlobalID"), (0, "TPT-DDOS-MIB", "proxyConnHistMinutesIndex"))
if mibBuilder.loadTexts: proxyConnHistMinutesEntry.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistMinutesEntry.setDescription('An entry in the proxied connections per second history minutes table. Rows cannot be created or deleted. ')
proxyConnHistMinutesGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 10, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: proxyConnHistMinutesGlobalID.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistMinutesGlobalID.setDescription('The global identifier of a DDoS filter group.')
proxyConnHistMinutesIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 10, 1, 2), Unsigned32())
if mibBuilder.loadTexts: proxyConnHistMinutesIndex.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistMinutesIndex.setDescription('The index (0-59) of the minute.')
proxyConnHistMinutesUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 10, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: proxyConnHistMinutesUnitCount.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistMinutesUnitCount.setDescription('The average of the SecondsUnitCount values corresponding to this minute.')
proxyConnHistMinutesTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 10, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: proxyConnHistMinutesTimestamp.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistMinutesTimestamp.setDescription('The time MinutesUnitCount was updated (in seconds since January 1, 1970).')
proxyConnHistHoursTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 11), )
if mibBuilder.loadTexts: proxyConnHistHoursTable.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistHoursTable.setDescription('Historical (sampled) data every hour for a day.')
proxyConnHistHoursEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 11, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "proxyConnHistHoursGlobalID"), (0, "TPT-DDOS-MIB", "proxyConnHistHoursIndex"))
if mibBuilder.loadTexts: proxyConnHistHoursEntry.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistHoursEntry.setDescription('An entry in the proxied connections per second history hours table. Rows cannot be created or deleted. ')
proxyConnHistHoursGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 11, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: proxyConnHistHoursGlobalID.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistHoursGlobalID.setDescription('The global identifier of a DDoS filter group.')
proxyConnHistHoursIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 11, 1, 2), Unsigned32())
if mibBuilder.loadTexts: proxyConnHistHoursIndex.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistHoursIndex.setDescription('The index (0-23) of the hour.')
proxyConnHistHoursUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 11, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: proxyConnHistHoursUnitCount.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistHoursUnitCount.setDescription('The average of the MinutesUnitCount values corresponding to this hour.')
proxyConnHistHoursTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 11, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: proxyConnHistHoursTimestamp.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistHoursTimestamp.setDescription('The time HoursUnitCount was updated (in seconds since January 1, 1970).')
proxyConnHistDaysTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 12), )
if mibBuilder.loadTexts: proxyConnHistDaysTable.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistDaysTable.setDescription('Historical (sampled) data every day for 35 days.')
proxyConnHistDaysEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 12, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "proxyConnHistDaysGlobalID"), (0, "TPT-DDOS-MIB", "proxyConnHistDaysIndex"))
if mibBuilder.loadTexts: proxyConnHistDaysEntry.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistDaysEntry.setDescription('An entry in the proxied connections per second history days table. Rows cannot be created or deleted. ')
proxyConnHistDaysGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 12, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: proxyConnHistDaysGlobalID.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistDaysGlobalID.setDescription('The global identifier of a DDoS filter group.')
proxyConnHistDaysIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 12, 1, 2), Unsigned32())
if mibBuilder.loadTexts: proxyConnHistDaysIndex.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistDaysIndex.setDescription('The index (0-34) of the day.')
proxyConnHistDaysUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 12, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: proxyConnHistDaysUnitCount.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistDaysUnitCount.setDescription('The average of the HoursUnitCount values corresponding to this day.')
proxyConnHistDaysTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 12, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: proxyConnHistDaysTimestamp.setStatus('current')
if mibBuilder.loadTexts: proxyConnHistDaysTimestamp.setDescription('The time DaysUnitCount was updated (in seconds since January 1, 1970).')
rejectCpsHistSecondsTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 15), )
if mibBuilder.loadTexts: rejectCpsHistSecondsTable.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistSecondsTable.setDescription('Historical (sampled) data every second for a minute.')
rejectCpsHistSecondsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 15, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "rejectCpsHistSecondsGlobalID"), (0, "TPT-DDOS-MIB", "rejectCpsHistSecondsIndex"))
if mibBuilder.loadTexts: rejectCpsHistSecondsEntry.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistSecondsEntry.setDescription('An entry in the rejected connections per sec (CPS) history seconds table. Rows cannot be created or deleted. ')
rejectCpsHistSecondsGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 15, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: rejectCpsHistSecondsGlobalID.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistSecondsGlobalID.setDescription('The global identifier of a DDoS filter group.')
rejectCpsHistSecondsIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 15, 1, 2), Unsigned32())
if mibBuilder.loadTexts: rejectCpsHistSecondsIndex.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistSecondsIndex.setDescription('The index (0-59) of the second.')
rejectCpsHistSecondsUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 15, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectCpsHistSecondsUnitCount.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistSecondsUnitCount.setDescription('The count of filter-specific units matching the criteria for this filter in the specified second.')
rejectCpsHistSecondsTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 15, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectCpsHistSecondsTimestamp.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistSecondsTimestamp.setDescription('The time SecondsUnitCount was updated (in seconds since January 1, 1970).')
rejectCpsHistMinutesTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 16), )
if mibBuilder.loadTexts: rejectCpsHistMinutesTable.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistMinutesTable.setDescription('Historical (sampled) data every minute for an hour.')
rejectCpsHistMinutesEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 16, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "rejectCpsHistMinutesGlobalID"), (0, "TPT-DDOS-MIB", "rejectCpsHistMinutesIndex"))
if mibBuilder.loadTexts: rejectCpsHistMinutesEntry.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistMinutesEntry.setDescription('An entry in the rejected connections per sec (CPS) history minutes table. Rows cannot be created or deleted. ')
rejectCpsHistMinutesGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 16, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: rejectCpsHistMinutesGlobalID.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistMinutesGlobalID.setDescription('The global identifier of a DDoS filter group.')
rejectCpsHistMinutesIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 16, 1, 2), Unsigned32())
if mibBuilder.loadTexts: rejectCpsHistMinutesIndex.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistMinutesIndex.setDescription('The index (0-59) of the minute.')
rejectCpsHistMinutesUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 16, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectCpsHistMinutesUnitCount.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistMinutesUnitCount.setDescription('The average of the SecondsUnitCount values corresponding to this minute.')
rejectCpsHistMinutesTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 16, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectCpsHistMinutesTimestamp.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistMinutesTimestamp.setDescription('The time MinutesUnitCount was updated (in seconds since January 1, 1970).')
rejectCpsHistHoursTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 17), )
if mibBuilder.loadTexts: rejectCpsHistHoursTable.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistHoursTable.setDescription('Historical (sampled) data every hour for a day.')
rejectCpsHistHoursEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 17, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "rejectCpsHistHoursGlobalID"), (0, "TPT-DDOS-MIB", "rejectCpsHistHoursIndex"))
if mibBuilder.loadTexts: rejectCpsHistHoursEntry.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistHoursEntry.setDescription('An entry in the rejected connections per sec (CPS) history hours table. Rows cannot be created or deleted. ')
rejectCpsHistHoursGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 17, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: rejectCpsHistHoursGlobalID.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistHoursGlobalID.setDescription('The global identifier of a DDoS filter group.')
rejectCpsHistHoursIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 17, 1, 2), Unsigned32())
if mibBuilder.loadTexts: rejectCpsHistHoursIndex.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistHoursIndex.setDescription('The index (0-23) of the hour.')
rejectCpsHistHoursUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 17, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectCpsHistHoursUnitCount.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistHoursUnitCount.setDescription('The average of the MinutesUnitCount values corresponding to this hour.')
rejectCpsHistHoursTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 17, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectCpsHistHoursTimestamp.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistHoursTimestamp.setDescription('The time HoursUnitCount was updated (in seconds since January 1, 1970).')
rejectCpsHistDaysTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 18), )
if mibBuilder.loadTexts: rejectCpsHistDaysTable.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistDaysTable.setDescription('Historical (sampled) data every day for 35 days.')
rejectCpsHistDaysEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 18, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "rejectCpsHistDaysGlobalID"), (0, "TPT-DDOS-MIB", "rejectCpsHistDaysIndex"))
if mibBuilder.loadTexts: rejectCpsHistDaysEntry.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistDaysEntry.setDescription('An entry in the rejected connections per sec (CPS) history days table. Rows cannot be created or deleted. ')
rejectCpsHistDaysGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 18, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: rejectCpsHistDaysGlobalID.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistDaysGlobalID.setDescription('The global identifier of a DDoS filter group.')
rejectCpsHistDaysIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 18, 1, 2), Unsigned32())
if mibBuilder.loadTexts: rejectCpsHistDaysIndex.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistDaysIndex.setDescription('The index (0-34) of the day.')
rejectCpsHistDaysUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 18, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectCpsHistDaysUnitCount.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistDaysUnitCount.setDescription('The average of the HoursUnitCount values corresponding to this day.')
rejectCpsHistDaysTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 18, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectCpsHistDaysTimestamp.setStatus('current')
if mibBuilder.loadTexts: rejectCpsHistDaysTimestamp.setDescription('The time DaysUnitCount was updated (in seconds since January 1, 1970).')
acceptCpsHistSecondsTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 19), )
if mibBuilder.loadTexts: acceptCpsHistSecondsTable.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistSecondsTable.setDescription('Historical (sampled) data every second for a minute.')
acceptCpsHistSecondsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 19, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "acceptCpsHistSecondsGlobalID"), (0, "TPT-DDOS-MIB", "acceptCpsHistSecondsIndex"))
if mibBuilder.loadTexts: acceptCpsHistSecondsEntry.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistSecondsEntry.setDescription('An entry in the accepted connections per sec (CPS) history seconds table. Rows cannot be created or deleted. ')
acceptCpsHistSecondsGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 19, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: acceptCpsHistSecondsGlobalID.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistSecondsGlobalID.setDescription('The global identifier of a DDoS filter group.')
acceptCpsHistSecondsIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 19, 1, 2), Unsigned32())
if mibBuilder.loadTexts: acceptCpsHistSecondsIndex.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistSecondsIndex.setDescription('The index (0-59) of the second.')
acceptCpsHistSecondsUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 19, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptCpsHistSecondsUnitCount.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistSecondsUnitCount.setDescription('The count of filter-specific units matching the criteria for this filter in the specified second.')
acceptCpsHistSecondsTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 19, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptCpsHistSecondsTimestamp.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistSecondsTimestamp.setDescription('The time SecondsUnitCount was updated (in seconds since January 1, 1970).')
acceptCpsHistMinutesTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 20), )
if mibBuilder.loadTexts: acceptCpsHistMinutesTable.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistMinutesTable.setDescription('Historical (sampled) data every minute for an hour.')
acceptCpsHistMinutesEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 20, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "acceptCpsHistMinutesGlobalID"), (0, "TPT-DDOS-MIB", "acceptCpsHistMinutesIndex"))
if mibBuilder.loadTexts: acceptCpsHistMinutesEntry.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistMinutesEntry.setDescription('An entry in the accepted connections per sec (CPS) history minutes table. Rows cannot be created or deleted. ')
acceptCpsHistMinutesGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 20, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: acceptCpsHistMinutesGlobalID.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistMinutesGlobalID.setDescription('The global identifier of a DDoS filter group.')
acceptCpsHistMinutesIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 20, 1, 2), Unsigned32())
if mibBuilder.loadTexts: acceptCpsHistMinutesIndex.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistMinutesIndex.setDescription('The index (0-59) of the minute.')
acceptCpsHistMinutesUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 20, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptCpsHistMinutesUnitCount.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistMinutesUnitCount.setDescription('The average of the SecondsUnitCount values corresponding to this minute.')
acceptCpsHistMinutesTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 20, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptCpsHistMinutesTimestamp.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistMinutesTimestamp.setDescription('The time MinutesUnitCount was updated (in seconds since January 1, 1970).')
acceptCpsHistHoursTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 21), )
if mibBuilder.loadTexts: acceptCpsHistHoursTable.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistHoursTable.setDescription('Historical (sampled) data every hour for a day.')
acceptCpsHistHoursEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 21, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "acceptCpsHistHoursGlobalID"), (0, "TPT-DDOS-MIB", "acceptCpsHistHoursIndex"))
if mibBuilder.loadTexts: acceptCpsHistHoursEntry.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistHoursEntry.setDescription('An entry in the accepted connections per sec (CPS) history hours table. Rows cannot be created or deleted. ')
acceptCpsHistHoursGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 21, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: acceptCpsHistHoursGlobalID.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistHoursGlobalID.setDescription('The global identifier of a DDoS filter group.')
acceptCpsHistHoursIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 21, 1, 2), Unsigned32())
if mibBuilder.loadTexts: acceptCpsHistHoursIndex.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistHoursIndex.setDescription('The index (0-23) of the hour.')
acceptCpsHistHoursUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 21, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptCpsHistHoursUnitCount.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistHoursUnitCount.setDescription('The average of the MinutesUnitCount values corresponding to this hour.')
acceptCpsHistHoursTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 21, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptCpsHistHoursTimestamp.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistHoursTimestamp.setDescription('The time HoursUnitCount was updated (in seconds since January 1, 1970).')
acceptCpsHistDaysTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 22), )
if mibBuilder.loadTexts: acceptCpsHistDaysTable.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistDaysTable.setDescription('Historical (sampled) data every day for 35 days.')
acceptCpsHistDaysEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 22, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "acceptCpsHistDaysGlobalID"), (0, "TPT-DDOS-MIB", "acceptCpsHistDaysIndex"))
if mibBuilder.loadTexts: acceptCpsHistDaysEntry.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistDaysEntry.setDescription('An entry in the accepted connections per sec (CPS) history days table. Rows cannot be created or deleted. ')
acceptCpsHistDaysGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 22, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: acceptCpsHistDaysGlobalID.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistDaysGlobalID.setDescription('The global identifier of a DDoS filter group.')
acceptCpsHistDaysIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 22, 1, 2), Unsigned32())
if mibBuilder.loadTexts: acceptCpsHistDaysIndex.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistDaysIndex.setDescription('The index (0-34) of the day.')
acceptCpsHistDaysUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 22, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptCpsHistDaysUnitCount.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistDaysUnitCount.setDescription('The average of the HoursUnitCount values corresponding to this day.')
acceptCpsHistDaysTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 22, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptCpsHistDaysTimestamp.setStatus('current')
if mibBuilder.loadTexts: acceptCpsHistDaysTimestamp.setDescription('The time DaysUnitCount was updated (in seconds since January 1, 1970).')
rejectEstHistSecondsTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 25), )
if mibBuilder.loadTexts: rejectEstHistSecondsTable.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistSecondsTable.setDescription('Historical (sampled) data every second for a minute.')
rejectEstHistSecondsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 25, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "rejectEstHistSecondsGlobalID"), (0, "TPT-DDOS-MIB", "rejectEstHistSecondsIndex"))
if mibBuilder.loadTexts: rejectEstHistSecondsEntry.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistSecondsEntry.setDescription('An entry in the rejected connections per sec (EST) history seconds table. Rows cannot be created or deleted. ')
rejectEstHistSecondsGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 25, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: rejectEstHistSecondsGlobalID.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistSecondsGlobalID.setDescription('The global identifier of a DDoS filter group.')
rejectEstHistSecondsIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 25, 1, 2), Unsigned32())
if mibBuilder.loadTexts: rejectEstHistSecondsIndex.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistSecondsIndex.setDescription('The index (0-59) of the second.')
rejectEstHistSecondsUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 25, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectEstHistSecondsUnitCount.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistSecondsUnitCount.setDescription('The count of filter-specific units matching the criteria for this filter in the specified second.')
rejectEstHistSecondsTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 25, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectEstHistSecondsTimestamp.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistSecondsTimestamp.setDescription('The time SecondsUnitCount was updated (in seconds since January 1, 1970).')
rejectEstHistMinutesTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 26), )
if mibBuilder.loadTexts: rejectEstHistMinutesTable.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistMinutesTable.setDescription('Historical (sampled) data every minute for an hour.')
rejectEstHistMinutesEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 26, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "rejectEstHistMinutesGlobalID"), (0, "TPT-DDOS-MIB", "rejectEstHistMinutesIndex"))
if mibBuilder.loadTexts: rejectEstHistMinutesEntry.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistMinutesEntry.setDescription('An entry in the rejected connections per sec (EST) history minutes table. Rows cannot be created or deleted. ')
rejectEstHistMinutesGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 26, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: rejectEstHistMinutesGlobalID.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistMinutesGlobalID.setDescription('The global identifier of a DDoS filter group.')
rejectEstHistMinutesIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 26, 1, 2), Unsigned32())
if mibBuilder.loadTexts: rejectEstHistMinutesIndex.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistMinutesIndex.setDescription('The index (0-59) of the minute.')
rejectEstHistMinutesUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 26, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectEstHistMinutesUnitCount.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistMinutesUnitCount.setDescription('The average of the SecondsUnitCount values corresponding to this minute.')
rejectEstHistMinutesTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 26, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectEstHistMinutesTimestamp.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistMinutesTimestamp.setDescription('The time MinutesUnitCount was updated (in seconds since January 1, 1970).')
rejectEstHistHoursTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 27), )
if mibBuilder.loadTexts: rejectEstHistHoursTable.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistHoursTable.setDescription('Historical (sampled) data every hour for a day.')
rejectEstHistHoursEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 27, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "rejectEstHistHoursGlobalID"), (0, "TPT-DDOS-MIB", "rejectEstHistHoursIndex"))
if mibBuilder.loadTexts: rejectEstHistHoursEntry.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistHoursEntry.setDescription('An entry in the rejected connections per sec (EST) history hours table. Rows cannot be created or deleted. ')
rejectEstHistHoursGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 27, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: rejectEstHistHoursGlobalID.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistHoursGlobalID.setDescription('The global identifier of a DDoS filter group.')
rejectEstHistHoursIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 27, 1, 2), Unsigned32())
if mibBuilder.loadTexts: rejectEstHistHoursIndex.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistHoursIndex.setDescription('The index (0-23) of the hour.')
rejectEstHistHoursUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 27, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectEstHistHoursUnitCount.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistHoursUnitCount.setDescription('The average of the MinutesUnitCount values corresponding to this hour.')
rejectEstHistHoursTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 27, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectEstHistHoursTimestamp.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistHoursTimestamp.setDescription('The time HoursUnitCount was updated (in seconds since January 1, 1970).')
rejectEstHistDaysTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 28), )
if mibBuilder.loadTexts: rejectEstHistDaysTable.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistDaysTable.setDescription('Historical (sampled) data every day for 35 days.')
rejectEstHistDaysEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 28, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "rejectEstHistDaysGlobalID"), (0, "TPT-DDOS-MIB", "rejectEstHistDaysIndex"))
if mibBuilder.loadTexts: rejectEstHistDaysEntry.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistDaysEntry.setDescription('An entry in the rejected connections per sec (EST) history days table. Rows cannot be created or deleted. ')
rejectEstHistDaysGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 28, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: rejectEstHistDaysGlobalID.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistDaysGlobalID.setDescription('The global identifier of a DDoS filter group.')
rejectEstHistDaysIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 28, 1, 2), Unsigned32())
if mibBuilder.loadTexts: rejectEstHistDaysIndex.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistDaysIndex.setDescription('The index (0-34) of the day.')
rejectEstHistDaysUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 28, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectEstHistDaysUnitCount.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistDaysUnitCount.setDescription('The average of the HoursUnitCount values corresponding to this day.')
rejectEstHistDaysTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 28, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rejectEstHistDaysTimestamp.setStatus('current')
if mibBuilder.loadTexts: rejectEstHistDaysTimestamp.setDescription('The time DaysUnitCount was updated (in seconds since January 1, 1970).')
acceptEstHistSecondsTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 29), )
if mibBuilder.loadTexts: acceptEstHistSecondsTable.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistSecondsTable.setDescription('Historical (sampled) data every second for a minute.')
acceptEstHistSecondsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 29, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "acceptEstHistSecondsGlobalID"), (0, "TPT-DDOS-MIB", "acceptEstHistSecondsIndex"))
if mibBuilder.loadTexts: acceptEstHistSecondsEntry.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistSecondsEntry.setDescription('An entry in the accepted connections per sec (EST) history seconds table. Rows cannot be created or deleted. ')
acceptEstHistSecondsGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 29, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: acceptEstHistSecondsGlobalID.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistSecondsGlobalID.setDescription('The global identifier of a DDoS filter group.')
acceptEstHistSecondsIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 29, 1, 2), Unsigned32())
if mibBuilder.loadTexts: acceptEstHistSecondsIndex.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistSecondsIndex.setDescription('The index (0-59) of the second.')
acceptEstHistSecondsUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 29, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptEstHistSecondsUnitCount.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistSecondsUnitCount.setDescription('The count of filter-specific units matching the criteria for this filter in the specified second.')
acceptEstHistSecondsTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 29, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptEstHistSecondsTimestamp.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistSecondsTimestamp.setDescription('The time SecondsUnitCount was updated (in seconds since January 1, 1970).')
acceptEstHistMinutesTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 30), )
if mibBuilder.loadTexts: acceptEstHistMinutesTable.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistMinutesTable.setDescription('Historical (sampled) data every minute for an hour.')
acceptEstHistMinutesEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 30, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "acceptEstHistMinutesGlobalID"), (0, "TPT-DDOS-MIB", "acceptEstHistMinutesIndex"))
if mibBuilder.loadTexts: acceptEstHistMinutesEntry.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistMinutesEntry.setDescription('An entry in the accepted connections per sec (EST) history minutes table. Rows cannot be created or deleted. ')
acceptEstHistMinutesGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 30, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: acceptEstHistMinutesGlobalID.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistMinutesGlobalID.setDescription('The global identifier of a DDoS filter group.')
acceptEstHistMinutesIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 30, 1, 2), Unsigned32())
if mibBuilder.loadTexts: acceptEstHistMinutesIndex.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistMinutesIndex.setDescription('The index (0-59) of the minute.')
acceptEstHistMinutesUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 30, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptEstHistMinutesUnitCount.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistMinutesUnitCount.setDescription('The average of the SecondsUnitCount values corresponding to this minute.')
acceptEstHistMinutesTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 30, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptEstHistMinutesTimestamp.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistMinutesTimestamp.setDescription('The time MinutesUnitCount was updated (in seconds since January 1, 1970).')
acceptEstHistHoursTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 31), )
if mibBuilder.loadTexts: acceptEstHistHoursTable.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistHoursTable.setDescription('Historical (sampled) data every hour for a day.')
acceptEstHistHoursEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 31, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "acceptEstHistHoursGlobalID"), (0, "TPT-DDOS-MIB", "acceptEstHistHoursIndex"))
if mibBuilder.loadTexts: acceptEstHistHoursEntry.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistHoursEntry.setDescription('An entry in the accepted connections per sec (EST) history hours table. Rows cannot be created or deleted. ')
acceptEstHistHoursGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 31, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: acceptEstHistHoursGlobalID.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistHoursGlobalID.setDescription('The global identifier of a DDoS filter group.')
acceptEstHistHoursIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 31, 1, 2), Unsigned32())
if mibBuilder.loadTexts: acceptEstHistHoursIndex.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistHoursIndex.setDescription('The index (0-23) of the hour.')
acceptEstHistHoursUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 31, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptEstHistHoursUnitCount.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistHoursUnitCount.setDescription('The average of the MinutesUnitCount values corresponding to this hour.')
acceptEstHistHoursTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 31, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptEstHistHoursTimestamp.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistHoursTimestamp.setDescription('The time HoursUnitCount was updated (in seconds since January 1, 1970).')
acceptEstHistDaysTable = MibTable((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 32), )
if mibBuilder.loadTexts: acceptEstHistDaysTable.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistDaysTable.setDescription('Historical (sampled) data every day for 35 days.')
acceptEstHistDaysEntry = MibTableRow((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 32, 1), ).setIndexNames((0, "TPT-DDOS-MIB", "acceptEstHistDaysGlobalID"), (0, "TPT-DDOS-MIB", "acceptEstHistDaysIndex"))
if mibBuilder.loadTexts: acceptEstHistDaysEntry.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistDaysEntry.setDescription('An entry in the accepted connections per sec (EST) history days table. Rows cannot be created or deleted. ')
acceptEstHistDaysGlobalID = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 32, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 40)))
if mibBuilder.loadTexts: acceptEstHistDaysGlobalID.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistDaysGlobalID.setDescription('The global identifier of a DDoS filter group.')
acceptEstHistDaysIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 32, 1, 2), Unsigned32())
if mibBuilder.loadTexts: acceptEstHistDaysIndex.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistDaysIndex.setDescription('The index (0-34) of the day.')
acceptEstHistDaysUnitCount = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 32, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptEstHistDaysUnitCount.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistDaysUnitCount.setDescription('The average of the HoursUnitCount values corresponding to this day.')
acceptEstHistDaysTimestamp = MibTableColumn((1, 3, 6, 1, 4, 1, 10734, 3, 3, 2, 9, 32, 1, 4), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: acceptEstHistDaysTimestamp.setStatus('current')
if mibBuilder.loadTexts: acceptEstHistDaysTimestamp.setDescription('The time DaysUnitCount was updated (in seconds since January 1, 1970).')
mibBuilder.exportSymbols("TPT-DDOS-MIB", rejectEstHistMinutesTable=rejectEstHistMinutesTable, proxyConnHistHoursGlobalID=proxyConnHistHoursGlobalID, rejectCpsHistSecondsTable=rejectCpsHistSecondsTable, proxyConnHistDaysGlobalID=proxyConnHistDaysGlobalID, acceptEstHistDaysUnitCount=acceptEstHistDaysUnitCount, acceptCpsHistMinutesGlobalID=acceptCpsHistMinutesGlobalID, proxyConnHistHoursEntry=proxyConnHistHoursEntry, acceptCpsHistHoursIndex=acceptCpsHistHoursIndex, acceptEstHistDaysTimestamp=acceptEstHistDaysTimestamp, rejectEstHistDaysEntry=rejectEstHistDaysEntry, acceptEstHistSecondsTable=acceptEstHistSecondsTable, rejectEstHistHoursEntry=rejectEstHistHoursEntry, acceptEstHistDaysGlobalID=acceptEstHistDaysGlobalID, rejectEstHistSecondsTable=rejectEstHistSecondsTable, rejectEstHistSecondsEntry=rejectEstHistSecondsEntry, acceptCpsHistSecondsEntry=acceptCpsHistSecondsEntry, acceptCpsHistSecondsUnitCount=acceptCpsHistSecondsUnitCount, proxyConnHistSecondsUnitCount=proxyConnHistSecondsUnitCount, acceptCpsHistDaysTable=acceptCpsHistDaysTable, acceptEstHistMinutesIndex=acceptEstHistMinutesIndex, rejectSynHistHoursUnitCount=rejectSynHistHoursUnitCount, rejectEstHistDaysTable=rejectEstHistDaysTable, rejectSynHistMinutesEntry=rejectSynHistMinutesEntry, acceptEstHistDaysTable=acceptEstHistDaysTable, acceptCpsHistHoursEntry=acceptCpsHistHoursEntry, rejectEstHistMinutesEntry=rejectEstHistMinutesEntry, acceptCpsHistDaysIndex=acceptCpsHistDaysIndex, acceptEstHistHoursGlobalID=acceptEstHistHoursGlobalID, rejectEstHistHoursGlobalID=rejectEstHistHoursGlobalID, acceptEstHistSecondsTimestamp=acceptEstHistSecondsTimestamp, rejectCpsHistDaysUnitCount=rejectCpsHistDaysUnitCount, rejectEstHistSecondsUnitCount=rejectEstHistSecondsUnitCount, acceptEstHistSecondsGlobalID=acceptEstHistSecondsGlobalID, proxyConnHistHoursIndex=proxyConnHistHoursIndex, rejectEstHistSecondsGlobalID=rejectEstHistSecondsGlobalID, rejectSynHistMinutesTable=rejectSynHistMinutesTable, rejectSynHistSecondsIndex=rejectSynHistSecondsIndex, acceptCpsHistMinutesIndex=acceptCpsHistMinutesIndex, acceptEstHistMinutesTimestamp=acceptEstHistMinutesTimestamp, rejectEstHistDaysTimestamp=rejectEstHistDaysTimestamp, acceptCpsHistDaysGlobalID=acceptCpsHistDaysGlobalID, rejectCpsHistSecondsIndex=rejectCpsHistSecondsIndex, acceptCpsHistMinutesUnitCount=acceptCpsHistMinutesUnitCount, proxyConnHistMinutesUnitCount=proxyConnHistMinutesUnitCount, acceptCpsHistDaysEntry=acceptCpsHistDaysEntry, proxyConnHistMinutesEntry=proxyConnHistMinutesEntry, rejectCpsHistMinutesGlobalID=rejectCpsHistMinutesGlobalID, acceptEstHistHoursEntry=acceptEstHistHoursEntry, rejectEstHistHoursIndex=rejectEstHistHoursIndex, rejectSynHistMinutesGlobalID=rejectSynHistMinutesGlobalID, acceptCpsHistHoursTable=acceptCpsHistHoursTable, rejectEstHistDaysIndex=rejectEstHistDaysIndex, rejectSynHistSecondsTable=rejectSynHistSecondsTable, rejectEstHistSecondsTimestamp=rejectEstHistSecondsTimestamp, rejectSynHistDaysGlobalID=rejectSynHistDaysGlobalID, rejectEstHistHoursTimestamp=rejectEstHistHoursTimestamp, acceptCpsHistSecondsGlobalID=acceptCpsHistSecondsGlobalID, rejectCpsHistSecondsUnitCount=rejectCpsHistSecondsUnitCount, rejectSynHistHoursTimestamp=rejectSynHistHoursTimestamp, acceptEstHistMinutesTable=acceptEstHistMinutesTable, rejectSynHistDaysUnitCount=rejectSynHistDaysUnitCount, acceptEstHistMinutesUnitCount=acceptEstHistMinutesUnitCount, PYSNMP_MODULE_ID=tpt_ddos, rejectEstHistHoursUnitCount=rejectEstHistHoursUnitCount, rejectCpsHistSecondsEntry=rejectCpsHistSecondsEntry, proxyConnHistMinutesIndex=proxyConnHistMinutesIndex, rejectCpsHistHoursEntry=rejectCpsHistHoursEntry, rejectEstHistMinutesIndex=rejectEstHistMinutesIndex, acceptEstHistSecondsUnitCount=acceptEstHistSecondsUnitCount, rejectSynHistSecondsUnitCount=rejectSynHistSecondsUnitCount, rejectCpsHistMinutesIndex=rejectCpsHistMinutesIndex, acceptCpsHistMinutesTimestamp=acceptCpsHistMinutesTimestamp, rejectSynHistMinutesTimestamp=rejectSynHistMinutesTimestamp, rejectCpsHistMinutesTable=rejectCpsHistMinutesTable, rejectEstHistMinutesUnitCount=rejectEstHistMinutesUnitCount, rejectSynHistDaysTable=rejectSynHistDaysTable, proxyConnHistDaysTimestamp=proxyConnHistDaysTimestamp, acceptCpsHistSecondsTable=acceptCpsHistSecondsTable, acceptCpsHistDaysTimestamp=acceptCpsHistDaysTimestamp, acceptCpsHistSecondsTimestamp=acceptCpsHistSecondsTimestamp, acceptEstHistDaysIndex=acceptEstHistDaysIndex, rejectSynHistSecondsTimestamp=rejectSynHistSecondsTimestamp, rejectCpsHistMinutesEntry=rejectCpsHistMinutesEntry, rejectEstHistHoursTable=rejectEstHistHoursTable, rejectCpsHistMinutesTimestamp=rejectCpsHistMinutesTimestamp, proxyConnHistDaysIndex=proxyConnHistDaysIndex, acceptEstHistSecondsIndex=acceptEstHistSecondsIndex, rejectCpsHistMinutesUnitCount=rejectCpsHistMinutesUnitCount, proxyConnHistSecondsTable=proxyConnHistSecondsTable, acceptCpsHistDaysUnitCount=acceptCpsHistDaysUnitCount, rejectEstHistDaysUnitCount=rejectEstHistDaysUnitCount, rejectCpsHistHoursIndex=rejectCpsHistHoursIndex, proxyConnHistMinutesGlobalID=proxyConnHistMinutesGlobalID, tpt_ddos=tpt_ddos, proxyConnHistSecondsGlobalID=proxyConnHistSecondsGlobalID, rejectCpsHistHoursGlobalID=rejectCpsHistHoursGlobalID, proxyConnHistDaysUnitCount=proxyConnHistDaysUnitCount, acceptEstHistDaysEntry=acceptEstHistDaysEntry, rejectSynHistSecondsEntry=rejectSynHistSecondsEntry, acceptCpsHistHoursTimestamp=acceptCpsHistHoursTimestamp, rejectCpsHistSecondsGlobalID=rejectCpsHistSecondsGlobalID, rejectCpsHistHoursUnitCount=rejectCpsHistHoursUnitCount, proxyConnHistSecondsTimestamp=proxyConnHistSecondsTimestamp, acceptEstHistHoursUnitCount=acceptEstHistHoursUnitCount, rejectCpsHistDaysTable=rejectCpsHistDaysTable, rejectSynHistHoursIndex=rejectSynHistHoursIndex, proxyConnHistSecondsIndex=proxyConnHistSecondsIndex, acceptEstHistMinutesGlobalID=acceptEstHistMinutesGlobalID, acceptEstHistHoursTimestamp=acceptEstHistHoursTimestamp, rejectSynHistSecondsGlobalID=rejectSynHistSecondsGlobalID, rejectCpsHistDaysIndex=rejectCpsHistDaysIndex, rejectEstHistDaysGlobalID=rejectEstHistDaysGlobalID, rejectSynHistDaysEntry=rejectSynHistDaysEntry, rejectSynHistMinutesUnitCount=rejectSynHistMinutesUnitCount, rejectSynHistHoursEntry=rejectSynHistHoursEntry, proxyConnHistSecondsEntry=proxyConnHistSecondsEntry, rejectCpsHistHoursTable=rejectCpsHistHoursTable, rejectSynHistHoursTable=rejectSynHistHoursTable, rejectCpsHistDaysEntry=rejectCpsHistDaysEntry, acceptEstHistSecondsEntry=acceptEstHistSecondsEntry, rejectSynHistDaysTimestamp=rejectSynHistDaysTimestamp, rejectEstHistMinutesGlobalID=rejectEstHistMinutesGlobalID, acceptEstHistHoursIndex=acceptEstHistHoursIndex, rejectSynHistMinutesIndex=rejectSynHistMinutesIndex, rejectSynHistHoursGlobalID=rejectSynHistHoursGlobalID, rejectCpsHistHoursTimestamp=rejectCpsHistHoursTimestamp, proxyConnHistHoursTimestamp=proxyConnHistHoursTimestamp, acceptEstHistMinutesEntry=acceptEstHistMinutesEntry, proxyConnHistDaysEntry=proxyConnHistDaysEntry, acceptCpsHistSecondsIndex=acceptCpsHistSecondsIndex, rejectEstHistSecondsIndex=rejectEstHistSecondsIndex, proxyConnHistMinutesTable=proxyConnHistMinutesTable, rejectCpsHistDaysTimestamp=rejectCpsHistDaysTimestamp, proxyConnHistMinutesTimestamp=proxyConnHistMinutesTimestamp, rejectSynHistDaysIndex=rejectSynHistDaysIndex, proxyConnHistHoursUnitCount=proxyConnHistHoursUnitCount, rejectCpsHistSecondsTimestamp=rejectCpsHistSecondsTimestamp, acceptEstHistHoursTable=acceptEstHistHoursTable, proxyConnHistHoursTable=proxyConnHistHoursTable, proxyConnHistDaysTable=proxyConnHistDaysTable, acceptCpsHistMinutesTable=acceptCpsHistMinutesTable, acceptCpsHistHoursGlobalID=acceptCpsHistHoursGlobalID, acceptCpsHistHoursUnitCount=acceptCpsHistHoursUnitCount, rejectEstHistMinutesTimestamp=rejectEstHistMinutesTimestamp, acceptCpsHistMinutesEntry=acceptCpsHistMinutesEntry, rejectCpsHistDaysGlobalID=rejectCpsHistDaysGlobalID)
| 133.253829 | 7,886 | 0.799021 | 6,713 | 60,897 | 7.246239 | 0.06912 | 0.07228 | 0.12649 | 0.011923 | 0.523004 | 0.406525 | 0.378896 | 0.377703 | 0.308897 | 0.256229 | 0 | 0.057934 | 0.087591 | 60,897 | 456 | 7,887 | 133.546053 | 0.81754 | 0.005222 | 0 | 0 | 0 | 0.037946 | 0.241291 | 0.020736 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.015625 | 0 | 0.015625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e68861ee650e637e33dfc25055f830d4d6b071a4 | 1,826 | py | Python | app/util/utils.py | TIHLDE/Lepton | 60ec0793381f1c1b222f305586e8c2d4345fb566 | [
"MIT"
] | 7 | 2021-03-04T18:49:12.000Z | 2021-03-08T18:25:51.000Z | app/util/utils.py | TIHLDE/Lepton | 60ec0793381f1c1b222f305586e8c2d4345fb566 | [
"MIT"
] | 251 | 2021-03-04T19:19:14.000Z | 2022-03-31T14:47:53.000Z | app/util/utils.py | tihlde/Lepton | 5cab3522c421b76373a5c25f49267cfaef7b826a | [
"MIT"
] | 3 | 2021-10-05T19:03:04.000Z | 2022-02-25T13:32:09.000Z | import logging
from datetime import datetime, timedelta
from functools import wraps
from django.conf import settings
from pytz import timezone as pytz_timezone
logger = logging.getLogger(__name__)
def getTimezone():
return pytz_timezone(settings.TIME_ZONE)
def yesterday():
return now() - timedelta(days=1)
def now():
return datetime.now(tz=getTimezone())
def datetime_format(date_time):
from django.template import Context, Template
# Using Django Template to format as it formats dates with both localization and timezone automatically
return Template("{{ date_to_format }}").render(
Context(dict(date_to_format=date_time))
)
def midday(date_time):
return date_time.replace(hour=12, minute=00, second=00)
def week_nr(date):
return date.isocalendar()[1]
def disable_for_loaddata(signal_handler):
"""
Disable signals for the 'loaddata' command
to avoid conflicts while loading fixtures.
"""
@wraps(signal_handler)
def wrapper(*args, **kwargs):
if kwargs.get("raw", False):
logger.info(f"Skipping signal for {args} {kwargs}")
return
signal_handler(*args, **kwargs)
return wrapper
class CaseInsensitiveBooleanQueryParam:
value = None
def __init__(self, value):
if value is not None:
value = value.lower()
if value == "true":
self.value = True
elif value == "false":
self.value = False
def __bool__(self):
return bool(self.value)
def __str__(self):
return f"<{self.__class__.__name__} object ({self.value})"
def chunk_list(lst, n):
"""Chunk a list into smaller lists with a max-length of n"""
lst = list(lst)
for i in range(0, len(lst), n):
yield lst[i : i + n]
| 22.825 | 107 | 0.653888 | 234 | 1,826 | 4.918803 | 0.452991 | 0.039096 | 0.024327 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006522 | 0.24425 | 1,826 | 79 | 108 | 23.113924 | 0.827536 | 0.133078 | 0 | 0 | 0 | 0 | 0.073813 | 0.016688 | 0 | 0 | 0 | 0 | 0 | 1 | 0.26087 | false | 0 | 0.130435 | 0.152174 | 0.652174 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
e688f601915a6fea0ff46df26e2d820a3ff361a6 | 654 | py | Python | fem/utilities/command_dispatcher/test.py | mjredmond/FEMApp | dd8cc53acf80d0a1bb83ce9c89bcfd51e85c6be8 | [
"MIT"
] | 1 | 2019-08-03T21:40:26.000Z | 2019-08-03T21:40:26.000Z | fem/utilities/command_dispatcher/test.py | mjredmond/FEMApp | dd8cc53acf80d0a1bb83ce9c89bcfd51e85c6be8 | [
"MIT"
] | null | null | null | fem/utilities/command_dispatcher/test.py | mjredmond/FEMApp | dd8cc53acf80d0a1bb83ce9c89bcfd51e85c6be8 | [
"MIT"
] | null | null | null | import inspect
def check_types(*args):
def check_args(func, *args2):
types = tuple(map(type, args2))
for i in range(len(types)):
if not isinstance(types[i], args[i]):
raise TypeError("Argument types for %s%s do not match! %s" % (func.__name__, str(args), str(types)))
def add_args_checking(func):
def _func(*args2):
check_args(func, *args2)
return func(*args2)
return _func
return add_args_checking
class Dummy(object):
@check_types(object, int, int, float)
def func1(self, a, b, c):
print(a, b, c)
print(type((int, int, float))) | 21.096774 | 116 | 0.584098 | 91 | 654 | 4.043956 | 0.450549 | 0.097826 | 0.070652 | 0.097826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012793 | 0.282875 | 654 | 31 | 117 | 21.096774 | 0.771855 | 0 | 0 | 0 | 0 | 0 | 0.061069 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.277778 | false | 0 | 0.055556 | 0 | 0.555556 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e69e2f4488986fd7cc1e34517c7adb66a11b4150 | 751 | py | Python | georiviere/maintenance/filters.py | Georiviere/Georiviere-admin | f59e1b979a758958a64899916b8b72d580128ee9 | [
"BSD-2-Clause"
] | 7 | 2021-11-05T14:52:25.000Z | 2022-03-24T21:18:02.000Z | georiviere/maintenance/filters.py | Georiviere/Georiviere-admin | f59e1b979a758958a64899916b8b72d580128ee9 | [
"BSD-2-Clause"
] | 57 | 2021-11-02T10:27:34.000Z | 2022-03-31T14:08:32.000Z | georiviere/maintenance/filters.py | Georiviere/Georiviere-admin | f59e1b979a758958a64899916b8b72d580128ee9 | [
"BSD-2-Clause"
] | 1 | 2021-12-05T14:55:42.000Z | 2021-12-05T14:55:42.000Z | from django_filters import CharFilter
from django.utils.translation import gettext_lazy as _
from mapentity.filters import MapEntityFilterSet, PythonPolygonFilter
from geotrek.zoning.filters import ZoningFilterSet
from georiviere.maintenance.models import Intervention
from georiviere.watershed.filters import WatershedFilterSet
class InterventionFilterSet(WatershedFilterSet, ZoningFilterSet, MapEntityFilterSet):
bbox = PythonPolygonFilter(field_name='geom')
name = CharFilter(label=_('Name'), lookup_expr='icontains')
class Meta(MapEntityFilterSet.Meta):
model = Intervention
fields = MapEntityFilterSet.Meta.fields + [
'name', 'intervention_type', 'disorders', 'stake', 'intervention_status'
]
| 37.55 | 85 | 0.780293 | 71 | 751 | 8.140845 | 0.549296 | 0.089965 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142477 | 751 | 19 | 86 | 39.526316 | 0.897516 | 0 | 0 | 0 | 0 | 0 | 0.094541 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
e6a25486802c0b7fd89290d2e22816d4deb362ca | 17,826 | py | Python | pysnmp/CNT251-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/CNT251-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/CNT251-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module CNT251-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/CNT251-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 18:09:31 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, OctetString, ObjectIdentifier = mibBuilder.importSymbols("ASN1", "Integer", "OctetString", "ObjectIdentifier")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueRangeConstraint, ConstraintsUnion, ValueSizeConstraint, SingleValueConstraint, ConstraintsIntersection = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueRangeConstraint", "ConstraintsUnion", "ValueSizeConstraint", "SingleValueConstraint", "ConstraintsIntersection")
cnt2CfgSystemProbe, = mibBuilder.importSymbols("CNT25-MIB", "cnt2CfgSystemProbe")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
Bits, MibIdentifier, Unsigned32, IpAddress, ModuleIdentity, Integer32, iso, Counter64, NotificationType, Gauge32, Counter32, TimeTicks, MibScalar, MibTable, MibTableRow, MibTableColumn, ObjectIdentity = mibBuilder.importSymbols("SNMPv2-SMI", "Bits", "MibIdentifier", "Unsigned32", "IpAddress", "ModuleIdentity", "Integer32", "iso", "Counter64", "NotificationType", "Gauge32", "Counter32", "TimeTicks", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "ObjectIdentity")
TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "DisplayString")
cnt2SysChassisType = MibScalar((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(2, 6, 12, 13, 14, 15, 16, 17))).clone(namedValues=NamedValues(("slot-2", 2), ("slot-6", 6), ("slot-12", 12), ("osg", 13), ("usg", 14), ("usd6", 15), ("usd12", 16), ("tm", 17)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysChassisType.setStatus('mandatory')
cnt2SysZachCardType = MibScalar((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("not-present", 1), ("rs232", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysZachCardType.setStatus('mandatory')
cnt2SysHmbFirmwareRevision = MibScalar((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysHmbFirmwareRevision.setStatus('mandatory')
cnt2SysScnrcVersion = MibScalar((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysScnrcVersion.setStatus('mandatory')
cnt2SysDatPresent = MibScalar((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("yes", 1), ("no", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysDatPresent.setStatus('mandatory')
cnt2SysCdRomPresent = MibScalar((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("yes", 1), ("no", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysCdRomPresent.setStatus('mandatory')
cnt2SysProbeDateTime = MibScalar((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 7), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysProbeDateTime.setStatus('mandatory')
cnt2SysSlotCount = MibScalar((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 8), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysSlotCount.setStatus('mandatory')
cnt2SysPowerSupplyTable = MibTable((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 9), )
if mibBuilder.loadTexts: cnt2SysPowerSupplyTable.setStatus('mandatory')
cnt2SysPowerSupplyEntry = MibTableRow((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 9, 1), ).setIndexNames((0, "CNT251-MIB", "cnt2SysPowerSupplyIndex"))
if mibBuilder.loadTexts: cnt2SysPowerSupplyEntry.setStatus('mandatory')
cnt2SysPowerSupplyIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 9, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysPowerSupplyIndex.setStatus('mandatory')
cnt2SysPowerSupplyPresent = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 9, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("yes", 1), ("no", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysPowerSupplyPresent.setStatus('mandatory')
cnt2SysFanTable = MibTable((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 10), )
if mibBuilder.loadTexts: cnt2SysFanTable.setStatus('mandatory')
cnt2SysFanEntry = MibTableRow((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 10, 1), ).setIndexNames((0, "CNT251-MIB", "cnt2SysFanIndex"))
if mibBuilder.loadTexts: cnt2SysFanEntry.setStatus('mandatory')
cnt2SysFanIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 10, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysFanIndex.setStatus('mandatory')
cnt2SysFanPresent = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 10, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("yes", 1), ("no", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysFanPresent.setStatus('mandatory')
cnt2SysAdapterTable = MibTable((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11), )
if mibBuilder.loadTexts: cnt2SysAdapterTable.setStatus('mandatory')
cnt2SysAdapterEntry = MibTableRow((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11, 1), ).setIndexNames((0, "CNT251-MIB", "cnt2SysAdapterIndex"))
if mibBuilder.loadTexts: cnt2SysAdapterEntry.setStatus('mandatory')
cnt2SysAdapterIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysAdapterIndex.setStatus('mandatory')
cnt2SysAdapterType = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5))).clone(namedValues=NamedValues(("unknown", 1), ("absent", 2), ("sparc", 3), ("escon", 4), ("ppc", 5)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysAdapterType.setStatus('mandatory')
cnt2SysAdapterName = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15))).clone(namedValues=NamedValues(("absent", 1), ("unknown", 2), ("zsp1", 3), ("zen1", 4), ("zap1", 5), ("zsp2", 6), ("zen2", 7), ("zap2", 8), ("zen3", 9), ("usg1", 10), ("usg2", 11), ("zap3", 12), ("zap4", 13), ("zen4", 14), ("o1x1", 15)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysAdapterName.setStatus('mandatory')
cnt2SysAdapterPartNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysAdapterPartNumber.setStatus('mandatory')
cnt2SysAdapterSerialNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11, 1, 5), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysAdapterSerialNumber.setStatus('mandatory')
cnt2SysAdapterHostId = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11, 1, 6), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysAdapterHostId.setStatus('mandatory')
cnt2SysAdapterBoardRevision = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11, 1, 7), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysAdapterBoardRevision.setStatus('mandatory')
cnt2SysAdapterFirmwareMajorRevision = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11, 1, 8), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysAdapterFirmwareMajorRevision.setStatus('mandatory')
cnt2SysAdapterFirmwareMinorRevision = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11, 1, 9), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysAdapterFirmwareMinorRevision.setStatus('mandatory')
cnt2SysAdapterHostName = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11, 1, 10), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysAdapterHostName.setStatus('mandatory')
cnt2SysAdapterOsName = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11, 1, 11), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysAdapterOsName.setStatus('mandatory')
cnt2SysAdapterOsMajorVersion = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11, 1, 12), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysAdapterOsMajorVersion.setStatus('mandatory')
cnt2SysAdapterOsMinorVersion = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11, 1, 13), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysAdapterOsMinorVersion.setStatus('mandatory')
cnt2SysAdapterServiceMonitorStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 11, 1, 14), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("none", 1), ("primary", 2), ("secondary", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysAdapterServiceMonitorStatus.setStatus('mandatory')
cnt2SysBusTable = MibTable((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 12), )
if mibBuilder.loadTexts: cnt2SysBusTable.setStatus('mandatory')
cnt2SysBusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 12, 1), ).setIndexNames((0, "CNT251-MIB", "cnt2SysBusAdapterIndex"), (0, "CNT251-MIB", "cnt2SysBusIndex"))
if mibBuilder.loadTexts: cnt2SysBusEntry.setStatus('mandatory')
cnt2SysBusAdapterIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 12, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysBusAdapterIndex.setStatus('mandatory')
cnt2SysBusIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 12, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysBusIndex.setStatus('mandatory')
cnt2SysBusType = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 12, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("unknown", 1), ("sbus", 2), ("pci", 3), ("vme", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysBusType.setStatus('mandatory')
cnt2SysCardTable = MibTable((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 13), )
if mibBuilder.loadTexts: cnt2SysCardTable.setStatus('mandatory')
cnt2SysCardEntry = MibTableRow((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 13, 1), ).setIndexNames((0, "CNT251-MIB", "cnt2SysCardAdapterIndex"), (0, "CNT251-MIB", "cnt2SysCardBusIndex"), (0, "CNT251-MIB", "cnt2SysCardIndex"))
if mibBuilder.loadTexts: cnt2SysCardEntry.setStatus('mandatory')
cnt2SysCardAdapterIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 13, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysCardAdapterIndex.setStatus('mandatory')
cnt2SysCardBusIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 13, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysCardBusIndex.setStatus('mandatory')
cnt2SysCardIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 13, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysCardIndex.setStatus('mandatory')
cnt2SysCardFunction = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 13, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("unknown", 1), ("interface", 2), ("compression", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysCardFunction.setStatus('mandatory')
cnt2SysCardFirmwareMajorRevision = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 13, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysCardFirmwareMajorRevision.setStatus('mandatory')
cnt2SysCardFirmwareMinorRevision = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 13, 1, 6), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysCardFirmwareMinorRevision.setStatus('mandatory')
cnt2SysCardVendorOctetString = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 13, 1, 7), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysCardVendorOctetString.setStatus('mandatory')
cnt2SysCardVendorDisplayString = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 13, 1, 8), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysCardVendorDisplayString.setStatus('mandatory')
cnt2SysIfTable = MibTable((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 14), )
if mibBuilder.loadTexts: cnt2SysIfTable.setStatus('mandatory')
cnt2SysIfEntry = MibTableRow((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 14, 1), ).setIndexNames((0, "CNT251-MIB", "cnt2SysIfAdapterIndex"), (0, "CNT251-MIB", "cnt2SysIfBusIndex"), (0, "CNT251-MIB", "cnt2SysIfIndex"))
if mibBuilder.loadTexts: cnt2SysIfEntry.setStatus('mandatory')
cnt2SysIfAdapterIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 14, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysIfAdapterIndex.setStatus('mandatory')
cnt2SysIfBusIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 14, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysIfBusIndex.setStatus('mandatory')
cnt2SysIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 14, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysIfIndex.setStatus('mandatory')
cnt2SysIfType = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 14, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13))).clone(namedValues=NamedValues(("unknown", 1), ("ethernetCsmacd", 2), ("async", 3), ("escon", 4), ("atm", 5), ("fibreChannel", 6), ("scsi-2", 7), ("scsi-3", 8), ("ds3", 9), ("fddi", 10), ("fastEther", 11), ("isdn", 12), ("gigabitEthernet", 13)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysIfType.setStatus('mandatory')
cnt2SysIfCardIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 14, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysIfCardIndex.setStatus('mandatory')
cnt2SysIfName = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 14, 1, 6), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysIfName.setStatus('mandatory')
cnt2SysIfConnector = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 14, 1, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8, 9, 10))).clone(namedValues=NamedValues(("absent", 1), ("unknown", 2), ("micro-d15", 3), ("scsi-2", 4), ("scsi-3", 5), ("sc-duplex", 6), ("rj45", 7), ("bnc", 8), ("hssdc", 9), ("rsd-duplex", 10)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysIfConnector.setStatus('mandatory')
cnt2SysIfSnmpIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 14, 1, 8), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysIfSnmpIndex.setStatus('mandatory')
cnt2SysSerialNumber = MibScalar((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 15), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysSerialNumber.setStatus('mandatory')
cnt2SysOsVersion = MibScalar((1, 3, 6, 1, 4, 1, 333, 2, 5, 1, 16), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cnt2SysOsVersion.setStatus('mandatory')
mibBuilder.exportSymbols("CNT251-MIB", cnt2SysZachCardType=cnt2SysZachCardType, cnt2SysBusTable=cnt2SysBusTable, cnt2SysAdapterEntry=cnt2SysAdapterEntry, cnt2SysCardFirmwareMinorRevision=cnt2SysCardFirmwareMinorRevision, cnt2SysFanIndex=cnt2SysFanIndex, cnt2SysChassisType=cnt2SysChassisType, cnt2SysFanPresent=cnt2SysFanPresent, cnt2SysIfEntry=cnt2SysIfEntry, cnt2SysIfConnector=cnt2SysIfConnector, cnt2SysBusType=cnt2SysBusType, cnt2SysFanEntry=cnt2SysFanEntry, cnt2SysAdapterType=cnt2SysAdapterType, cnt2SysCardBusIndex=cnt2SysCardBusIndex, cnt2SysSlotCount=cnt2SysSlotCount, cnt2SysAdapterTable=cnt2SysAdapterTable, cnt2SysPowerSupplyIndex=cnt2SysPowerSupplyIndex, cnt2SysAdapterName=cnt2SysAdapterName, cnt2SysProbeDateTime=cnt2SysProbeDateTime, cnt2SysCardIndex=cnt2SysCardIndex, cnt2SysAdapterFirmwareMinorRevision=cnt2SysAdapterFirmwareMinorRevision, cnt2SysIfAdapterIndex=cnt2SysIfAdapterIndex, cnt2SysOsVersion=cnt2SysOsVersion, cnt2SysIfType=cnt2SysIfType, cnt2SysPowerSupplyEntry=cnt2SysPowerSupplyEntry, cnt2SysCardEntry=cnt2SysCardEntry, cnt2SysAdapterOsMajorVersion=cnt2SysAdapterOsMajorVersion, cnt2SysCardVendorDisplayString=cnt2SysCardVendorDisplayString, cnt2SysCardVendorOctetString=cnt2SysCardVendorOctetString, cnt2SysCardFunction=cnt2SysCardFunction, cnt2SysDatPresent=cnt2SysDatPresent, cnt2SysAdapterFirmwareMajorRevision=cnt2SysAdapterFirmwareMajorRevision, cnt2SysIfBusIndex=cnt2SysIfBusIndex, cnt2SysPowerSupplyPresent=cnt2SysPowerSupplyPresent, cnt2SysAdapterHostId=cnt2SysAdapterHostId, cnt2SysAdapterBoardRevision=cnt2SysAdapterBoardRevision, cnt2SysIfName=cnt2SysIfName, cnt2SysCardFirmwareMajorRevision=cnt2SysCardFirmwareMajorRevision, cnt2SysAdapterSerialNumber=cnt2SysAdapterSerialNumber, cnt2SysFanTable=cnt2SysFanTable, cnt2SysBusIndex=cnt2SysBusIndex, cnt2SysIfSnmpIndex=cnt2SysIfSnmpIndex, cnt2SysAdapterOsMinorVersion=cnt2SysAdapterOsMinorVersion, cnt2SysAdapterIndex=cnt2SysAdapterIndex, cnt2SysAdapterServiceMonitorStatus=cnt2SysAdapterServiceMonitorStatus, cnt2SysBusAdapterIndex=cnt2SysBusAdapterIndex, cnt2SysSerialNumber=cnt2SysSerialNumber, cnt2SysPowerSupplyTable=cnt2SysPowerSupplyTable, cnt2SysCardTable=cnt2SysCardTable, cnt2SysAdapterHostName=cnt2SysAdapterHostName, cnt2SysScnrcVersion=cnt2SysScnrcVersion, cnt2SysBusEntry=cnt2SysBusEntry, cnt2SysCardAdapterIndex=cnt2SysCardAdapterIndex, cnt2SysIfIndex=cnt2SysIfIndex, cnt2SysHmbFirmwareRevision=cnt2SysHmbFirmwareRevision, cnt2SysAdapterOsName=cnt2SysAdapterOsName, cnt2SysIfCardIndex=cnt2SysIfCardIndex, cnt2SysCdRomPresent=cnt2SysCdRomPresent, cnt2SysIfTable=cnt2SysIfTable, cnt2SysAdapterPartNumber=cnt2SysAdapterPartNumber)
| 133.029851 | 2,633 | 0.755133 | 1,981 | 17,826 | 6.795053 | 0.113074 | 0.009509 | 0.013149 | 0.017532 | 0.435852 | 0.379467 | 0.331179 | 0.324939 | 0.232524 | 0.232449 | 0 | 0.094991 | 0.082857 | 17,826 | 133 | 2,634 | 134.030075 | 0.728363 | 0.017615 | 0 | 0 | 0 | 0 | 0.119008 | 0.007599 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e6b968245b82089e667eb2d923d7df182118e963 | 762 | py | Python | webapp/photos/models.py | hnarayanan/Stylist | ea7471d06955b0a8c386f444b936e43c591b8200 | [
"Unlicense"
] | null | null | null | webapp/photos/models.py | hnarayanan/Stylist | ea7471d06955b0a8c386f444b936e43c591b8200 | [
"Unlicense"
] | null | null | null | webapp/photos/models.py | hnarayanan/Stylist | ea7471d06955b0a8c386f444b936e43c591b8200 | [
"Unlicense"
] | null | null | null | from __future__ import unicode_literals
from django.db import models
from django.urls import reverse
class Style(models.Model):
image = models.ImageField(upload_to='styles/%Y/%m/%d/')
title = models.CharField(max_length=100)
def __unicode__(self):
return self.title
class Photo(models.Model):
image = models.ImageField(upload_to='uploads/%Y/%m/%d/')
title = models.CharField(max_length=100, null=True, blank=True)
style = models.ForeignKey(Style, on_delete=models.PROTECT)
is_highlighted = models.BooleanField(default=False)
class Meta:
ordering = ['-id']
def __unicode__(self):
return self.title
def get_absolute_url(self):
return reverse('photo:detail', kwargs={'pk': self.id})
| 25.4 | 67 | 0.694226 | 100 | 762 | 5.08 | 0.52 | 0.059055 | 0.062992 | 0.086614 | 0.409449 | 0.409449 | 0.295276 | 0.137795 | 0.137795 | 0 | 0 | 0.009615 | 0.181102 | 762 | 29 | 68 | 26.275862 | 0.804487 | 0 | 0 | 0.210526 | 0 | 0 | 0.065617 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.157895 | false | 0 | 0.157895 | 0.157895 | 0.947368 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
e6bccba67cb25497175a5a69101dce98be4eae74 | 3,108 | py | Python | openstack/tests/unit/compute/v2/test_hypervisor.py | catalinpopc/openstacksdk | adaf758076b0c74cf4bb55e88fdee7072764f5f3 | [
"Apache-2.0"
] | null | null | null | openstack/tests/unit/compute/v2/test_hypervisor.py | catalinpopc/openstacksdk | adaf758076b0c74cf4bb55e88fdee7072764f5f3 | [
"Apache-2.0"
] | null | null | null | openstack/tests/unit/compute/v2/test_hypervisor.py | catalinpopc/openstacksdk | adaf758076b0c74cf4bb55e88fdee7072764f5f3 | [
"Apache-2.0"
] | null | null | null | # Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from openstack.tests.unit import base
from openstack.compute.v2 import hypervisor
EXAMPLE = {
"status": "enabled",
"service": {
"host": "fake-mini",
"disabled_reason": None,
"id": 6
},
"vcpus_used": 0,
"hypervisor_type": "QEMU",
"local_gb_used": 0,
"vcpus": 8,
"hypervisor_hostname": "fake-mini",
"memory_mb_used": 512,
"memory_mb": 7980,
"current_workload": 0,
"state": "up",
"host_ip": "23.253.248.171",
"cpu_info": "some cpu info",
"running_vms": 0,
"free_disk_gb": 157,
"hypervisor_version": 2000000,
"disk_available_least": 140,
"local_gb": 157,
"free_ram_mb": 7468,
"id": 1
}
class TestHypervisor(base.TestCase):
def test_basic(self):
sot = hypervisor.Hypervisor()
self.assertEqual('hypervisor', sot.resource_key)
self.assertEqual('hypervisors', sot.resources_key)
self.assertEqual('/os-hypervisors', sot.base_path)
self.assertEqual('compute', sot.service.service_type)
self.assertTrue(sot.allow_get)
self.assertTrue(sot.allow_list)
def test_make_it(self):
sot = hypervisor.Hypervisor(**EXAMPLE)
self.assertEqual(EXAMPLE['id'], sot.id)
self.assertEqual(EXAMPLE['hypervisor_hostname'], sot.name)
self.assertEqual(EXAMPLE['state'], sot.state)
self.assertEqual(EXAMPLE['status'], sot.status)
self.assertEqual(EXAMPLE['service'], sot.service_details)
self.assertEqual(EXAMPLE['vcpus_used'], sot.vcpus_used)
self.assertEqual(EXAMPLE['hypervisor_type'], sot.hypervisor_type)
self.assertEqual(EXAMPLE['local_gb_used'], sot.local_disk_used)
self.assertEqual(EXAMPLE['vcpus'], sot.vcpus)
self.assertEqual(EXAMPLE['vcpus_used'], sot.vcpus_used)
self.assertEqual(EXAMPLE['memory_mb_used'], sot.memory_used)
self.assertEqual(EXAMPLE['memory_mb'], sot.memory_size)
self.assertEqual(EXAMPLE['current_workload'], sot.current_workload)
self.assertEqual(EXAMPLE['host_ip'], sot.host_ip)
self.assertEqual(EXAMPLE['cpu_info'], sot.cpu_info)
self.assertEqual(EXAMPLE['running_vms'], sot.running_vms)
self.assertEqual(EXAMPLE['free_disk_gb'], sot.local_disk_free)
self.assertEqual(EXAMPLE['hypervisor_version'], sot.hypervisor_version)
self.assertEqual(EXAMPLE['disk_available_least'], sot.disk_available)
self.assertEqual(EXAMPLE['local_gb'], sot.local_disk_size)
self.assertEqual(EXAMPLE['free_ram_mb'], sot.memory_free)
| 39.341772 | 79 | 0.686293 | 394 | 3,108 | 5.236041 | 0.35533 | 0.181774 | 0.223946 | 0.050412 | 0.111488 | 0.083374 | 0.063015 | 0.063015 | 0.063015 | 0.063015 | 0 | 0.019732 | 0.184685 | 3,108 | 78 | 80 | 39.846154 | 0.794396 | 0.167954 | 0 | 0.033333 | 0 | 0 | 0.219114 | 0 | 0 | 0 | 0 | 0 | 0.45 | 1 | 0.033333 | false | 0 | 0.033333 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e6bf15f6cb30a7e0d3398046a71cdb14731cf785 | 10,920 | py | Python | collection_manager/tests/services/test_CollectionWatcher.py | kevinmarlis/incubator-sdap-ingester | 7ee17fdf16201c499f7bd35cf398844f2c70f046 | [
"Apache-2.0"
] | null | null | null | collection_manager/tests/services/test_CollectionWatcher.py | kevinmarlis/incubator-sdap-ingester | 7ee17fdf16201c499f7bd35cf398844f2c70f046 | [
"Apache-2.0"
] | 1 | 2021-05-03T22:13:11.000Z | 2021-05-03T22:13:11.000Z | collection_manager/tests/services/test_CollectionWatcher.py | kevinmarlis/incubator-sdap-ingester | 7ee17fdf16201c499f7bd35cf398844f2c70f046 | [
"Apache-2.0"
] | null | null | null | import os
import tempfile
import unittest
from datetime import datetime
from unittest.mock import Mock
from collection_manager.entities import Collection
from collection_manager.entities.exceptions import CollectionConfigParsingError, CollectionConfigFileNotFoundError, \
RelativePathCollectionError, ConflictingPathCollectionError
from collection_manager.services import CollectionWatcher
from common.async_test_utils.AsyncTestUtils import AsyncAssert, AsyncMock, async_test
class TestCollectionWatcher(unittest.TestCase):
def test_collections_returns_all_collections(self):
collection_watcher = CollectionWatcher('/foo', Mock(), Mock())
collection_watcher._collections_by_dir = {
"/foo": {
Collection("id1", "var1", "path1", 1, 2, datetime.now(), datetime.now()),
Collection("id2", "var2", "path2", 3, 4, datetime.now(), datetime.now()),
},
"/bar": {
Collection("id3", "var3", "path3", 5, 6, datetime.now(), datetime.now()),
Collection("id4", "var4", "path4", 7, 8, datetime.now(), datetime.now()),
}
}
flattened_collections = collection_watcher._collections()
self.assertEqual(len(flattened_collections), 4)
def test_load_collections_loads_all_collections(self):
collections_path = os.path.join(os.path.dirname(__file__), '../resources/collections.yml')
collection_watcher = CollectionWatcher(collections_path, AsyncMock(), AsyncMock())
collection_watcher._load_collections()
self.assertEqual(len(collection_watcher._collections_by_dir), 2)
self.assertEqual(len(collection_watcher._collections_by_dir['/opt/data/grace']), 2)
self.assertEqual(len(collection_watcher._collections_by_dir['/opt/data/avhrr']), 1)
def test_load_collections_with_bad_yaml_syntax(self):
collections_path = os.path.join(os.path.dirname(__file__), '../resources/collections_bad_syntax.yml')
collection_watcher = CollectionWatcher(collections_path, Mock(), Mock())
self.assertRaises(CollectionConfigParsingError, collection_watcher._load_collections)
def test_load_collections_with_bad_schema(self):
collections_path = os.path.join(os.path.dirname(__file__), '../resources/collections_bad_schema.yml')
collection_watcher = CollectionWatcher(collections_path, Mock(), Mock())
self.assertRaises(CollectionConfigParsingError, collection_watcher._load_collections)
def test_load_collections_with_file_not_found(self):
collections_path = os.path.join(os.path.dirname(__file__), '../resources/does_not_exist.yml')
collection_watcher = CollectionWatcher(collections_path, Mock(), Mock())
self.assertRaises(CollectionConfigFileNotFoundError, collection_watcher._load_collections)
def test_get_updated_collections_returns_all_collections(self):
collections_path = os.path.join(os.path.dirname(__file__), '../resources/collections.yml')
collection_watcher = CollectionWatcher(collections_path, Mock(), Mock())
updated_collections = collection_watcher._get_updated_collections()
self.assertSetEqual(updated_collections, collection_watcher._collections())
def test_get_updated_collections_returns_no_collections(self):
collections_path = os.path.join(os.path.dirname(__file__), '../resources/collections.yml')
collection_watcher = CollectionWatcher(collections_path, Mock(), Mock())
collection_watcher._load_collections()
updated_collections = collection_watcher._get_updated_collections()
self.assertEqual(len(updated_collections), 0)
def test_get_updated_collections_returns_some_collections(self):
collections_path = os.path.join(os.path.dirname(__file__), '../resources/collections.yml')
collection_watcher = CollectionWatcher(collections_path, Mock(), Mock())
collection_watcher._load_collections()
new_collections_path = os.path.join(os.path.dirname(__file__), '../resources/collections_alternate.yml')
collection_watcher._collections_path = new_collections_path
updated_collections = collection_watcher._get_updated_collections()
self.assertEqual(len(updated_collections), 1)
def test_validate_collection(self):
collections_path = os.path.join(os.path.dirname(__file__), '../resources/collections.yml')
collection_watcher = CollectionWatcher(collections_path, Mock(), Mock())
collection = Collection(dataset_id="test_dataset",
path="/absolute/path",
projection="Grid",
slices=frozenset(),
dimension_names=frozenset(),
historical_priority=1,
forward_processing_priority=2,
date_from=None,
date_to=None)
collection_watcher._validate_collection(collection)
def test_validate_collection_with_relative_path(self):
collections_path = os.path.join(os.path.dirname(__file__), '../resources/collections.yml')
collection_watcher = CollectionWatcher(collections_path, Mock(), Mock())
collection = Collection(dataset_id="test_dataset",
path="relative/path",
projection="Grid",
slices=frozenset(),
dimension_names=frozenset(),
historical_priority=1,
forward_processing_priority=2,
date_from=None,
date_to=None)
self.assertRaises(RelativePathCollectionError, collection_watcher._validate_collection, collection)
def test_validate_collection_with_conflicting_path(self):
collections_path = os.path.join(os.path.dirname(__file__), '/resources/collections.yml')
collection_watcher = CollectionWatcher(collections_path, Mock(), Mock())
collection = Collection(dataset_id="test_dataset",
path="/resources/*.nc",
projection="Grid",
slices=frozenset(),
dimension_names=frozenset(),
historical_priority=1,
forward_processing_priority=2,
date_from=None,
date_to=None)
self.assertRaises(ConflictingPathCollectionError, collection_watcher._validate_collection, collection)
@async_test
async def test_collection_callback_is_called(self):
collections_config = tempfile.NamedTemporaryFile("w+b", buffering=0, delete=False)
granule_dir = tempfile.TemporaryDirectory()
collections_str = f"""collections:
- id: TELLUS_GRACE_MASCON_CRI_GRID_RL05_V2_LAND
path: {granule_dir.name}
priority: 1
forward-processing-priority: 5
projection: Grid
dimensionNames:
latitude: lat
longitude: lon
time: time
variable: lwe_thickness
slices:
time: 1
lat: 30
lon: 30
"""
collections_config.write(collections_str.encode("utf-8"))
collection_callback = AsyncMock()
collection_watcher = CollectionWatcher(collections_path=collections_config.name,
collection_updated_callback=collection_callback,
granule_updated_callback=AsyncMock(),
collections_refresh_interval=0.1)
await collection_watcher.start_watching()
collections_str = f"""
- id: TELLUS_GRACE_MASCON_CRI_GRID_RL05_V2_LAND
path: {granule_dir.name}
priority: 10
forward-processing-priority: 5
projection: Grid
dimensionNames:
latitude: lat
longitude: lon
time: time
variable: lwe_thickness
slices:
time: 1
lat: 30
lon: 30
"""
collections_config.write(collections_str.encode("utf-8"))
await AsyncAssert.assert_called_within_timeout(collection_callback, call_count=2)
collections_config.close()
granule_dir.cleanup()
os.remove(collections_config.name)
@async_test
async def test_granule_callback_is_called_on_new_file(self):
with tempfile.NamedTemporaryFile("w+b", buffering=0) as collections_config:
granule_dir = tempfile.TemporaryDirectory()
collections_str = f"""
collections:
- id: TELLUS_GRACE_MASCON_CRI_GRID_RL05_V2_LAND
path: {granule_dir.name}
priority: 1
forward-processing-priority: 5
projection: Grid
dimensionNames:
latitude: lat
longitude: lon
time: time
variable: lwe_thickness
slices:
time: 1
lat: 30
lon: 30
"""
collections_config.write(collections_str.encode("utf-8"))
granule_callback = AsyncMock()
collection_watcher = CollectionWatcher(collections_config.name, AsyncMock(), granule_callback)
await collection_watcher.start_watching()
new_granule = open(os.path.join(granule_dir.name, 'test.nc'), "w+")
await AsyncAssert.assert_called_within_timeout(granule_callback)
new_granule.close()
granule_dir.cleanup()
@async_test
async def test_granule_callback_is_called_on_modified_file(self):
with tempfile.NamedTemporaryFile("w+b", buffering=0) as collections_config:
granule_dir = tempfile.TemporaryDirectory()
collections_str = f"""
collections:
- id: TELLUS_GRACE_MASCON_CRI_GRID_RL05_V2_LAND
path: {granule_dir.name}
priority: 1
forward-processing-priority: 5
projection: Grid
dimensionNames:
latitude: lat
longitude: lon
time: time
variable: lwe_thickness
slices:
time: 1
lat: 30
lon: 30
"""
collections_config.write(collections_str.encode("utf-8"))
new_granule = open(os.path.join(granule_dir.name, 'test.nc'), "w+")
granule_callback = AsyncMock()
collection_watcher = CollectionWatcher(collections_config.name, AsyncMock(), granule_callback)
await collection_watcher.start_watching()
new_granule.write("hello world")
new_granule.close()
await AsyncAssert.assert_called_within_timeout(granule_callback)
granule_dir.cleanup()
@async_test
async def test_run_periodically(self):
callback = AsyncMock()
await CollectionWatcher._run_periodically(None, 0.1, callback)
await AsyncAssert.assert_called_within_timeout(callback, timeout_sec=0.3, call_count=2)
| 42.65625 | 117 | 0.66337 | 1,088 | 10,920 | 6.311581 | 0.154412 | 0.089122 | 0.069317 | 0.08519 | 0.762924 | 0.728266 | 0.688947 | 0.67555 | 0.635066 | 0.635066 | 0 | 0.010686 | 0.245879 | 10,920 | 255 | 118 | 42.823529 | 0.823194 | 0 | 0 | 0.654206 | 0 | 0 | 0.169597 | 0.056502 | 0 | 0 | 0 | 0 | 0.079439 | 1 | 0.051402 | false | 0 | 0.042056 | 0 | 0.098131 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e6c93ca5d45a0fb3256a65a28829483c7a151dcb | 1,325 | py | Python | src/clamnotif/terminal.py | pisoftmacao/clamnotif | 8e76859c9fcaef76c6eb10c3af357a5310ac7f4e | [
"MIT"
] | 1 | 2021-11-12T20:58:18.000Z | 2021-11-12T20:58:18.000Z | src/clamnotif/terminal.py | pisoftmacao/clamnotif | 8e76859c9fcaef76c6eb10c3af357a5310ac7f4e | [
"MIT"
] | null | null | null | src/clamnotif/terminal.py | pisoftmacao/clamnotif | 8e76859c9fcaef76c6eb10c3af357a5310ac7f4e | [
"MIT"
] | null | null | null | import os
class Colors:
HEADER = '\033[95m'
OKBLUE = '\033[94m'
OKCYAN = '\033[96m'
OKGREEN = '\033[92m'
WARNING = '\033[93m'
FAIL = '\033[91m'
ENDC = '\033[0m'
BOLD = '\033[1m'
UNDERLINE = '\033[4m'
class Checklist:
UNCHECKED = '\u2717'
CHECKED = '\u2714'
class Layout:
INDENT = " "
class Terminal(object):
def success(self, msg):
print(Colors.OKGREEN + msg + Colors.ENDC)
def warning(self, msg):
print(Colors.WARNING + msg + Colors.ENDC)
def welcome(self, msg):
print(Colors.HEADER + msg + Colors.ENDC)
def fail(self, msg):
print(Colors.FAIL + msg + Colors.ENDC)
def checked(self, msg):
print(Layout.INDENT + "[" + Colors.OKGREEN +
Checklist.CHECKED + Colors.ENDC + "] " + msg)
def unchecked(self, msg):
print(Layout.INDENT + "[" + Colors.WARNING +
Checklist.UNCHECKED + Colors.ENDC + "] " + msg)
def header(self, msg):
print(Colors.BOLD + "* " + msg + Colors.ENDC)
def please_run(self, cmd, purpose):
print(Layout.INDENT * 3 + "Tips:")
print(Layout.INDENT * 3 +
"please run the following command to {}".format(purpose))
print(Layout.INDENT * 4 + Colors.BOLD + cmd + Colors.ENDC)
terminal = Terminal()
| 22.844828 | 71 | 0.566792 | 155 | 1,325 | 4.83871 | 0.322581 | 0.106667 | 0.112 | 0.12 | 0.08 | 0.08 | 0 | 0 | 0 | 0 | 0 | 0.055907 | 0.284528 | 1,325 | 57 | 72 | 23.245614 | 0.735232 | 0 | 0 | 0 | 0 | 0 | 0.101132 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.205128 | false | 0 | 0.025641 | 0 | 0.641026 | 0.25641 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e6caf0e23e37f3047a11e32a624282c0d151b714 | 339 | py | Python | sympy/parsing/tests/test_latex_deps.py | STALKER2010/sympy-bleeding-edge | 81233029a9a30866747f6da2c0e9604d1681d474 | [
"BSD-3-Clause"
] | 8 | 2019-05-29T09:38:30.000Z | 2021-01-20T03:36:59.000Z | sympy/parsing/tests/test_latex_deps.py | STALKER2010/sympy-bleeding-edge | 81233029a9a30866747f6da2c0e9604d1681d474 | [
"BSD-3-Clause"
] | 12 | 2021-03-09T03:01:16.000Z | 2022-03-11T23:59:36.000Z | sympy/parsing/tests/test_latex_deps.py | STALKER2010/sympy-bleeding-edge | 81233029a9a30866747f6da2c0e9604d1681d474 | [
"BSD-3-Clause"
] | 1 | 2020-07-17T12:49:49.000Z | 2020-07-17T12:49:49.000Z | from sympy.external import import_module
from sympy.utilities import pytest
antlr4 = import_module("antlr4")
# disable tests if antlr4-python*-runtime is not present
if antlr4:
disabled = True
def test_no_import():
from sympy.parsing.latex import parse_latex
with pytest.raises(ImportError):
parse_latex('1 + 1')
| 19.941176 | 56 | 0.740413 | 47 | 339 | 5.212766 | 0.595745 | 0.110204 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021583 | 0.179941 | 339 | 16 | 57 | 21.1875 | 0.859712 | 0.159292 | 0 | 0 | 0 | 0 | 0.038869 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.666667 | 0 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
e6d5a99989055f809b7391edd9fd0f8791f6941c | 59,079 | py | Python | plugin-python/proto/pyvcloudprovider_pb2.py | srinarayanant/terraform-provider-vcloud-director-1 | 1e805550e69fb5284d4a746a2f86326ec72c565f | [
"BSD-2-Clause"
] | null | null | null | plugin-python/proto/pyvcloudprovider_pb2.py | srinarayanant/terraform-provider-vcloud-director-1 | 1e805550e69fb5284d4a746a2f86326ec72c565f | [
"BSD-2-Clause"
] | null | null | null | plugin-python/proto/pyvcloudprovider_pb2.py | srinarayanant/terraform-provider-vcloud-director-1 | 1e805550e69fb5284d4a746a2f86326ec72c565f | [
"BSD-2-Clause"
] | null | null | null | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: proto/pyvcloudprovider.proto
import sys
_b = sys.version_info[0] < 3 and (lambda x: x) or (
lambda x: x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from proto import vapp_pb2 as proto_dot_vapp__pb2
from proto import catalog_item_pb2 as proto_dot_catalog__item__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='proto/pyvcloudprovider.proto',
package='proto',
syntax='proto3',
serialized_pb=_b(
'\n\x1cproto/pyvcloudprovider.proto\x12\x05proto\x1a\x10proto/vapp.proto\x1a\x18proto/catalog_item.proto\"\x89\x01\n\x10LoginCredentials\x12\x10\n\x08username\x18\x01 \x01(\t\x12\x10\n\x08password\x18\x02 \x01(\t\x12\x0b\n\x03org\x18\x03 \x01(\t\x12\x1b\n\x13use_vcd_cli_profile\x18\x04 \x01(\x08\x12\n\n\x02ip\x18\x05 \x01(\t\x12\x1b\n\x13\x61llow_insecure_flag\x18\x06 \x01(\x08\"\x1c\n\x0bLoginResult\x12\r\n\x05token\x18\x01 \x01(\t\"<\n\x07\x43\x61talog\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x02 \x01(\t\x12\x0e\n\x06shared\x18\x03 \x01(\x08\"W\n\x11ReadCatalogResult\x12\x0f\n\x07present\x18\x01 \x01(\x08\x12\x0c\n\x04name\x18\x02 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x03 \x01(\t\x12\x0e\n\x06shared\x18\x04 \x01(\x08\"&\n\x13\x43reateCatalogResult\x12\x0f\n\x07\x63reated\x18\x01 \x01(\x08\"&\n\x13\x44\x65leteCatalogResult\x12\x0f\n\x07\x64\x65leted\x18\x01 \x01(\x08\"\'\n\x13\x43heckResolvedResult\x12\x10\n\x08resolved\x18\x01 \x01(\x08\"\x1d\n\nStopResult\x12\x0f\n\x07stopped\x18\x01 \x01(\x08\"\n\n\x08StopInfo2\xe4\x07\n\x10PyVcloudProvider\x12\x36\n\x05Login\x12\x17.proto.LoginCredentials\x1a\x12.proto.LoginResult\"\x00\x12\x39\n\x0bReadCatalog\x12\x0e.proto.Catalog\x1a\x18.proto.ReadCatalogResult\"\x00\x12=\n\rCreateCatalog\x12\x0e.proto.Catalog\x1a\x1a.proto.CreateCatalogResult\"\x00\x12=\n\rDeleteCatalog\x12\x0e.proto.Catalog\x1a\x1a.proto.DeleteCatalogResult\"\x00\x12V\n\x12\x43\x61talogUploadMedia\x12\x1d.proto.CatalogUploadMediaInfo\x1a\x1f.proto.CatalogUploadMediaResult\"\x00\x12P\n\x10\x43\x61talogUploadOva\x12\x1b.proto.CatalogUploadOvaInfo\x1a\x1d.proto.CatalogUploadOvaResult\"\x00\x12Q\n\x10OvaCheckResolved\x12\x1f.proto.CatalogCheckResolvedInfo\x1a\x1a.proto.CheckResolvedResult\"\x00\x12S\n\x11\x44\x65leteCatalogItem\x12\x1c.proto.DeleteCatalogItemInfo\x1a\x1e.proto.DeleteCatalogItemResult\"\x00\x12\\\n\x14isPresentCatalogItem\x12\x1f.proto.IsPresentCatalogItemInfo\x1a!.proto.IsPresentCatalogItemResult\"\x00\x12\x41\n\x0b\x43\x61ptureVapp\x12\x16.proto.CaptureVAppInfo\x1a\x18.proto.CaptureVAppResult\"\x00\x12>\n\nCreateVApp\x12\x15.proto.CreateVAppInfo\x1a\x17.proto.CreateVAppResult\"\x00\x12>\n\nDeleteVApp\x12\x15.proto.DeleteVAppInfo\x1a\x17.proto.DeleteVAppResult\"\x00\x12\x38\n\x08ReadVApp\x12\x13.proto.ReadVAppInfo\x1a\x15.proto.ReadVAppResult\"\x00\x12\x32\n\nStopPlugin\x12\x0f.proto.StopInfo\x1a\x11.proto.StopResult\"\x00\x42\x37\n\x1c\x63om.vmware.pyvcloud.providerB\x15PyVcloudProviderProtoP\x01\x62\x06proto3'
),
dependencies=[
proto_dot_vapp__pb2.DESCRIPTOR,
proto_dot_catalog__item__pb2.DESCRIPTOR,
])
_LOGINCREDENTIALS = _descriptor.Descriptor(
name='LoginCredentials',
full_name='proto.LoginCredentials',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='username',
full_name='proto.LoginCredentials.username',
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='password',
full_name='proto.LoginCredentials.password',
index=1,
number=2,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='org',
full_name='proto.LoginCredentials.org',
index=2,
number=3,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='use_vcd_cli_profile',
full_name='proto.LoginCredentials.use_vcd_cli_profile',
index=3,
number=4,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='ip',
full_name='proto.LoginCredentials.ip',
index=4,
number=5,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='allow_insecure_flag',
full_name='proto.LoginCredentials.allow_insecure_flag',
index=5,
number=6,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
],
extensions=[],
nested_types=[],
enum_types=[],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[],
serialized_start=84,
serialized_end=221,
)
_LOGINRESULT = _descriptor.Descriptor(
name='LoginResult',
full_name='proto.LoginResult',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='token',
full_name='proto.LoginResult.token',
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
],
extensions=[],
nested_types=[],
enum_types=[],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[],
serialized_start=223,
serialized_end=251,
)
_CATALOG = _descriptor.Descriptor(
name='Catalog',
full_name='proto.Catalog',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='name',
full_name='proto.Catalog.name',
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='description',
full_name='proto.Catalog.description',
index=1,
number=2,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='shared',
full_name='proto.Catalog.shared',
index=2,
number=3,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
],
extensions=[],
nested_types=[],
enum_types=[],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[],
serialized_start=253,
serialized_end=313,
)
_READCATALOGRESULT = _descriptor.Descriptor(
name='ReadCatalogResult',
full_name='proto.ReadCatalogResult',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='present',
full_name='proto.ReadCatalogResult.present',
index=0,
number=1,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='name',
full_name='proto.ReadCatalogResult.name',
index=1,
number=2,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='description',
full_name='proto.ReadCatalogResult.description',
index=2,
number=3,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=_b("").decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='shared',
full_name='proto.ReadCatalogResult.shared',
index=3,
number=4,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
],
extensions=[],
nested_types=[],
enum_types=[],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[],
serialized_start=315,
serialized_end=402,
)
_CREATECATALOGRESULT = _descriptor.Descriptor(
name='CreateCatalogResult',
full_name='proto.CreateCatalogResult',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='created',
full_name='proto.CreateCatalogResult.created',
index=0,
number=1,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
],
extensions=[],
nested_types=[],
enum_types=[],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[],
serialized_start=404,
serialized_end=442,
)
_DELETECATALOGRESULT = _descriptor.Descriptor(
name='DeleteCatalogResult',
full_name='proto.DeleteCatalogResult',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='deleted',
full_name='proto.DeleteCatalogResult.deleted',
index=0,
number=1,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
],
extensions=[],
nested_types=[],
enum_types=[],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[],
serialized_start=444,
serialized_end=482,
)
_CHECKRESOLVEDRESULT = _descriptor.Descriptor(
name='CheckResolvedResult',
full_name='proto.CheckResolvedResult',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='resolved',
full_name='proto.CheckResolvedResult.resolved',
index=0,
number=1,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
],
extensions=[],
nested_types=[],
enum_types=[],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[],
serialized_start=484,
serialized_end=523,
)
_STOPRESULT = _descriptor.Descriptor(
name='StopResult',
full_name='proto.StopResult',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='stopped',
full_name='proto.StopResult.stopped',
index=0,
number=1,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
options=None),
],
extensions=[],
nested_types=[],
enum_types=[],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[],
serialized_start=525,
serialized_end=554,
)
_STOPINFO = _descriptor.Descriptor(
name='StopInfo',
full_name='proto.StopInfo',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[],
extensions=[],
nested_types=[],
enum_types=[],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[],
serialized_start=556,
serialized_end=566,
)
DESCRIPTOR.message_types_by_name['LoginCredentials'] = _LOGINCREDENTIALS
DESCRIPTOR.message_types_by_name['LoginResult'] = _LOGINRESULT
DESCRIPTOR.message_types_by_name['Catalog'] = _CATALOG
DESCRIPTOR.message_types_by_name['ReadCatalogResult'] = _READCATALOGRESULT
DESCRIPTOR.message_types_by_name['CreateCatalogResult'] = _CREATECATALOGRESULT
DESCRIPTOR.message_types_by_name['DeleteCatalogResult'] = _DELETECATALOGRESULT
DESCRIPTOR.message_types_by_name['CheckResolvedResult'] = _CHECKRESOLVEDRESULT
DESCRIPTOR.message_types_by_name['StopResult'] = _STOPRESULT
DESCRIPTOR.message_types_by_name['StopInfo'] = _STOPINFO
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
LoginCredentials = _reflection.GeneratedProtocolMessageType(
'LoginCredentials',
(_message.Message, ),
dict(
DESCRIPTOR=_LOGINCREDENTIALS,
__module__='proto.pyvcloudprovider_pb2'
# @@protoc_insertion_point(class_scope:proto.LoginCredentials)
))
_sym_db.RegisterMessage(LoginCredentials)
LoginResult = _reflection.GeneratedProtocolMessageType(
'LoginResult',
(_message.Message, ),
dict(
DESCRIPTOR=_LOGINRESULT,
__module__='proto.pyvcloudprovider_pb2'
# @@protoc_insertion_point(class_scope:proto.LoginResult)
))
_sym_db.RegisterMessage(LoginResult)
Catalog = _reflection.GeneratedProtocolMessageType(
'Catalog',
(_message.Message, ),
dict(
DESCRIPTOR=_CATALOG,
__module__='proto.pyvcloudprovider_pb2'
# @@protoc_insertion_point(class_scope:proto.Catalog)
))
_sym_db.RegisterMessage(Catalog)
ReadCatalogResult = _reflection.GeneratedProtocolMessageType(
'ReadCatalogResult',
(_message.Message, ),
dict(
DESCRIPTOR=_READCATALOGRESULT,
__module__='proto.pyvcloudprovider_pb2'
# @@protoc_insertion_point(class_scope:proto.ReadCatalogResult)
))
_sym_db.RegisterMessage(ReadCatalogResult)
CreateCatalogResult = _reflection.GeneratedProtocolMessageType(
'CreateCatalogResult',
(_message.Message, ),
dict(
DESCRIPTOR=_CREATECATALOGRESULT,
__module__='proto.pyvcloudprovider_pb2'
# @@protoc_insertion_point(class_scope:proto.CreateCatalogResult)
))
_sym_db.RegisterMessage(CreateCatalogResult)
DeleteCatalogResult = _reflection.GeneratedProtocolMessageType(
'DeleteCatalogResult',
(_message.Message, ),
dict(
DESCRIPTOR=_DELETECATALOGRESULT,
__module__='proto.pyvcloudprovider_pb2'
# @@protoc_insertion_point(class_scope:proto.DeleteCatalogResult)
))
_sym_db.RegisterMessage(DeleteCatalogResult)
CheckResolvedResult = _reflection.GeneratedProtocolMessageType(
'CheckResolvedResult',
(_message.Message, ),
dict(
DESCRIPTOR=_CHECKRESOLVEDRESULT,
__module__='proto.pyvcloudprovider_pb2'
# @@protoc_insertion_point(class_scope:proto.CheckResolvedResult)
))
_sym_db.RegisterMessage(CheckResolvedResult)
StopResult = _reflection.GeneratedProtocolMessageType(
'StopResult',
(_message.Message, ),
dict(
DESCRIPTOR=_STOPRESULT,
__module__='proto.pyvcloudprovider_pb2'
# @@protoc_insertion_point(class_scope:proto.StopResult)
))
_sym_db.RegisterMessage(StopResult)
StopInfo = _reflection.GeneratedProtocolMessageType(
'StopInfo',
(_message.Message, ),
dict(
DESCRIPTOR=_STOPINFO,
__module__='proto.pyvcloudprovider_pb2'
# @@protoc_insertion_point(class_scope:proto.StopInfo)
))
_sym_db.RegisterMessage(StopInfo)
DESCRIPTOR.has_options = True
DESCRIPTOR._options = _descriptor._ParseOptions(
descriptor_pb2.FileOptions(),
_b('\n\034com.vmware.pyvcloud.providerB\025PyVcloudProviderProtoP\001'))
_PYVCLOUDPROVIDER = _descriptor.ServiceDescriptor(
name='PyVcloudProvider',
full_name='proto.PyVcloudProvider',
file=DESCRIPTOR,
index=0,
options=None,
serialized_start=569,
serialized_end=1565,
methods=[
_descriptor.MethodDescriptor(
name='Login',
full_name='proto.PyVcloudProvider.Login',
index=0,
containing_service=None,
input_type=_LOGINCREDENTIALS,
output_type=_LOGINRESULT,
options=None,
),
_descriptor.MethodDescriptor(
name='ReadCatalog',
full_name='proto.PyVcloudProvider.ReadCatalog',
index=1,
containing_service=None,
input_type=_CATALOG,
output_type=_READCATALOGRESULT,
options=None,
),
_descriptor.MethodDescriptor(
name='CreateCatalog',
full_name='proto.PyVcloudProvider.CreateCatalog',
index=2,
containing_service=None,
input_type=_CATALOG,
output_type=_CREATECATALOGRESULT,
options=None,
),
_descriptor.MethodDescriptor(
name='DeleteCatalog',
full_name='proto.PyVcloudProvider.DeleteCatalog',
index=3,
containing_service=None,
input_type=_CATALOG,
output_type=_DELETECATALOGRESULT,
options=None,
),
_descriptor.MethodDescriptor(
name='CatalogUploadMedia',
full_name='proto.PyVcloudProvider.CatalogUploadMedia',
index=4,
containing_service=None,
input_type=proto_dot_catalog__item__pb2._CATALOGUPLOADMEDIAINFO,
output_type=proto_dot_catalog__item__pb2._CATALOGUPLOADMEDIARESULT,
options=None,
),
_descriptor.MethodDescriptor(
name='CatalogUploadOva',
full_name='proto.PyVcloudProvider.CatalogUploadOva',
index=5,
containing_service=None,
input_type=proto_dot_catalog__item__pb2._CATALOGUPLOADOVAINFO,
output_type=proto_dot_catalog__item__pb2._CATALOGUPLOADOVARESULT,
options=None,
),
_descriptor.MethodDescriptor(
name='OvaCheckResolved',
full_name='proto.PyVcloudProvider.OvaCheckResolved',
index=6,
containing_service=None,
input_type=proto_dot_catalog__item__pb2._CATALOGCHECKRESOLVEDINFO,
output_type=_CHECKRESOLVEDRESULT,
options=None,
),
_descriptor.MethodDescriptor(
name='DeleteCatalogItem',
full_name='proto.PyVcloudProvider.DeleteCatalogItem',
index=7,
containing_service=None,
input_type=proto_dot_catalog__item__pb2._DELETECATALOGITEMINFO,
output_type=proto_dot_catalog__item__pb2._DELETECATALOGITEMRESULT,
options=None,
),
_descriptor.MethodDescriptor(
name='isPresentCatalogItem',
full_name='proto.PyVcloudProvider.isPresentCatalogItem',
index=8,
containing_service=None,
input_type=proto_dot_catalog__item__pb2._ISPRESENTCATALOGITEMINFO,
output_type=proto_dot_catalog__item__pb2.
_ISPRESENTCATALOGITEMRESULT,
options=None,
),
_descriptor.MethodDescriptor(
name='CaptureVapp',
full_name='proto.PyVcloudProvider.CaptureVapp',
index=9,
containing_service=None,
input_type=proto_dot_catalog__item__pb2._CAPTUREVAPPINFO,
output_type=proto_dot_catalog__item__pb2._CAPTUREVAPPRESULT,
options=None,
),
_descriptor.MethodDescriptor(
name='CreateVApp',
full_name='proto.PyVcloudProvider.CreateVApp',
index=10,
containing_service=None,
input_type=proto_dot_vapp__pb2._CREATEVAPPINFO,
output_type=proto_dot_vapp__pb2._CREATEVAPPRESULT,
options=None,
),
_descriptor.MethodDescriptor(
name='DeleteVApp',
full_name='proto.PyVcloudProvider.DeleteVApp',
index=11,
containing_service=None,
input_type=proto_dot_vapp__pb2._DELETEVAPPINFO,
output_type=proto_dot_vapp__pb2._DELETEVAPPRESULT,
options=None,
),
_descriptor.MethodDescriptor(
name='ReadVApp',
full_name='proto.PyVcloudProvider.ReadVApp',
index=12,
containing_service=None,
input_type=proto_dot_vapp__pb2._READVAPPINFO,
output_type=proto_dot_vapp__pb2._READVAPPRESULT,
options=None,
),
_descriptor.MethodDescriptor(
name='StopPlugin',
full_name='proto.PyVcloudProvider.StopPlugin',
index=13,
containing_service=None,
input_type=_STOPINFO,
output_type=_STOPRESULT,
options=None,
),
])
_sym_db.RegisterServiceDescriptor(_PYVCLOUDPROVIDER)
DESCRIPTOR.services_by_name['PyVcloudProvider'] = _PYVCLOUDPROVIDER
try:
# THESE ELEMENTS WILL BE DEPRECATED.
# Please use the generated *_pb2_grpc.py files instead.
import grpc
from grpc.beta import implementations as beta_implementations
from grpc.beta import interfaces as beta_interfaces
from grpc.framework.common import cardinality
from grpc.framework.interfaces.face import utilities as face_utilities
class PyVcloudProviderStub(object):
"""Interface exported by the server.
"""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.Login = channel.unary_unary(
'/proto.PyVcloudProvider/Login',
request_serializer=LoginCredentials.SerializeToString,
response_deserializer=LoginResult.FromString,
)
self.ReadCatalog = channel.unary_unary(
'/proto.PyVcloudProvider/ReadCatalog',
request_serializer=Catalog.SerializeToString,
response_deserializer=ReadCatalogResult.FromString,
)
self.CreateCatalog = channel.unary_unary(
'/proto.PyVcloudProvider/CreateCatalog',
request_serializer=Catalog.SerializeToString,
response_deserializer=CreateCatalogResult.FromString,
)
self.DeleteCatalog = channel.unary_unary(
'/proto.PyVcloudProvider/DeleteCatalog',
request_serializer=Catalog.SerializeToString,
response_deserializer=DeleteCatalogResult.FromString,
)
self.CatalogUploadMedia = channel.unary_unary(
'/proto.PyVcloudProvider/CatalogUploadMedia',
request_serializer=proto_dot_catalog__item__pb2.
CatalogUploadMediaInfo.SerializeToString,
response_deserializer=proto_dot_catalog__item__pb2.
CatalogUploadMediaResult.FromString,
)
self.CatalogUploadOva = channel.unary_unary(
'/proto.PyVcloudProvider/CatalogUploadOva',
request_serializer=proto_dot_catalog__item__pb2.
CatalogUploadOvaInfo.SerializeToString,
response_deserializer=proto_dot_catalog__item__pb2.
CatalogUploadOvaResult.FromString,
)
self.OvaCheckResolved = channel.unary_unary(
'/proto.PyVcloudProvider/OvaCheckResolved',
request_serializer=proto_dot_catalog__item__pb2.
CatalogCheckResolvedInfo.SerializeToString,
response_deserializer=CheckResolvedResult.FromString,
)
self.DeleteCatalogItem = channel.unary_unary(
'/proto.PyVcloudProvider/DeleteCatalogItem',
request_serializer=proto_dot_catalog__item__pb2.
DeleteCatalogItemInfo.SerializeToString,
response_deserializer=proto_dot_catalog__item__pb2.
DeleteCatalogItemResult.FromString,
)
self.isPresentCatalogItem = channel.unary_unary(
'/proto.PyVcloudProvider/isPresentCatalogItem',
request_serializer=proto_dot_catalog__item__pb2.
IsPresentCatalogItemInfo.SerializeToString,
response_deserializer=proto_dot_catalog__item__pb2.
IsPresentCatalogItemResult.FromString,
)
self.CaptureVapp = channel.unary_unary(
'/proto.PyVcloudProvider/CaptureVapp',
request_serializer=proto_dot_catalog__item__pb2.
CaptureVAppInfo.SerializeToString,
response_deserializer=proto_dot_catalog__item__pb2.
CaptureVAppResult.FromString,
)
self.CreateVApp = channel.unary_unary(
'/proto.PyVcloudProvider/CreateVApp',
request_serializer=proto_dot_vapp__pb2.CreateVAppInfo.
SerializeToString,
response_deserializer=proto_dot_vapp__pb2.CreateVAppResult.
FromString,
)
self.DeleteVApp = channel.unary_unary(
'/proto.PyVcloudProvider/DeleteVApp',
request_serializer=proto_dot_vapp__pb2.DeleteVAppInfo.
SerializeToString,
response_deserializer=proto_dot_vapp__pb2.DeleteVAppResult.
FromString,
)
self.ReadVApp = channel.unary_unary(
'/proto.PyVcloudProvider/ReadVApp',
request_serializer=proto_dot_vapp__pb2.ReadVAppInfo.
SerializeToString,
response_deserializer=proto_dot_vapp__pb2.ReadVAppResult.
FromString,
)
self.StopPlugin = channel.unary_unary(
'/proto.PyVcloudProvider/StopPlugin',
request_serializer=StopInfo.SerializeToString,
response_deserializer=StopResult.FromString,
)
class PyVcloudProviderServicer(object):
"""Interface exported by the server.
"""
def Login(self, request, context):
"""Tenant Loging to VCD
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ReadCatalog(self, request, context):
"""check if catalog is preset and return true and the catalog details
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def CreateCatalog(self, request, context):
"""create a new catalog
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteCatalog(self, request, context):
"""delete a catalog
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def CatalogUploadMedia(self, request, context):
"""catalog upload Media - anything other than ova
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def CatalogUploadOva(self, request, context):
"""catalog upload ova
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def OvaCheckResolved(self, request, context):
"""check resolved after upload
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteCatalogItem(self, request, context):
"""catalog item delete
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def isPresentCatalogItem(self, request, context):
"""check if catalog item is preset and return true
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def CaptureVapp(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def CreateVApp(self, request, context):
"""create vApp
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteVApp(self, request, context):
"""delete VApp
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ReadVApp(self, request, context):
"""Read VApp
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def StopPlugin(self, request, context):
"""remote stop interface for the plugin
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_PyVcloudProviderServicer_to_server(servicer, server):
rpc_method_handlers = {
'Login':
grpc.unary_unary_rpc_method_handler(
servicer.Login,
request_deserializer=LoginCredentials.FromString,
response_serializer=LoginResult.SerializeToString,
),
'ReadCatalog':
grpc.unary_unary_rpc_method_handler(
servicer.ReadCatalog,
request_deserializer=Catalog.FromString,
response_serializer=ReadCatalogResult.SerializeToString,
),
'CreateCatalog':
grpc.unary_unary_rpc_method_handler(
servicer.CreateCatalog,
request_deserializer=Catalog.FromString,
response_serializer=CreateCatalogResult.SerializeToString,
),
'DeleteCatalog':
grpc.unary_unary_rpc_method_handler(
servicer.DeleteCatalog,
request_deserializer=Catalog.FromString,
response_serializer=DeleteCatalogResult.SerializeToString,
),
'CatalogUploadMedia':
grpc.unary_unary_rpc_method_handler(
servicer.CatalogUploadMedia,
request_deserializer=proto_dot_catalog__item__pb2.
CatalogUploadMediaInfo.FromString,
response_serializer=proto_dot_catalog__item__pb2.
CatalogUploadMediaResult.SerializeToString,
),
'CatalogUploadOva':
grpc.unary_unary_rpc_method_handler(
servicer.CatalogUploadOva,
request_deserializer=proto_dot_catalog__item__pb2.
CatalogUploadOvaInfo.FromString,
response_serializer=proto_dot_catalog__item__pb2.
CatalogUploadOvaResult.SerializeToString,
),
'OvaCheckResolved':
grpc.unary_unary_rpc_method_handler(
servicer.OvaCheckResolved,
request_deserializer=proto_dot_catalog__item__pb2.
CatalogCheckResolvedInfo.FromString,
response_serializer=CheckResolvedResult.SerializeToString,
),
'DeleteCatalogItem':
grpc.unary_unary_rpc_method_handler(
servicer.DeleteCatalogItem,
request_deserializer=proto_dot_catalog__item__pb2.
DeleteCatalogItemInfo.FromString,
response_serializer=proto_dot_catalog__item__pb2.
DeleteCatalogItemResult.SerializeToString,
),
'isPresentCatalogItem':
grpc.unary_unary_rpc_method_handler(
servicer.isPresentCatalogItem,
request_deserializer=proto_dot_catalog__item__pb2.
IsPresentCatalogItemInfo.FromString,
response_serializer=proto_dot_catalog__item__pb2.
IsPresentCatalogItemResult.SerializeToString,
),
'CaptureVapp':
grpc.unary_unary_rpc_method_handler(
servicer.CaptureVapp,
request_deserializer=proto_dot_catalog__item__pb2.
CaptureVAppInfo.FromString,
response_serializer=proto_dot_catalog__item__pb2.
CaptureVAppResult.SerializeToString,
),
'CreateVApp':
grpc.unary_unary_rpc_method_handler(
servicer.CreateVApp,
request_deserializer=proto_dot_vapp__pb2.CreateVAppInfo.
FromString,
response_serializer=proto_dot_vapp__pb2.CreateVAppResult.
SerializeToString,
),
'DeleteVApp':
grpc.unary_unary_rpc_method_handler(
servicer.DeleteVApp,
request_deserializer=proto_dot_vapp__pb2.DeleteVAppInfo.
FromString,
response_serializer=proto_dot_vapp__pb2.DeleteVAppResult.
SerializeToString,
),
'ReadVApp':
grpc.unary_unary_rpc_method_handler(
servicer.ReadVApp,
request_deserializer=proto_dot_vapp__pb2.ReadVAppInfo.
FromString,
response_serializer=proto_dot_vapp__pb2.ReadVAppResult.
SerializeToString,
),
'StopPlugin':
grpc.unary_unary_rpc_method_handler(
servicer.StopPlugin,
request_deserializer=StopInfo.FromString,
response_serializer=StopResult.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'proto.PyVcloudProvider', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler, ))
class BetaPyVcloudProviderServicer(object):
"""The Beta API is deprecated for 0.15.0 and later.
It is recommended to use the GA API (classes and functions in this
file not marked beta) for all further purposes. This class was generated
only to ease transition from grpcio<0.15.0 to grpcio>=0.15.0."""
"""Interface exported by the server.
"""
def Login(self, request, context):
"""Tenant Loging to VCD
"""
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def ReadCatalog(self, request, context):
"""check if catalog is preset and return true and the catalog details
"""
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def CreateCatalog(self, request, context):
"""create a new catalog
"""
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def DeleteCatalog(self, request, context):
"""delete a catalog
"""
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def CatalogUploadMedia(self, request, context):
"""catalog upload Media - anything other than ova
"""
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def CatalogUploadOva(self, request, context):
"""catalog upload ova
"""
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def OvaCheckResolved(self, request, context):
"""check resolved after upload
"""
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def DeleteCatalogItem(self, request, context):
"""catalog item delete
"""
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def isPresentCatalogItem(self, request, context):
"""check if catalog item is preset and return true
"""
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def CaptureVapp(self, request, context):
# missing associated documentation comment in .proto file
pass
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def CreateVApp(self, request, context):
"""create vApp
"""
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def DeleteVApp(self, request, context):
"""delete VApp
"""
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def ReadVApp(self, request, context):
"""Read VApp
"""
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def StopPlugin(self, request, context):
"""remote stop interface for the plugin
"""
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
class BetaPyVcloudProviderStub(object):
"""The Beta API is deprecated for 0.15.0 and later.
It is recommended to use the GA API (classes and functions in this
file not marked beta) for all further purposes. This class was generated
only to ease transition from grpcio<0.15.0 to grpcio>=0.15.0."""
"""Interface exported by the server.
"""
def Login(self,
request,
timeout,
metadata=None,
with_call=False,
protocol_options=None):
"""Tenant Loging to VCD
"""
raise NotImplementedError()
Login.future = None
def ReadCatalog(self,
request,
timeout,
metadata=None,
with_call=False,
protocol_options=None):
"""check if catalog is preset and return true and the catalog details
"""
raise NotImplementedError()
ReadCatalog.future = None
def CreateCatalog(self,
request,
timeout,
metadata=None,
with_call=False,
protocol_options=None):
"""create a new catalog
"""
raise NotImplementedError()
CreateCatalog.future = None
def DeleteCatalog(self,
request,
timeout,
metadata=None,
with_call=False,
protocol_options=None):
"""delete a catalog
"""
raise NotImplementedError()
DeleteCatalog.future = None
def CatalogUploadMedia(self,
request,
timeout,
metadata=None,
with_call=False,
protocol_options=None):
"""catalog upload Media - anything other than ova
"""
raise NotImplementedError()
CatalogUploadMedia.future = None
def CatalogUploadOva(self,
request,
timeout,
metadata=None,
with_call=False,
protocol_options=None):
"""catalog upload ova
"""
raise NotImplementedError()
CatalogUploadOva.future = None
def OvaCheckResolved(self,
request,
timeout,
metadata=None,
with_call=False,
protocol_options=None):
"""check resolved after upload
"""
raise NotImplementedError()
OvaCheckResolved.future = None
def DeleteCatalogItem(self,
request,
timeout,
metadata=None,
with_call=False,
protocol_options=None):
"""catalog item delete
"""
raise NotImplementedError()
DeleteCatalogItem.future = None
def isPresentCatalogItem(self,
request,
timeout,
metadata=None,
with_call=False,
protocol_options=None):
"""check if catalog item is preset and return true
"""
raise NotImplementedError()
isPresentCatalogItem.future = None
def CaptureVapp(self,
request,
timeout,
metadata=None,
with_call=False,
protocol_options=None):
# missing associated documentation comment in .proto file
pass
raise NotImplementedError()
CaptureVapp.future = None
def CreateVApp(self,
request,
timeout,
metadata=None,
with_call=False,
protocol_options=None):
"""create vApp
"""
raise NotImplementedError()
CreateVApp.future = None
def DeleteVApp(self,
request,
timeout,
metadata=None,
with_call=False,
protocol_options=None):
"""delete VApp
"""
raise NotImplementedError()
DeleteVApp.future = None
def ReadVApp(self,
request,
timeout,
metadata=None,
with_call=False,
protocol_options=None):
"""Read VApp
"""
raise NotImplementedError()
ReadVApp.future = None
def StopPlugin(self,
request,
timeout,
metadata=None,
with_call=False,
protocol_options=None):
"""remote stop interface for the plugin
"""
raise NotImplementedError()
StopPlugin.future = None
def beta_create_PyVcloudProvider_server(servicer,
pool=None,
pool_size=None,
default_timeout=None,
maximum_timeout=None):
"""The Beta API is deprecated for 0.15.0 and later.
It is recommended to use the GA API (classes and functions in this
file not marked beta) for all further purposes. This function was
generated only to ease transition from grpcio<0.15.0 to grpcio>=0.15.0"""
request_deserializers = {
('proto.PyVcloudProvider', 'CaptureVapp'):
proto_dot_catalog__item__pb2.CaptureVAppInfo.FromString,
('proto.PyVcloudProvider', 'CatalogUploadMedia'):
proto_dot_catalog__item__pb2.CatalogUploadMediaInfo.FromString,
('proto.PyVcloudProvider', 'CatalogUploadOva'):
proto_dot_catalog__item__pb2.CatalogUploadOvaInfo.FromString,
('proto.PyVcloudProvider', 'CreateCatalog'):
Catalog.FromString,
('proto.PyVcloudProvider', 'CreateVApp'):
proto_dot_vapp__pb2.CreateVAppInfo.FromString,
('proto.PyVcloudProvider', 'DeleteCatalog'):
Catalog.FromString,
('proto.PyVcloudProvider', 'DeleteCatalogItem'):
proto_dot_catalog__item__pb2.DeleteCatalogItemInfo.FromString,
('proto.PyVcloudProvider', 'DeleteVApp'):
proto_dot_vapp__pb2.DeleteVAppInfo.FromString,
('proto.PyVcloudProvider', 'Login'):
LoginCredentials.FromString,
('proto.PyVcloudProvider', 'OvaCheckResolved'):
proto_dot_catalog__item__pb2.CatalogCheckResolvedInfo.FromString,
('proto.PyVcloudProvider', 'ReadCatalog'):
Catalog.FromString,
('proto.PyVcloudProvider', 'ReadVApp'):
proto_dot_vapp__pb2.ReadVAppInfo.FromString,
('proto.PyVcloudProvider', 'StopPlugin'):
StopInfo.FromString,
('proto.PyVcloudProvider', 'isPresentCatalogItem'):
proto_dot_catalog__item__pb2.IsPresentCatalogItemInfo.FromString,
}
response_serializers = {
('proto.PyVcloudProvider', 'CaptureVapp'):
proto_dot_catalog__item__pb2.CaptureVAppResult.SerializeToString,
('proto.PyVcloudProvider', 'CatalogUploadMedia'):
proto_dot_catalog__item__pb2.CatalogUploadMediaResult.
SerializeToString,
('proto.PyVcloudProvider', 'CatalogUploadOva'):
proto_dot_catalog__item__pb2.CatalogUploadOvaResult.
SerializeToString,
('proto.PyVcloudProvider', 'CreateCatalog'):
CreateCatalogResult.SerializeToString,
('proto.PyVcloudProvider', 'CreateVApp'):
proto_dot_vapp__pb2.CreateVAppResult.SerializeToString,
('proto.PyVcloudProvider', 'DeleteCatalog'):
DeleteCatalogResult.SerializeToString,
('proto.PyVcloudProvider', 'DeleteCatalogItem'):
proto_dot_catalog__item__pb2.DeleteCatalogItemResult.
SerializeToString,
('proto.PyVcloudProvider', 'DeleteVApp'):
proto_dot_vapp__pb2.DeleteVAppResult.SerializeToString,
('proto.PyVcloudProvider', 'Login'):
LoginResult.SerializeToString,
('proto.PyVcloudProvider', 'OvaCheckResolved'):
CheckResolvedResult.SerializeToString,
('proto.PyVcloudProvider', 'ReadCatalog'):
ReadCatalogResult.SerializeToString,
('proto.PyVcloudProvider', 'ReadVApp'):
proto_dot_vapp__pb2.ReadVAppResult.SerializeToString,
('proto.PyVcloudProvider', 'StopPlugin'):
StopResult.SerializeToString,
('proto.PyVcloudProvider', 'isPresentCatalogItem'):
proto_dot_catalog__item__pb2.IsPresentCatalogItemResult.
SerializeToString,
}
method_implementations = {
('proto.PyVcloudProvider', 'CaptureVapp'):
face_utilities.unary_unary_inline(servicer.CaptureVapp),
('proto.PyVcloudProvider', 'CatalogUploadMedia'):
face_utilities.unary_unary_inline(servicer.CatalogUploadMedia),
('proto.PyVcloudProvider', 'CatalogUploadOva'):
face_utilities.unary_unary_inline(servicer.CatalogUploadOva),
('proto.PyVcloudProvider', 'CreateCatalog'):
face_utilities.unary_unary_inline(servicer.CreateCatalog),
('proto.PyVcloudProvider', 'CreateVApp'):
face_utilities.unary_unary_inline(servicer.CreateVApp),
('proto.PyVcloudProvider', 'DeleteCatalog'):
face_utilities.unary_unary_inline(servicer.DeleteCatalog),
('proto.PyVcloudProvider', 'DeleteCatalogItem'):
face_utilities.unary_unary_inline(servicer.DeleteCatalogItem),
('proto.PyVcloudProvider', 'DeleteVApp'):
face_utilities.unary_unary_inline(servicer.DeleteVApp),
('proto.PyVcloudProvider', 'Login'):
face_utilities.unary_unary_inline(servicer.Login),
('proto.PyVcloudProvider', 'OvaCheckResolved'):
face_utilities.unary_unary_inline(servicer.OvaCheckResolved),
('proto.PyVcloudProvider', 'ReadCatalog'):
face_utilities.unary_unary_inline(servicer.ReadCatalog),
('proto.PyVcloudProvider', 'ReadVApp'):
face_utilities.unary_unary_inline(servicer.ReadVApp),
('proto.PyVcloudProvider', 'StopPlugin'):
face_utilities.unary_unary_inline(servicer.StopPlugin),
('proto.PyVcloudProvider', 'isPresentCatalogItem'):
face_utilities.unary_unary_inline(servicer.isPresentCatalogItem),
}
server_options = beta_implementations.server_options(
request_deserializers=request_deserializers,
response_serializers=response_serializers,
thread_pool=pool,
thread_pool_size=pool_size,
default_timeout=default_timeout,
maximum_timeout=maximum_timeout)
return beta_implementations.server(
method_implementations, options=server_options)
def beta_create_PyVcloudProvider_stub(channel,
host=None,
metadata_transformer=None,
pool=None,
pool_size=None):
"""The Beta API is deprecated for 0.15.0 and later.
It is recommended to use the GA API (classes and functions in this
file not marked beta) for all further purposes. This function was
generated only to ease transition from grpcio<0.15.0 to grpcio>=0.15.0"""
request_serializers = {
('proto.PyVcloudProvider', 'CaptureVapp'):
proto_dot_catalog__item__pb2.CaptureVAppInfo.SerializeToString,
('proto.PyVcloudProvider', 'CatalogUploadMedia'):
proto_dot_catalog__item__pb2.CatalogUploadMediaInfo.
SerializeToString,
('proto.PyVcloudProvider', 'CatalogUploadOva'):
proto_dot_catalog__item__pb2.CatalogUploadOvaInfo.
SerializeToString,
('proto.PyVcloudProvider', 'CreateCatalog'):
Catalog.SerializeToString,
('proto.PyVcloudProvider', 'CreateVApp'):
proto_dot_vapp__pb2.CreateVAppInfo.SerializeToString,
('proto.PyVcloudProvider', 'DeleteCatalog'):
Catalog.SerializeToString,
('proto.PyVcloudProvider', 'DeleteCatalogItem'):
proto_dot_catalog__item__pb2.DeleteCatalogItemInfo.
SerializeToString,
('proto.PyVcloudProvider', 'DeleteVApp'):
proto_dot_vapp__pb2.DeleteVAppInfo.SerializeToString,
('proto.PyVcloudProvider', 'Login'):
LoginCredentials.SerializeToString,
('proto.PyVcloudProvider', 'OvaCheckResolved'):
proto_dot_catalog__item__pb2.CatalogCheckResolvedInfo.
SerializeToString,
('proto.PyVcloudProvider', 'ReadCatalog'):
Catalog.SerializeToString,
('proto.PyVcloudProvider', 'ReadVApp'):
proto_dot_vapp__pb2.ReadVAppInfo.SerializeToString,
('proto.PyVcloudProvider', 'StopPlugin'):
StopInfo.SerializeToString,
('proto.PyVcloudProvider', 'isPresentCatalogItem'):
proto_dot_catalog__item__pb2.IsPresentCatalogItemInfo.
SerializeToString,
}
response_deserializers = {
('proto.PyVcloudProvider', 'CaptureVapp'):
proto_dot_catalog__item__pb2.CaptureVAppResult.FromString,
('proto.PyVcloudProvider', 'CatalogUploadMedia'):
proto_dot_catalog__item__pb2.CatalogUploadMediaResult.FromString,
('proto.PyVcloudProvider', 'CatalogUploadOva'):
proto_dot_catalog__item__pb2.CatalogUploadOvaResult.FromString,
('proto.PyVcloudProvider', 'CreateCatalog'):
CreateCatalogResult.FromString,
('proto.PyVcloudProvider', 'CreateVApp'):
proto_dot_vapp__pb2.CreateVAppResult.FromString,
('proto.PyVcloudProvider', 'DeleteCatalog'):
DeleteCatalogResult.FromString,
('proto.PyVcloudProvider', 'DeleteCatalogItem'):
proto_dot_catalog__item__pb2.DeleteCatalogItemResult.FromString,
('proto.PyVcloudProvider', 'DeleteVApp'):
proto_dot_vapp__pb2.DeleteVAppResult.FromString,
('proto.PyVcloudProvider', 'Login'):
LoginResult.FromString,
('proto.PyVcloudProvider', 'OvaCheckResolved'):
CheckResolvedResult.FromString,
('proto.PyVcloudProvider', 'ReadCatalog'):
ReadCatalogResult.FromString,
('proto.PyVcloudProvider', 'ReadVApp'):
proto_dot_vapp__pb2.ReadVAppResult.FromString,
('proto.PyVcloudProvider', 'StopPlugin'):
StopResult.FromString,
('proto.PyVcloudProvider', 'isPresentCatalogItem'):
proto_dot_catalog__item__pb2.IsPresentCatalogItemResult.FromString,
}
cardinalities = {
'CaptureVapp': cardinality.Cardinality.UNARY_UNARY,
'CatalogUploadMedia': cardinality.Cardinality.UNARY_UNARY,
'CatalogUploadOva': cardinality.Cardinality.UNARY_UNARY,
'CreateCatalog': cardinality.Cardinality.UNARY_UNARY,
'CreateVApp': cardinality.Cardinality.UNARY_UNARY,
'DeleteCatalog': cardinality.Cardinality.UNARY_UNARY,
'DeleteCatalogItem': cardinality.Cardinality.UNARY_UNARY,
'DeleteVApp': cardinality.Cardinality.UNARY_UNARY,
'Login': cardinality.Cardinality.UNARY_UNARY,
'OvaCheckResolved': cardinality.Cardinality.UNARY_UNARY,
'ReadCatalog': cardinality.Cardinality.UNARY_UNARY,
'ReadVApp': cardinality.Cardinality.UNARY_UNARY,
'StopPlugin': cardinality.Cardinality.UNARY_UNARY,
'isPresentCatalogItem': cardinality.Cardinality.UNARY_UNARY,
}
stub_options = beta_implementations.stub_options(
host=host,
metadata_transformer=metadata_transformer,
request_serializers=request_serializers,
response_deserializers=response_deserializers,
thread_pool=pool,
thread_pool_size=pool_size)
return beta_implementations.dynamic_stub(
channel,
'proto.PyVcloudProvider',
cardinalities,
options=stub_options)
except ImportError:
pass
# @@protoc_insertion_point(module_scope)
| 38.412874 | 2,519 | 0.612265 | 5,054 | 59,079 | 6.879501 | 0.081915 | 0.067646 | 0.023354 | 0.031148 | 0.625845 | 0.576548 | 0.520262 | 0.446547 | 0.377578 | 0.303259 | 0 | 0.020078 | 0.301122 | 59,079 | 1,537 | 2,520 | 38.437866 | 0.822011 | 0.058363 | 0 | 0.613516 | 1 | 0.006834 | 0.145108 | 0.093087 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034928 | false | 0.005315 | 0.01063 | 0 | 0.050114 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e6e1c3d976604c6739b0fbb293b5a5b396e6a535 | 863 | py | Python | common/ops/gather_ops.py | vahidk/TensorflowFramework | a9377d0dd8f5ac93e810876fbe8987990e3c728f | [
"BSD-3-Clause"
] | 129 | 2017-08-19T07:18:55.000Z | 2020-07-16T03:05:31.000Z | common/ops/gather_ops.py | vahidk/TensorflowFramework | a9377d0dd8f5ac93e810876fbe8987990e3c728f | [
"BSD-3-Clause"
] | 5 | 2017-09-13T08:55:31.000Z | 2019-07-12T06:52:07.000Z | common/ops/gather_ops.py | vahidk/TensorflowFramework | a9377d0dd8f5ac93e810876fbe8987990e3c728f | [
"BSD-3-Clause"
] | 46 | 2017-08-21T21:18:50.000Z | 2022-03-12T05:57:02.000Z | """Gather ops."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import tensorflow as tf
from common.ops import shape_ops
def batch_gather(tensor, indices):
"""Gather in batch from a tensor of arbitrary size.
In pseduocode this module will produce the following:
output[i] = tf.gather(tensor[i], indices[i])
Args:
tensor: Tensor of arbitrary size.
indices: Vector of indices.
Returns:
output: A tensor of gathered values.
"""
shape = shape_ops.get_shape(tensor)
flat_first = tf.reshape(tensor, [shape[0] * shape[1]] + shape[2:])
indices = tf.convert_to_tensor(indices)
offset_shape = [shape[0]] + [1] * (indices.shape.ndims - 1)
offset = tf.reshape(tf.range(shape[0]) * shape[1], offset_shape)
output = tf.gather(flat_first, indices + offset)
return output
| 28.766667 | 68 | 0.720742 | 125 | 863 | 4.784 | 0.4 | 0.050167 | 0.080268 | 0.070234 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011142 | 0.168019 | 863 | 29 | 69 | 29.758621 | 0.821727 | 0.325608 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.384615 | 0 | 0.538462 | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
e6fcc4450c1e5202ad2e054c99228230385b189a | 1,451 | py | Python | align/system/forms.py | Robbie1977/AlignmentPipe | f7979cbf67a40619fd36ae1873c460439d7ecd64 | [
"MIT"
] | null | null | null | align/system/forms.py | Robbie1977/AlignmentPipe | f7979cbf67a40619fd36ae1873c460439d7ecd64 | [
"MIT"
] | 18 | 2015-03-03T15:55:37.000Z | 2016-07-15T13:53:52.000Z | align/system/forms.py | Robbie1977/AlignmentPipe | f7979cbf67a40619fd36ae1873c460439d7ecd64 | [
"MIT"
] | null | null | null | from django.forms import MultipleChoiceField
from django.forms.widgets import CheckboxSelectMultiple
class CSICheckboxSelectMultiple(CheckboxSelectMultiple):
def value_from_datadict(self, data, files, name):
# Return a string of comma separated integers since the database, and
# field expect a string (not a list).
return ','.join(data.getlist(name))
def render(self, name, value, attrs=None, choices=()):
# Convert comma separated integer string to a list, since the checkbox
# rendering code expects a list (not a string)
if type(value) == str:
value = value.split(',')
return super(CSICheckboxSelectMultiple, self).render(
name, value, attrs=attrs, choices=choices
)
class CSIMultipleChoiceField(MultipleChoiceField):
widget = CSICheckboxSelectMultiple
# Value is stored and retrieved as a string of comma separated
# integers. We don't want to do processing to convert the value to
# a list like the normal MultipleChoiceField does.
def to_python(self, value):
return value
def validate(self, value):
# If we have a value, then we know it is a string of comma separated
# integers. To use the MultipleChoiceField validator, we first have
# to convert the value to a list.
if value:
value = value.split(',')
super(CSIMultipleChoiceField, self).validate(value)
| 39.216216 | 78 | 0.684356 | 178 | 1,451 | 5.561798 | 0.410112 | 0.035354 | 0.027273 | 0.042424 | 0.142424 | 0.142424 | 0.048485 | 0 | 0 | 0 | 0 | 0 | 0.243281 | 1,451 | 36 | 79 | 40.305556 | 0.901639 | 0.383873 | 0 | 0.105263 | 0 | 0 | 0.003398 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0 | 0.105263 | 0.105263 | 0.631579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
fc0356fecab14ec88b450f8b02f9a6cd65372895 | 1,799 | py | Python | OpenMRS/test_public_data/test_cf_hf_gd.py | BerryAI/Acai | a3dc7d29c3ca4df00817e7ee94440c9e947ebcb4 | [
"MIT"
] | 14 | 2016-08-08T14:57:26.000Z | 2021-04-01T04:17:22.000Z | OpenMRS/test_public_data/test_cf_hf_gd.py | BerryAI/Acai | a3dc7d29c3ca4df00817e7ee94440c9e947ebcb4 | [
"MIT"
] | 5 | 2016-08-17T05:36:09.000Z | 2016-12-28T09:55:51.000Z | OpenMRS/test_public_data/test_cf_hf_gd.py | BerryAI/Acai | a3dc7d29c3ca4df00817e7ee94440c9e947ebcb4 | [
"MIT"
] | 3 | 2016-08-13T03:20:29.000Z | 2017-11-05T12:44:22.000Z | """
test_cf_hf_gd.py
~~~
This module contains testing function of SVD method to discover hidden
features in collaborative filtering method
In this tesing file, we generate rating matrix for 1k user playing history
of songs in Million Song Dataset. Because there are large amount of miss
match in two data source, we only generate rate matrix of tracks in MSD
which are played by 1k user dataset. Then we user Gradient Descent method
to discover the hidden features in the CF methods.
:auther: Alexander Z Wang
"""
import numpy
import matplotlib.pyplot as plt
import sys
sys.path.append('./cf')
sys.path.append('./read_public_data')
import msd
import echo_nest as en
import cf_hidden_feature as ch
k = 5
lean_rate = 0.00001
lambda_rate = 0.00
max_iter = 5000
GD_method = 1
filename_tracks = "../../data/subset_unique_tracks.txt"
filename_echo_nest = "../../data/train_triplets.txt"
num = 1000
print "Reading MSD and Echo Nest Data..."
song_index, name_index = msd.get_song_ID_index(filename_tracks)
echo_nest_user_history = en.get_echo_nest_user_history(
filename_echo_nest, song_index)
user_rate_dict = en.get_top_user_rating_from_history_echo_nest(
echo_nest_user_history, num)
print "Calculating Hidden Features..."
user_weight, hidden_feature, res_norm = ch.get_hidden_feature_matrix_GD(
user_rate_dict, k, lean_rate, lambda_rate, max_iter, GD_method)
print "hidden features of 10 songs"
print hidden_feature[0:10, :]
hist, bin_edges = numpy.histogram(hidden_feature, bins=20)
print hist
print bin_edges
# Plot convergence
plt.plot(res_norm)
plt.ylabel('Norm of Error')
plt.xlabel('Iteration Steps')
plt.show()
| 31.561404 | 80 | 0.716509 | 271 | 1,799 | 4.520295 | 0.461255 | 0.052245 | 0.029388 | 0.046531 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019718 | 0.210673 | 1,799 | 56 | 81 | 32.125 | 0.842958 | 0.008894 | 0 | 0 | 1 | 0 | 0.174359 | 0.054701 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.176471 | null | null | 0.176471 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fc0c98e118d92145677e23f936dc46ea7f3f374e | 7,912 | py | Python | src/sage/combinat/posets/elements.py | bopopescu/sage | 2d495be78e0bdc7a0a635454290b27bb4f5f70f0 | [
"BSL-1.0"
] | 4 | 2020-07-17T04:49:44.000Z | 2020-07-29T06:33:51.000Z | src/sage/combinat/posets/elements.py | Ivo-Maffei/sage | 467fbc70a08b552b3de33d9065204ee9cbfb02c7 | [
"BSL-1.0"
] | 2 | 2018-10-30T13:40:20.000Z | 2020-07-23T12:13:30.000Z | src/sage/combinat/posets/elements.py | dimpase/sage | 468f23815ade42a2192b0a9cd378de8fdc594dcd | [
"BSL-1.0"
] | 7 | 2021-11-08T10:01:59.000Z | 2022-03-03T11:25:52.000Z | r"""
Elements of posets, lattices, semilattices, etc.
"""
#*****************************************************************************
# Copyright (C) 2008 Peter Jipsen <jipsen@chapman.edu>,
# Franco Saliola <saliola@gmail.com>
#
# Distributed under the terms of the GNU General Public License (GPL)
#
# This code is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# General Public License for more details.
#
# The full text of the GPL is available at:
#
# http://www.gnu.org/licenses/
#*****************************************************************************
from sage.structure.element import Element
from sage.structure.element import have_same_parent
class PosetElement(Element):
def __init__(self, poset, element, vertex):
r"""
Establish the parent-child relationship between ``poset``
and ``element``, where ``element`` is associated to the
vertex ``vertex`` of the Hasse diagram of the poset.
INPUT:
- ``poset`` -- a poset object
- ``element`` -- any object
- ``vertex`` -- a vertex of the Hasse diagram of the poset
TESTS::
sage: from sage.combinat.posets.elements import PosetElement
sage: P = Poset([[1,2],[4],[3],[4],[]], facade = False)
sage: e = P(0)
sage: e.parent() is P
True
sage: TestSuite(e).run()
"""
Element.__init__(self, poset)
if isinstance(element, self.parent().element_class):
self.element = element.element
else:
self.element = element
self.vertex = vertex
def __hash__(self):
r"""
TESTS::
sage: P = Poset([[1,2],[4],[3],[4],[]], facade = False)
sage: e = P(0)
sage: hash(e)
0
"""
return hash(self.element)
def _repr_(self):
"""
TESTS::
sage: Poset([[1,2],[4],[3],[4],[]], facade = False)(0)._repr_()
'0'
"""
return "%s" % str(self.element)
def _latex_(self):
r"""
Return the latex code of the poset element.
EXAMPLES::
sage: m = matrix(2,[1,2,3,4])
sage: m.set_immutable()
sage: P = Poset(([m],[]), facade = False)
sage: [e] = P
sage: type(e)
<class 'sage.combinat.posets.posets.FinitePoset_with_category.element_class'>
sage: latex(e) #indirect doctest
\left(\begin{array}{rr}
1 & 2 \\
3 & 4
\end{array}\right)
"""
from sage.misc.latex import latex
return latex(self.element)
def __eq__(self, other):
"""
TESTS::
sage: P = Poset([["a","b"],["d"],["c"],["d"],[]], facade = False)
sage: Q = Poset([["a","b"],["d"],["c"],[],[]], facade = False)
sage: P(0).__eq__(P(4))
False
sage: from sage.combinat.posets.elements import PosetElement
sage: PosetElement(P,0,"c") == PosetElement(P,0,"c")
True
sage: PosetElement(P,0,"c") == PosetElement(Q,0,"c")
False
sage: PosetElement(P,0,"b") == PosetElement(P,0,"c")
False
.. warning:: as an optimization, this only compares the parent
and vertex, using the invariant that, in a proper poset
element, ``self.element == other.element`` if and only
``self.vertex == other.vertex``::
sage: PosetElement(P,1,"c") == PosetElement(P,0,"c")
True
Test that :trac:`12351` is fixed::
sage: P(0) == int(0)
False
"""
# This should instead exploit unique representation, using
# self is other, or best inherit __eq__ from there. But there
# are issues around pickling and rich comparison functions.
return have_same_parent(self, other) \
and self.vertex == other.vertex
def __ne__(self, other):
r"""
TESTS::
sage: P = Poset([[1,2],[4],[3],[4],[]])
sage: P = Poset([["a","b"],["d"],["c"],["d"],[]])
sage: P(0).__ne__(P(4))
True
sage: from sage.combinat.posets.elements import PosetElement
sage: PosetElement(P,0,"c") != PosetElement(P,0,"c")
False
sage: PosetElement(P,0,"b") != PosetElement(P,0,"c")
True
For this one, see comment in :meth:`__eq__`::
sage: PosetElement(P,1,"c") != PosetElement(P,0,"c")
False
"""
return not self == other
def _cmp(self, other):
"""
TESTS::
sage: P = Poset([[1,2],[4],[3],[4],[]], facade = False)
sage: P(0)._cmp(P(4))
-1
sage: P(4)._cmp(P(0))
1
sage: P(0)._cmp(P(0))
0
sage: P(1)._cmp(P(2))
"""
return self.parent().compare_elements(self, other)
def __lt__(self, other):
"""
TESTS
::
sage: dag = DiGraph({0:[2,3], 1:[3,4], 2:[5], 3:[5], 4:[5]})
sage: P = Poset(dag, facade = False)
sage: P(0) < P(1)
False
sage: P(4) < P(1)
False
sage: P(0) < P(0)
False
"""
return self._cmp(other) == -1 or False
def __le__(self, other):
"""
TESTS
::
sage: dag = DiGraph({0:[2,3], 1:[3,4], 2:[5], 3:[5], 4:[5]})
sage: P = Poset(dag, facade = False)
sage: P(1) <= P(0)
False
sage: P(0) <= P(1)
False
sage: P(0) <= P(3)
True
sage: P(0) <= P(0)
True
"""
return self == other or self._cmp(other) == -1 or False
def __gt__(self, other):
"""
TESTS
::
sage: dag = DiGraph({0:[2,3], 1:[3,4], 2:[5], 3:[5], 4:[5]})
sage: P = Poset(dag)
sage: P(0).__gt__(P(5))
False
sage: P(5).__gt__(P(0))
True
sage: P(0).__gt__(P(0))
False
"""
return self._cmp(other) == 1 or False
def __ge__(self, other):
"""
TESTS
::
sage: dag = DiGraph({0:[2,3], 1:[3,4], 2:[5], 3:[5], 4:[5]})
sage: P = Poset(dag)
sage: P(0).__ge__(P(5))
False
sage: P(5).__ge__(P(0))
True
sage: P(0).__ge__(P(0))
True
"""
return self == other or self._cmp(other) == 1 or False
class MeetSemilatticeElement(PosetElement):
def __mul__(self, other):
r"""
Return the meet of ``self`` and ``other`` in the lattice.
EXAMPLES::
sage: D = posets.DiamondPoset(5,facade=False)
sage: D(1) * D(2)
0
sage: D(1) * D(1)
1
sage: D(1) * D(0)
0
sage: D(1) * D(4)
1
"""
return self.parent().meet(self, other)
class JoinSemilatticeElement(PosetElement):
def __add__(self, other):
r"""
Return the join of ``self`` and ``other`` in the lattice.
EXAMPLES::
sage: D = posets.DiamondPoset(5,facade=False)
sage: D(1) + D(2)
4
sage: D(1) + D(1)
1
sage: D(1) + D(4)
4
sage: D(1) + D(0)
1
"""
return self.parent().join(self, other)
class LatticePosetElement(MeetSemilatticeElement, JoinSemilatticeElement):
pass
| 28.156584 | 89 | 0.463473 | 947 | 7,912 | 3.757128 | 0.197466 | 0.020236 | 0.023609 | 0.037943 | 0.463744 | 0.392355 | 0.359472 | 0.354413 | 0.321529 | 0.277684 | 0 | 0.036629 | 0.368554 | 7,912 | 280 | 90 | 28.257143 | 0.67554 | 0.610086 | 0 | 0 | 0 | 0 | 0.001157 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.276596 | false | 0.021277 | 0.06383 | 0 | 0.680851 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
fc11019b0e747e0f8cf7d5a976f5a6af853dce93 | 753 | py | Python | src/DCGAN/experiments/02basic_lsun.py | jimmy-academia/GAN_studies | b3198e0290ac3b78a2b3e949a89aea43e532e558 | [
"MIT"
] | 1 | 2019-02-06T07:53:14.000Z | 2019-02-06T07:53:14.000Z | src/DCGAN/experiments/02basic_lsun.py | jimmy-academia/GAN_studies | b3198e0290ac3b78a2b3e949a89aea43e532e558 | [
"MIT"
] | null | null | null | src/DCGAN/experiments/02basic_lsun.py | jimmy-academia/GAN_studies | b3198e0290ac3b78a2b3e949a89aea43e532e558 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""trains DCGAN on mnist for 20 epochs and creates grid display genarated from fixed noise
Example:
in parent directory: python experiments/01basic_mnist.py
Todo:
grid display
move check directory to utils
move default result directory to result
__author__ = '{Jimmy Yeh}'
__email__ = '{marrch30@gmail.com}'
"""
import sys
sys.path.append('..')
sys.path.append('.')
from module.trainer import Trainer
from module.config import configurations
from module.utils import check_directories
def main():
config, args, opt = configurations('BASIC_LSUN', 'lsun')
check_directories(opt.dir_list)
trainer = Trainer(config, args, opt)
trainer.train()
if __name__ == '__main__':
main()
| 22.147059 | 90 | 0.706507 | 97 | 753 | 5.268041 | 0.618557 | 0.058708 | 0.050881 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011419 | 0.185923 | 753 | 33 | 91 | 22.818182 | 0.822186 | 0.475432 | 0 | 0 | 0 | 0 | 0.064599 | 0 | 0 | 0 | 0 | 0.030303 | 0 | 1 | 0.076923 | false | 0 | 0.307692 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
fc1ac96c556048629747f48f71ddb6e9da234861 | 890 | py | Python | src/coordinate.py | blairck/jaeger | a4b52d1684fd8a2d216360f87aad3ebc2f341cf4 | [
"MIT"
] | 1 | 2016-07-28T04:48:40.000Z | 2016-07-28T04:48:40.000Z | src/coordinate.py | blairck/jaeger | a4b52d1684fd8a2d216360f87aad3ebc2f341cf4 | [
"MIT"
] | 18 | 2016-08-15T03:03:28.000Z | 2018-02-01T08:15:31.000Z | src/coordinate.py | blairck/jaeger | a4b52d1684fd8a2d216360f87aad3ebc2f341cf4 | [
"MIT"
] | null | null | null | """ Implementation of the coordinate class """
from src import helper
class Coordinate(object):
""" Simple interface to pass game coordinates around
Can return 2 types of coordinates:
Board - The coordinate on a board, which is 1-indexed
and is used in the constructor.
Array - Coordinate in the array, which is 0-indexed """
def __init__(self, x, y):
helper.checkIfCoordinateIsValid(x, y)
self.x = x
self.y = y
def get_x_board(self):
""" Get the X coordinate in board notation """
return self.x
def get_y_board(self):
""" Get the Y coordinate in board notation """
return self.y
def get_x_array(self):
""" Get the X coordinate in array notation """
return self.x - 1
def get_y_array(self):
""" Get the Y coordinate in array notation """
return self.y - 1
| 28.709677 | 59 | 0.625843 | 127 | 890 | 4.291339 | 0.330709 | 0.110092 | 0.073395 | 0.029358 | 0.337615 | 0.337615 | 0 | 0 | 0 | 0 | 0 | 0.007874 | 0.286517 | 890 | 30 | 60 | 29.666667 | 0.850394 | 0.473034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.357143 | false | 0 | 0.071429 | 0 | 0.785714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.