hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
16776763f681621474515bcfff2ed039c2598d66 | 47 | py | Python | cupy_alias/manipulation/tiling.py | fixstars/clpy | 693485f85397cc110fa45803c36c30c24c297df0 | [
"BSD-3-Clause"
] | 142 | 2018-06-07T07:43:10.000Z | 2021-10-30T21:06:32.000Z | cupy_alias/manipulation/tiling.py | fixstars/clpy | 693485f85397cc110fa45803c36c30c24c297df0 | [
"BSD-3-Clause"
] | 282 | 2018-06-07T08:35:03.000Z | 2021-03-31T03:14:32.000Z | cupy_alias/manipulation/tiling.py | fixstars/clpy | 693485f85397cc110fa45803c36c30c24c297df0 | [
"BSD-3-Clause"
] | 19 | 2018-06-19T11:07:53.000Z | 2021-05-13T20:57:04.000Z | from clpy.manipulation.tiling import * # NOQA
| 23.5 | 46 | 0.765957 | 6 | 47 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148936 | 47 | 1 | 47 | 47 | 0.9 | 0.085106 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1693e93a9a8abe0079b77dd7495701e2f34051aa | 60 | py | Python | validpy/__init__.py | 0scarB/validpy | c557dcc0fd770d4401993bf4875b25c767469b3b | [
"MIT"
] | null | null | null | validpy/__init__.py | 0scarB/validpy | c557dcc0fd770d4401993bf4875b25c767469b3b | [
"MIT"
] | null | null | null | validpy/__init__.py | 0scarB/validpy | c557dcc0fd770d4401993bf4875b25c767469b3b | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
def func() -> bool:
return True
| 15 | 23 | 0.516667 | 8 | 60 | 3.875 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022222 | 0.25 | 60 | 3 | 24 | 20 | 0.666667 | 0.35 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
bc4de663c37ea4fd1d297cd41ba880d36d9d5f3b | 5,513 | py | Python | py/ops/itests/test_pods_ports.py | clchiou/garage | 446ff34f86cdbd114b09b643da44988cf5d027a3 | [
"MIT"
] | 3 | 2016-01-04T06:28:52.000Z | 2020-09-20T13:18:40.000Z | py/ops/itests/test_pods_ports.py | clchiou/garage | 446ff34f86cdbd114b09b643da44988cf5d027a3 | [
"MIT"
] | null | null | null | py/ops/itests/test_pods_ports.py | clchiou/garage | 446ff34f86cdbd114b09b643da44988cf5d027a3 | [
"MIT"
] | null | null | null | import unittest
from subprocess import Popen
from .fixtures import Fixture
@Fixture.inside_container
class PodsPortsTest(Fixture, unittest.TestCase):
@classmethod
def setUpClass(cls):
super().setUpClass()
cls.httpd_proc = Popen(
['python3', '-m', 'http.server', '8080'],
cwd=str(cls.testdata_path),
)
@classmethod
def tearDownClass(cls):
cls.httpd_proc.terminate()
cls.httpd_proc.wait()
super().tearDownClass()
# NOTE: Use test name format "test_XXXX_..." to ensure test order.
# (We need this because integration tests are stateful.)
def test_0000_no_pods(self):
self.assertEqual([], self.list_pods())
self.assertEqual([], self.list_ports())
def test_0001_deploy_v1001(self):
self.deploy(self.testdata_path / 'test_pods_ports/1001.json')
self.assertEqual(['//foo/bar:test-ports-pod@1001'], self.list_pods())
self.assertEqual(
[
'//foo/bar:test-ports-pod@1001 http 8000',
'//foo/bar:test-ports-pod@1001 https 8443',
'//foo/bar:test-ports-pod@1001 service1 30000',
'//foo/bar:test-ports-pod@1001 service2 30001',
],
self.list_ports(),
)
def test_0002_stop_v1001(self):
self.stop('//foo/bar:test-ports-pod@1001')
self.assertEqual(['//foo/bar:test-ports-pod@1001'], self.list_pods())
self.assertEqual(
[
'//foo/bar:test-ports-pod@1001 http 8000',
'//foo/bar:test-ports-pod@1001 https 8443',
'//foo/bar:test-ports-pod@1001 service1 30000',
'//foo/bar:test-ports-pod@1001 service2 30001',
],
self.list_ports(),
)
def test_0003_restart_v1001(self):
self.start('//foo/bar:test-ports-pod@1001')
self.assertEqual(['//foo/bar:test-ports-pod@1001'], self.list_pods())
self.assertEqual(
[
'//foo/bar:test-ports-pod@1001 http 8000',
'//foo/bar:test-ports-pod@1001 https 8443',
'//foo/bar:test-ports-pod@1001 service1 30000',
'//foo/bar:test-ports-pod@1001 service2 30001',
],
self.list_ports(),
)
def test_0004_deploy_v1002(self):
self.deploy(self.testdata_path / 'test_pods_ports/1002.json')
self.assertEqual(
[
'//foo/bar:test-ports-pod@1001',
'//foo/bar:test-ports-pod@1002',
],
self.list_pods(),
)
self.assertEqual(
[
'//foo/bar:test-ports-pod@1001 http 8000',
'//foo/bar:test-ports-pod@1001 https 8443',
'//foo/bar:test-ports-pod@1001 service1 30000',
'//foo/bar:test-ports-pod@1001 service2 30001',
'//foo/bar:test-ports-pod@1002 http 8001',
'//foo/bar:test-ports-pod@1002 https 8444',
'//foo/bar:test-ports-pod@1002 service1 30002',
'//foo/bar:test-ports-pod@1002 service2 30003',
],
self.list_ports(),
)
def test_0005_undeploy_v1001(self):
self.undeploy('//foo/bar:test-ports-pod@1001')
self.assertEqual(['//foo/bar:test-ports-pod@1002'], self.list_pods())
self.assertEqual(
[
'//foo/bar:test-ports-pod@1002 http 8001',
'//foo/bar:test-ports-pod@1002 https 8444',
'//foo/bar:test-ports-pod@1002 service1 30002',
'//foo/bar:test-ports-pod@1002 service2 30003',
],
self.list_ports(),
)
def test_0006_redeploy_v1001(self):
self.deploy(self.testdata_path / 'test_pods_ports/1001.json')
self.assertEqual(
[
'//foo/bar:test-ports-pod@1001',
'//foo/bar:test-ports-pod@1002',
],
self.list_pods(),
)
self.assertEqual(
[
'//foo/bar:test-ports-pod@1001 http 8000',
'//foo/bar:test-ports-pod@1001 https 8443',
'//foo/bar:test-ports-pod@1001 service1 30004',
'//foo/bar:test-ports-pod@1001 service2 30005',
'//foo/bar:test-ports-pod@1002 http 8001',
'//foo/bar:test-ports-pod@1002 https 8444',
'//foo/bar:test-ports-pod@1002 service1 30002',
'//foo/bar:test-ports-pod@1002 service2 30003',
],
self.list_ports(),
)
def test_0007_undeploy_all(self):
self.undeploy('//foo/bar:test-ports-pod@1001')
self.undeploy('//foo/bar:test-ports-pod@1002')
self.assertEqual([], self.list_pods())
self.assertEqual([], self.list_ports())
def test_0008_redeploy_v1002(self):
self.deploy(self.testdata_path / 'test_pods_ports/1002.json')
self.assertEqual(
[
'//foo/bar:test-ports-pod@1002',
],
self.list_pods(),
)
self.assertEqual(
[
'//foo/bar:test-ports-pod@1002 http 8000',
'//foo/bar:test-ports-pod@1002 https 8443',
'//foo/bar:test-ports-pod@1002 service1 30000',
'//foo/bar:test-ports-pod@1002 service2 30001',
],
self.list_ports(),
)
if __name__ == '__main__':
unittest.main()
| 35.11465 | 77 | 0.536006 | 642 | 5,513 | 4.490654 | 0.14486 | 0.104058 | 0.17343 | 0.260146 | 0.796393 | 0.790149 | 0.790149 | 0.734651 | 0.734651 | 0.734651 | 0 | 0.123179 | 0.315255 | 5,513 | 156 | 78 | 35.339744 | 0.64053 | 0.021585 | 0 | 0.588235 | 0 | 0 | 0.378594 | 0.287516 | 0 | 0 | 0 | 0 | 0.132353 | 1 | 0.080882 | false | 0 | 0.022059 | 0 | 0.110294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bcd6d4a49c532e9fbd3dc2b3ce65d334809e6c17 | 97 | py | Python | qsearch/__init__.py | ejwilson3/qsearch | 9f966db0f3d54fdaff15f1dd7a0e1d28de55b1b4 | [
"BSD-3-Clause-LBNL"
] | 20 | 2020-11-16T17:07:31.000Z | 2022-01-26T11:00:26.000Z | qsearch/__init__.py | WolfLink/search_compiler | efbc02717600beaa083d3afb9b94e34264f637d7 | [
"BSD-3-Clause-LBNL"
] | 11 | 2020-10-14T17:37:09.000Z | 2022-01-13T02:32:14.000Z | qsearch/__init__.py | WolfLink/search_compiler | efbc02717600beaa083d3afb9b94e34264f637d7 | [
"BSD-3-Clause-LBNL"
] | 6 | 2021-04-09T22:38:40.000Z | 2022-03-13T07:49:40.000Z | from .compiler import *
from . import gatesets
from . import gates
from .project import Project
| 16.166667 | 28 | 0.773196 | 13 | 97 | 5.769231 | 0.461538 | 0.266667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.175258 | 97 | 5 | 29 | 19.4 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4c25e45a5df6a34662fa8bd9d70fb8909911d05b | 154 | py | Python | 3rdparty/huawei-lte-api/huawei_lte_api/ApiGroup.py | tux1c0/plugin-huawei4g | af83ce6321db23fcfd51eb204e00c287d3ea2325 | [
"MIT"
] | null | null | null | 3rdparty/huawei-lte-api/huawei_lte_api/ApiGroup.py | tux1c0/plugin-huawei4g | af83ce6321db23fcfd51eb204e00c287d3ea2325 | [
"MIT"
] | 1 | 2020-06-21T20:35:27.000Z | 2020-07-14T20:08:55.000Z | 3rdparty/huawei-lte-api/huawei_lte_api/ApiGroup.py | tux1c0/plugin-huawei4g | af83ce6321db23fcfd51eb204e00c287d3ea2325 | [
"MIT"
] | 3 | 2020-02-27T11:35:55.000Z | 2021-03-25T08:51:51.000Z | from huawei_lte_api.Connection import Connection
class ApiGroup:
def __init__(self, connection: Connection):
self._connection = connection
| 19.25 | 48 | 0.75974 | 17 | 154 | 6.470588 | 0.647059 | 0.254545 | 0.436364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 154 | 7 | 49 | 22 | 0.873016 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
4c3a3bf1467ef37216c40e0dbd9877acbfe8564a | 59 | py | Python | emmett_mongorest/queries/__init__.py | gi0baro/emmett-mongorest | 814658b3d8051ad649f1bc7cffb32354b3b7a97c | [
"BSD-3-Clause"
] | null | null | null | emmett_mongorest/queries/__init__.py | gi0baro/emmett-mongorest | 814658b3d8051ad649f1bc7cffb32354b3b7a97c | [
"BSD-3-Clause"
] | 3 | 2020-12-27T12:29:12.000Z | 2022-02-27T10:34:33.000Z | emmett_mongorest/queries/__init__.py | gi0baro/emmett-mongorest | 814658b3d8051ad649f1bc7cffb32354b3b7a97c | [
"BSD-3-Clause"
] | null | null | null | from .helpers import JSONQueryPipe, AggregateJSONQueryPipe
| 29.5 | 58 | 0.881356 | 5 | 59 | 10.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084746 | 59 | 1 | 59 | 59 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
912ec6f581f1e92b263b2b99918a289f603376aa | 47 | py | Python | morphablegraphs/motion_generator/optimization/__init__.py | dfki-asr/morphablegraphs | 02c77aab72aa4b58f4067c720f5d124f0be3ea80 | [
"MIT"
] | 5 | 2020-03-03T21:07:01.000Z | 2021-05-12T16:59:28.000Z | morphablegraphs/motion_generator/optimization/__init__.py | dfki-asr/morphablegraphs | 02c77aab72aa4b58f4067c720f5d124f0be3ea80 | [
"MIT"
] | null | null | null | morphablegraphs/motion_generator/optimization/__init__.py | dfki-asr/morphablegraphs | 02c77aab72aa4b58f4067c720f5d124f0be3ea80 | [
"MIT"
] | 1 | 2020-07-20T06:57:08.000Z | 2020-07-20T06:57:08.000Z | from .optimizer_builder import OptimizerBuilder | 47 | 47 | 0.914894 | 5 | 47 | 8.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06383 | 47 | 1 | 47 | 47 | 0.954545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9140cc4c3a0d4ca2d6f932f6add315ebe6bb6c8b | 202 | py | Python | src/VarDACAE/__init__.py | scheng1992/Data_Assimilation | b4d43895229205ee2cd16b15ee20beccb33b71d6 | [
"MIT"
] | 1 | 2021-11-25T12:46:48.000Z | 2021-11-25T12:46:48.000Z | src/VarDACAE/__init__.py | bugsuse/Data_Assimilation | 2965ccf78951df11f8686282cd6814bae18afde5 | [
"MIT"
] | null | null | null | src/VarDACAE/__init__.py | bugsuse/Data_Assimilation | 2965ccf78951df11f8686282cd6814bae18afde5 | [
"MIT"
] | 2 | 2021-03-02T13:29:34.000Z | 2022-03-12T11:01:08.000Z | from VarDACAE.data.load import GetData
from VarDACAE.data.split import SplitData
from VarDACAE.train import TrainAE
from VarDACAE.VarDA.batch_DA import BatchDA
from VarDACAE.train.retrain import retrain | 40.4 | 43 | 0.861386 | 30 | 202 | 5.766667 | 0.5 | 0.346821 | 0.184971 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094059 | 202 | 5 | 44 | 40.4 | 0.945355 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e6984f1db2756d9bbae746a13617b3368a9cf460 | 23 | py | Python | erqa/__init__.py | msu-video-group/ERQA | 5a0c64cfd02e48d9a5f749027de479b42fd0a59d | [
"MIT"
] | 18 | 2021-09-10T14:59:59.000Z | 2022-03-09T07:02:20.000Z | erqa/__init__.py | msu-video-group/erqa | 5a0c64cfd02e48d9a5f749027de479b42fd0a59d | [
"MIT"
] | null | null | null | erqa/__init__.py | msu-video-group/erqa | 5a0c64cfd02e48d9a5f749027de479b42fd0a59d | [
"MIT"
] | 1 | 2022-01-25T13:28:02.000Z | 2022-01-25T13:28:02.000Z | from .erqa import ERQA
| 11.5 | 22 | 0.782609 | 4 | 23 | 4.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 23 | 1 | 23 | 23 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e6d94e7acd1f3f91a12f8bcad3ee1f83df34a995 | 17,635 | py | Python | lg_attract_loop/test/online/test_lg_attract_loop.py | carlosvquezada/lg_ros_nodes | 7560e99272d06ef5c80a5444131dad72c078a718 | [
"Apache-2.0"
] | null | null | null | lg_attract_loop/test/online/test_lg_attract_loop.py | carlosvquezada/lg_ros_nodes | 7560e99272d06ef5c80a5444131dad72c078a718 | [
"Apache-2.0"
] | null | null | null | lg_attract_loop/test/online/test_lg_attract_loop.py | carlosvquezada/lg_ros_nodes | 7560e99272d06ef5c80a5444131dad72c078a718 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
PKG = 'lg_attract_loop'
NAME = 'test_lg_attract_loop'
import rospy
import unittest
import json
from lg_attract_loop import AttractLoop
from std_msgs.msg import Bool
class MockAPI:
def __init__(self):
self.presentation_group = {
"meta": {
"limit": 1000,
"next": None,
"offset": 0,
"previous": None,
"total_count": 1
},
"objects": [
{
"attract_loop": True,
"description": "Test",
"name": "Test",
"resource_uri": "/director_api/presentationgroup/kml-test/",
"slug": "kml-test"
}
]
}
self.presentations = {
"attract_loop": True,
"description": "Test",
"name": "Test",
"presentations": [
{
"description": "test presentation",
"name": "Test presentation",
"resource_uri": "/director_api/presentation/kml-test-presentation/",
"slug": "kml-test-presentation"
}
],
"resource_uri": "/director_api/presentationgroup/kml-test/",
"slug": "kml-test"
}
self.presentation = {
"description": "test presentation",
"name": "Test presentation",
"resource_uri": "/director_api/presentation/kml-test-presentation/",
"scenes": [
{
"description": "Openflights data",
"duration": 1,
"name": "Flights",
"resource_uri": "/director_api/scene/flights/",
"slug": "flights"
},
{
"description": "",
"duration": 1,
"name": "Mplayer two different instances",
"resource_uri": "/director_api/scene/mplayer-two-different-instances/",
"slug": "mplayer-two-different-instances"
}
],
"slug": "kml-test-presentation"
}
self.mplayer_scene = {
"description": "",
"duration": 10000,
"name": "Mplayer two different instances",
"resource_uri": "/director_api/scene/mplayer-two-different-instances/",
"slug": "mplayer-two-different-instances",
"windows": [
{
"activity": "video",
"assets": [
"http://lg-head/lg/assets/videos/1222.mp4"
],
"height": 500,
"presentation_viewport": "left_one",
"width": 500,
"x_coord": 200,
"y_coord": 200
}
]
}
self.flights_scene = {
"description": "Openflights data",
"duration": 1000,
"name": "Flights",
"resource_uri": "/director_api/scene/flights/",
"slug": "flights",
"windows": [
{
"activity": "earth",
"assets": [
"http://www.doogal.co.uk/LondonTubeLinesKML.php",
"http://openflights.org/demo/openflights-sample.kml"
],
"height": 1,
"presentation_viewport": "center",
"width": 1,
"x_coord": 1,
"y_coord": 1
}
]
}
def get(self, url):
rospy.loginfo("MockAPI call %s" % url)
if url == '/director_api/presentationgroup/?attract_loop=True':
return json.dumps(self.presentation_group)
elif url == '/director_api/presentationgroup/kml-test/':
return json.dumps(self.presentations)
elif url == '/director_api/presentation/kml-test-presentation/':
return json.dumps(self.presentation)
elif url == '/director_api/scene/flights/?format=json':
return json.dumps(self.flights_scene)
elif url == '/director_api/scene/mplayer-two-different-instances/?format=json':
return json.dumps(self.mplayer_scene)
else:
rospy.logerr("Unknown call to MockAPI %s" % url)
class MockDirectorScenePublisher:
def __init__(self):
self.published_scenes = []
def publish(self, message):
rospy.loginfo("MDSP publishing message %s" % message)
self.published_scenes.append(message)
class MockDirectorPresentationPublisher:
def __init__(self):
self.published_presentations = []
def publish(self, message):
rospy.loginfo("MDPP publishing message %s" % message)
self.published_presentations.append(message)
class MockEarthQueryPublisher:
def __init__(self):
self.published_messages = []
def publish(self, message):
rospy.loginfo("MEQP publishing message %s" % message)
self.published_messages.append(message)
class MockEarthPlanetPublisher:
def __init__(self):
self.published_messages = []
def publish(self, message):
rospy.loginfo("MEPP publishing message %s" % message)
self.published_messages.append(message)
class TestAttractLoop(unittest.TestCase):
def setUp(self):
self.maxDiff = None
rospy.init_node("lg_attract_loop_testing")
def _init_mocks(self):
self.mock_api = MockAPI()
self.mock_director_scene_publisher = MockDirectorScenePublisher()
self.mock_director_presentation_publisher = MockDirectorPresentationPublisher()
self.earth_query_publisher = MockEarthQueryPublisher()
self.earth_planet_publisher = MockEarthPlanetPublisher()
def _deactivate_lg(self):
rospy.loginfo("Changing LG state to inactive to start playback")
self.attract_loop_controller._process_activity_state_change(Bool(data=False))
rospy.loginfo("Sleeping 2 seconds before making asserts")
rospy.sleep(2)
def _activate_lg(self):
rospy.loginfo("Activating LG - attract looop should stop")
self.attract_loop_controller._process_activity_state_change(Bool(data=True))
rospy.loginfo("Sleeping 2 seconds for the attract loop to stop")
rospy.sleep(2)
def test_1_entering_and_exiting_attract_loop_with_go_blank(self):
self._init_mocks()
self.stop_action = 'go_blank'
self.attract_loop_controller = AttractLoop(
api_proxy=self.mock_api, director_scene_publisher=self.mock_director_scene_publisher,
director_presentation_publisher=self.mock_director_presentation_publisher,
stop_action=self.stop_action, earth_query_publisher=self.earth_query_publisher,
earth_planet_publisher=self.earth_planet_publisher, default_presentation=None)
self.assertEqual(isinstance(self.attract_loop_controller, AttractLoop), True)
self.assertEqual(self.attract_loop_controller.play_loop, False)
self.assertEqual(self.attract_loop_controller.attract_loop_queue, [])
rospy.loginfo("game is on - first wait for initialization and check whether attract loop was populated with scenes")
rospy.sleep(2)
self.assertEqual(len(self.attract_loop_controller.attract_loop_queue), 1) # one item waiting for playback
self.assertEqual(len(self.attract_loop_controller.attract_loop_queue[0]['scenes']), 2) # there are two scenes in the queue
self.assertEqual(len(self.mock_director_scene_publisher.published_scenes), 0) # no scenes were published yet
self._deactivate_lg()
self.assertEqual(len(self.attract_loop_controller.attract_loop_queue), 1) # the item is still in the queue
self.assertEqual(len(self.attract_loop_controller.attract_loop_queue[0]['scenes']), 1) # one out of two scenes left in the queue
print "Published scenes" % self.mock_director_scene_publisher.published_scenes
self.assertEqual(len(self.mock_director_scene_publisher.published_scenes), 1) # one scene playing back right now (1000 seconds :)
self.assertEqual(self.attract_loop_controller.attract_loop_queue[0]['scenes'][0]['slug'], self.mock_api.mplayer_scene['slug']) # mplayer scene is waiting for publication
self.assertEqual(self.attract_loop_controller.attract_loop_queue[0]['scenes'][0]['description'], self.mock_api.mplayer_scene['description']) # mplayer scene again
self.assertEqual(self.attract_loop_controller.scene_timer > 500, True) # scene timer should be sth lik 997 here
self.assertEqual(json.loads(self.mock_director_scene_publisher.published_scenes[0].message), self.mock_api.flights_scene) # flights scene got published
self._activate_lg()
self.assertEqual(self.attract_loop_controller.scene_timer <= 0, True) # scene timer is going down
self.assertEqual(len(self.mock_director_scene_publisher.published_scenes), 2) # blank message was published because of 'go_blank'
self.assertEqual(len(self.earth_query_publisher.published_messages), 1) # earth was stopped
self._deactivate_lg()
self.assertEqual(len(self.mock_director_scene_publisher.published_scenes), 3) # flights + blank + mplayer
self.assertEqual(json.loads(self.mock_director_scene_publisher.published_scenes[2].message)['slug'], 'mplayer-two-different-instances') # blank message was published because of 'go_blank'
self.assertEqual(len(self.attract_loop_controller.attract_loop_queue), 1) # still one item with one scene waiting to be published
self.assertEqual(self.attract_loop_controller.scene_timer >= 9000, True) # scene timer is going down
self._activate_lg()
self.assertEqual(len(self.mock_director_scene_publisher.published_scenes), 4) # flights + blank + mplayer + blank
self.assertEqual(len(self.earth_query_publisher.published_messages), 2) # earth was stopped
self.assertEqual(json.loads(self.mock_director_scene_publisher.published_scenes[0].message)['description'], 'Openflights data') # rofl scene was published in attract loop
self.assertEqual(json.loads(self.mock_director_scene_publisher.published_scenes[1].message)['description'], 'attract loop blank scene') # empty attract loop scene
self.assertEqual(json.loads(self.mock_director_scene_publisher.published_scenes[2].message)['description'], '') # scene without descriptor was the 3rd scene
self.assertEqual(json.loads(self.mock_director_scene_publisher.published_scenes[3].message)['description'], 'attract loop blank scene') # blank message was published because of 'go_blank'
def test_2_entering_and_exiting_attract_loop_with_stop_playtour(self):
self._init_mocks()
self.stop_action = 'stop_playtour'
self.attract_loop_controller = AttractLoop(
api_proxy=self.mock_api, director_scene_publisher=self.mock_director_scene_publisher,
director_presentation_publisher=self.mock_director_presentation_publisher,
stop_action=self.stop_action,
earth_planet_publisher=self.earth_planet_publisher,
earth_query_publisher=self.earth_query_publisher,
default_presentation=None)
self.assertEqual(isinstance(self.attract_loop_controller, AttractLoop), True)
self.assertEqual(self.attract_loop_controller.play_loop, False)
self.assertEqual(self.attract_loop_controller.attract_loop_queue, [])
rospy.loginfo("game is on - first wait for initialization and check whether attract loop was populated with scenes")
rospy.sleep(2)
self.assertEqual(len(self.attract_loop_controller.attract_loop_queue), 1) # one item waiting for playback
self.assertEqual(len(self.attract_loop_controller.attract_loop_queue[0]['scenes']), 2) # there are two scenes in the queue
self.assertEqual(len(self.mock_director_scene_publisher.published_scenes), 0) # no scenes were published yet
self._deactivate_lg()
self.assertEqual(len(self.mock_director_scene_publisher.published_scenes), 1) # one scene playing back right now (1000 seconds :))
self.assertEqual(len(self.attract_loop_controller.attract_loop_queue), 1) # the other one in the queue
self.assertEqual(self.attract_loop_controller.attract_loop_queue[0]['scenes'][0]['slug'], self.mock_api.mplayer_scene['slug']) # mplayer scene is waiting for publication
self.assertEqual(self.attract_loop_controller.scene_timer > 500, True) # scene timer should be sth lik 997 here
self.assertEqual(json.loads(self.mock_director_scene_publisher.published_scenes[0].message), self.mock_api.flights_scene) # flights scene got published
self._activate_lg()
self.assertEqual(self.attract_loop_controller.scene_timer <= 0, True) # scene timer is going down
self.assertEqual(len(self.mock_director_scene_publisher.published_scenes), 1) # only playtour was published
self.assertEqual(len(self.earth_query_publisher.published_messages), 1) # playtour was published
self._deactivate_lg()
self.assertEqual(len(self.mock_director_scene_publisher.published_scenes), 2) # flights + mplayer
self.assertEqual(json.loads(self.mock_director_scene_publisher.published_scenes[1].message)['slug'], 'mplayer-two-different-instances')
self.assertEqual(json.loads(self.mock_director_scene_publisher.published_scenes[0].message)['slug'], 'flights')
self.assertEqual(len(self.attract_loop_controller.attract_loop_queue), 1) # attract loop filled with new content again
self.assertEqual(self.attract_loop_controller.scene_timer >= 9000, True) # scene timer is going down
self._activate_lg()
self.assertEqual(len(self.mock_director_scene_publisher.published_scenes), 2) # flights + mplayer
self.assertEqual(len(self.earth_query_publisher.published_messages), 2) # earth was stopped
def test_3_entering_and_exiting_attract_loop_with_go_blank_and_switch_to_planet(self):
self._init_mocks()
self.stop_action = 'go_blank_and_switch_to_planet'
self.attract_loop_controller = AttractLoop(
api_proxy=self.mock_api, director_scene_publisher=self.mock_director_scene_publisher,
director_presentation_publisher=self.mock_director_presentation_publisher,
stop_action=self.stop_action,
earth_planet_publisher=self.earth_planet_publisher,
earth_query_publisher=self.earth_query_publisher,
default_presentation=None)
self.assertEqual(isinstance(self.attract_loop_controller, AttractLoop), True)
self.assertEqual(self.attract_loop_controller.play_loop, False)
self.assertEqual(self.attract_loop_controller.attract_loop_queue, [])
rospy.loginfo("game is on - first wait for initialization and check whether attract loop was populated with scenes")
rospy.sleep(3)
self.assertEqual(len(self.attract_loop_controller.attract_loop_queue), 1) # two scenes waiting for playback
self.assertEqual(len(self.mock_director_scene_publisher.published_scenes), 0) # no scenes published yet
self._deactivate_lg()
self.assertEqual(len(self.earth_planet_publisher.published_messages), 1) # planet message was published for the first time
self.assertEqual(len(self.mock_director_scene_publisher.published_scenes), 1) # one scene playing back right now (1000 seconds :))
self.assertEqual(len(self.attract_loop_controller.attract_loop_queue), 1) # the other one in the queue
self.assertEqual(self.attract_loop_controller.attract_loop_queue[0]['scenes'][0]['slug'], self.mock_api.mplayer_scene['slug']) # mplayer scene is waiting for publication
self.assertEqual(self.attract_loop_controller.scene_timer > 500, True) # scene timer should be sth lik 997 here
self.assertEqual(json.loads(self.mock_director_scene_publisher.published_scenes[0].message), self.mock_api.flights_scene) # flights scene got published
self._activate_lg()
self.assertEqual(self.attract_loop_controller.scene_timer <= 0, True) # scene timer is going down
self.assertEqual(len(self.mock_director_scene_publisher.published_scenes), 2)
self.assertEqual(len(self.earth_query_publisher.published_messages), 1) # playtour was published
self.assertEqual(len(self.earth_planet_publisher.published_messages), 2) # planet message was published for the second time
self._deactivate_lg()
self.assertEqual(len(self.mock_director_scene_publisher.published_scenes), 3) # flights + mplayer + more
self.assertEqual(len(self.attract_loop_controller.attract_loop_queue), 1) # attract loop filled with new content again
self.assertEqual(self.attract_loop_controller.scene_timer >= 9000, True) # scene timer is going down
self._activate_lg()
self.assertEqual(len(self.mock_director_scene_publisher.published_scenes), 4)
self.assertEqual(len(self.earth_query_publisher.published_messages), 2) # earth was stopped
self.assertEqual(len(self.earth_planet_publisher.published_messages), 4) # planet change was published 4 times
if __name__ == '__main__':
import rostest
rostest.rosrun(PKG, NAME, TestAttractLoop)
| 50.821326 | 196 | 0.673774 | 2,018 | 17,635 | 5.626858 | 0.113479 | 0.075561 | 0.051519 | 0.085865 | 0.812329 | 0.783708 | 0.740731 | 0.711669 | 0.699251 | 0.675738 | 0 | 0.011187 | 0.229544 | 17,635 | 346 | 197 | 50.968208 | 0.824538 | 0.104225 | 0 | 0.52669 | 0 | 0 | 0.165047 | 0.053618 | 0 | 0 | 0 | 0 | 0.245552 | 0 | null | null | 0 | 0.021352 | null | null | 0.003559 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e6f1d8d5c6959abd0c135470fe4e9ca0177296d8 | 201 | py | Python | tests/deprecation_rules/deprecated_django_local_flavor_module/checked_file.py | iwoca/django-seven | c7be98b73c139c9e74a9be94a0f20a723c739c80 | [
"BSD-3-Clause"
] | 9 | 2016-05-25T22:33:17.000Z | 2021-05-29T18:38:07.000Z | tests/deprecation_rules/deprecated_django_local_flavor_module/checked_file.py | iwoca/django-upgrade-tools | c7be98b73c139c9e74a9be94a0f20a723c739c80 | [
"BSD-3-Clause"
] | 3 | 2016-06-19T21:17:06.000Z | 2016-07-20T20:26:14.000Z | tests/deprecation_rules/deprecated_django_local_flavor_module/checked_file.py | iwoca/django-upgrade-tools | c7be98b73c139c9e74a9be94a0f20a723c739c80 | [
"BSD-3-Clause"
] | 1 | 2016-08-11T07:27:30.000Z | 2016-08-11T07:27:30.000Z | # Old django core localflavor import
from django.contrib import localflavor
import django.contrib.localflavor
# New correct localflavor import, as it is not an independant package.
import localflavor
| 28.714286 | 70 | 0.830846 | 27 | 201 | 6.185185 | 0.592593 | 0.305389 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134328 | 201 | 6 | 71 | 33.5 | 0.95977 | 0.512438 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
fc243e4575ed503ed304f79464daab8424c23389 | 204 | py | Python | mwptoolkit/module/Decoder/__init__.py | ShubhamAnandJain/MWP-CS229 | ce86233504fdb37e104a3944fd81d4606fbfa621 | [
"MIT"
] | 71 | 2021-03-08T06:06:15.000Z | 2022-03-30T11:59:37.000Z | mwptoolkit/module/Decoder/__init__.py | ShubhamAnandJain/MWP-CS229 | ce86233504fdb37e104a3944fd81d4606fbfa621 | [
"MIT"
] | 13 | 2021-09-07T12:38:23.000Z | 2022-03-22T15:08:16.000Z | mwptoolkit/module/Decoder/__init__.py | ShubhamAnandJain/MWP-CS229 | ce86233504fdb37e104a3944fd81d4606fbfa621 | [
"MIT"
] | 21 | 2021-02-16T07:46:36.000Z | 2022-03-23T13:41:33.000Z | from __future__ import absolute_import
from __future__ import print_function
from __future__ import division
from mwptoolkit.module.Decoder import ept_decoder,rnn_decoder,transformer_decoder,tree_decoder | 40.8 | 94 | 0.897059 | 27 | 204 | 6.111111 | 0.518519 | 0.181818 | 0.290909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078431 | 204 | 5 | 94 | 40.8 | 0.87766 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.25 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
fc77d5c24012352b85cdf5acd2c4b4affbcea6f0 | 31,946 | py | Python | pymcr/tests/test_constraints.py | usnistgov/pyMCR | ad49381573c27a1e3502589f62f1817c988dcb13 | [
"Python-2.0"
] | 34 | 2019-04-17T08:15:55.000Z | 2022-02-25T18:12:03.000Z | pymcr/tests/test_constraints.py | usnistgov/pyMCR | ad49381573c27a1e3502589f62f1817c988dcb13 | [
"Python-2.0"
] | 21 | 2019-04-12T16:24:40.000Z | 2022-02-18T16:57:25.000Z | pymcr/tests/test_constraints.py | usnistgov/pyMCR | ad49381573c27a1e3502589f62f1817c988dcb13 | [
"Python-2.0"
] | 12 | 2019-04-15T14:07:58.000Z | 2021-11-24T09:40:28.000Z | """
Testing pymcr.constraints
"""
import numpy as np
from numpy.testing import assert_allclose
from pymcr.constraints import (ConstraintNonneg, ConstraintNorm,
ConstraintCumsumNonneg, ConstraintZeroEndPoints,
ConstraintZeroCumSumEndPoints, ConstraintCompressAbove,
ConstraintCompressBelow, ConstraintCutAbove, ConstraintCutBelow,
ConstraintReplaceZeros, ConstraintPlanarize)
import pytest
def test_nonneg():
A = np.array([[1, 2, 3], [-1, -2, -3], [1, 2, 3]])
A_nn = np.array([[1, 2, 3], [0, 0, 0], [1, 2, 3]])
constr_nn = ConstraintNonneg(copy=True)
out = constr_nn.transform(A)
assert_allclose(A_nn, out)
constr_nn = ConstraintNonneg(copy=False)
out = constr_nn.transform(A)
assert_allclose(A_nn, A)
def test_cumsumnonneg():
""" Cum-Sum Nonnegativity Constraint """
A = np.array([[2, -2, 3, -2], [-1, -2, -3, 7], [1, -2, -3, 7]])
A_nn_ax1 = np.array([[2, 0, 3, -2], [0, 0, 0, 7], [1, 0, 0, 7]])
A_nn_ax0 = np.array([[2, 0, 3, 0], [-1, 0, 0, 7], [1, 0, 0, 7]])
# Axis -1
constr_nn = ConstraintCumsumNonneg(copy=True, axis=-1)
out = constr_nn.transform(A)
assert_allclose(A_nn_ax1, out)
# Axis 0
constr_nn = ConstraintCumsumNonneg(copy=False, axis=0)
out = constr_nn.transform(A)
assert_allclose(A_nn_ax0, A)
def test_zeroendpoints():
""" 0-Endpoints Constraint """
A = np.array([[1, 2, 3, 4], [3, 6, 9, 12], [4, 8, 12, 16]]).astype(float)
A_ax1 = np.array([[0, 0, 0, 0], [0, 0, 0, 0], [0, 0, 0, 0]]).astype(float)
A_ax0 = np.array([[0, 0, 0, 0], [0.5, 1, 1.5, 2], [0, 0, 0, 0]]).astype(float)
# Axis 0
constr_ax0 = ConstraintZeroEndPoints(copy=True, axis=0)
out = constr_ax0.transform(A)
assert_allclose(A_ax0, out)
# Axis -1
constr_ax1 = ConstraintZeroEndPoints(copy=True, axis=-1)
out = constr_ax1.transform(A)
assert_allclose(A_ax1, out)
with pytest.raises(TypeError):
constr_ax1 = ConstraintZeroEndPoints(copy=True, axis=3)
# Axis 0 -- NOT copies
constr_ax0 = ConstraintZeroEndPoints(copy=False, axis=0)
out = constr_ax0.transform(A)
assert_allclose(A_ax0, A)
def test_zeroendpoints_span():
""" 0-Endpoints Constraint """
A = np.array([[1, 2, 3, 4], [3, 6, 9, 12], [4, 8, 12, 16]]).astype(float)
# Axis 1
constr_ax1 = ConstraintZeroEndPoints(copy=True, axis=1, span=2)
out = constr_ax1.transform(A)
assert_allclose(out[:, [0,1]].mean(axis=1), 0)
assert_allclose(out[:, [1,2]].mean(axis=1), 0)
# Axis 0
constr_ax0 = ConstraintZeroEndPoints(copy=True, axis=0, span=2)
out = constr_ax0.transform(A)
assert_allclose(out[[0,1], :].mean(axis=0), 0)
assert_allclose(out[[1,2], :].mean(axis=0), 0)
# effective an assert_not_equal
assert_allclose([q != 0 for q in out[:,0]], True)
assert_allclose([q != 0 for q in out[:,-1]], True)
# Axis 1 -- no copy
constr_ax1 = ConstraintZeroEndPoints(copy=False, axis=1, span=2)
out = constr_ax1.transform(A)
assert_allclose(A[:, [0,1]].mean(axis=1), 0)
assert_allclose(A[:, [1,2]].mean(axis=1), 0)
def test_zerocumsumendpoints():
""" Cum-Sum 0-Endpoints Constraint """
A = np.array([[1, 2, 3, 4], [4, 5, 6, 7], [7, 8, 9, 10]]).astype(float)
A_diff1 = np.array([[1, 1, 1, 1], [1, 1, 1, 1], [1, 1, 1, 1]]).astype(float)
A_diff0 = np.array([[3, 3, 3, 3], [3, 3, 3, 3], [3, 3, 3, 3]]).astype(float)
# A_ax1 = np.array([[0, 0, 0], [0, 0, 0], [0, 0, 0]])
# A_ax0 = np.array([[0, 0, 0], [0.5, 1, 1.5], [0, 0, 0]])
# Axis 0
constr_ax0 = ConstraintZeroCumSumEndPoints(copy=True, axis=0)
out = constr_ax0.transform(A_diff0)
assert_allclose(out, 0)
assert_allclose(np.cumsum(out, axis=0), 0)
# Axis -1
constr_ax1 = ConstraintZeroCumSumEndPoints(copy=True, axis=-1)
out = constr_ax1.transform(A_diff1)
assert_allclose(out, 0)
assert_allclose(np.cumsum(out, axis=1), 0)
# Axis = -1 -- NOT copy
constr_ax1 = ConstraintZeroCumSumEndPoints(copy=False, axis=-1)
out = constr_ax1.transform(A_diff1)
assert_allclose(A_diff1, 0)
assert_allclose(np.cumsum(A_diff1, axis=1), 0)
def test_zerocumsumendpoints_nodes():
""" Cum-Sum 0-Endpoints Constraint with defined nodes"""
A = np.array([[1, 2, 1, 12], [1, 2, 2, 3], [1, 2, 3, 6]]).astype(float)
A_transform_ax0 = np.array([[0, 0, -1, 5], [0, 0, 0, -4], [0, 0, 1, -1]])
A_transform_ax1 = np.array([[-3, -2, -3, 8], [-1, 0, 0, 1], [-2, -1, 0, 3]])
# COPY Axis 0, node=0 (same as endpoints)
constr_ax0 = ConstraintZeroCumSumEndPoints(copy=True, nodes=[0], axis=0)
out = constr_ax0.transform(A)
assert_allclose(out, A_transform_ax0)
# assert_allclose(np.cumsum(out, axis=0), 0)
# COPY Axis -1, node=0 (same as endpoints)
constr_ax1 = ConstraintZeroCumSumEndPoints(copy=True, nodes=[0], axis=-1)
out = constr_ax1.transform(A)
assert_allclose(out, A_transform_ax1)
# assert_allclose(np.cumsum(out, axis=1), 0)
# OVERWRITE, Axis = 0, node=0 (same as endpoints)
A = np.array([[1, 2, 1, 12], [1, 2, 2, 3], [1, 2, 3, 6]]).astype(float)
constr_ax0 = ConstraintZeroCumSumEndPoints(copy=False, nodes=[0], axis=0)
out = constr_ax0.transform(A)
assert_allclose(A, A_transform_ax0)
# OVERWRITE, Axis = 1, node=0 (same as endpoints)
A = np.array([[1, 2, 1, 12], [1, 2, 2, 3], [1, 2, 3, 6]]).astype(float)
constr_ax1 = ConstraintZeroCumSumEndPoints(copy=False, nodes=[0], axis=-1)
out = constr_ax1.transform(A)
assert_allclose(A, A_transform_ax1)
# COPY, Axis = 0, Nodes = all
A = np.array([[1, 2, 1, 12], [1, 2, 2, 3], [1, 2, 3, 6]]).astype(float)
constr_ax0 = ConstraintZeroCumSumEndPoints(copy=True, nodes=[0,1,2], axis=0)
out = constr_ax0.transform(A)
assert_allclose(out, 0)
# COPY, Axis = 1, Nodes = all
A = np.array([[1, 2, 1, 12], [1, 2, 2, 3], [1, 2, 3, 6]]).astype(float)
constr_ax1 = ConstraintZeroCumSumEndPoints(copy=True, nodes=[0,1,2,3], axis=1)
out = constr_ax1.transform(A)
assert_allclose(out, 0)
# OVERWRITE, Axis = 0, Nodes = all
A = np.array([[1, 2, 1, 12], [1, 2, 2, 3], [1, 2, 3, 6]]).astype(float)
constr_ax0 = ConstraintZeroCumSumEndPoints(copy=False, nodes=[0,1,2], axis=0)
out = constr_ax0.transform(A)
assert_allclose(A, 0)
# OVERWRITE, Axis = 1, Nodes = all
A = np.array([[1, 2, 1, 12], [1, 2, 2, 3], [1, 2, 3, 6]]).astype(float)
constr_ax1 = ConstraintZeroCumSumEndPoints(copy=False, nodes=[0,1,2,3], axis=1)
out = constr_ax1.transform(A)
assert_allclose(A, 0)
def test_norm():
""" Test normalization """
# A must be dtype.float for in-place math (copy=False)
constr_norm = ConstraintNorm(axis=0, copy=False)
A = np.array([[1, 2, 3], [-1, -2, -3], [1, 2, 3]]) # dtype: int32
with pytest.raises(TypeError):
out = constr_norm.transform(A)
# Axis must be 0,1, or -1
with pytest.raises(ValueError):
constr_norm = ConstraintNorm(axis=2, copy=False)
A = np.array([[1, 2, 3], [-1, -2, -3], [1, 2, 3]], dtype=float)
A_norm0 = A / A.sum(axis=0)[None,:]
A_norm1 = A / A.sum(axis=1)[:,None]
constr_norm = ConstraintNorm(axis=0, copy=True)
out = constr_norm.transform(A)
assert_allclose(A_norm0, out)
constr_norm = ConstraintNorm(axis=1, copy=True)
out = constr_norm.transform(A)
assert_allclose(A_norm1, out)
constr_norm = ConstraintNorm(axis=-1, copy=True)
out = constr_norm.transform(A)
assert_allclose(A_norm1, out)
constr_norm = ConstraintNorm(axis=0, copy=False)
out = constr_norm.transform(A)
assert_allclose(A_norm0, A)
A = np.array([[1, 2, 3], [-1, -2, -3], [1, 2, 3]], dtype=float)
constr_norm = ConstraintNorm(axis=1, copy=False)
out = constr_norm.transform(A)
assert_allclose(A_norm1, A)
def test_norm_fixed_axes():
# AXIS = 1
A = np.array([[0.0, 0.2, 1.0, 0.0], [0.25, 0.25, 0.0, 0.0], [0.3, 0.9, 0.6, 0.0]], dtype=float)
A_fix2_ax1 = np.array([[0.0, 0.0, 1.0, 0.0], [0.5, 0.5, 0.0, 0.0], [0.1, 0.3, 0.6, 0.0]], dtype=float)
# Fixed axes must be integers
with pytest.raises(TypeError):
constr_norm = ConstraintNorm(axis=1, fix=2.2, copy=True)
# Dtype must be integer related
with pytest.raises(TypeError):
constr_norm = ConstraintNorm(axis=1, fix=np.array([2.2]), copy=True)
# COPY: True
# Fix of type int
constr_norm = ConstraintNorm(axis=1, fix=2, copy=True)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, out)
# Fix of type list
constr_norm = ConstraintNorm(axis=1, fix=[2, 3], copy=True)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, out)
# Fix of type tuple
constr_norm = ConstraintNorm(axis=1, fix=(2), copy=True)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, out)
# Fix of type ndarray
constr_norm = ConstraintNorm(axis=1, fix=np.array([2]), copy=True)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, out)
# COPY: False
A_fix2_ax1 = np.array([[0.0, 0.0, 1.0, 0.0], [0.5, 0.5, 0.0, 0.0], [0.1, 0.3, 0.6, 0.0]], dtype=float)
# Fix of type int
A = np.array([[0.0, 0.2, 1.0, 0.0], [0.25, 0.25, 0.0, 0.0], [0.3, 0.9, 0.6, 0.0]], dtype=float)
constr_norm = ConstraintNorm(axis=1, fix=2, copy=False)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, A)
# Fix of type list
A = np.array([[0.0, 0.2, 1.0, 0.0], [0.25, 0.25, 0.0, 0.0], [0.3, 0.9, 0.6, 0.0]], dtype=float)
constr_norm = ConstraintNorm(axis=1, fix=[2,3], copy=False)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, A)
# Fix of type tuple
A = np.array([[0.0, 0.2, 1.0, 0.0], [0.25, 0.25, 0.0, 0.0], [0.3, 0.9, 0.6, 0.0]], dtype=float)
constr_norm = ConstraintNorm(axis=1, fix=(2), copy=False)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, A)
# Fix of type ndarray
A = np.array([[0.0, 0.2, 1.0, 0.0], [0.25, 0.25, 0.0, 0.0], [0.3, 0.9, 0.6, 0.0]], dtype=float)
constr_norm = ConstraintNorm(axis=1, fix=np.array([2]), copy=False)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, A)
# AXIS = 0
# Lazy, so just transposed
A = np.array([[0.0, 0.2, 1.0, 0.0], [0.25, 0.25, 0.0, 0.0], [0.3, 0.9, 0.6, 0.0]], dtype=float).T
A_fix2_ax1 = np.array([[0.0, 0.0, 1.0, 0.0], [0.5, 0.5, 0.0, 0.0], [0.1, 0.3, 0.6, 0.0]], dtype=float).T
# COPY: True
# Fix of type int
constr_norm = ConstraintNorm(axis=0, fix=2, copy=True)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, out)
# Fix of type list
constr_norm = ConstraintNorm(axis=0, fix=[2,3], copy=True)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, out)
# Fix of type tuple
constr_norm = ConstraintNorm(axis=0, fix=(2), copy=True)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, out)
# Fix of type ndarray
constr_norm = ConstraintNorm(axis=0, fix=np.array([2]), copy=True)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, out)
# COPY: False
A_fix2_ax1 = np.array([[0.0, 0.0, 1.0], [0.5, 0.5, 0.0], [0.1, 0.3, 0.6]], dtype=float).T
# Fix of type int
A = np.array([[0.0, 0.2, 1.0], [0.25, 0.25, 0.0], [0.3, 0.9, 0.6]], dtype=float).T
constr_norm = ConstraintNorm(axis=0, fix=2, copy=False)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, A)
# Fix of type list
A = np.array([[0.0, 0.2, 1.0], [0.25, 0.25, 0.0], [0.3, 0.9, 0.6]], dtype=float).T
constr_norm = ConstraintNorm(axis=0, fix=[2], copy=False)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, A)
# Fix of type tuple
A = np.array([[0.0, 0.2, 1.0], [0.25, 0.25, 0.0], [0.3, 0.9, 0.6]], dtype=float).T
constr_norm = ConstraintNorm(axis=0, fix=(2), copy=False)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, A)
# Fix of type ndarray
A = np.array([[0.0, 0.2, 1.0], [0.25, 0.25, 0.0], [0.3, 0.9, 0.6]], dtype=float).T
constr_norm = ConstraintNorm(axis=0, fix=np.array([2]), copy=False)
out = constr_norm.transform(A)
assert_allclose(A_fix2_ax1, A)
def test_cut_below():
""" Test cutting below (and not equal to) a value """
A = np.array([[1, 2, 3, 4], [4, 5, 6, 7], [7, 8, 9, 10]]).astype(float)
A_transform = np.array([[0, 0, 0, 4], [4, 5, 6, 7], [7, 8, 9, 10]]).astype(float)
constr = ConstraintCutBelow(copy=True, value=4)
out = constr.transform(A)
assert_allclose(out, A_transform)
# No Copy
constr = ConstraintCutBelow(copy=False, value=4)
out = constr.transform(A)
assert_allclose(A, A_transform)
def test_cut_below_exclude():
""" Test cutting below (and not equal to) a value """
A = np.array([[1, 2, 3, 4], [4, 5, 6, 7], [7, 8, 9, 10]]).astype(float)
A_transform_excl_0_ax1 = np.array([[1, 0, 0, 4], [4, 5, 6, 7], [7, 8, 9, 10]]).astype(float)
A_transform_excl_0_ax0 = np.array([[1, 2, 3, 4], [4, 5, 6, 7], [7, 8, 9, 10]]).astype(float)
# COPY
constr = ConstraintCutBelow(copy=True, value=4, exclude=0, exclude_axis=1)
out = constr.transform(A)
assert_allclose(out, A_transform_excl_0_ax1)
constr = ConstraintCutBelow(copy=True, value=4, exclude=0, exclude_axis=0)
out = constr.transform(A)
assert_allclose(out, A_transform_excl_0_ax0)
# OVERWRITE
A = np.array([[1, 2, 3, 4], [4, 5, 6, 7], [7, 8, 9, 10]]).astype(float)
constr = ConstraintCutBelow(copy=False, value=4, exclude=0, exclude_axis=1)
out = constr.transform(A)
assert_allclose(A, A_transform_excl_0_ax1)
A = np.array([[1, 2, 3, 4], [4, 5, 6, 7], [7, 8, 9, 10]]).astype(float)
constr = ConstraintCutBelow(copy=False, value=4, exclude=0, exclude_axis=0)
out = constr.transform(A)
assert_allclose(A, A_transform_excl_0_ax0)
def test_cut_below_nonzerosum():
"""
Test cutting below (and not equal to) a value with the constrain that
no columns (axis=-1) will all become 0
"""
A = np.array([[0.3, 0.7], [0.4, 0.6], [0.5, 0.5]])
cutoff = 0.7
A_correct = np.array([[0.0, 0.7], [0.4, 0.6], [0.5, 0.5]])
constr = ConstraintCutBelow(copy=True, value=cutoff, axis_sumnz=-1)
out = constr.transform(A)
assert_allclose(out, A_correct)
constr = ConstraintCutBelow(copy=False, value=cutoff, axis_sumnz=-1)
constr.transform(A)
assert_allclose(A, A_correct)
def test_cut_below_nonzerosum_exclude():
"""
Test cutting below (and not equal to) a value with the constrain that
no columns (axis=-1) will all become 0
"""
A = np.array([[0.3, 0.7], [0.3, 0.7], [0.7, 0.3]])
cutoff = 0.7
A_correct = np.array([[0.3, 0.7], [0.3, 0.7], [0.7, 0.0]])
# COPY
constr = ConstraintCutBelow(copy=True, value=cutoff, axis_sumnz=-1,
exclude=0, exclude_axis=-1)
out = constr.transform(A)
assert_allclose(out, A_correct)
# OVERWRITE
constr = ConstraintCutBelow(copy=False, value=cutoff, axis_sumnz=-1,
exclude=0, exclude_axis=-1)
_ = constr.transform(A)
assert_allclose(A, A_correct)
# constr = ConstraintCutBelow(copy=False, value=cutoff, axis_sumnz=-1)
# constr.transform(A)
# assert_allclose(A, A_correct)
def test_cut_above_nonzerosum():
"""
Test cutting above (and not equal to) a value with the constrain that
no columns (axis=-1) will all become 0
"""
A = np.array([[0.3, 0.7], [0.4, 0.6], [0.5, 0.5]])
cutoff = 0.4
A_correct = np.array([[0.3, 0.0], [0.4, 0.0], [0.5, 0.5]])
constr = ConstraintCutAbove(copy=True, value=cutoff, axis_sumnz=-1)
out = constr.transform(A)
assert_allclose(out, A_correct)
# NO Copy
A = np.array([[0.3, 0.7], [0.4, 0.6], [0.5, 0.5]])
cutoff = 0.4
A_correct = np.array([[0.3, 0.0], [0.4, 0.0], [0.5, 0.5]])
constr = ConstraintCutAbove(copy=False, value=cutoff, axis_sumnz=-1)
constr.transform(A)
assert_allclose(A, A_correct)
def test_compress_below():
""" Test compressing below (and not equal to) a value """
A = np.array([[1, 2, 3, 4], [4, 5, 6, 7], [7, 8, 9, 10]]).astype(float)
A_transform = np.array([[4, 4, 4, 4], [4, 5, 6, 7], [7, 8, 9, 10]]).astype(float)
constr = ConstraintCompressBelow(copy=True, value=4)
out = constr.transform(A)
assert_allclose(out, A_transform)
# No Copy
constr = ConstraintCompressBelow(copy=False, value=4)
out = constr.transform(A)
assert_allclose(A, A_transform)
def test_cut_above():
""" Test cutting above (and not equal to) a value """
A = np.array([[1, 2, 3, 4], [4, 5, 6, 7], [7, 8, 9, 10]]).astype(float)
A_transform = np.array([[1, 2, 3, 4], [4, 0, 0, 0], [0, 0, 0, 0]]).astype(float)
constr = ConstraintCutAbove(copy=True, value=4)
out = constr.transform(A)
assert_allclose(out, A_transform)
# No Copy
constr = ConstraintCutAbove(copy=False, value=4)
out = constr.transform(A)
assert_allclose(A, A_transform)
def test_cut_above_exclude():
""" Test cutting above (and not equal to) a value """
A = np.array([[1, 2, 3, 4], [4, 5, 6, 7], [7, 8, 9, 10]]).astype(float)
A_transform_excl_0_ax1 = np.array([[1, 2, 3, 4], [4, 0, 0, 0], [7, 0, 0, 0]]).astype(float)
A_transform_excl_2_ax0 = np.array([[1, 2, 3, 4], [4, 0, 0, 0], [7, 8, 9, 10]]).astype(float)
constr = ConstraintCutAbove(copy=True, value=4, exclude=0, exclude_axis=-1)
out = constr.transform(A)
assert_allclose(out, A_transform_excl_0_ax1)
constr = ConstraintCutAbove(copy=True, value=4, exclude=2, exclude_axis=0)
out = constr.transform(A)
assert_allclose(out, A_transform_excl_2_ax0)
# No Copy
A = np.array([[1, 2, 3, 4], [4, 5, 6, 7], [7, 8, 9, 10]]).astype(float)
constr = ConstraintCutAbove(copy=False, value=4, exclude=0, exclude_axis=-1)
_ = constr.transform(A)
assert_allclose(A, A_transform_excl_0_ax1)
A = np.array([[1, 2, 3, 4], [4, 5, 6, 7], [7, 8, 9, 10]]).astype(float)
constr = ConstraintCutAbove(copy=False, value=4, exclude=2, exclude_axis=0)
_ = constr.transform(A)
assert_allclose(A, A_transform_excl_2_ax0)
def test_compress_above():
""" Test compressing above (and not equal to) a value """
A = np.array([[1, 2, 3, 4], [4, 5, 6, 7], [7, 8, 9, 10]]).astype(float)
A_transform = np.array([[1, 2, 3, 4], [4, 4, 4, 4], [4, 4, 4, 4]]).astype(float)
constr = ConstraintCompressAbove(copy=True, value=4)
out = constr.transform(A)
assert_allclose(out, A_transform)
# No Copy
constr = ConstraintCompressAbove(copy=False, value=4)
out = constr.transform(A)
assert_allclose(A, A_transform)
def test_replace_zeros():
""" Test replace zeros using a single feature """
A = np.array([[0.0, 0.0, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
# Axis=0, feature = 0
A_transform_ax0 = np.array([[0.0, 0.0, 0.0, 1.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
# Axis=1, feature = 0
A_transform_ax1 = np.array([[1.0, 0.0, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[1.0, 0.0, 0.0, 0.0]], dtype=float)
# Axis 0, copy=True
constr = ConstraintReplaceZeros(copy=True, axis=0, feature=0)
out = constr.transform(A)
assert_allclose(out, A_transform_ax0)
# Axis 1, copy=True
constr = ConstraintReplaceZeros(copy=True, axis=1, feature=0)
out = constr.transform(A)
assert_allclose(out, A_transform_ax1)
# Axis 0, copy=FALSE
A = np.array([[0.0, 0.0, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
constr = ConstraintReplaceZeros(copy=False, axis=0, feature=0)
out = constr.transform(A)
assert_allclose(A, A_transform_ax0)
# Axis 1, copy=FALSE
A = np.array([[0.0, 0.0, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
constr = ConstraintReplaceZeros(copy=False, axis=1, feature=0)
out = constr.transform(A)
assert_allclose(A, A_transform_ax1)
def test_replace_zeros_multifeature():
""" Replace zeros using multiple features """
A = np.array([[0.0, 0.0, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
# Axis=0, feature = [0,1]
A_transform_ax0 = np.array([[0.0, 0.0, 0.0, 0.5],
[0.25, 0.25, 0.0, 0.5],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
# Axis=1, feature = [0,1]
A_transform_ax1 = np.array([[0.5, 0.5, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[0.5, 0.5, 0.0, 0.0]], dtype=float)
# Axis 0, copy=True, feature=[0,1]
constr = ConstraintReplaceZeros(copy=True, axis=0, feature=[0,1])
out = constr.transform(A)
assert_allclose(out, A_transform_ax0)
# Axis 1, copy=True, feature=[0,1]
constr = ConstraintReplaceZeros(copy=True, axis=1, feature=[0,1])
out = constr.transform(A)
assert_allclose(out, A_transform_ax1)
# Axis 1, copy=True, feature=np.array([0,1])
constr = ConstraintReplaceZeros(copy=True, axis=1, feature=np.array([0,1]))
out = constr.transform(A)
assert_allclose(out, A_transform_ax1)
# Axis 1, copy=True, feature=None
constr = ConstraintReplaceZeros(copy=True, axis=1, feature=None)
out = constr.transform(A)
assert_allclose(out, A)
# Axis 0, copy=FALSE, feature=[0,1]
A = np.array([[0.0, 0.0, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
constr = ConstraintReplaceZeros(copy=False, axis=0, feature=[0,1])
out = constr.transform(A)
assert_allclose(A, A_transform_ax0)
# Axis 1, copy=FALSE, feature=[0,1]
A = np.array([[0.0, 0.0, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
constr = ConstraintReplaceZeros(copy=False, axis=1, feature=[0,1])
out = constr.transform(A)
assert_allclose(A, A_transform_ax1)
def test_replace_zeros_non1fval():
""" Test replace zeros using a single feature with fval != 1 """
A = np.array([[0.0, 0.0, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
# Axis=0, feature = 0
A_transform_ax0 = np.array([[0.0, 0.0, 0.0, 4.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
# Axis=1, feature = 0
A_transform_ax1 = np.array([[4.0, 0.0, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[4.0, 0.0, 0.0, 0.0]], dtype=float)
# Axis 0, copy=True
constr = ConstraintReplaceZeros(copy=True, axis=0, feature=0, fval=4)
out = constr.transform(A)
assert_allclose(out, A_transform_ax0)
# Axis 1, copy=True
constr = ConstraintReplaceZeros(copy=True, axis=1, feature=0, fval=4)
out = constr.transform(A)
assert_allclose(out, A_transform_ax1)
# Axis 0, copy=FALSE
A = np.array([[0.0, 0.0, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
constr = ConstraintReplaceZeros(copy=False, axis=0, feature=0, fval=4)
out = constr.transform(A)
assert_allclose(A, A_transform_ax0)
# Axis 1, copy=FALSE
A = np.array([[0.0, 0.0, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
constr = ConstraintReplaceZeros(copy=False, axis=1, feature=0, fval=4)
out = constr.transform(A)
assert_allclose(A, A_transform_ax1)
def test_replace_zeros_non1fval_multifeature():
""" Test replace zeros using a 2 features with fval != 1 """
A = np.array([[0.0, 0.0, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
# Axis=0, feature = 0
A_transform_ax0 = np.array([[0.0, 0.0, 0.0, 2.0],
[0.25, 0.25, 0.0, 2.0],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
# Axis=1, feature = 0
A_transform_ax1 = np.array([[2.0, 2.0, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[2.0, 2.0, 0.0, 0.0]], dtype=float)
# Axis 0, copy=True
constr = ConstraintReplaceZeros(copy=True, axis=0, feature=[0,1], fval=4)
out = constr.transform(A)
assert_allclose(out, A_transform_ax0)
# Axis 1, copy=True
constr = ConstraintReplaceZeros(copy=True, axis=1, feature=[0,1], fval=4)
out = constr.transform(A)
assert_allclose(out, A_transform_ax1)
# Axis 0, copy=FALSE
A = np.array([[0.0, 0.0, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
constr = ConstraintReplaceZeros(copy=False, axis=0, feature=[0,1], fval=4)
out = constr.transform(A)
assert_allclose(A, A_transform_ax0)
# Axis 1, copy=FALSE
A = np.array([[0.0, 0.0, 0.0, 0.0],
[0.25, 0.25, 0.0, 0.0],
[0.3, 0.9, 0.6, 0.0],
[0.0, 0.0, 0.0, 0.0]], dtype=float)
constr = ConstraintReplaceZeros(copy=False, axis=1, feature=[0,1], fval=4)
out = constr.transform(A)
assert_allclose(A, A_transform_ax1)
def test_planarize_no_noise():
""" Test ConstraintPlanarize with no noise """
C_img = np.zeros((10, 20, 2)) # Y, X, Target
x = np.arange(C_img.shape[1])
y = np.arange(C_img.shape[0])
n_targets = C_img.shape[-1]
X, Y = np.meshgrid(x,y)
C_0_ideal = 0.1*X + 0.3*Y + 3
C_img[:,:,0] = C_0_ideal
C_img[:,:,1] = 0.2*X + 0.4*Y + 3
C_ravel = C_img.reshape((-1, n_targets))
# COPY
constr = ConstraintPlanarize(0, (10, 20), scaler=None, copy=True)
out = constr.transform(C_ravel)
out_img = out.reshape(C_img.shape)
assert_allclose(out_img[..., 1], C_img[..., 1])
# Data is a plane. Should return the same plane to within numerical error
assert_allclose(out_img[...,0], C_0_ideal)
# OVERWRITE
constr = ConstraintPlanarize(0, (10, 20), scaler=None, copy=False)
_ = constr.transform(C_ravel)
assert_allclose(C_ravel.reshape(C_img.shape)[..., 0], C_img[..., 0])
assert_allclose(C_ravel.reshape(C_img.shape)[..., 1], C_img[..., 1])
def test_planarize_noisy():
""" Test ConstraintPlanarize """
C_img = np.zeros((10, 20, 2)) # Y, X, Target
x = np.arange(C_img.shape[1])
y = np.arange(C_img.shape[0])
n_targets = C_img.shape[-1]
X, Y = np.meshgrid(x,y)
C_img[:,:,0] = 0.1*X + 0.3*Y + 3 + 0.1*np.random.randn(10,20)
C_img[:,:,1] = 0.1*X + 0.3*Y + 3
C_ravel = C_img.reshape((-1, n_targets))
constr = ConstraintPlanarize(0, (10, 20), scaler=None, copy=True)
out = constr.transform(C_ravel)
out_img = out.reshape(C_img.shape)
assert_allclose(out_img[..., 1], C_img[..., 1])
assert np.sum((out_img[..., 0] - C_img[..., 1])**2) < np.sum((C_img[...,0] - C_img[..., 1])**2)
def test_planarize_noisy_list_target():
""" Test ConstraintPlanarize """
C_img = np.zeros((10, 20, 2)) # Y, X, Target
x = np.arange(C_img.shape[1])
y = np.arange(C_img.shape[0])
n_targets = C_img.shape[-1]
X, Y = np.meshgrid(x,y)
C_img[:,:,0] = 0.1*X + 0.3*Y + 3 + 0.1*np.random.randn(10,20)
C_img[:,:,1] = 0.1*X + 0.3*Y + 3 + 0.1*np.random.randn(10,20)
C_ideal_01 = 0.1*X + 0.3*Y + 3
C_ravel = C_img.reshape((-1, n_targets))
constr = ConstraintPlanarize([0, 1], (10, 20), scaler=None, copy=True)
out = constr.transform(C_ravel)
out_img = out.reshape(C_img.shape)
# Test that RSS is better after planarize than before
assert np.sum((C_img[...,0] - C_ideal_01)**2) > np.sum((out_img[...,0] - C_ideal_01)**2)
assert np.sum((C_img[...,1] - C_ideal_01)**2) > np.sum((out_img[...,1] - C_ideal_01)**2)
# Two output targets are not identical
assert np.sum((out_img[..., 0] - out_img[..., 1])**2) > 0
def test_planarize_err_type_input():
""" Inputting a target that is not a list, tuple, or ndarray results in Type Error """
with pytest.raises(TypeError):
constr = ConstraintPlanarize({'a' : 0}, (10, 20), scaler=None, copy=True)
def test_planarize_set_scaler():
""" Test setting or not setting scaler """
C_img = np.zeros((10, 20, 2)) # Y, X, Target
x = np.arange(C_img.shape[1])
y = np.arange(C_img.shape[0])
n_targets = C_img.shape[-1]
X, Y = np.meshgrid(x,y)
C_img[:,:,0] = 0.1*X + 0.3*Y - 2.5
C_img[:,:,1] = 0.1*X + 0.3*Y - 2.5
C_ravel = C_img.reshape((-1, n_targets))
constr = ConstraintPlanarize(0, (10, 20), scaler=None, copy=True)
out = constr.transform(C_ravel)
assert constr.scaler is not None
assert constr.scaler > 100
constr = ConstraintPlanarize(0, (10, 20), scaler=1.0, copy=True)
out = constr.transform(C_ravel)
assert constr.scaler is not None
assert constr.scaler == 1.0
def test_planarize_use_above_and_below_on_plane():
""" Test ConstraintPlanarize """
C_img = np.zeros((10, 20, 2)) # Y, X, Target
x = np.arange(C_img.shape[1])
y = np.arange(C_img.shape[0])
n_targets = C_img.shape[-1]
X, Y = np.meshgrid(x,y)
C_img[:,:,0] = 0.1*X + 0.3*Y - 2.5
C_img[:,:,1] = 0.1*X + 0.3*Y - 2.5
C_ravel = C_img.reshape((-1, n_targets))
# COPY -- DO NOT Apply limits to plane
constr = ConstraintPlanarize(0, (10, 20), scaler=None, copy=True, use_vals_above=0,
use_vals_below=1, lims_to_plane=False)
out = constr.transform(C_ravel)
assert C_ravel.min() < 0
assert C_ravel.max() > 1
assert out[:,0].min() < 0
assert out[:,0].max() > 1
assert out[:,1].min() < 0
assert out[:,1].max() > 1
# COPY -- Apply limits to plane
constr = ConstraintPlanarize(0, (10, 20), scaler=None, copy=True, use_vals_above=0,
use_vals_below=1, lims_to_plane=True)
out = constr.transform(C_ravel)
assert C_ravel.min() < 0
assert C_ravel.max() > 1
assert out[:,0].min() >= 0
assert out[:,0].max() <= 1
assert out[:,1].min() < 0
assert out[:,1].max() > 1
# OVERWRITE
assert C_ravel[:,0].min() < 0
assert C_ravel[:,0].max() > 1
assert C_ravel[:,1].min() < 0
assert C_ravel[:,1].max() > 1
constr = ConstraintPlanarize(0, (10, 20), scaler=None, copy=False, use_vals_above=0,
use_vals_below=1, lims_to_plane=True)
out = constr.transform(C_ravel)
assert C_ravel[:,0].min() >= 0
assert C_ravel[:,0].max() <= 1
assert C_ravel[:,1].min() < 0
assert C_ravel[:,1].max() > 1
| 36.974537 | 108 | 0.577568 | 5,344 | 31,946 | 3.336639 | 0.037425 | 0.065392 | 0.072178 | 0.070215 | 0.883013 | 0.852504 | 0.828164 | 0.801694 | 0.771578 | 0.723964 | 0 | 0.098883 | 0.240562 | 31,946 | 863 | 109 | 37.017381 | 0.636083 | 0.110969 | 0 | 0.603978 | 0 | 0 | 0.000036 | 0 | 0 | 0 | 0 | 0 | 0.224231 | 1 | 0.048825 | false | 0 | 0.007233 | 0 | 0.056058 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5d944f9e0d5e14bdc3d6a1d08da7d0716cf368d3 | 136 | py | Python | foxlink_clustering/hash_functions/type_converter.py | a-wars/AGIW_First_Project | 248664f5fa7c165646655d9df7242e010dc56bfe | [
"MIT"
] | null | null | null | foxlink_clustering/hash_functions/type_converter.py | a-wars/AGIW_First_Project | 248664f5fa7c165646655d9df7242e010dc56bfe | [
"MIT"
] | null | null | null | foxlink_clustering/hash_functions/type_converter.py | a-wars/AGIW_First_Project | 248664f5fa7c165646655d9df7242e010dc56bfe | [
"MIT"
] | null | null | null | def hex_msb_to_int(hex_string):
return int(hex_string[-2:], 16)
def bin_msb_to_int(bin_string):
return int(bin_string[-8:], 2) | 22.666667 | 35 | 0.713235 | 26 | 136 | 3.346154 | 0.423077 | 0.114943 | 0.183908 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042735 | 0.139706 | 136 | 6 | 36 | 22.666667 | 0.700855 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
53bead8d8d313c02921e1bb0fa48a23ec730d544 | 34 | py | Python | boardStateDriver/tests/__init__.py | terpyPy/ButtonBox | 407fd54b3c8654e3fc3678858f121846c8ee32a2 | [
"MIT"
] | null | null | null | boardStateDriver/tests/__init__.py | terpyPy/ButtonBox | 407fd54b3c8654e3fc3678858f121846c8ee32a2 | [
"MIT"
] | null | null | null | boardStateDriver/tests/__init__.py | terpyPy/ButtonBox | 407fd54b3c8654e3fc3678858f121846c8ee32a2 | [
"MIT"
] | null | null | null | from .simMyBoard import simMyBoard | 34 | 34 | 0.882353 | 4 | 34 | 7.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088235 | 34 | 1 | 34 | 34 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
53c581cac8ea1a48cc419ccefcac5e7a5db94be5 | 4,941 | py | Python | autoprocess/utils/fitio.py | michel4j/auto-process | 9c011cef3cdc2fc55df31f9cac88c3e1074aa366 | [
"BSD-3-Clause"
] | null | null | null | autoprocess/utils/fitio.py | michel4j/auto-process | 9c011cef3cdc2fc55df31f9cac88c3e1074aa366 | [
"BSD-3-Clause"
] | null | null | null | autoprocess/utils/fitio.py | michel4j/auto-process | 9c011cef3cdc2fc55df31f9cac88c3e1074aa366 | [
"BSD-3-Clause"
] | null | null | null | import copy
import numpy
CALIB_TEMPLATE = """POWDER DIFFRACTION (2-D)
INPUT
{file_name}
O.K.
O.K.
TILT
X-PIXEL SIZE
{pixel_size:0.4f}
Y-PIXEL SIZE
{pixel_size:0.4f}
DISTANCE
{distance:0.4f}
WAVELENGTH
{wavelength:0.6f}
X-BEAM CENTRE
{beam_x:0.3f}
Y-BEAM CENTRE
{beam_y:0.3f}
TILT ROTATION
0
ANGLE OF TILT
0
DETECTOR ROTATION
0.0
O.K.
O.K.
ELLIPSE COORDINATES
{coordinates}
1
INTEGRATE
O.K.
GEOMETRY COR.
YES
CONSERVE INT.
{intensity}
MAX. ANGLE
{max_angle:0.2f}
O.K.
OUTPUT
CHIPLOT
FILE NAME
{data_name}.chi
O.K.
EXIT
SET-UP
SAVE FIT2D DEFAULTS
{db_file}
EXIT
EXIT FIT2D
YES
"""
INTEGRATE_TEMPLATE = """
SET-UP
LOAD FIT2D DEFAULTS
{db_file}
EXIT
POWDER DIFFRACTION (2-D)
INPUT
{file_name}
O.K.
O.K.
INTEGRATE
X-PIXEL SIZE
{pixel_size:0.4f}
Y-PIXEL SIZE
{pixel_size:0.4f}
DISTANCE
{distance:0.4f}
WAVELENGTH
{wavelength:0.6f}
O.K.
CONSERVE INT.
{intensity}
MAX. ANGLE
{max_angle:0.2f}
O.K.
OUTPUT
CHIPLOT
FILE NAME
{data_name}.chi
O.K.
EXIT
EXIT FIT2D
YES
"""
def write_calib_macro(parameters, macro_file='macro.mac'):
"""
Create calib macro file for fit2d
params = {
'wavelength': float
'distance': float
'start_angle': float
'first_frame': int
'file_name': str (full/relative path including directory)
'file_format': str (TIFF)
'detector_size': tuple of 2 ints
'pixel_size' : float
'beam_center': tuple of 2 floats
'ellipse': [(x1,y1), (x2,y2), ..., ]
'rings': [(x1,y1), (x2,y2), ..., ]
'ring_width': integer,
'db_file': file path for defaults, will be overwritten
}
"""
# Fix Fit2D geometry
max_x = max(
parameters['detector_size'][0] - parameters['beam_center'][0],
parameters['beam_center'][0],
)
max_y = max(
parameters['detector_size'][1] - parameters['beam_center'][1],
parameters['beam_center'][1],
)
max_r = 0.99 * numpy.sqrt(max_y ** 2 + max_x ** 2) * parameters['pixel_size']
max_angle = numpy.degrees(numpy.arctan(max_r / parameters['distance']))
params = copy.deepcopy(parameters)
params.update({
'beam_x': params['beam_center'][0],
'beam_y': params['detector_size'][1] - params['beam_center'][1],
'rings': [(x, params['detector_size'][1] - y) for x, y in params['rings']],
'ellipse': [(x, params['detector_size'][1] - y) for x, y in params['ellipse']],
'pixel_size': 1000 * params['pixel_size'],
'intensity': 'YES' if params.get('intensity') else 'NO',
'max_angle': max_angle
})
coords_text = [
'{:12d}'.format(len(params['ellipse']))
] + [
'{:14.7E}\n{:14.7E}'.format(x, y) for x, y in params['ellipse']
] + [
'{:12d}'.format(1),
'{:14.7E}\n{:14.7E}'.format(params['ellipse'][0][0] + params['ring_width'],
params['ellipse'][0][1] + params['ring_width']),
'{}'.format(len(params['rings'])),
] + [
'{:14.7E}\n{:14.7E}'.format(x, y) for x, y in params['rings']
]
params['coordinates'] = '\n'.join(coords_text)
macro = CALIB_TEMPLATE.format(**params)
with open(macro_file, 'w') as outfile:
outfile.write(macro)
def write_integrate_macro(parameters, macro_file='integrate.mac'):
"""
Create calib macro file for fit2d
params = {
'wavelength': float
'distance': float
'start_angle': float
'first_frame': int
'file_name': str (full/relative path including directory)
'file_format': str (TIFF)
'detector_size': tuple of 2 ints
'pixel_size' : float
'beam_center': tuple of 2 floats
'ellipse': [(x1,y1), (x2,y2), ..., ]
'rings': [(x1,y1), (x2,y2), ..., ]
'ring_width': integer,
'intensity': bool (preserve intensities?),
'db_file': file path for defaults
}
"""
# Fix Fit2D geometry
max_x = max(
parameters['detector_size'][0] - parameters['beam_center'][0],
parameters['beam_center'][0],
)
max_y = max(
parameters['detector_size'][1] - parameters['beam_center'][1],
parameters['beam_center'][1],
)
max_r = 0.99 * numpy.sqrt(max_y ** 2 + max_x ** 2) * parameters['pixel_size']
max_angle = numpy.degrees(numpy.arctan(max_r / parameters['distance']))
params = copy.deepcopy(parameters)
params.update({
'beam_x': params['beam_center'][0],
'beam_y': params['detector_size'][1] - params['beam_center'][1],
'pixel_size': 1000 * params['pixel_size'],
'intensity': 'YES' if params.get('intensity') else 'NO',
'max_angle': max_angle
})
macro = INTEGRATE_TEMPLATE.format(**params)
with open(macro_file, 'w') as outfile:
outfile.write(macro)
| 24.460396 | 98 | 0.582676 | 652 | 4,941 | 4.268405 | 0.184049 | 0.051743 | 0.057492 | 0.025871 | 0.790873 | 0.774344 | 0.750988 | 0.742364 | 0.742364 | 0.742364 | 0 | 0.033668 | 0.254604 | 4,941 | 201 | 99 | 24.58209 | 0.721966 | 0.214531 | 0 | 0.712329 | 0 | 0 | 0.394631 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013699 | false | 0 | 0.013699 | 0 | 0.027397 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
53dda0007ac8ced129bfc36696a38d33eab9b4d8 | 17,494 | py | Python | Fumagalli_Motta_Tarantino_2020/tests/Test_AdditionalModels.py | manuelbieri/Fumagalli_2020 | d3150b075f9c92b028c2aff2bcc2498b77c78877 | [
"MIT"
] | null | null | null | Fumagalli_Motta_Tarantino_2020/tests/Test_AdditionalModels.py | manuelbieri/Fumagalli_2020 | d3150b075f9c92b028c2aff2bcc2498b77c78877 | [
"MIT"
] | null | null | null | Fumagalli_Motta_Tarantino_2020/tests/Test_AdditionalModels.py | manuelbieri/Fumagalli_2020 | d3150b075f9c92b028c2aff2bcc2498b77c78877 | [
"MIT"
] | null | null | null | import Fumagalli_Motta_Tarantino_2020.tests.Test_Models as Test
import Fumagalli_Motta_Tarantino_2020 as FMT20
class TestMircoFoundationModel(Test.TestOptimalMergerPolicyModel):
def setUp(self) -> None:
self.calculate_properties_profits_consumer_surplus()
def setupModel(self, **kwargs) -> None:
self.model = FMT20.MicroFoundationModel(**kwargs)
def calculate_properties_profits_consumer_surplus(self) -> None:
# calculations made with Gamma = 0.3
self.test_incumbent_profit_without_innovation = 0.25
self.test_cs_without_innovation = 0.125
self.test_incumbent_profit_with_innovation = 1 / 2.6
self.test_cs_with_innovation = 1 / 5.2
self.test_incumbent_profit_duopoly = 1 / (2.3**2)
self.test_startup_profit_duopoly = self.test_incumbent_profit_duopoly
self.test_cs_duopoly = 1.3 / (2.3**2)
def get_welfare_value(self, market_situation: str) -> float:
if market_situation == "duopoly":
return (
self.test_cs_duopoly
+ self.test_startup_profit_duopoly
+ self.test_incumbent_profit_duopoly
)
if market_situation == "without_innovation":
return (
self.test_cs_without_innovation
+ self.test_incumbent_profit_without_innovation
)
if market_situation == "with_innovation":
return (
self.test_cs_with_innovation
+ self.test_incumbent_profit_with_innovation
)
def test_properties_profits_consumer_surplus(self):
self.setupModel()
self.assertTrue(
self.are_floats_equal(
self.test_cs_without_innovation, self.model.cs_without_innovation
)
)
self.assertTrue(
self.are_floats_equal(
self.test_incumbent_profit_without_innovation,
self.model.incumbent_profit_without_innovation,
)
)
self.assertTrue(
self.are_floats_equal(
self.test_cs_duopoly,
self.model.cs_duopoly,
)
)
self.assertTrue(
self.are_floats_equal(
self.test_incumbent_profit_duopoly,
self.model.incumbent_profit_duopoly,
)
)
self.assertTrue(
self.are_floats_equal(
self.test_startup_profit_duopoly,
self.model.startup_profit_duopoly,
)
)
self.assertTrue(
self.are_floats_equal(
self.test_cs_with_innovation,
self.model.cs_with_innovation,
)
)
self.assertTrue(
self.are_floats_equal(
self.test_incumbent_profit_with_innovation,
self.model.incumbent_profit_with_innovation,
)
)
def test_intermediate_optimal_merger_policy(self):
self.setupModel(gamma=0.2)
self.assertEqual(
FMT20.MergerPolicies.Intermediate_late_takeover_allowed,
self.model.get_optimal_merger_policy(),
)
self.assertTrue(self.model.is_intermediate_optimal())
def test_string_representation(self):
self.setupModel(gamma=0.3)
self.assertEqual(
"Merger Policy: Strict\n"
"Is start-up credit rationed?: False\n"
"Type of early takeover attempt: No bid\n"
"Is the early takeover approved?: False\n"
"Does the owner attempt the development?: True\n"
"Is the development successful?: True\n"
"Type of late takeover attempt: No bid\n"
"Is the late takeover approved?: False\n"
"Optimal merger policy: Strict",
str(self.model),
)
def test_laissez_faire_optimal_merger_policy(self):
# laissez-faire is never optimal -> dominated by strict
self.setupModel()
self.assertFalse(self.model.is_laissez_faire_optimal())
def test_tolerated_harm_strict(self):
self.setupModel()
self.assertEqual(0, self.model.tolerated_harm)
def test_tolerated_harm_intermediate_late_takeover_allowed(self):
self.setupModel(
merger_policy=FMT20.MergerPolicies.Intermediate_late_takeover_prohibited
)
self.assertTrue(
self.are_floats_equal(
(1 - 0.5070508811267713)
* (
0.7
* (
self.get_welfare_value("duopoly")
- self.get_welfare_value("without_innovation")
)
- 0.1
),
self.model.tolerated_harm,
)
)
def test_tolerated_harm_intermediate_late_takeover_prohibited(self):
self.setupModel(
merger_policy=FMT20.MergerPolicies.Intermediate_late_takeover_allowed
)
self.assertEqual(
self.get_welfare_value("duopoly")
- self.get_welfare_value("with_innovation"),
self.model.tolerated_harm,
)
def test_tolerated_harm_laissez_faire(self):
self.setupModel(merger_policy=FMT20.MergerPolicies.Laissez_faire)
self.assertEqual(float("inf"), self.model.tolerated_harm)
class TestPerfectInformationModel(Test.TestOptimalMergerPolicyModel):
def setupModel(self, **kwargs) -> None:
self.model = FMT20.PerfectInformationModel(**kwargs)
def test_laissez_faire_optimal_merger_policy(self):
self.setupModel()
self.assertFalse(self.model.is_laissez_faire_optimal())
class TestStrictPerfectInformationModel(TestPerfectInformationModel):
def test_not_profitable_not_credit_rationed(self):
self.setupModel()
self.assertEqual(FMT20.MergerPolicies.Strict, self.model.merger_policy)
self.assertFalse(self.model.is_startup_credit_rationed)
self.assertEqual(FMT20.Takeover.No, self.model.get_early_bidding_type)
self.assertEqual(FMT20.Takeover.No, self.model.get_late_bidding_type)
self.assertTrue(self.model.is_owner_investing)
self.assertTrue(self.model.is_development_successful)
self.assertFalse(self.model.is_early_takeover)
self.assertFalse(self.model.is_late_takeover)
def test_not_profitable_credit_rationed(self):
self.setupModel(startup_assets=0.01)
self.assertEqual(FMT20.MergerPolicies.Strict, self.model.merger_policy)
self.assertTrue(self.model.is_startup_credit_rationed)
self.assertEqual(FMT20.Takeover.No, self.model.get_early_bidding_type)
self.assertEqual(FMT20.Takeover.No, self.model.get_late_bidding_type)
self.assertFalse(self.model.is_owner_investing)
self.assertFalse(self.model.is_development_successful)
self.assertFalse(self.model.is_early_takeover)
self.assertFalse(self.model.is_late_takeover)
def test_profitable_not_credit_rationed(self):
self.setupModel(
startup_assets=0.06,
development_costs=0.075,
success_probability=0.79,
private_benefit=0.07,
incumbent_profit_without_innovation=0.3,
startup_profit_duopoly=0.11,
incumbent_profit_with_innovation=0.4,
)
self.assertEqual(FMT20.MergerPolicies.Strict, self.model.merger_policy)
self.assertFalse(self.model.is_startup_credit_rationed)
self.assertEqual(FMT20.Takeover.No, self.model.get_early_bidding_type)
self.assertEqual(FMT20.Takeover.No, self.model.get_late_bidding_type)
self.assertTrue(self.model.is_owner_investing)
self.assertTrue(self.model.is_development_successful)
self.assertFalse(self.model.is_early_takeover)
self.assertFalse(self.model.is_late_takeover)
def test_profitable_credit_rationed(self):
self.setupModel(
development_costs=0.075,
success_probability=0.79,
private_benefit=0.07,
incumbent_profit_without_innovation=0.3,
startup_profit_duopoly=0.11,
incumbent_profit_with_innovation=0.4,
)
self.assertEqual(FMT20.MergerPolicies.Strict, self.model.merger_policy)
self.assertTrue(self.model.is_startup_credit_rationed)
self.assertEqual(FMT20.Takeover.Separating, self.model.get_early_bidding_type)
self.assertEqual(FMT20.Takeover.No, self.model.get_late_bidding_type)
self.assertTrue(self.model.is_owner_investing)
self.assertTrue(self.model.is_development_successful)
self.assertTrue(self.model.is_early_takeover)
self.assertFalse(self.model.is_late_takeover)
class TestIntermediatePerfectInformationModel(TestPerfectInformationModel):
def test_not_profitable_not_credit_rationed(self):
self.setupModel(
merger_policy=FMT20.MergerPolicies.Intermediate_late_takeover_allowed,
consumer_surplus_duopoly=0.46,
consumer_surplus_without_innovation=0.2,
consumer_surplus_with_innovation=0.35,
)
self.assertEqual(
FMT20.MergerPolicies.Intermediate_late_takeover_allowed,
self.model.merger_policy,
)
self.assertFalse(self.model.is_startup_credit_rationed)
self.assertEqual(FMT20.Takeover.No, self.model.get_early_bidding_type)
self.assertEqual(FMT20.Takeover.Pooling, self.model.get_late_bidding_type)
self.assertTrue(self.model.is_owner_investing)
self.assertTrue(self.model.is_development_successful)
self.assertFalse(self.model.is_early_takeover)
self.assertTrue(self.model.is_late_takeover)
def test_not_profitable_not_credit_rationed_unsuccessful(self):
self.setupModel(
merger_policy=FMT20.MergerPolicies.Intermediate_late_takeover_allowed,
consumer_surplus_duopoly=0.46,
consumer_surplus_without_innovation=0.2,
consumer_surplus_with_innovation=0.35,
development_success=False,
)
self.assertEqual(
FMT20.MergerPolicies.Intermediate_late_takeover_allowed,
self.model.merger_policy,
)
self.assertFalse(self.model.is_startup_credit_rationed)
self.assertEqual(FMT20.Takeover.No, self.model.get_early_bidding_type)
self.assertEqual(FMT20.Takeover.No, self.model.get_late_bidding_type)
self.assertTrue(self.model.is_owner_investing)
self.assertFalse(self.model.is_development_successful)
self.assertFalse(self.model.is_early_takeover)
self.assertFalse(self.model.is_late_takeover)
def test_profitable_not_credit_rationed(self):
self.setupModel(
merger_policy=FMT20.MergerPolicies.Intermediate_late_takeover_allowed,
development_costs=0.09,
incumbent_profit_without_innovation=0.35,
consumer_surplus_duopoly=0.46,
consumer_surplus_without_innovation=0.2,
consumer_surplus_with_innovation=0.35,
)
self.assertEqual(
FMT20.MergerPolicies.Intermediate_late_takeover_allowed,
self.model.merger_policy,
)
self.assertFalse(self.model.is_startup_credit_rationed)
self.assertEqual(FMT20.Takeover.Pooling, self.model.get_early_bidding_type)
self.assertEqual(FMT20.Takeover.No, self.model.get_late_bidding_type)
self.assertTrue(self.model.is_owner_investing)
self.assertTrue(self.model.is_development_successful)
self.assertTrue(self.model.is_early_takeover)
self.assertFalse(self.model.is_late_takeover)
def test_profitable_credit_rationed(self):
self.setupModel(
merger_policy=FMT20.MergerPolicies.Intermediate_late_takeover_allowed,
private_benefit=0.075,
startup_assets=0.005,
development_costs=0.076,
success_probability=0.79,
incumbent_profit_with_innovation=0.179,
incumbent_profit_without_innovation=0.08,
incumbent_profit_duopoly=0.05,
startup_profit_duopoly=0.1,
consumer_surplus_duopoly=0.46,
consumer_surplus_without_innovation=0.2,
consumer_surplus_with_innovation=0.35,
)
self.assertEqual(
FMT20.MergerPolicies.Intermediate_late_takeover_allowed,
self.model.merger_policy,
)
self.assertTrue(self.model.is_startup_credit_rationed)
self.assertEqual(FMT20.Takeover.Separating, self.model.get_early_bidding_type)
self.assertEqual(FMT20.Takeover.No, self.model.get_late_bidding_type)
self.assertTrue(self.model.is_owner_investing)
self.assertTrue(self.model.is_development_successful)
self.assertTrue(self.model.is_early_takeover)
self.assertFalse(self.model.is_late_takeover)
class TestLaissezFairePerfectInformationModel(TestPerfectInformationModel):
def test_not_profitable_not_credit_rationed(self):
self.setupModel(merger_policy=FMT20.MergerPolicies.Laissez_faire)
self.assertEqual(FMT20.MergerPolicies.Laissez_faire, self.model.merger_policy)
self.assertFalse(self.model.is_startup_credit_rationed)
self.assertEqual(FMT20.Takeover.Pooling, self.model.get_early_bidding_type)
self.assertEqual(FMT20.Takeover.No, self.model.get_late_bidding_type)
self.assertFalse(self.model.is_owner_investing)
self.assertFalse(self.model.is_development_successful)
self.assertTrue(self.model.is_early_takeover)
self.assertFalse(self.model.is_late_takeover)
def test_not_profitable_credit_rationed(self):
self.setupModel(
merger_policy=FMT20.MergerPolicies.Laissez_faire,
startup_assets=0.01,
private_benefit=0.099,
success_probability=0.51,
development_costs=0.1,
startup_profit_duopoly=0.339,
incumbent_profit_duopoly=0.01,
incumbent_profit_with_innovation=0.35,
consumer_surplus_with_innovation=0.4,
incumbent_profit_without_innovation=0.3,
)
self.assertEqual(FMT20.MergerPolicies.Laissez_faire, self.model.merger_policy)
self.assertTrue(self.model.is_startup_credit_rationed)
self.assertEqual(FMT20.Takeover.No, self.model.get_early_bidding_type)
self.assertEqual(FMT20.Takeover.No, self.model.get_late_bidding_type)
self.assertFalse(self.model.is_owner_investing)
self.assertFalse(self.model.is_development_successful)
self.assertFalse(self.model.is_early_takeover)
self.assertFalse(self.model.is_late_takeover)
def test_profitable_not_credit_rationed(self):
self.setupModel(
merger_policy=FMT20.MergerPolicies.Laissez_faire,
private_benefit=0.075,
development_costs=0.078,
success_probability=0.76,
incumbent_profit_with_innovation=0.51,
)
self.assertEqual(FMT20.MergerPolicies.Laissez_faire, self.model.merger_policy)
self.assertFalse(self.model.is_startup_credit_rationed)
self.assertEqual(FMT20.Takeover.Pooling, self.model.get_early_bidding_type)
self.assertEqual(FMT20.Takeover.No, self.model.get_late_bidding_type)
self.assertTrue(self.model.is_owner_investing)
self.assertTrue(self.model.is_development_successful)
self.assertTrue(self.model.is_early_takeover)
self.assertFalse(self.model.is_late_takeover)
def test_profitable_credit_rationed(self):
self.setupModel(
merger_policy=FMT20.MergerPolicies.Laissez_faire,
private_benefit=0.075,
startup_assets=0.005,
development_costs=0.076,
success_probability=0.79,
incumbent_profit_with_innovation=0.179,
incumbent_profit_without_innovation=0.08,
incumbent_profit_duopoly=0.05,
startup_profit_duopoly=0.1,
)
self.assertEqual(FMT20.MergerPolicies.Laissez_faire, self.model.merger_policy)
self.assertTrue(self.model.is_startup_credit_rationed)
self.assertEqual(FMT20.Takeover.Separating, self.model.get_early_bidding_type)
self.assertEqual(FMT20.Takeover.No, self.model.get_late_bidding_type)
self.assertTrue(self.model.is_owner_investing)
self.assertTrue(self.model.is_development_successful)
self.assertTrue(self.model.is_early_takeover)
self.assertFalse(self.model.is_late_takeover)
class TestEquityModel(Test.TestOptimalMergerPolicyModel):
def setupModel(self, **kwargs) -> None:
self.model = FMT20.EquityContract(**kwargs)
def test_thresholds(self):
self.setupModel()
self.assertEqual(
self.model.asset_threshold, self.model.asset_threshold_late_takeover
)
def test_debt_not_preferred(self):
self.setupModel()
self.assertFalse(self.model.does_startup_strictly_prefer_debt())
def test_debt_preferred(self):
self.setupModel(merger_policy=FMT20.MergerPolicies.Laissez_faire)
self.assertTrue(self.model.does_startup_strictly_prefer_debt())
# TODO: Adjust optimal merger policies tests.
def test_laissez_faire_optimal_merger_policy(self):
pass
def test_intermediate_optimal_merger_policy(self):
pass
| 42.877451 | 86 | 0.688579 | 1,960 | 17,494 | 5.815816 | 0.078061 | 0.093956 | 0.060795 | 0.071585 | 0.87148 | 0.816124 | 0.778402 | 0.75182 | 0.730415 | 0.714098 | 0 | 0.025116 | 0.232994 | 17,494 | 407 | 87 | 42.982801 | 0.824415 | 0.007545 | 0 | 0.618132 | 0 | 0 | 0.024312 | 0 | 0 | 0 | 0 | 0.002457 | 0.315934 | 1 | 0.087912 | false | 0.005495 | 0.005495 | 0 | 0.118132 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
53f335d3ffcf59eb1b774f6378f3fb6ce85a283b | 183 | py | Python | Autostart/acconeerStreamingServerStart.py | Kandidatarbete-Chalmers-MCCX02-19-06/RaspberryPiRadarProgram | f5d69d9084d37246aaf0e0061b3353b86e8d59e3 | [
"MIT"
] | 7 | 2019-05-27T13:13:14.000Z | 2022-03-21T00:09:58.000Z | Autostart/acconeerStreamingServerStart.py | Kandidatarbete-Chalmers-MCCX02-19-06/Kandidatarbete-Chalmers-MCCX02-19-06 | f5d69d9084d37246aaf0e0061b3353b86e8d59e3 | [
"MIT"
] | 1 | 2020-02-04T08:32:16.000Z | 2020-05-26T17:44:11.000Z | Autostart/acconeerStreamingServerStart.py | Kandidatarbete-Chalmers-MCCX02-19-06/Kandidatarbete-Chalmers-MCCX02-19-06 | f5d69d9084d37246aaf0e0061b3353b86e8d59e3 | [
"MIT"
] | 1 | 2020-11-10T16:43:17.000Z | 2020-11-10T16:43:17.000Z | import subprocess # for Raspberry Pi shutdown
subprocess.call("./home/pi/Documents/evk_service_linux_armv7l_xc112/utils/acc_streaming_server_rpi_xc112_r2b_xr112_r2b_a111_r2c")
| 45.75 | 129 | 0.846995 | 27 | 183 | 5.259259 | 0.851852 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 0.081967 | 183 | 3 | 130 | 61 | 0.75 | 0.136612 | 0 | 0 | 0 | 0 | 0.705128 | 0.705128 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
54e15f2a6edf7c0e89058680d06cba5a8e6f7da0 | 96 | py | Python | venv/lib/python3.8/site-packages/poetry/core/_vendor/pyparsing.py | Retraces/UkraineBot | 3d5d7f8aaa58fa0cb8b98733b8808e5dfbdb8b71 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/poetry/core/_vendor/pyparsing.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/poetry/core/_vendor/pyparsing.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/b8/a8/70/dc2f028c3726177a09361e80fe561853d7f361abf4a136a5be4197f151 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.458333 | 0 | 96 | 1 | 96 | 96 | 0.4375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
54e1939e6b4b69f48d4f4532e833fa71ddb71dfc | 10,474 | py | Python | tests/test_rest_client.py | gbenson/google-photos-archiver | 85b0ffd2cca636b3a1690645a2442b2e2b431395 | [
"MIT"
] | 3 | 2021-06-28T16:39:25.000Z | 2022-03-07T21:43:34.000Z | tests/test_rest_client.py | gbenson/google-photos-archiver | 85b0ffd2cca636b3a1690645a2442b2e2b431395 | [
"MIT"
] | 3 | 2021-02-09T00:55:24.000Z | 2021-09-19T09:13:06.000Z | tests/test_rest_client.py | gbenson/google-photos-archiver | 85b0ffd2cca636b3a1690645a2442b2e2b431395 | [
"MIT"
] | 2 | 2021-04-10T20:18:15.000Z | 2021-09-08T11:10:51.000Z | import json
import pytest
from google_photos_archiver.album import create_album
from google_photos_archiver.media_item import create_media_item
from google_photos_archiver.rest_client import GooglePhotosApiRestClientError
from tests.conftest import MockFailureResponse, MockSuccessResponse, test_date_filter
class TestGooglePhotosApiRestClient:
def test_get_albums_success(self, google_photos_api_rest_client, mocker):
mock_get = mocker.patch(
"google_photos_archiver.rest_client.requests.get",
return_value=MockSuccessResponse(),
)
get_albums_response = google_photos_api_rest_client.get_albums()
assert get_albums_response.ok
mock_get.assert_called_with(
"https://photoslibrary.googleapis.com/v1/albums",
headers={"Authorization": "Bearer TEST_TOKEN"},
params={"pageSize": 50},
)
def test_get_albums_failure(self, google_photos_api_rest_client, mocker):
mock_get = mocker.patch(
"google_photos_archiver.rest_client.requests.get",
return_value=MockFailureResponse(),
)
with pytest.raises(
GooglePhotosApiRestClientError, match="Failed to execute: `get_albums`"
):
google_photos_api_rest_client.get_albums()
mock_get.assert_called_with(
"https://photoslibrary.googleapis.com/v1/albums",
headers={"Authorization": "Bearer TEST_TOKEN"},
params={"pageSize": 50},
)
def test_get_albums_paginated(
self, google_photos_api_rest_client, mocker, test_album_dict
):
mocker.patch(
"google_photos_archiver.rest_client.requests.get",
side_effect=[
MockSuccessResponse(
bytes(
json.dumps(
{
"albums": [
test_album_dict,
],
"nextPageToken": "abc123",
}
),
"utf-8",
)
),
MockSuccessResponse(
bytes(
json.dumps({"albums": [test_album_dict]}),
"utf-8",
)
),
],
)
albums = google_photos_api_rest_client.get_albums_paginated()
album = create_album(test_album_dict)
assert list(albums) == [album, album]
def test_get_media_items_success(self, google_photos_api_rest_client, mocker):
mock_get = mocker.patch(
"google_photos_archiver.rest_client.requests.get",
return_value=MockSuccessResponse(),
)
get_media_items_response = google_photos_api_rest_client.get_media_items()
assert get_media_items_response.ok
mock_get.assert_called_with(
"https://photoslibrary.googleapis.com/v1/mediaItems",
headers={"Authorization": "Bearer TEST_TOKEN"},
params={"pageSize": 25},
)
@pytest.mark.parametrize(
"params,expected_params",
[
(None, dict(pageSize=25)),
(dict(page_size=123), dict(pageSize=123)),
(dict(page_token="abc"), dict(pageSize=25, pageToken="abc")),
(
dict(page_size=123, page_token="abc"),
dict(pageSize=123, pageToken="abc"),
),
],
)
def test_get_media_items_params(
self, mocker, google_photos_api_rest_client, params, expected_params
):
mock_get = mocker.patch(
"google_photos_archiver.rest_client.requests.get",
return_value=MockSuccessResponse(),
)
if params is None:
get_media_items_response = google_photos_api_rest_client.get_media_items()
else:
get_media_items_response = google_photos_api_rest_client.get_media_items(
**params
)
assert get_media_items_response.ok
mock_get.assert_called_with(
"https://photoslibrary.googleapis.com/v1/mediaItems",
headers={"Authorization": "Bearer TEST_TOKEN"},
params=expected_params,
)
def test_get_media_items_failure(self, google_photos_api_rest_client, mocker):
mock_get = mocker.patch(
"google_photos_archiver.rest_client.requests.get",
return_value=MockFailureResponse(),
)
with pytest.raises(
GooglePhotosApiRestClientError, match="Failed to execute: `get_media_items`"
):
google_photos_api_rest_client.get_media_items()
mock_get.assert_called_with(
"https://photoslibrary.googleapis.com/v1/mediaItems",
headers={"Authorization": "Bearer TEST_TOKEN"},
params={"pageSize": 25},
)
def test_get_media_items_paginated(
self,
google_photos_api_rest_client,
mocker,
test_photo_media_item_dict,
test_video_media_item_dict,
):
mocker.patch(
"google_photos_archiver.rest_client.requests.get",
side_effect=[
MockSuccessResponse(
bytes(
json.dumps(
{
"mediaItems": [
test_photo_media_item_dict,
],
"nextPageToken": "abc123",
}
),
"utf-8",
)
),
MockSuccessResponse(
bytes(
json.dumps({"mediaItems": [test_video_media_item_dict]}),
"utf-8",
)
),
],
)
media_items = google_photos_api_rest_client.get_media_items_paginated()
photo_media_item = create_media_item(test_photo_media_item_dict)
video_media_item = create_media_item(test_video_media_item_dict)
assert list(media_items) == [photo_media_item, video_media_item]
def test_search_media_items_success(self, google_photos_api_rest_client, mocker):
mock_post = mocker.patch(
"google_photos_archiver.rest_client.requests.post",
return_value=MockSuccessResponse(),
)
search_media_items_response = google_photos_api_rest_client.search_media_items()
assert search_media_items_response.ok
mock_post.assert_called_with(
"https://photoslibrary.googleapis.com/v1/mediaItems:search",
headers={"Authorization": "Bearer TEST_TOKEN"},
params={"alt": "json"},
json={"pageSize": 100},
)
@pytest.mark.parametrize(
"_json,expected_json",
[
(None, dict(pageSize=100)),
(dict(page_size=123), dict(pageSize=123)),
(dict(page_token="abc"), dict(pageSize=100, pageToken="abc")),
(
dict(page_size=123, page_token="abc"),
dict(pageSize=123, pageToken="abc"),
),
(
dict(filters=[test_date_filter()]),
{"pageSize": 100, "filters": test_date_filter().get_filter()},
),
(
dict(album_id=1),
{"pageSize": 100, "albumId": 1},
),
],
)
def test_search_media_items_params(
self, mocker, google_photos_api_rest_client, _json, expected_json
):
mock_get = mocker.patch(
"google_photos_archiver.rest_client.requests.post",
return_value=MockSuccessResponse(),
)
if _json is None:
search_media_items_response = (
google_photos_api_rest_client.search_media_items()
)
else:
search_media_items_response = (
google_photos_api_rest_client.search_media_items(**_json)
)
assert search_media_items_response.ok
mock_get.assert_called_with(
"https://photoslibrary.googleapis.com/v1/mediaItems:search",
headers={"Authorization": "Bearer TEST_TOKEN"},
params={"alt": "json"},
json=expected_json,
)
def test_search_media_items_failure(self, google_photos_api_rest_client, mocker):
mock_get = mocker.patch(
"google_photos_archiver.rest_client.requests.post",
return_value=MockFailureResponse(),
)
with pytest.raises(
GooglePhotosApiRestClientError,
match="Failed to execute: `search_media_items`",
):
google_photos_api_rest_client.search_media_items()
mock_get.assert_called_with(
"https://photoslibrary.googleapis.com/v1/mediaItems:search",
headers={"Authorization": "Bearer TEST_TOKEN"},
params={"alt": "json"},
json={"pageSize": 100},
)
def test_search_media_items_paginated(
self,
google_photos_api_rest_client,
mocker,
test_photo_media_item_dict,
test_video_media_item_dict,
):
mocker.patch(
"google_photos_archiver.rest_client.requests.post",
side_effect=[
MockSuccessResponse(
bytes(
json.dumps(
{
"mediaItems": [
test_photo_media_item_dict,
],
"nextPageToken": "abc123",
}
),
"utf-8",
)
),
MockSuccessResponse(
bytes(
json.dumps({"mediaItems": [test_video_media_item_dict]}),
"utf-8",
)
),
],
)
media_items = google_photos_api_rest_client.search_media_items_paginated()
photo_media_item = create_media_item(test_photo_media_item_dict)
video_media_item = create_media_item(test_video_media_item_dict)
assert list(media_items) == [photo_media_item, video_media_item]
| 36.750877 | 88 | 0.555375 | 974 | 10,474 | 5.553388 | 0.096509 | 0.084304 | 0.066556 | 0.084304 | 0.867258 | 0.844703 | 0.844703 | 0.828064 | 0.806988 | 0.786282 | 0 | 0.011714 | 0.35612 | 10,474 | 284 | 89 | 36.880282 | 0.790332 | 0 | 0 | 0.62069 | 0 | 0 | 0.151136 | 0.051843 | 0 | 0 | 0 | 0 | 0.061303 | 1 | 0.042146 | false | 0 | 0.022989 | 0 | 0.068966 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
54f65f6a668552792c2d03a9d0fa8379871ce4fb | 22 | py | Python | ibcspy/__init__.py | simfrep/ibcspy | 9ecd98e17f4f2b3af75a2d59306337e12772a3ee | [
"BSD-3-Clause"
] | null | null | null | ibcspy/__init__.py | simfrep/ibcspy | 9ecd98e17f4f2b3af75a2d59306337e12772a3ee | [
"BSD-3-Clause"
] | null | null | null | ibcspy/__init__.py | simfrep/ibcspy | 9ecd98e17f4f2b3af75a2d59306337e12772a3ee | [
"BSD-3-Clause"
] | null | null | null | from .ibcs import Ibcs | 22 | 22 | 0.818182 | 4 | 22 | 4.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 22 | 1 | 22 | 22 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0703fa7b2536887f988fc6704bd73451fa5167c0 | 23 | py | Python | misc/python/mango/mpiTest/__init__.py | pymango/pymango | b55f831f0194b214e746b2dfb4d9c6671a1abc38 | [
"BSD-2-Clause"
] | 3 | 2020-05-11T03:23:17.000Z | 2021-03-16T09:01:48.000Z | misc/python/mango/mpiTest/__init__.py | pymango/pymango | b55f831f0194b214e746b2dfb4d9c6671a1abc38 | [
"BSD-2-Clause"
] | null | null | null | misc/python/mango/mpiTest/__init__.py | pymango/pymango | b55f831f0194b214e746b2dfb4d9c6671a1abc38 | [
"BSD-2-Clause"
] | 2 | 2017-03-04T11:03:40.000Z | 2020-08-01T10:01:36.000Z | from .mpiTest import *
| 11.5 | 22 | 0.73913 | 3 | 23 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 23 | 1 | 23 | 23 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
074cac1131ac9c8c57b8c92d7ac1512d578ebc4e | 226 | py | Python | run.py | larryworm1127/tic-tac-toe-python | 327b43d36948fc41ef9c902f40c7f67fab793b89 | [
"MIT"
] | null | null | null | run.py | larryworm1127/tic-tac-toe-python | 327b43d36948fc41ef9c902f40c7f67fab793b89 | [
"MIT"
] | null | null | null | run.py | larryworm1127/tic-tac-toe-python | 327b43d36948fc41ef9c902f40c7f67fab793b89 | [
"MIT"
] | null | null | null | """
Module that runs the entire game
"""
from ttt_game.ttt_gui import game_loop
from ttt_game.ttt_computer import get_move
from ttt_game.ttt_board import PLAYERO
if __name__ == '__main__':
game_loop(3, PLAYERO, get_move)
| 22.6 | 42 | 0.778761 | 38 | 226 | 4.157895 | 0.526316 | 0.132911 | 0.208861 | 0.265823 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005155 | 0.141593 | 226 | 9 | 43 | 25.111111 | 0.809278 | 0.141593 | 0 | 0 | 0 | 0 | 0.043011 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0767ebf286546e88cbcd84c69868d5c0f801a9d6 | 17,483 | py | Python | tools/accuracy_checker/tests/test_dataset.py | Ohtani-y/open_model_zoo | 280b59fc6c00455889a1949c795558252fdad96f | [
"Apache-2.0"
] | 2 | 2019-08-20T15:30:19.000Z | 2020-09-01T15:16:33.000Z | tools/accuracy_checker/tests/test_dataset.py | Ohtani-y/open_model_zoo | 280b59fc6c00455889a1949c795558252fdad96f | [
"Apache-2.0"
] | 4 | 2020-05-22T17:30:43.000Z | 2021-08-02T07:33:16.000Z | tools/accuracy_checker/tests/test_dataset.py | Ohtani-y/open_model_zoo | 280b59fc6c00455889a1949c795558252fdad96f | [
"Apache-2.0"
] | 2 | 2021-06-25T06:18:58.000Z | 2021-08-04T10:05:32.000Z | """
Copyright (c) 2018-2021 Intel Corporation
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import copy
from pathlib import Path
import pytest
from .common import make_representation
from openvino.tools.accuracy_checker.config import ConfigError
from openvino.tools.accuracy_checker.annotation_converters.format_converter import ConverterReturn
from openvino.tools.accuracy_checker.dataset import Dataset
def copy_dataset_config(config):
new_config = copy.deepcopy(config)
return new_config
class MockPreprocessor:
@staticmethod
def process(images):
return images
class TestDataset:
dataset_config = {
'name': 'custom',
'annotation': 'custom',
'data_source': 'custom',
'metrics': [{'type': 'map'}]
}
def test_missed_name_raises_config_error_exception(self):
local_dataset = copy_dataset_config(self.dataset_config)
local_dataset.pop('name')
with pytest.raises(ConfigError):
Dataset(local_dataset)
def test_setting_custom_dataset_with_missed_annotation_raises_config_error_exception(self):
local_dataset = copy_dataset_config(self.dataset_config)
local_dataset.pop('annotation')
with pytest.raises(ConfigError):
Dataset(local_dataset)
@pytest.mark.usefixtures('mock_path_exists')
class TestAnnotationConversion:
dataset_config = {
'name': 'custom',
'data_source': 'custom',
'metrics': [{'type': 'map'}]
}
def test_annotation_conversion_unknown_converter_raise_config_error(self):
addition_options = {'annotation_conversion': {'converter': 'unknown'}}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
with pytest.raises(ValueError):
Dataset(config)
def test_annotation_conversion_converter_without_required_options_raise_config_error(self):
addition_options = {'annotation_conversion': {'converter': 'wider'}}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
with pytest.raises(ConfigError):
Dataset(config)
def test_annotation_conversion_raise_config_error_on_extra_args(self):
addition_options = {'annotation_conversion': {'converter': 'wider', 'annotation_file': 'file', 'something_extra': 'extra'}}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
with pytest.raises(ConfigError):
Dataset(config)
def test_successful_annotation_conversion(self, mocker):
addition_options = {'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')}}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
annotation_converter_mock = mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(make_representation("0 0 0 5 5", True), None, None)
)
Dataset(config)
annotation_converter_mock.assert_called_once_with()
def test_annotation_conversion_not_convert_twice(self, mocker):
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'annotation': Path('custom')
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation('0 0 0 5 5', True)
annotation_reader_mock = mocker.patch(
'openvino.tools.accuracy_checker.dataset.read_annotation',
return_value=converted_annotation
)
Dataset(config)
annotation_reader_mock.assert_called_once_with(Path('custom'), True)
def test_annotation_conversion_with_store_annotation(self, mocker):
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'annotation': Path('custom')
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation('0 0 0 5 5', True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
mocker.patch('pathlib.Path.exists', return_value=False)
annotation_saver_mock = mocker.patch(
'openvino.tools.accuracy_checker.dataset.save_annotation'
)
Dataset(config)
annotation_saver_mock.assert_called_once_with(converted_annotation, None, Path('custom'), None, config)
def test_annotation_conversion_subset_size(self, mocker):
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'subsample_size': 1
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation(['0 0 0 5 5', '0 1 1 10 10'], True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
dataset = Dataset(config)
assert dataset.data_provider.annotation_provider._data_buffer == {converted_annotation[1].identifier: converted_annotation[1]}
def test_annotation_conversion_subset_ratio(self, mocker):
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'subsample_size': '50%'
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation(['0 0 0 5 5', '0 1 1 10 10'], True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
subset_maker_mock = mocker.patch(
'openvino.tools.accuracy_checker.dataset.make_subset'
)
Dataset.load_annotation(config)
subset_maker_mock.assert_called_once_with(converted_annotation, 1, 666, True, False)
def test_annotation_conversion_subset_more_than_dataset_size(self, mocker):
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'subsample_size': 3,
'subsample_seed': 1
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation(['0 0 0 5 5', '0 1 1 10 10'], True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
with pytest.warns(UserWarning):
annotation, _ = Dataset.load_annotation(config)
assert annotation == converted_annotation
def test_annotation_conversion_with_zero_subset_size(self, mocker):
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'subsample_size': 0
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation(['0 0 0 5 5', '0 1 1 10 10'], True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
with pytest.raises(ConfigError):
Dataset(config)
def test_annotation_conversion_with_negative_subset_size(self, mocker):
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'subsample_size': -1
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation(['0 0 0 5 5', '0 1 1 10 10'], True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
with pytest.raises(ConfigError):
Dataset(config)
def test_annotation_conversion_negative_subset_ratio_raise_config_error(self, mocker):
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'subsample_size': '-50%'
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation(['0 0 0 5 5', '0 1 1 10 10'], True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
with pytest.raises(ConfigError):
Dataset(config)
def test_annotation_conversion_zero_subset_ratio_raise_config_error(self, mocker):
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'subsample_size': '0%'
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation(['0 0 0 5 5', '0 1 1 10 10'], True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
with pytest.raises(ConfigError):
Dataset(config)
def test_annotation_conversion_invalid_subset_ratio_raise_config_error(self, mocker):
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'subsample_size': 'aaa%'
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation(['0 0 0 5 5', '0 1 1 10 10'], True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
with pytest.raises(ConfigError):
Dataset(config)
def test_annotation_conversion_invalid_subset_size_raise_config_error(self, mocker):
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'subsample_size': 'aaa'
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation(['0 0 0 5 5', '0 1 1 10 10'], True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
with pytest.raises(ConfigError):
Dataset(config)
def test_annotation_conversion_closer_to_zero_subset_ratio(self, mocker):
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'subsample_size': '0.001%'
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation(['0 0 0 5 5', '0 1 1 10 10'], True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
subset_maker_mock = mocker.patch(
'openvino.tools.accuracy_checker.dataset.make_subset'
)
Dataset.load_annotation(config)
subset_maker_mock.assert_called_once_with(converted_annotation, 1, 666, True, False)
def test_annotation_conversion_subset_with_seed(self, mocker):
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'subsample_size': 1,
'subsample_seed': 1
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation(['0 0 0 5 5', '0 1 1 10 10'], True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
annotation, _ = Dataset.load_annotation(config)
assert annotation == [converted_annotation[0]]
def test_annotation_conversion_save_subset(self, mocker):
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'annotation': Path('custom'),
'subsample_size': 1,
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation(['0 0 0 5 5', '0 1 1 10 10'], True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
annotation_saver_mock = mocker.patch(
'openvino.tools.accuracy_checker.dataset.save_annotation'
)
mocker.patch('pathlib.Path.exists', return_value=False)
Dataset(config)
annotation_saver_mock.assert_called_once_with([converted_annotation[1]], None, Path('custom'), None, config)
def test_annotation_conversion_subset_with_disabled_shuffle(self, mocker):
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'subsample_size': 1,
'shuffle': False
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation(['0 0 0 5 5', '0 1 1 10 10'], True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
annotation, _ = Dataset.load_annotation(config)
assert annotation == [converted_annotation[0]]
def test_ignore_subset_parameters_if_file_provided(self, mocker):
mocker.patch('pathlib.Path.exists', return_value=True)
mocker.patch('openvino.tools.accuracy_checker.utils.read_yaml', return_value=['1'])
subset_maker_mock = mocker.patch(
'openvino.tools.accuracy_checker.dataset.make_subset'
)
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
'subsample_size': 1,
'shuffle': False,
"subset_file": "subset.yml"
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation(['0 0 0 5 5', '0 1 1 10 10'], True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
Dataset.load_annotation(config)
assert not subset_maker_mock.called
def test_create_data_provider_with_subset_file(self, mocker):
mocker.patch('pathlib.Path.exists', return_value=True)
mocker.patch('openvino.tools.accuracy_checker.dataset.read_yaml', return_value=['1'])
addition_options = {
'annotation_conversion': {'converter': 'wider', 'annotation_file': Path('file')},
"subset_file": "subset.yml"
}
config = copy_dataset_config(self.dataset_config)
config.update(addition_options)
converted_annotation = make_representation(['0 0 0 5 5', '0 1 1 10 10'], True)
mocker.patch(
'openvino.tools.accuracy_checker.annotation_converters.WiderFormatConverter.convert',
return_value=ConverterReturn(converted_annotation, None, None)
)
dataset = Dataset(config)
assert len(dataset.data_provider) == 1
assert dataset.identifiers == ['1']
assert dataset.data_provider.full_size == 2
| 44.943445 | 134 | 0.677973 | 1,859 | 17,483 | 6.088219 | 0.099516 | 0.073511 | 0.051953 | 0.06927 | 0.848471 | 0.823909 | 0.815427 | 0.80615 | 0.784591 | 0.759321 | 0 | 0.017973 | 0.223474 | 17,483 | 388 | 135 | 45.059278 | 0.815704 | 0.032489 | 0 | 0.637725 | 0 | 0 | 0.226415 | 0.133022 | 0 | 0 | 0 | 0 | 0.041916 | 1 | 0.07485 | false | 0 | 0.020958 | 0.002994 | 0.116766 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4ad30246634ea79254fe4bc25c7c913443383401 | 242 | py | Python | core/views.py | Udantu/portfolio-v1 | 1fc1b88c1a76db24647555535baaceb16b563b6b | [
"MIT"
] | null | null | null | core/views.py | Udantu/portfolio-v1 | 1fc1b88c1a76db24647555535baaceb16b563b6b | [
"MIT"
] | null | null | null | core/views.py | Udantu/portfolio-v1 | 1fc1b88c1a76db24647555535baaceb16b563b6b | [
"MIT"
] | null | null | null | from django.shortcuts import render
def index(request):
return render(request,"core/index.html")
def dashboard(request):
return render(request,"core/dashboard.html")
def about(request):
return render(request,"core/about.html")
| 22 | 48 | 0.743802 | 32 | 242 | 5.625 | 0.40625 | 0.216667 | 0.316667 | 0.433333 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.128099 | 242 | 10 | 49 | 24.2 | 0.853081 | 0 | 0 | 0 | 0 | 0 | 0.202479 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0.142857 | 0.428571 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
ab033ca249b1daac30bde9c7af9c03d3eb3c21b4 | 10,415 | py | Python | tasks/nooisee/flask/ids_and_flags.py | irdkwmnsb/lkshl-ctf | e5c0200ddc8ba73df5f321b87b9763fb1bbaba57 | [
"MIT"
] | 3 | 2021-03-30T06:27:58.000Z | 2021-04-03T17:56:35.000Z | tasks/nooisee/flask/ids_and_flags.py | irdkwmnsb/lkshl-ctf | e5c0200ddc8ba73df5f321b87b9763fb1bbaba57 | [
"MIT"
] | null | null | null | tasks/nooisee/flask/ids_and_flags.py | irdkwmnsb/lkshl-ctf | e5c0200ddc8ba73df5f321b87b9763fb1bbaba57 | [
"MIT"
] | null | null | null | ids = ['i2yctSSMGde1TjBIor4T', 'DZrIgOypvPqMctJS06O9', 'lH4uWfzCg1kFeULgUIVC', 'N9mUMmG8maidbCD0s4mV', 'PuqvlpXgaa5s42mZKE4j', 'CoRVyJtEoNE87xUoRqw0', '8uKC7RXAOBlOXdP7KVGl', '2cVvsSZY3QLRmoAYRFkL', '22enKTGqQwxga5RFQdus', 'Sw1vkSnYDQdEtOjTnfH0', 'S7mUlJwhAJhCFgxCOoAW', 'JkTjzY6JUEgo9G7tHciV', 'xn8irINr3sDQLIAb2CkI', 'WPnNc5oYE4s3IPkL0fGg', 'dVqlZVLNgCCm1Sy4qfOW', 'l5cEE0hLMVCt2TjTit5c', '7BcmXXs7kbip5gfT853u', 'QZrfCklvGpdwt4vB6oh8', 'hAiHcE7oRy0zXqpoNeWh', 'LbSDhksl1vNujaAmhVhJ', 'oZZ1TyF3Ysg9KxtBGVs8', 'FKerevwgUjdoWUGyZ652', 'jvutz2HwlzcUGbSoZkgu', 'oV28KmKrYyvSdosdRck1', 'tuacfxYJA1CE54Ixao5F', 'q2B8TWtgwVD2rsGEeehx', '5bO4XLG9OyswG01jVleq', 'rZBIgEB002nWqVjMIBzg', '5ojT7jrimtbZP6mp7MAh', 'Z31bdvkrb3NtMIC3MenW', 'bbH9tpTiZz7V8a7i848m', '9xWkjKVCX91TzD933bMD', '3Jq5yRa0S0DKpIi96kjH', '2h3bhgxJ3ohZeNuG5Noq', 'o2YpZKg7619CB6yzN5SB', '1JZsemZRho77QrN0skkl', 'K9ySRqaGklft8OAY4l0G', 'r4wgNCzZbxtMhx8SHZlv', 'zLUWPrq0JEvsXn0yb2c5', 'fa87JWxShRCUK1xAV611', 'Kcr65dOqJVpTDU2cuUP1', 'aidLGpK1oiHQ2lv7gC12', 'Ttxgw2BqBD1jrm5l7hYc', 'YbQgYJDKArj2kOh95j86', 'tF82UdE6QhLaNfxbJ5PB', '3lN47CbPRR22TsFVdP6i', '2msPQ0ruOJ5rvOzvi0Gz', 'MUdVTNaq9i2k85AqIlVl', 'AiPFQJJBnJGTKtU6Ifrc', 'a7patLnLlBvWKjzfoPUy', 'sAsApUGUvINTq68IVRXX', 'Ne2J72wYP6LAnrckWZIh', '0D774XxxX56yC2WM7qkr', 'oggJm9bOxLAEEb1FswRZ', 'DXonEXUOBi61oAAj24MG', 'udpI7111q5lofsEmVkZN', 'lSoDYeb6RnTtOUZCG0y5', 'foE9zcv1E3n4VlnbnMGO', 'H69Fr2lKjto9PoNhbGdO', 'rKPI7ONeLkxt6p0bkCnC', '0gXXuhUYpAfF5TWuouUu', 'R4AMaHKEDnMx2fxdFR4v', 'p8HtmlwqQFbRvmlWj09t', 'Gv8gN89ZvU32kvteW3Kw', 'xOtLyoBFxgTskjoHCA74', 'zWesr92l1uJKm9RGGFwt', 'iCkYhQIGrM46gZyOeivp', 'R6BRGvq6uPZ7cRHdx5iq', 'UiIfbhnbY0J7SmnPOiHI', 'tgzeloib826aSzWzJRqr', '2XNmQ1h3uOXxvBY5cf45', 'TronVfZ9KWuimzkX7o56', 'iN2I21k3xQ1x4slI1q81', 'Tg0LMsdZFyyuv5spDpcM', '5Tg0M84vOTpM5jee3xCj', 'xS5dAwdL8ZLCUBCRZnEf', 'r9lyMRAjrWp3BxpT9NB5', 'hhsJYamIQskGD4BkLPUK', 'Du0ryLhVJFhfoDDOquON', 'yJkQ3y6kv93edSSoPcXl', 'ax6V8yGruNZMWmQJW8s8', 'U7GWGaCUjYSJQROeJNPl', '0Kr0BkEkkbj0fHi0HVt9', 'TvX8REHZmZGFWchiFFff', 'unBi2qvY43oHIf67DpJk', 'f70Yf2D6U2cokFrE4M89', 'UHR1IIsj0RqR0tCMWjaY', 'k7noV2tQkX0mXSwbRTeA', 'G2RCr57Ur0r8Bd0msCnh', 'uEs94rT8dvyMSK3tgUmb', 'Cn86UI8t4tEU5LslZLhT', 'GasDs9W7O7zE5UOLlUyU', '8yBxaH7rrliGXrE9iN6l', '2ib88yqt6IMAhZLbP5eR', 'OqZnGIsjGmw2MGsS4Y7F', 'JRj9gpEn7JmdErJthkDN', 'OHoTjcphFL1tWxfdmwfb', 'CmLSRN8HL8CpH7Je56lw', 'BQPKERa1jaW7lv99ETYL', 'tshNpN967rtSld8NYCgt', 'MWVDlAwG0z6KSFKDVdxy', 'Aj6Wbseon4e6CM2a8N1s', 'ZtDlWQnQaURZqkWVxPSc', 'KdtXIlqS604uHDKoqenD', 'QO0sLYHHSnzPUKHjZgiy', 'C1Tn5OZCJfSewPe9qxnP', 'plyCyxg03GwFY0kc9Mh7', 'BV5FLHzyksnrHn4qgb7Z', 'uz2OONdBpGpWREOFn2Dp', 'nRafhWiq9ady7YMjMfmD', 'jbKC4rnCJqnSWNybL0ho', 'v15v2YSAdmmwFsbJzv17', 'wrA9JCI3vQ17wKRznfmT', 'Wrppq31UosvJyjTECnOB', 'MxloukUZxGkyVaDtGyG1', 'CoeFnlaucvloF9dGCV1K', 'q0z4Vqf8uVOJjXMooAd0', 'tEu9sRK1yGBC8kH7sDUb', 'lgMRRPCNwMKSjb82hJN0', 'UhYGSPsJx6dNUCzuVaEw', '0lzOfTipjEFqzJsLiyYI', 'hbTF34f4iv7a981bHuV7', 'DYvTEJqsIk8pPuQS5Hai', 'mG9i1Q9oOtimP7Xpge4R', 'fI4oQBFO67nD6GeFxPjb', 'TkZRLa98jQEvEk5xiBi2', 'gJdNhYVuNHsXemBEU0XX', 'TxyeoCCGL80nX6hv7OcE', '5SayX4207FIvIIjqBANn', 'Kt1CH39vMZQO8mVVbPVG', 'sZInNDG5pKcfzsL1GeJk', 'lSoiTyLc5Pm2g8bS25fO', '8RygrlGoU9tVd20nB6pU', 'QjpdFah2GgN73iMOcKgu', '5uqRz6u3P0kcLddR0X1Q', 'JlYTWI9NtbINkqA9Gcx8', 'O48qYJt8hOoOeIRFZiZ5', 'HKhGZkv4v4SCwVNgowvF', 'fV4BZGFvUS2PUNCHQJoQ', 'kqSSIrlgFzfrYzW7LeC6', 'Yf2waPjPfokxDnTkbKR0', '41fFikB0JwoZ2d3bbT2N', 'XzITKmVvLkUGANuolqjs', 'WeL0qrrg3VUSvce4eOqH', 'bnC1DabRzDnbLOHlVs95', 'iNVSQXEm7uamgR2uM8ub', 'Us0NNywPFasbh18rhKJL', '0UJ9ZTXiVSsXjpJCXYw5', 'MAZtrsxoVBMmh1KxOXIC', 'XMHVLcPDSFEL4PBgO5Uw', '57aBZrcavR7OsV8mFfdL', 'SDYrvSy5HNxZvCHpdPBa', 'qZjSvBWwtiV16tprHaJZ', 'cwYVhgQzRGa1UZa6wVXJ', 'ZurZjCN8DEZ87clrXlr0', 'sLTnm7u7ZtN9JfT29Vxg', 'sYdSmELU7dTZzi9tPmkG', 's0JdJwu3TsxmUDccEpKQ', 'jZrz2x2pItkIAFD2oyWF', 'ooOzVV9Dd1POoGwBiWsd', 'MuFsWSaRIWDWxUtMtHdH', 'mYEILDLnIaMBAg4LJZYA', 'w402UedR0qSN03uxFKro', 'Xh4tAImtQ11tnq25JIwt', 'Q6YOeR6OYFXpRc2vaqp4', '6EaLzsqaq7s0qRavwFOR', 'KipsQDR4RI6fgYkiYQeu', '5TByalXqgofiGuUFQ8ga', '6WSjCzum30MOuHJHI25r', '8o9mxMvUtlKMAIFxuQkY', 'x2N5Jp1uJPsIeLAknqrk', 'fn1DYGLRxayGv91i3ico', '1vpNXERfEuuvKG1yt6Es', 'YDgM6cyCeZ3WMbKtnZRA', 'VXcpNpWmcOD4ZuH0vvqE', '67bfDWVAqymhAV8xoow2', 'shuxt8SQoWuiSjmNCrq6', 'SUAwETaPiK5yZWwWgzLe', 'v8c6KsbsY0O7r20NcTc9', 'Z3I5tZoUE9Sl80IPDio7', 'erz0CZLp38LLQtw5CEyE', 'qiV6CQW3Np8fLUi4aUx1', 'UMxKLOtyDTZsD89IVXn5', '6Ue63hlYvUd2vHbNQTSZ', 'zLwT4gUVggNYF1Qz3eJK', 'EcfEf5UUER30630SJtcM', 'd1GbTz3UiUdCZAtOiSfH', '8I6JNrQL7zXkoMLQ14AI', '9oylE0h4WnWRlJJJ81RO', 'nSMZmbS7vIdnKGym2NOB', 'CJCVx5gq2zEVFZSsHlUi', '1okbUDCHJuIZJ4c4r0cN', 'rc8HONSCGpF0WTct384T', 'EcrLmnCC47uM5uNzapU7', 'BKcxCqu6kH2eB5tvqbp8', 'zxcpVWFMGRo96KdhAWC4', 'pNAbg6kLWHvgWU18GSDR', 'rXsOIcfQbrObgjhKFD1y', 'gng3koJU2ngLBOMBkn09', '6eDv9WvCunSJ3rbR7P41']
flags = ['LKL{g00D_N0isss3_M0VwcT}', 'LKL{g00D_N0isss3_fIh2JH}', 'LKL{g00D_N0isss3_oD1gJ7}', 'LKL{g00D_N0isss3_SfipqG}', 'LKL{g00D_N0isss3_oS5Nnz}', 'LKL{g00D_N0isss3_Btipdn}', 'LKL{g00D_N0isss3_Mo2isN}', 'LKL{g00D_N0isss3_gfjVax}', 'LKL{g00D_N0isss3_89DjDR}', 'LKL{g00D_N0isss3_U9rTxu}', 'LKL{g00D_N0isss3_zkT5Ks}', 'LKL{g00D_N0isss3_vVa7nj}', 'LKL{g00D_N0isss3_6PTYIO}', 'LKL{g00D_N0isss3_yXAKpI}', 'LKL{g00D_N0isss3_UXYisz}', 'LKL{g00D_N0isss3_485o6m}', 'LKL{g00D_N0isss3_IAfQoF}', 'LKL{g00D_N0isss3_u7jwOR}', 'LKL{g00D_N0isss3_0eVf9D}', 'LKL{g00D_N0isss3_cJEXvX}', 'LKL{g00D_N0isss3_r8yGte}', 'LKL{g00D_N0isss3_0Wg6vG}', 'LKL{g00D_N0isss3_2yxorP}', 'LKL{g00D_N0isss3_4F6Syl}', 'LKL{g00D_N0isss3_Sfy6NZ}', 'LKL{g00D_N0isss3_MHIZ0f}', 'LKL{g00D_N0isss3_besNuI}', 'LKL{g00D_N0isss3_3Ofy6n}', 'LKL{g00D_N0isss3_bU4Enb}', 'LKL{g00D_N0isss3_jTy3F5}', 'LKL{g00D_N0isss3_ZeCN3f}', 'LKL{g00D_N0isss3_qJE6fK}', 'LKL{g00D_N0isss3_86VxMN}', 'LKL{g00D_N0isss3_VXRzes}', 'LKL{g00D_N0isss3_JyPPq5}', 'LKL{g00D_N0isss3_JGYTE9}', 'LKL{g00D_N0isss3_NcaQzt}', 'LKL{g00D_N0isss3_Py2Jbl}', 'LKL{g00D_N0isss3_yepRkv}', 'LKL{g00D_N0isss3_2SsIXv}', 'LKL{g00D_N0isss3_O1Hz6r}', 'LKL{g00D_N0isss3_H6n4Z9}', 'LKL{g00D_N0isss3_Ncw3Z8}', 'LKL{g00D_N0isss3_KUcuzK}', 'LKL{g00D_N0isss3_qIY0i2}', 'LKL{g00D_N0isss3_084rcz}', 'LKL{g00D_N0isss3_CSOVie}', 'LKL{g00D_N0isss3_Tx304O}', 'LKL{g00D_N0isss3_NQHYem}', 'LKL{g00D_N0isss3_j2yrJp}', 'LKL{g00D_N0isss3_fYETyb}', 'LKL{g00D_N0isss3_KFKGph}', 'LKL{g00D_N0isss3_Y67kzX}', 'LKL{g00D_N0isss3_DFaPLi}', 'LKL{g00D_N0isss3_pH9R0C}', 'LKL{g00D_N0isss3_Jz9TY7}', 'LKL{g00D_N0isss3_JGxdKo}', 'LKL{g00D_N0isss3_EEUsf3}', 'LKL{g00D_N0isss3_tffJEU}', 'LKL{g00D_N0isss3_mCsaLE}', 'LKL{g00D_N0isss3_F8J0OW}', 'LKL{g00D_N0isss3_9l20a6}', 'LKL{g00D_N0isss3_bZHXxr}', 'LKL{g00D_N0isss3_WXInmT}', 'LKL{g00D_N0isss3_giBP9c}', 'LKL{g00D_N0isss3_S3Oxlh}', 'LKL{g00D_N0isss3_fVRZxk}', 'LKL{g00D_N0isss3_OePWlp}', 'LKL{g00D_N0isss3_VrqnRw}', 'LKL{g00D_N0isss3_IoLWv0}', 'LKL{g00D_N0isss3_IyM6fA}', 'LKL{g00D_N0isss3_auHrW6}', 'LKL{g00D_N0isss3_oK579V}', 'LKL{g00D_N0isss3_RVElQC}', 'LKL{g00D_N0isss3_oR9Aqc}', 'LKL{g00D_N0isss3_zPD9Za}', 'LKL{g00D_N0isss3_5khQWk}', 'LKL{g00D_N0isss3_wydJs2}', 'LKL{g00D_N0isss3_ttNaud}', 'LKL{g00D_N0isss3_kIMIU7}', 'LKL{g00D_N0isss3_SNahdB}', 'LKL{g00D_N0isss3_kBCPmL}', 'LKL{g00D_N0isss3_BpNCv3}', 'LKL{g00D_N0isss3_IZPzC4}', 'LKL{g00D_N0isss3_s6kihA}', 'LKL{g00D_N0isss3_KX4A5L}', 'LKL{g00D_N0isss3_uQUZzA}', 'LKL{g00D_N0isss3_632Y2A}', 'LKL{g00D_N0isss3_W135ft}', 'LKL{g00D_N0isss3_LE6N7W}', 'LKL{g00D_N0isss3_KyICZe}', 'LKL{g00D_N0isss3_zkD0rf}', 'LKL{g00D_N0isss3_9buyIv}', 'LKL{g00D_N0isss3_kGEOoy}', 'LKL{g00D_N0isss3_ZfBib1}', 'LKL{g00D_N0isss3_z0slZ2}', 'LKL{g00D_N0isss3_88A01U}', 'LKL{g00D_N0isss3_oUNEDP}', 'LKL{g00D_N0isss3_Cnyscg}', 'LKL{g00D_N0isss3_7IkYG0}', 'LKL{g00D_N0isss3_gF0wmI}', 'LKL{g00D_N0isss3_yMF2cR}', 'LKL{g00D_N0isss3_TXzhcc}', 'LKL{g00D_N0isss3_3vUVPT}', 'LKL{g00D_N0isss3_75g5Wu}', 'LKL{g00D_N0isss3_ZGkNWN}', 'LKL{g00D_N0isss3_9baV51}', 'LKL{g00D_N0isss3_emoXAO}', 'LKL{g00D_N0isss3_pVghGT}', 'LKL{g00D_N0isss3_tQFOWQ}', 'LKL{g00D_N0isss3_jd4Zue}', 'LKL{g00D_N0isss3_kcVj6F}', 'LKL{g00D_N0isss3_XBIDjP}', 'LKL{g00D_N0isss3_hCVw6C}', 'LKL{g00D_N0isss3_tkYVgw}', 'LKL{g00D_N0isss3_t7tZkx}', 'LKL{g00D_N0isss3_6xlFZ6}', 'LKL{g00D_N0isss3_HSWb9c}', 'LKL{g00D_N0isss3_sLOi9l}', 'LKL{g00D_N0isss3_YXkZdr}', 'LKL{g00D_N0isss3_K5w8aU}', 'LKL{g00D_N0isss3_mv8ziu}', 'LKL{g00D_N0isss3_vxVAEt}', 'LKL{g00D_N0isss3_azgJlU}', 'LKL{g00D_N0isss3_Z2NJdp}', 'LKL{g00D_N0isss3_JaF5vV}', 'LKL{g00D_N0isss3_KxSi7R}', 'LKL{g00D_N0isss3_OI6SRb}', 'LKL{g00D_N0isss3_4R6m2i}', 'LKL{g00D_N0isss3_xtOTsi}', 'LKL{g00D_N0isss3_8ulVa0}', 'LKL{g00D_N0isss3_HkjTle}', 'LKL{g00D_N0isss3_FcrnrL}', 'LKL{g00D_N0isss3_zIDDbw}', 'LKL{g00D_N0isss3_wh2Fh6}', 'LKL{g00D_N0isss3_pkrF9v}', 'LKL{g00D_N0isss3_1Lq22A}', 'LKL{g00D_N0isss3_Vyf8vW}', 'LKL{g00D_N0isss3_VZ9rR0}', 'LKL{g00D_N0isss3_aeVraB}', 'LKL{g00D_N0isss3_hSoDcd}', 'LKL{g00D_N0isss3_RkTNkY}', 'LKL{g00D_N0isss3_2jRJ44}', 'LKL{g00D_N0isss3_p6PYM7}', 'LKL{g00D_N0isss3_nODrjr}', 'LKL{g00D_N0isss3_Btlsll}', 'LKL{g00D_N0isss3_48wYnO}', 'LKL{g00D_N0isss3_TBcmal}', 'LKL{g00D_N0isss3_lErmPs}', 'LKL{g00D_N0isss3_fEHtQe}', 'LKL{g00D_N0isss3_gjShxr}', 'LKL{g00D_N0isss3_Daj3S7}', 'LKL{g00D_N0isss3_CfIRqC}', 'LKL{g00D_N0isss3_pXUtMd}', 'LKL{g00D_N0isss3_rhVZVx}', 'LKL{g00D_N0isss3_CqsRWp}', 'LKL{g00D_N0isss3_yNBCA6}', 'LKL{g00D_N0isss3_vw6ySl}', 'LKL{g00D_N0isss3_JzxHxq}', 'LKL{g00D_N0isss3_Wcjjdr}', 'LKL{g00D_N0isss3_AKedWk}', 'LKL{g00D_N0isss3_hs10Sa}', 'LKL{g00D_N0isss3_5WBLqq}', 'LKL{g00D_N0isss3_1riPbD}', 'LKL{g00D_N0isss3_dV1wxO}', 'LKL{g00D_N0isss3_or6wJE}', 'LKL{g00D_N0isss3_bfr8E6}', 'LKL{g00D_N0isss3_Jlgc1D}', 'LKL{g00D_N0isss3_t1J8ZG}', 'LKL{g00D_N0isss3_8m9ery}', 'LKL{g00D_N0isss3_hiVkBd}', 'LKL{g00D_N0isss3_vIrWAD}', 'LKL{g00D_N0isss3_Mn9K3B}', 'LKL{g00D_N0isss3_pgjdiB}', 'LKL{g00D_N0isss3_azAstf}', 'LKL{g00D_N0isss3_wwURNX}', 'LKL{g00D_N0isss3_dtXquC}', 'LKL{g00D_N0isss3_qYuvXY}', 'LKL{g00D_N0isss3_rIkruu}', 'LKL{g00D_N0isss3_ATULAI}', 'LKL{g00D_N0isss3_wernRd}', 'LKL{g00D_N0isss3_pvziV6}', 'LKL{g00D_N0isss3_WPIIJQ}', 'LKL{g00D_N0isss3_yJPisd}', 'LKL{g00D_N0isss3_xrXPrQ}', 'LKL{g00D_N0isss3_j0IkqH}', 'LKL{g00D_N0isss3_wXBlZx}', 'LKL{g00D_N0isss3_DKBsw5}', 'LKL{g00D_N0isss3_l9JeSM}', 'LKL{g00D_N0isss3_jPVEqw}', 'LKL{g00D_N0isss3_BuGWtj}', 'LKL{g00D_N0isss3_mJWPmx}', 'LKL{g00D_N0isss3_2zAryd}', 'LKL{g00D_N0isss3_rP5bah}', 'LKL{g00D_N0isss3_Z86HGm}', 'LKL{g00D_N0isss3_m08J5V}', 'LKL{g00D_N0isss3_hukANs}', 'LKL{g00D_N0isss3_P2KSOO}', 'LKL{g00D_N0isss3_aauXbW}', 'LKL{g00D_N0isss3_kZ6TBv}'] | 5,207.5 | 5,608 | 0.807297 | 1,002 | 10,415 | 7.992016 | 0.404192 | 0.174825 | 0.34965 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1667 | 0.038694 | 10,415 | 2 | 5,608 | 5,207.5 | 0.63314 | 0 | 0 | 0 | 0 | 0 | 0.844854 | 0.460829 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ab033f7fd0adcf79a8c4ffaaf0f55c56ae4ddd80 | 731 | py | Python | pybit/exceptions.py | Crypto-Currency-Projects/pybit | 7922348b8d96cd45a8427ad25889484369ecf344 | [
"MIT"
] | 1 | 2021-11-03T15:18:35.000Z | 2021-11-03T15:18:35.000Z | pybit/exceptions.py | Crypto-Currency-Projects/pybit | 7922348b8d96cd45a8427ad25889484369ecf344 | [
"MIT"
] | null | null | null | pybit/exceptions.py | Crypto-Currency-Projects/pybit | 7922348b8d96cd45a8427ad25889484369ecf344 | [
"MIT"
] | 1 | 2021-04-28T04:44:12.000Z | 2021-04-28T04:44:12.000Z | class FailedRequestError(Exception):
"""
Exception raised for failed requests.
Attributes:
message -- Explanation of the error.
status_code -- The code number returned.
"""
def __init__(self, message, status_code):
self.message = message
self.status_code = status_code
super().__init__(self.message)
class InvalidRequestError(Exception):
"""
Exception raised for returned Bybit errors.
Attributes:
message -- Explanation of the error.
status_code -- The code number returned.
"""
def __init__(self, message, status_code):
self.message = message
self.status_code = status_code
super().__init__(self.message)
| 27.074074 | 48 | 0.651163 | 77 | 731 | 5.87013 | 0.311688 | 0.176991 | 0.132743 | 0.119469 | 0.70354 | 0.70354 | 0.70354 | 0.70354 | 0.70354 | 0.70354 | 0 | 0 | 0.261286 | 731 | 26 | 49 | 28.115385 | 0.837037 | 0.381669 | 0 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
dba24157dda956605db711a4c9fc65ec2361fc98 | 2,155 | py | Python | epytope/Data/pssms/smm/mat/A_68_23_9.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 7 | 2021-02-01T18:11:28.000Z | 2022-01-31T19:14:07.000Z | epytope/Data/pssms/smm/mat/A_68_23_9.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 22 | 2021-01-02T15:25:23.000Z | 2022-03-14T11:32:53.000Z | epytope/Data/pssms/smm/mat/A_68_23_9.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 4 | 2021-05-28T08:50:38.000Z | 2022-03-14T11:45:32.000Z | A_68_23_9 = {0: {'A': -0.009, 'C': -0.032, 'E': -0.07, 'D': 0.0, 'G': -0.009, 'F': -0.042, 'I': 0.0, 'H': 0.025, 'K': 0.112, 'M': -0.027, 'L': 0.129, 'N': 0.0, 'Q': 0.0, 'P': 0.0, 'S': -0.071, 'R': 0.005, 'T': 0.016, 'W': -0.041, 'V': 0.0, 'Y': 0.013}, 1: {'A': -0.161, 'C': 0.0, 'E': 0.243, 'D': 0.0, 'G': 0.267, 'F': 0.0, 'I': 0.312, 'H': 0.0, 'K': 0.0, 'M': -0.07, 'L': 0.238, 'N': 0.0, 'Q': -0.001, 'P': -0.015, 'S': 0.139, 'R': 0.258, 'T': -0.693, 'W': 0.0, 'V': -0.51, 'Y': -0.006}, 2: {'A': -0.022, 'C': 0.0, 'E': 0.0, 'D': -0.001, 'G': 0.014, 'F': -0.033, 'I': 0.029, 'H': -0.016, 'K': 0.0, 'M': -0.003, 'L': 0.02, 'N': 0.0, 'Q': 0.0, 'P': 0.012, 'S': 0.025, 'R': 0.0, 'T': -0.005, 'W': 0.01, 'V': -0.013, 'Y': -0.019}, 3: {'A': 0.06, 'C': 0.0, 'E': -0.071, 'D': -0.059, 'G': -0.07, 'F': -0.08, 'I': -0.036, 'H': 0.062, 'K': 0.088, 'M': -0.023, 'L': -0.011, 'N': 0.009, 'Q': 0.094, 'P': 0.149, 'S': 0.055, 'R': 0.01, 'T': 0.029, 'W': -0.027, 'V': -0.031, 'Y': -0.147}, 4: {'A': -0.0, 'C': 0.0, 'E': -0.0, 'D': -0.0, 'G': 0.0, 'F': -0.0, 'I': 0.0, 'H': -0.0, 'K': -0.0, 'M': 0.0, 'L': 0.0, 'N': -0.0, 'Q': 0.0, 'P': -0.0, 'S': 0.0, 'R': -0.0, 'T': 0.0, 'W': -0.0, 'V': -0.0, 'Y': -0.0}, 5: {'A': 0.0, 'C': 0.0, 'E': 0.0, 'D': -0.0, 'G': 0.0, 'F': 0.0, 'I': -0.0, 'H': -0.0, 'K': 0.0, 'M': -0.0, 'L': 0.0, 'N': 0.0, 'Q': -0.0, 'P': -0.0, 'S': 0.0, 'R': -0.0, 'T': -0.0, 'W': -0.0, 'V': -0.0, 'Y': 0.0}, 6: {'A': -0.194, 'C': 0.0, 'E': -0.045, 'D': -0.059, 'G': 0.066, 'F': 0.104, 'I': -0.167, 'H': -0.04, 'K': 0.0, 'M': -0.06, 'L': 0.26, 'N': -0.117, 'Q': 0.128, 'P': -0.172, 'S': 0.182, 'R': -0.036, 'T': -0.093, 'W': -0.019, 'V': 0.183, 'Y': 0.079}, 7: {'A': -0.004, 'C': -0.016, 'E': -0.025, 'D': -0.007, 'G': -0.07, 'F': 0.018, 'I': 0.017, 'H': -0.017, 'K': 0.009, 'M': 0.049, 'L': 0.043, 'N': 0.0, 'Q': 0.001, 'P': -0.037, 'S': 0.104, 'R': 0.011, 'T': -0.026, 'W': 0.0, 'V': 0.017, 'Y': -0.067}, 8: {'A': -0.032, 'C': 0.0, 'E': 0.0, 'D': 0.0, 'G': 0.0, 'F': 0.071, 'I': 0.162, 'H': 0.0, 'K': 0.013, 'M': -0.02, 'L': 0.01, 'N': 0.0, 'Q': 0.0, 'P': 0.0, 'S': 0.0, 'R': 0.068, 'T': 0.0, 'W': -0.046, 'V': -0.06, 'Y': -0.166}, -1: {'con': 1.43775}} | 2,155 | 2,155 | 0.354524 | 557 | 2,155 | 1.366248 | 0.193896 | 0.194481 | 0.027595 | 0.036794 | 0.396846 | 0.266754 | 0.266754 | 0.266754 | 0.208936 | 0.208936 | 0 | 0.323793 | 0.173086 | 2,155 | 1 | 2,155 | 2,155 | 0.103255 | 0 | 0 | 0 | 0 | 0 | 0.084879 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
dbba3937a50a7825512a04f16d6c5eac3d2e530b | 21 | py | Python | blesuite/replay/btsnoop/btsnoop/__init__.py | jreynders/BLESuite-1 | 1c3c15fc2d4e30c3f9c1a15e0268cae84685784b | [
"MIT"
] | 198 | 2016-08-04T05:45:38.000Z | 2022-02-17T08:30:58.000Z | blesuite/replay/btsnoop/btsnoop/__init__.py | jreynders/BLESuite-1 | 1c3c15fc2d4e30c3f9c1a15e0268cae84685784b | [
"MIT"
] | 13 | 2018-02-04T14:16:16.000Z | 2020-10-09T02:16:24.000Z | blesuite/replay/btsnoop/btsnoop/__init__.py | jreynders/BLESuite-1 | 1c3c15fc2d4e30c3f9c1a15e0268cae84685784b | [
"MIT"
] | 57 | 2016-08-08T04:24:04.000Z | 2022-01-24T08:43:02.000Z | from . import btsnoop | 21 | 21 | 0.809524 | 3 | 21 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 21 | 1 | 21 | 21 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
91acdaa9b641fa37cb32734abc3b596e6e494ffb | 287 | py | Python | aiger/__init__.py | sjunges/py-aiger | 8a971bc2ab983ee57115c64964d41135ca444048 | [
"MIT"
] | 19 | 2018-04-24T16:40:24.000Z | 2021-12-11T15:48:30.000Z | aiger/__init__.py | sjunges/py-aiger | 8a971bc2ab983ee57115c64964d41135ca444048 | [
"MIT"
] | 95 | 2018-05-09T00:13:53.000Z | 2021-03-03T22:26:02.000Z | aiger/__init__.py | sjunges/py-aiger | 8a971bc2ab983ee57115c64964d41135ca444048 | [
"MIT"
] | 5 | 2018-04-24T22:42:30.000Z | 2021-07-12T19:09:41.000Z | # flake8: noqa
from aiger.aig import AIG, to_aig
from aiger.common import and_gate, or_gate, bit_flipper, source, sink
from aiger.common import tee, empty, identity, delay, parity_gate
from aiger.parser import *
from aiger.expr import atom, atoms, ite, BoolExpr
from aiger.lazy import *
| 35.875 | 69 | 0.787456 | 47 | 287 | 4.702128 | 0.574468 | 0.244344 | 0.135747 | 0.190045 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004032 | 0.135889 | 287 | 7 | 70 | 41 | 0.887097 | 0.041812 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
91b1eb9b7bf0ca0e60d41a831b5874f4cd864e34 | 17 | py | Python | readbin/settings/local.example.py | asnelzin/readbin | 1b546f71955cf5753d63aaf7d7fda0d466fc1332 | [
"MIT"
] | null | null | null | readbin/settings/local.example.py | asnelzin/readbin | 1b546f71955cf5753d63aaf7d7fda0d466fc1332 | [
"MIT"
] | null | null | null | readbin/settings/local.example.py | asnelzin/readbin | 1b546f71955cf5753d63aaf7d7fda0d466fc1332 | [
"MIT"
] | null | null | null | from dev import * | 17 | 17 | 0.764706 | 3 | 17 | 4.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 17 | 1 | 17 | 17 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
91bf4650e62d672a4ce1360897e332b5c639afe0 | 254 | py | Python | python/federatedml/ensemble/boosting/boosting_core/__init__.py | hubert-he/FATE | 6758e150bd7ca7d6f788f9a7a8c8aea7e6500363 | [
"Apache-2.0"
] | 3,787 | 2019-08-30T04:55:10.000Z | 2022-03-31T23:30:07.000Z | python/federatedml/ensemble/boosting/boosting_core/__init__.py | JavaGreenHands/FATE | ea1e94b6be50c70c354d1861093187e523af32f2 | [
"Apache-2.0"
] | 1,439 | 2019-08-29T16:35:52.000Z | 2022-03-31T11:55:31.000Z | python/federatedml/ensemble/boosting/boosting_core/__init__.py | JavaGreenHands/FATE | ea1e94b6be50c70c354d1861093187e523af32f2 | [
"Apache-2.0"
] | 1,179 | 2019-08-29T16:18:32.000Z | 2022-03-31T12:55:38.000Z | from federatedml.ensemble.boosting.boosting_core.boosting import Boosting
from federatedml.ensemble.boosting.boosting_core.hetero_boosting import HeteroBoostingGuest, HeteroBoostingHost
__all__ = ["Boosting", "HeteroBoostingGuest", "HeteroBoostingHost"] | 63.5 | 111 | 0.866142 | 24 | 254 | 8.875 | 0.416667 | 0.140845 | 0.215962 | 0.29108 | 0.403756 | 0.403756 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055118 | 254 | 4 | 112 | 63.5 | 0.8875 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
91d1dc8c2e5db3f6200ebe0a47389b37dab1c440 | 35 | py | Python | dvcli/__main__.py | gdcc/dvcli | 25aabb31db631208411d3cdfefcabe13201bbe9a | [
"Apache-2.0"
] | 2 | 2020-02-08T12:59:31.000Z | 2020-02-10T15:22:29.000Z | dvcli/__main__.py | GlobalDataverseCommunityConsortium/dvcli | 25aabb31db631208411d3cdfefcabe13201bbe9a | [
"Apache-2.0"
] | 4 | 2020-02-08T13:13:01.000Z | 2020-07-22T20:17:46.000Z | dvcli/__main__.py | GlobalDataverseCommunityConsortium/dvcli | 25aabb31db631208411d3cdfefcabe13201bbe9a | [
"Apache-2.0"
] | null | null | null | import dvcli.cli
dvcli.cli.main()
| 8.75 | 16 | 0.742857 | 6 | 35 | 4.333333 | 0.666667 | 0.615385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 3 | 17 | 11.666667 | 0.83871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
37d194e701a0ee6320debc0b91b8af97a9e48721 | 85,688 | py | Python | notebooks/Tests_IDE.py | CFARS/TACT | 1b2bbf1f9d0a45cff232ec447286419faac66b58 | [
"BSD-3-Clause"
] | 1 | 2022-03-23T11:50:53.000Z | 2022-03-23T11:50:53.000Z | notebooks/Tests_IDE.py | CFARS/TACT | 1b2bbf1f9d0a45cff232ec447286419faac66b58 | [
"BSD-3-Clause"
] | 4 | 2021-12-18T04:01:41.000Z | 2022-03-10T16:13:18.000Z | notebooks/Tests_IDE.py | CFARS/TACT | 1b2bbf1f9d0a45cff232ec447286419faac66b58 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Thu Dec 30 11:25:35 2021
@author: AHBL
"""
import os
import openpyxl
from TACT.readers.config import Config
from TACT.readers.data import Data
"""parser get_input_files"""
example_path = "C:/GitHub/site_suitability_tool/Example/"
config = Config(
input_filename=example_path+"example_project.csv",
config_file=example_path+"configuration_example_project.xlsx",
rtd_files='',
results_file=example_path+'Test_IDE_results.xlsx',
save_model_location='',
time_test_flag=False,
)
config.outpath_dir = os.path.dirname(config.results_file)
config.outpath_file = os.path.basename(config.results_file)
input_filename = config.input_filename
config_file = config.config_file
rtd_files = config.rtd_files
results_filename = config.results_file
saveModel = config.save_model_location
timetestFlag = config.time_test_flag
globalModel = config.global_model
"""config object assignments"""
outpath_dir = config.outpath_dir
outpath_file = config.outpath_file
"""metadata parser"""
config.get_site_metadata()
siteMetadata = config.site_metadata
config.get_filtering_metadata()
filterMetadata = config.config_metadata
config.get_adjustments_metadata()
adjustmentsMetadata = config.adjustments_metadata
RSDtype = config.RSDtype
extrap_metadata = config.extrap_metadata
extrapolation_type = config.extrapolation_type
"""data object assignments"""
data = Data(input_filename=example_path+"example_project.csv",
config_file=example_path+"configuration_example_project.xlsx")
data.get_inputdata()
data.get_refTI_bins()
data.check_for_alphaConfig()
#%%
"""
config_object
- site_suitability_tool/readers/config_file.py
- parameters
-
- object includes
- config_file
- input_filename
- rtd_files
- results_filename
- saveModel
- time_test_flag
- global_model
-
- siteMetaData
- filterMetaData
- adjustmentsMetaData
- RSDtype
- extrap_metadata
- extrapolation_type
data_object
- config_object
- inputdata
- Timestamps
- a
- lab_a
- RSD_alphaFlag
- Ht_1_rsd
- Ht_2_rsd
"""
print ('%%%%%%%%%%%%%%%%%%%%%%%%% Processing Data %%%%%%%%%%%%%%%%%%%%%%%%%%%%%')
# -------------------------------
# special handling for data types
# -------------------------------
stabilityFlag = False
if RSDtype['Selection'][0:4] == 'Wind':
stabilityFlag = True
if RSDtype['Selection']=='ZX':
stabilityFlag = True
TI_computed = inputdata['RSD_SD']/inputdata['RSD_WS']
RepTI_computed = TI_computed + 1.28 * inputdata['RSD_SD']
inputdata = inputdata.rename(columns={'RSD_TI':'RSD_TI_instrument'})
inputdata = inputdata.rename(columns={'RSD_RepTI':'RSD_RepTI_instrument'})
inputdata['RSD_TI'] = TI_computed
inputdata['RSD_RepTI'] = RepTI_computed
elif RSDtype['Selection']=='Triton':
print ('RSD type is triton, not that output uncorrected TI is instrument corrected')
# ------------------------
# Baseline Results
# ------------------------
# Get all regressions available
reg_results = get_all_regressions(inputdata, title = 'Full comparison')
sensor, height = get_phaseiii_metadata(config_file)
stabilityClass_tke, stabilityMetric_tke, regimeBreakdown_tke = calculate_stability_TKE(inputdata)
cup_alphaFlag, stabilityClass_ane, stabilityMetric_ane, regimeBreakdown_ane, Ht_1_ane, Ht_2_ane, stabilityClass_rsd, stabilityMetric_rsd, regimeBreakdown_rsd = calculate_stability_alpha(inputdata, config_file, RSD_alphaFlag, Ht_1_rsd, Ht_2_rsd)
#------------------------
# Time Sensivity Analysis
#------------------------
TimeTestA = pd.DataFrame()
TimeTestB = pd.DataFrame()
TimeTestC = pd.DataFrame()
if timetestFlag == True:
# A) increase % of test train split -- check for convergence --- basic metrics recorded baseline but also for every adjustments
splitList = np.linspace(0.0, 100.0, num = 20, endpoint =False)
print ('Testing model generation time period sensitivity...% of data')
TimeTestA_adjustments_df = {}
TimeTestA_baseline_df = pd.DataFrame()
for s in splitList[1:]:
print (str(str(s) + '%'))
inputdata_test = train_test_split(s,inputdata.copy())
TimeTestA_baseline_df, TimeTestA_adjustments_df = QuickMetrics(inputdata_test,TimeTestA_baseline_df,TimeTestA_adjustments_df,str(100-s))
# B) incrementally Add days to training set sequentially -- check for convergence
numberofObsinOneDay = 144
numberofDaysInTest = int(round(len(inputdata)/numberofObsinOneDay))
print ('Testing model generation time period sensitivity...days to train model')
print ('Number of days in the study ' + str(numberofDaysInTest))
TimeTestB_adjustments_df = {}
TimeTestB_baseline_df = pd.DataFrame()
for i in range(0,numberofDaysInTest):
print (str(str(i) + 'days'))
windowEnd = (i+1)*(numberofObsinOneDay)
inputdata_test = train_test_split(i,inputdata.copy(), stepOverride = [0,windowEnd])
TimeTestB_baseline_df, TimeTestB_adjustments_df = QuickMetrics(inputdata_test,TimeTestB_baseline_df, TimeTestB_adjustments_df,str(numberofDaysInTest-i))
# C) If experiment is greater than 3 months, slide a 6 week window (1 week step)
if len(inputdata) > (numberofObsinOneDay*90): # check to see if experiment is greater than 3 months
print ('Testing model generation time period sensitivity...6 week window pick')
windowStart = 0
windowEnd = (numberofObsinOneDay*42)
TimeTestC_adjustments_df = {}
TimeTestC_baseline_df = pd.DataFrame()
while windowEnd < len(inputdata):
print (str('After observation #' + str(windowStart) + ' ' + 'Before observation #' + str(windowEnd)))
windowStart += numberofObsinOneDay*7
windowEnd = windowStart + (numberofObsinOneDay*42)
inputdata_test = train_test_split(i,inputdata.copy(), stepOverride = [windowStart,windowEnd])
TimeTestC_baseline_df, TimeTestC_adjustments_df = QuickMetrics(inputdata_test, TimeTestC_baseline_df, TimeTestC_adjustments_df,
str('After_' + str(windowStart) + '_' + 'Before_' + str(windowEnd)))
else:
TimeTestA_baseline_df = pd.DataFrame()
TimeTestB_baseline_df = pd.DataFrame()
TimeTestC_baseline_df = pd.DataFrame()
TimeTestA_adjustments_df = {}
TimeTestB_adjustments_df = {}
TimeTestC_adjustments_df = {}
#-----------------------
# Test - Train split
#-----------------------
# random 80-20 split
inputdata = train_test_split(80.0, inputdata.copy())
inputdata_train = inputdata[inputdata['split'] == True].copy().join(Timestamps)
inputdata_test = inputdata[inputdata['split'] == False].copy().join(Timestamps)
timestamp_train = inputdata_train['Timestamp']
timestamp_test = inputdata_test['Timestamp']
#-----------------------------
# stability class subset lists
#-----------------------------
# get reg_results by stability class: list of df's for each height
reg_results_class1 = []
reg_results_class2 = []
reg_results_class3 = []
reg_results_class4 = []
reg_results_class5 = []
reg_results_class1_alpha = {}
reg_results_class2_alpha = {}
reg_results_class3_alpha = {}
reg_results_class4_alpha = {}
reg_results_class5_alpha = {}
if RSDtype['Selection'][0:4] == 'Wind' or 'ZX' in RSDtype['Selection']:
inputdata_class1 = []
inputdata_class2 = []
inputdata_class3 = []
inputdata_class4 = []
inputdata_class5 = []
RSD_h = []
Alldata_inputdata = inputdata.copy()
for h in stabilityClass_tke.columns.to_list():
RSD_h.append(h)
inputdata_class1.append(Alldata_inputdata[Alldata_inputdata[h] == 1])
inputdata_class2.append(Alldata_inputdata[Alldata_inputdata[h] == 2])
inputdata_class3.append(Alldata_inputdata[Alldata_inputdata[h] == 3])
inputdata_class4.append(Alldata_inputdata[Alldata_inputdata[h] == 4])
inputdata_class5.append(Alldata_inputdata[Alldata_inputdata[h] == 5])
All_class_data = [inputdata_class1,inputdata_class2, inputdata_class3,
inputdata_class4, inputdata_class5]
All_class_data_clean = [inputdata_class1, inputdata_class2, inputdata_class3,
inputdata_class4, inputdata_class5]
for h in RSD_h:
idx = RSD_h.index(h)
df = inputdata_class1[idx]
reg_results_class1.append(get_all_regressions(df, title = str('TKE_stability_' + h + 'class1')))
df = inputdata_class2[idx]
reg_results_class2.append(get_all_regressions(df, title = str('TKE_stability_' + h + 'class2')))
df = inputdata_class3[idx]
reg_results_class3.append(get_all_regressions(df, title = str('TKE_stability_' + h + 'class3')))
df = inputdata_class4[idx]
reg_results_class4.append(get_all_regressions(df, title = str('TKE_stability_' + h + 'class4')))
df = inputdata_class5[idx]
reg_results_class5.append(get_all_regressions(df, title = str('TKE_stability_' + h + 'class5')))
if RSD_alphaFlag:
del inputdata_class1, inputdata_class2, inputdata_class3, inputdata_class4, inputdata_class5
Alldata_inputdata = inputdata.copy()
colName = stabilityClass_rsd.name
Alldata_inputdata[colName] = stabilityClass_rsd.values
inputdata_class1=Alldata_inputdata[Alldata_inputdata[stabilityClass_rsd.name] == 1.0]
inputdata_class2=Alldata_inputdata[Alldata_inputdata[stabilityClass_rsd.name] == 2.0]
inputdata_class3=Alldata_inputdata[Alldata_inputdata[stabilityClass_rsd.name] == 3.0]
inputdata_class4=Alldata_inputdata[Alldata_inputdata[stabilityClass_rsd.name] == 4.0]
inputdata_class5=Alldata_inputdata[Alldata_inputdata[stabilityClass_rsd.name] == 5.0]
All_class_data_alpha_RSD = [inputdata_class1,inputdata_class2, inputdata_class3,
inputdata_class4, inputdata_class5]
All_class_data_alpha_RSD_clean = [inputdata_class1.copy(),inputdata_class2.copy(), inputdata_class3.copy(),
inputdata_class4.copy(), inputdata_class5.copy()]
reg_results_class1_alpha['RSD'] = get_all_regressions(inputdata_class1, title = str('alpha_stability_RSD' + 'class1'))
reg_results_class2_alpha['RSD'] = get_all_regressions(inputdata_class2, title = str('alpha_stability_RSD' + 'class2'))
reg_results_class3_alpha['RSD'] = get_all_regressions(inputdata_class3, title = str('alpha_stability_RSD' + 'class3'))
reg_results_class4_alpha['RSD'] = get_all_regressions(inputdata_class4, title = str('alpha_stability_RSD' + 'class4'))
reg_results_class5_alpha['RSD'] = get_all_regressions(inputdata_class5, title = str('alpha_stability_RSD' + 'class5'))
if cup_alphaFlag:
del inputdata_class1, inputdata_class2, inputdata_class3, inputdata_class4, inputdata_class5
Alldata_inputdata = inputdata.copy()
colName = stabilityClass_ane.name
Alldata_inputdata[colName] = stabilityClass_ane.values
inputdata_class1 = Alldata_inputdata[Alldata_inputdata[stabilityClass_ane.name] == 1.0]
inputdata_class2 = Alldata_inputdata[Alldata_inputdata[stabilityClass_ane.name] == 2.0]
inputdata_class3 = Alldata_inputdata[Alldata_inputdata[stabilityClass_ane.name] == 3.0]
inputdata_class4 = Alldata_inputdata[Alldata_inputdata[stabilityClass_ane.name] == 4.0]
inputdata_class5 = Alldata_inputdata[Alldata_inputdata[stabilityClass_ane.name] == 5.0]
All_class_data_alpha_Ane = [inputdata_class1,inputdata_class2, inputdata_class3,
inputdata_class4, inputdata_class5]
All_class_data_alpha_Ane_clean = [inputdata_class1.copy(),inputdata_class2.copy(), inputdata_class3.copy(),
inputdata_class4.copy(), inputdata_class5.copy()]
reg_results_class1_alpha['Ane'] = get_all_regressions(inputdata_class1, title = str('alpha_stability_Ane' + 'class1'))
reg_results_class2_alpha['Ane'] = get_all_regressions(inputdata_class2, title = str('alpha_stability_Ane' + 'class2'))
reg_results_class3_alpha['Ane'] = get_all_regressions(inputdata_class3, title = str('alpha_stability_Ane' + 'class3'))
reg_results_class4_alpha['Ane'] = get_all_regressions(inputdata_class4, title = str('alpha_stability_Ane' + 'class4'))
reg_results_class5_alpha['Ane'] = get_all_regressions(inputdata_class5, title = str('alpha_stability_Ane' + 'class5'))
# ------------------------
# TI Adjustments
# ------------------------
baseResultsLists = initialize_resultsLists('')
# get number of observations in each bin
count_1mps, count_05mps = get_count_per_WSbin(inputdata, 'RSD_WS')
inputdata_train = inputdata[inputdata['split'] == True].copy().join(Timestamps)
inputdata_test = inputdata[inputdata['split'] == False].copy().join(Timestamps)
timestamp_train = inputdata_train['Timestamp']
timestamp_test = inputdata_test['Timestamp']
count_1mps_train, count_05mps_train = get_count_per_WSbin(inputdata_train, 'RSD_WS')
count_1mps_test, count_05mps_test = get_count_per_WSbin(inputdata_test, 'RSD_WS')
if RSDtype['Selection'][0:4] == 'Wind' or 'ZX' in RSDtype['Selection']:
primary_c = [h for h in RSD_h if 'Ht' not in h]
primary_idx = RSD_h.index(primary_c[0])
ResultsLists_stability = initialize_resultsLists('stability_')
if cup_alphaFlag:
ResultsLists_stability_alpha_Ane = initialize_resultsLists('stability_alpha_Ane')
if RSD_alphaFlag:
ResultsLists_stability_alpha_RSD = initialize_resultsLists('stability_alpha_RSD')
name_1mps_tke = []
name_1mps_alpha_Ane = []
name_1mps_alpha_RSD = []
name_05mps_tke = []
name_05mps_alpha_Ane = []
name_05mps_alpha_RSD = []
count_1mps_tke = []
count_1mps_alpha_Ane = []
count_1mps_alpha_RSD = []
count_05mps_tke = []
count_05mps_alpha_Ane = []
count_05mps_alpha_RSD = []
for c in range(0,len(All_class_data)):
name_1mps_tke.append(str('count_1mps_class_' + str(c) + '_tke'))
name_1mps_alpha_Ane.append(str('count_1mps_class_' + str(c) + '_alpha_Ane'))
name_1mps_alpha_RSD.append(str('count_1mps_class_' + str(c) + '_alpha_RSD'))
name_05mps_tke.append(str('count_05mps_class_' + str(c) + '_tke'))
name_05mps_alpha_Ane.append(str('count_05mps_class_' + str(c) + '_alpha_Ane'))
name_05mps_alpha_RSD.append(str('count_05mps_class_' + str(c) + '_alpha_RSD'))
try:
c_1mps_tke, c_05mps_tke = get_count_per_WSbin(All_class_data[c][primary_idx], 'RSD_WS')
count_1mps_tke.append(c_1mps_tke)
count_05mps_tke.append(c_05mps_tke)
except:
count_1mps_tke.append(None)
count_05mps_tke.append(None)
try:
c_1mps_alpha_Ane, c_05mps_alpha_Ane = get_count_per_WSbin(All_class_data_alpha_Ane[c], 'RSD_WS')
count_1mps_alpha_Ane.append(c_1mps_alpha_Ane)
count_05mps_alpha_Ane.append(c_05mps_alpha_Ane)
except:
count_1mps_alpha_Ane.append(None)
count_05mps_alpha_Ane.append(None)
try:
c_1mps_alpha_RSD, c_05mps_alpha_RSD = get_count_per_WSbin(All_class_data_alpha_RSD[c], 'RSD_WS')
count_1mps_alpha_RSD.append(c_1mps_alpha_RSD)
count_05mps_alpha_RSD.append(c_05mps_alpha_RSD)
except:
count_1mps_alpha_RSD.append(None)
count_05mps_alpha_RSD.append(None)
# intialize 10 minute output
TI_10minuteAdjusted = pd.DataFrame()
for method in adjustmentsMetadata:
# ************************************ #
# Site Specific Simple Adjustment (SS-S)
if method != 'SS-S':
pass
elif method == 'SS-S' and adjustmentsMetadata['SS-S'] == False:
pass
else:
print ('Applying Adjustment Method: SS-S')
inputdata_adj, lm_adj, m, c = perform_SS_S_adjustment(inputdata.copy())
print("SS-S: y = " + str(m) + " * x + " + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = 'SS-S'
adjustmentName = 'SS_S'
baseResultsLists = populate_resultsLists(baseResultsLists, '', adjustmentName, lm_adj, inputdata_adj,
Timestamps, method)
TI_10minuteAdjusted = record_TIadj(adjustmentName,inputdata_adj,Timestamps, method, TI_10minuteAdjusted, emptyclassFlag=False)
if RSDtype['Selection'][0:4] == 'Wind':
print ('Applying Adjustment Method: SS-S by stability class (TKE)')
# stability subset output for primary height (all classes)
ResultsLists_class = initialize_resultsLists('class_')
className = 1
for item in All_class_data:
inputdata_adj, lm_adj, m, c = perform_SS_S_adjustment(item[primary_idx].copy())
print("SS-S: y = " + str(m) + " * x + " + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-S' + '_TKE_' + 'class_' + str(className))
adjustmentName = str('SS-S'+ '_TKE_' + str(className))
ResultsLists_class = populate_resultsLists(ResultsLists_class, 'class_', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsList_stability = populate_resultsLists_stability(ResultsLists_stability, ResultsLists_class, '')
if RSD_alphaFlag:
print ('Applying Adjustment Method: SS-S by stability class Alpha w/ RSD')
ResultsLists_class_alpha_RSD = initialize_resultsLists('class_alpha_RSD')
className = 1
print (str('class ' + str(className)))
for item in All_class_data_alpha_RSD:
iputdata_adj, lm_adj, m, c = perform_SS_S_adjustment(item.copy())
print ("SS-S: y = " + str(m) + "* x +" + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-S' + '_' + 'class_' + str(className))
adjustmentName = str('SS-S' + '_alphaRSD_' + str(className))
ResultsLists_class_alpha_RSD = populate_resultsLists(ResultsLists_class_alpha_RSD, 'class_alpha_RSD', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_RSD = populate_resultsLists_stability(ResultsLists_stability_alpha_RSD, ResultsLists_class_alpha_RSD, 'alpha_RSD')
if cup_alphaFlag:
print ('Applying adjustment Method: SS-S by stability class Alpha w/cup')
ResultsLists_class_alpha_Ane = initialize_resultsLists('class_alpha_Ane')
className = 1
for item in All_class_data_alpha_Ane:
iputdata_adj, lm_adj, m, c = perform_SS_S_adjustment(item.copy())
print ("SS-S: y = " + str(m) + "* x +" + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-S' + '_alphaCup_' + 'class_' + str(className))
adjustmentName = str('SS-S' + '_' + str(className))
ResultsLists_class_alpha_Ane = populate_resultsLists(ResultsLists_class_alpha_Ane, 'class_alpha_Ane', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_Ane = populate_resultsLists_stability(ResultsLists_stability_alpha_Ane, ResultsLists_class_alpha_Ane, 'alpha_Ane')
# ********************************************** #
# Site Specific Simple + Filter Adjustment (SS-SF)
if method != 'SS-SF':
pass
elif method == 'SS-SF' and adjustmentsMetadata['SS-SF'] == False:
pass
else:
print ('Applying Adjustment Method: SS-SF')
inputdata_adj, lm_adj, m, c = perform_SS_SF_adjustment(inputdata.copy())
print("SS-SF: y = " + str(m) + " * x + " + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = 'SS-SF'
adjustmentName = 'SS_SF'
baseResultsLists = populate_resultsLists(baseResultsLists, '', adjustmentName, lm_adj, inputdata_adj,
Timestamps, method)
TI_10minuteAdjusted = record_TIadj(adjustmentName,inputdata_adj,Timestamps, method, TI_10minuteAdjusted, emptyclassFlag=False)
if RSDtype['Selection'][0:4] == 'Wind' or 'ZX' in RSDtype['Selection']:
print ('Applying Adjustment Method: SS-SF by stability class (TKE)')
# stability subset output for primary height (all classes)
ResultsLists_class = initialize_resultsLists('class_')
className = 1
for item in All_class_data:
inputdata_adj, lm_adj, m, c = perform_SS_SF_adjustment(item[primary_idx].copy())
print("SS-SF: y = " + str(m) + " * x + " + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-SF' + '_' + 'class_' + str(className))
adjustmentName = str('SS_SF' + '_TKE_' + str(className))
ResultsLists_class = populate_resultsLists(ResultsLists_class, 'class_', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsList_stability = populate_resultsLists_stability(ResultsLists_stability, ResultsLists_class, '')
if RSD_alphaFlag:
print ('Applying Adjustment Method: SS-SF by stability class Alpha w/ RSD')
ResultsLists_class_alpha_RSD = initialize_resultsLists('class_alpha_RSD')
className = 1
for item in All_class_data_alpha_RSD:
iputdata_adj, lm_adj, m, c = perform_SS_SF_adjustment(item.copy())
print ("SS-SF: y = " + str(m) + "* x +" + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-SF' + '_' + 'class_' + str(className))
adjustmentName = str('SS_SF' + '_alphaRSD_' + str(className))
ResultsLists_class_alpha_RSD = populate_resultsLists(ResultsLists_class_alpha_RSD, 'class_alpha_RSD', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_RSD = populate_resultsLists_stability(ResultsLists_stability_alpha_RSD, ResultsLists_class_alpha_RSD, 'alpha_RSD')
if cup_alphaFlag:
print ('Applying adjustment Method: SS-SF by stability class Alpha w/cup')
ResultsLists_class_alpha_Ane = initialize_resultsLists('class_alpha_Ane')
className = 1
for item in All_class_data_alpha_Ane:
iputdata_adj, lm_adj, m, c = perform_SS_SF_adjustment(item.copy())
print ("SS-SF: y = " + str(m) + "* x +" + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-SF' + '_' + 'class_' + str(className))
adjustmentName = str('SS_SF' + '_alphaCup_' + str(className))
ResultsLists_class_alpha_Ane = populate_resultsLists(ResultsLists_class_alpha_Ane, 'class_alpha_Ane', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_Ane = populate_resultsLists_stability(ResultsLists_stability_alpha_Ane, ResultsLists_class_alpha_Ane, 'alpha_Ane')
# ************************************ #
# Site Specific Simple Adjustment (SS-SS) combining stability classes adjusted differently
if method != 'SS-SS':
pass
elif method == 'SS-SS' and adjustmentsMetadata['SS-SS'] == False:
pass
elif RSDtype['Selection'][0:4] != 'Wind' and 'ZX' not in RSDtype['Selection']:
pass
else:
print ('Applying Adjustment Method: SS-SS')
inputdata_adj, lm_adj, m, c = perform_SS_SS_adjustment(inputdata.copy(),All_class_data,primary_idx)
print("SS-SS: y = " + str(m) + " * x + " + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = 'SS-SS'
adjustmentName = 'SS_SS'
baseResultsLists = populate_resultsLists(baseResultsLists, '', adjustmentName, lm_adj, inputdata_adj,
Timestamps, method)
TI_10minuteAdjusted = record_TIadj(adjustmentName,inputdata_adj,Timestamps, method, TI_10minuteAdjusted, emptyclassFlag=False)
if RSDtype['Selection'][0:4] == 'Wind':
print ('Applying Adjustment Method: SS-SS by stability class (TKE). SAME as Baseline')
ResultsLists_class = initialize_resultsLists('class_')
className = 1
for item in All_class_data:
print("SS-SS: y = " + str(m) + " * x + " + str(c))
adjustmentName = str('SS_SS' + '_TKE_' + str(className))
ResultsLists_class = populate_resultsLists(ResultsLists_class, 'class_', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsList_stability = populate_resultsLists_stability(ResultsLists_stability, ResultsLists_class, '')
if RSD_alphaFlag:
print ('Applying Adjustment Method: SS-SS by stability class Alpha w/ RSD. SAEM as Baseline')
ResultsLists_class_alpha_RSD = initialize_resultsLists('class_alpha_RSD')
className = 1
for item in All_class_data_alpha_RSD:
print ("SS-SS: y = " + str(m) + "* x +" + str(c))
adjustmentName = str('SS_SS' + '_alphaRSD_' + str(className))
ResultsLists_class_alpha_RSD = populate_resultsLists(ResultsLists_class_alpha_RSD, 'class_alpha_RSD', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_RSD = populate_resultsLists_stability(ResultsLists_stability_alpha_RSD, ResultsLists_class_alpha_RSD, 'alpha_RSD')
if cup_alphaFlag:
print ('Applying adjustment Method: SS-SS by stability class Alpha w/cup')
ResultsLists_class_alpha_Ane = initialize_resultsLists('class_alpha_Ane')
className = 1
for item in All_class_data_alpha_Ane:
print ("SS-SS: y = " + str(m) + "* x +" + str(c))
emptyclassFlag = False
adjustmentName = str('SS_SS' + '_alphaCup_' + str(className))
ResultsLists_class_alpha_Ane = populate_resultsLists(ResultsLists_class_alpha_Ane, 'class_alpha_Ane', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_Ane = populate_resultsLists_stability(ResultsLists_stability_alpha_Ane, ResultsLists_class_alpha_Ane, 'alpha_Ane')
# ******************************************* #
# Site Specific WindSpeed Adjustment (SS-WS)
if method != 'SS-WS':
pass
elif method == 'SS-WS' and adjustmentsMetadata['SS-WS'] == False:
pass
else:
print ('Applying Adjustment Method: SS-WS')
inputdata_adj, lm_adj, m, c = perform_SS_WS_adjustment(inputdata.copy())
print("SS-WS: y = " + str(m) + " * x + " + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = 'SS-WS'
adjustmentName = 'SS_WS'
baseResultsLists = populate_resultsLists(baseResultsLists, '', adjustmentName, lm_adj, inputdata_adj,
Timestamps, method)
TI_10minuteAdjusted = record_TIadj(adjustmentName,inputdata_adj,Timestamps, method, TI_10minuteAdjusted, emptyclassFlag=False)
if RSDtype['Selection'][0:4] == 'Wind' or 'ZX' in RSDtype['Selection']:
print ('Applying Adjustment Method: SS-WS by stability class (TKE)')
# stability subset output for primary height (all classes)
ResultsLists_class = initialize_resultsLists('class_')
className = 1
for item in All_class_data:
inputdata_adj, lm_adj, m, c = perform_SS_WS_adjustment(item[primary_idx].copy())
print("SS-WS: y = " + str(m) + " * x + " + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-WS' + '_' + 'class_' + str(className))
adjustmentName = str('SS_WS' + '_TKE_' + str(className))
ResultsLists_class = populate_resultsLists(ResultsLists_class, 'class_', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsList_stability = populate_resultsLists_stability(ResultsLists_stability, ResultsLists_class, '')
if RSD_alphaFlag:
print ('Applying Adjustment Method: SS-WS by stability class Alpha w/ RSD')
ResultsLists_class_alpha_RSD = initialize_resultsLists('class_alpha_RSD')
className = 1
for item in All_class_data_alpha_RSD:
iputdata_adj, lm_adj, m, c = perform_SS_WS_adjustment(item.copy())
print ("SS-WS: y = " + str(m) + "* x +" + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-WS' + '_' + 'class_' + str(className))
adjustmentName = str('SS_WS' + '_alphaRSD_' + str(className))
ResultsLists_class_alpha_RSD = populate_resultsLists(ResultsLists_class_alpha_RSD, 'class_alpha_RSD', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_RSD = populate_resultsLists_stability(ResultsLists_stability_alpha_RSD, ResultsLists_class_alpha_RSD, 'alpha_RSD')
if cup_alphaFlag:
print ('Applying adjustment Method: SS-WS by stability class Alpha w/cup')
ResultsLists_class_alpha_Ane = initialize_resultsLists('class_alpha_Ane')
className = 1
for item in All_class_data_alpha_Ane:
iputdata_adj, lm_adj, m, c = perform_SS_WS_adjustment(item.copy())
print ("SS-WS: y = " + str(m) + "* x +" + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-WS' + '_' + 'class_' + str(className))
emptyclassFlag = False
adjustmentName = str('SS_WS' + '_alphaCup_' + str(className))
ResultsLists_class_alpha_Ane = populate_resultsLists(ResultsLists_class_alpha_Ane, 'class_alpha_Ane', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_Ane = populate_resultsLists_stability(ResultsLists_stability_alpha_Ane, ResultsLists_class_alpha_Ane, 'alpha_Ane')
# ******************************************* #
# Site Specific Comprehensive Adjustment (SS-WS-Std)
if method != 'SS-WS-Std':
pass
elif method == 'SS-WS-Std' and adjustmentsMetadata['SS-WS-Std'] == False:
pass
else:
print ('Applying Adjustment Method: SS-WS-Std')
inputdata_adj, lm_adj, m, c = perform_SS_WS_Std_adjustment(inputdata.copy())
print("SS-WS-Std: y = " + str(m) + " * x + " + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = 'SS-WS-Std'
adjustmentName = 'SS_WS_Std'
baseResultsLists = populate_resultsLists(baseResultsLists, '', adjustmentName, lm_adj, inputdata_adj,
Timestamps, method)
TI_10minuteAdjusted = record_TIadj(adjustmentName,inputdata_adj,Timestamps, method, TI_10minuteAdjusted, emptyclassFlag=False)
if RSDtype['Selection'][0:4] == 'Wind' or 'ZX' in RSDtype['Selection']:
print ('Applying Adjustment Method: SS-WS-Std by stability class (TKE)')
# stability subset output for primary height (all classes)
ResultsLists_class = initialize_resultsLists('class_')
className = 1
for item in All_class_data:
inputdata_adj, lm_adj, m, c = perform_SS_WS_Std_adjustment(item[primary_idx].copy())
print("SS-WS-Std: y = " + str(m) + " * x + " + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-WS-Std' + '_' + 'class_' + str(className))
adjustmentName = str('SS_WS_Std' + '_TKE_' + str(className))
ResultsLists_class = populate_resultsLists(ResultsLists_class, 'class_', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsList_stability = populate_resultsLists_stability(ResultsLists_stability, ResultsLists_class, '')
if RSD_alphaFlag:
print ('Applying Adjustment Method: SS-WS-Std by stability class Alpha w/ RSD')
ResultsLists_class_alpha_RSD = initialize_resultsLists('class_alpha_RSD')
className = 1
for item in All_class_data_alpha_RSD:
iputdata_adj, lm_adj, m, c = perform_SS_WS_Std_adjustment(item.copy())
print ("SS-WS-Std: y = " + str(m) + "* x +" + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-WS-Std' + '_' + 'class_' + str(className))
adjustmentName = str('SS_WS_Std' + '_alphaRSD_' + str(className))
ResultsLists_class_alpha_RSD = populate_resultsLists(ResultsLists_class_alpha_RSD, 'class_alpha_RSD', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_RSD = populate_resultsLists_stability(ResultsLists_stability_alpha_RSD, ResultsLists_class_alpha_RSD, 'alpha_RSD')
if cup_alphaFlag:
print ('Applying adjustment Method: SS-WS-Std by stability class Alpha w/cup')
ResultsLists_class_alpha_Ane = initialize_resultsLists('class_alpha_Ane')
className = 1
for item in All_class_data_alpha_Ane:
iputdata_adj, lm_adj, m, c = perform_SS_WS_Std_adjustment(item.copy())
print ("SS-WS-Std: y = " + str(m) + "* x +" + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-WS-Std' + '_' + 'class_' + str(className))
emptyclassFlag = False
adjustmentName = str('SS_WS_Std' + '_alphaCup_' + str(className))
ResultsLists_class_alpha_Ane = populate_resultsLists(ResultsLists_class_alpha_Ane, 'class_alpha_Ane', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_Ane = populate_resultsLists_stability(ResultsLists_stability_alpha_Ane, ResultsLists_class_alpha_Ane, 'alpha_Ane')
# **************************************************************** #
# Site Specific LTERRA for WC 1HZ Data Adjustment (G-LTERRA_WC_1HZ)
if method != 'SS-LTERRA-WC-1HZ':
pass
elif method == 'SS-LTERRA-WC-1HZ' and adjustmentsMetadata['SS-LTERRA-WC-1HZ'] == False:
pass
else:
print ('Applying Adjustment Method: SS-LTERRA-WC-1HZ')
# ******************************************************************* #
# Site Specific LTERRA WC Machine Learning Adjustment (SS-LTERRA-MLa)
# Random Forest Regression with now ancillary columns
if method != 'SS-LTERRA-MLa':
pass
elif method == 'SS-LTERRA-MLa' and adjustmentsMetadata['SS-LTERRA-MLa'] == False:
pass
else:
print ('Applying Adjustment Method: SS-LTERRA-MLa')
inputdata_adj, lm_adj, m, c = perform_SS_LTERRA_ML_adjustment(inputdata.copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = 'SS_LTERRA_MLa'
adjustmentName = 'SS_LTERRA_MLa'
baseResultsLists = populate_resultsLists(baseResultsLists, '', adjustmentName, lm_adj, inputdata_adj,
Timestamps, method)
TI_10minuteAdjusted = record_TIadj(adjustmentName,inputdata_adj,Timestamps, method, TI_10minuteAdjusted, emptyclassFlag=False)
if RSDtype['Selection'][0:4] == 'Wind':
print ('Applying Adjustment Method: SS-LTERRA MLa by stability class (TKE)')
# stability subset output for primary height (all classes)
ResultsLists_class = initialize_resultsLists('class_')
className = 1
for item in All_class_data:
inputdata_adj, lm_adj, m, c= perform_SS_LTERRA_ML_adjustment(item[primary_idx].copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS_LTERRA_MLa' + '_' + 'class_' + str(className))
adjustmentName = str('SS_LTERRA_MLa' + '_TKE_' + str(className))
ResultsLists_class = populate_resultsLists(ResultsLists_class, 'class_', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsList_stability = populate_resultsLists_stability(ResultsLists_stability, ResultsLists_class, '')
if RSD_alphaFlag:
print ('Applying Adjustment Method: SS-LTERRA MLa by stability class Alpha w/ RSD')
ResultsLists_class_alpha_RSD = initialize_resultsLists('class_alpha_RSD')
className = 1
for item in All_class_data_alpha_RSD:
iputdata_adj, lm_adj, m, c = perform_SS_LTERRA_ML_adjustment(item.copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-LTERRA_MLa' + '_' + 'class_' + str(className))
adjustmentName = str('SS_LTERRA_ML' + '_alphaRSD_' + str(className))
ResultsLists_class_alpha_RSD = populate_resultsLists(ResultsLists_class_alpha_RSD, 'class_alpha_RSD', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_RSD = populate_resultsLists_stability(ResultsLists_stability_alpha_RSD, ResultsLists_class_alpha_RSD, 'alpha_RSD')
if cup_alphaFlag:
print ('Applying adjustment Method: SS-LTERRA_MLa by stability class Alpha w/cup')
ResultsLists_class_alpha_Ane = initialize_resultsLists('class_alpha_Ane')
className = 1
for item in All_class_data_alpha_Ane:
iputdata_adj, lm_adj, m, c = perform_SS_LTERRA_ML_adjustment(item.copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS_LTERRA_MLa' + '_' + 'class_' + str(className))
emptyclassFlag = False
adjustmentName = str('SS_LTERRA_MLa' + '_alphaCup_' + str(className))
ResultsLists_class_alpha_Ane = populate_resultsLists(ResultsLists_class_alpha_Ane, 'class_alpha_Ane', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_Ane = populate_resultsLists_stability(ResultsLists_stability_alpha_Ane, ResultsLists_class_alpha_Ane, 'alpha_Ane')
# ************************************************************************************ #
# Site Specific LTERRA WC (w/ stability) Machine Learning Adjustment (SS-LTERRA_MLc)
if method != 'SS-LTERRA-MLc':
pass
elif method == 'SS-LTERRA-MLc' and adjustmentsMetadata['SS-LTERRA-MLc'] == False:
pass
else:
print ('Applying Adjustment Method: SS-LTERRA-MLc')
all_trainX_cols = ['x_train_TI', 'x_train_TKE','x_train_WS','x_train_DIR','x_train_Hour']
all_trainY_cols = ['y_train']
all_testX_cols = ['x_test_TI','x_test_TKE','x_test_WS','x_test_DIR','x_test_Hour']
all_testY_cols = ['y_test']
inputdata_adj, lm_adj, m, c = perform_SS_LTERRA_S_ML_adjustment(inputdata.copy(),all_trainX_cols,all_trainY_cols,all_testX_cols,all_testY_cols)
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = 'SS_LTERRA_MLc'
adjustmentName = 'SS_LTERRA_MLc'
baseResultsLists = populate_resultsLists(baseResultsLists, '', adjustmentName, lm_adj, inputdata_adj, Timestamps, method)
TI_10minuteAdjusted = record_TIadj(adjustmentName,inputdata_adj,Timestamps, method, TI_10minuteAdjusted, emptyclassFlag=False)
if RSDtype['Selection'][0:4] == 'Wind':
print ('Applying Adjustment Method: SS-LTERRA_MLc by stability class (TKE)')
# stability subset output for primary height (all classes)
ResultsLists_class = initialize_resultsLists('class_')
className = 1
for item in All_class_data:
inputdata_adj, lm_adj, m, c= perform_SS_LTERRA_S_ML_adjustment(item[primary_idx].copy(),all_trainX_cols,all_trainY_cols,all_testX_cols,all_testY_cols)
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS_LTERRA_MLc' + '_' + 'class_' + str(className))
adjustmentName = str('SS_LTERRA_MLc' + '_TKE_' + str(className))
ResultsLists_class = populate_resultsLists(ResultsLists_class, 'class_', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsList_stability = populate_resultsLists_stability(ResultsLists_stability, ResultsLists_class, '')
if RSD_alphaFlag:
print ('Applying Adjustment Method: SS-LTERRA_MLc by stability class Alpha w/ RSD')
ResultsLists_class_alpha_RSD = initialize_resultsLists('class_alpha_RSD')
className = 1
for item in All_class_data_alpha_RSD:
iputdata_adj, lm_adj, m, c = perform_SS_LTERRA_S_ML_adjustment(item.copy(),all_trainX_cols,all_trainY_cols,all_testX_cols,all_testY_cols)
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-LTERRA_MLc' + '_' + 'class_' + str(className))
adjustmentName = str('SS_LTERRA_S_ML' + '_alphaRSD_' + str(className))
ResultsLists_class_alpha_RSD = populate_resultsLists(ResultsLists_class_alpha_RSD, 'class_alpha_RSD', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_RSD = populate_resultsLists_stability(ResultsLists_stability_alpha_RSD, ResultsLists_class_alpha_RSD, 'alpha_RSD')
if cup_alphaFlag:
print ('Applying adjustment Method: SS-LTERRA_MLc by stability class Alpha w/cup')
ResultsLists_class_alpha_Ane = initialize_resultsLists('class_alpha_Ane')
className = 1
for item in All_class_data_alpha_Ane:
iputdata_adj, lm_adj, m, c = perform_SS_LTERRA_S_ML_adjustment(item.copy(),all_trainX_cols,all_trainY_cols,all_testX_cols,all_testY_cols)
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS_LTERRA_MLc' + '_' + 'class_' + str(className))
emptyclassFlag = False
adjustmentName = str('SS_LTERRA_MLc' + '_alphaCup_' + str(className))
ResultsLists_class_alpha_Ane = populate_resultsLists(ResultsLists_class_alpha_Ane, 'class_alpha_Ane', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_Ane = populate_resultsLists_stability(ResultsLists_stability_alpha_Ane, ResultsLists_class_alpha_Ane, 'alpha_Ane')
# *********************** #
# Site Specific SS-LTERRA-MLb
if method != 'SS-LTERRA-MLb':
pass
elif method == 'SS-LTERRA-MLb' and adjustmentsMetadata['SS-LTERRA-MLb'] == False:
pass
else:
print ('Applying Adjustment Method: SS-LTERRA-MLb')
all_trainX_cols = ['x_train_TI', 'x_train_TKE']
all_trainY_cols = ['y_train']
all_testX_cols = ['x_test_TI','x_test_TKE']
all_testY_cols = ['y_test']
inputdata_adj, lm_adj, m, c = perform_SS_LTERRA_S_ML_adjustment(inputdata.copy(),all_trainX_cols,all_trainY_cols,all_testX_cols,all_testY_cols)
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = 'SS_LTERRA_MLb'
adjustmentName = 'SS_LTERRA_MLb'
baseResultsLists = populate_resultsLists(baseResultsLists, '', adjustmentName, lm_adj, inputdata_adj, Timestamps, method)
TI_10minuteAdjusted = record_TIadj(adjustmentName,inputdata_adj,Timestamps, method, TI_10minuteAdjusted, emptyclassFlag=False)
if RSDtype['Selection'][0:4] == 'Wind':
print ('Applying Adjustment Method: SS-LTERRA_MLb by stability class (TKE)')
# stability subset output for primary height (all classes)
ResultsLists_class = initialize_resultsLists('class_')
className = 1
for item in All_class_data:
inputdata_adj, lm_adj, m, c= perform_SS_LTERRA_S_ML_adjustment(item[primary_idx].copy(),all_trainX_cols,all_trainY_cols,all_testX_cols,all_testY_cols)
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS_LTERRA_MLb' + '_' + 'class_' + str(className))
adjustmentName = str('SS_LTERRA_MLb' + '_TKE_' + str(className))
ResultsLists_class = populate_resultsLists(ResultsLists_class, 'class_', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsList_stability = populate_resultsLists_stability(ResultsLists_stability, ResultsLists_class, '')
if RSD_alphaFlag:
print ('Applying Adjustment Method: SS-LTERRA_MLb by stability class Alpha w/ RSD')
ResultsLists_class_alpha_RSD = initialize_resultsLists('class_alpha_RSD')
className = 1
for item in All_class_data_alpha_RSD:
iputdata_adj, lm_adj, m, c = perform_SS_LTERRA_S_ML_adjustment(item.copy(),all_trainX_cols,all_trainY_cols,all_testX_cols,all_testY_cols)
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-LTERRA_MLb' + '_' + 'class_' + str(className))
adjustmentName = str('SS_LTERRA_MLb' + '_alphaRSD_' + str(className))
ResultsLists_class_alpha_RSD = populate_resultsLists(ResultsLists_class_alpha_RSD, 'class_alpha_RSD', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_RSD = populate_resultsLists_stability(ResultsLists_stability_alpha_RSD, ResultsLists_class_alpha_RSD, 'alpha_RSD')
if cup_alphaFlag:
print ('Applying adjustment Method: SS-LTERRA_MLb by stability class Alpha w/cup')
ResultsLists_class_alpha_Ane = initialize_resultsLists('class_alpha_Ane')
className = 1
for item in All_class_data_alpha_Ane:
iputdata_adj, lm_adj, m, c = perform_SS_LTERRA_S_ML_adjustment(item.copy(),all_trainX_cols,all_trainY_cols,all_testX_cols,all_testY_cols)
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS_LTERRA_MLb' + '_' + 'class_' + str(className))
emptyclassFlag = False
adjustmentName = str('SS_LTERRA_MLb' + '_alphaCup_' + str(className))
ResultsLists_class_alpha_Ane = populate_resultsLists(ResultsLists_class_alpha_Ane, 'class_alpha_Ane', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_Ane = populate_resultsLists_stability(ResultsLists_stability_alpha_Ane, ResultsLists_class_alpha_Ane, 'alpha_Ane')
# *********************** #
# TI Extrapolation (TI-Ext)
if method != 'TI-Extrap':
pass
elif method == 'TI-Extrap' and adjustmentsMetadata['TI-Extrap'] == False:
pass
else:
print ('Found enough data to perform extrapolation comparison')
blockPrint()
# Get extrapolation height
height_extrap = float(extrap_metadata['height'][extrap_metadata['type'] == 'extrap'])
# Extrapolate
inputdata_adj, lm_adj, shearTimeseries= perform_TI_extrapolation(inputdata.copy(), extrap_metadata,
extrapolation_type, height)
adjustmentName = 'TI_EXTRAP'
lm_adj['adjustment'] = adjustmentName
inputdataEXTRAP = inputdata_adj.copy()
inputdataEXTRAP, baseResultsLists = extrap_configResult(inputdataEXTRAP, baseResultsLists, method,lm_adj)
if RSDtype['Selection'][0:4] == 'Wind':
# stability subset output for primary height (all classes)
ResultsLists_class = initialize_resultsLists('class_')
className = 1
for item in All_class_data:
inputdata_adj, lm_adj, shearTimeseries= perform_TI_extrapolation(item[primary_idx].copy(), extrap_metadata,
extrapolation_type, height)
lm_adj['adjustment'] = str('TI_EXT_class1' + '_TKE_' + 'class_' + str(className))
inputdataEXTRAP = inputdata_adj.copy()
inputdataEXTRAP, ResultsLists_class = extrap_configResult(inputdataEXTRAP, ResultsLists_class,
method, lm_adj, appendString = 'class_')
className += 1
ResultsList_stability = populate_resultsLists_stability(ResultsLists_stability, ResultsLists_class, '')
if cup_alphaFlag:
ResultsLists_class_alpha_Ane = initialize_resultsLists('class_alpha_Ane')
className = 1
for item in All_class_data_alpha_Ane:
inputdata_adj, lm_adj, shearTimeseries= perform_TI_extrapolation(item.copy(), extrap_metadata,
extrapolation_type, height)
lm_adj['adjustment'] = str('TI_Ane_class1' + '_alphaCup_' + 'class_' + str(className))
inputdataEXTRAP = inputdata_adj.copy()
inputdataEXTRAP, ResultsLists_class_alpha_Ane = extrap_configResult(inputdataEXTRAP,
ResultsLists_class_alpha_Ane, method,
lm_adj, appendString = 'class_alpha_Ane')
className += 1
ResultsLists_stability_alpha_Ane = populate_resultsLists_stability(ResultsLists_stability_alpha_Ane,
ResultsLists_class_alpha_Ane, 'alpha_Ane')
if RSD_alphaFlag:
ResultsLists_class_alpha_RSD = initialize_resultsLists('class_alpha_RSD')
className = 1
for item in All_class_data_alpha_RSD:
inputdata_adj, lm_adj, shearTimeseries= perform_TI_extrapolation(item.copy(), extrap_metadata,
extrapolation_type, height)
lm_adj['adjustment'] = str('TI_RSD_class1' + '_alphaRSD_' + 'class_' + str(className))
inputdataEXTRAP = inputdata_adj.copy()
inputdataEXTRAP, ResultsLists_class_alpha_RSD = extrap_configResult(inputdataEXTRAP,
ResultsLists_class_alpha_RSD, method,
lm_adj, appendString = 'class_alpha_RSD')
className += 1
ResultsLists_stability_alpha_RSD = populate_resultsLists_stability(ResultsLists_stability_alpha_RSD,
ResultsLists_class_alpha_RSD, 'alpha_RSD')
# Add extra info to meta data and reformat
if extrapolation_type == 'simple':
desc = 'No truth measurement at extrapolation height'
else:
desc = 'Truth measurement available at extrapolation height'
extrap_metadata = (extrap_metadata
.append({'type': np.nan, 'height': np.nan, 'num': np.nan},
ignore_index=True)
.append(pd.DataFrame([['extrapolation type', extrapolation_type, desc]],
columns=extrap_metadata.columns))
.rename(columns={'type': 'Type',
'height': 'Height (m)',
'num': 'Comparison Height Number'}))
enablePrint()
# ************************************************** #
# Histogram Matching
if method != 'SS-Match':
pass
elif method == 'SS-Match' and adjustmentsMetadata['SS-Match'] == False:
pass
else:
print ('Applying Match algorithm: SS-Match')
inputdata_adj, lm_adj = perform_match(inputdata.copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = 'SS-Match'
adjustmentName = 'SS_Match'
baseResultsLists = populate_resultsLists(baseResultsLists, '', adjustmentName, lm_adj, inputdata_adj,
Timestamps, method)
TI_10minuteAdjusted = record_TIadj(adjustmentName,inputdata_adj,Timestamps, method, TI_10minuteAdjusted, emptyclassFlag=False)
if RSDtype['Selection'][0:4] == 'Wind':
print ('Applying Adjustment Method: SS-Match by stability class (TKE)')
ResultsLists_class = initialize_resultsLists('class_')
className = 1
for item in All_class_data:
inputdata_adj, lm_adj = perform_match(item[primary_idx].copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-Match' + '_' + 'class_' + str(className))
adjustmentName = str('SS_Match' + '_TKE_' + str(className))
ResultsLists_class = populate_resultsLists(ResultsLists_class, 'class_', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsList_stability = populate_resultsLists_stability(ResultsLists_stability, ResultsLists_class, '')
if RSD_alphaFlag:
print ('Applying Adjustment Method: SS-Match by stability class Alpha w/ RSD')
ResultsLists_class_alpha_RSD = initialize_resultsLists('class_alpha_RSD')
className = 1
for item in All_class_data_alpha_RSD:
inputdata_adj, lm_adj = perform_match(item.copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-Match' + '_' + 'class_' + str(className))
adjustmentName = str('SS_Match' + '_alphaRSD_' + str(className))
ResultsLists_class_alpha_RSD = populate_resultsLists(ResultsLists_class_alpha_RSD, 'class_alpha_RSD', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_RSD = populate_resultsLists_stability(ResultsLists_stability_alpha_RSD, ResultsLists_class_alpha_RSD, 'alpha_RSD')
if cup_alphaFlag:
print ('Applying adjustment Method: SS-Match by stability class Alpha w/cup')
ResultsLists_class_alpha_Ane = initialize_resultsLists('class_alpha_Ane')
className = 1
for item in All_class_data_alpha_Ane:
inputdata_adj, lm_adj = perform_match(item.copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-Match' + '_' + 'class_' + str(className))
adjustmentName = str('SS_Match' + '_alphaCup_' + str(className))
ResultsLists_class_alpha_Ane = populate_resultsLists(ResultsLists_class_alpha_Ane, 'class_alpha_Ane', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_Ane = populate_resultsLists_stability(ResultsLists_stability_alpha_Ane, ResultsLists_class_alpha_Ane, 'alpha_Ane')
# ************************************************** #
# Histogram Matching Input Corrected
if method != 'SS-Match2':
pass
elif method == 'SS-Match2' and adjustmentsMetadata['SS-Match2'] == False:
pass
else:
print ('Applying input match algorithm: SS-Match2')
inputdata_adj, lm_adj = perform_match_input(inputdata.copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = 'SS-Match2'
adjustmentName = 'SS_Match2'
baseResultsLists = populate_resultsLists(baseResultsLists, '', adjustmentName, lm_adj, inputdata_adj,
Timestamps, method)
TI_10minuteAdjusted = record_TIadj(adjustmentName,inputdata_adj,Timestamps, method, TI_10minuteAdjusted, emptyclassFlag=False)
if RSDtype['Selection'][0:4] == 'Wind':
print ('Applying Adjustment Method: SS-Match2 by stability class (TKE)')
ResultsLists_class = initialize_resultsLists('class_')
className = 1
for item in All_class_data:
inputdata_adj, lm_adj = perform_match_input(item[primary_idx].copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-Match2' + '_' + 'class_' + str(className))
adjustmentName = str('SS_Match2' + '_TKE_' + str(className))
ResultsLists_class = populate_resultsLists(ResultsLists_class, 'class_', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsList_stability = populate_resultsLists_stability(ResultsLists_stability, ResultsLists_class, '')
if RSD_alphaFlag:
print ('Applying Adjustment Method: SS-Match2 by stability class Alpha w/ RSD')
ResultsLists_class_alpha_RSD = initialize_resultsLists('class_alpha_RSD')
className = 1
for item in All_class_data_alpha_RSD:
inputdata_adj, lm_adj = perform_match_input(item.copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-Match2' + '_' + 'class_' + str(className))
adjustmentName = str('SS_Match2' + '_alphaRSD_' + str(className))
ResultsLists_class_alpha_RSD = populate_resultsLists(ResultsLists_class_alpha_RSD, 'class_alpha_RSD', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_RSD = populate_resultsLists_stability(ResultsLists_stability_alpha_RSD, ResultsLists_class_alpha_RSD, 'alpha_RSD')
if cup_alphaFlag:
print ('Applying adjustment Method: SS-Match2 by stability class Alpha w/cup')
ResultsLists_class_alpha_Ane = initialize_resultsLists('class_alpha_Ane')
className = 1
for item in All_class_data_alpha_Ane:
inputdata_adj, lm_adj = perform_match_input(item.copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('SS-Match2' + '_' + 'class_' + str(className))
adjustmentName = str('SS_Match2' + '_alphaCup_' + str(className))
ResultsLists_class_alpha_Ane = populate_resultsLists(ResultsLists_class_alpha_Ane, 'class_alpha_Ane', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_Ane = populate_resultsLists_stability(ResultsLists_stability_alpha_Ane, ResultsLists_class_alpha_Ane, 'alpha_Ane')
# ************************************************** #
# Global Simple Phase II mean Linear Reressions (G-Sa) + project
'''
RSD_TI = .984993 * RSD_TI + .087916
'''
if method != 'G-Sa':
pass
elif method == 'G-Sa' and adjustmentsMetadata['G-Sa'] == False:
pass
else:
print ('Applying Adjustment Method: G-Sa')
override = False
inputdata_adj, lm_adj, m, c = perform_G_Sa_adjustment(inputdata.copy(),override)
print("G-Sa: y = " + str(m) + " * x + " + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = 'G-Sa'
adjustmentName = 'G_Sa'
baseResultsLists = populate_resultsLists(baseResultsLists, '', adjustmentName, lm_adj, inputdata_adj,
Timestamps, method)
TI_10minuteAdjusted = record_TIadj(adjustmentName,inputdata_adj,Timestamps, method, TI_10minuteAdjusted, emptyclassFlag=False)
if RSDtype['Selection'][0:4] == 'Wind':
print ('Applying Adjustment Method: G-Sa by stability class (TKE)')
# stability subset output for primary height (all classes)
ResultsLists_class = initialize_resultsLists('class_')
className = 1
for item in All_class_data:
inputdata_adj, lm_adj, m, c = perform_G_Sa_adjustment(item[primary_idx].copy(),override)
print("G-Sa: y = " + str(m) + " * x + " + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('G-Sa' + '_TKE_' + 'class_' + str(className))
adjustmentName = str('G-Sa'+ '_TKE_' + str(className))
ResultsLists_class = populate_resultsLists(ResultsLists_class, 'class_', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsList_stability = populate_resultsLists_stability(ResultsLists_stability, ResultsLists_class, '')
if RSD_alphaFlag:
print ('Applying Adjustment Method: G-Sa by stability class Alpha w/ RSD')
ResultsLists_class_alpha_RSD = initialize_resultsLists('class_alpha_RSD')
className = 1
for item in All_class_data_alpha_RSD:
iputdata_adj, lm_adj, m, c = perform_G_Sa_adjustment(item.copy(),override)
print ("G-Sa: y = " + str(m) + "* x +" + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('G-Sa' + '_' + 'class_' + str(className))
adjustmentName = str('G-Sa' + '_alphaRSD_' + str(className))
ResultsLists_class_alpha_RSD = populate_resultsLists(ResultsLists_class_alpha_RSD, 'class_alpha_RSD', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_RSD = populate_resultsLists_stability(ResultsLists_stability_alpha_RSD, ResultsLists_class_alpha_RSD, 'alpha_RSD')
if cup_alphaFlag:
print ('Applying adjustment Method: G-Sa by stability class Alpha w/cup')
ResultsLists_class_alpha_Ane = initialize_resultsLists('class_alpha_Ane')
className = 1
for item in All_class_data_alpha_Ane:
iputdata_adj, lm_adj, m, c = perform_G_Sa_adjustment(item.copy(),override)
print ("G-Sa: y = " + str(m) + "* x +" + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('G-Sa' + '_alphaCup_' + 'class_' + str(className))
adjustmentName = str('G-Sa' + '_' + str(className))
ResultsLists_class_alpha_Ane = populate_resultsLists(ResultsLists_class_alpha_Ane, 'class_alpha_Ane', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_Ane = populate_resultsLists_stability(ResultsLists_stability_alpha_Ane, ResultsLists_class_alpha_Ane, 'alpha_Ane')
# ******************************************************** #
# Global Simple w/filter Phase II Linear Regressions (G-SFa) + project
# Check these values, but for WC m = 0.7086 and c = 0.0225
if method != 'G-SFa':
pass
elif method == 'G-SFa' and adjustmentsMetadata['G-SFa'] == False:
pass
elif RSDtype['Selection'][0:4] != 'Wind':
pass
else:
print ('Applying Adjustment Method: G-SFa')
override = [0.7086, 0.0225]
inputdata_adj, lm_adj, m, c = perform_G_Sa_adjustment(inputdata.copy(),override)
print("G-SFa: y = " + str(m) + " * x + " + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = 'G-SFa'
adjustmentName = 'G_SFa'
baseResultsLists = populate_resultsLists(baseResultsLists, '', adjustmentName, lm_adj, inputdata_adj,
Timestamps, method)
TI_10minuteAdjusted = record_TIadj(adjustmentName,inputdata_adj,Timestamps, method, TI_10minuteAdjusted, emptyclassFlag=False)
if RSDtype['Selection'][0:4] == 'Wind':
print ('Applying Adjustment Method: G-SFa by stability class (TKE)')
# stability subset output for primary height (all classes)
ResultsLists_class = initialize_resultsLists('class_')
className = 1
for item in All_class_data:
inputdata_adj, lm_adj, m, c = perform_G_Sa_adjustment(item[primary_idx].copy(),override)
print("G-SFa: y = " + str(m) + " * x + " + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('G-SFa' + '_TKE_' + 'class_' + str(className))
adjustmentName = str('G-SFa'+ '_TKE_' + str(className))
ResultsLists_class = populate_resultsLists(ResultsLists_class, 'class_', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsList_stability = populate_resultsLists_stability(ResultsLists_stability, ResultsLists_class, '')
if RSD_alphaFlag:
print ('Applying Adjustment Method: G-SFa by stability class Alpha w/ RSD')
ResultsLists_class_alpha_RSD = initialize_resultsLists('class_alpha_RSD')
className = 1
for item in All_class_data_alpha_RSD:
iputdata_adj, lm_adj, m, c = perform_G_Sa_adjustment(item.copy(),override)
print ("G-SFa: y = " + str(m) + "* x +" + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('G-Sa' + '_' + 'class_' + str(className))
adjustmentName = str('G-SFa' + '_alphaRSD_' + str(className))
ResultsLists_class_alpha_RSD = populate_resultsLists(ResultsLists_class_alpha_RSD, 'class_alpha_RSD', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_RSD = populate_resultsLists_stability(ResultsLists_stability_alpha_RSD, ResultsLists_class_alpha_RSD, 'alpha_RSD')
if cup_alphaFlag:
print ('Applying adjustment Method: G-SFa by stability class Alpha w/cup')
ResultsLists_class_alpha_Ane = initialize_resultsLists('class_alpha_Ane')
className = 1
for item in All_class_data_alpha_Ane:
iputdata_adj, lm_adj, m, c = perform_G_Sa_adjustment(item.copy(),override)
print ("G-SFa: y = " + str(m) + "* x +" + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('G-SFa' + '_alphaCup_' + 'class_' + str(className))
adjustmentName = str('G-SFa' + '_' + str(className))
ResultsLists_class_alpha_Ane = populate_resultsLists(ResultsLists_class_alpha_Ane, 'class_alpha_Ane', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_Ane = populate_resultsLists_stability(ResultsLists_stability_alpha_Ane, ResultsLists_class_alpha_Ane, 'alpha_Ane')
# ************************************************ #
# Global Standard Deviation and WS adjustment (G-Sc)
if method != 'G-SFc':
pass
elif method == 'G-SFc' and adjustmentsMetadata['G-SFc'] == False:
pass
elif RSDtype['Selection'][0:4] != 'Wind':
pass
else:
print ('Applying Adjustment Method: G-Sc')
inputdata_adj, lm_adj, m, c = perform_G_SFc_adjustment(inputdata.copy())
print("G-SFc: y = " + str(m) + " * x + " + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = 'G-SFc'
adjustmentName = 'G_SFc'
baseResultsLists = populate_resultsLists(baseResultsLists, '', adjustmentName, lm_adj, inputdata_adj,
Timestamps, method)
TI_10minuteAdjusted = record_TIadj(adjustmentName,inputdata_adj,Timestamps, method, TI_10minuteAdjusted, emptyclassFlag=False)
if RSDtype['Selection'][0:4] == 'Wind':
print ('Applying Adjustment Method: G-SFa by stability class (TKE)')
# stability subset output for primary height (all classes)
ResultsLists_class = initialize_resultsLists('class_')
className = 1
for item in All_class_data:
inputdata_adj, lm_adj, m, c = perform_G_SFc_adjustment(item[primary_idx].copy())
print("G-SFc: y = " + str(m) + " * x + " + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('G-SFc' + '_TKE_' + 'class_' + str(className))
adjustmentName = str('G-SFc'+ '_TKE_' + str(className))
ResultsLists_class = populate_resultsLists(ResultsLists_class, 'class_', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsList_stability = populate_resultsLists_stability(ResultsLists_stability, ResultsLists_class, '')
if RSD_alphaFlag:
print ('Applying Adjustment Method: G-SFc by stability class Alpha w/ RSD')
ResultsLists_class_alpha_RSD = initialize_resultsLists('class_alpha_RSD')
className = 1
for item in All_class_data_alpha_RSD:
iputdata_adj, lm_adj, m, c = perform_G_SFc_adjustment(item.copy())
print ("G-SFc: y = " + str(m) + "* x +" + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('G-SFc' + '_' + 'class_' + str(className))
adjustmentName = str('G-SFc' + '_alphaRSD_' + str(className))
ResultsLists_class_alpha_RSD = populate_resultsLists(ResultsLists_class_alpha_RSD, 'class_alpha_RSD', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_RSD = populate_resultsLists_stability(ResultsLists_stability_alpha_RSD, ResultsLists_class_alpha_RSD, 'alpha_RSD')
if cup_alphaFlag:
print ('Applying adjustment Method: G-SFc by stability class Alpha w/cup')
ResultsLists_class_alpha_Ane = initialize_resultsLists('class_alpha_Ane')
className = 1
for item in All_class_data_alpha_Ane:
iputdata_adj, lm_adj, m, c = perform_G_SFc_adjustment(item.copy())
print ("G-SFc: y = " + str(m) + "* x +" + str(c))
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('G-SFc' + '_alphaCup_' + 'class_' + str(className))
adjustmentName = str('G-SFc' + '_' + str(className))
ResultsLists_class_alpha_Ane = populate_resultsLists(ResultsLists_class_alpha_Ane, 'class_alpha_Ane', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_Ane = populate_resultsLists_stability(ResultsLists_stability_alpha_Ane, ResultsLists_class_alpha_Ane, 'alpha_Ane')
# ************************ #
# Global Comprehensive (G-C)
'''
based on empirical calibrations by EON
'''
if method != 'G-C':
pass
elif method == 'G-C' and adjustmentsMetadata['G-C'] == False:
pass
else:
print ('Applying Adjustment Method: G-C')
inputdata_adj, lm_adj, m, c = perform_G_C_adjustment(inputdata.copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = 'G-C'
adjustmentName = 'G_C'
baseResultsLists = populate_resultsLists(baseResultsLists, '', adjustmentName, lm_adj, inputdata_adj,
Timestamps, method)
TI_10minuteAdjusted = record_TIadj(adjustmentName,inputdata_adj,Timestamps, method, TI_10minuteAdjusted, emptyclassFlag=False)
if RSDtype['Selection'][0:4] == 'Wind':
print ('Applying Adjustment Method: G-C by stability class (TKE)')
# stability subset output for primary height (all classes)
ResultsLists_class = initialize_resultsLists('class_')
className = 1
for item in All_class_data:
print (str('class ' + str(className)))
inputdata_adj, lm_adj, m, c = perform_G_C_adjustment(item[primary_idx].copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('G-C' + '_TKE_' + 'class_' + str(className))
adjustmentName = str('G-C'+ '_TKE_' + str(className))
ResultsLists_class = populate_resultsLists(ResultsLists_class, 'class_', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsList_stability = populate_resultsLists_stability(ResultsLists_stability, ResultsLists_class, '')
if RSD_alphaFlag:
print ('Applying Adjustment Method: G-C by stability class Alpha w/ RSD')
ResultsLists_class_alpha_RSD = initialize_resultsLists('class_alpha_RSD')
className = 1
for item in All_class_data_alpha_RSD:
print (str('class ' + str(className)))
iputdata_adj, lm_adj, m, c = perform_G_C_adjustment(item.copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('G-C' + '_' + 'class_' + str(className))
adjustmentName = str('G-C' + '_alphaRSD_' + str(className))
ResultsLists_class_alpha_RSD = populate_resultsLists(ResultsLists_class_alpha_RSD, 'class_alpha_RSD', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_RSD = populate_resultsLists_stability(ResultsLists_stability_alpha_RSD, ResultsLists_class_alpha_RSD, 'alpha_RSD')
if cup_alphaFlag:
print ('Applying adjustment Method: G-C by stability class Alpha w/cup')
ResultsLists_class_alpha_Ane = initialize_resultsLists('class_alpha_Ane')
className = 1
for item in All_class_data_alpha_Ane:
print (str('class ' + str(className)))
iputdata_adj, lm_adj, m, c = perform_G_C_adjustment(item.copy())
lm_adj['sensor'] = sensor
lm_adj['height'] = height
lm_adj['adjustment'] = str('G-C' + '_alphaCup_' + 'class_' + str(className))
adjustmentName = str('G-C' + '_' + str(className))
ResultsLists_class_alpha_Ane = populate_resultsLists(ResultsLists_class_alpha_Ane, 'class_alpha_Ane', adjustmentName, lm_adj,
inputdata_adj, Timestamps, method)
className += 1
ResultsLists_stability_alpha_Ane = populate_resultsLists_stability(ResultsLists_stability_alpha_Ane, ResultsLists_class_alpha_Ane, 'alpha_Ane')
# ************************ #
# Global Comprehensive (G-Match)
if method != 'G-Match':
pass
elif method == 'G-Match' and adjustmentsMetadata['G-Match'] == False:
pass
else:
print ('Applying Adjustment Method: G-Match')
# ************************ #
# Global Comprehensive (G-Ref-S)
if method != 'G-Ref-S':
pass
elif method == 'G-Ref-S' and adjustmentsMetadata['G-Ref-S'] == False:
pass
else:
print ('Applying Adjustment Method: G-Ref-S')
# ************************ #
# Global Comprehensive (G-Ref-Sf)
if method != 'G-Ref-Sf':
pass
elif method == 'G-Ref-Sf' and adjustmentsMetadata['G-Ref-Sf'] == False:
pass
else:
print ('Applying Adjustment Method: G-Ref-Sf')
# ************************ #
# Global Comprehensive (G-Ref-SS)
if method != 'G-Ref-SS':
pass
elif method == 'G-Ref-SS' and adjustmentsMetadata['G-Ref-SS'] == False:
pass
else:
print ('Applying Adjustment Method: G-Ref-SS')
# ************************ #
# Global Comprehensive (G-Ref-SS-S)
if method != 'G-Ref-SS-S':
pass
elif method == 'G-Ref-SS-S' and adjustmentsMetadata['G-Ref-SS-S'] == False:
pass
else:
print ('Applying Adjustment Method: G-Ref-SS-S')
# ************************ #
# Global Comprehensive (G-Ref-WS-Std)
if method != 'G-Ref-WS-Std':
pass
elif method == 'G-Ref-WS-Std' and adjustmentsMetadata['G-Ref-WS-Std'] == False:
pass
else:
print ('Applying Adjustment Method: G-Ref-WS-Std')
# ***************************************** #
# Global LTERRA WC 1Hz Data (G-LTERRA_WC_1Hz)
if method != 'G-LTERRA_WC_1Hz':
pass
elif method == 'G-LTERRA_WC_1Hz' and adjustmentsMetadata['G-LTERRA_WC_1Hz'] == False:
pass
else:
print ('Applying Adjustment Method: G-LTERRA_WC_1Hz')
# ************************************************ #
# Global LTERRA ZX Machine Learning (G-LTERRA_ZX_ML)
if method != 'G-LTERRA_ZX_ML':
pass
elif adjustmentsMetadata['G-LTERRA_ZX_ML'] == False:
pass
else:
print ('Applying Adjustment Method: G-LTERRA_ZX_ML')
# ************************************************ #
# Global LTERRA WC Machine Learning (G-LTERRA_WC_ML)
if method != 'G-LTERRA_WC_ML':
pass
elif adjustmentsMetadata['G-LTERRA_WC_ML'] == False:
pass
else:
print ('Applying Adjustment Method: G-LTERRA_WC_ML')
# ************************************************** #
# Global LTERRA WC w/Stability 1Hz (G-LTERRA_WC_S_1Hz)
if method != 'G-LTERRA_WC_S_1Hz':
pass
elif method == 'G-LTERRA_WC_S_1Hz' and adjustmentsMetadata['G-LTERRA_WC_S_1Hz'] == False:
pass
else:
print ('Applying Adjustment Method: G-LTERRA_WC_S_1Hz')
# ************************************************************** #
# Global LTERRA WC w/Stability Machine Learning (G-LTERRA_WC_S_ML)
if method != 'G-LTERRA_WC_S_ML':
pass
elif method == 'G-LTERRA_WC_S_ML' and adjustmentsMetadata['G-LTERRA_WC_S_ML'] == False:
pass
else:
print ('Applying Adjustment Method: G-LTERRA_WC_S_ML')
if RSD_alphaFlag:
pass
else:
ResultsLists_stability_alpha_RSD = ResultsList_stability
if cup_alphaFlag:
pass
else:
ResultsLists_stability_alpha_Ane = ResultsList_stability
if RSDtype['Selection'][0:4] != 'Wind':
reg_results_class1 = np.nan
reg_results_class2 = np.nan
reg_results_class3 = np.nan
reg_results_class4 = np.nan
reg_results_class5 = np.nan
TI_MBEList_stability = np.nan
TI_DiffList_stability = np.nan
TI_DiffRefBinsList_stability = np.nan
TI_RMSEList_stability = np.nan
RepTI_MBEList_stability = np.nan
RepTI_DiffList_stability = np.nan
RepTI_DiffRefBinsList_stability = np.nan
RepTI_RMSEList_stability = np.nan
rep_TI_results_1mps_List_stability = np.nan
rep_TI_results_05mps_List_stability = np.nan
TIBinList_stability = np.nan
TIRefBinList_stability = np.nan
total_StatsList_stability = np.nan
belownominal_statsList_stability = np.nan
abovenominal_statsList_stability = np.nan
lm_adjList_stability = np.nan
adjustmentTagList_stability = np.nan
Distibution_statsList_stability = np.nan
sampleTestsLists_stability = np.nan
# Write 10 minute Adjusted data to a csv file
outpath_dir = os.path.dirname(results_filename)
outpath_file = os.path.basename(results_filename)
outpath_file = str('TI_10minuteAdjusted_' + outpath_file.split('.xlsx')[0] + '.csv')
out_dir = os.path.join(outpath_dir,outpath_file)
TI_10minuteAdjusted.to_csv(out_dir)
write_all_resultstofile(reg_results, baseResultsLists, count_1mps, count_05mps, count_1mps_train, count_05mps_train,
count_1mps_test, count_05mps_test, name_1mps_tke, name_1mps_alpha_Ane, name_1mps_alpha_RSD,
name_05mps_tke, name_05mps_alpha_Ane, name_05mps_alpha_RSD, count_05mps_tke, count_05mps_alpha_Ane, count_05mps_alpha_RSD,
count_1mps_tke, count_1mps_alpha_Ane, count_1mps_alpha_RSD,results_filename, siteMetadata, filterMetadata,
Timestamps,timestamp_train,timestamp_test,regimeBreakdown_tke, regimeBreakdown_ane, regimeBreakdown_rsd,
Ht_1_ane, Ht_2_ane, extrap_metadata, reg_results_class1, reg_results_class2, reg_results_class3,
reg_results_class4, reg_results_class5,reg_results_class1_alpha, reg_results_class2_alpha, reg_results_class3_alpha,
reg_results_class4_alpha, reg_results_class5_alpha, Ht_1_rsd, Ht_2_rsd, ResultsLists_stability, ResultsLists_stability_alpha_RSD,
ResultsLists_stability_alpha_Ane, stabilityFlag, cup_alphaFlag, RSD_alphaFlag, TimeTestA_baseline_df, TimeTestB_baseline_df,
TimeTestC_baseline_df,TimeTestA_adjustments_df,TimeTestB_adjustments_df,TimeTestC_adjustments_df)
| 55.786458 | 244 | 0.617321 | 9,201 | 85,688 | 5.403652 | 0.041735 | 0.028158 | 0.066373 | 0.037712 | 0.832076 | 0.791508 | 0.755325 | 0.731752 | 0.717693 | 0.686558 | 0 | 0.009612 | 0.27033 | 85,688 | 1,535 | 245 | 55.822801 | 0.785586 | 0.051524 | 0 | 0.599522 | 0 | 0 | 0.140707 | 0.002276 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.046975 | 0.003185 | 0 | 0.003185 | 0.090764 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
53481025b49fb6f705e4c00b2eac06c755bfbeb3 | 410 | py | Python | stonesoup/sensormanager/__init__.py | n0wis/Stone-Soup | 84d14a597d6e0450eb4f2972d7087368ae768cd8 | [
"MIT"
] | 1 | 2021-12-02T00:17:21.000Z | 2021-12-02T00:17:21.000Z | stonesoup/sensormanager/__init__.py | n0wis/Stone-Soup | 84d14a597d6e0450eb4f2972d7087368ae768cd8 | [
"MIT"
] | null | null | null | stonesoup/sensormanager/__init__.py | n0wis/Stone-Soup | 84d14a597d6e0450eb4f2972d7087368ae768cd8 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from .base import SensorManager, RandomSensorManager, BruteForceSensorManager
from .optimise import _OptimizeSensorManager, OptimizeBruteSensorManager, \
OptimizeBasinHoppingSensorManager
__all__ = ['SensorManager', 'RandomSensorManager', 'BruteForceSensorManager',
'_OptimizeSensorManager', 'OptimizeBruteSensorManager',
'OptimizeBasinHoppingSensorManager']
| 45.555556 | 77 | 0.782927 | 22 | 410 | 14.318182 | 0.636364 | 0.203175 | 0.349206 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002801 | 0.129268 | 410 | 8 | 78 | 51.25 | 0.879552 | 0.05122 | 0 | 0 | 0 | 0 | 0.351421 | 0.268734 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
725e53797370f0a5b727453d511fee8c64b51775 | 229 | py | Python | mlrun/frameworks/onnx/__init__.py | hayesgb/mlrun | 9a8b469b37d7d74f2d04dc956b2966f88fe4e890 | [
"Apache-2.0"
] | null | null | null | mlrun/frameworks/onnx/__init__.py | hayesgb/mlrun | 9a8b469b37d7d74f2d04dc956b2966f88fe4e890 | [
"Apache-2.0"
] | null | null | null | mlrun/frameworks/onnx/__init__.py | hayesgb/mlrun | 9a8b469b37d7d74f2d04dc956b2966f88fe4e890 | [
"Apache-2.0"
] | null | null | null | # flake8: noqa - this is until we take care of the F401 violations with respect to __all__ & sphinx
from mlrun.frameworks.onnx.model_handler import ONNXModelHandler
from mlrun.frameworks.onnx.model_server import ONNXModelServer
| 57.25 | 100 | 0.829694 | 33 | 229 | 5.575758 | 0.818182 | 0.097826 | 0.206522 | 0.25 | 0.304348 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02 | 0.126638 | 229 | 3 | 101 | 76.333333 | 0.9 | 0.427948 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
72681b5fed4c36c3fc5b0e3587b61fcc54ea2bc9 | 205 | py | Python | app/utils/__init__.py | ta4tsering/pyrrha-bo | d5afbe4b37d4d2ad5b5bb4129b1dccaeb50c9b17 | [
"MIT"
] | 1 | 2020-08-30T04:36:25.000Z | 2020-08-30T04:36:25.000Z | app/utils/__init__.py | ta4tsering/pyrrha-bo | d5afbe4b37d4d2ad5b5bb4129b1dccaeb50c9b17 | [
"MIT"
] | null | null | null | app/utils/__init__.py | ta4tsering/pyrrha-bo | d5afbe4b37d4d2ad5b5bb4129b1dccaeb50c9b17 | [
"MIT"
] | 1 | 2020-08-30T04:33:07.000Z | 2020-08-30T04:33:07.000Z | from .pagination import int_or
from .tsv import StringDictReader
from .forms import string_to_none
class PyrrhaError(Exception):
"""Raised when errors specific to this application occur."""
pass
| 22.777778 | 64 | 0.77561 | 27 | 205 | 5.777778 | 0.814815 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.160976 | 205 | 8 | 65 | 25.625 | 0.906977 | 0.263415 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.2 | 0.6 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
729f7072199d2bf5b4f014bf6e45c79c7f25b1b7 | 58 | py | Python | python/code_challenges/depth_first/test_depth_first.py | dina-fouad/data-structures-and-algorithms | 53204f1c6d841ffe00849ff5f0fd0cd0469b6469 | [
"MIT"
] | null | null | null | python/code_challenges/depth_first/test_depth_first.py | dina-fouad/data-structures-and-algorithms | 53204f1c6d841ffe00849ff5f0fd0cd0469b6469 | [
"MIT"
] | 9 | 2021-08-01T19:29:35.000Z | 2021-09-05T19:58:03.000Z | python/code_challenges/depth_first/test_depth_first.py | dina-fouad/data-structures-and-algorithms | 53204f1c6d841ffe00849ff5f0fd0cd0469b6469 | [
"MIT"
] | null | null | null | from depth_first.depth_first import graphs
import pytest
| 14.5 | 42 | 0.862069 | 9 | 58 | 5.333333 | 0.666667 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12069 | 58 | 3 | 43 | 19.333333 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
72c589668c22b19c061a7ea2fffc4554eb28a24c | 104,347 | py | Python | librespot/proto/playlist4_external_pb2.py | JeffmeisterJ/librespot-python | 0e0e1db65aa40262bd13479b97f81ae8c29ae049 | [
"Apache-2.0"
] | null | null | null | librespot/proto/playlist4_external_pb2.py | JeffmeisterJ/librespot-python | 0e0e1db65aa40262bd13479b97f81ae8c29ae049 | [
"Apache-2.0"
] | null | null | null | librespot/proto/playlist4_external_pb2.py | JeffmeisterJ/librespot-python | 0e0e1db65aa40262bd13479b97f81ae8c29ae049 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: playlist4_external.proto
"""Generated protocol buffer code."""
from google.protobuf.internal import enum_type_wrapper
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='playlist4_external.proto',
package='spotify.playlist4.proto',
syntax='proto2',
serialized_options=
b'\n\025com.spotify.playlist4B\021Playlist4ApiProtoH\002',
create_key=_descriptor._internal_create_key,
serialized_pb=
b'\n\x18playlist4_external.proto\x12\x17spotify.playlist4.proto\"P\n\x04Item\x12\x0b\n\x03uri\x18\x01 \x02(\t\x12;\n\nattributes\x18\x02 \x01(\x0b\x32\'.spotify.playlist4.proto.ItemAttributes\"\x94\x01\n\x08MetaItem\x12\x10\n\x08revision\x18\x01 \x01(\x0c\x12;\n\nattributes\x18\x02 \x01(\x0b\x32\'.spotify.playlist4.proto.ListAttributes\x12\x0e\n\x06length\x18\x03 \x01(\x05\x12\x11\n\ttimestamp\x18\x04 \x01(\x03\x12\x16\n\x0eowner_username\x18\x05 \x01(\t\"\x90\x01\n\tListItems\x12\x0b\n\x03pos\x18\x01 \x02(\x05\x12\x11\n\ttruncated\x18\x02 \x02(\x08\x12,\n\x05items\x18\x03 \x03(\x0b\x32\x1d.spotify.playlist4.proto.Item\x12\x35\n\nmeta_items\x18\x04 \x03(\x0b\x32!.spotify.playlist4.proto.MetaItem\"1\n\x13\x46ormatListAttribute\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\r\n\x05value\x18\x02 \x01(\t\"\xf6\x01\n\x0eListAttributes\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x02 \x01(\t\x12\x0f\n\x07picture\x18\x03 \x01(\x0c\x12\x15\n\rcollaborative\x18\x04 \x01(\x08\x12\x13\n\x0bpl3_version\x18\x05 \x01(\t\x12\x18\n\x10\x64\x65leted_by_owner\x18\x06 \x01(\x08\x12\x11\n\tclient_id\x18\n \x01(\t\x12\x0e\n\x06\x66ormat\x18\x0b \x01(\t\x12G\n\x11\x66ormat_attributes\x18\x0c \x03(\x0b\x32,.spotify.playlist4.proto.FormatListAttribute\"\xb0\x01\n\x0eItemAttributes\x12\x10\n\x08\x61\x64\x64\x65\x64_by\x18\x01 \x01(\t\x12\x11\n\ttimestamp\x18\x02 \x01(\x03\x12\x0f\n\x07seen_at\x18\t \x01(\x03\x12\x0e\n\x06public\x18\n \x01(\x08\x12G\n\x11\x66ormat_attributes\x18\x0b \x03(\x0b\x32,.spotify.playlist4.proto.FormatListAttribute\x12\x0f\n\x07item_id\x18\x0c \x01(\x0c\"l\n\x03\x41\x64\x64\x12\x12\n\nfrom_index\x18\x01 \x01(\x05\x12,\n\x05items\x18\x02 \x03(\x0b\x32\x1d.spotify.playlist4.proto.Item\x12\x10\n\x08\x61\x64\x64_last\x18\x04 \x01(\x08\x12\x11\n\tadd_first\x18\x05 \x01(\x08\"m\n\x03Rem\x12\x12\n\nfrom_index\x18\x01 \x01(\x05\x12\x0e\n\x06length\x18\x02 \x01(\x05\x12,\n\x05items\x18\x03 \x03(\x0b\x32\x1d.spotify.playlist4.proto.Item\x12\x14\n\x0citems_as_key\x18\x07 \x01(\x08\";\n\x03Mov\x12\x12\n\nfrom_index\x18\x01 \x02(\x05\x12\x0e\n\x06length\x18\x02 \x02(\x05\x12\x10\n\x08to_index\x18\x03 \x02(\x05\"\x93\x01\n\x1aItemAttributesPartialState\x12\x37\n\x06values\x18\x01 \x02(\x0b\x32\'.spotify.playlist4.proto.ItemAttributes\x12<\n\x08no_value\x18\x02 \x03(\x0e\x32*.spotify.playlist4.proto.ItemAttributeKind\"\x93\x01\n\x1aListAttributesPartialState\x12\x37\n\x06values\x18\x01 \x02(\x0b\x32\'.spotify.playlist4.proto.ListAttributes\x12<\n\x08no_value\x18\x02 \x03(\x0e\x32*.spotify.playlist4.proto.ListAttributeKind\"\xbf\x01\n\x14UpdateItemAttributes\x12\r\n\x05index\x18\x01 \x02(\x05\x12K\n\x0enew_attributes\x18\x02 \x02(\x0b\x32\x33.spotify.playlist4.proto.ItemAttributesPartialState\x12K\n\x0eold_attributes\x18\x03 \x01(\x0b\x32\x33.spotify.playlist4.proto.ItemAttributesPartialState\"\xb0\x01\n\x14UpdateListAttributes\x12K\n\x0enew_attributes\x18\x01 \x02(\x0b\x32\x33.spotify.playlist4.proto.ListAttributesPartialState\x12K\n\x0eold_attributes\x18\x02 \x01(\x0b\x32\x33.spotify.playlist4.proto.ListAttributesPartialState\"\xc0\x03\n\x02Op\x12.\n\x04kind\x18\x01 \x02(\x0e\x32 .spotify.playlist4.proto.Op.Kind\x12)\n\x03\x61\x64\x64\x18\x02 \x01(\x0b\x32\x1c.spotify.playlist4.proto.Add\x12)\n\x03rem\x18\x03 \x01(\x0b\x32\x1c.spotify.playlist4.proto.Rem\x12)\n\x03mov\x18\x04 \x01(\x0b\x32\x1c.spotify.playlist4.proto.Mov\x12M\n\x16update_item_attributes\x18\x05 \x01(\x0b\x32-.spotify.playlist4.proto.UpdateItemAttributes\x12M\n\x16update_list_attributes\x18\x06 \x01(\x0b\x32-.spotify.playlist4.proto.UpdateListAttributes\"k\n\x04Kind\x12\x10\n\x0cKIND_UNKNOWN\x10\x00\x12\x07\n\x03\x41\x44\x44\x10\x02\x12\x07\n\x03REM\x10\x03\x12\x07\n\x03MOV\x10\x04\x12\x1a\n\x16UPDATE_ITEM_ATTRIBUTES\x10\x05\x12\x1a\n\x16UPDATE_LIST_ATTRIBUTES\x10\x06\"2\n\x06OpList\x12(\n\x03ops\x18\x01 \x03(\x0b\x32\x1b.spotify.playlist4.proto.Op\"\xd5\x01\n\nChangeInfo\x12\x0c\n\x04user\x18\x01 \x01(\t\x12\x11\n\ttimestamp\x18\x02 \x01(\x03\x12\r\n\x05\x61\x64min\x18\x03 \x01(\x08\x12\x0c\n\x04undo\x18\x04 \x01(\x08\x12\x0c\n\x04redo\x18\x05 \x01(\x08\x12\r\n\x05merge\x18\x06 \x01(\x08\x12\x12\n\ncompressed\x18\x07 \x01(\x08\x12\x11\n\tmigration\x18\x08 \x01(\x08\x12\x10\n\x08split_id\x18\t \x01(\x05\x12\x33\n\x06source\x18\n \x01(\x0b\x32#.spotify.playlist4.proto.SourceInfo\"\xe8\x01\n\nSourceInfo\x12:\n\x06\x63lient\x18\x01 \x01(\x0e\x32*.spotify.playlist4.proto.SourceInfo.Client\x12\x0b\n\x03\x61pp\x18\x03 \x01(\t\x12\x0e\n\x06source\x18\x04 \x01(\t\x12\x0f\n\x07version\x18\x05 \x01(\t\"p\n\x06\x43lient\x12\x12\n\x0e\x43LIENT_UNKNOWN\x10\x00\x12\x11\n\rNATIVE_HERMES\x10\x01\x12\n\n\x06\x43LIENT\x10\x02\x12\n\n\x06PYTHON\x10\x03\x12\x08\n\x04JAVA\x10\x04\x12\r\n\tWEBPLAYER\x10\x05\x12\x0e\n\nLIBSPOTIFY\x10\x06\"z\n\x05\x44\x65lta\x12\x14\n\x0c\x62\x61se_version\x18\x01 \x01(\x0c\x12(\n\x03ops\x18\x02 \x03(\x0b\x32\x1b.spotify.playlist4.proto.Op\x12\x31\n\x04info\x18\x04 \x01(\x0b\x32#.spotify.playlist4.proto.ChangeInfo\"\\\n\x04\x44iff\x12\x15\n\rfrom_revision\x18\x01 \x02(\x0c\x12(\n\x03ops\x18\x02 \x03(\x0b\x32\x1b.spotify.playlist4.proto.Op\x12\x13\n\x0bto_revision\x18\x03 \x02(\x0c\"\xa0\x01\n\x0bListChanges\x12\x15\n\rbase_revision\x18\x01 \x01(\x0c\x12.\n\x06\x64\x65ltas\x18\x02 \x03(\x0b\x32\x1e.spotify.playlist4.proto.Delta\x12 \n\x18want_resulting_revisions\x18\x03 \x01(\x08\x12\x18\n\x10want_sync_result\x18\x04 \x01(\x08\x12\x0e\n\x06nonces\x18\x06 \x03(\x03\"\x8f\x03\n\x13SelectedListContent\x12\x10\n\x08revision\x18\x01 \x01(\x0c\x12\x0e\n\x06length\x18\x02 \x01(\x05\x12;\n\nattributes\x18\x03 \x01(\x0b\x32\'.spotify.playlist4.proto.ListAttributes\x12\x34\n\x08\x63ontents\x18\x05 \x01(\x0b\x32\".spotify.playlist4.proto.ListItems\x12+\n\x04\x64iff\x18\x06 \x01(\x0b\x32\x1d.spotify.playlist4.proto.Diff\x12\x32\n\x0bsync_result\x18\x07 \x01(\x0b\x32\x1d.spotify.playlist4.proto.Diff\x12\x1b\n\x13resulting_revisions\x18\x08 \x03(\x0c\x12\x16\n\x0emultiple_heads\x18\t \x01(\x08\x12\x12\n\nup_to_date\x18\n \x01(\x08\x12\x0e\n\x06nonces\x18\x0e \x03(\x03\x12\x11\n\ttimestamp\x18\x0f \x01(\x03\x12\x16\n\x0eowner_username\x18\x10 \x01(\t\"0\n\x0f\x43reateListReply\x12\x0b\n\x03uri\x18\x01 \x02(\x0c\x12\x10\n\x08revision\x18\x02 \x01(\x0c\",\n\x0bModifyReply\x12\x0b\n\x03uri\x18\x01 \x02(\x0c\x12\x10\n\x08revision\x18\x02 \x01(\x0c\" \n\x10SubscribeRequest\x12\x0c\n\x04uris\x18\x01 \x03(\x0c\"\"\n\x12UnsubscribeRequest\x12\x0c\n\x04uris\x18\x01 \x03(\x0c\"\x80\x01\n\x18PlaylistModificationInfo\x12\x0b\n\x03uri\x18\x01 \x01(\x0c\x12\x14\n\x0cnew_revision\x18\x02 \x01(\x0c\x12\x17\n\x0fparent_revision\x18\x03 \x01(\x0c\x12(\n\x03ops\x18\x04 \x03(\x0b\x32\x1b.spotify.playlist4.proto.Op*\xe6\x01\n\x11ListAttributeKind\x12\x10\n\x0cLIST_UNKNOWN\x10\x00\x12\r\n\tLIST_NAME\x10\x01\x12\x14\n\x10LIST_DESCRIPTION\x10\x02\x12\x10\n\x0cLIST_PICTURE\x10\x03\x12\x16\n\x12LIST_COLLABORATIVE\x10\x04\x12\x14\n\x10LIST_PL3_VERSION\x10\x05\x12\x19\n\x15LIST_DELETED_BY_OWNER\x10\x06\x12\x12\n\x0eLIST_CLIENT_ID\x10\n\x12\x0f\n\x0bLIST_FORMAT\x10\x0b\x12\x1a\n\x16LIST_FORMAT_ATTRIBUTES\x10\x0c*\x98\x01\n\x11ItemAttributeKind\x12\x10\n\x0cITEM_UNKNOWN\x10\x00\x12\x11\n\rITEM_ADDED_BY\x10\x01\x12\x12\n\x0eITEM_TIMESTAMP\x10\x02\x12\x10\n\x0cITEM_SEEN_AT\x10\t\x12\x0f\n\x0bITEM_PUBLIC\x10\n\x12\x1a\n\x16ITEM_FORMAT_ATTRIBUTES\x10\x0b\x12\x0b\n\x07ITEM_ID\x10\x0c\x42,\n\x15\x63om.spotify.playlist4B\x11Playlist4ApiProtoH\x02'
)
_LISTATTRIBUTEKIND = _descriptor.EnumDescriptor(
name='ListAttributeKind',
full_name='spotify.playlist4.proto.ListAttributeKind',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='LIST_UNKNOWN',
index=0,
number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LIST_NAME',
index=1,
number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LIST_DESCRIPTION',
index=2,
number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LIST_PICTURE',
index=3,
number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LIST_COLLABORATIVE',
index=4,
number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LIST_PL3_VERSION',
index=5,
number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LIST_DELETED_BY_OWNER',
index=6,
number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LIST_CLIENT_ID',
index=7,
number=10,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LIST_FORMAT',
index=8,
number=11,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LIST_FORMAT_ATTRIBUTES',
index=9,
number=12,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=3902,
serialized_end=4132,
)
_sym_db.RegisterEnumDescriptor(_LISTATTRIBUTEKIND)
ListAttributeKind = enum_type_wrapper.EnumTypeWrapper(_LISTATTRIBUTEKIND)
_ITEMATTRIBUTEKIND = _descriptor.EnumDescriptor(
name='ItemAttributeKind',
full_name='spotify.playlist4.proto.ItemAttributeKind',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='ITEM_UNKNOWN',
index=0,
number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ITEM_ADDED_BY',
index=1,
number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ITEM_TIMESTAMP',
index=2,
number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ITEM_SEEN_AT',
index=3,
number=9,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ITEM_PUBLIC',
index=4,
number=10,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ITEM_FORMAT_ATTRIBUTES',
index=5,
number=11,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ITEM_ID',
index=6,
number=12,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=4135,
serialized_end=4287,
)
_sym_db.RegisterEnumDescriptor(_ITEMATTRIBUTEKIND)
ItemAttributeKind = enum_type_wrapper.EnumTypeWrapper(_ITEMATTRIBUTEKIND)
LIST_UNKNOWN = 0
LIST_NAME = 1
LIST_DESCRIPTION = 2
LIST_PICTURE = 3
LIST_COLLABORATIVE = 4
LIST_PL3_VERSION = 5
LIST_DELETED_BY_OWNER = 6
LIST_CLIENT_ID = 10
LIST_FORMAT = 11
LIST_FORMAT_ATTRIBUTES = 12
ITEM_UNKNOWN = 0
ITEM_ADDED_BY = 1
ITEM_TIMESTAMP = 2
ITEM_SEEN_AT = 9
ITEM_PUBLIC = 10
ITEM_FORMAT_ATTRIBUTES = 11
ITEM_ID = 12
_OP_KIND = _descriptor.EnumDescriptor(
name='Kind',
full_name='spotify.playlist4.proto.Op.Kind',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='KIND_UNKNOWN',
index=0,
number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ADD',
index=1,
number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='REM',
index=2,
number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MOV',
index=3,
number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='UPDATE_ITEM_ATTRIBUTES',
index=4,
number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='UPDATE_LIST_ATTRIBUTES',
index=5,
number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=2209,
serialized_end=2316,
)
_sym_db.RegisterEnumDescriptor(_OP_KIND)
_SOURCEINFO_CLIENT = _descriptor.EnumDescriptor(
name='Client',
full_name='spotify.playlist4.proto.SourceInfo.Client',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='CLIENT_UNKNOWN',
index=0,
number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='NATIVE_HERMES',
index=1,
number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='CLIENT',
index=2,
number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PYTHON',
index=3,
number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='JAVA',
index=4,
number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='WEBPLAYER',
index=5,
number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LIBSPOTIFY',
index=6,
number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=2707,
serialized_end=2819,
)
_sym_db.RegisterEnumDescriptor(_SOURCEINFO_CLIENT)
_ITEM = _descriptor.Descriptor(
name='Item',
full_name='spotify.playlist4.proto.Item',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='uri',
full_name='spotify.playlist4.proto.Item.uri',
index=0,
number=1,
type=9,
cpp_type=9,
label=2,
has_default_value=False,
default_value=b"".decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='attributes',
full_name='spotify.playlist4.proto.Item.attributes',
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=53,
serialized_end=133,
)
_METAITEM = _descriptor.Descriptor(
name='MetaItem',
full_name='spotify.playlist4.proto.MetaItem',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='revision',
full_name='spotify.playlist4.proto.MetaItem.revision',
index=0,
number=1,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='attributes',
full_name='spotify.playlist4.proto.MetaItem.attributes',
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='length',
full_name='spotify.playlist4.proto.MetaItem.length',
index=2,
number=3,
type=5,
cpp_type=1,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp',
full_name='spotify.playlist4.proto.MetaItem.timestamp',
index=3,
number=4,
type=3,
cpp_type=2,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='owner_username',
full_name='spotify.playlist4.proto.MetaItem.owner_username',
index=4,
number=5,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=136,
serialized_end=284,
)
_LISTITEMS = _descriptor.Descriptor(
name='ListItems',
full_name='spotify.playlist4.proto.ListItems',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='pos',
full_name='spotify.playlist4.proto.ListItems.pos',
index=0,
number=1,
type=5,
cpp_type=1,
label=2,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='truncated',
full_name='spotify.playlist4.proto.ListItems.truncated',
index=1,
number=2,
type=8,
cpp_type=7,
label=2,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='items',
full_name='spotify.playlist4.proto.ListItems.items',
index=2,
number=3,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='meta_items',
full_name='spotify.playlist4.proto.ListItems.meta_items',
index=3,
number=4,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=287,
serialized_end=431,
)
_FORMATLISTATTRIBUTE = _descriptor.Descriptor(
name='FormatListAttribute',
full_name='spotify.playlist4.proto.FormatListAttribute',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='key',
full_name='spotify.playlist4.proto.FormatListAttribute.key',
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='value',
full_name='spotify.playlist4.proto.FormatListAttribute.value',
index=1,
number=2,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=433,
serialized_end=482,
)
_LISTATTRIBUTES = _descriptor.Descriptor(
name='ListAttributes',
full_name='spotify.playlist4.proto.ListAttributes',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='name',
full_name='spotify.playlist4.proto.ListAttributes.name',
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='description',
full_name='spotify.playlist4.proto.ListAttributes.description',
index=1,
number=2,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='picture',
full_name='spotify.playlist4.proto.ListAttributes.picture',
index=2,
number=3,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='collaborative',
full_name='spotify.playlist4.proto.ListAttributes.collaborative',
index=3,
number=4,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='pl3_version',
full_name='spotify.playlist4.proto.ListAttributes.pl3_version',
index=4,
number=5,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='deleted_by_owner',
full_name='spotify.playlist4.proto.ListAttributes.deleted_by_owner',
index=5,
number=6,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='client_id',
full_name='spotify.playlist4.proto.ListAttributes.client_id',
index=6,
number=10,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='format',
full_name='spotify.playlist4.proto.ListAttributes.format',
index=7,
number=11,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='format_attributes',
full_name=
'spotify.playlist4.proto.ListAttributes.format_attributes',
index=8,
number=12,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=485,
serialized_end=731,
)
_ITEMATTRIBUTES = _descriptor.Descriptor(
name='ItemAttributes',
full_name='spotify.playlist4.proto.ItemAttributes',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='added_by',
full_name='spotify.playlist4.proto.ItemAttributes.added_by',
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp',
full_name='spotify.playlist4.proto.ItemAttributes.timestamp',
index=1,
number=2,
type=3,
cpp_type=2,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='seen_at',
full_name='spotify.playlist4.proto.ItemAttributes.seen_at',
index=2,
number=9,
type=3,
cpp_type=2,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='public',
full_name='spotify.playlist4.proto.ItemAttributes.public',
index=3,
number=10,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='format_attributes',
full_name=
'spotify.playlist4.proto.ItemAttributes.format_attributes',
index=4,
number=11,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='item_id',
full_name='spotify.playlist4.proto.ItemAttributes.item_id',
index=5,
number=12,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=734,
serialized_end=910,
)
_ADD = _descriptor.Descriptor(
name='Add',
full_name='spotify.playlist4.proto.Add',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='from_index',
full_name='spotify.playlist4.proto.Add.from_index',
index=0,
number=1,
type=5,
cpp_type=1,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='items',
full_name='spotify.playlist4.proto.Add.items',
index=1,
number=2,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='add_last',
full_name='spotify.playlist4.proto.Add.add_last',
index=2,
number=4,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='add_first',
full_name='spotify.playlist4.proto.Add.add_first',
index=3,
number=5,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=912,
serialized_end=1020,
)
_REM = _descriptor.Descriptor(
name='Rem',
full_name='spotify.playlist4.proto.Rem',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='from_index',
full_name='spotify.playlist4.proto.Rem.from_index',
index=0,
number=1,
type=5,
cpp_type=1,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='length',
full_name='spotify.playlist4.proto.Rem.length',
index=1,
number=2,
type=5,
cpp_type=1,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='items',
full_name='spotify.playlist4.proto.Rem.items',
index=2,
number=3,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='items_as_key',
full_name='spotify.playlist4.proto.Rem.items_as_key',
index=3,
number=7,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=1022,
serialized_end=1131,
)
_MOV = _descriptor.Descriptor(
name='Mov',
full_name='spotify.playlist4.proto.Mov',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='from_index',
full_name='spotify.playlist4.proto.Mov.from_index',
index=0,
number=1,
type=5,
cpp_type=1,
label=2,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='length',
full_name='spotify.playlist4.proto.Mov.length',
index=1,
number=2,
type=5,
cpp_type=1,
label=2,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='to_index',
full_name='spotify.playlist4.proto.Mov.to_index',
index=2,
number=3,
type=5,
cpp_type=1,
label=2,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=1133,
serialized_end=1192,
)
_ITEMATTRIBUTESPARTIALSTATE = _descriptor.Descriptor(
name='ItemAttributesPartialState',
full_name='spotify.playlist4.proto.ItemAttributesPartialState',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='values',
full_name=
'spotify.playlist4.proto.ItemAttributesPartialState.values',
index=0,
number=1,
type=11,
cpp_type=10,
label=2,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='no_value',
full_name=
'spotify.playlist4.proto.ItemAttributesPartialState.no_value',
index=1,
number=2,
type=14,
cpp_type=8,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=1195,
serialized_end=1342,
)
_LISTATTRIBUTESPARTIALSTATE = _descriptor.Descriptor(
name='ListAttributesPartialState',
full_name='spotify.playlist4.proto.ListAttributesPartialState',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='values',
full_name=
'spotify.playlist4.proto.ListAttributesPartialState.values',
index=0,
number=1,
type=11,
cpp_type=10,
label=2,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='no_value',
full_name=
'spotify.playlist4.proto.ListAttributesPartialState.no_value',
index=1,
number=2,
type=14,
cpp_type=8,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=1345,
serialized_end=1492,
)
_UPDATEITEMATTRIBUTES = _descriptor.Descriptor(
name='UpdateItemAttributes',
full_name='spotify.playlist4.proto.UpdateItemAttributes',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='index',
full_name='spotify.playlist4.proto.UpdateItemAttributes.index',
index=0,
number=1,
type=5,
cpp_type=1,
label=2,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='new_attributes',
full_name=
'spotify.playlist4.proto.UpdateItemAttributes.new_attributes',
index=1,
number=2,
type=11,
cpp_type=10,
label=2,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='old_attributes',
full_name=
'spotify.playlist4.proto.UpdateItemAttributes.old_attributes',
index=2,
number=3,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=1495,
serialized_end=1686,
)
_UPDATELISTATTRIBUTES = _descriptor.Descriptor(
name='UpdateListAttributes',
full_name='spotify.playlist4.proto.UpdateListAttributes',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='new_attributes',
full_name=
'spotify.playlist4.proto.UpdateListAttributes.new_attributes',
index=0,
number=1,
type=11,
cpp_type=10,
label=2,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='old_attributes',
full_name=
'spotify.playlist4.proto.UpdateListAttributes.old_attributes',
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=1689,
serialized_end=1865,
)
_OP = _descriptor.Descriptor(
name='Op',
full_name='spotify.playlist4.proto.Op',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='kind',
full_name='spotify.playlist4.proto.Op.kind',
index=0,
number=1,
type=14,
cpp_type=8,
label=2,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='add',
full_name='spotify.playlist4.proto.Op.add',
index=1,
number=2,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='rem',
full_name='spotify.playlist4.proto.Op.rem',
index=2,
number=3,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='mov',
full_name='spotify.playlist4.proto.Op.mov',
index=3,
number=4,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='update_item_attributes',
full_name='spotify.playlist4.proto.Op.update_item_attributes',
index=4,
number=5,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='update_list_attributes',
full_name='spotify.playlist4.proto.Op.update_list_attributes',
index=5,
number=6,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[
_OP_KIND,
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=1868,
serialized_end=2316,
)
_OPLIST = _descriptor.Descriptor(
name='OpList',
full_name='spotify.playlist4.proto.OpList',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='ops',
full_name='spotify.playlist4.proto.OpList.ops',
index=0,
number=1,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=2318,
serialized_end=2368,
)
_CHANGEINFO = _descriptor.Descriptor(
name='ChangeInfo',
full_name='spotify.playlist4.proto.ChangeInfo',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='user',
full_name='spotify.playlist4.proto.ChangeInfo.user',
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp',
full_name='spotify.playlist4.proto.ChangeInfo.timestamp',
index=1,
number=2,
type=3,
cpp_type=2,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='admin',
full_name='spotify.playlist4.proto.ChangeInfo.admin',
index=2,
number=3,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='undo',
full_name='spotify.playlist4.proto.ChangeInfo.undo',
index=3,
number=4,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='redo',
full_name='spotify.playlist4.proto.ChangeInfo.redo',
index=4,
number=5,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='merge',
full_name='spotify.playlist4.proto.ChangeInfo.merge',
index=5,
number=6,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='compressed',
full_name='spotify.playlist4.proto.ChangeInfo.compressed',
index=6,
number=7,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='migration',
full_name='spotify.playlist4.proto.ChangeInfo.migration',
index=7,
number=8,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='split_id',
full_name='spotify.playlist4.proto.ChangeInfo.split_id',
index=8,
number=9,
type=5,
cpp_type=1,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='source',
full_name='spotify.playlist4.proto.ChangeInfo.source',
index=9,
number=10,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=2371,
serialized_end=2584,
)
_SOURCEINFO = _descriptor.Descriptor(
name='SourceInfo',
full_name='spotify.playlist4.proto.SourceInfo',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='client',
full_name='spotify.playlist4.proto.SourceInfo.client',
index=0,
number=1,
type=14,
cpp_type=8,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='app',
full_name='spotify.playlist4.proto.SourceInfo.app',
index=1,
number=3,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='source',
full_name='spotify.playlist4.proto.SourceInfo.source',
index=2,
number=4,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='version',
full_name='spotify.playlist4.proto.SourceInfo.version',
index=3,
number=5,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[
_SOURCEINFO_CLIENT,
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=2587,
serialized_end=2819,
)
_DELTA = _descriptor.Descriptor(
name='Delta',
full_name='spotify.playlist4.proto.Delta',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='base_version',
full_name='spotify.playlist4.proto.Delta.base_version',
index=0,
number=1,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ops',
full_name='spotify.playlist4.proto.Delta.ops',
index=1,
number=2,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='info',
full_name='spotify.playlist4.proto.Delta.info',
index=2,
number=4,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=2821,
serialized_end=2943,
)
_DIFF = _descriptor.Descriptor(
name='Diff',
full_name='spotify.playlist4.proto.Diff',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='from_revision',
full_name='spotify.playlist4.proto.Diff.from_revision',
index=0,
number=1,
type=12,
cpp_type=9,
label=2,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ops',
full_name='spotify.playlist4.proto.Diff.ops',
index=1,
number=2,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='to_revision',
full_name='spotify.playlist4.proto.Diff.to_revision',
index=2,
number=3,
type=12,
cpp_type=9,
label=2,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=2945,
serialized_end=3037,
)
_LISTCHANGES = _descriptor.Descriptor(
name='ListChanges',
full_name='spotify.playlist4.proto.ListChanges',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='base_revision',
full_name='spotify.playlist4.proto.ListChanges.base_revision',
index=0,
number=1,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='deltas',
full_name='spotify.playlist4.proto.ListChanges.deltas',
index=1,
number=2,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='want_resulting_revisions',
full_name=
'spotify.playlist4.proto.ListChanges.want_resulting_revisions',
index=2,
number=3,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='want_sync_result',
full_name='spotify.playlist4.proto.ListChanges.want_sync_result',
index=3,
number=4,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='nonces',
full_name='spotify.playlist4.proto.ListChanges.nonces',
index=4,
number=6,
type=3,
cpp_type=2,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=3040,
serialized_end=3200,
)
_SELECTEDLISTCONTENT = _descriptor.Descriptor(
name='SelectedListContent',
full_name='spotify.playlist4.proto.SelectedListContent',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='revision',
full_name='spotify.playlist4.proto.SelectedListContent.revision',
index=0,
number=1,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='length',
full_name='spotify.playlist4.proto.SelectedListContent.length',
index=1,
number=2,
type=5,
cpp_type=1,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='attributes',
full_name='spotify.playlist4.proto.SelectedListContent.attributes',
index=2,
number=3,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='contents',
full_name='spotify.playlist4.proto.SelectedListContent.contents',
index=3,
number=5,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='diff',
full_name='spotify.playlist4.proto.SelectedListContent.diff',
index=4,
number=6,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='sync_result',
full_name='spotify.playlist4.proto.SelectedListContent.sync_result',
index=5,
number=7,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='resulting_revisions',
full_name=
'spotify.playlist4.proto.SelectedListContent.resulting_revisions',
index=6,
number=8,
type=12,
cpp_type=9,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='multiple_heads',
full_name=
'spotify.playlist4.proto.SelectedListContent.multiple_heads',
index=7,
number=9,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='up_to_date',
full_name='spotify.playlist4.proto.SelectedListContent.up_to_date',
index=8,
number=10,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='nonces',
full_name='spotify.playlist4.proto.SelectedListContent.nonces',
index=9,
number=14,
type=3,
cpp_type=2,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timestamp',
full_name='spotify.playlist4.proto.SelectedListContent.timestamp',
index=10,
number=15,
type=3,
cpp_type=2,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='owner_username',
full_name=
'spotify.playlist4.proto.SelectedListContent.owner_username',
index=11,
number=16,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode('utf-8'),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=3203,
serialized_end=3602,
)
_CREATELISTREPLY = _descriptor.Descriptor(
name='CreateListReply',
full_name='spotify.playlist4.proto.CreateListReply',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='uri',
full_name='spotify.playlist4.proto.CreateListReply.uri',
index=0,
number=1,
type=12,
cpp_type=9,
label=2,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='revision',
full_name='spotify.playlist4.proto.CreateListReply.revision',
index=1,
number=2,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=3604,
serialized_end=3652,
)
_MODIFYREPLY = _descriptor.Descriptor(
name='ModifyReply',
full_name='spotify.playlist4.proto.ModifyReply',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='uri',
full_name='spotify.playlist4.proto.ModifyReply.uri',
index=0,
number=1,
type=12,
cpp_type=9,
label=2,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='revision',
full_name='spotify.playlist4.proto.ModifyReply.revision',
index=1,
number=2,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=3654,
serialized_end=3698,
)
_SUBSCRIBEREQUEST = _descriptor.Descriptor(
name='SubscribeRequest',
full_name='spotify.playlist4.proto.SubscribeRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='uris',
full_name='spotify.playlist4.proto.SubscribeRequest.uris',
index=0,
number=1,
type=12,
cpp_type=9,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=3700,
serialized_end=3732,
)
_UNSUBSCRIBEREQUEST = _descriptor.Descriptor(
name='UnsubscribeRequest',
full_name='spotify.playlist4.proto.UnsubscribeRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='uris',
full_name='spotify.playlist4.proto.UnsubscribeRequest.uris',
index=0,
number=1,
type=12,
cpp_type=9,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=3734,
serialized_end=3768,
)
_PLAYLISTMODIFICATIONINFO = _descriptor.Descriptor(
name='PlaylistModificationInfo',
full_name='spotify.playlist4.proto.PlaylistModificationInfo',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='uri',
full_name='spotify.playlist4.proto.PlaylistModificationInfo.uri',
index=0,
number=1,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='new_revision',
full_name=
'spotify.playlist4.proto.PlaylistModificationInfo.new_revision',
index=1,
number=2,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='parent_revision',
full_name=
'spotify.playlist4.proto.PlaylistModificationInfo.parent_revision',
index=2,
number=3,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ops',
full_name='spotify.playlist4.proto.PlaylistModificationInfo.ops',
index=3,
number=4,
type=11,
cpp_type=10,
label=3,
has_default_value=False,
default_value=[],
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[],
serialized_start=3771,
serialized_end=3899,
)
_ITEM.fields_by_name['attributes'].message_type = _ITEMATTRIBUTES
_METAITEM.fields_by_name['attributes'].message_type = _LISTATTRIBUTES
_LISTITEMS.fields_by_name['items'].message_type = _ITEM
_LISTITEMS.fields_by_name['meta_items'].message_type = _METAITEM
_LISTATTRIBUTES.fields_by_name[
'format_attributes'].message_type = _FORMATLISTATTRIBUTE
_ITEMATTRIBUTES.fields_by_name[
'format_attributes'].message_type = _FORMATLISTATTRIBUTE
_ADD.fields_by_name['items'].message_type = _ITEM
_REM.fields_by_name['items'].message_type = _ITEM
_ITEMATTRIBUTESPARTIALSTATE.fields_by_name[
'values'].message_type = _ITEMATTRIBUTES
_ITEMATTRIBUTESPARTIALSTATE.fields_by_name[
'no_value'].enum_type = _ITEMATTRIBUTEKIND
_LISTATTRIBUTESPARTIALSTATE.fields_by_name[
'values'].message_type = _LISTATTRIBUTES
_LISTATTRIBUTESPARTIALSTATE.fields_by_name[
'no_value'].enum_type = _LISTATTRIBUTEKIND
_UPDATEITEMATTRIBUTES.fields_by_name[
'new_attributes'].message_type = _ITEMATTRIBUTESPARTIALSTATE
_UPDATEITEMATTRIBUTES.fields_by_name[
'old_attributes'].message_type = _ITEMATTRIBUTESPARTIALSTATE
_UPDATELISTATTRIBUTES.fields_by_name[
'new_attributes'].message_type = _LISTATTRIBUTESPARTIALSTATE
_UPDATELISTATTRIBUTES.fields_by_name[
'old_attributes'].message_type = _LISTATTRIBUTESPARTIALSTATE
_OP.fields_by_name['kind'].enum_type = _OP_KIND
_OP.fields_by_name['add'].message_type = _ADD
_OP.fields_by_name['rem'].message_type = _REM
_OP.fields_by_name['mov'].message_type = _MOV
_OP.fields_by_name[
'update_item_attributes'].message_type = _UPDATEITEMATTRIBUTES
_OP.fields_by_name[
'update_list_attributes'].message_type = _UPDATELISTATTRIBUTES
_OP_KIND.containing_type = _OP
_OPLIST.fields_by_name['ops'].message_type = _OP
_CHANGEINFO.fields_by_name['source'].message_type = _SOURCEINFO
_SOURCEINFO.fields_by_name['client'].enum_type = _SOURCEINFO_CLIENT
_SOURCEINFO_CLIENT.containing_type = _SOURCEINFO
_DELTA.fields_by_name['ops'].message_type = _OP
_DELTA.fields_by_name['info'].message_type = _CHANGEINFO
_DIFF.fields_by_name['ops'].message_type = _OP
_LISTCHANGES.fields_by_name['deltas'].message_type = _DELTA
_SELECTEDLISTCONTENT.fields_by_name[
'attributes'].message_type = _LISTATTRIBUTES
_SELECTEDLISTCONTENT.fields_by_name['contents'].message_type = _LISTITEMS
_SELECTEDLISTCONTENT.fields_by_name['diff'].message_type = _DIFF
_SELECTEDLISTCONTENT.fields_by_name['sync_result'].message_type = _DIFF
_PLAYLISTMODIFICATIONINFO.fields_by_name['ops'].message_type = _OP
DESCRIPTOR.message_types_by_name['Item'] = _ITEM
DESCRIPTOR.message_types_by_name['MetaItem'] = _METAITEM
DESCRIPTOR.message_types_by_name['ListItems'] = _LISTITEMS
DESCRIPTOR.message_types_by_name['FormatListAttribute'] = _FORMATLISTATTRIBUTE
DESCRIPTOR.message_types_by_name['ListAttributes'] = _LISTATTRIBUTES
DESCRIPTOR.message_types_by_name['ItemAttributes'] = _ITEMATTRIBUTES
DESCRIPTOR.message_types_by_name['Add'] = _ADD
DESCRIPTOR.message_types_by_name['Rem'] = _REM
DESCRIPTOR.message_types_by_name['Mov'] = _MOV
DESCRIPTOR.message_types_by_name[
'ItemAttributesPartialState'] = _ITEMATTRIBUTESPARTIALSTATE
DESCRIPTOR.message_types_by_name[
'ListAttributesPartialState'] = _LISTATTRIBUTESPARTIALSTATE
DESCRIPTOR.message_types_by_name[
'UpdateItemAttributes'] = _UPDATEITEMATTRIBUTES
DESCRIPTOR.message_types_by_name[
'UpdateListAttributes'] = _UPDATELISTATTRIBUTES
DESCRIPTOR.message_types_by_name['Op'] = _OP
DESCRIPTOR.message_types_by_name['OpList'] = _OPLIST
DESCRIPTOR.message_types_by_name['ChangeInfo'] = _CHANGEINFO
DESCRIPTOR.message_types_by_name['SourceInfo'] = _SOURCEINFO
DESCRIPTOR.message_types_by_name['Delta'] = _DELTA
DESCRIPTOR.message_types_by_name['Diff'] = _DIFF
DESCRIPTOR.message_types_by_name['ListChanges'] = _LISTCHANGES
DESCRIPTOR.message_types_by_name['SelectedListContent'] = _SELECTEDLISTCONTENT
DESCRIPTOR.message_types_by_name['CreateListReply'] = _CREATELISTREPLY
DESCRIPTOR.message_types_by_name['ModifyReply'] = _MODIFYREPLY
DESCRIPTOR.message_types_by_name['SubscribeRequest'] = _SUBSCRIBEREQUEST
DESCRIPTOR.message_types_by_name['UnsubscribeRequest'] = _UNSUBSCRIBEREQUEST
DESCRIPTOR.message_types_by_name[
'PlaylistModificationInfo'] = _PLAYLISTMODIFICATIONINFO
DESCRIPTOR.enum_types_by_name['ListAttributeKind'] = _LISTATTRIBUTEKIND
DESCRIPTOR.enum_types_by_name['ItemAttributeKind'] = _ITEMATTRIBUTEKIND
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
Item = _reflection.GeneratedProtocolMessageType(
'Item',
(_message.Message, ),
{
'DESCRIPTOR': _ITEM,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.Item)
})
_sym_db.RegisterMessage(Item)
MetaItem = _reflection.GeneratedProtocolMessageType(
'MetaItem',
(_message.Message, ),
{
'DESCRIPTOR': _METAITEM,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.MetaItem)
})
_sym_db.RegisterMessage(MetaItem)
ListItems = _reflection.GeneratedProtocolMessageType(
'ListItems',
(_message.Message, ),
{
'DESCRIPTOR': _LISTITEMS,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.ListItems)
})
_sym_db.RegisterMessage(ListItems)
FormatListAttribute = _reflection.GeneratedProtocolMessageType(
'FormatListAttribute',
(_message.Message, ),
{
'DESCRIPTOR': _FORMATLISTATTRIBUTE,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.FormatListAttribute)
})
_sym_db.RegisterMessage(FormatListAttribute)
ListAttributes = _reflection.GeneratedProtocolMessageType(
'ListAttributes',
(_message.Message, ),
{
'DESCRIPTOR': _LISTATTRIBUTES,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.ListAttributes)
})
_sym_db.RegisterMessage(ListAttributes)
ItemAttributes = _reflection.GeneratedProtocolMessageType(
'ItemAttributes',
(_message.Message, ),
{
'DESCRIPTOR': _ITEMATTRIBUTES,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.ItemAttributes)
})
_sym_db.RegisterMessage(ItemAttributes)
Add = _reflection.GeneratedProtocolMessageType(
'Add',
(_message.Message, ),
{
'DESCRIPTOR': _ADD,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.Add)
})
_sym_db.RegisterMessage(Add)
Rem = _reflection.GeneratedProtocolMessageType(
'Rem',
(_message.Message, ),
{
'DESCRIPTOR': _REM,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.Rem)
})
_sym_db.RegisterMessage(Rem)
Mov = _reflection.GeneratedProtocolMessageType(
'Mov',
(_message.Message, ),
{
'DESCRIPTOR': _MOV,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.Mov)
})
_sym_db.RegisterMessage(Mov)
ItemAttributesPartialState = _reflection.GeneratedProtocolMessageType(
'ItemAttributesPartialState',
(_message.Message, ),
{
'DESCRIPTOR': _ITEMATTRIBUTESPARTIALSTATE,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.ItemAttributesPartialState)
})
_sym_db.RegisterMessage(ItemAttributesPartialState)
ListAttributesPartialState = _reflection.GeneratedProtocolMessageType(
'ListAttributesPartialState',
(_message.Message, ),
{
'DESCRIPTOR': _LISTATTRIBUTESPARTIALSTATE,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.ListAttributesPartialState)
})
_sym_db.RegisterMessage(ListAttributesPartialState)
UpdateItemAttributes = _reflection.GeneratedProtocolMessageType(
'UpdateItemAttributes',
(_message.Message, ),
{
'DESCRIPTOR': _UPDATEITEMATTRIBUTES,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.UpdateItemAttributes)
})
_sym_db.RegisterMessage(UpdateItemAttributes)
UpdateListAttributes = _reflection.GeneratedProtocolMessageType(
'UpdateListAttributes',
(_message.Message, ),
{
'DESCRIPTOR': _UPDATELISTATTRIBUTES,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.UpdateListAttributes)
})
_sym_db.RegisterMessage(UpdateListAttributes)
Op = _reflection.GeneratedProtocolMessageType(
'Op',
(_message.Message, ),
{
'DESCRIPTOR': _OP,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.Op)
})
_sym_db.RegisterMessage(Op)
OpList = _reflection.GeneratedProtocolMessageType(
'OpList',
(_message.Message, ),
{
'DESCRIPTOR': _OPLIST,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.OpList)
})
_sym_db.RegisterMessage(OpList)
ChangeInfo = _reflection.GeneratedProtocolMessageType(
'ChangeInfo',
(_message.Message, ),
{
'DESCRIPTOR': _CHANGEINFO,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.ChangeInfo)
})
_sym_db.RegisterMessage(ChangeInfo)
SourceInfo = _reflection.GeneratedProtocolMessageType(
'SourceInfo',
(_message.Message, ),
{
'DESCRIPTOR': _SOURCEINFO,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.SourceInfo)
})
_sym_db.RegisterMessage(SourceInfo)
Delta = _reflection.GeneratedProtocolMessageType(
'Delta',
(_message.Message, ),
{
'DESCRIPTOR': _DELTA,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.Delta)
})
_sym_db.RegisterMessage(Delta)
Diff = _reflection.GeneratedProtocolMessageType(
'Diff',
(_message.Message, ),
{
'DESCRIPTOR': _DIFF,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.Diff)
})
_sym_db.RegisterMessage(Diff)
ListChanges = _reflection.GeneratedProtocolMessageType(
'ListChanges',
(_message.Message, ),
{
'DESCRIPTOR': _LISTCHANGES,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.ListChanges)
})
_sym_db.RegisterMessage(ListChanges)
SelectedListContent = _reflection.GeneratedProtocolMessageType(
'SelectedListContent',
(_message.Message, ),
{
'DESCRIPTOR': _SELECTEDLISTCONTENT,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.SelectedListContent)
})
_sym_db.RegisterMessage(SelectedListContent)
CreateListReply = _reflection.GeneratedProtocolMessageType(
'CreateListReply',
(_message.Message, ),
{
'DESCRIPTOR': _CREATELISTREPLY,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.CreateListReply)
})
_sym_db.RegisterMessage(CreateListReply)
ModifyReply = _reflection.GeneratedProtocolMessageType(
'ModifyReply',
(_message.Message, ),
{
'DESCRIPTOR': _MODIFYREPLY,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.ModifyReply)
})
_sym_db.RegisterMessage(ModifyReply)
SubscribeRequest = _reflection.GeneratedProtocolMessageType(
'SubscribeRequest',
(_message.Message, ),
{
'DESCRIPTOR': _SUBSCRIBEREQUEST,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.SubscribeRequest)
})
_sym_db.RegisterMessage(SubscribeRequest)
UnsubscribeRequest = _reflection.GeneratedProtocolMessageType(
'UnsubscribeRequest',
(_message.Message, ),
{
'DESCRIPTOR': _UNSUBSCRIBEREQUEST,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.UnsubscribeRequest)
})
_sym_db.RegisterMessage(UnsubscribeRequest)
PlaylistModificationInfo = _reflection.GeneratedProtocolMessageType(
'PlaylistModificationInfo',
(_message.Message, ),
{
'DESCRIPTOR': _PLAYLISTMODIFICATIONINFO,
'__module__': 'playlist4_external_pb2'
# @@protoc_insertion_point(class_scope:spotify.playlist4.proto.PlaylistModificationInfo)
})
_sym_db.RegisterMessage(PlaylistModificationInfo)
DESCRIPTOR._options = None
# @@protoc_insertion_point(module_scope)
| 34.066928 | 7,415 | 0.606103 | 10,434 | 104,347 | 5.739218 | 0.04217 | 0.048895 | 0.084081 | 0.073493 | 0.785515 | 0.759197 | 0.701184 | 0.664279 | 0.655294 | 0.648615 | 0 | 0.040578 | 0.297383 | 104,347 | 3,062 | 7,416 | 34.078054 | 0.776202 | 0.020844 | 0 | 0.796565 | 1 | 0.003368 | 0.143475 | 0.113789 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.001684 | 0 | 0.001684 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
72d3e01b930ae464ceb3638840f9b8c358e041e2 | 867 | py | Python | source/libs/name_validator/test_name.py | eaxr/elasticsearch-cli | 4eae551d8099f02a30443749eae09efbb636ea57 | [
"MIT"
] | 4 | 2019-07-13T13:29:31.000Z | 2021-04-01T10:05:25.000Z | source/libs/name_validator/test_name.py | eaxr/elasticsearch-cli | 4eae551d8099f02a30443749eae09efbb636ea57 | [
"MIT"
] | null | null | null | source/libs/name_validator/test_name.py | eaxr/elasticsearch-cli | 4eae551d8099f02a30443749eae09efbb636ea57 | [
"MIT"
] | 4 | 2019-08-21T15:59:37.000Z | 2021-04-01T11:48:41.000Z | from .name_valid import name_validator
def name_validator_all():
assert name_validator('all') == 'all'
def name_validator_pri():
assert name_validator('primaries') == 'primaries'
def name_validator_new_pri():
assert status_validator('new_primaries') == 'new_primaries'
def name_validator_disable():
assert status_validator('disable') == 'null'
def name_validator_enable():
assert status_validator('enable') == 'all'
def name_validator_string():
assert status_validator('test') == 'Bad status value'
def name_validator_none():
assert status_validator(None) == 'Bad status value'
def name_validator_obj():
assert status_validator({}) == 'Bad status value'
def name_validator_arr():
assert status_validator([]) == 'Bad status value'
def name_validator_int():
assert status_validator(123) == 'Bad status value'
| 20.642857 | 63 | 0.72203 | 107 | 867 | 5.523364 | 0.224299 | 0.285956 | 0.270728 | 0.115059 | 0.274112 | 0.274112 | 0.172589 | 0.172589 | 0.172589 | 0 | 0 | 0.004093 | 0.154556 | 867 | 41 | 64 | 21.146341 | 0.802183 | 0 | 0 | 0 | 0 | 0 | 0.177624 | 0 | 0 | 0 | 0 | 0 | 0.47619 | 1 | 0.47619 | true | 0 | 0.047619 | 0 | 0.52381 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
72d5e417e66cd05869e503de5ad74006b12049e9 | 134 | py | Python | tests/missing_data/test_missing_data_air_passengers_DiscardRow_Mean.py | shaido987/pyaf | b9afd089557bed6b90b246d3712c481ae26a1957 | [
"BSD-3-Clause"
] | 377 | 2016-10-13T20:52:44.000Z | 2022-03-29T18:04:14.000Z | tests/missing_data/test_missing_data_air_passengers_DiscardRow_Mean.py | ysdede/pyaf | b5541b8249d5a1cfdc01f27fdfd99b6580ed680b | [
"BSD-3-Clause"
] | 160 | 2016-10-13T16:11:53.000Z | 2022-03-28T04:21:34.000Z | tests/missing_data/test_missing_data_air_passengers_DiscardRow_Mean.py | ysdede/pyaf | b5541b8249d5a1cfdc01f27fdfd99b6580ed680b | [
"BSD-3-Clause"
] | 63 | 2017-03-09T14:51:18.000Z | 2022-03-27T20:52:57.000Z | import tests.missing_data.test_missing_data_air_passengers_generic as gen
gen.test_air_passengers_missing_data('DiscardRow', 'Mean')
| 33.5 | 73 | 0.873134 | 20 | 134 | 5.35 | 0.6 | 0.308411 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052239 | 134 | 3 | 74 | 44.666667 | 0.84252 | 0 | 0 | 0 | 0 | 0 | 0.104478 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 1 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 6 |
72fcf6ddcb511b287bf1779d8fe6b26d4e961e9a | 214 | py | Python | tests/performance/datasets/__init__.py | sarahmish/RDT | ca639dc1eeae5f1bb9f78e8b163659680ce627e3 | [
"MIT"
] | 1 | 2021-07-29T20:38:26.000Z | 2021-07-29T20:38:26.000Z | tests/performance/datasets/__init__.py | sarahmish/RDT | ca639dc1eeae5f1bb9f78e8b163659680ce627e3 | [
"MIT"
] | null | null | null | tests/performance/datasets/__init__.py | sarahmish/RDT | ca639dc1eeae5f1bb9f78e8b163659680ce627e3 | [
"MIT"
] | null | null | null | from tests.performance.datasets.base import BaseDatasetGenerator
from tests.performance.datasets.numerical import RandomNumericalGenerator
__all__ = [
'BaseDatasetGenerator',
'RandomNumericalGenerator',
]
| 26.75 | 73 | 0.82243 | 17 | 214 | 10.117647 | 0.588235 | 0.104651 | 0.232558 | 0.325581 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107477 | 214 | 7 | 74 | 30.571429 | 0.900524 | 0 | 0 | 0 | 0 | 0 | 0.205607 | 0.11215 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 1 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
f41fd2c4542878bf9f59f46e199991bdf9832653 | 168 | py | Python | interactive/dashboard/import.py | mattldawson/music-box-interactive | 6b2610b4f0f255f0e78e23628dc7ba6cc844d0f4 | [
"Apache-2.0"
] | 4 | 2020-07-20T17:18:51.000Z | 2022-03-17T19:08:24.000Z | interactive/dashboard/import.py | mattldawson/music-box-interactive | 6b2610b4f0f255f0e78e23628dc7ba6cc844d0f4 | [
"Apache-2.0"
] | 174 | 2020-07-30T16:47:53.000Z | 2022-03-29T17:54:54.000Z | interactive/dashboard/import.py | mattldawson/music-box-interactive | 6b2610b4f0f255f0e78e23628dc7ba6cc844d0f4 | [
"Apache-2.0"
] | 1 | 2021-01-05T22:42:55.000Z | 2021-01-05T22:42:55.000Z | import json
# with open('/Users/simonthomas/music-box-interactive/interactive/dashboard/static/config/my_config.json') as f:
# data = json.loads(f.read())
#
#
| 24 | 112 | 0.708333 | 23 | 168 | 5.130435 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130952 | 168 | 6 | 113 | 28 | 0.808219 | 0.869048 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f42e22df2302765de18ac04f6e6758919e0166f5 | 36 | py | Python | pstraw/__init__.py | MJJBennett/pstraw | 1bc7f4d3ffa8f52534b790dfa8c9993277203b37 | [
"MIT"
] | null | null | null | pstraw/__init__.py | MJJBennett/pstraw | 1bc7f4d3ffa8f52534b790dfa8c9993277203b37 | [
"MIT"
] | null | null | null | pstraw/__init__.py | MJJBennett/pstraw | 1bc7f4d3ffa8f52534b790dfa8c9993277203b37 | [
"MIT"
] | null | null | null | from .pstraw import get, post, Poll
| 18 | 35 | 0.75 | 6 | 36 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 36 | 1 | 36 | 36 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f44c302fc61a4f5221ac1bfc01b542c400294507 | 199 | py | Python | psono/administration/tests/__init__.py | dirigeant/psono-server | a18c5b3c4d8bbbe4ecf1615b210d99fb77752205 | [
"Apache-2.0",
"CC0-1.0"
] | null | null | null | psono/administration/tests/__init__.py | dirigeant/psono-server | a18c5b3c4d8bbbe4ecf1615b210d99fb77752205 | [
"Apache-2.0",
"CC0-1.0"
] | null | null | null | psono/administration/tests/__init__.py | dirigeant/psono-server | a18c5b3c4d8bbbe4ecf1615b210d99fb77752205 | [
"Apache-2.0",
"CC0-1.0"
] | null | null | null | from .duo import *
from .ga import *
from .yubikey import *
from .user import *
from .session import *
from .info import *
from .group import *
from .membership import *
from .recovery_code import *
| 19.9 | 28 | 0.728643 | 28 | 199 | 5.142857 | 0.428571 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.180905 | 199 | 9 | 29 | 22.111111 | 0.883436 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f45499d421bd9a2271ad9917d9894b9125f0c578 | 40 | py | Python | Python_Bootcamp/src/MyMainPackage/some main myprogramm.py | sudhirln92/Udemy_Courses_ML | 2226177430a54e75541c4ac1e76e9eb2620cdae4 | [
"MIT"
] | null | null | null | Python_Bootcamp/src/MyMainPackage/some main myprogramm.py | sudhirln92/Udemy_Courses_ML | 2226177430a54e75541c4ac1e76e9eb2620cdae4 | [
"MIT"
] | null | null | null | Python_Bootcamp/src/MyMainPackage/some main myprogramm.py | sudhirln92/Udemy_Courses_ML | 2226177430a54e75541c4ac1e76e9eb2620cdae4 | [
"MIT"
] | 1 | 2018-02-18T14:49:44.000Z | 2018-02-18T14:49:44.000Z | from mymodule import my_func
my_func() | 13.333333 | 29 | 0.8 | 7 | 40 | 4.285714 | 0.714286 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 40 | 3 | 30 | 13.333333 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
fe2e833ea90596a5ae75f0341206c15578e8608f | 60 | py | Python | blog_01/__init__.py | Noah-Smith-wgp/blog_01 | ab5fd3c3b2f47a939c2215de0509658bfa6abec0 | [
"MIT"
] | null | null | null | blog_01/__init__.py | Noah-Smith-wgp/blog_01 | ab5fd3c3b2f47a939c2215de0509658bfa6abec0 | [
"MIT"
] | 7 | 2021-01-06T20:08:25.000Z | 2022-02-27T09:54:45.000Z | hsx_01/meiduo_mall/meiduo_mall/__init__.py | hsx9527/test01 | 06cd5886af8707dd3c18da7b96cb0192b5c7e861 | [
"MIT"
] | null | null | null | from pymysql import install_as_MySQLdb
install_as_MySQLdb()
| 20 | 38 | 0.883333 | 9 | 60 | 5.444444 | 0.666667 | 0.367347 | 0.653061 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 60 | 2 | 39 | 30 | 0.890909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
fe4529f015583dbb8f029693fbf67f0b33c4102d | 474 | py | Python | thumbor_extras/detectors/tests/conftest.py | imaus10/thumbor_extras | f58180c20b158944c428287bdc36715454ac88ea | [
"MIT"
] | null | null | null | thumbor_extras/detectors/tests/conftest.py | imaus10/thumbor_extras | f58180c20b158944c428287bdc36715454ac88ea | [
"MIT"
] | 1 | 2022-01-24T01:58:52.000Z | 2022-01-24T01:58:52.000Z | thumbor_extras/detectors/tests/conftest.py | imaus10/thumbor_extras | f58180c20b158944c428287bdc36715454ac88ea | [
"MIT"
] | null | null | null | import pytest
from thumbor_extras.detectors.tests.helpers import github_image_fixture
@pytest.fixture
def face_image_context():
return github_image_fixture('Giunchedi%252C_Filippo_January_2015_01.jpg')
@pytest.fixture
def gray_face_image_context():
return github_image_fixture('Giunchedi%252C_Filippo_January_2015_01-grayscale.jpg')
@pytest.fixture
def cmyk_face_image_context():
return github_image_fixture('Giunchedi%252C_Filippo_January_2015_01-cmyk.jpg')
| 31.6 | 87 | 0.843882 | 67 | 474 | 5.537313 | 0.373134 | 0.118598 | 0.19407 | 0.177898 | 0.590297 | 0.590297 | 0.590297 | 0.590297 | 0.590297 | 0.590297 | 0 | 0.061644 | 0.075949 | 474 | 14 | 88 | 33.857143 | 0.785388 | 0 | 0 | 0.272727 | 0 | 0 | 0.297468 | 0.297468 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | true | 0 | 0.181818 | 0.272727 | 0.727273 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
fe4da77ddeb659dabc2f7d09609bca9104bebcea | 9,423 | py | Python | 17B-162/HI/imaging/check_flux_recovery.py | e-koch/VLA_Lband | 8fca7b2de0b88ce5c5011b34bf3936c69338d0b0 | [
"MIT"
] | 1 | 2021-03-08T23:19:12.000Z | 2021-03-08T23:19:12.000Z | 17B-162/HI/imaging/check_flux_recovery.py | e-koch/VLA_Lband | 8fca7b2de0b88ce5c5011b34bf3936c69338d0b0 | [
"MIT"
] | null | null | null | 17B-162/HI/imaging/check_flux_recovery.py | e-koch/VLA_Lband | 8fca7b2de0b88ce5c5011b34bf3936c69338d0b0 | [
"MIT"
] | null | null | null |
'''
Script to compare the flux recovered in the interferometer cube, the SD cube,
and (optionally) the feathered cube.
'''
import numpy as np
from spectral_cube import (SpectralCube, OneDSpectrum,)
from spectral_cube.lower_dimensional_structures import \
(VaryingResolutionOneDSpectrum)
import os
import astropy.io.fits as fits
import astropy.units as u
import matplotlib.pyplot as plt
from pandas import DataFrame
from cube_analysis.feather_cubes import flux_recovery
from paths import (seventeenB_HI_data_02kms_path,
seventeenB_HI_data_02kms_wGBT_path,
seventeenB_HI_data_1kms_path,
seventeenB_HI_data_1kms_wGBT_path,
data_path, allfigs_path, paper1_tables_path)
from constants import pb_lim, hi_mass_conversion_Jy, distance
from plotting_styles import default_figure, onecolumn_figure
# Set which of the cubes to use
run_gbt_02kms = False
run_gbt_1kms = True
num_cores = 2
chunk = 8
if run_gbt_02kms:
# Load the non-pb masked cube
vla_cube = SpectralCube.read(seventeenB_HI_data_02kms_path("M33_14B_17B_HI_contsub_width_02kms.image.pbcor.fits"))
gbt_path = os.path.join(data_path, "GBT")
gbt_name = os.path.join(gbt_path, "17B-162_items/m33_gbt_vlsr_highres_Tmb_17B162_02kms_spectralregrid.fits")
gbt_cube = SpectralCube.read(gbt_name)
feathered_cube = SpectralCube.read(seventeenB_HI_data_02kms_wGBT_path("M33_14B_17B_HI_contsub_width_02kms.image.pbcor.GBT_feathered.fits"))
pbcov = SpectralCube.read(seventeenB_HI_data_02kms_path("M33_14B_17B_HI_contsub_width_02kms.pb.fits"))
mask = pbcov[0].value > pb_lim
total_vla_profile, total_gbt_profile = \
flux_recovery(vla_cube, gbt_cube, mask=mask, num_cores=num_cores,
chunk=chunk)
total_feathered_profile, total_gbt_profile = \
flux_recovery(feathered_cube, gbt_cube, mask=mask, num_cores=num_cores,
chunk=chunk)
vel_axis = vla_cube.spectral_axis.to(u.km / u.s).value
onecolumn_figure()
# Plot ratio b/w high-res to GBT total flux per channel
plt.plot(vel_axis, total_feathered_profile / total_gbt_profile,
label='VLA + GBT')
plt.plot(vel_axis, total_vla_profile / total_gbt_profile, label="VLA",
linestyle='--')
# plt.axhline(1, zorder=-1, linestyle='--', color='b', alpha=0.5)
plt.ylim([0.15, 1.5])
plt.legend(frameon=True)
plt.grid(True)
plt.ylabel("VLA-to-GBT Flux Ratio")
plt.xlabel("Velocity (km / s)")
plt.tight_layout()
plt.savefig(allfigs_path("Imaging/vla_gbt_17B_flux_recovery_ratio.png"))
plt.savefig(allfigs_path("Imaging/vla_gbt_17B_flux_recovery_ratio.pdf"))
plt.close()
# Plot the total spectra
plt.plot(vel_axis, total_gbt_profile,
label='GBT')
plt.plot(vel_axis, total_vla_profile, label="VLA",
linestyle='--')
plt.plot(vel_axis, total_feathered_profile,
label='VLA + GBT', linestyle=":")
plt.legend(frameon=True)
plt.grid(True)
plt.ylim([-3, 70])
plt.ylabel("Total Flux (Jy)")
plt.xlabel("Velocity (km / s)")
plt.tight_layout()
plt.savefig(allfigs_path("Imaging/vla_gbt_17B_flux_recovery.png"))
plt.savefig(allfigs_path("Imaging/vla_gbt_17B_flux_recovery.pdf"))
plt.close()
# We've summed up most of the data already. How about a mass estimate?
chan_width = np.abs(vel_axis[1] - vel_axis[0]) * u.km / u.s
vla_total_flux = np.sum(total_vla_profile) * chan_width
vla_mass = hi_mass_conversion_Jy * distance.to(u.Mpc)**2 * vla_total_flux
feathered_total_flux = np.sum(total_feathered_profile) * chan_width
feathered_mass = hi_mass_conversion_Jy * distance.to(u.Mpc)**2 * \
feathered_total_flux
gbt_total_flux = np.sum(total_gbt_profile) * chan_width
gbt_mass = hi_mass_conversion_Jy * distance.to(u.Mpc)**2 * gbt_total_flux
print("VLA HI Total Mass: {}".format(vla_mass))
print("GBT HI Total Mass: {}".format(gbt_mass))
print("VLA + GBT HI Total Mass: {}".format(feathered_mass))
df = DataFrame({"VLA Mass": [vla_mass.value],
"GBT Mass": [gbt_mass.value],
"VLA+GBT Mass": [feathered_mass.value]})
df.to_csv(seventeenB_HI_data_02kms_wGBT_path("tables/hi_masses_nomask.csv",
no_check=True))
if run_gbt_1kms:
# Load the non-pb masked cube
vla_cube = SpectralCube.read(seventeenB_HI_data_1kms_path("M33_14B_17B_HI_contsub_width_1kms.image.pbcor.fits"))
gbt_path = os.path.join(data_path, "GBT")
gbt_name = os.path.join(gbt_path, "17B-162_items/m33_gbt_vlsr_highres_Tmb_17B162_1kms_spectralregrid.fits")
gbt_cube = SpectralCube.read(gbt_name)
feathered_cube = SpectralCube.read(seventeenB_HI_data_1kms_wGBT_path("M33_14B_17B_HI_contsub_width_1kms.image.pbcor.GBT_feathered.fits"))
pbcov = SpectralCube.read(seventeenB_HI_data_1kms_path("M33_14B_17B_HI_contsub_width_1kms.pb.fits"))
mask = pbcov[0].value > pb_lim
total_vla_profile, total_gbt_profile = \
flux_recovery(vla_cube, gbt_cube, mask=mask, num_cores=num_cores,
chunk=chunk, spec_check_kwargs={'rtol': 0.03},
verbose=False)
total_feathered_profile, total_gbt_profile = \
flux_recovery(feathered_cube, gbt_cube, mask=mask, num_cores=num_cores,
chunk=chunk, spec_check_kwargs={'rtol': 0.03},
verbose=False)
vel_axis = vla_cube.spectral_axis.to(u.km / u.s).value
onecolumn_figure()
# Plot ratio b/w high-res to GBT total flux per channel
plt.plot(vel_axis, total_feathered_profile / total_gbt_profile,
label='VLA + GBT')
plt.plot(vel_axis, total_vla_profile / total_gbt_profile, label="VLA",
linestyle='--')
# plt.axhline(1, zorder=-1, linestyle='--', color='b', alpha=0.5)
plt.ylim([0.15, 1.5])
plt.legend(frameon=True)
plt.grid(True)
plt.ylabel("VLA-to-GBT Flux Ratio")
plt.xlabel("Velocity (km / s)")
plt.tight_layout()
plt.savefig(allfigs_path("Imaging/vla_gbt_17B_1kms_flux_recovery_ratio.png"))
plt.savefig(allfigs_path("Imaging/vla_gbt_17B_1kms_flux_recovery_ratio.pdf"))
plt.close()
# Plot the total spectra
plt.plot(vel_axis, total_gbt_profile,
label='GBT')
plt.plot(vel_axis, total_vla_profile, label="VLA",
linestyle='--')
plt.plot(vel_axis, total_feathered_profile,
label='VLA + GBT', linestyle=":")
plt.legend(frameon=True)
plt.grid(True)
plt.ylim([-3, 70])
plt.ylabel("Total Flux (Jy)")
plt.xlabel("Velocity (km / s)")
plt.tight_layout()
plt.savefig(allfigs_path("Imaging/vla_gbt_17B_1kms_flux_recovery.png"))
plt.savefig(allfigs_path("Imaging/vla_gbt_17B_1kms_flux_recovery.pdf"))
plt.close()
# We've summed up most of the data already. How about a mass estimate?
chan_width = np.abs(vel_axis[1] - vel_axis[0]) * u.km / u.s
vla_total_flux = np.sum(total_vla_profile) * chan_width
vla_mass = hi_mass_conversion_Jy * distance.to(u.Mpc)**2 * vla_total_flux
feathered_total_flux = np.sum(total_feathered_profile) * chan_width
feathered_mass = hi_mass_conversion_Jy * distance.to(u.Mpc)**2 * \
feathered_total_flux
gbt_total_flux = np.sum(total_gbt_profile) * chan_width
gbt_mass = hi_mass_conversion_Jy * distance.to(u.Mpc)**2 * gbt_total_flux
print("VLA HI Total Mass: {}".format(vla_mass))
print("GBT HI Total Mass: {}".format(gbt_mass))
print("VLA + GBT HI Total Mass: {}".format(feathered_mass))
df = DataFrame({"VLA Mass": [vla_mass.value],
"GBT Mass": [gbt_mass.value],
"VLA+GBT Mass": [feathered_mass.value]})
out_folder = seventeenB_HI_data_1kms_wGBT_path("tables/",
no_check=True)
if not os.path.exists(out_folder):
os.mkdir(out_folder)
df.to_csv(seventeenB_HI_data_1kms_wGBT_path("tables/hi_masses_nomask_1kms.csv",
no_check=True))
# Save the spectra, too
spec = vla_cube[:, 0, 0]
VRODS = VaryingResolutionOneDSpectrum
vla_spec = VRODS(total_vla_profile,
unit=u.Jy, wcs=spec.wcs,
meta=spec.meta,
beams=vla_cube.beams if hasattr(vla_cube, 'beams') else None)
vla_spec.write(seventeenB_HI_data_1kms_path("M33_14B_17B_HI_contsub_width_1kms.image.pbcor.total_flux_spec.fits", no_check=True))
spec = feathered_cube[:, 0, 0]
vla_feath_spec = VRODS(total_feathered_profile,
unit=u.Jy, wcs=spec.wcs,
meta=spec.meta,
beams=vla_cube.beams if hasattr(vla_cube, 'beams') else None)
vla_feath_spec.write(seventeenB_HI_data_1kms_wGBT_path("M33_14B_17B_HI_contsub_width_1kms.image.pbcor.GBT_feathered.total_flux_spec.fits", no_check=True))
spec = gbt_cube[:, 0, 0]
gbt_spec = OneDSpectrum(total_gbt_profile,
unit=u.Jy, wcs=spec.wcs,
meta=spec.meta,
beam=gbt_cube.beam)
gbt_spec.write(os.path.join(gbt_path,
"17B-162_items/m33_gbt_vlsr_highres_Tmb_17B162_1kms_spectralregrid.total_flux_spec.fits"))
default_figure()
| 40.44206 | 158 | 0.677491 | 1,362 | 9,423 | 4.353157 | 0.137298 | 0.028841 | 0.040479 | 0.023613 | 0.843481 | 0.830832 | 0.805195 | 0.792882 | 0.781413 | 0.781413 | 0 | 0.026695 | 0.212883 | 9,423 | 232 | 159 | 40.616379 | 0.772684 | 0.068237 | 0 | 0.654762 | 0 | 0 | 0.172831 | 0.123858 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.071429 | 0.035714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fe76b6c1cfb6c666ec3477fc3a40e2bad382895b | 3,783 | py | Python | ssm_spline/tests/test_basics.py | cunningham-lab/monotonic-flds | 57c70b0f1e18bd3329a948cf4ebfc22242f97207 | [
"BSD-3-Clause"
] | 1 | 2020-01-22T00:35:34.000Z | 2020-01-22T00:35:34.000Z | ssm_spline/tests/test_basics.py | cunningham-lab/monotonic-flds | 57c70b0f1e18bd3329a948cf4ebfc22242f97207 | [
"BSD-3-Clause"
] | null | null | null | ssm_spline/tests/test_basics.py | cunningham-lab/monotonic-flds | 57c70b0f1e18bd3329a948cf4ebfc22242f97207 | [
"BSD-3-Clause"
] | 1 | 2019-12-29T23:42:49.000Z | 2019-12-29T23:42:49.000Z | import autograd.numpy as np
import autograd.numpy.random as npr
from ssm.models import HMM
def test_hmm_likelihood(T=500, K=5, D=2):
# Create a true HMM
A = npr.rand(K, K)
A /= A.sum(axis=1, keepdims=True)
A = 0.75 * np.eye(K) + 0.25 * A
C = npr.randn(K, D)
sigma = 0.01
# Sample from the true HMM
z = np.zeros(T, dtype=int)
y = np.zeros((T, D))
for t in range(T):
if t > 0:
z[t] = np.random.choice(K, p=A[z[t-1]])
y[t] = C[z[t]] + np.sqrt(sigma) * npr.randn(D)
# Compare to pyhsmm answer
from pyhsmm.models import HMM as OldHMM
from pyhsmm.basic.distributions import Gaussian
hmm = OldHMM([Gaussian(mu=C[k], sigma=sigma * np.eye(D)) for k in range(K)],
trans_matrix=A,
init_state_distn="uniform")
true_lkhd = hmm.log_likelihood(y)
# Make an HMM with these parameters
hmm = HMM(K, D, observations="gaussian")
hmm.transitions.log_Ps = np.log(A)
hmm.observations.mus = C
hmm.observations.inv_sigmas = np.log(sigma) * np.ones((K, D))
test_lkhd = hmm.log_probability(y)
assert np.allclose(true_lkhd, test_lkhd)
def test_big_hmm_likelihood(T=50000, K=50, D=50):
test_hmm_likelihood(T=T, K=K, D=D)
def test_expectations(T=1000, K=20, D=2):
# Create a true HMM
A = npr.rand(K, K)
A /= A.sum(axis=1, keepdims=True)
A = 0.75 * np.eye(K) + 0.25 * A
C = npr.randn(K, D)
sigma = 0.01
# Sample from the true HMM
z = np.zeros(T, dtype=int)
y = np.zeros((T, D))
for t in range(T):
if t > 0:
z[t] = np.random.choice(K, p=A[z[t-1]])
y[t] = C[z[t]] + np.sqrt(sigma) * npr.randn(D)
# Compare to pyhsmm answer
from pyhsmm.models import HMM as OldHMM
from pyhsmm.basic.distributions import Gaussian
hmm = OldHMM([Gaussian(mu=C[k], sigma=sigma * np.eye(D)) for k in range(K)],
trans_matrix=A,
init_state_distn="uniform")
hmm.add_data(y)
states = hmm.states_list.pop()
states.E_step()
true_Ez = states.expected_states
true_E_trans = states.expected_transcounts
# Make an HMM with these parameters
hmm = HMM(K, D, observations="gaussian")
hmm.transitions.log_Ps = np.log(A)
hmm.observations.mus = C
hmm.observations.inv_sigmas = np.log(sigma) * np.ones((K, D))
test_Ez, test_Ezzp1, _ = hmm.expected_states(y)
test_E_trans = test_Ezzp1.sum(0)
print(true_E_trans.round(3))
print(test_E_trans.round(3))
assert np.allclose(true_Ez, test_Ez)
assert np.allclose(true_E_trans, test_E_trans)
def test_viterbi(T=1000, K=20, D=2):
# Create a true HMM
A = npr.rand(K, K)
A /= A.sum(axis=1, keepdims=True)
A = 0.75 * np.eye(K) + 0.25 * A
C = npr.randn(K, D)
sigma = 0.01
# Sample from the true HMM
z = np.zeros(T, dtype=int)
y = np.zeros((T, D))
for t in range(T):
if t > 0:
z[t] = np.random.choice(K, p=A[z[t-1]])
y[t] = C[z[t]] + np.sqrt(sigma) * npr.randn(D)
# Compare to pyhsmm answer
from pyhsmm.models import HMM as OldHMM
from pyhsmm.basic.distributions import Gaussian
hmm = OldHMM([Gaussian(mu=C[k], sigma=sigma * np.eye(D)) for k in range(K)],
trans_matrix=A,
init_state_distn="uniform")
hmm.add_data(y)
states = hmm.states_list.pop()
states.Viterbi()
z_star = states.stateseq
# Make an HMM with these parameters
hmm = HMM(K, D, observations="gaussian")
hmm.transitions.log_Ps = np.log(A)
hmm.observations.mus = C
hmm.observations.inv_sigmas = np.log(sigma) * np.ones((K, D))
z_star2 = hmm.most_likely_states(y)
print(z_star)
print(z_star2)
assert np.allclose(z_star, z_star2) | 30.756098 | 80 | 0.606661 | 643 | 3,783 | 3.468118 | 0.167963 | 0.008969 | 0.021525 | 0.012108 | 0.733184 | 0.733184 | 0.733184 | 0.733184 | 0.733184 | 0.733184 | 0 | 0.025451 | 0.252181 | 3,783 | 123 | 81 | 30.756098 | 0.762814 | 0.080624 | 0 | 0.711111 | 0 | 0 | 0.012983 | 0 | 0 | 0 | 0 | 0 | 0.044444 | 1 | 0.044444 | false | 0 | 0.1 | 0 | 0.144444 | 0.044444 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
feabd7ef9f83006e312a8b610160b37edbdf10f3 | 32,916 | py | Python | win/devkit/other/pymel/extras/completion/py/pymel/internal/pmcmds.py | leegoonz/Maya-devkit | b81fe799b58e854e4ef16435426d60446e975871 | [
"ADSL"
] | 10 | 2018-03-30T16:09:02.000Z | 2021-12-07T07:29:19.000Z | win/devkit/other/pymel/extras/completion/py/pymel/internal/pmcmds.py | leegoonz/Maya-devkit | b81fe799b58e854e4ef16435426d60446e975871 | [
"ADSL"
] | null | null | null | win/devkit/other/pymel/extras/completion/py/pymel/internal/pmcmds.py | leegoonz/Maya-devkit | b81fe799b58e854e4ef16435426d60446e975871 | [
"ADSL"
] | 9 | 2018-06-02T09:18:49.000Z | 2021-12-20T09:24:35.000Z | """
There are a number of pymel objects which must be converted to a "mel-friendly"
representation. For example, in versions prior to 2009, some mel commands (ie, getAttr) which expect
string arguments will simply reject custom classes, even if they have a valid string representation.
Another Example is mel's matrix inputs, which expects a flat list of 16 flaots, while pymel's Matrix has a more typical
4x4 representation.
If you're having compatibility issues with your custom classes when passing them to maya.cmds,
simply add a __melobject__ function that returns a mel-friendly result and pass them to pymel's wrapped commands.
The wrapped commands in this module are the starting point for any other pymel customizations.
"""
import pymel.util as util
import sys
import pymel.versions as versions
import warnings
import maya
import os
from maya.cmds import *
from exceptions import ValueError as objectErrorType
def unloadPlugin(*args, **kwargs):
pass
def polyBevel(*args, **kwargs):
pass
def polyAverageVertex(*args, **kwargs):
pass
def treeLister(*args, **kwargs):
pass
def eval(*args, **kwargs):
"""
Dynamic library stub function
"""
pass
def expression(*args, **kwargs):
pass
def polyAppendVertex(*args, **kwargs):
pass
def lattice(*args, **kwargs):
pass
def polyAppend(*args, **kwargs):
pass
def reverseSurface(*args, **kwargs):
pass
def menu(*args, **kwargs):
pass
def polyTriangulate(*args, **kwargs):
pass
def blendShapePanel(*args, **kwargs):
pass
def dR_modeVert(*args, **kwargs):
pass
def bakeSimulation(*args, **kwargs):
pass
def manipMoveContext(*args, **kwargs):
pass
def dR_modeEdge(*args, **kwargs):
pass
def instance(*args, **kwargs):
pass
def dR_loadRecentFile1(*args, **kwargs):
pass
def keyframe(*args, **kwargs):
pass
def dR_gridSnapPress(*args, **kwargs):
pass
def dR_defLightTGL(*args, **kwargs):
pass
def soundControl(*args, **kwargs):
pass
def createRenderLayer(*args, **kwargs):
pass
def extrude(*args, **kwargs):
pass
def addAttr(*args, **kwargs):
pass
def manipRotateContext(*args, **kwargs):
pass
def dR_slideOff(*args, **kwargs):
pass
def devicePanel(*args, **kwargs):
pass
def scaleConstraint(*args, **kwargs):
pass
def dR_setRelaxAffectsInterior(*args, **kwargs):
pass
def defaultLightListCheckBox(*args, **kwargs):
pass
def saveImage(*args, **kwargs):
pass
def checkBox(*args, **kwargs):
pass
def modelPanel(*args, **kwargs):
pass
def dR_curveSnapPress(*args, **kwargs):
pass
def dR_setRelaxAffectsAll(*args, **kwargs):
pass
def dR_coordSpaceCustom(*args, **kwargs):
pass
def dR_setExtendLoop(*args, **kwargs):
pass
def polySplitRing(*args, **kwargs):
pass
def layoutDialog(*args, **kwargs):
pass
def dR_cameraToPoly(*args, **kwargs):
pass
def dR_selectPress(*args, **kwargs):
pass
def nexConnectContext(*args, **kwargs):
pass
def dR_bevelRelease(*args, **kwargs):
pass
def polyLayoutUV(*args, **kwargs):
pass
def dR_selectTool(*args, **kwargs):
pass
def referenceQuery(*args, **kwargs):
pass
def polyExtrudeVertex(*args, **kwargs):
pass
def polyReduce(*args, **kwargs):
pass
def visor(*args, **kwargs):
pass
def pointConstraint(*args, **kwargs):
pass
def clipSchedulerOutliner(*args, **kwargs):
pass
def polyCollapseEdge(*args, **kwargs):
pass
def polyCrease(*args, **kwargs):
pass
def intSliderGrp(*args, **kwargs):
pass
def nexQuadDrawContext(*args, **kwargs):
pass
def sequenceManager(*args, **kwargs):
pass
def setAttr(*args, **kwargs):
pass
def dR_viewBack(*args, **kwargs):
pass
def polyWedgeFace(*args, **kwargs):
pass
def scriptedPanelType(*args, **kwargs):
pass
def closeCurve(*args, **kwargs):
pass
def dR_tweakPress(*args, **kwargs):
pass
def paramDimension(*args, **kwargs):
pass
def dR_selConstraintEdgeRing(*args, **kwargs):
pass
def listConnections(*args, **kwargs):
pass
def projectCurve(*args, **kwargs):
pass
def addAllWrappedCmds():
pass
def removeWrappedCmd(cmdname):
pass
def iconTextCheckBox(*args, **kwargs):
pass
def radioButtonGrp(*args, **kwargs):
pass
def workspace(*args, **kwargs):
pass
def polyMapSewMove(*args, **kwargs):
pass
def smoothCurve(*args, **kwargs):
pass
def polyOptUvs(*args, **kwargs):
pass
def polyMoveVertex(*args, **kwargs):
pass
def _testDecorator(function):
pass
def window(*args, **kwargs):
pass
def canvas(*args, **kwargs):
pass
def frameLayout(*args, **kwargs):
pass
def imagePlane(*args, **kwargs):
pass
def image(*args, **kwargs):
pass
def button(*args, **kwargs):
pass
def poleVectorConstraint(*args, **kwargs):
pass
def webBrowser(*args, **kwargs):
pass
def blendTwoAttr(*args, **kwargs):
pass
def ikSystem(*args, **kwargs):
pass
def partition(*args, **kwargs):
pass
def dR_slideEdge(*args, **kwargs):
pass
def psdChannelOutliner(*args, **kwargs):
"""
Dynamic library stub function
"""
pass
def dR_setRelaxAffectsBorders(*args, **kwargs):
pass
def dR_softSelDistanceTypeVolume(*args, **kwargs):
pass
def dR_setExtendBorder(*args, **kwargs):
pass
def dR_selectModeTweakMarquee(*args, **kwargs):
pass
def gradientControl(*args, **kwargs):
pass
def rebuildSurface(*args, **kwargs):
pass
def intSlider(*args, **kwargs):
pass
def dR_selectInvert(*args, **kwargs):
pass
def toolPropertyWindow(*args, **kwargs):
pass
def dR_selConstraintEdgeLoop(*args, **kwargs):
pass
def toolCollection(*args, **kwargs):
pass
def textFieldButtonGrp(*args, **kwargs):
pass
def wire(*args, **kwargs):
pass
def group(*args, **kwargs):
pass
def globalStitch(*args, **kwargs):
pass
def viewManip(*args, **kwargs):
pass
def subdiv(*args, **kwargs):
pass
def floatSliderButtonGrp(*args, **kwargs):
pass
def getPanel(*args, **kwargs):
pass
def findKeyframe(*args, **kwargs):
pass
def lsUI(*args, **kwargs):
pass
def timeWarp(*args, **kwargs):
pass
def addWrappedCmd(cmdname, cmd=None):
pass
def normalConstraint(*args, **kwargs):
pass
def flow(*args, **kwargs):
pass
def progressBar(*args, **kwargs):
pass
def dR_rotateTweakTool(*args, **kwargs):
pass
def dR_quadDrawTool(*args, **kwargs):
pass
def floatSlider(*args, **kwargs):
pass
def dR_outlinerTGL(*args, **kwargs):
pass
def dR_objectBackfaceTGL(*args, **kwargs):
pass
def floatField(*args, **kwargs):
pass
def subdMapCut(*args, **kwargs):
pass
def textField(*args, **kwargs):
pass
def subdLayoutUV(*args, **kwargs):
pass
def treeView(*args, **kwargs):
pass
def event(*args, **kwargs):
pass
def nurbsPlane(*args, **kwargs):
pass
def surface(*args, **kwargs):
pass
def optionMenu(*args, **kwargs):
pass
def artUserPaintCtx(*args, **kwargs):
pass
def cutKey(*args, **kwargs):
pass
def filletCurve(*args, **kwargs):
pass
def separator(*args, **kwargs):
pass
def artAttrCtx(*args, **kwargs):
pass
def revolve(*args, **kwargs):
pass
def menuBarLayout(*args, **kwargs):
pass
def arcLengthDimension(*args, **kwargs):
pass
def manipScaleContext(*args, **kwargs):
pass
def animLayer(*args, **kwargs):
pass
def keyframeStats(*args, **kwargs):
pass
def dR_modeMulti(*args, **kwargs):
pass
def polySplit(*args, **kwargs):
pass
def dR_loadRecentFile2(*args, **kwargs):
pass
def keyframeOutliner(*args, **kwargs):
pass
def dropoffLocator(*args, **kwargs):
pass
def cylinder(*args, **kwargs):
pass
def dR_gridSnapRelease(*args, **kwargs):
pass
def ls(*args, **kwargs):
pass
def snapKey(*args, **kwargs):
pass
def dR_extrudeRelease(*args, **kwargs):
pass
def ambientLight(*args, **kwargs):
pass
def dR_disableTexturesTGL(*args, **kwargs):
pass
def dR_customPivotToolPress(*args, **kwargs):
pass
def container(*args, **kwargs):
pass
def condition(*args, **kwargs):
pass
def emitter(*args, **kwargs):
pass
def nodeTreeLister(*args, **kwargs):
pass
def shadingGeometryRelCtx(*args, **kwargs):
pass
def polyHelix(*args, **kwargs):
pass
def dR_multiCutRelease(*args, **kwargs):
pass
def mute(*args, **kwargs):
pass
def nodeOutliner(*args, **kwargs):
pass
def scaleKey(*args, **kwargs):
pass
def delete(*args, **kwargs):
pass
def nodeIconButton(*args, **kwargs):
pass
def dR_curveSnapRelease(*args, **kwargs):
pass
def rigidSolver(*args, **kwargs):
pass
def listAnimatable(*args, **kwargs):
pass
def dR_coordSpaceLocal(*args, **kwargs):
pass
def dR_convertSelectionToEdge(*args, **kwargs):
pass
def dR_conform(*args, **kwargs):
pass
def dR_bevelTool(*args, **kwargs):
pass
def dR_quadDrawRelease(*args, **kwargs):
pass
def addDynamic(*args, **kwargs):
pass
def dR_activeHandleYZ(*args, **kwargs):
pass
def dR_activeHandleX(*args, **kwargs):
pass
def dR_quadDrawPress(*args, **kwargs):
pass
def polyDelVertex(*args, **kwargs):
pass
def dR_pointSnapRelease(*args, **kwargs):
pass
def polyPlatonicSolid(*args, **kwargs):
pass
def picture(*args, **kwargs):
pass
def dR_visorTGL(*args, **kwargs):
pass
def attrFieldSliderGrp(*args, **kwargs):
pass
def selectKeyframe(*args, **kwargs):
pass
def dR_viewLightsTGL(*args, **kwargs):
pass
def scrollField(*args, **kwargs):
pass
def cmdScrollFieldReporter(*args, **kwargs):
pass
def dR_tweakRelease(*args, **kwargs):
pass
def channelBox(*args, **kwargs):
pass
def dR_targetWeldRelease(*args, **kwargs):
pass
def dR_objectEdgesOnlyTGL(*args, **kwargs):
pass
def outlinerPanel(*args, **kwargs):
pass
def fontDialog(*args, **kwargs):
pass
def dR_showOptions(*args, **kwargs):
pass
def torus(*args, **kwargs):
pass
def hyperGraph(*args, **kwargs):
pass
def polyPlane(*args, **kwargs):
pass
def dR_multiCutTool(*args, **kwargs):
pass
def polyCopyUV(*args, **kwargs):
pass
def roundConstantRadius(*args, **kwargs):
pass
def polyMoveEdge(*args, **kwargs):
pass
def help(*args, **kwargs):
pass
def polyChipOff(*args, **kwargs):
pass
def polyMapDel(*args, **kwargs):
pass
def dR_mtkToolTGL(*args, **kwargs):
pass
def cameraSet(*args, **kwargs):
pass
def bufferCurve(*args, **kwargs):
pass
def rename(*args, **kwargs):
pass
def dR_setExtendEdge(*args, **kwargs):
pass
def dR_selectModeDisableTweakMarquee(*args, **kwargs):
pass
def rangeControl(*args, **kwargs):
pass
def instancer(*args, **kwargs):
pass
def uiTemplate(*args, **kwargs):
pass
def ikSolver(*args, **kwargs):
pass
def sphere(*args, **kwargs):
pass
def iconTextScrollList(*args, **kwargs):
pass
def turbulence(*args, **kwargs):
pass
def fluidEmitter(*args, **kwargs):
pass
def currentTime(*args, **kwargs):
pass
def floatFieldGrp(*args, **kwargs):
pass
def createNode(*args, **kwargs):
pass
def nurbsToSubdiv(*args, **kwargs):
pass
def distanceDimension(*args, **kwargs):
pass
def fitBspline(*args, **kwargs):
pass
def spotLightPreviewPort(*args, **kwargs):
pass
def dR_safeFrameTGL(*args, **kwargs):
pass
def paneLayout(*args, **kwargs):
pass
def dR_renderGlobalsTGL(*args, **kwargs):
pass
def copyKey(*args, **kwargs):
pass
def choice(*args, **kwargs):
pass
def dR_preferencesTGL(*args, **kwargs):
pass
def dR_overlayAppendMeshTGL(*args, **kwargs):
pass
def tangentConstraint(*args, **kwargs):
pass
def toolButton(*args, **kwargs):
pass
def pairBlend(*args, **kwargs):
pass
def dR_mtkPanelTGL(*args, **kwargs):
pass
def text(*args, **kwargs):
pass
def outlinerEditor(*args, **kwargs):
pass
def symbolButton(*args, **kwargs):
pass
def exportEdits(*args, **kwargs):
pass
def panel(*args, **kwargs):
pass
def spaceLocator(*args, **kwargs):
pass
def tabLayout(*args, **kwargs):
pass
def attachCurve(*args, **kwargs):
pass
def softMod(*args, **kwargs):
pass
def optionMenuGrp(*args, **kwargs):
pass
def assembly(*args, **kwargs):
pass
def assignCommand(*args, **kwargs):
pass
def arclen(*args, **kwargs):
pass
def preloadRefEd(*args, **kwargs):
pass
def dR_moveRelease(*args, **kwargs):
pass
def keyingGroup(*args, **kwargs):
pass
def dR_modeObject(*args, **kwargs):
pass
def offsetSurface(*args, **kwargs):
pass
def dR_loadRecentFile3(*args, **kwargs):
pass
def dR_paintPress(*args, **kwargs):
pass
def timePort(*args, **kwargs):
pass
def dR_hypergraphTGL(*args, **kwargs):
pass
def jointLattice(*args, **kwargs):
pass
def dR_extrudeTool(*args, **kwargs):
pass
def objectTypeUI(*args, **kwargs):
"""
This command returns the type of UI element such as button, sliders, etc.
Flags:
- isType : i (unicode) [create]
Returns true|false if the object is of the specified type.
- listAll : la (bool) [create]
Returns a list of all known UI commands and their respective types. Each entry contains three strings which are the
command name, ui type and class name. Note that the class name is internal and is subject to change.
- superClasses : sc (bool) [create]
Returns a list of the names of all super classes for the given object. Note that all class names are internal and are
subject to change. Flag can have multiple arguments, passed either as a tuple or a list.
Derived from mel command `maya.cmds.objectTypeUI`
"""
pass
def curve(*args, **kwargs):
pass
def dR_customPivotToolRelease(*args, **kwargs):
pass
def skinCluster(*args, **kwargs):
pass
def cone(*args, **kwargs):
pass
def polyRemesh(*args, **kwargs):
pass
def commandLogging(*args, **kwargs):
pass
def colorSliderButtonGrp(*args, **kwargs):
pass
def menuEditor(*args, **kwargs):
pass
def detachSurface(*args, **kwargs):
pass
def loadPlugin(*args, **kwargs):
pass
def polyUVRectangle(*args, **kwargs):
pass
def dR_coordSpaceObject(*args, **kwargs):
pass
def polyStraightenUVBorder(*args, **kwargs):
pass
def dR_convertSelectionToFace(*args, **kwargs):
pass
def textScrollList(*args, **kwargs):
pass
def dR_connectPress(*args, **kwargs):
pass
def polySoftEdge(*args, **kwargs):
pass
def dR_bridgePress(*args, **kwargs):
pass
def polyMapCut(*args, **kwargs):
pass
def polySewEdge(*args, **kwargs):
pass
def dR_activeHandleZ(*args, **kwargs):
pass
def dR_movePress(*args, **kwargs):
pass
def polyFlipUV(*args, **kwargs):
pass
def colorEditor(*args, **kwargs):
pass
def polyPyramid(*args, **kwargs):
pass
def falloffCurve(*args, **kwargs):
pass
def polyCylinder(*args, **kwargs):
pass
def bezierCurveToNurbs(*args, **kwargs):
pass
def uniform(*args, **kwargs):
pass
def nexTRSContext(*args, **kwargs):
pass
def renderer(*args, **kwargs):
pass
def batchRender(*args, **kwargs):
pass
def renderWindowEditor(*args, **kwargs):
pass
def connectDynamic(*args, **kwargs):
pass
def dR_loadRecentFile4(*args, **kwargs):
pass
def polyFlipEdge(*args, **kwargs):
pass
def dR_wireframeSmoothTGL(*args, **kwargs):
pass
def dR_viewPersp(*args, **kwargs):
pass
def parentConstraint(*args, **kwargs):
pass
def select(*args, **kwargs):
pass
def dR_viewFront(*args, **kwargs):
pass
def dR_vertLockSelected(*args, **kwargs):
pass
def checkBoxGrp(*args, **kwargs):
pass
def lsThroughFilter(*args, **kwargs):
pass
def dR_softSelToolTGL(*args, **kwargs):
pass
def radioCollection(*args, **kwargs):
pass
def iconTextRadioCollection(*args, **kwargs):
pass
def keyTangent(*args, **kwargs):
pass
def polyProjectCurve(*args, **kwargs):
pass
def hyperPanel(*args, **kwargs):
pass
def loft(*args, **kwargs):
pass
def polyUnite(*args, **kwargs):
pass
def dR_graphEditorTGL(*args, **kwargs):
pass
def hudSliderButton(*args, **kwargs):
pass
def hotkeyEditorPanel(*args, **kwargs):
pass
def polyNormal(*args, **kwargs):
pass
def hotBox(*args, **kwargs):
pass
def helpLine(*args, **kwargs):
pass
def dR_extrudePress(*args, **kwargs):
pass
def hardwareRenderPanel(*args, **kwargs):
pass
def cameraView(*args, **kwargs):
pass
def orientConstraint(*args, **kwargs):
pass
def polyBridgeEdge(*args, **kwargs):
pass
def dR_edgedFacesTGL(*args, **kwargs):
pass
def boneLattice(*args, **kwargs):
pass
def blendShape(*args, **kwargs):
pass
def dR_slideSurface(*args, **kwargs):
pass
def dR_showAbout(*args, **kwargs):
pass
def popupMenu(*args, **kwargs):
pass
def closeSurface(*args, **kwargs):
pass
def dynGlobals(*args, **kwargs):
pass
def dR_selectRelease(*args, **kwargs):
pass
def attrControlGrp(*args, **kwargs):
pass
def dR_selectModeHybrid(*args, **kwargs):
pass
def intField(*args, **kwargs):
pass
def dR_selConstraintElement(*args, **kwargs):
pass
def radioMenuItemCollection(*args, **kwargs):
pass
def insertKnotCurve(*args, **kwargs):
pass
def optionVar(*args, **kwargs):
pass
def dR_scaleTweakTool(*args, **kwargs):
pass
def volumeAxis(*args, **kwargs):
pass
def cteEditor(*args, **kwargs):
"""
Dynamic library stub function
"""
pass
def iconTextStaticLabel(*args, **kwargs):
pass
def untrim(*args, **kwargs):
pass
def floatScrollBar(*args, **kwargs):
pass
def subdMapSewMove(*args, **kwargs):
pass
def filter(*args, **kwargs):
pass
def stroke(*args, **kwargs):
pass
def nurbsCube(*args, **kwargs):
pass
def annotate(*args, **kwargs):
pass
def confirmDialog(*args, **kwargs):
pass
def angleBetween(*args, **kwargs):
pass
def nodeType(*args, **kwargs):
pass
def dR_scalePress(*args, **kwargs):
pass
def polyTransfer(*args, **kwargs):
pass
def dR_renderLastTGL(*args, **kwargs):
pass
def componentBox(*args, **kwargs):
pass
def dR_quadDrawClearDots(*args, **kwargs):
pass
def geometryConstraint(*args, **kwargs):
pass
def dR_objectHideTGL(*args, **kwargs):
pass
def dR_activeHandleXY(*args, **kwargs):
pass
def trim(*args, **kwargs):
pass
def showManipCtx(*args, **kwargs):
pass
def colorSliderGrp(*args, **kwargs):
pass
def animCurveEditor(*args, **kwargs):
pass
def promptDialog(*args, **kwargs):
pass
def file(*args, **kwargs):
pass
def palettePort(*args, **kwargs):
pass
def symbolCheckBox(*args, **kwargs):
pass
def colorIndexSliderGrp(*args, **kwargs):
pass
def commandLine(*args, **kwargs):
pass
def shelfLayout(*args, **kwargs):
pass
def shelfButton(*args, **kwargs):
pass
def offsetCurve(*args, **kwargs):
pass
def disconnectAttr(*args, **kwargs):
pass
def pluginInfo(*args, **kwargs):
pass
def menuItem(*args, **kwargs):
pass
def arrayMapper(*args, **kwargs):
pass
def shadingLightRelCtx(*args, **kwargs):
pass
def dR_moveTweakTool(*args, **kwargs):
pass
def repeatLast(*args, **kwargs):
pass
def dR_modePoly(*args, **kwargs):
pass
def dR_hypershadeTGL(*args, **kwargs):
pass
def deltaMush(*args, **kwargs):
pass
def joint(*args, **kwargs):
pass
def curveIntersect(*args, **kwargs):
pass
def dR_extrudeBevelPress(*args, **kwargs):
pass
def dR_cycleCustomCameras(*args, **kwargs):
pass
def control(*args, **kwargs):
pass
def flexor(*args, **kwargs):
pass
def setParent(*args, **kwargs):
pass
def textFieldGrp(*args, **kwargs):
pass
def commandPort(*args, **kwargs):
pass
def colorInputWidgetGrp(*args, **kwargs):
pass
def shelfTabLayout(*args, **kwargs):
pass
def deformer(*args, **kwargs):
pass
def scriptJob(*args, **kwargs):
pass
def nBase(*args, **kwargs):
pass
def sceneEditor(*args, **kwargs):
pass
def scale(*args, **kwargs):
pass
def movOut(*args, **kwargs):
pass
def listRelatives(*args, **kwargs):
pass
def rowColumnLayout(*args, **kwargs):
pass
def referenceEdit(*args, **kwargs):
pass
def dR_coordSpaceWorld(*args, **kwargs):
pass
def dR_convertSelectionToUV(*args, **kwargs):
pass
def lightList(*args, **kwargs):
pass
def layeredTexturePort(*args, **kwargs):
pass
def dR_connectRelease(*args, **kwargs):
pass
def polySphere(*args, **kwargs):
pass
def dR_bridgeRelease(*args, **kwargs):
pass
def switchTable(*args, **kwargs):
pass
def polyInstallAction(*args, **kwargs):
pass
def dR_activeHandleXYZ(*args, **kwargs):
pass
def runTimeCommand(*args, **kwargs):
pass
def polyExtrudeEdge(*args, **kwargs):
pass
def listHistory(*args, **kwargs):
pass
def cluster(*args, **kwargs):
pass
def dR_customPivotTool(*args, **kwargs):
pass
def bevel(*args, **kwargs):
pass
def messageLine(*args, **kwargs):
pass
def bakeResults(*args, **kwargs):
pass
def dR_createCameraFromView(*args, **kwargs):
pass
def listAttr(*args, **kwargs):
pass
def dR_viewRight(*args, **kwargs):
pass
def attrEnumOptionMenu(*args, **kwargs):
pass
def dR_vertSelectLocked(*args, **kwargs):
pass
def createDisplayLayer(*args, **kwargs):
pass
def hyperShade(*args, **kwargs):
pass
def polyPoke(*args, **kwargs):
pass
def polyPinUV(*args, **kwargs):
pass
def polyNormalPerVertex(*args, **kwargs):
pass
def layerButton(*args, **kwargs):
pass
def polyCone(*args, **kwargs):
pass
def polyMoveFacetUV(*args, **kwargs):
pass
def polyMergeUV(*args, **kwargs):
pass
def extendCurve(*args, **kwargs):
pass
def polySmooth(*args, **kwargs):
pass
def polyCBoolOp(*args, **kwargs):
pass
def editMetadata(*args, **kwargs):
pass
def polyBevel3(*args, **kwargs):
pass
def dR_autoWeldTGL(*args, **kwargs):
pass
def dR_softSelDistanceTypeGlobal(*args, **kwargs):
pass
def polyTorus(*args, **kwargs):
pass
def scrollLayout(*args, **kwargs):
pass
def dR_showHelp(*args, **kwargs):
pass
def polySeparate(*args, **kwargs):
pass
def cmdShell(*args, **kwargs):
pass
def dR_selectSimilar(*args, **kwargs):
pass
def character(*args, **kwargs):
pass
def dR_selectModeMarquee(*args, **kwargs):
pass
def intFieldGrp(*args, **kwargs):
pass
def formLayout(*args, **kwargs):
pass
def dR_selConstraintOff(*args, **kwargs):
pass
def quit(*args, **kwargs):
pass
def rampColorPort(*args, **kwargs):
pass
def insertKnotSurface(*args, **kwargs):
pass
def dR_selConstraintAngle(*args, **kwargs):
pass
def grid(*args, **kwargs):
pass
def boundary(*args, **kwargs):
pass
def gravity(*args, **kwargs):
pass
def ikHandle(*args, **kwargs):
pass
def transferAttributes(*args, **kwargs):
pass
def snapshot(*args, **kwargs):
pass
def flowLayout(*args, **kwargs):
pass
def polyQuad(*args, **kwargs):
pass
def flagTest(*args, **kwargs):
pass
def nurbsSquare(*args, **kwargs):
pass
def dynPaintEditor(*args, **kwargs):
pass
def polyDuplicateEdge(*args, **kwargs):
pass
def alignCurve(*args, **kwargs):
pass
def selectionConnection(*args, **kwargs):
pass
def getCmdName(inFunc):
"""
Use in place of inFunc.__name__ when inFunc could be a maya.cmds cmd
handles stubFuncs
"""
pass
def nonLinear(*args, **kwargs):
pass
def dockControl(*args, **kwargs):
pass
def aimConstraint(*args, **kwargs):
pass
def polyToSubdiv(*args, **kwargs):
pass
def dR_scaleRelease(*args, **kwargs):
pass
def adskAssetListUI(*args, **kwargs):
pass
def polyDelFacet(*args, **kwargs):
pass
def dR_rotatePress(*args, **kwargs):
pass
def polyDelEdge(*args, **kwargs):
pass
def dR_paintRelease(*args, **kwargs):
pass
def dR_objectTemplateTGL(*args, **kwargs):
pass
def polyCut(*args, **kwargs):
pass
def userCtx(*args, **kwargs):
pass
def sets(*args, **kwargs):
pass
def spotLight(*args, **kwargs):
pass
def fileBrowserDialog(*args, **kwargs):
pass
def extendSurface(*args, **kwargs):
pass
def circle(*args, **kwargs):
pass
def detachCurve(*args, **kwargs):
pass
def swatchDisplayPort(*args, **kwargs):
pass
def pasteKey(*args, **kwargs):
pass
def nexMultiCutContext(*args, **kwargs):
pass
def directionalLight(*args, **kwargs):
pass
def menuSet(*args, **kwargs):
pass
def draggerContext(*args, **kwargs):
pass
def art3dPaintCtx(*args, **kwargs):
pass
def reverseCurve(*args, **kwargs):
pass
def characterMap(*args, **kwargs):
pass
def dR_modeUV(*args, **kwargs):
pass
def namespaceInfo(*args, **kwargs):
pass
def dR_lockSelTGL(*args, **kwargs):
pass
def nameField(*args, **kwargs):
pass
def dR_increaseManipSize(*args, **kwargs):
pass
def dR_gridAllTGL(*args, **kwargs):
pass
def jointCluster(*args, **kwargs):
pass
def parent(*args, **kwargs):
pass
def dR_extrudeBevelRelease(*args, **kwargs):
pass
def dR_decreaseManipSize(*args, **kwargs):
pass
def nParticle(*args, **kwargs):
pass
def floatSliderGrp(*args, **kwargs):
pass
def connectAttr(*args, **kwargs):
pass
def attrFieldGrp(*args, **kwargs):
pass
def columnLayout(*args, **kwargs):
pass
def attrEnumOptionMenuGrp(*args, **kwargs):
pass
def drag(*args, **kwargs):
pass
def shot(*args, **kwargs):
pass
def shadingNode(*args, **kwargs):
pass
def dR_viewLeft(*args, **kwargs):
pass
def dR_viewXrayTGL(*args, **kwargs):
pass
def dR_viewGridTGL(*args, **kwargs):
pass
def attrColorSliderGrp(*args, **kwargs):
pass
def attrNavigationControlGrp(*args, **kwargs):
pass
def prepareRender(*args, **kwargs):
pass
def dR_viewBottom(*args, **kwargs):
pass
def cmdScrollFieldExecuter(*args, **kwargs):
pass
def listSets(*args, **kwargs):
pass
def rowLayout(*args, **kwargs):
pass
def sculpt(*args, **kwargs):
pass
def modelEditor(*args, **kwargs):
pass
def scriptCtx(*args, **kwargs):
pass
def dagPose(*args, **kwargs):
pass
def getAttr(*args, **kwargs):
pass
def rotate(*args, **kwargs):
pass
def dR_convertSelectionToVertex(*args, **kwargs):
pass
def polySplitEdge(*args, **kwargs):
pass
def dR_connectTool(*args, **kwargs):
pass
def layout(*args, **kwargs):
pass
def dR_bridgeTool(*args, **kwargs):
pass
def scriptedPanel(*args, **kwargs):
pass
def dR_bevelPress(*args, **kwargs):
pass
def dR_activeHandleXZ(*args, **kwargs):
pass
def polySelectConstraintMonitor(*args, **kwargs):
pass
def dR_targetWeldTool(*args, **kwargs):
pass
def pointOnPolyConstraint(*args, **kwargs):
pass
def dR_targetWeldPress(*args, **kwargs):
pass
def polyEditEdgeFlow(*args, **kwargs):
pass
def plane(*args, **kwargs):
pass
def dR_symmetryTGL(*args, **kwargs):
pass
def polyCube(*args, **kwargs):
pass
def dR_symmetrize(*args, **kwargs):
pass
def bevelPlus(*args, **kwargs):
pass
def dR_softSelStickyRelease(*args, **kwargs):
pass
def projectTangent(*args, **kwargs):
pass
def nameCommand(*args, **kwargs):
pass
def dR_softSelStickyPress(*args, **kwargs):
pass
def attributeMenu(*args, **kwargs):
pass
def dR_viewTop(*args, **kwargs):
pass
def selectKey(*args, **kwargs):
pass
def dR_softSelDistanceTypeSurface(*args, **kwargs):
pass
def layeredShaderPort(*args, **kwargs):
pass
def rigidBody(*args, **kwargs):
pass
def attachSurface(*args, **kwargs):
pass
def dR_vertUnlockAll(*args, **kwargs):
pass
def dR_timeConfigTGL(*args, **kwargs):
pass
def iconTextRadioButton(*args, **kwargs):
pass
def textureDeformer(*args, **kwargs):
pass
def iconTextButton(*args, **kwargs):
pass
def dR_activeHandleY(*args, **kwargs):
pass
def polyPrimitive(*args, **kwargs):
pass
def polyPipe(*args, **kwargs):
pass
def polyPrism(*args, **kwargs):
pass
def polyNormalizeUV(*args, **kwargs):
pass
def polyConnectComponents(*args, **kwargs):
pass
def polyMoveUV(*args, **kwargs):
pass
def polyColorPerVertex(*args, **kwargs):
pass
def simplify(*args, **kwargs):
pass
def headsUpDisplay(*args, **kwargs):
pass
def polyMergeEdge(*args, **kwargs):
pass
def hwReflectionMap(*args, **kwargs):
pass
def dR_viewJointsTGL(*args, **kwargs):
pass
def callbacks(*args, **kwargs):
pass
def hudSlider(*args, **kwargs):
pass
def hudButton(*args, **kwargs):
pass
def pointPosition(*args, **kwargs):
pass
def pointLight(*args, **kwargs):
pass
def dR_softSelDistanceTypeObject(*args, **kwargs):
pass
def particle(*args, **kwargs):
pass
def dR_setRelaxAffectsAuto(*args, **kwargs):
pass
def reference(*args, **kwargs):
pass
def dR_selectModeRaycast(*args, **kwargs):
pass
def hotkey(*args, **kwargs):
pass
def rebuildCurve(*args, **kwargs):
pass
def intScrollBar(*args, **kwargs):
pass
def hotkeyCheck(*args, **kwargs):
pass
def dR_selectAll(*args, **kwargs):
pass
def toolBar(*args, **kwargs):
pass
def dR_selConstraintBorder(*args, **kwargs):
pass
def radioButton(*args, **kwargs):
pass
def gridLayout(*args, **kwargs):
pass
def imageWindowEditor(*args, **kwargs):
pass
def vortex(*args, **kwargs):
pass
def gradientControlNoAttr(*args, **kwargs):
pass
def polyColorMod(*args, **kwargs):
pass
def polyColorDel(*args, **kwargs):
pass
def exclusiveLightCheckBox(*args, **kwargs):
pass
def textureWindow(*args, **kwargs):
pass
def floatSlider2(*args, **kwargs):
pass
def getClassification(*args, **kwargs):
pass
def move(*args, **kwargs):
pass
def nurbsCurveToBezier(*args, **kwargs):
pass
def subdCleanTopology(*args, **kwargs):
pass
def fileInfo(*args, **kwargs):
pass
def polyCloseBorder(*args, **kwargs):
pass
def nodeEditor(*args, **kwargs):
pass
def duplicate(*args, **kwargs):
pass
def spring(*args, **kwargs):
pass
def alignSurface(*args, **kwargs):
pass
def getMelRepresentation(args, recursionLimit=None, maintainDicts=True):
"""
Will return a list which contains each element of the iterable 'args' converted to a mel-friendly representation.
:Parameters:
recursionLimit : int or None
If an element of args is itself iterable, recursionLimit specifies the depth to which iterable elements
will recursively search for objects to convert; if ``recursionLimit==0``, only the elements
of args itself will be searched for PyNodes - if it is 1, iterables within args will have getMelRepresentation called
on them, etc. If recursionLimit==None, then there is no limit to recursion depth.
maintainDicts : bool
In general, all iterables will be converted to tuples in the returned copy - however, if maintainDicts==True,
then iterables for which ``util.isMapping()`` returns True will be returned as dicts.
"""
pass
def air(*args, **kwargs):
pass
def scriptTable(*args, **kwargs):
pass
def polyMapSew(*args, **kwargs):
pass
def geomBind(*args, **kwargs):
pass
def dR_rotateRelease(*args, **kwargs):
pass
def addPP(*args, **kwargs):
pass
def camera(*args, **kwargs):
pass
def dR_pointSnapPress(*args, **kwargs):
pass
def dR_objectXrayTGL(*args, **kwargs):
pass
def polyBoolOp(*args, **kwargs):
pass
def timeControl(*args, **kwargs):
pass
def polyBlindData(*args, **kwargs):
pass
def dR_nexTool(*args, **kwargs):
pass
def dR_multiCutPress(*args, **kwargs):
pass
objectErrorReg = None
| 12.501329 | 130 | 0.641633 | 3,762 | 32,916 | 5.570175 | 0.220362 | 0.212789 | 0.419566 | 0.508661 | 0.152517 | 0.011978 | 0.006156 | 0.006156 | 0 | 0 | 0 | 0.000661 | 0.218769 | 32,916 | 2,632 | 131 | 12.506079 | 0.814233 | 0.078624 | 0 | 0.496498 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.496498 | false | 0.496498 | 0.006226 | 0 | 0.502724 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
22a55f9f496401a3d75e8f5777a4a0b0ce2719bf | 61 | py | Python | reddata/__init__.py | WMDA/social-media-analysis | 2ab7d4c1c2be1dfb8ba8820fb013ae1be2ffbc4e | [
"MIT"
] | 1 | 2020-05-25T14:10:33.000Z | 2020-05-25T14:10:33.000Z | reddata/__init__.py | WMDA/social-media-analysis | 2ab7d4c1c2be1dfb8ba8820fb013ae1be2ffbc4e | [
"MIT"
] | 18 | 2020-05-27T18:21:12.000Z | 2020-06-13T16:56:47.000Z | reddata/__init__.py | WMDA/social-media-analysis | 2ab7d4c1c2be1dfb8ba8820fb013ae1be2ffbc4e | [
"MIT"
] | null | null | null | from reddata.function import *
from reddata.reddit import *
| 15.25 | 30 | 0.786885 | 8 | 61 | 6 | 0.625 | 0.458333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.147541 | 61 | 3 | 31 | 20.333333 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
22cdedf320f9a28fb9a273784eb04ae4dcade850 | 5,731 | py | Python | reconstruction/common/mytorch/loss/test_loss.py | khammernik/sigmanet | 6eb8dbd1ee350bb9baee60eb254080f7d660bbc5 | [
"MIT"
] | 50 | 2019-12-21T01:09:28.000Z | 2022-01-25T03:40:06.000Z | reconstruction/common/mytorch/loss/test_loss.py | cq615/sigmanet | 6eb8dbd1ee350bb9baee60eb254080f7d660bbc5 | [
"MIT"
] | 6 | 2020-02-12T04:34:32.000Z | 2022-03-12T00:08:55.000Z | reconstruction/common/mytorch/loss/test_loss.py | cq615/sigmanet | 6eb8dbd1ee350bb9baee60eb254080f7d660bbc5 | [
"MIT"
] | 15 | 2019-12-21T02:27:28.000Z | 2021-08-16T11:36:49.000Z | """
Copyright (c) 2019 Imperial College London.
Copyright (c) Facebook, Inc. and its affiliates.
This source code is licensed under the MIT license found in the
LICENSE file in the root directory of this source tree.
"""
import numpy as np
import pytest
import torch
import common.mytorch.loss as loss
from common import utils
from common.evaluate import mse, nmse, psnr, ssim
def create_input_pair(shape, sigma=0.1):
input = np.arange(np.product(shape)).reshape(shape).astype(float)
input /= np.max(input)
input2 = input + np.random.normal(0, sigma, input.shape)
input = torch.from_numpy(input).float()
input2 = torch.from_numpy(input2).float()
return input, input2
@pytest.mark.parametrize('shape', [
[10, 32, 32],
[10, 64, 64],
])
def test_psnr(shape):
input, input2 = create_input_pair(shape, sigma=0.1)
input_numpy = utils.torch_to_numpy(input)
input2_numpy = utils.torch_to_numpy(input2)
err = loss.psnr(input, input2, batch=False).item()
err_numpy = psnr(input_numpy, input2_numpy)
assert np.allclose(err, err_numpy)
def test_psnr_batch():
shape4d = [4, 6, 32, 32]
input, input2 = create_input_pair(shape4d, sigma=0.1)
input_numpy = utils.torch_to_numpy(input)
input2_numpy = utils.torch_to_numpy(input2)
err = loss.psnr(input, input2, batch=True).item()
err_numpy = 0
for i in range(shape4d[0]):
err_curr = psnr(input_numpy[i], input2_numpy[i])
err_numpy += err_curr
err_numpy /= shape4d[0]
assert np.allclose(err, err_numpy)
shape5d = [4, 6, 1, 32, 32]
input, input2 = create_input_pair(shape5d, sigma=0.1)
input_numpy = utils.torch_to_numpy(input)
input2_numpy = utils.torch_to_numpy(input2)
err = loss.psnr(input, input2)
err_numpy = 0
for i in range(shape5d[0]):
err_numpy += psnr(input_numpy[i][:,0], input2_numpy[i][:,0])
err_numpy /= shape5d[0]
assert np.allclose(err, err_numpy)
@pytest.mark.parametrize('shape', [
[5, 320, 320],
[10, 64, 64],
])
def test_ssim(shape):
input, input2 = create_input_pair(shape, sigma=0.1)
input_numpy = utils.torch_to_numpy(input)
input2_numpy = utils.torch_to_numpy(input2)
torch_ssim = loss.SSIM(win_size=7, device='cpu')
err = torch_ssim(input.unsqueeze(1), input2.unsqueeze(1)).item()
err_numpy = ssim(input_numpy, input2_numpy)
assert abs(err - err_numpy) < 1e-4
def test_ssim_batch():
shape4d = [4, 6, 96, 96]
input, input2 = create_input_pair(shape4d, sigma=0.1)
input_numpy = utils.torch_to_numpy(input)
input2_numpy = utils.torch_to_numpy(input2)
torch_ssim = loss.SSIM(win_size=7, device='cpu')
data_range = input.view(4, -1).max(1)[0].repeat(6)
err = torch_ssim(
input.reshape(24, 1, 96, 96),
input2.reshape(24, 1, 96, 96),
data_range=data_range,
).item()
err_numpy = 0
for i in range(shape4d[0]):
err_curr = ssim(input_numpy[i], input2_numpy[i])
err_numpy += err_curr
err_numpy /= shape4d[0]
assert abs(err - err_numpy) < 1e-4
@pytest.mark.parametrize('shape', [
[10, 32, 32],
[10, 64, 64],
[4, 6, 32, 32],
[4, 6, 1, 32, 32],
])
def test_mse(shape):
input, input2 = create_input_pair(shape, sigma=0.1)
input_numpy = utils.torch_to_numpy(input)
input2_numpy = utils.torch_to_numpy(input2)
err = torch.nn.functional.mse_loss(input, input2).item()
err_numpy = mse(input_numpy, input2_numpy)
assert np.allclose(err, err_numpy)
def test_mse_batch():
shape4d = [4, 6, 32, 32]
input, input2 = create_input_pair(shape4d, sigma=0.1)
input_numpy = utils.torch_to_numpy(input)
input2_numpy = utils.torch_to_numpy(input2)
err = torch.nn.functional.mse_loss(input, input2).item()
err_numpy = 0
for i in range(shape4d[0]):
err_curr = mse(input_numpy[i], input2_numpy[i])
err_numpy += err_curr
err_numpy /= shape4d[0]
assert np.allclose(err, err_numpy)
shape5d = [4, 6, 1, 32, 32]
input, input2 = create_input_pair(shape5d, sigma=0.1)
input_numpy = utils.torch_to_numpy(input)
input2_numpy = utils.torch_to_numpy(input2)
err = torch.nn.functional.mse_loss(input, input2).item()
err_numpy = 0
for i in range(shape5d[0]):
err_numpy += mse(input_numpy[i][:,0], input2_numpy[i][:,0])
err_numpy /= shape5d[0]
assert np.allclose(err, err_numpy)
@pytest.mark.parametrize('shape', [
[10, 32, 32],
[10, 64, 64],
[4, 6, 32, 32],
[4, 6, 1, 32, 32],
])
def test_nmse(shape):
input, input2 = create_input_pair(shape, sigma=0.1)
input_numpy = utils.torch_to_numpy(input)
input2_numpy = utils.torch_to_numpy(input2)
err = loss.nmse(input, input2, batch=False).item()
err_numpy = nmse(input_numpy, input2_numpy)
assert np.allclose(err, err_numpy)
def test_nmse_batch():
shape4d = [4, 6, 32, 32]
input, input2 = create_input_pair(shape4d, sigma=0.1)
input_numpy = utils.torch_to_numpy(input)
input2_numpy = utils.torch_to_numpy(input2)
err = loss.nmse(input, input2).item()
err_numpy = 0
for i in range(shape4d[0]):
err_curr = nmse(input_numpy[i], input2_numpy[i])
err_numpy += err_curr
err_numpy /= shape4d[0]
assert np.allclose(err, err_numpy)
shape5d = [4, 6, 1, 32, 32]
input, input2 = create_input_pair(shape5d, sigma=0.1)
input_numpy = utils.torch_to_numpy(input)
input2_numpy = utils.torch_to_numpy(input2)
err = loss.nmse(input, input2).item()
err_numpy = 0
for i in range(shape5d[0]):
err_numpy += nmse(input_numpy[i][:,0], input2_numpy[i][:,0])
err_numpy /= shape5d[0]
assert np.allclose(err, err_numpy)
| 29.541237 | 69 | 0.664456 | 887 | 5,731 | 4.087937 | 0.11274 | 0.079426 | 0.091009 | 0.103144 | 0.809708 | 0.784611 | 0.784611 | 0.762824 | 0.753999 | 0.753999 | 0 | 0.063208 | 0.199442 | 5,731 | 193 | 70 | 29.694301 | 0.727114 | 0.037166 | 0 | 0.691781 | 0 | 0 | 0.004719 | 0 | 0 | 0 | 0 | 0 | 0.075342 | 1 | 0.061644 | false | 0 | 0.041096 | 0 | 0.109589 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
22dbd06db47b39bd6040ef579a87218b1e944d2f | 188 | py | Python | census_similarity/commands/__init__.py | 18F/census-similarity | 1af03a7fc9fc865b037abcb460386fb9b73bc103 | [
"CC0-1.0"
] | 1 | 2016-09-28T16:02:58.000Z | 2016-09-28T16:02:58.000Z | census_similarity/commands/__init__.py | 18F/census-similarity | 1af03a7fc9fc865b037abcb460386fb9b73bc103 | [
"CC0-1.0"
] | 11 | 2016-09-26T18:44:18.000Z | 2016-10-15T23:30:41.000Z | census_similarity/commands/__init__.py | 18F/census-similarity | 1af03a7fc9fc865b037abcb460386fb9b73bc103 | [
"CC0-1.0"
] | 3 | 2016-09-28T10:02:17.000Z | 2021-02-15T09:58:33.000Z | import logging
from .cluster_by_field import cluster_by_field # noqa
from .group_by import group_by # noqa
from .lookup import lookup # noqa
logging.basicConfig(level=logging.DEBUG)
| 20.888889 | 54 | 0.797872 | 28 | 188 | 5.142857 | 0.428571 | 0.125 | 0.194444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.143617 | 188 | 8 | 55 | 23.5 | 0.89441 | 0.074468 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.8 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
22fbd65009c0e9985357b5b62b1d501c7bf3ad5f | 38 | py | Python | medimodule/Liver/__init__.py | daeun02/MI2RLNet | 55f32e3908dc1d5fa6100f9d9fccd23a2636adbb | [
"Apache-2.0"
] | null | null | null | medimodule/Liver/__init__.py | daeun02/MI2RLNet | 55f32e3908dc1d5fa6100f9d9fccd23a2636adbb | [
"Apache-2.0"
] | null | null | null | medimodule/Liver/__init__.py | daeun02/MI2RLNet | 55f32e3908dc1d5fa6100f9d9fccd23a2636adbb | [
"Apache-2.0"
] | null | null | null | from .module import LiverSegmentation
| 19 | 37 | 0.868421 | 4 | 38 | 8.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 38 | 1 | 38 | 38 | 0.970588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a3a8f57d92e850e27ac94777cfc71d10dd9aa784 | 27 | py | Python | base/test-show-scope/func-args-4.py | jpolitz/lambda-py-paper | 746ef63fc1123714b4adaf78119028afbea7bd76 | [
"Apache-2.0"
] | 25 | 2015-04-16T04:31:49.000Z | 2022-03-10T15:53:28.000Z | base/test-show-scope/func-args-4.py | jpolitz/lambda-py-paper | 746ef63fc1123714b4adaf78119028afbea7bd76 | [
"Apache-2.0"
] | 1 | 2018-11-21T22:40:02.000Z | 2018-11-26T17:53:11.000Z | base/test-show-scope/func-args-4.py | jpolitz/lambda-py-paper | 746ef63fc1123714b4adaf78119028afbea7bd76 | [
"Apache-2.0"
] | 1 | 2021-03-26T03:36:19.000Z | 2021-03-26T03:36:19.000Z | x = 7
def f(y):
return x
| 5.4 | 9 | 0.518519 | 7 | 27 | 2 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 0.333333 | 27 | 4 | 10 | 6.75 | 0.722222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 0.666667 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
a3f3ae3959564175f5b7913ca1c91da3cf6e273d | 27 | py | Python | tests/unit/test_client.py | charlesvardeman/smartcontainers_cfv | 141fd85d6fe7feb2c502ae9102ac53dafe9844a4 | [
"Apache-2.0"
] | 1 | 2016-06-07T01:01:51.000Z | 2016-06-07T01:01:51.000Z | tests/unit/test_client.py | charlesvardeman/smartcontainers_cfv | 141fd85d6fe7feb2c502ae9102ac53dafe9844a4 | [
"Apache-2.0"
] | null | null | null | tests/unit/test_client.py | charlesvardeman/smartcontainers_cfv | 141fd85d6fe7feb2c502ae9102ac53dafe9844a4 | [
"Apache-2.0"
] | null | null | null | import pytest
import os
#
| 5.4 | 13 | 0.740741 | 4 | 27 | 5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 27 | 4 | 14 | 6.75 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4a84b60a09bb42947fd090715ce5cd60fb0755a3 | 70 | py | Python | recognition/ocr/angle/__init__.py | kalenforn/MMVA | 1e4ec5417d4497a14f226fab8a66fe065a9f0f65 | [
"MIT"
] | 4 | 2021-12-16T08:17:49.000Z | 2022-03-12T10:14:50.000Z | recognition/ocr/angle/__init__.py | kalenforn/video-content-clean | 4b6e572ec034fbe2e668c250cff8e1c9a13dd0e0 | [
"MIT"
] | null | null | null | recognition/ocr/angle/__init__.py | kalenforn/video-content-clean | 4b6e572ec034fbe2e668c250cff8e1c9a13dd0e0 | [
"MIT"
] | 1 | 2021-12-14T08:17:41.000Z | 2021-12-14T08:17:41.000Z | from .angleHandler import AngleHandler, LABEL_MAP_DICT, ROTAE_MAP_DICT | 70 | 70 | 0.885714 | 10 | 70 | 5.8 | 0.7 | 0.241379 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 70 | 1 | 70 | 70 | 0.892308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
436d3105abbbc9960c9c7c94f4065154cd6388a3 | 160 | py | Python | Modulo10/operaciones.py | chemross/practica3 | ca8ce3eeb49e5e8648d9d10a058b1df56dbde047 | [
"Apache-2.0"
] | null | null | null | Modulo10/operaciones.py | chemross/practica3 | ca8ce3eeb49e5e8648d9d10a058b1df56dbde047 | [
"Apache-2.0"
] | null | null | null | Modulo10/operaciones.py | chemross/practica3 | ca8ce3eeb49e5e8648d9d10a058b1df56dbde047 | [
"Apache-2.0"
] | null | null | null | def suma(a,b):
return a+b;
def resta(a,b):
return a-b;
def multiplicacion(a,b):
return a*b;
def division(a,b):
return a/b;
| 13.333333 | 25 | 0.51875 | 28 | 160 | 2.964286 | 0.285714 | 0.192771 | 0.385542 | 0.433735 | 0.590361 | 0.46988 | 0 | 0 | 0 | 0 | 0 | 0 | 0.33125 | 160 | 12 | 26 | 13.333333 | 0.775701 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
43a71f22bd94947070d305945bdcf49e8e29b672 | 11,556 | py | Python | tests/swig/python/model/test_subenvironment.py | mondus/FLAMEGPU2_dev | 457f5304e302ff4413e0680dbb61be86f7394485 | [
"MIT"
] | null | null | null | tests/swig/python/model/test_subenvironment.py | mondus/FLAMEGPU2_dev | 457f5304e302ff4413e0680dbb61be86f7394485 | [
"MIT"
] | null | null | null | tests/swig/python/model/test_subenvironment.py | mondus/FLAMEGPU2_dev | 457f5304e302ff4413e0680dbb61be86f7394485 | [
"MIT"
] | null | null | null | import pytest
from unittest import TestCase
from pyflamegpu import *
class ExitAlways(pyflamegpu.HostFunctionConditionCallback):
def __init__(self):
super().__init__()
def run(self, FLAMEGPU):
return pyflamegpu.EXIT;
class SubEnvironmentDescriptionTest(TestCase):
def test_InvalidNames(self):
m2 = pyflamegpu.ModelDescription("sub");
# Define SubModel
exitcdn = ExitAlways()
m2.addExitConditionCallback(exitcdn);
m2.Environment().newPropertyFloat("a", 0);
m2.Environment().newPropertyArrayFloat("a2", [0] * 2);
m = pyflamegpu.ModelDescription("host");
# Define Model
m.Environment().newPropertyFloat("b", 0);
m.Environment().newPropertyArrayFloat("b2", [0] * 2);
sm = m.newSubModel("sub", m2);
senv = sm.SubEnvironment();
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapProperty("c", "b");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapProperty("c", "b2");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapProperty("a", "c");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapProperty("a2", "c");
senv.mapProperty("a2", "b2");
senv.mapProperty("a", "b");
def test_TypesDoNotMatch(self):
m2 = pyflamegpu.ModelDescription("sub");
# Define SubModel
exitcdn = ExitAlways()
m2.addExitConditionCallback(exitcdn);
m2.Environment().newPropertyFloat("a", 0);
m2.Environment().newPropertyArrayInt("a2", [0] * 2);
m = pyflamegpu.ModelDescription("host");
# Define Model
m.Environment().newPropertyUInt("b", 0);
m.Environment().newPropertyArrayFloat("b2", [0] * 2);
sm = m.newSubModel("sub", m2);
senv = sm.SubEnvironment();
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapProperty("a", "b");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapProperty("a2", "b2");
def test_ElementsDoNotMatch(self):
m2 = pyflamegpu.ModelDescription("sub");
# Define SubModel
exitcdn = ExitAlways()
m2.addExitConditionCallback(exitcdn);
m2.Environment().newPropertyFloat("a", 0);
m2.Environment().newPropertyArrayFloat("a2", [0] * 2);
m = pyflamegpu.ModelDescription("host");
# Define Model
m.Environment().newPropertyFloat("b", 0);
m.Environment().newPropertyArrayFloat("b2", [0] * 2);
sm = m.newSubModel("sub", m2);
senv = sm.SubEnvironment();
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapProperty("a", "b2");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapProperty("a2", "b");
senv.mapProperty("a2", "b2");
senv.mapProperty("a", "b");
def test_IsConstWrong(self):
m2 = pyflamegpu.ModelDescription("sub");
# Define SubModel
exitcdn = ExitAlways()
m2.addExitConditionCallback(exitcdn);
m2.Environment().newPropertyFloat("a", 0);
m2.Environment().newPropertyArrayFloat("a2", [0] * 2);
m = pyflamegpu.ModelDescription("host");
# Define Model
m.Environment().newPropertyFloat("b", 0, True);
m.Environment().newPropertyArrayFloat("b2", [0] * 2, True);
sm = m.newSubModel("sub", m2);
senv = sm.SubEnvironment();
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapProperty("a", "b");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapProperty("a2", "b2");
def test_AlreadyBound(self):
m2 = pyflamegpu.ModelDescription("sub");
# Define SubModel
exitcdn = ExitAlways()
m2.addExitConditionCallback(exitcdn);
m2.Environment().newPropertyFloat("a", 0);
m2.Environment().newPropertyArrayFloat("a2", [0.0] * 2);
m2.Environment().newPropertyFloat("a_", 0);
m2.Environment().newPropertyArrayFloat("a2_", [0] * 2);
m = pyflamegpu.ModelDescription("host");
# Define Model
m.Environment().newPropertyFloat("b", 0);
m.Environment().newPropertyArrayFloat("b2", [0] * 2);
m.Environment().newPropertyFloat("b_", 0);
m.Environment().newPropertyArrayFloat("b2_", [0] * 2);
# Missing exit condition
sm = m.newSubModel("sub", m2);
senv = sm.SubEnvironment();
senv.mapProperty("a", "b");
senv.mapProperty("a2", "b2");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapProperty("a", "b_");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapProperty("a2", "b2_");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapProperty("a_", "b");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapProperty("a2_", "b2");
def test_Macro_InvalidNames(self):
m2 = pyflamegpu.ModelDescription("sub");
# Define SubModel
exitcdn = ExitAlways()
m2.addExitConditionCallback(exitcdn);
m2.Environment().newMacroPropertyFloat("a");
m2.Environment().newMacroPropertyFloat("a2", 2, 3, 4, 5);
m = pyflamegpu.ModelDescription("host");
# Define Model
m.Environment().newMacroPropertyFloat("b");
m.Environment().newMacroPropertyFloat("b2", 2, 3, 4, 5);
sm = m.newSubModel("sub", m2);
senv = sm.SubEnvironment();
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("c", "b");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("c", "b2");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a", "c");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a2", "c");
senv.mapMacroProperty("a2", "b2");
senv.mapMacroProperty("a", "b");
def test_Macro_TypesDoNotMatch(self):
m2 = pyflamegpu.ModelDescription("sub");
# Define SubModel
exitcdn = ExitAlways()
m2.addExitConditionCallback(exitcdn);
m2.Environment().newMacroPropertyFloat("a");
m2.Environment().newMacroPropertyInt("a2");
m = pyflamegpu.ModelDescription("host");
# Define Model
m.Environment().newMacroPropertyUInt("b");
m.Environment().newMacroPropertyFloat("b2", 2, 3, 4);
sm = m.newSubModel("sub", m2);
senv = sm.SubEnvironment();
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a", "b");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a2", "b2");
def test_Macro_DimensionsDoNotMatch(self):
m2 = pyflamegpu.ModelDescription("sub");
# Define SubModel
exitcdn = ExitAlways()
m2.addExitConditionCallback(exitcdn);
m2.Environment().newMacroPropertyFloat("a", 4, 3, 2, 1);
m2.Environment().newMacroPropertyFloat("a2", 1, 2, 3, 4);
m2.Environment().newMacroPropertyFloat("a3", 1, 2, 3);
m2.Environment().newMacroPropertyFloat("a4", 2, 3, 4);
m2.Environment().newMacroPropertyFloat("a5");
m = pyflamegpu.ModelDescription("host");
# Define Model
m.Environment().newMacroPropertyFloat("b", 4, 3, 2, 1);
m.Environment().newMacroPropertyFloat("b2", 1, 2, 3, 4);
sm = m.newSubModel("sub", m2);
senv = sm.SubEnvironment();
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a", "b2");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a", "b3");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a", "b4");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a", "b5");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a2", "b");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a3", "b");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a4", "b");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a5", "b");
senv.mapMacroProperty("a2", "b2");
senv.mapMacroProperty("a", "b");
def test_Macro_AlreadyBound(self):
m2 = pyflamegpu.ModelDescription("sub");
# Define SubModel
exitcdn = ExitAlways()
m2.addExitConditionCallback(exitcdn);
m2.Environment().newMacroPropertyFloat("a");
m2.Environment().newMacroPropertyFloat("a2", 2);
m2.Environment().newMacroPropertyFloat("a_");
m2.Environment().newMacroPropertyFloat("a2_", 2);
m = pyflamegpu.ModelDescription("host");
# Define Model
m.Environment().newMacroPropertyFloat("b");
m.Environment().newMacroPropertyFloat("b2", 2);
m.Environment().newMacroPropertyFloat("b_");
m.Environment().newMacroPropertyFloat("b2_", 2);
sm = m.newSubModel("sub", m2);
senv = sm.SubEnvironment();
senv.mapMacroProperty("a", "b");
senv.mapMacroProperty("a2", "b2");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a", "b_");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a2", "b2_");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a_", "b");
with pytest.raises(pyflamegpu.FLAMEGPURuntimeException) as e: # exception::InvalidEnvProperty exception
senv.mapMacroProperty("a2_", "b2");
| 50.907489 | 112 | 0.650744 | 1,056 | 11,556 | 7.086174 | 0.072917 | 0.042764 | 0.068422 | 0.111185 | 0.927168 | 0.924362 | 0.909528 | 0.909528 | 0.897234 | 0.866765 | 0 | 0.021061 | 0.215213 | 11,556 | 226 | 113 | 51.132743 | 0.804058 | 0.135254 | 0 | 0.635897 | 0 | 0 | 0.030782 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05641 | false | 0 | 0.015385 | 0.005128 | 0.087179 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
43b37c9098925186f67787368c6557b1f8de8f8a | 87 | py | Python | py_ice_cascade/hillslope/__init__.py | keithfma/py_ice_cascade | 22fe9d51777c52dfff7cce2be9f3ed1b3c6f8bc2 | [
"MIT"
] | null | null | null | py_ice_cascade/hillslope/__init__.py | keithfma/py_ice_cascade | 22fe9d51777c52dfff7cce2be9f3ed1b3c6f8bc2 | [
"MIT"
] | null | null | null | py_ice_cascade/hillslope/__init__.py | keithfma/py_ice_cascade | 22fe9d51777c52dfff7cce2be9f3ed1b3c6f8bc2 | [
"MIT"
] | null | null | null | from .base import base_model
from .null import null_model
from .ftcs import ftcs_model
| 21.75 | 28 | 0.827586 | 15 | 87 | 4.6 | 0.4 | 0.26087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 87 | 3 | 29 | 29 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
78e933418f27697114294c6047838ae670e8697b | 18,300 | py | Python | restraintlib/lib/deoxyribose_pyrimidine.py | mkowiel/restraintlib | 32de01d67ae290a45f3199e90c729acc258a6249 | [
"BSD-3-Clause"
] | null | null | null | restraintlib/lib/deoxyribose_pyrimidine.py | mkowiel/restraintlib | 32de01d67ae290a45f3199e90c729acc258a6249 | [
"BSD-3-Clause"
] | 1 | 2021-11-11T18:45:10.000Z | 2021-11-11T18:45:10.000Z | restraintlib/lib/deoxyribose_pyrimidine.py | mkowiel/restraintlib | 32de01d67ae290a45f3199e90c729acc258a6249 | [
"BSD-3-Clause"
] | null | null | null | DEOXYRIBOSE_PYRIMIDINE_PDB_CODES = ['DC', 'DT', 'DU']
DEOXYRIBOSE_PYRIMIDINE_ALL_PDB_CODES = DEOXYRIBOSE_PYRIMIDINE_PDB_CODES
DEOXYRIBOSE_PYRIMIDINE_CHI_GAMMA_PDB_CODES = DEOXYRIBOSE_PYRIMIDINE_PDB_CODES
DEOXYRIBOSE_PYRIMIDINE_CHI_PDB_CODES = DEOXYRIBOSE_PYRIMIDINE_PDB_CODES
DEOXYRIBOSE_PYRIMIDINE_BASE_FUNC_OF_TORSION_CHI_PDB_CODES = DEOXYRIBOSE_PYRIMIDINE_PDB_CODES
DEOXYRIBOSE_PYRIMIDINE_CONFORMATION_PDB_CODES = DEOXYRIBOSE_PYRIMIDINE_PDB_CODES
DEOXYRIBOSE_PYRIMIDINE_SUGAR_PDB_CODES = DEOXYRIBOSE_PYRIMIDINE_PDB_CODES
DEOXYRIBOSE_PYRIMIDINE_CHI_CONFORMATION_PDB_CODES = DEOXYRIBOSE_PYRIMIDINE_PDB_CODES
DEOXYRIBOSE_PYRIMIDINE_SUGAR_CONFORMATION_FUNC_OF_TAU_MAX_PDB_CODES = DEOXYRIBOSE_PYRIMIDINE_PDB_CODES
DEOXYRIBOSE_PYRIMIDINE_GAMMA_PDB_CODES = DEOXYRIBOSE_PYRIMIDINE_PDB_CODES
DEOXYRIBOSE_PYRIMIDINE_ALL_FUNC_OF_TORSION_CHI_PDB_CODES = DEOXYRIBOSE_PYRIMIDINE_PDB_CODES
DEOXYRIBOSE_PYRIMIDINE_ATOM_NAMES = {
"C1'": "C1'",
"C1*": "C1'",
"C2": "C2",
"C2'": "C2'",
"C2*": "C2'",
"C3'": "C3'",
"C3*": "C3'",
"C4'": "C4'",
"C4*": "C4'",
"C5'": "C5'",
"C5*": "C5'",
"C6": "C6",
"N1": "N1",
"O3'": "O3'",
"O3*": "O3'",
"O4'": "O4'",
"O4*": "O4'",
"O5'": "O5'",
"O5*": "O5'",
"P": "P"
}
DEOXYRIBOSE_PYRIMIDINE_ALL_ATOM_NAMES = DEOXYRIBOSE_PYRIMIDINE_ATOM_NAMES
DEOXYRIBOSE_PYRIMIDINE_CHI_GAMMA_ATOM_NAMES = DEOXYRIBOSE_PYRIMIDINE_ATOM_NAMES
DEOXYRIBOSE_PYRIMIDINE_CHI_ATOM_NAMES = DEOXYRIBOSE_PYRIMIDINE_ATOM_NAMES
DEOXYRIBOSE_PYRIMIDINE_BASE_FUNC_OF_TORSION_CHI_ATOM_NAMES = DEOXYRIBOSE_PYRIMIDINE_ATOM_NAMES
DEOXYRIBOSE_PYRIMIDINE_CONFORMATION_ATOM_NAMES = DEOXYRIBOSE_PYRIMIDINE_ATOM_NAMES
DEOXYRIBOSE_PYRIMIDINE_SUGAR_ATOM_NAMES = DEOXYRIBOSE_PYRIMIDINE_ATOM_NAMES
DEOXYRIBOSE_PYRIMIDINE_CHI_CONFORMATION_ATOM_NAMES = DEOXYRIBOSE_PYRIMIDINE_ATOM_NAMES
DEOXYRIBOSE_PYRIMIDINE_SUGAR_CONFORMATION_FUNC_OF_TAU_MAX_ATOM_NAMES = DEOXYRIBOSE_PYRIMIDINE_ATOM_NAMES
DEOXYRIBOSE_PYRIMIDINE_GAMMA_ATOM_NAMES = DEOXYRIBOSE_PYRIMIDINE_ATOM_NAMES
DEOXYRIBOSE_PYRIMIDINE_ALL_FUNC_OF_TORSION_CHI_ATOM_NAMES = DEOXYRIBOSE_PYRIMIDINE_ATOM_NAMES
DEOXYRIBOSE_PYRIMIDINE_ATOM_RES = {
"C1'": 0,
"C2": 0,
"C2'": 0,
"C3'": 0,
"C4'": 0,
"C5'": 0,
"C6": 0,
"N1": 0,
"O3'": 0,
"O4'": 0,
"O5'": 0
}
DEOXYRIBOSE_PYRIMIDINE_ALL_ATOM_RES = DEOXYRIBOSE_PYRIMIDINE_ATOM_RES
DEOXYRIBOSE_PYRIMIDINE_CHI_GAMMA_ATOM_RES = DEOXYRIBOSE_PYRIMIDINE_ATOM_RES
DEOXYRIBOSE_PYRIMIDINE_CHI_ATOM_RES = DEOXYRIBOSE_PYRIMIDINE_ATOM_RES
DEOXYRIBOSE_PYRIMIDINE_BASE_FUNC_OF_TORSION_CHI_ATOM_RES = DEOXYRIBOSE_PYRIMIDINE_ATOM_RES
DEOXYRIBOSE_PYRIMIDINE_CONFORMATION_ATOM_RES = DEOXYRIBOSE_PYRIMIDINE_ATOM_RES
DEOXYRIBOSE_PYRIMIDINE_SUGAR_ATOM_RES = DEOXYRIBOSE_PYRIMIDINE_ATOM_RES
DEOXYRIBOSE_PYRIMIDINE_CHI_CONFORMATION_ATOM_RES = DEOXYRIBOSE_PYRIMIDINE_ATOM_RES
DEOXYRIBOSE_PYRIMIDINE_SUGAR_CONFORMATION_FUNC_OF_TAU_MAX_ATOM_RES = DEOXYRIBOSE_PYRIMIDINE_ATOM_RES
DEOXYRIBOSE_PYRIMIDINE_GAMMA_ATOM_RES = DEOXYRIBOSE_PYRIMIDINE_ATOM_RES
DEOXYRIBOSE_PYRIMIDINE_ALL_FUNC_OF_TORSION_CHI_ATOM_RES = DEOXYRIBOSE_PYRIMIDINE_ATOM_RES
DEOXYRIBOSE_PYRIMIDINE_REQUIRED_CONDITION = [
("C1'", "C2'", 2.0, 0, 0),
("C2'", "C3'", 2.0, 0, 0),
("C3'", "C4'", 2.0, 0, 0),
("C4'", "O4'", 2.0, 0, 0),
("C1'", "O4'", 2.0, 0, 0),
("C3'", "O3'", 2.0, 0, 0),
("C4'", "C5'", 2.0, 0, 0),
("C5'", "O5'", 2.0, 0, 0),
("C1'", 'N1', 2.0, 0, 0),
("O5'", 'P', 2.5, 0, 0),
("O3'", 'P', 2.5, 0, 1)
]
DEOXYRIBOSE_PYRIMIDINE_ALL_REQUIRED_CONDITION = DEOXYRIBOSE_PYRIMIDINE_REQUIRED_CONDITION
DEOXYRIBOSE_PYRIMIDINE_CHI_GAMMA_REQUIRED_CONDITION = DEOXYRIBOSE_PYRIMIDINE_REQUIRED_CONDITION
DEOXYRIBOSE_PYRIMIDINE_CHI_REQUIRED_CONDITION = DEOXYRIBOSE_PYRIMIDINE_REQUIRED_CONDITION
DEOXYRIBOSE_PYRIMIDINE_BASE_FUNC_OF_TORSION_CHI_REQUIRED_CONDITION = DEOXYRIBOSE_PYRIMIDINE_REQUIRED_CONDITION
DEOXYRIBOSE_PYRIMIDINE_CONFORMATION_REQUIRED_CONDITION = DEOXYRIBOSE_PYRIMIDINE_REQUIRED_CONDITION
DEOXYRIBOSE_PYRIMIDINE_SUGAR_REQUIRED_CONDITION = DEOXYRIBOSE_PYRIMIDINE_REQUIRED_CONDITION
DEOXYRIBOSE_PYRIMIDINE_CHI_CONFORMATION_REQUIRED_CONDITION = DEOXYRIBOSE_PYRIMIDINE_REQUIRED_CONDITION
DEOXYRIBOSE_PYRIMIDINE_SUGAR_CONFORMATION_FUNC_OF_TAU_MAX_REQUIRED_CONDITION = DEOXYRIBOSE_PYRIMIDINE_REQUIRED_CONDITION
DEOXYRIBOSE_PYRIMIDINE_GAMMA_REQUIRED_CONDITION = DEOXYRIBOSE_PYRIMIDINE_REQUIRED_CONDITION
DEOXYRIBOSE_PYRIMIDINE_ALL_FUNC_OF_TORSION_CHI_REQUIRED_CONDITION = DEOXYRIBOSE_PYRIMIDINE_REQUIRED_CONDITION
DEOXYRIBOSE_PYRIMIDINE_DISTANCE_MEASURE = {
'measure': 'euclidean_angles',
'restraint_names': ["aC4'C5'O5'", "aC4'C3'O3'", "aN1C1'C2'", "aC1'N1C2", "aC1'N1C6", "aN1C1'O4'", "aC2'C1'O4'", "aC2'C3'O3'", "aC1'C2'C3'", "aC2'C3'C4'", "aC3'C4'O4'", "aC1'O4'C4'", "aC3'C4'C5'", "aC5'C4'O4'"]
}
DEOXYRIBOSE_PYRIMIDINE_ALL_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_CHI_GAMMA_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_CHI_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_BASE_FUNC_OF_TORSION_CHI_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_CONFORMATION_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_SUGAR_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_CHI_CONFORMATION_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_SUGAR_CONFORMATION_FUNC_OF_TAU_MAX_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_GAMMA_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_ALL_FUNC_OF_TORSION_CHI_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_CONDITION_DISTANCE_MEASURE = {
'measure': 'euclidean_angles',
'restraint_names': ["tO4'C1'N1C2", "tC3'C4'C5'O5'", "pC1'C2'C3'C4'O4'"]
}
DEOXYRIBOSE_PYRIMIDINE_ALL_CONDITION_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_CONDITION_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_CHI_GAMMA_CONDITION_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_CONDITION_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_CHI_CONDITION_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_CONDITION_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_BASE_FUNC_OF_TORSION_CHI_CONDITION_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_CONDITION_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_CONFORMATION_CONDITION_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_CONDITION_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_SUGAR_CONDITION_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_CONDITION_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_CHI_CONFORMATION_CONDITION_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_CONDITION_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_SUGAR_CONFORMATION_FUNC_OF_TAU_MAX_CONDITION_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_CONDITION_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_GAMMA_CONDITION_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_CONDITION_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_ALL_FUNC_OF_TORSION_CHI_CONDITION_DISTANCE_MEASURE = DEOXYRIBOSE_PYRIMIDINE_CONDITION_DISTANCE_MEASURE
DEOXYRIBOSE_PYRIMIDINE_ALL_RESTRAINTS = [{
'conditions': [], 'name': 'deoxyribose_pyrimidine==All=All', 'restraints': [['dist', "dC1'C2'", ["C1'", "C2'"], 1.525, 0.012], ['dist', "dC2'C3'", ["C2'", "C3'"], 1.523, 0.011]]
}
]
DEOXYRIBOSE_PYRIMIDINE_CHI_GAMMA_RESTRAINTS = [
{
'conditions': [['torsion', "tO4'C1'N1C2", ["O4'", "C1'", 'N1', 'C2'], 180, 22.5], ['torsion', "tC3'C4'C5'O5'", ["C3'", "C4'", "C5'", "O5'"], 60, 8.75]],
'name': 'deoxyribose_pyrimidine==Chi=anti__Gamma=gauche+',
'restraints': [['angle', "aC4'C5'O5'", ["C4'", "C5'", "O5'"], 110.6, 1.9]]
},
{
'conditions': [['torsion', "tO4'C1'N1C2", ["O4'", "C1'", 'N1', 'C2'], 180, 22.5], ['torsion', "tC3'C4'C5'O5'", ["C3'", "C4'", "C5'", "O5'"], -60, 8.75]],
'name': 'deoxyribose_pyrimidine==Chi=anti__Gamma=gauche-',
'restraints': [['angle', "aC4'C5'O5'", ["C4'", "C5'", "O5'"], 109.6, 1.8]]
},
{
'conditions': [['torsion', "tO4'C1'N1C2", ["O4'", "C1'", 'N1', 'C2'], 180, 22.5], ['torsion', "tC3'C4'C5'O5'", ["C3'", "C4'", "C5'", "O5'"], 180, 21.25]],
'name': 'deoxyribose_pyrimidine==Chi=anti__Gamma=trans',
'restraints': [['angle', "aC4'C5'O5'", ["C4'", "C5'", "O5'"], 110.2, 1.9]]
},
{
'conditions': [['torsion', "tO4'C1'N1C2", ["O4'", "C1'", 'N1', 'C2'], 0, 22.5], ['torsion', "tC3'C4'C5'O5'", ["C3'", "C4'", "C5'", "O5'"], 60, 8.75]],
'name': 'deoxyribose_pyrimidine==Chi=syn__Gamma=gauche+',
'restraints': [['angle', "aC4'C5'O5'", ["C4'", "C5'", "O5'"], 112.5, 1.9]]
},
{
'conditions': [['torsion', "tO4'C1'N1C2", ["O4'", "C1'", 'N1', 'C2'], 0, 22.5], ['torsion', "tC3'C4'C5'O5'", ["C3'", "C4'", "C5'", "O5'"], -60, 8.75]],
'name': 'deoxyribose_pyrimidine==Chi=syn__Gamma=gauche-',
'restraints': [['angle', "aC4'C5'O5'", ["C4'", "C5'", "O5'"], 111.0, 0.9]]
},
{
'conditions': [['torsion', "tO4'C1'N1C2", ["O4'", "C1'", 'N1', 'C2'], 0, 22.5], ['torsion', "tC3'C4'C5'O5'", ["C3'", "C4'", "C5'", "O5'"], 180, 21.25]],
'name': 'deoxyribose_pyrimidine==Chi=syn__Gamma=trans',
'restraints': [['angle', "aC4'C5'O5'", ["C4'", "C5'", "O5'"], 110.5, 2.3]]
}
]
DEOXYRIBOSE_PYRIMIDINE_CHI_RESTRAINTS = [
{
'conditions': [['torsion', "tO4'C1'N1C2", ["O4'", "C1'", 'N1', 'C2'], 180, 22.5]],
'name': 'deoxyribose_pyrimidine==Chi=anti',
'restraints': [['angle', "aC4'C3'O3'", ["C4'", "C3'", "O3'"], 110.7, 2.3]]
},
{
'conditions': [['torsion', "tO4'C1'N1C2", ["O4'", "C1'", 'N1', 'C2'], 0, 22.5]],
'name': 'deoxyribose_pyrimidine==Chi=syn',
'restraints': [['angle', "aC4'C3'O3'", ["C4'", "C3'", "O3'"], 109.8, 2.1]]
}
]
DEOXYRIBOSE_PYRIMIDINE_BASE_FUNC_OF_TORSION_CHI_RESTRAINTS = [
{
'conditions': [],
'name': 'deoxyribose_pyrimidine==Base=pyrimidine',
'restraints': [ ['angle', "aN1C1'C2'", ['N1', "C1'", "C2'"], None, None, None, None, "pyrimidine-N1-C1'-C2' or N9-C1'-C2'.pickle", ['torsion_chi', ["O4'", "C1'", 'N1', 'C2']]],
['angle', "aC1'N1C2", ["C1'", 'N1', 'C2'], None, None, None, None, "pyrimidine-C1'-N1-C2 or C1'-N9-C4.pickle", ['torsion_chi', ["O4'", "C1'", 'N1', 'C2']]],
['angle', "aC1'N1C6", ["C1'", 'N1', 'C6'], None, None, None, None, "pyrimidine-C1'-N1-C6 or C1'-N9-C8.pickle", ['torsion_chi', ["O4'", "C1'", 'N1', 'C2']]],
['angle', "aN1C1'O4'", ['N1', "C1'", "O4'"], None, None, None, None, "pyrimidine-N1-C1'-O4' or N9-C1'-O4'.pickle", ['torsion_chi', ["O4'", "C1'", 'N1', 'C2']]]]
}
]
DEOXYRIBOSE_PYRIMIDINE_CONFORMATION_RESTRAINTS = [
{
'conditions': [['pseudorotation', "pC1'C2'C3'C4'O4'", ["C1'", "C2'", "C3'", "C4'", "O4'"], 162, 4.5]],
'name': "deoxyribose_pyrimidine==Conformation=C2'-endo",
'restraints': [['dist', "dC3'C4'", ["C3'", "C4'"], 1.527, 0.01], ['angle', "aC2'C1'O4'", ["C2'", "C1'", "O4'"], 106.0, 0.8]]
},
{
'conditions': [['pseudorotation', "pC1'C2'C3'C4'O4'", ["C1'", "C2'", "C3'", "C4'", "O4'"], 18, 4.5]],
'name': "deoxyribose_pyrimidine==Conformation=C3'-endo",
'restraints': [['dist', "dC3'C4'", ["C3'", "C4'"], 1.52, 0.009], ['angle', "aC2'C1'O4'", ["C2'", "C1'", "O4'"], 107.3, 0.6]]
},
{
'conditions': [],
'name': 'deoxyribose_pyrimidine==Conformation=Other',
'restraints': [['dist', "dC3'C4'", ["C3'", "C4'"], 1.531, 0.009], ['angle', "aC2'C1'O4'", ["C2'", "C1'", "O4'"], 106.2, 1.3]]
}
]
DEOXYRIBOSE_PYRIMIDINE_SUGAR_RESTRAINTS = [{
'conditions': [], 'name': 'deoxyribose_pyrimidine==Sugar=deoxyribose', 'restraints': [['dist', "dC4'O4'", ["C4'", "O4'"], 1.445, 0.009]]
}
]
DEOXYRIBOSE_PYRIMIDINE_CHI_CONFORMATION_RESTRAINTS = [
{
'conditions': [['torsion', "tO4'C1'N1C2", ["O4'", "C1'", 'N1', 'C2'], 180, 22.5], ['pseudorotation', "pC1'C2'C3'C4'O4'", ["C1'", "C2'", "C3'", "C4'", "O4'"], 162, 4.5]],
'name': "deoxyribose_pyrimidine==Chi=anti__Conformation=C2'-endo",
'restraints': [['angle', "aC2'C3'O3'", ["C2'", "C3'", "O3'"], 109.4, 2.4]]
},
{
'conditions': [['torsion', "tO4'C1'N1C2", ["O4'", "C1'", 'N1', 'C2'], 180, 22.5], ['pseudorotation', "pC1'C2'C3'C4'O4'", ["C1'", "C2'", "C3'", "C4'", "O4'"], 18, 4.5]],
'name': "deoxyribose_pyrimidine==Chi=anti__Conformation=C3'-endo",
'restraints': [['angle', "aC2'C3'O3'", ["C2'", "C3'", "O3'"], 113.4, 2.1]]
},
{
'conditions': [['torsion', "tO4'C1'N1C2", ["O4'", "C1'", 'N1', 'C2'], 180, 22.5]],
'name': 'deoxyribose_pyrimidine==Chi=anti__Conformation=Other',
'restraints': [['angle', "aC2'C3'O3'", ["C2'", "C3'", "O3'"], 111.9, 2.5]]
},
{
'conditions': [['torsion', "tO4'C1'N1C2", ["O4'", "C1'", 'N1', 'C2'], 0, 22.5], ['pseudorotation', "pC1'C2'C3'C4'O4'", ["C1'", "C2'", "C3'", "C4'", "O4'"], 162, 4.5]],
'name': "deoxyribose_pyrimidine==Chi=syn__Conformation=C2'-endo",
'restraints': [['angle', "aC2'C3'O3'", ["C2'", "C3'", "O3'"], 110.1, 2.2]]
},
{
'conditions': [['torsion', "tO4'C1'N1C2", ["O4'", "C1'", 'N1', 'C2'], 0, 22.5], ['pseudorotation', "pC1'C2'C3'C4'O4'", ["C1'", "C2'", "C3'", "C4'", "O4'"], 18, 4.5]],
'name': "deoxyribose_pyrimidine==Chi=syn__Conformation=C3'-endo",
'restraints': [['angle', "aC2'C3'O3'", ["C2'", "C3'", "O3'"], 114.2, 0.9]]
},
{
'conditions': [['torsion', "tO4'C1'N1C2", ["O4'", "C1'", 'N1', 'C2'], 0, 22.5]],
'name': 'deoxyribose_pyrimidine==Chi=syn__Conformation=Other',
'restraints': [['angle', "aC2'C3'O3'", ["C2'", "C3'", "O3'"], 113.0, 1.7]]
}
]
DEOXYRIBOSE_PYRIMIDINE_SUGAR_CONFORMATION_FUNC_OF_TAU_MAX_RESTRAINTS = [
{
'conditions': [['pseudorotation', "pC1'C2'C3'C4'O4'", ["C1'", "C2'", "C3'", "C4'", "O4'"], 162, 4.5]],
'name': "deoxyribose_pyrimidine==Sugar=deoxyribose__Conformation=C2'-endo",
'restraints': [ ['angle', "aC1'C2'C3'", ["C1'", "C2'", "C3'"], None, None, None, None, "deoxyribose-C2'-endo-C1'-C2'-C3'.pickle", ['tau_max', ["C1'", "C2'", "C3'", "C4'", "O4'"]]],
['angle', "aC2'C3'C4'", ["C2'", "C3'", "C4'"], None, None, None, None, "deoxyribose-C2'-endo-C2'-C3'-C4'.pickle", ['tau_max', ["C1'", "C2'", "C3'", "C4'", "O4'"]]],
['angle', "aC3'C4'O4'", ["C3'", "C4'", "O4'"], None, None, None, None, "deoxyribose-C2'-endo-C3'-C4'-O4'.pickle", ['tau_max', ["C1'", "C2'", "C3'", "C4'", "O4'"]]],
['angle', "aC1'O4'C4'", ["C1'", "O4'", "C4'"], None, None, None, None, "deoxyribose-C2'-endo-C1'-O4'-C4'.pickle", ['tau_max', ["C1'", "C2'", "C3'", "C4'", "O4'"]]]]
},
{
'conditions': [['pseudorotation', "pC1'C2'C3'C4'O4'", ["C1'", "C2'", "C3'", "C4'", "O4'"], 18, 4.5]],
'name': "deoxyribose_pyrimidine==Sugar=deoxyribose__Conformation=C3'-endo",
'restraints': [ ['angle', "aC1'C2'C3'", ["C1'", "C2'", "C3'"], None, None, None, None, "deoxyribose-C3'-endo-C1'-C2'-C3'.pickle", ['tau_max', ["C1'", "C2'", "C3'", "C4'", "O4'"]]],
['angle', "aC2'C3'C4'", ["C2'", "C3'", "C4'"], None, None, None, None, "deoxyribose-C3'-endo-C2'-C3'-C4'.pickle", ['tau_max', ["C1'", "C2'", "C3'", "C4'", "O4'"]]],
['angle', "aC3'C4'O4'", ["C3'", "C4'", "O4'"], None, None, None, None, "deoxyribose-C3'-endo-C3'-C4'-O4'.pickle", ['tau_max', ["C1'", "C2'", "C3'", "C4'", "O4'"]]],
['angle', "aC1'O4'C4'", ["C1'", "O4'", "C4'"], None, None, None, None, "deoxyribose-C3'-endo-C1'-O4'-C4'.pickle", ['tau_max', ["C1'", "C2'", "C3'", "C4'", "O4'"]]]]
},
{
'conditions': [],
'name': 'deoxyribose_pyrimidine==Sugar=deoxyribose__Conformation=Other',
'restraints': [ ['angle', "aC1'C2'C3'", ["C1'", "C2'", "C3'"], None, None, None, None, "deoxyribose-Other-C1'-C2'-C3'.pickle", ['tau_max', ["C1'", "C2'", "C3'", "C4'", "O4'"]]],
['angle', "aC2'C3'C4'", ["C2'", "C3'", "C4'"], None, None, None, None, "deoxyribose-Other-C2'-C3'-C4'.pickle", ['tau_max', ["C1'", "C2'", "C3'", "C4'", "O4'"]]],
['angle', "aC3'C4'O4'", ["C3'", "C4'", "O4'"], None, None, None, None, "deoxyribose-Other-C3'-C4'-O4'.pickle", ['tau_max', ["C1'", "C2'", "C3'", "C4'", "O4'"]]],
['angle', "aC1'O4'C4'", ["C1'", "O4'", "C4'"], None, None, None, None, "deoxyribose-Other-C1'-O4'-C4'.pickle", ['tau_max', ["C1'", "C2'", "C3'", "C4'", "O4'"]]]]
}
]
DEOXYRIBOSE_PYRIMIDINE_GAMMA_RESTRAINTS = [
{
'conditions': [['torsion', "tC3'C4'C5'O5'", ["C3'", "C4'", "C5'", "O5'"], 60, 8.75]],
'name': 'deoxyribose_pyrimidine==Gamma=gauche+',
'restraints': [['dist', "dC4'C5'", ["C4'", "C5'"], 1.508, 0.009], ['angle', "aC3'C4'C5'", ["C3'", "C4'", "C5'"], 115.7, 1.2], ['angle', "aC5'C4'O4'", ["C5'", "C4'", "O4'"], 109.4, 1.0]]
},
{
'conditions': [['torsion', "tC3'C4'C5'O5'", ["C3'", "C4'", "C5'", "O5'"], -60, 8.75]],
'name': 'deoxyribose_pyrimidine==Gamma=gauche-',
'restraints': [['dist', "dC4'C5'", ["C4'", "C5'"], 1.518, 0.009], ['angle', "aC3'C4'C5'", ["C3'", "C4'", "C5'"], 114.5, 1.2], ['angle', "aC5'C4'O4'", ["C5'", "C4'", "O4'"], 107.8, 0.9]]
},
{
'conditions': [['torsion', "tC3'C4'C5'O5'", ["C3'", "C4'", "C5'", "O5'"], 180, 21.25]],
'name': 'deoxyribose_pyrimidine==Gamma=trans',
'restraints': [['dist', "dC4'C5'", ["C4'", "C5'"], 1.509, 0.01], ['angle', "aC3'C4'C5'", ["C3'", "C4'", "C5'"], 113.8, 1.3], ['angle', "aC5'C4'O4'", ["C5'", "C4'", "O4'"], 109.9, 1.2]]
}
]
DEOXYRIBOSE_PYRIMIDINE_ALL_FUNC_OF_TORSION_CHI_RESTRAINTS = [
{
'conditions': [],
'name': 'deoxyribose_pyrimidine==All=All',
'restraints': [ ['dist', "dC1'N1", ["C1'", 'N1'], None, None, None, None, "All-C1'-N1 or C1'-N9.pickle", ['torsion_chi', ["O4'", "C1'", 'N1', 'C2']]],
['dist', "dC1'O4'", ["C1'", "O4'"], None, None, None, None, "All-C1'-O4'.pickle", ['torsion_chi', ["O4'", "C1'", 'N1', 'C2']]]]
}
] | 62.244898 | 213 | 0.621475 | 2,293 | 18,300 | 4.666376 | 0.048408 | 0.319907 | 0.097196 | 0.134579 | 0.91 | 0.90028 | 0.877664 | 0.84 | 0.77486 | 0.578037 | 0 | 0.080126 | 0.146831 | 18,300 | 294 | 214 | 62.244898 | 0.605201 | 0 | 0 | 0.050179 | 0 | 0 | 0.302716 | 0.094476 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
601f422c28e56eae0d42060f9c306c76ba639ddc | 5,749 | py | Python | tests/test_connectors.py | RA333/cert-issuer | ea1d4c4697ffc336f11e491685e7cb31e6515fd6 | [
"MIT"
] | null | null | null | tests/test_connectors.py | RA333/cert-issuer | ea1d4c4697ffc336f11e491685e7cb31e6515fd6 | [
"MIT"
] | null | null | null | tests/test_connectors.py | RA333/cert-issuer | ea1d4c4697ffc336f11e491685e7cb31e6515fd6 | [
"MIT"
] | null | null | null | import unittest
import bitcoin
from bitcoin.core import COutPoint, lx, x, CScript
from bitcoin.core.script import OP_EQUALVERIFY, OP_CHECKSIG, OP_DUP, OP_HASH160
from bitcoin.wallet import P2PKHBitcoinAddress
from mock import patch
from cert_issuer.connectors import BitcoindConnector
from cert_issuer.helpers import hexlify
TESTNET_TX = '010000000137e6a590428144e64cf008beb6e3193efee5a1a4ddfbbd48d10a12025b88c23c00000000fd5d0100473044022024959a1439e7e364c32f012a7e46dfa2d8cfa036ccdf230e9b3642fb9cdd4341022048292d0dbed226fadeae36b20627b50a3351456f164cae2923dba897995843c701483045022100e6dbcfb4ae35322e5c05688a6afcb144ab347654c217c9a3b2e963c2447418e702205cb639b549c7a9eace7d59ff2ce7c23167d60a214e6c9c011ce93317063850de014cc95241048aa0d470b7a9328889c84ef0291ed30346986e22558e80c3ae06199391eae21308a00cdcfb34febc0ea9c80dfd16b01f26c7ec67593cb8ab474aca8fa1d7029d4104cf54956634c4d0bdaf00e6b1871c089b7a892d0fecc077f03b91e8d4d146861b0a4fdd237891a9819c878984d4b123f6fe92d9bbc05873a1bb4fe510145bf369410471843c33b2971e4944c73d4500abd6f61f7edf9ec919c408cbe12a6c9132d2cb8ebed8253322760d5ec6081165e0ab68900683de503f1544f03816d47fec699a53aeffffffff09d29e91010000000017a9145629021f7668d4ec310ac5e99701a6d6cf95eb8f8727ed19190000000017a9145629021f7668d4ec310ac5e99701a6d6cf95eb8f874eda33320000000017a9145629021f7668d4ec310ac5e99701a6d6cf95eb8f879db467640000000017a9145629021f7668d4ec310ac5e99701a6d6cf95eb8f87a43d23030000000017a9145629021f7668d4ec310ac5e99701a6d6cf95eb8f87497b46060000000017a9145629021f7668d4ec310ac5e99701a6d6cf95eb8f87d29e91010000000017a9145629021f7668d4ec310ac5e99701a6d6cf95eb8f8793f68c0c0000000017a9145629021f7668d4ec310ac5e99701a6d6cf95eb8f8716400e00000000001976a9146efcf883b4b6f9997be9a0600f6c095fe2bd2d9288ac00000000'
MAINNET_TX = '0100000001ce379123234bc9662f3f00f2a9c59d5420fc9f9d5e1fd8881b8666e8c9def133000000006a473044022032d2d9c2a67d90eb5ea32d9a5e935b46080d4c62a1d53265555c78775e8f6f2102205c3469593995b9b76f8d24aa4285a50b72ca71661ca021cd219883f1a8f14abe012103704cf7aa5e4152639617d0b3f8bcd302e231bbda13b468cba1b12aa7be14f3b3ffffffff07be0a0000000000001976a91464799d48941b0fbfdb4a7ee6340840fb2eb5c2c388acbe0a0000000000001976a914c615ecb52f6e877df0621f4b36bdb25410ec22c388acbe0a0000000000001976a9144e9862ff1c4041b7d083fe30cf5f68f7bedb321b88acbe0a0000000000001976a914413df7bf4a41f2e8a1366fcf7352885e6c88964b88acbe0a0000000000001976a914fabc1ff527531581b4a4c58f13bd088e274122bc88acbb810000000000001976a914fcbe34aa288a91eab1f0fe93353997ec6aa3594088ac0000000000000000226a2068f3ede17fdb67ffd4a5164b5687a71f9fbb68da803b803935720f2aa38f772800000000'
def mock_listunspent(self, addrs):
print('mock listunspent')
output1 = {'outpoint': COutPoint(lx('34eb81bc0d1a822369f75174fd4916b1ec490d8fbcba33168e820cc78a52f608'), 0),
'confirmations': 62952, 'address': P2PKHBitcoinAddress('mz7poFND7hVGRtPWjiZizcCnjf6wEDWjjT'),
'spendable': False, 'amount': 49000000, 'solvable': False, 'scriptPubKey': CScript(
[OP_DUP, OP_HASH160, x('cc0a909c4c83068be8b45d69b60a6f09c2be0fda'), OP_EQUALVERIFY, OP_CHECKSIG]),
'account': ''}
output2 = {'address': P2PKHBitcoinAddress('mz7poFND7hVGRtPWjiZizcCnjf6wEDWjjT'), 'amount': 2750, 'account': '',
'spendable': False, 'solvable': False, 'confirmations': 62932,
'outpoint': COutPoint(lx('6773785b4dc5d2cced67d26fc0820329307a8e10dfaef50d506924984387bf0b'), 1),
'scriptPubKey': CScript(
[OP_DUP, OP_HASH160, x('cc0a909c4c83068be8b45d69b60a6f09c2be0fda'), OP_EQUALVERIFY,
OP_CHECKSIG])}
output3 = {'address': P2PKHBitcoinAddress('mz7poFND7hVGRtPWjiZizcCnjf6wEDWjjT'), 'amount': 2750, 'account': '',
'spendable': False, 'solvable': False, 'confirmations': 62932,
'outpoint': COutPoint(lx('6773785b4dc5d2cced67d26fc0820329307a8e10dfaef50d506924984387bf0b'), 5),
'scriptPubKey': CScript(
[OP_DUP, OP_HASH160, x('cc0a909c4c83068be8b45d69b60a6f09c2be0fda'), OP_EQUALVERIFY,
OP_CHECKSIG])}
unspent_outputs = [output1, output2, output3]
return unspent_outputs
def mock_init(self,
service_url=None,
service_port=None,
btc_conf_file=None,
timeout=0,
**kwargs):
pass
def mock_del(self):
pass
def mock_broadcast(self, transaction):
return lx('b59bef6934d043ec2b6c3be7e853b3492e9f493b3559b3bd69864283c122b257')
@patch('bitcoin.rpc.Proxy.__init__', mock_init)
@patch('bitcoin.rpc.Proxy.__del__', mock_del)
@patch('bitcoin.rpc.Proxy.listunspent', mock_listunspent)
class TestConnectors(unittest.TestCase):
def test_bitcoind_connector_spendables(self):
bitcoin.SelectParams('testnet')
bc = BitcoindConnector('XTN')
spendables = bc.spendables_for_address('mz7poFND7hVGRtPWjiZizcCnjf6wEDWjjT')
self.assertEquals(len(spendables), 3)
self.assertEquals(hexlify(spendables[0].tx_hash),
'08f6528ac70c828e1633babc8f0d49ecb11649fd7451f76923821a0dbc81eb34')
self.assertEquals(spendables[0].coin_value, 49000000)
self.assertEquals(spendables[1].coin_value, 2750)
self.assertEquals(spendables[2].coin_value, 2750)
# TODO: this test isn't calling the bitcoin RPC proxy because of the changed configuration. This will most likely
# need to be different in the open source. Fix this test and connectors.
#def test_get_balance(self):
# bitcoin.SelectParams('testnet')
# connector = ServiceProviderConnector('XTN', 'na')
# balance = connector.get_balance('mz7poFND7hVGRtPWjiZizcCnjf6wEDWjjT')
# self.assertEquals(balance, 49005500)
if __name__ == '__main__':
unittest.main()
| 70.109756 | 1,399 | 0.813359 | 324 | 5,749 | 14.243827 | 0.391975 | 0.020802 | 0.013001 | 0.019935 | 0.151896 | 0.151896 | 0.151896 | 0.151896 | 0.151896 | 0.151896 | 0 | 0.372596 | 0.122804 | 5,749 | 81 | 1,400 | 70.975309 | 0.542534 | 0.071317 | 0 | 0.172414 | 0 | 0 | 0.580004 | 0.534234 | 0 | 1 | 0 | 0.012346 | 0.086207 | 1 | 0.086207 | false | 0.034483 | 0.137931 | 0.017241 | 0.275862 | 0.017241 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6032c9f197915a001f2d95bc4649d9be19f7f99a | 26 | py | Python | getpwd/__init__.py | rafpyprog/pyssword | 2fe139ecc2bd3d145c1d90fb37086d540904c270 | [
"MIT"
] | 2 | 2017-03-16T18:24:25.000Z | 2021-01-06T14:32:36.000Z | getpwd/__init__.py | rafpyprog/pyssword | 2fe139ecc2bd3d145c1d90fb37086d540904c270 | [
"MIT"
] | null | null | null | getpwd/__init__.py | rafpyprog/pyssword | 2fe139ecc2bd3d145c1d90fb37086d540904c270 | [
"MIT"
] | null | null | null | from .getpwd import getpwd | 26 | 26 | 0.846154 | 4 | 26 | 5.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 26 | 1 | 26 | 26 | 0.956522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
605b0e3bbaedaf91cb113d521f948309dd227a20 | 11,203 | py | Python | copyrightDetection/vectorize.py | DilaraG831/CopyrightDetection | 25f5e008a7e0b8b4ac093c6f8d05d6a88c858767 | [
"CNRI-Python"
] | null | null | null | copyrightDetection/vectorize.py | DilaraG831/CopyrightDetection | 25f5e008a7e0b8b4ac093c6f8d05d6a88c858767 | [
"CNRI-Python"
] | 7 | 2022-03-17T07:11:08.000Z | 2022-03-28T10:19:58.000Z | copyrightDetection/vectorize.py | goekDil96/CopyrightDetection | 1815e8046ba13261f52c6ce1ddf215388a4e73b4 | [
"CNRI-Python"
] | null | null | null | from copyrightDetection.preprocessing import PreprocessData
import pandas as pd
import json
from sklearn.feature_extraction.text import TfidfVectorizer, CountVectorizer
from copyrightDetection.extractData import *
pathToFile = "C:/Users/dilGoe/Desktop/CopyrightDetection/dataClean/dataCLEANsmall.json"
class VectorizeData(PreprocessData):
stopwords = ["copy", "tar", "fileutils", "file", "value", "byte", "sizeof", "systemd", "mediatracker",
"note", "basicspinner", "argtypemessages", "override", "messagetracker", "except",
"standardstate", "str", "msgstr", "except", "string", "integer", "arrann", "arrprimitive", "arrstr"
"stringtokenizer", "vector", "attributes", "class", "exec", "example", "txt", "endif", "if", "while",
"nativeconstructoraccessorimpl", "delegatingconstructoraccessorimpl", "by"]
def __init__(self, filepath, wordClassification=None, replaceCopy=True, replaceNum=True, replacePunct=True, replaceShort=True, useNER=False, ngram_range=(1,1), min_df=0.0, max_df=1.0):
PreprocessData.__init__(self, filepath, wordClassification=wordClassification, replaceCopy=replaceCopy, replaceNum=replaceNum, replacePunct=replacePunct, replaceShort=replaceShort, useNER=useNER)
self.ngram_range = ngram_range
self.min_df = min_df
self.max_df = max_df
# Binary Term Frequency
def vecBTF(self) -> pd.DataFrame:
fObj = open(self.filepath)
data = json.load(fObj)
# prepare dict
dataClean = {"isCopyrightStatement": []}
copyrightStatements = []
corpus = []
for copySt in data:
dataClean["isCopyrightStatement"].append(abs(1-data[copySt]))
copyrightStatement, dataCleanOneString = self.preprocess(copyrightStatement=copySt)
for key in dataCleanOneString.keys():
if dataClean.get(key):
dataClean[key].append(dataCleanOneString[key][0])
else:
dataClean[key] = dataCleanOneString[key]
corpus.append(copyrightStatement)
copyrightStatements.append(copySt)
CountVec = CountVectorizer(binary=True,
stop_words=self.stopwords,
ngram_range=self.ngram_range,
min_df=self.min_df,
max_df=self.max_df,
tokenizer=self.wordClassification
)
#transform
X = CountVec.fit_transform(corpus)
# create dataframe
df = pd.DataFrame(X.toarray(), columns=CountVec.get_feature_names_out(), index=copyrightStatements)
for key in dataClean:
df[key] = dataClean[key]
self.vocabularyBTF = CountVec.get_feature_names_out()
return df
def vecBTFPrediction(self, filepath: str) -> pd.DataFrame:
fObj = open(filepath)
data = json.load(fObj)
# prepare dict
dataClean = {"isCopyrightStatement": []}
copyrightStatements = []
corpus = []
for copySt in data:
dataClean["isCopyrightStatement"].append(abs(1-data[copySt]))
copyrightStatement, dataCleanOneString = self.preprocess(copyrightStatement=copySt)
for key in dataCleanOneString.keys():
if dataClean.get(key):
dataClean[key].append(dataCleanOneString[key][0])
else:
dataClean[key] = dataCleanOneString[key]
corpus.append(copyrightStatement)
copyrightStatements.append(copySt)
CountVec = CountVectorizer(binary=True,
ngram_range=self.ngram_range,
vocabulary=self.vocabularyBTF,
tokenizer=self.wordClassification
)
#transform
X = CountVec.fit_transform(corpus)
# create dataframe
df = pd.DataFrame(X.toarray(), columns=CountVec.get_feature_names_out(), index=copyrightStatements)
for key in dataClean:
df[key] = dataClean[key]
return df
# Bag of words
def vecBOW(self) -> pd.DataFrame:
fObj = open(self.filepath)
data = json.load(fObj)
# prepare dict
dataClean = {"isCopyrightStatement": []}
copyrightStatements = []
corpus = []
for copySt in data:
dataClean["isCopyrightStatement"].append(abs(1-data[copySt]))
copyrightStatement, dataCleanOneString = self.preprocess(copyrightStatement=copySt)
for key in dataCleanOneString.keys():
if dataClean.get(key):
dataClean[key].append(dataCleanOneString[key][0])
else:
dataClean[key] = dataCleanOneString[key]
corpus.append(copyrightStatement)
copyrightStatements.append(copySt)
CountVec = CountVectorizer(binary=False,
stop_words=self.stopwords,
ngram_range=self.ngram_range,
min_df=self.min_df,
max_df=self.max_df,
tokenizer=self.wordClassification
)
#transform
X = CountVec.fit_transform(corpus)
# create dataframe
df = pd.DataFrame(X.toarray(), columns=CountVec.get_feature_names_out(), index=copyrightStatements)
for key in dataClean:
df[key] = dataClean[key]
self.vocabularyBOW = CountVec.get_feature_names_out()
return df
def vecBOWPrediction(self, filepath: str) -> pd.DataFrame:
fObj = open(filepath)
data = json.load(fObj)
# prepare dict
dataClean = {"isCopyrightStatement": []}
copyrightStatements = []
corpus = []
for copySt in data:
dataClean["isCopyrightStatement"].append(abs(1-data[copySt]))
copyrightStatement, dataCleanOneString = self.preprocess(copyrightStatement=copySt)
for key in dataCleanOneString.keys():
if dataClean.get(key):
dataClean[key].append(dataCleanOneString[key][0])
else:
dataClean[key] = dataCleanOneString[key]
corpus.append(copyrightStatement)
copyrightStatements.append(copySt)
CountVec = CountVectorizer(binary=False,
ngram_range=self.ngram_range,
vocabulary=self.vocabularyBOW,
tokenizer=self.wordClassification
)
#transform
X = CountVec.fit_transform(corpus)
# create dataframe
df = pd.DataFrame(X.toarray(), columns=CountVec.get_feature_names_out(), index=copyrightStatements)
for key in dataClean:
df[key] = dataClean[key]
return df
# Term Frequency - Inverse Document Frequency
def vecTfIdf(self) -> pd.DataFrame:
fObj = open(self.filepath)
data = json.load(fObj)
# prepare dict
dataClean = {"isCopyrightStatement": []}
copyrightStatements = []
corpus = []
for copySt in data:
dataClean["isCopyrightStatement"].append(abs(1-data[copySt]))
copyrightStatement, dataCleanOneString = self.preprocess(copyrightStatement=copySt)
for key in dataCleanOneString.keys():
if dataClean.get(key):
dataClean[key].append(dataCleanOneString[key][0])
else:
dataClean[key] = dataCleanOneString[key]
corpus.append(copyrightStatement)
copyrightStatements.append(copySt)
CountVec = TfidfVectorizer(ngram_range=self.ngram_range,
stop_words=self.stopwords,
min_df=self.min_df,
max_df=self.max_df,
tokenizer=self.wordClassification
)
#transform
X = CountVec.fit_transform(corpus)
# create dataframe
df = pd.DataFrame(X.toarray(), columns=CountVec.get_feature_names_out(), index=copyrightStatements)
for key in dataClean:
df[key] = dataClean[key]
self.vocabulary = CountVec.get_feature_names_out()
return df
def vecTfIdfPrediction(self, filepath: str) -> pd.DataFrame:
fObj = open(filepath)
data = json.load(fObj)
# prepare dict
dataClean = {"isCopyrightStatement": []}
copyrightStatements = []
corpus = []
for copySt in data:
dataClean["isCopyrightStatement"].append(abs(1-data[copySt]))
copyrightStatement, dataCleanOneString = self.preprocess(copyrightStatement=copySt)
for key in dataCleanOneString.keys():
if dataClean.get(key):
dataClean[key].append(dataCleanOneString[key][0])
else:
dataClean[key] = dataCleanOneString[key]
corpus.append(copyrightStatement)
copyrightStatements.append(copySt)
CountVec = TfidfVectorizer(ngram_range=self.ngram_range,
vocabulary=self.vocabulary,
tokenizer=self.wordClassification
)
#transform
X = CountVec.fit_transform(corpus)
# create dataframe
df = pd.DataFrame(X.toarray(), columns=CountVec.get_feature_names_out(), index=copyrightStatements)
for key in dataClean:
df[key] = dataClean[key]
return df
def vecTfIdfPredictionOnFile(self, filepath: str) -> pd.DataFrame:
data = getCorpusOneFile(filepath=filepath)
# prepare dict
dataClean = {}
copyrightStatements = []
corpus = []
for copySt in data:
copyrightStatement, dataCleanOneString = self.preprocess(copyrightStatement=copySt)
for key in dataCleanOneString.keys():
if dataClean.get(key):
dataClean[key].append(dataCleanOneString[key][0])
else:
dataClean[key] = dataCleanOneString[key]
corpus.append(copyrightStatement)
copyrightStatements.append(copySt)
CountVec = TfidfVectorizer(ngram_range=self.ngram_range,
vocabulary=self.vocabulary,
tokenizer=self.wordClassification
)
#transform
X = CountVec.fit_transform(corpus)
# create dataframe
df = pd.DataFrame(X.toarray(), columns=CountVec.get_feature_names_out(), index=copyrightStatements)
for key in dataClean:
df[key] = dataClean[key]
return df
| 39.586572 | 203 | 0.573953 | 940 | 11,203 | 6.751064 | 0.143617 | 0.03971 | 0.017649 | 0.036243 | 0.798613 | 0.794516 | 0.788213 | 0.776237 | 0.758746 | 0.758746 | 0 | 0.002554 | 0.336071 | 11,203 | 282 | 204 | 39.72695 | 0.850632 | 0.031331 | 0 | 0.812808 | 0 | 0 | 0.056895 | 0.012376 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039409 | false | 0 | 0.024631 | 0 | 0.108374 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6090a1faf704b0be5f89ced2d3b159b5304552a8 | 49 | py | Python | pythonicqt/models/__init__.py | Digirolamo/pythonicqt | 5744f907f01767fdbcb2da7cc264ecaca9040bad | [
"MIT"
] | 1 | 2015-08-17T03:06:53.000Z | 2015-08-17T03:06:53.000Z | pythonicqt/models/__init__.py | Digirolamo/pythonicqt | 5744f907f01767fdbcb2da7cc264ecaca9040bad | [
"MIT"
] | null | null | null | pythonicqt/models/__init__.py | Digirolamo/pythonicqt | 5744f907f01767fdbcb2da7cc264ecaca9040bad | [
"MIT"
] | null | null | null | from pythonicqt.models.listmodel import ListModel | 49 | 49 | 0.897959 | 6 | 49 | 7.333333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061224 | 49 | 1 | 49 | 49 | 0.956522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
60a083332380044c190762a693798432bc0980ca | 273 | py | Python | proto-inter/templates/Python/_create_package.py | glad4enkonm/micron | d935dab7e20c65eb4cc0bb9f5d2696a814052e7d | [
"MIT"
] | null | null | null | proto-inter/templates/Python/_create_package.py | glad4enkonm/micron | d935dab7e20c65eb4cc0bb9f5d2696a814052e7d | [
"MIT"
] | 3 | 2020-03-31T10:25:21.000Z | 2021-06-02T00:46:37.000Z | proto-inter/templates/Python/_create_package.py | glad4enkonm/micron | d935dab7e20c65eb4cc0bb9f5d2696a814052e7d | [
"MIT"
] | null | null | null | import os
# Create gRPC classes
os.system('python -m grpc_tools.protoc --proto_path=../ ../<%= package %>.proto --python_out=./broadcast-<%= package %> --grpc_python_out=./broadcast-<%= package %>')
# Create a python package
os.system('python setup.py sdist bdist_wheel') | 39 | 166 | 0.703297 | 37 | 273 | 5.027027 | 0.567568 | 0.086022 | 0.150538 | 0.268817 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106227 | 273 | 7 | 167 | 39 | 0.762295 | 0.157509 | 0 | 0 | 0 | 0.333333 | 0.815789 | 0.267544 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
60a3d5a8126e704e23a6e9887ba6af5954ffcc4e | 15,787 | py | Python | tests/components/unifiprotect/test_binary_sensor.py | MrDelik/core | 93a66cc357b226389967668441000498a10453bb | [
"Apache-2.0"
] | 3 | 2021-11-22T22:37:43.000Z | 2022-03-17T00:55:28.000Z | tests/components/unifiprotect/test_binary_sensor.py | MrDelik/core | 93a66cc357b226389967668441000498a10453bb | [
"Apache-2.0"
] | 1,016 | 2019-06-18T21:27:47.000Z | 2020-03-06T11:09:58.000Z | tests/components/unifiprotect/test_binary_sensor.py | MrDelik/core | 93a66cc357b226389967668441000498a10453bb | [
"Apache-2.0"
] | 3 | 2022-01-02T18:49:54.000Z | 2022-01-25T02:03:54.000Z | """Test the UniFi Protect binary_sensor platform."""
# pylint: disable=protected-access
from __future__ import annotations
from copy import copy
from datetime import datetime, timedelta
from unittest.mock import Mock
import pytest
from pyunifiprotect.data import Camera, Event, EventType, Light, MountType, Sensor
from pyunifiprotect.data.nvr import EventMetadata
from homeassistant.components.binary_sensor import BinarySensorDeviceClass
from homeassistant.components.unifiprotect.binary_sensor import (
CAMERA_SENSORS,
LIGHT_SENSORS,
MOTION_SENSORS,
SENSE_SENSORS,
)
from homeassistant.components.unifiprotect.const import (
ATTR_EVENT_SCORE,
DEFAULT_ATTRIBUTION,
)
from homeassistant.const import (
ATTR_ATTRIBUTION,
ATTR_DEVICE_CLASS,
STATE_OFF,
STATE_ON,
STATE_UNAVAILABLE,
Platform,
)
from homeassistant.core import HomeAssistant
from homeassistant.helpers import entity_registry as er
from .conftest import (
MockEntityFixture,
assert_entity_counts,
ids_from_device_description,
)
@pytest.fixture(name="camera")
async def camera_fixture(
hass: HomeAssistant,
mock_entry: MockEntityFixture,
mock_camera: Camera,
now: datetime,
):
"""Fixture for a single camera for testing the binary_sensor platform."""
# disable pydantic validation so mocking can happen
Camera.__config__.validate_assignment = False
camera_obj = mock_camera.copy(deep=True)
camera_obj._api = mock_entry.api
camera_obj.channels[0]._api = mock_entry.api
camera_obj.channels[1]._api = mock_entry.api
camera_obj.channels[2]._api = mock_entry.api
camera_obj.name = "Test Camera"
camera_obj.feature_flags.has_chime = True
camera_obj.last_ring = now - timedelta(hours=1)
camera_obj.is_dark = False
camera_obj.is_motion_detected = False
mock_entry.api.bootstrap.reset_objects()
mock_entry.api.bootstrap.nvr.system_info.storage.devices = []
mock_entry.api.bootstrap.cameras = {
camera_obj.id: camera_obj,
}
await hass.config_entries.async_setup(mock_entry.entry.entry_id)
await hass.async_block_till_done()
assert_entity_counts(hass, Platform.BINARY_SENSOR, 3, 3)
yield camera_obj
Camera.__config__.validate_assignment = True
@pytest.fixture(name="light")
async def light_fixture(
hass: HomeAssistant, mock_entry: MockEntityFixture, mock_light: Light, now: datetime
):
"""Fixture for a single light for testing the binary_sensor platform."""
# disable pydantic validation so mocking can happen
Light.__config__.validate_assignment = False
light_obj = mock_light.copy(deep=True)
light_obj._api = mock_entry.api
light_obj.name = "Test Light"
light_obj.is_dark = False
light_obj.is_pir_motion_detected = False
light_obj.last_motion = now - timedelta(hours=1)
mock_entry.api.bootstrap.reset_objects()
mock_entry.api.bootstrap.nvr.system_info.storage.devices = []
mock_entry.api.bootstrap.lights = {
light_obj.id: light_obj,
}
await hass.config_entries.async_setup(mock_entry.entry.entry_id)
await hass.async_block_till_done()
assert_entity_counts(hass, Platform.BINARY_SENSOR, 2, 2)
yield light_obj
Light.__config__.validate_assignment = True
@pytest.fixture(name="camera_none")
async def camera_none_fixture(
hass: HomeAssistant, mock_entry: MockEntityFixture, mock_camera: Camera
):
"""Fixture for a single camera for testing the binary_sensor platform."""
# disable pydantic validation so mocking can happen
Camera.__config__.validate_assignment = False
camera_obj = mock_camera.copy(deep=True)
camera_obj._api = mock_entry.api
camera_obj.channels[0]._api = mock_entry.api
camera_obj.channels[1]._api = mock_entry.api
camera_obj.channels[2]._api = mock_entry.api
camera_obj.name = "Test Camera"
camera_obj.feature_flags.has_chime = False
camera_obj.is_dark = False
camera_obj.is_motion_detected = False
mock_entry.api.bootstrap.reset_objects()
mock_entry.api.bootstrap.nvr.system_info.storage.devices = []
mock_entry.api.bootstrap.cameras = {
camera_obj.id: camera_obj,
}
await hass.config_entries.async_setup(mock_entry.entry.entry_id)
await hass.async_block_till_done()
assert_entity_counts(hass, Platform.BINARY_SENSOR, 2, 2)
yield camera_obj
Camera.__config__.validate_assignment = True
@pytest.fixture(name="sensor")
async def sensor_fixture(
hass: HomeAssistant,
mock_entry: MockEntityFixture,
mock_sensor: Sensor,
now: datetime,
):
"""Fixture for a single sensor for testing the binary_sensor platform."""
# disable pydantic validation so mocking can happen
Sensor.__config__.validate_assignment = False
sensor_obj = mock_sensor.copy(deep=True)
sensor_obj._api = mock_entry.api
sensor_obj.name = "Test Sensor"
sensor_obj.mount_type = MountType.DOOR
sensor_obj.is_opened = False
sensor_obj.battery_status.is_low = False
sensor_obj.is_motion_detected = False
sensor_obj.alarm_settings.is_enabled = True
sensor_obj.motion_detected_at = now - timedelta(hours=1)
sensor_obj.open_status_changed_at = now - timedelta(hours=1)
sensor_obj.alarm_triggered_at = now - timedelta(hours=1)
sensor_obj.tampering_detected_at = now - timedelta(hours=1)
mock_entry.api.bootstrap.reset_objects()
mock_entry.api.bootstrap.nvr.system_info.storage.devices = []
mock_entry.api.bootstrap.sensors = {
sensor_obj.id: sensor_obj,
}
await hass.config_entries.async_setup(mock_entry.entry.entry_id)
await hass.async_block_till_done()
assert_entity_counts(hass, Platform.BINARY_SENSOR, 4, 4)
yield sensor_obj
Sensor.__config__.validate_assignment = True
@pytest.fixture(name="sensor_none")
async def sensor_none_fixture(
hass: HomeAssistant,
mock_entry: MockEntityFixture,
mock_sensor: Sensor,
now: datetime,
):
"""Fixture for a single sensor for testing the binary_sensor platform."""
# disable pydantic validation so mocking can happen
Sensor.__config__.validate_assignment = False
sensor_obj = mock_sensor.copy(deep=True)
sensor_obj._api = mock_entry.api
sensor_obj.name = "Test Sensor"
sensor_obj.mount_type = MountType.LEAK
sensor_obj.battery_status.is_low = False
sensor_obj.alarm_settings.is_enabled = False
sensor_obj.tampering_detected_at = now - timedelta(hours=1)
mock_entry.api.bootstrap.reset_objects()
mock_entry.api.bootstrap.nvr.system_info.storage.devices = []
mock_entry.api.bootstrap.sensors = {
sensor_obj.id: sensor_obj,
}
await hass.config_entries.async_setup(mock_entry.entry.entry_id)
await hass.async_block_till_done()
assert_entity_counts(hass, Platform.BINARY_SENSOR, 4, 4)
yield sensor_obj
Sensor.__config__.validate_assignment = True
async def test_binary_sensor_setup_light(
hass: HomeAssistant, light: Light, now: datetime
):
"""Test binary_sensor entity setup for light devices."""
entity_registry = er.async_get(hass)
for description in LIGHT_SENSORS:
unique_id, entity_id = ids_from_device_description(
Platform.BINARY_SENSOR, light, description
)
entity = entity_registry.async_get(entity_id)
assert entity
assert entity.unique_id == unique_id
state = hass.states.get(entity_id)
assert state
assert state.state == STATE_OFF
assert state.attributes[ATTR_ATTRIBUTION] == DEFAULT_ATTRIBUTION
async def test_binary_sensor_setup_camera_all(
hass: HomeAssistant, camera: Camera, now: datetime
):
"""Test binary_sensor entity setup for camera devices (all features)."""
entity_registry = er.async_get(hass)
description = CAMERA_SENSORS[0]
unique_id, entity_id = ids_from_device_description(
Platform.BINARY_SENSOR, camera, description
)
entity = entity_registry.async_get(entity_id)
assert entity
assert entity.unique_id == unique_id
state = hass.states.get(entity_id)
assert state
assert state.state == STATE_OFF
assert state.attributes[ATTR_ATTRIBUTION] == DEFAULT_ATTRIBUTION
# Is Dark
description = CAMERA_SENSORS[1]
unique_id, entity_id = ids_from_device_description(
Platform.BINARY_SENSOR, camera, description
)
entity = entity_registry.async_get(entity_id)
assert entity
assert entity.unique_id == unique_id
state = hass.states.get(entity_id)
assert state
assert state.state == STATE_OFF
assert state.attributes[ATTR_ATTRIBUTION] == DEFAULT_ATTRIBUTION
# Motion
description = MOTION_SENSORS[0]
unique_id, entity_id = ids_from_device_description(
Platform.BINARY_SENSOR, camera, description
)
entity = entity_registry.async_get(entity_id)
assert entity
assert entity.unique_id == unique_id
state = hass.states.get(entity_id)
assert state
assert state.state == STATE_OFF
assert state.attributes[ATTR_ATTRIBUTION] == DEFAULT_ATTRIBUTION
assert state.attributes[ATTR_EVENT_SCORE] == 0
async def test_binary_sensor_setup_camera_none(
hass: HomeAssistant,
camera_none: Camera,
):
"""Test binary_sensor entity setup for camera devices (no features)."""
entity_registry = er.async_get(hass)
description = CAMERA_SENSORS[1]
unique_id, entity_id = ids_from_device_description(
Platform.BINARY_SENSOR, camera_none, description
)
entity = entity_registry.async_get(entity_id)
assert entity
assert entity.unique_id == unique_id
state = hass.states.get(entity_id)
assert state
assert state.state == STATE_OFF
assert state.attributes[ATTR_ATTRIBUTION] == DEFAULT_ATTRIBUTION
async def test_binary_sensor_setup_sensor(
hass: HomeAssistant, sensor: Sensor, now: datetime
):
"""Test binary_sensor entity setup for sensor devices."""
entity_registry = er.async_get(hass)
for description in SENSE_SENSORS:
unique_id, entity_id = ids_from_device_description(
Platform.BINARY_SENSOR, sensor, description
)
entity = entity_registry.async_get(entity_id)
assert entity
assert entity.unique_id == unique_id
state = hass.states.get(entity_id)
assert state
assert state.state == STATE_OFF
assert state.attributes[ATTR_ATTRIBUTION] == DEFAULT_ATTRIBUTION
async def test_binary_sensor_setup_sensor_none(
hass: HomeAssistant, sensor_none: Sensor
):
"""Test binary_sensor entity setup for sensor with most sensors disabled."""
entity_registry = er.async_get(hass)
expected = [
STATE_UNAVAILABLE,
STATE_OFF,
STATE_UNAVAILABLE,
STATE_OFF,
]
for index, description in enumerate(SENSE_SENSORS):
unique_id, entity_id = ids_from_device_description(
Platform.BINARY_SENSOR, sensor_none, description
)
entity = entity_registry.async_get(entity_id)
assert entity
assert entity.unique_id == unique_id
state = hass.states.get(entity_id)
assert state
print(entity_id)
assert state.state == expected[index]
assert state.attributes[ATTR_ATTRIBUTION] == DEFAULT_ATTRIBUTION
async def test_binary_sensor_update_motion(
hass: HomeAssistant, mock_entry: MockEntityFixture, camera: Camera, now: datetime
):
"""Test binary_sensor motion entity."""
_, entity_id = ids_from_device_description(
Platform.BINARY_SENSOR, camera, MOTION_SENSORS[0]
)
event = Event(
id="test_event_id",
type=EventType.MOTION,
start=now - timedelta(seconds=1),
end=None,
score=100,
smart_detect_types=[],
smart_detect_event_ids=[],
camera_id=camera.id,
)
new_bootstrap = copy(mock_entry.api.bootstrap)
new_camera = camera.copy()
new_camera.is_motion_detected = True
new_camera.last_motion_event_id = event.id
mock_msg = Mock()
mock_msg.changed_data = {}
mock_msg.new_obj = new_camera
new_bootstrap.cameras = {new_camera.id: new_camera}
new_bootstrap.events = {event.id: event}
mock_entry.api.bootstrap = new_bootstrap
mock_entry.api.ws_subscription(mock_msg)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state
assert state.state == STATE_ON
assert state.attributes[ATTR_ATTRIBUTION] == DEFAULT_ATTRIBUTION
assert state.attributes[ATTR_EVENT_SCORE] == 100
async def test_binary_sensor_update_light_motion(
hass: HomeAssistant, mock_entry: MockEntityFixture, light: Light, now: datetime
):
"""Test binary_sensor motion entity."""
_, entity_id = ids_from_device_description(
Platform.BINARY_SENSOR, light, LIGHT_SENSORS[1]
)
event_metadata = EventMetadata(light_id=light.id)
event = Event(
id="test_event_id",
type=EventType.MOTION_LIGHT,
start=now - timedelta(seconds=1),
end=None,
score=100,
smart_detect_types=[],
smart_detect_event_ids=[],
metadata=event_metadata,
api=mock_entry.api,
)
new_bootstrap = copy(mock_entry.api.bootstrap)
new_light = light.copy()
new_light.is_pir_motion_detected = True
new_light.last_motion_event_id = event.id
mock_msg = Mock()
mock_msg.changed_data = {}
mock_msg.new_obj = event
new_bootstrap.lights = {new_light.id: new_light}
new_bootstrap.events = {event.id: event}
mock_entry.api.bootstrap = new_bootstrap
mock_entry.api.ws_subscription(mock_msg)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state
assert state.state == STATE_ON
async def test_binary_sensor_update_mount_type_window(
hass: HomeAssistant, mock_entry: MockEntityFixture, sensor: Sensor
):
"""Test binary_sensor motion entity."""
_, entity_id = ids_from_device_description(
Platform.BINARY_SENSOR, sensor, SENSE_SENSORS[0]
)
state = hass.states.get(entity_id)
assert state
assert state.attributes[ATTR_DEVICE_CLASS] == BinarySensorDeviceClass.DOOR.value
new_bootstrap = copy(mock_entry.api.bootstrap)
new_sensor = sensor.copy()
new_sensor.mount_type = MountType.WINDOW
mock_msg = Mock()
mock_msg.changed_data = {}
mock_msg.new_obj = new_sensor
new_bootstrap.sensors = {new_sensor.id: new_sensor}
mock_entry.api.bootstrap = new_bootstrap
mock_entry.api.ws_subscription(mock_msg)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state
assert state.attributes[ATTR_DEVICE_CLASS] == BinarySensorDeviceClass.WINDOW.value
async def test_binary_sensor_update_mount_type_garage(
hass: HomeAssistant, mock_entry: MockEntityFixture, sensor: Sensor
):
"""Test binary_sensor motion entity."""
_, entity_id = ids_from_device_description(
Platform.BINARY_SENSOR, sensor, SENSE_SENSORS[0]
)
state = hass.states.get(entity_id)
assert state
assert state.attributes[ATTR_DEVICE_CLASS] == BinarySensorDeviceClass.DOOR.value
new_bootstrap = copy(mock_entry.api.bootstrap)
new_sensor = sensor.copy()
new_sensor.mount_type = MountType.GARAGE
mock_msg = Mock()
mock_msg.changed_data = {}
mock_msg.new_obj = new_sensor
new_bootstrap.sensors = {new_sensor.id: new_sensor}
mock_entry.api.bootstrap = new_bootstrap
mock_entry.api.ws_subscription(mock_msg)
await hass.async_block_till_done()
state = hass.states.get(entity_id)
assert state
assert (
state.attributes[ATTR_DEVICE_CLASS] == BinarySensorDeviceClass.GARAGE_DOOR.value
)
| 30.359615 | 88 | 0.728067 | 2,028 | 15,787 | 5.342702 | 0.08284 | 0.044024 | 0.043193 | 0.044578 | 0.817167 | 0.814029 | 0.792063 | 0.758653 | 0.724135 | 0.696631 | 0 | 0.003358 | 0.188953 | 15,787 | 519 | 89 | 30.418112 | 0.842862 | 0.021853 | 0 | 0.627968 | 0 | 0 | 0.008164 | 0 | 0 | 0 | 0 | 0 | 0.147757 | 1 | 0 | false | 0 | 0.036939 | 0 | 0.036939 | 0.002639 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
60fb98af15ea203ced425eaf4fd83b2dabc979e6 | 40 | py | Python | src/util/data_input/reader/__init__.py | JasonYuJjyu/unsupFlownet | 3f652d7aaa0ce99efb767114ec119a3c23e3051b | [
"MIT"
] | 24 | 2018-09-25T14:39:45.000Z | 2022-03-29T10:44:50.000Z | src/util/data_input/reader/__init__.py | y788zhan/joint_seg_flow | 4c0082e9c1f625d6f6f1e1195a2c3d9dbf82bec0 | [
"MIT"
] | 6 | 2018-09-28T18:03:26.000Z | 2019-03-25T01:43:29.000Z | src/util/data_input/reader/__init__.py | y788zhan/joint_seg_flow | 4c0082e9c1f625d6f6f1e1195a2c3d9dbf82bec0 | [
"MIT"
] | 7 | 2018-10-19T03:19:55.000Z | 2022-02-22T02:42:28.000Z | from Png import *
from PngFlow import *
| 13.333333 | 21 | 0.75 | 6 | 40 | 5 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 40 | 2 | 22 | 20 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
88009f3eb8b07a4ca95e6d7292ab0d46fecab47f | 3,069 | py | Python | tests/test_common.py | akaterra/udb.py | 3c04fa788e4b2fc8356c1210b9d81004aa932c0b | [
"MIT"
] | 2 | 2020-01-04T12:04:58.000Z | 2020-02-15T16:32:12.000Z | tests/test_common.py | akaterra/udb.py | 3c04fa788e4b2fc8356c1210b9d81004aa932c0b | [
"MIT"
] | null | null | null | tests/test_common.py | akaterra/udb.py | 3c04fa788e4b2fc8356c1210b9d81004aa932c0b | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import pytest
from udb_py.common import *
@pytest.mark.common
def test_should_format_positive_int_value():
i = TYPE_FORMAT_MAPPERS[int](1)
assert i == u'\x02\x01\x00\x00\x00\x00\x00\x00\x00\x01'
@pytest.mark.common
def test_should_compare_formatted_positive_int():
i = TYPE_FORMAT_MAPPERS[int](1)
j = TYPE_FORMAT_MAPPERS[int](2)
k = TYPE_FORMAT_MAPPERS[int](0)
assert i < j
assert i > k
@pytest.mark.common
def test_should_format_negative_int_value():
i = TYPE_FORMAT_MAPPERS[int](-1)
assert i == u'\x02\x00\xff\xff\xff\xff\xff\xff\xff\xff'
@pytest.mark.common
def test_should_compare_formatted_negative_int():
i = TYPE_FORMAT_MAPPERS[int](-1)
j = TYPE_FORMAT_MAPPERS[int](-2)
k = TYPE_FORMAT_MAPPERS[int](0)
assert i > j
assert i < k
@pytest.mark.common
def test_should_format_positive_float_value_with_decimal():
i = TYPE_FORMAT_MAPPER_INT_AS_FLOAT[float](1.9)
assert i == u'\x02\x01\x00\x00\x00\x00\x00\x00\x00\x01\x02\x01\x0c}q;IÙÿ\x80'
@pytest.mark.common
def test_should_compare_formatted_positive_float_value_with_decimal():
i = TYPE_FORMAT_MAPPER_INT_AS_FLOAT[float](1.9)
j = TYPE_FORMAT_MAPPER_INT_AS_FLOAT[float](1.91)
k = TYPE_FORMAT_MAPPER_INT_AS_FLOAT[float](1.89)
l = TYPE_FORMAT_MAPPER_INT_AS_FLOAT[int](1)
m = TYPE_FORMAT_MAPPER_INT_AS_FLOAT[int](2)
assert i < j
assert i > k
assert i > l
assert i < m
@pytest.mark.common
def test_should_format_positive_float_value_without_decimal():
i = TYPE_FORMAT_MAPPER_INT_AS_FLOAT[float](1)
assert i == u'\x02\x01\x00\x00\x00\x00\x00\x00\x00\x01\x02\x01\x00\x00\x00\x00\x00\x00\x00\x00'
@pytest.mark.common
def test_should_format_negative_float_value_with_decimal():
i = TYPE_FORMAT_MAPPER_INT_AS_FLOAT[float](-1.9)
assert i == u'\x02\x00ÿÿÿÿÿÿÿÿ\x02\x00ó\x82\x8eĶ&\x00\x80'
@pytest.mark.common
def test_should_compare_formatted_negative_float_value_with_decimal():
i = TYPE_FORMAT_MAPPER_INT_AS_FLOAT[float](-1.9)
j = TYPE_FORMAT_MAPPER_INT_AS_FLOAT[float](-1.91)
k = TYPE_FORMAT_MAPPER_INT_AS_FLOAT[float](-1.89)
l = TYPE_FORMAT_MAPPER_INT_AS_FLOAT[int](-1)
m = TYPE_FORMAT_MAPPER_INT_AS_FLOAT[int](-2)
assert i > j
assert i < k
assert i < l
assert i > m
@pytest.mark.common
def test_should_format_negative_float_value_without_decimal():
i = TYPE_FORMAT_MAPPER_INT_AS_FLOAT[float](-1)
assert i == u'\x02\x00\xff\xff\xff\xff\xff\xff\xff\xff\x02\x01\x00\x00\x00\x00\x00\x00\x00\x00'
@pytest.mark.common
def test_should_format_true_value():
i = TYPE_FORMAT_MAPPERS[bool](True)
assert i == u'\x01\x01'
@pytest.mark.common
def test_should_format_false_value():
i = TYPE_FORMAT_MAPPERS[bool](False)
assert i == u'\x01\x00'
@pytest.mark.common
def test_should_format_none_value():
i = TYPE_FORMAT_MAPPERS[type(None)](None)
assert i == u'\x00'
@pytest.mark.common
def test_should_format_string_value():
i = TYPE_FORMAT_MAPPERS[str]('a')
assert i == u'\x03a'
| 24.75 | 99 | 0.72434 | 524 | 3,069 | 3.910305 | 0.118321 | 0.093704 | 0.118594 | 0.128843 | 0.919473 | 0.897023 | 0.870669 | 0.852123 | 0.790142 | 0.691557 | 0 | 0.067712 | 0.148257 | 3,069 | 123 | 100 | 24.95122 | 0.715761 | 0.006843 | 0 | 0.410256 | 0 | 0.038462 | 0.122127 | 0.113592 | 0 | 0 | 0 | 0 | 0.282051 | 1 | 0.179487 | false | 0 | 0.025641 | 0 | 0.205128 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7187b3594f486b1742fec88609e8dac0f8afed8f | 1,837 | py | Python | ivy_mech_tests/test_position/test_coordinate_conversions.py | unifyai/mech | d678c8732ee5aba4a92fb37b96519cd06553c0c6 | [
"Apache-2.0"
] | 1 | 2021-10-11T17:58:41.000Z | 2021-10-11T17:58:41.000Z | ivy_mech_tests/test_position/test_coordinate_conversions.py | ivy-dl/mech | f3ce5f7b38fe2c453a066f58019ff84dcee517a6 | [
"Apache-2.0"
] | null | null | null | ivy_mech_tests/test_position/test_coordinate_conversions.py | ivy-dl/mech | f3ce5f7b38fe2c453a066f58019ff84dcee517a6 | [
"Apache-2.0"
] | null | null | null | """
Collection of tests for co-ordinate conversion functions
"""
# global
import ivy_mech
import numpy as np
# local
from ivy_mech_tests.test_position.position_data import PositionTestData
ptd = PositionTestData()
def test_polar_to_cartesian_coords(dev_str, call):
assert np.allclose(call(ivy_mech.polar_to_cartesian_coords, ptd.polar_coords), ptd.cartesian_coords, atol=1e-6)
assert np.allclose(call(ivy_mech.polar_to_cartesian_coords, ptd.batched_polar_coords)[0],
ptd.cartesian_coords, atol=1e-6)
def test_cartesian_to_polar_coords(dev_str, call):
assert np.allclose(call(ivy_mech.cartesian_to_polar_coords, ptd.cartesian_coords), ptd.polar_coords, atol=1e-6)
assert np.allclose(call(ivy_mech.cartesian_to_polar_coords, ptd.batched_cartesian_coords)[0],
ptd.polar_coords, atol=1e-6)
def test_cartesian_to_polar_coords_and_back(dev_str, call):
assert np.allclose(call(ivy_mech.polar_to_cartesian_coords,
call(ivy_mech.cartesian_to_polar_coords, ptd.cartesian_coords)),
ptd.cartesian_coords, atol=1e-6)
assert np.allclose(call(ivy_mech.polar_to_cartesian_coords,
call(ivy_mech.cartesian_to_polar_coords, ptd.batched_cartesian_coords))[0],
ptd.cartesian_coords, atol=1e-6)
def test_polar_to_cartesian_coords_and_back(dev_str, call):
assert np.allclose(call(ivy_mech.cartesian_to_polar_coords,
call(ivy_mech.polar_to_cartesian_coords, ptd.polar_coords)),
ptd.polar_coords, atol=1e-6)
assert np.allclose(call(ivy_mech.cartesian_to_polar_coords,
call(ivy_mech.polar_to_cartesian_coords, ptd.batched_polar_coords))[0],
ptd.polar_coords, atol=1e-6)
| 42.72093 | 115 | 0.709853 | 257 | 1,837 | 4.696498 | 0.151751 | 0.19884 | 0.109362 | 0.145816 | 0.859983 | 0.859983 | 0.827672 | 0.827672 | 0.808616 | 0.808616 | 0 | 0.01368 | 0.204137 | 1,837 | 42 | 116 | 43.738095 | 0.811902 | 0.038106 | 0 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.307692 | 1 | 0.153846 | false | 0 | 0.115385 | 0 | 0.269231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
71b703aaf0de8596eb991051185e3cbfacf32ca2 | 221 | py | Python | python/bitcoin_101/create_your_own_network.py | fcracker79/python_bitcoin_101 | 3af1b1e1b817d9676981f3c6c87d93064ac8febe | [
"MIT"
] | 1 | 2018-09-26T18:10:36.000Z | 2018-09-26T18:10:36.000Z | python/bitcoin_101/create_your_own_network.py | fcracker79/python_bitcoin_101 | 3af1b1e1b817d9676981f3c6c87d93064ac8febe | [
"MIT"
] | null | null | null | python/bitcoin_101/create_your_own_network.py | fcracker79/python_bitcoin_101 | 3af1b1e1b817d9676981f3c6c87d93064ac8febe | [
"MIT"
] | null | null | null | from pprint import pprint
from bitcoin_101 import bitprint
from bitcoin_101.bitcoin_client import BitcoinClient
# make start
if __name__ == '__main__':
bitprint(BitcoinClient().getinfo())
# show keys_regtest.py
| 22.1 | 52 | 0.778281 | 28 | 221 | 5.714286 | 0.642857 | 0.1375 | 0.175 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031915 | 0.149321 | 221 | 9 | 53 | 24.555556 | 0.819149 | 0.140271 | 0 | 0 | 0 | 0 | 0.042781 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0.6 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
71ba530d23ad063971fe775d4e95f2e7eba71ad7 | 212,248 | py | Python | pynitrokey/nethsm/client/api/default_api.py | fayrlight/pynitrokey | c6a93da7a811d34213746b60fab22affb3616a88 | [
"Apache-2.0",
"MIT"
] | null | null | null | pynitrokey/nethsm/client/api/default_api.py | fayrlight/pynitrokey | c6a93da7a811d34213746b60fab22affb3616a88 | [
"Apache-2.0",
"MIT"
] | null | null | null | pynitrokey/nethsm/client/api/default_api.py | fayrlight/pynitrokey | c6a93da7a811d34213746b60fab22affb3616a88 | [
"Apache-2.0",
"MIT"
] | null | null | null | """
NetHSM
All endpoints expect exactly the specified JSON. Additional properties will cause a Bad Request Error (400). All HTTP errors contain a JSON structure with an explanation of type string. All <a href=\"https://tools.ietf.org/html/rfc4648#section-4\">base64</a> encoded values are Big Endian. # noqa: E501
The version of the OpenAPI document: v1
Generated by: https://openapi-generator.tech
"""
import re # noqa: F401
import sys # noqa: F401
from pynitrokey.nethsm.client.api_client import ApiClient, Endpoint as _Endpoint
from pynitrokey.nethsm.client.model_utils import ( # noqa: F401
check_allowed_values,
check_validations,
date,
datetime,
file_type,
none_type,
validate_and_convert_types
)
from pynitrokey.nethsm.client.model.backup_passphrase_config import BackupPassphraseConfig
from pynitrokey.nethsm.client.model.decrypt_data import DecryptData
from pynitrokey.nethsm.client.model.decrypt_request_data import DecryptRequestData
from pynitrokey.nethsm.client.model.distinguished_name import DistinguishedName
from pynitrokey.nethsm.client.model.health_state_data import HealthStateData
from pynitrokey.nethsm.client.model.info_data import InfoData
from pynitrokey.nethsm.client.model.key_generate_request_data import KeyGenerateRequestData
from pynitrokey.nethsm.client.model.key_item import KeyItem
from pynitrokey.nethsm.client.model.key_list import KeyList
from pynitrokey.nethsm.client.model.key_mechanism import KeyMechanism
from pynitrokey.nethsm.client.model.logging_config import LoggingConfig
from pynitrokey.nethsm.client.model.network_config import NetworkConfig
from pynitrokey.nethsm.client.model.private_key import PrivateKey
from pynitrokey.nethsm.client.model.provision_request_data import ProvisionRequestData
from pynitrokey.nethsm.client.model.public_key import PublicKey
from pynitrokey.nethsm.client.model.random_data import RandomData
from pynitrokey.nethsm.client.model.random_request_data import RandomRequestData
from pynitrokey.nethsm.client.model.sign_data import SignData
from pynitrokey.nethsm.client.model.sign_request_data import SignRequestData
from pynitrokey.nethsm.client.model.system_info import SystemInfo
from pynitrokey.nethsm.client.model.system_update_data import SystemUpdateData
from pynitrokey.nethsm.client.model.time_config import TimeConfig
from pynitrokey.nethsm.client.model.unattended_boot_config import UnattendedBootConfig
from pynitrokey.nethsm.client.model.unlock_passphrase_config import UnlockPassphraseConfig
from pynitrokey.nethsm.client.model.unlock_request_data import UnlockRequestData
from pynitrokey.nethsm.client.model.user_data import UserData
from pynitrokey.nethsm.client.model.user_list import UserList
from pynitrokey.nethsm.client.model.user_passphrase_post_data import UserPassphrasePostData
from pynitrokey.nethsm.client.model.user_post_data import UserPostData
class DefaultApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def __config_backup_passphrase_put(
self,
**kwargs
):
"""config_backup_passphrase_put # noqa: E501
Update the backup passphrase. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.config_backup_passphrase_put(async_req=True)
>>> result = thread.get()
Keyword Args:
body (BackupPassphraseConfig): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.config_backup_passphrase_put = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/config/backup-passphrase',
'operation_id': 'config_backup_passphrase_put',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
(BackupPassphraseConfig,),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__config_backup_passphrase_put
)
def __config_logging_get(
self,
**kwargs
):
"""config_logging_get # noqa: E501
Get logging configuration. Protocol is always syslog over UDP. Configurable are IP adress and port, log level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.config_logging_get(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
LoggingConfig
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.config_logging_get = _Endpoint(
settings={
'response_type': (LoggingConfig,),
'auth': [
'basic'
],
'endpoint_path': '/config/logging',
'operation_id': 'config_logging_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__config_logging_get
)
def __config_logging_put(
self,
**kwargs
):
"""config_logging_put # noqa: E501
Configure log level and destination. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.config_logging_put(async_req=True)
>>> result = thread.get()
Keyword Args:
body (LoggingConfig): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.config_logging_put = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/config/logging',
'operation_id': 'config_logging_put',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
(LoggingConfig,),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__config_logging_put
)
def __config_network_get(
self,
**kwargs
):
"""config_network_get # noqa: E501
Get network configuration. IP address, netmask, router. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.config_network_get(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
NetworkConfig
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.config_network_get = _Endpoint(
settings={
'response_type': (NetworkConfig,),
'auth': [
'basic'
],
'endpoint_path': '/config/network',
'operation_id': 'config_network_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__config_network_get
)
def __config_network_put(
self,
**kwargs
):
"""config_network_put # noqa: E501
Configure network. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.config_network_put(async_req=True)
>>> result = thread.get()
Keyword Args:
body (NetworkConfig): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.config_network_put = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/config/network',
'operation_id': 'config_network_put',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
(NetworkConfig,),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__config_network_put
)
def __config_time_get(
self,
**kwargs
):
"""config_time_get # noqa: E501
Get system time. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.config_time_get(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
TimeConfig
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.config_time_get = _Endpoint(
settings={
'response_type': (TimeConfig,),
'auth': [
'basic'
],
'endpoint_path': '/config/time',
'operation_id': 'config_time_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__config_time_get
)
def __config_time_put(
self,
**kwargs
):
"""config_time_put # noqa: E501
Configure system time. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.config_time_put(async_req=True)
>>> result = thread.get()
Keyword Args:
body (TimeConfig): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.config_time_put = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/config/time',
'operation_id': 'config_time_put',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
(TimeConfig,),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__config_time_put
)
def __config_tls_cert_pem_get(
self,
**kwargs
):
"""config_tls_cert_pem_get # noqa: E501
Get certificate for NetHSMs https API. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.config_tls_cert_pem_get(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
str
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.config_tls_cert_pem_get = _Endpoint(
settings={
'response_type': (str,),
'auth': [
'basic'
],
'endpoint_path': '/config/tls/cert.pem',
'operation_id': 'config_tls_cert_pem_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/x-pem-file'
],
'content_type': [],
},
api_client=api_client,
callable=__config_tls_cert_pem_get
)
def __config_tls_cert_pem_put(
self,
**kwargs
):
"""config_tls_cert_pem_put # noqa: E501
Set certificate for NetHSMs https API e.g. to replace self-signed intital certificate. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.config_tls_cert_pem_put(async_req=True)
>>> result = thread.get()
Keyword Args:
body (str): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.config_tls_cert_pem_put = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/config/tls/cert.pem',
'operation_id': 'config_tls_cert_pem_put',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
(str,),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__config_tls_cert_pem_put
)
def __config_tls_csr_pem_put(
self,
**kwargs
):
"""config_tls_csr_pem_put # noqa: E501
Get NetHSM certificate signing request e.g. to replace self-signed intital certificate. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.config_tls_csr_pem_put(async_req=True)
>>> result = thread.get()
Keyword Args:
body (DistinguishedName): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
str
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.config_tls_csr_pem_put = _Endpoint(
settings={
'response_type': (str,),
'auth': [
'basic'
],
'endpoint_path': '/config/tls/csr.pem',
'operation_id': 'config_tls_csr_pem_put',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
(DistinguishedName,),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__config_tls_csr_pem_put
)
def __config_tls_public_pem_get(
self,
**kwargs
):
"""config_tls_public_pem_get # noqa: E501
Get public key for NetHSMs https API. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.config_tls_public_pem_get(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
str
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.config_tls_public_pem_get = _Endpoint(
settings={
'response_type': (str,),
'auth': [
'basic'
],
'endpoint_path': '/config/tls/public.pem',
'operation_id': 'config_tls_public_pem_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/x-pem-file'
],
'content_type': [],
},
api_client=api_client,
callable=__config_tls_public_pem_get
)
def __config_unattended_boot_get(
self,
**kwargs
):
"""config_unattended_boot_get # noqa: E501
Read unattended boot configuration: is it on or off? # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.config_unattended_boot_get(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
UnattendedBootConfig
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.config_unattended_boot_get = _Endpoint(
settings={
'response_type': (UnattendedBootConfig,),
'auth': [
'basic'
],
'endpoint_path': '/config/unattended-boot',
'operation_id': 'config_unattended_boot_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__config_unattended_boot_get
)
def __config_unattended_boot_put(
self,
**kwargs
):
"""config_unattended_boot_put # noqa: E501
Configure unattended boot: switch it on or off (flip the switch). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.config_unattended_boot_put(async_req=True)
>>> result = thread.get()
Keyword Args:
body (UnattendedBootConfig): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.config_unattended_boot_put = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/config/unattended-boot',
'operation_id': 'config_unattended_boot_put',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
(UnattendedBootConfig,),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__config_unattended_boot_put
)
def __config_unlock_passphrase_put(
self,
**kwargs
):
"""config_unlock_passphrase_put # noqa: E501
Update the unlock passphrase. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.config_unlock_passphrase_put(async_req=True)
>>> result = thread.get()
Keyword Args:
body (UnlockPassphraseConfig): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.config_unlock_passphrase_put = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/config/unlock-passphrase',
'operation_id': 'config_unlock_passphrase_put',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
(UnlockPassphraseConfig,),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__config_unlock_passphrase_put
)
def __health_alive_get(
self,
**kwargs
):
"""health_alive_get # noqa: E501
Retrieve wether NetHSM is alive (powered up). This corresponds to the state <i>Locked</i> or <i>Unprovisioned</i>. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.health_alive_get(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.health_alive_get = _Endpoint(
settings={
'response_type': None,
'auth': [],
'endpoint_path': '/health/alive',
'operation_id': 'health_alive_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__health_alive_get
)
def __health_ready_get(
self,
**kwargs
):
"""health_ready_get # noqa: E501
Retrieve wether NetHSM is alive and ready to take traffic. This corresponds to the state <i>Operational</i>. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.health_ready_get(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.health_ready_get = _Endpoint(
settings={
'response_type': None,
'auth': [],
'endpoint_path': '/health/ready',
'operation_id': 'health_ready_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__health_ready_get
)
def __health_state_get(
self,
**kwargs
):
"""health_state_get # noqa: E501
Retrieve the state of NetHSM. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.health_state_get(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
HealthStateData
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.health_state_get = _Endpoint(
settings={
'response_type': (HealthStateData,),
'auth': [],
'endpoint_path': '/health/state',
'operation_id': 'health_state_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__health_state_get
)
def __info_get(
self,
**kwargs
):
"""info_get # noqa: E501
Information about the vendor and product. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.info_get(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
InfoData
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.info_get = _Endpoint(
settings={
'response_type': (InfoData,),
'auth': [],
'endpoint_path': '/info',
'operation_id': 'info_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__info_get
)
def __keys_generate_post(
self,
**kwargs
):
"""keys_generate_post # noqa: E501
Generate a pair of public and private key and store it in NetHSM. KeyID is optional as a parameter and will be generated by NetHSM if not present. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.keys_generate_post(async_req=True)
>>> result = thread.get()
Keyword Args:
body (KeyGenerateRequestData): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.keys_generate_post = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/keys/generate',
'operation_id': 'keys_generate_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
(KeyGenerateRequestData,),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__keys_generate_post
)
def __keys_get(
self,
**kwargs
):
"""keys_get # noqa: E501
Get a list of the identifiers of all keys that are currently stored in NetHSM. Separate requests need to be made to request the individual key data. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.keys_get(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
KeyList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.keys_get = _Endpoint(
settings={
'response_type': (KeyList,),
'auth': [
'basic'
],
'endpoint_path': '/keys',
'operation_id': 'keys_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__keys_get
)
def __keys_key_id_cert_delete(
self,
key_id,
**kwargs
):
"""keys_key_id_cert_delete # noqa: E501
Delete the certificate. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.keys_key_id_cert_delete(key_id, async_req=True)
>>> result = thread.get()
Args:
key_id (str):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['key_id'] = \
key_id
return self.call_with_http_info(**kwargs)
self.keys_key_id_cert_delete = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/keys/{KeyID}/cert',
'operation_id': 'keys_key_id_cert_delete',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'key_id',
],
'required': [
'key_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'key_id':
(str,),
},
'attribute_map': {
'key_id': 'KeyID',
},
'location_map': {
'key_id': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__keys_key_id_cert_delete
)
def __keys_key_id_cert_get(
self,
key_id,
**kwargs
):
"""keys_key_id_cert_get # noqa: E501
Retrieve stored certificate. The content-type header will display the media type of the stored data. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.keys_key_id_cert_get(key_id, async_req=True)
>>> result = thread.get()
Args:
key_id (str):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
str
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['key_id'] = \
key_id
return self.call_with_http_info(**kwargs)
self.keys_key_id_cert_get = _Endpoint(
settings={
'response_type': (str,),
'auth': [
'basic'
],
'endpoint_path': '/keys/{KeyID}/cert',
'operation_id': 'keys_key_id_cert_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'key_id',
],
'required': [
'key_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'key_id':
(str,),
},
'attribute_map': {
'key_id': 'KeyID',
},
'location_map': {
'key_id': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__keys_key_id_cert_get
)
def __keys_key_id_cert_put(
self,
key_id,
**kwargs
):
"""keys_key_id_cert_put # noqa: E501
Store a certificate. Maximum size 1MB. The content-type header provides the media type. Only application/json, application/x-pem-file, application/x-x509-ca-cert, application/octet-stream, text/plain and application/pgp-keys is allowed. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.keys_key_id_cert_put(key_id, async_req=True)
>>> result = thread.get()
Args:
key_id (str):
Keyword Args:
body (str): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['key_id'] = \
key_id
return self.call_with_http_info(**kwargs)
self.keys_key_id_cert_put = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/keys/{KeyID}/cert',
'operation_id': 'keys_key_id_cert_put',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'key_id',
'body',
],
'required': [
'key_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'key_id':
(str,),
'body':
(str,),
},
'attribute_map': {
'key_id': 'KeyID',
},
'location_map': {
'key_id': 'path',
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__keys_key_id_cert_put
)
def __keys_key_id_csr_pem_post(
self,
key_id,
**kwargs
):
"""keys_key_id_csr_pem_post # noqa: E501
Retrieve a certificate signing request in PEM format. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.keys_key_id_csr_pem_post(key_id, async_req=True)
>>> result = thread.get()
Args:
key_id (str):
Keyword Args:
body (DistinguishedName): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
str
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['key_id'] = \
key_id
return self.call_with_http_info(**kwargs)
self.keys_key_id_csr_pem_post = _Endpoint(
settings={
'response_type': (str,),
'auth': [
'basic'
],
'endpoint_path': '/keys/{KeyID}/csr.pem',
'operation_id': 'keys_key_id_csr_pem_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'key_id',
'body',
],
'required': [
'key_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'key_id':
(str,),
'body':
(DistinguishedName,),
},
'attribute_map': {
'key_id': 'KeyID',
},
'location_map': {
'key_id': 'path',
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__keys_key_id_csr_pem_post
)
def __keys_key_id_decrypt_post(
self,
key_id,
**kwargs
):
"""keys_key_id_decrypt_post # noqa: E501
Decrypt an encrypted message with the secret key. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.keys_key_id_decrypt_post(key_id, async_req=True)
>>> result = thread.get()
Args:
key_id (str):
Keyword Args:
body (DecryptRequestData): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
DecryptData
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['key_id'] = \
key_id
return self.call_with_http_info(**kwargs)
self.keys_key_id_decrypt_post = _Endpoint(
settings={
'response_type': (DecryptData,),
'auth': [
'basic'
],
'endpoint_path': '/keys/{KeyID}/decrypt',
'operation_id': 'keys_key_id_decrypt_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'key_id',
'body',
],
'required': [
'key_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'key_id':
(str,),
'body':
(DecryptRequestData,),
},
'attribute_map': {
'key_id': 'KeyID',
},
'location_map': {
'key_id': 'path',
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__keys_key_id_decrypt_post
)
def __keys_key_id_delete(
self,
key_id,
**kwargs
):
"""keys_key_id_delete # noqa: E501
Delete a pair of public and private key. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.keys_key_id_delete(key_id, async_req=True)
>>> result = thread.get()
Args:
key_id (str):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['key_id'] = \
key_id
return self.call_with_http_info(**kwargs)
self.keys_key_id_delete = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/keys/{KeyID}',
'operation_id': 'keys_key_id_delete',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'key_id',
],
'required': [
'key_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'key_id':
(str,),
},
'attribute_map': {
'key_id': 'KeyID',
},
'location_map': {
'key_id': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__keys_key_id_delete
)
def __keys_key_id_get(
self,
key_id,
**kwargs
):
"""keys_key_id_get # noqa: E501
Retrieve the public key. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.keys_key_id_get(key_id, async_req=True)
>>> result = thread.get()
Args:
key_id (str):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
PublicKey
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['key_id'] = \
key_id
return self.call_with_http_info(**kwargs)
self.keys_key_id_get = _Endpoint(
settings={
'response_type': (PublicKey,),
'auth': [
'basic'
],
'endpoint_path': '/keys/{KeyID}',
'operation_id': 'keys_key_id_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'key_id',
],
'required': [
'key_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'key_id':
(str,),
},
'attribute_map': {
'key_id': 'KeyID',
},
'location_map': {
'key_id': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__keys_key_id_get
)
def __keys_key_id_public_pem_get(
self,
key_id,
**kwargs
):
"""keys_key_id_public_pem_get # noqa: E501
Retrieve public key in PEM format. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.keys_key_id_public_pem_get(key_id, async_req=True)
>>> result = thread.get()
Args:
key_id (str):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
str
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['key_id'] = \
key_id
return self.call_with_http_info(**kwargs)
self.keys_key_id_public_pem_get = _Endpoint(
settings={
'response_type': (str,),
'auth': [
'basic'
],
'endpoint_path': '/keys/{KeyID}/public.pem',
'operation_id': 'keys_key_id_public_pem_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'key_id',
],
'required': [
'key_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'key_id':
(str,),
},
'attribute_map': {
'key_id': 'KeyID',
},
'location_map': {
'key_id': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/x-pem-file'
],
'content_type': [],
},
api_client=api_client,
callable=__keys_key_id_public_pem_get
)
def __keys_key_id_put(
self,
key_id,
**kwargs
):
"""keys_key_id_put # noqa: E501
Import a private key into NetHSM and store it under the <i>KeyID</i> path. The public key will be automatically derived. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.keys_key_id_put(key_id, async_req=True)
>>> result = thread.get()
Args:
key_id (str):
Keyword Args:
body (PrivateKey): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['key_id'] = \
key_id
return self.call_with_http_info(**kwargs)
self.keys_key_id_put = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/keys/{KeyID}',
'operation_id': 'keys_key_id_put',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'key_id',
'body',
],
'required': [
'key_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'key_id':
(str,),
'body':
(PrivateKey,),
},
'attribute_map': {
'key_id': 'KeyID',
},
'location_map': {
'key_id': 'path',
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__keys_key_id_put
)
def __keys_key_id_sign_post(
self,
key_id,
**kwargs
):
"""keys_key_id_sign_post # noqa: E501
Sign a message with the secret key. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.keys_key_id_sign_post(key_id, async_req=True)
>>> result = thread.get()
Args:
key_id (str):
Keyword Args:
body (SignRequestData): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SignData
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['key_id'] = \
key_id
return self.call_with_http_info(**kwargs)
self.keys_key_id_sign_post = _Endpoint(
settings={
'response_type': (SignData,),
'auth': [
'basic'
],
'endpoint_path': '/keys/{KeyID}/sign',
'operation_id': 'keys_key_id_sign_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'key_id',
'body',
],
'required': [
'key_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'key_id':
(str,),
'body':
(SignRequestData,),
},
'attribute_map': {
'key_id': 'KeyID',
},
'location_map': {
'key_id': 'path',
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__keys_key_id_sign_post
)
def __keys_post(
self,
**kwargs
):
"""keys_post # noqa: E501
Import a private key into NetHSM and let NetHSM generate a KeyID. The public key will be automatically derived. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.keys_post(async_req=True)
>>> result = thread.get()
Keyword Args:
mechanisms ([KeyMechanism]): [optional]
body (PrivateKey): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
KeyItem
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.keys_post = _Endpoint(
settings={
'response_type': (KeyItem,),
'auth': [
'basic'
],
'endpoint_path': '/keys',
'operation_id': 'keys_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'mechanisms',
'body',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'mechanisms':
([KeyMechanism],),
'body':
(PrivateKey,),
},
'attribute_map': {
'mechanisms': 'mechanisms',
},
'location_map': {
'mechanisms': 'query',
'body': 'body',
},
'collection_format_map': {
'mechanisms': 'csv',
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__keys_post
)
def __lock_post(
self,
**kwargs
):
"""lock_post # noqa: E501
Brings an <i>Operational</i> NetHSM into <i>Locked</i> state. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.lock_post(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.lock_post = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/lock',
'operation_id': 'lock_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__lock_post
)
def __metrics_get(
self,
**kwargs
):
"""metrics_get # noqa: E501
Get metrics. Precondition: NetHSM is <i>Operational</i> and a <b>R-Metrics</b> can be authenticated. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.metrics_get(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
{str: (bool, date, datetime, dict, float, int, list, str, none_type)}
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.metrics_get = _Endpoint(
settings={
'response_type': ({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'auth': [
'basic'
],
'endpoint_path': '/metrics',
'operation_id': 'metrics_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__metrics_get
)
def __provision_post(
self,
**kwargs
):
"""provision_post # noqa: E501
Initial provisioning, only available in <i>Unprovisioned</i> state. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.provision_post(async_req=True)
>>> result = thread.get()
Keyword Args:
body (ProvisionRequestData): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.provision_post = _Endpoint(
settings={
'response_type': None,
'auth': [],
'endpoint_path': '/provision',
'operation_id': 'provision_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
(ProvisionRequestData,),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__provision_post
)
def __random_post(
self,
**kwargs
):
"""random_post # noqa: E501
Retrieve cryptographically strong random bytes from NetHSM. Precondition: NetHSM is <i>Operational</i> and a <b>R-Operator</b> can be authenticated. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.random_post(async_req=True)
>>> result = thread.get()
Keyword Args:
body (RandomRequestData): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
RandomData
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.random_post = _Endpoint(
settings={
'response_type': (RandomData,),
'auth': [
'basic'
],
'endpoint_path': '/random',
'operation_id': 'random_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
(RandomRequestData,),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__random_post
)
def __system_backup_post(
self,
**kwargs
):
"""system_backup_post # noqa: E501
Back up the key store to a backup file. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.system_backup_post(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.system_backup_post = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/system/backup',
'operation_id': 'system_backup_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__system_backup_post
)
def __system_cancel_update_post(
self,
**kwargs
):
"""system_cancel_update_post # noqa: E501
Cancel update of NetHSM software. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.system_cancel_update_post(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.system_cancel_update_post = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/system/cancel-update',
'operation_id': 'system_cancel_update_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__system_cancel_update_post
)
def __system_commit_update_post(
self,
**kwargs
):
"""system_commit_update_post # noqa: E501
Commit update of NetHSM software. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.system_commit_update_post(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.system_commit_update_post = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/system/commit-update',
'operation_id': 'system_commit_update_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__system_commit_update_post
)
def __system_info_get(
self,
**kwargs
):
"""system_info_get # noqa: E501
Get detailed system information, including firmware version, system software version, hardware version. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.system_info_get(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SystemInfo
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.system_info_get = _Endpoint(
settings={
'response_type': (SystemInfo,),
'auth': [
'basic'
],
'endpoint_path': '/system/info',
'operation_id': 'system_info_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__system_info_get
)
def __system_reboot_post(
self,
**kwargs
):
"""system_reboot_post # noqa: E501
Reboot NetHSM. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.system_reboot_post(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.system_reboot_post = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/system/reboot',
'operation_id': 'system_reboot_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__system_reboot_post
)
def __system_reset_post(
self,
**kwargs
):
"""system_reset_post # noqa: E501
Reset NetHSM to factory settings. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.system_reset_post(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.system_reset_post = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/system/reset',
'operation_id': 'system_reset_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__system_reset_post
)
def __system_restore_post(
self,
backup_passphrase,
system_time,
**kwargs
):
"""system_restore_post # noqa: E501
Restore the key store from a backup file. Only available in <i>Unprovisioned</i> state. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.system_restore_post(backup_passphrase, system_time, async_req=True)
>>> result = thread.get()
Args:
backup_passphrase (str):
system_time (datetime):
Keyword Args:
body ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['backup_passphrase'] = \
backup_passphrase
kwargs['system_time'] = \
system_time
return self.call_with_http_info(**kwargs)
self.system_restore_post = _Endpoint(
settings={
'response_type': None,
'auth': [],
'endpoint_path': '/system/restore',
'operation_id': 'system_restore_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'backup_passphrase',
'system_time',
'body',
],
'required': [
'backup_passphrase',
'system_time',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'backup_passphrase':
(str,),
'system_time':
(datetime,),
'body':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
},
'attribute_map': {
'backup_passphrase': 'backupPassphrase',
'system_time': 'systemTime',
},
'location_map': {
'backup_passphrase': 'query',
'system_time': 'query',
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__system_restore_post
)
def __system_shutdown_post(
self,
**kwargs
):
"""system_shutdown_post # noqa: E501
Shut down NetHSM. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.system_shutdown_post(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.system_shutdown_post = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/system/shutdown',
'operation_id': 'system_shutdown_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__system_shutdown_post
)
def __system_update_post(
self,
**kwargs
):
"""system_update_post # noqa: E501
Update NetHSM software. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.system_update_post(async_req=True)
>>> result = thread.get()
Keyword Args:
body ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SystemUpdateData
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.system_update_post = _Endpoint(
settings={
'response_type': (SystemUpdateData,),
'auth': [
'basic'
],
'endpoint_path': '/system/update',
'operation_id': 'system_update_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__system_update_post
)
def __unlock_post(
self,
**kwargs
):
"""unlock_post # noqa: E501
Brings a <i>Locked</i> NetHSM into <i>Operational</i> state. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.unlock_post(async_req=True)
>>> result = thread.get()
Keyword Args:
body (UnlockRequestData): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.unlock_post = _Endpoint(
settings={
'response_type': None,
'auth': [],
'endpoint_path': '/unlock',
'operation_id': 'unlock_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
(UnlockRequestData,),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__unlock_post
)
def __users_get(
self,
**kwargs
):
"""users_get # noqa: E501
Get a list of all user ids that have accounts on NetHSM. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.users_get(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
UserList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.users_get = _Endpoint(
settings={
'response_type': (UserList,),
'auth': [
'basic'
],
'endpoint_path': '/users',
'operation_id': 'users_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
},
'attribute_map': {
},
'location_map': {
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__users_get
)
def __users_post(
self,
**kwargs
):
"""users_post # noqa: E501
Create a new user on NetHSM. The user-ID is generated by NetHSM. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.users_post(async_req=True)
>>> result = thread.get()
Keyword Args:
body (UserPostData): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.users_post = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/users',
'operation_id': 'users_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'body',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'body':
(UserPostData,),
},
'attribute_map': {
},
'location_map': {
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__users_post
)
def __users_user_id_delete(
self,
user_id,
**kwargs
):
"""users_user_id_delete # noqa: E501
Delete a user from keyfender. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.users_user_id_delete(user_id, async_req=True)
>>> result = thread.get()
Args:
user_id (str):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['user_id'] = \
user_id
return self.call_with_http_info(**kwargs)
self.users_user_id_delete = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/users/{UserID}',
'operation_id': 'users_user_id_delete',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'user_id',
],
'required': [
'user_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'user_id':
(str,),
},
'attribute_map': {
'user_id': 'UserID',
},
'location_map': {
'user_id': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__users_user_id_delete
)
def __users_user_id_get(
self,
user_id,
**kwargs
):
"""users_user_id_get # noqa: E501
Get user info: name and role. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.users_user_id_get(user_id, async_req=True)
>>> result = thread.get()
Args:
user_id (str):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
UserData
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['user_id'] = \
user_id
return self.call_with_http_info(**kwargs)
self.users_user_id_get = _Endpoint(
settings={
'response_type': (UserData,),
'auth': [
'basic'
],
'endpoint_path': '/users/{UserID}',
'operation_id': 'users_user_id_get',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'user_id',
],
'required': [
'user_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'user_id':
(str,),
},
'attribute_map': {
'user_id': 'UserID',
},
'location_map': {
'user_id': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__users_user_id_get
)
def __users_user_id_passphrase_post(
self,
user_id,
**kwargs
):
"""users_user_id_passphrase_post # noqa: E501
Update the passphrase. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.users_user_id_passphrase_post(user_id, async_req=True)
>>> result = thread.get()
Args:
user_id (str):
Keyword Args:
body (UserPassphrasePostData): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['user_id'] = \
user_id
return self.call_with_http_info(**kwargs)
self.users_user_id_passphrase_post = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/users/{UserID}/passphrase',
'operation_id': 'users_user_id_passphrase_post',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'user_id',
'body',
],
'required': [
'user_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'user_id':
(str,),
'body':
(UserPassphrasePostData,),
},
'attribute_map': {
'user_id': 'UserID',
},
'location_map': {
'user_id': 'path',
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__users_user_id_passphrase_post
)
def __users_user_id_put(
self,
user_id,
**kwargs
):
"""users_user_id_put # noqa: E501
Create a user on keyfender. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.users_user_id_put(user_id, async_req=True)
>>> result = thread.get()
Args:
user_id (str):
Keyword Args:
body (UserPostData): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['user_id'] = \
user_id
return self.call_with_http_info(**kwargs)
self.users_user_id_put = _Endpoint(
settings={
'response_type': None,
'auth': [
'basic'
],
'endpoint_path': '/users/{UserID}',
'operation_id': 'users_user_id_put',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'user_id',
'body',
],
'required': [
'user_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'user_id':
(str,),
'body':
(UserPostData,),
},
'attribute_map': {
'user_id': 'UserID',
},
'location_map': {
'user_id': 'path',
'body': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [
'application/json'
]
},
api_client=api_client,
callable=__users_user_id_put
)
| 36.525211 | 307 | 0.459566 | 18,462 | 212,248 | 5.015979 | 0.020691 | 0.034696 | 0.028638 | 0.029739 | 0.923071 | 0.904843 | 0.88885 | 0.885978 | 0.880093 | 0.875406 | 0 | 0.00335 | 0.459896 | 212,248 | 5,810 | 308 | 36.531497 | 0.804468 | 0.349761 | 0 | 0.66614 | 0 | 0 | 0.212534 | 0.033977 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013691 | false | 0.007636 | 0.008689 | 0 | 0.036072 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
71dba5d250d0c243bd200bcc2c70bff0b2f35454 | 130 | py | Python | back-end/app/api/ping.py | naruto890809/flask-vuejs-hmblog | 4324dd4eb6d26ba4de041ffced6f546476c25b1f | [
"MIT"
] | null | null | null | back-end/app/api/ping.py | naruto890809/flask-vuejs-hmblog | 4324dd4eb6d26ba4de041ffced6f546476c25b1f | [
"MIT"
] | 5 | 2021-03-10T06:11:49.000Z | 2022-02-26T23:08:00.000Z | back-end/app/api/ping.py | naruto890809/flask-vuejs-hmblog | 4324dd4eb6d26ba4de041ffced6f546476c25b1f | [
"MIT"
] | null | null | null | from flask import jsonify
from app.api import bp
@bp.route('/ping',methods=["GET"])
def ping():
return jsonify("pong!!") | 21.666667 | 35 | 0.653846 | 19 | 130 | 4.473684 | 0.736842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.169231 | 130 | 6 | 36 | 21.666667 | 0.787037 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
e096b4dc8006c1fc410bee224c2d73f997d041c3 | 16,640 | py | Python | tests/base/test_tensor_boundary.py | bluetyson/discretize | a4ead91d6a1f84658ab20946da5fa86dc9ccc831 | [
"MIT"
] | null | null | null | tests/base/test_tensor_boundary.py | bluetyson/discretize | a4ead91d6a1f84658ab20946da5fa86dc9ccc831 | [
"MIT"
] | null | null | null | tests/base/test_tensor_boundary.py | bluetyson/discretize | a4ead91d6a1f84658ab20946da5fa86dc9ccc831 | [
"MIT"
] | null | null | null | from __future__ import print_function
import numpy as np
import unittest
import discretize
from pymatsolver import Solver
MESHTYPES = ['uniformTensorMesh']
def getxBCyBC_CC(mesh, alpha, beta, gamma):
"""
This is a subfunction generating mixed-boundary condition:
.. math::
\nabla \cdot \vec{j} = -\nabla \cdot \vec{j}_s = q
\rho \vec{j} = -\nabla \phi \phi
\alpha \phi + \beta \frac{\partial \phi}{\partial r} = \gamma \ at \ r
= \partial \Omega
xBC = f_1(\alpha, \beta, \gamma)
yBC = f(\alpha, \beta, \gamma)
Computes xBC and yBC for cell-centered discretizations
"""
if mesh.dim == 1: # 1D
if (len(alpha) != 2 or len(beta) != 2 or len(gamma) != 2):
raise Exception("Lenght of list, alpha should be 2")
fCCxm, fCCxp = mesh.cellBoundaryInd
nBC = fCCxm.sum()+fCCxp.sum()
h_xm, h_xp = mesh.gridCC[fCCxm], mesh.gridCC[fCCxp]
alpha_xm, beta_xm, gamma_xm = alpha[0], beta[0], gamma[0]
alpha_xp, beta_xp, gamma_xp = alpha[1], beta[1], gamma[1]
# h_xm, h_xp = mesh.gridCC[fCCxm], mesh.gridCC[fCCxp]
h_xm, h_xp = mesh.hx[0], mesh.hx[-1]
a_xm = gamma_xm/(0.5*alpha_xm-beta_xm/h_xm)
b_xm = (0.5*alpha_xm+beta_xm/h_xm)/(0.5*alpha_xm-beta_xm/h_xm)
a_xp = gamma_xp/(0.5*alpha_xp-beta_xp/h_xp)
b_xp = (0.5*alpha_xp+beta_xp/h_xp)/(0.5*alpha_xp-beta_xp/h_xp)
xBC_xm = 0.5*a_xm
xBC_xp = 0.5*a_xp/b_xp
yBC_xm = 0.5*(1.-b_xm)
yBC_xp = 0.5*(1.-1./b_xp)
xBC = np.r_[xBC_xm, xBC_xp]
yBC = np.r_[yBC_xm, yBC_xp]
elif mesh.dim == 2: # 2D
if (len(alpha) != 4 or len(beta) != 4 or len(gamma) != 4):
raise Exception("Lenght of list, alpha should be 4")
fxm, fxp, fym, fyp = mesh.faceBoundaryInd
nBC = fxm.sum()+fxp.sum()+fxm.sum()+fxp.sum()
alpha_xm, beta_xm, gamma_xm = alpha[0], beta[0], gamma[0]
alpha_xp, beta_xp, gamma_xp = alpha[1], beta[1], gamma[1]
alpha_ym, beta_ym, gamma_ym = alpha[2], beta[2], gamma[2]
alpha_yp, beta_yp, gamma_yp = alpha[3], beta[3], gamma[3]
# h_xm, h_xp = mesh.gridCC[fCCxm,0], mesh.gridCC[fCCxp,0]
# h_ym, h_yp = mesh.gridCC[fCCym,1], mesh.gridCC[fCCyp,1]
h_xm = mesh.hx[0]*np.ones_like(alpha_xm)
h_xp = mesh.hx[-1]*np.ones_like(alpha_xp)
h_ym = mesh.hy[0]*np.ones_like(alpha_ym)
h_yp = mesh.hy[-1]*np.ones_like(alpha_yp)
a_xm = gamma_xm/(0.5*alpha_xm-beta_xm/h_xm)
b_xm = (0.5*alpha_xm+beta_xm/h_xm)/(0.5*alpha_xm-beta_xm/h_xm)
a_xp = gamma_xp/(0.5*alpha_xp-beta_xp/h_xp)
b_xp = (0.5*alpha_xp+beta_xp/h_xp)/(0.5*alpha_xp-beta_xp/h_xp)
a_ym = gamma_ym/(0.5*alpha_ym-beta_ym/h_ym)
b_ym = (0.5*alpha_ym+beta_ym/h_ym)/(0.5*alpha_ym-beta_ym/h_ym)
a_yp = gamma_yp/(0.5*alpha_yp-beta_yp/h_yp)
b_yp = (0.5*alpha_yp+beta_yp/h_yp)/(0.5*alpha_yp-beta_yp/h_yp)
xBC_xm = 0.5*a_xm
xBC_xp = 0.5*a_xp/b_xp
yBC_xm = 0.5*(1.-b_xm)
yBC_xp = 0.5*(1.-1./b_xp)
xBC_ym = 0.5*a_ym
xBC_yp = 0.5*a_yp/b_yp
yBC_ym = 0.5*(1.-b_ym)
yBC_yp = 0.5*(1.-1./b_yp)
sortindsfx = np.argsort(np.r_[np.arange(mesh.nFx)[fxm],
np.arange(mesh.nFx)[fxp]])
sortindsfy = np.argsort(np.r_[np.arange(mesh.nFy)[fym],
np.arange(mesh.nFy)[fyp]])
xBC_x = np.r_[xBC_xm, xBC_xp][sortindsfx]
xBC_y = np.r_[xBC_ym, xBC_yp][sortindsfy]
yBC_x = np.r_[yBC_xm, yBC_xp][sortindsfx]
yBC_y = np.r_[yBC_ym, yBC_yp][sortindsfy]
xBC = np.r_[xBC_x, xBC_y]
yBC = np.r_[yBC_x, yBC_y]
elif mesh.dim == 3: # 3D
if (len(alpha) != 6 or len(beta) != 6 or len(gamma) != 6):
raise Exception("Lenght of list, alpha should be 6")
# fCCxm,fCCxp,fCCym,fCCyp,fCCzm,fCCzp = mesh.cellBoundaryInd
fxm, fxp, fym, fyp, fzm, fzp = mesh.faceBoundaryInd
nBC = fxm.sum()+fxp.sum()+fxm.sum()+fxp.sum()
alpha_xm, beta_xm, gamma_xm = alpha[0], beta[0], gamma[0]
alpha_xp, beta_xp, gamma_xp = alpha[1], beta[1], gamma[1]
alpha_ym, beta_ym, gamma_ym = alpha[2], beta[2], gamma[2]
alpha_yp, beta_yp, gamma_yp = alpha[3], beta[3], gamma[3]
alpha_zm, beta_zm, gamma_zm = alpha[4], beta[4], gamma[4]
alpha_zp, beta_zp, gamma_zp = alpha[5], beta[5], gamma[5]
# h_xm, h_xp = mesh.gridCC[fCCxm,0], mesh.gridCC[fCCxp,0]
# h_ym, h_yp = mesh.gridCC[fCCym,1], mesh.gridCC[fCCyp,1]
# h_zm, h_zp = mesh.gridCC[fCCzm,2], mesh.gridCC[fCCzp,2]
h_xm = mesh.hx[0]*np.ones_like(alpha_xm)
h_xp = mesh.hx[-1]*np.ones_like(alpha_xp)
h_ym = mesh.hy[0]*np.ones_like(alpha_ym)
h_yp = mesh.hy[-1]*np.ones_like(alpha_yp)
h_zm = mesh.hz[0]*np.ones_like(alpha_zm)
h_zp = mesh.hz[-1]*np.ones_like(alpha_zp)
a_xm = gamma_xm/(0.5*alpha_xm-beta_xm/h_xm)
b_xm = (0.5*alpha_xm+beta_xm/h_xm)/(0.5*alpha_xm-beta_xm/h_xm)
a_xp = gamma_xp/(0.5*alpha_xp-beta_xp/h_xp)
b_xp = (0.5*alpha_xp+beta_xp/h_xp)/(0.5*alpha_xp-beta_xp/h_xp)
a_ym = gamma_ym/(0.5*alpha_ym-beta_ym/h_ym)
b_ym = (0.5*alpha_ym+beta_ym/h_ym)/(0.5*alpha_ym-beta_ym/h_ym)
a_yp = gamma_yp/(0.5*alpha_yp-beta_yp/h_yp)
b_yp = (0.5*alpha_yp+beta_yp/h_yp)/(0.5*alpha_yp-beta_yp/h_yp)
a_zm = gamma_zm/(0.5*alpha_zm-beta_zm/h_zm)
b_zm = (0.5*alpha_zm+beta_zm/h_zm)/(0.5*alpha_zm-beta_zm/h_zm)
a_zp = gamma_zp/(0.5*alpha_zp-beta_zp/h_zp)
b_zp = (0.5*alpha_zp+beta_zp/h_zp)/(0.5*alpha_zp-beta_zp/h_zp)
xBC_xm = 0.5*a_xm
xBC_xp = 0.5*a_xp/b_xp
yBC_xm = 0.5*(1.-b_xm)
yBC_xp = 0.5*(1.-1./b_xp)
xBC_ym = 0.5*a_ym
xBC_yp = 0.5*a_yp/b_yp
yBC_ym = 0.5*(1.-b_ym)
yBC_yp = 0.5*(1.-1./b_yp)
xBC_zm = 0.5*a_zm
xBC_zp = 0.5*a_zp/b_zp
yBC_zm = 0.5*(1.-b_zm)
yBC_zp = 0.5*(1.-1./b_zp)
sortindsfx = np.argsort(np.r_[np.arange(mesh.nFx)[fxm],
np.arange(mesh.nFx)[fxp]])
sortindsfy = np.argsort(np.r_[np.arange(mesh.nFy)[fym],
np.arange(mesh.nFy)[fyp]])
sortindsfz = np.argsort(np.r_[np.arange(mesh.nFz)[fzm],
np.arange(mesh.nFz)[fzp]])
xBC_x = np.r_[xBC_xm, xBC_xp][sortindsfx]
xBC_y = np.r_[xBC_ym, xBC_yp][sortindsfy]
xBC_z = np.r_[xBC_zm, xBC_zp][sortindsfz]
yBC_x = np.r_[yBC_xm, yBC_xp][sortindsfx]
yBC_y = np.r_[yBC_ym, yBC_yp][sortindsfy]
yBC_z = np.r_[yBC_zm, yBC_zp][sortindsfz]
xBC = np.r_[xBC_x, xBC_y, xBC_z]
yBC = np.r_[yBC_x, yBC_y, yBC_z]
return xBC, yBC
class Test1D_InhomogeneousMixed(discretize.Tests.OrderTest):
name = "1D - Mixed"
meshTypes = MESHTYPES
meshDimension = 1
expectedOrders = 2
meshSizes = [4, 8, 16, 32]
def getError(self):
# Test function
def phi_fun(x): return np.cos(np.pi*x)
def j_fun(x): return np.pi*np.sin(np.pi*x)
def phi_deriv(x): return -j_fun(x)
def q_fun(x): return (np.pi**2)*np.cos(np.pi*x)
xc_ana = phi_fun(self.M.gridCC)
q_ana = q_fun(self.M.gridCC)
j_ana = j_fun(self.M.gridFx)
# Get boundary locations
vecN = self.M.vectorNx
vecC = self.M.vectorCCx
# Setup Mixed B.C (alpha, beta, gamma)
alpha_xm, alpha_xp = 1., 1.
beta_xm, beta_xp = 1., 1.
alpha = np.r_[alpha_xm, alpha_xp]
beta = np.r_[beta_xm, beta_xp]
vecN = self.M.vectorNx
vecC = self.M.vectorCCx
phi_bc = phi_fun(vecN[[0, -1]])
phi_deriv_bc = phi_deriv(vecN[[0, -1]])
gamma = alpha*phi_bc + beta*phi_deriv_bc
x_BC, y_BC = getxBCyBC_CC(self.M, alpha, beta, gamma)
sigma = np.ones(self.M.nC)
Mfrho = self.M.getFaceInnerProduct(1./sigma)
MfrhoI = self.M.getFaceInnerProduct(1./sigma, invMat=True)
V = discretize.utils.sdiag(self.M.vol)
Div = V*self.M.faceDiv
P_BC, B = self.M.getBCProjWF_simple()
q = q_fun(self.M.gridCC)
M = B*self.M.aveCC2F
G = Div.T - P_BC*discretize.utils.sdiag(y_BC)*M
# Mrhoj = D.T V phi + P_BC*discretize.utils.sdiag(y_BC)*M phi - P_BC*x_BC
rhs = V*q + Div*MfrhoI*P_BC*x_BC
A = Div*MfrhoI*G
if self.myTest == 'xc':
# TODO: fix the null space
Ainv = Solver(A)
xc = Ainv*rhs
err = np.linalg.norm((xc-xc_ana), np.inf)
else:
NotImplementedError
return err
def test_order(self):
print("==== Testing Mixed boudary conduction for CC-problem ====")
self.name = "1D"
self.myTest = 'xc'
self.orderTest()
class Test2D_InhomogeneousMixed(discretize.Tests.OrderTest):
name = "2D - Mixed"
meshTypes = MESHTYPES
meshDimension = 2
expectedOrders = 2
meshSizes = [4, 8, 16, 32]
def getError(self):
# Test function
def phi_fun(x):
return np.cos(np.pi*x[:, 0])*np.cos(np.pi*x[:, 1])
def j_funX(x):
return +np.pi*np.sin(np.pi*x[:, 0])*np.cos(np.pi*x[:, 1])
def j_funY(x):
return +np.pi*np.cos(np.pi*x[:, 0])*np.sin(np.pi*x[:, 1])
def phideriv_funX(x):
return -j_funX(x)
def phideriv_funY(x):
return -j_funY(x)
def q_fun(x):
return +2*(np.pi**2)*phi_fun(x)
xc_ana = phi_fun(self.M.gridCC)
q_ana = q_fun(self.M.gridCC)
jX_ana = j_funX(self.M.gridFx)
jY_ana = j_funY(self.M.gridFy)
j_ana = np.r_[jX_ana, jY_ana]
# Get boundary locations
fxm, fxp, fym, fyp = self.M.faceBoundaryInd
gBFxm = self.M.gridFx[fxm, :]
gBFxp = self.M.gridFx[fxp, :]
gBFym = self.M.gridFy[fym, :]
gBFyp = self.M.gridFy[fyp, :]
# Setup Mixed B.C (alpha, beta, gamma)
alpha_xm = np.ones_like(gBFxm[:, 0])
alpha_xp = np.ones_like(gBFxp[:, 0])
beta_xm = np.ones_like(gBFxm[:, 0])
beta_xp = np.ones_like(gBFxp[:, 0])
alpha_ym = np.ones_like(gBFym[:, 1])
alpha_yp = np.ones_like(gBFyp[:, 1])
beta_ym = np.ones_like(gBFym[:, 1])
beta_yp = np.ones_like(gBFyp[:, 1])
phi_bc_xm, phi_bc_xp = phi_fun(gBFxm), phi_fun(gBFxp)
phi_bc_ym, phi_bc_yp = phi_fun(gBFym), phi_fun(gBFyp)
phiderivX_bc_xm = phideriv_funX(gBFxm)
phiderivX_bc_xp = phideriv_funX(gBFxp)
phiderivY_bc_ym = phideriv_funY(gBFym)
phiderivY_bc_yp = phideriv_funY(gBFyp)
def gamma_fun(alpha, beta, phi, phi_deriv):
return alpha*phi + beta*phi_deriv
gamma_xm = gamma_fun(alpha_xm, beta_xm, phi_bc_xm, phiderivX_bc_xm)
gamma_xp = gamma_fun(alpha_xp, beta_xp, phi_bc_xp, phiderivX_bc_xp)
gamma_ym = gamma_fun(alpha_ym, beta_ym, phi_bc_ym, phiderivY_bc_ym)
gamma_yp = gamma_fun(alpha_yp, beta_yp, phi_bc_yp, phiderivY_bc_yp)
alpha = [alpha_xm, alpha_xp, alpha_ym, alpha_yp]
beta = [beta_xm, beta_xp, beta_ym, beta_yp]
gamma = [gamma_xm, gamma_xp, gamma_ym, gamma_yp]
x_BC, y_BC = getxBCyBC_CC(self.M, alpha, beta, gamma)
sigma = np.ones(self.M.nC)
Mfrho = self.M.getFaceInnerProduct(1./sigma)
MfrhoI = self.M.getFaceInnerProduct(1./sigma, invMat=True)
V = discretize.utils.sdiag(self.M.vol)
Div = V*self.M.faceDiv
P_BC, B = self.M.getBCProjWF_simple()
q = q_fun(self.M.gridCC)
M = B*self.M.aveCC2F
G = Div.T - P_BC*discretize.utils.sdiag(y_BC)*M
rhs = V*q + Div*MfrhoI*P_BC*x_BC
A = Div*MfrhoI*G
if self.myTest == 'xc':
Ainv = Solver(A)
xc = Ainv*rhs
err = np.linalg.norm((xc-xc_ana), np.inf)
else:
NotImplementedError
return err
def test_order(self):
print("==== Testing Mixed boudary conduction for CC-problem ====")
self.name = "2D"
self.myTest = 'xc'
self.orderTest()
class Test3D_InhomogeneousMixed(discretize.Tests.OrderTest):
name = "3D - Mixed"
meshTypes = MESHTYPES
meshDimension = 3
expectedOrders = 2
meshSizes = [4, 8, 16]
def getError(self):
# Test function
def phi_fun(x):
return (np.cos(np.pi*x[:, 0])*np.cos(np.pi*x[:, 1]) *
np.cos(np.pi*x[:, 2]))
def j_funX(x):
return (np.pi*np.sin(np.pi*x[:, 0])*np.cos(np.pi*x[:, 1]) *
np.cos(np.pi*x[:, 2]))
def j_funY(x):
return (np.pi*np.cos(np.pi*x[:, 0])*np.sin(np.pi*x[:, 1]) *
np.cos(np.pi*x[:, 2]))
def j_funZ(x):
return (np.pi*np.cos(np.pi*x[:, 0])*np.cos(np.pi*x[:, 1]) *
np.sin(np.pi*x[:, 2]))
def phideriv_funX(x): return -j_funX(x)
def phideriv_funY(x): return -j_funY(x)
def phideriv_funZ(x): return -j_funZ(x)
def q_fun(x): return 3*(np.pi**2)*phi_fun(x)
xc_ana = phi_fun(self.M.gridCC)
q_ana = q_fun(self.M.gridCC)
jX_ana = j_funX(self.M.gridFx)
jY_ana = j_funY(self.M.gridFy)
j_ana = np.r_[jX_ana, jY_ana, jY_ana]
# Get boundary locations
fxm, fxp, fym, fyp, fzm, fzp = self.M.faceBoundaryInd
gBFxm = self.M.gridFx[fxm, :]
gBFxp = self.M.gridFx[fxp, :]
gBFym = self.M.gridFy[fym, :]
gBFyp = self.M.gridFy[fyp, :]
gBFzm = self.M.gridFz[fzm, :]
gBFzp = self.M.gridFz[fzp, :]
# Setup Mixed B.C (alpha, beta, gamma)
alpha_xm = np.ones_like(gBFxm[:, 0])
alpha_xp = np.ones_like(gBFxp[:, 0])
beta_xm = np.ones_like(gBFxm[:, 0])
beta_xp = np.ones_like(gBFxp[:, 0])
alpha_ym = np.ones_like(gBFym[:, 1])
alpha_yp = np.ones_like(gBFyp[:, 1])
beta_ym = np.ones_like(gBFym[:, 1])
beta_yp = np.ones_like(gBFyp[:, 1])
alpha_zm = np.ones_like(gBFzm[:, 2])
alpha_zp = np.ones_like(gBFzp[:, 2])
beta_zm = np.ones_like(gBFzm[:, 2])
beta_zp = np.ones_like(gBFzp[:, 2])
phi_bc_xm, phi_bc_xp = phi_fun(gBFxm), phi_fun(gBFxp)
phi_bc_ym, phi_bc_yp = phi_fun(gBFym), phi_fun(gBFyp)
phi_bc_zm, phi_bc_zp = phi_fun(gBFzm), phi_fun(gBFzp)
phiderivX_bc_xm = phideriv_funX(gBFxm)
phiderivX_bc_xp = phideriv_funX(gBFxp)
phiderivY_bc_ym = phideriv_funY(gBFym)
phiderivY_bc_yp = phideriv_funY(gBFyp)
phiderivY_bc_zm = phideriv_funZ(gBFzm)
phiderivY_bc_zp = phideriv_funZ(gBFzp)
def gamma_fun(alpha, beta, phi, phi_deriv):
return alpha*phi + beta*phi_deriv
gamma_xm = gamma_fun(alpha_xm, beta_xm, phi_bc_xm, phiderivX_bc_xm)
gamma_xp = gamma_fun(alpha_xp, beta_xp, phi_bc_xp, phiderivX_bc_xp)
gamma_ym = gamma_fun(alpha_ym, beta_ym, phi_bc_ym, phiderivY_bc_ym)
gamma_yp = gamma_fun(alpha_yp, beta_yp, phi_bc_yp, phiderivY_bc_yp)
gamma_zm = gamma_fun(alpha_zm, beta_zm, phi_bc_zm, phiderivY_bc_zm)
gamma_zp = gamma_fun(alpha_zp, beta_zp, phi_bc_zp, phiderivY_bc_zp)
alpha = [alpha_xm, alpha_xp, alpha_ym, alpha_yp, alpha_zm, alpha_zp]
beta = [beta_xm, beta_xp, beta_ym, beta_yp, beta_zm, beta_zp]
gamma = [gamma_xm, gamma_xp, gamma_ym, gamma_yp, gamma_zm, gamma_zp]
x_BC, y_BC = getxBCyBC_CC(self.M, alpha, beta, gamma)
sigma = np.ones(self.M.nC)
Mfrho = self.M.getFaceInnerProduct(1./sigma)
MfrhoI = self.M.getFaceInnerProduct(1./sigma, invMat=True)
V = discretize.utils.sdiag(self.M.vol)
Div = V*self.M.faceDiv
P_BC, B = self.M.getBCProjWF_simple()
q = q_fun(self.M.gridCC)
M = B*self.M.aveCC2F
G = Div.T - P_BC*discretize.utils.sdiag(y_BC)*M
rhs = V*q + Div*MfrhoI*P_BC*x_BC
A = Div*MfrhoI*G
if self.myTest == 'xc':
# TODO: fix the null space
Ainv = Solver(A)
xc = Ainv*rhs
err = np.linalg.norm((xc-xc_ana), np.inf)
else:
NotImplementedError
return err
def test_order(self):
print("==== Testing Mixed boudary conduction for CC-problem ====")
self.name = "3D"
self.myTest = 'xc'
self.orderTest()
if __name__ == '__main__':
unittest.main()
| 35.862069 | 81 | 0.576502 | 2,793 | 16,640 | 3.185822 | 0.074472 | 0.013486 | 0.028321 | 0.015172 | 0.813216 | 0.786581 | 0.758373 | 0.749382 | 0.725332 | 0.685098 | 0 | 0.025377 | 0.275361 | 16,640 | 463 | 82 | 35.939525 | 0.712556 | 0.069591 | 0 | 0.669617 | 0 | 0 | 0.022221 | 0 | 0 | 0 | 0 | 0.00216 | 0 | 1 | 0.079646 | false | 0 | 0.014749 | 0.058997 | 0.19469 | 0.011799 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e0cd8fdf7b63bca58ddd0dd2a724fca91d1de4ef | 307 | py | Python | Practical 1/src/conjugate.py | n1snt/Linear-Algebra-using-Python | 1335a4969cce70f379d4a82bbffefe47703c6cf6 | [
"MIT"
] | 1 | 2021-03-06T06:59:19.000Z | 2021-03-06T06:59:19.000Z | Practical 1/src/conjugate.py | n1snt/Linear-Algebra-using-Python | 1335a4969cce70f379d4a82bbffefe47703c6cf6 | [
"MIT"
] | null | null | null | Practical 1/src/conjugate.py | n1snt/Linear-Algebra-using-Python | 1335a4969cce70f379d4a82bbffefe47703c6cf6 | [
"MIT"
] | null | null | null | a = 5+2j
b = 3+6j
c = 6-4j
d = 2+ 1j
print("Conjugate of the complex number 5-2j is : ", a.conjugate())
print("Conjugate of the complex number -6+3j is : ", b.conjugate())
print("Conjugate of the complex number -6-4j is : ", c.conjugate())
print("Conjugate of the complex number 2+1j is : ", d.conjugate())
| 34.111111 | 67 | 0.664495 | 56 | 307 | 3.642857 | 0.357143 | 0.27451 | 0.313725 | 0.372549 | 0.769608 | 0.769608 | 0.612745 | 0.411765 | 0 | 0 | 0 | 0.062992 | 0.172638 | 307 | 8 | 68 | 38.375 | 0.740157 | 0 | 0 | 0 | 0 | 0 | 0.553746 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
1cbcb083f08f405a68bcef9043ee68c54a2da479 | 22 | py | Python | src/formsettesthelpers/__init__.py | Raekkeri/django-formsettesthelpers | eb2e392c47dd4c21e56af301fb53bd5686fdc334 | [
"MIT"
] | 1 | 2015-01-04T15:06:56.000Z | 2015-01-04T15:06:56.000Z | src/formsettesthelpers/__init__.py | Raekkeri/django-formsettesthelpers | eb2e392c47dd4c21e56af301fb53bd5686fdc334 | [
"MIT"
] | null | null | null | src/formsettesthelpers/__init__.py | Raekkeri/django-formsettesthelpers | eb2e392c47dd4c21e56af301fb53bd5686fdc334 | [
"MIT"
] | null | null | null | from formset import *
| 11 | 21 | 0.772727 | 3 | 22 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 22 | 1 | 22 | 22 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1cd70297b19b768a2ff49ffad032b5a3fb003813 | 21,298 | py | Python | nailgun/nailgun/test/integration/test_master_node_settings_handler.py | Zipfer/fuel-web | c6c4032eb6e29474e2be0318349265bdb566454c | [
"Apache-2.0"
] | null | null | null | nailgun/nailgun/test/integration/test_master_node_settings_handler.py | Zipfer/fuel-web | c6c4032eb6e29474e2be0318349265bdb566454c | [
"Apache-2.0"
] | null | null | null | nailgun/nailgun/test/integration/test_master_node_settings_handler.py | Zipfer/fuel-web | c6c4032eb6e29474e2be0318349265bdb566454c | [
"Apache-2.0"
] | null | null | null | # Copyright 2014 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import unittest2
from nailgun.test.base import BaseIntegrationTest
from nailgun.test.base import reverse
from nailgun.openstack.common import jsonutils
from nailgun import objects
from nailgun.statistics.installation_info import InstallationInfo
from nailgun.statistics.statsenderd import StatsSender
class TestMasterNodeSettingsHandler(BaseIntegrationTest):
@unittest2.skip('To be reworked')
def test_get_controller(self):
# should contain data that are defined in master_node_settings.yaml
# fixture file which is located in fixtures directory for nailgun
expected = {
"settings": {
"statistics": {
"send_anonymous_statistic": {
"type": "checkbox",
"value": True,
"label": "statistics.setting_labels."
"send_anonymous_statistic",
"weight": 10
},
"send_user_info": {
"type": "checkbox",
"value": True,
"label": "statistics.setting_labels.send_user_info",
"weight": 20,
"restrictions": [
"fuel_settings:statistics."
"send_anonymous_statistic.value == false",
{
"condition":
"not ('mirantis' in version:feature_groups)",
"action": "hide"
}
]
},
"name": {
"type": "text",
"value": "",
"label": "statistics.setting_labels.name",
"weight": 30,
"regex": {
"source": "\S",
"error": "statistics.errors.name"
},
"restrictions": [
"fuel_settings:statistics."
"send_anonymous_statistic.value == false",
"fuel_settings:statistics."
"send_user_info.value == false",
{
"condition":
"not ('mirantis' in version:feature_groups)",
"action": "hide"
}
]
},
"email": {
"type": "text",
"value": "",
"label": "statistics.setting_labels.email",
"weight": 40,
"regex": {
"source": "\S",
"error": "statistics.errors.email"
},
"restrictions": [
"fuel_settings:statistics."
"send_anonymous_statistic.value == false",
"fuel_settings:statistics."
"send_user_info.value == false",
{
"condition":
"not ('mirantis' in version:feature_groups)",
"action": "hide"
}
]
},
"company": {
"type": "text",
"value": "",
"label": "statistics.setting_labels.company",
"weight": 50,
"regex": {
"source": "\S",
"error": "statistics.errors.company"
},
"restrictions": [
"fuel_settings:statistics."
"send_anonymous_statistic.value == false",
"fuel_settings:statistics."
"send_user_info.value == false",
{
"condition":
"not ('mirantis' in version:feature_groups)",
"action": "hide"
}
]
},
"user_choice_saved": {
"type": "hidden",
"value": False
}
}
}
}
resp = self.app.get(
reverse("MasterNodeSettingsHandler"),
headers=self.default_headers,
)
self.assertEqual(resp.json_body, expected)
@unittest2.skip('To be reworked')
def test_put_controller(self):
data = {
"settings": {
"statistics": {
"send_anonymous_statistic": {
"type": "checkbox",
"value": False,
"label": "statistics.setting_labels."
"send_anonymous_statistic",
"weight": 10
},
"send_user_info": {
"type": "checkbox",
"value": True,
"label": "statistics.setting_labels.send_user_info",
"weight": 20,
"restrictions": [
"fuel_settings:statistics."
"send_anonymous_statistic.value == false",
{
"condition":
"not ('mirantis' in version:feature_groups)",
"action": "hide"
}
]
},
"name": {
"type": "text",
"value": "Some User",
"label": "statistics.setting_labels.name",
"weight": 30,
"regex": {
"source": "\S",
"error": "statistics.errors.name"
},
"restrictions": [
"fuel_settings:statistics."
"send_anonymous_statistic.value == false",
"fuel_settings:statistics."
"send_user_info.value == false",
{
"condition":
"not ('mirantis' in version:feature_groups)",
"action": "hide"
}
]
},
"email": {
"type": "text",
"value": "user@email.com",
"label": "statistics.setting_labels.email",
"weight": 40,
"regex": {
"source": "\S",
"error": "statistics.errors.email"
},
"restrictions": [
"fuel_settings:statistics."
"send_anonymous_statistic.value == false",
"fuel_settings:statistics."
"send_user_info.value == false",
{
"condition":
"not ('mirantis' in version:feature_groups)",
"action": "hide"
}
]
},
"company": {
"type": "text",
"value": "Some Company",
"label": "statistics.setting_labels.company",
"weight": 50,
"regex": {
"source": "\S",
"error": "statistics.errors.company"
},
"restrictions": [
"fuel_settings:statistics."
"send_anonymous_statistic.value == false",
"fuel_settings:statistics."
"send_user_info.value == false",
{
"condition":
"not ('mirantis' in version:feature_groups)",
"action": "hide"
}
]
},
"user_choice_saved": {
"type": "hidden",
"value": True
}
}
}
}
resp = self.app.put(
reverse("MasterNodeSettingsHandler"),
headers=self.default_headers,
params=jsonutils.dumps(data)
)
self.assertEqual(200, resp.status_code)
self.assertEqual(resp.json_body, data)
settings_from_db = objects.MasterNodeSettings.get_one()
self.assertEqual(settings_from_db.settings, data["settings"])
@unittest2.skip('To be reworked')
def test_patch_controller(self):
data = {
"settings": {
"statistics": {
"company": {
"value": "Other Company"
}
}
}
}
expected = {
"settings": {
"statistics": {
"send_anonymous_statistic": {
"type": "checkbox",
"value": True,
"label": "statistics.setting_labels."
"send_anonymous_statistic",
"weight": 10
},
"send_user_info": {
"type": "checkbox",
"value": True,
"label": "statistics.setting_labels.send_user_info",
"weight": 20,
"restrictions": [
"fuel_settings:statistics."
"send_anonymous_statistic.value == false",
{
"condition":
"not ('mirantis' in version:feature_groups)",
"action": "hide"
}
]
},
"name": {
"type": "text",
"value": "",
"label": "statistics.setting_labels.name",
"weight": 30,
"regex": {
"source": "\S",
"error": "statistics.errors.name"
},
"restrictions": [
"fuel_settings:statistics."
"send_anonymous_statistic.value == false",
"fuel_settings:statistics."
"send_user_info.value == false",
{
"condition":
"not ('mirantis' in version:feature_groups)",
"action": "hide"
}
]
},
"email": {
"type": "text",
"value": "",
"label": "statistics.setting_labels.email",
"weight": 40,
"regex": {
"source": "\S",
"error": "statistics.errors.email"
},
"restrictions": [
"fuel_settings:statistics."
"send_anonymous_statistic.value == false",
"fuel_settings:statistics."
"send_user_info.value == false",
{
"condition":
"not ('mirantis' in version:feature_groups)",
"action": "hide"
}
]
},
"company": {
"type": "text",
"value": "Other Company",
"label": "statistics.setting_labels.company",
"weight": 50,
"regex": {
"source": "\S",
"error": "statistics.errors.company"
},
"restrictions": [
"fuel_settings:statistics."
"send_anonymous_statistic.value == false",
"fuel_settings:statistics."
"send_user_info.value == false",
{
"condition":
"not ('mirantis' in version:feature_groups)",
"action": "hide"
}
]
},
"user_choice_saved": {
"type": "hidden",
"value": False
}
}
}
}
resp = self.app.patch(
reverse("MasterNodeSettingsHandler"),
headers=self.default_headers,
params=jsonutils.dumps(data)
)
self.assertEqual(200, resp.status_code)
self.assertEqual(resp.json_body, expected)
settings_from_db = objects.MasterNodeSettings.get_one()
self.assertEqual(settings_from_db.settings, expected["settings"])
def test_validation_error(self):
data = {
"settings": "I'm not an object, bro:)"
}
resp = self.app.put(
reverse("MasterNodeSettingsHandler"),
headers=self.default_headers,
params=jsonutils.dumps(data),
expect_errors=True
)
self.assertEqual(400, resp.status_code)
self.assertIn("Failed validating", resp.body)
def test_not_found_error(self):
settings_from_db = objects.MasterNodeSettings.get_one()
self.db.delete(settings_from_db)
self.db.commit()
resp = self.app.get(
reverse("MasterNodeSettingsHandler"),
headers=self.default_headers,
expect_errors=True
)
self.assertEqual(404, resp.status_code)
self.assertIn("not found", resp.body)
def test_stats_sending_enabled(self):
self.assertEqual(StatsSender().must_send_stats(), False)
resp = self.app.get(
reverse("MasterNodeSettingsHandler"),
headers=self.default_headers)
self.assertEqual(200, resp.status_code)
data = resp.json_body
# emulate user confirmed settings in UI
data["settings"]["statistics"]["user_choice_saved"]["value"] = True
resp = self.app.put(
reverse("MasterNodeSettingsHandler"),
headers=self.default_headers,
params=jsonutils.dumps(data)
)
self.assertEqual(200, resp.status_code)
self.assertEqual(StatsSender().must_send_stats(), True)
# emulate user disabled statistics sending
data["settings"]["statistics"]["send_anonymous_statistic"]["value"] = \
False
resp = self.app.put(
reverse("MasterNodeSettingsHandler"),
headers=self.default_headers,
params=jsonutils.dumps(data)
)
self.assertEqual(200, resp.status_code)
self.assertEqual(StatsSender().must_send_stats(), False)
def test_user_contacts_info_disabled_while_not_confirmed_by_user(self):
self.assertDictEqual(
InstallationInfo().get_installation_info()['user_information'],
{'contact_info_provided': False})
def test_user_contacts_info_disabled_by_default(self):
resp = self.app.get(
reverse("MasterNodeSettingsHandler"),
headers=self.default_headers)
self.assertEqual(200, resp.status_code)
data = resp.json_body
# emulate user confirmed settings in UI
data["settings"]["statistics"]["user_choice_saved"]["value"] = True
resp = self.app.put(
reverse("MasterNodeSettingsHandler"),
headers=self.default_headers,
params=jsonutils.dumps(data)
)
self.assertEqual(200, resp.status_code)
self.assertDictEqual(
InstallationInfo().get_installation_info()['user_information'],
{'contact_info_provided': False})
def test_user_contacts_info_enabled_by_user(self):
resp = self.app.get(
reverse("MasterNodeSettingsHandler"),
headers=self.default_headers)
self.assertEqual(200, resp.status_code)
data = resp.json_body
# emulate user enabled contact info sending to support team
data["settings"]["statistics"]["user_choice_saved"]["value"] = True
data["settings"]["statistics"]["send_user_info"]["value"] = \
True
name = "user"
email = "u@e.mail"
company = "user company"
data["settings"]["statistics"]["name"]["value"] = name
data["settings"]["statistics"]["email"]["value"] = email
data["settings"]["statistics"]["company"]["value"] = company
resp = self.app.put(
reverse("MasterNodeSettingsHandler"),
headers=self.default_headers,
params=jsonutils.dumps(data)
)
self.assertEqual(200, resp.status_code)
self.assertDictEqual(
InstallationInfo().get_installation_info()['user_information'],
{
'contact_info_provided': True,
'name': name,
'email': email,
'company': company
}
)
def test_partial_user_contacts_info(self):
resp = self.app.get(
reverse("MasterNodeSettingsHandler"),
headers=self.default_headers)
self.assertEqual(200, resp.status_code)
data = resp.json_body
# emulate user enabled contact info sending to support team
data["settings"]["statistics"]["user_choice_saved"]["value"] = True
data["settings"]["statistics"]["send_user_info"]["value"] = \
True
name = "user"
email = "u@e.mail"
data["settings"]["statistics"]["name"]["value"] = name
data["settings"]["statistics"]["email"]["value"] = email
resp = self.app.put(
reverse("MasterNodeSettingsHandler"),
headers=self.default_headers,
params=jsonutils.dumps(data)
)
self.assertEqual(200, resp.status_code)
self.assertDictEqual(
InstallationInfo().get_installation_info()['user_information'],
{
'contact_info_provided': True,
'name': name,
'email': email,
'company': ''
}
)
def test_user_contacts_info_broken(self):
settings_from_db = objects.MasterNodeSettings.get_one()
settings = dict(settings_from_db.settings)
settings["statistics"] = None
settings_from_db.settings = settings
self.db.commit()
self.assertDictEqual(
InstallationInfo().get_installation_info()['user_information'],
{'contact_info_provided': False})
def test_installation_info_when_stats_info_deleted(self):
settings_from_db = objects.MasterNodeSettings.get_one()
self.db.delete(settings_from_db)
self.db.commit()
self.assertDictEqual(
InstallationInfo().get_installation_info()['user_information'],
{'contact_info_provided': False})
self.assertEqual(
InstallationInfo().get_installation_info()['master_node_uid'],
None)
| 39.587361 | 79 | 0.428021 | 1,489 | 21,298 | 5.92814 | 0.137005 | 0.07749 | 0.067294 | 0.061856 | 0.845927 | 0.807069 | 0.802991 | 0.777048 | 0.765039 | 0.765039 | 0 | 0.007204 | 0.472063 | 21,298 | 537 | 80 | 39.66108 | 0.777837 | 0.04423 | 0 | 0.698947 | 0 | 0 | 0.248033 | 0.122713 | 0 | 0 | 0 | 0 | 0.063158 | 1 | 0.025263 | false | 0 | 0.014737 | 0 | 0.042105 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e80cdec79e6b9dcedb7e207e947d21434bf72a9a | 28 | py | Python | dae_RelayBoard/__init__.py | jhonsay/dae-py-relay-controller | a472e64a9394cdcb53f5061926edb5d0bb0ca166 | [
"Unlicense"
] | null | null | null | dae_RelayBoard/__init__.py | jhonsay/dae-py-relay-controller | a472e64a9394cdcb53f5061926edb5d0bb0ca166 | [
"Unlicense"
] | null | null | null | dae_RelayBoard/__init__.py | jhonsay/dae-py-relay-controller | a472e64a9394cdcb53f5061926edb5d0bb0ca166 | [
"Unlicense"
] | null | null | null | from dae_RelayBoard import * | 28 | 28 | 0.857143 | 4 | 28 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107143 | 28 | 1 | 28 | 28 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e8181cc5f085ce06ff85dd7d78f7176792c3baf1 | 11,805 | py | Python | spiketoolkit/validation/validation_tools.py | Shawn-Guo-CN/spiketoolkit | 11e60f3cd80c135c62e27538a4e141115a7e27ad | [
"MIT"
] | null | null | null | spiketoolkit/validation/validation_tools.py | Shawn-Guo-CN/spiketoolkit | 11e60f3cd80c135c62e27538a4e141115a7e27ad | [
"MIT"
] | 5 | 2019-02-15T20:16:43.000Z | 2019-02-27T14:54:08.000Z | spiketoolkit/validation/validation_tools.py | Shawn-Guo-CN/spiketoolkit | 11e60f3cd80c135c62e27538a4e141115a7e27ad | [
"MIT"
] | 1 | 2019-02-15T14:40:54.000Z | 2019-02-15T14:40:54.000Z | from ..postprocessing.postprocessing_tools import _get_quality_metric_data, _get_pca_metric_data, \
_get_spike_times_clusters, _get_amp_metric_data
import spikeextractors as se
import numpy as np
def get_spike_times_metrics_data(sorting, sampling_frequency):
'''
Computes and returns the spike times in seconds and also returns
along with cluster_ids needed for quality metrics
Parameters
----------
sorting: SortingExtractor
The sorting extractor
sampling_frequency: float
The sampling frequency of the recording
Returns
-------
spike_times: numpy.ndarray (num_spikes x 0)
Spike times in frames
spike_clusters: numpy.ndarray (num_spikes x 0)
Cluster IDs for each spike time
'''
if not isinstance(sorting, se.SortingExtractor):
raise AttributeError()
if len(sorting.get_unit_ids()) == 0:
raise Exception("No units in the sorting result, can't compute any metric information.")
# spike times.npy and spike clusters.npy
spike_times, spike_clusters = _get_spike_times_clusters(sorting)
spike_times = np.squeeze((spike_times / sampling_frequency))
spike_clusters = np.squeeze(spike_clusters.astype(int))
return spike_times, spike_clusters
def get_pca_metric_data(recording, sorting, n_comp=3, ms_before=1., ms_after=2., dtype=None, max_spikes_per_unit=np.inf,
max_spikes_for_pca=np.inf, recompute_info=True, save_features_props=False,
verbose=False, seed=0):
'''
Computes and returns all data needed to compute all the quality metrics from SpikeMetrics
Parameters
----------
recording: RecordingExtractor
The recording extractor
sorting: SortingExtractor
The sorting extractor
n_comp: int
n_compFeatures in template-gui format
ms_before: float
Time period in ms to cut waveforms before the spike events
ms_after: float
Time period in ms to cut waveforms after the spike events
dtype: dtype
The numpy dtype of the waveforms
max_spikes_per_unit: int
The maximum number of spikes to extract per unit.
max_spikes_for_pca: int
The maximum number of spikes to use to compute PCA.
recompute_info: bool
If True, will always re-extract waveforms.
save_features_props: bool
If True, save all features and properties in the sorting extractor.
verbose: bool
If True output is verbose
seed: int
Random seed for reproducibility
Returns
-------
spike_times: numpy.ndarray (num_spikes x 0)
Spike times in frames
spike_clusters: numpy.ndarray (num_spikes x 0)
Cluster IDs for each spike time
pc_features: numpy.ndarray (num_spikes x num_pcs x num_channels)
Pre-computed PCs for blocks of channels around each spike
pc_feature_ind: numpy.ndarray (num_units x num_channels)
Channel indices of PCs for each unit
'''
if not isinstance(recording, se.RecordingExtractor) or not isinstance(sorting, se.SortingExtractor):
raise AttributeError()
if len(sorting.get_unit_ids()) == 0:
raise Exception("No units in the sorting result, can't compute any metric information.")
spike_times, spike_clusters, pc_features, pc_feature_ind = _get_pca_metric_data(recording, sorting, n_comp=n_comp,
ms_before=ms_before,
ms_after=ms_after,
dtype=dtype,
max_spikes_per_unit=
max_spikes_per_unit,
max_spikes_for_pca=
max_spikes_for_pca,
recompute_info=recompute_info,
save_features_props=
save_features_props,
verbose=verbose, seed=seed)
return np.squeeze(recording.frame_to_time(spike_times)), np.squeeze(spike_clusters), pc_features, pc_feature_ind
def get_amplitude_metric_data(recording, sorting, amp_method='absolute', amp_peak='both', amp_frames_before=3,
amp_frames_after=3, max_spikes_per_unit=np.inf, recompute_info=True,
save_features_props=False, seed=0):
'''
Computes and returns all data needed to compute all the quality metrics from SpikeMetrics
Parameters
----------
recording: RecordingExtractor
The recording extractor
sorting: SortingExtractor
The sorting extractor
dtype: dtype
The numpy dtype of the waveforms
amp_method: str
If 'absolute' (default), amplitudes are absolute amplitudes in uV are returned.
If 'relative', amplitudes are returned as ratios between waveform amplitudes and template amplitudes.
amp_peak: str
If maximum channel has to be found among negative peaks ('neg'), positive ('pos') or both ('both' - default)
amp_frames_before: int
Frames before peak to compute amplitude
amp_frames_after: int
Frames after peak to compute amplitude
max_spikes_per_unit: int
The maximum number of spikes to extract per unit.
recompute_info: bool
If True, will always re-extract waveforms.
save_features_props: bool
If True, save all features and properties in the sorting extractor.
verbose: bool
If True output is verbose
seed: int
Random seed for reproducibility
Returns
-------
spike_times: numpy.ndarray (num_spikes x 0)
Spike times in frames
spike_clusters: numpy.ndarray (num_spikes x 0)
Cluster IDs for each spike time
amplitudes: numpy.ndarray (num_spikes x 0)
Amplitude value for each spike time
'''
if not isinstance(recording, se.RecordingExtractor) or not isinstance(sorting, se.SortingExtractor):
raise AttributeError()
if len(sorting.get_unit_ids()) == 0:
raise Exception("No units in the sorting result, can't compute any metric information.")
spike_times, spike_clusters, amplitudes = _get_amp_metric_data(recording, sorting,
amp_method=amp_method,
amp_peak=amp_peak,
amp_frames_before=amp_frames_before,
amp_frames_after=amp_frames_after,
max_spikes_per_unit=max_spikes_per_unit,
save_features_props=save_features_props,
recompute_info=recompute_info,
seed=seed)
return np.squeeze(recording.frame_to_time(spike_times)), np.squeeze(spike_clusters), np.squeeze(amplitudes)
def get_all_metric_data(recording, sorting, n_comp=3, ms_before=1., ms_after=2., dtype=None, amp_method='absolute',
amp_peak='both', amp_frames_before=3, amp_frames_after=3, max_spikes_per_unit=np.inf,
max_spikes_for_pca=np.inf, recompute_info=True, save_features_props=False,
verbose=False, seed=0):
'''
Computes and returns all data needed to compute all the quality metrics from SpikeMetrics
Parameters
----------
recording: RecordingExtractor
The recording extractor
sorting: SortingExtractor
The sorting extractor
n_comp: int
n_compFeatures in template-gui format
ms_before: float
Time period in ms to cut waveforms before the spike events
ms_after: float
Time period in ms to cut waveforms after the spike events
dtype: dtype
The numpy dtype of the waveforms
amp_method: str
If 'absolute' (default), amplitudes are absolute amplitudes in uV are returned.
If 'relative', amplitudes are returned as ratios between waveform amplitudes and template amplitudes.
amp_peak: str
If maximum channel has to be found among negative peaks ('neg'), positive ('pos') or both ('both' - default)
amp_frames_before: int
Frames before peak to compute amplitude
amp_frames_after: int
Frames after peak to compute amplitude
max_spikes_per_unit: int
The maximum number of spikes to extract per unit.
max_spikes_for_pca: int
The maximum number of spikes to use to compute PCA.
recompute_info: bool
If True, will always re-extract waveforms.
save_features_props: bool
If True, save all features and properties in the sorting extractor.
verbose: bool
If True output is verbose
seed: int
Random seed for reproducibility
Returns
-------
spike_times: numpy.ndarray (num_spikes x 0)
Spike times in frames
spike_times:amps: numpy.ndarray (num_spikes x 0)
Spike times in frames for amplitudes
spike_times_pca: numpy.ndarray (num_spikes x 0)
Spike times in frames for pca
spike_clusters: numpy.ndarray (num_spikes x 0)
Cluster IDs for each spike time
spike_clusters_amps: numpy.ndarray (num_spikes x 0)
Cluster IDs for each spike time in amplitudes
spike_clusters_pca: numpy.ndarray (num_spikes x 0)
Cluster IDs for each spike time in pca
amplitudes: numpy.ndarray (num_spikes x 0)
Amplitude value for each spike time
pc_features: numpy.ndarray (num_spikes x num_pcs x num_channels)
Pre-computed PCs for blocks of channels around each spike
pc_feature_ind: numpy.ndarray (num_units x num_channels)
Channel indices of PCs for each unit
'''
if not isinstance(recording, se.RecordingExtractor) or not isinstance(sorting, se.SortingExtractor):
raise AttributeError()
if len(sorting.get_unit_ids()) == 0:
raise Exception("No units in the sorting result, can't compute any metric information.")
spike_times, spike_times_amps, spike_times_pca, spike_clusters, spike_clusters_amps, spike_clusters_pca, \
amplitudes, pc_features, pc_feature_ind = _get_quality_metric_data(
recording, sorting, n_comp=n_comp,
ms_before=ms_before, ms_after=ms_after,
dtype=dtype, amp_method=amp_method,
amp_peak=amp_peak,
amp_frames_before=amp_frames_before,
amp_frames_after=amp_frames_after,
max_spikes_per_unit=max_spikes_per_unit,
max_spikes_for_pca=max_spikes_for_pca,
recompute_info=recompute_info,
save_features_props=save_features_props,
verbose=verbose, seed=seed)
return np.squeeze(recording.frame_to_time(spike_times)), np.squeeze(recording.frame_to_time(spike_times_amps)), \
np.squeeze(recording.frame_to_time(spike_times_pca)), np.squeeze(spike_clusters), \
np.squeeze(spike_clusters_amps), np.squeeze(spike_clusters_pca), \
np.squeeze(amplitudes), pc_features, pc_feature_ind
| 46.660079 | 120 | 0.62482 | 1,429 | 11,805 | 4.930021 | 0.109867 | 0.044003 | 0.038325 | 0.047693 | 0.890987 | 0.881618 | 0.84741 | 0.838893 | 0.827395 | 0.822569 | 0 | 0.003857 | 0.319102 | 11,805 | 252 | 121 | 46.845238 | 0.872605 | 0.450064 | 0 | 0.453333 | 0 | 0 | 0.051064 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.053333 | false | 0 | 0.04 | 0 | 0.146667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e81a66fed3ca2dbb0c008fa052a0344b83bf0cb3 | 40 | py | Python | cride/users/serializers/__init__.py | devmiguelangel/cride | 8593eb76291ff8af46a93b5625304766f95b2347 | [
"MIT"
] | null | null | null | cride/users/serializers/__init__.py | devmiguelangel/cride | 8593eb76291ff8af46a93b5625304766f95b2347 | [
"MIT"
] | null | null | null | cride/users/serializers/__init__.py | devmiguelangel/cride | 8593eb76291ff8af46a93b5625304766f95b2347 | [
"MIT"
] | null | null | null | # User serializers
from .users import *
| 13.333333 | 20 | 0.75 | 5 | 40 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.175 | 40 | 2 | 21 | 20 | 0.909091 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e82a84adab31d63d2fbdfff0a04424e862fd8bb5 | 20 | py | Python | solaris/raster/__init__.py | rbavery/solaris | 0d7bd1439a96c243d7810fcddf776b7e635a05ea | [
"Apache-2.0"
] | 367 | 2019-05-05T22:09:39.000Z | 2022-03-27T10:05:16.000Z | 3-SatShipAI/solaris/raster/__init__.py | Z-Zheng/SpaceNet_SAR_Buildings_Solutions | 6a9c3962d987d985384d0d41a187f5fbfadac82c | [
"Apache-2.0"
] | 396 | 2019-04-30T21:51:12.000Z | 2022-03-31T09:21:09.000Z | 3-SatShipAI/solaris/raster/__init__.py | Z-Zheng/SpaceNet_SAR_Buildings_Solutions | 6a9c3962d987d985384d0d41a187f5fbfadac82c | [
"Apache-2.0"
] | 120 | 2019-06-29T20:20:08.000Z | 2022-03-10T07:37:57.000Z | from . import image
| 10 | 19 | 0.75 | 3 | 20 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 20 | 1 | 20 | 20 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1c028b2de24b74487417246a6b4120e335e24362 | 43 | py | Python | src/lr_schedulers/__init__.py | data-sachez-2511/pl_classification | 637b85c58d723925ae4d3fce08db2842786c750a | [
"MIT"
] | 8 | 2021-11-13T19:21:36.000Z | 2021-12-14T15:00:16.000Z | src/lr_schedulers/__init__.py | data-sachez-2511/pl_classification | 637b85c58d723925ae4d3fce08db2842786c750a | [
"MIT"
] | 6 | 2021-11-11T13:05:21.000Z | 2021-12-13T09:59:05.000Z | src/lr_schedulers/__init__.py | data-sachez-2511/pl_classification | 637b85c58d723925ae4d3fce08db2842786c750a | [
"MIT"
] | null | null | null | from lr_schedulers.wrapper import Scheduler | 43 | 43 | 0.906977 | 6 | 43 | 6.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069767 | 43 | 1 | 43 | 43 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1c5d2f67a7faa40ec66fe41a1a9c471496826898 | 85 | py | Python | model/__init__.py | drah/UGATIT | 69c7f6d9887407f21c900a8bdf952e65cdcbc8a6 | [
"Apache-2.0"
] | null | null | null | model/__init__.py | drah/UGATIT | 69c7f6d9887407f21c900a8bdf952e65cdcbc8a6 | [
"Apache-2.0"
] | null | null | null | model/__init__.py | drah/UGATIT | 69c7f6d9887407f21c900a8bdf952e65cdcbc8a6 | [
"Apache-2.0"
] | null | null | null | from . import helpers
from . import modules
from . import factory
from . import model | 21.25 | 21 | 0.776471 | 12 | 85 | 5.5 | 0.5 | 0.606061 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 85 | 4 | 22 | 21.25 | 0.942857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
98c67ce5625cfdb781f5cb1b11b2a5c8b4b0c7e3 | 142 | py | Python | api/ext_news/cron.py | sp-team-lutsk/docker_polls_group | e59aa4d9b0b703e31046460a87a8b14d9e3f3d1e | [
"MIT"
] | null | null | null | api/ext_news/cron.py | sp-team-lutsk/docker_polls_group | e59aa4d9b0b703e31046460a87a8b14d9e3f3d1e | [
"MIT"
] | 18 | 2019-06-25T14:37:52.000Z | 2019-10-25T21:18:44.000Z | api/ext_news/cron.py | sp-team-lutsk/cyber_security_web_portal | e59aa4d9b0b703e31046460a87a8b14d9e3f3d1e | [
"MIT"
] | 2 | 2019-06-25T12:43:44.000Z | 2019-08-21T08:36:44.000Z | from utils.views import news_subscription
from parsing import main
def CronMailing():
news_subscription()
def CronParse():
main()
| 12.909091 | 41 | 0.739437 | 17 | 142 | 6.058824 | 0.647059 | 0.31068 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.183099 | 142 | 10 | 42 | 14.2 | 0.887931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c70acf0d76f55e0a16bd98ff3e266297fefe8d69 | 24 | py | Python | main.py | Fernandohf/NURBS | 6cd5db1ba7009e48d434e4e04b13fd29a8d26678 | [
"MIT"
] | 4 | 2019-03-19T17:56:06.000Z | 2021-03-28T11:09:01.000Z | main.py | Fernandohf/NURBS | 6cd5db1ba7009e48d434e4e04b13fd29a8d26678 | [
"MIT"
] | null | null | null | main.py | Fernandohf/NURBS | 6cd5db1ba7009e48d434e4e04b13fd29a8d26678 | [
"MIT"
] | 3 | 2019-03-19T17:56:06.000Z | 2020-11-17T08:50:33.000Z | import NURBS
import UI
| 6 | 12 | 0.791667 | 4 | 24 | 4.75 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.208333 | 24 | 3 | 13 | 8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c74f05073459c0bb58f387f9c6d002892b1d85c6 | 8,652 | py | Python | tests/serialization/test_complex_types_serialization.py | marcosschroh/-dataclasses-avroschema | 37f7ba34a9e53a5f1f45edae051e5821934934d3 | [
"MIT"
] | null | null | null | tests/serialization/test_complex_types_serialization.py | marcosschroh/-dataclasses-avroschema | 37f7ba34a9e53a5f1f45edae051e5821934934d3 | [
"MIT"
] | null | null | null | tests/serialization/test_complex_types_serialization.py | marcosschroh/-dataclasses-avroschema | 37f7ba34a9e53a5f1f45edae051e5821934934d3 | [
"MIT"
] | null | null | null | import dataclasses
import enum
import json
from dataclasses_avroschema import AvroModel
def test_complex_fields(user_advance_dataclass, color_enum):
data = {
"name": "juan",
"age": 20,
"pets": ["dog"],
"accounts": {"ing": 100},
"has_car": True,
"favorite_colors": color_enum.GREEN,
"md5": b"u00ffffffffffffx",
}
expected_data = {
"name": "juan",
"age": 20,
"pets": ["dog"],
"accounts": {"ing": 100},
"has_car": True,
"favorite_colors": color_enum.GREEN,
"md5": b"u00ffffffffffffx",
"country": "Argentina",
"address": None,
}
data_json = {
"name": "juan",
"age": 20,
"pets": ["dog"],
"accounts": {"ing": 100},
"favorite_colors": color_enum.GREEN.value,
"has_car": True,
"country": "Argentina",
"address": None,
"md5": "u00ffffffffffffx",
}
user = user_advance_dataclass(**data)
avro_binary = user.serialize()
avro_json = user.serialize(serialization_type="avro-json")
assert user_advance_dataclass.deserialize(avro_binary, create_instance=False) == expected_data
assert (
user_advance_dataclass.deserialize(avro_json, serialization_type="avro-json", create_instance=False)
== expected_data
)
assert user.to_json() == json.dumps(data_json)
assert user_advance_dataclass.deserialize(avro_binary) == user
assert user_advance_dataclass.deserialize(avro_json, serialization_type="avro-json") == user
def test_complex_fields_with_defaults(user_advance_with_defaults_dataclass, color_enum):
data = {
"name": "juan",
"age": 20,
}
expected_data = {
"name": "juan",
"age": 20,
"pets": ["dog", "cat"],
"accounts": {"key": 1},
"has_car": False,
"favorite_colors": color_enum.BLUE,
"country": "Argentina",
"address": None,
}
expected_json = {
"name": "juan",
"age": 20,
"pets": ["dog", "cat"],
"accounts": {"key": 1},
"has_car": False,
"favorite_colors": color_enum.BLUE.value,
"country": "Argentina",
"address": None,
}
user = user_advance_with_defaults_dataclass(**data)
expected_user = user_advance_with_defaults_dataclass(
name="juan",
age=20,
favorite_colors=color_enum.BLUE,
)
avro_binary = user.serialize()
avro_json = user.serialize(serialization_type="avro-json")
assert user_advance_with_defaults_dataclass.deserialize(avro_binary, create_instance=False) == expected_data
assert (
user_advance_with_defaults_dataclass.deserialize(
avro_json, serialization_type="avro-json", create_instance=False
)
== expected_data
)
assert user.to_json() == json.dumps(expected_json)
assert user_advance_with_defaults_dataclass.deserialize(avro_binary) == expected_user
assert user_advance_with_defaults_dataclass.deserialize(avro_json, serialization_type="avro-json") == expected_user
# check that is possible to continue doing serialization and dedesialization operations
expected_user.favorite_colors = "YELLOW"
assert expected_user.serialize() == b"\x08juan(\x04\x06dog\x06cat\x00\x02\x06key\x02\x00\x00\x02\x12Argentina\x00"
assert user.avro_schema() == expected_user.avro_schema()
def test_complex_fields_with_enum(user_advance_dataclass_with_enum, color_enum):
data = {
"name": "juan",
"age": 20,
"pets": ["dog"],
"accounts": {"ing": 100},
"has_car": True,
"favorite_colors": color_enum.GREEN,
"md5": b"u00ffffffffffffx",
}
expected_data = {
"name": "juan",
"age": 20,
"pets": ["dog"],
"accounts": {"ing": 100},
"has_car": True,
"favorite_colors": color_enum.GREEN,
"md5": b"u00ffffffffffffx",
"country": "Argentina",
"address": None,
"user_type": None,
}
data_json = {
"name": "juan",
"age": 20,
"pets": ["dog"],
"accounts": {"ing": 100},
"favorite_colors": "GREEN",
"has_car": True,
"country": "Argentina",
"address": None,
"user_type": None,
"md5": "u00ffffffffffffx",
}
user = user_advance_dataclass_with_enum(**data)
avro_binary = user.serialize()
avro_json = user.serialize(serialization_type="avro-json")
assert user_advance_dataclass_with_enum.deserialize(avro_binary, create_instance=False) == expected_data
assert (
user_advance_dataclass_with_enum.deserialize(avro_json, serialization_type="avro-json", create_instance=False)
== expected_data
)
assert user.to_json() == json.dumps(data_json)
assert user_advance_dataclass_with_enum.deserialize(avro_binary) == user
assert user_advance_dataclass_with_enum.deserialize(avro_json, serialization_type="avro-json") == user
def test_complex_fields_with_defaults_with_enum(user_advance_with_defaults_dataclass_with_enum, color_enum):
data = {
"name": "juan",
"age": 20,
}
expected_data = {
"name": "juan",
"age": 20,
"pets": ["dog", "cat"],
"accounts": {"key": 1},
"has_car": False,
"favorite_colors": color_enum.BLUE,
"country": "Argentina",
"address": None,
"user_type": None,
}
data_json = {
"name": "juan",
"age": 20,
"pets": ["dog", "cat"],
"accounts": {"key": 1},
"has_car": False,
"favorite_colors": color_enum.BLUE.value,
"country": "Argentina",
"address": None,
"user_type": None,
}
user = user_advance_with_defaults_dataclass_with_enum(**data)
expected_user = user_advance_with_defaults_dataclass_with_enum(
name="juan",
age=20,
favorite_colors=color_enum.BLUE,
)
avro_binary = user.serialize()
avro_json = user.serialize(serialization_type="avro-json")
assert (
user_advance_with_defaults_dataclass_with_enum.deserialize(avro_binary, create_instance=False) == expected_data
)
assert (
user_advance_with_defaults_dataclass_with_enum.deserialize(
avro_json, serialization_type="avro-json", create_instance=False
)
== expected_data
)
assert user.to_json() == json.dumps(data_json)
assert user_advance_with_defaults_dataclass_with_enum.deserialize(avro_binary) == expected_user
assert (
user_advance_with_defaults_dataclass_with_enum.deserialize(avro_json, serialization_type="avro-json")
== expected_user
)
# check that is possible to continue doing serialization and dedesialization operations
expected_user.favorite_colors = color_enum.YELLOW
assert (
expected_user.serialize() == b"\x08juan(\x04\x06dog\x06cat\x00\x02\x06key\x02\x00\x00\x02\x12Argentina\x00\x00"
)
assert user.avro_schema() == expected_user.avro_schema()
def test_enum_fields_with_inner_schema():
class InnerEnum(enum.Enum):
ONE = "One"
TWO = "Two"
class OuterEnum(enum.Enum):
THREE = "Three"
FOUR = "Four"
@dataclasses.dataclass
class InnerSchema(AvroModel):
"""Inner schema"""
some_val: str
enum_field1: InnerEnum
@dataclasses.dataclass
class OuterSchema(AvroModel):
"""Outer schema"""
some_val2: str
enum_field2: OuterEnum
inner_schema: InnerSchema
example = OuterSchema(
some_val2="val2",
enum_field2=OuterEnum.FOUR,
inner_schema=InnerSchema(some_val="val1", enum_field1=InnerEnum.TWO),
)
expected_data = {
"some_val2": "val2",
"enum_field2": OuterEnum.FOUR,
"inner_schema": {"some_val": "val1", "enum_field1": InnerEnum.TWO},
}
expected_json = {
"some_val2": "val2",
"enum_field2": "Four",
"inner_schema": {"some_val": "val1", "enum_field1": "Two"},
}
avro_binary = example.serialize()
avro_json = example.serialize(serialization_type="avro-json")
assert OuterSchema.deserialize(avro_binary, create_instance=False) == expected_data
assert OuterSchema.deserialize(avro_json, serialization_type="avro-json", create_instance=False) == expected_data
assert example.to_json() == json.dumps(expected_json)
assert OuterSchema.deserialize(avro_binary) == example
assert OuterSchema.deserialize(avro_json, serialization_type="avro-json") == example
| 30.464789 | 119 | 0.63338 | 948 | 8,652 | 5.471519 | 0.101266 | 0.04627 | 0.052439 | 0.072296 | 0.895701 | 0.877386 | 0.861384 | 0.808174 | 0.747831 | 0.705032 | 0 | 0.021251 | 0.238558 | 8,652 | 283 | 120 | 30.572438 | 0.76609 | 0.022885 | 0 | 0.597403 | 0 | 0.008658 | 0.158019 | 0.018242 | 0 | 0 | 0 | 0 | 0.125541 | 1 | 0.021645 | false | 0 | 0.017316 | 0 | 0.095238 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c7623ba54da666904cc2752c52173e7ab6f519e4 | 11,037 | py | Python | products/views.py | stephane-lucienvauthier/foodstock-api | 6b886cd638b5f1a10774280845bc85fd37a6ca9d | [
"Apache-2.0"
] | null | null | null | products/views.py | stephane-lucienvauthier/foodstock-api | 6b886cd638b5f1a10774280845bc85fd37a6ca9d | [
"Apache-2.0"
] | null | null | null | products/views.py | stephane-lucienvauthier/foodstock-api | 6b886cd638b5f1a10774280845bc85fd37a6ca9d | [
"Apache-2.0"
] | null | null | null | """This module manages the views of the categories app."""
from rest_framework import generics, permissions, status
from rest_framework.views import APIView
from rest_framework.response import Response
from products.models import Product, Batch
from products.serializers import ProductSerializer, ProductUpdateSerializer, ProductListSerializer, BatchSerializer
class ProductView(APIView):
"""
This class manages the view to create and list the products.
Attributes:
permission_classes (list(Permissions)): The options to access at this resource.
serializer_class (Serializer): The serializer to bind the request and the response object.
Returns:
200: The list of products.
201: The product is created.
400: An error is detected on the request data.
401: The user must be connected to access this resource.
406: The response format is not acceptable by the server.
500: An error was occured in the treatment of the request.
"""
permission_classes = [permissions.IsAuthenticated]
def get(self, request, format=None):
"""
Retrieve the product list with batches in stock.
Attributes:
request (Request): The request sent to the api.
format (NoneType): Always none, pass by Accept header.
Returns:
200: The product is updated.
400: An error is detected on the request data.
401: The user must be connected to access this resource.
406: The response format is not acceptable by the server.
500: An error was occured in the treatment of the request.
"""
products = Product.objects.filter(owner=request.user) # pylint: disable=no-member
serializer = ProductListSerializer(products, many=True)
return Response(serializer.data)
def post(self, request, format=None):
"""
Create a product.
Attributes:
request (Request): The request sent to the api.
format (NoneType): Always none, pass by Accept header.
Returns:
201: The product is created.
400: An error is detected on the request data.
401: The user must be connected to access this resource.
406: The response format is not acceptable by the server.
500: An error was occured in the treatment of the request.
"""
serializer = ProductUpdateSerializer(data=request.data)
if serializer.is_valid():
serializer.save(owner=request.user)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
class ProductDetail(APIView):
"""
This class manages the view to update and delete a product.
Attributes:
permission_classes (list(Permissions)): The options to access at this resource.
serializer_class (Serializer): The serializer to bind the request and the response object.
Returns:
200: The product is updated.
204: The product is deleted.
400: An error is detected on the request data.
401: The user must be connected to access this resource.
406: The response format is not acceptable by the server.
500: An error was occured in the treatment of the request.
"""
permission_classes = [permissions.IsAuthenticated]
def get(self, request, pk, format=None):
"""
Retrieve a batch.
Attributes:
request (Request): The request sent to the api.
pk (int): The product identifier.
format (NoneType): Always none, pass by Accept header.
Returns:
200: The product is updated.
400: An error is detected on the request data.
401: The user must be connected to access this resource.
406: The response format is not acceptable by the server.
500: An error was occured in the treatment of the request.
"""
try:
product = Product.objects.get(owner=request.user, pk=pk) # pylint: disable=no-member
except Exception:
return Response(status=status.HTTP_404_NOT_FOUND)
serializer = ProductSerializer(product)
return Response(serializer.data)
def put(self, request, pk, format=None):
"""
Update a product.
Attributes:
request (Request): The request sent to the api.
pk (int): The product identifier.
format (NoneType): Always none, pass by Accept header.
Returns:
200: The product is updated.
400: An error is detected on the request data.
401: The user must be connected to access this resource.
406: The response format is not acceptable by the server.
500: An error was occured in the treatment of the request.
"""
product = Product.objects.get(owner=request.user, pk=pk) # pylint: disable=no-member
serializer = ProductUpdateSerializer(product, data=request.data)
if serializer.is_valid():
serializer.save(owner=request.user)
return Response(serializer.data)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, pk, format=None):
"""
Delete a product.
Attributes:
request (Request): The request sent to the api.
pk (int): The product identifier.
ps (int): the batch identifier.
format (NoneType): Always none, pass by Accept header.
Returns:
204: The product is deleted.
400: An error is detected on the request data.
401: The user must be connected to access this resource.
406: The response format is not acceptable by the server.
500: An error was occured in the treatment of the request.
"""
product = Product.objects.get(owner=request.user, pk=pk) # pylint: disable=no-member
product.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class BatchView(generics.ListCreateAPIView):
"""
This class manages the view to create and list the products.
Attributes:
permission_classes (list(Permissions)): The options to access at this resource.
serializer_class (Serializer): The serializer to bind the request and the response object.
Returns:
200: The list of products.
201: The product is created.
400: An error is detected on the request data.
401: The user must be connected to access this resource.
406: The response format is not acceptable by the server.
500: An error was occured in the treatment of the request.
"""
permission_classes = [permissions.IsAuthenticated]
serializer_class = BatchSerializer
def get_queryset(self):
"""Return the category list of the owner."""
product = Product.objects.get(owner=self.request.user, pk=self.kwargs['pk']) # pylint: disable=no-member
return Batch.objects.filter(owner=self.request.user, product=product) # pylint: disable=no-member
def perform_create(self, serializer):
"""Add the owner of the category before create it."""
product = Product.objects.get(owner=self.request.user, pk=self.kwargs['pk']) # pylint: disable=no-member
serializer.save(owner=self.request.user, product=product)
class BatchDetail(APIView):
"""
This class manages the view to update and delete a batch.
Attributes:
permission_classes (list(Permissions)): The options to access at this resource.
"""
permission_classes = [permissions.IsAuthenticated]
def get(self, request, pk, ps, format=None):
"""
Retrieve a batch.
Attributes:
request (Request): The request sent to the api.
pk (int): The product identifier.
ps (int): the batch identifier.
format (NoneType): Always none, pass by Accept header.
Returns:
200: The product is updated.
400: An error is detected on the request data.
401: The user must be connected to access this resource.
406: The response format is not acceptable by the server.
500: An error was occured in the treatment of the request.
"""
product = Product.objects.get(owner=request.user, pk=pk) # pylint: disable=no-member
try:
batch = Batch.objects.get(owner=request.user, product=product, pk=ps) # pylint: disable=no-member
except Exception:
return Response(status=status.HTTP_404_NOT_FOUND)
serializer = BatchSerializer(batch)
return Response(serializer.data)
def put(self, request, pk, ps, format=None):
"""
Update a batch.
Attributes:
request (Request): The request sent to the api.
pk (int): The product identifier.
ps (int): the batch identifier.
format (NoneType): Always none, pass by Accept header.
Returns:
200: The product is updated.
400: An error is detected on the request data.
401: The user must be connected to access this resource.
406: The response format is not acceptable by the server.
500: An error was occured in the treatment of the request.
"""
product = Product.objects.get(owner=request.user, pk=pk) # pylint: disable=no-member
batch = Batch.objects.get(owner=request.user, product=product, pk=ps) # pylint: disable=no-member
serializer = BatchSerializer(batch, data=request.data)
if serializer.is_valid():
serializer.save(owner=request.user, product=product)
return Response(serializer.data)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, pk, ps, format=None):
"""
Delete a batch.
Attributes:
request (Request): The request sent to the api.
pk (int): The product identifier.
ps (int): the batch identifier.
format (NoneType): Always none, pass by Accept header.
Returns:
204: The product is deleted.
400: An error is detected on the request data.
401: The user must be connected to access this resource.
406: The response format is not acceptable by the server.
500: An error was occured in the treatment of the request.
"""
product = Product.objects.get(owner=request.user, pk=pk) # pylint: disable=no-member
batch = Batch.objects.get(owner=request.user, product=product, pk=ps) # pylint: disable=no-member
batch.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
| 41.02974 | 115 | 0.641026 | 1,369 | 11,037 | 5.136596 | 0.096421 | 0.046928 | 0.029579 | 0.038823 | 0.865188 | 0.848407 | 0.82992 | 0.828925 | 0.828925 | 0.792804 | 0 | 0.025108 | 0.285494 | 11,037 | 268 | 116 | 41.182836 | 0.866599 | 0.559482 | 0 | 0.521739 | 0 | 0 | 0.001051 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.144928 | false | 0 | 0.072464 | 0 | 0.550725 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c7a4aea6607d6da0c0fd2beaf2fd3de120fa8c46 | 3,238 | py | Python | ticker/api.py | yusenjeng/invintdex | ea68477d248ca8bdb4d62fd09ee4008e01cb831c | [
"MIT"
] | null | null | null | ticker/api.py | yusenjeng/invintdex | ea68477d248ca8bdb4d62fd09ee4008e01cb831c | [
"MIT"
] | null | null | null | ticker/api.py | yusenjeng/invintdex | ea68477d248ca8bdb4d62fd09ee4008e01cb831c | [
"MIT"
] | null | null | null | from control import helper as Helper
from control.models import Quote, ETFQuote
def generateTBondsReturns(target, f, t):
tickers = ['TLT', 'IEF', 'SHY']
title1, series1, dates = generateReturnsSeries(tickers, f, t, True)
stream1 = Helper.toCsvString(title1, dates, series1)
quotes = ETFQuote.objects.filter(
ticker=target, date__range=[f, t]).order_by('date')
title2 = ['Date', target]
series2 = [[q.close for q in quotes]]
stream2 = Helper.toCsvString(title2, dates, series2)
return {
'title': 'Comparison of US Bonds',
'subtitle': '',
'subtitle': title1,
'subtitle2': title2,
'yLabel': 'Returns %',
'yLabel2': target,
'data1': stream1.getvalue(),
'data2': stream2.getvalue(),
'signals': [],
'annotation': 'B',
'from': dates[0],
'to': dates[-1],
'range': 0
}
def generateEmDevReturns(target, f, t):
tickers = ['EEM', 'EFA']
title1, series1, dates = generateReturnsSeries(tickers, f, t, True)
stream1 = Helper.toCsvString(title1, dates, series1)
quotes = ETFQuote.objects.filter(
ticker=target, date__range=[f, t]).order_by('date')
title2 = ['Date', target]
series2 = [[q.close for q in quotes]]
stream2 = Helper.toCsvString(title2, dates, series2)
return {
'title': 'Comparison of Emerging/Developed Market',
'subtitle': '',
'subtitle': title1,
'subtitle2': title2,
'yLabel': 'Returns %',
'yLabel2': target,
'data1': stream1.getvalue(),
'data2': stream2.getvalue(),
'signals': [],
'annotation': 'B',
'from': dates[0],
'to': dates[-1],
'range': 0
}
def generateCBondsReturns(target, f, t):
tickers = ['HYG', 'LQD']
title1, series1, dates = generateReturnsSeries(tickers, f, t, True)
stream1 = Helper.toCsvString(title1, dates, series1)
quotes = ETFQuote.objects.filter(
ticker=target, date__range=[f, t]).order_by('date')
title2 = ['Date', target]
series2 = [[q.close for q in quotes]]
stream2 = Helper.toCsvString(title2, dates, series2)
return {
'title': 'Comparison of Corporate Bonds',
'subtitle': '',
'subtitle': title1,
'subtitle2': title2,
'yLabel': 'Returns %',
'yLabel2': target,
'data1': stream1.getvalue(),
'data2': stream2.getvalue(),
'signals': [],
'annotation': 'B',
'from': dates[0],
'to': dates[-1],
'range': 0
}
def generateReturnsSeries(tickers, f, t, isETF=False):
titles = ['Date']
series = []
dates = []
for ticker in tickers:
titles.append(ticker)
if isETF:
quotes = ETFQuote.objects.filter(
ticker=ticker, date__range=[f, t]).order_by('date')
else:
quotes = Quote.objects.filter(
ticker=ticker, date__range=[f, t]).order_by('date')
first = quotes[0].adjclose
s = [(q.adjclose / first) * 100 for q in quotes]
series.append(s)
if len(dates) == 0:
dates = [q.date for q in quotes]
return titles, series, dates | 28.654867 | 71 | 0.565164 | 341 | 3,238 | 5.322581 | 0.237537 | 0.013223 | 0.052342 | 0.030303 | 0.741047 | 0.733333 | 0.733333 | 0.733333 | 0.733333 | 0.733333 | 0 | 0.029235 | 0.281655 | 3,238 | 113 | 72 | 28.654867 | 0.751075 | 0 | 0 | 0.684783 | 1 | 0 | 0.12967 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0 | 0.021739 | 0 | 0.108696 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c7bce37f59e14659e2893b122faf1775fa1d9f44 | 143 | py | Python | example_snippets/multimenus_snippets/Snippets/NumPy/Pretty printing/Formatting functions for specific dtypes/Set formatter for `float` type.py | kuanpern/jupyterlab-snippets-multimenus | 477f51cfdbad7409eab45abe53cf774cd70f380c | [
"BSD-3-Clause"
] | null | null | null | example_snippets/multimenus_snippets/Snippets/NumPy/Pretty printing/Formatting functions for specific dtypes/Set formatter for `float` type.py | kuanpern/jupyterlab-snippets-multimenus | 477f51cfdbad7409eab45abe53cf774cd70f380c | [
"BSD-3-Clause"
] | null | null | null | example_snippets/multimenus_snippets/Snippets/NumPy/Pretty printing/Formatting functions for specific dtypes/Set formatter for `float` type.py | kuanpern/jupyterlab-snippets-multimenus | 477f51cfdbad7409eab45abe53cf774cd70f380c | [
"BSD-3-Clause"
] | 1 | 2021-02-04T04:51:48.000Z | 2021-02-04T04:51:48.000Z | def format_float(x):
return '{0:+0.2f}'.format(x)
with printoptions(formatter={'float': format_float}):
print(np.random.random(10)-0.5) | 35.75 | 53 | 0.685315 | 23 | 143 | 4.173913 | 0.652174 | 0.229167 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.054688 | 0.104895 | 143 | 4 | 54 | 35.75 | 0.695313 | 0 | 0 | 0 | 0 | 0 | 0.097222 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0.25 | 0.5 | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 6 |
1beb8f6c38c4fa79fb57217b6556d7fe3d8d619a | 23 | py | Python | src/test/python/testDataSetRepo/provider/library/dsalib/__init__.py | ninjapapa/SMV2 | 42cf9f176c3ec0bed61f66fbf859c18d97027dd6 | [
"Apache-2.0"
] | null | null | null | src/test/python/testDataSetRepo/provider/library/dsalib/__init__.py | ninjapapa/SMV2 | 42cf9f176c3ec0bed61f66fbf859c18d97027dd6 | [
"Apache-2.0"
] | 34 | 2022-02-26T04:27:34.000Z | 2022-03-29T23:05:47.000Z | src/test/python/testDataSetRepo/provider/library/dsalib/__init__.py | ninjapapa/SMV2 | 42cf9f176c3ec0bed61f66fbf859c18d97027dd6 | [
"Apache-2.0"
] | null | null | null | from dsalib.d import *
| 11.5 | 22 | 0.73913 | 4 | 23 | 4.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 23 | 1 | 23 | 23 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4045d135d97ebfe6f37e4c39294d397d8cb8755e | 21 | py | Python | vial-plugin/cash/__init__.py | baverman/vial-cash | f9fd11f9b7e5ef39433ed8632a347afe99997cb5 | [
"MIT"
] | 10 | 2021-02-26T13:00:22.000Z | 2022-03-31T11:38:28.000Z | vial-plugin/cash/__init__.py | baverman/vial-cash | f9fd11f9b7e5ef39433ed8632a347afe99997cb5 | [
"MIT"
] | 1 | 2021-08-03T10:21:12.000Z | 2021-08-03T10:21:12.000Z | images/ruleset-image-base/app/env.py | airspot-dev/krules | 0e402feef51c6189a163a62912480cfac0c438bb | [
"Apache-2.0"
] | null | null | null | def init():
pass
| 7 | 11 | 0.52381 | 3 | 21 | 3.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 21 | 2 | 12 | 10.5 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
406c94a9b95cbe2bfb301d08181a7fba23f8c98a | 457 | py | Python | python/super.py | zykzhang/practice | 180cdf41ae6397b1e71059fc32389cb305e8fea0 | [
"Apache-2.0"
] | null | null | null | python/super.py | zykzhang/practice | 180cdf41ae6397b1e71059fc32389cb305e8fea0 | [
"Apache-2.0"
] | null | null | null | python/super.py | zykzhang/practice | 180cdf41ae6397b1e71059fc32389cb305e8fea0 | [
"Apache-2.0"
] | 1 | 2017-12-26T17:10:15.000Z | 2017-12-26T17:10:15.000Z | # class A:
# def __init__(self):
# print "enter A"
# print "leave A"
# class B(A):
# def __init__(self):
# print "enter B"
# A.__init__(self)
# print "leave B"
class A(object):
def __init__(self):
print "enter A"
print "leave A"
class C(object):
def __init__(self):
print "enter C"
print "leave C"
class B(C):
def __init__(self):
print "enter B"
# A.__init__(self)
super(B, self).__init__()
print "leave B"
b = B()
| 13.848485 | 27 | 0.601751 | 71 | 457 | 3.422535 | 0.169014 | 0.230453 | 0.320988 | 0.329218 | 0.711934 | 0.711934 | 0.567901 | 0.567901 | 0.567901 | 0.312757 | 0 | 0 | 0.238512 | 457 | 32 | 28 | 14.28125 | 0.698276 | 0.378556 | 0 | 0.214286 | 0 | 0 | 0.153846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.428571 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
4094edf07cef101942bf310c7f1736b7855ecbfe | 1,653 | py | Python | catkin_ws/build/vision_msgs/cmake/vision_msgs-genmsg-context.py | Colin1245/ROS-Theory-Application-Shenlan | 49986c83a2c73c7ab4310fd3f010e1b6bc0de786 | [
"Apache-2.0"
] | null | null | null | catkin_ws/build/vision_msgs/cmake/vision_msgs-genmsg-context.py | Colin1245/ROS-Theory-Application-Shenlan | 49986c83a2c73c7ab4310fd3f010e1b6bc0de786 | [
"Apache-2.0"
] | null | null | null | catkin_ws/build/vision_msgs/cmake/vision_msgs-genmsg-context.py | Colin1245/ROS-Theory-Application-Shenlan | 49986c83a2c73c7ab4310fd3f010e1b6bc0de786 | [
"Apache-2.0"
] | null | null | null | # generated from genmsg/cmake/pkg-genmsg.context.in
messages_str = "/home/lwh/Desktop/Learning/shenlanROS/catkin_ws/src/vision_msgs/msg/BoundingBox2D.msg;/home/lwh/Desktop/Learning/shenlanROS/catkin_ws/src/vision_msgs/msg/BoundingBox3D.msg;/home/lwh/Desktop/Learning/shenlanROS/catkin_ws/src/vision_msgs/msg/Classification2D.msg;/home/lwh/Desktop/Learning/shenlanROS/catkin_ws/src/vision_msgs/msg/Classification3D.msg;/home/lwh/Desktop/Learning/shenlanROS/catkin_ws/src/vision_msgs/msg/Detection2DArray.msg;/home/lwh/Desktop/Learning/shenlanROS/catkin_ws/src/vision_msgs/msg/Detection2D.msg;/home/lwh/Desktop/Learning/shenlanROS/catkin_ws/src/vision_msgs/msg/Detection3DArray.msg;/home/lwh/Desktop/Learning/shenlanROS/catkin_ws/src/vision_msgs/msg/Detection3D.msg;/home/lwh/Desktop/Learning/shenlanROS/catkin_ws/src/vision_msgs/msg/ObjectHypothesis.msg;/home/lwh/Desktop/Learning/shenlanROS/catkin_ws/src/vision_msgs/msg/ObjectHypothesisWithPose.msg;/home/lwh/Desktop/Learning/shenlanROS/catkin_ws/src/vision_msgs/msg/VisionInfo.msg"
services_str = ""
pkg_name = "vision_msgs"
dependencies_str = "std_msgs;sensor_msgs;geometry_msgs"
langs = "gencpp;geneus;genlisp;gennodejs;genpy"
dep_include_paths_str = "vision_msgs;/home/lwh/Desktop/Learning/shenlanROS/catkin_ws/src/vision_msgs/msg;std_msgs;/opt/ros/melodic/share/std_msgs/cmake/../msg;sensor_msgs;/opt/ros/melodic/share/sensor_msgs/cmake/../msg;geometry_msgs;/opt/ros/melodic/share/geometry_msgs/cmake/../msg"
PYTHON_EXECUTABLE = "/usr/bin/python2"
package_has_static_sources = '' == 'TRUE'
genmsg_check_deps_script = "/opt/ros/melodic/share/genmsg/cmake/../../../lib/genmsg/genmsg_check_deps.py"
| 137.75 | 981 | 0.833031 | 246 | 1,653 | 5.390244 | 0.264228 | 0.105581 | 0.126697 | 0.199095 | 0.579186 | 0.529412 | 0.529412 | 0.529412 | 0.529412 | 0.529412 | 0 | 0.005559 | 0.020569 | 1,653 | 11 | 982 | 150.272727 | 0.813465 | 0.029643 | 0 | 0 | 1 | 0.222222 | 0.873283 | 0.853933 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
905c84815311912a57b96304a8f8e78afa51abdc | 664 | py | Python | DigitalInnovationOne/hello_django/core/views.py | Neto002/Python | 6892bc8446f216711949a982524867b17d973735 | [
"MIT"
] | null | null | null | DigitalInnovationOne/hello_django/core/views.py | Neto002/Python | 6892bc8446f216711949a982524867b17d973735 | [
"MIT"
] | null | null | null | DigitalInnovationOne/hello_django/core/views.py | Neto002/Python | 6892bc8446f216711949a982524867b17d973735 | [
"MIT"
] | null | null | null | from django.shortcuts import render, HttpResponse
# Create your views here.
def hello(request, nome, idade):
return HttpResponse(f'<h1>Hello {nome} de {idade} anos</h1>')
def soma(request, n1, n2):
return HttpResponse(f'<h1>A soma entre {n1} e {n2} é {n1+n2}</h1>')
def subtracao(request, n1, n2):
return HttpResponse(f'<h1>A subtração entre {n1} e {n2} é {n1-n2}</h1>')
def multiplicacao(request, n1, n2):
return HttpResponse(f'<h1>A multiplicação entre {n1} e {n2} é {n1*n2}</h1>')
def divisao(request, n1, n2):
return HttpResponse(f'<h1>A divisão entre {n1} e {n2} é {n1/n2:.2f}</h1>')
def home(request):
return HttpResponse('Oi') | 31.619048 | 80 | 0.667169 | 108 | 664 | 4.101852 | 0.333333 | 0.072235 | 0.214447 | 0.23702 | 0.467269 | 0.467269 | 0.467269 | 0.433409 | 0.13544 | 0 | 0 | 0.062724 | 0.159639 | 664 | 21 | 81 | 31.619048 | 0.731183 | 0.034639 | 0 | 0 | 0 | 0.307692 | 0.3625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.461538 | false | 0 | 0.076923 | 0.461538 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
908f984f38a543120836260a1cc08d594d8b9d88 | 178 | py | Python | pystrf/__init__.py | choldgraf/pySTRF | 26e66febd5f2f84e195645cfe4432954173caf14 | [
"BSD-2-Clause"
] | 3 | 2018-05-21T21:22:04.000Z | 2021-09-17T08:12:01.000Z | pystrf/__init__.py | choldgraf/pySTRF | 26e66febd5f2f84e195645cfe4432954173caf14 | [
"BSD-2-Clause"
] | null | null | null | pystrf/__init__.py | choldgraf/pySTRF | 26e66febd5f2f84e195645cfe4432954173caf14 | [
"BSD-2-Clause"
] | null | null | null | """Helpers for STRF analysis"""
from .audio import *
from .fit import *
from .mps import *
from .preprocess import *
from .sig import *
from .stats import *
from .utils import *
| 19.777778 | 31 | 0.707865 | 25 | 178 | 5.04 | 0.52 | 0.47619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179775 | 178 | 8 | 32 | 22.25 | 0.863014 | 0.140449 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
90d040603c620ce31ee22f0c7f8f250f2031bac3 | 9,831 | py | Python | test_backup.py | nathants/backup | fe2fe885ce5d039df377921380e36f5c6c8b9499 | [
"MIT"
] | null | null | null | test_backup.py | nathants/backup | fe2fe885ce5d039df377921380e36f5c6c8b9499 | [
"MIT"
] | null | null | null | test_backup.py | nathants/backup | fe2fe885ce5d039df377921380e36f5c6c8b9499 | [
"MIT"
] | null | null | null | import shell as sh
import os
import re
import uuid
sh.set['stream'] = True
def scrub(text):
text = re.sub(r'\d{4}\-\d{2}\-\d{2}T\d{2}:\d{2}:\d{2}Z', 'DATE', text) # remove digits from date
text = re.sub(r'(\w{8})\w{120}', r'\1', text) # truncate blake2b
return text
def scrub2(text):
text = re.sub(r'\d{4}\-\d{2}\-\d{2}T\d{2}:\d{2}:\d{2}Z', 'DATE', text) # remove digits from date
text = re.sub(r'\w{128}', 'HASH', text) # remove blake2b
return text
def diff():
return [(kind, path, tar, blake2b, size) for x in scrub(sh.run('backup-diff')).splitlines() for kind, path, tar, blake2b, size in [x.split()]]
def additions():
return [(path, tar, blake2b, size) for x in scrub(sh.run('backup-additions')).splitlines() for path, tar, blake2b, size in [x.split()]]
def log():
return scrub2(sh.run('backup-log --pretty=%s index')).splitlines()
def commits():
return sh.run('backup-log --pretty=%H index').splitlines()
def index():
return [(path, tar, blake2b, size) for x in scrub(sh.run('backup-index')).splitlines() for path, tar, blake2b, size in [x.split()]]
def find(regex, revision='HEAD'):
return [(path, tar, blake2b, size) for x in scrub(sh.run(f"backup-find '{regex}' {revision}")).splitlines() for path, tar, blake2b, size in [x.split()]]
def restore(regex, revision='HEAD'):
return [(path, blake2b[:8]) for x in sh.run(f"backup-restore '{regex}' {revision}").splitlines() for path, blake2b in [x.split()]]
def test_basic():
with sh.tempdir():
uid = str(uuid.uuid4())
os.environ['BACKUP_RCLONE_REMOTE'] = os.environ['BACKUP_TEST_RCLONE_REMOTE']
os.environ['BACKUP_DESTINATION'] = os.environ['BACKUP_TEST_DESTINATION'] + '/' + uid
os.environ['BACKUP_STORAGE_CLASS'] = 'STANDARD_IA'
os.environ['BACKUP_CHUNK_MEGABYTES'] = '100'
os.environ['BACKUP_ROOT'] = os.getcwd()
for k, v in os.environ.items():
if k.startswith('BACKUP_'):
print(k, '=>', v)
##
sh.run('echo foo > bar.txt')
sh.run('backup-add')
assert diff() == [
('addition:', './bar.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
]
assert additions() == [
('./bar.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
]
sh.run('backup-commit')
assert log() == [
'0000000000.DATE.tar.lz4.gpg.00000 HASH 1510',
'init',
]
assert index() == [
('./bar.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
]
##
sh.run('echo foo > bar2.txt')
sh.run('backup-add')
assert diff() == [
('addition:', './bar2.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
]
assert additions() == [
('./bar2.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
]
sh.run('backup-commit')
assert log() == [
'index-only-update',
'0000000000.DATE.tar.lz4.gpg.00000 HASH 1510',
'init',
]
assert index() == [
('./bar.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
('./bar2.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
]
##
sh.run('echo asdf > asdf.txt')
sh.run('backup-add')
assert diff() == [
('addition:', './asdf.txt', '0000000001.DATE.tar.lz4.gpg.00000', '36b807d5', '5')
]
assert additions() == [
('./asdf.txt', '0000000001.DATE.tar.lz4.gpg.00000', '36b807d5', '5')
]
sh.run('backup-commit')
assert log() == [
'0000000001.DATE.tar.lz4.gpg.00000 HASH 1513',
'index-only-update',
'0000000000.DATE.tar.lz4.gpg.00000 HASH 1510',
'init',
]
assert index() == [
('./asdf.txt', '0000000001.DATE.tar.lz4.gpg.00000', '36b807d5', '5'),
('./bar.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
('./bar2.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
]
##
with sh.tempdir():
os.environ['BACKUP_ROOT'] = os.getcwd()
_ = find('.') # clone the repo with the first call to find()
assert [find('.', commit) for commit in commits()] == [find(r'\.txt$', commit) for commit in commits()] == [
[
('./asdf.txt', '0000000001.DATE.tar.lz4.gpg.00000', '36b807d5', '5'),
('./bar.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
('./bar2.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
],
[
('./bar.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
('./bar2.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
],
[
('./bar.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
],
[],
]
assert [find('.*asdf.*', commit) for commit in commits()] == [
[
('./asdf.txt', '0000000001.DATE.tar.lz4.gpg.00000', '36b807d5', '5'),
],
[],
[],
[],
]
##
with sh.tempdir():
os.environ['BACKUP_ROOT'] = os.getcwd()
_ = find('.')
assert [restore('.', commit) for commit in commits()] == [
[
('./bar.txt', 'd202d795'),
('./bar2.txt', 'd202d795'),
('./asdf.txt', '36b807d5'),
],
[
('./bar.txt', 'd202d795'),
('./bar2.txt', 'd202d795'),
],
[
('./bar.txt', 'd202d795'),
],
[],
]
assert [restore(r'\./bar2\.txt$', commit) for commit in commits()] == [
[
('./bar2.txt', 'd202d795'),
],
[
('./bar2.txt', 'd202d795'),
],
[],
[],
]
assert sh.run('cat bar.txt') == 'foo'
assert sh.run('cat bar2.txt') == 'foo'
assert sh.run('cat asdf.txt') == 'asdf'
def test_symlink():
with sh.tempdir():
uid = str(uuid.uuid4())
os.environ['BACKUP_RCLONE_REMOTE'] = os.environ['BACKUP_TEST_RCLONE_REMOTE']
os.environ['BACKUP_DESTINATION'] = os.environ['BACKUP_TEST_DESTINATION'] + '/' + uid
os.environ['BACKUP_STORAGE_CLASS'] = 'STANDARD_IA'
os.environ['BACKUP_CHUNK_MEGABYTES'] = '100'
os.environ['BACKUP_ROOT'] = os.getcwd()
for k, v in os.environ.items():
if k.startswith('BACKUP_'):
print(k, '=>', v)
##
sh.run('echo foo > bar.txt')
sh.run('ln -s bar.txt link.txt')
sh.run('backup-add')
assert diff() == [
('addition:', './bar.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
('addition:', './link.txt', 'symlink', './bar.txt', '0'),
]
assert additions() == [
('./bar.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
('./link.txt', 'symlink', './bar.txt', '0'),
]
sh.run('backup-commit')
assert log() == [
'0000000000.DATE.tar.lz4.gpg.00000 HASH 1510',
'init',
]
assert index() == [
('./bar.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
('./link.txt', 'symlink', './bar.txt', '0'),
]
##
sh.run('mkdir dir')
sh.run('cd dir && ln -s ../bar.txt link.txt')
sh.run('backup-add')
assert diff() == [
('addition:', './dir/link.txt', 'symlink', './bar.txt', '0'),
]
assert additions() == [
('./dir/link.txt', 'symlink', './bar.txt', '0'),
]
sh.run('backup-commit')
assert log() == [
'index-only-update',
'0000000000.DATE.tar.lz4.gpg.00000 HASH 1510',
'init',
]
assert index() == [
('./bar.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
('./dir/link.txt', 'symlink', './bar.txt', '0'),
('./link.txt', 'symlink', './bar.txt', '0'),
]
##
with sh.tempdir():
os.environ['BACKUP_ROOT'] = os.getcwd()
_ = find('.') # clone the repo with the first call to find()
assert [find('.', commit) for commit in commits()] == [
[
('./bar.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
('./dir/link.txt', 'symlink', './bar.txt', '0'),
('./link.txt', 'symlink', './bar.txt', '0'),
],
[
('./bar.txt', '0000000000.DATE.tar.lz4.gpg.00000', 'd202d795', '4'),
('./link.txt', 'symlink', './bar.txt', '0'),
],
[],
]
##
with sh.tempdir():
os.environ['BACKUP_ROOT'] = os.getcwd()
restore('.')
assert sh.run('find -printf "%y %p %l\n" | grep -v "\./\.backup" | grep -P "^(l|f)"').splitlines() == [
'l ./link.txt bar.txt',
'l ./dir/link.txt ../bar.txt',
'f ./bar.txt',
]
assert sh.run('cat link.txt') == 'foo'
assert sh.run('cat dir/link.txt') == 'foo'
assert sh.run('cat bar.txt') == 'foo'
| 39.641129 | 156 | 0.462517 | 1,087 | 9,831 | 4.147194 | 0.117755 | 0.047915 | 0.068767 | 0.089397 | 0.860248 | 0.824534 | 0.785936 | 0.760204 | 0.725599 | 0.725599 | 0 | 0.123023 | 0.331096 | 9,831 | 247 | 157 | 39.801619 | 0.5625 | 0.017191 | 0 | 0.610619 | 0 | 0.013274 | 0.338487 | 0.128567 | 0 | 0 | 0 | 0 | 0.141593 | 1 | 0.048673 | false | 0 | 0.017699 | 0.030973 | 0.106195 | 0.013274 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
90db32ba1e1e5c8f6e58f0a8918625024dba8ad5 | 314 | py | Python | src/dc_federated/backend/__init__.py | cchanzl/dc-federated | 9a506f2e576f62048e2366fb52d28811190e01a7 | [
"Apache-2.0"
] | 8 | 2020-12-22T12:37:03.000Z | 2022-03-16T12:45:34.000Z | src/dc_federated/backend/__init__.py | cchanzl/dc-federated | 9a506f2e576f62048e2366fb52d28811190e01a7 | [
"Apache-2.0"
] | 3 | 2021-01-04T14:19:41.000Z | 2022-01-06T23:56:08.000Z | src/dc_federated/backend/__init__.py | cchanzl/dc-federated | 9a506f2e576f62048e2366fb52d28811190e01a7 | [
"Apache-2.0"
] | 5 | 2021-01-04T13:25:44.000Z | 2022-01-05T10:02:48.000Z | from dc_federated.backend.dcf_server import DCFServer
from dc_federated.backend.dcf_worker import DCFWorker
from dc_federated.backend._constants import GLOBAL_MODEL, \
GLOBAL_MODEL_VERSION, LAST_WORKER_MODEL_VERSION, WID_LEN
from dc_federated.backend.backend_utils import create_model_dict, is_valid_model_dict | 62.8 | 85 | 0.882166 | 47 | 314 | 5.468085 | 0.468085 | 0.093385 | 0.233463 | 0.342412 | 0.194553 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076433 | 314 | 5 | 85 | 62.8 | 0.886207 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.8 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
90f807e00b325de5aad57e29d0223b2ea0a94e01 | 86 | py | Python | __init__.py | WowlNAN/qpath | 45dd116780e946a8de199c7f5a85a57751f2f593 | [
"CC0-1.0"
] | 3 | 2020-07-19T04:16:24.000Z | 2020-09-21T04:50:43.000Z | __init__.py | WowlNAN/qpath | 45dd116780e946a8de199c7f5a85a57751f2f593 | [
"CC0-1.0"
] | null | null | null | __init__.py | WowlNAN/qpath | 45dd116780e946a8de199c7f5a85a57751f2f593 | [
"CC0-1.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Sun Jul 19 20:37:37 2020
@author: MRN_6
"""
| 9.555556 | 35 | 0.55814 | 15 | 86 | 3.133333 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.208955 | 0.22093 | 86 | 8 | 36 | 10.75 | 0.492537 | 0.860465 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
90fbd8a92da9a74d3dfdc49125a36b8c939f2c91 | 43 | py | Python | ftpclient/logger/__init__.py | gusenov/ftp-client-py | 983ab42c1dbf526b9798ceccc9282ae2d9fa3cf7 | [
"MIT"
] | null | null | null | ftpclient/logger/__init__.py | gusenov/ftp-client-py | 983ab42c1dbf526b9798ceccc9282ae2d9fa3cf7 | [
"MIT"
] | null | null | null | ftpclient/logger/__init__.py | gusenov/ftp-client-py | 983ab42c1dbf526b9798ceccc9282ae2d9fa3cf7 | [
"MIT"
] | null | null | null | from ftpclient.logger.file_logger import *
| 21.5 | 42 | 0.837209 | 6 | 43 | 5.833333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093023 | 43 | 1 | 43 | 43 | 0.897436 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.