hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
773ca315bb385e6ede2553d6bd9ebe22a2d0e6ae | 9,973 | py | Python | dingtalk/python/alibabacloud_dingtalk/project_integration_1_0/models.py | yndu13/dingtalk-sdk | 700fb7bb49c4d3167f84afc5fcb5e7aa5a09735f | [
"Apache-2.0"
] | 15 | 2020-08-27T04:10:26.000Z | 2022-03-07T06:25:42.000Z | dingtalk/python/alibabacloud_dingtalk/project_integration_1_0/models.py | yndu13/dingtalk-sdk | 700fb7bb49c4d3167f84afc5fcb5e7aa5a09735f | [
"Apache-2.0"
] | 1 | 2020-09-27T01:30:46.000Z | 2021-12-29T09:15:34.000Z | dingtalk/python/alibabacloud_dingtalk/project_integration_1_0/models.py | yndu13/dingtalk-sdk | 700fb7bb49c4d3167f84afc5fcb5e7aa5a09735f | [
"Apache-2.0"
] | 5 | 2020-08-27T04:07:44.000Z | 2021-12-03T02:55:20.000Z | # -*- coding: utf-8 -*-
# This file is auto-generated, don't edit it. Thanks.
from Tea.model import TeaModel
from typing import Dict
class SendInteractiveCardHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class SendInteractiveCardResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: dict = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
self.body = m.get('body')
return self
class UpdateInteractiveCardHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class UpdateInteractiveCardResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: dict = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
self.body = m.get('body')
return self
class SendSingleInteractiveCardHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class SendSingleInteractiveCardResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: dict = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
self.body = m.get('body')
return self
class AddAttendeeToEventGroupHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class AddAttendeeToEventGroupResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: dict = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
self.body = m.get('body')
return self
class CreateEventGroupHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class CreateEventGroupResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: dict = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
self.body = m.get('body')
return self
| 29.160819 | 84 | 0.589391 | 1,287 | 9,973 | 4.36519 | 0.045066 | 0.0445 | 0.0801 | 0.144179 | 0.931826 | 0.931826 | 0.931826 | 0.931826 | 0.931826 | 0.931826 | 0 | 0.000145 | 0.308734 | 9,973 | 341 | 85 | 29.246334 | 0.814766 | 0.00732 | 0 | 0.956679 | 1 | 0 | 0.08287 | 0.04093 | 0 | 0 | 0 | 0 | 0 | 1 | 0.144404 | false | 0.018051 | 0.00722 | 0 | 0.296029 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
625ea86206c9e48abda671906df9e7c804591c74 | 178 | py | Python | src/generator/tests/expected_files/expected_main.py | dslab-epfl/svshi | 0b0f62931698bfd670dae3117078889d966688de | [
"MIT"
] | 1 | 2022-02-11T14:54:49.000Z | 2022-02-11T14:54:49.000Z | src/generator/tests/expected_files/expected_main.py | dslab-epfl/svshi | 0b0f62931698bfd670dae3117078889d966688de | [
"MIT"
] | null | null | null | src/generator/tests/expected_files/expected_main.py | dslab-epfl/svshi | 0b0f62931698bfd670dae3117078889d966688de | [
"MIT"
] | null | null | null | from instances import app_state, BINARY_SENSOR_INSTANCE_NAME, SWITCH_INSTANCE_NAME, TEMPERATURE_SENSOR_INSTANCE_NAME, HUMIDITY_SENSOR_INSTANCE_NAME, CO_TWO_SENSOR_INSTANCE_NAME
| 59.333333 | 176 | 0.91573 | 25 | 178 | 5.88 | 0.56 | 0.408163 | 0.489796 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05618 | 178 | 2 | 177 | 89 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
6263224c0b6791a70a0587fa660bee4b5840b8ba | 39,829 | py | Python | tests/iptables_test.py | drienyov/treadmill | ce21537cd9a2fdb0567ac2aa3de1afcb2f6861de | [
"Apache-2.0"
] | null | null | null | tests/iptables_test.py | drienyov/treadmill | ce21537cd9a2fdb0567ac2aa3de1afcb2f6861de | [
"Apache-2.0"
] | null | null | null | tests/iptables_test.py | drienyov/treadmill | ce21537cd9a2fdb0567ac2aa3de1afcb2f6861de | [
"Apache-2.0"
] | null | null | null | """Unit test for iptables - manipulating iptables rules.
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
import io
import os
import time
import unittest
import mock
# Disable W0611: Unused import
import tests.treadmill_test_skip_windows # pylint: disable=W0611
import treadmill
from treadmill import firewall
from treadmill import iptables
from treadmill import subproc
# Disable C0302: Too many lines
# pylint: disable=C0302
class IptablesTest(unittest.TestCase):
"""Mock test for treadmill.iptables."""
IPTABLES_STATE = os.path.join(
os.path.dirname(__file__),
'iptables_state.save'
)
IPTABLES_EMPTY_STATE = os.path.join(
os.path.dirname(__file__),
'iptables_empty_state.save'
)
IPTABLES_FILTER_STATE = os.path.join(
os.path.dirname(__file__),
'iptables_filter_state.save'
)
IPSET_STATE = os.path.join(
os.path.dirname(__file__),
'ipset_state.save'
)
NAT_TABLE_SAVE = os.path.join(
os.path.dirname(__file__),
'iptables_test_nat_table.save'
)
def setUp(self):
# Note: These two match the content of NAT_TABLE_SAVE
self.dnat_rules = set([
firewall.DNATRule(proto='udp',
dst_ip='172.31.81.67', dst_port=5002,
new_ip='192.168.1.13', new_port=8000),
firewall.DNATRule(proto='tcp',
dst_ip='172.31.81.67', dst_port=5000,
new_ip='192.168.0.11', new_port=8000),
firewall.DNATRule(proto='tcp',
dst_ip='172.31.81.67', dst_port=5003,
new_ip='192.168.1.13', new_port=22),
firewall.DNATRule(proto='tcp',
dst_ip='172.31.81.67', dst_port=5001,
new_ip='192.168.0.11', new_port=22),
])
self.snat_rules = set([
firewall.SNATRule(proto='udp',
src_ip='192.168.0.3', src_port=22,
new_ip='172.31.81.67', new_port=5001),
])
self.passthrough_rules = set([
firewall.PassThroughRule(src_ip='10.197.19.18',
dst_ip='192.168.3.2'),
firewall.PassThroughRule(src_ip='10.197.19.19',
dst_ip='192.168.2.2'),
])
with io.open(self.IPTABLES_STATE) as f:
self.iptables_state = f.read()
with io.open(self.IPTABLES_EMPTY_STATE) as f:
self.iptables_empty_state = f.read()
with io.open(self.IPTABLES_FILTER_STATE) as f:
self.iptables_filter_state = f.read()
with io.open(self.IPSET_STATE) as f:
self.ipset_state = f.read()
with io.open(self.NAT_TABLE_SAVE) as f:
self.nat_table_save = f.read()
@mock.patch('treadmill.iptables.flush_set', mock.Mock(set_spec=True))
@mock.patch('treadmill.iptables.add_ip_set', mock.Mock(set_spec=True))
@mock.patch('treadmill.iptables.ipset_restore', mock.Mock(set_spec=True))
@mock.patch('treadmill.iptables.create_chain', mock.Mock(set_spec=True))
@mock.patch('treadmill.iptables._iptables_restore',
mock.Mock(set_spec=True))
def test_initialize(self):
"""Test iptables initialization"""
# Disable protected-access: Test access protected members .
# pylint: disable=protected-access
# NOTE: keep this IP in sync with the tests' state file dumps
iptables.initialize('1.2.3.4')
treadmill.iptables.ipset_restore.assert_called_with(
self.ipset_state
)
treadmill.iptables._iptables_restore.assert_called_with(
self.iptables_state
)
@mock.patch('treadmill.iptables.create_chain', mock.Mock(set_spec=True))
@mock.patch('treadmill.iptables._iptables_restore',
mock.Mock(set_spec=True))
def test_filter_table_set(self):
"""Test filter table initialization.
"""
# Disable protected-access: Test access protected members .
# pylint: disable=protected-access
# NOTE: keep this sync with the tests' filter state file dump.
iptables.filter_table_set(['one rule', 'two rule'], ['other rule'])
treadmill.iptables.create_chain.assert_called_with(
'filter', 'TM_EXCEPTION_FILTER'
)
treadmill.iptables._iptables_restore.assert_called_with(
self.iptables_filter_state, noflush=True
)
@mock.patch('treadmill.subproc.invoke', mock.Mock(return_value=(0, '')))
def test_iptables_restore(self):
"""Test iptables-restore util"""
# Disable protected-access: Test access protected members .
# pylint: disable=protected-access
iptables._iptables_restore('firewall_state', noflush=True)
treadmill.subproc.invoke.assert_called_with(
['iptables_restore', '--noflush'],
cmd_input='firewall_state',
use_except=True
)
@mock.patch('treadmill.subproc.invoke', mock.Mock(return_value=(0, '')))
def test_initialize_container(self):
"""Test iptables container initialization"""
iptables.initialize_container()
treadmill.subproc.invoke.assert_called_with(
['iptables_restore'],
cmd_input=self.iptables_empty_state,
use_except=True,
)
@mock.patch('treadmill.subproc.check_call', mock.Mock())
def test_add_raw_rule(self):
"""Test adding iptable rule."""
iptables.add_raw_rule('nat', 'OUTPUT', '-j FOO', safe=False)
treadmill.subproc.check_call.assert_called_with(
['iptables', '-t', 'nat', '-A', 'OUTPUT', '-j', 'FOO']
)
@mock.patch('treadmill.subproc.check_call', mock.Mock())
def test_delete_raw_rule(self):
"""Test deleting an iptable rule."""
iptables.delete_raw_rule('nat', 'OUTPUT', '-j FOO')
treadmill.subproc.check_call.assert_called_with(
['iptables', '-t', 'nat', '-D', 'OUTPUT', '-j', 'FOO']
)
treadmill.subproc.check_call.reset_mock()
treadmill.subproc.check_call.side_effect = (
treadmill.subproc.CalledProcessError(1, '1.4.7 style')
)
# Should not raise
iptables.delete_raw_rule('nat', 'OUTPUT', '-j FOO')
treadmill.subproc.check_call.reset_mock()
# Should not raise
treadmill.subproc.check_call.side_effect = (
treadmill.subproc.CalledProcessError(2, '1.4.21 style')
)
iptables.delete_raw_rule('nat', 'OUTPUT', '-j FOO')
treadmill.subproc.check_call.reset_mock()
treadmill.subproc.check_call.side_effect = (
treadmill.subproc.CalledProcessError(42, 'other error')
)
self.assertRaises(
treadmill.subproc.CalledProcessError,
iptables.delete_raw_rule,
'nat', 'OUTPUT', '-j FOO'
)
@mock.patch('treadmill.subproc.check_call', mock.Mock())
def test_add_raw_rule_safe(self):
"""Test adding iptable rule (safe)."""
treadmill.subproc.check_call.return_value = 0
iptables.add_raw_rule('nat', 'OUTPUT', '-j FOO', safe=True)
treadmill.subproc.check_call.assert_called_once_with(
['iptables', '-t', 'nat', '-C', 'OUTPUT', '-j', 'FOO']
)
# Rule does not exist.
treadmill.subproc.check_call.reset_mock()
treadmill.subproc.check_call.side_effect = [
subproc.CalledProcessError(1, ''),
0,
]
iptables.add_raw_rule('nat', 'OUTPUT', '-j FOO', safe=True)
treadmill.subproc.check_call.assert_has_calls([
mock.call(['iptables', '-t', 'nat', '-C', 'OUTPUT', '-j', 'FOO']),
mock.call(['iptables', '-t', 'nat', '-A', 'OUTPUT', '-j', 'FOO'])
])
# Unexpected iptables error while checking if the rule already exists.
treadmill.subproc.check_call.reset_mock()
treadmill.subproc.check_call.side_effect = \
subproc.CalledProcessError(3, '')
with self.assertRaises(subproc.CalledProcessError):
iptables.add_raw_rule('nat', 'OUTPUT', '-j FOO', safe=True)
@mock.patch('treadmill.iptables.add_raw_rule', mock.Mock())
def test_add_dnat_rule(self):
"""Test dnat rule addition."""
iptables.add_dnat_rule(
firewall.DNATRule(proto='tcp',
dst_ip='1.1.1.1', dst_port=123,
new_ip='2.2.2.2', new_port=345),
'SOME_RULE',
safe=True
)
treadmill.iptables.add_raw_rule.assert_called_with(
'nat', 'SOME_RULE',
('-s 0.0.0.0/0 -d 1.1.1.1 -p tcp -m tcp --dport 123'
' -j DNAT --to-destination 2.2.2.2:345'),
True
)
@mock.patch('treadmill.iptables.delete_raw_rule', mock.Mock())
def test_delete_dnat_rule(self):
"""Test dnat rule deletion."""
iptables.delete_dnat_rule(
firewall.DNATRule(proto='tcp',
dst_ip='1.1.1.1', dst_port=123,
new_ip='2.2.2.2', new_port=345),
'SOME_RULE'
)
treadmill.iptables.delete_raw_rule.assert_called_with(
'nat', 'SOME_RULE',
('-s 0.0.0.0/0 -d 1.1.1.1 -p tcp -m tcp --dport 123'
' -j DNAT --to-destination 2.2.2.2:345')
)
@mock.patch('treadmill.subproc.check_call', mock.Mock())
def test_delete_dnat_rule_nonexist(self):
"""Test dnat rule deleting when the rule does not exist."""
treadmill.subproc.check_call.side_effect = \
subproc.CalledProcessError(returncode=1, output='', cmd='')
iptables.delete_dnat_rule(
firewall.DNATRule(proto='tcp',
dst_ip='1.1.1.1', dst_port=123,
new_ip='2.2.2.2', new_port=345),
'SOME_RULE',
)
treadmill.subproc.check_call.assert_called_with([
'iptables', '-t', 'nat', '-D', 'SOME_RULE',
'-s', '0.0.0.0/0', '-d', '1.1.1.1', '-p', 'tcp', '-m', 'tcp',
'--dport', '123',
'-j', 'DNAT', '--to-destination', '2.2.2.2:345'])
@mock.patch('treadmill.iptables.add_raw_rule', mock.Mock())
def test_add_snat_rule(self):
"""Test snat rule addition."""
iptables.add_snat_rule(
firewall.SNATRule(proto='tcp',
src_ip='1.1.1.1', src_port=123,
new_ip='2.2.2.2', new_port=345),
'SOME_RULE',
safe=True
)
treadmill.iptables.add_raw_rule.assert_called_with(
'nat', 'SOME_RULE',
('-s 1.1.1.1 -d 0.0.0.0/0 -p tcp -m tcp --sport 123'
' -j SNAT --to-source 2.2.2.2:345'),
True
)
@mock.patch('treadmill.iptables.delete_raw_rule', mock.Mock())
def test_delete_snat_rule(self):
"""Test snat rule deletion."""
iptables.delete_snat_rule(
firewall.SNATRule(proto='tcp',
src_ip='1.1.1.1', src_port=123,
new_ip='2.2.2.2', new_port=345),
'SOME_RULE'
)
treadmill.iptables.delete_raw_rule.assert_called_with(
'nat', 'SOME_RULE',
('-s 1.1.1.1 -d 0.0.0.0/0 -p tcp -m tcp --sport 123'
' -j SNAT --to-source 2.2.2.2:345')
)
@mock.patch('treadmill.iptables.add_dnat_rule', mock.Mock())
@mock.patch('treadmill.iptables.delete_dnat_rule', mock.Mock())
@mock.patch('treadmill.iptables._get_current_dnat_rules', mock.Mock())
def test_dnat_up_to_date(self):
"""Tests DNAT setup when configuration is up to date.
"""
# Disable protected-access: Test access protected members.
# pylint: disable=protected-access
treadmill.iptables._get_current_dnat_rules.return_value = \
self.dnat_rules
iptables.configure_dnat_rules(
self.dnat_rules,
iptables.PREROUTING_DNAT
)
self.assertEqual(0, treadmill.iptables.add_dnat_rule.call_count)
self.assertEqual(0, treadmill.iptables.delete_dnat_rule.call_count)
@mock.patch('treadmill.iptables.add_dnat_rule', mock.Mock())
@mock.patch('treadmill.iptables.delete_dnat_rule', mock.Mock())
@mock.patch('treadmill.iptables._get_current_dnat_rules', mock.Mock())
def test_dnat_missing_rule(self):
"""Tests DNAT setup when new rule needs to be created.
"""
# Disable protected-access: Test access protected members.
# pylint: disable=protected-access
treadmill.iptables._get_current_dnat_rules.return_value = \
self.dnat_rules
desired_rules = (
self.dnat_rules |
set([
firewall.DNATRule('tcp',
'172.31.81.67', 5004,
'192.168.2.15', 22),
])
)
iptables.configure_dnat_rules(
desired_rules,
iptables.PREROUTING_DNAT
)
treadmill.iptables.add_dnat_rule.assert_called_with(
firewall.DNATRule('tcp',
'172.31.81.67', 5004,
'192.168.2.15', 22),
chain=iptables.PREROUTING_DNAT
)
self.assertEqual(0, treadmill.iptables.delete_dnat_rule.call_count)
@mock.patch('treadmill.iptables.add_dnat_rule', mock.Mock())
@mock.patch('treadmill.iptables.delete_dnat_rule', mock.Mock())
@mock.patch('treadmill.iptables._get_current_dnat_rules', mock.Mock())
def test_dnat_extra_rule(self):
"""Tests DNAT setup when rule needs to be removed."""
# Disable protected-access: Test access protected members.
# pylint: disable=protected-access
treadmill.iptables._get_current_dnat_rules.return_value = (
self.dnat_rules |
set([
firewall.DNATRule('tcp',
'172.31.81.67', 5004,
'192.168.2.15', 22),
])
)
desired_rules = (
self.dnat_rules
)
iptables.configure_dnat_rules(
desired_rules,
iptables.PREROUTING_DNAT
)
self.assertEqual(0, treadmill.iptables.add_dnat_rule.call_count)
treadmill.iptables.delete_dnat_rule.assert_called_with(
firewall.DNATRule('tcp',
'172.31.81.67', 5004,
'192.168.2.15', 22),
chain=iptables.PREROUTING_DNAT,
)
@mock.patch('treadmill.subproc.check_output', mock.Mock())
def test__get_current_dnat_rules(self):
"""Test query DNAT/SNAT rules."""
# Disable protected-access: Test access protected members.
# pylint: disable=protected-access
treadmill.subproc.check_output.return_value = \
self.nat_table_save
rules = iptables._get_current_dnat_rules(iptables.PREROUTING_DNAT)
treadmill.subproc.check_output.assert_called_with(
['iptables',
'-t', 'nat', '-S', iptables.PREROUTING_DNAT]
)
self.assertEqual(set(rules), self.dnat_rules)
@mock.patch('treadmill.iptables.add_snat_rule', mock.Mock())
@mock.patch('treadmill.iptables.delete_snat_rule', mock.Mock())
@mock.patch('treadmill.iptables._get_current_snat_rules', mock.Mock())
def test_snat_up_to_date(self):
"""Tests SNAT setup when configuration is up to date.
"""
# Disable protected-access: Test access protected members.
# pylint: disable=protected-access
treadmill.iptables._get_current_snat_rules.return_value = \
self.snat_rules
iptables.configure_snat_rules(
self.snat_rules,
iptables.POSTROUTING_SNAT
)
self.assertEqual(0, treadmill.iptables.add_snat_rule.call_count)
self.assertEqual(0, treadmill.iptables.delete_snat_rule.call_count)
@mock.patch('treadmill.iptables.add_snat_rule', mock.Mock())
@mock.patch('treadmill.iptables.delete_snat_rule', mock.Mock())
@mock.patch('treadmill.iptables._get_current_snat_rules', mock.Mock())
def test_snat_missing_rule(self):
"""Tests DNAT setup when new rule needs to be created.
"""
# Disable protected-access: Test access protected members.
# pylint: disable=protected-access
treadmill.iptables._get_current_snat_rules.return_value = \
self.snat_rules
desired_rules = (
self.snat_rules |
set([
firewall.SNATRule('tcp',
'172.31.81.67', 5004,
'192.168.2.15', 22),
])
)
iptables.configure_snat_rules(
desired_rules,
iptables.POSTROUTING_SNAT
)
treadmill.iptables.add_snat_rule.assert_called_with(
firewall.SNATRule('tcp',
'172.31.81.67', 5004,
'192.168.2.15', 22),
chain=iptables.POSTROUTING_SNAT
)
self.assertEqual(0, treadmill.iptables.delete_snat_rule.call_count)
@mock.patch('treadmill.iptables.add_snat_rule', mock.Mock())
@mock.patch('treadmill.iptables.delete_snat_rule', mock.Mock())
@mock.patch('treadmill.iptables._get_current_snat_rules', mock.Mock())
def test_snat_extra_rule(self):
"""Tests SNAT setup when rule needs to be removed.
"""
# Disable protected-access: Test access protected members.
# pylint: disable=protected-access
treadmill.iptables._get_current_snat_rules.return_value = (
self.snat_rules |
set([
firewall.SNATRule('tcp',
'172.31.81.67', 5004,
'192.168.2.15', 22),
])
)
desired_rules = (
self.snat_rules
)
iptables.configure_snat_rules(
desired_rules,
iptables.PREROUTING_DNAT
)
self.assertEqual(0, treadmill.iptables.add_snat_rule.call_count)
treadmill.iptables.delete_snat_rule.assert_called_with(
firewall.SNATRule('tcp',
'172.31.81.67', 5004,
'192.168.2.15', 22),
chain=iptables.PREROUTING_DNAT
)
@mock.patch('treadmill.subproc.check_output', mock.Mock())
def test__get_current_snat_rules(self):
"""Test query DNAT/SNAT rules."""
# Disable protected-access: Test access protected members.
# pylint: disable=protected-access
treadmill.subproc.check_output.return_value = \
self.nat_table_save
rules = iptables._get_current_snat_rules(iptables.POSTROUTING_SNAT)
treadmill.subproc.check_output.assert_called_with(
['iptables',
'-t', 'nat', '-S', iptables.POSTROUTING_SNAT]
)
self.assertEqual(set(rules), self.snat_rules)
@mock.patch('treadmill.iptables.add_raw_rule', mock.Mock())
def test_add_passthrough_rule(self):
"""Test configure_passthrough."""
iptables.add_passthrough_rule(
firewall.PassThroughRule(src_ip='4.4.4.4', dst_ip='1.2.3.4'),
iptables.PREROUTING_PASSTHROUGH
)
treadmill.iptables.add_raw_rule.assert_called_with(
'nat', iptables.PREROUTING_PASSTHROUGH,
'-s 4.4.4.4 -j DNAT --to-destination 1.2.3.4',
safe=False
)
@mock.patch('treadmill.iptables.delete_raw_rule', mock.Mock())
def test_delete_passthrough_rule(self):
"""Test deletion of a passthrough rule"""
iptables.delete_passthrough_rule(
firewall.PassThroughRule(src_ip='4.4.4.4', dst_ip='1.2.3.4'),
iptables.PREROUTING_PASSTHROUGH
)
treadmill.iptables.delete_raw_rule.assert_called_with(
'nat', iptables.PREROUTING_PASSTHROUGH,
'-s 4.4.4.4 -j DNAT --to-destination 1.2.3.4'
)
@mock.patch('treadmill.iptables.delete_raw_rule', mock.Mock())
def test_delete_passthrough_rule2(self):
"""Test deletion of a passthrough rule (no conntrack data)"""
# Check that ret_code 1 from conntrack -D is treated as success.
iptables.delete_passthrough_rule(
firewall.PassThroughRule(src_ip='5.5.5.5', dst_ip='1.2.3.4'),
iptables.PREROUTING_PASSTHROUGH
)
treadmill.iptables.delete_raw_rule.assert_called_with(
'nat', iptables.PREROUTING_PASSTHROUGH,
'-s 5.5.5.5 -j DNAT --to-destination 1.2.3.4'
)
@mock.patch('treadmill.iptables.add_passthrough_rule', mock.Mock())
@mock.patch('treadmill.iptables.delete_passthrough_rule', mock.Mock())
@mock.patch('treadmill.iptables._get_current_passthrough_rules',
mock.Mock())
def test_passthrough_up_to_date(self):
"""Tests PassThrough setup when configuration is up to date."""
# Disable protected-access: Test access protected members.
# pylint: disable=protected-access
treadmill.iptables._get_current_passthrough_rules.return_value = \
self.passthrough_rules
passthroughs = self.passthrough_rules
iptables.configure_passthrough_rules(
passthroughs,
iptables.PREROUTING_PASSTHROUGH
)
self.assertEqual(
0, treadmill.iptables.add_passthrough_rule.call_count
)
self.assertEqual(
0, treadmill.iptables.delete_passthrough_rule.call_count
)
@mock.patch('treadmill.iptables.add_passthrough_rule', mock.Mock())
@mock.patch('treadmill.iptables.delete_passthrough_rule', mock.Mock())
@mock.patch('treadmill.iptables._get_current_passthrough_rules',
mock.Mock())
def test_passthrough_missing_rule(self):
"""Tests PassThrough setup when new rule needs to be created."""
# Disable protected-access: Test access protected members.
# pylint: disable=protected-access
treadmill.iptables._get_current_passthrough_rules.return_value = \
self.passthrough_rules
missing_rule = firewall.PassThroughRule(src_ip='10.197.19.20',
dst_ip='192.168.2.2')
passthroughs = self.passthrough_rules | set([missing_rule, ])
iptables.configure_passthrough_rules(
passthroughs,
iptables.PREROUTING_PASSTHROUGH
)
treadmill.iptables.add_passthrough_rule.assert_called_with(
missing_rule,
chain=iptables.PREROUTING_PASSTHROUGH
)
self.assertEqual(
0, treadmill.iptables.delete_passthrough_rule.call_count
)
@mock.patch('treadmill.iptables.add_passthrough_rule', mock.Mock())
@mock.patch('treadmill.iptables.delete_passthrough_rule', mock.Mock())
@mock.patch('treadmill.iptables._get_current_passthrough_rules',
mock.Mock())
def test_passthrough_extra_rule(self):
"""Tests PassThrough setup when rule needs to be removed."""
# Disable protected-access: Test access protected members.
# pylint: disable=protected-access
treadmill.iptables._get_current_passthrough_rules.return_value = \
self.passthrough_rules
extra_rule = firewall.PassThroughRule(src_ip='10.197.19.19',
dst_ip='192.168.2.2')
passthroughs = self.passthrough_rules - set([extra_rule, ])
iptables.configure_passthrough_rules(
passthroughs,
iptables.PREROUTING_PASSTHROUGH
)
self.assertEqual(
0, treadmill.iptables.add_passthrough_rule.call_count
)
treadmill.iptables.delete_passthrough_rule.assert_called_with(
extra_rule,
chain=iptables.PREROUTING_PASSTHROUGH
)
@mock.patch('treadmill.subproc.check_call', mock.Mock(autospec=True))
def test_flush_cnt_conntrack_table(self):
"""Test flushing container conntrack rules.
"""
treadmill.subproc.check_call.return_value = 0
treadmill.iptables.flush_cnt_conntrack_table(vip='5.5.5.5')
treadmill.subproc.check_call.assert_has_calls(
[
mock.call(
[
'conntrack',
'-D',
'--protonum', 'udp',
'--src-nat', '5.5.5.5'
]
),
mock.call(
[
'conntrack',
'-D',
'--protonum', 'udp',
'--dst-nat', '5.5.5.5'
]
),
],
any_order=True
)
treadmill.subproc.check_call.reset_mock()
treadmill.subproc.check_call.return_value = 1
treadmill.subproc.check_call.side_effect = \
subproc.CalledProcessError(returncode=1, cmd='failed conntrack')
treadmill.iptables.flush_cnt_conntrack_table('4.4.4.4')
treadmill.subproc.check_call.assert_has_calls(
[
mock.call(
[
'conntrack',
'-D',
'--protonum', 'udp',
'--src-nat', '4.4.4.4'
]
),
mock.call(
[
'conntrack',
'-D',
'--protonum', 'udp',
'--dst-nat', '4.4.4.4'
]
),
],
any_order=True
)
@mock.patch('treadmill.subproc.check_call', mock.Mock(autospec=True))
def test_flush_pt_conntrack_table(self):
"""Test flushing Passthrough conntrack rules.
"""
treadmill.subproc.check_call.return_value = 0
treadmill.iptables.flush_pt_conntrack_table('5.5.5.5')
treadmill.subproc.check_call.assert_has_calls(
[
mock.call(
[
'conntrack',
'-D',
'--protonum', 'udp',
'--orig-src', '5.5.5.5'
]
),
mock.call(
[
'conntrack',
'-D',
'--protonum', 'udp',
'--orig-dst', '5.5.5.5'
]
),
],
any_order=True
)
treadmill.subproc.check_call.reset_mock()
treadmill.subproc.check_call.return_value = 1
treadmill.subproc.check_call.side_effect = \
subproc.CalledProcessError(returncode=1, cmd='failed conntrack')
treadmill.iptables.flush_pt_conntrack_table('4.4.4.4')
treadmill.subproc.check_call.assert_has_calls(
[
mock.call(
[
'conntrack',
'-D',
'--protonum', 'udp',
'--orig-src', '4.4.4.4'
]
),
mock.call(
[
'conntrack',
'-D',
'--protonum', 'udp',
'--orig-dst', '4.4.4.4'
]
),
],
any_order=True
)
@mock.patch('treadmill.subproc.check_output', mock.Mock())
def test__get_current_pt_rules(self):
"""Test query passthrough rules."""
# Disable protected-access: Test access protected members.
# pylint: disable=protected-access
treadmill.subproc.check_output.return_value = \
self.nat_table_save
rules = iptables._get_current_passthrough_rules(
iptables.PREROUTING_PASSTHROUGH
)
treadmill.subproc.check_output.assert_called_with(
['iptables',
'-t', 'nat', '-S', iptables.PREROUTING_PASSTHROUGH]
)
self.assertEqual(set(rules), self.passthrough_rules)
@mock.patch('treadmill.iptables.add_dnat_rule', mock.Mock())
@mock.patch('treadmill.iptables.add_passthrough_rule', mock.Mock())
def test_add_rule(self):
"""Test generic addition of a rule"""
dnat_rule = self.dnat_rules.pop()
passthrough_rule = self.passthrough_rules.pop()
iptables.add_rule(dnat_rule, chain='TEST_CHAIN')
self.assertEqual(
0, treadmill.iptables.add_passthrough_rule.call_count
)
treadmill.iptables.add_dnat_rule.assert_called_with(
dnat_rule,
chain='TEST_CHAIN'
)
treadmill.iptables.add_passthrough_rule.reset_mock()
treadmill.iptables.add_dnat_rule.reset_mock()
iptables.add_rule(passthrough_rule, chain='TEST_CHAIN')
treadmill.iptables.add_passthrough_rule.assert_called_with(
passthrough_rule,
chain='TEST_CHAIN'
)
self.assertEqual(
0, treadmill.iptables.add_dnat_rule.call_count
)
@mock.patch('treadmill.iptables.delete_dnat_rule', mock.Mock())
@mock.patch('treadmill.iptables.delete_passthrough_rule', mock.Mock())
def test_delete_rule(self):
"""Test generic removal of a rule"""
dnat_rule = self.dnat_rules.pop()
passthrough_rule = self.passthrough_rules.pop()
iptables.delete_rule(dnat_rule, chain='TEST_CHAIN')
self.assertEqual(
0, treadmill.iptables.delete_passthrough_rule.call_count
)
treadmill.iptables.delete_dnat_rule.assert_called_with(
dnat_rule,
chain='TEST_CHAIN'
)
treadmill.iptables.delete_passthrough_rule.reset_mock()
treadmill.iptables.delete_dnat_rule.reset_mock()
iptables.delete_rule(passthrough_rule, chain='TEST_CHAIN')
treadmill.iptables.delete_passthrough_rule.assert_called_with(
passthrough_rule,
chain='TEST_CHAIN'
)
self.assertEqual(
0, treadmill.iptables.delete_dnat_rule.call_count
)
@mock.patch('time.sleep', mock.Mock(spec_set=True))
@mock.patch('treadmill.subproc.check_call', mock.Mock(spec_set=True))
def test__iptables(self):
"""Test iptables command invocation.
"""
# pylint: disable=protected-access
res = iptables._iptables('foo', 'bar', 'baz')
treadmill.subproc.check_call.assert_called_with(
['iptables', '-t', 'foo', 'bar', 'baz']
)
self.assertEqual(res, treadmill.subproc.check_call.return_value)
treadmill.subproc.check_call.reset_mock()
res = iptables._iptables(
'foo', 'bar', 'baz',
['-d', 'some_host', '-p', 'tcp']
)
treadmill.subproc.check_call.assert_called_with(
[
'iptables', '-t', 'foo', 'bar', 'baz',
'-d', 'some_host', '-p', 'tcp'
]
)
self.assertEqual(res, treadmill.subproc.check_call.return_value)
treadmill.subproc.check_call.reset_mock()
mock_res = mock.Mock(name='finally!')
treadmill.subproc.check_call.side_effect = [
treadmill.subproc.CalledProcessError(4, 'locked'),
mock_res,
]
res = iptables._iptables('foo', 'bar', 'baz')
treadmill.subproc.check_call.assert_called_with(
['iptables', '-t', 'foo', 'bar', 'baz']
)
time.sleep.assert_has_calls(
[mock.ANY] * 1
)
self.assertEqual(time.sleep.call_count, 1)
self.assertEqual(res, mock_res)
treadmill.subproc.check_call.reset_mock()
treadmill.subproc.check_call.side_effect = (
treadmill.subproc.CalledProcessError('not 4', 'something else')
)
self.assertRaises(
treadmill.subproc.CalledProcessError,
iptables._iptables,
'foo', 'bar', 'baz'
)
@mock.patch('time.sleep', mock.Mock(spec_set=True))
@mock.patch('treadmill.subproc.check_output', mock.Mock(spec_set=True))
def test__iptables_output(self):
"""Test iptables command invocation.
"""
# pylint: disable=protected-access
res = iptables._iptables_output('foo', 'bar', 'baz')
treadmill.subproc.check_output.assert_called_with(
['iptables', '-t', 'foo', 'bar', 'baz']
)
self.assertEqual(res, treadmill.subproc.check_output.return_value)
treadmill.subproc.check_output.reset_mock()
mock_res = mock.Mock(name='finally!')
treadmill.subproc.check_output.side_effect = [
treadmill.subproc.CalledProcessError(4, 'locked'),
treadmill.subproc.CalledProcessError(4, 'locked'),
mock_res,
]
res = iptables._iptables_output('foo', 'bar', 'baz')
treadmill.subproc.check_output.assert_called_with(
['iptables', '-t', 'foo', 'bar', 'baz']
)
time.sleep.assert_has_calls(
[mock.ANY] * 2
)
self.assertEqual(time.sleep.call_count, 2)
self.assertEqual(res, mock_res)
treadmill.subproc.check_output.reset_mock()
treadmill.subproc.check_output.side_effect = (
treadmill.subproc.CalledProcessError('not 4', 'something else')
)
self.assertRaises(
treadmill.subproc.CalledProcessError,
iptables._iptables_output,
'foo', 'bar', 'baz'
)
@mock.patch('treadmill.subproc.invoke', mock.Mock(return_value=(0, '')))
def test__ipset(self):
"""Test ipset tool invocation.
"""
# Disable protected-access: Test access protected members .
# pylint: disable=protected-access
treadmill.subproc.invoke.return_value = (123, 'test data')
res = iptables._ipset('foo', 'bar', cmd_input='test')
treadmill.subproc.invoke.assert_called_with(
['ipset', 'foo', 'bar'],
cmd_input='test',
use_except=True
)
self.assertEqual(
res,
(123, 'test data')
)
@mock.patch('treadmill.iptables._ipset', mock.Mock())
def test_list_set(self):
"""Test listing set membership.
"""
# Disable protected-access: Test access protected members .
# pylint: disable=protected-access
iptables._ipset.return_value = (
42,
"""
<ipset name="tm:prod-containers">
<type>hash:ip</type>
<header>
<family>inet</family>
<hashsize>1024</hashsize>
<maxelem>65536</maxelem>
<memsize>16520</memsize>
<references>3</references>
</header>
<members>
<member>192.168.0.2</member>
<member>192.168.0.7</member>
</members>
</ipset>
"""
)
res = iptables.list_set('tm:prod-containers')
iptables._ipset.assert_called_with(
'list', '-o', 'xml', 'tm:prod-containers'
)
self.assertAlmostEqual(
res,
['192.168.0.2', '192.168.0.7']
)
@mock.patch('treadmill.iptables._ipset', mock.Mock())
def test_init_set(self):
"""Test set initialization"""
# Disable protected-access: Test access protected members .
# pylint: disable=protected-access
iptables.init_set('foo')
treadmill.iptables._ipset.assert_has_calls([
mock.call('-exist', 'create', 'foo', 'hash:ip'),
mock.call('flush', 'foo'),
])
@mock.patch('treadmill.iptables._ipset', mock.Mock())
def test_test_ip_set(self):
"""Test testing of IP in a given set"""
# Disable protected-access: Test access protected members .
# pylint: disable=protected-access
iptables._ipset.return_value = (42, 'foo')
res = iptables.test_ip_set('foo', '1.2.3.4')
treadmill.iptables._ipset.assert_called_with(
'test', 'foo', '1.2.3.4', use_except=False,
)
self.assertFalse(res)
# Try with success now
iptables._ipset.reset_mock()
iptables._ipset.return_value = (0, 'bar')
res = iptables.test_ip_set('foo', '1.2.3.4')
self.assertTrue(res)
@mock.patch('treadmill.iptables._ipset', mock.Mock())
def test_add_ip_set(self):
"""Test addition of IP to a given set"""
# Disable protected-access: Test access protected members .
# pylint: disable=protected-access
iptables.add_ip_set('foo', '1.2.3.4')
treadmill.iptables._ipset.assert_called_with(
'-exist', 'add', 'foo', '1.2.3.4'
)
@mock.patch('treadmill.iptables._ipset', mock.Mock())
def test_rm_ip_set(self):
"""Test removal of IP from a given set"""
# Disable protected-access: Test access protected members .
# pylint: disable=protected-access
iptables.rm_ip_set('foo', '1.2.3.4')
treadmill.iptables._ipset.assert_called_with(
'-exist', 'del', 'foo', '1.2.3.4'
)
@mock.patch('treadmill.iptables._ipset', mock.Mock())
def test_swap_set(self):
"""Test swapping of two IPSets.
"""
# Disable protected-access: Test access protected members .
# pylint: disable=protected-access
iptables.swap_set('from', 'to')
treadmill.iptables._ipset.assert_called_with(
'swap', 'from', 'to'
)
@mock.patch('treadmill.iptables._ipset', mock.Mock())
def test_ipset_restore(self):
"""Test the state restore functionality of IPSet"""
# Disable protected-access: Test access protected members .
# pylint: disable=protected-access
iptables.ipset_restore('Initial IPSet state')
treadmill.iptables._ipset.assert_called_with(
'-exist', 'restore', cmd_input='Initial IPSet state'
)
@mock.patch('treadmill.iptables.create_set', mock.Mock())
@mock.patch('treadmill.iptables.destroy_set', mock.Mock())
@mock.patch('treadmill.iptables.flush_set', mock.Mock())
@mock.patch('treadmill.iptables.ipset_restore', mock.Mock())
@mock.patch('treadmill.iptables.swap_set', mock.Mock())
def test_atomic_set(self):
"""Test atomic replacement of IPSet content.
"""
test_content = (x for x in ['a', 'b', 'c'])
iptables.atomic_set('target', test_content,
'some:type', foo='bar')
iptables.create_set.assert_called_with(
mock.ANY, set_type='some:type', foo='bar'
)
tmp_set = iptables.create_set.call_args[0][0]
iptables.ipset_restore.assert_called_with(
(
"add {tmp_set} a\n"
"add {tmp_set} b\n"
"add {tmp_set} c"
).format(tmp_set=tmp_set)
)
iptables.swap_set.assert_called_with(
'target', tmp_set
)
iptables.destroy_set.assert_called_with(
tmp_set
)
if __name__ == '__main__':
unittest.main()
| 35.753142 | 78 | 0.579578 | 4,436 | 39,829 | 4.971821 | 0.067403 | 0.090954 | 0.057946 | 0.067196 | 0.844162 | 0.810428 | 0.772569 | 0.749127 | 0.701564 | 0.65518 | 0 | 0.02878 | 0.298602 | 39,829 | 1,113 | 79 | 35.785265 | 0.760703 | 0.110874 | 0 | 0.515489 | 0 | 0.008674 | 0.152471 | 0.069911 | 0 | 0 | 0 | 0 | 0.109046 | 1 | 0.053284 | false | 0.094176 | 0.017348 | 0 | 0.078067 | 0.001239 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
6558de8bb792bc885edb749907a776ce17809349 | 30,159 | py | Python | dataset.py | chudur-budur/dsacstar | 3a2552ee065950e186ff087e7586d9437d66baab | [
"BSD-3-Clause"
] | null | null | null | dataset.py | chudur-budur/dsacstar | 3a2552ee065950e186ff087e7586d9437d66baab | [
"BSD-3-Clause"
] | null | null | null | dataset.py | chudur-budur/dsacstar | 3a2552ee065950e186ff087e7586d9437d66baab | [
"BSD-3-Clause"
] | null | null | null | import sys
import os
import random
import math
import warnings
import time
import numpy as np
import cv2
# This cv2 setting is needed since the transforms.Compose in torchvision
# doesn't work in multi-threaded mode when the cv2 is used. See here:
# https://github.com/pytorch/pytorch/issues/45198
# https://github.com/pytorch/pytorch/issues/15808
# https://github.com/pytorch/vision/issues/3756
cv2.setNumThreads(0)
import torch
import torch.nn.functional as F
from torch.utils.data import Dataset
from torchvision import transforms
from skimage.transform import rotate, resize
from skimage import io, color
from network import Network
import transforms as tr
class JellyfishDataset(Dataset):
"""Camera localization dataset for Jellyfish SLAM.
This is similar to `CamLocDataset` but uses train/test mapping in csv files
instead of symlinks (as has been done in the original implementation). Also
this is only for nodvi datasets, not suitable for any other formats.
"""
def __init__(self, map_file,
mode=1,
sparse=False,
augment=False,
aug_rotation=30,
aug_scale_min=0.66667,
aug_scale_max=1.5,
aug_contrast=0.1,
aug_brightness=0.1,
image_height=480):
'''Constructor.
Parameters:
root_dir: Folder of the data (training or test).
mode:
0 = RGB only, load no initialization targets,
1 = RGB + ground truth scene coordinates, load or generate ground
truth scene coordinate targets
2 = RGB-D, load camera coordinates instead of scene coordinates
sparse: for mode = 1 (RGB+GT SC), load sparse initialization targets when True,
load dense depth maps and generate initialization targets when False
augment: Use random data augmentation, note: not supported for mode = 2 (RGB-D)
since pre-generateed eye coordinates cannot be agumented
aug_rotation: Max 2D image rotation angle, sampled uniformly around 0, both directions
aug_scale_min: Lower limit of image scale factor for uniform sampling
aug_scale_min: Upper limit of image scale factor for uniform sampling
aug_contrast: Max relative scale factor for image contrast sampling,
e.g. 0.1 -> [0.9,1.1]
aug_brightness: Max relative scale factor for image brightness sampling,
e.g. 0.1 -> [0.9,1.1]
image_height: RGB images are rescaled to this maximum height
'''
self.init = (mode == 1)
self.sparse = sparse
self.eye = (mode == 2)
self.image_height = image_height
self.augment = augment
self.aug_rotation = aug_rotation
self.aug_scale_min = aug_scale_min
self.aug_scale_max = aug_scale_max
self.aug_contrast = aug_contrast
self.aug_brightness = aug_brightness
if self.eye and self.augment and \
(self.aug_rotation > 0 or self.aug_scale_min != 1 or self.aug_scale_max != 1):
print(
"WARNING: Check your augmentation settings. Camera coordinates will not be augmented.")
# read the mapping file
fp = open(map_file, 'r')
entries = [line.strip().split(',') for line in fp.readlines()]
fp.close()
# collect poses, timestamps, images and calibration data from `map_file`
print("Collecting {0:d} poses ...".format(len(entries)))
self.pose_data, Id = self.__get_poses__(entries)
print("{0:d} valid poses found.".format(len(Id)))
print("Collecting {0:d} timestamps ...".format(len(Id)))
self.timestamps = np.array(
[os.path.split(entries[i][0])[-1].split('.')[0] for i in Id])
print("Collecting {0:d} file paths ...".format(len(Id)))
self.rgb_files = np.array([entries[i][0] for i in Id])
print("Collecting {0:d} camera calibrations ...".format(len(Id)))
self.calibration_data = np.array(
[[float(v) for v in entries[i][8:-1]] for i in Id])
if len(self.rgb_files) != len(self.pose_data):
raise Exception('RGB file count does not match pose file count!')
if not sparse:
# create grid of 2D pixel positions when generating scene coordinates from depth
self.prediction_grid = np.zeros((2, math.ceil(5000 / Network.OUTPUT_SUBSAMPLE),
math.ceil(5000 / Network.OUTPUT_SUBSAMPLE)))
for x in range(0, self.prediction_grid.shape[2]):
for y in range(0, self.prediction_grid.shape[1]):
self.prediction_grid[0, y, x] = x * \
Network.OUTPUT_SUBSAMPLE
self.prediction_grid[1, y, x] = y * \
Network.OUTPUT_SUBSAMPLE
def __len__(self):
return len(self.rgb_files)
def __get_poses__(self, entries):
"""Get all the quarternions and translation values and return their corresponding
pose matrices.
Also return poses only when the translations are correct. Keep track of all the
indices with correct pose/translation.
"""
poses, valid_indices = [], []
for i, e in enumerate(entries):
extrinsics = [float(v) for v in e[1:8]]
# 0: q0 (qx), 1: q1 (qy), 2: q2 (qz), 3: q3 (qw),
# 4: x, 5: y, 6: z
q, p = np.array(extrinsics[0:4]), np.array(extrinsics[4:])
# compute pose with Rodrigues
pose = tr.compute_pose(p, q)
if pose is not None:
poses.append(pose)
valid_indices.append(i)
return np.array(poses), np.array(valid_indices)
def __getitem__(self, idx):
image = io.imread(self.rgb_files[idx])
if len(image.shape) < 3:
image = color.gray2rgb(image)
focal_length = float(self.calibration_data[idx][0])
# image will be normalized to standard height, adjust focal length as well
f_scale_factor = self.image_height / image.shape[0]
focal_length *= f_scale_factor
# pose = np.loadtxt(self.pose_files[idx])
pose = torch.from_numpy(self.pose_data[idx]).float()
# for jellyfish coords are none
coords = 0
if self.augment:
scale_factor = random.uniform(
self.aug_scale_min, self.aug_scale_max)
# scale focal length
focal_length *= scale_factor
angle = random.uniform(-self.aug_rotation, self.aug_rotation)
# get the intrinsics and lens distortion
camera_intrinsics = self.calibration_data[idx][0:4]
distortion_coeffs = self.calibration_data[idx][4:]
# augment input image
pipeline = transforms.Compose([
transforms.Lambda(lambda img: tr.unfish(
img, camera_intrinsics, distortion_coeffs)),
transforms.Lambda(lambda img: tr.cambridgify(img)),
transforms.ToPILImage(),
transforms.Resize(int(self.image_height * scale_factor)),
transforms.Grayscale(),
transforms.ColorJitter(
brightness=self.aug_brightness,
contrast=self.aug_contrast),
transforms.ToTensor()
])
image = pipeline(image)
# rotate image
# image = tr.rotate(image, angle, 1, 'reflect')
#
# # rotate ground truth camera pose
# angle = angle * math.pi / 180
# pose_rot = torch.eye(4)
# pose_rot[0, 0] = math.cos(angle)
# pose_rot[0, 1] = -math.sin(angle)
# pose_rot[1, 0] = math.sin(angle)
# pose_rot[1, 1] = math.cos(angle)
# pose = torch.matmul(pose, pose_rot)
if self.init:
# rotate and scale depth maps
depth = resize(depth, image.shape[1:], order=0)
depth = rotate(depth, angle, order=0, mode='constant')
else:
pipeline = transforms.Compose([
transforms.ToPILImage(),
transforms.Resize(self.image_height),
transforms.Grayscale(),
transforms.ToTensor()
])
image = pipeline(image)
if self.init and not self.sparse:
# generate initialization targets from depth map
offsetX = int(Network.OUTPUT_SUBSAMPLE/2)
offsetY = int(Network.OUTPUT_SUBSAMPLE/2)
coords = torch.zeros((3,
math.ceil(
image.shape[1] / Network.OUTPUT_SUBSAMPLE),
math.ceil(image.shape[2] / Network.OUTPUT_SUBSAMPLE)))
# subsample to network output size
depth = depth[offsetY::Network.OUTPUT_SUBSAMPLE,
offsetX::Network.OUTPUT_SUBSAMPLE]
# construct x and y coordinates of camera coordinate
xy = self.prediction_grid[:,
:depth.shape[0], :depth.shape[1]].copy()
# add random pixel shift
xy[0] += offsetX
xy[1] += offsetY
# substract principal point (assume image center)
xy[0] -= image.shape[2] / 2
xy[1] -= image.shape[1] / 2
# reproject
xy /= focal_length
xy[0] *= depth
xy[1] *= depth
# assemble camera coordinates trensor
eye = np.ndarray((4, depth.shape[0], depth.shape[1]))
eye[0:2] = xy
eye[2] = depth
eye[3] = 1
# eye to scene coordinates
sc = np.matmul(pose.numpy(), eye.reshape(4, -1))
sc = sc.reshape(4, depth.shape[0], depth.shape[1])
# mind pixels with invalid depth
sc[:, depth == 0] = 0
sc[:, depth > 1000] = 0
sc = torch.from_numpy(sc[0:3])
coords[:, :sc.shape[1], :sc.shape[2]] = sc
return image, pose, coords, focal_length, self.timestamps[idx], self.rgb_files[idx]
class CamLocDatasetLite(Dataset):
"""Camera localization dataset for 7-scenes, 12-scenes and Cambrigde.
This is similar to `CamLocDataset` but uses train/test mapping in csv files
instead of symlinks (as has been done in the original implementation).
"""
def __init__(self, map_file,
mode=1,
sparse=False,
augment=False,
aug_rotation=30,
aug_scale_min=2/3,
aug_scale_max=3/2,
aug_contrast=0.1,
aug_brightness=0.1,
image_height=480):
'''Constructor.
Parameters:
root_dir: Folder of the data (training or test).
mode:
0 = RGB only, load no initialization targets,
1 = RGB + ground truth scene coordinates, load or generate ground
truth scene coordinate targets
2 = RGB-D, load camera coordinates instead of scene coordinates
sparse: for mode = 1 (RGB+GT SC), load sparse initialization targets when True,
load dense depth maps and generate initialization targets when False
augment: Use random data augmentation, note: not supported for mode = 2 (RGB-D)
since pre-generateed eye coordinates cannot be agumented
aug_rotation: Max 2D image rotation angle, sampled uniformly around 0, both directions
aug_scale_min: Lower limit of image scale factor for uniform sampling
aug_scale_min: Upper limit of image scale factor for uniform sampling
aug_contrast: Max relative scale factor for image contrast sampling,
e.g. 0.1 -> [0.9,1.1]
aug_brightness: Max relative scale factor for image brightness sampling,
e.g. 0.1 -> [0.9,1.1]
image_height: RGB images are rescaled to this maximum height
'''
self.init = (mode == 1)
self.sparse = sparse
self.eye = (mode == 2)
self.image_height = image_height
self.augment = augment
self.aug_rotation = aug_rotation
self.aug_scale_min = aug_scale_min
self.aug_scale_max = aug_scale_max
self.aug_contrast = aug_contrast
self.aug_brightness = aug_brightness
if self.eye and self.augment and \
(self.aug_rotation > 0 or self.aug_scale_min != 1 or self.aug_scale_max != 1):
print(
"WARNING: Check your augmentation settings. Camera coordinates will not be augmented.")
# read the mapping file
fp = open(map_file, 'r')
entries = [line.strip().split(',') for line in fp.readlines()]
fp.close()
self.rgb_files = [e[0] for e in entries]
self.pose_files = [e[1] for e in entries]
if self.sparse:
self.coord_files = [e[3] for e in entries]
elif self.eye:
self.coord_files = [e[4] for e in entries]
else:
self.coord_files = [e[2] for e in entries]
self.calibration_data = [e[-1] for e in entries]
if len(self.rgb_files) != len(self.pose_files):
raise Exception('RGB file count does not match pose file count!')
self.image_transform = transforms.Compose([
transforms.ToPILImage(),
transforms.Resize(self.image_height),
transforms.Grayscale(),
transforms.ToTensor(),
transforms.Normalize(
# statistics calculated over 7scenes training set, should generalize fairly well
mean=[0.4],
std=[0.25]
)
])
self.pose_transform = transforms.Compose([
transforms.ToTensor()
])
if not sparse:
# create grid of 2D pixel positions when generating scene coordinates from depth
self.prediction_grid = np.zeros((2, math.ceil(5000 / Network.OUTPUT_SUBSAMPLE),
math.ceil(5000 / Network.OUTPUT_SUBSAMPLE)))
for x in range(0, self.prediction_grid.shape[2]):
for y in range(0, self.prediction_grid.shape[1]):
self.prediction_grid[0, y, x] = x * \
Network.OUTPUT_SUBSAMPLE
self.prediction_grid[1, y, x] = y * \
Network.OUTPUT_SUBSAMPLE
def __len__(self):
return len(self.rgb_files)
def __getitem__(self, idx):
image = io.imread(self.rgb_files[idx])
if len(image.shape) < 3:
image = color.gray2rgb(image)
focal_length = float(self.calibration_data[idx])
# image will be normalized to standard height, adjust focal length as well
f_scale_factor = self.image_height / image.shape[0]
focal_length *= f_scale_factor
pose = np.loadtxt(self.pose_files[idx])
pose = torch.from_numpy(pose).float()
if self.init:
if self.sparse:
coords = torch.load(self.coord_files[idx])
else:
depth = io.imread(self.coord_files[idx])
depth = depth.astype(np.float64)
depth /= 1000 # from millimeters to meters
elif self.eye:
coords = torch.load(self.coord_files[idx])
else:
coords = 0
if self.augment:
scale_factor = random.uniform(
self.aug_scale_min, self.aug_scale_max)
angle = random.uniform(-self.aug_rotation, self.aug_rotation)
# augment input image
cur_image_transform = transforms.Compose([
transforms.ToPILImage(),
transforms.Resize(int(self.image_height * scale_factor)),
transforms.Grayscale(),
transforms.ColorJitter(
brightness=self.aug_brightness, contrast=self.aug_contrast),
transforms.ToTensor(),
transforms.Normalize(mean=[0.4], std=[0.25])
])
image = cur_image_transform(image)
# scale focal length
focal_length *= scale_factor
# rotate input image
def __rotate__(t, angle, order, mode='constant'):
# rotate input image
t = t.permute(1, 2, 0).numpy()
t = rotate(t, angle, order=order, mode=mode)
t = torch.from_numpy(t).permute(2, 0, 1).float()
return t
image = __rotate__(image, angle, 1, 'reflect')
if self.init:
if self.sparse:
# rotate and scale initalization targets
coords_w = math.ceil(image.size(
2) / Network.OUTPUT_SUBSAMPLE)
coords_h = math.ceil(image.size(
1) / Network.OUTPUT_SUBSAMPLE)
coords = F.interpolate(coords.unsqueeze(
0), size=(coords_h, coords_w))[0]
coords = my_rot(coords, angle, 0)
else:
# rotate and scale depth maps
depth = resize(depth, image.shape[1:], order=0)
depth = rotate(depth, angle, order=0, mode='constant')
# rotate ground truth camera pose
angle = angle * math.pi / 180
pose_rot = torch.eye(4)
pose_rot[0, 0] = math.cos(angle)
pose_rot[0, 1] = -math.sin(angle)
pose_rot[1, 0] = math.sin(angle)
pose_rot[1, 1] = math.cos(angle)
pose = torch.matmul(pose, pose_rot)
else:
image = self.image_transform(image)
if self.init and not self.sparse:
# generate initialization targets from depth map
offsetX = int(Network.OUTPUT_SUBSAMPLE/2)
offsetY = int(Network.OUTPUT_SUBSAMPLE/2)
coords = torch.zeros((3,
math.ceil(
image.shape[1] / Network.OUTPUT_SUBSAMPLE),
math.ceil(image.shape[2] / Network.OUTPUT_SUBSAMPLE)))
# subsample to network output size
depth = depth[offsetY::Network.OUTPUT_SUBSAMPLE,
offsetX::Network.OUTPUT_SUBSAMPLE]
# construct x and y coordinates of camera coordinate
xy = self.prediction_grid[:,
:depth.shape[0], :depth.shape[1]].copy()
# add random pixel shift
xy[0] += offsetX
xy[1] += offsetY
# substract principal point (assume image center)
xy[0] -= image.shape[2] / 2
xy[1] -= image.shape[1] / 2
# reproject
xy /= focal_length
xy[0] *= depth
xy[1] *= depth
# assemble camera coordinates trensor
eye = np.ndarray((4, depth.shape[0], depth.shape[1]))
eye[0:2] = xy
eye[2] = depth
eye[3] = 1
# eye to scene coordinates
sc = np.matmul(pose.numpy(), eye.reshape(4, -1))
sc = sc.reshape(4, depth.shape[0], depth.shape[1])
# mind pixels with invalid depth
sc[:, depth == 0] = 0
sc[:, depth > 1000] = 0
sc = torch.from_numpy(sc[0:3])
coords[:, :sc.shape[1], :sc.shape[2]] = sc
return image, pose, coords, focal_length, self.rgb_files[idx]
class CamLocDataset(Dataset):
"""Camera localization dataset.
Access to image, calibration and ground truth data given a dataset directory.
"""
def __init__(self, root_dir,
mode=1,
sparse=False,
augment=False,
aug_rotation=30,
aug_scale_min=2/3,
aug_scale_max=3/2,
aug_contrast=0.1,
aug_brightness=0.1,
image_height=480):
'''Constructor.
Parameters:
root_dir: Folder of the data (training or test).
mode:
0 = RGB only, load no initialization targets,
1 = RGB + ground truth scene coordinates, load or generate ground
truth scene coordinate targets
2 = RGB-D, load camera coordinates instead of scene coordinates
sparse: for mode = 1 (RGB+GT SC), load sparse initialization targets when True,
load dense depth maps and generate initialization targets when False
augment: Use random data augmentation, note: not supported for mode = 2 (RGB-D)
since pre-generateed eye coordinates cannot be agumented
aug_rotation: Max 2D image rotation angle, sampled uniformly around 0, both directions
aug_scale_min: Lower limit of image scale factor for uniform sampling
aug_scale_min: Upper limit of image scale factor for uniform sampling
aug_contrast: Max relative scale factor for image contrast sampling,
e.g. 0.1 -> [0.9,1.1]
aug_brightness: Max relative scale factor for image brightness sampling,
e.g. 0.1 -> [0.9,1.1]
image_height: RGB images are rescaled to this maximum height
'''
self.init = (mode == 1)
self.sparse = sparse
self.eye = (mode == 2)
self.image_height = image_height
self.augment = augment
self.aug_rotation = aug_rotation
self.aug_scale_min = aug_scale_min
self.aug_scale_max = aug_scale_max
self.aug_contrast = aug_contrast
self.aug_brightness = aug_brightness
if self.eye and self.augment and \
(self.aug_rotation > 0 or self.aug_scale_min != 1 or self.aug_scale_max != 1):
print(
"WARNING: Check your augmentation settings. Camera coordinates will not be augmented.")
rgb_dir = root_dir + '/rgb/'
pose_dir = root_dir + '/poses/'
calibration_dir = root_dir + '/calibration/'
if self.eye:
coord_dir = root_dir + '/eye/'
elif self.sparse:
coord_dir = root_dir + '/init/'
else:
coord_dir = root_dir + '/depth/'
self.rgb_files = os.listdir(rgb_dir)
self.rgb_files = [rgb_dir + f for f in self.rgb_files]
self.rgb_files.sort()
self.image_transform = transforms.Compose([
transforms.ToPILImage(),
transforms.Resize(self.image_height),
transforms.Grayscale(),
transforms.ToTensor(),
transforms.Normalize(
# statistics calculated over 7scenes training set, should generalize fairly well
mean=[0.4],
std=[0.25]
)
])
self.pose_files = os.listdir(pose_dir)
self.pose_files = [pose_dir + f for f in self.pose_files]
self.pose_files.sort()
self.pose_transform = transforms.Compose([
transforms.ToTensor()
])
self.calibration_files = os.listdir(calibration_dir)
self.calibration_files = [calibration_dir +
f for f in self.calibration_files]
self.calibration_files.sort()
if self.init or self.eye:
self.coord_files = os.listdir(coord_dir)
self.coord_files = [coord_dir + f for f in self.coord_files]
self.coord_files.sort()
if len(self.rgb_files) != len(self.pose_files):
raise Exception('RGB file count does not match pose file count!')
if not sparse:
# create grid of 2D pixel positions when generating scene coordinates from depth
self.prediction_grid = np.zeros((2, math.ceil(5000 / Network.OUTPUT_SUBSAMPLE),
math.ceil(5000 / Network.OUTPUT_SUBSAMPLE)))
for x in range(0, self.prediction_grid.shape[2]):
for y in range(0, self.prediction_grid.shape[1]):
self.prediction_grid[0, y, x] = x * \
Network.OUTPUT_SUBSAMPLE
self.prediction_grid[1, y, x] = y * \
Network.OUTPUT_SUBSAMPLE
def __len__(self):
return len(self.rgb_files)
def __getitem__(self, idx):
image = io.imread(self.rgb_files[idx])
if len(image.shape) < 3:
image = color.gray2rgb(image)
focal_length = float(np.loadtxt(self.calibration_files[idx]))
# image will be normalized to standard height, adjust focal length as well
f_scale_factor = self.image_height / image.shape[0]
focal_length *= f_scale_factor
pose = np.loadtxt(self.pose_files[idx])
pose = torch.from_numpy(pose).float()
if self.init:
if self.sparse:
coords = torch.load(self.coord_files[idx])
else:
depth = io.imread(self.coord_files[idx])
depth = depth.astype(np.float64)
depth /= 1000 # from millimeters to meters
elif self.eye:
coords = torch.load(self.coord_files[idx])
else:
coords = 0
if self.augment:
scale_factor = random.uniform(
self.aug_scale_min, self.aug_scale_max)
angle = random.uniform(-self.aug_rotation, self.aug_rotation)
# augment input image
cur_image_transform = transforms.Compose([
transforms.ToPILImage(),
transforms.Resize(int(self.image_height * scale_factor)),
transforms.Grayscale(),
transforms.ColorJitter(
brightness=self.aug_brightness, contrast=self.aug_contrast),
transforms.ToTensor(),
transforms.Normalize(mean=[0.4], std=[0.25])
])
image = cur_image_transform(image)
# scale focal length
focal_length *= scale_factor
# rotate input image
def my_rot(t, angle, order, mode='constant'):
t = t.permute(1, 2, 0).numpy()
t = rotate(t, angle, order=order, mode=mode)
t = torch.from_numpy(t).permute(2, 0, 1).float()
return t
image = my_rot(image, angle, 1, 'reflect')
if self.init:
if self.sparse:
# rotate and scale initalization targets
coords_w = math.ceil(image.size(
2) / Network.OUTPUT_SUBSAMPLE)
coords_h = math.ceil(image.size(
1) / Network.OUTPUT_SUBSAMPLE)
coords = F.interpolate(coords.unsqueeze(
0), size=(coords_h, coords_w))[0]
coords = my_rot(coords, angle, 0)
else:
# rotate and scale depth maps
depth = resize(depth, image.shape[1:], order=0)
depth = rotate(depth, angle, order=0, mode='constant')
# rotate ground truth camera pose
angle = angle * math.pi / 180
pose_rot = torch.eye(4)
pose_rot[0, 0] = math.cos(angle)
pose_rot[0, 1] = -math.sin(angle)
pose_rot[1, 0] = math.sin(angle)
pose_rot[1, 1] = math.cos(angle)
pose = torch.matmul(pose, pose_rot)
else:
image = self.image_transform(image)
if self.init and not self.sparse:
# generate initialization targets from depth map
offsetX = int(Network.OUTPUT_SUBSAMPLE/2)
offsetY = int(Network.OUTPUT_SUBSAMPLE/2)
coords = torch.zeros((3,
math.ceil(
image.shape[1] / Network.OUTPUT_SUBSAMPLE),
math.ceil(image.shape[2] / Network.OUTPUT_SUBSAMPLE)))
# subsample to network output size
depth = depth[offsetY::Network.OUTPUT_SUBSAMPLE,
offsetX::Network.OUTPUT_SUBSAMPLE]
# construct x and y coordinates of camera coordinate
xy = self.prediction_grid[:,
:depth.shape[0], :depth.shape[1]].copy()
# add random pixel shift
xy[0] += offsetX
xy[1] += offsetY
# substract principal point (assume image center)
xy[0] -= image.shape[2] / 2
xy[1] -= image.shape[1] / 2
# reproject
xy /= focal_length
xy[0] *= depth
xy[1] *= depth
# assemble camera coordinates trensor
eye = np.ndarray((4, depth.shape[0], depth.shape[1]))
eye[0:2] = xy
eye[2] = depth
eye[3] = 1
# eye to scene coordinates
sc = np.matmul(pose.numpy(), eye.reshape(4, -1))
sc = sc.reshape(4, depth.shape[0], depth.shape[1])
# mind pixels with invalid depth
sc[:, depth == 0] = 0
sc[:, depth > 1000] = 0
sc = torch.from_numpy(sc[0:3])
coords[:, :sc.shape[1], :sc.shape[2]] = sc
return image, pose, coords, focal_length, self.rgb_files[idx]
| 39.998674 | 103 | 0.547664 | 3,520 | 30,159 | 4.57017 | 0.102273 | 0.018276 | 0.046497 | 0.008392 | 0.845963 | 0.82116 | 0.813452 | 0.803133 | 0.801392 | 0.795176 | 0 | 0.024058 | 0.360489 | 30,159 | 753 | 104 | 40.051793 | 0.810027 | 0.251069 | 0 | 0.805139 | 0 | 0 | 0.029382 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025696 | false | 0 | 0.034261 | 0.006424 | 0.085653 | 0.017131 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
02e06cbd1d5e205ce1bc93e4e7011f1e9d741689 | 156 | py | Python | dataplaybook/__main__.py | kellerza/data-playbook | 382c369505a5cb0c28bd786ebdda51c05417165c | [
"Apache-2.0"
] | 3 | 2018-07-06T08:34:46.000Z | 2021-05-27T23:29:04.000Z | dataplaybook/__main__.py | kellerza/data-playbook | 382c369505a5cb0c28bd786ebdda51c05417165c | [
"Apache-2.0"
] | null | null | null | dataplaybook/__main__.py | kellerza/data-playbook | 382c369505a5cb0c28bd786ebdda51c05417165c | [
"Apache-2.0"
] | null | null | null | """Dataplaybook script."""
from dataplaybook.main import run_playbooks
def main():
"""Execute a playbook."""
run_playbooks(dataplaybook_cmd=True)
| 19.5 | 43 | 0.724359 | 18 | 156 | 6.111111 | 0.722222 | 0.218182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141026 | 156 | 7 | 44 | 22.285714 | 0.820896 | 0.25641 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b849af34344786702ff5e55675b74fe657dc1c3a | 4,045 | py | Python | src/genie/libs/parser/iosxr/tests/ShowMplsLabelTableDetail/cli/equal/golden_output_1_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 204 | 2018-06-27T00:55:27.000Z | 2022-03-06T21:12:18.000Z | src/genie/libs/parser/iosxr/tests/ShowMplsLabelTableDetail/cli/equal/golden_output_1_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 468 | 2018-06-19T00:33:18.000Z | 2022-03-31T23:23:35.000Z | src/genie/libs/parser/iosxr/tests/ShowMplsLabelTableDetail/cli/equal/golden_output_1_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 309 | 2019-01-16T20:21:07.000Z | 2022-03-30T12:56:41.000Z |
expected_output = {
'table': {
0: {
'label': {
0: {
'owner': {
'LSD(A)': {
'state': 'InUse',
'rewrite': 'Yes'
},
},
},
1: {
'owner': {
'LSD(A)': {
'state': 'InUse',
'rewrite': 'Yes'
},
},
},
2: {
'owner': {
'LSD(A)': {
'state': 'InUse',
'rewrite': 'Yes'
},
},
},
13: {
'owner': {
'LSD(A)': {
'state': 'InUse',
'rewrite': 'Yes'
},
},
},
16000: {
'owner': {
'ISIS(A):SR': {
'state': 'InUse',
'rewrite': 'No'
},
},
'label_type': {
'Lbl-blk SRGB': {
'vers': 0,
'start_label': 16000,
'size': 8000
},
},
},
24000: {
'owner': {
'ISIS(A):SR': {
'state': 'InUse',
'rewrite': 'Yes'
},
},
'label_type': {
'SR Adj Segment IPv4': {
'vers': 0,
'index': 0,
'type': 0,
'interface': 'Gi0/0/0/1',
'nh': '10.1.2.2'
},
},
},
24001: {
'owner': {
'ISIS(A):SR': {
'state': 'InUse',
'rewrite': 'Yes'
},
},
'label_type': {
'SR Adj Segment IPv4': {
'vers': 0,
'index': 2,
'type': 0,
'interface': 'Gi0/0/0/1',
'nh': '10.1.2.2'
},
},
},
24002: {
'owner': {
'ISIS(A):SR': {
'state': 'InUse',
'rewrite': 'Yes'
},
},
'label_type': {
'SR Adj Segment IPv4': {
'vers': 0,
'index': 1,
'type': 0,
'interface': 'Gi0/0/0/1',
'nh': '10.1.2.2'
},
},
},
24003: {
'owner': {
'ISIS(A):SR': {
'state': 'InUse',
'rewrite': 'Yes'
}
},
'label_type': {
'SR Adj Segment IPv4': {
'vers': 0,
'index': 3,
'type': 0,
'interface': 'Gi0/0/0/1',
'nh': '10.1.2.2'
},
},
},
},
},
},
}
| 32.103175 | 53 | 0.157231 | 182 | 4,045 | 3.456044 | 0.21978 | 0.143084 | 0.243243 | 0.254372 | 0.828299 | 0.828299 | 0.828299 | 0.597774 | 0.597774 | 0.597774 | 0 | 0.083935 | 0.726082 | 4,045 | 125 | 54 | 32.36 | 0.483755 | 0 | 0 | 0.512195 | 0 | 0 | 0.155578 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b88be76cec8cf8b830f56aae26adeb63ddcd90e2 | 2,700 | py | Python | kornia/morphology/open_close.py | NathanHowell/kornia | 777e2e03ba61f2a69ad686a01de72a1829f780fd | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | kornia/morphology/open_close.py | NathanHowell/kornia | 777e2e03ba61f2a69ad686a01de72a1829f780fd | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | kornia/morphology/open_close.py | NathanHowell/kornia | 777e2e03ba61f2a69ad686a01de72a1829f780fd | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2021-05-15T03:22:24.000Z | 2021-05-15T03:22:24.000Z | import torch
import torch.nn as nn
import torch.nn.functional as F
from kornia.morphology.basic_operators import dilation, erosion
# open
def open(tensor: torch.Tensor, kernel: torch.Tensor) -> torch.Tensor:
r"""Returns the opened image, (that means, erosion after a dilation) applying the same kernel in each channel.
The kernel must have 2 dimensions, each one defined by an odd number.
Args:
tensor (torch.Tensor): Image with shape :math:`(B, C, H, W)`.
kernel (torch.Tensor): Structuring element with shape :math:`(H, W)`.
Returns:
torch.Tensor: Dilated image with shape :math:`(B, C, H, W)`.
Example:
>>> tensor = torch.rand(1, 3, 5, 5)
>>> kernel = torch.ones(3, 3)
>>> opened_img = open(tensor, kernel)
"""
if not isinstance(tensor, torch.Tensor):
raise TypeError("Input type is not a torch.Tensor. Got {}".format(
type(tensor)))
if len(tensor.shape) != 4:
raise ValueError("Input size must have 4 dimensions. Got {}".format(
tensor.dim()))
if not isinstance(kernel, torch.Tensor):
raise TypeError("Kernel type is not a torch.Tensor. Got {}".format(
type(kernel)))
if len(kernel.shape) != 2:
raise ValueError("Kernel size must have 2 dimensions. Got {}".format(
kernel.dim()))
return dilation(erosion(tensor, kernel), kernel)
# close
def close(tensor: torch.Tensor, kernel: torch.Tensor) -> torch.Tensor:
r"""Returns the closed image, (that means, dilation after an erosion) applying the same kernel in each channel.
The kernel must have 2 dimensions, each one defined by an odd number.
Args:
tensor (torch.Tensor): Image with shape :math:`(B, C, H, W)`.
kernel (torch.Tensor): Structuring element with shape :math:`(H, W)`.
Returns:
torch.Tensor: Dilated image with shape :math:`(B, C, H, W)`.
Example:
>>> tensor = torch.rand(1, 3, 5, 5)
>>> kernel = torch.ones(3, 3)
>>> closed_img = close(tensor, kernel)
"""
if not isinstance(tensor, torch.Tensor):
raise TypeError("Input type is not a torch.Tensor. Got {}".format(
type(tensor)))
if len(tensor.shape) != 4:
raise ValueError("Input size must have 4 dimensions. Got {}".format(
tensor.dim()))
if not isinstance(kernel, torch.Tensor):
raise TypeError("Kernel type is not a torch.Tensor. Got {}".format(
type(kernel)))
if len(kernel.shape) != 2:
raise ValueError("Kernel size must have 2 dimensions. Got {}".format(
kernel.dim()))
return erosion(dilation(tensor, kernel), kernel)
| 32.926829 | 115 | 0.623704 | 365 | 2,700 | 4.605479 | 0.210959 | 0.130874 | 0.080904 | 0.045211 | 0.816181 | 0.816181 | 0.816181 | 0.816181 | 0.816181 | 0.816181 | 0 | 0.010827 | 0.247407 | 2,700 | 81 | 116 | 33.333333 | 0.816437 | 0.392222 | 0 | 0.705882 | 0 | 0 | 0.213125 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.117647 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b891fab5f3d8e0412cde0b1bbcebd74f75a42426 | 214 | py | Python | labtronyx/gui/wx_views/__init__.py | protonyx/labtronyx | e5f6ea306d7135f640e1a5f0cd7c3302facb911a | [
"MIT"
] | 4 | 2017-05-13T21:37:33.000Z | 2021-05-09T19:14:33.000Z | labtronyx/gui/wx_views/__init__.py | protonyx/labtronyx | e5f6ea306d7135f640e1a5f0cd7c3302facb911a | [
"MIT"
] | 21 | 2015-07-24T20:02:03.000Z | 2015-11-18T04:46:09.000Z | labtronyx/gui/wx_views/__init__.py | protonyx/labtronyx | e5f6ea306d7135f640e1a5f0cd7c3302facb911a | [
"MIT"
] | 3 | 2017-04-17T12:16:24.000Z | 2021-08-13T09:09:07.000Z | try:
from .wx_base import *
from .wx_interfaces import *
from .wx_resources import *
from .wx_scripts import *
from .wx_manager import *
from .wx_main import *
except ImportError:
pass | 19.454545 | 32 | 0.668224 | 28 | 214 | 4.892857 | 0.464286 | 0.262774 | 0.437956 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.266355 | 214 | 11 | 33 | 19.454545 | 0.872611 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.111111 | 0.777778 | 0 | 0.777778 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
b8a186880a1f6c87a899332baef6f69ed9c4df20 | 36,424 | py | Python | examples/sitl_client/democontroller1.py | GaloisInc/planning-synthesis | 35247e3f5080191d6f55c6272960251db3f9ff6b | [
"BSD-2-Clause"
] | 1 | 2019-11-27T07:16:34.000Z | 2019-11-27T07:16:34.000Z | examples/sitl_client/democontroller1.py | GaloisInc/planning-synthesis | 35247e3f5080191d6f55c6272960251db3f9ff6b | [
"BSD-2-Clause"
] | null | null | null | examples/sitl_client/democontroller1.py | GaloisInc/planning-synthesis | 35247e3f5080191d6f55c6272960251db3f9ff6b | [
"BSD-2-Clause"
] | null | null | null | class ExampleCtrl(object):
"""Mealy transducer.
Internal states are integers, the current state
is stored in the attribute "state".
To take a transition, call method "move".
The names of input variables are stored in the
attribute "input_vars".
Automatically generated by tulip.dumpsmach on 2015-08-13 05:18:57 UTC
To learn more about TuLiP, visit http://tulip-control.org
"""
def __init__(self):
self.state = 52
self.input_vars = ['env2']
def move(self, env2):
"""Given inputs, take move and return outputs.
@rtype: dict
@return: dictionary with keys of the output variable names:
['loc', 'stage']
"""
output = dict()
if self.state == 0:
if (env2 == 0):
self.state = 0
output["loc"] = 16
output["stage"] = 1
elif (env2 == 1):
self.state = 33
output["loc"] = 17
output["stage"] = 1
elif (env2 == 2):
self.state = 34
output["loc"] = 17
output["stage"] = 1
else:
self._error(env2)
elif self.state == 1:
if (env2 == 0):
self.state = 0
output["loc"] = 16
output["stage"] = 1
elif (env2 == 1):
self.state = 33
output["loc"] = 17
output["stage"] = 1
elif (env2 == 2):
self.state = 34
output["loc"] = 17
output["stage"] = 1
elif (env2 == 3):
self.state = 38
output["loc"] = 16
output["stage"] = 1
else:
self._error(env2)
elif self.state == 2:
if (env2 == 0):
self.state = 0
output["loc"] = 16
output["stage"] = 1
elif (env2 == 4):
self.state = 41
output["loc"] = 16
output["stage"] = 1
elif (env2 == 2):
self.state = 34
output["loc"] = 17
output["stage"] = 1
elif (env2 == 3):
self.state = 38
output["loc"] = 16
output["stage"] = 1
elif (env2 == 1):
self.state = 33
output["loc"] = 17
output["stage"] = 1
else:
self._error(env2)
elif self.state == 3:
if (env2 == 4):
self.state = 41
output["loc"] = 16
output["stage"] = 1
elif (env2 == 2):
self.state = 2
output["loc"] = 16
output["stage"] = 1
elif (env2 == 1):
self.state = 1
output["loc"] = 16
output["stage"] = 1
elif (env2 == 3):
self.state = 38
output["loc"] = 16
output["stage"] = 1
elif (env2 == 5):
self.state = 39
output["loc"] = 16
output["stage"] = 1
else:
self._error(env2)
elif self.state == 4:
if (env2 == 6):
self.state = 40
output["loc"] = 16
output["stage"] = 1
elif (env2 == 4):
self.state = 41
output["loc"] = 16
output["stage"] = 1
elif (env2 == 2):
self.state = 2
output["loc"] = 16
output["stage"] = 1
elif (env2 == 3):
self.state = 38
output["loc"] = 16
output["stage"] = 1
elif (env2 == 5):
self.state = 39
output["loc"] = 16
output["stage"] = 1
else:
self._error(env2)
elif self.state == 5:
if (env2 == 6):
self.state = 40
output["loc"] = 16
output["stage"] = 1
elif (env2 == 7):
self.state = 5
output["loc"] = 16
output["stage"] = 1
elif (env2 == 5):
self.state = 39
output["loc"] = 16
output["stage"] = 1
else:
self._error(env2)
elif self.state == 6:
if (env2 == 0):
self.state = 8
output["loc"] = 20
output["stage"] = 1
elif (env2 == 1):
self.state = 18
output["loc"] = 21
output["stage"] = 1
elif (env2 == 2):
self.state = 19
output["loc"] = 21
output["stage"] = 1
elif (env2 == 3):
self.state = 21
output["loc"] = 20
output["stage"] = 1
else:
self._error(env2)
elif self.state == 7:
if (env2 == 0):
self.state = 8
output["loc"] = 20
output["stage"] = 1
elif (env2 == 4):
self.state = 24
output["loc"] = 20
output["stage"] = 1
elif (env2 == 1):
self.state = 18
output["loc"] = 21
output["stage"] = 1
elif (env2 == 2):
self.state = 19
output["loc"] = 21
output["stage"] = 1
elif (env2 == 3):
self.state = 21
output["loc"] = 20
output["stage"] = 1
else:
self._error(env2)
elif self.state == 8:
if (env2 == 0):
self.state = 8
output["loc"] = 20
output["stage"] = 1
elif (env2 == 1):
self.state = 18
output["loc"] = 21
output["stage"] = 1
elif (env2 == 2):
self.state = 19
output["loc"] = 21
output["stage"] = 1
else:
self._error(env2)
elif self.state == 9:
if (env2 == 4):
self.state = 24
output["loc"] = 20
output["stage"] = 1
elif (env2 == 7):
self.state = 20
output["loc"] = 20
output["stage"] = 1
elif (env2 == 5):
self.state = 22
output["loc"] = 20
output["stage"] = 1
elif (env2 == 6):
self.state = 23
output["loc"] = 20
output["stage"] = 1
else:
self._error(env2)
elif self.state == 10:
if (env2 == 7):
self.state = 10
output["loc"] = 20
output["stage"] = 2
elif (env2 == 6):
self.state = 11
output["loc"] = 20
output["stage"] = 2
elif (env2 == 5):
self.state = 12
output["loc"] = 20
output["stage"] = 2
else:
self._error(env2)
elif self.state == 11:
if (env2 == 7):
self.state = 10
output["loc"] = 20
output["stage"] = 2
elif (env2 == 6):
self.state = 11
output["loc"] = 20
output["stage"] = 2
elif (env2 == 5):
self.state = 12
output["loc"] = 20
output["stage"] = 2
elif (env2 == 4):
self.state = 13
output["loc"] = 20
output["stage"] = 2
else:
self._error(env2)
elif self.state == 12:
if (env2 == 7):
self.state = 10
output["loc"] = 20
output["stage"] = 2
elif (env2 == 6):
self.state = 11
output["loc"] = 20
output["stage"] = 2
elif (env2 == 5):
self.state = 12
output["loc"] = 20
output["stage"] = 2
elif (env2 == 4):
self.state = 13
output["loc"] = 20
output["stage"] = 2
elif (env2 == 3):
self.state = 14
output["loc"] = 20
output["stage"] = 2
else:
self._error(env2)
elif self.state == 13:
if (env2 == 2):
self.state = 16
output["loc"] = 20
output["stage"] = 2
elif (env2 == 6):
self.state = 11
output["loc"] = 20
output["stage"] = 2
elif (env2 == 5):
self.state = 12
output["loc"] = 20
output["stage"] = 2
elif (env2 == 4):
self.state = 13
output["loc"] = 20
output["stage"] = 2
elif (env2 == 3):
self.state = 14
output["loc"] = 20
output["stage"] = 2
else:
self._error(env2)
elif self.state == 14:
if (env2 == 2):
self.state = 16
output["loc"] = 20
output["stage"] = 2
elif (env2 == 5):
self.state = 12
output["loc"] = 20
output["stage"] = 2
elif (env2 == 4):
self.state = 13
output["loc"] = 20
output["stage"] = 2
elif (env2 == 3):
self.state = 14
output["loc"] = 20
output["stage"] = 2
elif (env2 == 1):
self.state = 15
output["loc"] = 20
output["stage"] = 2
else:
self._error(env2)
elif self.state == 15:
if (env2 == 2):
self.state = 16
output["loc"] = 20
output["stage"] = 2
elif (env2 == 0):
self.state = 17
output["loc"] = 20
output["stage"] = 2
elif (env2 == 3):
self.state = 14
output["loc"] = 20
output["stage"] = 2
elif (env2 == 1):
self.state = 15
output["loc"] = 20
output["stage"] = 2
else:
self._error(env2)
elif self.state == 16:
if (env2 == 2):
self.state = 16
output["loc"] = 20
output["stage"] = 2
elif (env2 == 0):
self.state = 17
output["loc"] = 20
output["stage"] = 2
elif (env2 == 4):
self.state = 13
output["loc"] = 20
output["stage"] = 2
elif (env2 == 3):
self.state = 14
output["loc"] = 20
output["stage"] = 2
elif (env2 == 1):
self.state = 15
output["loc"] = 20
output["stage"] = 2
else:
self._error(env2)
elif self.state == 17:
if (env2 == 2):
self.state = 16
output["loc"] = 20
output["stage"] = 2
elif (env2 == 0):
self.state = 17
output["loc"] = 20
output["stage"] = 2
elif (env2 == 1):
self.state = 15
output["loc"] = 20
output["stage"] = 2
else:
self._error(env2)
elif self.state == 18:
if (env2 == 2):
self.state = 16
output["loc"] = 20
output["stage"] = 2
elif (env2 == 0):
self.state = 17
output["loc"] = 20
output["stage"] = 2
elif (env2 == 3):
self.state = 14
output["loc"] = 20
output["stage"] = 2
elif (env2 == 1):
self.state = 15
output["loc"] = 20
output["stage"] = 2
else:
self._error(env2)
elif self.state == 19:
if (env2 == 2):
self.state = 16
output["loc"] = 20
output["stage"] = 2
elif (env2 == 0):
self.state = 17
output["loc"] = 20
output["stage"] = 2
elif (env2 == 4):
self.state = 13
output["loc"] = 20
output["stage"] = 2
elif (env2 == 3):
self.state = 14
output["loc"] = 20
output["stage"] = 2
elif (env2 == 1):
self.state = 15
output["loc"] = 20
output["stage"] = 2
else:
self._error(env2)
elif self.state == 20:
if (env2 == 7):
self.state = 20
output["loc"] = 20
output["stage"] = 1
elif (env2 == 5):
self.state = 22
output["loc"] = 20
output["stage"] = 1
elif (env2 == 6):
self.state = 23
output["loc"] = 20
output["stage"] = 1
else:
self._error(env2)
elif self.state == 21:
if (env2 == 4):
self.state = 24
output["loc"] = 20
output["stage"] = 1
elif (env2 == 1):
self.state = 18
output["loc"] = 21
output["stage"] = 1
elif (env2 == 2):
self.state = 19
output["loc"] = 21
output["stage"] = 1
elif (env2 == 3):
self.state = 21
output["loc"] = 20
output["stage"] = 1
elif (env2 == 5):
self.state = 22
output["loc"] = 20
output["stage"] = 1
else:
self._error(env2)
elif self.state == 22:
if (env2 == 4):
self.state = 24
output["loc"] = 20
output["stage"] = 1
elif (env2 == 7):
self.state = 20
output["loc"] = 20
output["stage"] = 1
elif (env2 == 3):
self.state = 21
output["loc"] = 20
output["stage"] = 1
elif (env2 == 5):
self.state = 22
output["loc"] = 20
output["stage"] = 1
elif (env2 == 6):
self.state = 23
output["loc"] = 20
output["stage"] = 1
else:
self._error(env2)
elif self.state == 23:
if (env2 == 4):
self.state = 24
output["loc"] = 20
output["stage"] = 1
elif (env2 == 7):
self.state = 20
output["loc"] = 20
output["stage"] = 1
elif (env2 == 5):
self.state = 22
output["loc"] = 20
output["stage"] = 1
elif (env2 == 6):
self.state = 23
output["loc"] = 20
output["stage"] = 1
else:
self._error(env2)
elif self.state == 24:
if (env2 == 4):
self.state = 24
output["loc"] = 20
output["stage"] = 1
elif (env2 == 2):
self.state = 19
output["loc"] = 21
output["stage"] = 1
elif (env2 == 3):
self.state = 21
output["loc"] = 20
output["stage"] = 1
elif (env2 == 5):
self.state = 22
output["loc"] = 20
output["stage"] = 1
elif (env2 == 6):
self.state = 23
output["loc"] = 20
output["stage"] = 1
else:
self._error(env2)
elif self.state == 25:
if (env2 == 4):
self.state = 24
output["loc"] = 20
output["stage"] = 1
elif (env2 == 7):
self.state = 20
output["loc"] = 20
output["stage"] = 1
elif (env2 == 3):
self.state = 21
output["loc"] = 20
output["stage"] = 1
elif (env2 == 5):
self.state = 22
output["loc"] = 20
output["stage"] = 1
elif (env2 == 6):
self.state = 23
output["loc"] = 20
output["stage"] = 1
else:
self._error(env2)
elif self.state == 26:
if (env2 == 6):
self.state = 9
output["loc"] = 19
output["stage"] = 1
elif (env2 == 4):
self.state = 26
output["loc"] = 18
output["stage"] = 1
elif (env2 == 3):
self.state = 27
output["loc"] = 18
output["stage"] = 1
elif (env2 == 2):
self.state = 29
output["loc"] = 19
output["stage"] = 1
elif (env2 == 5):
self.state = 25
output["loc"] = 19
output["stage"] = 1
else:
self._error(env2)
elif self.state == 27:
if (env2 == 5):
self.state = 25
output["loc"] = 19
output["stage"] = 1
elif (env2 == 4):
self.state = 26
output["loc"] = 18
output["stage"] = 1
elif (env2 == 3):
self.state = 27
output["loc"] = 18
output["stage"] = 1
elif (env2 == 1):
self.state = 28
output["loc"] = 19
output["stage"] = 1
elif (env2 == 2):
self.state = 29
output["loc"] = 19
output["stage"] = 1
else:
self._error(env2)
elif self.state == 28:
if (env2 == 0):
self.state = 8
output["loc"] = 20
output["stage"] = 1
elif (env2 == 3):
self.state = 21
output["loc"] = 20
output["stage"] = 1
elif (env2 == 1):
self.state = 6
output["loc"] = 20
output["stage"] = 1
elif (env2 == 2):
self.state = 7
output["loc"] = 20
output["stage"] = 1
else:
self._error(env2)
elif self.state == 29:
if (env2 == 0):
self.state = 8
output["loc"] = 20
output["stage"] = 1
elif (env2 == 4):
self.state = 24
output["loc"] = 20
output["stage"] = 1
elif (env2 == 3):
self.state = 21
output["loc"] = 20
output["stage"] = 1
elif (env2 == 1):
self.state = 6
output["loc"] = 20
output["stage"] = 1
elif (env2 == 2):
self.state = 7
output["loc"] = 20
output["stage"] = 1
else:
self._error(env2)
elif self.state == 30:
if (env2 == 0):
self.state = 32
output["loc"] = 18
output["stage"] = 1
elif (env2 == 3):
self.state = 27
output["loc"] = 18
output["stage"] = 1
elif (env2 == 1):
self.state = 28
output["loc"] = 19
output["stage"] = 1
elif (env2 == 2):
self.state = 29
output["loc"] = 19
output["stage"] = 1
else:
self._error(env2)
elif self.state == 31:
if (env2 == 0):
self.state = 32
output["loc"] = 18
output["stage"] = 1
elif (env2 == 4):
self.state = 26
output["loc"] = 18
output["stage"] = 1
elif (env2 == 3):
self.state = 27
output["loc"] = 18
output["stage"] = 1
elif (env2 == 1):
self.state = 28
output["loc"] = 19
output["stage"] = 1
elif (env2 == 2):
self.state = 29
output["loc"] = 19
output["stage"] = 1
else:
self._error(env2)
elif self.state == 32:
if (env2 == 0):
self.state = 32
output["loc"] = 18
output["stage"] = 1
elif (env2 == 1):
self.state = 28
output["loc"] = 19
output["stage"] = 1
elif (env2 == 2):
self.state = 29
output["loc"] = 19
output["stage"] = 1
else:
self._error(env2)
elif self.state == 33:
if (env2 == 0):
self.state = 32
output["loc"] = 18
output["stage"] = 1
elif (env2 == 1):
self.state = 30
output["loc"] = 18
output["stage"] = 1
elif (env2 == 3):
self.state = 38
output["loc"] = 16
output["stage"] = 1
elif (env2 == 2):
self.state = 31
output["loc"] = 18
output["stage"] = 1
else:
self._error(env2)
elif self.state == 34:
if (env2 == 0):
self.state = 32
output["loc"] = 18
output["stage"] = 1
elif (env2 == 4):
self.state = 41
output["loc"] = 16
output["stage"] = 1
elif (env2 == 1):
self.state = 30
output["loc"] = 18
output["stage"] = 1
elif (env2 == 3):
self.state = 38
output["loc"] = 16
output["stage"] = 1
elif (env2 == 2):
self.state = 31
output["loc"] = 18
output["stage"] = 1
else:
self._error(env2)
elif self.state == 35:
if (env2 == 6):
self.state = 40
output["loc"] = 16
output["stage"] = 1
elif (env2 == 4):
self.state = 41
output["loc"] = 16
output["stage"] = 1
elif (env2 == 7):
self.state = 37
output["loc"] = 0
output["stage"] = 1
elif (env2 == 3):
self.state = 38
output["loc"] = 16
output["stage"] = 1
elif (env2 == 5):
self.state = 39
output["loc"] = 16
output["stage"] = 1
else:
self._error(env2)
elif self.state == 36:
if (env2 == 6):
self.state = 40
output["loc"] = 16
output["stage"] = 1
elif (env2 == 4):
self.state = 41
output["loc"] = 16
output["stage"] = 1
elif (env2 == 7):
self.state = 37
output["loc"] = 0
output["stage"] = 1
elif (env2 == 5):
self.state = 39
output["loc"] = 16
output["stage"] = 1
else:
self._error(env2)
elif self.state == 37:
if (env2 == 5):
self.state = 35
output["loc"] = 8
output["stage"] = 1
elif (env2 == 6):
self.state = 36
output["loc"] = 8
output["stage"] = 1
elif (env2 == 7):
self.state = 37
output["loc"] = 0
output["stage"] = 1
else:
self._error(env2)
elif self.state == 38:
if (env2 == 4):
self.state = 41
output["loc"] = 16
output["stage"] = 1
elif (env2 == 2):
self.state = 34
output["loc"] = 17
output["stage"] = 1
elif (env2 == 1):
self.state = 33
output["loc"] = 17
output["stage"] = 1
elif (env2 == 3):
self.state = 38
output["loc"] = 16
output["stage"] = 1
elif (env2 == 5):
self.state = 39
output["loc"] = 16
output["stage"] = 1
else:
self._error(env2)
elif self.state == 39:
if (env2 == 6):
self.state = 40
output["loc"] = 16
output["stage"] = 1
elif (env2 == 4):
self.state = 41
output["loc"] = 16
output["stage"] = 1
elif (env2 == 7):
self.state = 5
output["loc"] = 16
output["stage"] = 1
elif (env2 == 3):
self.state = 38
output["loc"] = 16
output["stage"] = 1
elif (env2 == 5):
self.state = 39
output["loc"] = 16
output["stage"] = 1
else:
self._error(env2)
elif self.state == 40:
if (env2 == 6):
self.state = 40
output["loc"] = 16
output["stage"] = 1
elif (env2 == 4):
self.state = 41
output["loc"] = 16
output["stage"] = 1
elif (env2 == 7):
self.state = 5
output["loc"] = 16
output["stage"] = 1
elif (env2 == 5):
self.state = 39
output["loc"] = 16
output["stage"] = 1
else:
self._error(env2)
elif self.state == 41:
if (env2 == 6):
self.state = 40
output["loc"] = 16
output["stage"] = 1
elif (env2 == 4):
self.state = 41
output["loc"] = 16
output["stage"] = 1
elif (env2 == 2):
self.state = 34
output["loc"] = 17
output["stage"] = 1
elif (env2 == 3):
self.state = 38
output["loc"] = 16
output["stage"] = 1
elif (env2 == 5):
self.state = 39
output["loc"] = 16
output["stage"] = 1
else:
self._error(env2)
elif self.state == 42:
if (env2 == 6):
self.state = 40
output["loc"] = 16
output["stage"] = 1
elif (env2 == 4):
self.state = 41
output["loc"] = 16
output["stage"] = 1
elif (env2 == 7):
self.state = 37
output["loc"] = 0
output["stage"] = 1
elif (env2 == 3):
self.state = 38
output["loc"] = 16
output["stage"] = 1
elif (env2 == 5):
self.state = 39
output["loc"] = 16
output["stage"] = 1
else:
self._error(env2)
elif self.state == 43:
if (env2 == 6):
self.state = 40
output["loc"] = 16
output["stage"] = 1
elif (env2 == 4):
self.state = 41
output["loc"] = 16
output["stage"] = 1
elif (env2 == 7):
self.state = 37
output["loc"] = 0
output["stage"] = 1
elif (env2 == 5):
self.state = 39
output["loc"] = 16
output["stage"] = 1
else:
self._error(env2)
elif self.state == 44:
if (env2 == 1):
self.state = 48
output["loc"] = 0
output["stage"] = 0
elif (env2 == 0):
self.state = 44
output["loc"] = 0
output["stage"] = 0
elif (env2 == 2):
self.state = 46
output["loc"] = 0
output["stage"] = 0
else:
self._error(env2)
elif self.state == 45:
if (env2 == 6):
self.state = 43
output["loc"] = 8
output["stage"] = 0
elif (env2 == 5):
self.state = 42
output["loc"] = 8
output["stage"] = 0
elif (env2 == 3):
self.state = 3
output["loc"] = 8
output["stage"] = 0
elif (env2 == 4):
self.state = 4
output["loc"] = 8
output["stage"] = 0
elif (env2 == 2):
self.state = 46
output["loc"] = 0
output["stage"] = 0
else:
self._error(env2)
elif self.state == 46:
if (env2 == 1):
self.state = 48
output["loc"] = 0
output["stage"] = 0
elif (env2 == 0):
self.state = 44
output["loc"] = 0
output["stage"] = 0
elif (env2 == 3):
self.state = 3
output["loc"] = 8
output["stage"] = 0
elif (env2 == 4):
self.state = 4
output["loc"] = 8
output["stage"] = 0
elif (env2 == 2):
self.state = 46
output["loc"] = 0
output["stage"] = 0
else:
self._error(env2)
elif self.state == 47:
if (env2 == 6):
self.state = 43
output["loc"] = 8
output["stage"] = 0
elif (env2 == 5):
self.state = 42
output["loc"] = 8
output["stage"] = 0
elif (env2 == 7):
self.state = 51
output["loc"] = 0
output["stage"] = 0
elif (env2 == 4):
self.state = 4
output["loc"] = 8
output["stage"] = 0
else:
self._error(env2)
elif self.state == 48:
if (env2 == 1):
self.state = 48
output["loc"] = 0
output["stage"] = 0
elif (env2 == 3):
self.state = 3
output["loc"] = 8
output["stage"] = 0
elif (env2 == 0):
self.state = 44
output["loc"] = 0
output["stage"] = 0
elif (env2 == 2):
self.state = 46
output["loc"] = 0
output["stage"] = 0
else:
self._error(env2)
elif self.state == 49:
if (env2 == 7):
self.state = 51
output["loc"] = 0
output["stage"] = 0
elif (env2 == 5):
self.state = 42
output["loc"] = 8
output["stage"] = 0
elif (env2 == 3):
self.state = 3
output["loc"] = 8
output["stage"] = 0
elif (env2 == 4):
self.state = 4
output["loc"] = 8
output["stage"] = 0
elif (env2 == 6):
self.state = 43
output["loc"] = 8
output["stage"] = 0
else:
self._error(env2)
elif self.state == 50:
if (env2 == 1):
self.state = 48
output["loc"] = 0
output["stage"] = 0
elif (env2 == 5):
self.state = 42
output["loc"] = 8
output["stage"] = 0
elif (env2 == 3):
self.state = 3
output["loc"] = 8
output["stage"] = 0
elif (env2 == 4):
self.state = 4
output["loc"] = 8
output["stage"] = 0
elif (env2 == 2):
self.state = 46
output["loc"] = 0
output["stage"] = 0
else:
self._error(env2)
elif self.state == 51:
if (env2 == 6):
self.state = 43
output["loc"] = 8
output["stage"] = 0
elif (env2 == 5):
self.state = 42
output["loc"] = 8
output["stage"] = 0
elif (env2 == 7):
self.state = 51
output["loc"] = 0
output["stage"] = 0
else:
self._error(env2)
elif self.state == 52:
if (env2 == 0):
self.state = 44
output["loc"] = 0
output["stage"] = 0
elif (env2 == 4):
self.state = 45
output["loc"] = 0
output["stage"] = 0
elif (env2 == 2):
self.state = 46
output["loc"] = 0
output["stage"] = 0
elif (env2 == 6):
self.state = 47
output["loc"] = 0
output["stage"] = 0
elif (env2 == 1):
self.state = 48
output["loc"] = 0
output["stage"] = 0
elif (env2 == 5):
self.state = 49
output["loc"] = 0
output["stage"] = 0
elif (env2 == 3):
self.state = 50
output["loc"] = 0
output["stage"] = 0
elif (env2 == 7):
self.state = 51
output["loc"] = 0
output["stage"] = 0
else:
self._error(env2)
else:
raise Exception("Unrecognized internal state: " + str(self.state))
return output
def _error(self, env2):
raise ValueError("Unrecognized input: " + (
"env2 = {env2}; ").format(
env2=env2))
| 26.821797 | 78 | 0.334203 | 3,340 | 36,424 | 3.626647 | 0.038024 | 0.213985 | 0.146619 | 0.150582 | 0.940973 | 0.93767 | 0.93767 | 0.93767 | 0.928754 | 0.928754 | 0 | 0.103541 | 0.542609 | 36,424 | 1,357 | 79 | 26.841562 | 0.623529 | 0.013288 | 0 | 0.933877 | 0 | 0 | 0.053872 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.002717 | false | 0 | 0 | 0 | 0.004529 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
b25dcfcd42b3078c6314d3f144cc067792c9509a | 929 | py | Python | tests/test_resource_urls.py | calind/zipa | d2f3572454456aafe952a911b4881f4862b3730c | [
"Apache-2.0"
] | null | null | null | tests/test_resource_urls.py | calind/zipa | d2f3572454456aafe952a911b4881f4862b3730c | [
"Apache-2.0"
] | null | null | null | tests/test_resource_urls.py | calind/zipa | d2f3572454456aafe952a911b4881f4862b3730c | [
"Apache-2.0"
] | null | null | null | class TestResourceUrls:
def test_naked_domain(self):
from zipa import api_test_com as t
assert t._get_url() == 'https://api.test.com/'
def test_with_version_in_url(self):
from zipa import api_test_com__v1 as t
assert t._get_url() == 'https://api.test.com/v1/'
def test_with_deep_path(self):
from zipa import test_com__api_v1 as t
assert t._get_url() == 'https://test.com/api/v1/'
def test_simple_method_with_naked_domain(self):
from zipa import api_test_com as t
assert t.res._get_url() == 'https://api.test.com/res'
def test_simple_method_with_version_in_url(self):
from zipa import api_test_com__v1 as t
assert t.res._get_url() == 'https://api.test.com/v1/res'
def test_simple_method_with_deep_path(self):
from zipa import test_com__api_v1 as t
assert t.res._get_url() == 'https://test.com/api/v1/res'
| 37.16 | 64 | 0.673843 | 155 | 929 | 3.670968 | 0.174194 | 0.147627 | 0.140598 | 0.189807 | 0.920914 | 0.880492 | 0.803163 | 0.753954 | 0.734622 | 0.710018 | 0 | 0.010959 | 0.214209 | 929 | 24 | 65 | 38.708333 | 0.768493 | 0 | 0 | 0.315789 | 0 | 0 | 0.158235 | 0 | 0 | 0 | 0 | 0 | 0.315789 | 1 | 0.315789 | false | 0 | 0.315789 | 0 | 0.684211 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
a273844fa7784e0de919abfa0f5d6c5e8537c308 | 6,601 | py | Python | reinforcement_learning_dataset.py | YunqiuXu/H-KGA | 694a36baf9e51ffb97be269d8182a2b906eb0da5 | [
"MIT"
] | 2 | 2021-10-01T00:41:40.000Z | 2022-02-17T03:43:13.000Z | reinforcement_learning_dataset.py | YunqiuXu/H-KGA | 694a36baf9e51ffb97be269d8182a2b906eb0da5 | [
"MIT"
] | null | null | null | reinforcement_learning_dataset.py | YunqiuXu/H-KGA | 694a36baf9e51ffb97be269d8182a2b906eb0da5 | [
"MIT"
] | 1 | 2021-10-19T13:47:53.000Z | 2021-10-19T13:47:53.000Z | import os
import glob
import gym
import textworld.gym
def get_training_game_env_1level100game(data_dir, difficulty_level, requested_infos, max_episode_steps, batch_size):
assert difficulty_level in [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13]
training_size = 100
game_file_names = []
game_path = data_dir + "/train_" + str(training_size) + "/difficulty_level_" + str(difficulty_level)
if os.path.isdir(game_path):
game_file_names += glob.glob(os.path.join(game_path, "*.z8"))
else:
game_file_names.append(game_path)
game_file_names_touse = sorted(game_file_names)
env_id = textworld.gym.register_games(game_file_names_touse, request_infos=requested_infos,
max_episode_steps=max_episode_steps, batch_size=batch_size,
name="training", asynchronous=True, auto_reset=False)
env = gym.make(env_id)
num_game = len(game_file_names_touse)
print("Training: level {}, {} games".format(difficulty_level, num_game))
return env, num_game
def get_training_game_env_1level25game(data_dir, difficulty_level, requested_infos, max_episode_steps, batch_size):
assert difficulty_level in [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13]
training_size = 100
game_file_names = []
game_path = data_dir + "/train_" + str(training_size) + "/difficulty_level_" + str(difficulty_level)
if os.path.isdir(game_path):
game_file_names += glob.glob(os.path.join(game_path, "*.z8"))
else:
game_file_names.append(game_path)
game_file_names_touse = sorted(game_file_names)[:25]
env_id = textworld.gym.register_games(game_file_names_touse, request_infos=requested_infos,
max_episode_steps=max_episode_steps, batch_size=batch_size,
name="training", asynchronous=True, auto_reset=False)
env = gym.make(env_id)
num_game = len(game_file_names_touse)
print("Training: level {}, {} games".format(difficulty_level, num_game))
return env, num_game
def get_training_game_env(data_dir, difficulty_level, training_size, requested_infos, max_episode_steps, batch_size):
assert difficulty_level in [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13]
assert training_size in [1, 20, 100]
game_file_names = []
game_path = data_dir + "/train_" + str(training_size) + "/difficulty_level_" + str(difficulty_level)
if os.path.isdir(game_path):
game_file_names += glob.glob(os.path.join(game_path, "*.z8"))
else:
game_file_names.append(game_path)
env_id = textworld.gym.register_games(sorted(game_file_names), request_infos=requested_infos,
max_episode_steps=max_episode_steps, batch_size=batch_size,
name="training", asynchronous=True, auto_reset=False)
env = gym.make(env_id)
num_game = len(game_file_names)
return env, num_game
def get_training_game_env_4level25game(data_dir, requested_infos, max_episode_steps, batch_size):
"""
Choose 25 games per level
"""
training_size = 100
difficulty_level_list = [3,7,5,9]
game_file_names_list = []
print("===== Building training env on 4 levels, 25 games per level ======")
for difficulty_level in difficulty_level_list:
game_file_names = []
game_path = data_dir + "/train_" + str(training_size) + "/difficulty_level_" + str(difficulty_level)
if os.path.isdir(game_path):
game_file_names += glob.glob(os.path.join(game_path, "*.z8"))
else:
game_file_names.append(game_path)
game_file_names_touse = sorted(game_file_names)[:25]
print("Level {}, {} games".format(difficulty_level, len(game_file_names_touse)))
game_file_names_list.extend(game_file_names_touse)
print("Totally {} games".format(len(game_file_names_list)))
env_id = textworld.gym.register_games(sorted(game_file_names_list), request_infos=requested_infos,
max_episode_steps=max_episode_steps, batch_size=batch_size,
name="training", asynchronous=True, auto_reset=False)
env = gym.make(env_id)
num_game = len(game_file_names_list)
return env, num_game
def get_training_game_env_4level100game(data_dir, requested_infos, max_episode_steps, batch_size):
"""
Choose 100 games per level
"""
training_size = 100
difficulty_level_list = [3,7,5,9]
game_file_names_list = []
print("===== Building training env on 4 levels, 100 games per level ======")
for difficulty_level in difficulty_level_list:
game_file_names = []
game_path = data_dir + "/train_" + str(training_size) + "/difficulty_level_" + str(difficulty_level)
if os.path.isdir(game_path):
game_file_names += glob.glob(os.path.join(game_path, "*.z8"))
else:
game_file_names.append(game_path)
game_file_names_touse = sorted(game_file_names)
print("Level {}, {} games".format(difficulty_level, len(game_file_names_touse)))
game_file_names_list.extend(game_file_names_touse)
print("Totally {} games".format(len(game_file_names_list)))
env_id = textworld.gym.register_games(sorted(game_file_names_list), request_infos=requested_infos,
max_episode_steps=max_episode_steps, batch_size=batch_size,
name="training", asynchronous=True, auto_reset=False)
env = gym.make(env_id)
num_game = len(game_file_names_list)
return env, num_game
def get_evaluation_game_env(data_dir, difficulty_level, requested_infos, max_episode_steps, batch_size, valid_or_test="valid"):
assert valid_or_test in ["valid", "test"]
assert difficulty_level in [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13]
game_file_names = []
game_path = data_dir + "/" + valid_or_test + "/difficulty_level_" + str(difficulty_level)
if os.path.isdir(game_path):
game_file_names += glob.glob(os.path.join(game_path, "*.z8"))
else:
game_file_names.append(game_path)
env_id = textworld.gym.register_games(sorted(game_file_names), request_infos=requested_infos,
max_episode_steps=max_episode_steps, batch_size=batch_size,
name="eval", asynchronous=True, auto_reset=False)
env = gym.make(env_id)
num_game = len(game_file_names)
return env, num_game
| 51.570313 | 127 | 0.667475 | 904 | 6,601 | 4.488938 | 0.095133 | 0.094628 | 0.15377 | 0.070971 | 0.948497 | 0.943322 | 0.936175 | 0.929276 | 0.929276 | 0.920404 | 0 | 0.025465 | 0.226632 | 6,601 | 127 | 128 | 51.976378 | 0.769442 | 0.007878 | 0 | 0.857143 | 0 | 0 | 0.074102 | 0 | 0 | 0 | 0 | 0 | 0.053571 | 1 | 0.053571 | false | 0 | 0.035714 | 0 | 0.142857 | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a28c397694aaab1313638a9ae26d325f1974bfd1 | 7,146 | py | Python | script.py | KolodziejczykWaldemar/CRS2_parallel | ec2d09818adc866668160454da3114a06e936673 | [
"MIT"
] | null | null | null | script.py | KolodziejczykWaldemar/CRS2_parallel | ec2d09818adc866668160454da3114a06e936673 | [
"MIT"
] | null | null | null | script.py | KolodziejczykWaldemar/CRS2_parallel | ec2d09818adc866668160454da3114a06e936673 | [
"MIT"
] | null | null | null | from crs2_shared_memory import CRS2_multi
from crs3_shared_memory import CRS3_multi
from crs2_local_memory import CRS2_local
from crs3_local_memory import CRS3_local
from crs2 import CRS2
from crs3 import CRS3
if __name__ == '__main__':
ns = [2, 3]
cpus = [2, 3]
ex_num = 2
func_nums = [1, 2]
# For CRS2 sequential:
# for fun_num in func_nums:
# for n in ns:
# result_file = open("CRS2_{}_{}.txt".format(fun_num, n), "a")
# for ex in range(ex_num):
# iter_time, rand_time, full_time, result = CRS2(FUNCTION_NUMBER=fun_num, vec_len=n)
# result_str = 'Experiment: {} \n' \
# 'Function number: {} \n' \
# 'Vector dimensionality: {}\n' \
# 'Iteration time: \n{:.4f} s\n' \
# 'Randomization time: \n{:.4f} s\n' \
# 'Full time: \n{:.4f} s\n' \
# 'Result: \n{} \n\n\n'.format(ex + 1, fun_num, n, iter_time, rand_time, full_time, result)
#
# result_file.write(result_str)
# result_file.close()
#
#
# # For CRS3 sequential:
# for fun_num in func_nums:
# for n in ns:
# result_file = open("CRS3_{}_{}.txt".format(fun_num, n), "a")
# for ex in range(ex_num):
# iter_time, rand_time, full_time, result = CRS3(FUNCTION_NUMBER=fun_num, vec_len=n)
# result_str = 'Experiment: {} \n' \
# 'Function number: {} \n' \
# 'Vector dimensionality: {}\n' \
# 'Iteration time: \n{:.4f} s\n' \
# 'Randomization time: \n{:.4f} s\n' \
# 'Full time: \n{:.4f} s\n' \
# 'Result: \n{} \n\n\n'.format(ex + 1, fun_num, n, iter_time, rand_time, full_time, result)
#
# result_file.write(result_str)
# result_file.close()
#
#
#
# # For CRS2 shared memory:
# for fun_num in func_nums:
# for n in ns:
# for cpu in cpus:
# result_file = open("CRS2SM_{}_{}_{}.txt".format(fun_num, n, cpu), "a")
# for ex in range(ex_num):
# iter_time, rand_time, full_time, result = CRS2_multi(cpu_num=cpu,
# FUNCTION_NUMBER=fun_num,
# vec_len=n)
# result_str = 'Experiment: {} \n' \
# 'Function number: {} \n' \
# 'Vector dimensionality: {}\n' \
# 'CPU number: {} \n' \
# 'Iteration time: \n{:.4f} s\n' \
# 'Randomization time: \n{:.4f} s\n' \
# 'Full time: \n{:.4f} s\n' \
# 'Result: \n{:.4f} \n\n\n'.format(ex+1, fun_num, n, cpu, iter_time,
# rand_time, full_time, result)
#
# result_file.write(result_str)
# result_file.close()
#
#
# # For CRS3 shared memory:
# for fun_num in func_nums:
# for n in ns:
# for cpu in cpus:
# result_file = open("CRS3SM_{}_{}_{}.txt".format(fun_num, n, cpu), "a")
# for ex in range(ex_num):
# iter_time, rand_time, full_time, result = CRS3_multi(cpu_num=cpu,
# FUNCTION_NUMBER=fun_num,
# vec_len=n)
# result_str = 'Experiment: {} \n' \
# 'Function number: {} \n' \
# 'Vector dimensionality: {}\n' \
# 'CPU number: {} \n' \
# 'Iteration time: \n{:.4f} s\n' \
# 'Randomization time: \n{:.4f} s\n' \
# 'Full time: \n{:.4f} s\n' \
# 'Result: \n{} \n\n\n'.format(ex + 1, fun_num, n, cpu, iter_time,
# rand_time, full_time, result)
#
# result_file.write(result_str)
# result_file.close()
#
# For CRS2 local memory:
for fun_num in func_nums:
for n in ns:
for cpu in cpus:
result_file = open("CRS2LM_{}_{}_{}.txt".format(fun_num, n, cpu), "a")
for ex in range(ex_num):
iter_time, rand_time, full_time, result = CRS2_local(cpu_num=cpu,
FUNCTION_NUMBER=fun_num,
vec_len=n)
result_str = 'Experiment: {} \n' \
'Function number: {} \n' \
'Vector dimensionality: {}\n' \
'CPU number: {} \n' \
'Iteration time: \n{:.4f} s\n' \
'Randomization time: \n{:.4f} s\n' \
'Full time: \n{:.4f} s\n' \
'Result: \n{:.4f} \n\n\n'.format(ex+1, fun_num, n, cpu, iter_time,
rand_time, full_time, result)
result_file.write(result_str)
result_file.close()
# For CRS3 local memory:
for fun_num in func_nums:
for n in ns:
for cpu in cpus:
result_file = open("CRS3LM_{}_{}_{}.txt".format(fun_num, n, cpu), "a")
for ex in range(ex_num):
iter_time, rand_time, full_time, result = CRS3_local(cpu_num=cpu,
FUNCTION_NUMBER=fun_num,
vec_len=n)
result_str = 'Experiment: {} \n' \
'Function number: {} \n' \
'Vector dimensionality: {}\n' \
'CPU number: {} \n' \
'Iteration time: \n{:.4f} s\n' \
'Randomization time: \n{:.4f} s\n' \
'Full time: \n{:.4f} s\n' \
'Result: \n{:.4f} \n\n\n'.format(ex+1, fun_num, n, cpu, iter_time,
rand_time, full_time, result)
result_file.write(result_str)
result_file.close()
| 49.972028 | 121 | 0.381192 | 699 | 7,146 | 3.668097 | 0.078684 | 0.056162 | 0.049142 | 0.056162 | 0.902886 | 0.902886 | 0.902886 | 0.902886 | 0.902886 | 0.902886 | 0 | 0.018054 | 0.503918 | 7,146 | 142 | 122 | 50.323944 | 0.705219 | 0.547159 | 0 | 0.693878 | 0 | 0 | 0.13541 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.122449 | 0 | 0.122449 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a298da0f824cd46ecb74ca76d80cffa3e53b4419 | 8,911 | py | Python | tests/test_parallel.py | GBillotey/Fractalshades | e100b12db031f016bf1a8a1f4fad9ca1c64a0302 | [
"MIT"
] | null | null | null | tests/test_parallel.py | GBillotey/Fractalshades | e100b12db031f016bf1a8a1f4fad9ca1c64a0302 | [
"MIT"
] | 1 | 2021-11-01T14:55:57.000Z | 2021-11-01T14:55:57.000Z | tests/test_parallel.py | GBillotey/Fractalshades | e100b12db031f016bf1a8a1f4fad9ca1c64a0302 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import unittest
import concurrent.futures
import os
import numpy as np
import numba
import fractalshades as fs
import fractalshades.parallel_lock
import test_config
"""
A few Experiments with numba & GIL releasing in multi-threading.
"""
# @numba.njit("(intp[:], float64[:])", nogil=True)
@numba.njit(nogil=True)
def lock_and_work(locks, arr):
failtimes = 0
for _ in range(10000):
for i in range(locks.size):
# get lock pointer
lock_ptr = locks[i:]
# try to lock and do some work
if fs.parallel_lock.try_lock(lock_ptr):
arr[i] += 1
# unlock
fs.parallel_lock.unlock(lock_ptr)
break
else:
# count number of times it failed to do work
failtimes += 1
return failtimes
@numba.njit(nogil=True)
def busywait_and_work(locks, arr):
failtimes = 0
for _ in range(10000):
for i in range(0):#locks.size):
# get lock pointer
lock_ptr = locks[i:]
# busywait to lock and do some work
while not(fs.parallel_lock.try_lock(lock_ptr)):
pass
arr[i] += 1
fs.parallel_lock.unlock(lock_ptr)
break
else:
# count number of times it failed to do work
failtimes += 1
return failtimes
@numba.njit(nogil=False)
def lock_and_work_with_GIL(locks, arr):
failtimes = 0
for _ in range(10000):
for i in range(locks.size):
# get lock pointer
lock_ptr = locks[i:]
# try to lock and do some work
if fs.parallel_lock.try_lock(lock_ptr):
arr[i] += 1
# unlock
fs.parallel_lock.unlock(lock_ptr)
break
else:
# count number of times it failed to do work
failtimes += 1
return failtimes
@numba.njit(nogil=True)
def lock_and_work_nested(locks, arr):
return lock_and_work_with_GIL(locks, arr)
def python_lock_and_work(locks, arr):
# just passing through
return lock_and_work(locks, arr)
def python_lock_and_work_nested(locks, arr):
# just passing through
return lock_and_work_nested(locks, arr)
class Test_parallel(unittest.TestCase):
def test_lock(self):
# calling a numba nogil=True jitted func
# -> GIL is released (as expected)
locks = np.zeros(3, dtype=np.intp)
values = np.zeros(3, dtype=np.float64)
assert lock_and_work(locks, values) == 0
assert np.sum(values) == 10000.
with concurrent.futures.ThreadPoolExecutor(max_workers=os.cpu_count()
) as threadpool:
futures = (threadpool.submit(lock_and_work, locks, values)
for _ in range(8))
lock_failed = 0
for fut in concurrent.futures.as_completed(futures):
res = fut.result()
lock_failed += res
# print('failed to lock {0} times'.format(res))
print('total failed to lock:', lock_failed)
print(np.sum(values))
assert np.sum(values) + lock_failed == 90000.
def test_lock_python(self):
# calling a numba nogil=True jitted func through a python func
# -> GIL is released
locks = np.zeros(3, dtype=np.intp)
values = np.zeros(3, dtype=np.float64)
assert lock_and_work(locks, values) == 0
assert np.sum(values) == 10000.
with concurrent.futures.ThreadPoolExecutor(max_workers=os.cpu_count()
) as threadpool:
futures = (threadpool.submit(python_lock_and_work, locks, values)
for _ in range(8))
lock_failed = 0
for fut in concurrent.futures.as_completed(futures):
res = fut.result()
lock_failed += res
# print('failed to lock {0} times'.format(res))
print('total failed to lock:', lock_failed)
print(np.sum(values))
assert np.sum(values) + lock_failed == 90000.
def test_lock_GIL(self):
# Calling a numba function compiled with nogil=False
# -> GIL is not released (as expected)
locks = np.zeros(3, dtype=np.intp)
values = np.zeros(3, dtype=np.float64)
assert lock_and_work(locks, values) == 0
assert np.sum(values) == 10000.
with concurrent.futures.ThreadPoolExecutor(max_workers=os.cpu_count()
) as threadpool:
futures = (threadpool.submit(lock_and_work_with_GIL, locks, values)
for _ in range(8))
lock_failed = 0
for fut in concurrent.futures.as_completed(futures):
res = fut.result()
lock_failed += res
# print('failed to lock {0} times'.format(res))
print('total failed to lock:', lock_failed)
print(np.sum(values))
assert lock_failed == 0.
assert np.sum(values) == 90000.
def test_lock_nested(self):
# Calling nested :
# - a nogil=True numba jitted function
# - which calls a nogil=False numba jitted
# -> GIL is released
locks = np.zeros(3, dtype=np.intp)
values = np.zeros(3, dtype=np.float64)
assert lock_and_work(locks, values) == 0
assert np.sum(values) == 10000.
with concurrent.futures.ThreadPoolExecutor(max_workers=os.cpu_count()
) as threadpool:
futures = (threadpool.submit(lock_and_work_nested, locks, values)
for _ in range(8))
lock_failed = 0
for fut in concurrent.futures.as_completed(futures):
res = fut.result()
lock_failed += res
# print('failed to lock {0} times'.format(res))
print('total failed to lock:', lock_failed)
print(np.sum(values))
assert np.sum(values) + lock_failed == 90000.
def test_lock_python_nested(self):
# Calling nested :
# - pure python function
# - which calls a nogil=True numba jitted
# - which calls a nogil=False numba jitted
# -> GIL is released
locks = np.zeros(3, dtype=np.intp)
values = np.zeros(3, dtype=np.float64)
assert lock_and_work(locks, values) == 0
assert np.sum(values) == 10000.
with concurrent.futures.ThreadPoolExecutor(max_workers=os.cpu_count()
) as threadpool:
futures = (threadpool.submit(python_lock_and_work_nested,
locks, values)
for _ in range(8))
lock_failed = 0
for fut in concurrent.futures.as_completed(futures):
res = fut.result()
lock_failed += res
# print('failed to lock {0} times'.format(res))
print('total failed to lock:', lock_failed)
print(np.sum(values))
assert np.sum(values) + lock_failed == 90000.
def test_lock2(self):
# Syntax for in place mod
locks = np.zeros(3, dtype=np.intp)
values = np.zeros(3, dtype=np.float64)
with concurrent.futures.ThreadPoolExecutor(max_workers=os.cpu_count()
) as threadpool:
futures = (
threadpool.submit(lock_and_work, locks, values)
for _ in range(8)
)
for fut in concurrent.futures.as_completed(futures):
fut.result()
print(np.sum(values))
def test_busy_lock(self):
# calling a numba nogil=True jitted func
# -> GIL is released (as expected)
locks = np.zeros(3, dtype=np.intp)
values = np.zeros(3, dtype=np.float64)
assert lock_and_work(locks, values) == 0
assert np.sum(values) == 10000.
with concurrent.futures.ThreadPoolExecutor(max_workers=os.cpu_count()
) as threadpool:
futures = (threadpool.submit(busywait_and_work, locks, values)
for _ in range(8))
lock_failed = 0
for fut in concurrent.futures.as_completed(futures):
res = fut.result()
lock_failed += res
# print('failed to lock {0} times'.format(res))
print('total failed to lock:', lock_failed)
print(np.sum(values))
assert np.sum(values) + lock_failed == 90000.
if __name__ == '__main__':
# main()
full_test = True
runner = unittest.TextTestRunner(verbosity=2)
if full_test:
runner.run(test_config.suite([Test_parallel]))
else:
suite = unittest.TestSuite()
suite.addTest(Test_parallel("test_busy_lock"))
runner.run(suite)
| 35.644 | 80 | 0.570643 | 1,090 | 8,911 | 4.515596 | 0.112844 | 0.048761 | 0.044697 | 0.036977 | 0.850467 | 0.832995 | 0.820805 | 0.796424 | 0.77306 | 0.748273 | 0 | 0.024226 | 0.332959 | 8,911 | 249 | 81 | 35.787149 | 0.803836 | 0.144092 | 0 | 0.708571 | 0 | 0 | 0.019702 | 0 | 0 | 0 | 0 | 0 | 0.108571 | 1 | 0.074286 | false | 0.005714 | 0.045714 | 0.017143 | 0.16 | 0.074286 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a299724e51a4c474e1f168f772e2e30ae01f6497 | 393 | py | Python | iridauploader/parsers/exceptions/__init__.py | COMBAT-SARS-COV-2/irida-uploader | b9d04d187d6a5a9fdcaef5b27135965ffac99db0 | [
"Apache-2.0"
] | null | null | null | iridauploader/parsers/exceptions/__init__.py | COMBAT-SARS-COV-2/irida-uploader | b9d04d187d6a5a9fdcaef5b27135965ffac99db0 | [
"Apache-2.0"
] | null | null | null | iridauploader/parsers/exceptions/__init__.py | COMBAT-SARS-COV-2/irida-uploader | b9d04d187d6a5a9fdcaef5b27135965ffac99db0 | [
"Apache-2.0"
] | null | null | null | from iridauploader.parsers.exceptions.directory_error import DirectoryError
from iridauploader.parsers.exceptions.sample_sheet_error import SampleSheetError
from iridauploader.parsers.exceptions.sequence_file_error import SequenceFileError
from iridauploader.parsers.exceptions.validation_error import ValidationError
from iridauploader.parsers.exceptions.file_size_error import FileSizeError
| 65.5 | 82 | 0.910941 | 43 | 393 | 8.139535 | 0.418605 | 0.242857 | 0.342857 | 0.485714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.050891 | 393 | 5 | 83 | 78.6 | 0.938338 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a2a1f597e2d8ced0d4158f18340f4ca53e2b8e8d | 5,316 | py | Python | helmRKPR.py | pysg/pyther | 6a47fc41533cc50bc64134e42ddd3ed8d54d75c7 | [
"MIT"
] | 9 | 2017-07-10T19:21:35.000Z | 2022-01-24T16:41:34.000Z | helmRKPR.py | NERD-cpu/pyther | 6a47fc41533cc50bc64134e42ddd3ed8d54d75c7 | [
"MIT"
] | 1 | 2017-05-28T01:45:00.000Z | 2018-01-08T14:54:31.000Z | helmRKPR.py | NERD-cpu/pyther | 6a47fc41533cc50bc64134e42ddd3ed8d54d75c7 | [
"MIT"
] | 3 | 2017-08-18T18:47:21.000Z | 2021-03-01T02:25:24.000Z | #HelmRKPR
# SUBROUTINE HelmRKPR(nco,NDE,NTD,rn,V,T,Ar,ArV,ArTV,ArV2,Arn,ArVn,ArTn,Arn2)
# IMPLICIT DOUBLE PRECISION (A-H,O-Z)
# PARAMETER (RGAS=0.08314472d0)
# dimension rn(nco),Arn(nco),ArVn(nco),ArTn(nco),Arn2(nco,nco)
# dimension dBi(nco),dBij(nco,nco),dD1i(nco),dD1ij(nco,nco)
# dimension dDi(nco),dDij(nco,nco),dDiT(nco)
dimension aij(nco,nco),daijdT(nco,nco),daijdT2(nco,nco)
COMMON /rule/ncomb
nc=nco
TOTN = sum(rn)
call DELTAnder(nc,rn,D1,dD1i,dD1ij)
D2=(1-D1)/(1+D1)
! Comparison to test and debug cubic mixing rules
! rn=[0.65,0.35]
! T=460.0d0
! call Bnder(nc,rn,Bmix,dBi,dBij)
! call Bcubicnder(nc,rn,Bmix,dBi,dBij)
! call DandTnder(NTD,nc,T,rn,D,dDi,dDiT,dDij,dDdT,dDdT2)
! call DCubicandTnder(NTD,nc,T,rn,D,dDi,dDiT,dDij,dDdT,dDdT2)
if(ncomb.lt.2)then
call Bnder(nc,rn,Bmix,dBi,dBij)
call DandTnder(NTD,nc,T,rn,D,dDi,dDiT,dDij,dDdT,dDdT2)
else
! call Bcubicnder(nc,rn,Bmix,dBi,dBij)
! call DCubicandTnder(NTD,nc,T,rn,D,dDi,dDiT,dDij,dDdT,dDdT2)
end if
! The f's and g's used here are for Ar, not F (reduced Ar) ***********
! This requires to multiply by R all g, f and its derivatives as defined by Mollerup ****
f=log((V+D1*Bmix)/(V+D2*Bmix))/Bmix/(D1-D2)
g=RGAS*log(1-Bmix/V)
fv=-1/((V+D1*Bmix)*(V+D2*Bmix))
fB=-(f+V*fv)/Bmix
gv=RGAS*Bmix/(V*(V-Bmix))
fv2=(-1/(V+D1*Bmix)**2+1/(V+D2*Bmix)**2)/Bmix/(D1-D2)
gv2=RGAS*(1/V**2-1/(V-Bmix)**2)
! DERIVATIVES OF f WITH RESPECT TO DELTA1
auxD2=(1+2/(1+D1)**2)
fD1=(1/(V+D1*Bmix)+2/(V+D2*Bmix)/(1+D1)**2)-f*auxD2
fD1=fD1/(D1-D2)
fBD1=-(fB*auxD2+D1/(V+D1*Bmix)**2+2*D2/(V+D2*Bmix)**2/(1+D1)**2)
fBD1=fBD1/(D1-D2)
fVD1=-(fV*auxD2+1/(V+D1*Bmix)**2+2/(V+D2*Bmix)**2/(1+D1)**2)/(D1-D2)
fD1D1=4*(f-1/(V+D2*Bmix))/(1+D1)**3+Bmix*(-1/(V+D1*Bmix)**2+ &
4/(V+D2*Bmix)**2/(1+D1)**4)-2*fD1*(1+2/(1+D1)**2)
fD1D1=fD1D1/(D1-D2)
! Reduced Helmholtz Energy and derivatives
Ar=-TOTN*g*T-D*f
ArV=-TOTN*gv*T-D*fv
ArV2=-TOTN*gv2*T-D*fv2
AUX=RGAS*T/(V-Bmix)
FFB=TOTN*AUX-D*fB
FFBV=-TOTN*AUX/(V-Bmix)+D*(2*fv+V*fv2)/Bmix
FFBB=TOTN*AUX/(V-Bmix)-D*(2*f+4*V*fv+V**2*fv2)/Bmix**2
do i=1,nc
Arn(i)=-g*T+FFB*dBi(i)-f*dDi(i)-D*fD1*dD1i(i)
ArVn(i)=-gv*T+FFBV*dBi(i)-fv*dDi(i)-D*fVD1*dD1i(i)
IF (NDE.EQ.2) THEN
do j=1,i
Arn2(i,j)=AUX*(dBi(i)+dBi(j))-fB*(dBi(i)*dDi(j)+dBi(j)*dDi(i)) &
+FFB*dBij(i,j)+FFBB*dBi(i)*dBi(j)-f*dDij(i,j)
Arn2(i,j)=Arn2(i,j)-D*fBD1*(dBi(i)*dD1i(j)+dBi(j)*dD1i(i)) &
-fD1*(dDi(i)*dD1i(j)+dDi(j)*dD1i(i)) &
-D*fD1*dD1ij(i,j)-D*fD1D1*dD1i(i)*dD1i(j)
Arn2(j,i)=Arn2(i,j)
end do
END IF
end do
! TEMPERATURE DERIVATIVES
IF (NTD.EQ.1) THEN
ArT=-TOTN*g-dDdT*f
ArTV=-TOTN*gv-dDdT*fV
ArTT=-dDdT2*f
do i=1,nc
ArTn(i)=-g+(TOTN*AUX/T-dDdT*fB)*dBi(i)-f*dDiT(i)-dDdT*fD1*dD1i(i)
end do
END IF
end
SUBROUTINE HelmRKPR(nco,NDE,NTD,rn,V,T,Ar,ArV,ArTV,ArV2,Arn,ArVn,ArTn,Arn2)
IMPLICIT DOUBLE PRECISION (A-H,O-Z)
PARAMETER (RGAS=0.08314472d0)
dimension rn(nco),Arn(nco),ArVn(nco),ArTn(nco),Arn2(nco,nco)
dimension dBi(nco),dBij(nco,nco),dD1i(nco),dD1ij(nco,nco)
dimension dDi(nco),dDij(nco,nco),dDiT(nco)
dimension aij(nco,nco),daijdT(nco,nco),daijdT2(nco,nco)
COMMON /rule/ncomb
nc=nco
TOTN = sum(rn)
call DELTAnder(nc,rn,D1,dD1i,dD1ij)
D2=(1-D1)/(1+D1)
if(ncomb.lt.2)then
call Bnder(nc,rn,Bmix,dBi,dBij)
call DandTnder(NTD,nc,T,rn,D,dDi,dDiT,dDij,dDdT,dDdT2)
else
! call Bcubicnder(nc,rn,Bmix,dBi,dBij)
! call DCubicandTnder(NTD,nc,T,rn,D,dDi,dDiT,dDij,dDdT,dDdT2)
end if
! The f's and g's used here are for Ar, not F (reduced Ar) ***********
! This requires to multiply by R all g, f and its derivatives as defined by Mollerup ****
f=log((V+D1*Bmix)/(V+D2*Bmix))/Bmix/(D1-D2)
g=RGAS*log(1-Bmix/V)
fv=-1/((V+D1*Bmix)*(V+D2*Bmix))
fB=-(f+V*fv)/Bmix
gv=RGAS*Bmix/(V*(V-Bmix))
fv2=(-1/(V+D1*Bmix)**2+1/(V+D2*Bmix)**2)/Bmix/(D1-D2)
gv2=RGAS*(1/V**2-1/(V-Bmix)**2)
! DERIVATIVES OF f WITH RESPECT TO DELTA1
auxD2=(1+2/(1+D1)**2)
fD1=(1/(V+D1*Bmix)+2/(V+D2*Bmix)/(1+D1)**2)-f*auxD2
fD1=fD1/(D1-D2)
fBD1=-(fB*auxD2+D1/(V+D1*Bmix)**2+2*D2/(V+D2*Bmix)**2/(1+D1)**2)
fBD1=fBD1/(D1-D2)
fVD1=-(fV*auxD2+1/(V+D1*Bmix)**2+2/(V+D2*Bmix)**2/(1+D1)**2)/(D1-D2)
fD1D1=4*(f-1/(V+D2*Bmix))/(1+D1)**3+Bmix*(-1/(V+D1*Bmix)**2+ &
4/(V+D2*Bmix)**2/(1+D1)**4)-2*fD1*(1+2/(1+D1)**2)
fD1D1=fD1D1/(D1-D2)
! Reduced Helmholtz Energy and derivatives
Ar=-TOTN*g*T-D*f
ArV=-TOTN*gv*T-D*fv
ArV2=-TOTN*gv2*T-D*fv2
AUX=RGAS*T/(V-Bmix)
FFB=TOTN*AUX-D*fB
FFBV=-TOTN*AUX/(V-Bmix)+D*(2*fv+V*fv2)/Bmix
FFBB=TOTN*AUX/(V-Bmix)-D*(2*f+4*V*fv+V**2*fv2)/Bmix**2
do i=1,nc
Arn(i)=-g*T+FFB*dBi(i)-f*dDi(i)-D*fD1*dD1i(i)
ArVn(i)=-gv*T+FFBV*dBi(i)-fv*dDi(i)-D*fVD1*dD1i(i)
IF (NDE.EQ.2) THEN
do j=1,i
Arn2(i,j)=AUX*(dBi(i)+dBi(j))-fB*(dBi(i)*dDi(j)+dBi(j)*dDi(i)) &
+FFB*dBij(i,j)+FFBB*dBi(i)*dBi(j)-f*dDij(i,j)
Arn2(i,j)=Arn2(i,j)-D*fBD1*(dBi(i)*dD1i(j)+dBi(j)*dD1i(i)) &
-fD1*(dDi(i)*dD1i(j)+dDi(j)*dD1i(i)) &
-D*fD1*dD1ij(i,j)-D*fD1D1*dD1i(i)*dD1i(j)
Arn2(j,i)=Arn2(i,j)
end do
END IF
end do
! TEMPERATURE DERIVATIVES
IF (NTD.EQ.1) THEN
ArT=-TOTN*g-dDdT*f
ArTV=-TOTN*gv-dDdT*fV
ArTT=-dDdT2*f
do i=1,nc
ArTn(i)=-g+(TOTN*AUX/T-dDdT*fB)*dBi(i)-f*dDiT(i)-dDdT*fD1*dD1i(i)
end do
END IF
end
| 26.713568 | 90 | 0.611926 | 1,163 | 5,316 | 2.797077 | 0.108341 | 0.033815 | 0.03443 | 0.024593 | 0.980633 | 0.980633 | 0.980633 | 0.980633 | 0.968952 | 0.968952 | 0 | 0.074768 | 0.126975 | 5,316 | 198 | 91 | 26.848485 | 0.626158 | 0 | 0 | 0.921986 | 0 | 0 | 0.002806 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a2bcee29c6d847228c56e48030cc598047aab4ca | 62,875 | py | Python | src/fidalgo/azext_fidalgo/manual/_params.py | tbyfield/azure-cli-extensions | e7e5f37fdcea3afb5c4aecb61fa72eac72c2128e | [
"MIT"
] | null | null | null | src/fidalgo/azext_fidalgo/manual/_params.py | tbyfield/azure-cli-extensions | e7e5f37fdcea3afb5c4aecb61fa72eac72c2128e | [
"MIT"
] | null | null | null | src/fidalgo/azext_fidalgo/manual/_params.py | tbyfield/azure-cli-extensions | e7e5f37fdcea3afb5c4aecb61fa72eac72c2128e | [
"MIT"
] | 1 | 2022-02-14T21:43:29.000Z | 2022-02-14T21:43:29.000Z | # --------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# --------------------------------------------------------------------------------------------
# pylint: disable=line-too-long
from knack.arguments import CLIArgumentType
from azure.cli.core.commands.parameters import (
tags_type,
get_enum_type,
resource_group_name_type,
get_location_type
)
from azure.cli.core.commands.validators import (
get_default_location_from_resource_group,
validate_file_or_dict
)
from azext_fidalgo.action import (
AddParameters,
AddGitHub,
AddImageReference
)
def load_arguments(self, _):
from azure.cli.core.commands.parameters import tags_type
with self.argument_context('fidalgo dev project list') as c:
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
c.argument('filter_', options_list=['--filter'], type=str, help='An OData $filter clause to apply to the '
'operation.')
c.argument('dev_center', type=str, help='The Fidalgo DevCenter.')
c.argument('fidalgo_dns_suffix', type=str, help='Optional DevCenter DNS suffix')
with self.argument_context('fidalgo dev pool list') as c:
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
c.argument('filter_', options_list=['--filter'], type=str, help='An OData $filter clause to apply to the '
'operation.')
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('dev_center', type=str, help='The Fidalgo DevCenter.')
c.argument('fidalgo_dns_suffix', type=str, help='Optional DevCenter DNS suffix')
with self.argument_context('fidalgo dev pool show') as c:
c.argument('pool_name', options_list=['--name', '-n', '--pool-name'], type=str, help='The name of a pool of '
'virtual machines.')
c.argument('project_name', options_list=['--project-name', '--project'], type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('dev_center', type=str, help='The Fidalgo DevCenter.')
c.argument('fidalgo_dns_suffix', type=str, help='Optional DevCenter DNS suffix')
with self.argument_context('fidalgo dev virtual-machine list') as c:
c.argument('filter_', options_list=['--filter'], type=str, help='An OData $filter clause to apply to the '
'operation.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('user_id', type=str, help='The id of the user. If value is \'me\', the identity is taken from the '
'authentication context')
c.argument('dev_center', type=str, help='The Fidalgo DevCenter.')
c.argument('fidalgo_dns_suffix', type=str, help='Optional DevCenter DNS suffix')
with self.argument_context('fidalgo dev virtual-machine show') as c:
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('user_id', type=str, help='The id of the user. If value is \'me\', the identity is taken from the '
'authentication context')
c.argument('virtual_machine_name', options_list=['--name', '-n', '--virtual-machine-name'], type=str,
help='The name of a virtual machine.')
c.argument('dev_center', type=str, help='The Fidalgo DevCenter.')
c.argument('fidalgo_dns_suffix', type=str, help='Optional DevCenter DNS suffix')
with self.argument_context('fidalgo dev virtual-machine create') as c:
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('user_id', type=str, help='The id of the user. If value is \'me\', the identity is taken from the '
'authentication context')
c.argument('virtual_machine_name', options_list=['--name', '-n', '--virtual-machine-name'], type=str,
help='The name of a virtual machine.')
c.argument('pool_name', type=str, help='The name of the virtual machine pool this machine belongs to.')
c.argument('dev_center', type=str, help='The Fidalgo DevCenter.')
c.argument('fidalgo_dns_suffix', type=str, help='Optional DevCenter DNS suffix')
with self.argument_context('fidalgo dev virtual-machine delete') as c:
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('user_id', type=str, help='The id of the user. If value is \'me\', the identity is taken from the '
'authentication context')
c.argument('virtual_machine_name', options_list=['--name', '-n', '--virtual-machine-name'], type=str,
help='The name of a virtual machine.')
c.argument('dev_center', type=str, help='The Fidalgo DevCenter.')
c.argument('fidalgo_dns_suffix', type=str, help='Optional DevCenter DNS suffix')
with self.argument_context('fidalgo dev virtual-machine assign') as c:
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('user_id', type=str, help='The id of the user. If value is \'me\', the identity is taken from the '
'authentication context')
c.argument('virtual_machine_name', options_list=['--name', '-n', '--virtual-machine-name'], type=str,
help='The name of a virtual machine.')
c.argument('new_owner', type=str, help='Identifier of new owner')
c.argument('dev_center', type=str, help='The Fidalgo DevCenter.')
c.argument('fidalgo_dns_suffix', type=str, help='Optional DevCenter DNS suffix')
with self.argument_context('fidalgo dev virtual-machine get-remote-connection') as c:
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('user_id', type=str, help='The id of the user. If value is \'me\', the identity is taken from the '
'authentication context')
c.argument('virtual_machine_name', options_list=['--name', '-n', '--virtual-machine-name'], type=str,
help='The name of a virtual machine.')
c.argument('dev_center', type=str, help='The Fidalgo DevCenter.')
c.argument('fidalgo_dns_suffix', type=str, help='Optional DevCenter DNS suffix')
with self.argument_context('fidalgo dev virtual-machine start') as c:
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('user_id', type=str, help='The id of the user. If value is \'me\', the identity is taken from the '
'authentication context')
c.argument('virtual_machine_name', options_list=['--name', '-n', '--virtual-machine-name'], type=str,
help='The name of a virtual machine.')
c.argument('dev_center', type=str, help='The Fidalgo DevCenter.')
c.argument('fidalgo_dns_suffix', type=str, help='Optional DevCenter DNS suffix')
with self.argument_context('fidalgo dev virtual-machine stop') as c:
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('user_id', type=str, help='The id of the user. If value is \'me\', the identity is taken from the '
'authentication context')
c.argument('virtual_machine_name', options_list=['--name', '-n', '--virtual-machine-name'], type=str,
help='The name of a virtual machine.')
c.argument('dev_center', type=str, help='The Fidalgo DevCenter.')
c.argument('fidalgo_dns_suffix', type=str, help='Optional DevCenter DNS suffix')
with self.argument_context('fidalgo dev environment list') as c:
c.argument('dev_center', type=str, help='The DevCenter to operate on.')
c.argument('fidalgo_dns_suffix', type=str, help='The DNS suffix used as the base for all fidalgo requests.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('filter_', options_list=['--filter'], type=str, help='An OData $filter clause to apply to the '
'operation.')
with self.argument_context('fidalgo dev environment show') as c:
c.argument('dev_center', type=str, help='The DevCenter to operate on.')
c.argument('fidalgo_dns_suffix', type=str, help='The DNS suffix used as the base for all fidalgo requests.')
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('environment_name', options_list=['--name', '-n', '--environment-name'], type=str, help='The name '
'of the environment.')
with self.argument_context('fidalgo dev environment create') as c:
c.argument('dev_center', type=str, help='The DevCenter to operate on.')
c.argument('fidalgo_dns_suffix', type=str, help='The DNS suffix used as the base for all fidalgo requests.')
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('environment_name', options_list=['--name', '-n', '--environment-name'], type=str, help='The name '
'of the environment.')
c.argument('description', type=str, help='Description of the Environment.')
c.argument('catalog_item_name', type=str, help='Name of the catalog item.')
c.argument('deployment_parameters', type=validate_file_or_dict, help='Deployment parameters passed to catalog '
'item. Expected value: json-string/json-file/@json-file.')
c.argument('environment_type', type=str, help='Environment type.')
c.argument('owner', type=str, help='Identifier of the owner of this Environment.')
c.argument('tags', tags_type)
with self.argument_context('fidalgo dev environment update') as c:
c.argument('dev_center', type=str, help='The DevCenter to operate on.')
c.argument('fidalgo_dns_suffix', type=str, help='The DNS suffix used as the base for all fidalgo requests.')
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('environment_name', options_list=['--name', '-n', '--environment-name'], type=str, help='The name '
'of the environment.')
c.argument('description', type=str, help='Description of the Environment.')
c.argument('catalog_item_name', type=str, help='Name of the catalog item.')
c.argument('deployment_parameters', type=validate_file_or_dict, help='Deployment parameters passed to catalog '
'item. Expected value: json-string/json-file/@json-file.')
with self.argument_context('fidalgo dev environment delete') as c:
c.argument('dev_center', type=str, help='The DevCenter to operate on.')
c.argument('fidalgo_dns_suffix', type=str, help='The DNS suffix used as the base for all fidalgo requests.')
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('environment_name', options_list=['--name', '-n', '--environment-name'], type=str, help='The name '
'of the environment.')
with self.argument_context('fidalgo dev environment deploy') as c:
c.argument('dev_center', type=str, help='The DevCenter to operate on.')
c.argument('fidalgo_dns_suffix', type=str, help='The DNS suffix used as the base for all fidalgo requests.')
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('environment_name', options_list=['--name', '-n', '--environment-name'], type=str, help='The name '
'of the environment.')
c.argument('parameters', type=validate_file_or_dict, help='Deployment parameters. Expected value: '
'json-string/json-file/@json-file.')
with self.argument_context('fidalgo dev deployment list') as c:
c.argument('dev_center', type=str, help='The DevCenter to operate on.')
c.argument('fidalgo_dns_suffix', type=str, help='The DNS suffix used as the base for all fidalgo requests.')
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('environment_name', type=str, help='The name of the environment.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
c.argument('filter_', options_list=['--filter'], type=str, help='An OData $filter clause to apply to the '
'operation.')
with self.argument_context('fidalgo dev catalog-item list') as c:
c.argument('dev_center', type=str, help='The DevCenter to operate on.')
c.argument('fidalgo_dns_suffix', type=str, help='The DNS suffix used as the base for all fidalgo requests.')
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
c.argument('filter_', options_list=['--filter'], type=str, help='An OData $filter clause to apply to the '
'operation.')
with self.argument_context('fidalgo dev environment-type list') as c:
c.argument('dev_center', type=str, help='The DevCenter to operate on.')
c.argument('fidalgo_dns_suffix', type=str, help='The DNS suffix used as the base for all fidalgo requests.')
c.argument('project_name', type=str, help='The Fidalgo Project upon which to execute operations.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
#control plane
with self.argument_context('fidalgo admin dev-center list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
with self.argument_context('fidalgo admin dev-center show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', options_list=['--name', '-n', '--dev-center-name'], type=str, help='The name of '
'the devcenter.', id_part='name')
with self.argument_context('fidalgo admin dev-center create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', options_list=['--name', '-n', '--dev-center-name'], type=str, help='The name of '
'the devcenter.')
c.argument('tags', tags_type)
c.argument('location', arg_type=get_location_type(self.cli_ctx), required=False,
validator=get_default_location_from_resource_group)
c.argument('identity_type', arg_type=get_enum_type(['SystemAssigned', 'UserAssigned',
'SystemAssigned, UserAssigned', 'None']),
help='The type of identity used for the resource. The type \'SystemAssigned, UserAssigned\' '
'includes both an implicitly created identity and a user assigned identity. The type \'None\' will '
'remove any identities from the resource.', required=False, arg_group='Identity')
c.argument('user_assigned_identity', type=str, help='The user identity '
'associated with the resource. The user identity references will be an ARM resource id '
'in the form: \'/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microso'
'ft.ManagedIdentity/userAssignedIdentities/{identityName}\'. ', arg_group='Identity')
with self.argument_context('fidalgo admin dev-center update') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', options_list=['--name', '-n', '--dev-center-name'], type=str, help='The name of '
'the devcenter.', id_part='name')
c.argument('tags', tags_type)
c.argument('location', arg_type=get_location_type(self.cli_ctx), required=False,
validator=get_default_location_from_resource_group)
c.argument('identity_type', arg_type=get_enum_type(['SystemAssigned', 'UserAssigned',
'SystemAssigned, UserAssigned', 'None']),
help='The type of identity used for the resource. The type \'SystemAssigned, UserAssigned\' '
'includes both an implicitly created identity and a user assigned identity. The type \'None\' will '
'remove any identities from the resource.', required=False, arg_group='Identity')
c.argument('user_assigned_identities', type=validate_file_or_dict, help='The list of user identities '
'associated with the resource. The user identity dictionary key references will be ARM resource ids '
'in the form: \'/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microso'
'ft.ManagedIdentity/userAssignedIdentities/{identityName}\'. Expected value: '
'json-string/json-file/@json-file.', arg_group='Identity')
with self.argument_context('fidalgo admin dev-center delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', options_list=['--name', '-n', '--dev-center-name'], type=str, help='The name of '
'the devcenter.', id_part='name')
with self.argument_context('fidalgo admin dev-center attach-network') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', options_list=['--name', '-n', '--dev-center-name'], type=str, help='The name of '
'the devcenter.', id_part='name')
c.argument('network_connection_id', type=str, help='Resource id of a Network Settings resource')
with self.argument_context('fidalgo admin dev-center detach-network') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', options_list=['--name', '-n', '--dev-center-name'], type=str, help='The name of '
'the devcenter.', id_part='name')
c.argument('network_connection_id', type=str, help='Resource id of a Network Settings resource')
with self.argument_context('fidalgo admin dev-center wait') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', options_list=['--name', '-n', '--dev-center-name'], type=str, help='The name of '
'the devcenter.', id_part='name')
with self.argument_context('fidalgo admin project list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
with self.argument_context('fidalgo admin project show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', options_list=['--name', '-n', '--project-name'], type=str, help='The name of the '
'project.', id_part='name')
with self.argument_context('fidalgo admin project create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', options_list=['--name', '-n', '--project-name'], type=str, help='The name of the '
'project.')
c.argument('tags', tags_type)
c.argument('location', arg_type=get_location_type(self.cli_ctx), required=False,
validator=get_default_location_from_resource_group)
c.argument('dev_center_id', type=str, help='Resource Id of an associated DevCenter')
c.argument('description', type=str, help='Description of the project.')
with self.argument_context('fidalgo admin project update') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', options_list=['--name', '-n', '--project-name'], type=str, help='The name of the '
'project.', id_part='name')
c.argument('tags', tags_type)
c.argument('location', arg_type=get_location_type(self.cli_ctx), required=False,
validator=get_default_location_from_resource_group)
c.argument('dev_center_id', type=str, help='Resource Id of an associated DevCenter')
c.argument('description', type=str, help='Description of the project.')
with self.argument_context('fidalgo admin project delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', options_list=['--name', '-n', '--project-name'], type=str, help='The name of the '
'project.', id_part='name')
with self.argument_context('fidalgo admin project wait') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', options_list=['--name', '-n', '--project-name'], type=str, help='The name of the '
'project.', id_part='name')
with self.argument_context('fidalgo admin attached-network list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
with self.argument_context('fidalgo admin attached-network show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.', id_part='name')
c.argument('attached_network_connection_name', options_list=['--name', '-n', '--attached-network-connection-name'], type=str, help='The name of the attached NetworkConnection.',
id_part='child_name_1')
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
with self.argument_context('fidalgo admin attached-network create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
c.argument('attached_network_connection_name', options_list=['--name', '-n', '--attached-network-connection-name'], type=str, help='The name of the attached NetworkConnection.')
c.argument('network_connection_resource_id', type=str, help='The resource ID of the NetworkConnection you want '
'to attach.')
with self.argument_context('fidalgo admin attached-network update') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('attached_network_connection_name', options_list=['--name', '-n', '--attached-network-connection-name'], type=str, help='The name of the attached NetworkConnection.',
id_part='child_name_1')
c.argument('network_connection_resource_id', type=str, help='The resource ID of the NetworkConnection you want '
'to attach.')
with self.argument_context('fidalgo admin attached-network delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('attached_network_connection_name', options_list=['--name', '-n', '--attached-network-connection-name'], type=str, help='The name of the attached NetworkConnection.',
id_part='child_name_1')
with self.argument_context('fidalgo admin attached-network wait') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.', id_part='name')
c.argument('attached_network_connection_name', type=str, help='The name of the attached NetworkConnection.',
id_part='child_name_1')
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
with self.argument_context('fidalgo admin environment list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
with self.argument_context('fidalgo admin environment show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.', id_part='name')
c.argument('environment_name', options_list=['--name', '-n', '--environment-name'], type=str, help='The name '
'of the environment.', id_part='child_name_1')
with self.argument_context('fidalgo admin environment create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.')
c.argument('environment_name', options_list=['--name', '-n', '--environment-name'], type=str, help='The name '
'of the environment.')
c.argument('tags', tags_type)
c.argument('location', arg_type=get_location_type(self.cli_ctx), required=False,
validator=get_default_location_from_resource_group)
c.argument('description', type=str, help='Description of the Environment.')
c.argument('catalog_item_name', type=str, help='Name of the catalog item.')
c.argument('template_uri', type=str, help='Uri of a template used to deploy resources to the environment.')
c.argument('deployment_parameters', type=validate_file_or_dict, help='Deployment parameters passed to catalog '
'item. Expected value: json-string/json-file/@json-file.')
c.argument('environment_type', type=str, help='Environment type.')
with self.argument_context('fidalgo admin environment update') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.', id_part='name')
c.argument('environment_name', options_list=['--name', '-n', '--environment-name'], type=str, help='The name '
'of the environment.', id_part='child_name_1')
c.argument('tags', tags_type)
c.argument('location', arg_type=get_location_type(self.cli_ctx), required=False,
validator=get_default_location_from_resource_group)
c.argument('description', type=str, help='Description of the Environment.')
c.argument('catalog_item_name', type=str, help='Name of the catalog item.')
c.argument('template_uri', type=str, help='Uri of a template used to deploy resources to the environment.')
c.argument('deployment_parameters', type=validate_file_or_dict, help='Deployment parameters passed to catalog '
'item. Expected value: json-string/json-file/@json-file.')
with self.argument_context('fidalgo admin environment delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.', id_part='name')
c.argument('environment_name', options_list=['--name', '-n', '--environment-name'], type=str, help='The name '
'of the environment.', id_part='child_name_1')
with self.argument_context('fidalgo admin environment deploy') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.', id_part='name')
c.argument('environment_name', options_list=['--name', '-n', '--environment-name'], type=str, help='The name '
'of the environment.', id_part='child_name_1')
c.argument('parameters', type=validate_file_or_dict, help='Deployment parameters passed to catalog item. '
'Expected value: json-string/json-file/@json-file.')
with self.argument_context('fidalgo admin environment wait') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.', id_part='name')
c.argument('environment_name', options_list=['--name', '-n', '--environment-name'], type=str, help='The name '
'of the environment.', id_part='child_name_1')
with self.argument_context('fidalgo admin deployment list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.')
c.argument('environment_name', type=str, help='The name of the environment.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
with self.argument_context('fidalgo admin environment-type list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
with self.argument_context('fidalgo admin environment-type show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('environment_type_name', options_list=['--name', '-n', '--environment-type-name'], type=str,
help='The name of the environment type.', id_part='child_name_1')
with self.argument_context('fidalgo admin environment-type create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
c.argument('environment_type_name', options_list=['--name', '-n', '--environment-type-name'], type=str,
help='The name of the environment type.')
c.argument('tags', tags_type)
c.argument('description', type=str, help='Description of the environment type.')
with self.argument_context('fidalgo admin environment-type update') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('environment_type_name', options_list=['--name', '-n', '--environment-type-name'], type=str,
help='The name of the environment type.', id_part='child_name_1')
c.argument('tags', tags_type)
c.argument('description', type=str, help='Description of the environment type.')
with self.argument_context('fidalgo admin environment-type delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('environment_type_name', options_list=['--name', '-n', '--environment-type-name'], type=str,
help='The name of the environment type.', id_part='child_name_1')
with self.argument_context('fidalgo admin environment-type wait') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('environment_type_name', options_list=['--name', '-n', '--environment-type-name'], type=str,
help='The name of the environment type.', id_part='child_name_1')
with self.argument_context('fidalgo admin catalog-item list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
c.argument('catalog_name', type=str, help='The name of the Catalog.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
c.argument('project_name', type=str, help='The name of the project.')
with self.argument_context('fidalgo admin catalog-item show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('catalog_name', type=str, help='The name of the Catalog.', id_part='child_name_1')
c.argument('catalog_item_name', options_list=['--name', '-n', '--catalog-item-name'], type=str, help='The name '
'of the catalog item.', id_part='child_name_2')
with self.argument_context('fidalgo admin catalog-item create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
c.argument('catalog_name', type=str, help='The name of the Catalog.')
c.argument('catalog_item_name', options_list=['--name', '-n', '--catalog-item-name'], type=str, help='The name '
'of the catalog item.')
c.argument('description', type=str, help='Description of the catalog item.')
c.argument('template_path', type=str, help='Path to the catalog item entrypoint file.', arg_group='Engine')
c.argument('parameters', action=AddParameters, nargs='+', help='Parameters that can be provided to the catalog '
'item.', arg_group='Engine')
with self.argument_context('fidalgo admin catalog-item update') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('catalog_name', type=str, help='The name of the Catalog.', id_part='child_name_1')
c.argument('catalog_item_name', options_list=['--name', '-n', '--catalog-item-name'], type=str, help='The name '
'of the catalog item.', id_part='child_name_2')
c.argument('tags', tags_type)
c.argument('description', type=str, help='Description of the catalog item.')
with self.argument_context('fidalgo admin catalog-item delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('catalog_name', type=str, help='The name of the Catalog.', id_part='child_name_1')
c.argument('catalog_item_name', options_list=['--name', '-n', '--catalog-item-name'], type=str, help='The name '
'of the catalog item.', id_part='child_name_2')
with self.argument_context('fidalgo admin gallery list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
with self.argument_context('fidalgo admin gallery show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('gallery_name', options_list=['--name', '-n', '--gallery-name'], type=str, help='The name of the '
'gallery.', id_part='child_name_1')
with self.argument_context('fidalgo admin gallery create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
c.argument('gallery_name', options_list=['--name', '-n', '--gallery-name'], type=str, help='The name of the '
'gallery.')
c.argument('gallery_resource_id', type=str, help='The resource ID of the backing Azure Compute Gallery.')
with self.argument_context('fidalgo admin gallery update') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('gallery_name', options_list=['--name', '-n', '--gallery-name'], type=str, help='The name of the '
'gallery.', id_part='child_name_1')
c.argument('gallery_resource_id', type=str, help='The resource ID of the backing Azure Compute Gallery.')
c.ignore('body')
with self.argument_context('fidalgo admin gallery delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('gallery_name', options_list=['--name', '-n', '--gallery-name'], type=str, help='The name of the '
'gallery.', id_part='child_name_1')
with self.argument_context('fidalgo admin gallery wait') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('gallery_name', options_list=['--name', '-n', '--gallery-name'], type=str, help='The name of the '
'gallery.', id_part='child_name_1')
with self.argument_context('fidalgo admin image list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
c.argument('gallery_name', type=str, help='The name of the gallery.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
with self.argument_context('fidalgo admin image show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('gallery_name', type=str, help='The name of the gallery.', id_part='child_name_1')
c.argument('image_name', options_list=['--name', '-n', '--image-name'], type=str,
help='The name of the image.', id_part='child_name_2')
with self.argument_context('fidalgo admin image-version list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
c.argument('gallery_name', type=str, help='The name of the gallery.')
c.argument('image_name', type=str, help='The name of the image.')
with self.argument_context('fidalgo admin image-version show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('gallery_name', type=str, help='The name of the gallery.', id_part='child_name_1')
c.argument('image_name', type=str, help='The name of the image.', id_part='child_name_2')
c.argument('version_name', type=str, help='The version of the image.', id_part='child_name_3')
with self.argument_context('fidalgo admin catalog list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
with self.argument_context('fidalgo admin catalog show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('catalog_name', options_list=['--name', '-n', '--catalog-name'], type=str, help='The name of the '
'Catalog.', id_part='child_name_1')
with self.argument_context('fidalgo admin catalog create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
c.argument('catalog_name', options_list=['--name', '-n', '--catalog-name'], type=str, help='The name of the '
'Catalog.')
c.argument('git_hub', action=AddGitHub, nargs='+', help='Properties for a GitHub catalog type.')
c.argument('ado_git', action=AddGitHub, nargs='+', help='Properties for an Azure DevOps catalog type.')
with self.argument_context('fidalgo admin catalog update') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('catalog_name', options_list=['--name', '-n', '--catalog-name'], type=str, help='The name of the '
'Catalog.', id_part='child_name_1')
c.argument('tags', tags_type)
c.argument('git_hub', action=AddGitHub, nargs='+', help='Properties for a GitHub catalog type.')
c.argument('ado_git', action=AddGitHub, nargs='+', help='Properties for an Azure DevOps catalog type.')
with self.argument_context('fidalgo admin catalog delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('catalog_name', options_list=['--name', '-n', '--catalog-name'], type=str, help='The name of the '
'Catalog.', id_part='child_name_1')
with self.argument_context('fidalgo admin catalog sync') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('catalog_name', options_list=['--name', '-n', '--catalog-name'], type=str, help='The name of the '
'Catalog.', id_part='child_name_1')
with self.argument_context('fidalgo admin catalog wait') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('catalog_name', options_list=['--name', '-n', '--catalog-name'], type=str, help='The name of the '
'Catalog.', id_part='child_name_1')
with self.argument_context('fidalgo admin mapping list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
with self.argument_context('fidalgo admin mapping show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('mapping_name', options_list=['--name', '-n', '--mapping-name'], type=str, help='Mapping name.',
id_part='child_name_1')
with self.argument_context('fidalgo admin mapping create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
c.argument('mapping_name', options_list=['--name', '-n', '--mapping-name'], type=str, help='Mapping name.')
c.argument('mapped_subscription_id', type=str, help='Id of a subscription that the environment type will be '
'mapped to. The environment\'s resources will be deployed into this subscription.')
c.argument('environment_type', type=str, help='Environment type (e.g. Dev/Test)')
c.argument('project_id', type=str, help='Resource Id of a project that this mapping is associated with.')
with self.argument_context('fidalgo admin mapping update') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('mapping_name', options_list=['--name', '-n', '--mapping-name'], type=str, help='Mapping name.',
id_part='child_name_1')
c.argument('mapped_subscription_id', type=str, help='Id of a subscription that the environment type will be '
'mapped to. The environment\'s resources will be deployed into this subscription.')
with self.argument_context('fidalgo admin devbox-definition list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
with self.argument_context('fidalgo admin devbox-definition show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('dev_box_definition_name', options_list=['--name', '-n', '--devbox-definition-name'], type=str,
help='The name of the Dev Box definition.', id_part='child_name_1')
with self.argument_context('fidalgo admin devbox-definition create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.')
c.argument('dev_box_definition_name', options_list=['--name', '-n', '--devbox-definition-name'], type=str, help='The name of the Dev Box definition.')
c.argument('tags', tags_type)
c.argument('location', arg_type=get_location_type(self.cli_ctx), required=False,
validator=get_default_location_from_resource_group)
c.argument('image_reference', action=AddImageReference, nargs='+', help='Image reference information.')
c.argument('sku_name', type=str, help='The name of the SKU.', arg_group='Sku')
with self.argument_context('fidalgo admin devbox-definition update') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('dev_box_definition_name', options_list=['--name', '-n', '--devbox-definition-name'], type=str, help='The name of the Dev Box definition.',
id_part='child_name_1')
c.argument('tags', tags_type)
c.argument('image_reference', action=AddImageReference, nargs='+', help='Image reference information.')
c.argument('sku_name', type=str, help='The name of the SKU.', arg_group='Sku')
with self.argument_context('fidalgo admin devbox-definition delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('dev_box_definition_name', options_list=['--name', '-n', '--devbox-definition-name'], type=str,
help='The name of the Dev Box definition.', id_part='child_name_1')
with self.argument_context('fidalgo admin devbox-definition wait') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('dev_box_definition_name', options_list=['--name', '-n', '--devbox-definition-name'], type=str,
help='The name of the Dev Box definition.', id_part='child_name_1')
with self.argument_context('fidalgo admin mapping delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('dev_center_name', type=str, help='The name of the devcenter.', id_part='name')
c.argument('mapping_name', options_list=['--name', '-n', '--mapping-name'], type=str, help='Mapping name.',
id_part='child_name_1')
with self.argument_context('fidalgo admin operation-statuses show') as c:
c.argument('location', arg_type=get_location_type(self.cli_ctx))
c.argument('operation_id', type=str, help='The ID of an ongoing async operation')
with self.argument_context('fidalgo admin sku list') as c:
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
with self.argument_context('fidalgo admin pool list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.')
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
with self.argument_context('fidalgo admin pool show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.', id_part='name')
c.argument('pool_name', options_list=['--name', '-n', '--pool-name'], type=str, help='Name of the pool.',
id_part='child_name_1')
with self.argument_context('fidalgo admin pool create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.')
c.argument('pool_name', options_list=['--name', '-n', '--pool-name'], type=str, help='Name of the pool.')
c.argument('tags', tags_type)
c.argument('location', arg_type=get_location_type(self.cli_ctx), required=False,
validator=get_default_location_from_resource_group)
#c.argument('machine_definition_id', type=str, help='Resource Id of a Machine Definition')
c.argument('dev_box_definition_name', options_list=['--devbox-definition-name'], type=str, help='Name of a Dev Box definition in parent Project of this Pool')
#c.argument('network_settings_id', type=str, help='Resource Id of a Network Settings resource')
c.argument('network_connection_name', type=str, help='Name of a Network Connection in parent Project of this Pool')
#c.argument('sku_name', type=str, required=False, help='The name of the SKU - this is optional and can be used to overrride the SKU defined in the Dev Box Definition. (not currently used)', arg_group='Sku')
with self.argument_context('fidalgo admin pool update') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.', id_part='name')
c.argument('pool_name', options_list=['--name', '-n', '--pool-name'], type=str, help='Name of the pool.',
id_part='child_name_1')
c.argument('tags', tags_type)
c.argument('location', arg_type=get_location_type(self.cli_ctx), required=False,
validator=get_default_location_from_resource_group)
#c.argument('machine_definition_id', type=str, help='Resource Id of a Machine Definition')
c.argument('dev_box_definition_name', options_list=['--devbox-definition-name'], type=str, help='Name of a Dev Box definition in parent Project of this Pool')
#c.argument('network_settings_id', type=str, help='Resource Id of a Network Settings resource')
c.argument('network_connection_name', type=str, help='Name of a Network Connection in parent Project of this Pool')
#c.argument('sku_name', type=str, help='The name of the SKU - this is optional and can be used to overrride the SKU defined in the Dev Box Definition. (not currently used)', arg_group='Sku')
with self.argument_context('fidalgo admin pool delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.', id_part='name')
c.argument('pool_name', options_list=['--name', '-n', '--pool-name'], type=str, help='Name of the pool.',
id_part='child_name_1')
with self.argument_context('fidalgo admin pool wait') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('project_name', type=str, help='The name of the project.', id_part='name')
c.argument('pool_name', options_list=['--name', '-n', '--pool-name'], type=str, help='Name of the pool.',
id_part='child_name_1')
with self.argument_context('fidalgo admin machine-definition list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
with self.argument_context('fidalgo admin machine-definition show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('machine_definition_name', options_list=['--name', '-n', '--machine-definition-name'], type=str,
help='The name of the machine definition.', id_part='name')
with self.argument_context('fidalgo admin machine-definition create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('machine_definition_name', options_list=['--name', '-n', '--machine-definition-name'], type=str,
help='The name of the machine definition.')
c.argument('tags', tags_type)
c.argument('location', arg_type=get_location_type(self.cli_ctx), required=False,
validator=get_default_location_from_resource_group)
c.argument('image_reference', action=AddImageReference, nargs='+', help='Image reference information.')
with self.argument_context('fidalgo admin machine-definition update') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('machine_definition_name', options_list=['--name', '-n', '--machine-definition-name'], type=str,
help='The name of the machine definition.', id_part='name')
c.argument('tags', tags_type)
c.argument('location', arg_type=get_location_type(self.cli_ctx), required=False,
validator=get_default_location_from_resource_group)
c.argument('image_reference', action=AddImageReference, nargs='+', help='Image reference information.')
with self.argument_context('fidalgo admin machine-definition delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('machine_definition_name', options_list=['--name', '-n', '--machine-definition-name'], type=str,
help='The name of the machine definition.', id_part='name')
with self.argument_context('fidalgo admin machine-definition wait') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('machine_definition_name', options_list=['--name', '-n', '--machine-definition-name'], type=str,
help='The name of the machine definition.', id_part='name')
with self.argument_context('fidalgo admin network-setting list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('top', type=int, help='The maximum number of resources to return from the operation. Example: '
'\'$top=10\'.')
with self.argument_context('fidalgo admin network-setting show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('network_setting_name', options_list=['--name', '-n', '--network-setting-name'], type=str,
help='Name of the Network Settings that can be applied to a Pool.', id_part='name')
with self.argument_context('fidalgo admin network-setting create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('network_setting_name', options_list=['--name', '-n', '--network-setting-name'], type=str,
help='Name of the Network Settings that can be applied to a Pool.')
c.argument('tags', tags_type)
c.argument('location', arg_type=get_location_type(self.cli_ctx), required=False,
validator=get_default_location_from_resource_group)
c.argument('subnet_id', type=str, help='The subnet to attach Virtual Machines to')
c.argument('networking_resource_group_id', type=str, help='[deprecated] Target resource group id for NICs to be placed.'
'Required format: \'/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}\'')
c.argument('domain_name', type=str, help='Active Directory domain name')
c.argument('organization_unit', type=str, help='Active Directory domain Organization Unit (OU)')
c.argument('domain_username', type=str, help='The username of an Active Directory account (user or service '
'account) that has permissions to create computer objects in Active Directory. Required format: '
'admin@contoso.com.')
c.argument('domain_password', type=str, help='The password for the account used to join domain')
c.argument('networking_resource_group_name', type=str, help='The name for the managed resource group where NICs will be '
'placed.')
c.argument('domain_join_type', arg_type=get_enum_type(['HybridAzureADJoin', 'AzureADJoin']), help='AAD Join '
'type.')
with self.argument_context('fidalgo admin network-setting update') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('network_setting_name', options_list=['--name', '-n', '--network-setting-name'], type=str,
help='Name of the Network Settings that can be applied to a Pool.', id_part='name')
c.argument('tags', tags_type)
c.argument('location', arg_type=get_location_type(self.cli_ctx), required=False,
validator=get_default_location_from_resource_group)
c.argument('subnet_id', type=str, help='The subnet to attach Virtual Machines to')
c.argument('networking_resource_group_id', type=str, help='[deprecated] Target resource group id for NICs to be placed. '
'Required format: \'/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}\'')
c.argument('domain_name', type=str, help='Active Directory domain name')
c.argument('organization_unit', type=str, help='Active Directory domain Organization Unit (OU)')
c.argument('domain_username', type=str, help='The username of an Active Directory account (user or service '
'account) that has permissions to create computer objects in Active Directory. Required format: '
'admin@contoso.com.')
c.argument('domain_password', type=str, help='The password for the account used to join domain')
with self.argument_context('fidalgo admin network-setting delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('network_setting_name', options_list=['--name', '-n', '--network-setting-name'], type=str,
help='Name of the Network Settings that can be applied to a Pool.', id_part='name')
with self.argument_context('fidalgo admin network-setting show-health-detail') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('network_setting_name', options_list=['--name', '-n', '--network-setting-name'], type=str,
help='Name of the Network Settings that can be applied to a Pool.', id_part='name')
with self.argument_context('fidalgo admin network-setting wait') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('network_setting_name', options_list=['--name', '-n', '--network-setting-name'], type=str,
help='Name of the Network Settings that can be applied to a Pool.', id_part='name')
| 72.940835 | 215 | 0.657606 | 8,414 | 62,875 | 4.739125 | 0.031614 | 0.102696 | 0.080828 | 0.076539 | 0.972263 | 0.969254 | 0.961028 | 0.95032 | 0.934821 | 0.928677 | 0 | 0.001804 | 0.206744 | 62,875 | 861 | 216 | 73.025552 | 0.797678 | 0.018115 | 0 | 0.764228 | 0 | 0 | 0.439897 | 0.035244 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001355 | false | 0.009485 | 0.006775 | 0 | 0.00813 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a2d9f4cfeb100f3bfc649f2ef345f7865030bd80 | 22,860 | py | Python | src/db.py | manucordova/ProbAsn | c6c4ee3223fa8283b8f9ec88ed8d2257d7fe0182 | [
"CC-BY-4.0"
] | 6 | 2021-11-27T11:37:33.000Z | 2022-02-23T11:17:28.000Z | src/db.py | manucordova/ProbAsn | c6c4ee3223fa8283b8f9ec88ed8d2257d7fe0182 | [
"CC-BY-4.0"
] | null | null | null | src/db.py | manucordova/ProbAsn | c6c4ee3223fa8283b8f9ec88ed8d2257d7fe0182 | [
"CC-BY-4.0"
] | null | null | null | ####################################################################################################
### ###
### Functions for database handling (fetching, ...) ###
### Author: Manuel Cordova (EPFL) ###
### Last modified: 03.09.2021 ###
### ###
####################################################################################################
# Import libraries
import numpy as np
import os
import subprocess as sbp
import time
# Import local libraries
import graph as gr
def fetch_entries(db_root, elem, atoms, envs, Gs, max_w, N_min=10, nei_elem=None, exclude=None, verbose=False):
"""
Find the database entries corresponding to each graph, with a minimum number of instances.
Also retrieve the crystal identifier and index of the atom associated with each database entry.
Inputs: - db_root Root directory of the database
- elem Element of the central nodes of the graphs
- elems List of atoms in the molecule
- envs Environment of each graph (first coordination shell)
- Gs List of graphs to fetch the database for
- max_w Maximum depth
- N_min Minimum number of entries in the database required
- nei_elem "None" if we only want to retrieve the shifts of the central atom,
otherwise element of the neighbour to extract shift distributions from
- exclude List of crystal identifiers to exclude
- verbose Whether additional information about the search should be printed
Outputs: - all_shifts List of predicted shifts for each graph
- all_errs List of prediction errors for each graph
- ws List of maximum depth for each graph
- labels List of graph labels
- all_crysts List of the crystals corresponding to the shifts extracted
- all_inds List of the indices of the atoms corresponding to the shifts extracted
- hashes List of hashes identifying the graphs
"""
# Initialize arrays
all_shifts = []
all_errs = []
ws = []
labels = []
all_crysts = []
all_inds = []
hashes = []
# Get the directory
db_dir = db_root + elem + "/"
# Loop over each graph
for i, (G, env) in enumerate(zip(Gs, envs)):
start = time.time()
# Get the number of neighbouring elements in the environment
num_nei = 0
if nei_elem is not None:
nei_elems = env.split("-")
num_nei = nei_elems.count(nei_elem)
# Get the directory
db_dir = db_root + elem + "-" + nei_elem + "/"
# Check if database directory exists
if not os.path.exists(db_dir):
raise ValueError("Directory does not exist: {}".format(db_dir))
# If there are neighbours that correspond to the element, extract the 2D shifts
if num_nei > 0:
if not os.path.exists(db_dir + env + ".csv"):
raise ValueError("File does not exist: {}".format(db_dir + env + ".csv"))
# Loop over all neighbours
for j in range(1, len(nei_elems)+1):
if G.nodes[j]["elem"] == nei_elem:
this_w = max_w
# Generate arborescence (array of hashes)
arb = []
for w in range(2, max_w+1):
cut_G = gr.cut_graph(G, w)
cut_G.nodes[j]["elem"] = "Z"
arb.append(gr.generate_hash(cut_G))
# If the arborescence was already found before, get the corresponding shifts directly
if ",".join(arb) in hashes:
h_ind = hashes.index(",".join(arb))
hashes.append(",".join(arb))
this_w = ws[h_ind]
# Append the array of shifts and errors for this distribution
all_shifts.append(all_shifts[h_ind])
all_errs.append(all_errs[h_ind])
ws.append(this_w)
labels.append("{}{}-{}{}".format(elem, i+1, nei_elem, atoms[:G.nodes[j]["ind"]].count(nei_elem)+1))
all_inds.append(all_inds[h_ind])
all_crysts.append(all_crysts[h_ind])
# Otherwise, search through the database
else:
hashes.append(",".join(arb))
# Initialize array of shifts and errors
shifts = []
errs = []
inds = []
crysts = []
# Get the entries of the corresponding graph
p = sbp.Popen(["grep", ",".join(arb), db_dir + env + ".csv"], stdout=sbp.PIPE)
out, err = p.communicate()
out = out.decode("UTF-8")
for l in out.split("\n"):
if len(l) > 0:
tmp = l.split(",")
if (exclude is None or tmp[0] not in exclude) and tmp[0] != "crystal":
crysts.append(tmp[0])
inds.append([int(tmp[1]), int(tmp[4])])
shifts.append([float(tmp[2]), float(tmp[5])])
errs.append([float(tmp[3]), float(tmp[6])])
# If there is not enough entries, reduce the depth and try again
while len(shifts) < N_min:
if verbose:
print(" w = {}: {} instances are not enough, reducing graph depth...".format(this_w, len(shifts)))
shifts = []
errs = []
inds = []
crysts = []
# Update the depth and the corresponding arborescence
this_w -= 1
arb = arb[:-1]
# Get the entries of the corresponding graph
p = sbp.Popen(["grep", ",".join(arb), db_dir + env + ".csv"], stdout=sbp.PIPE)
out, err = p.communicate()
out = out.decode("UTF-8")
for l in out.split("\n"):
if len(l) > 0:
tmp = l.split(",")
if (exclude is None or tmp[0] not in exclude) and tmp[0] != "crystal":
shifts.append([float(tmp[2]), float(tmp[5])])
errs.append([float(tmp[3]), float(tmp[6])])
inds.append([int(tmp[1]), int(tmp[4])])
crysts.append(tmp[0])
# Append the array of shifts and errors for this distribution
all_shifts.append(np.array(shifts))
all_errs.append(np.array(errs))
ws.append(this_w)
labels.append("{}{}-{}{}".format(elem, i+1, nei_elem, atoms[:G.nodes[j]["ind"]].count(nei_elem)+1))
all_inds.append(inds)
all_crysts.append(crysts)
stop = time.time()
print(" Graph {}/{} found. w = {}, {} instances. Time elapsed: {:.2f} s".format(i+1, len(Gs), this_w, len(all_shifts[-1]), stop-start))
# If the neighbouring element is not set, extract the 1D shfits
elif nei_elem is None:
this_w = max_w
# Generate arborescence (array of hashes)
arb = []
for w in range(2, max_w+1):
cut_G = gr.cut_graph(G, w)
arb.append(gr.generate_hash(cut_G))
# If the arborescence was already found, reuse the previously extracted shifts to save time
if ",".join(arb) in hashes:
h_ind = hashes.index(",".join(arb))
hashes.append(",".join(arb))
this_w = ws[h_ind]
# Append the array of shifts and errors for this distribution
all_shifts.append(all_shifts[h_ind])
all_errs.append(all_errs[h_ind])
ws.append(this_w)
labels.append("{}{}".format(elem, i+1))
all_inds.append(all_inds[h_ind])
all_crysts.append(all_crysts[h_ind])
else:
hashes.append(",".join(arb))
# Initialize array of shifts, errors, crystal structures and atomic indices
shifts = []
errs = []
crysts = []
inds = []
# Get the entries of the corresponding graph
p = sbp.Popen(["grep", ",".join(arb), db_dir + env + ".csv"], stdout=sbp.PIPE)
out, err = p.communicate()
out = out.decode("UTF-8")
for l in out.split("\n"):
if len(l) > 0:
tmp = l.split(",")
if (exclude is None or tmp[0] not in exclude) and tmp[0] != "crystal":
crysts.append(tmp[0])
inds.append(int(tmp[1]))
shifts.append(float(tmp[2]))
errs.append(float(tmp[3]))
# If there is not enough entries, reduce the depth and try again
while len(shifts) < N_min:
if verbose:
print(" w = {}: {} instances are not enough, reducing graph depth...".format(this_w, len(shifts)))
shifts = []
errs = []
inds = []
crysts = []
# Reduce the depth and update the arborescence
this_w -= 1
arb = arb[:-1]
# Get the entries of the corresponding graph
p = sbp.Popen(["grep", ",".join(arb), db_dir + env + ".csv"], stdout=sbp.PIPE)
out, err = p.communicate()
out = out.decode("UTF-8")
for l in out.split("\n"):
if len(l) > 0:
tmp = l.split(",")
if (exclude is None or tmp[0] not in exclude) and tmp[0] != "crystal":
crysts.append(tmp[0])
inds.append(int(tmp[1]))
shifts.append(float(tmp[2]))
errs.append(float(tmp[3]))
# Append the array of shifts and error for this distribution
all_shifts.append(np.array(shifts))
all_errs.append(np.array(errs))
ws.append(this_w)
labels.append("{}{}".format(elem, i+1))
all_inds.append(inds)
all_crysts.append(crysts)
stop = time.time()
print(" Graph {}/{} found. w = {}, {} instances. Time elapsed: {:.2f} s".format(i+1, len(Gs), this_w, len(all_shifts[-1]), stop-start))
else:
print(" Graph {}/{} has no neighbouring {}.".format(i+1, len(Gs), nei_elem))
return all_shifts, all_errs, ws, labels, all_crysts, all_inds, hashes
def fetch_entries_from_hashes(db_root, elem, envs, Hs, max_w, N_min=10, nei_elem=None, exclude=None, verbose=False):
"""
Find the database entries corresponding to each graph, with a minimum number of instances.
Also retrieve the crystal identifier and index of the atom associated with each database entry.
Inputs: - db_root Root directory of the database
- elem Element of the central nodes of the graphs
- envs Environment of each graph (first coordination shell)
- Hs List of graphs hashes to fetch the database for
- w_max Maximum depth
- N_min Minimum number of entries for each graph
- nei_elem "None" if we only want to retrieve the shifts of the central atom,
otherwise element of the neighbour to extract shift distributions from
- exclude List of crystal identifiers to exclude
- verbose Whether additional information about the search should be printed
Outputs: - all_shifts List of predicted shifts for each graph
- all_errs List of prediction errors for each graph
- ws List of maximum depth for each graph
- all_crysts List of the crystals corresponding to the shifts extracted
- all_inds List of the indices of the atoms corresponding to the shifts extracted
"""
# Initialize arrays
all_shifts = []
all_errs = []
ws = []
labels = []
all_crysts = []
all_inds = []
hashes = []
# Get the directory
elem_dir = db_root + elem + "/"
# Loop over each graph
for i, (H, env) in enumerate(zip(Hs, envs)):
start = time.time()
# Get the number of neighbouring elements in the environment
num_nei = 0
if nei_elem is not None:
nei_elems = env.split("-")
num_nei = nei_elems.count(nei_elem)
# Get the directory
db_dir = db_root + elem + "-" + nei_elem + "/"
if not os.path.exists(elem_dir):
raise ValueError("Directory does not exist: {}".format(db_dir))
# If there are neighbours that correspond to the element, extract the 2D shifts
if num_nei > 0:
if not os.path.exists(db_dir + env + ".csv"):
raise ValueError("File does not exist: {}".format(db_dir + env + ".csv"))
this_w = max_w
arb = H.split(",")
# If the arborescence was already found before, get the corresponding shifts directly
if ",".join(arb) in hashes:
h_ind = hashes.index(",".join(arb))
hashes.append(",".join(arb))
this_w = ws[h_ind]
# Append the array of shifts and errors for this distribution
all_shifts.append(all_shifts[h_ind])
all_errs.append(all_errs[h_ind])
ws.append(this_w)
labels.append("{}{}-{}{}".format(elem, i+1, nei_elem, 0))
all_inds.append(all_inds[h_ind])
all_crysts.append(all_crysts[h_ind])
# Otherwise, search through the database
else:
hashes.append(",".join(arb))
# Initialize array of shifts and errors
shifts = []
errs = []
inds = []
crysts = []
# Get the entries of the corresponding graph
p = sbp.Popen(["grep", ",".join(arb), db_dir + env + ".csv"], stdout=sbp.PIPE)
out, err = p.communicate()
out = out.decode("UTF-8")
for l in out.split("\n"):
if len(l) > 0:
tmp = l.split(",")
if (exclude is None or tmp[0] not in exclude) and tmp[0] != "crystal":
crysts.append(tmp[0])
inds.append([int(tmp[1]), int(tmp[4])])
shifts.append([float(tmp[2]), float(tmp[5])])
errs.append([float(tmp[3]), float(tmp[6])])
# If there is not enough entries, reduce the depth and try again
while len(shifts) < N_min:
if verbose:
print(" w = {}: {} instances are not enough, reducing graph depth...".format(this_w, len(shifts)))
shifts = []
errs = []
inds = []
crysts = []
# Update the depth and the corresponding arborescence
this_w -= 1
arb = arb[:-1]
# Get the entries of the corresponding graph
p = sbp.Popen(["grep", ",".join(arb), db_dir + env + ".csv"], stdout=sbp.PIPE)
out, err = p.communicate()
out = out.decode("UTF-8")
for l in out.split("\n"):
if len(l) > 0:
tmp = l.split(",")
if (exclude is None or tmp[0] not in exclude) and tmp[0] != "crystal":
crysts.append(tmp[0])
inds.append([int(tmp[1]), int(tmp[4])])
shifts.append([float(tmp[2]), float(tmp[5])])
errs.append([float(tmp[3]), float(tmp[6])])
# Append the array of shifts and errors for this distribution
all_shifts.append(np.array(shifts))
all_errs.append(np.array(errs))
ws.append(this_w)
labels.append("{}{}-{}{}".format(elem, i+1, nei_elem, 0))
all_inds.append(inds)
all_crysts.append(crysts)
stop = time.time()
print(" Graph {}/{} found. w = {}, {} instances. Time elapsed: {:.2f} s".format(i+1, len(Hs), this_w, len(shifts), stop-start))
# If the neighbouring element is not set, extract the 1D shfits
elif nei_elem is None:
this_w = max_w
# Generate arborescence (array of hashes)
arb = H.split(",")
if ",".join(arb) in hashes:
h_ind = hashes.index(",".join(arb))
hashes.append(",".join(arb))
this_w = ws[h_ind]
# Append the array of shifts and errors for this distribution
all_shifts.append(all_shifts[h_ind])
all_errs.append(all_errs[h_ind])
ws.append(this_w)
labels.append("{}{}".format(elem, i+1))
all_inds.append(all_inds[h_ind])
all_crysts.append(all_crysts[h_ind])
else:
hashes.append(",".join(arb))
# Initialize array of shifts and errors
shifts = []
errs = []
inds = []
crysts = []
# Get the entries of the corresponding graph
p = sbp.Popen(["grep", ",".join(arb), db_dir + env + ".csv"], stdout=sbp.PIPE)
out, err = p.communicate()
out = out.decode("UTF-8")
for l in out.split("\n"):
if len(l) > 0:
tmp = l.split(",")
if (exclude is None or tmp[0] not in exclude) and tmp[0] != "crystal":
crysts.append(tmp[0])
inds.append(int(tmp[1]))
shifts.append(float(tmp[2]))
errs.append(float(tmp[3]))
# If there is not enough entries, reduce the depth and try again
while len(shifts) < N_min:
if verbose:
print(" w = {}: {} instances are not enough, reducing graph depth...".format(this_w, len(shifts)))
shifts = []
errs = []
inds = []
crysts = []
# Update the depth and the corresponding arborescence
this_w -= 1
arb = arb[:-1]
# Get the entries of the corresponding graph
p = sbp.Popen(["grep", ",".join(arb), db_dir + env + ".csv"], stdout=sbp.PIPE)
out, err = p.communicate()
out = out.decode("UTF-8")
for l in out.split("\n"):
if len(l) > 0:
tmp = l.split(",")
if (exclude is None or tmp[0] not in exclude) and tmp[0] != "crystal":
crysts.append(tmp[0])
inds.append(int(tmp[1]))
shifts.append(float(tmp[2]))
errs.append(float(tmp[3]))
# Append the array of shifts and error for this distribution
all_shifts.append(np.array(shifts))
all_errs.append(np.array(errs))
ws.append(this_w)
labels.append("{}{}".format(elem, i+1))
all_inds.append(inds)
all_crysts.append(crysts)
stop = time.time()
print(" Graph {}/{} found. w = {}, {} instances. Time elapsed: {:.2f} s".format(i+1, len(Hs), this_w, len(shifts), stop-start))
else:
print(" Graph {}/{} has no neighbouring {}.".format(i+1, len(Gs), nei_elem))
return all_shifts, all_errs, ws, labels, all_crysts, all_inds
| 45.357143 | 148 | 0.438014 | 2,396 | 22,860 | 4.088481 | 0.093489 | 0.014292 | 0.022866 | 0.013475 | 0.923336 | 0.917313 | 0.917313 | 0.915374 | 0.895161 | 0.895161 | 0 | 0.010742 | 0.454331 | 22,860 | 503 | 149 | 45.447316 | 0.774571 | 0.258661 | 0 | 0.938356 | 0 | 0 | 0.060251 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.006849 | false | 0 | 0.017123 | 0 | 0.030822 | 0.034247 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a2e1ae6c6e7d919eb89b2a917057e2e193abfc40 | 64 | py | Python | pythonmomo/start_momo.py | pythonmomo/pythonmomo | 8024f3f853659db38c9bcec0c01981c98701f548 | [
"MIT"
] | null | null | null | pythonmomo/start_momo.py | pythonmomo/pythonmomo | 8024f3f853659db38c9bcec0c01981c98701f548 | [
"MIT"
] | 1 | 2018-11-20T12:13:00.000Z | 2018-11-20T12:32:21.000Z | pythonmomo/start_momo.py | pythonmomo/pythonmomo | 8024f3f853659db38c9bcec0c01981c98701f548 | [
"MIT"
] | 3 | 2018-11-19T11:11:21.000Z | 2018-11-19T11:37:19.000Z | import pythonmomo
def return_name():
return pythonmomo.name
| 10.666667 | 24 | 0.78125 | 8 | 64 | 6.125 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15625 | 64 | 5 | 25 | 12.8 | 0.907407 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
a2e30c7288cb34d02f1581a0149675460eab0e95 | 14,592 | py | Python | tools/mo/unit_tests/mo/front/HSigmoid_fusion_test.py | ryanloney/openvino-1 | 4e0a740eb3ee31062ba0df88fcf438564f67edb7 | [
"Apache-2.0"
] | 1,127 | 2018-10-15T14:36:58.000Z | 2020-04-20T09:29:44.000Z | tools/mo/unit_tests/mo/front/HSigmoid_fusion_test.py | ryanloney/openvino-1 | 4e0a740eb3ee31062ba0df88fcf438564f67edb7 | [
"Apache-2.0"
] | 439 | 2018-10-20T04:40:35.000Z | 2020-04-19T05:56:25.000Z | tools/mo/unit_tests/mo/front/HSigmoid_fusion_test.py | ryanloney/openvino-1 | 4e0a740eb3ee31062ba0df88fcf438564f67edb7 | [
"Apache-2.0"
] | 414 | 2018-10-17T05:53:46.000Z | 2020-04-16T17:29:53.000Z | # Copyright (C) 2018-2022 Intel Corporation
# SPDX-License-Identifier: Apache-2.0
import unittest
from openvino.tools.mo.front.HSigmoid_fusion import HSigmoidWithClamp, HSigmoidWithMinMax, HSigmoidWithReluDiv, \
HSigmoidWithReluMul
from openvino.tools.mo.front.common.partial_infer.utils import float_array
from openvino.tools.mo.utils.ir_engine.compare_graphs import compare_graphs
from unit_tests.utils.graph import build_graph, const, regular_op, result, build_graph_with_edge_attrs
ref_nodes = {**regular_op('input', {'type': 'Parameter'}),
**regular_op('hsigmoid', {'type': 'HSigmoid', 'name': 'final_mul'}),
**result('result')
}
ref_edges = [('input', 'hsigmoid'), ('hsigmoid', 'result')]
class HSigmoidWithClampTest(unittest.TestCase):
nodes = {
**regular_op('input', {'type': 'Parameter'}),
**regular_op('add', {'op': 'Add'}),
**regular_op('relu6', {'op': 'Clamp'}),
**regular_op('mul_2', {'op': 'Mul', 'name': 'final_mul'}),
**const('const_0', float_array([0.0])),
**const('const_3', float_array([3.0])),
**const('const_6', float_array([6.0])),
**const('const_1_6', float_array([1.0 / 6.0])),
**result('result'),
}
edges = [('input', 'add', {'in': 0, 'out': 0}),
('const_3', 'add', {'in': 1, 'out': 0}),
('add', 'relu6', {'in': 0, 'out': 0}),
('const_0', 'relu6', {'in': 1, 'out': 0}),
('const_6', 'relu6', {'in': 2, 'out': 0}),
('relu6', 'mul_2', {'in': 1, 'out': 0}),
('const_1_6', 'mul_2', {'in': 1, 'out': 0}),
('mul_2', 'result', {'in': 0, 'out': 0})]
def test_hsigmoid_with_clamp(self):
graph = build_graph_with_edge_attrs(self.nodes, self.edges, {})
graph_ref = build_graph(ref_nodes, ref_edges)
graph.stage = 'front'
HSigmoidWithClamp().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
self.assertTrue(len(graph.get_op_nodes(name='final_mul')) == 1 and
graph.get_op_nodes(name='final_mul')[0].op == 'HSigmoid')
def test_hsigmoid_with_clamp_wrong_constant(self):
graph = build_graph_with_edge_attrs(self.nodes, self.edges, {'const_0': {'value': float_array([0.00001])}})
graph_ref = graph.copy()
graph.stage = 'front'
HSigmoidWithClamp().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
def test_hsigmoid_with_clamp_different_tensors(self):
graph = build_graph_with_edge_attrs({
**regular_op('input', {'type': 'Parameter'}),
**regular_op('input_2', {'type': 'Parameter'}),
**regular_op('add', {'op': 'Add'}),
**regular_op('relu6', {'op': 'Clamp'}),
**regular_op('mul', {'op': 'Mul'}),
**regular_op('mul_2', {'op': 'Mul', 'name': 'final_mul'}),
**const('const_0', float_array([0.0])),
**const('const_3', float_array([3.0])),
**const('const_6', float_array([6.0])),
**const('const_1_6', float_array([1.0 / 6.0])),
**result('result'),
}, [('input', 'mul', {'in': 0, 'out': 0}),
('input_2', 'add', {'in': 0, 'out': 0}),
('const_3', 'add', {'in': 1, 'out': 0}),
('add', 'relu6', {'in': 0, 'out': 0}),
('const_0', 'relu6', {'in': 1, 'out': 0}),
('const_6', 'relu6', {'in': 2, 'out': 0}),
('relu6', 'mul', {'in': 1, 'out': 0}),
('mul', 'mul_2', {'in': 0, 'out': 0}),
('const_1_6', 'mul_2', {'in': 1, 'out': 0}),
('mul_2', 'result', {'in': 0, 'out': 0})])
graph_ref = graph.copy()
graph.stage = 'front'
HSigmoidWithClamp().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
class HSigmoidWithMinMaxTest(unittest.TestCase):
nodes = {
**regular_op('input', {'type': 'Parameter'}),
**regular_op('add', {'op': 'Add'}),
**regular_op('max', {'op': 'Maximum'}),
**regular_op('min', {'op': 'Minimum'}),
**regular_op('mul_2', {'op': 'Mul', 'name': 'final_mul'}),
**const('const_0', float_array([0.0])),
**const('const_3', float_array([3.0])),
**const('const_6', float_array([6.0])),
**const('const_1_6', float_array([1.0 / 6.0])),
**result('result'),
}
edges = [('input', 'add', {'in': 0, 'out': 0}),
('const_3', 'add', {'in': 1, 'out': 0}),
('add', 'max', {'in': 0, 'out': 0}),
('const_0', 'max', {'in': 1, 'out': 0}),
('max', 'min', {'in': 0, 'out': 0}),
('const_6', 'min', {'in': 1, 'out': 0}),
('min', 'mul_2', {'in': 0, 'out': 0}),
('const_1_6', 'mul_2', {'in': 1, 'out': 0}),
('mul_2', 'result', {'in': 0, 'out': 0})]
def test_hsigmoid_with_min_max(self):
graph = build_graph_with_edge_attrs(self.nodes, self.edges, {})
graph_ref = build_graph(ref_nodes, ref_edges)
graph.stage = 'front'
HSigmoidWithMinMax().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
self.assertTrue(len(graph.get_op_nodes(name='final_mul')) == 1 and
graph.get_op_nodes(name='final_mul')[0].op == 'HSigmoid')
def test_hsigmoid_with_min_max_wrong_constant(self):
graph = build_graph_with_edge_attrs(self.nodes, self.edges, {'const_0': {'value': float_array([0.00001])}})
graph_ref = graph.copy()
graph.stage = 'front'
HSigmoidWithMinMax().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
def test_hsigmoid_with_min_max_different_tensors(self):
graph = build_graph_with_edge_attrs({
**regular_op('input', {'type': 'Parameter'}),
**regular_op('input_2', {'type': 'Parameter'}),
**regular_op('add', {'op': 'Add'}),
**regular_op('max', {'op': 'Maximum'}),
**regular_op('min', {'op': 'Minimum'}),
**regular_op('mul', {'op': 'Mul'}),
**regular_op('mul_2', {'op': 'Mul', 'name': 'final_mul'}),
**const('const_0', float_array([0.0])),
**const('const_3', float_array([3.0])),
**const('const_6', float_array([6.0])),
**const('const_1_6', float_array([1.0 / 6.0])),
**result('result'),
}, [('input_2', 'mul', {'in': 1, 'out': 0}),
('input', 'add', {'in': 0, 'out': 0}),
('const_3', 'add', {'in': 1, 'out': 0}),
('add', 'max', {'in': 0, 'out': 0}),
('const_0', 'max', {'in': 1, 'out': 0}),
('max', 'min', {'in': 0, 'out': 0}),
('const_6', 'min', {'in': 1, 'out': 0}),
('min', 'mul', {'in': 0, 'out': 0}),
('mul', 'mul_2', {'in': 0, 'out': 0}),
('const_1_6', 'mul_2', {'in': 1, 'out': 0}),
('mul_2', 'result', {'in': 0, 'out': 0})])
graph_ref = graph.copy()
graph.stage = 'front'
HSigmoidWithMinMax().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
class HSigmoidWithReluDivTest(unittest.TestCase):
nodes = {
**regular_op('input', {'type': 'Parameter'}),
**regular_op('add', {'op': 'Add'}),
**regular_op('relu', {'op': 'ReLU'}),
**regular_op('min', {'op': 'Minimum'}),
**regular_op('div', {'op': 'Div', 'name': 'final_div'}),
**const('add_const', float_array([3.0])),
**const('min_const', float_array([6.0])),
**const('div_const', float_array([6.0])),
**result('result'),
}
edges = [('input', 'add', {'in': 0, 'out': 0}),
('add_const', 'add', {'in': 1, 'out': 0}),
('add', 'relu', {'in': 0, 'out': 0}),
('relu', 'min', {'in': 0, 'out': 0}),
('min_const', 'min', {'in': 1, 'out': 0}),
('min', 'div', {'in': 0, 'out': 0}),
('div_const', 'div', {'in': 1, 'out': 0}),
('div', 'result', {'in': 0, 'out': 0})]
def test_hsigmoid_with_relu_div(self):
graph = build_graph_with_edge_attrs(self.nodes, self.edges, {})
graph_ref = build_graph(ref_nodes, ref_edges)
graph.stage = 'front'
HSigmoidWithReluDiv().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
self.assertTrue(len(graph.get_op_nodes(name='final_div')) == 1 and
graph.get_op_nodes(name='final_div')[0].op == 'HSigmoid')
self.assertTrue(graph.get_op_nodes(name='final_div')[0].out_nodes()[0].node == 'result')
def test_hsigmoid_with_relu_div_wrong_constant(self):
graph = build_graph_with_edge_attrs(self.nodes, self.edges, {'add_const': {'value': float_array([0.00001])}})
graph_ref = graph.copy()
graph.stage = 'front'
HSigmoidWithReluDiv().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
def test_hsigmoid_with_relu_div_different_tensors(self):
graph = build_graph_with_edge_attrs({
**regular_op('input', {'type': 'Parameter'}),
**regular_op('input_2', {'type': 'Parameter'}),
**regular_op('add', {'op': 'Add'}),
**regular_op('max', {'op': 'Maximum'}),
**regular_op('min', {'op': 'Minimum'}),
**regular_op('mul', {'op': 'Mul'}),
**regular_op('mul_2', {'op': 'Mul', 'name': 'final_mul'}),
**const('const_0', float_array([0.0])),
**const('const_3', float_array([3.0])),
**const('const_6', float_array([6.0])),
**const('const_1_6', float_array([1.0 / 6.0])),
**result('result'),
}, [('input_2', 'mul', {'in': 1, 'out': 0}),
('input', 'add', {'in': 0, 'out': 0}),
('const_3', 'add', {'in': 1, 'out': 0}),
('add', 'max', {'in': 0, 'out': 0}),
('const_0', 'max', {'in': 1, 'out': 0}),
('max', 'min', {'in': 0, 'out': 0}),
('const_6', 'min', {'in': 1, 'out': 0}),
('min', 'mul', {'in': 0, 'out': 0}),
('mul', 'mul_2', {'in': 0, 'out': 0}),
('const_1_6', 'mul_2', {'in': 1, 'out': 0}),
('mul_2', 'result', {'in': 0, 'out': 0})])
graph_ref = graph.copy()
graph.stage = 'front'
HSigmoidWithReluDiv().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
class HSigmoidWithReluMulTest(unittest.TestCase):
nodes = {
**regular_op('input', {'type': 'Parameter'}),
**regular_op('add', {'op': 'Add'}),
**regular_op('relu', {'op': 'ReLU'}),
**regular_op('min', {'op': 'Minimum'}),
**regular_op('mul', {'op': 'Mul', 'name': 'final_mul'}),
**const('add_const', float_array([3.0])),
**const('min_const', float_array([6.0])),
**const('mul_const', float_array([1.0/6.0])),
**result('result'),
}
edges = [('input', 'add', {'in': 0, 'out': 0}),
('add_const', 'add', {'in': 1, 'out': 0}),
('add', 'relu', {'in': 0, 'out': 0}),
('relu', 'min', {'in': 0, 'out': 0}),
('min_const', 'min', {'in': 1, 'out': 0}),
('min', 'mul', {'in': 0, 'out': 0}),
('mul_const', 'mul', {'in': 1, 'out': 0}),
('mul', 'result', {'in': 0, 'out': 0})]
def test_hsigmoid_with_relu_mul(self):
graph = build_graph_with_edge_attrs(self.nodes, self.edges, {})
graph_ref = build_graph(ref_nodes, ref_edges)
graph.stage = 'front'
HSigmoidWithReluMul().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
self.assertTrue(len(graph.get_op_nodes(name='final_mul')) == 1 and
graph.get_op_nodes(name='final_mul')[0].op == 'HSigmoid')
self.assertTrue(graph.get_op_nodes(name='final_mul')[0].out_nodes()[0].node == 'result')
def test_hsigmoid_with_relu_mul_wrong_constant(self):
graph = build_graph_with_edge_attrs(self.nodes, self.edges, {'add_const': {'value': float_array([0.00001])}})
graph_ref = graph.copy()
graph.stage = 'front'
HSigmoidWithReluMul().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
def test_hsigmoid_with_relu_mul_different_tensors(self):
graph = build_graph_with_edge_attrs({
**regular_op('input', {'type': 'Parameter'}),
**regular_op('input_2', {'type': 'Parameter'}),
**regular_op('add', {'op': 'Add'}),
**regular_op('max', {'op': 'Maximum'}),
**regular_op('min', {'op': 'Minimum'}),
**regular_op('mul', {'op': 'Mul'}),
**regular_op('mul_2', {'op': 'Mul', 'name': 'final_mul'}),
**const('const_0', float_array([0.0])),
**const('const_3', float_array([3.0])),
**const('const_6', float_array([6.0])),
**const('const_1_6', float_array([1.0 / 6.0])),
**result('result'),
}, [('input_2', 'mul', {'in': 1, 'out': 0}),
('input', 'add', {'in': 0, 'out': 0}),
('const_3', 'add', {'in': 1, 'out': 0}),
('add', 'max', {'in': 0, 'out': 0}),
('const_0', 'max', {'in': 1, 'out': 0}),
('max', 'min', {'in': 0, 'out': 0}),
('const_6', 'min', {'in': 1, 'out': 0}),
('min', 'mul', {'in': 0, 'out': 0}),
('mul', 'mul_2', {'in': 0, 'out': 0}),
('const_1_6', 'mul_2', {'in': 1, 'out': 0}),
('mul_2', 'result', {'in': 0, 'out': 0})])
graph_ref = graph.copy()
graph.stage = 'front'
HSigmoidWithReluMul().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
| 42.542274 | 117 | 0.513569 | 1,801 | 14,592 | 3.917268 | 0.053304 | 0.04309 | 0.034869 | 0.04068 | 0.917647 | 0.901914 | 0.892275 | 0.89185 | 0.880794 | 0.880794 | 0 | 0.034648 | 0.256305 | 14,592 | 342 | 118 | 42.666667 | 0.615463 | 0.005277 | 0 | 0.823105 | 0 | 0 | 0.168137 | 0 | 0 | 0 | 0 | 0 | 0.064982 | 1 | 0.043321 | false | 0 | 0.018051 | 0 | 0.104693 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0c2b3858d2cf79677387130d744ae0299cae06d0 | 142 | py | Python | frappe_util_configs/install/__init__.py | leam-tech/frappe_util_configs | 2d289885290d114c52cebad9479e0927cf71cb0d | [
"MIT"
] | null | null | null | frappe_util_configs/install/__init__.py | leam-tech/frappe_util_configs | 2d289885290d114c52cebad9479e0927cf71cb0d | [
"MIT"
] | null | null | null | frappe_util_configs/install/__init__.py | leam-tech/frappe_util_configs | 2d289885290d114c52cebad9479e0927cf71cb0d | [
"MIT"
] | 2 | 2020-09-15T14:33:34.000Z | 2021-08-14T22:43:46.000Z | from jinja2 import Environment, PackageLoader
def get_jinja_env():
return Environment(loader=PackageLoader("frappe_util_configs.install"))
| 28.4 | 73 | 0.830986 | 17 | 142 | 6.705882 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007692 | 0.084507 | 142 | 4 | 74 | 35.5 | 0.869231 | 0 | 0 | 0 | 0 | 0 | 0.190141 | 0.190141 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
0c38cc8274269bffc3b1019ab86201516d2bdb95 | 30,754 | py | Python | simpleredial/dataloader/bert_ft_compare_dataloader.py | gmftbyGMFTBY/SimpleReDial-v1 | f45b8eb23d1499ec617b4cc4f417d83d8f2b6bde | [
"MIT"
] | 36 | 2021-10-13T10:32:08.000Z | 2022-03-20T07:50:05.000Z | simpleredial/dataloader/bert_ft_compare_dataloader.py | gmftbyGMFTBY/SimpleReDial-v1 | f45b8eb23d1499ec617b4cc4f417d83d8f2b6bde | [
"MIT"
] | 3 | 2021-11-24T10:57:59.000Z | 2022-03-27T15:37:40.000Z | simpleredial/dataloader/bert_ft_compare_dataloader.py | gmftbyGMFTBY/SimpleReDial-v1 | f45b8eb23d1499ec617b4cc4f417d83d8f2b6bde | [
"MIT"
] | 1 | 2022-03-15T07:13:22.000Z | 2022-03-15T07:13:22.000Z | from header import *
from .utils import *
from .util_func import *
class BERTFTCompDataset(Dataset):
def __init__(self, vocab, path, **args):
self.args = args
self.vocab = vocab
self.vocab.add_tokens(['[EOS]'])
self.pad = self.vocab.convert_tokens_to_ids('[PAD]')
self.sep = self.vocab.convert_tokens_to_ids('[SEP]')
self.eos = self.vocab.convert_tokens_to_ids('[EOS]')
self.cls = self.vocab.convert_tokens_to_ids('[CLS]')
self.topk = args['gray_cand_num']
self.num_labels = args['num_labels']
suffix = args['tokenizer'].replace('/', '_')
self.pp_path = f'{os.path.splitext(path)[0]}_ft_comp_plus_{suffix}.pt'
if os.path.exists(self.pp_path):
if self.args['mode'] == 'train':
self.data, self.responses = torch.load(self.pp_path)
else:
self.data = torch.load(self.pp_path)
print(f'[!] load preprocessed file from {self.pp_path}')
return None
self.data = []
if self.args['mode'] == 'train':
# path = f'{os.path.split(path)[0]}/train_dpr_gray.txt'
# data = read_dpr_gray(path)
path = f'{os.path.split(path)[0]}/train_bm25_gray.txt'
data = read_bm25_hard_negative(path)
responses, response_overlap = [], set()
for item in tqdm(data):
context, response, candidates = item['q'], item['r'], item['nr']
ids = self.vocab.batch_encode_plus(context + [response], add_special_tokens=False)['input_ids']
cids = []
sids, cache = [], 0
for u in ids[:-1]:
cids.extend(u + [self.eos])
sids.extend([cache] * (len(u) + 1))
cache = 1 if cache == 0 else 0
sids.pop()
cids.pop()
if len(cids) == 0:
# the empty sequence raise exception
continue
rids = ids[-1]
responses.append(rids)
if response not in response_overlap:
responses.append(rids)
response_overlap.add(response)
self.data.append({
'context': cids,
'sids': sids,
'response': rids,
'candidates': candidates,
})
self.responses = responses
else:
# if args['dataset'] in ['ubuntu'] and args['mode'] == 'valid':
data = read_text_data_utterances(path, lang=self.args['lang'])
# too many validation samples, just sample 1000
# data = data[:10000]
for i in tqdm(range(0, len(data), 10)):
batch = data[i:i+10]
responses = [b[1][-1] for b in batch]
context = batch[0][1][:-1]
self.data.append({
'label': [b[0] for b in batch],
'context': context,
'responses': responses,
})
def __len__(self):
return len(self.data)
def _packup(self, cids, rids1, rids2, sids=None):
cids_, rids1_, rids2_ = deepcopy(cids), deepcopy(rids1), deepcopy(rids2)
sids_ = deepcopy(sids)
truncate_pair_two_candidates(
cids_, rids1_, rids2_,
self.args['max_len'],
sids=sids_,
)
other_speaker = 0 if sids[-1] == 1 else 1
ids = [self.cls] + cids_ + [self.sep] + rids1_ + [self.sep] + rids2_ + [self.sep]
sids_ = [sids_[0]] + sids_ + [sids_[-1]] + [other_speaker] * (len(rids1_) + len(rids2_) + 2)
tids = [0] * (len(cids_) + 2) + [1] * (len(rids1_) + 1) + [0] * (len(rids2_) + 1)
# if label == 0:
# token_labels = [-100] * (len(cids_) + 2) + [0] * (len(rids1_) + 1) + [1] * (len(rids2_) + 1)
# else:
# token_labels = [-100] * (len(cids_) + 2) + [1] * (len(rids1_) + 1) + [0] * (len(rids2_) + 1)
# assert len(sids_) == len(ids) == len(token_labels)
assert len(sids_) == len(ids)
return ids, tids, sids_
def __getitem__(self, i):
bundle = self.data[i]
if self.args['mode'] == 'train':
cids, rids = bundle['context'], bundle['response']
speaker_ids = bundle['sids']
if self.args['no_hard_negative']:
hrids = random.sample(self.responses, self.topk)
else:
if self.topk > len(bundle['candidates']):
candidates = bundle['candidates']
if candidates:
hrids = self.vocab.batch_encode_plus(candidates, add_special_tokens=False)['input_ids']
else:
hrids = []
hrids += random.sample(self.responses, self.topk - len(candidates))
else:
candidates = random.sample(bundle['candidates'], self.topk)
hrids = self.vocab.batch_encode_plus(candidates, add_special_tokens=False)['input_ids']
# context session hard negative samples
# candidates = bundle['context_session']
# hrids2 = self.vocab.batch_encode_plus(candidates, add_special_tokens=False)['input_ids']
# if self.topk > len(candidates):
# hrids2 += random.sample(self.responses, self.topk - len(hrids2))
ids, sids, tids, label, token_label = [], [], [], [], []
# label 0/1: positive vs. easy negative
for _ in range(self.topk):
e = random.choice(self.responses)
if random.random() > 0.5:
ids_, tids_, sids_ = self._packup(cids, rids, e, sids=speaker_ids)
l = 1
else:
ids_, tids_, sids_ = self._packup(cids, e, rids, sids=speaker_ids)
l = 0
ids.append(ids_)
sids.append(sids_)
tids.append(tids_)
label.append(l)
# label 0/1: positive negatives vs. bm25 hard negative
for _ in range(self.topk):
h = random.choice(hrids)
if random.random() > 0.5:
ids_, tids_, sids_ = self._packup(cids, rids, h, sids=speaker_ids)
l = 1
else:
ids_, tids_, sids_ = self._packup(cids, h, rids, sids=speaker_ids)
l = 0
ids.append(ids_)
sids.append(sids_)
tids.append(tids_)
label.append(l)
# label 0/1: hard negative from the session
# for h in hrids2:
# if random.random() > 0.5:
# ids_, tids_, sids_ = self._packup(cids, rids, h, sids=speaker_ids)
# l = 1
# else:
# ids_, tids_, sids_ = self._packup(cids, h, rids, sids=speaker_ids)
# l = 0
# ids.append(ids_)
# sids.append(sids_)
# tids.append(tids_)
# label.append(l)
# whole samples
ids = [torch.LongTensor(i) for i in ids]
sids = [torch.LongTensor(i) for i in sids]
tids = [torch.LongTensor(i) for i in tids]
return ids, sids, tids, label
else:
# test
return bundle['context'], bundle['responses'], bundle['label']
def save(self):
if self.args['mode'] == 'train':
data = torch.save((self.data, self.responses), self.pp_path)
else:
data = torch.save(self.data, self.pp_path)
print(f'[!] save preprocessed dataset into {self.pp_path}')
def collate(self, batch):
if self.args['mode'] == 'train':
ids, sids, tids, label = [], [], [], []
for b in batch:
ids.extend(b[0])
sids.extend(b[1])
tids.extend(b[2])
label.extend(b[3])
label = torch.LongTensor(label)
return {
'ids': ids,
'sids': sids,
'tids': tids,
'label': label
}
else:
# test or valid set
assert len(batch) == 1
return {
'context': batch[0][0],
'responses': batch[0][1],
'label': batch[0][2],
}
class BERTFTCompEvaluationDataset(Dataset):
'''Compare the evaluation results of the generated responses from two systems'''
def __init__(self, vocab, path, **args):
self.args = args
self.vocab = vocab
self.pad = self.vocab.convert_tokens_to_ids('[PAD]')
self.sep = self.vocab.convert_tokens_to_ids('[SEP]')
# 22335: bm25+BERT-FP; 22336: dual-bert
ports = args['file_tags'].split(',')
path1 = f'{os.path.split(path)[0]}/test_api_pipeline_{ports[0]}_log.txt'
path2 = f'{os.path.split(path)[0]}/test_api_pipeline_{ports[1]}_log.txt'
print(f'[!] load file from:\n {path1}\n {path2}')
data1 = read_text_data_from_log_file(path1, lang=args['lang'])
data2 = read_text_data_from_log_file(path2, lang=args['lang'])
self.data = []
for (ctx1, res1), (ctx2, res2) in zip(data1, data2):
assert ctx1 == ctx2
self.data.append({
'context': [i.strip() for i in ctx1.split(' [SEP] ')],
'responses': [res1, res2]
})
def __len__(self):
return len(self.data)
def __getitem__(self, i):
return self.data[i]
def collate(self, batch):
assert len(batch) == 1
return batch[0]
class BERTFTCompMultiDataset(Dataset):
def __init__(self, vocab, path, **args):
self.args = args
self.vocab = vocab
self.vocab.add_tokens(['[EOS]'])
self.pad = self.vocab.convert_tokens_to_ids('[PAD]')
self.sep = self.vocab.convert_tokens_to_ids('[SEP]')
self.eos = self.vocab.convert_tokens_to_ids('[EOS]')
self.cls = self.vocab.convert_tokens_to_ids('[CLS]')
self.topk = args['gray_cand_num']
self.compare_set_size = args['compare_set_size']
suffix = args['tokenizer'].replace('/', '_')
self.pp_path = f'{os.path.splitext(path)[0]}_ft_comp_multi_{suffix}.pt'
if os.path.exists(self.pp_path):
if self.args['mode'] == 'train':
self.data, self.responses = torch.load(self.pp_path)
else:
self.data = torch.load(self.pp_path)
print(f'[!] load preprocessed file from {self.pp_path}')
return None
self.data = []
if self.args['mode'] == 'train':
path = f'{os.path.split(path)[0]}/train_bm25_gray.txt'
data = read_bm25_hard_negative(path)
responses, response_overlap = [], set()
for item in tqdm(data):
context, response, candidates = item['q'], item['r'], item['nr']
ids = self.vocab.batch_encode_plus(context + [response], add_special_tokens=False)['input_ids']
cids = []
sids, cache = [], 0
for u in ids[:-1]:
cids.extend(u + [self.eos])
sids.extend([cache] * (len(u) + 1))
cache = 1 if cache == 0 else 0
sids.pop()
cids.pop()
if self.args['no_inner_session_negative'] is False:
candidates += context
if len(cids) == 0:
continue
rids = ids[-1]
responses.append(rids)
if response not in response_overlap:
responses.append(rids)
response_overlap.add(response)
self.data.append({
'context': cids,
'sids': sids,
'response': rids,
'candidates': candidates,
})
self.responses = responses
else:
data = read_text_data_utterances(path, lang=self.args['lang'])
for i in tqdm(range(0, len(data), 10)):
batch = data[i:i+10]
responses = [b[1][-1] for b in batch]
context = batch[0][1][:-1]
self.data.append({
'label': [b[0] for b in batch],
'context': context,
'responses': responses,
})
def __len__(self):
return len(self.data)
def _packup(self, cids, sids, rids, label):
ctx_max_length, res_max_length = self.args['ctx_max_length'], self.args['res_max_length']
num = len(rids)
# length limitation
rids = [i[:(res_max_length-2)] for i in rids]
cids = cids[-(ctx_max_length-2):]
sids = sids[-(ctx_max_length-2):]
cids_ = [self.cls] + cids + [self.sep]
sids_ = [sids[0]] + sids + [sids[-1]]
tids_ = [0] * (len(cids) + 2)
lids_ = [-100] * (len(cids) + 2)
other_speaker = 0 if sids[-1] == 1 else 1
tcache = 1
# concatenation
for idx, (r, l) in enumerate(zip(rids, label)):
# [unused1] ~ [unused10]
cids_ += [idx + 1] + r + [self.sep]
sids_ += [other_speaker] * (len(r) + 2)
tids_ += [tcache] * (len(r) + 2)
lids_ += [l] + [-100] * (len(r) + 1)
# tcache = 0 if tcache == 1 else 1
assert len(cids_) == len(sids_) == len(tids_) == len(lids_)
return cids_, sids_, tids_, lids_
def __getitem__(self, i):
bundle = self.data[i]
if self.args['mode'] == 'train':
cids, rids, sids = deepcopy(bundle['context']), deepcopy(bundle['response']), deepcopy(bundle['sids'])
if self.args['no_hard_negative']:
hrids = random.sample(self.responses, self.topk)
else:
candidates = random.sample(
bundle['candidates'], self.topk
)
hrids = self.vocab.batch_encode_plus(candidates, add_special_tokens=False)['input_ids']
rids = [rids] + random.sample(hrids, self.topk) + random.sample(self.responses, self.compare_set_size - self.topk - 1)
random_idx = list(range(self.compare_set_size))
random.shuffle(random_idx)
label = [1] + [0] * (self.compare_set_size - 1)
label = [label[i] for i in random_idx]
cls_label = random_idx.index(0)
rids = [rids[i] for i in random_idx]
ids, sids, tids, lids = self._packup(cids, sids, rids, label)
ids = torch.LongTensor(ids)
sids = torch.LongTensor(sids)
tids = torch.LongTensor(tids)
lids = torch.LongTensor(lids)
return ids, sids, tids, lids, cls_label
else:
# test
return bundle['context'], bundle['responses'], bundle['label']
def save(self):
if self.args['mode'] == 'train':
data = torch.save((self.data, self.responses), self.pp_path)
else:
data = torch.save(self.data, self.pp_path)
print(f'[!] save preprocessed dataset into {self.pp_path}')
def collate(self, batch):
if self.args['mode'] == 'train':
ids, sids, tids, lids, label = [], [], [], [], []
for a, b, c, d, e in batch:
ids.append(a)
sids.append(b)
tids.append(c)
lids.append(d)
label.append(e)
ids = pad_sequence(ids, batch_first=True, padding_value=self.pad)
sids = pad_sequence(sids, batch_first=True, padding_value=self.pad)
tids = pad_sequence(tids, batch_first=True, padding_value=self.pad)
lids = pad_sequence(lids, batch_first=True, padding_value=-100)
label = torch.LongTensor(label)
mask = generate_mask(ids)
ids, sids, tids, lids, mask, label = to_cuda(ids, sids, tids, lids, mask, label)
return {
'ids': ids,
'sids': sids,
'tids': tids,
'lids': lids,
'mask': mask,
'label': label,
}
else:
# test or valid set
assert len(batch) == 1
return {
'context': batch[0][0],
'responses': batch[0][1],
'label': batch[0][2],
}
class BERTFTCompMultiCLSDataset(Dataset):
def __init__(self, vocab, path, **args):
self.args = args
self.vocab = vocab
self.vocab.add_tokens(['[EOS]'])
self.pad = self.vocab.convert_tokens_to_ids('[PAD]')
self.sep = self.vocab.convert_tokens_to_ids('[SEP]')
self.eos = self.vocab.convert_tokens_to_ids('[EOS]')
self.cls = self.vocab.convert_tokens_to_ids('[CLS]')
self.topk = args['gray_cand_num']
self.compare_set_size = args['compare_set_size']
suffix = args['tokenizer'].replace('/', '_')
self.pp_path = f'{os.path.splitext(path)[0]}_ft_comp_multi_{suffix}.pt'
if os.path.exists(self.pp_path):
if self.args['mode'] == 'train':
self.data, self.responses = torch.load(self.pp_path)
else:
self.data = torch.load(self.pp_path)
print(f'[!] load preprocessed file from {self.pp_path}')
return None
self.data = []
if self.args['mode'] == 'train':
path = f'{os.path.split(path)[0]}/train_bm25_gray.txt'
data = read_bm25_hard_negative(path)
responses, response_overlap = [], set()
for item in tqdm(data):
context, response, candidates = item['q'], item['r'], item['nr']
ids = self.vocab.batch_encode_plus(context + [response], add_special_tokens=False)['input_ids']
cids = []
sids, cache = [], 0
for u in ids[:-1]:
cids.extend(u + [self.eos])
sids.extend([cache] * (len(u) + 1))
cache = 1 if cache == 0 else 0
sids.pop()
cids.pop()
if self.args['no_inner_session_negative'] is False:
candidates += context
if len(cids) == 0:
continue
rids = ids[-1]
responses.append(rids)
if response not in response_overlap:
responses.append(rids)
response_overlap.add(response)
self.data.append({
'context': cids,
'sids': sids,
'response': rids,
'candidates': candidates,
})
self.responses = responses
else:
data = read_text_data_utterances(path, lang=self.args['lang'])
for i in tqdm(range(0, len(data), 10)):
batch = data[i:i+10]
responses = [b[1][-1] for b in batch]
context = batch[0][1][:-1]
self.data.append({
'label': [b[0] for b in batch],
'context': context,
'responses': responses,
})
def __len__(self):
return len(self.data)
def _packup(self, cids, sids, rids):
ctx_max_length, res_max_length = self.args['ctx_max_length'], self.args['res_max_length']
num = len(rids)
# length limitation
rids = [i[:(res_max_length-2)] for i in rids]
cids = cids[-(ctx_max_length-2):]
sids = sids[-(ctx_max_length-2):]
cids_ = [self.cls] + cids + [self.sep]
sids_ = [sids[0]] + sids + [sids[-1]]
tids_ = [0] * (len(cids) + 2)
other_speaker = 0 if sids[-1] == 1 else 1
tcache = 1
# concatenation
for idx, r in enumerate(rids):
# [unused1] ~ [unused10]
cids_ += [idx + 1] + r + [self.sep]
sids_ += [other_speaker] * (len(r) + 2)
tids_ += [tcache] * (len(r) + 2)
tcache = 0 if tcache == 1 else 1
assert len(cids_) == len(sids_) == len(tids_)
return cids_, sids_, tids_
def __getitem__(self, i):
bundle = self.data[i]
if self.args['mode'] == 'train':
cids, rids, sids = deepcopy(bundle['context']), deepcopy(bundle['response']), deepcopy(bundle['sids'])
if self.args['no_hard_negative']:
hrids = random.sample(self.responses, self.topk)
else:
candidates = random.sample(
bundle['candidates'], self.topk
)
hrids = self.vocab.batch_encode_plus(candidates, add_special_tokens=False)['input_ids']
rids = [rids] + random.sample(hrids, self.topk) + random.sample(self.responses, self.compare_set_size - self.topk - 1)
random_idx = list(range(self.compare_set_size))
random.shuffle(random_idx)
label = random_idx.index(0)
rids = [rids[i] for i in random_idx]
ids, sids, tids = self._packup(cids, sids, rids)
ids = torch.LongTensor(ids)
sids = torch.LongTensor(sids)
tids = torch.LongTensor(tids)
return ids, sids, tids, label
else:
# test
return bundle['context'], bundle['responses'], bundle['label']
def save(self):
if self.args['mode'] == 'train':
data = torch.save((self.data, self.responses), self.pp_path)
else:
data = torch.save(self.data, self.pp_path)
print(f'[!] save preprocessed dataset into {self.pp_path}')
def collate(self, batch):
if self.args['mode'] == 'train':
ids, sids, tids, label = [], [], [], []
for a, b, c, d in batch:
ids.append(a)
sids.append(b)
tids.append(c)
label.append(d)
ids = pad_sequence(ids, batch_first=True, padding_value=self.pad)
sids = pad_sequence(sids, batch_first=True, padding_value=self.pad)
tids = pad_sequence(tids, batch_first=True, padding_value=self.pad)
label = torch.LongTensor(label)
mask = generate_mask(ids)
ids, sids, tids, label, mask = to_cuda(ids, sids, tids, label, mask)
return {
'ids': ids,
'sids': sids,
'tids': tids,
'label': label,
'mask': mask,
}
else:
# test or valid set
assert len(batch) == 1
return {
'context': batch[0][0],
'responses': batch[0][1],
'label': batch[0][2],
}
class BERTFTCompTokenDataset(Dataset):
def __init__(self, vocab, path, **args):
self.args = args
self.vocab = vocab
self.vocab.add_tokens(['[EOS]'])
self.pad = self.vocab.convert_tokens_to_ids('[PAD]')
self.sep = self.vocab.convert_tokens_to_ids('[SEP]')
self.eos = self.vocab.convert_tokens_to_ids('[EOS]')
self.cls = self.vocab.convert_tokens_to_ids('[CLS]')
self.topk = args['gray_cand_num']
suffix = args['tokenizer'].replace('/', '_')
self.pp_path = f'{os.path.splitext(path)[0]}_ft_comp_token_{suffix}.pt'
if os.path.exists(self.pp_path):
if self.args['mode'] == 'train':
self.data, self.responses = torch.load(self.pp_path)
else:
self.data = torch.load(self.pp_path)
print(f'[!] load preprocessed file from {self.pp_path}')
return None
self.data = []
if self.args['mode'] == 'train':
path = f'{os.path.split(path)[0]}/train_bm25_gray.txt'
data = read_bm25_hard_negative(path)
responses, response_overlap = [], set()
for item in tqdm(data):
context, response, candidates = item['q'], item['r'], item['nr']
ids = self.vocab.batch_encode_plus(context + [response], add_special_tokens=False)['input_ids']
cids = []
sids, cache = [], 0
for u in ids[:-1]:
cids.extend(u + [self.eos])
sids.extend([cache] * (len(u) + 1))
cache = 1 if cache == 0 else 0
sids.pop()
cids.pop()
if len(cids) == 0:
continue
rids = ids[-1]
responses.append(rids)
if response not in response_overlap:
responses.append(rids)
response_overlap.add(response)
self.data.append({
'context': cids,
'sids': sids,
'response': rids,
'candidates': candidates,
})
self.responses = responses
else:
data = read_text_data_utterances(path, lang=self.args['lang'])
for i in tqdm(range(0, len(data), 10)):
batch = data[i:i+10]
responses = [b[1][-1] for b in batch]
context = batch[0][1][:-1]
self.data.append({
'label': [b[0] for b in batch],
'context': context,
'responses': responses,
})
def __len__(self):
return len(self.data)
def _packup(self, cids, sids, rids1, rids2, label1, label2):
cids_, sids_, rids1_, rids2_ = deepcopy(cids), deepcopy(sids), deepcopy(rids1), deepcopy(rids2)
truncate_pair_two_candidates(
cids_, rids1_, rids2_,
self.args['max_len'],
sids=sids_,
)
other_speaker = 0 if sids_[-1] == 1 else 1
cids__ = [self.cls] + cids_ + [self.sep] + [1] + rids1_ + [self.sep] + [2] + rids2_ + [self.sep]
sids__ = [sids_[0]] + sids_ + [sids_[-1]] + [other_speaker] * (len(rids1_) + len(rids2_) + 4)
tids__ = [0] * (len(cids_) + 2) + [1] * (len(rids1_) + 2) + [0] * (len(rids2_) + 2)
tlids__ = [-100] * (len(cids_) + 2) + [label1] + [-100] * (len(rids1_) + 1) + [label2] + [-100] * (len(rids2_) + 1)
assert len(tids__) == len(sids__) == len(cids__) == len(tlids__)
return cids__, tids__, sids__, tlids__
def __getitem__(self, i):
bundle = self.data[i]
if self.args['mode'] == 'train':
cids, rids = bundle['context'], bundle['response']
speaker_ids = bundle['sids']
if self.args['no_hard_negative']:
hrids = random.sample(self.responses, self.topk)
else:
if self.topk > len(bundle['candidates']):
candidates = bundle['candidates']
if candidates:
hrids = self.vocab.batch_encode_plus(candidates, add_special_tokens=False)['input_ids']
else:
hrids = []
hrids += random.sample(self.responses, self.topk - len(candidates))
else:
candidates = random.sample(bundle['candidates'], self.topk)
hrids = self.vocab.batch_encode_plus(candidates, add_special_tokens=False)['input_ids']
ids, sids, tids, tlids = [], [], [], []
# positive vs. easy negative
for _ in range(self.topk):
e = random.choice(self.responses)
if random.random() > 0.5:
ids_, tids_, sids_, tlids_ = self._packup(cids, speaker_ids, rids, e, 1, 0)
else:
ids_, tids_, sids_, tlids_ = self._packup(cids, speaker_ids, e, rids, 0, 1)
ids.append(ids_)
sids.append(sids_)
tids.append(tids_)
tlids.append(tlids_)
# positive negatives vs. bm25 hard negative
for _ in range(self.topk):
h = random.choice(hrids)
if random.random() > 0.5:
ids_, tids_, sids_, tlids_ = self._packup(cids, speaker_ids, rids, h, 1, 0)
else:
ids_, tids_, sids_, tlids_ = self._packup(cids, speaker_ids, h, rids, 0, 1)
ids.append(ids_)
sids.append(sids_)
tids.append(tids_)
tlids.append(tlids_)
# easy neg vs. easy neg.
for _ in range(self.topk):
e1, e2 = random.sample(self.responses, 2)
ids_, tids_, sids_, tlids_ = self._packup(cids, speaker_ids, e1, e2, 0, 0)
ids.append(ids_)
sids.append(sids_)
tids.append(tids_)
tlids.append(tlids_)
# whole samples
ids = [torch.LongTensor(i) for i in ids]
sids = [torch.LongTensor(i) for i in sids]
tids = [torch.LongTensor(i) for i in tids]
tlids = [torch.LongTensor(i) for i in tlids]
return ids, sids, tids, tlids
else:
# test
return bundle['context'], bundle['responses'], bundle['label']
def save(self):
if self.args['mode'] == 'train':
data = torch.save((self.data, self.responses), self.pp_path)
else:
data = torch.save(self.data, self.pp_path)
print(f'[!] save preprocessed dataset into {self.pp_path}')
def collate(self, batch):
if self.args['mode'] == 'train':
ids, sids, tids, tlids = [], [], [], []
for b in batch:
ids.extend(b[0])
sids.extend(b[1])
tids.extend(b[2])
tlids.extend(b[3])
return {
'ids': ids,
'sids': sids,
'tids': tids,
'tlids': tlids,
}
else:
# test or valid set
assert len(batch) == 1
return {
'context': batch[0][0],
'responses': batch[0][1],
'label': batch[0][2],
}
| 41.005333 | 130 | 0.495123 | 3,467 | 30,754 | 4.213729 | 0.063455 | 0.026491 | 0.021904 | 0.019166 | 0.885002 | 0.865562 | 0.852557 | 0.848176 | 0.841673 | 0.834417 | 0 | 0.019811 | 0.371366 | 30,754 | 749 | 131 | 41.06008 | 0.73584 | 0.055277 | 0 | 0.84345 | 0 | 0 | 0.079419 | 0.019277 | 0 | 0 | 0 | 0 | 0.015974 | 1 | 0.044728 | false | 0 | 0.004792 | 0.009585 | 0.107029 | 0.014377 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a7a3111d242bfb4cdd52de1f6a397aa3ac1f6101 | 118,978 | py | Python | clients/kratos/python/ory_kratos_client/api/public_api.py | Marcuzz/sdk | e17e32972dc499713243c3fa26550245c66d7d22 | [
"Apache-2.0"
] | null | null | null | clients/kratos/python/ory_kratos_client/api/public_api.py | Marcuzz/sdk | e17e32972dc499713243c3fa26550245c66d7d22 | [
"Apache-2.0"
] | null | null | null | clients/kratos/python/ory_kratos_client/api/public_api.py | Marcuzz/sdk | e17e32972dc499713243c3fa26550245c66d7d22 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Ory Kratos
Welcome to the ORY Kratos HTTP API documentation! # noqa: E501
The version of the OpenAPI document: v0.4.6-alpha.1
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from ory_kratos_client.api_client import ApiClient
from ory_kratos_client.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class PublicApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def complete_self_service_browser_recovery_link_strategy_flow(self, **kwargs): # noqa: E501
"""Complete the browser-based recovery flow using a recovery link # noqa: E501
> This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...) and HTML Forms. More information can be found at [ORY Kratos Account Recovery Documentation](../self-service/flows/password-reset-account-recovery). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.complete_self_service_browser_recovery_link_strategy_flow(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.complete_self_service_browser_recovery_link_strategy_flow_with_http_info(**kwargs) # noqa: E501
def complete_self_service_browser_recovery_link_strategy_flow_with_http_info(self, **kwargs): # noqa: E501
"""Complete the browser-based recovery flow using a recovery link # noqa: E501
> This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...) and HTML Forms. More information can be found at [ORY Kratos Account Recovery Documentation](../self-service/flows/password-reset-account-recovery). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.complete_self_service_browser_recovery_link_strategy_flow_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method complete_self_service_browser_recovery_link_strategy_flow" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/recovery/link', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def complete_self_service_browser_settings_oidc_settings_flow(self, **kwargs): # noqa: E501
"""Complete the browser-based settings flow for the OpenID Connect strategy # noqa: E501
This endpoint completes a browser-based settings flow. This is usually achieved by POSTing data to this endpoint. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...) and HTML Forms. More information can be found at [ORY Kratos User Settings & Profile Management Documentation](../self-service/flows/user-settings). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.complete_self_service_browser_settings_oidc_settings_flow(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.complete_self_service_browser_settings_oidc_settings_flow_with_http_info(**kwargs) # noqa: E501
def complete_self_service_browser_settings_oidc_settings_flow_with_http_info(self, **kwargs): # noqa: E501
"""Complete the browser-based settings flow for the OpenID Connect strategy # noqa: E501
This endpoint completes a browser-based settings flow. This is usually achieved by POSTing data to this endpoint. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...) and HTML Forms. More information can be found at [ORY Kratos User Settings & Profile Management Documentation](../self-service/flows/user-settings). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.complete_self_service_browser_settings_oidc_settings_flow_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method complete_self_service_browser_settings_oidc_settings_flow" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/registration/strategies/oidc/settings/connections', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def complete_self_service_browser_settings_password_strategy_flow(self, **kwargs): # noqa: E501
"""Complete the browser-based settings flow for the password strategy # noqa: E501
This endpoint completes a browser-based settings flow. This is usually achieved by POSTing data to this endpoint. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...) and HTML Forms. More information can be found at [ORY Kratos User Settings & Profile Management Documentation](../self-service/flows/user-settings). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.complete_self_service_browser_settings_password_strategy_flow(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.complete_self_service_browser_settings_password_strategy_flow_with_http_info(**kwargs) # noqa: E501
def complete_self_service_browser_settings_password_strategy_flow_with_http_info(self, **kwargs): # noqa: E501
"""Complete the browser-based settings flow for the password strategy # noqa: E501
This endpoint completes a browser-based settings flow. This is usually achieved by POSTing data to this endpoint. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...) and HTML Forms. More information can be found at [ORY Kratos User Settings & Profile Management Documentation](../self-service/flows/user-settings). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.complete_self_service_browser_settings_password_strategy_flow_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method complete_self_service_browser_settings_password_strategy_flow" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/settings/strategies/password', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def complete_self_service_browser_settings_profile_strategy_flow(self, request, body, **kwargs): # noqa: E501
"""Complete the browser-based settings flow for profile data # noqa: E501
This endpoint completes a browser-based settings flow. This is usually achieved by POSTing data to this endpoint. If the provided profile data is valid against the Identity's Traits JSON Schema, the data will be updated and the browser redirected to `url.settings_ui` for further steps. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...) and HTML Forms. More information can be found at [ORY Kratos User Settings & Profile Management Documentation](../self-service/flows/user-settings). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.complete_self_service_browser_settings_profile_strategy_flow(request, body, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str request: Request is the request ID. (required)
:param CompleteSelfServiceBrowserSettingsStrategyProfileFlowPayload body: (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.complete_self_service_browser_settings_profile_strategy_flow_with_http_info(request, body, **kwargs) # noqa: E501
def complete_self_service_browser_settings_profile_strategy_flow_with_http_info(self, request, body, **kwargs): # noqa: E501
"""Complete the browser-based settings flow for profile data # noqa: E501
This endpoint completes a browser-based settings flow. This is usually achieved by POSTing data to this endpoint. If the provided profile data is valid against the Identity's Traits JSON Schema, the data will be updated and the browser redirected to `url.settings_ui` for further steps. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...) and HTML Forms. More information can be found at [ORY Kratos User Settings & Profile Management Documentation](../self-service/flows/user-settings). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.complete_self_service_browser_settings_profile_strategy_flow_with_http_info(request, body, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str request: Request is the request ID. (required)
:param CompleteSelfServiceBrowserSettingsStrategyProfileFlowPayload body: (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'request',
'body'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method complete_self_service_browser_settings_profile_strategy_flow" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'request' is set
if self.api_client.client_side_validation and ('request' not in local_var_params or # noqa: E501
local_var_params['request'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `request` when calling `complete_self_service_browser_settings_profile_strategy_flow`") # noqa: E501
# verify the required parameter 'body' is set
if self.api_client.client_side_validation and ('body' not in local_var_params or # noqa: E501
local_var_params['body'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `body` when calling `complete_self_service_browser_settings_profile_strategy_flow`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'request' in local_var_params and local_var_params['request'] is not None: # noqa: E501
query_params.append(('request', local_var_params['request'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/x-www-form-urlencoded']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/settings/strategies/profile', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def complete_self_service_browser_verification_flow(self, request, via, **kwargs): # noqa: E501
"""Complete the browser-based verification flows # noqa: E501
This endpoint completes a browser-based verification flow. This is usually achieved by POSTing data to this endpoint. If the provided data is valid against the Identity's Traits JSON Schema, the data will be updated and the browser redirected to `url.settings_ui` for further steps. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...) and HTML Forms. More information can be found at [ORY Kratos Email and Phone Verification Documentation](https://www.ory.sh/docs/kratos/selfservice/flows/verify-email-account-activation). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.complete_self_service_browser_verification_flow(request, via, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str request: Request is the Request ID The value for this parameter comes from `request` URL Query parameter sent to your application (e.g. `/verify?request=abcde`). (required)
:param str via: What to verify Currently only \"email\" is supported. (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.complete_self_service_browser_verification_flow_with_http_info(request, via, **kwargs) # noqa: E501
def complete_self_service_browser_verification_flow_with_http_info(self, request, via, **kwargs): # noqa: E501
"""Complete the browser-based verification flows # noqa: E501
This endpoint completes a browser-based verification flow. This is usually achieved by POSTing data to this endpoint. If the provided data is valid against the Identity's Traits JSON Schema, the data will be updated and the browser redirected to `url.settings_ui` for further steps. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...) and HTML Forms. More information can be found at [ORY Kratos Email and Phone Verification Documentation](https://www.ory.sh/docs/kratos/selfservice/flows/verify-email-account-activation). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.complete_self_service_browser_verification_flow_with_http_info(request, via, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str request: Request is the Request ID The value for this parameter comes from `request` URL Query parameter sent to your application (e.g. `/verify?request=abcde`). (required)
:param str via: What to verify Currently only \"email\" is supported. (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'request',
'via'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method complete_self_service_browser_verification_flow" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'request' is set
if self.api_client.client_side_validation and ('request' not in local_var_params or # noqa: E501
local_var_params['request'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `request` when calling `complete_self_service_browser_verification_flow`") # noqa: E501
# verify the required parameter 'via' is set
if self.api_client.client_side_validation and ('via' not in local_var_params or # noqa: E501
local_var_params['via'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `via` when calling `complete_self_service_browser_verification_flow`") # noqa: E501
collection_formats = {}
path_params = {}
if 'via' in local_var_params:
path_params['via'] = local_var_params['via'] # noqa: E501
query_params = []
if 'request' in local_var_params and local_var_params['request'] is not None: # noqa: E501
query_params.append(('request', local_var_params['request'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/verification/{via}/complete', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_schema(self, id, **kwargs): # noqa: E501
"""get_schema # noqa: E501
Get a traits schema definition # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_schema(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str id: ID must be set to the ID of schema you want to get (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_schema_with_http_info(id, **kwargs) # noqa: E501
def get_schema_with_http_info(self, id, **kwargs): # noqa: E501
"""get_schema # noqa: E501
Get a traits schema definition # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_schema_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str id: ID must be set to the ID of schema you want to get (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(object, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_schema" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `get_schema`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/schemas/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_self_service_browser_login_request(self, request, **kwargs): # noqa: E501
"""Get the request context of browser-based login user flows # noqa: E501
This endpoint returns a login request's context with, for example, error details and other information. When accessing this endpoint through ORY Kratos' Public API, ensure that cookies are set as they are required for CSRF to work. To prevent token scanning attacks, the public endpoint does not return 404 status codes. More information can be found at [ORY Kratos User Login and User Registration Documentation](https://www.ory.sh/docs/next/kratos/self-service/flows/user-login-user-registration). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_self_service_browser_login_request(request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str request: Request is the Login Request ID The value for this parameter comes from `request` URL Query parameter sent to your application (e.g. `/login?request=abcde`). (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: LoginRequest
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_self_service_browser_login_request_with_http_info(request, **kwargs) # noqa: E501
def get_self_service_browser_login_request_with_http_info(self, request, **kwargs): # noqa: E501
"""Get the request context of browser-based login user flows # noqa: E501
This endpoint returns a login request's context with, for example, error details and other information. When accessing this endpoint through ORY Kratos' Public API, ensure that cookies are set as they are required for CSRF to work. To prevent token scanning attacks, the public endpoint does not return 404 status codes. More information can be found at [ORY Kratos User Login and User Registration Documentation](https://www.ory.sh/docs/next/kratos/self-service/flows/user-login-user-registration). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_self_service_browser_login_request_with_http_info(request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str request: Request is the Login Request ID The value for this parameter comes from `request` URL Query parameter sent to your application (e.g. `/login?request=abcde`). (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(LoginRequest, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'request'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_self_service_browser_login_request" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'request' is set
if self.api_client.client_side_validation and ('request' not in local_var_params or # noqa: E501
local_var_params['request'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `request` when calling `get_self_service_browser_login_request`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'request' in local_var_params and local_var_params['request'] is not None: # noqa: E501
query_params.append(('request', local_var_params['request'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/requests/login', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='LoginRequest', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_self_service_browser_recovery_request(self, request, **kwargs): # noqa: E501
"""Get the request context of browser-based recovery flows # noqa: E501
When accessing this endpoint through ORY Kratos' Public API, ensure that cookies are set as they are required for checking the auth session. To prevent scanning attacks, the public endpoint does not return 404 status codes but instead 403 or 500. More information can be found at [ORY Kratos Account Recovery Documentation](../self-service/flows/password-reset-account-recovery). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_self_service_browser_recovery_request(request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str request: Request is the Login Request ID The value for this parameter comes from `request` URL Query parameter sent to your application (e.g. `/recover?request=abcde`). (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: RecoveryRequest
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_self_service_browser_recovery_request_with_http_info(request, **kwargs) # noqa: E501
def get_self_service_browser_recovery_request_with_http_info(self, request, **kwargs): # noqa: E501
"""Get the request context of browser-based recovery flows # noqa: E501
When accessing this endpoint through ORY Kratos' Public API, ensure that cookies are set as they are required for checking the auth session. To prevent scanning attacks, the public endpoint does not return 404 status codes but instead 403 or 500. More information can be found at [ORY Kratos Account Recovery Documentation](../self-service/flows/password-reset-account-recovery). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_self_service_browser_recovery_request_with_http_info(request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str request: Request is the Login Request ID The value for this parameter comes from `request` URL Query parameter sent to your application (e.g. `/recover?request=abcde`). (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(RecoveryRequest, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'request'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_self_service_browser_recovery_request" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'request' is set
if self.api_client.client_side_validation and ('request' not in local_var_params or # noqa: E501
local_var_params['request'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `request` when calling `get_self_service_browser_recovery_request`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'request' in local_var_params and local_var_params['request'] is not None: # noqa: E501
query_params.append(('request', local_var_params['request'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/requests/recovery', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RecoveryRequest', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_self_service_browser_registration_request(self, request, **kwargs): # noqa: E501
"""Get the request context of browser-based registration user flows # noqa: E501
This endpoint returns a registration request's context with, for example, error details and other information. When accessing this endpoint through ORY Kratos' Public API, ensure that cookies are set as they are required for CSRF to work. To prevent token scanning attacks, the public endpoint does not return 404 status codes. More information can be found at [ORY Kratos User Login and User Registration Documentation](https://www.ory.sh/docs/next/kratos/self-service/flows/user-login-user-registration). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_self_service_browser_registration_request(request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str request: Request is the Registration Request ID The value for this parameter comes from `request` URL Query parameter sent to your application (e.g. `/registration?request=abcde`). (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: RegistrationRequest
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_self_service_browser_registration_request_with_http_info(request, **kwargs) # noqa: E501
def get_self_service_browser_registration_request_with_http_info(self, request, **kwargs): # noqa: E501
"""Get the request context of browser-based registration user flows # noqa: E501
This endpoint returns a registration request's context with, for example, error details and other information. When accessing this endpoint through ORY Kratos' Public API, ensure that cookies are set as they are required for CSRF to work. To prevent token scanning attacks, the public endpoint does not return 404 status codes. More information can be found at [ORY Kratos User Login and User Registration Documentation](https://www.ory.sh/docs/next/kratos/self-service/flows/user-login-user-registration). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_self_service_browser_registration_request_with_http_info(request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str request: Request is the Registration Request ID The value for this parameter comes from `request` URL Query parameter sent to your application (e.g. `/registration?request=abcde`). (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(RegistrationRequest, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'request'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_self_service_browser_registration_request" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'request' is set
if self.api_client.client_side_validation and ('request' not in local_var_params or # noqa: E501
local_var_params['request'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `request` when calling `get_self_service_browser_registration_request`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'request' in local_var_params and local_var_params['request'] is not None: # noqa: E501
query_params.append(('request', local_var_params['request'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/requests/registration', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RegistrationRequest', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_self_service_browser_settings_request(self, request, **kwargs): # noqa: E501
"""Get the request context of browser-based settings flows # noqa: E501
When accessing this endpoint through ORY Kratos' Public API, ensure that cookies are set as they are required for checking the auth session. To prevent scanning attacks, the public endpoint does not return 404 status codes but instead 403 or 500. More information can be found at [ORY Kratos User Settings & Profile Management Documentation](../self-service/flows/user-settings). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_self_service_browser_settings_request(request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str request: Request is the Login Request ID The value for this parameter comes from `request` URL Query parameter sent to your application (e.g. `/settingss?request=abcde`). (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: SettingsRequest
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_self_service_browser_settings_request_with_http_info(request, **kwargs) # noqa: E501
def get_self_service_browser_settings_request_with_http_info(self, request, **kwargs): # noqa: E501
"""Get the request context of browser-based settings flows # noqa: E501
When accessing this endpoint through ORY Kratos' Public API, ensure that cookies are set as they are required for checking the auth session. To prevent scanning attacks, the public endpoint does not return 404 status codes but instead 403 or 500. More information can be found at [ORY Kratos User Settings & Profile Management Documentation](../self-service/flows/user-settings). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_self_service_browser_settings_request_with_http_info(request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str request: Request is the Login Request ID The value for this parameter comes from `request` URL Query parameter sent to your application (e.g. `/settingss?request=abcde`). (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(SettingsRequest, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'request'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_self_service_browser_settings_request" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'request' is set
if self.api_client.client_side_validation and ('request' not in local_var_params or # noqa: E501
local_var_params['request'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `request` when calling `get_self_service_browser_settings_request`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'request' in local_var_params and local_var_params['request'] is not None: # noqa: E501
query_params.append(('request', local_var_params['request'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/requests/settings', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SettingsRequest', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_self_service_error(self, **kwargs): # noqa: E501
"""Get user-facing self-service errors # noqa: E501
This endpoint returns the error associated with a user-facing self service errors. When accessing this endpoint through ORY Kratos' Public API, ensure that cookies are set as they are required for CSRF to work. To prevent token scanning attacks, the public endpoint does not return 404 status codes. More information can be found at [ORY Kratos User User Facing Error Documentation](https://www.ory.sh/docs/kratos/self-service/flows/user-facing-errors). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_self_service_error(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str error:
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: ErrorContainer
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_self_service_error_with_http_info(**kwargs) # noqa: E501
def get_self_service_error_with_http_info(self, **kwargs): # noqa: E501
"""Get user-facing self-service errors # noqa: E501
This endpoint returns the error associated with a user-facing self service errors. When accessing this endpoint through ORY Kratos' Public API, ensure that cookies are set as they are required for CSRF to work. To prevent token scanning attacks, the public endpoint does not return 404 status codes. More information can be found at [ORY Kratos User User Facing Error Documentation](https://www.ory.sh/docs/kratos/self-service/flows/user-facing-errors). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_self_service_error_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str error:
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(ErrorContainer, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'error'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_self_service_error" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'error' in local_var_params and local_var_params['error'] is not None: # noqa: E501
query_params.append(('error', local_var_params['error'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/errors', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ErrorContainer', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_self_service_verification_request(self, request, **kwargs): # noqa: E501
"""Get the request context of browser-based verification flows # noqa: E501
When accessing this endpoint through ORY Kratos' Public API, ensure that cookies are set as they are required for checking the auth session. To prevent scanning attacks, the public endpoint does not return 404 status codes but instead 403 or 500. More information can be found at [ORY Kratos Email and Phone Verification Documentation](https://www.ory.sh/docs/kratos/selfservice/flows/verify-email-account-activation). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_self_service_verification_request(request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str request: Request is the Request ID The value for this parameter comes from `request` URL Query parameter sent to your application (e.g. `/verify?request=abcde`). (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: VerificationRequest
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_self_service_verification_request_with_http_info(request, **kwargs) # noqa: E501
def get_self_service_verification_request_with_http_info(self, request, **kwargs): # noqa: E501
"""Get the request context of browser-based verification flows # noqa: E501
When accessing this endpoint through ORY Kratos' Public API, ensure that cookies are set as they are required for checking the auth session. To prevent scanning attacks, the public endpoint does not return 404 status codes but instead 403 or 500. More information can be found at [ORY Kratos Email and Phone Verification Documentation](https://www.ory.sh/docs/kratos/selfservice/flows/verify-email-account-activation). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_self_service_verification_request_with_http_info(request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str request: Request is the Request ID The value for this parameter comes from `request` URL Query parameter sent to your application (e.g. `/verify?request=abcde`). (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(VerificationRequest, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'request'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_self_service_verification_request" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'request' is set
if self.api_client.client_side_validation and ('request' not in local_var_params or # noqa: E501
local_var_params['request'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `request` when calling `get_self_service_verification_request`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'request' in local_var_params and local_var_params['request'] is not None: # noqa: E501
query_params.append(('request', local_var_params['request'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/requests/verification', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='VerificationRequest', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def initialize_self_service_browser_login_flow(self, **kwargs): # noqa: E501
"""Initialize browser-based login user flow # noqa: E501
This endpoint initializes a browser-based user login flow. Once initialized, the browser will be redirected to `selfservice.flows.login.ui_url` with the request ID set as a query parameter. If a valid user session exists already, the browser will be redirected to `urls.default_redirect_url`. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...). More information can be found at [ORY Kratos User Login and User Registration Documentation](https://www.ory.sh/docs/next/kratos/self-service/flows/user-login-user-registration). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.initialize_self_service_browser_login_flow(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param bool refresh: Refresh a login session If set to true, this will refresh an existing login session by asking the user to sign in again. This will reset the authenticated_at time of the session.
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.initialize_self_service_browser_login_flow_with_http_info(**kwargs) # noqa: E501
def initialize_self_service_browser_login_flow_with_http_info(self, **kwargs): # noqa: E501
"""Initialize browser-based login user flow # noqa: E501
This endpoint initializes a browser-based user login flow. Once initialized, the browser will be redirected to `selfservice.flows.login.ui_url` with the request ID set as a query parameter. If a valid user session exists already, the browser will be redirected to `urls.default_redirect_url`. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...). More information can be found at [ORY Kratos User Login and User Registration Documentation](https://www.ory.sh/docs/next/kratos/self-service/flows/user-login-user-registration). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.initialize_self_service_browser_login_flow_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param bool refresh: Refresh a login session If set to true, this will refresh an existing login session by asking the user to sign in again. This will reset the authenticated_at time of the session.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'refresh'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method initialize_self_service_browser_login_flow" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'refresh' in local_var_params and local_var_params['refresh'] is not None: # noqa: E501
query_params.append(('refresh', local_var_params['refresh'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/login', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def initialize_self_service_browser_logout_flow(self, **kwargs): # noqa: E501
"""Initialize Browser-Based Logout User Flow # noqa: E501
This endpoint initializes a logout flow. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...). On successful logout, the browser will be redirected (HTTP 302 Found) to `urls.default_return_to`. More information can be found at [ORY Kratos User Logout Documentation](https://www.ory.sh/docs/next/kratos/self-service/flows/user-logout). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.initialize_self_service_browser_logout_flow(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.initialize_self_service_browser_logout_flow_with_http_info(**kwargs) # noqa: E501
def initialize_self_service_browser_logout_flow_with_http_info(self, **kwargs): # noqa: E501
"""Initialize Browser-Based Logout User Flow # noqa: E501
This endpoint initializes a logout flow. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...). On successful logout, the browser will be redirected (HTTP 302 Found) to `urls.default_return_to`. More information can be found at [ORY Kratos User Logout Documentation](https://www.ory.sh/docs/next/kratos/self-service/flows/user-logout). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.initialize_self_service_browser_logout_flow_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method initialize_self_service_browser_logout_flow" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/logout', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def initialize_self_service_browser_registration_flow(self, **kwargs): # noqa: E501
"""Initialize browser-based registration user flow # noqa: E501
This endpoint initializes a browser-based user registration flow. Once initialized, the browser will be redirected to `selfservice.flows.registration.ui_url` with the request ID set as a query parameter. If a valid user session exists already, the browser will be redirected to `urls.default_redirect_url`. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...). More information can be found at [ORY Kratos User Login and User Registration Documentation](https://www.ory.sh/docs/next/kratos/self-service/flows/user-login-user-registration). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.initialize_self_service_browser_registration_flow(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.initialize_self_service_browser_registration_flow_with_http_info(**kwargs) # noqa: E501
def initialize_self_service_browser_registration_flow_with_http_info(self, **kwargs): # noqa: E501
"""Initialize browser-based registration user flow # noqa: E501
This endpoint initializes a browser-based user registration flow. Once initialized, the browser will be redirected to `selfservice.flows.registration.ui_url` with the request ID set as a query parameter. If a valid user session exists already, the browser will be redirected to `urls.default_redirect_url`. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...). More information can be found at [ORY Kratos User Login and User Registration Documentation](https://www.ory.sh/docs/next/kratos/self-service/flows/user-login-user-registration). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.initialize_self_service_browser_registration_flow_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method initialize_self_service_browser_registration_flow" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/registration', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def initialize_self_service_browser_verification_flow(self, via, **kwargs): # noqa: E501
"""Initialize browser-based verification flow # noqa: E501
This endpoint initializes a browser-based verification flow. Once initialized, the browser will be redirected to `selfservice.flows.settings.ui_url` with the request ID set as a query parameter. If no valid user session exists, a login flow will be initialized. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...). More information can be found at [ORY Kratos Email and Phone Verification Documentation](https://www.ory.sh/docs/kratos/selfservice/flows/verify-email-account-activation). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.initialize_self_service_browser_verification_flow(via, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str via: What to verify Currently only \"email\" is supported. (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.initialize_self_service_browser_verification_flow_with_http_info(via, **kwargs) # noqa: E501
def initialize_self_service_browser_verification_flow_with_http_info(self, via, **kwargs): # noqa: E501
"""Initialize browser-based verification flow # noqa: E501
This endpoint initializes a browser-based verification flow. Once initialized, the browser will be redirected to `selfservice.flows.settings.ui_url` with the request ID set as a query parameter. If no valid user session exists, a login flow will be initialized. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...). More information can be found at [ORY Kratos Email and Phone Verification Documentation](https://www.ory.sh/docs/kratos/selfservice/flows/verify-email-account-activation). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.initialize_self_service_browser_verification_flow_with_http_info(via, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str via: What to verify Currently only \"email\" is supported. (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'via'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method initialize_self_service_browser_verification_flow" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'via' is set
if self.api_client.client_side_validation and ('via' not in local_var_params or # noqa: E501
local_var_params['via'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `via` when calling `initialize_self_service_browser_verification_flow`") # noqa: E501
collection_formats = {}
path_params = {}
if 'via' in local_var_params:
path_params['via'] = local_var_params['via'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/verification/init/{via}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def initialize_self_service_recovery_flow(self, **kwargs): # noqa: E501
"""Initialize browser-based account recovery flow # noqa: E501
This endpoint initializes a browser-based account recovery flow. Once initialized, the browser will be redirected to `selfservice.flows.recovery.ui_url` with the request ID set as a query parameter. If a valid user session exists, the request is aborted. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...). More information can be found at [ORY Kratos Account Recovery Documentation](../self-service/flows/password-reset-account-recovery). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.initialize_self_service_recovery_flow(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.initialize_self_service_recovery_flow_with_http_info(**kwargs) # noqa: E501
def initialize_self_service_recovery_flow_with_http_info(self, **kwargs): # noqa: E501
"""Initialize browser-based account recovery flow # noqa: E501
This endpoint initializes a browser-based account recovery flow. Once initialized, the browser will be redirected to `selfservice.flows.recovery.ui_url` with the request ID set as a query parameter. If a valid user session exists, the request is aborted. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...). More information can be found at [ORY Kratos Account Recovery Documentation](../self-service/flows/password-reset-account-recovery). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.initialize_self_service_recovery_flow_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method initialize_self_service_recovery_flow" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/recovery', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def initialize_self_service_settings_flow(self, **kwargs): # noqa: E501
"""Initialize browser-based settings flow # noqa: E501
This endpoint initializes a browser-based settings flow. Once initialized, the browser will be redirected to `selfservice.flows.settings.ui_url` with the request ID set as a query parameter. If no valid user session exists, a login flow will be initialized. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...). More information can be found at [ORY Kratos User Settings & Profile Management Documentation](../self-service/flows/user-settings). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.initialize_self_service_settings_flow(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.initialize_self_service_settings_flow_with_http_info(**kwargs) # noqa: E501
def initialize_self_service_settings_flow_with_http_info(self, **kwargs): # noqa: E501
"""Initialize browser-based settings flow # noqa: E501
This endpoint initializes a browser-based settings flow. Once initialized, the browser will be redirected to `selfservice.flows.settings.ui_url` with the request ID set as a query parameter. If no valid user session exists, a login flow will be initialized. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...). More information can be found at [ORY Kratos User Settings & Profile Management Documentation](../self-service/flows/user-settings). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.initialize_self_service_settings_flow_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method initialize_self_service_settings_flow" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/settings', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def self_service_browser_verify(self, code, via, **kwargs): # noqa: E501
"""Complete the browser-based verification flows # noqa: E501
This endpoint completes a browser-based verification flow. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...) and HTML Forms. More information can be found at [ORY Kratos Email and Phone Verification Documentation](https://www.ory.sh/docs/kratos/selfservice/flows/verify-email-account-activation). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.self_service_browser_verify(code, via, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str code: (required)
:param str via: What to verify Currently only \"email\" is supported. (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.self_service_browser_verify_with_http_info(code, via, **kwargs) # noqa: E501
def self_service_browser_verify_with_http_info(self, code, via, **kwargs): # noqa: E501
"""Complete the browser-based verification flows # noqa: E501
This endpoint completes a browser-based verification flow. > This endpoint is NOT INTENDED for API clients and only works with browsers (Chrome, Firefox, ...) and HTML Forms. More information can be found at [ORY Kratos Email and Phone Verification Documentation](https://www.ory.sh/docs/kratos/selfservice/flows/verify-email-account-activation). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.self_service_browser_verify_with_http_info(code, via, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str code: (required)
:param str via: What to verify Currently only \"email\" is supported. (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'code',
'via'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method self_service_browser_verify" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'code' is set
if self.api_client.client_side_validation and ('code' not in local_var_params or # noqa: E501
local_var_params['code'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `code` when calling `self_service_browser_verify`") # noqa: E501
# verify the required parameter 'via' is set
if self.api_client.client_side_validation and ('via' not in local_var_params or # noqa: E501
local_var_params['via'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `via` when calling `self_service_browser_verify`") # noqa: E501
collection_formats = {}
path_params = {}
if 'code' in local_var_params:
path_params['code'] = local_var_params['code'] # noqa: E501
if 'via' in local_var_params:
path_params['via'] = local_var_params['via'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/self-service/browser/flows/verification/{via}/confirm/{code}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def whoami(self, **kwargs): # noqa: E501
"""Check who the current HTTP session belongs to # noqa: E501
Uses the HTTP Headers in the GET request to determine (e.g. by using checking the cookies) who is authenticated. Returns a session object in the body or 401 if the credentials are invalid or no credentials were sent. Additionally when the request it successful it adds the user ID to the 'X-Kratos-Authenticated-Identity-Id' header in the response. This endpoint is useful for reverse proxies and API Gateways. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.whoami(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Session
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.whoami_with_http_info(**kwargs) # noqa: E501
def whoami_with_http_info(self, **kwargs): # noqa: E501
"""Check who the current HTTP session belongs to # noqa: E501
Uses the HTTP Headers in the GET request to determine (e.g. by using checking the cookies) who is authenticated. Returns a session object in the body or 401 if the credentials are invalid or no credentials were sent. Additionally when the request it successful it adds the user ID to the 'X-Kratos-Authenticated-Identity-Id' header in the response. This endpoint is useful for reverse proxies and API Gateways. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.whoami_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(Session, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method whoami" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/sessions/whoami', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Session', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
| 52.436316 | 610 | 0.624914 | 13,806 | 118,978 | 5.171157 | 0.024048 | 0.031936 | 0.044122 | 0.025213 | 0.980012 | 0.978037 | 0.975082 | 0.970571 | 0.966173 | 0.961005 | 0 | 0.012023 | 0.306527 | 118,978 | 2,268 | 611 | 52.459436 | 0.853263 | 0.541226 | 0 | 0.757318 | 1 | 0 | 0.172396 | 0.072685 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038716 | false | 0.004721 | 0.004721 | 0 | 0.082153 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ac052221e3d28493bdec497c614b3ae6e294445f | 226 | py | Python | src/pyinsect/documentModel/representations/__init__.py | vishalbelsare/PyINSECT | 517cc6e51b6aeaa93772af0fd7bb75a7f240b1a3 | [
"Apache-2.0"
] | 3 | 2018-10-16T20:38:51.000Z | 2021-09-17T08:53:33.000Z | src/pyinsect/documentModel/representations/__init__.py | vishalbelsare/PyINSECT | 517cc6e51b6aeaa93772af0fd7bb75a7f240b1a3 | [
"Apache-2.0"
] | 1 | 2017-05-30T10:31:17.000Z | 2017-05-30T10:36:12.000Z | src/pyinsect/documentModel/representations/__init__.py | vishalbelsare/PyINSECT | 517cc6e51b6aeaa93772af0fd7bb75a7f240b1a3 | [
"Apache-2.0"
] | 5 | 2017-05-17T12:03:16.000Z | 2021-04-17T14:35:07.000Z | from pyinsect.documentModel.representations.DocumentNGramGaussNormGraph import (
DocumentNGramGaussNormGraph,
)
from pyinsect.documentModel.representations.DocumentNGramSymWinGraph import (
DocumentNGramSymWinGraph,
)
| 32.285714 | 80 | 0.858407 | 14 | 226 | 13.857143 | 0.5 | 0.123711 | 0.257732 | 0.412371 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088496 | 226 | 6 | 81 | 37.666667 | 0.941748 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 1 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
ac3894d667fb540068042e8092883a036100ed88 | 108 | py | Python | sampi/sdes/__init__.py | Jonathan-Lindbloom/sampi | 2953e19d60df77b617779daaf90712f4f1099ff8 | [
"MIT"
] | null | null | null | sampi/sdes/__init__.py | Jonathan-Lindbloom/sampi | 2953e19d60df77b617779daaf90712f4f1099ff8 | [
"MIT"
] | null | null | null | sampi/sdes/__init__.py | Jonathan-Lindbloom/sampi | 2953e19d60df77b617779daaf90712f4f1099ff8 | [
"MIT"
] | null | null | null | from sampi.sdes.base import StochDiffEq
from sampi.sdes.util import draw_brownian
import sampi.sdes.catalog | 36 | 41 | 0.851852 | 17 | 108 | 5.352941 | 0.588235 | 0.296703 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.092593 | 108 | 3 | 42 | 36 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ac600fdba5c27075923240ec755d7f70831d56e9 | 4,796 | py | Python | Jumpscale/clients/openvcloud/test_OpenvCloudClientFactory.py | threefoldtech/JumpscaleX | 5fb073a82aeb0e66fc7d9660c45a1e31bc094bfa | [
"Apache-2.0"
] | 2 | 2019-05-09T07:21:25.000Z | 2019-08-05T06:37:53.000Z | Jumpscale/clients/openvcloud/test_OpenvCloudClientFactory.py | threefoldtech/JumpscaleX | 5fb073a82aeb0e66fc7d9660c45a1e31bc094bfa | [
"Apache-2.0"
] | 664 | 2018-12-19T12:43:44.000Z | 2019-08-23T04:24:42.000Z | Jumpscale/clients/openvcloud/test_OpenvCloudClientFactory.py | threefoldtech/jumpscale10 | 5fb073a82aeb0e66fc7d9660c45a1e31bc094bfa | [
"Apache-2.0"
] | 7 | 2019-05-03T07:14:37.000Z | 2019-08-05T12:36:52.000Z | import pytest
import unittest
import sys
from unittest import mock
from Jumpscale import j
class TestOpencCloudClientFactory(unittest.TestCase):
@pytest.mark.ssh_factory
@mock.patch("Jumpscale.clients.openvcloud.Account.Account")
def test_machine_create_name_empty(self, account):
from Jumpscale.clients.openvcloud.Space import Space
with pytest.raises(RuntimeError):
model = dict()
model["id"] = 123
space = Space(account=account, model=model)
space.machine_create(name=" ", sshkeyname="auto_0")
@pytest.mark.skip(reason="test need to be reviewed")
@pytest.mark.ssh_factory
@mock.patch("Jumpscale.clients.openvcloud.Account.Account")
@mock.patch("Jumpscale.clients.openvcloud.Machine.Machine")
def test_machine_create_image_find_id_called(self, account, machine):
from Jumpscale.clients.openvcloud.Space import Space
model = dict()
model["id"] = 123
space = Space(account=account, model=model)
space.account.client.api.cloudapi.machines.list = mock.MagicMock(return_value=[{"name": "dummy"}])
space.image_find_id = mock.MagicMock()
space._node_set = mock.MagicMock()
space.machine_create(name="dummy", sshkeyname="auto_0", image="imageName", sizeId="sizeID")
space.image_find_id.assert_called_with("imageName")
@pytest.mark.skip(reason="test need to be reviewed")
@pytest.mark.ssh_factory
@mock.patch("Jumpscale.clients.openvcloud.Account")
@mock.patch("Jumpscale.clients.openvcloud.Space.image_find_id")
@mock.patch("Jumpscale.clients.openvcloud.Space.size_find_id")
@mock.patch("Jumpscale.clients.openvcloud.Space.machines")
def test_machine_create_size_id(self, account, a, b, c, d):
from Jumpscale.clients.openvcloud import Space
model = dict()
model["id"] = 123
space = Space(account=account, model=model)
space.machine_create(name="dummy", memsize=5, vcpus=5, sshkeyname="auto_0", image="imageName")
space.size_find_id.assert_called_with(5, 5)
@pytest.mark.skip(reason="test need to be reviewed")
@pytest.mark.ssh_factory
@mock.patch("Jumpscale.clients.openvcloud.Account")
@mock.patch("Jumpscale.clients.openvcloud.Space.image_find_id")
@mock.patch("Jumpscale.clients.openvcloud.Space.size_find_id")
@mock.patch("Jumpscale.clients.openvcloud.Space.machines")
def test_machine_create_configure_machine(self, account, a, b, c, d):
from Jumpscale.clients.openvcloud import Space
model = dict()
model["id"] = 123
space = Space(account=account, model=model)
space.machine_create(name="dummy", sshkeyname="auto_0", sshkeypath="auto_0Path")
space.configure_machine.assert_called_with(
machine=space.machines["dummy"], name="dummy", sshkeyname="auto_0", sshkey_path="auto_0Path"
)
@pytest.mark.skip(reason="test need to be reviewed")
@pytest.mark.ssh_factory
@mock.patch("Jumpscale.clients.openvcloud.Account")
@mock.patch("Jumpscale.clients.openvcloud.Space.image_find_id")
@mock.patch("Jumpscale.clients.openvcloud.Space.size_find_id")
@mock.patch("Jumpscale.clients.openvcloud.Space.machines")
@mock.patch("Jumpscale.clients.openvcloud.Space.createPortForward")
@mock.patch("Jumpscale.clients.openvcloud.Space._authorizeSSH")
def test_machine_create_create_port_forward(self, account, a, b, c, e, d):
from Jumpscale.clients.openvcloud import Space
model = dict()
model["id"] = 123
space = Space(account=account, model=model)
space.machine_create(name="dummy", sshkeyname="auto_0", sshkeypath="auto_0Path")
machine = space.machines["dummy"]
space.createPortForward.assert_called_with(machine)
@pytest.mark.skip(reason="test need to be reviewed")
@pytest.mark.ssh_factory
@mock.patch("Jumpscale.clients.openvcloud.Account")
@mock.patch("Jumpscale.clients.openvcloud.Space.image_find_id")
@mock.patch("Jumpscale.clients.openvcloud.Space.size_find_id")
@mock.patch("Jumpscale.clients.openvcloud.Space.machines")
@mock.patch("Jumpscale.clients.openvcloud.Space.createPortForward")
@mock.patch("Jumpscale.clients.openvcloud.Space._authorizeSSH")
def test_machine_create_authorize_ssh(self, account, a, b, c, e, d):
from Jumpscale.clients.openvcloud import Space
model = dict()
model["id"] = 123
space = Space(account=account, model=model)
space.machine_create(name="dummy", sshkeyname="auto_0", sshkeypath="auto_0Path")
machine = space.machines["dummy"]
space._authorizeSSH.assert_called_with(machine=machine, sshkeyname="auto_0", sshkey_path="auto_0Path")
| 44.407407 | 110 | 0.708507 | 592 | 4,796 | 5.58277 | 0.131757 | 0.140393 | 0.228139 | 0.173979 | 0.809077 | 0.773979 | 0.763389 | 0.718306 | 0.705598 | 0.705598 | 0 | 0.008726 | 0.163678 | 4,796 | 107 | 111 | 44.82243 | 0.815258 | 0 | 0 | 0.696629 | 0 | 0 | 0.280651 | 0.214345 | 0 | 0 | 0 | 0 | 0.05618 | 1 | 0.067416 | false | 0 | 0.123596 | 0 | 0.202247 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ac66bd8ed2687b199dc5aa010e5b9fb60642b523 | 28 | py | Python | tests/unit/utils/test_source_utils/test_import_python_file_when_already_loaded/some_module.py | dumpmemory/zenml | ec3f6994ae9666493519d600471c035eb9109ac4 | [
"Apache-2.0"
] | null | null | null | tests/unit/utils/test_source_utils/test_import_python_file_when_already_loaded/some_module.py | dumpmemory/zenml | ec3f6994ae9666493519d600471c035eb9109ac4 | [
"Apache-2.0"
] | null | null | null | tests/unit/utils/test_source_utils/test_import_python_file_when_already_loaded/some_module.py | dumpmemory/zenml | ec3f6994ae9666493519d600471c035eb9109ac4 | [
"Apache-2.0"
] | null | null | null | def some_func():
return
| 9.333333 | 16 | 0.642857 | 4 | 28 | 4.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 28 | 2 | 17 | 14 | 0.809524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
ac81c7a52d426bd66f5507fada73d27ef6a348b3 | 223 | py | Python | Inheritance/mix_it_06E/project/parking_mall/level3.py | MNikov/Python-OOP-October-2020 | a53e4555758ec810605e31e7b2c71b65c49b2332 | [
"MIT"
] | null | null | null | Inheritance/mix_it_06E/project/parking_mall/level3.py | MNikov/Python-OOP-October-2020 | a53e4555758ec810605e31e7b2c71b65c49b2332 | [
"MIT"
] | null | null | null | Inheritance/mix_it_06E/project/parking_mall/level3.py | MNikov/Python-OOP-October-2020 | a53e4555758ec810605e31e7b2c71b65c49b2332 | [
"MIT"
] | null | null | null | # from Inheritance.mix_it_06E.project.parking_mall.parking_mall import ParkingMall
from project.parking_mall.parking_mall import ParkingMall
class Level3(ParkingMall):
def __init__(self):
super().__init__(80)
| 27.875 | 82 | 0.793722 | 29 | 223 | 5.62069 | 0.586207 | 0.269939 | 0.220859 | 0.306748 | 0.564417 | 0.564417 | 0.564417 | 0 | 0 | 0 | 0 | 0.025641 | 0.125561 | 223 | 7 | 83 | 31.857143 | 0.810256 | 0.358744 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
3bc336fdf889d7563b6da9daaed759c9981fb332 | 139 | py | Python | tests/conftest.py | Semantic-Health/deep-patient-cohorts | e0f49bd5434bd22a81c09272e727b59e09154a22 | [
"MIT"
] | 2 | 2021-03-31T04:54:01.000Z | 2021-09-30T22:00:13.000Z | tests/conftest.py | Semantic-Health/deep-patient-cohorts | e0f49bd5434bd22a81c09272e727b59e09154a22 | [
"MIT"
] | 64 | 2020-09-08T21:56:28.000Z | 2022-03-24T23:11:39.000Z | tests/conftest.py | Semantic-Health/deep-patient-cohorts | e0f49bd5434bd22a81c09272e727b59e09154a22 | [
"MIT"
] | 3 | 2020-09-12T01:23:46.000Z | 2022-03-03T06:05:35.000Z | import pytest
from deep_patient_cohorts.noisy_labeler import NoisyLabeler
@pytest.fixture
def noisy_labeler():
return NoisyLabeler()
| 17.375 | 59 | 0.820144 | 17 | 139 | 6.470588 | 0.705882 | 0.218182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122302 | 139 | 7 | 60 | 19.857143 | 0.901639 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
021d743dc5831bfbf2be917b55c73bbda76ade4a | 244 | py | Python | shared/mimicry_db/__init__.py | RebeccaClarkson/mimicry-db | 8eb10edec19c760aff58b6d880ea471e359fc15e | [
"FSFAP"
] | null | null | null | shared/mimicry_db/__init__.py | RebeccaClarkson/mimicry-db | 8eb10edec19c760aff58b6d880ea471e359fc15e | [
"FSFAP"
] | null | null | null | shared/mimicry_db/__init__.py | RebeccaClarkson/mimicry-db | 8eb10edec19c760aff58b6d880ea471e359fc15e | [
"FSFAP"
] | null | null | null | from mimicry_db.patient import Patient; Patient
from mimicry_db.admission import Admission; Admission
from mimicry_db.diagnosis import Diagnosis; Diagnosis
from mimicry_db.diagnosis_description import DiagnosisDescription; DiagnosisDescription
| 48.8 | 87 | 0.885246 | 29 | 244 | 7.275862 | 0.310345 | 0.208531 | 0.246446 | 0.208531 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081967 | 244 | 4 | 88 | 61 | 0.941964 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
0282e98d01be5d9b8255ccfb6f75d071bead1674 | 2,591 | py | Python | configs/cbnet/htc_cbv2_swin_base_patch4_window7_mstrain_400-1400_giou_4conv1f_adamw_20e_coco.py | ace19-dev/CBNetV2 | b08cfa8bbe438cd72651da3049ec4829f168ba81 | [
"Apache-2.0"
] | null | null | null | configs/cbnet/htc_cbv2_swin_base_patch4_window7_mstrain_400-1400_giou_4conv1f_adamw_20e_coco.py | ace19-dev/CBNetV2 | b08cfa8bbe438cd72651da3049ec4829f168ba81 | [
"Apache-2.0"
] | null | null | null | configs/cbnet/htc_cbv2_swin_base_patch4_window7_mstrain_400-1400_giou_4conv1f_adamw_20e_coco.py | ace19-dev/CBNetV2 | b08cfa8bbe438cd72651da3049ec4829f168ba81 | [
"Apache-2.0"
] | null | null | null | _base_ = 'htc_cbv2_swin_base_patch4_window7_mstrain_400-1400_adamw_20e_coco.py'
model = dict(
roi_head=dict(
bbox_head=[
dict(
type='ConvFCBBoxHead',
num_shared_convs=4,
num_shared_fcs=1,
in_channels=256,
conv_out_channels=256,
fc_out_channels=1024,
roi_feat_size=7,
num_classes=4,
bbox_coder=dict(
type='DeltaXYWHBBoxCoder',
target_means=[0., 0., 0., 0.],
target_stds=[0.1, 0.1, 0.2, 0.2]),
reg_class_agnostic=True,
reg_decoded_bbox=True,
norm_cfg=dict(type='BN', requires_grad=True),
loss_cls=dict(
type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0),
loss_bbox=dict(type='GIoULoss', loss_weight=10.0)),
dict(
type='ConvFCBBoxHead',
num_shared_convs=4,
num_shared_fcs=1,
in_channels=256,
conv_out_channels=256,
fc_out_channels=1024,
roi_feat_size=7,
num_classes=4,
bbox_coder=dict(
type='DeltaXYWHBBoxCoder',
target_means=[0., 0., 0., 0.],
target_stds=[0.05, 0.05, 0.1, 0.1]),
reg_class_agnostic=True,
reg_decoded_bbox=True,
norm_cfg=dict(type='BN', requires_grad=True),
loss_cls=dict(
type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0),
loss_bbox=dict(type='GIoULoss', loss_weight=10.0)),
dict(
type='ConvFCBBoxHead',
num_shared_convs=4,
num_shared_fcs=1,
in_channels=256,
conv_out_channels=256,
fc_out_channels=1024,
roi_feat_size=7,
num_classes=4,
bbox_coder=dict(
type='DeltaXYWHBBoxCoder',
target_means=[0., 0., 0., 0.],
target_stds=[0.033, 0.033, 0.067, 0.067]),
reg_class_agnostic=True,
reg_decoded_bbox=True,
norm_cfg=dict(type='BN', requires_grad=True),
loss_cls=dict(
type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0),
loss_bbox=dict(type='GIoULoss', loss_weight=10.0))
]
)
)
| 39.257576 | 81 | 0.484369 | 282 | 2,591 | 4.124113 | 0.234043 | 0.103181 | 0.015477 | 0.064488 | 0.897678 | 0.897678 | 0.897678 | 0.897678 | 0.897678 | 0.897678 | 0 | 0.075658 | 0.413354 | 2,591 | 65 | 82 | 39.861538 | 0.689474 | 0 | 0 | 0.828125 | 0 | 0 | 0.0934 | 0.026245 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5a5510495e549afc8f85f6d95373244101bf517c | 1,126 | py | Python | nrk/utils/Equalizers.py | Slimmerd/DiscordBotPY | 08c177cea8a1f3482f632288a971bded03b349b3 | [
"Apache-2.0"
] | null | null | null | nrk/utils/Equalizers.py | Slimmerd/DiscordBotPY | 08c177cea8a1f3482f632288a971bded03b349b3 | [
"Apache-2.0"
] | null | null | null | nrk/utils/Equalizers.py | Slimmerd/DiscordBotPY | 08c177cea8a1f3482f632288a971bded03b349b3 | [
"Apache-2.0"
] | null | null | null | class Equalizers:
def flat(self):
eq = [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
return eq
def boost(self):
eq = [
-0.075,
1.0,
1.0,
1.0,
1.0,
1.0,
1.0,
0.0,
0.0,
0.0,
0.0,
0.0,
1.0,
1.0,
1.0,
]
return eq
def metal(self):
eq = [
0.0,
0.1,
0.1,
0.15,
0.13,
0.1,
0.0,
0.125,
0.175,
0.175,
0.125,
0.125,
0.1,
0.075,
0.0,
]
return eq
def piano(self):
eq = [
-0.25,
-0.25,
-0.125,
0.0,
0.25,
0.25,
0.0,
-0.25,
-0.25,
0.0,
0.0,
0.5,
0.25,
-0.025,
]
return eq
| 16.558824 | 88 | 0.219361 | 144 | 1,126 | 1.715278 | 0.138889 | 0.421053 | 0.534413 | 0.59919 | 0.54251 | 0.340081 | 0.340081 | 0.287449 | 0.287449 | 0.210526 | 0 | 0.366162 | 0.648313 | 1,126 | 67 | 89 | 16.80597 | 0.257576 | 0 | 0 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0 | 0 | 0.15 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ce46cb33e258121445e669a002e81fd0baed79e5 | 88,850 | py | Python | smartversion/foo.py | rackjoobee/smartversion | bdc0b3cb28f1c61526d956a11fa682673fb0be8f | [
"MIT"
] | null | null | null | smartversion/foo.py | rackjoobee/smartversion | bdc0b3cb28f1c61526d956a11fa682673fb0be8f | [
"MIT"
] | null | null | null | smartversion/foo.py | rackjoobee/smartversion | bdc0b3cb28f1c61526d956a11fa682673fb0be8f | [
"MIT"
] | 1 | 2019-10-07T08:45:07.000Z | 2019-10-07T08:45:07.000Z |
def test_usage():
l = []
for i in range(20):
l.append(Version(i, i))
for i in range(20):
assert(l[i] == Version(i, i))
v1 = Version(0)
v2 = Version(0)
assert(v1 == v2)
assert(hash(v1) == hash(v2))
v1 = Version(1, 2, '3-rc6')
v2 = Version(1, 2, 4)
assert(v1 != v2)
assert(hash(v1) != hash(v2))
v2 = Version(1, 2, '3-rc6')
assert(v1 == v2)
assert(hash(v1) == hash(v2))
d = {}
for i in range(20):
v = Version(i, i, i)
d[v] = i
for i in range(20):
v = Version(i, i, i)
assert(d[v] == i)
v1 = Version(0, 1, 2)
v2 = Version(0, 1, 3)
s = set([v1, v2])
assert(len(s) == 2)
v2 = Version(0, 1, 2)
s = set([v1, v2])
assert(len(s) == 1)
def test_compare():
compare = lambda l: l[0] == l[1]
# TODO check both semver and pep404 comparisons (and interactions)
# TODO check semantics where default is wrong
# Equalities
v1 = Version(major=0)
v2 = Version() # Implied 0
assert(v1 == v2)
assert(v2 == v1)
v1 = Version('foo', 1)
v2 = Version('foo', 1)
assert(v1 == v2)
assert(v2 == v1)
v1 = Version('foo', 1)
v2 = Version('foo', 1, 0)
assert(v1 == v2)
assert(v2 == v1)
v1 = Version('foo', 1, 1)
v2 = Version('foo', 1, 1)
assert(v1 == v2)
assert(v2 == v1)
v1 = Version('foo', 1, 1)
v2 = Version('foo', 1, 1, 0)
assert(v1 == v2)
assert(v2 == v1)
v1 = Version('foo', 1, 1, 1)
v2 = Version('foo', 1, 1, 1)
assert(v1 == v2)
assert(v2 == v1)
v1 = Version('foo', 1, 1, '1')
v2 = Version('foo', 1, 1, '1')
assert(v1 == v2)
assert(v2 == v1)
v1 = Version('foo', 1, 1, '1rc')
v2 = Version('foo', 1, 1, '1rc')
assert(v1 == v2)
assert(v2 == v1)
v1 = Version('foo', 1, 1, '1rc1')
v2 = Version('foo', 1, 1, '1rc1')
assert(v1 == v2)
assert(v2 == v1)
v1 = Version('foo', 1, 1, 'rc1')
v2 = Version('foo', 1, 1, 'rc1')
assert(v1 == v2)
assert(v2 == v1)
v1 = Version('foo', 1, 1, 'rc')
v2 = Version('foo', 1, 1, 'rc')
assert(v1 == v2)
assert(v2 == v1)
# extra_str shouldn't affect comparison
v1 = Version('foo', 1, extra_str = 'asd')
v2 = Version('foo', 1, extra_str = 'lkj')
assert(v1 == v2)
assert(v2 == v1)
# nor release_date
v1 = Version('foo', 1, release_date = date(2000, 1, 1))
v2 = Version('foo', 1, release_date = date(2000, 1, 2))
assert(v1 == v2)
assert(v2 == v1)
# nor build_meta
v1 = Version('foo', 1, build_meta = 'lkasdjf')
v2 = Version('foo', 1, build_meta = ';KFJDAa')
assert(v1 == v2)
assert(v2 == v1)
# Inequalities
v1 = Version(major=0)
v2 = Version(major=1)
assert(v1 != v2)
assert(v2 != v1)
v1 = Version(major=0, minor=0)
v2 = Version(major=0, minor=1)
assert(v1 != v2)
assert(v2 != v1)
v1 = Version(major=0, minor=0, patch=0)
v2 = Version(major=0, minor=0, patch=1)
assert(v1 != v2)
assert(v2 != v1)
v1 = Version('foo', have_clue=True)
v2 = Version('bar', have_clue=True)
#assert(v1 != v2)
#assert(v2 != v1)
v1 = Version('foo', 1)
v2 = Version('foo', 2)
assert(v1 != v2)
assert(v2 != v1)
v1 = Version('foo', 1, 1)
v2 = Version('foo', 1, 2)
assert(v1 != v2)
assert(v2 != v1)
v1 = Version('foo', 1, 1, 1)
v2 = Version('foo', 1, 1, 2)
assert(v1 != v2)
assert(v2 != v1)
v1 = Version('foo', 1, 1, '1test')
v2 = Version('foo', 1, 1, '1frob')
assert(v1 != v2)
assert(v2 != v1)
v1 = Version('foo', 1, 1, '1test5')
v2 = Version('foo', 1, 1, '1test6')
assert(v1 != v2)
assert(v2 != v1)
v1 = Version('foo', 1, 1, 'test5')
v2 = Version('foo', 1, 1, 'test6')
assert(v1 != v2)
assert(v2 != v1)
v1 = Version('foo', 1, 1, 'test')
v2 = Version('foo', 1, 1, 'frob')
assert(v1 != v2)
assert(v2 != v1)
# Ordering
v1 = Version(major=0)
v2 = Version(major=1)
assert(v1 < v2)
assert(v2 > v1)
v1 = Version('foo', 1)
v2 = Version('foo', 2)
assert(v1 < v2)
assert(v2 > v1)
v1 = Version('foo', 1, None)
v2 = Version('foo', 1, 1)
assert(v1 < v2)
assert(v2 > v1)
v1 = Version('foo', 1, 1)
v2 = Version('foo', 1, 2)
assert(v1 < v2)
assert(v2 > v1)
v1 = Version('foo', 1, 2)
v2 = Version('foo', 2, 1)
assert(v1 < v2)
assert(v2 > v1)
v1 = Version('foo', 1, 1, 1)
v2 = Version('foo', 1, 1, 2)
assert(v1 < v2)
assert(v2 > v1)
v1 = Version('foo', 1, 1, 2)
v2 = Version('foo', 1, 2, 1)
assert(v1 < v2)
assert(v2 > v1)
v1 = Version('foo', 1, 2, 1)
v2 = Version('foo', 2, 1, 1)
assert(v1 < v2)
assert(v2 > v1)
v1 = Version('foo', 1, 1, 'rc0')
v2 = Version('foo', 1, 1, 'rc1')
assert(v1 < v2)
assert(v2 > v1)
v1 = Version('foo', 1, 1, '1-rc1')
v2 = Version('foo', 1, 1, 1)
assert(v1 < v2)
assert(v2 > v1)
v1 = Version('foo', 1, 1, 'rc0')
v2 = Version('foo', 1, 1, '1rc0')
assert(v1 < v2)
assert(v2 > v1)
v1 = Version('foo', 1, 1, '1rc0')
v2 = Version('foo', 1, 1, '1rc1')
assert(v1 < v2)
assert(v2 > v1)
v1 = Version('foo', 1, 1, '1rc1')
v2 = Version('foo', 1, 1, '2rc0')
assert(v1 < v2)
assert(v2 > v1)
v1 = Version('foo', 1, 1, '1rc1')
v2 = Version('foo', 1, 2, '1rc1')
assert(v1 < v2)
assert(v2 > v1)
v1 = Version('foo', 1, 2, '1rc1')
v2 = Version('foo', 2, 1, '1rc1')
assert(v1 < v2)
assert(v2 > v1)
v1 = Version('foo', 1, 1, 'rc1')
v2 = Version('foo', 1, 1)
assert(v1 < v2)
assert(v2 > v1)
def test_compare_typical():
# Normal usage
v1 = Version.parse('linux-2.4.6')
v2 = Version.parse('linux-2.4.8')
v3 = Version.parse('linux-2.4.10-rc1') # rc < version proper
v4 = Version.parse('linux-2.4.10')
v5 = Version.parse('linux-2.6.27.10')
v6 = Version.parse('linux-2.6.27.11')
v7 = Version.parse('linux-3.0')
v8 = Version.parse('linux-3.0.1')
v9 = Version.parse('linux-3.0.65')
v10 = Version.parse('linux-3.0.65.1')
v11 = Version.parse('linux-4.x')
v12 = Version.parse('linux-5.X')
assert(v1 < v2)
assert(v2 > v1)
assert(v1 != v2)
assert(v2 != v1)
# Make sure rc compare ABOVE lower version without patch_str
assert(v2 < v3)
assert(v3 > v2)
assert(v2 != v3)
assert(v3 != v2)
assert(v3 < v4)
assert(v4 > v3)
assert(v3 != v4)
assert(v4 != v3)
assert(v4 < v5)
assert(v5 > v4)
assert(v4 != v5)
assert(v5 != v4)
assert(v5 < v6)
assert(v6 > v5)
assert(v5 != v6)
assert(v6 != v5)
assert(v6 < v7)
assert(v7 > v6)
assert(v6 != v7)
assert(v7 != v6)
assert(v7 < v8)
assert(v8 > v7)
assert(v7 != v8)
assert(v8 != v7)
assert(v8 < v9)
assert(v9 > v8)
assert(v8 != v9)
assert(v9 != v8)
assert(v9 < v10)
assert(v10 > v9)
assert(v9 != v10)
assert(v10 != v9)
assert(v10 < v11)
assert(v11 > v10)
assert(v10 != v11)
assert(v11 != v10)
assert(v11 < v12)
assert(v12 > v11)
assert(v11 != v12)
assert(v12 != v11)
l = sorted([v12, v11, v10, v9, v8, v7, v6, v5, v4, v3, v2, v1])
for i in range(10):
assert(l[0] == v1)
assert(l[1] == v2)
assert(l[2] == v3)
assert(l[3] == v4)
assert(l[4] == v5)
assert(l[5] == v6)
assert(l[6] == v7)
assert(l[7] == v8)
assert(l[8] == v9)
assert(l[9] == v10)
assert(l[10] == v11)
assert(l[11] == v12)
random.shuffle(l)
l.sort()
# TODO compare for
# 1.71 (decimal) < 1.8
def test_wildcards():
for char in ['x', 'X', '*']:
# Equality
v1 = Version(major=char)
v2 = Version(major=1)
assert(v1 == v2)
assert(v2 == v1)
# wildcard implies wildcard for any lesser fields
v1 = Version(major=char)
v2 = Version(major=1, minor=1)
assert(v1 == v2)
assert(v2 == v1)
v2 = Version(major=1, minor=1, patch='1.1')
assert(v1 == v2)
assert(v2 == v1)
v1 = Version(major=char, minor=char)
assert(v1 == v2)
assert(v2 == v1)
v1 = Version(major=0, minor=char)
v2 = Version(major=0, minor=0)
assert(v1 == v2)
assert(v2 == v1)
v1 = Version(major=0, minor=0, patch=char)
v2 = Version(major=0, minor=0, patch=0)
assert(v1 == v2)
assert(v2 == v1)
v1 = Version(major=0, minor=char, patch=0)
v2 = Version(major=0, minor=9, patch=0)
assert(v1 == v2)
assert(v2 == v1)
v2 = Version(major=0, minor=1, patch=0)
assert(v1 == v2)
assert(v2 == v1)
v2 = Version(major=0, minor=char, patch=0)
assert(v1 == v2)
assert(v2 == v1)
v1 = Version(major=0, minor=0, patch=char)
v2 = Version(major=0, minor=0, patch='1.5')
assert(v1 == v2)
assert(v2 == v1)
v1 = Version(major=0, minor=0, patch='1.'+char)
v2 = Version(major=0, minor=0, patch='1.5')
assert(v1 == v2)
assert(v2 == v1)
v1 = Version(major=0, minor=0, patch=char+'.5')
v2 = Version(major=0, minor=0, patch='1.5')
assert(v1 == v2)
assert(v2 == v1)
# Inequality
v1 = Version(major=0, minor=char)
v2 = Version(major=1, minor=1)
assert(v1 != v2)
assert(v2 != v1)
v1 = Version(major=0, minor=0, patch=char)
v2 = Version(major=0, minor=1, patch=char)
assert(v1 != v2)
assert(v2 != v1)
v1 = Version(major=0, minor=0, patch='1.'+char)
v2 = Version(major=0, minor=0, patch='2.5')
assert(v1 != v2)
assert(v2 != v1)
v1 = Version(major=0, minor=0, patch=char+'.4')
v2 = Version(major=0, minor=0, patch='1.5')
assert(v1 != v2)
assert(v2 != v1)
# TODO a few more edge cases
def test_methods():
class MyDate(date):
@staticmethod
def today():
return date(2000, 2, 1)
Version.date_class = MyDate
#### .age()
v1 = Version('foo', 1, 0, release_date=date(2000, 1, 1))
assert(v1.age() == timedelta(31))
assert(v1.age_human() == '1 month, 1 day')
v2 = Version('foo', 1, 0, release_date=date(2000, 2, 1))
assert(v2.age() == timedelta(0))
assert(v2.age_human() == '0 days')
v3 = Version('foo', 1, 0)
assert(v3.age() > timedelta(10988))
assert(v3.age_human() == 'none')
v4 = Version('foo', 1, 0, release_date=date(1970, 1, 1))
assert(v4.age() == timedelta(10988))
assert(v4.age_human() == '30 years, 1 month, 8 days')
#dup
v5 = Version('foo', 1, 0, release_date=date(2000, 1, 1))
assert(v5.age_human() == '1 month, 1 day')
l = sorted([v2, v1, v5, v4, v3], key=lambda x: x.age())
assert(l[0] == v2)
assert(l[1] == v1)
assert(l[2] == v5)
assert(l[3] == v4)
assert(l[4] == v3)
v1 = Version('foo', 1, 1, release_date=date(1999, 2, 1))
assert(v1.is_older_than('1m'))
assert(v1.is_older_than('12m4d'))
assert(v1.is_older_than('1y') is False)
assert(v1.is_newer_than('2y'))
assert(v1.is_newer_than('1y1d'))
assert(v1.is_newer_than('1y') is False)
assert(v1.is_newer_than('12m4d') is False)
v1 = Version('foo', 1, 1, release_date=date(1999, 2, 1))
v2 = Version('foo', 1, 1, release_date=date(1999, 3, 1))
assert(v2.is_newer_than(v1))
assert(v1.is_older_than(v2))
assert(v1.is_older_than(date(1999, 2, 2)))
assert(v1.is_older_than(date(1999, 2, 1)) is False)
assert(v1.is_newer_than(date(1998, 2, 2)))
assert(v1.is_newer_than(date(1999, 2, 1)) is False)
v1 = Version('foo', 1, 1, release_date=date(2000, 2, 1))
assert(v1.is_older_than('0d') is False)
v1.release_date = date(2000, 1, 31)
assert(v1.is_older_than('0d'))
assert(v1.is_newer_than('1d') is False)
assert(v1.is_older_than('1d') is False)
assert(v1.is_newer_than('1d') is False)
v1.release_date = date(2000, 1, 1)
assert(v1.is_older_than('1m'))
assert(v1.is_older_than('30d'))
assert(v1.is_older_than('31d') is False)
assert(v1.is_newer_than('32d'))
assert(v1.is_newer_than('31d') is False)
v1.release_date = date(1999, 2, 1)
v2 = Version('foo', 1, 1, release_date=date(1999, 2, 1))
assert(v1.is_older_than(v2) is False)
assert(v1.is_newer_than(v2) is False)
assert(v2.is_older_than(v1) is False)
assert(v2.is_newer_than(v1) is False)
v2.release_date = date(1999, 2, 2)
assert(v1.is_older_than(v2))
assert(v2.is_older_than(v1) is False)
assert(v2.is_newer_than(v1))
assert(v1.is_newer_than(v2) is False)
v1 = Version('foo', 1, 1, release_date=date(1990, 2, 1))
assert(v1.is_older_than('10y2d') is False)
assert(v1.is_newer_than('10y2d') is False)
assert(v1.is_newer_than('10y3d'))
assert(v1.is_older_than('10y1d'))
def test_parse_version():
pass # TODO
def test_version_range():
# Check Version.parse() handles ranges - should it? TODO
v1 = Version(0, 1, 2)
v2 = Version(0, 1, 8)
# Empty
vr = VersionRange()
# Bad positional
assert_raises(VersionInitError, VersionRange, v1)
# Bad KW
assert_raises(VersionInitError, VersionRange, {'start': v1})
assert_raises(VersionInitError, VersionRange, {'end': v1})
# Positional
vr = VersionRange(v1, v2)
assert(type(vr.start) is Version)
assert(type(vr.end) is Version)
assert(vr.start == v1)
assert(vr.end == v2)
# Kw
vr = VersionRange(start=v1, end=v2)
assert(type(vr.start) is Version)
assert(type(vr.end) is Version)
assert(vr.start == v1)
assert(vr.end == v2)
# Contains
v1 = Version(1)
v2 = Version(7)
vr = VersionRange(v1, v2)
assert(Version(0) not in vr)
assert(Version(0, 9) not in vr)
assert(Version(0, 9, 99) not in vr)
assert(Version(1) in vr)
assert(Version(1, 0) in vr)
assert(Version(1, 0, 0) in vr)
assert(Version(1, 0, 1) in vr)
assert(Version(1, 1, 0) in vr)
assert(Version(2) in vr)
assert(Version(6) in vr)
assert(Version(6, 9) in vr)
assert(Version(6, 99, 999) in vr)
assert(Version(7) in vr)
assert(Version(7, 0) in vr)
assert(Version(7, 0, 0) in vr)
assert(Version(7, 0, 1) not in vr)
assert(Version(7, 1, 0) not in vr)
assert(Version(8) not in vr)
v1 = Version(0, 1, 2)
v2 = Version(0, 1, 8)
vr = VersionRange(v1, v2)
assert(Version(0) not in vr)
assert(Version(0, 1) not in vr)
assert(Version(0, 1, 1) not in vr)
assert(Version(0, 1, 2) in vr)
assert(Version(0, 1, 3) in vr)
assert(Version(0, 1, 7) in vr) # TODO fractional semantics
assert(Version(0, 1, 8) in vr)
assert(Version(0, 1, 9) not in vr)
assert(Version(0, 2) not in vr)
assert(Version(1) not in vr)
v1 = Version(1, 345)
v2 = Version(1, 456)
vr = VersionRange(v1, v2)
assert(Version(0) not in vr)
assert(Version(1) not in vr)
assert(Version(1, 0) not in vr)
assert(Version(1, 111) not in vr)
assert(Version(1, 344) not in vr)
assert(Version(1, 344, 0) not in vr)
assert(Version(1, 344, 1) not in vr)
assert(Version(1, 345) in vr)
assert(Version(1, 345, 0) in vr)
assert(Version(1, 345, 1) in vr)
assert(Version(1, 346) in vr)
assert(Version(1, 400) in vr)
assert(Version(1, 455) in vr)
assert(Version(1, 456) in vr)
assert(Version(1, 456, 0) in vr)
assert(Version(1, 456, 1) not in vr)
assert(Version(1, 457) not in vr)
assert(Version(1, 457, 0) not in vr)
assert(Version(1, 99999) not in vr)
assert(Version(2) not in vr)
# TODO epoch, prerelease, beta etc
# TODO esp rc/alpha/beta edge cases
# TODO few parse here too
def test_parse():
# TODO SVN/cvs/rcs version numbers?
####
#### In-the-wild / generally semver-compatible tests
####
assert_raises(VersionParseError, Version.parse, '')
s = 'OpenSSH'
assert_raises(VersionParseError, Version.parse, s)
s = 'OpenSSH 4'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(v.name == 'OpenSSH')
assert(type(v.major) is int)
assert(v.major == 4)
assert(v.minor is None)
assert(v.patch is None)
assert(v.name_sep == ' ')
assert(v.version_sep is None)
assert(str(v) == s)
s = 'OpenSSH_4'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(type(v.major) is int)
assert(v.major == 4)
assert(v.minor is None)
assert(v.patch is None)
assert(v.name_sep == '_')
assert(v.version_sep is None)
assert(str(v) == s)
s = 'OpenSSH-4'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(type(v.major) is int)
assert(v.major == 4)
assert(v.minor is None)
assert(v.patch is None)
assert(v.name_sep == '-')
assert(v.version_sep is None)
assert(str(v) == s)
s = 'OpenSSH_4.3'
v = Version.parse(s)
assert(type(v.name_clean) in STR_TYPES)
assert(v.name_clean == 'openssh')
assert(type(v.major) is int)
assert(v.major == 4)
assert(type(v.minor) is int)
assert(v.minor == 3)
assert(v.patch is None)
assert(v.patch_str is None)
assert(v.name_sep == '_')
assert(v.version_sep == '.')
assert(str(v) == s)
s = 'OpenSSH_4.3'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(type(v.name_clean) in STR_TYPES)
assert(v.major == 4)
assert(type(v.major) is int)
assert(v.minor == 3)
assert(type(v.minor) is int)
assert(v.patch is None)
assert(v.patch_str == None)
assert(v.name_sep == '_')
assert(v.version_sep == '.')
assert(str(v) == s)
s = 'OpenSSH-4.3'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(v.major == 4)
assert(v.minor == 3)
assert(v.patch is None)
assert(v.name_sep == '-')
assert(v.version_sep == '.')
assert(str(v) == s)
s = 'Cisco-1.25'
v = Version.parse(s)
assert(v.name_clean == 'cisco')
assert(v.name == 'Cisco')
assert(v.major == 1)
assert(v.minor == 25)
assert(type(v.minor) is int)
assert(v.patch is None)
assert(v.name_sep == '-')
assert(v.version_sep == '.')
assert(str(v) == s)
s = 'OpenSSH_6.2'
v = Version.parse(s)
assert(type(v.name_clean) in STR_TYPES)
assert(v.name_clean == 'openssh')
assert(type(v.major) is int)
assert(v.major == 6)
assert(type(v.minor) is int)
assert(v.minor == 2)
assert(str(v) == s)
s = 'OpenSSH_6.2p5'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(v.major == 6)
assert(v.minor == 2)
assert(type(v.patch) is int)
assert(v.patch == 5)
assert(type(v.patch1) is int)
assert(v.patch1 == 5)
assert(v.patch_str == 'p')
assert(v.patch2 is None)
assert(str(v) == s)
s = 'OpenSSH_5.5p1 Debian-6+squeeze2'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(v.major == 5)
assert(v.minor == 5)
assert(v.patch == 1)
assert(v.patch_str == 'p')
assert(str(v) == s)
s = 'OpenSSH_4.3-HipServ'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(v.major == 4)
assert(v.minor == 3)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '-HipServ')
assert(str(v) == s)
#s = ''
#v = Version.parse(s)
#assert(str(v) == s)
# Perl format
s = 'Quux 1.12_15'
v = Version.parse(s)
assert(v._perl_version_fmt)
assert(v.name_clean == 'quux')
assert(v.name == 'Quux')
assert(type(v.major) is int)
assert(v.major == 1)
assert(type(v.minor) is int)
assert(v.minor == 12)
assert(type(v.patch) is int)
assert(v.patch == 15)
assert(type(v.patch1) is int)
assert(v.patch1 == 15)
assert(v.patch2 is None)
assert(v.patch_str is None)
assert(str(v) == s)
s = 'Quux 1.12_15_99'
v = Version.parse(s)
assert(v._perl_version_fmt)
assert(v.name_clean == 'quux')
assert(v.name == 'Quux')
assert(type(v.major) is int)
assert(v.major == 1)
assert(type(v.minor) is int)
assert(v.minor == 12)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '15_99')
assert(type(v.patch1) is int)
assert(v.patch1 == 15)
assert(type(v.patch2) is int)
assert(v.patch2 == 99)
assert(v.patch_str == '_')
assert(str(v) == s)
s = 'Quux 1_12_15'
v = Version.parse(s)
assert(v._perl_version_fmt)
assert(v.name_clean == 'quux')
assert(v.name == 'Quux')
assert(v.major == 1)
assert(v.minor == 12)
assert(v.patch == 15)
assert(v.patch1 == 15)
assert(v.patch2 is None)
assert(v.patch_str is None)
assert(v.version_sep == '_')
assert(str(v) == s)
s = 'Quux_1_12_15'
v = Version.parse(s)
assert(v._perl_version_fmt)
assert(v.name_clean == 'quux')
assert(v.name == 'Quux')
assert(v.major == 1)
assert(v.minor == 12)
assert(v.patch == 15)
assert(v.patch1 == 15)
assert(v.patch2 is None)
assert(v.patch_str is None)
assert(v.version_sep == '_')
assert(v.name_sep == '_')
assert(str(v) == s)
# Whitespace cleanups
s = 'openssh\t1.2.2p1'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(v.name == 'openssh')
assert(v.major == 1)
assert(v.minor == 2)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '2p1')
assert(v.patch1 == 2)
assert(v.patch2 == 1)
assert(v.patch_str == 'p')
assert(str(v) == 'openssh 1.2.2p1')
s = '\t openssh 1.2.1pre18 \t'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(v.name == 'openssh')
assert(v.major == 1)
assert(v.minor == 2)
assert(v.patch == '1pre18')
assert(v.patch1 == 1)
assert(v.patch2 == 18)
assert(v.patch_str == 'pre')
assert(str(v) == 'openssh 1.2.1pre18')
# ...
s = 'openssh 4.2.p1'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(v.name == 'openssh')
assert(v.major == 4)
assert(v.minor == 2)
assert(not v._openssh_moved_p)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == 'p1')
assert(v.patch1 is None)
assert(type(v.patch2) is int)
assert(v.patch2 == 1)
assert(type(v.patch_str) in STR_TYPES)
assert(v.patch_str == 'p')
assert(str(v) == s)
s = 'ARRIS_0.44_01'
v = Version.parse(s)
assert(v.name_clean == 'arris')
assert(v.major == 0)
assert(v.minor == 44)
assert(v.zero_prefixes['patch'] == 1)
assert(type(v.patch) is int)
assert(v.patch == 1)
assert(v.zero_prefixes['patch1'] == 1)
assert(type(v.patch1) is int)
assert(v.patch1 == 1)
assert(v._perl_version_fmt)
assert(v.patch_str is None)
assert(v.patch2 is None)
assert(str(v) == s)
s = 'ProFTPD1.3.3'
v = Version.parse(s)
assert(v.name_clean == 'proftpd')
assert(v.major == 1)
assert(v.minor == 3)
assert(v.patch == 3)
assert(v.patch1 == 3)
assert(v.patch2 is None)
assert(str(v) == s)
s = 'linux-3.0.77'
v = Version.parse(s)
assert(v.name_clean == 'linux')
assert(v.name == 'linux')
assert(v.major == 3)
assert(v.minor == 0)
assert(type(v.patch) is int)
assert(v.patch == 77)
assert(type(v.patch1) is int)
assert(v.patch1 == 77)
assert(str(v) == s)
s = 'linux-2.6.27.10'
v = Version.parse(s)
assert(v.name_clean == 'linux')
assert(v.major == 2)
assert(v.minor == 6)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '27.10')
assert(type(v.patch1) is int)
assert(v.patch1 == 27)
assert(v.patch2 == 10)
assert(v.patch_str == '.')
assert(str(v) == s)
s = 'OpenSSH 5.5p1 Debian 6+squeeze4 (protocol 2.0)'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(v.name == 'OpenSSH')
assert(v.major == 5)
assert(v.minor == 5)
assert(v.patch == 1)
assert(v.patch1 == 1)
assert(v.patch2 is None)
assert(v.patch_str == 'p')
assert(str(v) == s)
s = 'lighttpd 1.4.23'
v = Version.parse(s)
assert(v.name_clean == 'lighttpd')
assert(v.major == 1)
assert(v.minor == 4)
assert(type(v.patch) is int)
assert(v.patch == 23)
assert(type(v.patch1) is int)
assert(v.patch1 == 23)
assert(v.patch2 is None)
assert(v.patch_str is None)
assert(str(v) == s)
s = 'ProFTPD 1.3.3a'
v = Version.parse(s)
assert(v.name_clean == 'proftpd')
assert(v.major == 1)
assert(v.minor == 3)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '3a')
assert(v.patch1 == 3)
assert(type(v.patch1) is int)
assert(v.patch_str == 'a')
assert(str(v) == s)
s = 'BetaFTPD 0.0.8pre17'
v = Version.parse(s)
assert(v.name_clean == 'betaftpd')
assert(v.major == 0)
assert(v.minor == 0)
assert(v.patch == '8pre17')
assert(v.patch1 == 8)
assert(v.patch2 == 17)
assert(v.patch_str == 'pre')
assert(str(v) == s)
s = 'linux-2.6.0-rc1'
v = Version.parse(s)
assert(v.name_clean == 'linux')
assert(v.major == 2)
assert(v.minor == 6)
assert(v.patch == '0-rc1')
assert(v.patch1 == 0)
assert(v.patch2 == 1)
assert(v.patch_str == '-rc')
assert(str(v) == s)
s = 'linux-2.6.0-test4'
v = Version.parse(s)
assert(v.name_clean == 'linux')
assert(v.major == 2)
assert(v.minor == 6)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '0-test4')
assert(v.patch1 == 0)
assert(v.patch2 == 4)
assert(v.patch_str == '-test')
assert(str(v) == s)
s = 'Gene6 FTP Server v3.10.0'
v = Version.parse(s)
assert(v.name_clean == 'gene6ftpserver')
assert(v.name == 'Gene6 FTP Server')
assert(v.major == 3)
assert(v.minor == 10)
assert(v.patch == 0)
assert(v.patch1 == 0)
assert(v.patch2 is None)
assert(v.patch_str is None)
assert(str(v) == s)
s = 'IdeaWebServer httpd v0.70'
v = Version.parse(s)
assert(v.name_clean == 'ideawebserverhttpd')
assert(v.name == 'IdeaWebServer httpd')
assert(v.major == 0)
assert(v.minor == 70)
assert(v.patch is None)
assert(str(v) == s)
s = 'Multicraft 1.8.2 FTP server'
v = Version.parse(s)
assert(v.name_clean == 'multicraft')
assert(v.major == 1)
assert(v.minor == 8)
assert(v.patch == 2)
assert(v.patch1 == 2)
assert(v.patch_str is None)
assert(v.patch2 is None)
assert(str(v) == s)
s = 'ProFTPD 1.3.3g Server'
v = Version.parse(s)
assert(v.name_clean == 'proftpd')
assert(v.name == 'ProFTPD')
assert(v.major == 1)
assert(v.minor == 3)
assert(v.patch == '3g')
assert(v.patch1 == 3)
assert(v.patch_str == 'g')
assert(v.patch2 is None)
assert(str(v) == s)
s = 'Loxone FTP 5.66.4.23'
v = Version.parse(s)
assert(v.name_clean == 'loxoneftp')
assert(v.name == 'Loxone FTP')
assert(v.major == 5)
assert(v.minor == 66)
assert(v.patch == '4.23')
assert(v.patch1 == 4)
assert(v.patch2 == 23)
assert(v.patch_str == '.')
assert(str(v) == s)
s = 'Exim smtpd 4.X'
v = Version.parse(s)
assert(v.name_clean == 'eximsmtpd')
assert(v.name == 'Exim smtpd')
assert(v.major == 4)
assert(v.minor == 'X')
assert(v.patch is None)
assert(str(v) == s)
s = 'MikroTik router ftpd 5.7'
v = Version.parse(s)
assert(v.name_clean == 'mikrotikrouterftpd')
assert(v.name == 'MikroTik router ftpd')
assert(v.major == 5)
assert(v.minor == 7)
assert(v.patch is None)
assert(str(v) == s)
s = 'Dropbear sshd 0.51' # TODO
v = Version.parse(s)
assert(str(v) == s)
s = 'RapidLogic httpd 1.1' # TODO
v = Version.parse(s)
assert(str(v) == s)
s = 'MySQL 5.0.91-log'
v = Version.parse(s)
assert(v.name_clean == 'mysql')
assert(v.major == 5)
assert(v.minor == 0)
assert(v.patch == '91-log')
assert(v.patch1 == 91)
assert(v.patch_str == '-log')
assert(v.patch2 is None)
assert(str(v) == s)
s = 'Foowizard 12.1.99900.0'
v = Version.parse(s)
assert(v.name_clean == 'foowizard')
assert(v.major == 12)
assert(v.minor == 1)
assert(v.patch == '99900.0')
assert(v.patch1 == 99900)
assert(v.patch2 == 0)
assert(v.patch_str == '.')
assert(str(v) == s)
s = 'Task Manager Pro 2.0245'
v = Version.parse(s)
assert(v.name_clean == 'taskmanagerpro')
assert(v.name == 'Task Manager Pro')
assert(v.major == 2)
assert(v.minor == 245)
assert(v.patch is None)
assert(v.zero_prefixes['minor'] == 1)
assert(str(v) == s)
s = 'Internet Download Helper 8.22.0.1234'
v = Version.parse(s)
assert(v.name_clean == 'internetdownloadhelper')
assert(v.major == 8)
assert(v.minor == 22)
assert(v.patch == '0.1234')
assert(v.patch1 == 0)
assert(v.patch2 == 1234)
assert(v.patch_str == '.')
assert(str(v) == s)
s = 'Apache/2'
v = Version.parse(s)
assert(v.name_clean == 'apache')
assert(v.major == 2)
assert(v.minor is None)
assert(v.patch is None)
assert(str(v) == s)
s = 'Linux/2.x'
v = Version.parse(s)
assert(v.name_clean == 'linux')
assert(v.major == 2)
assert(v.minor == 'x')
assert(v.patch is None)
assert(str(v) == s)
s = 'PHP/5.2.9-1'
v = Version.parse(s)
assert(v.name_clean == 'php')
assert(v.major == 5)
assert(v.minor == 2)
assert(v.patch == '9-1')
assert(v.patch1 == 9)
assert(v.patch2 == 1)
assert(v.patch_str == '-')
assert(str(v) == s)
s = 'lighttpd/1.4.28-devel-4979'
v = Version.parse(s)
assert(v.name_clean == 'lighttpd')
assert(v.major == 1)
assert(v.minor == 4)
assert(v.patch == '28-devel-4979')
assert(v.patch1 == 28)
assert(v.patch2 == 4979)
assert(v.patch_str == '-devel-')
assert(str(v) == s)
s = 'Apache/2.2.18'
v = Version.parse(s)
assert(v.name_clean == 'apache')
assert(v.name == 'Apache')
assert(v.major == 2)
assert(v.minor == 2)
assert(v.patch == 18)
assert(v.patch1 == 18)
assert(v.patch_str is None)
assert(v.patch2 is None)
assert(str(v) == s)
s = 'Apache/2_2_18'
v = Version.parse(s)
assert(v.name_clean == 'apache')
assert(v.name == 'Apache')
assert(v.major == 2)
assert(v.minor == 2)
assert(type(v.patch) is int)
assert(v.patch == 18)
assert(type(v.patch1) is int)
assert(v.patch1 == 18)
assert(v.patch_str is None)
assert(v.patch2 is None)
assert(str(v) == s)
s = 'Microsoft-IIS/6.0'
v = Version.parse(s)
assert(v.name_clean == 'microsoft-iis')
assert(v.name == 'Microsoft-IIS')
assert(v.major == 6)
assert(v.minor == 0)
assert(v.patch is None)
assert(str(v) == s)
s = 'Virata-EmWeb/R6_0_1'
v = Version.parse(s)
assert(v.name_clean == 'virata-emweb')
assert(v.major == 6)
assert(v.minor == 0)
assert(type(v.patch) is int)
assert(v.patch == 1)
assert(type(v.patch1) is int)
assert(v.patch1 == 1)
assert(v.patch2 is None)
assert(str(v) == s)
s = 'IdeaWebServer/v0.80'
v = Version.parse(s)
assert(v.name_clean == 'ideawebserver')
assert(v.name == 'IdeaWebServer')
assert(v.major == 0)
assert(v.minor == 80)
assert(v.patch is None)
assert(str(v) == s)
s = 'mod_ssl/2.2.18'
v = Version.parse(s)
assert(v.name_clean == 'mod_ssl')
assert(v.major == 2)
assert(v.minor == 2)
assert(v.patch == 18)
assert(v.patch_str is None)
assert(str(v) == s)
s = 'Gemtek/0.899'
v = Version.parse(s)
assert(v.name_clean == 'gemtek')
assert(v.major == 0)
assert(v.minor == 899)
assert(v.patch is None)
assert(str(v) == s)
s = 'OpenSSL/1.0.0-fips'
v = Version.parse(s)
assert(v.name_clean == 'openssl')
assert(v.major == 1)
assert(v.minor == 0)
assert(v.patch == '0-fips')
assert(v.patch1 == 0)
assert(v.patch_str == '-fips')
assert(v.patch2 is None)
assert(str(v) == s)
s = 'KM-MFP-http/V0.0.1'
v = Version.parse(s)
assert(v.name_clean == 'km-mfp-http')
assert(v.name == 'KM-MFP-http')
assert(v.major == 0)
assert(v.minor == 0)
assert(v.patch == 1)
assert(str(v) == s)
s = 'PHP/4.4.4-8+etch6'
v = Version.parse(s)
assert(v.name_clean == 'php')
assert(v.major == 4)
assert(v.minor == 4)
assert(v.patch == '4-8')
assert(v.patch1 == 4)
assert(v.patch2 == 8)
assert(v.patch_str == '-')
assert(v.build_meta == 'etch6')
assert(str(v) == s)
s = 'IP_SHARER WEB 1.0'
v = Version.parse(s)
assert(v.name_clean == 'ip_sharerweb')
assert(v.name == 'IP_SHARER WEB')
assert(v.major == 1)
assert(v.minor == 0)
assert(v.patch is None)
assert(str(v) == s)
s = 'mod_auth_passthrough/2.1'
v = Version.parse(s)
assert(v.name_clean == 'mod_auth_passthrough')
assert(v.name == 'mod_auth_passthrough')
assert(v.major == 2)
assert(v.minor == 1)
assert(v.patch is None)
assert(str(v) == s)
s = 'FrontPage/5.0.2.2635'
v = Version.parse(s)
assert(v.name_clean == 'frontpage')
assert(v.major == 5)
assert(v.minor == 0)
assert(v.patch == '2.2635')
assert(v.patch1 == 2)
assert(v.patch2 == 2635)
assert(v.patch_str == '.')
assert(str(v) == s)
s = 'OpenSSL/0.9.8r'
v = Version.parse(s)
assert(v.name_clean == 'openssl')
assert(v.major == 0)
assert(v.minor == 9)
assert(v.patch == '8r')
assert(v.patch1 == 8)
assert(v.patch_str == 'r')
assert(v.patch2 is None)
assert(str(v) == s)
s = 'sendmail.8.14.7'
v = Version.parse(s)
assert(v.name_clean == 'sendmail')
assert(v.name == 'sendmail')
assert(v.major == 8)
assert(v.minor == 14)
assert(type(v.patch) is int)
assert(v.patch == 7)
assert(type(v.patch1) is int)
assert(v.patch1 == 7)
assert(v.patch2 is None)
assert(v.patch_str is None)
assert(str(v) == s)
s = 'Mercury POP3 server 1.48'
v = Version.parse(s)
assert(v.name_clean == 'mercurypop3server')
assert(v.name == 'Mercury POP3 server')
assert(v.major == 1)
assert(v.minor == 48)
assert(v.patch is None)
assert(str(v) == s)
s = 'Squid http proxy 3.0.STABLE20'
v = Version.parse(s)
assert(v.name_clean == 'squidhttpproxy')
assert(v.name == 'Squid http proxy')
assert(v.major == 3)
assert(v.minor == 0)
assert(v.patch == 'STABLE20')
assert(v.patch1 is None)
assert(v.patch_str == 'STABLE')
assert(v.patch2 == 20)
assert(str(v) == s)
s = 'mod_apreq2-20090110/2.7.1'
v = Version.parse(s)
assert(v.name_clean == 'mod_apreq2')
assert(v.name == 'mod_apreq2')
assert(v.major == 2)
assert(v.minor == 7)
assert(v.patch == 1)
assert(v.patch1 == 1)
assert(v.patch_str is None)
assert(v.patch2 is None)
assert(v.release_date == date(2009, 1, 10))
assert(str(v) == s)
s = 'mini_httpd/1.19 19dec2003'
v = Version.parse(s)
assert(v.name_clean == 'mini_httpd')
assert(v.major == 1)
assert(v.minor == 19)
assert(v.patch is None)
assert(v.release_date == date(2003, 12, 19))
assert(str(v) == s)
s = 'Allegro-Software-RomPager/4.34'
v = Version.parse(s)
assert(v.name_clean == 'allegro-software-rompager')
assert(v.major == 4)
assert(v.minor == 34)
assert(v.patch is None)
assert(str(v) == s)
s = 'Foobar 8.00.162'
v = Version.parse(s)
assert(v.name_clean == 'foobar')
assert(v.name == 'Foobar')
assert(v.major == 8)
assert(v.minor == 0)
assert(v.zero_prefixes['minor'] == 1)
assert(v.patch == 162)
assert(v.patch1 == 162)
assert(v.patch_str is None)
assert(v.patch2 is None)
assert(str(v) == s)
s = 'Foobar 8.00.0162'
v = Version.parse(s)
assert(v.name_clean == 'foobar')
assert(v.name == 'Foobar')
assert(v.major == 8)
assert(v.minor == 0)
assert(v.zero_prefixes['minor'] == 1)
assert(v.patch == 162)
assert(v.zero_prefixes['patch'] == 1)
assert(v.patch1 == 162)
assert(v.zero_prefixes['patch1'] == 1)
assert(v.patch_str is None)
assert(v.patch2 is None)
assert(str(v) == s)
s = 'LANCOM 1611+ 8.0.162'
v = Version.parse(s)
assert(v.name_clean == 'lancom1611+')
assert(v.name == 'LANCOM 1611+')
assert(v.major == 8)
assert(v.minor == 0)
assert(v.patch == 162)
assert(v.patch1 == 162)
assert(v.patch_str is None)
assert(v.patch2 is None)
assert(str(v) == s)
s = 'LANCOM 1611+ 8.00.162'
v = Version.parse(s)
assert(v.name_clean == 'lancom1611+')
assert(v.name == 'LANCOM 1611+')
assert(v.major == 8)
assert(v.minor == 0)
assert(v.zero_prefixes['minor'] == 1)
assert(v.patch == 162)
assert(v.patch1 == 162)
assert(v.patch_str is None)
assert(v.patch2 is None)
assert(str(v) == s)
s = 'LANCOM 1611+ 8.00.0162 / 16.06.2010'
v = Version.parse(s)
assert(v.name_clean == 'lancom1611+')
assert(v.name == 'LANCOM 1611+')
assert(v.major == 8)
assert(v.minor == 0)
assert(v.zero_prefixes['minor'] == 1)
assert(v.patch == 162)
assert(v.zero_prefixes['patch'] == 1)
assert(v.patch1 == 162)
assert(v.zero_prefixes['patch1'] == 1)
assert(v.patch_str is None)
assert(v.patch2 is None)
assert(v.release_date == date(2010, 6, 16))
assert(str(v) == s)
s = 'OpenSSL/0.9.8e-fips-rhel5'
v = Version.parse(s)
assert(v.name_clean == 'openssl')
assert(v.major == 0)
assert(v.minor == 9)
assert(v.patch == '8e-fips-rhel5')
assert(v.patch1 == 8)
assert(v.patch_str == 'e-fips-rhel')
assert(v.patch2 == 5) # Even though that's not ideal
assert(str(v) == s)
s = 'Sun-ONE-ASP/4.0.3'
v = Version.parse(s)
assert(v.name_clean == 'sun-one-asp')
assert(v.major == 4)
assert(v.minor == 0)
assert(v.patch == 3)
assert(v.patch1 == 3)
assert(v.patch_str is None)
assert(v.patch2 is None)
assert(str(v) == s)
s = 'thttpd/2.23beta1 26may2002'
v = Version.parse(s)
assert(v.name_clean == 'thttpd')
assert(v.major == 2)
assert(v.minor == 23)
assert(v.patch == 'beta1')
assert(v.patch1 is None)
assert(v.patch2 == 1)
assert(v.patch_str == 'beta')
assert(v.release_date == date(2002, 5, 26))
assert(str(v) == s)
# Test arbitary number of zero prefixes
# TODO and interaction with strings in major/minor
s = 'Foobar 0'
v = Version.parse(s)
assert(v.name_clean == 'foobar')
assert(v.major == 0)
assert(v.zero_prefixes['major'] == 0)
assert(v.minor is None)
assert(str(v) == s)
s = 'Foobar 03'
v = Version.parse(s)
assert(v.name_clean == 'foobar')
assert(v.major == 3)
assert(v.zero_prefixes['major'] == 1)
assert(v.minor is None)
assert(str(v) == s)
s = 'Foobar 003'
v = Version.parse(s)
assert(v.name_clean == 'foobar')
assert(v.major == 3)
assert(v.zero_prefixes['major'] == 2)
assert(v.minor is None)
assert(str(v) == s)
s = 'Foobar 00000000003'
v = Version.parse(s)
assert(v.name_clean == 'foobar')
assert(v.major == 3)
assert(v.zero_prefixes['major'] == 10)
assert(v.minor is None)
assert(str(v) == s)
s = 'Foobar 3.01'
v = Version.parse(s)
assert(v.name_clean == 'foobar')
assert(v.major == 3)
assert(v.minor == 1)
assert(v.zero_prefixes['minor'] == 1)
assert(str(v) == s)
s = 'Foobar 3.001'
v = Version.parse(s)
assert(v.name_clean == 'foobar')
assert(v.major == 3)
assert(v.minor == 1)
assert(v.zero_prefixes['minor'] == 2)
assert(str(v) == s)
s = 'Foobar 3.00000000001'
v = Version.parse(s)
assert(v.name_clean == 'foobar')
assert(v.major == 3)
assert(v.minor == 1)
assert(v.zero_prefixes['minor'] == 10)
assert(str(v) == s)
s = 'Foobar 03.01'
v = Version.parse(s)
assert(v.name_clean == 'foobar')
assert(v.major == 3)
assert(v.zero_prefixes['major'] == 1)
assert(v.minor == 1)
assert(v.zero_prefixes['minor'] == 1)
assert(str(v) == s)
s = 'Foobar 000003.01'
v = Version.parse(s)
assert(v.name_clean == 'foobar')
assert(v.major == 3)
assert(v.zero_prefixes['major'] == 5)
assert(v.minor == 1)
assert(v.zero_prefixes['minor'] == 1)
assert(str(v) == s)
s = 'Foobar 000003.00000000001'
v = Version.parse(s)
assert(v.name_clean == 'foobar')
assert(v.major == 3)
assert(v.zero_prefixes['major'] == 5)
assert(v.minor == 1)
assert(v.zero_prefixes['minor'] == 10)
assert(str(v) == s)
s = 'Foobar 3.1.01'
v = Version.parse(s)
assert(v.name_clean == 'foobar')
assert(v.major == 3)
assert(v.minor == 1)
assert(v.patch == 1)
assert(v.zero_prefixes['patch'] == 1)
assert(v.patch1 == 1)
assert(v.zero_prefixes['patch1'] == 1)
assert(v.patch2 is None)
assert(v.patch_str is None)
assert(str(v) == s)
s = 'Foobar 3.1.1-01'
v = Version.parse(s)
assert(v.name_clean == 'foobar')
assert(v.major == 3)
assert(v.minor == 1)
assert(v.patch == '1-01')
assert(v.patch1 == 1)
assert(v.patch2 == 1)
assert(v.patch_str == '-')
assert(str(v) == s)
s = 'Foobar 3 (FB3-DX) 1.90.26'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
s = 'ProTools Basic Edition 5.0 Build 11'
v = Version.parse(s)
assert(v.name_clean == 'protoolsbasicedition')
assert(v.name == 'ProTools Basic Edition')
assert(v.major == 5)
assert(v.minor == 0)
assert(v.patch is None)
assert(v.extra_str == 'Build 11')
assert(str(v) == s)
s = 'Fiddlesticks 2.0 Beta 2'
v = Version.parse(s)
assert(v.name_clean == 'fiddlesticks')
assert(v.name == 'Fiddlesticks')
assert(v.major == 2)
assert(v.minor == 0)
assert(v.patch is None)
assert(v.extra_str == 'Beta 2')
assert(str(v) == s)
s = 'IDA 5.19.1.1387.2314'
#v = Version.parse(s)
#assert(v.name_clean == 'ida')
#assert(v.name == 'IDA')
#assert(v.major == 5)
#assert(v.minor == 19)
#assert(v.patch == '')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
# s = 'IDA 5.19.1.1387.2314.0'
# v = Version.parse(s)
# assert(v.name_clean == 'php')
# assert(v.name == 'PHP')
# assert(v.major == 5)
# assert(v.minor == 2)
# assert(v.patch == '6-1+lenny15')
# assert(v.patch1 == 6)
# assert(v.patch2 == 1)
# assert(v.patch_str == '-')
# assert(str(v) == s)
#
# s = 'IDA 5.19.1.1387.2314.0.1352135'
# v = Version.parse(s)
# assert(v.name_clean == 'php')
# assert(v.name == 'PHP')
# assert(v.major == 5)
# assert(v.minor == 2)
# assert(v.patch == '6-1+lenny15')
# assert(v.patch1 == 6)
# assert(v.patch2 == 1)
# assert(v.patch_str == '-')
# assert(str(v) == s)
s = 'Cyrus POP3 v2.2.13-Debian-2.2.13-14+lenny3 server'
v = Version.parse(s)
assert(v.name_clean == 'cyruspop3')
assert(v.name == 'Cyrus POP3')
assert(v.major == 2)
assert(v.minor == 2)
assert(v.patch == '13-Debian-2.2.13-14')
assert(v.patch1 == 13)
assert(v.patch2 == 14)
assert(v.patch_str == '-Debian-2.2.13-')
assert(v.build_meta == 'lenny3')
assert(v.extra_str == ' server')
assert(str(v) == s)
s = 'POP MDaemon 9.0.4'
v = Version.parse(s)
assert(v.name_clean == 'popmdaemon')
assert(v.name == 'POP MDaemon')
assert(v.major == 9)
assert(v.minor == 0)
assert(v.patch == 4)
assert(v.patch1 == 4)
assert(v.patch2 is None)
assert(v.patch_str is None)
assert(str(v) == s)
s = 'POP3 Bigfoot v1.0 server'
v = Version.parse(s)
assert(v.name_clean == 'pop3bigfoot')
assert(v.name == 'POP3 Bigfoot')
assert(v.major == 1)
assert(v.minor == 0)
assert(v.patch is None)
assert(v.extra_str == ' server')
assert(str(v) == s)
s = 'IMail 8.05 4000-1'
v = Version.parse(s)
assert(v.name_clean == 'imail')
assert(v.name == 'IMail')
assert(v.major == 8)
assert(v.minor == 5)
assert(v.zero_prefixes['minor'] == 1)
assert(v.patch is None)
assert(v.extra_str == '4000-1') # won't include ' ' if it's the name_sep
assert(str(v) == s)
s = 'IdeaPop3Server v0.80'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
s = 'example.example Cyrus POP3 v2.3.7-Invoca-RPM-2.3.7-12.el5_7.2 server'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
##assert(str(v) == s)
s = 'Qpopper (version 4.0.5)'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
s = 'POP3 on WinWebMail [3.8.1.3]'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
s = 'Microsoft Exchange Server 2003 POP3 <A6><F8><AA>A<BE><B9><AA><A9><A5><BB> 6.5.7638.1 (example.local)'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
s = 'Microsoft Windows POP3 Service Version 1.0'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
s = 'Microsoft Exchange 2000 POP3 server version 6.0.6249.0 (example.example.org)'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
s = 'X1 NT-POP3 Server mail.example.org (IMail 8.03 304911-2)'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
s = 'RaidenMAILD POP3 service v2205'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
s = 'Lotus Notes POP3 server version Release 8.5.3'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
s = 'X1 NT-POP3 Server example.org (IMail 9.23 64609-2757)'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
s = 'IceWarp 10.3.5 RHEL5 POP3'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
s = 'XMail 1.27 POP3 Server'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
##assert(str(v) == s)
s = 'Intoto Http Server v1.0'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
##assert(str(v) == s)
s = 'Apache/2.2.15 (CentOS)'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
##assert(str(v) == s)
s = 'mod_gzip/1.3.26.1a'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
##assert(str(v) == s)
s = 'mod_perl/1.29'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
##assert(str(v) == s)
s = 'DIR-600 Ver 2.11'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
##assert(str(v) == s)
s = 'mini_httpd/1.19 19dec2003'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
##assert(str(v) == s)
s = 'Embedthis-Appweb/3.3.1'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
##assert(str(v) == s)
s = 'Boa/0.94.14rc21'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
##assert(str(v) == s)
s = 'MailEnable-HTTP/5.0'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
# Fuck you Zope
s = 'Zope/(Zope 2.11.4-final, python 2.5.4, linux2) ZServer/1.1'
s = 'Mathopd/1.5p6'
v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
# lenny15 should be treated as build meta and rest
# in extra_str
s = 'PHP/5.2.6-1+lenny15 with Suhosin-Patch'
v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
s = 'squid/2.7.STABLE9'
v = Version.parse(s)
assert(v.name_clean == 'squid')
assert(v.name == 'squid')
assert(v.major == 2)
assert(v.minor == 7)
assert(v.patch == 'STABLE9')
assert(v.patch1 is None)
assert(v.patch2 == 9)
assert(v.patch_str == 'STABLE')
assert(str(v) == s)
s = 'lighttpd/1.4.26-devel-6243M'
v = Version.parse(s)
assert(v.name_clean == 'lighttpd')
assert(v.name == 'lighttpd')
assert(v.major == 1)
assert(v.minor == 4)
assert(v.patch == '26-devel-6243M')
assert(v.patch1 == 26)
assert(v.patch2 is None)
assert(v.patch_str == '-devel-6243M')
assert(str(v) == s)
s = 'Winstone Servlet Engine v0.9.10'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
s = 'PHP/5.3.10-1ubuntu3.11'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
##assert(str(v) == s)
s = 'F6D4630-4-v2/1.0'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
##assert(str(v) == s)
s = 'thttpd/2.25b 29dec2003'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
##assert(str(v) == s)
s = 'Jetty(8.y.z-SNAPSHOT)'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
##assert(str(v) == s)
s = 'Microsoft-HTTPAPI/2.0'
#v = Version.parse(s)
#assert(v.name_clean == 'php')
#assert(v.name == 'PHP')
#assert(v.major == 5)
#assert(v.minor == 2)
#assert(v.patch == '6-1+lenny15')
#assert(v.patch1 == 6)
#assert(v.patch2 == 1)
#assert(v.patch_str == '-')
#assert(str(v) == s)
#s = 'GlassFish Server Open Source Edition 4.0'
#v = Version.parse(s)
#assert(v.name_clean == 'glassfishserveropensourceedition') # ergh
#assert(v.name == 'GlassFish Server Open Source Edition')
#assert(v.major == 4)
#assert(v.minor == 0)
#assert(v.patch is None)
#assert(str(v) == s)
#s = 'David-WebBox/11.00a (0717)'
#v = Version.parse(s)
#assert(v.name_clean == 'david-webbox')
#assert(v.name == 'David-WebBox')
#assert(v.major == 11)
#assert(v.minor == 0)
#assert(v.patch == 'a')
#assert(v.patch1 is None)
#assert(v.patch2 is None)
#assert(v.patch_str == 'a')
#assert(v.extra_str == ' (0717)')
#assert(str(v) == s)
s = 'VOD server/4.9.0.01 (Unix)'
v = Version.parse(s)
# TODO extra str
#assert(str(v) == s)
s = 'Boa/0.94.13-20100727-114000'
v = Version.parse(s)
assert(v.name_clean == 'boa')
assert(v.name == 'Boa')
assert(v.major == 0)
assert(v.minor == 94)
assert(v.patch == '13-114000')
assert(v.patch1 == 13)
assert(v.patch2 == 114000)
assert(v.patch_str == '-')
assert(str(v) == s)
s = 'distccd v1 ((GNU) 4.2.4 (Ubuntu 4.2.4-1ubuntu4))'
v = Version.parse(s)
#assert(v.name_clean == 'distccdv1')
#assert(v.name == 'distccd v1')
#assert(v.major == 4)
#assert(v.minor == 2)
#assert(v.patch == 4)
#assert(v.patch1 == 4)
#assert(v.patch_str is None)
#assert(v.patch2 is None)
#assert(str(v) == s)
s = 'ISC BIND 9.4.2'
v = Version.parse(s)
assert(v.name_clean == 'iscbind')
assert(v.name == 'ISC BIND')
assert(v.major == 9)
assert(v.minor == 4)
assert(v.patch == 2)
assert(v.patch1 == 2)
assert(v.patch_str is None)
assert(v.patch2 is None)
assert(str(v) == s)
s = 'Apache Tomcat/Coyote JSP engine 1.1' # TODO
#v = Version.parse(s)
#assert(str(v) == s)
s = 'FreeBSD 8.2-release-p7'
v = Version.parse(s)
assert(v.name_clean == 'freebsd')
assert(v.name == 'FreeBSD')
assert(v.major == 8)
assert(v.minor == 2)
assert(v.patch == '-release-p7')
assert(v.patch1 is None)
assert(v.patch2 == 7)
assert(v.patch_str == '-release-p')
assert(str(v) == s)
s = 'Java SE 7u45'
v = Version.parse(s)
assert(v.name_clean == 'javase')
assert(v.name == 'Java SE')
assert(v.major == 7)
assert(v.minor == 45)
assert(v.patch is None)
assert(v.version_sep == 'u')
assert(str(v) == s)
s = 'blag 29999.50000'
v = Version.parse(s)
assert(v.name_clean == 'blag')
assert(v.major == 29999)
assert(v.minor == 50000)
assert(v.patch is None)
assert(str(v) == s)
s = 'blag 19999.00071'
v = Version.parse(s)
assert(v.name_clean == 'blag')
assert(v.major == 19999)
assert(v.minor == 71)
assert(v.zero_prefixes['minor'] == 3)
assert(v.patch is None)
assert(str(v) == s)
s = 'blorg 1:4.8.2-1ubuntu6'
#v = Version.parse(s)
#assert(v.name_clean == 'blorg')
#assert(v.epoch == 1)
#assert(v.major == 4)
#assert(v.minor == 8)
#assert(v.patch == '2-1ubuntu6')
#assert(v.patch1 == 2)
#assert(v.patch2 == 6) # TODO?
#assert(str(v) == s)
s = 'foo 0.08-2'
#v = Version.parse(s)
#assert(v.name_clean == 'foo')
#assert(v.major == 0)
#assert(v.minor == 8)
#assert(v.zero_prefixes['minor'] == 1)
#assert(v.patch == '2')
#assert(v.patch1 == 2)
#assert(str(v) == s)
s = 'libalgorithm-perl 0.08-2'
#v = Version.parse(s)
#assert(v.name_clean == 'libalgorithm-perl')
#assert(v.major == 0)
#assert(v.minor == 8)
#assert(v.zero_prefixes['minor'] == 1)
#assert(v.patch == '2')
#assert(v.patch1 == 2)
#assert(str(v) == s)
s = 'foo 1.19.02-3'
v = Version.parse(s)
assert(v.name_clean == 'foo')
assert(v.major == 1)
assert(v.minor == 19)
assert(v.patch == '02-3')
assert(v.patch1 == 2)
assert(v.zero_prefixes['patch1'] == 1)
assert(v.patch2 == 3)
assert(str(v) == s)
s = 'foo 0.5.1+14.04.20140409-0ubuntu1'
v = Version.parse(s)
assert(v.name_clean == 'foo')
assert(v.major == 0)
assert(v.minor == 5)
assert(v.patch == 1)
assert(v.patch1 == 1)
assert(v.patch2 is None)
assert(v.build_meta == '14.04.20140409-0ubuntu1')
assert(v.release_date == date(2014, 4, 9))
assert(str(v) == s)
s = 'foo 1.1svn5547-1'
#v = Version.parse(s)
#assert(v.name_clean == 'foo')
#assert(v.major == 1)
#assert(v.minor == 1)
#TODO
#assert(str(v) == s)
s = 'foo 2.10.1-2build1'
v = Version.parse(s)
assert(v.name_clean == 'foo')
assert(v.major == 2)
assert(v.minor == 10)
assert(v.patch == '1-2build1')
assert(v.patch1 == 1)
assert(v.patch2 == 1) #TODO correct? patch2 == 2?
assert(str(v) == s)
s = '2:0.142.2389+git956c8d8-2'
s = '6:9.13-0ubuntu0.14.04.1'
s = '2014.01.13-1'
s = 'libdirectfb-1.2-9'
s = '0.15.1b-8ubuntu1'
s = '2.1.4-0ubuntu14.04.1'
# TODO list all versions in apt repo
#s = ''
#v = Version.parse(s)
#assert(str(v) == s)
####
#### Software with a digit in the protocol (e.g. POP3)
####
# TODO RPC?
#s = '2 (RPC #100000)'
#v = Version.parse(s)
#assert(str(v) == s)
# TODO
#s = '2-4 (RPC #100003)'
#assert(str(v) == s)
s = 'VNC (protocol 3.3)'
v = Version.parse(s)
assert(str(v) == s)
# legit case
s = '(sshd version 1.2.3)'
v = Version.parse(s)
assert(str(v) == s)
s = 'Apache Jserv (Protocol v1.3)'
v = Version.parse(s)
assert(str(v) == s)
s = 'Apache Tomcat|Coyote JSP engine 1.1'
v = Version.parse(s)
assert(str(v) == s)
s = 'Ruby DRb RMI (Ruby 1.8; path |usr|lib|ruby|1.8|drb)'
v = Version.parse(s)
# TODO parens
#assert(v.name == 'Ruby DRb RMI')
#assert(v.name_clean == 'rubydrbrmi')
#assert(v.major == 1)
#assert(v.minor == 8)
#assert(v.patch is None)
#assert(v.extra_str is not None)
#assert(v.extra_str[0] == ';')
#assert(str(v) == s)
####
#### Apple NumVersion # TODO
####
####
#### distutils LooseVersion
####
# 1.5.2b2
# 3.10a
# 3.4j
# 1996.07.12
# 3.2.pl0
# 3.1.1.6
# 2g6
# 11g
# 0.960923
# 2.2beta29
# 1.13++
# 5.5.kw
# 2.0b1pl0
# semver pep440
# 1.5.2b2
# 3.10a
# 3.4j
# 1996.07.12
# 3.2.pl0
# 3.1.1.6
# 2g6
# 11g
# 0.960923
# 2.2beta29
# 1.13++
# 5.5.kw
# 2.0b1pl0
####
#### distutils StrictVersion
####
# 0.4 0.4.0 (these two are equivalent)
# 0.4.1
# 0.5a1
# 0.5b3
# 0.5
# 0.9.6
# 1.0
# 1.0.4a3
# 1.0.4b1
# 1.0.4
####
#### setuptools parse_version
####
# just make sure we do a superset of that (close already)
####
#### Misc
####
#1 (first draft)
#0.1.alphadev
#2008-03-29_r219
# TODO NB http://legacy.python.org/dev/peps/pep-0386/
# -> caveats of existing systems
# (try to solve these issues)
# (and address these points in FAQ/readme/whatever)
####
#### Misc Tests
####
# TODO see http://en.wikipedia.org/wiki/Software_versioning
s = ''
#v = Version.parse(s)
#assert(str(v) == s)
s = ''
#v = Version.parse(s)
#assert(str(v) == s)
def test_parse_pep440():
####
#### PEP440 Tests
####
# Various trivial cases are handled earlier
# Format:
# epoch, release, pre-release, post-release, dev-release
# all numeric components - numeric value, not text strings
s = '0'
v = Version.parse(s)
assert(v.name is None)
assert(len(v.release) == 1)
assert(v.release == (0,))
assert(str(v) == s)
# s = 'frob 1.0a1'
# v = Version.parse(s)
# assert(v.name_clean == 'frob')
# assert(v.name == 'frob')
# assert(v.major == 2012)
# assert(v.minor == 4)
# assert(v.patch is None)
# assert(str(v) == s)
#
# s = 'frob 1.0b2'
# v = Version.parse(s)
# assert(v.name_clean == 'frob')
# assert(v.name == 'frob')
# assert(v.major == 2012)
# assert(v.minor == 4)
# assert(v.patch is None)
# assert(str(v) == s)
#
# s = 'frob 1.0c1'
# v = Version.parse(s)
# assert(v.name_clean == 'frob')
# assert(v.name == 'frob')
# assert(v.major == 2012)
# assert(v.minor == 4)
# assert(v.patch is None)
# assert(str(v) == s)
#
# s = 'frob 1.0rc1'
# v = Version.parse(s)
# assert(v.name_clean == 'frob')
# assert(v.name == 'frob')
# assert(v.major == 2012)
# assert(v.minor == 4)
# assert(v.patch is None)
# assert(str(v) == s)
#
# s = 'frob 1.0.dev1'
# v = Version.parse(s)
# assert(v.name_clean == 'frob')
# assert(v.name == 'frob')
# assert(v.major == 2012)
# assert(v.minor == 4)
# assert(v.patch is None)
# assert(str(v) == s)
#
# s = 'frob 1.0.post1'
# v = Version.parse(s)
# assert(v.name_clean == 'frob')
# assert(v.name == 'frob')
# assert(v.major == 2012)
# assert(v.minor == 4)
# assert(v.patch is None)
# assert(str(v) == s)
#
# s = 'frob 1.0.dev456'
# v = Version.parse(s)
# assert(v.name_clean == 'frob')
# assert(v.name == 'frob')
# assert(v.major == 2012)
# assert(v.minor == 4)
# assert(v.patch is None)
# assert(str(v) == s)
#
# s = 'frob 1.0a2.dev456'
# v = Version.parse(s)
# assert(v.name_clean == 'frob')
# assert(v.name == 'frob')
# assert(v.major == 2012)
# assert(v.minor == 4)
# assert(v.patch is None)
# assert(str(v) == s)
#
# s = 'frob 1.0a12.dev456'
# v = Version.parse(s)
# assert(v.name_clean == 'frob')
# assert(v.name == 'frob')
# assert(v.major == 2012)
# assert(v.minor == 4)
# assert(v.patch is None)
# assert(str(v) == s)
#
# s = 'frob 1.0a12'
# v = Version.parse(s)
# assert(v.name_clean == 'frob')
# assert(v.name == 'frob')
# assert(v.major == 2012)
# assert(v.minor == 4)
# assert(v.patch is None)
# assert(str(v) == s)
#
# s = 'frob 1.0b2.post345.dev456'
# v = Version.parse(s)
# assert(v.name_clean == 'frob')
# assert(v.name == 'frob')
# assert(v.major == 2012)
# assert(v.minor == 4)
# assert(v.patch is None)
# assert(str(v) == s)
#
# s = 'frob 1.0b2.post345'
# v = Version.parse(s)
# assert(v.name_clean == 'frob')
# assert(v.name == 'frob')
# assert(v.major == 2012)
# assert(v.minor == 4)
# assert(v.patch is None)
# assert(str(v) == s)
#
# s = 'frob 1.0c1.dev456'
# v = Version.parse(s)
# assert(v.name_clean == 'frob')
# assert(v.name == 'frob')
# assert(v.major == 2012)
# assert(v.minor == 4)
# assert(v.patch is None)
# assert(str(v) == s)
#
# s = 'frob 1.0.post456.dev34'
# v = Version.parse(s)
# assert(v.name_clean == 'frob')
# assert(v.name == 'frob')
# assert(v.major == 2012)
# assert(v.minor == 4)
# assert(v.patch is None)
# assert(str(v) == s)
#
# s = 'frob 1.0.post456'
# v = Version.parse(s)
# assert(v.name_clean == 'frob')
# assert(v.name == 'frob')
# assert(v.major == 2012)
# assert(v.minor == 4)
# assert(v.patch is None)
# assert(str(v) == s)
# Epoch
# (implicit epoch is 0)
# TODO see http://legacy.python.org/dev/peps/pep-0440/
# for version matching
s = 'frob 5:1'
s = 'frob 5:1.0'
s = 'frob 5:1.0a1'
s = 'frob 5:1.0.dev1'
s = 'frob 5:1.0.post1.dev456'
s = 'frob 5:1.0rc2.post456'
# Local version identifier
s = 'frob 1-1'
s = 'frob 1.0-1'
s = 'frob 1.0a1-1'
s = 'frob 1.0a1-1.1'
s = 'frob 1.0.dev1-1.1'
s = 'frob 1.1a.post345.dev789-88.99'
s = 'frob 1000:1.1a.post345.dev789-88.99.11.22'
# Date-as-version
s = '2012.04'
#v = Version.parse(s)
#assert(str(v) == s)
s = 'frob 2012.4'
#v = Version.parse(s)
#assert(str(v) == s)
s = 'frob 2012.4.1'
#v = Version.parse(s)
#assert(str(v) == s)
s = 'frob 2012.04.01'
#v = Version.parse(s)
#assert(str(v) == s)
s = 'frob 20120401'
#v = Version.parse(s)
#assert(v.release_date == date(2012, 4, 1))
def test_parse_semver():
####
#### Semver-specific tests (Semver 2.0.0)
####
s = '0.0.0' # legal according to semver 2.0.0
v = Version.parse(s)
assert(v.name is None)
assert(v.name_clean is None)
assert(type(v.major) is int)
assert(v.major == 0)
assert(type(v.minor) is int)
assert(v.minor == 0)
assert(type(v.patch) is int)
assert(v.patch == 0)
assert(type(v.patch1) is int)
assert(v.patch1 == 0)
assert(str(v) == s)
s = '0.0.1'
v = Version.parse(s)
assert(v.name is None)
assert(v.name_clean is None)
assert(v.major == 0)
assert(v.minor == 0)
assert(v.patch == 1)
assert(v.patch1 == 1)
assert(str(v) == s)
s = '0.0.99'
v = Version.parse(s)
assert(v.name is None)
assert(v.name_clean is None)
assert(v.major == 0)
assert(v.minor == 0)
assert(v.patch == 99)
assert(v.patch1 == 99)
assert(str(v) == s)
s = '0.0.99999999'
v = Version.parse(s)
assert(v.name is None)
assert(v.name_clean is None)
assert(type(v.major) is int)
assert(v.major == 0)
assert(type(v.minor) is int)
assert(v.minor == 0)
assert(type(v.patch) is int)
assert(v.patch == 99999999)
assert(type(v.patch1) is int)
assert(v.patch1 == 99999999)
assert(str(v) == s)
s = '0.999999999.0'
v = Version.parse(s)
assert(v.name is None)
assert(v.name_clean is None)
assert(v.major == 0)
assert(v.minor == 999999999)
assert(v.patch == 0)
assert(v.patch1 == 0)
assert(str(v) == s)
s = '99999999.0.0'
v = Version.parse(s)
assert(v.name is None)
assert(v.name_clean is None)
assert(v.major == 99999999)
assert(v.minor == 0)
assert(v.patch == 0)
assert(v.patch1 == 0)
assert(str(v) == s)
s = '1.0.0-alpha'
v = Version.parse(s)
assert(v.major == 1)
assert(v.minor == 0)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '0-alpha')
assert(type(v.patch1) is int)
assert(v.patch1 == 0)
assert(v.patch2 is None)
assert(v.patch_str == '-alpha')
assert(str(v) == s)
s = '1.0.0-alpha1'
v = Version.parse(s)
assert(v.major == 1)
assert(v.minor == 0)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '0-alpha1')
assert(type(v.patch1) is int)
assert(v.patch1 == 0)
assert(type(v.patch2) is int)
assert(v.patch2 == 1)
assert(v.patch_str == '-alpha')
assert(str(v) == s)
s = '1.0.0-alpha.1'
v = Version.parse(s)
assert(v.major == 1)
assert(v.minor == 0)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '0-alpha.1')
assert(type(v.patch1) is int)
assert(v.patch1 == 0)
assert(type(v.patch2) is int)
assert(v.patch2 == 1)
assert(v.patch_str == '-alpha.')
assert(str(v) == s)
s = '1.0.0-1'
v = Version.parse(s)
assert(v.major == 1)
assert(v.minor == 0)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '0-1')
assert(type(v.patch1) is int)
assert(v.patch1 == 0)
assert(type(v.patch2) is int)
assert(v.patch2 == 1)
assert(v.patch_str == '-')
assert(str(v) == s)
s = '1.0.0-0.1'
v = Version.parse(s)
assert(v.major == 1)
assert(v.minor == 0)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '0-0.1')
assert(type(v.patch1) is int)
assert(v.patch1 == 0)
assert(type(v.patch2) is int)
assert(v.patch2 == 1)
assert(v.patch_str == '-0.')
assert(str(v) == s)
s = '1.0.0-1-2-3-4-5'
v = Version.parse(s)
assert(v.major == 1)
assert(v.minor == 0)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '0-1-2-3-4-5')
assert(type(v.patch1) is int)
assert(v.patch1 == 0)
assert(type(v.patch2) is int)
assert(v.patch2 == 5)
assert(v.patch_str == '-1-2-3-4-')
assert(str(v) == s)
s = '1.0.0-1-0-1-0-1'
v = Version.parse(s)
assert(v.major == 1)
assert(v.minor == 0)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '0-1-0-1-0-1')
assert(type(v.patch1) is int)
assert(v.patch1 == 0)
assert(type(v.patch2) is int)
assert(v.patch2 == 1)
assert(v.patch_str == '-1-0-1-0-')
assert(str(v) == s)
s = '1.0.0-1a2b3c'
v = Version.parse(s)
assert(v.major == 1)
assert(v.minor == 0)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '0-1a2b3c')
assert(type(v.patch1) is int)
assert(v.patch1 == 0)
assert(v.patch2 is None)
assert(v.patch_str == '-1a2b3c')
assert(str(v) == s)
s = '1.0.0--' # legal ""
v = Version.parse(s)
assert(v.major == 1)
assert(v.minor == 0)
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '0--')
assert(type(v.patch1) is int)
assert(v.patch1 == 0)
assert(v.patch2 is None)
assert(v.patch_str == '--')
assert(str(v) == s)
s = '1.0.0-a.b' # TODO interface. this is wrong
v = Version.parse(s)
#assert(v.major == 1)
#assert(v.minor == 0)
#assert(type(v.patch) in STR_TYPES)
#assert(v.patch == '0-a.b')
#assert(type(v.patch1) in STR_TYPES)
#assert(v.patch1 == 'a')
#assert(v.patch2 == 'b')
#assert(v.patch_str == '-a.b')
#assert(str(v) == s)
s = '1.0.0-A.B'
v = Version.parse(s)
assert(v.name is None)
assert(v.major == 1)
assert(v.minor == 0)
assert(v.patch == '0-A.B')
assert(v.patch1 == 0)
assert(v.patch2 is None)
assert(v.patch_str == '-A.B')
assert(str(v) == s)
s = '1.0.0-a-.b-.c-.d-'
v = Version.parse(s)
assert(v.name is None)
assert(v.major == 1)
assert(v.minor == 0)
assert(v.patch == '0-a-.b-.c-.d-')
assert(v.patch1 == 0)
assert(v.patch2 is None)
assert(str(v) == s)
s = '1.0.0-0.33.44.55'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-999.fiddlesticks.whoomp-there-it-is.ohyeah'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-a.2.X.5'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-a.2.X.5.-'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-a.2.X.5.z'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-a.2.X.5.0'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+20130313144700'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-beta+exp.sha.5114f85'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+alpha'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+alpha1'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+alpha.1'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+1'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+0.1'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+1-2-3-4-5'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+1-0-1-0-1'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+1a2b3c'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+-' # legal ""
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+a.b'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+A.B'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+a-.b-.c-.d-'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+0.33.44.55'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+999.fiddlesticks.whoomp-there-it-is.ohyeah'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+a.2.X.5'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+a.2.X.5.-'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+a.2.X.5.z'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0+a.2.X.5.0'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-alpha+alpha'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-alpha1+alpha1'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-alpha.1+alpha.1'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-1+1'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-0.1+0.1'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-1-2-3-4-5+1-2-3-4-5'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-1-0-1-0-1+1-0-1-0-1'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-1a2b3c+1a2b3c'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0--+-' # legal ""
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-a.b+a.b'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-A.B+A.B'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-a-.b-.c-.+a-.b-.c-.d-'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-0.33.44.55+0.33.44.55'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-999.fiddlesticks.whoomp-there-it-is.ohyeah+999.fiddlesticks.whoomp-there-it-is.ohyeah'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-a.2.X.5+a.2.X.5'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-a.2.X.5.-+a.2.X.5.-'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-a.2.X.5.z+a.2.X.5.z'
v = Version.parse(s)
assert(str(v) == s)
s = '1.0.0-a.2.X.5.0+a.2.X.5.0'
v = Version.parse(s)
assert(str(v) == s)
def test_parse_wildcards():
s = 'OpenSSH *'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(type(v.major) in STR_TYPES)
assert(v.major == '*')
assert(v.minor is None)
assert(v.patch is None)
assert(v.name_sep == ' ')
assert(str(v) == s)
s = 'OpenSSH x'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(type(v.major) in STR_TYPES)
assert(v.major == 'x')
assert(v.minor is None)
assert(v.patch is None)
assert(v.name_sep == ' ')
assert(str(v) == s)
s = 'OpenSSH X'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(type(v.major) in STR_TYPES)
assert(v.major == 'X')
assert(v.minor is None)
assert(v.patch is None)
assert(v.name_sep == ' ')
assert(str(v) == s)
s = 'OpenSSH 4.x'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(type(v.major) is int)
assert(v.major == 4)
assert(type(v.minor) in STR_TYPES)
assert(v.minor == 'x')
assert(v.patch is None)
assert(v.name_sep == ' ')
assert(v.version_sep == '.')
assert(str(v) == s)
s = 'OpenSSH 4.X'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(type(v.major) is int)
assert(v.major == 4)
assert(type(v.minor) in STR_TYPES)
assert(v.minor == 'X')
assert(v.patch is None)
assert(v.name_sep == ' ')
assert(v.version_sep == '.')
assert(str(v) == s)
s = 'OpenSSH 4.*'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(type(v.major) is int)
assert(v.major == 4)
assert(type(v.minor) in STR_TYPES)
assert(v.minor == '*')
assert(v.patch is None)
assert(v.name_sep == ' ')
assert(v.version_sep == '.')
assert(str(v) == s)
s = 'OpenSSH 4.x.x'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(type(v.major) is int)
assert(v.major == 4)
assert(type(v.minor) in STR_TYPES)
assert(v.minor == 'x')
assert(type(v.patch) in STR_TYPES)
assert(v.patch == 'x')
assert(type(v.patch1) in STR_TYPES)
assert(v.patch1 == 'x')
assert(v.patch2 is None)
assert(v.patch_str is None)
assert(v.name_sep == ' ')
assert(v.version_sep == '.')
assert(str(v) == s)
s = 'OpenSSH 4.X.X'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(type(v.major) is int)
assert(v.major == 4)
assert(type(v.minor) in STR_TYPES)
assert(v.minor == 'X')
assert(type(v.patch) in STR_TYPES)
assert(v.patch == 'X')
assert(type(v.patch1) in STR_TYPES)
assert(v.patch1 == 'X')
assert(v.patch2 is None)
assert(v.patch_str is None)
assert(v.name_sep == ' ')
assert(v.version_sep == '.')
assert(str(v) == s)
s = 'OpenSSH 4.*.*'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(type(v.major) is int)
assert(v.major == 4)
assert(type(v.minor) in STR_TYPES)
assert(v.minor == '*')
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '*')
assert(type(v.patch1) in STR_TYPES)
assert(v.patch1 == '*')
assert(v.patch2 is None)
assert(v.patch_str is None)
assert(v.name_sep == ' ')
assert(v.version_sep == '.')
s = 'OpenSSH 4.*.*-*'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(type(v.major) is int)
assert(v.major == 4)
assert(type(v.minor) in STR_TYPES)
assert(v.minor == '*')
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '*-*')
assert(type(v.patch1) in STR_TYPES)
assert(v.patch1 == '*')
assert(type(v.patch2) in STR_TYPES)
assert(v.patch2 == '*')
assert(type(v.patch_str) in STR_TYPES)
assert(v.patch_str == '-')
assert(v.name_sep == ' ')
assert(v.version_sep == '.')
s = 'OpenSSH *.*.*-*'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(type(v.major) in STR_TYPES)
assert(v.major == '*')
assert(type(v.minor) in STR_TYPES)
assert(v.minor == '*')
assert(type(v.patch) in STR_TYPES)
assert(v.patch == '*-*')
assert(type(v.patch1) in STR_TYPES)
assert(v.patch1 == '*')
assert(type(v.patch2) in STR_TYPES)
assert(v.patch2 == '*')
assert(type(v.patch_str) in STR_TYPES)
assert(v.patch_str == '-')
assert(v.name_sep == ' ')
assert(v.version_sep == '.')
s = 'OpenSSH x.x.x-x'
v = Version.parse(s)
assert(v.name_clean == 'openssh')
assert(type(v.major) in STR_TYPES)
assert(v.major == 'x')
assert(type(v.minor) in STR_TYPES)
assert(v.minor == 'x')
assert(type(v.patch) in STR_TYPES)
assert(v.patch == 'x-x')
assert(v.patch1 == 'x')
assert(v.patch2 == 'x')
assert(v.patch_str == '-')
assert(v.name_sep == ' ')
assert(v.version_sep == '.')
def test_parse_compare():
v1 = Version.parse('OpenSSH 4')
v2 = Version.parse('OpenSSH 5')
assert(v1 != v2)
assert(v2 != v1)
assert(v1 < v2)
assert(not(v1 > v2))
assert(v2 > v1)
assert(not(v2 < v1))
v1 = Version.parse('OpenSSH_4')
v2 = Version.parse('OpenSSH 5')
assert(v1 != v2)
assert(v2 != v1)
assert(v1 < v2)
assert(not(v1 > v2))
assert(v2 > v1)
assert(not(v2 < v1))
v1 = Version.parse('OpenSSH_4')
v2 = Version.parse('OpenSSH v5')
assert(v1 != v2)
assert(v2 != v1)
assert(v1 < v2)
assert(not(v1 > v2))
assert(v2 > v1)
assert(not(v2 < v1))
v1 = Version.parse('OpenSSH *')
v2 = Version.parse('OpenSSH 5')
assert(v1 == v2)
assert(v2 == v1)
assert(not (v1 < v2))
assert(not (v1 > v2))
assert(not (v2 > v1))
assert(not (v2 < v1))
v1 = Version.parse('OpenSSH_4.3')
v2 = Version.parse('OpenSSH 4.3')
assert(v1 == v2)
assert(v2 == v1)
assert(not (v1 < v2))
assert(not (v1 > v2))
assert(not (v2 > v1))
assert(not (v2 < v1))
v1 = Version.parse('OpenSSH_4.3')
v2 = Version.parse('OpenSSH 4.4')
assert(v1 != v2)
assert(v2 != v1)
assert(v1 < v2)
assert(not (v1 > v2))
assert(v2 > v1)
assert(not (v2 < v1))
v1 = Version.parse('OpenSSH_5.5p1')
v2 = Version.parse('OpenSSH_5.5p1 Debian-6+squeeze2')
assert(v1 == v2)
assert(v2 == v1)
assert(not (v1 < v2))
assert(not (v1 > v2))
assert(not (v2 > v1))
assert(not (v2 < v1))
v1 = Version.parse('OpenSSH_5.5p1')
v2 = Version.parse('OpenSSH_6.5p1 Debian-6+squeeze2')
assert(v1 != v2)
assert(v2 != v1)
assert(v1 < v2)
assert(not (v1 > v2))
assert(v2 > v1)
assert(not (v2 < v1))
v1 = Version.parse('OpenSSH_5.5p1')
v2 = Version.parse('OpenSSH_5.6p1 Debian-6+squeeze2')
assert(v1 != v2)
assert(v2 != v1)
assert(v1 < v2)
assert(not (v1 > v2))
assert(v2 > v1)
assert(not (v2 < v1))
v1 = Version.parse('OpenSSH_5.5p1')
v2 = Version.parse('OpenSSH_5.5p2 Debian-6+squeeze2')
assert(v1 != v2)
assert(v2 != v1)
assert(v1 < v2)
assert(not (v1 > v2))
assert(v2 > v1)
assert(not (v2 < v1))
v1 = Version.parse('Quux 1.12_15')
v2 = Version.parse('Quux 1.12_15')
assert(v1 == v2)
assert(v2 == v1)
assert(not (v1 < v2))
assert(not (v1 > v2))
assert(not (v2 < v1))
assert(not (v2 > v1))
# TODO few more here
def test_encodings():
s = 'foo 1.0'
v = Version.parse(s)
if sys.version_info.major == 2:
assert(str(v) == s)
assert(unicode(v) == coerce_to_unicode(s))
if sys.version_info.major == 3:
assert(str(v) == coerce_to_unicode(s))
s = u'foo 1.0'
v = Version.parse(s)
if sys.version_info.major == 2:
assert(str(v) == str(s))
assert(unicode(v) == s)
if sys.version_info.major == 3:
assert(str(v) == s)
s = b'foo 1.0'
v = Version.parse(s)
if sys.version_info.major == 2:
assert(str(v) == s)
assert(unicode(v) == coerce_to_unicode(s))
if sys.version_info.major == 3:
assert(str(v) == coerce_to_unicode(s))
def test_bad():
# Bogus data which shouldn't parse
s = 'pop3d'
assert_raises(VersionParseError, Version.parse, s)
s = 'Dovecot pop3d'
assert_raises(VersionParseError, Version.parse, s)
s = '(protocol version 30)'
assert_raises(VersionParseError, Version.parse, s)
# TODO find_version
#s = 'netkit-rsh rexecd None'
#assert_raises(VersionParseError, Version.parse, s)
# TODO find_version
#s = 'Postfix smtpd None'
#assert_raises(VersionParseError, Version.parse, s)
# TODO find_version
#s = 'Linux telnetd None'
#assert_raises(VersionParseError, Version.parse, s)
| 26.538232 | 110 | 0.532808 | 13,623 | 88,850 | 3.426264 | 0.052558 | 0.189562 | 0.071407 | 0.075885 | 0.847223 | 0.821686 | 0.79527 | 0.739652 | 0.700317 | 0.659547 | 0 | 0.072801 | 0.278492 | 88,850 | 3,347 | 111 | 26.546161 | 0.655305 | 0.196624 | 0 | 0.657064 | 0 | 0.005487 | 0.114279 | 0.014987 | 0 | 0 | 0 | 0.000299 | 0.693644 | 1 | 0.006859 | false | 0.001829 | 0 | 0.000457 | 0.007773 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ce4a08c39194bc4f4b65f5708f5542c7f128e0d7 | 6,686 | py | Python | script.py | brunoggregorio/DCNN-feature-extraction | 08b6bc751612f98a147afbe55051d3c3aec3693e | [
"Apache-2.0"
] | null | null | null | script.py | brunoggregorio/DCNN-feature-extraction | 08b6bc751612f98a147afbe55051d3c3aec3693e | [
"Apache-2.0"
] | 10 | 2020-09-25T22:16:54.000Z | 2022-02-10T02:53:43.000Z | script.py | brunoggregorio/DCNN-feature-extraction | 08b6bc751612f98a147afbe55051d3c3aec3693e | [
"Apache-2.0"
] | null | null | null | """!
"""
import numpy as np
from dcnn_mtm import dcnn_mtm
# Create global variables
# =======================
base_path = "/home/brunoggregorio/Workspace/data/dataset/"
dcnn_model = ['Xception',
'VGG16', 'VGG19',
'ResNet50', 'ResNet101', 'ResNet152',
'ResNet50V2', 'ResNet101V2', 'ResNet152V2',
'InceptionV3', 'InceptionResNetV2',
'DenseNet121', 'DenseNet169', 'DenseNet201',
'NASNetLarge']
min_side = 1000
max_side = 1400
thres_feature = 0.9
retained_value = 0.1
radius_feature = 5.0
pyramid_levels = 1
thres_ecc = 0.62
constant = -25.5
thres_binary = np.arange(0.7, 0.96, 0.05)
# =====================================================
# SPINALCORD 1
# =====================================================
video = "spinalcord_1"
aux_dcnn_model = ['ResNet152V2',
'InceptionV3', 'InceptionResNetV2',
'DenseNet121', 'DenseNet169', 'DenseNet201',
'NASNetLarge']
folder_path = base_path + video + "/frames/"
mask_img = base_path + video + "/" + video + "_mask.png"
ground_truth = base_path + video + "/" + video + "_gt.txt"
# Getting started
print("------------------------------")
print("Processing video:", video)
print("------------------------------")
# For each model in the list
for model in aux_dcnn_model:
# For each number of templates
for n_tmpl in ['1', '2', '3']:
# Get proper file of template positions
template = base_path + video + "/" + video + "_" + n_tmpl + "-templates.txt"
print("Model: {}, # Templates: {}".format(model, n_tmpl))
# For each threshold for TM image binarization
for thresh in thres_binary:
# Do the real work
dcnn_mtm(folder_path=folder_path,
mask_img=mask_img,
ground_truth=ground_truth,
template=template,
dcnn_model=model,
thres_binary=thresh)
# =====================================================
# MESENTERY 1
# =====================================================
video = "mesentery_1"
folder_path = base_path + video + "/frames/"
mask_img = base_path + video + "/" + video + "_mask.png"
ground_truth = base_path + video + "/" + video + "_gt.txt"
# Getting started
print("------------------------------")
print("Processing video:", video)
print("------------------------------")
# For each model in the list
for model in dcnn_model:
# For each number of templates
for n_tmpl in ['1', '2', '3']:
# Get proper file of template positions
template = base_path + video + "/" + video + "_" + n_tmpl + "-templates.txt"
print("Model: {}, # Templates: {}".format(model, n_tmpl))
# For each threshold for TM image binarization
for thresh in thres_binary:
# Do the real work
dcnn_mtm(folder_path=folder_path,
mask_img=mask_img,
ground_truth=ground_truth,
template=template,
dcnn_model=model,
thres_binary=thresh)
# =====================================================
# CREMASTER 2
# =====================================================
video = "cremaster_2"
folder_path = base_path + video + "/frames/"
mask_img = base_path + video + "/" + video + "_mask.png"
ground_truth = base_path + video + "/" + video + "_gt.txt"
# Getting started
print("------------------------------")
print("Processing video:", video)
print("------------------------------")
# For each model in the list
for model in dcnn_model:
# For each number of templates
for n_tmpl in ['1', '2', '3']:
# Get proper file of template positions
template = base_path + video + "/" + video + "_" + n_tmpl + "-templates.txt"
print("Model: {}, # Templates: {}".format(model, n_tmpl))
# For each threshold for TM image binarization
for thresh in thres_binary:
# Do the real work
dcnn_mtm(folder_path=folder_path,
mask_img=mask_img,
ground_truth=ground_truth,
template=template,
dcnn_model=model,
thres_binary=thresh)
# =====================================================
# CREMASTER 1
# =====================================================
video = "cremaster_1"
folder_path = base_path + video + "/frames/"
mask_img = base_path + video + "/" + video + "_mask.png"
ground_truth = base_path + video + "/" + video + "_gt.txt"
# Getting started
print("------------------------------")
print("Processing video:", video)
print("------------------------------")
# For each model in the list
for model in dcnn_model:
# For each number of templates
for n_tmpl in ['1', '2', '3']:
# Get proper file of template positions
template = base_path + video + "/" + video + "_" + n_tmpl + "-templates.txt"
print("Model: {}, # Templates: {}".format(model, n_tmpl))
# For each threshold for TM image binarization
for thresh in thres_binary:
# Do the real work
dcnn_mtm(folder_path=folder_path,
mask_img=mask_img,
ground_truth=ground_truth,
template=template,
dcnn_model=model,
thres_binary=thresh)
# =====================================================
# BRAIN 2
# =====================================================
video = "brain_2"
folder_path = base_path + video + "/frames/"
mask_img = base_path + video + "/" + video + "_mask.png"
ground_truth = base_path + video + "/" + video + "_gt.txt"
# Getting started
print("------------------------------")
print("Processing video:", video)
print("------------------------------")
# For each model in the list
for model in dcnn_model:
# For each number of templates
for n_tmpl in ['1', '2', '3']:
# Get proper file of template positions
template = base_path + video + "/" + video + "_" + n_tmpl + "-templates.txt"
print("Model: {}, # Templates: {}".format(model, n_tmpl))
# For each threshold for TM image binarization
for thresh in thres_binary:
# Do the real work
dcnn_mtm(folder_path=folder_path,
mask_img=mask_img,
ground_truth=ground_truth,
template=template,
dcnn_model=model,
thres_binary=thresh) | 32.77451 | 84 | 0.49641 | 676 | 6,686 | 4.702663 | 0.152367 | 0.052847 | 0.081787 | 0.084932 | 0.867568 | 0.867568 | 0.867568 | 0.815351 | 0.815351 | 0.815351 | 0 | 0.021246 | 0.274903 | 6,686 | 204 | 85 | 32.77451 | 0.634488 | 0.239605 | 0 | 0.79646 | 0 | 0 | 0.212043 | 0.068362 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.017699 | 0 | 0.017699 | 0.176991 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cec355c7fedbd3705030b1c022349663b1928e88 | 215 | py | Python | device.py | nisarkhanatwork/mctsnet | 2ff9e8234bd4a944246aab803e3dd07082042f62 | [
"Apache-2.0"
] | 6 | 2020-07-06T02:28:25.000Z | 2021-11-05T08:08:24.000Z | device.py | dixantmittal/async-dqn | 7d50c3b6524fd5cf872a1222595664f527ce8760 | [
"MIT"
] | null | null | null | device.py | dixantmittal/async-dqn | 7d50c3b6524fd5cf872a1222595664f527ce8760 | [
"MIT"
] | 1 | 2021-02-19T20:22:46.000Z | 2021-02-19T20:22:46.000Z | import torch as t
class Device:
device = t.device('cpu')
@staticmethod
def set_device(key):
Device.device = t.device(key)
@staticmethod
def get_device():
return Device.device
| 15.357143 | 37 | 0.623256 | 27 | 215 | 4.888889 | 0.481481 | 0.272727 | 0.19697 | 0.287879 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.27907 | 215 | 13 | 38 | 16.538462 | 0.851613 | 0 | 0 | 0.222222 | 0 | 0 | 0.013953 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0.111111 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
0c7eca3dddeaaa8087e2510298db7cb36bb9e7dc | 30,335 | py | Python | Utilities/MythBAdminCommands.py | xManu03/BombSquad-Mods | 465ee7767d6ef51d42ce6def8048be0820f03233 | [
"MIT"
] | null | null | null | Utilities/MythBAdminCommands.py | xManu03/BombSquad-Mods | 465ee7767d6ef51d42ce6def8048be0820f03233 | [
"MIT"
] | null | null | null | Utilities/MythBAdminCommands.py | xManu03/BombSquad-Mods | 465ee7767d6ef51d42ce6def8048be0820f03233 | [
"MIT"
] | 1 | 2021-01-06T13:57:35.000Z | 2021-01-06T13:57:35.000Z | import bs #Created By MythB # http://github.com/MythB
import bsInternal
import bsPowerup
import bsUtils
import random
import os
import MythBAdminList as mbal
class chatOptions(object):
def __init__(self):
self.MythBWasHere = True
def checkDevice(self,nick):# check if in adminlist
client_str = []
for i in bsInternal._getForegroundHostSession().players:#FIXME when player's nick contain lots of emoticon It's break equality!!!
if (i.getName()).encode('utf-8') == nick: # use i.getName(True) <-- if u need fullname
client_str = i.get_account_id()
if client_str in mbal.AdminList:
return True
else:
bsInternal._chatMessage("Commands Only For Admins")
return False
#bs.gameTimer(100,call=self.checkDevice,repeat=True)
def opt(self,nick,msg):
if self.checkDevice(nick):
m = msg.split(' ')[0] # command
a = msg.split(' ', 1)[1:] # arguments
activity = bsInternal._getForegroundHostActivity()
with bs.Context(activity):
if m == '/kick': #just remove from the game
if a == []:
bsInternal._chatMessage("MUST USE KICK ID")
else:
try:
kickedPlayerID = int(a[0])
except Exception:
bsInternal._chatMessage("PLAYER NOT FOUND")
else:
if not kickedPlayerID == -1:
bsInternal._disconnectClient(kickedPlayerID)
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
else:
bsInternal._chatMessage("CANT KICK HOST")
elif m == '/list': #list of current players id
bsInternal._chatMessage("==========PLAYER KICK IDS==========")
for i in bsInternal._getGameRoster():
try:
bsInternal._chatMessage(i['players'][0]['nameFull'] + " kick ID " + str(i['clientID']))
except Exception:
pass
bsInternal._chatMessage("==========PLAYER IDS=============")
for s in bsInternal._getForegroundHostSession().players:
bsInternal._chatMessage(s.getName() +" ID = "+ str(bsInternal._getForegroundHostSession().players.index(s)))
elif m == '/ban':# add id to banlist=autokick list
if a == []:
bsInternal._chatMessage("MUST USE PLAYER ID OR NICK") #also FIX this every time bsInternal ChatMessage thing!! for stop loops "update-FIXED"
else: #firstly try nick if nick len is more then 2 else try as player id FIX ME
if len(a[0]) > 2:
for i in bs.getActivity().players:
try:
if (i.getName()).encode('utf-8') == (a[0]):
bannedClient = i.getInputDevice().getClientID()
bannedName = i.getName().encode('utf-8')
bannedPlayerID = i.get_account_id()
foolist = []
foolist = mbal.autoKickList
if bannedPlayerID not in foolist:
foolist.append(bannedPlayerID)
bsInternal._chatMessage(str(bannedName) + " Banned")
i.removeFromGame()
else:
bsInternal._chatMessage(str(bannedName) + " Already Banned")
with open(bs.getEnvironment()['systemScriptsDirectory'] + "/MythBAdminList.py") as file:
s = [row for row in file]
s[7] = 'autoKickList = '+ str(foolist) + '\n'
f = open(bs.getEnvironment()['systemScriptsDirectory'] + "/MythBAdminList.py",'w')
for i in s:
f.write(i)
f.close()
reload(mbal)
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
else:
try:
bannedClient = bsInternal._getForegroundHostSession().players[int(a[0])]
except Exception:
bsInternal._chatMessage("PLAYER NOT FOUND")
else:
foolist = []
foolist = mbal.autoKickList
bannedPlayerID = bannedClient.get_account_id()
if bannedPlayerID not in foolist:
foolist.append(bannedPlayerID)
bsInternal._chatMessage(str(bannedClient) + " Banned")
bannedClient.removeFromGame()
else:
bsInternal._chatMessage(str(bannedClient) + " Already Banned")
with open(bs.getEnvironment()['systemScriptsDirectory'] + "/MythBAdminList.py") as file:
s = [row for row in file]
s[7] = 'autoKickList = '+ str(foolist) + '\n'
f = open(bs.getEnvironment()['systemScriptsDirectory'] + "/MythBAdminList.py",'w')
for i in s:
f.write(i)
f.close()
reload(mbal)
elif m == '/unban':# remove id from banlist=autokick list
if a == []:
bsInternal._chatMessage("MUST USE PLAYER ID OR NICK")
else:
if len(a[0]) > 2:
for i in bs.getActivity().players:
try:
if (i.getName()).encode('utf-8') == (a[0]):
bannedClient = i.getInputDevice().getClientID()
bannedName = i.getName().encode('utf-8')
bannedPlayerID = i.get_account_id()
foolist = []
foolist = mbal.autoKickList
if bannedPlayerID in foolist:
foolist.remove(bannedPlayerID)
bsInternal._chatMessage(str(bannedName) + " be free now!")
else:
bsInternal._chatMessage(str(bannedName) + " Already Not Banned")
with open(bs.getEnvironment()['systemScriptsDirectory'] + "/MythBAdminList.py") as file:
s = [row for row in file]
s[7] = 'autoKickList = '+ str(foolist) + '\n'
f = open(bs.getEnvironment()['systemScriptsDirectory'] + "/MythBAdminList.py",'w')
for i in s:
f.write(i)
f.close()
reload(mbal)
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
else:
try:
bannedClient = bsInternal._getForegroundHostSession().players[int(a[0])]
except Exception:
bsInternal._chatMessage("PLAYER NOT FOUND")
else:
foolist = []
foolist = mbal.autoKickList
bannedPlayerID = bannedClient.get_account_id()
if bannedPlayerID in foolist:
foolist.remove(bannedPlayerID)
bsInternal._chatMessage(str(bannedClient) + " be free now!")
else:
bsInternal._chatMessage(str(bannedClient) + " Already Not Banned")
with open(bs.getEnvironment()['systemScriptsDirectory'] + "/MythBAdminList.py") as file:
s = [row for row in file]
s[7] = 'autoKickList = '+ str(foolist) + '\n'
f = open(bs.getEnvironment()['systemScriptsDirectory'] + "/MythBAdminList.py",'w')
for i in s:
f.write(i)
f.close()
reload(mbal)
elif m == '/amnesty': # reset blacklist
foolist = []
bsInternal._chatMessage("==========FREEDOM TO ALL==========")
bsInternal._chatMessage("=========BLACKLİST WIPED=========")
with open(bs.getEnvironment()['systemScriptsDirectory'] + "/MythBAdminList.py") as file:
s = [row for row in file]
s[7] = 'autoKickList = '+ str(foolist) + '\n'
f = open(bs.getEnvironment()['systemScriptsDirectory'] + "/MythBAdminList.py",'w')
for i in s:
f.write(i)
f.close()
reload(mbal)
elif m == '/camera': #change camera mode
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
try:
if bs.getSharedObject('globals').cameraMode == 'follow':
bs.getSharedObject('globals').cameraMode = 'rotate'
else:
bs.getSharedObject('globals').cameraMode = 'follow'
except Exception:
bsInternal._chatMessage('AN ERROR OCCURED')
elif m == '/maxplayers': #set maxplayers limit
if a == []:
bsInternal._chatMessage('MUST USE NUMBERS')
else:
try:
bsInternal._getForegroundHostSession()._maxPlayers = int(a[0])
bsInternal._setPublicPartyMaxSize(int(a[0]))
bsInternal._chatMessage('MaxPlayers = '+str(int(a[0])))
except Exception:
bsInternal._chatMessage('AN ERROR OCCURED')
elif m == '/help': #show help
bsInternal._chatMessage("=====================COMMANDS=====================")
bsInternal._chatMessage("list-kick-remove-ban-unban-amnesty-kill-curse-end-heal")
bsInternal._chatMessage("freeze-thaw-headless-shield-punch-maxplayers-headlessall")
bsInternal._chatMessage("killall-freezeall-shieldall-punchall-camera-slow")
elif m == '/remove': #remove from game
if a == []:
bsInternal._chatMessage('MUST USE PLAYER ID OR NICK')
else:
if len(a[0]) > 2:
for i in bs.getActivity().players:
try:
if (i.getName()).encode('utf-8') == (a[0]):
i.removeFromGame()
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
else:
try:
bs.getActivity().players[int(a[0])].removeFromGame()
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
except Exception:
bsInternal._chatMessage('PLAYER NOT FOUND')
elif m == '/curse': #curse
if a == []:
bsInternal._chatMessage('MUST USE PLAYER ID OR NICK')
else:
if len(a[0]) > 2:
for i in bs.getActivity().players:
try:
if (i.getName()).encode('utf-8') == (a[0]):
if i.actor.exists():
i.actor.curse()
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
else:
try:
bs.getActivity().players[int(a[0])].actor.curse()
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
except Exception:
bsInternal._chatMessage('PLAYER NOT FOUND')
elif m == '/curseall': #curse all
for i in bs.getActivity().players:
try:
if i.actor.exists():
i.actor.curse()
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
elif m == '/kill': #kill
if a == []:
bsInternal._chatMessage('MUST USE PLAYER ID OR NICK')
else:
if len(a[0]) > 2:
for i in bs.getActivity().players:
try:
if (i.getName()).encode('utf-8') == (a[0]):
if i.actor.exists():
i.actor.node.handleMessage(bs.DieMessage())
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
else:
try:
bs.getActivity().players[int(a[0])].actor.node.handleMessage(bs.DieMessage())
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
except Exception:
bsInternal._chatMessage('PLAYER NOT FOUND')
elif m == '/killall': #kill all
for i in bs.getActivity().players:
try:
if i.actor.exists():
i.actor.node.handleMessage(bs.DieMessage())
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
elif m == '/freeze': #freeze
if a == []:
bsInternal._chatMessage('MUST USE PLAYER ID OR NICK')
else:
if len(a[0]) > 2:
for i in bs.getActivity().players:
try:
if (i.getName()).encode('utf-8') == (a[0]):
if i.actor.exists():
i.actor.node.handleMessage(bs.FreezeMessage())
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
else:
try:
bs.getActivity().players[int(a[0])].actor.node.handleMessage(bs.FreezeMessage())
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
except Exception:
bsInternal._chatMessage('PLAYER NOT FOUND')
elif m == '/freezeall': #freeze all
for i in bs.getActivity().players:
try:
if i.actor.exists():
i.actor.node.handleMessage(bs.FreezeMessage())
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
elif m == '/thaw': #thaw
if a == []:
bsInternal._chatMessage('MUST USE PLAYER ID OR NICK')
else:
if len(a[0]) > 2:
for i in bs.getActivity().players:
try:
if (i.getName()).encode('utf-8') == (a[0]):
if i.actor.exists():
i.actor.node.handleMessage(bs.ThawMessage())
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
else:
try:
bs.getActivity().players[int(a[0])].actor.node.handleMessage(bs.ThawMessage())
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
except Exception:
bsInternal._chatMessage('PLAYER NOT FOUND')
elif m == '/thawall': #thaw all
for i in bs.getActivity().players:
try:
if i.actor.exists():
i.actor.node.handleMessage(bs.ThawMessage())
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
elif m == '/headless': #headless
if a == []:
bsInternal._chatMessage('MUST USE PLAYER ID OR NICK')
else:
if len(a[0]) > 2:
for i in bs.getActivity().players:
try:
if (i.getName()).encode('utf-8') == (a[0]):
if i.actor.exists():
i.actor.node.headModel = None
i.actor.node.style = "cyborg"
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
else:
try:
bs.getActivity().players[int(a[0])].actor.node.headModel = None
bs.getActivity().players[int(a[0])].actor.node.style = "cyborg"
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
except Exception:
bsInternal._chatMessage('PLAYER NOT FOUND')
elif m == '/headlessall': #headless all
for i in bs.getActivity().players:
try:
if i.actor.exists():
i.actor.node.headModel = None
i.actor.node.style = "cyborg"
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
elif m == '/heal': #heal
if a == []:
bsInternal._chatMessage('MUST USE PLAYER ID OR NICK')
else:
if len(a[0]) > 2:
for i in bs.getActivity().players:
try:
if (i.getName()).encode('utf-8') == (a[0]):
if i.actor.exists():
i.actor.node.handleMessage(bs.PowerupMessage(powerupType = 'health'))
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
else:
try:
bs.getActivity().players[int(a[0])].actor.node.handleMessage(bs.PowerupMessage(powerupType = 'health'))
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
except Exception:
bsInternal._chatMessage('PLAYER NOT FOUND')
elif m == '/healall': #heal all
for i in bs.getActivity().players:
try:
if i.actor.exists():
i.actor.node.handleMessage(bs.PowerupMessage(powerupType = 'health'))
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
elif m == '/shield': #shield
if a == []:
bsInternal._chatMessage('MUST USE PLAYER ID OR NICK')
else:
if len(a[0]) > 2:
for i in bs.getActivity().players:
try:
if (i.getName()).encode('utf-8') == (a[0]):
if i.actor.exists():
i.actor.node.handleMessage(bs.PowerupMessage(powerupType = 'shield'))
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
else:
try:
bs.getActivity().players[int(a[0])].actor.node.handleMessage(bs.PowerupMessage(powerupType = 'shield'))
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
except Exception:
bsInternal._chatMessage('PLAYER NOT FOUND')
elif m == '/shieldall': #shield all
for i in bs.getActivity().players:
try:
if i.actor.exists():
i.actor.node.handleMessage(bs.PowerupMessage(powerupType = 'shield'))
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
elif m == '/punch': #punch
if a == []:
bsInternal._chatMessage('MUST USE PLAYER ID OR NICK')
else:
if len(a[0]) > 2:
for i in bs.getActivity().players:
try:
if (i.getName()).encode('utf-8') == (a[0]):
if i.actor.exists():
i.actor.node.handleMessage(bs.PowerupMessage(powerupType = 'punch'))
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
else:
try:
bs.getActivity().players[int(a[0])].actor.node.handleMessage(bs.PowerupMessage(powerupType = 'punch'))
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
except Exception:
bsInternal._chatMessage('PLAYER NOT FOUND')
elif m == '/punchall': #punch all
for i in bs.getActivity().players:
try:
if i.actor.exists():
i.actor.node.handleMessage(bs.PowerupMessage(powerupType = 'punch'))
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
elif m == '/knock': #knock him
if a == []:
bsInternal._chatMessage('MUST USE PLAYER ID OR NICK')
else:
if len(a[0]) > 2:
for i in bs.getActivity().players:
try:
if (i.getName()).encode('utf-8') == (a[0]):
if i.actor.exists():
i.actor.node.handleMessage("knockout",5000)
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
else:
try:
bs.getActivity().players[int(a[0])].actor.node.handleMessage("knockout",5000)
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
except Exception:
bsInternal._chatMessage('PLAYER NOT FOUND')
elif m == '/knockall': #knock all
for i in bs.getActivity().players:
try:
if i.actor.exists():
i.actor.node.handleMessage("knockout",5000)
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
elif m == '/celebrate': #celebrate him
if a == []:
bsInternal._chatMessage('MUST USE PLAYER ID OR NICK')
else:
if len(a[0]) > 2:
for i in bs.getActivity().players:
try:
if (i.getName()).encode('utf-8') == (a[0]):
if i.actor.exists():
i.actor.node.handleMessage('celebrate', 30000)
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
else:
try:
bs.getActivity().players[int(a[0])].actor.node.handleMessage('celebrate', 30000)
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
except Exception:
bsInternal._chatMessage('PLAYER NOT FOUND')
elif m == '/celebrateall': #celebrate
for i in bs.getActivity().players:
try:
if i.actor.exists():
i.actor.node.handleMessage('celebrate', 30000)
except Exception:
pass
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
elif m == '/slow': # slow-mo
bsInternal._chatMessage(bs.getSpecialChar('logoFlat'))
try:
if bs.getSharedObject('globals').slowMotion == True:
bs.getSharedObject('globals').slowMotion = False
else:
bs.getSharedObject('globals').slowMotion = True
except Exception:
bsInternal._chatMessage('AN ERROR OCCURED')
elif m == '/end': # just end game
try:
bsInternal._getForegroundHostActivity().endGame()
bsInternal._chatMessage('THE END')
except Exception:
bsInternal._chatMessage('AN ERROR OCCURED')
c = chatOptions()
def cmd(msg):
if bsInternal._getForegroundHostActivity() is not None:
n = msg.split(': ')
c.opt(n[0],n[1])
bs.realTimer(5000,bs.Call(bsInternal._setPartyIconAlwaysVisible,True))
import bsUI
bs.realTimer(10000,bs.Call(bsUI.onPartyIconActivate,(0,0)))## THATS THE TRICKY PART check ==> 23858 bsUI / _handleLocalChatMessage
| 56.072089 | 163 | 0.391297 | 2,149 | 30,335 | 5.466729 | 0.114007 | 0.166241 | 0.072438 | 0.11653 | 0.780303 | 0.756469 | 0.739445 | 0.723698 | 0.720804 | 0.706759 | 0 | 0.009009 | 0.516994 | 30,335 | 540 | 164 | 56.175926 | 0.792656 | 0.02802 | 0 | 0.77381 | 0 | 0 | 0.084542 | 0.014538 | 0 | 0 | 0 | 0.001852 | 0 | 0 | null | null | 0.047619 | 0.015873 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0cd7937da582e2f12b263cadb692d74cedd76ec3 | 191 | py | Python | cli/medperf/tests/mocks/cube.py | johnugeorge/medperf | 5bc3f643064df14e9476bd4d4c1a4c0cce5337d5 | [
"Apache-2.0"
] | 1 | 2021-09-24T18:09:53.000Z | 2021-09-24T18:09:53.000Z | cli/medperf/tests/mocks/cube.py | johnugeorge/medperf | 5bc3f643064df14e9476bd4d4c1a4c0cce5337d5 | [
"Apache-2.0"
] | 2 | 2021-09-27T16:14:04.000Z | 2021-11-03T14:24:54.000Z | cli/medperf/tests/mocks/cube.py | johnugeorge/medperf | 5bc3f643064df14e9476bd4d4c1a4c0cce5337d5 | [
"Apache-2.0"
] | null | null | null | class MockCube:
def __init__(self, is_valid):
self.name = "Test"
self.valid = is_valid
def is_valid(self):
return self.valid
def run(self):
pass
| 17.363636 | 33 | 0.570681 | 25 | 191 | 4.08 | 0.48 | 0.205882 | 0.215686 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.335079 | 191 | 10 | 34 | 19.1 | 0.80315 | 0 | 0 | 0 | 0 | 0 | 0.020942 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0.125 | 0 | 0.125 | 0.625 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 7 |
49042c44b9e050431fabb15ef29bb0786eeef137 | 85 | py | Python | pyscfad/scf/__init__.py | yangdatou/pyscfad | 8b90c928928f8244237e5fe415858e074dd5e5fb | [
"MIT"
] | null | null | null | pyscfad/scf/__init__.py | yangdatou/pyscfad | 8b90c928928f8244237e5fe415858e074dd5e5fb | [
"MIT"
] | null | null | null | pyscfad/scf/__init__.py | yangdatou/pyscfad | 8b90c928928f8244237e5fe415858e074dd5e5fb | [
"MIT"
] | null | null | null | from pyscfad.scf import hf
def RHF(mol, **kwargs):
return hf.RHF(mol, **kwargs)
| 17 | 32 | 0.670588 | 14 | 85 | 4.071429 | 0.714286 | 0.210526 | 0.421053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 85 | 4 | 33 | 21.25 | 0.814286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
0bc220c2b9321c08975c27042c0ded41f2926da5 | 2,037 | py | Python | utils/logger.py | AnonymityCode/FastLFnet | cc4c1d9620fef5e75798f40084729d8d7fdd5a9a | [
"MIT"
] | 8 | 2021-10-13T01:31:25.000Z | 2022-03-18T03:13:15.000Z | utils/logger.py | zcong17huang/FastLFnet | d86e9b333f6acb62e0b4d30ed14519fe39ef2963 | [
"MIT"
] | 5 | 2021-12-13T07:19:14.000Z | 2022-03-26T13:00:37.000Z | utils/logger.py | AnonymityCode/FastLFnet | cc4c1d9620fef5e75798f40084729d8d7fdd5a9a | [
"MIT"
] | 2 | 2021-11-04T04:04:41.000Z | 2021-11-06T07:56:25.000Z | import logging
import os
def setup_logger1(filepath = None):
# Log object
logger = logging.getLogger('logger1')
logger.setLevel(level=logging.DEBUG)
# Set formatter
formatter_file = logging.Formatter('[%(asctime)s %(levelname)s %(lineno)4s] %(message)s', datefmt='%Y/%m/%d %H:%M:%S')
formatter_stream = logging.Formatter('[%(asctime)s %(lineno)4s] %(message)s', datefmt='%Y/%m/%d %H:%M')
# FileHandler
if filepath == None:
pass
else:
if not os.path.exists(os.path.dirname(filepath)):
os.makedirs(os.path.dirname(filepath))
file_handler = logging.FileHandler(filename=filepath)
file_handler.setLevel(logging.DEBUG)
file_handler.setFormatter(formatter_file)
logger.addHandler(file_handler)
# StreamHandler
stream_handler = logging.StreamHandler()
stream_handler.setLevel(logging.INFO)
stream_handler.setFormatter(formatter_stream)
logger.addHandler(stream_handler)
return logger
def setup_logger2(filepath = None):
# Log object
logger = logging.getLogger('logger2')
logger.setLevel(level=logging.DEBUG)
# Set formatter
formatter_file = logging.Formatter('[%(asctime)s %(levelname)s %(lineno)4s] %(message)s', datefmt='%Y/%m/%d %H:%M:%S')
formatter_stream = logging.Formatter('[%(asctime)s %(lineno)4s] %(message)s', datefmt='%Y/%m/%d %H:%M')
# FileHandler
if filepath == None:
pass
else:
if not os.path.exists(os.path.dirname(filepath)):
os.makedirs(os.path.dirname(filepath))
file_handler = logging.FileHandler(filename=filepath)
file_handler.setLevel(logging.DEBUG)
file_handler.setFormatter(formatter_file)
logger.addHandler(file_handler)
# StreamHandler
stream_handler = logging.StreamHandler()
stream_handler.setLevel(logging.INFO)
stream_handler.setFormatter(formatter_stream)
logger.addHandler(stream_handler)
return logger
| 34.525424 | 123 | 0.662248 | 234 | 2,037 | 5.653846 | 0.200855 | 0.066516 | 0.069539 | 0.072562 | 0.950869 | 0.950869 | 0.950869 | 0.885865 | 0.885865 | 0.885865 | 0 | 0.004966 | 0.209131 | 2,037 | 58 | 124 | 35.12069 | 0.816263 | 0.049583 | 0 | 0.85 | 0 | 0 | 0.134831 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0.05 | 0.05 | 0 | 0.15 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f0124878a948396e6f1fd04837d782d26ca80a95 | 64 | py | Python | env/Lib/site-packages/win32/file/__init__.py | Daniel-Key/HearStone-Python | 981584d2b9502319393bd92b48f0ec8d906b4d44 | [
"MIT"
] | null | null | null | env/Lib/site-packages/win32/file/__init__.py | Daniel-Key/HearStone-Python | 981584d2b9502319393bd92b48f0ec8d906b4d44 | [
"MIT"
] | 1 | 2020-10-27T14:44:08.000Z | 2020-10-27T14:44:08.000Z | env/Lib/site-packages/win32/file/__init__.py | Daniel-Key/HearStone-Python | 981584d2b9502319393bd92b48f0ec8d906b4d44 | [
"MIT"
] | null | null | null | from win32._file import *
from win32._file import _get_osfhandle | 32 | 38 | 0.84375 | 10 | 64 | 5 | 0.6 | 0.36 | 0.52 | 0.76 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.070175 | 0.109375 | 64 | 2 | 38 | 32 | 0.807018 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
b2cb9e833954a84a46cfe2e1f710e1ab8a3420bc | 4,242 | py | Python | ppr-api/migrations/versions/927e555db242_increase_party_first_name_key_length.py | cameron-freshworks/ppr | 01d6f5d300c791aebad5e58bb4601e9be2ccfc46 | [
"Apache-2.0"
] | null | null | null | ppr-api/migrations/versions/927e555db242_increase_party_first_name_key_length.py | cameron-freshworks/ppr | 01d6f5d300c791aebad5e58bb4601e9be2ccfc46 | [
"Apache-2.0"
] | null | null | null | ppr-api/migrations/versions/927e555db242_increase_party_first_name_key_length.py | cameron-freshworks/ppr | 01d6f5d300c791aebad5e58bb4601e9be2ccfc46 | [
"Apache-2.0"
] | null | null | null | """Increase party.first_name_key length.
Revision ID: 927e555db242
Revises: f27b7c10458f
Create Date: 2022-03-07 12:13:52.120132
"""
from alembic import op
import sqlalchemy as sa
from alembic_utils.pg_function import PGFunction
from sqlalchemy import text as sql_text
from alembic_utils.pg_view import PGView
from sqlalchemy import text as sql_text
from sqlalchemy.dialects import postgresql
# revision identifiers, used by Alembic.
revision = '927e555db242'
down_revision = 'f27b7c10458f'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.alter_column('parties', 'first_name_key',
existing_type=sa.VARCHAR(length=50),
type_=sa.String(length=100),
existing_nullable=True)
public_searchkey_individual = PGFunction(
schema="public",
signature="searchkey_individual(last_name character varying, first_name character varying)",
definition="RETURNS character varying\n LANGUAGE plpgsql\n AS $function$\nDECLARE\n v_ind_key VARCHAR(100);\n\t\tv_last_name VARCHAR(50);\n\t\tv_first_name VARCHAR(50);\n BEGIN\n\t -- Remove special characters last name\n v_last_name := regexp_replace(LAST_NAME,'[^\\w]+',' ','gi');\n -- Remove prefixes suffixes last name\n\t\tv_last_name := regexp_replace(v_last_name,'\\y(DR|MR|MRS|MS|CH|DE|DO|DA|LE|LA|MA|JR|SR|I|II|III)\\y','','gi');\n\t\t-- Remove extra spaces\n\t\tv_last_name := trim(regexp_replace(v_last_name, '\\s+', ' ', 'gi'));\n\t\t-- Remove repeating letters\n\t\tv_last_name := regexp_replace(v_last_name, '(.)\x01{1,}', '\x01', 'g');\n\t\t-- Remove special characters first name\n v_first_name := regexp_replace(first_name,'[^\\w]+',' ','gi');\n -- Remove prefixes first name\n\t\tv_first_name := regexp_replace(v_first_name,'\\y(DR|MR|MRS|MS|CH|DE|DO|DA|LE|LA|MA|JR|SR|I|II|III)\\y','','gi');\n\t\t-- Remove extra spaces\n\t\tv_first_name := trim(regexp_replace(v_first_name, '\\s+', ' ', 'gi'));\n\t\t-- Remove repeating letters\n\t\tv_first_name := regexp_replace(v_first_name, '(.)\x01{1,}', '\x01', 'g');\n\n\t\t-- join last first name\n\t\tv_ind_key := v_last_name||' '||v_first_name;\n\n RETURN UPPER(v_ind_key);\n END\n ; \n $function$"
)
op.replace_entity(public_searchkey_individual)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
public_searchkey_individual = PGFunction(
schema="public",
signature="searchkey_individual(last_name character varying, first_name character varying)",
definition="returns character varying\n LANGUAGE plpgsql\nAS $function$\nDECLARE\n v_ind_key VARCHAR(120);\n\t\tv_last_name VARCHAR(60);\n\t\tv_first_name VARCHAR(60);\n BEGIN\n\t -- Remove special characters last name\n v_last_name := regexp_replace(LAST_NAME,'[^\\w]+',' ','gi');\n -- Remove prefixes suffixes last name\n\t\tv_last_name := regexp_replace(v_last_name,'\\y(DR|MR|MRS|MS|CH|DE|DO|DA|LE|LA|MA|JR|SR|I|II|III)\\y','','gi');\n\t\t-- Remove extra spaces\n\t\tv_last_name := trim(regexp_replace(v_last_name, '\\s+', ' ', 'gi'));\n\t\t-- Remove repeating letters\n\t\tv_last_name := regexp_replace(v_last_name, '(.)\\1{1,}', '\\1', 'g');\n\t\t-- Remove special characters first name\n v_first_name := regexp_replace(first_name,'[^\\w]+',' ','gi');\n -- Remove prefixes first name\n\t\tv_first_name := regexp_replace(v_first_name,'\\y(DR|MR|MRS|MS|CH|DE|DO|DA|LE|LA|MA|JR|SR|I|II|III)\\y','','gi');\n\t\t-- Remove extra spaces\n\t\tv_first_name := trim(regexp_replace(v_first_name, '\\s+', ' ', 'gi'));\n\t\t-- Remove repeating letters\n\t\tv_first_name := regexp_replace(v_first_name, '(.)\\1{1,}', '\\1', 'g');\n\n\t\t-- join last first name\n\t\tv_ind_key := v_last_name||' '||v_first_name;\n\n RETURN UPPER(v_ind_key);\n END\n ;\n$function$"
)
op.replace_entity(public_searchkey_individual)
op.alter_column('parties', 'first_name_key',
existing_type=sa.String(length=100),
type_=sa.VARCHAR(length=50),
existing_nullable=True)
# ### end Alembic commands ###
| 81.576923 | 1,329 | 0.67586 | 680 | 4,242 | 4.005882 | 0.186765 | 0.023495 | 0.026432 | 0.03304 | 0.82746 | 0.801028 | 0.765051 | 0.711454 | 0.684288 | 0.684288 | 0 | 0.025974 | 0.146865 | 4,242 | 51 | 1,330 | 83.176471 | 0.72672 | 0.0752 | 0 | 0.424242 | 0 | 0.060606 | 0.729799 | 0.334534 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060606 | false | 0 | 0.212121 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6509730a629e7de5e3abb82ac170663527881786 | 10,578 | py | Python | src/plotResultNew.py | andertavares/syntheticmdps | d60dd38b5652cbe64be067374d0361b304f7dca5 | [
"MIT"
] | 1 | 2021-02-25T14:40:55.000Z | 2021-02-25T14:40:55.000Z | src/plotResultNew.py | andertavares/syntheticmdps | d60dd38b5652cbe64be067374d0361b304f7dca5 | [
"MIT"
] | 2 | 2018-05-15T11:41:39.000Z | 2018-05-15T11:44:13.000Z | src/plotResultNew.py | andertavares/syntheticmdps | d60dd38b5652cbe64be067374d0361b304f7dca5 | [
"MIT"
] | null | null | null | #!/usr/bin/python3
import matplotlib
import matplotlib.pyplot as plt
import numpy as np
import pickle
import sys
import os
import subprocess
from matplotlib.transforms import Bbox
def listToString(p):
string = ""
for x in p[0:-1]:
string = string + str(x) + ","
string = string + str(p[-1])
return string
def calcConfInt(p):
firstElement = p[0]
allEqual = True
for element in p:
if (element != firstElement):
allEqual = False
if allEqual:
return 0
f = open("tmp.R","w")
f.write("#!/usr/bin/Rscript\n")
f.write("print(t.test(c(" + listToString(p) + "),conf.level=0.99))\n")
f.close()
os.system("chmod +x ./tmp.R")
output = subprocess.check_output("./tmp.R", stderr=subprocess.STDOUT, shell=True)
output = output.split()
return float(output[31])
team_sizes = [5,10,15,20,25]
n_arms = [100,150,200,250,300]
upper_bounds = [0.30,0.40,0.50,0.60,0.70,0.80,0.90]
results = []
resultsLowCI = []
resultsFolder = "../../results/results_uniform_decay_e"
#expNumber = 0
#bandit_sizes = [100,150,200,250,300]
#us = [0.7,0.8,0.9]
# tau as team size grows
for n in n_arms:
for u in upper_bounds:
resultsReward = []
resultsLowCIReward = []
resultsPBest = []
resultsLowCIPBest = []
resultsTimesBest = []
resultsLowCITimesBest = []
resultsCumulativeReward = []
resultsLowCICumulativeReward = []
resultsCumulativeRegret = []
resultsLowCICumulativeRegret = []
resultsRegretExp = []
resultsLowCIRegretExp = []
for team_sz in team_sizes:
currentResultReward = []
currentResultPBest = []
currentResultTimesBest = []
currentResultCumulativeReward = []
currentResultCumulativeRegret = []
currentResultRegretExp = []
for expNumber in range(5):
pickleFile = open(resultsFolder+"/"+str(expNumber)+"/"+str(n)+"/"+str(team_sz)+"/"+'%.2f' % u+"/results.pickle","rb")
result = pickle.load(pickleFile)
pickleFile.close()
currentResultReward.append(result[12])
currentResultPBest.append(result[13])
currentResultTimesBest.append(result[14])
currentResultCumulativeReward.append(result[15])
currentResultCumulativeRegret.append(result[16])
currentResultRegretExp.append(result[17])
resultsReward.append(np.mean(currentResultReward))
resultsLowCIReward.append(calcConfInt(currentResultReward))
resultsPBest.append(np.mean(currentResultPBest))
resultsLowCIPBest.append(calcConfInt(currentResultPBest))
resultsTimesBest.append(np.mean(currentResultTimesBest))
resultsLowCITimesBest.append(calcConfInt(currentResultTimesBest))
resultsCumulativeReward.append(np.mean(currentResultCumulativeReward))
resultsLowCICumulativeReward.append(calcConfInt(currentResultCumulativeReward))
resultsCumulativeRegret.append(np.mean(currentResultCumulativeRegret))
resultsLowCICumulativeRegret.append(calcConfInt(currentResultCumulativeRegret))
resultsRegretExp.append(np.mean(currentResultRegretExp))
resultsLowCIRegretExp.append(calcConfInt(currentResultRegretExp))
resultsToPlot = [resultsReward,resultsPBest,resultsTimesBest,resultsCumulativeReward,resultsCumulativeRegret,resultsRegretExp]
CIsToPlot = [resultsLowCIReward,resultsLowCIPBest,resultsLowCITimesBest,resultsLowCICumulativeReward,resultsLowCICumulativeRegret,resultsLowCIRegretExp]
namesToPlot = ["Reward","PBest","TimesBest","CumulativeReward","CumulativeRegret","CumulativeRegretExp"]
for p in range(6):
plt.figure(figsize=(3.0,2.0))
plt.errorbar(team_sizes,resultsToPlot[p],yerr=np.array(resultsToPlot[p])-np.array(CIsToPlot[p]))
plt.xlabel("Team Size")
plt.ylabel("Tau")
plt.savefig("plots/tau"+namesToPlot[p]+"-ChangeTeamSize-"+str(n)+"-" + str(u) +"-uniform.pdf",bbox_inches='tight')
plt.close()
# tau as problem size grows
for u in upper_bounds:
for team_sz in team_sizes:
resultsReward = []
resultsLowCIReward = []
resultsPBest = []
resultsLowCIPBest = []
resultsTimesBest = []
resultsLowCITimesBest = []
resultsCumulativeReward = []
resultsLowCICumulativeReward = []
resultsCumulativeRegret = []
resultsLowCICumulativeRegret = []
resultsRegretExp = []
resultsLowCIRegretExp = []
for n in n_arms:
currentResultReward = []
currentResultPBest = []
currentResultTimesBest = []
currentResultCumulativeReward = []
currentResultCumulativeRegret = []
currentResultRegretExp = []
for expNumber in range(5):
pickleFile = open(resultsFolder+"/"+str(expNumber)+"/"+str(n)+"/"+str(team_sz)+"/"+'%.2f' % u+"/results.pickle","rb")
result = pickle.load(pickleFile)
pickleFile.close()
currentResultReward.append(result[12])
currentResultPBest.append(result[13])
currentResultTimesBest.append(result[14])
currentResultCumulativeReward.append(result[15])
currentResultCumulativeRegret.append(result[16])
currentResultRegretExp.append(result[17])
resultsReward.append(np.mean(currentResultReward))
resultsLowCIReward.append(calcConfInt(currentResultReward))
resultsPBest.append(np.mean(currentResultPBest))
resultsLowCIPBest.append(calcConfInt(currentResultPBest))
resultsTimesBest.append(np.mean(currentResultTimesBest))
resultsLowCITimesBest.append(calcConfInt(currentResultTimesBest))
resultsCumulativeReward.append(np.mean(currentResultCumulativeReward))
resultsLowCICumulativeReward.append(calcConfInt(currentResultCumulativeReward))
resultsCumulativeRegret.append(np.mean(currentResultCumulativeRegret))
resultsLowCICumulativeRegret.append(calcConfInt(currentResultCumulativeRegret))
resultsRegretExp.append(np.mean(currentResultRegretExp))
resultsLowCIRegretExp.append(calcConfInt(currentResultRegretExp))
resultsToPlot = [resultsReward,resultsPBest,resultsTimesBest,resultsCumulativeReward,resultsCumulativeRegret,resultsRegretExp]
CIsToPlot = [resultsLowCIReward,resultsLowCIPBest,resultsLowCITimesBest,resultsLowCICumulativeReward,resultsLowCICumulativeRegret,resultsLowCIRegretExp]
namesToPlot = ["Reward","PBest","TimesBest","CumulativeReward","CumulativeRegret","CumulativeRegretExp"]
for p in range(6):
plt.figure(figsize=(3.0,2.0))
plt.errorbar(n_arms,resultsToPlot[p],yerr=np.array(resultsToPlot[p])-np.array(CIsToPlot[p]))
plt.xlabel("Problem Size")
plt.ylabel("Tau")
plt.savefig("plots/tau"+namesToPlot[p]+"-ChangeProblemSize-"+str(team_sz)+"-" + str(u)+"-uniform.pdf",bbox_inches='tight')
plt.close()
# # tau as upper bound grows
for n in n_arms:
for team_sz in team_sizes:
resultsReward = []
resultsLowCIReward = []
resultsPBest = []
resultsLowCIPBest = []
resultsTimesBest = []
resultsLowCITimesBest = []
resultsCumulativeReward = []
resultsLowCICumulativeReward = []
resultsCumulativeRegret = []
resultsLowCICumulativeRegret = []
resultsRegretExp = []
resultsLowCIRegretExp = []
for u in upper_bounds:
currentResultReward = []
currentResultPBest = []
currentResultTimesBest = []
currentResultCumulativeReward = []
currentResultCumulativeRegret = []
currentResultRegretExp = []
for expNumber in range(5):
pickleFile = open(resultsFolder+"/"+str(expNumber)+"/"+str(n)+"/"+str(team_sz)+"/"+'%.2f' % u+"/results.pickle","rb")
result = pickle.load(pickleFile)
pickleFile.close()
currentResultReward.append(result[12])
currentResultPBest.append(result[13])
currentResultTimesBest.append(result[14])
currentResultCumulativeReward.append(result[15])
currentResultCumulativeRegret.append(result[16])
currentResultRegretExp.append(result[17])
resultsReward.append(np.mean(currentResultReward))
resultsLowCIReward.append(calcConfInt(currentResultReward))
resultsPBest.append(np.mean(currentResultPBest))
resultsLowCIPBest.append(calcConfInt(currentResultPBest))
resultsTimesBest.append(np.mean(currentResultTimesBest))
resultsLowCITimesBest.append(calcConfInt(currentResultTimesBest))
resultsCumulativeReward.append(np.mean(currentResultCumulativeReward))
resultsLowCICumulativeReward.append(calcConfInt(currentResultCumulativeReward))
resultsCumulativeRegret.append(np.mean(currentResultCumulativeRegret))
resultsLowCICumulativeRegret.append(calcConfInt(currentResultCumulativeRegret))
resultsRegretExp.append(np.mean(currentResultRegretExp))
resultsLowCIRegretExp.append(calcConfInt(currentResultRegretExp))
resultsToPlot = [resultsReward,resultsPBest,resultsTimesBest,resultsCumulativeReward,resultsCumulativeRegret,resultsRegretExp]
CIsToPlot = [resultsLowCIReward,resultsLowCIPBest,resultsLowCITimesBest,resultsLowCICumulativeReward,resultsLowCICumulativeRegret,resultsLowCIRegretExp]
namesToPlot = ["Reward","PBest","TimesBest","CumulativeReward","CumulativeRegret","CumulativeRegretExp"]
for p in range(6):
plt.figure(figsize=(3.0,2.0))
plt.errorbar(upper_bounds,resultsToPlot[p],yerr=np.array(resultsToPlot[p])-np.array(CIsToPlot[p]),capsize=3)
plt.xlabel("Upper Bound")
plt.ylabel("Tau")
plt.savefig("plots/tau"+namesToPlot[p]+"-ChangeUpperBound-"+str(n)+"-" + str(team_sz)+"-uniform.pdf",bbox_inches='tight')
plt.close()
| 37.378092 | 160 | 0.651919 | 825 | 10,578 | 8.321212 | 0.195152 | 0.031464 | 0.031464 | 0.006409 | 0.878951 | 0.864967 | 0.862491 | 0.852586 | 0.852586 | 0.846468 | 0 | 0.016903 | 0.239365 | 10,578 | 282 | 161 | 37.510638 | 0.836316 | 0.015031 | 0 | 0.769231 | 0 | 0 | 0.056287 | 0.005571 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010256 | false | 0 | 0.041026 | 0 | 0.066667 | 0.005128 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
651c4df49eb375fe4c043bcaa1a1741e35cd901a | 1,854 | py | Python | froide/guide/templatetags/guidance_tags.py | sthapa/froide | e751e071f33c7c0ffe67c4e1aabafbe52ab1e74f | [
"MIT"
] | null | null | null | froide/guide/templatetags/guidance_tags.py | sthapa/froide | e751e071f33c7c0ffe67c4e1aabafbe52ab1e74f | [
"MIT"
] | null | null | null | froide/guide/templatetags/guidance_tags.py | sthapa/froide | e751e071f33c7c0ffe67c4e1aabafbe52ab1e74f | [
"MIT"
] | null | null | null | from collections import defaultdict
from django import template
from ..models import Guidance
register = template.Library()
@register.inclusion_tag("guide/guidance.html", takes_context=True)
def render_guidance(context, message):
if not hasattr(message, "guidances"):
# Get all problem reports for all messages
request = message.request
guidances = Guidance.objects.filter(
message__in=request.messages
).select_related("action", "rule")
message_guidances = defaultdict(list)
for guidance in guidances:
message_guidances[guidance.message_id].append(guidance)
for mes in request.messages:
mes.guidances = message_guidances[mes.id]
# Avoid query by assigning
for guidance in mes.guidances:
guidance.message = mes
return {
"request": context["request"],
"message": message,
"foirequest": message.request,
}
@register.inclusion_tag("foirequest/body/message/guidance.html", takes_context=True)
def render_guidance_alpha(context, message):
if not hasattr(message, "guidances"):
# Get all problem reports for all messages
request = message.request
guidances = Guidance.objects.filter(
message__in=request.messages
).select_related("action", "rule")
message_guidances = defaultdict(list)
for guidance in guidances:
message_guidances[guidance.message_id].append(guidance)
for mes in request.messages:
mes.guidances = message_guidances[mes.id]
# Avoid query by assigning
for guidance in mes.guidances:
guidance.message = mes
return {
"request": context["request"],
"message": message,
"foirequest": message.request,
}
| 32.526316 | 84 | 0.648867 | 194 | 1,854 | 6.092784 | 0.257732 | 0.108291 | 0.05753 | 0.040609 | 0.852792 | 0.852792 | 0.852792 | 0.852792 | 0.77665 | 0.77665 | 0 | 0 | 0.261597 | 1,854 | 56 | 85 | 33.107143 | 0.863404 | 0.070658 | 0 | 0.761905 | 0 | 0 | 0.090803 | 0.021537 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0.071429 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
65384088d5bfcec131d4bac961e9c4e0e9f3f1a3 | 149 | py | Python | jobstamps/__init__.py | polysquare/jobstamps | 49b4dec93b38c9db55643226a9788c675a53ef25 | [
"MIT"
] | 1 | 2018-10-16T22:37:57.000Z | 2018-10-16T22:37:57.000Z | jobstamps/__init__.py | polysquare/jobstamps | 49b4dec93b38c9db55643226a9788c675a53ef25 | [
"MIT"
] | 1 | 2016-11-10T12:56:52.000Z | 2016-11-10T12:56:52.000Z | jobstamps/__init__.py | polysquare/jobstamps | 49b4dec93b38c9db55643226a9788c675a53ef25 | [
"MIT"
] | 2 | 2015-07-15T12:33:39.000Z | 2018-11-14T09:12:13.000Z | # /jobstamps/__init__.py
#
# Module loader file for jobstamps.
#
# See /LICENCE.md for Copyright information
"""Module loader file for jobstamps."""
| 21.285714 | 43 | 0.738255 | 19 | 149 | 5.578947 | 0.631579 | 0.226415 | 0.301887 | 0.358491 | 0.528302 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.14094 | 149 | 6 | 44 | 24.833333 | 0.828125 | 0.892617 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e8ed7076778afd21ce35d99abfcb0b45e583a1be | 5,101 | py | Python | tests/test_xxh64.py | charmoniumQ/python-xxhash | 9412c264eb545686f1318ef2ac9940b327191b1c | [
"BSD-2-Clause"
] | 269 | 2015-01-22T13:23:09.000Z | 2022-03-15T10:44:05.000Z | tests/test_xxh64.py | charmoniumQ/python-xxhash | 9412c264eb545686f1318ef2ac9940b327191b1c | [
"BSD-2-Clause"
] | 50 | 2015-02-19T20:28:05.000Z | 2022-03-09T06:35:35.000Z | tests/test_xxh64.py | charmoniumQ/python-xxhash | 9412c264eb545686f1318ef2ac9940b327191b1c | [
"BSD-2-Clause"
] | 26 | 2015-08-11T12:54:06.000Z | 2022-01-03T00:51:12.000Z | import os
import unittest
import random
import xxhash
class TestXXH(unittest.TestCase):
def test_xxh64(self):
self.assertEqual(xxhash.xxh64('a').intdigest(), 15154266338359012955)
self.assertEqual(xxhash.xxh64('a', 0).intdigest(), 15154266338359012955)
self.assertEqual(xxhash.xxh64('a', 1).intdigest(), 16051599287423682246)
self.assertEqual(xxhash.xxh64('a', 2**64-1).intdigest(), 6972758980737027682)
def test_xxh64_intdigest(self):
self.assertEqual(xxhash.xxh64_intdigest('a'), 15154266338359012955)
self.assertEqual(xxhash.xxh64_intdigest('a', 0), 15154266338359012955)
self.assertEqual(xxhash.xxh64_intdigest('a', 1), 16051599287423682246)
self.assertEqual(xxhash.xxh64_intdigest('a', 2**64-1), 6972758980737027682)
def test_xxh64_update(self):
x = xxhash.xxh64()
x.update('a')
self.assertEqual(xxhash.xxh64('a').digest(), x.digest())
self.assertEqual(xxhash.xxh64_digest('a'), x.digest())
x.update('b')
self.assertEqual(xxhash.xxh64('ab').digest(), x.digest())
self.assertEqual(xxhash.xxh64_digest('ab'), x.digest())
x.update('c')
self.assertEqual(xxhash.xxh64('abc').digest(), x.digest())
self.assertEqual(xxhash.xxh64_digest('abc'), x.digest())
seed = random.randint(0, 2**64)
x = xxhash.xxh64(seed=seed)
x.update('a')
self.assertEqual(xxhash.xxh64('a', seed).digest(), x.digest())
self.assertEqual(xxhash.xxh64_digest('a', seed), x.digest())
x.update('b')
self.assertEqual(xxhash.xxh64('ab', seed).digest(), x.digest())
self.assertEqual(xxhash.xxh64_digest('ab', seed), x.digest())
x.update('c')
self.assertEqual(xxhash.xxh64('abc', seed).digest(), x.digest())
self.assertEqual(xxhash.xxh64_digest('abc', seed), x.digest())
def test_xxh64_reset(self):
x = xxhash.xxh64()
h = x.intdigest()
for i in range(10, 50):
x.update(os.urandom(i))
x.reset()
self.assertEqual(h, x.intdigest())
def test_xxh64_copy(self):
a = xxhash.xxh64()
a.update('xxhash')
b = a.copy()
self.assertEqual(a.digest(), b.digest())
self.assertEqual(a.intdigest(), b.intdigest())
self.assertEqual(a.hexdigest(), b.hexdigest())
b.update('xxhash')
self.assertNotEqual(a.digest(), b.digest())
self.assertNotEqual(a.intdigest(), b.intdigest())
self.assertNotEqual(a.hexdigest(), b.hexdigest())
a.update('xxhash')
self.assertEqual(a.digest(), b.digest())
self.assertEqual(a.intdigest(), b.intdigest())
self.assertEqual(a.hexdigest(), b.hexdigest())
def test_xxh64_overflow(self):
s = 'I want an unsigned 64-bit seed!'
a = xxhash.xxh64(s, seed=0)
b = xxhash.xxh64(s, seed=2**64)
self.assertEqual(a.seed, b.seed)
self.assertEqual(a.intdigest(), b.intdigest())
self.assertEqual(a.hexdigest(), b.hexdigest())
self.assertEqual(a.digest(), b.digest())
self.assertEqual(a.intdigest(), xxhash.xxh64_intdigest(s, seed=0))
self.assertEqual(a.intdigest(), xxhash.xxh64_intdigest(s, seed=2**64))
self.assertEqual(a.digest(), xxhash.xxh64_digest(s, seed=0))
self.assertEqual(a.digest(), xxhash.xxh64_digest(s, seed=2**64))
self.assertEqual(a.hexdigest(), xxhash.xxh64_hexdigest(s, seed=0))
self.assertEqual(a.hexdigest(), xxhash.xxh64_hexdigest(s, seed=2**64))
a = xxhash.xxh64(s, seed=1)
b = xxhash.xxh64(s, seed=2**64+1)
self.assertEqual(a.seed, b.seed)
self.assertEqual(a.intdigest(), b.intdigest())
self.assertEqual(a.hexdigest(), b.hexdigest())
self.assertEqual(a.digest(), b.digest())
self.assertEqual(a.intdigest(), xxhash.xxh64_intdigest(s, seed=1))
self.assertEqual(a.intdigest(), xxhash.xxh64_intdigest(s, seed=2**64+1))
self.assertEqual(a.digest(), xxhash.xxh64_digest(s, seed=1))
self.assertEqual(a.digest(), xxhash.xxh64_digest(s, seed=2**64+1))
self.assertEqual(a.hexdigest(), xxhash.xxh64_hexdigest(s, seed=1))
self.assertEqual(a.hexdigest(), xxhash.xxh64_hexdigest(s, seed=2**64+1))
a = xxhash.xxh64(s, seed=2**65-1)
b = xxhash.xxh64(s, seed=2**66-1)
self.assertEqual(a.seed, b.seed)
self.assertEqual(a.intdigest(), b.intdigest())
self.assertEqual(a.hexdigest(), b.hexdigest())
self.assertEqual(a.digest(), b.digest())
self.assertEqual(a.intdigest(), xxhash.xxh64_intdigest(s, seed=2**65-1))
self.assertEqual(a.intdigest(), xxhash.xxh64_intdigest(s, seed=2**66-1))
self.assertEqual(a.digest(), xxhash.xxh64_digest(s, seed=2**65-1))
self.assertEqual(a.digest(), xxhash.xxh64_digest(s, seed=2**66-1))
self.assertEqual(a.hexdigest(), xxhash.xxh64_hexdigest(s, seed=2**65-1))
self.assertEqual(a.hexdigest(), xxhash.xxh64_hexdigest(s, seed=2**66-1))
if __name__ == '__main__':
unittest.main()
| 43.598291 | 85 | 0.632621 | 663 | 5,101 | 4.79638 | 0.081448 | 0.268868 | 0.181132 | 0.163522 | 0.814151 | 0.748113 | 0.724528 | 0.639623 | 0.609434 | 0.487421 | 0 | 0.086471 | 0.192903 | 5,101 | 116 | 86 | 43.974138 | 0.685936 | 0 | 0 | 0.282828 | 0 | 0 | 0.018624 | 0 | 0 | 0 | 0 | 0 | 0.606061 | 1 | 0.060606 | false | 0 | 0.040404 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
68748d4072d345a8bcf64030cfd4cac779208931 | 116 | py | Python | fedora_college/core/forms.py | echevemaster/fedora-college | 55033de5dd6f3a1063bf12f0b8937b493ae07692 | [
"BSD-3-Clause"
] | 4 | 2015-05-16T09:54:18.000Z | 2016-01-08T16:52:19.000Z | fedora_college/core/forms.py | fedora-infra/fedora-college | cf310dab2e4fea02b9ac5e7f57dc53aafb4834d8 | [
"BSD-3-Clause"
] | 1 | 2015-12-03T21:30:13.000Z | 2016-01-09T10:47:24.000Z | fedora_college/core/forms.py | echevemaster/fedora-college | 55033de5dd6f3a1063bf12f0b8937b493ae07692 | [
"BSD-3-Clause"
] | 1 | 2019-06-30T15:51:52.000Z | 2019-06-30T15:51:52.000Z | from fedora_college.modules.profile.forms import * # noqa
from fedora_college.modules.admin.forms import * # noqa
| 38.666667 | 58 | 0.793103 | 16 | 116 | 5.625 | 0.5625 | 0.222222 | 0.377778 | 0.533333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12069 | 116 | 2 | 59 | 58 | 0.882353 | 0.077586 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
687a62911ecbea9c81da56f5dd73e7387eff38c0 | 715 | py | Python | test/test-for.py | xupingmao/minipy | 5bce2f238925eb92fe9ff7d935f59ef68daa257a | [
"MIT"
] | 52 | 2016-07-11T10:14:35.000Z | 2021-12-09T09:10:43.000Z | test/test_case/005_test_for.py | xupingmao/snake | c956f151ed1ebd2faeaf1565352b59ca5a8fa0b4 | [
"MIT"
] | 13 | 2016-07-24T13:50:37.000Z | 2019-03-02T06:56:18.000Z | test/test_case/005_test_for.py | xupingmao/snake | c956f151ed1ebd2faeaf1565352b59ca5a8fa0b4 | [
"MIT"
] | 9 | 2017-01-27T10:46:04.000Z | 2021-12-09T09:10:46.000Z | import logging
def iter(i):
list = []
for y in i:
list.append(y)
logging.info(list)
return list
iter(range(3))
iter(range(2, 3))
iter(range(10, 0, -1))
logging.info('range=', range)
logging.info('range(5)=', range(5))
logging.info('iterate list')
list = [1,2, 'abc']
for key in list:
logging.info(key)
for key in (1,2,3):
logging.info('tuple', key)
for k,v in [[1,2], [3,4]]:
print(k,v)
import logging
def iter(i):
list = []
for y in i:
list.append(y)
logging.info(list)
return list
iter(range(3))
iter(range(2, 3))
iter(range(10, 0, -1))
logging.info('range=', range)
logging.info('range(5)=', range(5))
result = True | 15.543478 | 35 | 0.569231 | 118 | 715 | 3.449153 | 0.245763 | 0.243243 | 0.09828 | 0.09828 | 0.717445 | 0.717445 | 0.717445 | 0.717445 | 0.717445 | 0.717445 | 0 | 0.049724 | 0.240559 | 715 | 46 | 36 | 15.543478 | 0.699816 | 0 | 0 | 0.727273 | 0 | 0 | 0.069832 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060606 | false | 0 | 0.060606 | 0 | 0.181818 | 0.030303 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6886b7e73966ee51d1fe9fedc2c2758a6b3b37bd | 2,699 | py | Python | unit-testing/test_algorithms_and_data_structures/test_two_sums.py | amgad01/algorithms | 53eecf06e907cde806d4b78dc78fcd70d0271e3e | [
"MIT"
] | 1 | 2021-03-05T18:13:02.000Z | 2021-03-05T18:13:02.000Z | unit-testing/test_algorithms_and_data_structures/test_two_sums.py | amgad01/algorithms | 53eecf06e907cde806d4b78dc78fcd70d0271e3e | [
"MIT"
] | null | null | null | unit-testing/test_algorithms_and_data_structures/test_two_sums.py | amgad01/algorithms | 53eecf06e907cde806d4b78dc78fcd70d0271e3e | [
"MIT"
] | 1 | 2021-07-25T01:55:12.000Z | 2021-07-25T01:55:12.000Z | from unittest import TestCase
from LeetCode.two_sum import two_sum, two_sums, two_sums_
from LeetCode.util import check_validity
class TestTwoSums(TestCase):
# Testing two_sum implementation
def test_two_sum_valid_solutions(self):
nums_array = [1, 2, 3, 4, 5, 6]
self.assertEqual(two_sum(nums_array, 10), [3, 5])
self.assertEqual(two_sum(nums_array, 0), [])
self.assertEqual(two_sum(nums_array, 3), [0, 1])
self.assertIn(two_sum(nums_array, 5), [[0, 3], [1, 2]])
self.assertIn(two_sum(nums_array, 6), [[1, 3], [0, 4]])
def test_two_sum_invalid_arrays(self):
nums_array = [1, 20, -10, 3, 3, 3, 33, 5, -14]
self.assertRaises(ValueError, two_sum, nums_array, 3)
nums_array = [1, 20, -10, 3, "number", 5]
self.assertRaises(TypeError, two_sum, nums_array, 6)
self.assertRaises(TypeError, two_sum, nums_array, "target")
self.assertRaises(TypeError, two_sum, nums_array, None)
# Testing two_sums implementation
def test_two_sum2_valid_solutions(self):
nums_array = [1, 2, 3, 4, 5, 6]
self.assertIn(two_sums(nums_array, 10), [[3, 5], [5, 3]])
self.assertEqual(two_sums(nums_array, 0), [])
self.assertIn(two_sums(nums_array, 3), [[0, 1], [1, 0]])
self.assertIn(two_sums(nums_array, 5), [[0, 3], [3, 0], [2, 1], [1, 2]])
self.assertIn(two_sums(nums_array, 6), [[1, 3], [3, 1], [4, 0], [0, 4]])
def test_two_sum2_invalid_arrays(self):
nums_array = [1, 20, -10, 3, 3, 3, 3, 33, 5]
self.assertRaises(ValueError, two_sums, nums_array, 6)
nums_array = [1, 20, -10, 3, "number", 5]
self.assertRaises(TypeError, two_sums, nums_array, 6)
self.assertRaises(TypeError, two_sums, nums_array, "target")
self.assertRaises(TypeError, two_sums, nums_array, None)
# testing two_sums_ implementation
def test_two_sum1_valid_solutions(self):
nums_array = [1, 2, 3, 4, 5, 6]
self.assertEqual(two_sums_(nums_array, 10), [3, 5])
self.assertEqual(two_sums_(nums_array, 0), [])
self.assertEqual(two_sums_(nums_array, 3), [0, 1])
self.assertIn(two_sums_(nums_array, 5), [[0, 3], [1, 2]])
self.assertIn(two_sums_(nums_array, 6), [[1, 3], [0, 4]])
def test_two_sum1_invalid_arrays(self):
nums_array = [1, 20, -10, 3, 3, 33, 5]
self.assertRaises(ValueError, two_sums_, nums_array, 6)
nums_array = [1, 20, -10, 3, "number", 5]
self.assertRaises(TypeError, two_sums_, nums_array, 6)
self.assertRaises(TypeError, two_sums_, nums_array, "target")
self.assertRaises(TypeError, two_sums_, nums_array, None)
| 44.245902 | 80 | 0.632456 | 406 | 2,699 | 3.94335 | 0.110837 | 0.202374 | 0.123673 | 0.179888 | 0.858214 | 0.838851 | 0.800125 | 0.707058 | 0.610244 | 0.579013 | 0 | 0.070888 | 0.216006 | 2,699 | 60 | 81 | 44.983333 | 0.685728 | 0.035198 | 0 | 0.130435 | 0 | 0 | 0.013846 | 0 | 0 | 0 | 0 | 0 | 0.586957 | 1 | 0.130435 | false | 0 | 0.065217 | 0 | 0.217391 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
68a56080aa70a06dff608b8f692bbf57e7db82e5 | 9,937 | py | Python | tests/unit/compile/test_registrable.py | ethan-asapp/flambe | 70257167058c7b82ee39f74167a6161bd264ad18 | [
"MIT"
] | 148 | 2019-08-29T21:19:03.000Z | 2022-03-18T06:13:53.000Z | tests/unit/compile/test_registrable.py | cle-ros/flambe | 0dc2f5b2b286694defe8abf450fe5be9ae12c097 | [
"MIT"
] | 108 | 2019-09-03T14:36:10.000Z | 2020-05-13T15:53:14.000Z | tests/unit/compile/test_registrable.py | cle-ros/flambe | 0dc2f5b2b286694defe8abf450fe5be9ae12c097 | [
"MIT"
] | 21 | 2019-09-08T14:09:45.000Z | 2020-12-27T04:12:33.000Z | import pytest
from flambe.compile import yaml, Registrable, alias, register, registrable_factory, \
registration_context
from ruamel.yaml.compat import StringIO
@pytest.fixture
def make_classes():
class A(Registrable):
def __init__(self, akw1=0, akw2=None):
self.akw1 = akw1
self.akw2 = akw2
@registrable_factory
@classmethod
def some_factory(cls, akw1=0, akw2=None):
return cls(akw1, akw2)
@classmethod
def from_yaml(cls, constructor, node, factory_name):
kwargs, = list(constructor.construct_yaml_map(node))
if factory_name is not None:
return getattr(cls, factory_name)(**kwargs)
else:
return cls(**kwargs)
@classmethod
def to_yaml(cls, representer, node, tag):
return representer.represent_mapping(tag, {"akw1": node.akw1, "akw2": node.akw2})
class B(Registrable):
def __init__(self, bkw1=0, bkw2=''):
self.bkw1 = bkw1
self.bkw2 = bkw2
@classmethod
def from_yaml(cls, constructor, node, factory_name):
kwargs, = list(constructor.construct_yaml_map(node))
return cls(**kwargs)
@classmethod
def to_yaml(cls, representer, node, tag):
return representer.represent_mapping(tag, {"bkw1": node.bkw1, "bkw2": node.bkw2})
return A, B
@pytest.fixture
def make_namespace_classes():
with registration_context("ns"):
class A(Registrable):
def __init__(self, akw1=0, akw2=None):
self.akw1 = akw1
self.akw2 = akw2
@registrable_factory
@classmethod
def some_factory(cls, akw1=0, akw2=None):
return cls(akw1, akw2)
@classmethod
def from_yaml(cls, constructor, node, factory_name):
kwargs, = list(constructor.construct_yaml_map(node))
if factory_name is not None:
return getattr(cls, factory_name)(**kwargs)
else:
return cls(**kwargs)
@classmethod
def to_yaml(cls, representer, node, tag):
return representer.represent_mapping(tag, {"akw1": node.akw1, "akw2": node.akw2})
class B(Registrable):
def __init__(self, bkw1=0, bkw2=''):
self.bkw1 = bkw1
self.bkw2 = bkw2
@classmethod
def from_yaml(cls, constructor, node, factory_name):
kwargs, = list(constructor.construct_yaml_map(node))
return cls(**kwargs)
@classmethod
def to_yaml(cls, representer, node, tag):
return representer.represent_mapping(tag, {"bkw1": node.bkw1, "bkw2": node.bkw2})
return A, B
@pytest.fixture
def make_aliased_classes():
@alias('a_class')
class A(Registrable):
def __init__(self, akw1=0, akw2=None):
self.akw1 = akw1
self.akw2 = akw2
@registrable_factory
@classmethod
def some_factory(cls, akw1=0, akw2=None):
return cls(akw1, akw2)
@classmethod
def from_yaml(cls, constructor, node, factory_name):
kwargs, = list(constructor.construct_yaml_map(node))
if factory_name is not None:
return getattr(cls, factory_name)(**kwargs)
else:
return cls(**kwargs)
@classmethod
def to_yaml(cls, representer, node, tag):
return representer.represent_mapping(tag, {"akw1": node.akw1, "akw2": node.akw2})
@alias('b_class')
@alias('b_')
class B(Registrable):
def __init__(self, bkw1=0, bkw2=''):
self.bkw1 = bkw1
self.bkw2 = bkw2
@classmethod
def from_yaml(cls, constructor, node, factory_name):
kwargs, = list(constructor.construct_yaml_map(node))
return cls(**kwargs)
@classmethod
def to_yaml(cls, representer, node, tag):
return representer.represent_mapping(tag, {"bkw1": node.bkw1, "bkw2": node.bkw2})
return A, B
@pytest.fixture
def make_new_classes():
class A:
def __init__(self, akw1=0, akw2=None):
self.akw1 = akw1
self.akw2 = akw2
@classmethod
def from_yaml(cls, constructor, node, factory_name):
kwargs, = list(constructor.construct_yaml_map(node))
return cls(**kwargs)
@classmethod
def to_yaml(cls, representer, node, tag):
return representer.represent_mapping(tag, {"akw1": node.akw1, "akw2": node.akw2})
class B:
def __init__(self, bkw1=0, bkw2=''):
self.bkw1 = bkw1
self.bkw2 = bkw2
@classmethod
def from_yaml(cls, constructor, node, factory_name):
kwargs, = list(constructor.construct_yaml_map(node))
return cls(**kwargs)
@classmethod
def to_yaml(cls, representer, node, tag):
return representer.represent_mapping(tag, {"bkw1": node.bkw1, "bkw2": node.bkw2})
register(A, 'a_class')
register(B, 'b_class')
return A, B
def test_registrable_load_basic(make_classes):
A, B = make_classes
txt = """a: !A
akw1: 8
akw2: !B
bkw1: 2
bkw2: hello world
"""
config = yaml.load(txt)
a = config['a']
assert a.akw1 == 8
assert a.akw2 is not None
assert hasattr(a.akw2, "bkw1")
assert a.akw2.bkw1 == 2
def test_registrable_load_context(make_namespace_classes):
A, B = make_namespace_classes
txt = """a: !ns.A
akw1: 8
akw2: !ns.B
bkw1: 2
bkw2: hello world
"""
config = yaml.load(txt)
a = config['a']
assert a.akw1 == 8
assert a.akw2 is not None
assert hasattr(a.akw2, "bkw1")
assert a.akw2.bkw1 == 2
def test_registrable_dump_basic(make_classes):
A, B = make_classes
txt = """!A
akw1: 8
akw2: !B
bkw1: 2
bkw2: hello world
"""
b = B(2, "hello world")
a = A(8, b)
with StringIO() as s:
yaml.dump(a, s)
assert s.getvalue() == txt
def test_registrable_roundtrip(make_classes):
A, B = make_classes
txt = """a: !A
akw1: 8
akw2: !B
bkw1: 2
bkw2: hello world
"""
config = yaml.load(txt)
with StringIO() as s:
yaml.dump(config, s)
assert s.getvalue() == txt
def test_registrable_load_alias(make_aliased_classes):
A, B = make_aliased_classes
txt = """a: !a_class
akw1: 8
akw2: !b_class
bkw1: 2
bkw2: hello world
"""
config = yaml.load(txt)
a = config['a']
assert a.akw1 == 8
assert a.akw2 is not None
assert hasattr(a.akw2, "bkw1")
assert a.akw2.bkw1 == 2
def test_registrable_dump_alias(make_aliased_classes):
A, B = make_aliased_classes
txt = """!a_class
akw1: 8
akw2: !b_class
bkw1: 2
bkw2: hello world
"""
b = B(2, "hello world")
a = A(8, b)
with StringIO() as s:
yaml.dump(a, s)
assert s.getvalue() == txt
def test_registrable_roundtrip_alias_default(make_aliased_classes):
A, B = make_aliased_classes
txt = """a: !a_class
akw1: 8
akw2: !b_
bkw1: 2
bkw2: hello world
"""
txt_default_alias = """a: !a_class
akw1: 8
akw2: !b_
bkw1: 2
bkw2: hello world
"""
config = yaml.load(txt)
with StringIO() as s:
yaml.dump(config, s)
assert s.getvalue() == txt_default_alias
def test_registrable_load_new_class(make_new_classes):
A, B = make_new_classes
txt = """a: !a_class
akw1: 8
akw2: !b_class
bkw1: 2
bkw2: hello world
"""
config = yaml.load(txt)
a = config['a']
assert a.akw1 == 8
assert a.akw2 is not None
assert hasattr(a.akw2, "bkw1")
assert a.akw2.bkw1 == 2
def test_registrable_dump_new_class(make_new_classes):
A, B = make_new_classes
txt = """!a_class
akw1: 8
akw2: !b_class
bkw1: 2
bkw2: hello world
"""
b = B(2, "hello world")
a = A(8, b)
with StringIO() as s:
yaml.dump(a, s)
assert s.getvalue() == txt
def test_registrable_roundtrip_new_default(make_new_classes):
A, B = make_new_classes
txt = """a: !a_class
akw1: 8
akw2: !b_
bkw1: 2
bkw2: hello world
"""
txt_default_alias = """a: !a_class
akw1: 8
akw2: !b_
bkw1: 2
bkw2: hello world
"""
config = yaml.load(txt)
with StringIO() as s:
yaml.dump(config, s)
assert s.getvalue() == txt_default_alias
def test_registrable_factory(make_classes):
A, B = make_classes
txt = """a: !A.some_factory
akw1: 8
akw2: !B
bkw1: 2
bkw2: hello world
"""
config = yaml.load(txt)
a = config['a']
assert a.akw1 == 8
assert a.akw2 is not None
assert hasattr(a.akw2, "bkw1")
assert a.akw2.bkw1 == 2
def test_registrable_factory_roundtrip(make_classes):
A, B = make_classes
txt = """a: !A.some_factory
akw1: 8
akw2: !B
bkw1: 2
bkw2: hello world
"""
txt_default_alias = """a: !A.some_factory
akw1: 8
akw2: !B
bkw1: 2
bkw2: hello world
"""
config = yaml.load(txt)
with StringIO() as s:
yaml.dump(config, s)
assert s.getvalue() == txt_default_alias
def test_registrable_factory_roundtrip_alias(make_aliased_classes):
A, B = make_aliased_classes
txt = """a: !a_class.some_factory
akw1: 8
akw2: !b_
bkw1: 2
bkw2: hello world
"""
txt_default_alias = """a: !a_class.some_factory
akw1: 8
akw2: !b_
bkw1: 2
bkw2: hello world
"""
config = yaml.load(txt)
a = config['a']
assert a.akw1 == 8
assert a.akw2 is not None
assert hasattr(a.akw2, "bkw1")
assert a.akw2.bkw1 == 2
assert isinstance(a, A)
with StringIO() as s:
yaml.dump(config, s)
assert s.getvalue() == txt_default_alias
| 23.603325 | 97 | 0.587401 | 1,295 | 9,937 | 4.328185 | 0.057143 | 0.020517 | 0.027297 | 0.042462 | 0.921677 | 0.918822 | 0.918822 | 0.918822 | 0.915611 | 0.90901 | 0 | 0.037058 | 0.296669 | 9,937 | 420 | 98 | 23.659524 | 0.764916 | 0 | 0 | 0.887879 | 0 | 0 | 0.133239 | 0.004227 | 0 | 0 | 0 | 0 | 0.1 | 1 | 0.133333 | false | 0 | 0.009091 | 0.033333 | 0.245455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d7c5b9df4cd2543c928a78032278eb3ddf1322c4 | 2,158 | py | Python | defuzzify.py | siyanapavlova/FuzzyLogic_Assignment | 75bd457276c2b6b8120dbb96c002c6aa840b2f36 | [
"MIT"
] | null | null | null | defuzzify.py | siyanapavlova/FuzzyLogic_Assignment | 75bd457276c2b6b8120dbb96c002c6aa840b2f36 | [
"MIT"
] | null | null | null | defuzzify.py | siyanapavlova/FuzzyLogic_Assignment | 75bd457276c2b6b8120dbb96c002c6aa840b2f36 | [
"MIT"
] | null | null | null | from inference import infer
def defuzzifyCoG(variables, ans):
'''Defuzzify using the Centre of Gravity method
Parameters: variables, ans
Output: the defuzzified value
'''
finalAns = 0
for var in ans:
denominator = 0
nominator = 0
for description in ans[var]:
if ans[var][description] != 0:
area = (-2*variables[var][description]['a'] + variables[var][description]['alfa'] + 2*variables[var][description]['b'] + variables[var][description]['beta'])/2.0
missing = ((1-ans[var][description])*((1 - ans[var][description])*100 + variables[var][description]['b'] - variables[var][description]['a']))/2.0
missing = (1-ans[var][description])*((1-ans[var][description])*(variables[var][description]['b'] + variables[var][description]['beta'] - variables[var][description]['a'] + variables[var][description]['alfa']))/2
area -= missing
denominator += area
nominator += (area*(variables[var][description]['a'] - variables[var][description]['alfa'] + variables[var][description]['b'] + variables[var][description]['beta']))/2
if denominator == 0:
return 0
finalAns = nominator/denominator
return finalAns
def defuzzifyDoA(variables, ans):
'''Defuzzify using the Dilation of the Aggregate method
Parameters: variables, ans
Output: the defuzzified value
'''
finalAns = 0
for var in ans:
denominator = 0
nominator = 0
for description in ans[var]:
if ans[var][description] != 0:
area = ((-variables[var][description]['a'] + variables[var][description]['alfa'] + variables[var][description]['b'] + variables[var][description]['beta'])*ans[var][description])/2
denominator += area
nominator += (area*(variables[var][description]['a'] - variables[var][description]['alfa'] + variables[var][description]['b'] + variables[var][description]['beta']))/2
if denominator == 0:
return 0
finalAns = nominator/denominator
return finalAns
| 50.186047 | 228 | 0.601483 | 233 | 2,158 | 5.570815 | 0.175966 | 0.312789 | 0.389831 | 0.11094 | 0.903698 | 0.859014 | 0.859014 | 0.822804 | 0.783513 | 0.664099 | 0 | 0.017661 | 0.23911 | 2,158 | 42 | 229 | 51.380952 | 0.772838 | 0.097776 | 0 | 0.75 | 0 | 0 | 0.027837 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.03125 | 0 | 0.21875 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
cc0449febfc31b2850b77539e1a508b8c8f67fe5 | 58,888 | py | Python | junk/color_test.py | fyp-sb3-mxj3/d50826df94911c26dc5e2f3db6f2fea2 | 32ab69ab1ce99d626991559213ade95816a3af52 | [
"WTFPL"
] | 3 | 2019-03-13T07:48:00.000Z | 2019-06-02T14:35:27.000Z | junk/color_test.py | zhuixunforever/d50826df94911c26dc5e2f3db6f2fea2 | 32ab69ab1ce99d626991559213ade95816a3af52 | [
"WTFPL"
] | 2 | 2019-04-07T10:08:09.000Z | 2019-04-17T05:42:07.000Z | junk/color_test.py | zhuixunforever/d50826df94911c26dc5e2f3db6f2fea2 | 32ab69ab1ce99d626991559213ade95816a3af52 | [
"WTFPL"
] | 3 | 2019-03-21T06:32:08.000Z | 2019-06-02T14:35:40.000Z | import cv2
import numpy as np
from time import time
from math import sqrt
from scipy import interpolate
def color_gradient_v4(img, edges_x):
# new_img = np.zeros_like(img, dtype=np.float32)
new_img = img.copy().astype(np.float32)
color = np.mean(img[np.argwhere(edges_x[:, 0] > 0), img.shape[1] // 2, :], axis=0)
for _y, (_x0, _x1) in enumerate(edges_x):
if _x0 != 0 and _x1 != img.shape[1]:
# appends edges
new_img[_y, -1, :] = new_img[_y, 0, :] = color
# left edge
length = _x0
left_color = new_img[_y, 0, :]
right_color = new_img[_y, _x0, :]
new_img[_y, :_x0, 0] = np.fromfunction(
lambda x: left_color[0] * (1 - x / length) + right_color[0] * (x / length), (length,))
new_img[_y, :_x0, 1] = np.fromfunction(
lambda x: left_color[1] * (1 - x / length) + right_color[1] * (x / length), (length,))
new_img[_y, :_x0, 2] = np.fromfunction(
lambda x: left_color[2] * (1 - x / length) + right_color[2] * (x / length), (length,))
# right edge
length = img.shape[1] - _x1
left_color = new_img[_y, _x1, :]
right_color = new_img[_y, -1, :]
new_img[_y, _x1:, 0] = np.fromfunction(
lambda x: left_color[0] * (1 - x / length) + right_color[0] * (x / length), (length,))
new_img[_y, _x1:, 1] = np.fromfunction(
lambda x: left_color[1] * (1 - x / length) + right_color[1] * (x / length), (length,))
new_img[_y, _x1:, 2] = np.fromfunction(
lambda x: left_color[2] * (1 - x / length) + right_color[2] * (x / length), (length,))
# internal
length = _x1 - _x0
left_color = new_img[_y, _x0, :]
right_color = new_img[_y, _x1, :]
new_img[_y, _x0:_x1, 0] = np.fromfunction(
lambda x: left_color[0] * (1 - x / length) + right_color[0] * (x / length), (length,))
new_img[_y, _x0:_x1, 1] = np.fromfunction(
lambda x: left_color[1] * (1 - x / length) + right_color[1] * (x / length), (length,))
new_img[_y, _x0:_x1, 2] = np.fromfunction(
lambda x: left_color[2] * (1 - x / length) + right_color[2] * (x / length), (length,))
new_img = new_img.round().clip(0, 255).astype(np.uint8)
return new_img
def color_gradient_v3(img, pts, x_range=None, y_range=None, x_loc=None, y_loc=None):
packed = []
for i, (_y, _x) in enumerate(pts):
color = img[_y, _x]
loc = (_y, _x)
scale = np.max([dist((_y, _x), (0, 0)),
dist((_y, _x), (img.shape[0] - 1, 0)),
dist((_y, _x), (0, img.shape[1] - 1)),
dist((_y, _x), (img.shape[0] - 1, img.shape[1] - 1))])
packed.append((color, loc, scale))
new_img = np.zeros(img.shape)
if y_range is not None: # y_range not None
for _y in y_range:
if x_range is not None:
for _x in x_range:
for color, loc, scale in packed:
new_img[_y, _x, :] += color * (1 - dist(loc, (_y, _x)) / scale)
elif x_loc is not None:
_x = x_loc
for color, loc, scale in packed:
new_img[_y, _x, :] += color * (1 - dist(loc, (_y, _x)) / scale)
else: # y_range not None, x_range and x_loc are not provided
raise ValueError("Values are None")
elif x_range is not None: # y_range not None
for _x in x_range:
if y_loc is not None:
_y = y_loc
for color, loc, scale in packed:
new_img[_y, _x, :] += color * (1 - dist(loc, (_y, _x)) / scale)
else: # x_range not None, y_range and y_loc not provided
raise ValueError("Values are None")
else: # x_range y_range are None
if x_loc is not None and y_loc is not None:
_y, _x = y_loc, x_loc
for color, loc, scale in packed:
new_img[_y, _x, :] += color * (1 - dist(loc, (_y, _x)) / scale)
else:
raise ValueError("Values are None")
np.clip(new_img, 0, 255, new_img)
return new_img.round().astype(np.uint8)
def color_gradient_v2(img, pts, x_range=None, y_range=None, x_loc=None, y_loc=None, mode="a"):
packed = []
if mode == 'r':
dists = np.zeros((len(pts),) * 2)
for i in range(len(pts)):
for j in range(len(pts)):
dists[i, j] = dist(pts[i], pts[j])
for i, (_y, _x) in enumerate(pts):
color = img[_y, _x]
loc = (_y, _x)
scale = np.max(dists[i])
packed.append((color, loc, scale))
elif mode == 'a':
for i, (_y, _x) in enumerate(pts):
color = img[_y, _x]
loc = (_y, _x)
scale = np.max([dist((_y, _x), (0, 0)),
dist((_y, _x), (img.shape[0] - 1, 0)),
dist((_y, _x), (0, img.shape[1] - 1)),
dist((_y, _x), (img.shape[0] - 1, img.shape[1] - 1))])
packed.append((color, loc, scale))
else:
raise ValueError("Wrong Mode, should be, 'r': relative; 'a': absolute")
new_img = np.zeros(img.shape)
if y_range is not None: # y_range not None
for _y in y_range:
if x_range is not None:
for _x in x_range:
for color, loc, scale in packed:
new_img[_y, _x, :] += color * (1 - dist(loc, (_y, _x)) / scale)
elif x_loc is not None:
_x = x_loc
for color, loc, scale in packed:
new_img[_y, _x, :] += color * (1 - dist(loc, (_y, _x)) / scale)
else: # y_range not None, x_range and x_loc are not provided
raise ValueError("Values are None")
elif x_range is not None: # y_range not None
for _x in x_range:
if y_loc is not None:
_y = y_loc
for color, loc, scale in packed:
new_img[_y, _x, :] += color * (1 - dist(loc, (_y, _x)) / scale)
else: # x_range not None, y_range and y_loc not provided
raise ValueError("Values are None")
else: # x_range y_range are None
if x_loc is not None and y_loc is not None:
_y, _x = y_loc, x_loc
for color, loc, scale in packed:
new_img[_y, _x, :] += color * (1 - dist(loc, (_y, _x)) / scale)
else:
raise ValueError("Values are None")
np.clip(new_img, 0, 255, new_img)
return new_img.round().astype(np.uint8)
def color_gradient(img, pts, mode="r"):
packed = []
if mode == 'r':
dists = np.zeros((len(pts),) * 2)
for i in range(len(pts)):
for j in range(len(pts)):
dists[i, j] = dist(pts[i], pts[j])
for i, (y, x) in enumerate(pts):
color = img[y, x]
loc = (y, x)
scale = np.max(dists[i])
packed.append((color, loc, scale))
elif mode == 'a':
for i, (y, x) in enumerate(pts):
color = img[y, x]
loc = (y, x)
scale = np.max([dist((y, x), (0, 0)),
dist((y, x), (img.shape[0] - 1, 0)),
dist((y, x), (0, img.shape[1] - 1)),
dist((y, x), (img.shape[0] - 1, img.shape[1] - 1))])
packed.append((color, loc, scale))
else:
raise ValueError("Wrong Mode, should be, 'r': relative; 'a': absolute")
new_img = np.zeros(img.shape)
for i in range(img.shape[0]):
for j in range(img.shape[1]):
for color, loc, scale in packed:
new_img[i, j, :] += color * (1 - dist(loc, (i, j)) / scale)
np.clip(new_img, 0, 255, new_img)
return new_img.round().astype(np.uint8)
def dist(pt1, pt2):
# return np.linalg.norm(pt2-pt1)
# return ((pt1[0] - pt2[0]) ** 2 + (pt1[1] - pt2[1]) ** 2) ** 0.5
return sqrt((pt1[0] - pt2[0]) ** 2 + (pt1[1] - pt2[1]) ** 2)
def display(img, name="Img", time=0, encode="BGR"):
if not isinstance(img, np.ndarray):
img = cv2.imread(img)
encode = "BGR"
if img.ndim == 3:
img = img[..., [encode.find('B'), encode.find('G'), encode.find('R')]] # to BGR
cv2.imshow(name, img)
cv2.waitKey(time)
cv2.destroyAllWindows()
def color_gradient_demo1():
MAP_SIZE = (512, 512, 3) # y,x,c
initial_color = [{'loc': (MAP_SIZE[0] // 2, 0), 'color': (200, 190, 120)},
{'loc': (MAP_SIZE[0] // 2, MAP_SIZE[1] - 1), 'color': (191, 90, 17)},
{'loc': (MAP_SIZE[0] // 2, MAP_SIZE[0] // 2), 'color': (255, 255, 255)}]
img = np.zeros(MAP_SIZE, dtype=np.uint8)
if len(initial_color) > 1:
for initial in initial_color:
l, c = initial.values()
img[l[0], l[1], :] = np.array(c)
list_ref_pts = [init_c['loc'] for init_c in initial_color]
display(img, encode="RGB")
# constructed an img with initialized color
# contructed new img
new_img = color_gradient_v2(img, list_ref_pts, x_range=range(0, img.shape[1]), y_range=range(0, img.shape[0]))
display(new_img, encode="RGB")
return
def edge_detection(img, th=10):
"""
detect the edge
:param img: source image
:param th: threshold to decide what is black
:return: grad, edge_along_y, edge_along_x
"""
src = img.copy()
gray = cv2.cvtColor(src, cv2.COLOR_BGR2GRAY)
grad_x = cv2.convertScaleAbs(
cv2.Sobel(gray, cv2.CV_16S, 1, 0, ksize=1, scale=1, delta=0, borderType=cv2.BORDER_DEFAULT))
grad_y = cv2.convertScaleAbs(
cv2.Sobel(gray, cv2.CV_16S, 0, 1, ksize=1, scale=1, delta=0, borderType=cv2.BORDER_DEFAULT))
grad = cv2.addWeighted(grad_x, 0.5, grad_y, 0.5, 0)
display(grad, "output")
edge_along_y = np.zeros((img.shape[0], 2), dtype=np.uint16)
edge_along_x = np.zeros((img.shape[1], 2), dtype=np.uint16)
for _y in range(edge_along_y.shape[0]):
temp = np.argwhere(grad_x[_y, :] > th)
if np.any(temp):
edge_along_y[_y, 0] = np.min(temp) + 1
edge_along_y[_y, 1] = np.max(temp) - 1
for _x in range(edge_along_x.shape[0]):
temp = np.argwhere(grad_x[:, _x] > th)
if np.any(temp):
edge_along_x[_x, 0] = np.min(temp) + 1
edge_along_x[_x, 1] = np.max(temp) - 1
return grad, edge_along_y, edge_along_x
def is_black(color: np.array, th=10):
return np.sum(color) <= th
def find_ref_pts(img, pt, k=3, d=2, valid=is_black):
"""
find the reference point of the image where it is not black
:param img: the img
:param pt: the original point
:param k: number of points
:param d: distance
:param valid: check for ref pt validity
:return: a list of pt
"""
pts = []
m = 1
while len(pts) < k:
x_pts = [pt[1] - m * d] * 3 + [pt[1]] * 3 + [pt[1] + m * d] * 3
y_pts = [pt[0] - m * d, pt[0], pt[0] + m * d] * 3
for _x, _y in zip(x_pts, y_pts):
print(_x, _y)
if 0 <= _x < img.shape[1] and 0 <= _y < img.shape[0] and (_y, _x) not in pts:
if valid(img[_y, _x, :]):
pts.append((_y, _x))
return pts
def make_img(color, loc, scale, size=(256, 256)):
new_img = np.zeros((size[0], size[1], 3), dtype=np.float32)
new_img[:, :, 0] = np.fromfunction(
lambda i, j: color[0] * (1 - (np.sqrt((i - loc[0]) ** 2 + (j - loc[1]) ** 2)) / scale), size)
new_img[:, :, 1] = np.fromfunction(
lambda i, j: color[1] * (1 - (np.sqrt((i - loc[0]) ** 2 + (j - loc[1]) ** 2)) / scale), size)
new_img[:, :, 2] = np.fromfunction(
lambda i, j: color[2] * (1 - (np.sqrt((i - loc[0]) ** 2 + (j - loc[1]) ** 2)) / scale), size)
# new_img = np.transpose(np.stack((img0, img1, img2)), (1, 2, 0))
return new_img
def image_expansion(img, internal=False):
# new_img = np.zeros_like(img, dtype=np.float32)
new_img = img.copy().astype(np.float32)
_, edges_along_y, edges_along_x = edge_detection(img, th=5)
color = np.mean(img[np.argwhere(edges_along_y[:, 0] > 0), img.shape[1] // 2, :], axis=0)
for _y, (_x0, _x1) in enumerate(edges_along_y):
if _x0 != 0 and _x1 != img.shape[1]:
# appends edges
new_img[_y, -1, :] = new_img[_y, 0, :] = color
# left edge
length = _x0
left_color = new_img[_y, 0, :]
right_color = new_img[_y, _x0, :]
new_img[_y, :_x0, 0] = np.fromfunction(
lambda x: left_color[0] * (1 - x / length) + right_color[0] * (x / length), (length,))
new_img[_y, :_x0, 1] = np.fromfunction(
lambda x: left_color[1] * (1 - x / length) + right_color[1] * (x / length), (length,))
new_img[_y, :_x0, 2] = np.fromfunction(
lambda x: left_color[2] * (1 - x / length) + right_color[2] * (x / length), (length,))
# right edge
length = img.shape[1] - _x1
left_color = new_img[_y, _x1, :]
right_color = new_img[_y, -1, :]
new_img[_y, _x1:, 0] = np.fromfunction(
lambda x: left_color[0] * (1 - x / length) + right_color[0] * (x / length), (length,))
new_img[_y, _x1:, 1] = np.fromfunction(
lambda x: left_color[1] * (1 - x / length) + right_color[1] * (x / length), (length,))
new_img[_y, _x1:, 2] = np.fromfunction(
lambda x: left_color[2] * (1 - x / length) + right_color[2] * (x / length), (length,))
# internal
if internal:
length = _x1 - _x0
left_color = new_img[_y, _x0, :]
right_color = new_img[_y, _x1, :]
new_img[_y, _x0:_x1, 0] = np.fromfunction(
lambda x: left_color[0] * (1 - x / length) + right_color[0] * (x / length), (length,))
new_img[_y, _x0:_x1, 1] = np.fromfunction(
lambda x: left_color[1] * (1 - x / length) + right_color[1] * (x / length), (length,))
new_img[_y, _x0:_x1, 2] = np.fromfunction(
lambda x: left_color[2] * (1 - x / length) + right_color[2] * (x / length), (length,))
# end of x padding
# _, _, edges_along_x = edge_detection(new_img, th=5)
color = np.mean(img[img.shape[0] // 2, np.argwhere(edges_along_x[:, 0] > 0), :], axis=0)
# for _x, (_y0, _y1) in enumerate(edges_along_x):
_y0 = np.argwhere(edges_along_y[:, 0] > 0).min()
_y1 = np.argwhere(edges_along_y[:, 0] > 0).max()
for _x in range(img.shape[1]):
if _y0 != 0 and _y1 != img.shape[0]:
# appends edges
new_img[0, _x, :] = new_img[-1, _x, :] = color
# left edge
length = _y0
left_color = new_img[0, _x, :]
right_color = new_img[_y0, _x, :]
new_img[:_y0, _x, 0] = np.fromfunction(
lambda x: left_color[0] * (1 - x / length) + right_color[0] * (x / length), (length,))
new_img[:_y0, _x, 1] = np.fromfunction(
lambda x: left_color[1] * (1 - x / length) + right_color[1] * (x / length), (length,))
new_img[:_y0, _x, 2] = np.fromfunction(
lambda x: left_color[2] * (1 - x / length) + right_color[2] * (x / length), (length,))
# right edge
length = img.shape[0] - _y1
left_color = new_img[_y1, _x, :]
right_color = new_img[-1, _x, :]
new_img[_y1:, _x, 0] = np.fromfunction(
lambda x: left_color[0] * (1 - x / length) + right_color[0] * (x / length), (length,))
new_img[_y1:, _x, 1] = np.fromfunction(
lambda x: left_color[1] * (1 - x / length) + right_color[1] * (x / length), (length,))
new_img[_y1:, _x, 2] = np.fromfunction(
lambda x: left_color[2] * (1 - x / length) + right_color[2] * (x / length), (length,))
# # internal
# length = _y1 - _y0
# left_color = new_img[_y0, _x, :]
# right_color = new_img[_y1, _x, :]
# new_img[_y0:_y1, _x, 0] = np.fromfunction(
# lambda x: left_color[0] * (1 - x / length) + right_color[0] * (x / length), (length,))
# new_img[_y0:_y1, _x, 1] = np.fromfunction(
# lambda x: left_color[1] * (1 - x / length) + right_color[1] * (x / length), (length,))
# new_img[_y0:_y1, _x, 2] = np.fromfunction(
# lambda x: left_color[2] * (1 - x / length) + right_color[2] * (x / length), (length,))
# end of y padding
new_img = new_img.round().clip(0, 255).astype(np.uint8)
return new_img
def color_grad_2_pts_x(img, x0, x1, y, left_color, right_color):
length = x1 - x0
img[y, x0:x1, 0] = np.fromfunction(
lambda x: left_color[0] * (1 - x / length) + right_color[0] * (x / length), (length,))
img[y, x0:x1, 1] = np.fromfunction(
lambda x: left_color[1] * (1 - x / length) + right_color[1] * (x / length), (length,))
img[y, x0:x1, 2] = np.fromfunction(
lambda x: left_color[2] * (1 - x / length) + right_color[2] * (x / length), (length,))
return img
def color_grad_2_pts_y(img, y0, y1, x, left_color, right_color):
length = y1 - y0
img[y0:y1, x, 0] = np.fromfunction(
lambda _x: left_color[0] * (1 - _x / length) + right_color[0] * (_x / length), (length,))
img[y0:y1, x, 1] = np.fromfunction(
lambda _x: left_color[1] * (1 - _x / length) + right_color[1] * (_x / length), (length,))
img[y0:y1, x, 2] = np.fromfunction(
lambda _x: left_color[2] * (1 - _x / length) + right_color[2] * (_x / length), (length,))
return img
def image_expansion_v2(img, internal=False):
img = cv2.cvtColor(img, cv2.COLOR_BGR2HSV).astype(np.float64)
img[:, :, 2] *= 1.5
avg_color = img[img.sum(-1) > 0].mean(0)
maskHSV = cv2.inRange(img, avg_color - np.array([10, 40, 25]), avg_color + np.array([10, 100, 50]))
for i in range(maskHSV.shape[0]):
t = maskHSV[i].nonzero()[0].flatten()
if t.size > 1:
maskHSV[i, t[0]:t[-1]] = 255
resultHSV = cv2.bitwise_and(img, img, mask=maskHSV)
new_img_x = resultHSV.copy().astype(np.float32)
img = resultHSV
for r in range(img.shape[0]):
t = np.argwhere(img[r].sum(-1) > 0).flatten()
if t.size > 0:
left_edge = np.min(t) + 5
right_edge = np.max(t) - 5
while img[r, left_edge].sum(-1) <= 5:
left_edge -= 1
while img[r, right_edge].sum(-1) <= 5:
right_edge += 1
# left edge
new_img_x = color_grad_2_pts_x(new_img_x, x0=0, x1=left_edge, y=r,
left_color=img[r, left_edge] * 0.5 + avg_color * 0.5,
right_color=img[r, left_edge])
# right edge
new_img_x = color_grad_2_pts_x(new_img_x, x0=right_edge, x1=img.shape[1], y=r,
left_color=img[r, right_edge, :],
right_color=img[r, right_edge, :] * 0.5 + avg_color * 0.5)
# internal
if internal:
left_edge = np.min(t)
right_edge = np.max(t)
while img[r, left_edge].sum(-1) <= 5:
left_edge += 1
while img[r, right_edge].sum(-1) <= 5:
right_edge -= 1
new_img_x = color_grad_2_pts_x(new_img_x, x0=left_edge, x1=right_edge, y=r,
left_color=new_img_x[r, left_edge],
right_color=new_img_x[r, right_edge])
new_img_y = new_img_x.copy().astype(np.float32)
for c in range(new_img_y.shape[1]):
t = np.argwhere(new_img_y[:, c, :].sum(-1) > 0).flatten()
if t.size > 0:
left_edge = np.min(t) + 5
right_edge = np.max(t) - 5
while new_img_y[left_edge, c].sum(-1) <= 5:
left_edge -= 1
while new_img_y[right_edge, c].sum(-1) <= 5:
right_edge += 1
# left edge
new_img_y = color_grad_2_pts_y(new_img_y, y0=0, y1=left_edge, x=c,
left_color=new_img_y[left_edge, c] * 0.5 + avg_color * 0.5,
right_color=new_img_y[left_edge, c])
new_img_x = color_grad_2_pts_y(new_img_x, y0=0, y1=left_edge, x=c,
left_color=new_img_y[left_edge, c] * 0.5 + avg_color * 0.5,
right_color=new_img_y[left_edge, c])
# right edge
new_img_y = color_grad_2_pts_y(new_img_y, y0=right_edge, y1=img.shape[0], x=c,
left_color=new_img_y[right_edge, c, :],
right_color=new_img_y[right_edge, c, :] * 0.5 + avg_color * 0.5)
new_img_x = color_grad_2_pts_y(new_img_x, y0=right_edge, y1=img.shape[0], x=c,
left_color=new_img_y[right_edge, c, :],
right_color=new_img_y[right_edge, c, :] * 0.5 + avg_color * 0.5)
if internal:
left_edge = np.min(t) - 5
right_edge = np.max(t) + 5
while new_img_y[left_edge, c].sum(-1) <= 5:
left_edge += 1
while new_img_y[right_edge, c].sum(-1) <= 5:
right_edge -= 1
new_img_y = color_grad_2_pts_y(new_img_y, y0=left_edge, y1=right_edge, x=c,
left_color=new_img_y[left_edge, c, :],
right_color=new_img_y[right_edge, c, :] * 0.5 + avg_color * 0.5)
img_recover = cv2.addWeighted(new_img_x, 0.5, new_img_y, 0.5, 0)
return new_img_x
def image_expansion_v3(img, internal=False):
img = cv2.cvtColor(img, cv2.COLOR_BGR2HSV).astype(np.float64)
img[:, :, 2] *= 1.5
avg_color = img[img.sum(-1) > 0].mean(0)
maskHSV = cv2.inRange(img, avg_color - np.array([10, 40, 25]), avg_color + np.array([10, 100, 50]))
for i in range(maskHSV.shape[0]):
t = maskHSV[i].nonzero()[0].flatten()
if t.size > 1:
maskHSV[i, t[0]:t[-1]] = 255
resultHSV = cv2.bitwise_and(img, img, mask=maskHSV)
new_img_x = resultHSV.copy().astype(np.float32)
img = resultHSV
for r in range(img.shape[0]):
t = np.argwhere(img[r].sum(-1) > 0).flatten()
if t.size > 0:
left_edge = np.min(t) + 5
right_edge = np.max(t) - 5
while img[r, left_edge].sum(-1) <= 5:
left_edge -= 1
while img[r, right_edge].sum(-1) <= 5:
right_edge += 1
# left edge
new_img_x = color_grad_2_pts_x(new_img_x, x0=0, x1=left_edge, y=r,
left_color=img[r, left_edge] * 0.5 + avg_color * 0.5,
right_color=img[r, left_edge])
# right edge
new_img_x = color_grad_2_pts_x(new_img_x, x0=right_edge, x1=img.shape[1], y=r,
left_color=img[r, right_edge, :],
right_color=img[r, right_edge, :] * 0.5 + avg_color * 0.5)
# internal
if internal:
left_edge = np.min(t)
right_edge = np.max(t)
while img[r, left_edge].sum(-1) <= 5:
left_edge += 1
while img[r, right_edge].sum(-1) <= 5:
right_edge -= 1
new_img_x = color_grad_2_pts_x(new_img_x, x0=left_edge, x1=right_edge, y=r,
left_color=new_img_x[r, left_edge],
right_color=new_img_x[r, right_edge])
#
# new_img_y = new_img_x.copy().astype(np.float32)
# for c in range(new_img_y.shape[1]):
# t = np.argwhere(new_img_y[:, c, :].sum(-1) > 0).flatten()
# if t.size > 0:
# left_edge = np.min(t) + 5
# right_edge = np.max(t) - 5
# while new_img_y[left_edge, c].sum(-1) <= 5:
# left_edge -= 1
# while new_img_y[right_edge, c].sum(-1) <= 5:
# right_edge += 1
# # left edge
# new_img_y = color_grad_2_pts_y(new_img_y, y0=0, y1=left_edge, x=c,
# left_color=new_img_y[left_edge, c] * 0.5 + avg_color * 0.5,
# right_color=new_img_y[left_edge, c])
# new_img_x = color_grad_2_pts_y(new_img_x, y0=0, y1=left_edge, x=c,
# left_color=new_img_y[left_edge, c] * 0.5 + avg_color * 0.5,
# right_color=new_img_y[left_edge, c])
#
# # right edge
# new_img_y = color_grad_2_pts_y(new_img_y, y0=right_edge, y1=img.shape[0], x=c,
# left_color=new_img_y[right_edge, c, :],
# right_color=new_img_y[right_edge, c, :] * 0.5 + avg_color * 0.5)
# new_img_x = color_grad_2_pts_y(new_img_x, y0=right_edge, y1=img.shape[0], x=c,
# left_color=new_img_y[right_edge, c, :],
# right_color=new_img_y[right_edge, c, :] * 0.5 + avg_color * 0.5)
# if internal:
# left_edge = np.min(t) - 5
# right_edge = np.max(t) + 5
# while new_img_y[left_edge, c].sum(-1) <= 5:
# left_edge += 1
# while new_img_y[right_edge, c].sum(-1) <= 5:
# right_edge -= 1
# new_img_y = color_grad_2_pts_y(new_img_y, y0=left_edge, y1=right_edge, x=c,
# left_color=new_img_y[left_edge, c, :],
# right_color=new_img_y[right_edge, c, :] * 0.5 + avg_color * 0.5)
# img_recover = cv2.addWeighted(new_img_x, 0.5, new_img_y, 0.5, 0)
return new_img_x
def main_2():
start_time = time()
img_BGR = cv2.imread(r'Data\mask\0test.png')
img_append = image_expansion_v2(img_BGR, True).round().clip(0, 255).astype(np.uint8)
img_append = cv2.cvtColor(img_append, cv2.COLOR_HSV2BGR)
img_intern = image_expansion_v2(img_BGR, False).round().clip(0, 255).astype(np.uint8)
img_intern = cv2.cvtColor(img_intern, cv2.COLOR_HSV2BGR)
display(np.concatenate((img_BGR, img_intern, img_append), axis=1))
print("Time Elapsed {:.2f}".format(time() - start_time))
def gen_checker(fname1, fname2, shape1=(256, 256, 3), shape2=(512, 512, 3)):
img_1 = np.zeros(shape1, np.uint8)
img_2 = np.zeros(shape2, np.uint8)
off_set = shape2[0] * 3 // 5 - shape1[0] // 2
for y in range(256):
color = np.random.randint(0, 255, 3, np.uint8)
img_1[y, :, :] = color
img_2[y + off_set, :, :] = color
cv2.imwrite(fname1, img_1)
cv2.imwrite(fname2, img_2)
return
def main_v3():
start_time = time()
img_BGR = cv2.imread(r'Data\mask\0test.png')
img = cv2.cvtColor(img_BGR, cv2.COLOR_BGR2HSV).astype(np.float64)
avg_color = img[np.logical_and(img.sum(-1) > 10, img.sum(-1) < 700)].mean(0)
print(avg_color)
maskHSV = cv2.inRange(img, np.array([0, 0, 0], dtype=np.float64),
avg_color + np.array([10, 50, 50], dtype=np.float64))
# for i in range(maskHSV.shape[0]):
# t = maskHSV[i].nonzero()[0].flatten()
# if t.size > 1:
# maskHSV[i, t[0]:t[-1]] = 255
resultHSV = cv2.bitwise_and(img, img, mask=maskHSV)
# print(resultHSV)
# img_append = image_expansion_v3(img_BGR, True).round().clip(0, 255).astype(np.uint8)
# img_append = cv2.cvtColor(img_append, cv2.COLOR_HSV2BGR)
# img_intern = image_expansion_v3(img_BGR, False).round().clip(0, 255).astype(np.uint8)
# img_intern = cv2.cvtColor(img_intern, cv2.COLOR_HSV2BGR)
resultHSV = cv2.cvtColor(resultHSV.astype(np.uint8), cv2.COLOR_HSV2BGR)
display(np.concatenate((img_BGR, resultHSV), axis=1))
print("Time Elapsed {:.2f}".format(time() - start_time))
def difference(n):
"""
:param n: np.array of shape 5,,x
:return: fd,sd
"""
assert n.ndim == 2
length = n.shape[0]
depth = n.shape[1]
assert length > 2
fd = np.zeros((length - 1, depth), dtype=np.float64)
sd = np.zeros((length - 2, depth), dtype=np.float64)
for i in range(length - 1):
fd[i, :] = n[i + 1, :] - n[i, :]
for i in range(length - 2):
sd[i, :] = fd[i + 1, :] - fd[i, :]
return fd, sd
def image_expansion_v4(img, avg_color):
new_img_x = img.copy().astype(np.float32)
left_edge = np.zeros(img.shape[0], dtype=np.uint32)
right_edge = np.full(img.shape[0], img.shape[1], dtype=np.uint32)
for _y in range(img.shape[0]):
t = np.argwhere(img[_y].sum(-1) > 0).flatten()
if t.size > 0:
k = 4
left_edge[_y] = np.min(t) + k
right_edge[_y] = np.max(t) - k
k = 1
kind = "slinear"
x_fit = np.concatenate(([0], np.arange(left_edge[_y], left_edge[_y] + k)), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img_x[_y, left_edge[_y]:left_edge[_y] + k, :]), 0)
fl = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
x_fit = np.concatenate(([new_img_x.shape[1]], np.arange(right_edge[_y] - k, right_edge[_y])), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img_x[_y, right_edge[_y] - k: right_edge[_y], :]), 0)
fr = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
new_img_x[_y, :left_edge[_y]] = fl(np.arange(left_edge[_y])).clip(0, 255)
new_img_x[_y, right_edge[_y]:] = fr(np.arange(right_edge[_y], new_img_x.shape[1])).clip(0, 255)
for _y in range(img.shape[0]):
for _x in reversed(range(0, left_edge[_y])):
new_img_x[_y, _x] = 0.33 * new_img_x[_y - 1, _x] + 0.34 * new_img_x[_y, _x + 1] + 0.33 * new_img_x[
_y + 1, _x]
for _x in range(right_edge[_y], new_img_x.shape[1]):
new_img_x[_y, _x] = 0.33 * new_img_x[_y - 1, _x] + 0.34 * new_img_x[_y, _x - 1] + 0.33 * new_img_x[
_y + 1, _x]
return new_img_x
def hsv2bgr(img):
return cv2.cvtColor(img.clip(0, 255).astype(np.uint8), cv2.COLOR_HSV2BGR)
def main_v4():
start_time = time()
img_BGR = cv2.imread(r'Data/mask/0_texture_2.png')
img = cv2.cvtColor(img_BGR, cv2.COLOR_BGR2HSV).astype(np.float64)
avg_color = img[np.logical_and(img.sum(-1) > 10, img.sum(-1) < 700)].mean(0)
maskHSV = cv2.inRange(img, avg_color - np.array([10, 40, 40], dtype=np.float64),
avg_color + np.array([20, 30, 50], dtype=np.float64))
for i in range(maskHSV.shape[0]):
t = maskHSV[i].nonzero()[0].flatten()
if t.size > 1:
maskHSV[i, t[0]:t[-1]] = 255
resultHSV = cv2.bitwise_and(img, img, mask=maskHSV)
img = image_expansion_v4(resultHSV, avg_color)
display(np.concatenate((img_BGR, hsv2bgr(resultHSV), hsv2bgr(img)), axis=1))
def m1():
img_BGR = cv2.imread(r'Data/mask/0_texture_2.png')
img_HSV = cv2.cvtColor(img_BGR, cv2.COLOR_BGR2HSV).astype(np.float64)
avg_color = img_HSV[np.logical_and(img_HSV.sum(-1) > 10, img_HSV.sum(-1) < 700)].mean(0)
maskHSV = cv2.inRange(img_HSV, avg_color - np.array([10, 40, 40], dtype=np.float64),
avg_color + np.array([20, 30, 50], dtype=np.float64))
for i in range(maskHSV.shape[0]):
t = maskHSV[i].nonzero()[0].flatten()
if t.size > 1:
maskHSV[i, t[0]:t[-1]] = 255
masked_HSV = cv2.bitwise_and(img_HSV, img_HSV, mask=maskHSV)
# set img
new_img = img_HSV.copy().astype(np.float32)
left_edge = np.zeros(img_HSV.shape[0], dtype=np.uint32)
right_edge = np.full(img_HSV.shape[0], img_HSV.shape[1], dtype=np.uint32)
for _y in range(img_HSV.shape[0]):
t = np.argwhere(img_HSV[_y].sum(-1) > 0).flatten()
if t.size > 0:
k = 4
left_edge[_y] = np.min(t) + k
right_edge[_y] = np.max(t) - k
new_img[_y, :left_edge[_y]] = avg_color
new_img[_y, right_edge[_y]:] = avg_color
up_edge = np.zeros(img_HSV.shape[1], dtype=np.uint32)
down_edge = np.full(img_HSV.shape[1], img_HSV.shape[0], dtype=np.uint32)
for _x in range(img_HSV.shape[1]):
t = np.argwhere(new_img[:, _x, :].sum(-1) > 0).flatten()
if t.size > 0:
k = 4
up_edge[_x] = np.min(t) + k
down_edge[_x] = np.max(t) - k
new_img[:up_edge[_x], _x, :] = avg_color
new_img[down_edge[_x]:, _x, :] = avg_color
out_img = new_img.round().clip(0, 255).astype(np.uint8)
out_img_BGR = hsv2bgr(out_img)
display(np.concatenate((img_BGR, out_img_BGR), axis=1))
return
def m2():
img_BGR = cv2.imread(r'Data/mask/0_texture_2.png')
img_HSV = cv2.cvtColor(img_BGR, cv2.COLOR_BGR2HSV).astype(np.float64)
avg_color = img_HSV[np.logical_and(img_HSV.sum(-1) > 10, img_HSV.sum(-1) < 700)].mean(0)
maskHSV = cv2.inRange(img_HSV, avg_color - np.array([10, 40, 40], dtype=np.float64),
avg_color + np.array([20, 30, 50], dtype=np.float64))
for i in range(maskHSV.shape[0]):
t = maskHSV[i].nonzero()[0].flatten()
if t.size > 1:
maskHSV[i, t[0]:t[-1]] = 255
masked_HSV = cv2.bitwise_and(img_HSV, img_HSV, mask=maskHSV)
# set img
new_img = img_HSV.copy().astype(np.float32)
left_edge = np.zeros(img_HSV.shape[0], dtype=np.uint32)
right_edge = np.full(img_HSV.shape[0], img_HSV.shape[1], dtype=np.uint32)
for _y in range(img_HSV.shape[0]):
t = np.argwhere(img_HSV[_y].sum(-1) > 0).flatten()
if t.size > 0:
k = 4
left_edge[_y] = np.min(t) + k
right_edge[_y] = np.max(t) - k
new_img[_y, :left_edge[_y]] = new_img[_y, left_edge[_y]]
new_img[_y, right_edge[_y]:] = new_img[_y, right_edge[_y]]
up_edge = np.zeros(img_HSV.shape[1], dtype=np.uint32)
down_edge = np.full(img_HSV.shape[1], img_HSV.shape[0], dtype=np.uint32)
for _x in range(img_HSV.shape[1]):
t = np.argwhere(new_img[:, _x, :].sum(-1) > 0).flatten()
if t.size > 0:
k = 4
up_edge[_x] = np.min(t) + k
down_edge[_x] = np.max(t) - k
new_img[:up_edge[_x], _x, :] = new_img[up_edge[_x], _x, :]
new_img[down_edge[_x]:, _x, :] = new_img[down_edge[_x], _x, :]
out_img = new_img.round().clip(0, 255).astype(np.uint8)
out_img_BGR = hsv2bgr(out_img)
display(np.concatenate((img_BGR, out_img_BGR), axis=1))
return
def m3():
img_BGR = cv2.imread(r'Data/mask/0_texture_2.png')
img_HSV = cv2.cvtColor(img_BGR, cv2.COLOR_BGR2HSV).astype(np.float64)
avg_color = img_HSV[np.logical_and(img_HSV.sum(-1) > 10, img_HSV.sum(-1) < 700)].mean(0)
maskHSV = cv2.inRange(img_HSV, avg_color - np.array([10, 40, 40], dtype=np.float64),
avg_color + np.array([20, 30, 50], dtype=np.float64))
for i in range(maskHSV.shape[0]):
t = maskHSV[i].nonzero()[0].flatten()
if t.size > 1:
maskHSV[i, t[0]:t[-1]] = 255
masked_HSV = cv2.bitwise_and(img_HSV, img_HSV, mask=maskHSV)
# set img
new_img = masked_HSV.copy().astype(np.float32)
left_edge = np.zeros(masked_HSV.shape[0], dtype=np.uint32)
right_edge = np.full(masked_HSV.shape[0], img_HSV.shape[1], dtype=np.uint32)
for _y in range(masked_HSV.shape[0]):
t = np.argwhere(masked_HSV[_y].sum(-1) > 0).flatten()
if t.size > 0:
k = 4
left_edge[_y] = np.min(t) + k
right_edge[_y] = np.max(t) - k
new_img[_y, :left_edge[_y]] = new_img[_y, left_edge[_y]]
new_img[_y, right_edge[_y]:] = new_img[_y, right_edge[_y]]
up_edge = np.zeros(img_HSV.shape[1], dtype=np.uint32)
down_edge = np.full(img_HSV.shape[1], img_HSV.shape[0], dtype=np.uint32)
for _x in range(img_HSV.shape[1]):
t = np.argwhere(new_img[:, _x, :].sum(-1) > 0).flatten()
if t.size > 0:
k = 4
up_edge[_x] = np.min(t) + k
down_edge[_x] = np.max(t) - k
new_img[:up_edge[_x], _x, :] = new_img[up_edge[_x], _x, :]
new_img[down_edge[_x]:, _x, :] = new_img[down_edge[_x], _x, :]
out_img = new_img.round().clip(0, 255).astype(np.uint8)
out_img_BGR = hsv2bgr(out_img)
display(np.concatenate((img_BGR, out_img_BGR), axis=1))
return
def m4():
img_BGR = cv2.imread(r'Data/mask/0_texture_2.png')
img_HSV = cv2.cvtColor(img_BGR, cv2.COLOR_BGR2HSV).astype(np.float64)
avg_color = img_HSV[np.logical_and(img_HSV.sum(-1) > 10, img_HSV.sum(-1) < 700)].mean(0)
maskHSV = cv2.inRange(img_HSV, avg_color - np.array([10, 40, 40], dtype=np.float64),
avg_color + np.array([20, 30, 50], dtype=np.float64))
for i in range(maskHSV.shape[0]):
t = maskHSV[i].nonzero()[0].flatten()
if t.size > 1:
maskHSV[i, t[0]:t[-1]] = 255
masked_HSV = cv2.bitwise_and(img_HSV, img_HSV, mask=maskHSV)
# set img
new_img = masked_HSV.copy().astype(np.float32)
left_edge = np.zeros(masked_HSV.shape[0], dtype=np.uint32)
right_edge = np.full(masked_HSV.shape[0], img_HSV.shape[1], dtype=np.uint32)
for _y in range(masked_HSV.shape[0]):
t = np.argwhere(masked_HSV[_y].sum(-1) > 0).flatten()
if t.size > 0:
k = 4
left_edge[_y] = np.min(t) + k
right_edge[_y] = np.max(t) - k
kind = "slinear"
x_fit = np.concatenate(([0], np.arange(left_edge[_y], left_edge[_y] + k)), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[_y, left_edge[_y]:left_edge[_y] + k, :]), 0)
fl = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
x_fit = np.concatenate(([new_img.shape[1]], np.arange(right_edge[_y] - k, right_edge[_y])), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[_y, right_edge[_y] - k: right_edge[_y], :]), 0)
fr = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
new_img[_y, :left_edge[_y]] = fl(np.arange(left_edge[_y])).clip(0, 255)
new_img[_y, right_edge[_y]:] = fr(np.arange(right_edge[_y], new_img.shape[1])).clip(0, 255)
up_edge = np.zeros(img_HSV.shape[1], dtype=np.uint32)
down_edge = np.full(img_HSV.shape[1], img_HSV.shape[0], dtype=np.uint32)
for _x in range(img_HSV.shape[1]):
t = np.argwhere(new_img[:, _x, :].sum(-1) > 0).flatten()
if t.size > 0:
k = 4
up_edge[_x] = np.min(t) + k
down_edge[_x] = np.max(t) - k
k = 1
kind = "slinear"
x_fit = np.concatenate(([0], np.arange(up_edge[_x], up_edge[_x] + k)), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[up_edge[_x]:up_edge[_x] + k, _x, :]), 0)
fl = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
x_fit = np.concatenate(([new_img.shape[1]], np.arange(down_edge[_x] - k, down_edge[_x])), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[down_edge[_x] - k: down_edge[_x], _x, :]), 0)
fr = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
new_img[:up_edge[_x], _x, :] = fl(np.arange(up_edge[_x])).clip(0, 255)
new_img[down_edge[_x]:, _x, :] = fr(np.arange(down_edge[_x], new_img.shape[0])).clip(0, 255)
out_img = new_img.round().clip(0, 255).astype(np.uint8)
out_img_BGR = hsv2bgr(out_img)
display(np.concatenate((img_BGR, out_img_BGR), axis=1))
return
def m5():
img_BGR = cv2.imread(r'Data/mask/0_texture_2.png')
img_HSV = cv2.cvtColor(img_BGR, cv2.COLOR_BGR2HSV).astype(np.float64)
avg_color = img_HSV[np.logical_and(img_HSV.sum(-1) > 10, img_HSV.sum(-1) < 700)].mean(0)
maskHSV = cv2.inRange(img_HSV, avg_color - np.array([10, 40, 40], dtype=np.float64),
avg_color + np.array([20, 30, 50], dtype=np.float64))
for i in range(maskHSV.shape[0]):
t = maskHSV[i].nonzero()[0].flatten()
if t.size > 1:
maskHSV[i, t[0]:t[-1]] = 255
masked_HSV = cv2.bitwise_and(img_HSV, img_HSV, mask=maskHSV)
# set img
new_img = masked_HSV.copy().astype(np.float32)
left_edge = np.zeros(masked_HSV.shape[0], dtype=np.uint32)
right_edge = np.full(masked_HSV.shape[0], img_HSV.shape[1], dtype=np.uint32)
for _y in range(masked_HSV.shape[0]):
t = np.argwhere(masked_HSV[_y].sum(-1) > 0).flatten()
if t.size > 0:
k = 4
left_edge[_y] = np.min(t) + k
right_edge[_y] = np.max(t) - k
kind = "slinear"
x_fit = np.concatenate(([left_edge[_y] // 2], np.arange(left_edge[_y], left_edge[_y] + k)), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[_y, left_edge[_y]:left_edge[_y] + k, :]), 0)
fl = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
x_fit = np.concatenate(
([(new_img.shape[1] + right_edge[_y]) // 2], np.arange(right_edge[_y] - k, right_edge[_y])), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[_y, right_edge[_y] - k: right_edge[_y], :]), 0)
fr = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
new_img[_y, :left_edge[_y]] = fl(np.arange(left_edge[_y])).clip(0, 255)
new_img[_y, right_edge[_y]:] = fr(np.arange(right_edge[_y], new_img.shape[1])).clip(0, 255)
for _y in range(new_img.shape[0] - 1):
for _x in reversed(range(0, left_edge[_y])):
new_img[_y, _x] = 0.33 * new_img[_y - 1, _x] + 0.34 * new_img[_y, _x + 1] + 0.33 * new_img[
_y + 1, _x]
for _x in range(right_edge[_y], new_img.shape[1]):
new_img[_y, _x] = 0.33 * new_img[_y - 1, _x] + 0.34 * new_img[_y, _x - 1] + 0.33 * new_img[
_y + 1, _x]
up_edge = np.zeros(img_HSV.shape[1], dtype=np.uint32)
down_edge = np.full(img_HSV.shape[1], img_HSV.shape[0], dtype=np.uint32)
for _x in range(img_HSV.shape[1]):
t = np.argwhere(new_img[:, _x, :].sum(-1) > 0).flatten()
if t.size > 0:
k = 4
up_edge[_x] = np.min(t) + k
down_edge[_x] = np.max(t) - k
k = 1
kind = "slinear"
x_fit = np.concatenate(([up_edge[_x] // 2], np.arange(up_edge[_x], up_edge[_x] + k)), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[up_edge[_x]:up_edge[_x] + k, _x, :]), 0)
fl = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
x_fit = np.concatenate(
([(new_img.shape[1] + down_edge[_x]) // 2], np.arange(down_edge[_x] - k, down_edge[_x])), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[down_edge[_x] - k: down_edge[_x], _x, :]), 0)
fr = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
new_img[:up_edge[_x], _x, :] = fl(np.arange(up_edge[_x])).clip(0, 255)
new_img[down_edge[_x]:, _x, :] = fr(np.arange(down_edge[_x], new_img.shape[0])).clip(0, 255)
for _x in range(new_img.shape[1] - 1):
for _y in reversed(range(0, up_edge[_x])):
new_img[_y, _x] = 0.33 * new_img[_y, _x - 1] + 0.34 * new_img[_y + 1, _x] + 0.33 * new_img[
_y, _x + 1]
for _y in range(down_edge[_x], new_img.shape[0]):
new_img[_y, _x] = 0.33 * new_img[_y, _x - 1] + 0.34 * new_img[_y - 1, _x] + 0.33 * new_img[
_y, _x + 1]
out_img = new_img.round().clip(0, 255).astype(np.uint8)
out_img_BGR = hsv2bgr(out_img)
display(np.concatenate((img_BGR, out_img_BGR), axis=1))
return
def m6():
img_BGR = cv2.imread(r'Data/mask/0_texture_2.png')
img_HSV = cv2.cvtColor(img_BGR, cv2.COLOR_BGR2HSV).astype(np.float64)
avg_color = img_HSV[np.logical_and(img_HSV.sum(-1) > 10, img_HSV.sum(-1) < 700)].mean(0)
maskHSV = cv2.inRange(img_HSV, avg_color - np.array([10, 40, 40], dtype=np.float64),
avg_color + np.array([20, 30, 50], dtype=np.float64))
for i in range(maskHSV.shape[0]):
t = maskHSV[i].nonzero()[0].flatten()
if t.size > 1:
maskHSV[i, t[0]:t[-1]] = 255
masked_HSV = cv2.bitwise_and(img_HSV, img_HSV, mask=maskHSV)
# set img
new_img = masked_HSV.copy().astype(np.float32)
left_edge = np.zeros(masked_HSV.shape[0], dtype=np.uint32)
right_edge = np.full(masked_HSV.shape[0], img_HSV.shape[1], dtype=np.uint32)
for _y in range(masked_HSV.shape[0]):
t = np.argwhere(masked_HSV[_y].sum(-1) > 0).flatten()
if t.size > 0:
k = 4
left_edge[_y] = np.min(t) + k
right_edge[_y] = np.max(t) - k
kind = "slinear"
x_fit = np.concatenate(([left_edge[_y] // 2], np.arange(left_edge[_y], left_edge[_y] + k)), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[_y, left_edge[_y]:left_edge[_y] + k, :]), 0)
fl = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
x_fit = np.concatenate(
([(new_img.shape[1] + right_edge[_y]) // 2], np.arange(right_edge[_y] - k, right_edge[_y])), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[_y, right_edge[_y] - k: right_edge[_y], :]), 0)
fr = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
new_img[_y, left_edge[_y] // 2:left_edge[_y], :] = fl(np.arange(left_edge[_y] // 2, left_edge[_y])).clip(0,
255)
new_img[_y, right_edge[_y]:(new_img.shape[1] + right_edge[_y]) // 2, :] = fr(
np.arange(right_edge[_y], (new_img.shape[1] + right_edge[_y]) // 2)).clip(0, 255)
new_img[_y, :left_edge[_y] // 2, :] = avg_color
new_img[_y, (new_img.shape[1] + right_edge[_y]) // 2:, :] = avg_color
for _y in range(new_img.shape[0] - 1):
for _x in reversed(range(0, left_edge[_y])):
new_img[_y, _x] = 0.33 * new_img[_y - 1, _x] + 0.34 * new_img[_y, _x + 1] + 0.33 * new_img[
_y + 1, _x]
for _x in range(right_edge[_y], new_img.shape[1]):
new_img[_y, _x] = 0.33 * new_img[_y - 1, _x] + 0.34 * new_img[_y, _x - 1] + 0.33 * new_img[
_y + 1, _x]
up_edge = np.zeros(img_HSV.shape[1], dtype=np.uint32)
down_edge = np.full(img_HSV.shape[1], img_HSV.shape[0], dtype=np.uint32)
for _x in range(img_HSV.shape[1]):
t = np.argwhere(new_img[:, _x, :].sum(-1) > 0).flatten()
if t.size > 0:
k = 4
up_edge[_x] = np.min(t) + k
down_edge[_x] = np.max(t) - k
k = 1
kind = "slinear"
x_fit = np.concatenate(([up_edge[_x] // 2], np.arange(up_edge[_x], up_edge[_x] + k)), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[up_edge[_x]:up_edge[_x] + k, _x, :]), 0)
fl = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
x_fit = np.concatenate(
([(new_img.shape[1] + down_edge[_x]) // 2], np.arange(down_edge[_x] - k, down_edge[_x])), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[down_edge[_x] - k: down_edge[_x], _x, :]), 0)
fr = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
new_img[up_edge[_x] // 2:up_edge[_x], _x, :] = fl(np.arange(up_edge[_x] // 2, up_edge[_x])).clip(0, 255)
new_img[down_edge[_x]:(new_img.shape[0] + down_edge[_y]) // 2, _x, :] = fr(
np.arange(down_edge[_x], (new_img.shape[0] + down_edge[_y]) // 2)).clip(0, 255)
new_img[:up_edge[_x] // 2, _x, :] = avg_color
new_img[(new_img.shape[0] + down_edge[_x]) // 2:, _x, :] = avg_color
for _x in range(new_img.shape[1] - 1):
for _y in reversed(range(0, up_edge[_x])):
new_img[_y, _x] = 0.33 * new_img[_y, _x - 1] + 0.34 * new_img[_y + 1, _x] + 0.33 * new_img[
_y, _x + 1]
for _y in range(down_edge[_x], new_img.shape[0]):
new_img[_y, _x] = 0.33 * new_img[_y, _x - 1] + 0.34 * new_img[_y - 1, _x] + 0.33 * new_img[
_y, _x + 1]
out_img = new_img.round().clip(0, 255).astype(np.uint8)
out_img_BGR = hsv2bgr(out_img)
display(np.concatenate((img_BGR, out_img_BGR), axis=1))
return
def m7():
img_BGR = cv2.imread(r'Data/mask/0_texture_2.png')
img_HSV = cv2.cvtColor(img_BGR, cv2.COLOR_BGR2HSV).astype(np.float64)
avg_color = img_HSV[np.logical_and(img_HSV.sum(-1) > 10, img_HSV.sum(-1) < 700)].mean(0)
maskHSV = cv2.inRange(img_HSV, avg_color - np.array([5, 30, 30], dtype=np.float64),
avg_color + np.array([10, 25, 25], dtype=np.float64))
for i in range(maskHSV.shape[0]):
t = maskHSV[i].nonzero()[0].flatten()
if t.size > 1:
maskHSV[i, t[0]:t[-1]] = 255
masked_HSV = cv2.bitwise_and(img_HSV, img_HSV, mask=maskHSV)
# set img
new_img = masked_HSV.copy().astype(np.float32)
left_edge = np.zeros(masked_HSV.shape[0], dtype=np.uint32)
right_edge = np.full(masked_HSV.shape[0], img_HSV.shape[1], dtype=np.uint32)
for _y in range(masked_HSV.shape[0]):
t = np.argwhere(masked_HSV[_y].sum(-1) > 0).flatten()
if t.size > 0:
k = 4
left_edge[_y] = np.min(t) + k
right_edge[_y] = np.max(t) - k
kind = "slinear"
x_fit = np.concatenate(([left_edge[_y] // 2], np.arange(left_edge[_y], left_edge[_y] + k)), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[_y, left_edge[_y]:left_edge[_y] + k, :]), 0)
fl = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
x_fit = np.concatenate(
([(new_img.shape[1] + right_edge[_y]) // 2], np.arange(right_edge[_y] - k, right_edge[_y])), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[_y, right_edge[_y] - k: right_edge[_y], :]), 0)
fr = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
new_img[_y, left_edge[_y] // 2:left_edge[_y], :] = fl(np.arange(left_edge[_y] // 2, left_edge[_y])).clip(0,
255)
new_img[_y, right_edge[_y]:(new_img.shape[1] + right_edge[_y]) // 2, :] = fr(
np.arange(right_edge[_y], (new_img.shape[1] + right_edge[_y]) // 2)).clip(0, 255)
new_img[_y, :left_edge[_y] // 2, :] = avg_color
new_img[_y, (new_img.shape[1] + right_edge[_y]) // 2:, :] = avg_color
for _y in range(new_img.shape[0] - 1):
for _x in reversed(range(0, left_edge[_y])):
new_img[_y, _x] = 0.33 * new_img[_y - 1, _x] + 0.34 * new_img[_y, _x + 1] + 0.33 * new_img[
_y + 1, _x]
for _x in range(right_edge[_y], new_img.shape[1]):
new_img[_y, _x] = 0.33 * new_img[_y - 1, _x] + 0.34 * new_img[_y, _x - 1] + 0.33 * new_img[
_y + 1, _x]
up_edge = np.zeros(img_HSV.shape[1], dtype=np.uint32)
down_edge = np.full(img_HSV.shape[1], img_HSV.shape[0], dtype=np.uint32)
for _x in range(img_HSV.shape[1]):
t = np.argwhere(new_img[:, _x, :].sum(-1) > 0).flatten()
if t.size > 0:
k = 4
up_edge[_x] = np.min(t) + k
down_edge[_x] = np.max(t) - k
k = 1
kind = "slinear"
x_fit = np.concatenate(([up_edge[_x] // 2], np.arange(up_edge[_x], up_edge[_x] + k)), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[up_edge[_x]:up_edge[_x] + k, _x, :]), 0)
fl = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
x_fit = np.concatenate(
([(new_img.shape[1] + down_edge[_x]) // 2], np.arange(down_edge[_x] - k, down_edge[_x])), 0)
y_fit = np.concatenate((avg_color.reshape(1, 3), new_img[down_edge[_x] - k: down_edge[_x], _x, :]), 0)
fr = interpolate.interp1d(x_fit, y_fit, kind=kind, axis=0, fill_value="extrapolate")
new_img[up_edge[_x] // 2:up_edge[_x], _x, :] = fl(np.arange(up_edge[_x] // 2, up_edge[_x])).clip(0, 255)
new_img[down_edge[_x]:(new_img.shape[0] + down_edge[_y]) // 2, _x, :] = fr(
np.arange(down_edge[_x], (new_img.shape[0] + down_edge[_y]) // 2)).clip(0, 255)
new_img[:up_edge[_x] // 2, _x, :] = avg_color
new_img[(new_img.shape[0] + down_edge[_x]) // 2:, _x, :] = avg_color
for _x in range(new_img.shape[1] - 1):
for _y in reversed(range(0, up_edge[_x])):
new_img[_y, _x] = 0.33 * new_img[_y, _x - 1] + 0.34 * new_img[_y + 1, _x] + 0.33 * new_img[
_y, _x + 1]
for _y in range(down_edge[_x], new_img.shape[0]):
new_img[_y, _x] = 0.33 * new_img[_y, _x - 1] + 0.34 * new_img[_y - 1, _x] + 0.33 * new_img[
_y, _x + 1]
out_img = new_img.round().clip(0, 255).astype(np.uint8)
out_img_BGR = hsv2bgr(out_img)
display(np.concatenate((img_BGR, hsv2bgr(masked_HSV), out_img_BGR), axis=1))
return
if __name__ == "__main__":
# detect edge
# start_time = time()
# img = cv2.imread(r'Data\mask\0_texture.png')
# edge_img, edge_along_y, edge_along_x = edge_detection(img)
#
# img_new = img.copy()
# list_ref_pts = []
# for _y, edge in enumerate(edge_along_y):
# if edge[0] != 0 and edge[1] != 0:
# list_ref_pts.append((_y, edge[0]))
# if edge[0] < edge[1]:
# list_ref_pts.append((_y, edge[1]))
# for _x, edge in enumerate(edge_along_x):
# if edge[0] != 0 and edge[1] != 0:
# list_ref_pts.append((edge[0], _x))
# if edge[0] < edge[1]:
# list_ref_pts.append((edge[1],_x))
# list_ref_pts = list(set(list_ref_pts))
# new_img = color_gradient_v2(img, list_ref_pts, x_range=range(0, img.shape[1]), y_range=range(0, img.shape[0]))
# display(new_img, encode="RGB")
#
# # color_gradient_demo1()
# print("Time Elapsed {:.2f}".format(time()-start_time))
# for _y in range(img_new.shape[0]):
# # if there is a left edge
# if edge_x[_y, 0] != 0:
# list_of_pts = find_ref_pts(img, (_y, edge_x[_y, 0]),1)
# # fill image with x 0 to edge -1, with y = y_ref
# img_new = color_gradient_v2(img, pts=list_of_pts, x_range=range(0, edge_x[_y, 0]), y_loc=_y, mode='r')
# # if there is a right edge
# if edge_x[i, 1] != 0:
# pass
# # img_new[i, edge_x[i, 1]:, :] = img[i, edge_x[i, 1], :]
# # list_of_pts = find_ref_pts(img, (i, edge_x[i, 1]))
# # img_new = color_gradient_v2(img, list_of_pts, x=range(edge_x[i, 1], img_new.shape[0]), y_ref=i, mode='r')
# # img_new = cv2.blur(img_new, (1, 3))
#
# # img_new = img.copy()
# # edge_x = np.zeros((img.shape[0], 2), dtype=np.uint16)
# # th = 0
# # for i in range(edge_x.shape[0]):
# # temp = np.argwhere(grad_x[i] > th)
# # if np.any(temp):
# # edge_x[i, 0] = np.min(temp) + 1
# # edge_x[i, 1] = np.max(temp) - 1
# # for i in range(img_new.shape[0]):
# # if edge_x[i, 0]:
# # img_new[i, :edge_x[i, 0], :] = img[i, edge_x[i, 0], :]
# # if edge_x[i, 1]:
# # img_new[i, edge_x[i, 1]:, :] = img[i, edge_x[i, 1], :]
# # img_new = cv2.blur(img_new, (1, 3))
# # edge_y = np.zeros((img.shape[1], 2), dtype=np.uint16)
# # th = 0
# # for i in range(img_new.shape[1]):
# # temp = np.argwhere(np.any(img_new[:, i] - th, axis=-1))
# # if np.any(temp):
# # edge_y[i, 0] = np.min(temp) + 2
# # edge_y[i, 1] = np.max(temp) - 2
# # for i in range(img_new.shape[1]):
# # if edge_y[i, 0]:
# # img_new[:edge_y[i, 0], i, :] = img_new[edge_y[i, 0], i, :]
# # if edge_y[i, 1]:
# # img_new[edge_y[i, 1]:, i, :] = img_new[edge_y[i, 1], i, :]
# # img_new = cv2.blur(img_new, (1, 3))
# display(img_new, "new_img")
# color_gradient_demo1()
# MAP_SIZE=(256,256)
# initial_color = [{'loc': (MAP_SIZE[0] // 2, 0), 'color': (200, 190, 120), 'scale': 0, 'weight':1},
# {'loc': (MAP_SIZE[0] // 2, MAP_SIZE[1] - 1), 'color': (191, 90, 17), 'scale': 0,'weight':1},
# {'loc': (MAP_SIZE[0] // 2, MAP_SIZE[0] // 2), 'color': (255, 255, 255), 'scale': 0,'weight':0}]
# all_pts = [i['loc'] for i in initial_color]
# for d in initial_color:
# _y, _x = d['loc']
# d['scale'] = np.max([dist((_y, _x), (0, 0)),
# dist((_y, _x), (MAP_SIZE[0] - 1, 0)),
# dist((_y, _x), (0, MAP_SIZE[1] - 1)),
# dist((_y, _x), (MAP_SIZE[0] - 1, MAP_SIZE[1] - 1))])
# img = np.zeros((MAP_SIZE[0], MAP_SIZE[1], 3),dtype=np.float32)
# for i in initial_color:
# img += make_img(i['color'],i['loc'], i['scale'], MAP_SIZE) * i['weight']
# np.clip(img, 0, 255, img)
# img = np.round(img).astype(np.uint8)
# display(img,encode='RGB')
# start_time = time()
# img = cv2.imread(r'Data\mask\0_texture.png')
# edge_img, edge_along_y, edge_along_x = edge_detection(img, th=5)
# new_img = color_gradient_v4(img, edge_along_y)
# display(new_img, "v4")
# main_v3()
# main_v4()
m7()
| 50.160136 | 121 | 0.537206 | 9,293 | 58,888 | 3.139137 | 0.030023 | 0.074249 | 0.040313 | 0.010421 | 0.897813 | 0.875394 | 0.859968 | 0.844543 | 0.827369 | 0.818319 | 0 | 0.053004 | 0.296767 | 58,888 | 1,173 | 122 | 50.202899 | 0.65143 | 0.138976 | 0 | 0.767494 | 0 | 0 | 0.015759 | 0.003969 | 0 | 0 | 0 | 0 | 0.002257 | 1 | 0.03386 | false | 0 | 0.005643 | 0.003386 | 0.068849 | 0.004515 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cc2570380cff1f1fbe749c12af84f741bd6e6ba4 | 2,168 | py | Python | tests/data_structures/commons/queue_unit_test.py | vertexproject/pyalgs | 11b9ea37afc9e9f9e38ffacc42b53f9cd96f5f83 | [
"BSD-3-Clause"
] | 12 | 2017-05-01T10:31:42.000Z | 2021-06-23T14:03:28.000Z | tests/data_structures/commons/queue_unit_test.py | vertexproject/pyalgs | 11b9ea37afc9e9f9e38ffacc42b53f9cd96f5f83 | [
"BSD-3-Clause"
] | 2 | 2018-08-01T10:09:09.000Z | 2020-07-16T11:41:46.000Z | tests/data_structures/commons/queue_unit_test.py | vertexproject/pyalgs | 11b9ea37afc9e9f9e38ffacc42b53f9cd96f5f83 | [
"BSD-3-Clause"
] | 6 | 2017-06-04T01:41:14.000Z | 2021-01-19T05:05:44.000Z | import unittest
from pyalgs.data_structures.commons.queue import LinkedListQueue, Queue, ArrayQueue
class QueueUnitTest(unittest.TestCase):
def test_Queue(self):
queue = Queue.create()
queue.enqueue(10)
self.assertEqual(1, queue.size())
self.assertFalse(queue.is_empty())
queue.enqueue(20)
self.assertEqual(2, queue.size())
queue.enqueue(30)
print([i for i in queue.iterate()])
self.assertEqual(3, queue.size())
self.assertEqual(10, queue.dequeue())
self.assertEqual(2, queue.size())
self.assertEqual(20, queue.dequeue())
self.assertEqual(1, queue.size())
self.assertEqual(30, queue.dequeue())
self.assertTrue(queue.is_empty())
def test_LinkedListQueue(self):
queue = LinkedListQueue()
queue.enqueue(10)
self.assertEqual(1, queue.size())
self.assertFalse(queue.is_empty())
queue.enqueue(20)
self.assertEqual(2, queue.size())
queue.enqueue(30)
print([i for i in queue.iterate()])
self.assertEqual(3, queue.size())
self.assertEqual(10, queue.dequeue())
self.assertEqual(2, queue.size())
self.assertEqual(20, queue.dequeue())
self.assertEqual(1, queue.size())
self.assertEqual(30, queue.dequeue())
self.assertTrue(queue.is_empty())
def test_ArrayQueue(self):
queue = ArrayQueue()
queue.enqueue(10)
self.assertEqual(1, queue.size())
self.assertFalse(queue.is_empty())
queue.enqueue(20)
self.assertEqual(2, queue.size())
queue.enqueue(30)
print([i for i in queue.iterate()])
self.assertEqual(3, queue.size())
self.assertEqual(10, queue.dequeue())
self.assertEqual(2, queue.size())
self.assertEqual(20, queue.dequeue())
self.assertEqual(1, queue.size())
self.assertEqual(30, queue.dequeue())
self.assertTrue(queue.is_empty())
for i in range(100):
queue.enqueue(i)
for i in range(100):
queue.dequeue()
if __name__ == '__main__':
unittest.main() | 30.111111 | 83 | 0.608856 | 253 | 2,168 | 5.146245 | 0.162055 | 0.276498 | 0.119816 | 0.165899 | 0.802611 | 0.802611 | 0.773426 | 0.773426 | 0.773426 | 0.773426 | 0 | 0.035185 | 0.252768 | 2,168 | 72 | 84 | 30.111111 | 0.768519 | 0 | 0 | 0.77193 | 0 | 0 | 0.003688 | 0 | 0 | 0 | 0 | 0 | 0.526316 | 1 | 0.052632 | false | 0 | 0.035088 | 0 | 0.105263 | 0.052632 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
0bdf93d926a0681a9a614916663a667ec176a988 | 244 | py | Python | HeavyFlavorAnalysis/Skimming/python/onia_SkimPaths_cff.py | SWuchterl/cmssw | 769b4a7ef81796579af7d626da6039dfa0347b8e | [
"Apache-2.0"
] | 6 | 2017-09-08T14:12:56.000Z | 2022-03-09T23:57:01.000Z | HeavyFlavorAnalysis/Skimming/python/onia_SkimPaths_cff.py | SWuchterl/cmssw | 769b4a7ef81796579af7d626da6039dfa0347b8e | [
"Apache-2.0"
] | 545 | 2017-09-19T17:10:19.000Z | 2022-03-07T16:55:27.000Z | HeavyFlavorAnalysis/Skimming/python/onia_SkimPaths_cff.py | SWuchterl/cmssw | 769b4a7ef81796579af7d626da6039dfa0347b8e | [
"Apache-2.0"
] | 14 | 2017-10-04T09:47:21.000Z | 2019-10-23T18:04:45.000Z | import FWCore.ParameterSet.Config as cms
from HeavyFlavorAnalysis.Skimming.jpsiToMuMu_SkimPath_cff import *
from HeavyFlavorAnalysis.Skimming.upsilonToMuMu_SkimPath_cff import *
from HeavyFlavorAnalysis.Skimming.bToMuMu_SkimPath_cff import *
| 34.857143 | 69 | 0.881148 | 27 | 244 | 7.740741 | 0.518519 | 0.330144 | 0.444976 | 0.200957 | 0.45933 | 0.45933 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07377 | 244 | 6 | 70 | 40.666667 | 0.924779 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
0412be3aad2a35bdc58d8046d58c0605e68611ab | 90 | py | Python | src/artifice/scraper/tasks/__init__.py | artifice-project/artifice-scraper | f224a0da22162fd479d6b9f9095ff5cae4723716 | [
"MIT"
] | null | null | null | src/artifice/scraper/tasks/__init__.py | artifice-project/artifice-scraper | f224a0da22162fd479d6b9f9095ff5cae4723716 | [
"MIT"
] | 5 | 2019-09-18T19:17:14.000Z | 2021-03-20T01:46:06.000Z | src/artifice/scraper/tasks/__init__.py | artifice-project/artifice-scraper | f224a0da22162fd479d6b9f9095ff5cae4723716 | [
"MIT"
] | null | null | null | from .scheduled_tasks import *
from .callable_tasks import *
from .base import run_celery
| 22.5 | 30 | 0.811111 | 13 | 90 | 5.384615 | 0.615385 | 0.314286 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 90 | 3 | 31 | 30 | 0.897436 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
042a7c300445607ff53293503b9aad27caa47fed | 49 | py | Python | api.py | FoxeiZ/arknights-gacha | 420fccca347ba5823bae03432cae2f3563193bbf | [
"MIT"
] | 1 | 2022-02-15T05:31:03.000Z | 2022-02-15T05:31:03.000Z | api.py | FoxeiZ/arknights-gacha | 420fccca347ba5823bae03432cae2f3563193bbf | [
"MIT"
] | 1 | 2021-10-21T03:42:02.000Z | 2021-10-21T03:42:02.000Z | api.py | Swyreee/arknights-gacha | 420fccca347ba5823bae03432cae2f3563193bbf | [
"MIT"
] | null | null | null | from core import gacha
# TODO: Write this later
| 12.25 | 24 | 0.755102 | 8 | 49 | 4.625 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.204082 | 49 | 3 | 25 | 16.333333 | 0.948718 | 0.44898 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f0c2686e92ab6c7d5680efb87254305ed38718ac | 6,728 | py | Python | conans/test/remote_checks_test.py | pawelkami/conan | 5df3b0134bd6c85185c6b8a0a8574dfad54aa17e | [
"MIT"
] | null | null | null | conans/test/remote_checks_test.py | pawelkami/conan | 5df3b0134bd6c85185c6b8a0a8574dfad54aa17e | [
"MIT"
] | null | null | null | conans/test/remote_checks_test.py | pawelkami/conan | 5df3b0134bd6c85185c6b8a0a8574dfad54aa17e | [
"MIT"
] | null | null | null | import unittest
from conans.test.utils.tools import TestClient, TestServer
from conans.model.manifest import FileTreeManifest
from conans.model.ref import ConanFileReference
class RemoteChecksTest(unittest.TestCase):
def test_recipe_updates(self):
servers = {"server1": TestServer(), "server2": TestServer(), "server3": TestServer()}
client = TestClient(servers=servers, users={"server1": [("lasote", "mypass")],
"server2": [("lasote", "mypass")],
"server3": [("lasote", "mypass")]})
conanfile = """from conans import ConanFile
class Pkg(ConanFile):
def package_info(self):
self.output.info("%s")
"""
client.save({"conanfile.py": conanfile % "Server1!"})
client.run("create . Pkg/0.1@lasote/testing")
client.run("upload Pkg* -r=server1 --confirm --all")
def bump_time(inc_time):
path = client.client_cache.export(ConanFileReference.loads("Pkg/0.1@lasote/testing"))
manifest = FileTreeManifest.load(path)
manifest.time += inc_time
manifest.save(path)
client.save({"conanfile.py": conanfile % "Server2!"})
client.run("create . Pkg/0.1@lasote/testing")
bump_time(20)
client.run("upload Pkg* -r=server2 --confirm --all")
client.save({"conanfile.py": conanfile % "Server3!"})
client.run("create . Pkg/0.1@lasote/testing")
bump_time(40)
client.run("upload Pkg* -r=server3 --confirm --all")
# The remote defined is the first one that was used for upload
client.run("remote list_ref")
self.assertIn("Pkg/0.1@lasote/testing: server1", client.out)
client.run("remove * -f")
client.run("install Pkg/0.1@lasote/testing -r=server1")
self.assertIn("Pkg/0.1@lasote/testing: Server1!", client.out)
client.run("install Pkg/0.1@lasote/testing -r=server2")
self.assertIn("Pkg/0.1@lasote/testing: Server1!", client.out)
# Update
client.run("install Pkg/0.1@lasote/testing -r=server2 --update")
self.assertIn("Pkg/0.1@lasote/testing: Server2!", client.out)
client.run("remote list_ref")
self.assertIn("Pkg/0.1@lasote/testing: server2", client.out)
# Update
client.run("install Pkg/0.1@lasote/testing -r=server3 --update")
self.assertIn("Pkg/0.1@lasote/testing: Server3!", client.out)
client.run("remote list_ref")
self.assertIn("Pkg/0.1@lasote/testing: server3", client.out)
def test_binary_defines_remote(self):
servers = {"server1": TestServer(), "server2": TestServer(), "server3": TestServer()}
client = TestClient(servers=servers, users={"server1": [("lasote", "mypass")],
"server2": [("lasote", "mypass")],
"server3": [("lasote", "mypass")]})
conanfile = """from conans import ConanFile
class Pkg(ConanFile):
pass"""
client.save({"conanfile.py": conanfile})
client.run("create . Pkg/0.1@lasote/testing")
client.run("upload Pkg* --all -r=server1 --confirm")
client.run("upload Pkg* --all -r=server2 --confirm")
# It takes the default remote
client.run("remove * -f")
client.run("remote list_ref")
self.assertNotIn("Pkg", client.out)
client.run("export . Pkg/0.1@lasote/testing")
client.run("install Pkg/0.1@lasote/testing")
self.assertIn("Downloading conan_package.tgz", client.out)
client.run("remote list_ref")
self.assertIn("Pkg/0.1@lasote/testing: server1", client.out)
# Explicit remote also defines the remote
client.run("remove * -f")
client.run("remote list_ref")
self.assertNotIn("Pkg", client.out)
client.run("export . Pkg/0.1@lasote/testing")
client.run("install Pkg/0.1@lasote/testing -r=server2")
self.assertIn("Downloading conan_package.tgz", client.out)
client.run("remote list_ref")
self.assertIn("Pkg/0.1@lasote/testing: server2", client.out)
# But order fails!!!
client.run("remove * -f")
client.run("remove * -f -r=server1")
client.run("export . Pkg/0.1@lasote/testing")
error = client.run("install Pkg/0.1@lasote/testing", ignore_error=True)
self.assertTrue(error)
self.assertIn("Missing prebuilt package for 'Pkg/0.1@lasote/testing'", client.out)
client.run("remote list_ref")
self.assertNotIn("Pkg", client.out)
def test_binaries_from_different_remotes(self):
servers = {"server1": TestServer(), "server2": TestServer(), "server3": TestServer()}
client = TestClient(servers=servers, users={"server1": [("lasote", "mypass")],
"server2": [("lasote", "mypass")]})
conanfile = """from conans import ConanFile
class Pkg(ConanFile):
options = {"opt": [1, 2, 3]}
"""
client.save({"conanfile.py": conanfile})
client.run("create . Pkg/0.1@lasote/testing -o Pkg:opt=1")
client.run("upload Pkg* --all -r=server1 --confirm")
client.run("remove * -p -f")
client.run("create . Pkg/0.1@lasote/testing -o Pkg:opt=2")
client.run("upload Pkg* --all -r=server2 --confirm")
client.run("remove * -p -f")
client.run("remote list_ref")
self.assertIn("Pkg/0.1@lasote/testing: server1", client.out)
# Trying to install from another remote fails
error = client.run("install Pkg/0.1@lasote/testing -o Pkg:opt=2 -r=server2", ignore_error=True)
self.assertTrue(error)
self.assertIn("ERROR: Missing prebuilt package for 'Pkg/0.1@lasote/testing'", client.out)
# Also update fails
error = client.run("install Pkg/0.1@lasote/testing -o Pkg:opt=2 -r=server2 -u", ignore_error=True)
self.assertTrue(error)
self.assertIn("ERROR: Missing prebuilt package for 'Pkg/0.1@lasote/testing'", client.out)
# Build outdated
error = client.run("install Pkg/0.1@lasote/testing -o Pkg:opt=2 -r=server2 --build=outdated",
ignore_error=True)
self.assertTrue(error)
self.assertIn("ERROR: Missing prebuilt package for 'Pkg/0.1@lasote/testing'", client.out)
# If the remote reference is dissasociated, it works
client.run("remote remove_ref Pkg/0.1@lasote/testing")
client.run("install Pkg/0.1@lasote/testing -o Pkg:opt=2 -r=server2")
client.run("remote list_ref")
self.assertIn("Pkg/0.1@lasote/testing: server2", client.out)
| 48.402878 | 106 | 0.608056 | 831 | 6,728 | 4.883273 | 0.135981 | 0.099803 | 0.045589 | 0.100296 | 0.802366 | 0.766141 | 0.758255 | 0.758255 | 0.714145 | 0.695663 | 0 | 0.026316 | 0.237515 | 6,728 | 138 | 107 | 48.753623 | 0.764717 | 0.042955 | 0 | 0.610619 | 0 | 0.035398 | 0.403391 | 0.133012 | 0 | 0 | 0 | 0 | 0.212389 | 1 | 0.035398 | false | 0.079646 | 0.061947 | 0 | 0.106195 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
f0d8ae182f2b19a05c2de174f2599fc7db4e517d | 6,569 | py | Python | test_xdg.py | flokli/xdg | 0b588f1eacea5c2d802c0518881e0b67e2915aa6 | [
"ISC"
] | null | null | null | test_xdg.py | flokli/xdg | 0b588f1eacea5c2d802c0518881e0b67e2915aa6 | [
"ISC"
] | null | null | null | test_xdg.py | flokli/xdg | 0b588f1eacea5c2d802c0518881e0b67e2915aa6 | [
"ISC"
] | null | null | null | """Test suite for xdg."""
import os
import sys
from typing import Callable, TYPE_CHECKING
import pytest # pylint: disable=import-error
# pylint: disable=unused-import
if TYPE_CHECKING:
from _pytest.monkeypatch import MonkeyPatch # noqa
# pylint: enable=unused-import
HOME_DIR = '/homedir'
@pytest.fixture # type: ignore
def unimport() -> None:
"""Ensure xdg is absent from sys.modules."""
try:
del sys.modules['xdg']
except KeyError:
pass
# pylint: disable=no-self-use,redefined-outer-name,unused-argument
class TestXdgCacheHome:
"""Tests for XDG_CACHE_HOME."""
def test_unset(self, monkeypatch: 'MonkeyPatch',
unimport: Callable) -> None:
"""Test when XDG_CACHE_HOME is unset."""
monkeypatch.delenv('XDG_CACHE_HOME', raising=False)
monkeypatch.setenv('HOME', HOME_DIR)
from xdg import XDG_CACHE_HOME
assert XDG_CACHE_HOME == os.path.join(HOME_DIR, '.cache')
def test_empty(self, monkeypatch: 'MonkeyPatch',
unimport: Callable) -> None:
"""Test when XDG_CACHE_HOME is empty."""
monkeypatch.setenv('HOME', HOME_DIR)
monkeypatch.setenv('XDG_CACHE_HOME', '')
from xdg import XDG_CACHE_HOME
assert XDG_CACHE_HOME == os.path.join(HOME_DIR, '.cache')
def test_set(self, monkeypatch: 'MonkeyPatch', unimport: Callable) -> None:
"""Test when XDG_CACHE_HOME is set."""
monkeypatch.setenv('XDG_CACHE_HOME', '/xdg_cache_home')
from xdg import XDG_CACHE_HOME
assert XDG_CACHE_HOME == '/xdg_cache_home'
class TestXdgConfigDirs:
"""Tests for XDG_CONFIG_DIRS."""
def test_unset(self, monkeypatch: 'MonkeyPatch',
unimport: Callable) -> None:
"""Test when XDG_CONFIG_DIRS is unset."""
monkeypatch.delenv('XDG_CONFIG_DIRS', raising=False)
from xdg import XDG_CONFIG_DIRS
assert XDG_CONFIG_DIRS == ['/etc/xdg']
def test_empty(self, monkeypatch: 'MonkeyPatch',
unimport: Callable) -> None:
"""Test when XDG_CONFIG_DIRS is empty."""
monkeypatch.setenv('XDG_CONFIG_DIRS', '')
from xdg import XDG_CONFIG_DIRS
assert XDG_CONFIG_DIRS == ['/etc/xdg']
def test_set(self, monkeypatch: 'MonkeyPatch', unimport: Callable) -> None:
"""Test when XDG_CONFIG_DIRS is set."""
monkeypatch.setenv('XDG_CONFIG_DIRS', '/first:/sec/ond')
from xdg import XDG_CONFIG_DIRS
assert XDG_CONFIG_DIRS == ['/first', '/sec/ond']
class TestXdgConfigHome:
"""Tests for XDG_CONFIG_HOME."""
def test_unset(self, monkeypatch: 'MonkeyPatch',
unimport: Callable) -> None:
"""Test when XDG_CONFIG_HOME is unset."""
monkeypatch.delenv('XDG_CONFIG_HOME', raising=False)
monkeypatch.setenv('HOME', HOME_DIR)
from xdg import XDG_CONFIG_HOME
assert XDG_CONFIG_HOME == os.path.join(HOME_DIR, '.config')
def test_empty(self, monkeypatch: 'MonkeyPatch',
unimport: Callable) -> None:
"""Test when XDG_CONFIG_HOME is empty."""
monkeypatch.setenv('HOME', HOME_DIR)
monkeypatch.setenv('XDG_CONFIG_HOME', '')
from xdg import XDG_CONFIG_HOME
assert XDG_CONFIG_HOME == os.path.join(HOME_DIR, '.config')
def test_set(self, monkeypatch: 'MonkeyPatch', unimport: Callable) -> None:
"""Test when XDG_CONFIG_HOME is set."""
monkeypatch.setenv('XDG_CONFIG_HOME', '/xdg_config_home')
from xdg import XDG_CONFIG_HOME
assert XDG_CONFIG_HOME == '/xdg_config_home'
class TestXdgDataDirs:
"""Tests for XDG_DATA_DIRS."""
def test_unset(self, monkeypatch: 'MonkeyPatch',
unimport: Callable) -> None:
"""Test when XDG_DATA_DIRS is unset."""
monkeypatch.delenv('XDG_DATA_DIRS', raising=False)
from xdg import XDG_DATA_DIRS
assert XDG_DATA_DIRS == ['/usr/local/share/', '/usr/share/']
def test_empty(self, monkeypatch: 'MonkeyPatch',
unimport: Callable) -> None:
"""Test when XDG_DATA_DIRS is empty."""
monkeypatch.setenv('XDG_DATA_DIRS', '')
from xdg import XDG_DATA_DIRS
assert XDG_DATA_DIRS == ['/usr/local/share/', '/usr/share/']
def test_set(self, monkeypatch: 'MonkeyPatch', unimport: Callable) -> None:
"""Test when XDG_DATA_DIRS is set."""
monkeypatch.setenv('XDG_DATA_DIRS', '/first/:/sec/ond/')
from xdg import XDG_DATA_DIRS
assert XDG_DATA_DIRS == ['/first/', '/sec/ond/']
class TestXdgDataHome:
"""Tests for XDG_DATA_HOME."""
def test_unset(self, monkeypatch: 'MonkeyPatch',
unimport: Callable) -> None:
"""Test when XDG_DATA_HOME is unset."""
monkeypatch.delenv('XDG_DATA_HOME', raising=False)
monkeypatch.setenv('HOME', HOME_DIR)
from xdg import XDG_DATA_HOME
assert XDG_DATA_HOME == os.path.join(HOME_DIR, '.local', 'share')
def test_empty(self, monkeypatch: 'MonkeyPatch',
unimport: Callable) -> None:
"""Test when XDG_DATA_HOME is empty."""
monkeypatch.setenv('HOME', HOME_DIR)
monkeypatch.setenv('XDG_DATA_HOME', '')
from xdg import XDG_DATA_HOME
assert XDG_DATA_HOME == os.path.join(HOME_DIR, '.local', 'share')
def test_set(self, monkeypatch: 'MonkeyPatch', unimport: Callable) -> None:
"""Test when XDG_DATA_HOME is set."""
monkeypatch.setenv('XDG_DATA_HOME', '/xdg_data_home')
from xdg import XDG_DATA_HOME
assert XDG_DATA_HOME == '/xdg_data_home'
class TestXdgRuntimeDir:
"""Tests for XDG_RUNTIME_DIR."""
def test_unset(self, monkeypatch: 'MonkeyPatch',
unimport: Callable) -> None:
"""Test when XDG_RUNTIME_DIR is unset."""
monkeypatch.delenv('XDG_RUNTIME_DIR', raising=False)
from xdg import XDG_RUNTIME_DIR
assert XDG_RUNTIME_DIR is None
def test_empty(self, monkeypatch: 'MonkeyPatch',
unimport: Callable) -> None:
"""Test when XDG_RUNTIME_DIR is empty."""
monkeypatch.setenv('XDG_RUNTIME_DIR', '')
from xdg import XDG_RUNTIME_DIR
assert XDG_RUNTIME_DIR == ''
def test_set(self, monkeypatch: 'MonkeyPatch', unimport: Callable) -> None:
"""Test when XDG_RUNTIME_DIR is set."""
monkeypatch.setenv('XDG_RUNTIME_DIR', '/xdg_runtime_dir')
from xdg import XDG_RUNTIME_DIR
assert XDG_RUNTIME_DIR == '/xdg_runtime_dir'
| 37.112994 | 79 | 0.644999 | 804 | 6,569 | 5.013682 | 0.094527 | 0.062516 | 0.1161 | 0.151823 | 0.848921 | 0.822129 | 0.71223 | 0.704292 | 0.696849 | 0.696849 | 0 | 0 | 0.2355 | 6,569 | 176 | 80 | 37.323864 | 0.802668 | 0.153448 | 0 | 0.587156 | 0 | 0 | 0.145991 | 0 | 0 | 0 | 0 | 0 | 0.165138 | 1 | 0.174312 | false | 0.009174 | 0.385321 | 0 | 0.614679 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f0f4944dced6f59a191ed58460aede2bf3f21f98 | 27,260 | py | Python | test/test_transform.py | mengqiuli/FlowCal | 06f02b7d245c6242a78d789a29375477d012e02a | [
"MIT"
] | null | null | null | test/test_transform.py | mengqiuli/FlowCal | 06f02b7d245c6242a78d789a29375477d012e02a | [
"MIT"
] | null | null | null | test/test_transform.py | mengqiuli/FlowCal | 06f02b7d245c6242a78d789a29375477d012e02a | [
"MIT"
] | null | null | null | #!/usr/bin/python
#
# test_transform.py - Unit tests for transform module
#
# Author: Sebastian M. Castillo-Hair (smc9@rice.edu)
# Date: 7/1/2015
#
# Requires:
# * FlowCal.io
# * FlowCal.transform
# * numpy
#
import FlowCal.io
import FlowCal.transform
import numpy as np
import unittest
import os
class TestRFIArray(unittest.TestCase):
def setUp(self):
self.d = np.array([
[1, 7, 2],
[2, 8, 3],
[3, 9, 4],
[4, 10, 5],
[5, 1, 6],
[6, 2, 7],
[7, 3, 8],
[8, 4, 9],
[9, 5, 10],
[10, 6, 1],
])
def test_rfi_original_integrity(self):
db = self.d.copy()
dt = FlowCal.transform.to_rfi(self.d,
channels=[0,1],
amplification_type=[(0,0), (0,0)],
amplifier_gain=[1.0, 1.0],
resolution=[1024, 1024],)
np.testing.assert_array_equal(self.d, db)
def test_rfi_arg_error_amplification_type_absent(self):
with self.assertRaises(ValueError):
FlowCal.transform.to_rfi(self.d,
channels=[0,1])
def test_rfi_arg_error_amplification_type_length(self):
with self.assertRaises(ValueError):
FlowCal.transform.to_rfi(self.d,
channels=[0,1],
amplification_type=[(4,1), (4,1), (4,1)])
def test_rfi_arg_error_resolution_absent(self):
with self.assertRaises(ValueError):
FlowCal.transform.to_rfi(self.d,
channels=[0,1],
amplification_type=[(4,1), (4,1)])
def test_rfi_arg_error_resolution_length(self):
with self.assertRaises(ValueError):
FlowCal.transform.to_rfi(self.d,
channels=[0,1],
amplification_type=[(4,1), (4,1)],
resolution=[1024])
def test_rfi_arg_error_amplifier_gain_length(self):
with self.assertRaises(ValueError):
FlowCal.transform.to_rfi(self.d,
channels=[0,1],
amplification_type=[(0,0), (0,0)],
amplifier_gain=[3,4,4])
def test_rfi_1d_log_1(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=1,
amplification_type=(4, 1),
resolution=1024)
np.testing.assert_array_equal(dt[:,0], self.d[:,0])
np.testing.assert_array_equal(dt[:,1], 10**(self.d[:,1]/256.0))
np.testing.assert_array_equal(dt[:,2], self.d[:,2])
def test_rfi_1d_log_2(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=2,
amplification_type=(2, 0.01),
amplifier_gain=5.0,
resolution=256)
np.testing.assert_array_equal(dt[:,0], self.d[:,0])
np.testing.assert_array_equal(dt[:,1], self.d[:,1])
np.testing.assert_array_equal(dt[:,2], 0.01*10**(self.d[:,2]/128.0))
def test_rfi_1d_linear_1(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=2,
amplification_type=(0, 0),
amplifier_gain=None,
resolution=256)
np.testing.assert_array_equal(dt[:,0], self.d[:,0])
np.testing.assert_array_equal(dt[:,1], self.d[:,1])
np.testing.assert_array_equal(dt[:,2], self.d[:,2])
def test_rfi_1d_linear_2(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=1,
amplification_type=(0, 0),
amplifier_gain=5.0,
resolution=256)
np.testing.assert_array_equal(dt[:,0], self.d[:,0])
np.testing.assert_array_equal(dt[:,1], self.d[:,1]/5.0)
np.testing.assert_array_equal(dt[:,2], self.d[:,2])
def test_rfi_2d_log_1(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=[1,2],
amplification_type=[(4, 1), (2, 0.01)],
resolution=[1024, 256])
np.testing.assert_array_equal(dt[:,0], self.d[:,0])
np.testing.assert_array_equal(dt[:,1], 10**(self.d[:,1]/256.0))
np.testing.assert_array_equal(dt[:,2], 0.01*10**(self.d[:,2]/128.0))
def test_rfi_2d_mixed_1(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=[1,2],
amplification_type=[(4, 1), (0, 0)],
amplifier_gain=[4., None],
resolution=[1024, 1024])
np.testing.assert_array_equal(dt[:,0], self.d[:,0])
np.testing.assert_array_equal(dt[:,1], 10**(self.d[:,1]/256.0))
np.testing.assert_array_equal(dt[:,2], self.d[:,2])
def test_rfi_2d_mixed_2(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=[1,2],
amplification_type=[(4, 1), (0, 0)],
amplifier_gain=[4., 10.],
resolution=[1024, 1024])
np.testing.assert_array_equal(dt[:,0], self.d[:,0])
np.testing.assert_array_equal(dt[:,1], 10**(self.d[:,1]/256.0))
np.testing.assert_array_equal(dt[:,2], self.d[:,2]/10.)
def test_rfi_default_channel_1(self):
dt = FlowCal.transform.to_rfi(self.d,
amplification_type=[(4,1)]*3,
amplifier_gain=[4., 5., 10.],
resolution=[1024]*3)
np.testing.assert_array_equal(dt, 10**(self.d/256.0))
def test_rfi_default_channel_2(self):
dt = FlowCal.transform.to_rfi(self.d,
amplification_type=[(0,0)]*3,
amplifier_gain=[10., 100., 0.01],
resolution=[1024]*3)
np.testing.assert_array_equal(dt, self.d/np.array([10., 100., 0.01]))
class TestRFIFCSLog(unittest.TestCase):
def setUp(self):
self.channel_names = ['FSC-H', 'SSC-H', 'FL1-H',
'FL2-H', 'FL3-H', 'Time']
current_dir = os.path.abspath(__file__).replace(__file__, '') + os.path.sep
self.d = FlowCal.io.FCSData(current_dir + 'Data001.fcs')
self.n_samples = self.d.shape[0]
def test_rfi_original_integrity(self):
db = self.d.copy()
dt = FlowCal.transform.to_rfi(self.d,
channels=['FSC-H', 'SSC-H'],
amplification_type=[(4,1), (4,1)],
resolution=[1024, 1024])
np.testing.assert_array_equal(self.d, db)
def test_rfi_arg_error_amplification_type_length(self):
with self.assertRaises(ValueError):
FlowCal.transform.to_rfi(self.d,
channels=[0,1],
amplification_type=[(4,1), (4,1), (4,1)])
def test_rfi_arg_error_resolution_length(self):
with self.assertRaises(ValueError):
FlowCal.transform.to_rfi(self.d,
channels=[0,1],
amplification_type=[(4,1), (4,1)],
resolution=[1024])
def test_rfi_arg_error_amplifier_gain_length(self):
with self.assertRaises(ValueError):
FlowCal.transform.to_rfi(self.d,
channels=[0,1],
amplification_type=[(0,0), (0,0)],
amplifier_gain=[3,4,4])
def test_rfi_1d_log_1(self):
dt = FlowCal.transform.to_rfi(self.d,
channels='FL1-H',
amplification_type=(4, 0.01),
resolution=512)
np.testing.assert_array_equal(dt[:,'FSC-H'], self.d[:,'FSC-H'])
np.testing.assert_array_equal(dt[:,'SSC-H'], self.d[:,'SSC-H'])
np.testing.assert_array_equal(dt[:,'FL1-H'],
0.01*10**(self.d[:,'FL1-H']/128.0))
np.testing.assert_array_equal(dt[:,'FL2-H'], self.d[:,'FL2-H'])
np.testing.assert_array_equal(dt[:,'FL3-H'], self.d[:,'FL3-H'])
np.testing.assert_array_equal(dt[:,'Time'], self.d[:,'Time'])
def test_rfi_1d_log_2(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=2,
amplification_type=(2, 0.01),
amplifier_gain=50.,
resolution=512)
np.testing.assert_array_equal(dt[:,'FSC-H'], self.d[:,'FSC-H'])
np.testing.assert_array_equal(dt[:,'SSC-H'], self.d[:,'SSC-H'])
np.testing.assert_array_equal(dt[:,'FL1-H'],
0.01*10**(self.d[:,'FL1-H']/256.0))
np.testing.assert_array_equal(dt[:,'FL2-H'], self.d[:,'FL2-H'])
np.testing.assert_array_equal(dt[:,'FL3-H'], self.d[:,'FL3-H'])
np.testing.assert_array_equal(dt[:,'Time'], self.d[:,'Time'])
def test_rfi_1d_linear_1(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=2,
amplification_type=(0, 0),
amplifier_gain=50.,
resolution=512)
np.testing.assert_array_equal(dt[:,'FSC-H'], self.d[:,'FSC-H'])
np.testing.assert_array_equal(dt[:,'SSC-H'], self.d[:,'SSC-H'])
np.testing.assert_array_equal(dt[:,'FL1-H'], self.d[:,'FL1-H']/50.)
np.testing.assert_array_equal(dt[:,'FL2-H'], self.d[:,'FL2-H'])
np.testing.assert_array_equal(dt[:,'FL3-H'], self.d[:,'FL3-H'])
np.testing.assert_array_equal(dt[:,'Time'], self.d[:,'Time'])
def test_rfi_1d_defaults(self):
dt = FlowCal.transform.to_rfi(self.d,
channels='FL1-H')
np.testing.assert_array_equal(dt[:,'FSC-H'], self.d[:,'FSC-H'])
np.testing.assert_array_equal(dt[:,'SSC-H'], self.d[:,'SSC-H'])
np.testing.assert_array_equal(dt[:,'FL1-H'],
10**(self.d[:,'FL1-H']/256.0))
np.testing.assert_array_equal(dt[:,'FL2-H'], self.d[:,'FL2-H'])
np.testing.assert_array_equal(dt[:,'FL3-H'], self.d[:,'FL3-H'])
np.testing.assert_array_equal(dt[:,'Time'], self.d[:,'Time'])
def test_rfi_2d_log_1(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=['FL1-H', 'FL3-H'],
amplification_type=[(4, 0.01), (2, 1)],
resolution=[512, 2048])
np.testing.assert_array_equal(dt[:,'FSC-H'], self.d[:,'FSC-H'])
np.testing.assert_array_equal(dt[:,'SSC-H'], self.d[:,'SSC-H'])
np.testing.assert_array_equal(dt[:,'FL1-H'],
0.01*10**(self.d[:,'FL1-H']/128.0))
np.testing.assert_array_equal(dt[:,'FL2-H'], self.d[:,'FL2-H'])
np.testing.assert_array_equal(dt[:,'FL3-H'],
10**(self.d[:,'FL3-H']/1024.))
np.testing.assert_array_equal(dt[:,'Time'], self.d[:,'Time'])
def test_rfi_2d_mixed_1(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=[2, 4],
amplification_type=[(4, 0.01), (0, 0)],
resolution=[512, 1024])
np.testing.assert_array_equal(dt[:,'FSC-H'], self.d[:,'FSC-H'])
np.testing.assert_array_equal(dt[:,'SSC-H'], self.d[:,'SSC-H'])
np.testing.assert_array_equal(dt[:,'FL1-H'],
0.01*10**(self.d[:,'FL1-H']/128.0))
np.testing.assert_array_equal(dt[:,'FL2-H'], self.d[:,'FL2-H'])
np.testing.assert_array_equal(dt[:,'FL3-H'], self.d[:,'FL3-H'])
np.testing.assert_array_equal(dt[:,'Time'], self.d[:,'Time'])
def test_rfi_2d_mixed_2(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=[2, 4],
amplification_type=[(4, 0.01), (0, 0)],
amplifier_gain=[5., None],
resolution=[512, 1024])
np.testing.assert_array_equal(dt[:,'FSC-H'], self.d[:,'FSC-H'])
np.testing.assert_array_equal(dt[:,'SSC-H'], self.d[:,'SSC-H'])
np.testing.assert_array_equal(dt[:,'FL1-H'],
0.01*10**(self.d[:,'FL1-H']/128.0))
np.testing.assert_array_equal(dt[:,'FL2-H'], self.d[:,'FL2-H'])
np.testing.assert_array_equal(dt[:,'FL3-H'], self.d[:,'FL3-H'])
np.testing.assert_array_equal(dt[:,'Time'], self.d[:,'Time'])
def test_rfi_2d_mixed_3(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=[2, 4],
amplification_type=[(4, 0.01), (0, 0)],
amplifier_gain=[5., 10.],
resolution=[512, 1024])
np.testing.assert_array_equal(dt[:,'FSC-H'], self.d[:,'FSC-H'])
np.testing.assert_array_equal(dt[:,'SSC-H'], self.d[:,'SSC-H'])
np.testing.assert_array_equal(dt[:,'FL1-H'],
0.01*10**(self.d[:,'FL1-H']/128.0))
np.testing.assert_array_equal(dt[:,'FL2-H'], self.d[:,'FL2-H'])
np.testing.assert_array_equal(dt[:,'FL3-H'], self.d[:,'FL3-H']/10.)
np.testing.assert_array_equal(dt[:,'Time'], self.d[:,'Time'])
def test_rfi_2d_defaults(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=['FL1-H', 'FL3-H'])
np.testing.assert_array_equal(dt[:,'FSC-H'], self.d[:,'FSC-H'])
np.testing.assert_array_equal(dt[:,'SSC-H'], self.d[:,'SSC-H'])
np.testing.assert_array_equal(dt[:,'FL1-H'],
10**(self.d[:,'FL1-H']/256.0))
np.testing.assert_array_equal(dt[:,'FL2-H'], self.d[:,'FL2-H'])
np.testing.assert_array_equal(dt[:,'FL3-H'],
10**(self.d[:,'FL3-H']/256.))
np.testing.assert_array_equal(dt[:,'Time'], self.d[:,'Time'])
def test_rfi_2d_range(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=['FL1-H', 'FL3-H'],
resolution=[512, 2048],
amplification_type=[(4, 0.01), (2, 1)])
np.testing.assert_array_equal(
dt.range('FSC-H'),
self.d.range('FSC-H'))
np.testing.assert_array_equal(
dt.range('SSC-H'),
self.d.range('SSC-H'))
np.testing.assert_array_equal(
dt.range('FL1-H'),
[0.01*10**(r/128.0) for r in self.d.range('FL1-H')])
np.testing.assert_array_equal(
dt.range('FL2-H'),
self.d.range('FL2-H'))
np.testing.assert_array_equal(
dt.range('FL3-H'),
[10**(r/1024.0) for r in self.d.range('FL3-H')])
def test_rfi_default_channel(self):
# Leave time channel out
channels = ['FSC-H', 'SSC-H', 'FL1-H', 'FL2-H', 'FL3-H']
dt = FlowCal.transform.to_rfi(self.d[:, channels])
np.testing.assert_array_equal(dt[:,'FSC-H'], self.d[:,'FSC-H'])
np.testing.assert_array_equal(dt[:,'SSC-H'], self.d[:,'SSC-H'])
np.testing.assert_array_equal(dt[:,'FL1-H'],
10**(self.d[:,'FL1-H']/256.0))
np.testing.assert_array_equal(dt[:,'FL2-H'],
10**(self.d[:,'FL2-H']/256.0))
np.testing.assert_array_equal(dt[:,'FL3-H'],
10**(self.d[:,'FL3-H']/256.0))
class TestRFIFCSLinear(unittest.TestCase):
def setUp(self):
self.channel_names = ['FSC-A',
'FSC-H',
'FSC-W',
'SSC-A',
'SSC-H',
'SSC-W',
'FSC PMT-A',
'FSC PMT-H',
'FSC PMT-W',
'GFP-A',
'GFP-H',
'mCherry-A',
'mCherry-H',
'Time']
current_dir = os.path.abspath(__file__).replace(__file__, '') + os.path.sep
self.d = FlowCal.io.FCSData(current_dir + 'Data004.fcs')
self.n_samples = self.d.shape[0]
def test_rfi_original_integrity(self):
db = self.d.copy()
dt = FlowCal.transform.to_rfi(self.d,
channels=['FSC-H', 'SSC-H'])
np.testing.assert_array_equal(self.d, db)
def test_rfi_arg_error_amplification_type_length(self):
with self.assertRaises(ValueError):
FlowCal.transform.to_rfi(self.d,
channels=[0,1],
amplification_type=[(4,1), (4,1), (4,1)])
def test_rfi_arg_error_resolution_length(self):
with self.assertRaises(ValueError):
FlowCal.transform.to_rfi(self.d,
channels=[0,1],
amplification_type=[(4,1), (4,1)],
resolution=[1024])
def test_rfi_arg_error_amplifier_gain_length(self):
with self.assertRaises(ValueError):
FlowCal.transform.to_rfi(self.d,
channels=[0,1],
amplification_type=[(0,0), (0,0)],
amplifier_gain=[3,4,4])
def test_rfi_1d_defaults(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=['GFP-A', 'mCherry-A', 'Time'])
np.testing.assert_array_equal(dt[:,'FSC-A'], self.d[:,'FSC-A'])
np.testing.assert_array_equal(dt[:,'FSC-H'], self.d[:,'FSC-H'])
np.testing.assert_array_equal(dt[:,'FSC-W'], self.d[:,'FSC-W'])
np.testing.assert_array_equal(dt[:,'SSC-A'], self.d[:,'SSC-A'])
np.testing.assert_array_equal(dt[:,'SSC-H'], self.d[:,'SSC-H'])
np.testing.assert_array_equal(dt[:,'SSC-W'], self.d[:,'SSC-W'])
np.testing.assert_array_equal(dt[:,'FSC PMT-A'], self.d[:,'FSC PMT-A'])
np.testing.assert_array_equal(dt[:,'FSC PMT-H'], self.d[:,'FSC PMT-H'])
np.testing.assert_array_equal(dt[:,'FSC PMT-W'], self.d[:,'FSC PMT-W'])
np.testing.assert_array_equal(dt[:,'GFP-A'], self.d[:,'GFP-A'])
np.testing.assert_array_equal(dt[:,'GFP-H'], self.d[:,'GFP-H'])
np.testing.assert_array_equal(dt[:,'mCherry-A'], self.d[:,'mCherry-A'])
np.testing.assert_array_equal(dt[:,'mCherry-H'], self.d[:,'mCherry-H'])
np.testing.assert_array_almost_equal(dt[:,'Time'],
self.d[:,'Time']/0.01,
decimal=3)
def test_rfi_1d_defaults_2(self):
dt = FlowCal.transform.to_rfi(self.d,
channels=['GFP-A', 'mCherry-A', 'Time'],
amplifier_gain=[2., 1., 0.01])
np.testing.assert_array_equal(dt[:,'FSC-A'], self.d[:,'FSC-A'])
np.testing.assert_array_equal(dt[:,'FSC-H'], self.d[:,'FSC-H'])
np.testing.assert_array_equal(dt[:,'FSC-W'], self.d[:,'FSC-W'])
np.testing.assert_array_equal(dt[:,'SSC-A'], self.d[:,'SSC-A'])
np.testing.assert_array_equal(dt[:,'SSC-H'], self.d[:,'SSC-H'])
np.testing.assert_array_equal(dt[:,'SSC-W'], self.d[:,'SSC-W'])
np.testing.assert_array_equal(dt[:,'FSC PMT-A'], self.d[:,'FSC PMT-A'])
np.testing.assert_array_equal(dt[:,'FSC PMT-H'], self.d[:,'FSC PMT-H'])
np.testing.assert_array_equal(dt[:,'FSC PMT-W'], self.d[:,'FSC PMT-W'])
np.testing.assert_array_equal(dt[:,'GFP-A'], self.d[:,'GFP-A']/2.)
np.testing.assert_array_equal(dt[:,'GFP-H'], self.d[:,'GFP-H'])
np.testing.assert_array_equal(dt[:,'mCherry-A'], self.d[:,'mCherry-A'])
np.testing.assert_array_equal(dt[:,'mCherry-H'], self.d[:,'mCherry-H'])
np.testing.assert_array_almost_equal(dt[:,'Time'],
self.d[:,'Time']/0.01,
decimal=3)
class TestMefArray(unittest.TestCase):
def setUp(self):
self.d = np.array([
[1, 7, 2],
[2, 8, 3],
[3, 9, 4],
[4, 10, 5],
[5, 1, 6],
[6, 2, 7],
[7, 3, 8],
[8, 4, 9],
[9, 5, 10],
[10, 6, 1],
])
self.sc0 = lambda x: x + 10
self.sc1 = lambda x: x**2
self.sc2 = lambda x: np.log(x)
def test_mef_length_error(self):
self.assertRaises(ValueError, FlowCal.transform.to_mef,
self.d, 1, [self.sc1], [1,2])
def test_mef_channel_error(self):
self.assertRaises(ValueError, FlowCal.transform.to_mef,
self.d, 0, [self.sc1, self.sc2], [1,2])
def test_mef_1d_1(self):
dt = FlowCal.transform.to_mef(
self.d, 1, [self.sc1, self.sc2], [1,2])
np.testing.assert_array_equal(dt[:,0], self.d[:,0])
np.testing.assert_array_equal(dt[:,1], self.d[:,1]**2.)
np.testing.assert_array_equal(dt[:,2], self.d[:,2])
def test_mef_1d_2(self):
dt = FlowCal.transform.to_mef(
self.d, 2, [self.sc1, self.sc2], [1,2])
np.testing.assert_array_equal(dt[:,0], self.d[:,0])
np.testing.assert_array_equal(dt[:,1], self.d[:,1])
np.testing.assert_array_equal(dt[:,2], np.log(self.d[:,2]))
def test_mef_2d(self):
dt = FlowCal.transform.to_mef(
self.d, [1,2], [self.sc1, self.sc2], [1,2])
np.testing.assert_array_equal(dt[:,0], self.d[:,0])
np.testing.assert_array_equal(dt[:,1], self.d[:,1]**2.)
np.testing.assert_array_equal(dt[:,2], np.log(self.d[:,2]))
def test_mef_default_channel(self):
dt = FlowCal.transform.to_mef(
self.d, None, [self.sc1, self.sc2], [1,2])
np.testing.assert_array_equal(dt[:,0], self.d[:,0])
np.testing.assert_array_equal(dt[:,1], self.d[:,1]**2.)
np.testing.assert_array_equal(dt[:,2], np.log(self.d[:,2]))
def test_mef_default_sc_channel(self):
dt = FlowCal.transform.to_mef(
self.d, None, [self.sc0, self.sc1, self.sc2], None)
np.testing.assert_array_equal(dt[:,0], self.d[:,0] + 10)
np.testing.assert_array_equal(dt[:,1], self.d[:,1]**2.)
np.testing.assert_array_equal(dt[:,2], np.log(self.d[:,2]))
def test_mef_default_sc_channel_error(self):
self.assertRaises(ValueError, FlowCal.transform.to_mef,
self.d, None, [self.sc1, self.sc2], None)
class TestMefFCS(unittest.TestCase):
def setUp(self):
self.channel_names = ['FSC-H', 'SSC-H', 'FL1-H',
'FL2-H', 'FL3-H', 'Time']
current_dir = os.path.abspath(__file__).replace(__file__, '') + os.path.sep
self.d = FlowCal.io.FCSData(current_dir + 'Data001.fcs')
self.n_samples = self.d.shape[0]
self.sc0 = lambda x: x + 10
self.sc1 = lambda x: x**2
self.sc2 = lambda x: np.log(x + 1)
def test_mef_length_error(self):
self.assertRaises(ValueError, FlowCal.transform.to_mef,
self.d, 'FL1-H', [self.sc1], ['FL1-H','FL3-H'])
def test_mef_channel_error(self):
self.assertRaises(ValueError, FlowCal.transform.to_mef,
self.d, 'FSC-H', [self.sc1, self.sc2], ['FL1-H','FL3-H'])
def test_mef_1d_1(self):
dt = FlowCal.transform.to_mef(
self.d, 'FL1-H', [self.sc1, self.sc2], ['FL1-H','FL3-H'])
np.testing.assert_array_equal(dt[:,0], self.d[:,0])
np.testing.assert_array_equal(dt[:,1], self.d[:,1])
np.testing.assert_array_equal(dt[:,2], self.d[:,2]**2.)
np.testing.assert_array_equal(dt[:,3], self.d[:,3])
np.testing.assert_array_equal(dt[:,4], self.d[:,4])
def test_mef_1d_2(self):
dt = FlowCal.transform.to_mef(
self.d, 'FL3-H', [self.sc1, self.sc2], ['FL1-H','FL3-H'])
np.testing.assert_array_equal(dt[:,0], self.d[:,0])
np.testing.assert_array_equal(dt[:,1], self.d[:,1])
np.testing.assert_array_equal(dt[:,2], self.d[:,2])
np.testing.assert_array_equal(dt[:,3], self.d[:,3])
np.testing.assert_array_equal(dt[:,4],
np.log(self.d[:,4].astype(np.float64) + 1))
def test_mef_2d(self):
dt = FlowCal.transform.to_mef(
self.d, ['FL1-H','FL3-H'], [self.sc1, self.sc2], ['FL1-H','FL3-H'])
np.testing.assert_array_equal(dt[:,0], self.d[:,0])
np.testing.assert_array_equal(dt[:,1], self.d[:,1])
np.testing.assert_array_equal(dt[:,2], self.d[:,2]**2.)
np.testing.assert_array_equal(dt[:,3], self.d[:,3])
np.testing.assert_array_equal(dt[:,4],
np.log(self.d[:,4].astype(np.float64) + 1))
def test_mef_default_channel(self):
dt = FlowCal.transform.to_mef(
self.d, None, [self.sc1, self.sc2], ['FL1-H','FL3-H'])
np.testing.assert_array_equal(dt[:,0], self.d[:,0])
np.testing.assert_array_equal(dt[:,1], self.d[:,1])
np.testing.assert_array_equal(dt[:,2], self.d[:,2]**2.)
np.testing.assert_array_equal(dt[:,3], self.d[:,3])
np.testing.assert_array_equal(dt[:,4],
np.log(self.d[:,4].astype(np.float64) + 1))
def test_mef_bins_channels(self):
dt = FlowCal.transform.to_mef(
self.d, ['FL1-H','FL3-H'], [self.sc1, self.sc2], ['FL1-H','FL3-H'])
vit = [dt.range(i) for i in range(5)]
vo = [np.array([0, 1023]),
np.array([0, 1023]),
np.array([0, 1023])**2,
np.array([0, 1023]),
np.log(np.array([0, 1023]) + 1),
]
np.testing.assert_array_equal(vit, vo)
if __name__ == '__main__':
unittest.main()
| 47.908612 | 83 | 0.491783 | 3,539 | 27,260 | 3.604125 | 0.040407 | 0.084281 | 0.181105 | 0.241474 | 0.940494 | 0.928342 | 0.923246 | 0.913132 | 0.894551 | 0.881145 | 0 | 0.053155 | 0.332649 | 27,260 | 568 | 84 | 47.992958 | 0.647977 | 0.00785 | 0 | 0.760563 | 0 | 0 | 0.051678 | 0 | 0 | 0 | 0 | 0 | 0.342052 | 1 | 0.112676 | false | 0 | 0.01006 | 0 | 0.132797 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
f0fac39feb81708c93aef3a1583c4d3b00094de2 | 1,971 | py | Python | test/test_errors.py | hroncok/filabel | 14061f21d33629fcf1290044ab2b04fc0cdf574c | [
"MIT"
] | null | null | null | test/test_errors.py | hroncok/filabel | 14061f21d33629fcf1290044ab2b04fc0cdf574c | [
"MIT"
] | null | null | null | test/test_errors.py | hroncok/filabel | 14061f21d33629fcf1290044ab2b04fc0cdf574c | [
"MIT"
] | null | null | null | from helper import run, config
def test_no_config():
cp = run('')
assert cp.returncode != 0
assert not cp.stdout
assert (
cp.stderr == 'Auth configuration not supplied!\n' or
cp.stderr == 'Labels configuration not supplied!\n' or
'Missing option' in cp.stderr
)
def test_no_auth_config():
cp = run(f'--config-labels "{config("labels.empty.cfg")}"')
assert cp.returncode != 0
assert not cp.stdout
assert (
cp.stderr == 'Auth configuration not supplied!\n' or
'Missing option' in cp.stderr
)
def test_unusable_auth_config():
cp = run(f'--config-auth "{config("empty_file.cfg")}" '
f'--config-labels "{config("labels.empty.cfg")}"')
assert cp.returncode != 0
assert not cp.stdout
assert cp.stderr == 'Auth configuration not usable!\n'
def test_no_labels_config():
cp = run(f'--config-auth "{config("auth.fff.cfg")}"')
assert cp.returncode != 0
assert not cp.stdout
assert (
cp.stderr == 'Labels configuration not supplied!\n' or
'Missing option' in cp.stderr
)
def test_unusable_labels_config():
cp = run(f'--config-labels "{config("empty_file.cfg")}" '
f'--config-auth "{config("auth.fff.cfg")}"')
assert cp.returncode != 0
assert not cp.stdout
assert cp.stderr == 'Labels configuration not usable!\n'
def test_invalid_repolsug():
cp = run(f'--config-labels "{config("labels.empty.cfg")}" '
f'--config-auth "{config("auth.fff.cfg")}" '
'foobar')
assert cp.returncode != 0
assert not cp.stdout
assert cp.stderr == 'Reposlug foobar not valid!\n'
def test_invalid_second_repolsug():
cp = run(f'--config-labels "{config("labels.empty.cfg")}" '
f'--config-auth "{config("auth.fff.cfg")}" '
'xyz/abc foobar')
assert cp.returncode != 0
assert not cp.stdout
assert cp.stderr == 'Reposlug foobar not valid!\n'
| 29.41791 | 63 | 0.615424 | 264 | 1,971 | 4.515152 | 0.147727 | 0.09396 | 0.105705 | 0.111577 | 0.931208 | 0.931208 | 0.873322 | 0.810403 | 0.807886 | 0.803691 | 0 | 0.004636 | 0.233891 | 1,971 | 66 | 64 | 29.863636 | 0.784768 | 0 | 0 | 0.576923 | 0 | 0 | 0.385591 | 0.14206 | 0 | 0 | 0 | 0 | 0.403846 | 1 | 0.134615 | false | 0 | 0.019231 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0b0af4354d01a0f6a4d95e1907cabd36299efec5 | 4,890 | py | Python | examples/embed/plot_dcca_tutorial.py | idc9/mvlearn | c9d5cd10ac34e0f901a4b0b8804397f2c0d75401 | [
"Apache-2.0"
] | null | null | null | examples/embed/plot_dcca_tutorial.py | idc9/mvlearn | c9d5cd10ac34e0f901a4b0b8804397f2c0d75401 | [
"Apache-2.0"
] | null | null | null | examples/embed/plot_dcca_tutorial.py | idc9/mvlearn | c9d5cd10ac34e0f901a4b0b8804397f2c0d75401 | [
"Apache-2.0"
] | null | null | null | """
===============
Deep CCA (DCCA)
===============
"""
import numpy as np
from mvlearn.embed import DCCA
from mvlearn.datasets import GaussianMixture
from mvlearn.plotting import crossviews_plot
from mvlearn.model_selection import train_test_split
###############################################################################
# Polynomial-Transformed Latent Correlation
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#
# Latent variables are sampled from two multivariate Gaussians with equal
# prior probability. Then a polynomial transformation is applied and noise is
# added independently to both the transformed and untransformed latents.
n_samples = 2000
means = [[0, 1], [0, -1]]
covariances = [np.eye(2), np.eye(2)]
gm = GaussianMixture(n_samples, means, covariances, random_state=42,
shuffle=True, shuffle_random_state=42)
latent, y = gm.get_Xy(latents=True)
# Plot latent data against itself to reveal the underlying distribtution.
crossviews_plot([latent, latent], labels=y,
title='Latent Variable', equal_axes=True)
# Split data into train and test sets
Xs, y = gm.sample_views(transform='poly', n_noise=2).get_Xy()
Xs_train, Xs_test, y_train, y_test = train_test_split(Xs, y, test_size=0.3,
random_state=42)
# Plot the testing data after polynomial transformation
crossviews_plot(Xs_test, labels=y_test,
title='Testing Data View 1 vs. View 2 '
'(Polynomial Transform + noise)',
equal_axes=True)
###############################################################################
# Fit DCCA model to uncover latent distribution
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#
# The output dimensionality is still 4.
# Define parameters and layers for deep model
features1 = Xs_train[0].shape[1] # Feature sizes
features2 = Xs_train[1].shape[1]
layers1 = [256, 256, 4] # nodes in each hidden layer and the output size
layers2 = [256, 256, 4]
dcca = DCCA(input_size1=features1, input_size2=features2, n_components=4,
layer_sizes1=layers1, layer_sizes2=layers2, epoch_num=500)
dcca.fit(Xs_train)
Xs_transformed = dcca.transform(Xs_test)
###############################################################################
# Visualize the transformed data
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#
# We can see that it has uncovered the latent correlation between views.
crossviews_plot(Xs_transformed, labels=y_test,
title='Transformed Testing Data View 1 vs. View 2 '
'(Polynomial Transform + noise)',
equal_axes=True)
###############################################################################
# Sinusoidal-Transformed Latent Correlation
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#
# Following the same procedure as above, latent variables are sampled from two
# multivariate Gaussians with equal prior probability. This time, a sinusoidal
# transformation is applied and noise is added independently to both the
# transformed and untransformed latents.
n_samples = 2000
means = [[0, 1], [0, -1]]
covariances = [np.eye(2), np.eye(2)]
gm = GaussianMixture(n_samples, means, covariances, random_state=42,
shuffle=True, shuffle_random_state=42)
# Split data into train and test segments
Xs, y = gm.sample_views(transform='sin', n_noise=2).get_Xy()
Xs_train, Xs_test, y_train, y_test = train_test_split(Xs, y, test_size=0.3,
random_state=42)
# Plot the testing data against itself after polynomial transformation
crossviews_plot(Xs_test, labels=y_test,
title='Testing Data View 1 vs. View 2 '
'(Polynomial Transform + noise)',
equal_axes=True)
###############################################################################
# Fit DCCA model to uncover latent distribution
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#
# The output dimensionality is still 4.
# Define parameters and layers for deep model
features1 = Xs_train[0].shape[1] # Feature sizes
features2 = Xs_train[1].shape[1]
layers1 = [256, 256, 4] # nodes in each hidden layer and the output size
layers2 = [256, 256, 4]
dcca = DCCA(input_size1=features1, input_size2=features2, n_components=4,
layer_sizes1=layers1, layer_sizes2=layers2, epoch_num=500)
dcca.fit(Xs_train)
Xs_transformed = dcca.transform(Xs_test)
###############################################################################
# Visualize the transformed data
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#
# We can see that it has uncovered the latent correlation between views.
crossviews_plot(Xs_transformed, labels=y_test,
title='Transformed Testing Data View 1 vs. View 2 '
'(Sinusoidal Transform + noise)',
equal_axes=True)
| 36.492537 | 79 | 0.588957 | 569 | 4,890 | 4.926186 | 0.246046 | 0.019979 | 0.027827 | 0.022833 | 0.817695 | 0.808063 | 0.772387 | 0.772387 | 0.772387 | 0.772387 | 0 | 0.029249 | 0.188957 | 4,890 | 133 | 80 | 36.766917 | 0.677509 | 0.348875 | 0 | 0.803571 | 0 | 0 | 0.1089 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.089286 | 0 | 0.089286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0b18312593a50ffe8a0607ea7d5fc24260a2234e | 15,441 | py | Python | test/test_guillotine.py | pyrige/greedypacker | ca6a302af2b0275a52df91be01831c87ee428aa2 | [
"Apache-2.0"
] | 108 | 2017-11-04T02:26:44.000Z | 2021-09-19T01:27:37.000Z | test/test_guillotine.py | pyrige/greedypacker | ca6a302af2b0275a52df91be01831c87ee428aa2 | [
"Apache-2.0"
] | 10 | 2018-05-31T08:08:47.000Z | 2021-06-01T22:02:19.000Z | test/test_guillotine.py | pyrige/greedypacker | ca6a302af2b0275a52df91be01831c87ee428aa2 | [
"Apache-2.0"
] | 24 | 2017-12-01T04:22:09.000Z | 2021-11-02T07:37:49.000Z | import sys
import unittest
from sortedcontainers import SortedListWithKey
from greedypacker import guillotine
from greedypacker import item
from .base import BaseTestCase
from .util import stdout_redirect
class BestShortSide(BaseTestCase):
def setUp(self):
self.BIN = guillotine.Guillotine(8, 4, rotation=False, heuristic='best_shortside')
self.freeRectangle = guillotine.FreeRectangle
def tearDown(self):
del self.BIN
del self.freeRectangle
def testItemInsertionFailure(self):
"""
Single Item Fits no FreeRectangles
Split Horizontal
Rotation == False
RectMerge == False
"""
ITEM = item.Item(5, 9)
self.assertFalse(self.BIN.insert(ITEM))
def testItemInsertionSuccess(self):
"""
Single item
Split Horizontal
Rotation == False
RectMerge == False
"""
F0 = self.freeRectangle(1, 2, 0, 0)
F1 = self.freeRectangle(2, 2, 1, 0)
ITEM = item.Item(1, 1)
self.BIN.freerects = SortedListWithKey([F0, F1], key=lambda x: x.area)
self.BIN.insert(ITEM)
with self.subTest():
correct = [self.freeRectangle(1, 1, 0, 1),
self.freeRectangle(2, 2, 1, 0)]
self.assertCountEqual(self.BIN.freerects, correct)
with self.subTest():
self.assertEqual(ITEM.x, 0)
self.assertEqual(ITEM.y, 0)
with self.subTest():
self.assertEqual(self.BIN.items, [ITEM])
def testItemInsertionSuccessRotation(self):
"""
Single item
Split Horizontal
Rotation == True
RectMerge == False
"""
F0 = self.freeRectangle(2, 1, 0, 0)
ITEM = item.Item(1, 2)
self.BIN.rotation = True
self.BIN.freerects = SortedListWithKey([F0], key=lambda x: x.area)
self.BIN.insert(ITEM)
with self.subTest():
self.assertCountEqual(self.BIN.freerects, [])
with self.subTest():
self.assertEqual(ITEM.x, 0)
self.assertEqual(ITEM.y, 0)
with self.subTest():
self.assertEqual(self.BIN.items, [ITEM])
with self.subTest():
self.assertEqual(self.BIN.free_area, 30)
class BestLongSide(BaseTestCase):
def setUp(self):
self.BIN = guillotine.Guillotine(10, 5, rotation=False, heuristic='best_longside')
self.freeRectangle = guillotine.FreeRectangle
def tearDown(self):
del self.BIN
del self.freeRectangle
def testItemInsertionFailure(self):
"""
Single Item Fits no FreeRectangles
Split Horizontal
Rotation == False
RectMerge == False
"""
ITEM = item.Item(5, 11)
self.assertFalse(self.BIN.insert(ITEM))
def testItemInsertionSuccess(self):
"""
Single item
Split Horizontal
Rotation == False
RectMerge == False
"""
F0 = self.freeRectangle(1, 3, 0, 0)
F1 = self.freeRectangle(2, 1, 1, 0)
ITEM = item.Item(1, 1)
self.BIN.freerects = SortedListWithKey([F0, F1], key=lambda x: x.area)
self.BIN.insert(ITEM)
with self.subTest():
correct = [self.freeRectangle(1, 3, 0, 0),
self.freeRectangle(1, 1, 2, 0)]
self.assertCountEqual(self.BIN.freerects, correct)
with self.subTest():
self.assertEqual(ITEM.x, 1)
self.assertEqual(ITEM.y, 0)
with self.subTest():
self.assertEqual(self.BIN.items, [ITEM])
with self.subTest():
self.assertEqual(self.BIN.free_area, 49)
def testItemInsertionSuccessRotation(self):
"""
Single item
Split Horizontal
Rotation == True
RectMerge == False
"""
F0 = self.freeRectangle(2, 1, 0, 0)
ITEM = item.Item(1, 2)
self.BIN.rotation = True
self.BIN.freerects = [F0]
self.BIN.insert(ITEM)
with self.subTest():
self.assertCountEqual(self.BIN.freerects, [])
with self.subTest():
self.assertEqual(ITEM.x, 0)
self.assertEqual(ITEM.y, 0)
with self.subTest():
self.assertEqual(self.BIN.items, [ITEM])
with self.subTest():
self.assertEqual(self.BIN.free_area, 48)
class BestAreaFit(BaseTestCase):
def setUp(self):
self.BIN = guillotine.Guillotine(10, 5, rotation=False, heuristic='best_area')
self.freeRectangle = guillotine.FreeRectangle
def tearDown(self):
del self.BIN
del self.freeRectangle
def testItemInsertionFailure(self):
"""
Single Item Fits no FreeRectangles
Split Horizontal
Rotation == False
RectMerge == False
"""
ITEM = item.Item(5, 11)
self.assertFalse(self.BIN.insert(ITEM, 'best_area_fit'))
def testItemInsertionSuccess(self):
"""
Single item
Split Horizontal
Rotation == False
RectMerge == False
"""
F0 = self.freeRectangle(2, 2, 0, 0)
F1 = self.freeRectangle(3, 3, 2, 0)
ITEM = item.Item(1, 1)
self.BIN.freerects = SortedListWithKey([F0, F1], key=lambda x: x.area)
self.BIN.insert(ITEM)
with self.subTest():
correct = [self.freeRectangle(2, 1, 0, 1),
self.freeRectangle(1, 1, 1, 0),
self.freeRectangle(3, 3, 2, 0)]
self.assertCountEqual(self.BIN.freerects, correct)
with self.subTest():
self.assertEqual(ITEM.x, 0)
self.assertEqual(ITEM.y, 0)
with self.subTest():
self.assertEqual(self.BIN.items, [ITEM])
with self.subTest():
self.assertEqual(self.BIN.free_area, 49)
def testItemInsertionSuccessRotation(self):
"""
Single item
Split Horizontal
Rotation == True
RectMerge == False
"""
F0 = self.freeRectangle(2, 1, 0, 0)
ITEM = item.Item(1, 2)
self.BIN.rotation = True
self.BIN.freerects = [F0]
self.BIN.insert(ITEM)
with self.subTest():
self.assertCountEqual(self.BIN.freerects, [])
with self.subTest():
self.assertEqual(ITEM.x, 0)
self.assertEqual(ITEM.y, 0)
with self.subTest():
self.assertEqual(self.BIN.items, [ITEM])
with self.subTest():
self.assertEqual(self.BIN.free_area, 48)
class WorstLongSide(BaseTestCase):
def setUp(self):
self.BIN = guillotine.Guillotine(8, 4, rotation=False, heuristic='worst_longside')
self.freeRectangle = guillotine.FreeRectangle
def tearDown(self):
del self.BIN
del self.freeRectangle
def testItemInsertionFailure(self):
"""
Single Item Fits no FreeRectangles
Split Horizontal
Rotation == False
RectMerge == False
"""
ITEM = item.Item(5, 9)
self.assertFalse(self.BIN.insert(ITEM))
def testItemInsertionSuccess(self):
"""
Single item
Split Horizontal
Rotation == False
RectMerge == False
"""
F0 = self.freeRectangle(1, 3, 0, 0)
F1 = self.freeRectangle(2, 1, 1, 0)
ITEM = item.Item(1, 1)
self.BIN.freerects = SortedListWithKey([F0, F1], key=lambda x: x.area)
self.BIN.insert(ITEM)
with self.subTest():
correct = [self.freeRectangle(1, 2, 0, 1),
self.freeRectangle(2, 1, 1, 0)]
self.assertCountEqual(self.BIN.freerects, correct)
with self.subTest():
self.assertEqual(ITEM.x, 0)
self.assertEqual(ITEM.y, 0)
with self.subTest():
self.assertEqual(self.BIN.items, [ITEM])
with self.subTest():
self.assertEqual(self.BIN.free_area, 31)
def testItemInsertionSuccessRotation(self):
"""
Single item
Split Horizontal
Rotation == True
RectMerge == False
"""
F0 = self.freeRectangle(2, 1, 0, 0)
ITEM = item.Item(1, 2)
self.BIN.rotation = True
self.BIN.freerects = [F0]
self.BIN.insert(ITEM)
with self.subTest():
self.assertCountEqual(self.BIN.freerects, [])
with self.subTest():
self.assertEqual(ITEM.x, 0)
self.assertEqual(ITEM.y, 0)
with self.subTest():
self.assertEqual(self.BIN.items, [ITEM])
with self.subTest():
self.assertEqual(self.BIN.free_area, 30)
class WorstShortSide(BaseTestCase):
def setUp(self):
self.BIN = guillotine.Guillotine(8, 4, rotation=False, heuristic='worst_shortside')
self.freeRectangle = guillotine.FreeRectangle
def tearDown(self):
del self.BIN
del self.freeRectangle
def testItemInsertionFailure(self):
"""
Single Item Fits no FreeRectangles
Split Horizontal
Rotation == False
RectMerge == False
"""
ITEM = item.Item(5, 9)
self.assertFalse(self.BIN.insert(ITEM))
def testItemInsertionSuccess(self):
"""
Single item
Split Horizontal
Rotation == False
RectMerge == False
"""
F0 = self.freeRectangle(1, 2, 0, 0)
F1 = self.freeRectangle(2, 2, 1, 0)
ITEM = item.Item(1, 1)
self.BIN.freerects = SortedListWithKey([F0, F1], key=lambda x: x.area)
self.BIN.insert(ITEM)
with self.subTest():
correct = [self.freeRectangle(1, 2, 0, 0),
self.freeRectangle(2, 1, 1, 1),
self.freeRectangle(1, 1, 2, 0)]
self.assertCountEqual(self.BIN.freerects, correct)
with self.subTest():
self.assertEqual(ITEM.x, 1)
self.assertEqual(ITEM.y, 0)
with self.subTest():
self.assertEqual(self.BIN.items, [ITEM])
with self.subTest():
self.assertEqual(self.BIN.free_area, 31)
def testItemInsertionSuccessRotation(self):
"""
Single item
Split Horizontal
Rotation == True
RectMerge == False
"""
F0 = self.freeRectangle(2, 1, 0, 0)
ITEM = item.Item(1, 2)
self.BIN.rotation = True
self.BIN.freerects = [F0]
self.BIN.insert(ITEM)
with self.subTest():
self.assertCountEqual(self.BIN.freerects, [])
with self.subTest():
self.assertEqual(ITEM.x, 0)
self.assertEqual(ITEM.y, 0)
with self.subTest():
self.assertEqual(self.BIN.items, [ITEM])
with self.subTest():
self.assertEqual(self.BIN.free_area, 30)
class WorstAreaFit(BaseTestCase):
def setUp(self):
self.BIN = guillotine.Guillotine(10, 5, rotation=False, heuristic='worst_area')
self.freeRectangle = guillotine.FreeRectangle
def tearDown(self):
del self.BIN
del self.freeRectangle
def testItemInsertionFailure(self):
"""
Single Item Fits no FreeRectangles
Split Horizontal
Rotation == False
RectMerge == False
"""
ITEM = item.Item(5, 11)
self.assertFalse(self.BIN.insert(ITEM, 'worst_area_fit'))
def testItemInsertionSuccess(self):
"""
Single item
Split Horizontal
Rotation == False
RectMerge == False
"""
F0 = self.freeRectangle(2, 2, 0, 0)
F1 = self.freeRectangle(3, 3, 2, 0)
ITEM = item.Item(1, 1)
self.BIN.freerects = SortedListWithKey([F0, F1], key=lambda x: x.area)
self.BIN.insert(ITEM)
with self.subTest():
correct = [self.freeRectangle(2, 2, 0, 0),
self.freeRectangle(2, 1, 3, 0),
self.freeRectangle(3, 2, 2, 1)]
self.assertCountEqual(self.BIN.freerects, correct)
with self.subTest():
self.assertEqual(ITEM.x, 2)
self.assertEqual(ITEM.y, 0)
with self.subTest():
self.assertEqual(self.BIN.items, [ITEM])
with self.subTest():
self.assertEqual(self.BIN.free_area, 49)
def testItemInsertionSuccessRotation(self):
"""
Single item
Split Horizontal
Rotation == True
RectMerge == False
"""
F0 = self.freeRectangle(2, 1, 0, 0)
ITEM = item.Item(1, 2)
self.BIN.rotation = True
self.BIN.freerects = [F0]
self.BIN.insert(ITEM)
with self.subTest():
self.assertCountEqual(self.BIN.freerects, [])
with self.subTest():
self.assertEqual(ITEM.x, 0)
self.assertEqual(ITEM.y, 0)
with self.subTest():
self.assertEqual(self.BIN.items, [ITEM])
with self.subTest():
self.assertEqual(self.BIN.free_area, 48)
class RectMerge(BaseTestCase):
def setUp(self):
self.BIN = guillotine.Guillotine(10, 5, rotation=False, heuristic='best_area')
self.freeRectangle = guillotine.FreeRectangle
self.BIN.rMerge = True
def tearDown(self):
del self.BIN
del self.freeRectangle
def testMatchingWidths(self):
"""
Two item
Split Horizontal
Rotation == False
RectMerge == True
"""
ITEM = item.Item(4, 2)
ITEM2 = item.Item(4, 3)
self.BIN.insert(ITEM, 'best_area')
self.BIN.insert(ITEM2, 'best_area')
self.assertEqual(self.BIN.freerects, [self.freeRectangle(6, 5, 4, 0)])
class BinStats(BaseTestCase):
def setUp(self):
self.BIN = guillotine.Guillotine(10, 5, rotation=False, heuristic='best_area')
def tearDown(self):
del self.BIN
def testReturn(self):
ITEM = item.Item(4, 2)
ITEM2 = item.Item(2, 2)
self.BIN.insert(ITEM, 'best_area')
self.BIN.insert(ITEM2, 'best_area')
correct = {
'width': 10,
'height': 5,
'area': 50,
'efficiency': 0.24,
'items': [ITEM, ITEM2],
}
self.assertEqual(self.BIN.bin_stats(), correct)
def load_tests(loader, tests, pattern):
suite = unittest.TestSuite()
if pattern is None:
suite.addTests(loader.loadTestsFromTestCase(BestShortSide))
suite.addTests(loader.loadTestsFromTestCase(BestLongSide))
suite.addTests(loader.loadTestsFromTestCase(BestAreaFit))
suite.addTests(loader.loadTestsFromTestCase(WorstShortSide))
suite.addTests(loader.loadTestsFromTestCase(WorstLongSide))
suite.addTests(loader.loadTestsFromTestCase(WorstAreaFit))
suite.addTests(loader.loadTestsFromTestCase(RectMerge))
suite.addTests(loader.loadTestsFromTestCase(BinStats))
else:
tests = loader.loadTestsFromName(pattern,
module=sys.modules[__name__])
failedTests = [t for t in tests._tests
if type(t) == unittest.loader._FailedTest]
if len(failedTests) == 0:
suite.addTests(tests)
return suite
| 29.411429 | 91 | 0.56881 | 1,649 | 15,441 | 5.304427 | 0.069739 | 0.075226 | 0.080599 | 0.089059 | 0.865897 | 0.860524 | 0.845661 | 0.84086 | 0.834686 | 0.834686 | 0 | 0.028479 | 0.31779 | 15,441 | 524 | 92 | 29.467557 | 0.80188 | 0.089437 | 0 | 0.77492 | 0 | 0 | 0.014128 | 0 | 0 | 0 | 0 | 0 | 0.215434 | 1 | 0.118971 | false | 0 | 0.022508 | 0 | 0.170418 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0b1a88bd8115a8b9ca20f7dbbb16acedde9b2ade | 8,857 | py | Python | NonlinearLMPC/DubinsObstacleAvoidance_SampledSafeSet/plot.py | aandrien/LMPC | 8f6048640e4ff72b3c6c9dcb44d8267d7607dbf0 | [
"MIT"
] | 35 | 2019-10-02T23:05:02.000Z | 2022-03-18T04:05:05.000Z | NonlinearLMPC/DubinsObstacleAvoidance_SampledSafeSet/plot.py | urosolia/LMPC | 4837dad776c937e0dec0b58f989a1f231730512a | [
"MIT"
] | null | null | null | NonlinearLMPC/DubinsObstacleAvoidance_SampledSafeSet/plot.py | urosolia/LMPC | 4837dad776c937e0dec0b58f989a1f231730512a | [
"MIT"
] | 12 | 2020-02-15T03:09:00.000Z | 2022-01-26T11:00:31.000Z | import numpy as np
import matplotlib.pyplot as plt
import copy
from matplotlib import rc
import pdb
rc('font',**{'family':'sans-serif','sans-serif':['Helvetica']})
rc('text', usetex=True)
it = 6
iterationTime = []
P = 10
print "Number of points used: ", P
# =========================================================
# Plot closed-loop
# =========================================================
# 0 = blue, 1 = orange, 2 = green, 3 = red, 4 = purple
colorMap = ["#1f77b4", "#ff7f0e", "#2ca02c", "#d62728", "#9467bd", "#8c564b", "#e377c2", "#7f7f7f", "#bcbd22", "#17becf"]
# colorMap = ["#3182bd", "#fd8d3c", "#31a354", "#e6550d", "#756bb1"]
xFeasible = np.loadtxt('storedData/closedLoopFeasible.txt')
plt.figure()
plt.plot(xFeasible[0,:], xFeasible[1,:], '-d', color=colorMap[2], label='Feasible trajectory')
iterationTime.append(xFeasible.shape[1]-1) # Store time to reach xf
print xFeasible
xit = []
for i in range(1,it):
xcl = np.loadtxt('storedData/closedLoopIteration'+str(i)+'_P_'+str(P)+'.txt')
xcl = xcl.T
xit.append(copy.copy(xcl))
plt.plot(xcl[0,:], xcl[1,:], 's', color=colorMap[3])
iterationTime.append(xcl.shape[1]-1) # Store time to reach xf
plt.plot(0, 0, 's', color=colorMap[3], label='Stored data')
xcl = np.loadtxt('storedData/closedLoopIteration'+str(it)+'_P_'+str(P)+'.txt')
xcl = xcl.T
iterationTime.append(xcl.shape[1]-1) # Store time to reach xf
xit.append(copy.copy(xcl))
plt.plot(xcl[0,:], xcl[1,:], 's', color=colorMap[3]) # Store time to reach xf
plt.plot(xcl[0,:], xcl[1,:], '-o', color=colorMap[0], label='LMPC closed-loop at '+str(it)+'th iteration')
print iterationTime
x_obs = []
y_obs = []
for i in np.linspace(0,2*np.pi,1000):
x_obs.append(27 + 8*np.cos(i))
y_obs.append(-1 + 6*np.sin(i))
plt.plot(x_obs, y_obs, '-k', label='Obstacle')
plt.xlabel('$x$', fontsize=20)
plt.ylabel('$y$', fontsize=20)
plt.legend(loc = 'best')
# plt.ylim(-10,50)
# plt.xlim(-1,60)
# =========================================================
# Plot velocity and acceleration
# =========================================================
i = it
ucl = np.loadtxt('storedData/inputIteration'+str(i)+'_P_'+str(P)+'.txt')
xcl = np.loadtxt('storedData/closedLoopIteration'+str(i)+'_P_'+str(P)+'.txt')
plt.figure()
plt.subplot(2, 1, 1)
plt.plot(xcl[:,2], '-o', color=colorMap[0])
plt.ylabel('$\mathrm{Velocity}$', fontsize=20)
plt.subplot(2, 1, 2)
plt.plot(ucl[:,1], '-o', color=colorMap[0])
plt.plot([0,ucl.shape[0]-1],[1,1], '--k', label='Saturation limit')
plt.plot([0,ucl.shape[0]-1],[-1,-1], '--k')
plt.xlabel('$\mathrm{Time~Step}$', fontsize=20)
plt.ylabel('$\mathrm{Acceleration}$', fontsize=20)
plt.legend()
# =========================================================
# Plot velocity
# =========================================================
xFeasible = np.loadtxt('storedData/closedLoopFeasible.txt')
plt.figure()
plt.plot(xFeasible[0,:], xFeasible[2,:], '-d', color=colorMap[2], label='Feasible trajectory')
xit = []
for i in range(1,it+1):
xcl = np.loadtxt('storedData/closedLoopIteration'+str(i)+'_P_'+str(P)+'.txt')
xcl = xcl.T
xit.append(copy.copy(xcl))
plt.plot(xcl[0,:], xcl[2,:], 's', color=colorMap[3])
plt.plot(0, 0, 's', color=colorMap[3], label='Stored data')
plt.plot(xcl[0,:], xcl[2,:], '-o', color=colorMap[0], label='LMPC closed-loop')
plt.xlabel('$z$', fontsize=20)
plt.ylabel('$\mathrm{velocity}$', fontsize=20)
plt.legend()
# =========================================================
# Plot inputs
# =========================================================
i = it
ucl = np.loadtxt('storedData/inputIteration'+str(i)+'_P_'+str(P)+'.txt')
plt.figure()
plt.subplot(2, 1, 1)
plt.plot(ucl[:,0], '-o', color=colorMap[0])
plt.ylabel('$\mathrm{Steering}$', fontsize=20)
plt.subplot(2, 1, 2)
plt.plot(ucl[:,1], '-o', color=colorMap[0])
plt.plot([0,ucl.shape[0]-1],[1,1], '--k', label='Saturation limit')
plt.plot([0,ucl.shape[0]-1],[-1,-1], '--k')
plt.xlabel('$\mathrm{Time~Step}$', fontsize=20)
plt.ylabel('$\mathrm{Acceleration}$', fontsize=20)
plt.legend()
plt.figure()
mat = np.loadtxt('storedData/meanTimeLMPC'+'_P_'+str(P)+'.txt')
compTime = mat[:,0].tolist()
if P == "all":
plt.plot(range(1,len(compTime)+1), compTime, '--d', label=labelLMPC)
else:
plt.plot(range(1,len(compTime)+1), compTime, '-ob', label='${P =}$'+str(P))
plt.xlabel('$\mathrm{Iteration~}j$', fontsize=20)
plt.ylabel('$\mathrm{Mean~Computational~Time~[s]}$', fontsize=20)
plt.legend()
# =========================================================
# Run Comparison
# =========================================================
input = raw_input("Do you want to run comparison for different values of P and l? [y/n] ")
# =========================================================
# Plot inputs
# =========================================================
l = [1, 2,3,10]
P = [8,10,40,'all']
labelLMPC = 'LMPC from [21]'
pltMarker = ['-o', '--s','-d','--v']
if input == 'y':
i = it
plt.figure()
plt.subplot(2, 1, 1)
for i in range(0,len(P)):
ucl = np.loadtxt('storedData/inputIteration'+str(it)+'_P_'+str(P[i])+'.txt')
if P[i] == 'all':
plt.plot(ucl[:,0], '-o', label=labelLMPC)
else:
plt.plot(ucl[:,0], '-o', label="LMPC closed-loop for P = "+str(P[i])+", i="+str(l[i]))
plt.ylabel('$\mathrm{Steering}$', fontsize=20)
plt.legend()
plt.subplot(2, 1, 2)
for i in range(0,len(P)):
ucl = np.loadtxt('storedData/inputIteration'+str(it)+'_P_'+str(P[i])+'.txt')
plt.plot(ucl[:,1], '-o')#, label="LMPC closed-loop for P = "+str(i)+", i="+str(l[counter]))
plt.plot([0,ucl.shape[0]-1],[1,1], '--k', label='Saturation limit')
plt.plot([0,ucl.shape[0]-1],[-1,-1], '--k')
plt.xlabel('$\mathrm{Time~Step}$', fontsize=20)
plt.ylabel('$\mathrm{Acceleration}$', fontsize=20)
plt.legend()
# =========================================================
# Plot acceleration and velocities
# =========================================================
labelLMPC = 'LMPC from [26]'
pltMarker = ['-o', '--s','-d','--v']
if input == 'y':
i = it
plt.figure()
plt.subplot(2, 1, 1)
for i in range(0,len(P)):
xcl = np.loadtxt('storedData/closedLoopIteration'+str(it)+'_P_'+str(P[i])+'.txt')
if P[i] == 'all':
plt.plot(xcl[:,2], '-o', label=labelLMPC)
else:
plt.plot(xcl[:,2], '-o', label="LMPC closed-loop for P = "+str(P[i])+", i="+str(l[i]))
plt.ylabel('$\mathrm{Steering}$', fontsize=20)
plt.subplot(2, 1, 2)
for i in range(0,len(P)):
ucl = np.loadtxt('storedData/inputIteration'+str(it)+'_P_'+str(P[i])+'.txt')
plt.plot(ucl[:,1], '-o', label="LMPC closed-loop for P = "+str(i)+", i="+str(l[i]))
plt.plot([0,ucl.shape[0]-1],[1,1], '--k', label='Saturation limit')
plt.plot([0,ucl.shape[0]-1],[-1,-1], '--k')
plt.xlabel('$\mathrm{Time~Step}$', fontsize=20)
plt.ylabel('$\mathrm{Acceleration}$', fontsize=20)
plt.legend()
# =========================================================
# Closed-loop comparison (X-Y)
# =========================================================
xFeasible = np.loadtxt('storedData/closedLoopFeasible.txt')
plt.figure()
plt.plot(xFeasible[0,:], xFeasible[1,:], '-dk', label='Feasible trajectory')
iterationTime.append(xFeasible.shape[1]-1) # Store time to reach xf
for i in range(0, len(P)):
xcl = np.loadtxt('storedData/closedLoopIteration'+str(it)+'_P_'+str(P[i])+'.txt')
xcl = xcl.T
if P[i] == "all":
plt.plot(xcl[0,:], xcl[1,:], '--d', label=labelLMPC)
else:
plt.plot(xcl[0,:], xcl[1,:], '-o', label="LMPC closed-loop for P = "+str(P[i])+", i="+str(l[i]))
x_obs = []
y_obs = []
for i in np.linspace(0,2*np.pi,1000):
x_obs.append(27 + 8*np.cos(i))
y_obs.append(-1 + 6*np.sin(i))
plt.plot(x_obs, y_obs, '-k', label='Obstacle')
plt.xlabel('$x$', fontsize=20)
plt.ylabel('$y$', fontsize=20)
plt.legend()
# =========================================================
# Time and computational cost
# =========================================================
plt.figure()
for i in range(0,len(P)):
mat = np.loadtxt('storedData/meanTimeLMPC'+'_P_'+str(P[i])+'.txt')
cost = mat[:,1].tolist()
cost.insert(0, 39)
if P[i] == "all":
plt.plot(range(0,len(cost)), cost, '--d', label=labelLMPC)
else:
plt.plot(range(0,len(cost)), cost, '-o', label='${P =}$'+str(P[i])+', ${i =}$'+str(l[i]))
plt.xlabel('$\mathrm{Iteration~}j$', fontsize=20)
plt.ylabel('$\mathrm{Time~Steps~}T^j$', fontsize=20)
plt.legend()
plt.figure()
for i in range(0, len(P)):
mat = np.loadtxt('storedData/meanTimeLMPC'+'_P_'+str(P[i])+'.txt')
compTime = mat[:,0].tolist()
if P[i] == "all":
plt.plot(range(1,len(compTime)+1), compTime, '--d', label=labelLMPC)
else:
plt.plot(range(1,len(compTime)+1), compTime, '-o', label='${P =}$'+str(P[i])+', ${i =}$'+str(l[i]))
plt.xlabel('$\mathrm{Iteration~}j$', fontsize=20)
plt.ylabel('$\mathrm{Mean~Computational~Time~[s]}$', fontsize=20)
plt.legend()
plt.show()
| 33.549242 | 121 | 0.549283 | 1,282 | 8,857 | 3.762871 | 0.129485 | 0.055141 | 0.064677 | 0.014925 | 0.843076 | 0.836235 | 0.807629 | 0.730307 | 0.708126 | 0.707919 | 0 | 0.037289 | 0.10376 | 8,857 | 263 | 122 | 33.676806 | 0.570421 | 0.176471 | 0 | 0.717391 | 0 | 0 | 0.240999 | 0.100979 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.027174 | null | null | 0.016304 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
accb0bac3980bc1f47d20f31081611b5363ac76c | 17,990 | py | Python | collection_modules/btleCollectionPoint/rfid/rfid_device.py | maxakuru/SimpleSensor | 655d10ebed5eddb892d036012cb12ccd6b460d2d | [
"Apache-2.0"
] | null | null | null | collection_modules/btleCollectionPoint/rfid/rfid_device.py | maxakuru/SimpleSensor | 655d10ebed5eddb892d036012cb12ccd6b460d2d | [
"Apache-2.0"
] | null | null | null | collection_modules/btleCollectionPoint/rfid/rfid_device.py | maxakuru/SimpleSensor | 655d10ebed5eddb892d036012cb12ccd6b460d2d | [
"Apache-2.0"
] | null | null | null | import serial
import time
import logging
from rssiInventoryResponse import RssiInventoryResponse
class RfidDevice(object):
defaultReadWait = .35
def __init__(self,port='/dev/ttyAMA0',speed=9600):
logging.config.fileConfig("logging.conf")
self.logger = logging.getLogger("rfid_device.RfidDevice")
self.port = port
self.portSpeed=speed
self.connection = None
self.frequencies = {}
usFrequencies = {}
usFrequencies['906.8'] = r'\x41\x40\x04\xf8\xef\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['926.6'] = r'\x41\x40\x04\x88\x23\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['917.6'] = r'\x41\x40\x04\x60\x00\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['905'] = r'\x41\x40\x04\x28\xcf\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['919.4'] = r'\x41\x40\x04\x68\x07\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['920.6'] = r'\x41\x40\x04\x18\x0c\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['903.8'] = r'\x41\x40\x04\x78\xca\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['923.6'] = r'\x41\x40\x04\xd0\x17\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['907.4'] = r'\x41\x40\x04\x88\xd8\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['915.2'] = r'\x41\x40\x04\x00\xf7\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['917'] = r'\x41\x40\x04\x08\xfe\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['927.2'] = r'\x41\x40\x04\xe0\x25\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['920'] = r'\x41\x40\x04\xc0\x09\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['921.8'] = r'\x41\x40\x04\xc8\x10\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['902'] = r'\x41\x40\x04\x70\xc3\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['918.8'] = r'\x41\x40\x04\x10\x05\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['905.6'] = r'\x41\x40\x04\x80\xd1\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['912.8'] = r'\x41\x40\x04\xa0\xed\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['909.2'] = r'\x41\x40\x04\x90\xdf\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['915.8'] = r'\x41\x40\x04\x58\xf9\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['911.6'] = r'\x41\x40\x04\xf0\xe8\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['925.4'] = r'\x41\x40\x04\xd8\x1e\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['923'] = r'\x41\x40\x04\x78\x15\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['911'] = r'\x41\x40\x04\x98\xe6\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['918.2'] = r'\x41\x40\x04\xb8\x02\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['926'] = r'\x41\x40\x04\x30\x21\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['906.2'] = r'\x41\x40\x04\xd8\xd3\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['902.6'] = r'\x41\x40\x04\xc8\xc5\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['914'] = r'\x41\x40\x04\x50\xf2\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['904.4'] = r'\x41\x40\x04\xd0\xcc\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['909.8'] = r'\x41\x40\x04\xe8\xe1\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['914.6'] = r'\x41\x40\x04\xa8\xf4\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['908'] = r'\x41\x40\x04\xe0\xda\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['924.2'] = r'\x41\x40\x04\x28\x1a\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['927.8'] = r'\x41\x40\x04\x38\x28\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['912.2'] = r'\x41\x40\x04\x48\xeb\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['910.4'] = r'\x41\x40\x04\x40\xe4\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['921.2'] = r'\x41\x40\x04\x70\x0e\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['908.6'] = r'\x41\x40\x04\x38\xdd\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['903.2'] = r'\x41\x40\x04\x20\xc8\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['916.4'] = r'\x41\x40\x04\xb0\xfb\x0d\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['922.4'] = r'\x41\x40\x04\x20\x13\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
usFrequencies['924.8'] = r'\x41\x40\x04\x80\x1c\x0e\xd8\x00\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd\xcd'
self.frequencies['us'] = usFrequencies
powerLevels = dict()
powerLevels['-91'] = r'\x59\x40\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\xa6\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
powerLevels['-71'] = r'\x59\x40\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\xb9\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
#Commands
self.commands = dict()
self.commands['HARDWARE_VERSION'] = r'\x10\x03\x01'
self.commands['SOFTWARE_VERSION'] = r'\x10\x03\x00'
self.commands['ATTENNA_POWER_OFF'] = r'\x18\x03\x00'
self.commands['ATTENNA_POWER_ON'] = r'\x18\x03\xFF'
self.commands['USA_FREQUENCY_BASE'] = r'\x41\x40\x04'
self.commands['EUROPE_FREQUENCY'] = r''
self.commands['TAG_INVENTORY_START'] = r'\x31\x03\x01'
self.commands['TAG_INVENTORY_GET_NEXT'] = r'\x31\x03\x02'
self.commands['RSSI_INVENTORY_START'] = r'\x43\x04\x01\xcd'
self.commands['RSSI_INVENTORY_GET_NEXT'] = r'\x43\x03\x02'
self.commands['SENSITIVITY'] = r'\x41\x40\x11'
self.commands['RSSI_GEN2_COMMAND1'] = r'\x59\x40\x00\x00\x00\x01\x00\x00\x00\x01\x00\x04\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00|x00'
self.commands['RSSI_GEN2_COMMAND2'] = r'\x59\x40\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xcc\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
self.commands['RSSI_GEN2_COMMAND3'] = r'\x41\x40\x11\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
self.commands['POWERLEVEL'] = powerLevels
#Response Codes
self.responseCodes = {}
self.responseCodes['42'] = r'Change Frequency'
self.responseCodes['11'] = r'Firmware/Hardware'
self.responseCodes['19'] = r'Antenna Power'
def getHardwareVersion(self):
commandToIssue = self.commands.get('HARDWARE_VERSION')
return self.issueCommandAndWaitForResponse(commandToIssue)
def setFrequencySet(self,setKeyName):
frequencySet = self.frequencies.get(setKeyName)
#self.logger.debug(frequencySet);
for key in frequencySet:
self.logger.debug("Setting frequency %s" %key)
self.issueCommandAndWaitForResponse(frequencySet[key])
def setGen2Rssi(self):
self.issueCommandAndWaitForResponse(self.commands.get('RSSI_GEN2_COMMAND1'))
self.issueCommandAndWaitForResponse(self.commands.get('RSSI_GEN2_COMMAND2'))
self.issueCommandAndWaitForResponse(self.commands.get('RSSI_GEN2_COMMAND3'))
def tagStartInventory(self):
commandToIssue = self.commands.get('RSSI_INVENTORY_START')
return self.issueCommandAndWaitForResponse(commandToIssue)
def tagInventoryGetNext(self):
commandToIssue = self.commands.get('RSSI_INVENTORY_GET_NEXT')
return self.issueCommandAndWaitForResponse(commandToIssue)
def tagRssiStartInventory(self):
commandToIssue = self.commands.get('RSSI_INVENTORY_START')
rawResponse = self.issueCommandAndWaitForResponse(commandToIssue)
#rssiInventoryResponse = RssiInventoryResponse(rawResponse)
return RssiInventoryResponse(rawResponse)
def tagRssiInventoryGetNext(self):
commandToIssue = self.commands.get('RSSI_INVENTORY_GET_NEXT')
return self.issueCommandAndWaitForResponse(commandToIssue)
def openConnection(self):
self.connection = serial.Serial(self.port,self.portSpeed,timeout=0,stopbits=serial.STOPBITS_ONE)
self.logger.debug('connection is open %s' %self.connection.isOpen())
def issueCommandAndWaitForResponse(self,command):
if (self.connection is None) or (not self.connection.isOpen()):
self.openConnection()
callResult = ''
self.logger.debug('issuing command %s' %command)
self.connection.write(command.decode('string_escape'))
callResult = self.waitAndReadResponse()
self.logger.debug('callResult = %s' % callResult)
return callResult
def waitAndReadResponse(self):
time.sleep(self.defaultReadWait)
callResult = self.connection.readline().encode("hex")
return callResult | 123.219178 | 305 | 0.71184 | 3,652 | 17,990 | 3.493154 | 0.061062 | 1.11233 | 1.638159 | 2.143764 | 0.817904 | 0.767343 | 0.761386 | 0.747982 | 0.739986 | 0.739986 | 0 | 0.08585 | 0.072151 | 17,990 | 146 | 306 | 123.219178 | 0.678409 | 0.006226 | 0 | 0.081967 | 0 | 0.393443 | 0.739524 | 0.693762 | 0 | 1 | 0 | 0 | 0 | 1 | 0.090164 | false | 0 | 0.032787 | 0 | 0.196721 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 14 |
4a0d445c74930d93279befdd1847843e22bf1dcc | 40 | py | Python | test_fixtures/general_repo_origin/file_with_main_function.py | SerejkaSJ/fiasko_bro | dfb8c30109f317c1e5b6d211e002fd148695809e | [
"MIT"
] | 25 | 2018-01-24T10:45:35.000Z | 2020-12-05T21:47:20.000Z | test_fixtures/general_repo_origin/file_with_main_function.py | SerejkaSJ/fiasko_bro | dfb8c30109f317c1e5b6d211e002fd148695809e | [
"MIT"
] | 110 | 2018-01-21T12:25:13.000Z | 2021-06-10T19:27:22.000Z | test_fixtures/general_repo_origin/file_with_main_function.py | SerejkaSJ/fiasko_bro | dfb8c30109f317c1e5b6d211e002fd148695809e | [
"MIT"
] | 13 | 2017-12-12T22:19:01.000Z | 2019-01-29T18:08:05.000Z | import sys
def main():
sys.exit()
| 6.666667 | 14 | 0.575 | 6 | 40 | 3.833333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.275 | 40 | 5 | 15 | 8 | 0.793103 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c581a9508cc05a7d9a63d8a6df6eff2a9dac8166 | 5,427 | py | Python | source/training/dataset.py | ishidalab-titech/3DCNN_MQA | 8f68a3719065338f03eca44da9a6eb0262da0ce9 | [
"MIT"
] | 2 | 2020-02-09T03:17:22.000Z | 2020-03-31T08:42:30.000Z | source/training/dataset.py | ishidalab-titech/3DCNN_MQA | 8f68a3719065338f03eca44da9a6eb0262da0ce9 | [
"MIT"
] | null | null | null | source/training/dataset.py | ishidalab-titech/3DCNN_MQA | 8f68a3719065338f03eca44da9a6eb0262da0ce9 | [
"MIT"
] | 2 | 2019-07-26T07:47:35.000Z | 2020-06-25T21:03:27.000Z | from chainer.dataset import DatasetMixin
import six
import numpy as np
import os
from scipy.sparse import load_npz
from numba import jit
@jit
def sp_noise(data, occur_rate=0.9, sp_rate=0.5):
noise = np.random.uniform(0, 1, data.shape)
for i, p in enumerate(noise):
noise[i] = data[i] if p < occur_rate else 1 if p < occur_rate + (1 - occur_rate) * sp_rate else 0
return noise
# noise = [data[i] if p < occur_rate else 1 if p < occur_rate + (1 - occur_rate) * sp_rate else 0 for i, p in
# enumerate(noise)]
# noise = np.array(noise)
from numba import jit
class RegressionDataset(DatasetMixin):
def __init__(self, path, index, config):
super().__init__()
self.path = path
self.index = index
self.config = config
def __getitem__(self, index):
if isinstance(index, slice):
current, stop, step = index.indices(len(self))
return [self.get_example(i) for i in
six.moves.range(current, stop, step)]
elif isinstance(index, list) or isinstance(index, np.ndarray):
return [self.get_example(i) for i in index]
else:
return self.get_example(index)
def __len__(self):
return len(self.path)
@jit
def get_data(self, path, index):
if len(str(path).split('/')[0]) == 2:
voxel_path = os.path.join(self.config['scop_path'], path)
voxel = load_npz(voxel_path)[index]
voxel = np.reshape(voxel.toarray(), [18, 30, 30, 30])[:self.config['channel']].astype(np.float32)
data_width = voxel.shape[1]
b, e = (data_width - self.config['box_width']) // 2, (data_width + self.config['box_width']) // 2
voxel = voxel[:, b:e, b:e, b:e]
local_label = []
for label_name in self.config['label']:
local_label.append(1)
else:
label_path = os.path.join(self.config['label_path'], path)
voxel_path = os.path.join(self.config['voxel_path'], path)
voxel = load_npz(voxel_path)[index]
voxel = np.reshape(voxel.toarray(), [18, 30, 30, 30])[:self.config['channel']].astype(np.float32)
data_width = voxel.shape[1]
b, e = (data_width - self.config['box_width']) // 2, (data_width + self.config['box_width']) // 2
voxel = voxel[:, b:e, b:e, b:e]
#label = np.load(label_path)[self.config['protein']].tolist()
label = np.load(label_path)
local_label = []
for label_name in self.config['label']:
local_label.append(label[label_name][index])
return voxel, local_label
def get_example(self, i):
path = self.path[i]
index = self.index[i]
voxel, label = self.get_data(path=path, index=index)
threshold = self.config['local_threshold']
label = np.array(label, dtype=np.float32)
return voxel, label
class MultiDataset(DatasetMixin):
def __init__(self, path, index, config):
super().__init__()
self.path = path
self.index = index
self.config = config
def __getitem__(self, index):
if isinstance(index, slice):
current, stop, step = index.indices(len(self))
return [self.get_example(i) for i in
six.moves.range(current, stop, step)]
elif isinstance(index, list) or isinstance(index, np.ndarray):
return [self.get_example(i) for i in index]
else:
return self.get_example(index)
def __len__(self):
return len(self.path)
@jit
def get_data(self, path, index):
if len(str(path).split('/')[0]) == 2:
voxel_path = os.path.join(self.config['scop_path'], path)
voxel = load_npz(voxel_path)[index]
voxel = np.reshape(voxel.toarray(), [14, 30, 30, 30])[:self.config['channel']].astype(np.float32)
data_width = voxel.shape[1]
b, e = (data_width - self.config['box_width']) // 2, (data_width + self.config['box_width']) // 2
voxel = voxel[:, b:e, b:e, b:e]
local_label = []
for label_name in self.config['label']:
local_label.append(1)
else:
label_path = os.path.join(self.config['label_path'], path)
voxel_path = os.path.join(self.config['voxel_path'], path)
voxel = load_npz(voxel_path)[index]
voxel = np.reshape(voxel.toarray(), [14, 30, 30, 30])[:self.config['channel']].astype(np.float32)
data_width = voxel.shape[1]
b, e = (data_width - self.config['box_width']) // 2, (data_width + self.config['box_width']) // 2
voxel = voxel[:, b:e, b:e, b:e]
#label = np.load(label_path)[self.config['protein']].tolist()
label = np.load(label_path)
local_label = []
for label_name in self.config['label']:
local_label.append(label[label_name][index])
return voxel, local_label
def get_example(self, i):
path = self.path[i]
index = self.index[i]
voxel, label = self.get_data(path=path, index=index)
threshold = self.config['local_threshold']
label = [1 if i > threshold else 0 for i in np.array(label, dtype=np.float32)]
label = np.array(label)
return voxel, label
| 39.904412 | 113 | 0.580431 | 738 | 5,427 | 4.108401 | 0.120596 | 0.092348 | 0.034301 | 0.050132 | 0.887863 | 0.887863 | 0.870712 | 0.853562 | 0.853562 | 0.853562 | 0 | 0.019502 | 0.281924 | 5,427 | 135 | 114 | 40.2 | 0.758532 | 0.051225 | 0 | 0.866071 | 0 | 0 | 0.04084 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.098214 | false | 0 | 0.0625 | 0.017857 | 0.294643 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c5c123921918a3b8da8fdd8cffb49a799ad4884c | 108 | py | Python | torchcv/visualizations/__init__.py | chenyuntc/dsod.pytorch | f954085154d4fcad96f623f8572c21d8607ea698 | [
"MIT"
] | 74 | 2018-03-31T04:06:55.000Z | 2021-09-10T08:23:07.000Z | torchcv/visualizations/__init__.py | chenyuntc/dsod.pytorch | f954085154d4fcad96f623f8572c21d8607ea698 | [
"MIT"
] | 5 | 2018-04-01T06:13:17.000Z | 2019-04-15T12:34:11.000Z | torchcv/visualizations/__init__.py | chenyuntc/dsod.pytorch | f954085154d4fcad96f623f8572c21d8607ea698 | [
"MIT"
] | 15 | 2018-04-01T05:12:15.000Z | 2022-03-07T07:59:18.000Z | from torchcv.visualizations.vis_image import vis_image
from torchcv.visualizations.visdom import Visualizer
| 36 | 54 | 0.888889 | 14 | 108 | 6.714286 | 0.571429 | 0.234043 | 0.531915 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 108 | 2 | 55 | 54 | 0.94 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
6807050faaea4820a66fdb1842d96f075626ea2b | 5,346 | py | Python | tests/utils/test_git.py | albertoa/hydra | 8161e75829e4e76cb91ce516bbf03c258a87ce9e | [
"Apache-2.0"
] | 28 | 2020-11-05T16:04:51.000Z | 2021-02-16T22:58:10.000Z | tests/utils/test_git.py | albertoa/hydra | 8161e75829e4e76cb91ce516bbf03c258a87ce9e | [
"Apache-2.0"
] | 43 | 2020-11-06T19:21:39.000Z | 2021-02-25T19:04:42.000Z | tests/utils/test_git.py | albertoa/hydra | 8161e75829e4e76cb91ce516bbf03c258a87ce9e | [
"Apache-2.0"
] | 4 | 2020-11-06T08:54:57.000Z | 2021-01-18T03:26:00.000Z | import pytest
import pytest_mock
from hydra.utils.git import *
VALID_GITHUB_TOKEN = "Georgian"
VALID_REPO_URL = "https://georgian.io/"
VALID_COMMIT_SHA = "m1rr0r1ng"
def test_check_repo_success(mocker):
def pass_test(self):
return False
def fail_test(self):
return True
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_empty",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_untracked",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_modified",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_uncommitted",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_unsynced",
pass_test
)
mocker.patch(
"hydra.utils.git.get_repo_url",
return_value=VALID_REPO_URL
)
mocker.patch(
"hydra.utils.git.get_commit_sha",
return_value=VALID_COMMIT_SHA
)
repo_url, commit_sha = check_repo(VALID_GITHUB_TOKEN)
assert repo_url == VALID_REPO_URL and commit_sha == VALID_COMMIT_SHA
def test_check_repo_empty_token():
with pytest.raises(ValueError, match="GITHUB_TOKEN") as err:
check_repo(None)
def test_check_repo_untracked(mocker):
def pass_test(self):
return False
def fail_test(self):
return True
with pytest.raises(ValueError, match="Hydra is not being called in the root of a git repo.") as err:
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_empty",
fail_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_untracked",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_modified",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_uncommitted",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_unsynced",
pass_test
)
check_repo(VALID_GITHUB_TOKEN)
def test_check_repo_modified(mocker):
def pass_test(self):
return False
def fail_test(self):
return True
with pytest.raises(RuntimeError, match="Some modified files are not staged for commit.") as err:
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_empty",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_untracked",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_modified",
fail_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_uncommitted",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_unsynced",
pass_test
)
check_repo(VALID_GITHUB_TOKEN)
def test_check_repo_uncommitted(mocker):
def pass_test(self):
return False
def fail_test(self):
return True
with pytest.raises(RuntimeError, match="Some staged files are not commited.") as err:
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_empty",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_untracked",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_modified",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_uncommitted",
fail_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_unsynced",
pass_test
)
check_repo(VALID_GITHUB_TOKEN)
def test_check_repo_unsynced(mocker):
def pass_test(self):
return False
def fail_test(self):
return True
with pytest.raises(RuntimeError, match="Some commits are not pushed to the remote repo.") as err:
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_empty",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_untracked",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_modified",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_uncommitted",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_unsynced",
fail_test
)
check_repo(VALID_GITHUB_TOKEN)
def test_check_repo_untracked(mocker):
def pass_test(self):
return False
def fail_test(self):
return True
with pytest.warns(UserWarning, match="Some files are not tracked by git.") as record:
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_empty",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_untracked",
fail_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_modified",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_uncommitted",
pass_test
)
mocker.patch(
"hydra.utils.git_repo.GitRepo.is_unsynced",
pass_test
)
check_repo(VALID_GITHUB_TOKEN)
| 25.098592 | 104 | 0.59147 | 629 | 5,346 | 4.758347 | 0.114467 | 0.110257 | 0.143334 | 0.224524 | 0.817574 | 0.793852 | 0.783829 | 0.77581 | 0.77581 | 0.77581 | 0 | 0.000827 | 0.321175 | 5,346 | 212 | 105 | 25.216981 | 0.823918 | 0 | 0 | 0.687151 | 0 | 0 | 0.285688 | 0.236483 | 0 | 0 | 0 | 0 | 0.005587 | 1 | 0.106145 | false | 0.173184 | 0.01676 | 0.067039 | 0.189944 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
a8686a32dee7285b4416853e55ef606d59f07e67 | 61 | py | Python | CMakeVersion.py | Nosbielc/DoNotDoze | 62038d401d13bf9ece1ecc5ef070a744b83d176d | [
"MIT"
] | null | null | null | CMakeVersion.py | Nosbielc/DoNotDoze | 62038d401d13bf9ece1ecc5ef070a744b83d176d | [
"MIT"
] | null | null | null | CMakeVersion.py | Nosbielc/DoNotDoze | 62038d401d13bf9ece1ecc5ef070a744b83d176d | [
"MIT"
] | null | null | null | import cmake
print("CMake version")
print(cmake.__version__) | 15.25 | 24 | 0.803279 | 8 | 61 | 5.625 | 0.5 | 0.444444 | 0.755556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081967 | 61 | 4 | 24 | 15.25 | 0.803571 | 0 | 0 | 0 | 0 | 0 | 0.209677 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
a8763d8690423d64cdce880e797874e608bad958 | 43 | py | Python | biliob_tracer/__init__.py | ProgramRipper/biliob-spider | 2fe3d5fd91bb301dd0d0eb21d03153d6882f6bcf | [
"MIT"
] | 2 | 2021-02-21T05:49:17.000Z | 2021-02-28T03:01:45.000Z | biliob_tracer/__init__.py | kirahan/biliob-spider | 1a7c4a2b6781775c62c9a7d1aa2f1b0e2b0ab1f8 | [
"MIT"
] | 1 | 2022-03-20T07:59:27.000Z | 2022-03-20T07:59:27.000Z | biliob_tracer/__init__.py | kirahan/biliob-spider | 1a7c4a2b6781775c62c9a7d1aa2f1b0e2b0ab1f8 | [
"MIT"
] | 7 | 2021-02-13T16:58:49.000Z | 2022-02-11T03:23:56.000Z |
from datetime import datetime
import time
| 10.75 | 29 | 0.837209 | 6 | 43 | 6 | 0.666667 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162791 | 43 | 3 | 30 | 14.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a877391745920788e3999ddd4c4f7dd657ec4d16 | 64,295 | py | Python | src/stokesletsInPipe.py | pcmagic/stokes_flow | 464d512d3739eee77b33d1ebf2f27dae6cfa0423 | [
"MIT"
] | 1 | 2018-11-11T05:00:53.000Z | 2018-11-11T05:00:53.000Z | src/stokesletsInPipe.py | pcmagic/stokes_flow | 464d512d3739eee77b33d1ebf2f27dae6cfa0423 | [
"MIT"
] | null | null | null | src/stokesletsInPipe.py | pcmagic/stokes_flow | 464d512d3739eee77b33d1ebf2f27dae6cfa0423 | [
"MIT"
] | null | null | null | import numpy as np
import os
from scipy.io import loadmat
from scipy.special import kv, iv
from numpy import pi, real, imag, exp, sqrt, sum, sin, cos
# see Liron, N., and R. Shahar. "Stokes flow due to a Stokeslet in a pipe."
# Journal of Fluid Mechanics 86.04 (1978): 727-744.
# class containing functions for detailed expression
# noinspection PyTypeChecker
class detail:
def __init__(self, threshold, b):
self._threshold = threshold
self._b = b
self._k = np.zeros([0])
self._n = np.zeros([0])
self._xn = np.zeros([0])
self._yn = np.zeros([0])
self._DmyD_xn = np.zeros([0])
self._DmyD_yn = np.zeros([0])
self._xn_k0 = np.zeros([0])
self._yn_k0 = np.zeros([0])
self._DmyD_xn_k0 = np.zeros([0])
self._DmyD_yn_k0 = np.zeros([0])
self._psi_xn1 = np.zeros([0])
self._psi_xn2 = np.zeros([0])
self._psi_xn3 = np.zeros([0])
self._pi_xn1 = np.zeros([0])
self._pi_xn2 = np.zeros([0])
self._pi_xn3 = np.zeros([0])
self._omega_xn1 = np.zeros([0])
self._omega_xn2 = np.zeros([0])
self._omega_xn3 = np.zeros([0])
self._psi_yn1 = np.zeros([0])
self._psi_yn2 = np.zeros([0])
self._psi_yn3 = np.zeros([0])
self._pi_yn1 = np.zeros([0])
self._pi_yn2 = np.zeros([0])
self._pi_yn3 = np.zeros([0])
self._omega_yn1 = np.zeros([0])
self._omega_yn2 = np.zeros([0])
self._omega_yn3 = np.zeros([0])
self._psi_xn1_k0 = np.zeros([0])
self._psi_xn3_k0 = np.zeros([0])
self._pi_xn1_k0 = np.zeros([0])
self._pi_xn3_k0 = np.zeros([0])
self._omega_xn1_k0 = np.zeros([0])
self._omega_xn3_k0 = np.zeros([0])
self._psi_yn2_k0 = np.zeros([0])
self._pi_yn2_k0 = np.zeros([0])
self._omega_yn2_k0 = np.zeros([0])
self._finish_xyk = False # run _set_xyk first
self._finish_xn = False # run _solve_prepare_xn first
self._finish_yn = False # run _solve_prepare_yn first
self._finish1 = False # run _solve_prepare1 first
self._finish2 = False # run _solve_prepare2 first
self._finish3 = False # run _solve_prepare3 first
def _set_xyk(self):
threshold = self._threshold
kmax = int(threshold - 2)
nmax = int(threshold / 2)
n_use, k_use = np.meshgrid(np.arange(1, nmax + 1), np.arange(-kmax, kmax + 1))
INDEX = (np.abs(k_use) + 2 * n_use) <= threshold
INDEX[kmax, :] = 0
k_use = k_use[INDEX]
n_use = n_use[INDEX]
t_path = os.path.dirname(os.path.abspath(__file__))
full_path = os.path.normpath(t_path + '/' + 'xn.mat')
mat_contents = loadmat(full_path)
xn = mat_contents['xn']
full_path = os.path.normpath(t_path + '/' + 'yn.mat')
mat_contents = loadmat(full_path)
yn = mat_contents['yn']
xn_use = np.vstack((xn[kmax:0:-1, 0: nmax], xn[0: kmax + 1, 0: nmax]))
yn_use = np.vstack((yn[kmax:0:-1, 0: nmax], yn[0: kmax + 1, 0: nmax]))
xn_use = xn_use[INDEX]
yn_use = yn_use[INDEX]
xn_k0 = xn[0, 0:nmax]
yn_k0 = yn[0, 0:nmax]
self._k = k_use
self._n = n_use
self._xn = xn_use
self._yn = yn_use
self._xn_k0 = xn_k0
self._yn_k0 = yn_k0
self._finish_xyk = True
return True
def get_b(self):
return self._b
def _solve_prepare_xn(self):
err_msg = 'run _set_xyk first. '
assert self._finish_xyk, err_msg
DmyD = lambda k, s: 2 * s ** (-2) * iv(k, s) * (
(-1) * s * ((-4) + k ** 2 + s ** 2) * iv((-1) + k, s) ** 2 + 2 * ((-2) + k) * (
k * (2 + k) + s ** 2) * iv(
(-1) + k, s) * iv(k, s) + s * (k * (4 + k) + s ** 2) * iv(k, s) ** 2)
DmyDk0 = lambda s: 2 * iv(0, s) * (
s * iv(0, s) ** 2 + (-4) * iv(0, s) * iv(1, s) + (-1) * s ** (-1) * (
(-4) + s ** 2) * iv(1, s) ** 2)
self._DmyD_xn = DmyD(self._k, self._xn)
self._DmyD_xn_k0 = DmyDk0(self._xn_k0)
self._finish_xn = True
return True
def _solve_prepare_yn(self):
err_msg = 'run _set_xyk first. '
assert self._finish_xyk, err_msg
DmyD = lambda k, s: 2 * s ** (-2) * iv(k, s) * (
(-1) * s * ((-4) + k ** 2 + s ** 2) * iv((-1) + k, s) ** 2 + 2 * ((-2) + k) * (
k * (2 + k) + s ** 2) * iv(
(-1) + k, s) * iv(k, s) + s * (k * (4 + k) + s ** 2) * iv(k, s) ** 2)
DmyDk0 = lambda s: 2 * iv(0, s) * (
s * iv(0, s) ** 2 + (-4) * iv(0, s) * iv(1, s) + (-1) * s ** (-1) * (
(-4) + s ** 2) * iv(1, s) ** 2)
self._DmyD_yn = DmyD(self._k, self._yn)
self._DmyD_yn_k0 = DmyDk0(self._yn_k0)
self._finish_yn = True
return True
def _solve_prepare1(self):
err_msg = 'run _solve_prepare_xn first. '
assert self._finish_xn, err_msg
psi1 = lambda k, s, b: (1 / 16) * pi ** (-2) * (
s ** 2 * ((iv((-2) + k, s) + iv(k, s)) * iv(1 + k, s) + iv((-1) + k, s) * (
iv(k, s) + iv(2 + k, s))) * (
iv((-1) + k, b * s) * kv((-1) + k, s) + (-2) * b * iv(k, b * s) * kv(k,
s) + iv(
1 + k, b * s) * kv(
1 + k,
s)) + (
-1) * (s * iv((-1) + k, s) + (-1) * ((-1) + k) * iv(k, s)) * (
iv(1 + k, s) * (
b * s * (iv((-2) + k, b * s) + 3 * iv(k, b * s)) * kv((-1) + k, s) + iv(
(-1) + k, b * s) * (
(-2) * s * kv((-2) + k, s) + (-2) * (1 + k) * kv((-1) + k, s)) + (
-2) * s * iv(1 + k,
b * s) * kv(k,
s)) + 2 * iv(
(-1) + k, s) * (
(-1) * s * (iv((-1) + k, b * s) + iv(1 + k, b * s)) * kv(k,
s) + 2 * (
b * s * iv(k, b * s) + (-1) * (2 + k) * iv(1 + k,
b * s)) * kv(
1 + k,
s))))
pi1 = lambda k, s, b: (1 / 16) * pi ** (-2) * (iv(k, s) * iv(1 + k, s) * (
b * s * (iv((-2) + k, b * s) + 3 * iv(k, b * s)) * kv((-1) + k, s) + iv((-1) + k,
b * s) * (
(-2) * s * kv((-2) + k, s) + (-2) * (1 + k) * kv((-1) + k, s)) + (
-2) * s * iv(1 + k,
b * s) * kv(
k,
s)) + (
-2) * iv((-1) + k, s) * (
s * iv((-1) + k, b * s) * (
2 * iv(1 + k, s) * kv((-1) + k,
s) + iv(k,
s) * kv(
k,
s)) + (
-2) * b * s * iv(k, b * s) * (
2 * iv(1 + k, s) * kv(k,
s) + iv(
k, s) * kv(1 + k,
s)) + iv(
1 + k, b * s) * (
2 * s * iv(1 + k, s) * kv(
1 + k, s) + iv(k, s) * (
s * kv(k, s) + 2 * (
2 + k) * kv(
1 + k, s)))))
omega1 = lambda k, s, b: (1 / 16) * pi ** (-2) * s ** (-1) * (
s ** 2 * iv((-1) + k, s) ** 2 * (
(-1) * b * s * iv((-2) + k, b * s) * kv((-1) + k, s) + (-3) * b * s * iv(k,
b * s) * kv(
(-1) + k, s) + (
-8) * b * k * iv(k, b * s) * kv(k, s) + 2 * iv((-1) + k, b * s) * (
s * kv((-2) + k, s) + (1 + 3 * k) * kv((-1) + k, s) + (-1) * s * kv(k,
s)) + 4 * b * s * iv(
k,
b * s) * kv(
1 + k, s) + (-8) * iv(1 + k, b * s) * kv(1 + k, s)) + (-2) * s * iv(
(-1) + k,
s) * iv(
k, s) * (
(-1) * b * ((-1) + k) * s * iv((-2) + k,
b * s) * kv(
(-1) + k, s) + 3 * b * s * iv(k,
b * s) * kv(
(-1) + k, s) + (
-3) * b * k * s * iv(k, b * s) * kv(
(-1) + k, s) + (-8) * b * k ** 2 * iv(
k,
b * s) * kv(
k, s) + 2 * iv((-1) + k,
b * s) * (
((-1) + k) * s * kv((-2) + k, s) + (
(-1) + 3 * k ** 2) * kv((-1) + k,
s) + (
-1) * ((-1) + k) * s * kv(k, s)) + (
-4) * b * s * iv(
k, b * s) * kv(1 + k,
s) + 4 * b * k * s * iv(
k, b * s) * kv(1 + k, s) + 8 * iv(
1 + k,
b * s) * kv(
1 + k, s) + (
-4) * k * iv(
1 + k, b * s) * kv(1 + k, s)) + iv(k,
s) ** 2 * (
(-2) * iv((-1) + k, b * s) * (
(4 * k * s + s ** 3) * kv((-2) + k, s) + (
4 * k + 4 * k ** 2 + s ** 2 + 3 * k * s ** 2) * kv(
(-1) + k, s) + (-1) * s ** 3 * kv(
k,
s)) + s * (
b * (4 * k + s ** 2) * iv((-2) + k,
b * s) * kv(
(-1) + k,
s) + 8 * iv(
1 + k, b * s) * (
(-1) * k * kv(k, s) + s * kv(1 + k,
s)) + b * iv(
k,
b * s) * (
3 * (4 * k + s ** 2) * kv((-1) + k,
s) + (
-4) * s * (
(-2) * k * kv(k, s) + s * kv(
1 + k,
s))))))
psi1_k0 = lambda s, b: (1 / 16) * pi ** (-2) * iv(1, s) * (
(-4) * s ** 2 * (iv(0, s) + iv(2, s)) * (
b * iv(0, b * s) * kv(0, s) + (-1) * iv(1, b * s) * kv(1, s)) + (
-8) * s * (iv(0, s) + s * iv(1, s)) * (
b * iv(0, b * s) * kv(1, s) + (-1) * iv(1, b * s) * kv(2, s)))
pi1_k0 = lambda s, b: (1 / 2) * pi ** (-2) * iv(1, s) * (
b * iv(0, b * s) + (-1) * s * iv(1, b * s) * (
iv(1, s) * kv(1, s) + iv(0, s) * kv(2, s)))
self._psi_xn1 = psi1(self._k, self._xn, self._b)
self._psi_yn1 = psi1(self._k, self._yn, self._b)
self._pi_xn1 = pi1(self._k, self._xn, self._b)
self._pi_yn1 = pi1(self._k, self._yn, self._b)
self._omega_xn1 = omega1(self._k, self._xn, self._b)
self._omega_yn1 = omega1(self._k, self._yn, self._b)
self._psi_xn1_k0 = psi1_k0(self._xn_k0, self._b)
self._omega_xn1_k0 = 0
self._pi_xn1_k0 = pi1_k0(self._xn_k0, self._b)
self._finish1 = True
return True
def _solve_prepare2(self):
err_msg = 'run _solve_prepare_yn first. '
assert self._finish_yn, err_msg
psi2 = lambda k, s, b: (1 / 16) * pi ** (-2) * (
s ** 2 * ((iv((-2) + k, s) + iv(k, s)) * iv(1 + k, s) + iv((-1) + k, s) * (
iv(k, s) + iv(2 + k, s))) * (
iv((-1) + k, b * s) * kv((-1) + k, s) + (-1) * iv(1 + k, b * s) * kv(1 + k,
s)) + (
-4) * b ** (-1) * (s * iv((-1) + k, s) + (-1) * ((-1) + k) * iv(k, s)) * (
b * ((-2) + k) * iv((-1) + k, b * s) * iv(1 + k, s) * kv((-1) + k, s) + (
-1) * k * iv(k, b * s) * iv(
1 + k, s) * kv(k, s) + iv((-1) + k, s) * (
(-1) * k * iv(k, b * s) * kv(k, s) + b * (2 + k) * iv(1 + k,
b * s) * kv(
1 + k, s))))
pi2 = lambda k, s, b: (1 / 4) * b ** (-1) * pi ** (-2) * (
iv(k, s) * iv(1 + k, s) * (
b * ((-2) + k) * iv((-1) + k, b * s) * kv((-1) + k, s) + (-1) * k * iv(k,
b * s) * kv(
k, s)) + iv(
(-1) + k,
s) * (
(-1) * b * s * iv((-1) + k, b * s) * iv(1 + k, s) * kv((-1) + k,
s) + b * s * iv(
1 + k, s) * iv(1 + k,
b * s) * kv(
1 + k, s) + iv(k, s) * (
(-1) * k * iv(k, b * s) * kv(k, s) + b * (2 + k) * iv(1 + k,
b * s) * kv(
1 + k, s))))
omega2 = lambda k, s, b: (1 / 2) * b ** (-1) * pi ** (-2) * s ** (-1) * (
(-1) * b * s ** 2 * iv((-1) + k, s) ** 2 * (
iv((-1) + k, b * s) * kv((-1) + k, s) + iv(1 + k, b * s) * kv(1 + k,
s)) + b * s * iv(
(-1) + k, s) * iv(
k,
s) * (
((-2) + 3 * k) * iv((-1) + k, b * s) * kv((-1) + k, s) + ((-2) + k) * iv(
1 + k, b * s) * kv(1 + k,
s)) + iv(k,
s) ** 2 * (
b * (4 * k + (-2) * k ** 2 + s ** 2) * iv((-1) + k, b * s) * kv((-1) + k,
s) + 2 * k ** 2 * iv(
k,
b * s) * kv(
k, s) + b * s ** 2 * iv(1 + k, b * s) * kv(1 + k, s)))
omega2_k0 = lambda s, b: pi ** (-2) * (
s * iv(0, s) ** 2 + (-2) * iv(0, s) * iv(1, s) + (-1) * s * iv(1, s) ** 2) * iv(1,
b * s) * kv(
1, s)
self._psi_xn2 = psi2(self._k, self._xn, self._b)
self._psi_yn2 = psi2(self._k, self._yn, self._b)
self._pi_xn2 = pi2(self._k, self._xn, self._b)
self._pi_yn2 = pi2(self._k, self._yn, self._b)
self._omega_xn2 = omega2(self._k, self._xn, self._b)
self._omega_yn2 = omega2(self._k, self._yn, self._b)
self._psi_yn2_k0 = 0
self._omega_yn2_k0 = omega2_k0(self._yn_k0, self._b)
self._pi_yn2_k0 = 0
self._finish2 = True
return True
def _solve_prepare3(self):
err_msg = 'run _solve_prepare_xn first. '
assert self._finish_xn, err_msg
psi3 = lambda k, s, b: (1 / 8) * pi ** (-2) * s * (
((iv((-2) + k, s) + iv(k, s)) * iv(1 + k, s) + iv((-1) + k, s) * (
iv(k, s) + iv(2 + k, s))) * (
(-1) * b * s * iv((-1) + k, b * s) * kv(k, s) + iv(k, b * s) * (
s * kv((-1) + k, s) + 2 * ((-1) + k) * kv(k, s))) + (-2) * (
s * iv((-1) + k, s) + (-1) * ((-1) + k) * iv(k, s)) * (
b * iv((-1) + k, b * s) * iv(1 + k, s) * kv((-1) + k, s) + (-1) * iv(k,
b * s) * iv(
1 + k, s) * kv(k,
s) + iv(
(-1) + k, s) * (
(-1) * iv(k, b * s) * kv(k, s) + b * iv(1 + k, b * s) * kv(1 + k,
s))))
pi3 = lambda k, s, b: (1 / 4) * pi ** (-2) * (
(-1) * s * iv(k, s) * iv(k, b * s) * iv(1 + k, s) * kv(k, s) + b * s * iv((-1) + k,
b * s) * iv(
1 + k,
s) * (
iv(k, s) * kv((-1) + k, s) + 2 * iv((-1) + k, s) * kv(k, s)) + iv((-1) + k,
s) * (
(-1) * iv(k, b * s) * (s * iv(k, s) * kv(k, s) + 2 * iv(1 + k, s) * (
s * kv((-1) + k, s) + 2 * ((-1) + k) * kv(k, s))) + b * s * iv(k, s) * iv(
1 + k, b * s) * kv(1 + k,
s)))
omega3 = lambda k, s, b: (1 / 4) * pi ** (-2) * s ** (-1) * (s * iv(k, s) ** 2 * (
(-2) * k * iv(k, b * s) * (s * kv((-1) + k, s) + 2 * k * kv(k, s)) + b * iv(
(-1) + k, b * s) * (
(4 * k + s ** 2) * kv((-1) + k, s) + 2 * k * s * kv(k, s)) + (
-1) * b * s ** 2 * iv(1 + k,
b * s) * kv(
1 + k, s)) + s * iv((-1) + k, s) ** 2 * (2 * k * iv(k, b * s) * (
s * kv((-1) + k, s) + 2 * ((-1) + k) * kv(k, s)) + (-1) * b * s * iv((-1) + k,
b * s) * (
s * kv((-1) + k, s) + 2 * k * kv(k,
s)) + b * s ** 2 * iv(
1 + k, b * s) * kv(1 + k, s)) + 2 * iv((-1) + k, s) * iv(k, s) * (
(-2) * k ** 2 * iv(k,
b * s) * (
s * kv(
(-1) + k,
s) + 2 * ((
-1) + k) * kv(
k,
s)) + b * s * iv(
(-1) + k, b * s) * (
((
-1) + k) * s * kv(
(-1) + k,
s) + 2 * k ** 2 * kv(
k,
s)) + (
-1) * b * ((
-1) + k) * s ** 2 * iv(
1 + k,
b * s) * kv(
1 + k,
s)))
psi3_k0 = lambda s, b: (1 / 4) * pi ** (-2) * s * iv(1, s) * (b * iv(1, b * s) * (
(-1) * s * (iv(0, s) + iv(2, s)) * kv(0, s) + (-2) * (iv(0, s) + s * iv(1, s)) * kv(
1, s)) + iv(0,
b * s) * (
2 * (s * iv(1, s) + (
-1) * iv(2, s)) * kv(0,
s) + s * (
iv(0, s) + iv(
2, s)) * kv(1,
s)))
pi3_k0 = lambda s, b: (1 / 2) * pi ** (-2) * iv(1, s) * (
b * iv(1, b * s) + (-1) * s * iv(0, b * s) * (
iv(2, s) * kv(0, s) + iv(1, s) * kv(1, s)))
self._psi_xn3 = psi3(self._k, self._xn, self._b)
self._psi_yn3 = psi3(self._k, self._yn, self._b)
self._pi_xn3 = pi3(self._k, self._xn, self._b)
self._pi_yn3 = pi3(self._k, self._yn, self._b)
self._omega_xn3 = omega3(self._k, self._xn, self._b)
self._omega_yn3 = omega3(self._k, self._yn, self._b)
self._psi_xn3_k0 = psi3_k0(self._xn_k0, self._b)
self._omega_xn3_k0 = 0
self._pi_xn3_k0 = pi3_k0(self._xn_k0, self._b)
self._finish3 = True
return True
def solve_u1(self, R, Phi, z):
err_msg = 'run _solve_prepare1 first. '
assert self._finish1, err_msg
AFPhi1nL = lambda xn, k, psi1, omega1, pi1, R, z, DmyD: (-2) * exp(1) ** (
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * exp(1) ** (sqrt(-1 + 0j) * z * real(xn)) * (
(-1) * (omega1 + k * pi1) * iv((-1) + k, R * xn) + k * (
omega1 + pi1 + k * pi1 + (-1) * psi1) * R ** (
-1) * xn ** (-1) * iv(k, R * xn)))
AFPhi1nR = lambda yn, k, psi1, omega1, pi1, R, z, DmyD: (-1) * np.exp(
(-1) * z * imag(yn)) * pi * imag(
DmyD ** (-1) * (
(-1) * (omega1 + k * pi1) * iv((-1) + k, R * yn) + k * (
omega1 + pi1 + k * pi1 + (-1) * psi1) * R ** (
-1) * yn ** (-1) * iv(k, R * yn)))
AFR1nL = lambda xn, k, psi1, omega1, pi1, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * R ** (-1) * xn ** (-1) * (
((-1) * pi1 + psi1) * R * xn * iv((-1) + k, R * xn) + (
k * (omega1 + pi1 + k * pi1 + (-1) * psi1) + pi1 * R ** 2 * xn ** 2) * iv(k,
R * xn)))
AFR1nR = lambda yn, k, psi1, omega1, pi1, R, z, DmyD: (-1) * np.exp(
(-1) * z * imag(yn)) * pi * imag(
DmyD ** (-1) * R ** (-1) * yn ** (-1) * (
((-1) * pi1 + psi1) * R * yn * iv((-1) + k, R * yn) + (
k * (omega1 + pi1 + k * pi1 + (-1) * psi1) + pi1 * R ** 2 * yn ** 2) * iv(k,
R * yn)))
BFz1nL = lambda xn, k, psi1, omega1, pi1, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * real(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
pi1 * R * xn * iv((-1) + k, R * xn) + (pi1 + (-1) * k * pi1 + psi1) * iv(k,
R * xn)))
BFz1nR = lambda yn, k, psi1, omega1, pi1, R, z, DmyD: (-1) * np.exp(
(-1) * z * imag(yn)) * pi * real(
DmyD ** (-1) * (pi1 * R * yn * iv((-1) + k, R * yn) + (
pi1 + (-1) * k * pi1 + psi1) * iv(k, R * yn)))
uR1_k0 = lambda xn, psi1, omega1, pi1, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
pi1 * R * xn * iv(0, R * xn) + ((-1) * pi1 + psi1) * iv(1, R * xn)))
uz1_k0 = lambda xn, psi1, omega1, pi1, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * real(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
(pi1 + psi1) * iv(0, R * xn) + pi1 * R * xn * iv(1, R * xn)))
R = np.array(R, dtype=float).flatten()
z = np.array(z, dtype=float).flatten()
Phi = np.array(Phi, dtype=float)
Phi_shape = Phi.shape
Phi_flags = Phi.flags
Phi = Phi.flatten()
err_msg = 'both R and z should be scales. '
assert R.size == 1 and z.size == 1, err_msg
uR1 = Phi.copy()
uPhi1 = Phi.copy()
uz1 = Phi.copy()
uR1k0 = sum(uR1_k0(self._xn_k0, self._psi_xn1_k0, self._omega_xn1_k0, self._pi_xn1_k0, R, z,
self._DmyD_xn_k0))
uPhi1k0 = 0
uz1k0 = sum(uz1_k0(self._xn_k0, self._psi_xn1_k0, self._omega_xn1_k0, self._pi_xn1_k0, R, z,
self._DmyD_xn_k0))
t_AFR1nL = AFR1nL(self._xn, self._k, self._psi_xn1, self._omega_xn1, self._pi_xn1, R, z,
self._DmyD_xn)
t_AFR1nR = AFR1nR(self._yn, self._k, self._psi_yn1, self._omega_yn1, self._pi_yn1, R, z,
self._DmyD_yn)
t_AFPhi1nL = AFPhi1nL(self._xn, self._k, self._psi_xn1, self._omega_xn1, self._pi_xn1, R, z,
self._DmyD_xn)
t_AFPhi1nR = AFPhi1nR(self._yn, self._k, self._psi_yn1, self._omega_yn1, self._pi_yn1, R, z,
self._DmyD_yn)
t_BFz1nL = BFz1nL(self._xn, self._k, self._psi_xn1, self._omega_xn1, self._pi_xn1, R, z,
self._DmyD_xn)
t_BFz1nR = BFz1nR(self._yn, self._k, self._psi_yn1, self._omega_yn1, self._pi_yn1, R, z,
self._DmyD_yn)
for i0, phi in enumerate(Phi):
uR1[i0] = uR1k0 + sum((t_AFR1nL + t_AFR1nR) * cos(self._k * phi))
uPhi1[i0] = uPhi1k0 + sum((t_AFPhi1nL + t_AFPhi1nR) * sin(self._k * phi))
uz1[i0] = uz1k0 + sum((t_BFz1nL + t_BFz1nR) * cos(self._k * phi))
if Phi_flags['C_CONTIGUOUS']:
uR1 = uR1.reshape(Phi_shape, order='C')
uPhi1 = uPhi1.reshape(Phi_shape, order='C')
uz1 = uz1.reshape(Phi_shape, order='C')
elif Phi_flags['F_CONTIGUOUS']:
uR1 = uR1.reshape(Phi_shape, order='F')
uPhi1 = uPhi1.reshape(Phi_shape, order='F')
uz1 = uz1.reshape(Phi_shape, order='F')
else:
raise ValueError('C_CONTIGUOUS and F_CONTIGUOUS are both False. ')
return uR1, uPhi1, uz1
def solve_u2(self, R, Phi, z):
err_msg = 'run _solve_prepare2 first. '
assert self._finish2, err_msg
AFPhi2nL = lambda xn, k, psi2, omega2, pi2, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
((-1) * omega2 + k * pi2) * iv((-1) + k, R * xn) + k * (
omega2 + (-1) * (1 + k) * pi2 + psi2) * R ** (
-1) * xn ** (-1) * iv(k, R * xn)))
AFPhi2nR = lambda yn, k, psi2, omega2, pi2, R, z, DmyD: (-1) * np.exp(
(-1) * z * imag(yn)) * pi * imag(
DmyD ** (-1) * (
((-1) * omega2 + k * pi2) * iv((-1) + k, R * yn) + k * (
omega2 + (-1) * (1 + k) * pi2 + psi2) * R ** (
-1) * yn ** (-1) * iv(k, R * yn)))
AFR2nL = lambda xn, k, psi2, omega2, pi2, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * R ** (-1) * xn ** (-1) * (
((-1) * pi2 + psi2) * R * xn * iv((-1) + k, R * xn) + (
k * ((-1) * omega2 + pi2 + k * pi2 + (
-1) * psi2) + pi2 * R ** 2 * xn ** 2) * iv(k, R * xn)))
AFR2nR = lambda yn, k, psi2, omega2, pi2, R, z, DmyD: (-1) * np.exp(
(-1) * z * imag(yn)) * pi * imag(
DmyD ** (-1) * R ** (-1) * yn ** (-1) * (
((-1) * pi2 + psi2) * R * yn * iv((-1) + k, R * yn) + (
k * ((-1) * omega2 + pi2 + k * pi2 + (
-1) * psi2) + pi2 * R ** 2 * yn ** 2) * iv(k, R * yn)))
BFz2nL = lambda xn, k, psi2, omega2, pi2, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * real(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
pi2 * R * xn * iv((-1) + k, R * xn) + (pi2 + (-1) * k * pi2 + psi2) * iv(k,
R * xn)))
BFz2nR = lambda yn, k, psi2, omega2, pi2, R, z, DmyD: (-1) * np.exp(
(-1) * z * imag(yn)) * pi * real(
DmyD ** (-1) * (pi2 * R * yn * iv((-1) + k, R * yn) + (
pi2 + (-1) * k * pi2 + psi2) * iv(k, R * yn)))
uPhi2_k0 = lambda yn, psi2, omega2, pi2, R, z, DmyD: np.exp(
(-1) * z * imag(yn)) * pi * imag(
DmyD ** (-1) * omega2 * iv(1, R * yn))
R = np.array(R, dtype=float).flatten()
z = np.array(z, dtype=float).flatten()
Phi = np.array(Phi, dtype=float)
Phi_shape = Phi.shape
Phi_flags = Phi.flags
Phi = Phi.flatten()
err_msg = 'both R and z should be scales. '
assert R.size == 1 and z.size == 1, err_msg
uR2 = Phi.copy()
uPhi2 = Phi.copy()
uz2 = Phi.copy()
uR2k0 = 0
uPhi2k0 = sum(
uPhi2_k0(self._yn_k0, self._psi_yn2_k0, self._omega_yn2_k0, self._pi_yn2_k0, R, z,
self._DmyD_yn_k0))
uz2k0 = 0
t_AFR2nL = AFR2nL(self._xn, self._k, self._psi_xn2, self._omega_xn2, self._pi_xn2, R, z,
self._DmyD_xn)
t_AFR2nR = AFR2nR(self._yn, self._k, self._psi_yn2, self._omega_yn2, self._pi_yn2, R, z,
self._DmyD_yn)
t_AFPhi2nL = AFPhi2nL(self._xn, self._k, self._psi_xn2, self._omega_xn2, self._pi_xn2, R, z,
self._DmyD_xn)
t_AFPhi2nR = AFPhi2nR(self._yn, self._k, self._psi_yn2, self._omega_yn2, self._pi_yn2, R, z,
self._DmyD_yn)
t_BFz2nL = BFz2nL(self._xn, self._k, self._psi_xn2, self._omega_xn2, self._pi_xn2, R, z,
self._DmyD_xn)
t_BFz2nR = BFz2nR(self._yn, self._k, self._psi_yn2, self._omega_yn2, self._pi_yn2, R, z,
self._DmyD_yn)
for i0, phi in enumerate(Phi):
uR2[i0] = uR2k0 + sum((t_AFR2nL + t_AFR2nR) * sin(self._k * phi))
uPhi2[i0] = uPhi2k0 + sum((t_AFPhi2nL + t_AFPhi2nR) * cos(self._k * phi))
uz2[i0] = uz2k0 + sum((t_BFz2nL + t_BFz2nR) * sin(self._k * phi))
if Phi_flags['C_CONTIGUOUS']:
uR2 = uR2.reshape(Phi_shape, order='C')
uPhi2 = uPhi2.reshape(Phi_shape, order='C')
uz2 = uz2.reshape(Phi_shape, order='C')
elif Phi_flags['F_CONTIGUOUS']:
uR2 = uR2.reshape(Phi_shape, order='F')
uPhi2 = uPhi2.reshape(Phi_shape, order='F')
uz2 = uz2.reshape(Phi_shape, order='F')
else:
raise ValueError('C_CONTIGUOUS and F_CONTIGUOUS are both False. ')
return uR2, uPhi2, uz2
def solve_u3(self, R, Phi, z):
err_msg = 'run _solve_prepare3 first. '
assert self._finish3, err_msg
BFPhi3nL = lambda xn, k, psi3, omega3, pi3, R, z, DmyD: 2 * np.exp(
(-1) * z * imag(xn)) * pi * real(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
(-1) * (omega3 + k * pi3) * iv((-1) + k, R * xn) + k * (
omega3 + pi3 + k * pi3 + (-1) * psi3) * R ** (
-1) * xn ** (-1) * iv(k, R * xn)))
BFPhi3nR = lambda yn, k, psi3, omega3, pi3, R, z, DmyD: np.exp(
(-1) * z * imag(yn)) * pi * real(
DmyD ** (-1) * (
(-1) * (omega3 + k * pi3) * iv((-1) + k, R * yn) + k * (
omega3 + pi3 + k * pi3 + (-1) * psi3) * R ** (
-1) * yn ** (-1) * iv(k, R * yn)))
BFR3nL = lambda xn, k, psi3, omega3, pi3, R, z, DmyD: 2 * np.exp(
(-1) * z * imag(xn)) * pi * real(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * R ** (-1) * xn ** (-1) * (
((-1) * pi3 + psi3) * R * xn * iv((-1) + k, R * xn) + (
k * (omega3 + pi3 + k * pi3 + (-1) * psi3) + pi3 * R ** 2 * xn ** 2) * iv(k,
R * xn)))
BFR3nR = lambda yn, k, psi3, omega3, pi3, R, z, DmyD: np.exp(
(-1) * z * imag(yn)) * pi * real(
DmyD ** (-1) * R ** (-1) * yn ** (-1) * (
((-1) * pi3 + psi3) * R * yn * iv((-1) + k, R * yn) + (
k * (omega3 + pi3 + k * pi3 + (-1) * psi3) + pi3 * R ** 2 * yn ** 2) * iv(k,
R * yn)))
AFz3nL = lambda xn, k, psi3, omega3, pi3, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
pi3 * R * xn * iv((-1) + k, R * xn) + (pi3 + (-1) * k * pi3 + psi3) * iv(k,
R * xn)))
AFz3nR = lambda yn, k, psi3, omega3, pi3, R, z, DmyD: (-1) * np.exp(
(-1) * z * imag(yn)) * pi * imag(
DmyD ** (-1) * (pi3 * R * yn * iv((-1) + k, R * yn) + (
pi3 + (-1) * k * pi3 + psi3) * iv(k, R * yn)))
uR3_k0 = lambda xn, psi3, omega3, pi3, R, z, DmyD: 2 * np.exp(
(-1) * z * imag(xn)) * pi * real(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
pi3 * R * xn * iv(0, R * xn) + ((-1) * pi3 + psi3) * iv(1, R * xn)))
uz3_k0 = lambda xn, psi3, omega3, pi3, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
(pi3 + psi3) * iv(0, R * xn) + pi3 * R * xn * iv(1, R * xn)))
R = np.array(R, dtype=float).flatten()
z = np.array(z, dtype=float).flatten()
Phi = np.array(Phi, dtype=float)
Phi_shape = Phi.shape
Phi_flags = Phi.flags
Phi = Phi.flatten()
err_msg = 'both R and z should be scales. '
assert R.size == 1 and z.size == 1, err_msg
uR3 = Phi.copy()
uPhi3 = Phi.copy()
uz3 = Phi.copy()
uR3k0 = sum(uR3_k0(self._xn_k0, self._psi_xn3_k0, self._omega_xn3_k0, self._pi_xn3_k0, R, z,
self._DmyD_xn_k0))
uPhi3k0 = 0
uz3k0 = sum(uz3_k0(self._xn_k0, self._psi_xn3_k0, self._omega_xn3_k0, self._pi_xn3_k0, R, z,
self._DmyD_xn_k0))
t_BFR3nL = BFR3nL(self._xn, self._k, self._psi_xn3, self._omega_xn3, self._pi_xn3, R, z,
self._DmyD_xn)
t_BFR3nR = BFR3nR(self._yn, self._k, self._psi_yn3, self._omega_yn3, self._pi_yn3, R, z,
self._DmyD_yn)
t_BFPhi3nL = BFPhi3nL(self._xn, self._k, self._psi_xn3, self._omega_xn3, self._pi_xn3, R, z,
self._DmyD_xn)
t_BFPhi3nR = BFPhi3nR(self._yn, self._k, self._psi_yn3, self._omega_yn3, self._pi_yn3, R, z,
self._DmyD_yn)
t_AFz3nL = AFz3nL(self._xn, self._k, self._psi_xn3, self._omega_xn3, self._pi_xn3, R, z,
self._DmyD_xn)
t_AFz3nR = AFz3nR(self._yn, self._k, self._psi_yn3, self._omega_yn3, self._pi_yn3, R, z,
self._DmyD_yn)
for i0, phi in enumerate(Phi):
uR3[i0] = uR3k0 + sum((t_BFR3nL + t_BFR3nR) * cos(self._k * phi))
uPhi3[i0] = uPhi3k0 + sum((t_BFPhi3nL + t_BFPhi3nR) * sin(self._k * phi))
uz3[i0] = uz3k0 + sum((t_AFz3nL + t_AFz3nR) * cos(self._k * phi))
if Phi_flags['C_CONTIGUOUS']:
uR3 = uR3.reshape(Phi_shape, order='C')
uPhi3 = uPhi3.reshape(Phi_shape, order='C')
uz3 = uz3.reshape(Phi_shape, order='C')
elif Phi_flags['F_CONTIGUOUS']:
uR3 = uR3.reshape(Phi_shape, order='F')
uPhi3 = uPhi3.reshape(Phi_shape, order='F')
uz3 = uz3.reshape(Phi_shape, order='F')
else:
raise ValueError('C_CONTIGUOUS and F_CONTIGUOUS are both False. ')
return uR3, uPhi3, uz3
def solve_prepare(self):
self._set_xyk()
self._solve_prepare_xn()
self._solve_prepare_yn()
self._solve_prepare1()
self._solve_prepare2()
self._solve_prepare3()
return True
def solve_u(self, R, Phi, z):
uR1, uPhi1, uz1 = self.solve_u1(R, Phi, z)
uR2, uPhi2, uz2 = self.solve_u2(R, Phi, z)
uR3, uPhi3, uz3 = self.solve_u3(R, Phi, z)
return uR1, uPhi1, uz1, uR2, uPhi2, uz2, uR3, uPhi3, uz3
def solve_uxyz(self, nodes):
from petsc4py import PETSc
phi = np.arctan2(nodes[:, 1], nodes[:, 0])
rho = np.sqrt(nodes[:, 0] ** 2 + nodes[:, 1] ** 2)
z = nodes[:, 2]
u1 = []
u2 = []
u3 = []
dmda = PETSc.DMDA().create(sizes=(nodes.shape[0],), dof=1,
stencil_width=0, comm=PETSc.COMM_WORLD)
dmda.setFromOptions()
dmda.setUp()
for i0 in range(dmda.getRanges()[0][0], dmda.getRanges()[0][1]):
t_rho = rho[i0]
t_phi = phi[i0]
t_z = z[i0]
abs_z = np.abs(t_z)
sign_z = np.sign(t_z)
if np.isclose(t_rho, 1):
ux1 = 0
uy1 = 0
uz1 = 0
ux2 = 0
uy2 = 0
uz2 = 0
ux3 = 0
uy3 = 0
uz3 = 0
else:
uR1, uPhi1, uz1, uR2, uPhi2, uz2, uR3, uPhi3, uz3 = self.solve_u(t_rho, t_phi,
abs_z)
ux1 = np.cos(t_phi) * uR1 - np.sin(t_phi) * uPhi1
ux2 = np.cos(t_phi) * uR2 - np.sin(t_phi) * uPhi2
ux3 = np.cos(t_phi) * uR3 - np.sin(t_phi) * uPhi3
uy1 = np.sin(t_phi) * uR1 + np.cos(t_phi) * uPhi1
uy2 = np.sin(t_phi) * uR2 + np.cos(t_phi) * uPhi2
uy3 = np.sin(t_phi) * uR3 + np.cos(t_phi) * uPhi3
u1.append((ux1, uy1, sign_z * uz1))
u2.append((ux2, uy2, sign_z * uz2))
u3.append((sign_z * ux3, sign_z * uy3, uz3))
comm = PETSc.COMM_WORLD.tompi4py()
u1_all = np.vstack(comm.allgather(u1))
u2_all = np.vstack(comm.allgather(u2))
u3_all = np.vstack(comm.allgather(u3))
return u1_all, u2_all, u3_all
class detail_light(detail):
def __init__(self, threshold):
super().__init__(threshold=threshold, b=0)
def set_b(self, b):
self._b = b
return True
def solve_prepare_light(self):
self._set_xyk()
self._solve_prepare_xn()
self._solve_prepare_yn()
return True
def solve_prepare_b(self):
self._solve_prepare1()
self._solve_prepare2()
self._solve_prepare3()
return True
def solve_u1_light(self, R, Phi, z):
err_msg = 'run _solve_prepare1 first. '
assert self._finish1, err_msg
AFPhi1nL = lambda xn, k, psi1, omega1, pi1, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
(-1) * (omega1 + k * pi1) * iv((-1) + k, R * xn) + k * (
omega1 + pi1 + k * pi1 + (-1) * psi1) * R ** (
-1) * xn ** (-1) * iv(k, R * xn)))
AFPhi1nR = lambda yn, k, psi1, omega1, pi1, R, z, DmyD: (-1) * np.exp(
(-1) * z * imag(yn)) * pi * imag(
DmyD ** (-1) * (
(-1) * (omega1 + k * pi1) * iv((-1) + k, R * yn) + k * (
omega1 + pi1 + k * pi1 + (-1) * psi1) * R ** (
-1) * yn ** (-1) * iv(k, R * yn)))
AFR1nL = lambda xn, k, psi1, omega1, pi1, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * R ** (-1) * xn ** (-1) * (
((-1) * pi1 + psi1) * R * xn * iv((-1) + k, R * xn) + (
k * (omega1 + pi1 + k * pi1 + (-1) * psi1) + pi1 * R ** 2 * xn ** 2) * iv(k,
R * xn)))
AFR1nR = lambda yn, k, psi1, omega1, pi1, R, z, DmyD: (-1) * np.exp(
(-1) * z * imag(yn)) * pi * imag(
DmyD ** (-1) * R ** (-1) * yn ** (-1) * (
((-1) * pi1 + psi1) * R * yn * iv((-1) + k, R * yn) + (
k * (omega1 + pi1 + k * pi1 + (-1) * psi1) + pi1 * R ** 2 * yn ** 2) * iv(k,
R * yn)))
BFz1nL = lambda xn, k, psi1, omega1, pi1, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * real(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
pi1 * R * xn * iv((-1) + k, R * xn) + (pi1 + (-1) * k * pi1 + psi1) * iv(k,
R * xn)))
BFz1nR = lambda yn, k, psi1, omega1, pi1, R, z, DmyD: (-1) * np.exp(
(-1) * z * imag(yn)) * pi * real(
DmyD ** (-1) * (pi1 * R * yn * iv((-1) + k, R * yn) + (
pi1 + (-1) * k * pi1 + psi1) * iv(k, R * yn)))
uR1_k0 = lambda xn, psi1, omega1, pi1, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
pi1 * R * xn * iv(0, R * xn) + ((-1) * pi1 + psi1) * iv(1, R * xn)))
uz1_k0 = lambda xn, psi1, omega1, pi1, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * real(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
(pi1 + psi1) * iv(0, R * xn) + pi1 * R * xn * iv(1, R * xn)))
uR1k0 = sum(uR1_k0(self._xn_k0, self._psi_xn1_k0, self._omega_xn1_k0, self._pi_xn1_k0, R, z,
self._DmyD_xn_k0))
uPhi1k0 = 0
uz1k0 = sum(uz1_k0(self._xn_k0, self._psi_xn1_k0, self._omega_xn1_k0, self._pi_xn1_k0, R, z,
self._DmyD_xn_k0))
t_AFR1nL = AFR1nL(self._xn, self._k, self._psi_xn1, self._omega_xn1, self._pi_xn1, R, z,
self._DmyD_xn)
t_AFR1nR = AFR1nR(self._yn, self._k, self._psi_yn1, self._omega_yn1, self._pi_yn1, R, z,
self._DmyD_yn)
t_AFPhi1nL = AFPhi1nL(self._xn, self._k, self._psi_xn1, self._omega_xn1, self._pi_xn1, R, z,
self._DmyD_xn)
t_AFPhi1nR = AFPhi1nR(self._yn, self._k, self._psi_yn1, self._omega_yn1, self._pi_yn1, R, z,
self._DmyD_yn)
t_BFz1nL = BFz1nL(self._xn, self._k, self._psi_xn1, self._omega_xn1, self._pi_xn1, R, z,
self._DmyD_xn)
t_BFz1nR = BFz1nR(self._yn, self._k, self._psi_yn1, self._omega_yn1, self._pi_yn1, R, z,
self._DmyD_yn)
uR1 = uR1k0 + sum((t_AFR1nL + t_AFR1nR) * cos(self._k * Phi))
uPhi1 = uPhi1k0 + sum((t_AFPhi1nL + t_AFPhi1nR) * sin(self._k * Phi))
uz1 = uz1k0 + sum((t_BFz1nL + t_BFz1nR) * cos(self._k * Phi))
return uR1, uPhi1, uz1
def solve_u2_light(self, R, Phi, z):
err_msg = 'run _solve_prepare2 first. '
assert self._finish2, err_msg
AFPhi2nL = lambda xn, k, psi2, omega2, pi2, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
((-1) * omega2 + k * pi2) * iv((-1) + k, R * xn) + k * (
omega2 + (-1) * (1 + k) * pi2 + psi2) * R ** (
-1) * xn ** (-1) * iv(k, R * xn)))
AFPhi2nR = lambda yn, k, psi2, omega2, pi2, R, z, DmyD: (-1) * np.exp(
(-1) * z * imag(yn)) * pi * imag(
DmyD ** (-1) * (
((-1) * omega2 + k * pi2) * iv((-1) + k, R * yn) + k * (
omega2 + (-1) * (1 + k) * pi2 + psi2) * R ** (
-1) * yn ** (-1) * iv(k, R * yn)))
AFR2nL = lambda xn, k, psi2, omega2, pi2, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * R ** (-1) * xn ** (-1) * (
((-1) * pi2 + psi2) * R * xn * iv((-1) + k, R * xn) + (
k * ((-1) * omega2 + pi2 + k * pi2 + (
-1) * psi2) + pi2 * R ** 2 * xn ** 2) * iv(k, R * xn)))
AFR2nR = lambda yn, k, psi2, omega2, pi2, R, z, DmyD: (-1) * np.exp(
(-1) * z * imag(yn)) * pi * imag(
DmyD ** (-1) * R ** (-1) * yn ** (-1) * (
((-1) * pi2 + psi2) * R * yn * iv((-1) + k, R * yn) + (
k * ((-1) * omega2 + pi2 + k * pi2 + (
-1) * psi2) + pi2 * R ** 2 * yn ** 2) * iv(k, R * yn)))
BFz2nL = lambda xn, k, psi2, omega2, pi2, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * real(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
pi2 * R * xn * iv((-1) + k, R * xn) + (pi2 + (-1) * k * pi2 + psi2) * iv(k,
R * xn)))
BFz2nR = lambda yn, k, psi2, omega2, pi2, R, z, DmyD: (-1) * np.exp(
(-1) * z * imag(yn)) * pi * real(
DmyD ** (-1) * (pi2 * R * yn * iv((-1) + k, R * yn) + (
pi2 + (-1) * k * pi2 + psi2) * iv(k, R * yn)))
uPhi2_k0 = lambda yn, psi2, omega2, pi2, R, z, DmyD: np.exp(
(-1) * z * imag(yn)) * pi * imag(
DmyD ** (-1) * omega2 * iv(1, R * yn))
uR2k0 = 0
uPhi2k0 = sum(
uPhi2_k0(self._yn_k0, self._psi_yn2_k0, self._omega_yn2_k0, self._pi_yn2_k0, R, z,
self._DmyD_yn_k0))
uz2k0 = 0
t_AFR2nL = AFR2nL(self._xn, self._k, self._psi_xn2, self._omega_xn2, self._pi_xn2, R, z,
self._DmyD_xn)
t_AFR2nR = AFR2nR(self._yn, self._k, self._psi_yn2, self._omega_yn2, self._pi_yn2, R, z,
self._DmyD_yn)
t_AFPhi2nL = AFPhi2nL(self._xn, self._k, self._psi_xn2, self._omega_xn2, self._pi_xn2, R, z,
self._DmyD_xn)
t_AFPhi2nR = AFPhi2nR(self._yn, self._k, self._psi_yn2, self._omega_yn2, self._pi_yn2, R, z,
self._DmyD_yn)
t_BFz2nL = BFz2nL(self._xn, self._k, self._psi_xn2, self._omega_xn2, self._pi_xn2, R, z,
self._DmyD_xn)
t_BFz2nR = BFz2nR(self._yn, self._k, self._psi_yn2, self._omega_yn2, self._pi_yn2, R, z,
self._DmyD_yn)
uR2 = uR2k0 + sum((t_AFR2nL + t_AFR2nR) * sin(self._k * Phi))
uPhi2 = uPhi2k0 + sum((t_AFPhi2nL + t_AFPhi2nR) * cos(self._k * Phi))
uz2 = uz2k0 + sum((t_BFz2nL + t_BFz2nR) * sin(self._k * Phi))
return uR2, uPhi2, uz2
def solve_u3_light(self, R, Phi, z):
err_msg = 'run _solve_prepare3 first. '
assert self._finish3, err_msg
BFPhi3nL = lambda xn, k, psi3, omega3, pi3, R, z, DmyD: 2 * np.exp(
(-1) * z * imag(xn)) * pi * real(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
(-1) * (omega3 + k * pi3) * iv((-1) + k, R * xn) + k * (
omega3 + pi3 + k * pi3 + (-1) * psi3) * R ** (
-1) * xn ** (-1) * iv(k, R * xn)))
BFPhi3nR = lambda yn, k, psi3, omega3, pi3, R, z, DmyD: np.exp(
(-1) * z * imag(yn)) * pi * real(
DmyD ** (-1) * (
(-1) * (omega3 + k * pi3) * iv((-1) + k, R * yn) + k * (
omega3 + pi3 + k * pi3 + (-1) * psi3) * R ** (
-1) * yn ** (-1) * iv(k, R * yn)))
BFR3nL = lambda xn, k, psi3, omega3, pi3, R, z, DmyD: 2 * np.exp(
(-1) * z * imag(xn)) * pi * real(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * R ** (-1) * xn ** (-1) * (
((-1) * pi3 + psi3) * R * xn * iv((-1) + k, R * xn) + (
k * (omega3 + pi3 + k * pi3 + (-1) * psi3) + pi3 * R ** 2 * xn ** 2) * iv(k,
R * xn)))
BFR3nR = lambda yn, k, psi3, omega3, pi3, R, z, DmyD: np.exp(
(-1) * z * imag(yn)) * pi * real(
DmyD ** (-1) * R ** (-1) * yn ** (-1) * (
((-1) * pi3 + psi3) * R * yn * iv((-1) + k, R * yn) + (
k * (omega3 + pi3 + k * pi3 + (-1) * psi3) + pi3 * R ** 2 * yn ** 2) * iv(k,
R * yn)))
AFz3nL = lambda xn, k, psi3, omega3, pi3, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
pi3 * R * xn * iv((-1) + k, R * xn) + (pi3 + (-1) * k * pi3 + psi3) * iv(k,
R * xn)))
AFz3nR = lambda yn, k, psi3, omega3, pi3, R, z, DmyD: (-1) * np.exp(
(-1) * z * imag(yn)) * pi * imag(
DmyD ** (-1) * (pi3 * R * yn * iv((-1) + k, R * yn) + (
pi3 + (-1) * k * pi3 + psi3) * iv(k, R * yn)))
uR3_k0 = lambda xn, psi3, omega3, pi3, R, z, DmyD: 2 * np.exp(
(-1) * z * imag(xn)) * pi * real(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
pi3 * R * xn * iv(0, R * xn) + ((-1) * pi3 + psi3) * iv(1, R * xn)))
uz3_k0 = lambda xn, psi3, omega3, pi3, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
(pi3 + psi3) * iv(0, R * xn) + pi3 * R * xn * iv(1, R * xn)))
uR3k0 = sum(uR3_k0(self._xn_k0, self._psi_xn3_k0, self._omega_xn3_k0, self._pi_xn3_k0, R, z,
self._DmyD_xn_k0))
uPhi3k0 = 0
uz3k0 = sum(uz3_k0(self._xn_k0, self._psi_xn3_k0, self._omega_xn3_k0, self._pi_xn3_k0, R, z,
self._DmyD_xn_k0))
t_BFR3nL = BFR3nL(self._xn, self._k, self._psi_xn3, self._omega_xn3, self._pi_xn3, R, z,
self._DmyD_xn)
t_BFR3nR = BFR3nR(self._yn, self._k, self._psi_yn3, self._omega_yn3, self._pi_yn3, R, z,
self._DmyD_yn)
t_BFPhi3nL = BFPhi3nL(self._xn, self._k, self._psi_xn3, self._omega_xn3, self._pi_xn3, R, z,
self._DmyD_xn)
t_BFPhi3nR = BFPhi3nR(self._yn, self._k, self._psi_yn3, self._omega_yn3, self._pi_yn3, R, z,
self._DmyD_yn)
t_AFz3nL = AFz3nL(self._xn, self._k, self._psi_xn3, self._omega_xn3, self._pi_xn3, R, z,
self._DmyD_xn)
t_AFz3nR = AFz3nR(self._yn, self._k, self._psi_yn3, self._omega_yn3, self._pi_yn3, R, z,
self._DmyD_yn)
uR3 = uR3k0 + sum((t_BFR3nL + t_BFR3nR) * cos(self._k * Phi))
uPhi3 = uPhi3k0 + sum((t_BFPhi3nL + t_BFPhi3nR) * sin(self._k * Phi))
uz3 = uz3k0 + sum((t_AFz3nL + t_AFz3nR) * cos(self._k * Phi))
return uR3, uPhi3, uz3
def solve_u_light(self, R, Phi, z):
uR1, uPhi1, uz1 = self.solve_u1_light(R, Phi, z)
uR2, uPhi2, uz2 = self.solve_u2_light(R, Phi, z)
uR3, uPhi3, uz3 = self.solve_u3_light(R, Phi, z)
return uR1, uPhi1, uz1, uR2, uPhi2, uz2, uR3, uPhi3, uz3
class StokesletsRinginPipe_light:
def __init__(self, threshold):
self._threshold = threshold
self._b = 0
self._n = np.zeros([0])
self._xn_k0 = np.zeros([0])
self._yn_k0 = np.zeros([0])
self._DmyD_xn_k0 = np.zeros([0])
self._DmyD_yn_k0 = np.zeros([0])
self._psi_xn1_k0 = np.zeros([0])
self._psi_xn3_k0 = np.zeros([0])
self._pi_xn1_k0 = np.zeros([0])
self._pi_xn3_k0 = np.zeros([0])
self._omega_xn1_k0 = np.zeros([0])
self._omega_xn3_k0 = np.zeros([0])
self._psi_yn2_k0 = np.zeros([0])
self._pi_yn2_k0 = np.zeros([0])
self._omega_yn2_k0 = np.zeros([0])
self._finish_xyk = False # run _set_xyk first
self._finish_xn = False # run _solve_prepare_xn first
self._finish_yn = False # run _solve_prepare_yn first
self._finish1 = False # run _solve_prepare1 first
self._finish2 = False # run _solve_prepare2 first
self._finish3 = False # run _solve_prepare3 first
def _set_xyk(self):
threshold = self._threshold
nmax = int(threshold / 2)
n_use = np.arange(1, nmax + 1)
t_path = os.path.dirname(os.path.abspath(__file__))
full_path = os.path.normpath(t_path + '/' + 'xn.mat')
mat_contents = loadmat(full_path)
xn = mat_contents['xn']
xn_k0 = xn[0, 0:nmax]
full_path = os.path.normpath(t_path + '/' + 'yn.mat')
mat_contents = loadmat(full_path)
yn = mat_contents['yn']
yn_k0 = yn[0, 0:nmax]
self._n = n_use
self._xn_k0 = xn_k0
self._yn_k0 = yn_k0
self._finish_xyk = True
return True
def get_b(self):
return self._b
def set_b(self, b):
self._b = b
return True
def solve_prepare_light(self):
self._set_xyk()
self._solve_prepare_xn()
self._solve_prepare_yn()
return True
def solve_prepare_b(self):
self._solve_prepare1()
self._solve_prepare2()
self._solve_prepare3()
return True
def _solve_prepare_xn(self):
err_msg = 'run _set_xyk first. '
assert self._finish_xyk, err_msg
DmyDk0 = lambda s: 2 * iv(0, s) * (
s * iv(0, s) ** 2 + (-4) * iv(0, s) * iv(1, s) + (-1) * s ** (-1) * (
(-4) + s ** 2) * iv(1, s) ** 2)
self._DmyD_xn_k0 = DmyDk0(self._xn_k0)
self._finish_xn = True
return True
def _solve_prepare_yn(self):
err_msg = 'run _set_xyk first. '
assert self._finish_xyk, err_msg
DmyDk0 = lambda s: 2 * iv(0, s) * (
s * iv(0, s) ** 2 + (-4) * iv(0, s) * iv(1, s) + (-1) * s ** (-1) * (
(-4) + s ** 2) * iv(1, s) ** 2)
self._DmyD_yn_k0 = DmyDk0(self._yn_k0)
self._finish_yn = True
return True
def _solve_prepare1(self):
err_msg = 'run _solve_prepare_xn first. '
assert self._finish_xn, err_msg
psi1_k0 = lambda s, b: (1 / 16) * pi ** (-2) * iv(1, s) * (
(-4) * s ** 2 * (iv(0, s) + iv(2, s)) * (
b * iv(0, b * s) * kv(0, s) + (-1) * iv(1, b * s) * kv(1, s)) + (
-8) * s * (iv(0, s) + s * iv(1, s)) * (
b * iv(0, b * s) * kv(1, s) + (-1) * iv(1, b * s) * kv(2, s)))
pi1_k0 = lambda s, b: (1 / 2) * pi ** (-2) * iv(1, s) * (
b * iv(0, b * s) + (-1) * s * iv(1, b * s) * (
iv(1, s) * kv(1, s) + iv(0, s) * kv(2, s)))
self._psi_xn1_k0 = psi1_k0(self._xn_k0, self._b)
self._omega_xn1_k0 = 0
self._pi_xn1_k0 = pi1_k0(self._xn_k0, self._b)
self._finish1 = True
return True
def _solve_prepare2(self):
err_msg = 'run _solve_prepare_yn first. '
assert self._finish_yn, err_msg
omega2_k0 = lambda s, b: pi ** (-2) * (s * iv(0, s) ** 2 +
(-2) * iv(0, s) * iv(1, s) +
(-1) * s * iv(1, s) ** 2) * iv(1, b * s) * kv(1, s)
self._psi_yn2_k0 = 0
self._omega_yn2_k0 = omega2_k0(self._yn_k0, self._b)
self._pi_yn2_k0 = 0
self._finish2 = True
return True
def _solve_prepare3(self):
err_msg = 'run _solve_prepare_xn first. '
assert self._finish_xn, err_msg
psi3_k0 = lambda s, b: (1 / 4) * pi ** (-2) * s * iv(1, s) * (b * iv(1, b * s) * (
(-1) * s * (iv(0, s) + iv(2, s)) * kv(0, s) + (-2) * (iv(0, s) + s * iv(1, s)) * kv(
1, s)) + iv(0,
b * s) * (
2 * (s * iv(1, s) + (
-1) * iv(2, s)) * kv(0,
s) + s * (
iv(0, s) + iv(
2, s)) * kv(1,
s)))
pi3_k0 = lambda s, b: (1 / 2) * pi ** (-2) * iv(1, s) * (
b * iv(1, b * s) + (-1) * s * iv(0, b * s) * (
iv(2, s) * kv(0, s) + iv(1, s) * kv(1, s)))
self._psi_xn3_k0 = psi3_k0(self._xn_k0, self._b)
self._omega_xn3_k0 = 0
self._pi_xn3_k0 = pi3_k0(self._xn_k0, self._b)
self._finish3 = True
return True
def solve_u1_light(self, R, z):
err_msg = 'run _solve_prepare1 first. '
assert self._finish1, err_msg
uR1_k0 = lambda xn, psi1, omega1, pi1, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
pi1 * R * xn * iv(0, R * xn) + ((-1) * pi1 + psi1) * iv(1, R * xn)))
uz1_k0 = lambda xn, psi1, omega1, pi1, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * real(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
(pi1 + psi1) * iv(0, R * xn) + pi1 * R * xn * iv(1, R * xn)))
uR1k0 = sum(uR1_k0(self._xn_k0, self._psi_xn1_k0, self._omega_xn1_k0, self._pi_xn1_k0, R, z,
self._DmyD_xn_k0))
uz1k0 = sum(uz1_k0(self._xn_k0, self._psi_xn1_k0, self._omega_xn1_k0, self._pi_xn1_k0, R, z,
self._DmyD_xn_k0))
uR1 = uR1k0 * 2 * np.pi
uPhi1 = 0
uz1 = uz1k0 * 2 * np.pi
return uR1, uPhi1, uz1
def solve_u2_light(self, R, z):
err_msg = 'run _solve_prepare2 first. '
assert self._finish2, err_msg
uPhi2_k0 = lambda yn, psi2, omega2, pi2, R, z, DmyD: np.exp(
(-1) * z * imag(yn)) * pi * imag(
DmyD ** (-1) * omega2 * iv(1, R * yn))
uPhi2k0 = sum(
uPhi2_k0(self._yn_k0, self._psi_yn2_k0, self._omega_yn2_k0, self._pi_yn2_k0, R, z,
self._DmyD_yn_k0))
uR2 = 0
uPhi2 = uPhi2k0 * 2 * np.pi
uz2 = 0
return uR2, uPhi2, uz2
def solve_u3_light(self, R, z):
err_msg = 'run _solve_prepare3 first. '
assert self._finish3, err_msg
uR3_k0 = lambda xn, psi3, omega3, pi3, R, z, DmyD: 2 * np.exp(
(-1) * z * imag(xn)) * pi * real(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
pi3 * R * xn * iv(0, R * xn) + ((-1) * pi3 + psi3) * iv(1, R * xn)))
uz3_k0 = lambda xn, psi3, omega3, pi3, R, z, DmyD: (-2) * np.exp(
(-1) * z * imag(xn)) * pi * imag(
DmyD ** (-1) * np.exp(sqrt(-1 + 0j) * z * real(xn)) * (
(pi3 + psi3) * iv(0, R * xn) + pi3 * R * xn * iv(1, R * xn)))
uR3k0 = sum(uR3_k0(self._xn_k0, self._psi_xn3_k0, self._omega_xn3_k0, self._pi_xn3_k0, R, z,
self._DmyD_xn_k0))
uz3k0 = sum(uz3_k0(self._xn_k0, self._psi_xn3_k0, self._omega_xn3_k0, self._pi_xn3_k0, R, z,
self._DmyD_xn_k0))
uR3 = uR3k0 * 2 * np.pi
uPhi3 = 0
uz3 = uz3k0 * 2 * np.pi
return uR3, uPhi3, uz3
def solve_u_light(self, R, z):
uR1, uPhi1, uz1 = self.solve_u1_light(R, z)
uR2, uPhi2, uz2 = self.solve_u2_light(R, z)
uR3, uPhi3, uz3 = self.solve_u3_light(R, z)
return uR1, uPhi1, uz1, uR2, uPhi2, uz2, uR3, uPhi3, uz3
| 54.487288 | 120 | 0.357695 | 8,238 | 64,295 | 2.608521 | 0.027798 | 0.019824 | 0.022337 | 0.013495 | 0.922891 | 0.897296 | 0.871934 | 0.859184 | 0.832333 | 0.813905 | 0 | 0.077162 | 0.479757 | 64,295 | 1,179 | 121 | 54.533503 | 0.565278 | 0.007917 | 0 | 0.714286 | 0 | 0 | 0.013392 | 0 | 0 | 0 | 0 | 0 | 0.020147 | 1 | 0.033883 | false | 0 | 0.005495 | 0.001832 | 0.07326 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a8d8f1d354df5695a62e276b919efe1dce92b0a9 | 8,464 | py | Python | discovery-provider/tests/test_get_aggregate_route_metrics.py | atticwip/audius-protocol | 9758e849fae01508fa1d27675741228b11533e6e | [
"Apache-2.0"
] | 429 | 2019-08-14T01:34:07.000Z | 2022-03-30T06:31:38.000Z | discovery-provider/tests/test_get_aggregate_route_metrics.py | SNOmad1/audius-protocol | 3d5fc2bf688265eb529060f1f3234ef2b95ed231 | [
"Apache-2.0"
] | 998 | 2019-08-14T01:52:37.000Z | 2022-03-31T23:17:22.000Z | discovery-provider/tests/test_get_aggregate_route_metrics.py | SNOmad1/audius-protocol | 3d5fc2bf688265eb529060f1f3234ef2b95ed231 | [
"Apache-2.0"
] | 73 | 2019-10-04T04:24:16.000Z | 2022-03-24T16:27:30.000Z | from datetime import date, timedelta
from src.models import (
AggregateDailyUniqueUsersMetrics,
AggregateDailyTotalUsersMetrics,
AggregateMonthlyUniqueUsersMetrics,
AggregateMonthlyTotalUsersMetrics,
)
from src.queries.get_route_metrics import _get_aggregate_route_metrics
from src.utils.db_session import get_db
limit = 2
today = date.today()
yesterday = today - timedelta(days=1)
def test_get_aggregate_route_metrics_week(app):
with app.app_context():
db_mock = get_db()
with db_mock.scoped_session() as session:
session.bulk_save_objects(
[
AggregateDailyUniqueUsersMetrics(
count=1, summed_count=2, timestamp=today - timedelta(days=8)
),
AggregateDailyUniqueUsersMetrics(
count=2, summed_count=3, timestamp=yesterday - timedelta(days=1)
),
AggregateDailyUniqueUsersMetrics(
count=3, summed_count=4, timestamp=yesterday
),
AggregateDailyUniqueUsersMetrics(
count=4, summed_count=5, timestamp=today
),
AggregateDailyTotalUsersMetrics(
count=2, timestamp=today - timedelta(days=8)
),
AggregateDailyTotalUsersMetrics(
count=4, timestamp=yesterday - timedelta(days=1)
),
AggregateDailyTotalUsersMetrics(count=6, timestamp=yesterday),
AggregateDailyTotalUsersMetrics(count=8, timestamp=today),
]
)
aggregate_metrics = _get_aggregate_route_metrics(session, "week", "day")
assert len(aggregate_metrics) == 2
assert aggregate_metrics[0]["unique_count"] == 2
assert aggregate_metrics[0]["summed_unique_count"] == 3
assert aggregate_metrics[0]["total_count"] == 4
assert aggregate_metrics[1]["unique_count"] == 3
assert aggregate_metrics[1]["summed_unique_count"] == 4
assert aggregate_metrics[1]["total_count"] == 6
def test_get_aggregate_route_metrics_month_daily_bucket(app):
with app.app_context():
db_mock = get_db()
with db_mock.scoped_session() as session:
session.bulk_save_objects(
[
AggregateDailyUniqueUsersMetrics(
count=1, summed_count=2, timestamp=today - timedelta(days=31)
),
AggregateDailyUniqueUsersMetrics(
count=2, summed_count=3, timestamp=today - timedelta(days=8)
),
AggregateDailyUniqueUsersMetrics(
count=3, summed_count=4, timestamp=yesterday
),
AggregateDailyUniqueUsersMetrics(
count=4, summed_count=5, timestamp=today
),
AggregateDailyTotalUsersMetrics(
count=2, timestamp=today - timedelta(days=31)
),
AggregateDailyTotalUsersMetrics(
count=4, timestamp=today - timedelta(days=8)
),
AggregateDailyTotalUsersMetrics(count=6, timestamp=yesterday),
AggregateDailyTotalUsersMetrics(count=8, timestamp=today),
]
)
aggregate_metrics = _get_aggregate_route_metrics(session, "month", "day")
assert len(aggregate_metrics) == 2
assert aggregate_metrics[0]["unique_count"] == 2
assert aggregate_metrics[0]["summed_unique_count"] == 3
assert aggregate_metrics[0]["total_count"] == 4
assert aggregate_metrics[1]["unique_count"] == 3
assert aggregate_metrics[1]["summed_unique_count"] == 4
assert aggregate_metrics[1]["total_count"] == 6
def test_get_aggregate_route_metrics_month_weekly_bucket(app):
with app.app_context():
db_mock = get_db()
with db_mock.scoped_session() as session:
session.bulk_save_objects(
[
AggregateDailyUniqueUsersMetrics(
count=1, summed_count=2, timestamp=today - timedelta(days=31)
),
AggregateDailyUniqueUsersMetrics(
count=2, summed_count=3, timestamp=today - timedelta(days=8)
),
AggregateDailyUniqueUsersMetrics(
count=3, summed_count=4, timestamp=yesterday
),
AggregateDailyUniqueUsersMetrics(
count=4, summed_count=5, timestamp=today
),
AggregateDailyTotalUsersMetrics(
count=2, timestamp=today - timedelta(days=31)
),
AggregateDailyTotalUsersMetrics(
count=4, timestamp=today - timedelta(days=8)
),
AggregateDailyTotalUsersMetrics(count=6, timestamp=yesterday),
AggregateDailyTotalUsersMetrics(count=8, timestamp=today),
]
)
aggregate_metrics = _get_aggregate_route_metrics(session, "month", "week")
assert len(aggregate_metrics) == 2
assert aggregate_metrics[0]["unique_count"] == 2
assert aggregate_metrics[0]["summed_unique_count"] == 3
assert aggregate_metrics[0]["total_count"] == 4
assert aggregate_metrics[1]["unique_count"] == 3
assert aggregate_metrics[1]["summed_unique_count"] == 4
assert aggregate_metrics[1]["total_count"] == 6
def test_get_aggregate_route_metrics_all_time_monthly_bucket(app):
with app.app_context():
db_mock = get_db()
with db_mock.scoped_session() as session:
session.bulk_save_objects(
[
AggregateMonthlyUniqueUsersMetrics(
count=1, summed_count=2, timestamp=today - timedelta(days=367)
),
AggregateMonthlyUniqueUsersMetrics(
count=2, summed_count=3, timestamp=today - timedelta(days=100)
),
AggregateMonthlyUniqueUsersMetrics(
count=3, summed_count=4, timestamp=today
),
AggregateMonthlyTotalUsersMetrics(
count=2, timestamp=today - timedelta(days=367)
),
AggregateMonthlyTotalUsersMetrics(
count=4, timestamp=today - timedelta(days=100)
),
AggregateMonthlyTotalUsersMetrics(count=6, timestamp=today),
]
)
aggregate_metrics = _get_aggregate_route_metrics(session, "all_time", "month")
assert len(aggregate_metrics) == 2
assert aggregate_metrics[0]["unique_count"] == 1
assert aggregate_metrics[0]["summed_unique_count"] == 2
assert aggregate_metrics[0]["total_count"] == 2
assert aggregate_metrics[1]["unique_count"] == 2
assert aggregate_metrics[1]["summed_unique_count"] == 3
assert aggregate_metrics[1]["total_count"] == 4
def test_get_aggregate_route_metrics_all_time_weekly_bucket(app):
with app.app_context():
db_mock = get_db()
with db_mock.scoped_session() as session:
session.bulk_save_objects(
[
AggregateDailyUniqueUsersMetrics(
count=1, summed_count=2, timestamp=today - timedelta(days=367)
),
AggregateDailyUniqueUsersMetrics(
count=2, summed_count=3, timestamp=yesterday
),
AggregateDailyUniqueUsersMetrics(
count=3, summed_count=4, timestamp=today
),
AggregateDailyTotalUsersMetrics(
count=2, timestamp=today - timedelta(days=367)
),
AggregateDailyTotalUsersMetrics(count=4, timestamp=yesterday),
AggregateDailyTotalUsersMetrics(count=6, timestamp=today),
]
)
aggregate_metrics = _get_aggregate_route_metrics(session, "all_time", "week")
assert len(aggregate_metrics) == 2
assert aggregate_metrics[0]["unique_count"] == 1
assert aggregate_metrics[0]["summed_unique_count"] == 2
assert aggregate_metrics[0]["total_count"] == 2
assert aggregate_metrics[1]["unique_count"] == 2
assert aggregate_metrics[1]["summed_unique_count"] == 3
assert aggregate_metrics[1]["total_count"] == 4
| 40.497608 | 86 | 0.599598 | 767 | 8,464 | 6.35854 | 0.079531 | 0.131228 | 0.135329 | 0.088579 | 0.881895 | 0.85975 | 0.846627 | 0.814025 | 0.776912 | 0.751692 | 0 | 0.026628 | 0.312264 | 8,464 | 208 | 87 | 40.692308 | 0.811201 | 0 | 0 | 0.741758 | 0 | 0 | 0.055411 | 0 | 0 | 0 | 0 | 0 | 0.192308 | 1 | 0.027473 | false | 0 | 0.021978 | 0 | 0.049451 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
76b00d39e1e2c4ab3ff126edce6f44090db4c4e2 | 5,678 | py | Python | src/genie/libs/parser/iosxr/tests/ShowMplsTrafficEngTunnelsTabular/cli/equal/golden_output_1_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 204 | 2018-06-27T00:55:27.000Z | 2022-03-06T21:12:18.000Z | src/genie/libs/parser/iosxr/tests/ShowMplsTrafficEngTunnelsTabular/cli/equal/golden_output_1_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 468 | 2018-06-19T00:33:18.000Z | 2022-03-31T23:23:35.000Z | src/genie/libs/parser/iosxr/tests/ShowMplsTrafficEngTunnelsTabular/cli/equal/golden_output_1_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 309 | 2019-01-16T20:21:07.000Z | 2022-03-30T12:56:41.000Z | expected_output = {
"vrf": {
"default": {
"tunnel": {
"tunnel-te50000": {
"lsp_id": 4,
"destination_address": "109.109.109.109",
"source_address": "17.17.17.17",
"tunnel_state": "up",
"frr_state": "Ready",
"lsp_role": "Head",
"path_prot": "Inact",
},
"tunnel-te50010": {
"lsp_id": 3,
"destination_address": "108.108.108.108",
"source_address": "17.17.17.17",
"tunnel_state": "up",
"frr_state": "Ready",
"lsp_role": "Head",
"path_prot": "Inact",
},
"tunnel-te53000": {
"lsp_id": 2,
"destination_address": "141.141.141.141",
"source_address": "17.17.17.17",
"tunnel_state": "up",
"frr_state": "Inact",
"lsp_role": "Head",
"path_prot": "Inact",
},
"tunnel-te53010": {
"lsp_id": 2,
"destination_address": "106.106.106.106",
"source_address": "17.17.17.17",
"tunnel_state": "up",
"frr_state": "Inact",
"lsp_role": "Head",
"path_prot": "Inact",
},
"tunnel-te53100": {
"lsp_id": 2,
"destination_address": "107.107.107.107",
"source_address": "17.17.17.17",
"tunnel_state": "up",
"frr_state": "Inact",
"lsp_role": "Head",
"path_prot": "Inact",
},
"tunnel-te53300": {
"lsp_id": 2,
"destination_address": "109.109.109.109",
"source_address": "17.17.17.17",
"tunnel_state": "up",
"frr_state": "Inact",
"lsp_role": "Head",
"path_prot": "Inact",
},
"tunnel-te53400": {
"lsp_id": 2,
"destination_address": "107.107.107.107",
"source_address": "17.17.17.17",
"tunnel_state": "up",
"frr_state": "Inact",
"lsp_role": "Head",
"path_prot": "Inact",
},
"NHOP_15060_F141-A": {
"lsp_id": 3,
"destination_address": "17.17.17.17",
"source_address": "141.141.141.141",
"tunnel_state": "up",
"frr_state": "Inact",
"lsp_role": "Tail",
},
"NNHOP_15370_F141-": {
"lsp_id": 3,
"destination_address": "106.106.106.106",
"source_address": "141.141.141.141",
"tunnel_state": "up",
"frr_state": "Inact",
"lsp_role": "Mid",
},
"NHOP_16060_F107-A": {
"lsp_id": 61,
"destination_address": "106.106.106.106",
"source_address": "107.107.107.107",
"tunnel_state": "up",
"frr_state": "Inact",
"lsp_role": "Mid",
},
"NNHOP_16370_F107-": {
"lsp_id": 48,
"destination_address": "17.17.17.17",
"source_address": "107.107.107.107",
"tunnel_state": "up",
"frr_state": "Inact",
"lsp_role": "Tail",
},
"NNHOP_16710_F107-": {
"lsp_id": 74,
"destination_address": "17.17.17.17",
"source_address": "107.107.107.107",
"tunnel_state": "up",
"frr_state": "Inact",
"lsp_role": "Tail",
},
"51000_F106-ASR900": {
"lsp_id": 167,
"destination_address": "109.109.109.109",
"source_address": "106.106.106.106",
"tunnel_state": "up",
"frr_state": "Ready",
"lsp_role": "Mid",
},
"55000_F109-ASR900": {
"lsp_id": 3,
"destination_address": "17.17.17.17",
"source_address": "109.109.109.109",
"tunnel_state": "up",
"frr_state": "Inact",
"lsp_role": "Tail",
},
"56000_F108-ASR900": {
"lsp_id": 26,
"destination_address": "17.17.17.17",
"source_address": "108.108.108.108",
"tunnel_state": "up",
"frr_state": "Inact",
"lsp_role": "Tail",
},
"NNHOP_58300_F109-": {
"lsp_id": 4,
"destination_address": "17.17.17.17",
"source_address": "109.109.109.109",
"tunnel_state": "up",
"frr_state": "Inact",
"lsp_role": "Tail",
},
}
}
}
}
| 39.430556 | 61 | 0.348538 | 453 | 5,678 | 4.10596 | 0.13245 | 0.083871 | 0.083871 | 0.137634 | 0.872043 | 0.804301 | 0.804301 | 0.804301 | 0.694086 | 0.69086 | 0 | 0.168398 | 0.508454 | 5,678 | 143 | 62 | 39.706294 | 0.498029 | 0 | 0 | 0.657343 | 0 | 0 | 0.361747 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4f37587b707284279decd805449262bfd2bb7d85 | 40,649 | py | Python | azext_iot/tests/dps/test_iot_dps_int.py | jongio/azure-iot-cli-extension | 5e41824688c4d9e4593737a55e8789a6bb1d2411 | [
"MIT"
] | null | null | null | azext_iot/tests/dps/test_iot_dps_int.py | jongio/azure-iot-cli-extension | 5e41824688c4d9e4593737a55e8789a6bb1d2411 | [
"MIT"
] | null | null | null | azext_iot/tests/dps/test_iot_dps_int.py | jongio/azure-iot-cli-extension | 5e41824688c4d9e4593737a55e8789a6bb1d2411 | [
"MIT"
] | null | null | null | # coding=utf-8
# --------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# --------------------------------------------------------------------------------------------
from azext_iot.common.shared import EntityStatusType, AttestationType, AllocationType, ReprovisionType
from azext_iot.common.utility import generate_key
from azext_iot.tests.dps import (
API_VERSION,
CERT_PATH,
DATAPLANE_AUTH_TYPES,
WEBHOOK_URL,
IoTDPSLiveScenarioTest
)
test_endorsement_key = (
"AToAAQALAAMAsgAgg3GXZ0SEs/gakMyNRqXXJP1S124GUgtk8qHaGzMUaaoABgCAAEMAEAgAAAAAAAEAibym9HQP9vxCGF5dVc1Q"
"QsAGe021aUGJzNol1/gycBx3jFsTpwmWbISRwnFvflWd0w2Mc44FAAZNaJOAAxwZvG8GvyLlHh6fGKdh+mSBL4iLH2bZ4Ry22cB3"
"CJVjXmdGoz9Y/j3/NwLndBxQC+baNvzvyVQZ4/A2YL7vzIIj2ik4y+ve9ir7U0GbNdnxskqK1KFIITVVtkTIYyyFTIR0BySjPrRI"
"Dj7r7Mh5uF9HBppGKQCBoVSVV8dI91lNazmSdpGWyqCkO7iM4VvUMv2HT/ym53aYlUrau+Qq87Tu+uQipWYgRdF11KDfcpMHqqzB"
"QQ1NpOJVhrsTrhyJzO7KNw=="
)
class TestDPSEnrollments(IoTDPSLiveScenarioTest):
def __init__(self, test_method):
super(TestDPSEnrollments, self).__init__(test_method)
def test_dps_compute_device_key(self):
offline_device_key = self.cmd(
'az iot dps compute-device-key --key "{}" '
"--registration-id myarbitrarydeviceId".format(test_endorsement_key)
).output
offline_device_key = offline_device_key.strip("\"'\n")
assert offline_device_key == "cT/EXZvsplPEpT//p98Pc6sKh8mY3kYgSxavHwMkl7w="
def test_dps_enrollment_tpm_lifecycle(self):
attestation_type = AttestationType.tpm.value
for auth_phase in DATAPLANE_AUTH_TYPES:
enrollment_id = self.generate_enrollment_names()[0]
device_id = self.generate_device_names()[0]
enrollment = self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment create --enrollment-id {} --attestation-type {}"
" -g {} --dps-name {} --endorsement-key {}"
" --provisioning-status {} --device-id {} --initial-twin-tags {}"
" --initial-twin-properties {} --device-information {} "
"--allocation-policy {} --iot-hubs {}".format(
enrollment_id,
attestation_type,
self.entity_rg,
self.entity_dps_name,
test_endorsement_key,
EntityStatusType.enabled.value,
device_id,
'"{generic_dict}"',
'"{generic_dict}"',
'"{generic_dict}"',
AllocationType.static.value,
self.hub_host_name,
),
auth_type=auth_phase
),
checks=[
self.check("attestation.type", attestation_type),
self.check("registrationId", enrollment_id),
self.check("provisioningStatus", EntityStatusType.enabled.value),
self.check("deviceId", device_id),
self.check("allocationPolicy", AllocationType.static.value),
self.check("iotHubs", self.hub_host_name.split()),
self.check("initialTwin.tags", self.kwargs["generic_dict"]),
self.check("optionalDeviceInformation", self.kwargs["generic_dict"]),
self.check(
"initialTwin.properties.desired", self.kwargs["generic_dict"]
),
self.exists("reprovisionPolicy"),
self.check("reprovisionPolicy.migrateDeviceData", True),
self.check("reprovisionPolicy.updateHubAssignment", True),
],
).get_output_in_json()
etag = enrollment["etag"]
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment list -g {} --dps-name {}".format(
self.entity_rg, self.entity_dps_name
),
auth_type=auth_phase
),
checks=[
self.check("length(@)", 1),
self.check("[0].registrationId", enrollment_id),
],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment show -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id
),
auth_type=auth_phase
),
checks=[self.check("registrationId", enrollment_id)],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment show -g {} --dps-name {} --enrollment-id {} --show-keys".format(
self.entity_rg, self.entity_dps_name, enrollment_id
),
auth_type=auth_phase
),
checks=[
self.check("registrationId", enrollment_id),
self.check("attestation.type", attestation_type),
self.exists("attestation.{}".format(attestation_type)),
],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment update -g {} --dps-name {} --enrollment-id {}"
" --provisioning-status {} --etag {} --info {}".format(
self.entity_rg,
self.entity_dps_name,
enrollment_id,
EntityStatusType.disabled.value,
etag,
'""'
),
auth_type=auth_phase
),
checks=[
self.check("attestation.type", attestation_type),
self.check("registrationId", enrollment_id),
self.check("provisioningStatus", EntityStatusType.disabled.value),
self.check("deviceId", device_id),
self.check("allocationPolicy", AllocationType.static.value),
self.check("iotHubs", self.hub_host_name.split()),
self.exists("initialTwin.tags"),
self.exists("initialTwin.properties.desired"),
self.exists("optionalDeviceInformation"),
],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment delete -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id
),
auth_type=auth_phase
),
)
def test_dps_enrollment_x509_lifecycle(self):
attestation_type = AttestationType.x509.value
for auth_phase in DATAPLANE_AUTH_TYPES:
enrollment_id = self.generate_enrollment_names()[0]
device_id = self.generate_device_names()[0]
etag = self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment create --enrollment-id {} --attestation-type {}"
" -g {} --dps-name {} --cp {} --scp {}"
" --provisioning-status {} --device-id {}"
" --initial-twin-tags {} --initial-twin-properties {}"
" --allocation-policy {} --iot-hubs {}".format(
enrollment_id,
attestation_type,
self.entity_rg,
self.entity_dps_name,
CERT_PATH,
CERT_PATH,
EntityStatusType.enabled.value,
device_id,
'"{generic_dict}"',
'"{generic_dict}"',
AllocationType.hashed.value,
self.hub_host_name,
),
auth_type=auth_phase
),
checks=[
self.check("attestation.type", attestation_type),
self.check("registrationId", enrollment_id),
self.check("provisioningStatus", EntityStatusType.enabled.value),
self.check("deviceId", device_id),
self.check("allocationPolicy", AllocationType.hashed.value),
self.check("iotHubs", self.hub_host_name.split()),
self.check("initialTwin.tags", self.kwargs["generic_dict"]),
self.check(
"initialTwin.properties.desired", self.kwargs["generic_dict"]
),
self.exists("reprovisionPolicy"),
self.check("reprovisionPolicy.migrateDeviceData", True),
self.check("reprovisionPolicy.updateHubAssignment", True),
],
).get_output_in_json()["etag"]
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment list -g {} --dps-name {}".format(self.entity_rg, self.entity_dps_name),
auth_type=auth_phase
),
checks=[
self.check("length(@)", 1),
self.check("[0].registrationId", enrollment_id),
],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment show -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id
),
auth_type=auth_phase
),
checks=[self.check("registrationId", enrollment_id)],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment update -g {} --dps-name {} --enrollment-id {}"
" --provisioning-status {} --etag {} --info {} --rc".format(
self.entity_rg,
self.entity_dps_name,
enrollment_id,
EntityStatusType.disabled.value,
etag,
'"{generic_dict}"',
),
auth_type=auth_phase
),
checks=[
self.check("attestation.type", attestation_type),
self.check("registrationId", enrollment_id),
self.check("provisioningStatus", EntityStatusType.disabled.value),
self.check("deviceId", device_id),
self.check("allocationPolicy", AllocationType.hashed.value),
self.check("iotHubs", self.hub_host_name.split()),
self.exists("initialTwin.tags"),
self.exists("initialTwin.properties.desired"),
self.check("optionalDeviceInformation", self.kwargs["generic_dict"]),
self.check("attestation.type.x509.clientCertificates.primary", None),
],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment delete -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id
),
auth_type=auth_phase
),
)
def test_dps_enrollment_symmetrickey_lifecycle(self):
attestation_type = AttestationType.symmetricKey.value
for auth_phase in DATAPLANE_AUTH_TYPES:
enrollment_id, enrollment_id2 = self.generate_enrollment_names(count=2)
primary_key = generate_key()
secondary_key = generate_key()
device_id = self.generate_enrollment_names()[0]
# Use provided keys
etag = self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment create --enrollment-id {} --attestation-type {}"
" -g {} --dps-name {} --pk {} --sk {}"
" --provisioning-status {} --device-id {}"
" --initial-twin-tags {} --initial-twin-properties {} --device-information {}"
" --allocation-policy {} --rp {} --iot-hubs {} --edge-enabled".format(
enrollment_id,
attestation_type,
self.entity_rg,
self.entity_dps_name,
primary_key,
secondary_key,
EntityStatusType.enabled.value,
device_id,
'"{generic_dict}"',
'"{generic_dict}"',
'"{generic_dict}"',
AllocationType.geolatency.value.lower(),
ReprovisionType.reprovisionandresetdata.value,
self.hub_host_name,
),
auth_type=auth_phase
),
checks=[
self.check("attestation.type", attestation_type),
self.check("registrationId", enrollment_id),
self.check("provisioningStatus", EntityStatusType.enabled.value),
self.check("deviceId", device_id),
self.check("allocationPolicy", AllocationType.geolatency.value),
self.check("iotHubs", self.hub_host_name.split()),
self.check("initialTwin.tags", self.kwargs["generic_dict"]),
self.check("optionalDeviceInformation", self.kwargs["generic_dict"]),
self.check(
"initialTwin.properties.desired", self.kwargs["generic_dict"]
),
self.exists("reprovisionPolicy"),
self.check("reprovisionPolicy.migrateDeviceData", False),
self.check("reprovisionPolicy.updateHubAssignment", True),
self.check("capabilities.iotEdge", True),
],
).get_output_in_json()["etag"]
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment list -g {} --dps-name {}".format(self.entity_rg, self.entity_dps_name),
auth_type=auth_phase
),
checks=[
self.check("length(@)", 1),
self.check("[0].registrationId", enrollment_id),
],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment show -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id
),
auth_type=auth_phase
),
checks=[self.check("registrationId", enrollment_id)],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment update -g {} --dps-name {} --enrollment-id {}"
" --provisioning-status {} --etag {} --edge-enabled False"
" --allocation-policy {} --webhook-url {} --api-version {}".format(
self.entity_rg,
self.entity_dps_name,
enrollment_id,
EntityStatusType.disabled.value,
etag,
AllocationType.custom.value,
WEBHOOK_URL,
API_VERSION,
),
auth_type=auth_phase
),
checks=[
self.check("attestation.type", attestation_type),
self.check("registrationId", enrollment_id),
self.check("provisioningStatus", EntityStatusType.disabled.value),
self.check("deviceId", device_id),
self.check("allocationPolicy", "custom"),
self.check("customAllocationDefinition.webhookUrl", WEBHOOK_URL),
self.check("customAllocationDefinition.apiVersion", API_VERSION),
self.check("iotHubs", None),
self.exists("initialTwin.tags"),
self.exists("initialTwin.properties.desired"),
self.check("attestation.symmetricKey.primaryKey", primary_key),
self.check("capabilities.iotEdge", False),
],
)
# Use service generated keys
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment create --enrollment-id {} --attestation-type {}"
" -g {} --dps-name {} --allocation-policy {} --webhook-url {} --api-version {}".format(
enrollment_id2,
attestation_type,
self.entity_rg,
self.entity_dps_name,
AllocationType.custom.value,
WEBHOOK_URL,
API_VERSION,
),
auth_type=auth_phase
),
checks=[
self.check("attestation.type", attestation_type),
self.check("registrationId", enrollment_id2),
self.check("allocationPolicy", "custom"),
self.check("customAllocationDefinition.webhookUrl", WEBHOOK_URL),
self.check("customAllocationDefinition.apiVersion", API_VERSION),
],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment delete -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id
),
auth_type=auth_phase
),
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment delete -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id2
),
auth_type=auth_phase
),
)
def test_dps_enrollment_group_x509_lifecycle(self):
for auth_phase in DATAPLANE_AUTH_TYPES:
enrollment_id = self.generate_enrollment_names(group=True)[0]
etag = self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group create --enrollment-id {} -g {} --dps-name {}"
" --cp {} --scp {} --provisioning-status {} --allocation-policy {}"
" --iot-hubs {} --edge-enabled".format(
enrollment_id,
self.entity_rg,
self.entity_dps_name,
CERT_PATH,
CERT_PATH,
EntityStatusType.enabled.value,
AllocationType.geolatency.value,
self.hub_host_name,
),
auth_type=auth_phase
),
checks=[
self.check("enrollmentGroupId", enrollment_id),
self.check("provisioningStatus", EntityStatusType.enabled.value),
self.exists("reprovisionPolicy"),
self.check("allocationPolicy", AllocationType.geolatency.value),
self.check("iotHubs", self.hub_host_name.split()),
self.check("reprovisionPolicy.migrateDeviceData", True),
self.check("reprovisionPolicy.updateHubAssignment", True),
self.check("capabilities.iotEdge", True),
],
).get_output_in_json()["etag"]
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group list -g {} --dps-name {}".format(self.entity_rg, self.entity_dps_name),
auth_type=auth_phase
),
checks=[
self.check("length(@)", 1),
self.check("[0].enrollmentGroupId", enrollment_id),
],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group show -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id
),
auth_type=auth_phase
),
checks=[self.check("enrollmentGroupId", enrollment_id)],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group show -g {} --dps-name {} --enrollment-id {} --show-keys".format(
self.entity_rg, self.entity_dps_name, enrollment_id
),
auth_type=auth_phase
),
checks=[
self.check("enrollmentGroupId", enrollment_id),
self.exists("attestation.x509"),
],
)
# Compute Device Key only works for symmetric key enrollment groups
self.cmd(
self.set_cmd_auth_type(
'az iot dps compute-device-key -g {} --dps-name {} --enrollment-id {} '
"--registration-id myarbitrarydeviceId".format(
self.entity_rg, self.entity_dps_name, enrollment_id
),
auth_type=auth_phase
),
expect_failure=True
)
etag = self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group update -g {} --dps-name {} --enrollment-id {}"
" --provisioning-status {} --rsc --etag {} --rp {} --allocation-policy {}"
" --edge-enabled False --scp {}".format(
self.entity_rg,
self.entity_dps_name,
enrollment_id,
EntityStatusType.disabled.value,
etag,
ReprovisionType.never.value,
AllocationType.hashed.value,
CERT_PATH,
),
auth_type=auth_phase
),
checks=[
self.check("attestation.type", AttestationType.x509.value),
self.check("enrollmentGroupId", enrollment_id),
self.check("provisioningStatus", EntityStatusType.disabled.value),
self.check("attestation.type.x509.clientCertificates.secondary", None),
self.exists("reprovisionPolicy"),
self.check("allocationPolicy", AllocationType.hashed.value),
self.check("reprovisionPolicy.migrateDeviceData", False),
self.check("reprovisionPolicy.updateHubAssignment", False),
self.check("capabilities.iotEdge", False),
],
).get_output_in_json()["etag"]
self.cmd(
self.set_cmd_auth_type(
"iot dps registration list -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id
),
auth_type=auth_phase
),
checks=[self.check("length(@)", 0)],
)
cert_name = self.create_random_name("certificate-for-test", length=48)
cert_etag = self.cmd(
"iot dps certificate create -g {} --dps-name {} --name {} --p {}".format(
self.entity_rg, self.entity_dps_name, cert_name, CERT_PATH
),
checks=[self.check("name", cert_name)],
).get_output_in_json()["etag"]
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group update -g {} --dps-name {} --enrollment-id {}"
" --cn {} --etag {} --allocation-policy {} --webhook-url {} --api-version {}".format(
self.entity_rg,
self.entity_dps_name,
enrollment_id,
cert_name,
etag,
AllocationType.custom.value,
WEBHOOK_URL,
API_VERSION,
),
auth_type=auth_phase
),
checks=[
self.check("attestation.type", AttestationType.x509.value),
self.check("enrollmentGroupId", enrollment_id),
self.check("allocationPolicy", "custom"),
self.check("customAllocationDefinition.webhookUrl", WEBHOOK_URL),
self.check("customAllocationDefinition.apiVersion", API_VERSION),
self.check("attestation.x509.caReferences.primary", cert_name),
self.check("attestation.x509.caReferences.secondary", None),
],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group delete -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id
),
auth_type=auth_phase
),
)
self.cmd(
"iot dps certificate delete -g {} --dps-name {} --name {} --etag {}".format(
self.entity_rg, self.entity_dps_name, cert_name, cert_etag
),
)
def test_dps_enrollment_group_symmetrickey_lifecycle(self):
attestation_type = AttestationType.symmetricKey.value
for auth_phase in DATAPLANE_AUTH_TYPES:
enrollment_id, enrollment_id2 = self.generate_enrollment_names(count=2, group=True)
primary_key = generate_key()
secondary_key = generate_key()
etag = self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group create --enrollment-id {}"
" -g {} --dps-name {} --pk {} --sk {} --provisioning-status {}"
" --initial-twin-tags {} --initial-twin-properties {}"
" --allocation-policy {} --rp {} --iot-hubs {} --edge-enabled".format(
enrollment_id,
self.entity_rg,
self.entity_dps_name,
primary_key,
secondary_key,
EntityStatusType.enabled.value,
'"{generic_dict}"',
'"{generic_dict}"',
AllocationType.geolatency.value,
ReprovisionType.reprovisionandresetdata.value,
self.hub_host_name,
),
auth_type=auth_phase
),
checks=[
self.check("enrollmentGroupId", enrollment_id),
self.check("provisioningStatus", EntityStatusType.enabled.value),
self.check("allocationPolicy", AllocationType.geolatency.value),
self.check("iotHubs", self.hub_host_name.split()),
self.check("initialTwin.tags", self.kwargs["generic_dict"]),
self.check(
"initialTwin.properties.desired", self.kwargs["generic_dict"]
),
self.exists("reprovisionPolicy"),
self.check("reprovisionPolicy.migrateDeviceData", False),
self.check("reprovisionPolicy.updateHubAssignment", True),
self.check("capabilities.iotEdge", True),
],
).get_output_in_json()["etag"]
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group list -g {} --dps-name {}".format(self.entity_rg, self.entity_dps_name),
auth_type=auth_phase
),
checks=[
self.check("length(@)", 1),
self.check("[0].enrollmentGroupId", enrollment_id),
],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group show -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id
),
auth_type=auth_phase
),
checks=[self.check("enrollmentGroupId", enrollment_id)],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group update -g {} --dps-name {} --enrollment-id {}"
" --provisioning-status {} --etag {} --edge-enabled False"
" --allocation-policy {} --webhook-url {} --api-version {}".format(
self.entity_rg,
self.entity_dps_name,
enrollment_id,
EntityStatusType.disabled.value,
etag,
AllocationType.custom.value,
WEBHOOK_URL,
API_VERSION,
),
auth_type=auth_phase
),
checks=[
self.check("enrollmentGroupId", enrollment_id),
self.check("provisioningStatus", EntityStatusType.disabled.value),
self.check("allocationPolicy", "custom"),
self.check("customAllocationDefinition.webhookUrl", WEBHOOK_URL),
self.check("customAllocationDefinition.apiVersion", API_VERSION),
self.check("iotHubs", None),
self.exists("initialTwin.tags"),
self.exists("initialTwin.properties.desired"),
self.check("attestation.symmetricKey.primaryKey", primary_key),
self.check("capabilities.iotEdge", False),
],
)
# Use service generated keys
etag = self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group create -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id2
),
auth_type=auth_phase
),
checks=[
self.check("enrollmentGroupId", enrollment_id2),
self.check("attestation.type", attestation_type),
],
).get_output_in_json()["etag"]
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group list -g {} --dps-name {}".format(self.entity_rg, self.entity_dps_name),
auth_type=auth_phase
),
checks=[
self.check("length(@)", 2)
],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group show -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id2
),
auth_type=auth_phase
),
checks=[self.check("enrollmentGroupId", enrollment_id2)],
)
keys = self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group show -g {} --dps-name {} --enrollment-id {} --show-keys".format(
self.entity_rg, self.entity_dps_name, enrollment_id2
),
auth_type=auth_phase
),
checks=[
self.check("enrollmentGroupId", enrollment_id2),
self.exists("attestation.symmetricKey"),
],
).get_output_in_json()["attestation"]["symmetricKey"]
# Compute Device Key tests
online_device_key = self.cmd(
self.set_cmd_auth_type(
'az iot dps compute-device-key -g {} --dps-name {} --enrollment-id {} '
"--registration-id myarbitrarydeviceId".format(
self.entity_rg, self.entity_dps_name, enrollment_id2
),
auth_type=auth_phase
),
).output
offline_device_key = self.cmd(
'az iot dps compute-device-key --key "{}" '
"--registration-id myarbitrarydeviceId".format(keys["primaryKey"])
).output
assert offline_device_key == online_device_key
# Compute Device Key uses primary key
offline_device_key = self.cmd(
'az iot dps compute-device-key --key "{}" '
"--registration-id myarbitrarydeviceId".format(keys["secondaryKey"])
).output
assert offline_device_key != online_device_key
etag = self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group update -g {} --dps-name {} --enrollment-id {}"
" --pk {} --sk {} --etag {}".format(
self.entity_rg,
self.entity_dps_name,
enrollment_id2,
keys["secondaryKey"],
keys["primaryKey"],
etag
),
auth_type=auth_phase
),
checks=[
self.check("enrollmentGroupId", enrollment_id2),
self.check("attestation.type", attestation_type),
],
).get_output_in_json()["etag"]
online_device_key = self.cmd(
self.set_cmd_auth_type(
'az iot dps compute-device-key -g {} --dps-name {} --enrollment-id {} '
"--registration-id myarbitrarydeviceId".format(
self.entity_rg, self.entity_dps_name, enrollment_id2
),
auth_type=auth_phase
),
).output
assert offline_device_key == online_device_key
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group delete -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id2
),
auth_type=auth_phase
),
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group delete -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id
),
auth_type=auth_phase
),
)
def test_dps_enrollment_twin_array(self):
attestation_type = AttestationType.x509.value
for auth_phase in DATAPLANE_AUTH_TYPES:
# test twin array in enrollment
device_id = self.generate_device_names()[0]
enrollment_id = self.generate_enrollment_names()[0]
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment create --enrollment-id {} --attestation-type {}"
" -g {} --dps-name {} --cp {} --scp {}"
" --provisioning-status {} --device-id {}"
" --initial-twin-tags {} --initial-twin-properties {} --device-information {}"
" --allocation-policy {} --iot-hubs {}".format(
enrollment_id,
attestation_type,
self.entity_rg,
self.entity_dps_name,
CERT_PATH,
CERT_PATH,
EntityStatusType.enabled.value,
device_id,
'"{generic_dict}"',
'"{twin_array_dict}"',
'"{generic_dict}"',
AllocationType.hashed.value,
self.hub_host_name,
),
auth_type=auth_phase
),
checks=[
self.check("attestation.type", attestation_type),
self.check("registrationId", enrollment_id),
self.check("provisioningStatus", EntityStatusType.enabled.value),
self.check("deviceId", device_id),
self.check("allocationPolicy", AllocationType.hashed.value),
self.check("iotHubs", self.hub_host_name.split()),
self.check("initialTwin.tags", self.kwargs["generic_dict"]),
self.check("optionalDeviceInformation", self.kwargs["generic_dict"]),
self.check(
"initialTwin.properties.desired", self.kwargs["twin_array_dict"]
),
self.exists("reprovisionPolicy"),
self.check("reprovisionPolicy.migrateDeviceData", True),
self.check("reprovisionPolicy.updateHubAssignment", True),
],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment delete -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_id
),
auth_type=auth_phase
),
)
# test twin array in enrollment group
enrollment_group_id = self.generate_enrollment_names(group=True)[0]
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group create --enrollment-id {} -g {} --dps-name {}"
" --cp {} --scp {} --provisioning-status {} --allocation-policy {}"
" --iot-hubs {} --edge-enabled --props {}".format(
enrollment_group_id,
self.entity_rg,
self.entity_dps_name,
CERT_PATH,
CERT_PATH,
EntityStatusType.enabled.value,
AllocationType.geolatency.value,
self.hub_host_name,
'"{twin_array_dict}"',
),
auth_type=auth_phase
),
checks=[
self.check("enrollmentGroupId", enrollment_group_id),
self.check("provisioningStatus", EntityStatusType.enabled.value),
self.exists("reprovisionPolicy"),
self.check("allocationPolicy", AllocationType.geolatency.value),
self.check("iotHubs", self.hub_host_name.split()),
self.check(
"initialTwin.properties.desired", self.kwargs["twin_array_dict"]
),
self.check("reprovisionPolicy.migrateDeviceData", True),
self.check("reprovisionPolicy.updateHubAssignment", True),
self.check("capabilities.iotEdge", True),
],
)
self.cmd(
self.set_cmd_auth_type(
"iot dps enrollment-group delete -g {} --dps-name {} --enrollment-id {}".format(
self.entity_rg, self.entity_dps_name, enrollment_group_id
),
auth_type=auth_phase
),
)
| 46.035108 | 117 | 0.480676 | 3,297 | 40,649 | 5.70003 | 0.060661 | 0.070399 | 0.054276 | 0.051562 | 0.897089 | 0.881764 | 0.876231 | 0.864045 | 0.842069 | 0.828287 | 0 | 0.005807 | 0.411179 | 40,649 | 882 | 118 | 46.087302 | 0.779361 | 0.015105 | 0 | 0.843403 | 0 | 0.025894 | 0.235876 | 0.061268 | 0 | 0 | 0 | 0 | 0.004932 | 1 | 0.009864 | false | 0 | 0.003699 | 0 | 0.014797 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4fa0d4aaede1e59ee4b80a3d0ed81d2b51211c44 | 36,538 | py | Python | tests/integration/mongodb/factory/prof/profmgrad.py | RaenonX/Jelly-Bot-API | c7da1e91783dce3a2b71b955b3a22b68db9056cf | [
"MIT"
] | 5 | 2020-08-26T20:12:00.000Z | 2020-12-11T16:39:22.000Z | tests/integration/mongodb/factory/prof/profmgrad.py | RaenonX/Jelly-Bot | c7da1e91783dce3a2b71b955b3a22b68db9056cf | [
"MIT"
] | 234 | 2019-12-14T03:45:19.000Z | 2020-08-26T18:55:19.000Z | tests/integration/mongodb/factory/prof/profmgrad.py | RaenonX/Jelly-Bot-API | c7da1e91783dce3a2b71b955b3a22b68db9056cf | [
"MIT"
] | 2 | 2019-10-23T15:21:15.000Z | 2020-05-22T09:35:55.000Z | from abc import abstractmethod, ABC
from bson import ObjectId
from flags import ProfilePermission
from models import ChannelProfileModel, ChannelProfileConnectionModel
from mongodb.factory import ProfileManager
from mongodb.factory.prof_base import ProfileDataManager, UserProfileManager
from mongodb.factory.results import OperationOutcome
from tests.base import TestDatabaseMixin, TestModelMixin
__all__ = ["TestProfileManagerAttachName", "TestProfileManagerAttachOid",
"TestProfileManagerDetachName", "TestProfileManagerDetachOid"]
class TestProfileManagerAttach(ABC):
class TestClass(TestModelMixin, TestDatabaseMixin):
CHANNEL_OID = ObjectId()
CHANNEL_OID_2 = ObjectId()
USER_OID = ObjectId()
USER_OID_2 = ObjectId()
@staticmethod
def obj_to_clear():
return [ProfileManager]
@abstractmethod
def perm_ctrl_self(self, user_oid: ObjectId, channel_oid: ObjectId, prof_oid: ObjectId, prof_name: str) \
-> OperationOutcome:
raise NotImplementedError()
@abstractmethod
def perm_ctrl_other(self, user_oid: ObjectId, channel_oid: ObjectId, prof_oid: ObjectId, prof_name: str,
target_oid: ObjectId) \
-> OperationOutcome:
raise NotImplementedError()
@abstractmethod
def not_found_result(self) -> OperationOutcome:
raise NotImplementedError()
def test_suf_perm_ctrl_self(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.AR_ACCESS_PINNED_MODULE.code_str: True,
ProfilePermission.PRF_CONTROL_SELF.code_str: True,
ProfilePermission.PRF_CONTROL_MEMBER.code_str: True})
mdl2 = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="DEF")
ProfileDataManager.insert_one_model(mdl)
ProfileDataManager.insert_one_model(mdl2)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl.id]))
result = self.perm_ctrl_self(self.USER_OID, self.CHANNEL_OID, mdl2.id, "DEF")
self.assertEqual(result, OperationOutcome.O_COMPLETED)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl.id, mdl2.id])
)
def test_suf_perm_ctrl_other(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.AR_ACCESS_PINNED_MODULE.code_str: True,
ProfilePermission.PRF_CONTROL_SELF.code_str: True,
ProfilePermission.PRF_CONTROL_MEMBER.code_str: True})
mdl2 = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="DEF")
ProfileDataManager.insert_one_model(mdl)
ProfileDataManager.insert_one_model(mdl2)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl.id, mdl2.id]))
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl2.id]))
result = self.perm_ctrl_other(self.USER_OID, self.CHANNEL_OID, mdl.id, "ABC", self.USER_OID_2)
self.assertEqual(result, OperationOutcome.O_COMPLETED)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl.id, mdl2.id])
)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID_2}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl2.id, mdl.id])
)
def test_insuf_perm_ctrl_self(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.AR_ACCESS_PINNED_MODULE.code_str: True})
mdl2 = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="DEF")
ProfileDataManager.insert_one_model(mdl)
ProfileDataManager.insert_one_model(mdl2)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id]))
result = self.perm_ctrl_self(self.USER_OID, self.CHANNEL_OID, mdl.id, "ABC")
self.assertEqual(result, OperationOutcome.X_INSUFFICIENT_PERMISSION)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id])
)
def test_insuf_perm_ctrl_other(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.AR_ACCESS_PINNED_MODULE.code_str: True})
mdl2 = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="DEF")
ProfileDataManager.insert_one_model(mdl)
ProfileDataManager.insert_one_model(mdl2)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id]))
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl2.id]))
result = self.perm_ctrl_other(self.USER_OID, self.CHANNEL_OID, mdl.id, "ABC", self.USER_OID_2)
self.assertEqual(result, OperationOutcome.X_INSUFFICIENT_PERMISSION)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id])
)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID_2}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl2.id])
)
def test_not_found(self):
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[ObjectId()]))
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[ObjectId()]))
result = self.perm_ctrl_self(self.USER_OID, self.CHANNEL_OID, ObjectId(), "ABC")
self.assertEqual(result, self.not_found_result())
result = self.perm_ctrl_other(self.USER_OID, self.CHANNEL_OID, ObjectId(), "ABC", self.USER_OID_2)
self.assertEqual(result, self.not_found_result())
def test_user_not_in_channel(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.PRF_CONTROL_SELF.code_str: True,
ProfilePermission.PRF_CONTROL_MEMBER.code_str: True})
ProfileDataManager.insert_one_model(mdl)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[]))
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl.id]))
result = self.perm_ctrl_other(self.USER_OID, self.CHANNEL_OID, mdl.id, "ABC", self.USER_OID_2)
self.assertEqual(result, OperationOutcome.X_EXECUTOR_NOT_IN_CHANNEL)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[])
)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID_2}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl.id])
)
def test_target_not_in_channel(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.PRF_CONTROL_SELF.code_str: True,
ProfilePermission.PRF_CONTROL_MEMBER.code_str: True})
ProfileDataManager.insert_one_model(mdl)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl.id]))
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[]))
result = self.perm_ctrl_other(self.USER_OID, self.CHANNEL_OID, mdl.id, "ABC", self.USER_OID_2)
self.assertEqual(result, OperationOutcome.X_TARGET_NOT_IN_CHANNEL)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl.id])
)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID_2}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[])
)
def test_insuf_perm_self(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.AR_ACCESS_PINNED_MODULE.code_str: True,
ProfilePermission.PRF_CONTROL_SELF.code_str: True,
ProfilePermission.PRF_CONTROL_MEMBER.code_str: True})
mdl2 = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="DEF")
ProfileDataManager.insert_one_model(mdl)
ProfileDataManager.insert_one_model(mdl2)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id]))
result = self.perm_ctrl_self(self.USER_OID, self.CHANNEL_OID, mdl.id, "ABC")
self.assertEqual(result, OperationOutcome.X_INSUFFICIENT_PERMISSION)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id])
)
def test_insuf_perm_self_other(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.AR_ACCESS_PINNED_MODULE.code_str: True,
ProfilePermission.PRF_CONTROL_SELF.code_str: True,
ProfilePermission.PRF_CONTROL_MEMBER.code_str: True})
mdl2 = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="DEF")
ProfileDataManager.insert_one_model(mdl)
ProfileDataManager.insert_one_model(mdl2)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id]))
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl2.id]))
result = self.perm_ctrl_other(self.USER_OID, self.CHANNEL_OID, mdl.id, "ABC", self.USER_OID_2)
self.assertEqual(result, OperationOutcome.X_INSUFFICIENT_PERMISSION)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id])
)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID_2}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl2.id])
)
class TestProfileManagerAttachName(TestProfileManagerAttach.TestClass):
def not_found_result(self) -> OperationOutcome:
return OperationOutcome.X_PROFILE_NOT_FOUND_NAME
def perm_ctrl_self(self, user_oid: ObjectId, channel_oid: ObjectId, prof_oid: ObjectId,
prof_name: str) -> OperationOutcome:
return ProfileManager.attach_profile_name(channel_oid, user_oid, prof_name)
def perm_ctrl_other(self, user_oid: ObjectId, channel_oid: ObjectId, prof_oid: ObjectId, prof_name: str,
target_oid: ObjectId) -> OperationOutcome:
return ProfileManager.attach_profile_name(channel_oid, user_oid, prof_name, target_oid)
class TestProfileManagerAttachOid(TestProfileManagerAttach.TestClass):
def not_found_result(self) -> OperationOutcome:
return OperationOutcome.X_PROFILE_NOT_FOUND_OID
def perm_ctrl_self(self, user_oid: ObjectId, channel_oid: ObjectId, prof_oid: ObjectId,
prof_name: str) -> OperationOutcome:
return ProfileManager.attach_profile(channel_oid, user_oid, prof_oid)
def perm_ctrl_other(self, user_oid: ObjectId, channel_oid: ObjectId, prof_oid: ObjectId, prof_name: str,
target_oid: ObjectId) -> OperationOutcome:
return ProfileManager.attach_profile(channel_oid, user_oid, prof_oid, target_oid)
class TestProfileManagerDetach(ABC):
class TestClass(TestModelMixin, TestDatabaseMixin):
CHANNEL_OID = ObjectId()
CHANNEL_OID_2 = ObjectId()
USER_OID = ObjectId()
USER_OID_2 = ObjectId()
@staticmethod
def obj_to_clear():
return [ProfileManager]
@abstractmethod
def perm_ctrl_self(self, channel_oid: ObjectId, prof_oid: ObjectId, prof_name: str, user_oid: ObjectId) \
-> OperationOutcome:
raise NotImplementedError()
@abstractmethod
def perm_ctrl_other(self, channel_oid: ObjectId, prof_oid: ObjectId, prof_name: str, user_oid: ObjectId,
target_oid: ObjectId) \
-> OperationOutcome:
raise NotImplementedError()
@abstractmethod
def perm_ctrl_all(self, channel_oid: ObjectId, prof_oid: ObjectId, prof_name: str, user_oid: ObjectId) \
-> OperationOutcome:
raise NotImplementedError()
@abstractmethod
def not_found_result(self) -> OperationOutcome:
raise NotImplementedError()
def test_suf_perm_ctrl_self(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.AR_ACCESS_PINNED_MODULE.code_str: True,
ProfilePermission.PRF_CONTROL_SELF.code_str: True,
ProfilePermission.PRF_CONTROL_MEMBER.code_str: True})
mdl2 = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="DEF")
ProfileDataManager.insert_one_model(mdl)
ProfileDataManager.insert_one_model(mdl2)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl.id, mdl2.id]))
result = self.perm_ctrl_self(self.CHANNEL_OID, mdl2.id, "DEF", self.USER_OID)
self.assertEqual(result, OperationOutcome.O_COMPLETED)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl.id])
)
def test_detach_name_suf_perm_ctrl_other(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.AR_ACCESS_PINNED_MODULE.code_str: True,
ProfilePermission.PRF_CONTROL_SELF.code_str: True,
ProfilePermission.PRF_CONTROL_MEMBER.code_str: True})
mdl2 = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="DEF")
ProfileDataManager.insert_one_model(mdl)
ProfileDataManager.insert_one_model(mdl2)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl.id, mdl2.id]))
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl.id, mdl2.id]))
result = self.perm_ctrl_other(self.CHANNEL_OID, mdl2.id, "DEF", self.USER_OID, self.USER_OID_2)
self.assertEqual(result, OperationOutcome.O_COMPLETED)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl.id, mdl2.id])
)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID_2}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl.id])
)
def test_detach_name_insuf_perm_ctrl_self(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.AR_ACCESS_PINNED_MODULE.code_str: True})
mdl2 = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="DEF")
ProfileDataManager.insert_one_model(mdl)
ProfileDataManager.insert_one_model(mdl2)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id]))
result = self.perm_ctrl_self(self.CHANNEL_OID, mdl2.id, "DEF", self.USER_OID)
self.assertEqual(result, OperationOutcome.X_INSUFFICIENT_PERMISSION)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id])
)
def test_detach_name_insuf_perm_ctrl_other(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.AR_ACCESS_PINNED_MODULE.code_str: True})
mdl2 = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="DEF")
ProfileDataManager.insert_one_model(mdl)
ProfileDataManager.insert_one_model(mdl2)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id]))
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl2.id]))
result = self.perm_ctrl_other(self.CHANNEL_OID, mdl2.id, "DEF", self.USER_OID, self.USER_OID_2)
self.assertEqual(result, OperationOutcome.X_INSUFFICIENT_PERMISSION)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id])
)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID_2}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl2.id])
)
def test_not_found(self):
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[ObjectId()]))
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[ObjectId()])), self.USER_OID
result = self.perm_ctrl_self(self.CHANNEL_OID, ObjectId(), "ABC", self.USER_OID)
self.assertEqual(result, self.not_found_result())
result = self.perm_ctrl_other(self.CHANNEL_OID, ObjectId(), "ABC", self.USER_OID, self.USER_OID_2)
self.assertEqual(result, self.not_found_result())
def test_detach_name_user_not_in_channel(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.PRF_CONTROL_SELF.code_str: True,
ProfilePermission.PRF_CONTROL_MEMBER.code_str: True})
ProfileDataManager.insert_one_model(mdl)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[]))
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl.id]))
result = self.perm_ctrl_other(self.CHANNEL_OID, mdl.id, "ABC", self.USER_OID, self.USER_OID_2)
self.assertEqual(result, OperationOutcome.X_EXECUTOR_NOT_IN_CHANNEL)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[])
)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID_2}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl.id])
)
def test_detach_name_target_not_in_channel(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.PRF_CONTROL_SELF.code_str: True,
ProfilePermission.PRF_CONTROL_MEMBER.code_str: True})
ProfileDataManager.insert_one_model(mdl)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl.id]))
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2, ProfileOids=[]))
result = self.perm_ctrl_other(self.CHANNEL_OID, mdl.id, "ABC", self.USER_OID, self.USER_OID_2)
self.assertEqual(result, OperationOutcome.X_TARGET_NOT_IN_CHANNEL)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl.id])
)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID_2}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[])
)
def test_detach_name_insuf_perm_self(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.AR_ACCESS_PINNED_MODULE.code_str: True,
ProfilePermission.PRF_CONTROL_SELF.code_str: True,
ProfilePermission.PRF_CONTROL_MEMBER.code_str: True})
mdl2 = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="DEF")
ProfileDataManager.insert_one_model(mdl)
ProfileDataManager.insert_one_model(mdl2)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id]))
result = self.perm_ctrl_self(self.CHANNEL_OID, mdl2.id, "DEF", self.USER_OID)
self.assertEqual(result, OperationOutcome.X_INSUFFICIENT_PERMISSION)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id])
)
def test_detach_name_insuf_perm_self_other(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.AR_ACCESS_PINNED_MODULE.code_str: True,
ProfilePermission.PRF_CONTROL_SELF.code_str: True,
ProfilePermission.PRF_CONTROL_MEMBER.code_str: True})
mdl2 = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="DEF")
ProfileDataManager.insert_one_model(mdl)
ProfileDataManager.insert_one_model(mdl2)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id]))
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl2.id]))
result = self.perm_ctrl_other(self.CHANNEL_OID, mdl2.id, "DEF", self.USER_OID, self.USER_OID_2)
self.assertEqual(result, OperationOutcome.X_INSUFFICIENT_PERMISSION)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id])
)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID_2}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl2.id])
)
def test_detach_all(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.AR_ACCESS_PINNED_MODULE.code_str: True,
ProfilePermission.PRF_CONTROL_SELF.code_str: True,
ProfilePermission.PRF_CONTROL_MEMBER.code_str: True})
mdl2 = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="DEF")
ProfileDataManager.insert_one_model(mdl)
ProfileDataManager.insert_one_model(mdl2)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl.id, mdl2.id]))
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl2.id]))
result = self.perm_ctrl_all(self.CHANNEL_OID, mdl2.id, "DEF", self.USER_OID)
self.assertEqual(result, OperationOutcome.O_COMPLETED)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl.id])
)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID_2}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[])
)
def test_detach_all_insuf_perm(self):
mdl = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="ABC",
Permission={ProfilePermission.AR_ACCESS_PINNED_MODULE.code_str: True,
ProfilePermission.PRF_CONTROL_SELF.code_str: True,
ProfilePermission.PRF_CONTROL_MEMBER.code_str: True})
mdl2 = ChannelProfileModel(ChannelOid=self.CHANNEL_OID, Name="DEF")
ProfileDataManager.insert_one_model(mdl)
ProfileDataManager.insert_one_model(mdl2)
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id]))
UserProfileManager.insert_one_model(
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl2.id]))
result = self.perm_ctrl_all(self.CHANNEL_OID, mdl2.id, "DEF", self.USER_OID)
self.assertEqual(result, OperationOutcome.X_INSUFFICIENT_PERMISSION)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID,
ProfileOids=[mdl2.id])
)
self.assertModelEqual(
UserProfileManager.find_one_casted({ChannelProfileConnectionModel.UserOid.key: self.USER_OID_2}),
ChannelProfileConnectionModel(ChannelOid=self.CHANNEL_OID, UserOid=self.USER_OID_2,
ProfileOids=[mdl2.id])
)
class TestProfileManagerDetachName(TestProfileManagerDetach.TestClass):
def not_found_result(self) -> OperationOutcome:
return OperationOutcome.X_PROFILE_NOT_FOUND_NAME
def perm_ctrl_self(self, channel_oid: ObjectId, prof_oid: ObjectId, prof_name: str,
user_oid: ObjectId) -> OperationOutcome:
return ProfileManager.detach_profile_name(channel_oid, prof_name, user_oid)
def perm_ctrl_other(self, channel_oid: ObjectId, prof_oid: ObjectId, prof_name: str, user_oid: ObjectId,
target_oid: ObjectId) -> OperationOutcome:
return ProfileManager.detach_profile_name(channel_oid, prof_name, user_oid, target_oid)
def perm_ctrl_all(self, channel_oid: ObjectId, prof_oid: ObjectId, prof_name: str, user_oid: ObjectId) \
-> OperationOutcome:
return ProfileManager.detach_profile_name(channel_oid, prof_name, user_oid)
class TestProfileManagerDetachOid(TestProfileManagerDetach.TestClass):
def not_found_result(self) -> OperationOutcome:
return OperationOutcome.X_PROFILE_NOT_FOUND_OID
def perm_ctrl_self(self, channel_oid: ObjectId, prof_oid: ObjectId, prof_name: str,
user_oid: ObjectId) -> OperationOutcome:
return ProfileManager.detach_profile(channel_oid, prof_oid, user_oid, user_oid)
def perm_ctrl_other(self, channel_oid: ObjectId, prof_oid: ObjectId, prof_name: str, user_oid: ObjectId,
target_oid: ObjectId) -> OperationOutcome:
return ProfileManager.detach_profile(channel_oid, prof_oid, user_oid, target_oid)
def perm_ctrl_all(self, channel_oid: ObjectId, prof_oid: ObjectId, prof_name: str, user_oid: ObjectId) \
-> OperationOutcome:
return ProfileManager.detach_profile(channel_oid, prof_oid, user_oid)
| 59.508143 | 116 | 0.622338 | 3,306 | 36,538 | 6.575318 | 0.027223 | 0.051201 | 0.068314 | 0.10599 | 0.967844 | 0.967522 | 0.966832 | 0.965452 | 0.96444 | 0.957448 | 0 | 0.005097 | 0.301987 | 36,538 | 613 | 117 | 59.60522 | 0.84724 | 0 | 0 | 0.845857 | 0 | 0 | 0.007444 | 0.003011 | 0 | 0 | 0 | 0 | 0.100193 | 1 | 0.082852 | false | 0 | 0.015414 | 0.030829 | 0.144509 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4fa79b4f25af316d3366725e13e6b9159a146de0 | 98,404 | py | Python | pose/canvas_data.py | Jacob-Lapkin/UCL_Thesis | 6c36cf3b1604deccdbdc9dda58f2aab5b42fa504 | [
"MIT"
] | null | null | null | pose/canvas_data.py | Jacob-Lapkin/UCL_Thesis | 6c36cf3b1604deccdbdc9dda58f2aab5b42fa504 | [
"MIT"
] | null | null | null | pose/canvas_data.py | Jacob-Lapkin/UCL_Thesis | 6c36cf3b1604deccdbdc9dda58f2aab5b42fa504 | [
"MIT"
] | null | null | null | from numpy.lib.shape_base import split
from pose.pro_angle_data import smoothed_df, phase_divider, grab_label, split_data, split_label, display_df
from pose.user_angle_data import smoothed_user_df, grab_user_label, display_user_df, phase_user_divider
from pose.multi_class_azure import StrokeList
from pose.recommendations_data import *
import matplotlib.pyplot as plt
import random
#############################################################################
########### CREATING CLASSES TO USE FOR PRO AND USER ON DASHBOARD ###########
#############################################################################
######### class to get data from professional ############
class Player_data:
def __init__(self, path, angle, name):
self.path = path
self.angle = angle
self.name = name
# getting all the data at once
def get_data(self):
data = smoothed_df(self.path, self.angle)
return data
# getting only a phase of the data
def get_split_data(self, phase):
data = split_data(self.path, self.angle, phase)
return data
# getting a breakdown of the phases
def doughnut(self):
doughnut_data = phase_divider(self.path)
return doughnut_data
# gettinga all the labels or frames
def labels(self):
labels = grab_label(self.path, self.angle)
return labels
# getting all the labels or frames by phase
def splitting_label(self, phase):
getting_split = split_label(self.path, self.angle, phase)
labels = []
for i in getting_split:
labels.append(str(i))
return labels
def get_min_data(self):
min_data = grab_min(self.path, self.angle)
return min_data
def get_max_data(self):
max_data = grab_max(self.path, self.angle)
return max_data
def get_range_data(self):
range_data = grab_range(self.path, self.angle)
return range_data
def get_average_data(self):
average_data = grab_average(self.path, self.angle)
return average_data
########### class to get data from user ###########
class User_data():
def __init__(self, path, name, base, folder):
self.path = path
self.name = name
self.df = display_user_df(path)
self.phase_df = StrokeList(name, base, folder)
def get_data(self, angle):
#data = smoothed_user_df(self.path)
data = smoothed_user_df(self.df)
return data[angle]
def labels(self):
data = self.df
labels = grab_user_label(data)
return labels
def doughnut(self):
data = self.phase_df
doughnut_data = phase_user_divider(data)
return doughnut_data
def get_full_data(self, angle):
df = phase_w_data(self.df, angle, self.phase_df)
return df
def get_min_data(self, angle):
min_data = grab_user_min(self.df, angle, self.phase_df)
return min_data
def get_max_data(self, angle):
max_data = grab_user_max(self.df, angle, self.phase_df)
return max_data
def get_range_data(self, angle):
range_data = grab_user_range(self.df, angle, self.phase_df)
return range_data
def get_average_data(self, angle):
range_data = grab_user_average(self.df, angle, self.phase_df)
return range_data
# playerright_leg = Player_data(f'pose/data/serve_data/djokservelegs.csv', 'hip2ankle_right', 'djok')
# playerleft_leg = Player_data(f'pose/data/serve_data/djokservelegs.csv', 'hip2ankle_left', 'djok')
# playerright_arm = Player_data(f'pose/data/serve_data/djokservearm.csv', 'shoulder2wrist_right', 'djok')
# playerleft_arm = Player_data(f'pose/data/serve_data/djokservearm.csv', 'shoulder2wrist_left', 'djok')
# user = User_data('pose/videos/serve/jake.mov', 'Jacob', str(1), str(1))
# print(user.get_full_data('hip2ankle_right'))
####### IF STATEMENTS FOR RECOMMENDATIONS #######
####################### for right handed players ##############################
# Legs
def legs_tips_start(user, playerright_leg, playerleft_leg):
leg_tip_start_first = 'None'
leg_tip_start_second = 'None'
score_lost = 0
# less than the pro angles
if user.get_max_data('hip2ankle_right')[0] < (playerright_leg.get_max_data()[0] * 0.80) and user.get_max_data('hip2ankle_left')[0] < (playerleft_leg.get_max_data()[0] * 0.80):
leg_tip_start_first = 'Try to standup significantly more during your ready position.'
leg_tip_start_second = 'Stand remarkably higher on your ready position.'
score_lost += 7
elif user.get_max_data('hip2ankle_right')[0] < (playerright_leg.get_max_data()[0] * 0.90) and user.get_max_data('hip2ankle_left')[0] < (playerleft_leg.get_max_data()[0] * 0.90):
leg_tip_start_first = 'Try to standup more during your ready position.'
leg_tip_start_second = 'Stand higher on your ready position.'
score_lost += 4
elif user.get_max_data('hip2ankle_right')[0] < (playerright_leg.get_max_data()[0] * 0.95) and user.get_max_data('hip2ankle_left')[0] < (playerleft_leg.get_max_data()[0] * 0.95):
leg_tip_start_first = 'Try to standup more during your ready position.'
leg_tip_start_second = 'Stand slightly higher on your ready position.'
score_lost += 2
# grater than the pro angles
elif user.get_max_data('hip2ankle_right')[0] > (playerright_leg.get_max_data()[0] * 1.20) and user.get_max_data('hip2ankle_left')[0] > (playerleft_leg.get_max_data()[0] * 1.20):
leg_tip_start_first = 'Try to bend your legs significantly more during your ready position.'
leg_tip_start_second = 'Your legs are much too straight on the ready position. Bend them remarkably more.'
score_lost += 7
elif user.get_max_data('hip2ankle_right')[0] > (playerright_leg.get_max_data()[0] * 1.10) and user.get_max_data('hip2ankle_left')[0] > (playerleft_leg.get_max_data()[0] * 1.10):
leg_tip_start_first = 'Try to bend your legs more during your ready position.'
leg_tip_start_second = 'Your legs are too straight on the ready position. Bend them more.'
score_lost += 4
elif user.get_max_data('hip2ankle_right')[0] > (playerright_leg.get_max_data()[0] * 1.05) and user.get_max_data('hip2ankle_left')[0] > (playerleft_leg.get_max_data()[0] * 1.05):
leg_tip_start_first = 'Try to bend your legs slightly more during your ready position.'
leg_tip_start_second = 'Your legs are slightly too straight on the ready position. Bend them a bit more.'
score_lost += 2
# greater than one angle and less than the other
elif user.get_max_data('hip2ankle_right')[0] > (playerright_leg.get_max_data()[0] * 1.20) and user.get_max_data('hip2ankle_left')[0] < (playerleft_leg.get_max_data()[0] * 1.20):
leg_tip_start_first = 'Your legs seem to be starting at significantly contrasting angles. Try copying your leg positions more during your ready position to increase balance.'
leg_tip_start_second = 'Your legs are strongly out of sync. Try mirroring your leg positions significantly more during the ready position'
score_lost += 7
elif user.get_max_data('hip2ankle_right')[0] < (playerright_leg.get_max_data()[0] * 1.10) and user.get_max_data('hip2ankle_left')[0] > (playerleft_leg.get_max_data()[0] * 1.10):
leg_tip_start_first = 'Your legs seem to be starting at contrasting angles. Try copying your leg positions more during your ready position to increase balance.'
leg_tip_start_second = 'Your legs are out of sync. Try mirroring your leg positions more during the ready position'
score_lost += 4
elif user.get_max_data('hip2ankle_right')[0] < (playerright_leg.get_max_data()[0] * 1.05) and user.get_max_data('hip2ankle_left')[0] > (playerleft_leg.get_max_data()[0] * 1.05):
leg_tip_start_first = 'Your legs seem to be starting at slighty contrasting angles. Try copying your leg positions more during your ready position to increase balance.'
leg_tip_start_second = 'Your legs are slightly out of sync. Try mirroring your leg positions slightly more during the ready position'
score_lost += 2
else:
leg_tip_start_first = 'Your legs are placed well, and your lower body posture looks good!'
leg_tip_start_second = 'You do not need to adjust your legs on your ready position. Your lower body posture looks good!'
pick_from_this = [leg_tip_start_first, leg_tip_start_second]
leg_tip_start = random.choice(pick_from_this)
return [leg_tip_start, score_lost]
def legs_tips_load(user, playerright_leg, playerleft_leg):
score_lost = 0
leg_tip_load_first = 'None'
leg_tip_load_second = 'None'
# bending legs too much during load
if user.get_min_data('hip2ankle_right')[1] < (playerright_leg.get_min_data()[1] * 0.80) and user.get_min_data('hip2ankle_left')[1] < (playerleft_leg.get_min_data()[1] * 0.80):
leg_tip_load_first = 'You are loading your back leg too extremely. try standing much taller during your take back.'
leg_tip_load_second = 'You do not need to bend your back leg so extremely on the load. Try standing up much more on your takeback to keep yourself from being too low.'
score_lost += 7
elif user.get_min_data('hip2ankle_right')[1] < (playerright_leg.get_min_data()[1] * 0.90) and user.get_min_data('hip2ankle_left')[1] < (playerleft_leg.get_min_data()[1] * 0.90):
leg_tip_load_first = 'You are loading your back legs too much, try standing taller during your take back.'
leg_tip_load_second = 'You do not need to bend your back leg that much on the load. Try standing up more on your takeback to keep yourself from being too low.'
score_lost += 4
elif user.get_min_data('hip2ankle_right')[1] < (playerright_leg.get_min_data()[1] * 0.95) and user.get_min_data('hip2ankle_left')[1] < (playerleft_leg.get_min_data()[1] * 0.95):
leg_tip_load_first = 'You are loading your back legs a bit too much, try standing slightly taller during your take back.'
leg_tip_load_second = 'Your are bending your back leg slightly too much on the load. Try standing up a bit more on your takeback to keep yourself from being too low.'
score_lost += 2
# not bedning legs enough during laod
elif user.get_min_data('hip2ankle_right')[1] > (playerright_leg.get_min_data()[1] * 1.20) and user.get_min_data('hip2ankle_left')[1] > (playerleft_leg.get_min_data()[1] * 1.20):
leg_tip_load_first = 'You can significatly increase power in your serve by bending your legs a convincing amount.'
leg_tip_load_second = 'Bend your legs much more to significantly decrease the angle of them in your load and subsequently significantly increase power'
score_lost += 7
elif user.get_min_data('hip2ankle_right')[1] > (playerright_leg.get_min_data()[1] * 1.10) and user.get_min_data('hip2ankle_left')[1] > (playerleft_leg.get_min_data()[1] * 1.10):
leg_tip_load_first = 'You can increase power in your serve by bending your legs more.'
leg_tip_load_second = 'Bend your legs more to decrease the angle of them in your load and subsequently increase power'
score_lost += 4
elif user.get_min_data('hip2ankle_right')[1] > (playerright_leg.get_min_data()[1] * 1.05) and user.get_min_data('hip2ankle_left')[1] > (playerleft_leg.get_min_data()[1] * 1.05):
leg_tip_load_first = 'You can slightly increase power in your serve by bending your legs a small amount.'
leg_tip_load_second = 'Bend your legs slightly more to decrease the angle of them in your load and subsequently slightly increase power'
score_lost += 2
# leaning forward during takeback
elif user.get_min_data('hip2ankle_right')[1] > (playerright_leg.get_min_data()[1] * 1.20) and user.get_min_data('hip2ankle_left')[1] < (playerleft_leg.get_min_data()[1] * 0.80):
leg_tip_load_first = 'Your lower body is leaning significantly too far forward during you takeback. Try to more equally match the bend in your right and left legs.'
leg_tip_load_second = 'You seem to be leaning ramarkably too far forward during your takeback. Try to mirror your legs much more.'
score_lost += 7
elif user.get_min_data('hip2ankle_right')[1] > (playerright_leg.get_min_data()[1] * 1.10) and user.get_min_data('hip2ankle_left')[1] < (playerleft_leg.get_min_data()[1] * 0.90):
leg_tip_load_first = 'Your lower body may be leaning too far forward during you takeback. Try to more equally match the bend in your right and left legs.'
leg_tip_load_second = 'You seem to be leaning too far forward during your takeback. Try to mirror your legs more.'
score_lost += 4
elif user.get_min_data('hip2ankle_right')[1] > (playerright_leg.get_min_data()[1] * 1.05) and user.get_min_data('hip2ankle_left')[1] < (playerleft_leg.get_min_data()[1] * 0.95):
leg_tip_load_first = 'Your lower body may be leaning slightly forward during you takeback. Try to more equally match the bend in your right and left legs.'
leg_tip_load_second = 'You seem to be leaning a bit too far forward during your takeback. Try to mirror your legs a small fraction more.'
score_lost += 2
# leaning backward during takeback
elif user.get_min_data('hip2ankle_right')[1] < (playerright_leg.get_min_data()[1] * 0.80) and user.get_min_data('hip2ankle_left')[1] > (playerleft_leg.get_min_data()[1] * 1.20):
leg_tip_load_first = 'Your lower body is leaning significantly too far back during you takeback. Try to more equally match the bend in your right and left legs.'
leg_tip_load_second = 'You seem to be leaning ramarkably too far back during your takeback. Try to mirror your legs much more.'
score_lost += 7
elif user.get_min_data('hip2ankle_right')[1] < (playerright_leg.get_min_data()[1] * 0.90) and user.get_min_data('hip2ankle_left')[1] > (playerleft_leg.get_min_data()[1] * 1.10):
leg_tip_load_first = 'Your lower body is leaning too far back during you takeback. Try to more equally match the bend in your right and left legs.'
leg_tip_load_second = 'You seem to be leaning too far back during your takeback. Try to mirror your legs more.'
score_lost += 4
elif user.get_min_data('hip2ankle_right')[1] < (playerright_leg.get_min_data()[1] * 0.95) and user.get_min_data('hip2ankle_left')[1] > (playerleft_leg.get_min_data()[1] * 1.05):
leg_tip_load_first = 'Your lower body is leaning slightly too far back during you takeback. Try to more equally match the bend in your right and left legs.'
leg_tip_load_second = 'You seem to be leaning a bit too far back during your takeback. Try to mirror your legs a small fraction more.'
score_lost += 2
else:
leg_tip_load_first = 'Your legs are well balanced during your takeback, and your legs are loading very well!'
leg_tip_load_second = 'Your balance does not need any improvement at this point because your legs are loading very well!'
pick_from_this = [leg_tip_load_first, leg_tip_load_second]
leg_tip_load = random.choice(pick_from_this)
return [leg_tip_load, score_lost]
def legs_tips_extend(user, playerright_leg, playerleft_leg):
score_lost = 0
leg_tip_extend_first = 'None'
leg_tip_extend_second = 'None'
# not extending enough during contact
if user.get_max_data('hip2ankle_right')[2] < (playerright_leg.get_max_data()[2] * 0.80) and user.get_max_data('hip2ankle_left')[2] < (playerleft_leg.get_max_data()[2] * 0.80):
leg_tip_extend_first = 'Your legs are significantly under extending during contact. Make sure to immensely increase drive with you legs on your extension.'
leg_tip_extend_second = 'You are remarkably too low on your extension during contact. Drive much more with your legs to be straighter on contact.'
score_lost += 7
elif user.get_max_data('hip2ankle_right')[2] < (playerright_leg.get_max_data()[2] * 0.90) and user.get_max_data('hip2ankle_left')[2] < (playerleft_leg.get_max_data()[2] * 0.90):
leg_tip_extend_first = 'Your legs are under extending during contact. Make sure to drive more with you legs on your extension.'
leg_tip_extend_second = 'You are too low on your extension during contact. Drive more with your legs to be straighter on contact.'
score_lost += 4
elif user.get_max_data('hip2ankle_right')[2] < (playerright_leg.get_max_data()[2] * 0.95) and user.get_max_data('hip2ankle_left')[2] < (playerleft_leg.get_max_data()[2] * 0.95):
leg_tip_extend_first = 'Your legs are slightly under extending during contact. Make sure to drive slightly more with you legs on your extension.'
leg_tip_extend_second = 'You are slightly too low on your extension during contact. Drive a bit more with your legs to be straighter on contact.'
score_lost += 2
# mirroring extension with the legs
elif user.get_max_data('hip2ankle_right')[2] > (playerright_leg.get_max_data()[2] * 1.10) and user.get_max_data('hip2ankle_left')[2] < (playerleft_leg.get_max_data()[2] * 0.90):
leg_tip_extend_first = 'Your back leg should extend more to be closer in line with your front leg during extension and contact.'
leg_tip_extend_second = 'Your legs should be closely in line during extension and contact. Try to extend your back leg more to accompish this.'
score_lost += 4
elif user.get_max_data('hip2ankle_right')[2] < (playerright_leg.get_max_data()[2] * 0.90) and user.get_max_data('hip2ankle_left')[2] > (playerleft_leg.get_max_data()[2] * 1.10):
leg_tip_extend_first = 'Your front leg should extend more to be closer in line with your back leg during extension and contact.'
leg_tip_extend_second = 'Your legs should be closely in line during extension and contact. Try to extend your front leg more to accompish this.'
score_lost += 4
else:
leg_tip_extend_first = 'Good job! Your legs are optimally extending on contact and consistently mirroring eachother.'
leg_tip_extend_second = 'Your legs are optimally extending on contact and consistently mirroring eachother! There is no need to adjust your legs during your extension.'
pick_from_this = [leg_tip_extend_first, leg_tip_extend_second]
leg_tip_extend = random.choice(pick_from_this)
return [leg_tip_extend, score_lost]
def legs_tips_finish(user, playerleft_leg):
score_lost = 0
leg_tip_finish_first = 'None'
leg_tip_finish_second = 'None'
if user.get_min_data('hip2ankle_left')[3] < (playerleft_leg.get_min_data()[3] * 0.80):
leg_tip_finish_start = 'You are dipping your front leg significantly too much during the finish. Try landing a lot taller in the legs to optimize your recovery.'
leg_tip_finish_second = 'Your front leg is absorbing remarkably too much on the finish. You need to finish much taller to have a better chance at recovery.'
score_lost += 7
elif user.get_min_data('hip2ankle_left')[3] < (playerleft_leg.get_min_data()[3] * 0.90):
leg_tip_finish_first = 'You are dipping your front leg too much during the finish. Try landing taller in the legs to optimize your recovery.'
leg_tip_finish_second = 'Your front leg is absorbing too strongly on the finish. You need to finish taller to have a better chance at recovery.'
score_lost += 4
elif user.get_min_data('hip2ankle_left')[3] < (playerleft_leg.get_min_data()[3] * 0.95):
leg_tip_finish_first = 'You are dipping your front leg slightly too much during the finish. Try landing a bit taller in the legs to optimize your recovery.'
leg_tip_finish_second = 'Your front leg is absorbing slightly too much on the finish. You need to finish a bit taller to have a better chance at recovery.'
score_lost += 2
elif user.get_min_data('hip2ankle_left')[3] > (playerleft_leg.get_min_data()[3] * 1.20):
leg_tip_finish_first = 'You are standing significantly too tall during your finish. Try getting a lot lower in the legs to absorb your impact with the ground.'
leg_tip_finish_second = 'You are remarkably too high on the finish. You need to get much lower and absorb the impact with your legs more. '
score_lost += 7
elif user.get_min_data('hip2ankle_left')[3] > (playerleft_leg.get_min_data()[3] * 1.10):
leg_tip_finish_first = 'You are standing too tall during your finish. Try getting lower in the legs to absorb your impact with the ground.'
leg_tip_finish_second = 'You are too high on the finish. You need to get lower and absorb the impact with your legs more. '
score_lost += 4
elif user.get_min_data('hip2ankle_left')[3] > (playerleft_leg.get_min_data()[3] * 1.05):
leg_tip_finish_first = 'You are standing slightly too tall during your finish. Try getting a bit lower in the legs to absorb your impact with the ground.'
leg_tip_finish_second = 'You are slightly too high on the finish. You need to get a bit lower and absorb the impact with your legs more. '
score_lost += 2
else:
leg_tip_finish_first = 'You are landing in a great position that is optimal for balance and recovery!'
leg_tip_finish_second = 'Your landing is neither too low nor too high on the finish! You are optimzed for a strong recovery after the serve.'
pick_from_this = [leg_tip_finish_first, leg_tip_finish_second]
leg_tip_finish = random.choice(pick_from_this)
return [leg_tip_finish, score_lost]
def leg_score(user, playerright_leg, playerleft_leg):
leg_start = legs_tips_start(user, playerright_leg, playerleft_leg)
leg_load = legs_tips_load(user, playerright_leg, playerleft_leg)
leg_extend = legs_tips_extend(user, playerright_leg, playerleft_leg)
leg_finish = legs_tips_finish(user, playerright_leg)
leg_tip_list = [leg_start, leg_load, leg_extend, leg_finish]
score = []
for i in leg_tip_list:
score.append(i[0])
return score
def leg_score_quant(user, playerright_leg, playerleft_leg):
leg_start = legs_tips_start(user, playerright_leg, playerleft_leg)
leg_load = legs_tips_load(user, playerright_leg, playerleft_leg)
leg_extend = legs_tips_extend(user, playerright_leg, playerleft_leg)
leg_finish = legs_tips_finish(user, playerright_leg)
leg_tip_list = [leg_start, leg_load, leg_extend, leg_finish]
score = []
for i in leg_tip_list:
score.append(i[-1])
return score
# Arms
def arms_tips_start(user, playerright_arm, playerleft_arm):
arm_tip_start_right_first = 'None'
arm_tip_start_right_first = 'None'
arm_tip_start_both_first = 'None'
arm_tip_start_both_second = 'None'
score = 0
# right arm extended out too much
if user.get_min_data('shoulder2wrist_right')[0] > (playerright_arm.get_min_data()[0] * 1.20):
arm_tip_start_right_first = 'Dominant arm is extended out significantly too much during the starting position. Tuck your arm in a large amount to minimize the time it takes to start the takeback.'
arm_tip_start_right_second = 'Dominant arm is extended out notably too much during the starting position. Tuck your arm in a huge amount to minimize the time it takes to start the takeback.'
score += 7
elif user.get_min_data('shoulder2wrist_right')[0] > (playerright_arm.get_min_data()[0] * 1.10):
arm_tip_start_right_first = 'Dominant arm is extended out too much during the starting position. Tuck your arm in more to minimize the time it takes to start the takeback.'
arm_tip_start_right_second = 'The dominant arm does not need to be extended out that much during the starting position. Tuck your arm in to minimize the time it takes to start the takeback.'
score += 4
elif user.get_min_data('shoulder2wrist_right')[0] > (playerright_arm.get_min_data()[0] * 1.05):
arm_tip_start_right_first = 'Dominant arm is extended out slightly too much during the starting position. Tuck your arm in a bit more to slightly minimize the time it takes to start the takeback.'
arm_tip_start_right_second = 'The dominant arm is slightly over-extended during the starting position. Tuck your arm in a bit to minimize the time it takes to start the takeback.'
score += 2
# right arm tucked in too much
elif user.get_min_data('shoulder2wrist_right')[0] < (playerright_arm.get_min_data()[0] * 0.80):
arm_tip_start_right_first = 'Dominant arm is tucked in significantly too much during the starting position. Extend your arm out a large amout to to increase momentum going into the takeback.'
arm_tip_start_right_second = 'Your momentum going into the takeback could increase significantly if your dominant arm is tucked in remarkably more during the starting position.'
score += 7
elif user.get_min_data('shoulder2wrist_right')[0] < (playerright_arm.get_min_data()[0] * 0.90):
arm_tip_start_right_first = 'Dominant arm is tucked in too much during the starting position. Extend your arm out more to to increase momentum going into the takeback.'
arm_tip_start_right_second = 'Your momentum going into the takeback could increase if your dominant arm is tucked in more during the starting position.'
score += 4
elif user.get_min_data('shoulder2wrist_right')[0] < (playerright_arm.get_min_data()[0] * 0.95):
arm_tip_start_right_first = 'Dominant arm is tucked in slightly too much during the starting position. Extend your arm out a bit more to to increase momentum going into the takeback.'
arm_tip_start_right_second = 'Your momentum going into the takeback could slightly increase if your dominant arm is tucked in a bit more during the starting position.'
score += 2
else:
arm_tip_start_right_first = 'Great works, your racquet seems to be starting in the right spot based on your right arm position!'
arm_tip_start_right_second = 'Your momentum going into the takeback could slightly increase if your dominant arm is tucked in a bit more during the starting position.'
# arms not aligned
if user.get_min_data('shoulder2wrist_left')[0] > (playerleft_arm.get_min_data()[0] * 1.20) and user.get_min_data('shoulder2wrist_right')[0] < (playerright_arm.get_min_data()[0] * 0.80):
arm_tip_start_both_first = 'Arms are significantly not aligned enough during starting position. Your arms should mirror each other more closely by bending your left and/or retracting your right.'
arm_tip_start_both_second = 'Arms are significantly unaligned during starting position. Your arms need to copy each other more closely by bending your left and/or retracting your right.'
score += 7
elif user.get_min_data('shoulder2wrist_left')[0] > (playerleft_arm.get_min_data()[0] * 1.10) and user.get_min_data('shoulder2wrist_right')[0] < (playerright_arm.get_min_data()[0] * 0.90):
arm_tip_start_both_first = 'Arms are not aligned enough during starting position. Your arms should mirror each other more closely by bending your left and/or retracting your right.'
arm_tip_start_both_second = 'Arms are unaligned during starting position. Your arms need to copy each other more closely by bending your left and/or retracting your right.'
score += 4
elif user.get_min_data('shoulder2wrist_left')[0] > (playerleft_arm.get_min_data()[0] * 1.05) and user.get_min_data('shoulder2wrist_right')[0] < (playerright_arm.get_min_data()[0] * 0.95):
arm_tip_start_both_first = 'Arms are slightly not aligned enough during starting position. Your arms should mirror each other more closely by bending your left and/or retracting your right.'
arm_tip_start_both_second = 'Arms are slightly unaligned during starting position. Your arms need to copy each other more closely by bending your left and/or retracting your right.'
score += 2
elif user.get_min_data('shoulder2wrist_left')[0] < (playerleft_arm.get_min_data()[0] * 0.80) and user.get_min_data('shoulder2wrist_right')[0] > (playerright_arm.get_min_data()[0] * 1.20):
arm_tip_start_both_first = 'Arms are significantly not aligned enough during starting position. Your arms should mirror each other more closely by bending your right arm and/or retracting your left arm.'
arm_tip_start_both_second = 'Arms are significantly unaligned during starting position. Your arms need to copy each other more closely by bending your right and/or retracting your left.'
score += 7
elif user.get_min_data('shoulder2wrist_left')[0] < (playerleft_arm.get_min_data()[0] * 0.90) and user.get_min_data('shoulder2wrist_right')[0] > (playerright_arm.get_min_data()[0] * 1.10):
arm_tip_start_both_first = 'Arms are not aligned enough during starting position. Your arms should mirror each other more closely by bending your right arm and/or retracting your left arm.'
arm_tip_start_both_second = 'Arms are unaligned during starting position. Your arms need to copy each other more closely by bending your right and/or retracting your left.'
score += 4
elif user.get_min_data('shoulder2wrist_left')[0] < (playerleft_arm.get_min_data()[0] * 0.95) and user.get_min_data('shoulder2wrist_right')[0] > (playerright_arm.get_min_data()[0] * 1.05):
arm_tip_start_both_first = 'Arms are slightly not aligned enough during starting position. Your arms should mirror each other more closely by bending your right arm and/or retracting your left arm.'
arm_tip_start_both_second = 'Arms are slightly unaligned during starting position. Your arms need to copy each other more closely by bending your right and/or retracting your left.'
score += 2
else:
arm_tip_start_both_first = 'Overall, arms are aligned well during the starting position!'
arm_tip_start_both_second = 'Goodjob, arms are aligned well during the starting position!'
pick_from_this_right = [arm_tip_start_right_first, arm_tip_start_right_second]
pick_from_this_both = [arm_tip_start_both_first, arm_tip_start_both_second]
arm_tip_start_right = random.choice(pick_from_this_right)
arm_tip_start_both = random.choice(pick_from_this_both)
arm_tips_start = [arm_tip_start_right, arm_tip_start_both, score]
return arm_tips_start
def arms_tips_load(user, playerright_arm, playerleft_arm):
arm_tip_load_right_first = 'None'
arm_tip_load_right_second = 'None'
arm_tip_load_left_first = 'None'
arm_tip_load_left_second = 'None'
score = 0
# bending arm too much
if user.get_min_data('shoulder2wrist_right')[1] < (playerright_arm.get_min_data()[1] * 0.80):
arm_tip_load_right_first = 'You are bending your dominant arm significantly too much on the loadup. Extend your arm a lot more for more power.'
arm_tip_load_right_second= 'To increase power by a large amount, you should extend your dominant arm much more because you are bending it significantly too much on the loadup.'
score += 7
elif user.get_min_data('shoulder2wrist_right')[1] < (playerright_arm.get_min_data()[1] * 0.90):
arm_tip_load_right_first = 'You are bending your dominant arm too much on the loadup. Extend your arm to a comfortable position for more power.'
arm_tip_load_right_second= 'To increase power, you should extend your dominant arm more because you are bending it too much on the loadup.'
score += 4
elif user.get_min_data('shoulder2wrist_right')[1] < (playerright_arm.get_min_data()[1] * 0.95):
arm_tip_load_right_first = 'You are bending your dominant arm slightyl too much on the loadup. Extend your arm a bit to a comfortable position for more power.'
arm_tip_load_right_second= 'To slightly increase power, you should extend your dominant arm a bit more because you are bending it fractionally too much on the loadup.'
score += 2
# not enough arm bend
elif user.get_min_data('shoulder2wrist_right')[1] > (playerright_arm.get_min_data()[1] * 1.20):
arm_tip_load_right_first = 'You are significantly under retracting your dominant arm in the loadup. Bend your arm a lot more to a comfortable position for more power.'
arm_tip_load_right_second= 'You are extended significantly too much on your dominant arm in the loadup. Bend your arm a lot more to a comfortable position for more power.'
score += 7
elif user.get_min_data('shoulder2wrist_right')[1] > (playerright_arm.get_min_data()[1] * 1.10):
arm_tip_load_right_first = 'You are not bending your dominant arm enough on the loadup. Bend your arm to a comfortable position for more power.'
arm_tip_load_right_second= 'You are extended too much on your dominant arm in the loadup. Bend your arm a lot more to a comfortable position for more power.'
score += 4
elif user.get_min_data('shoulder2wrist_right')[1] > (playerright_arm.get_min_data()[1] * 1.05):
arm_tip_load_right_first = 'You are not bending your dominant arm slightly enough on the loadup. Bend your arm a bit to a comfortable position for more power.'
arm_tip_load_right_second= 'You are extended slightly too much on your dominant arm in the loadup. Bend your arm a lot more to a comfortable position for more power.'
score += 2
else:
arm_tip_load_right_first = 'Great work, your dominant arm is optimzed for power in the loadup!'
arm_tip_load_right_second= 'Good job, your dominant arm does not need any changes in the loadup at the moment because it is optimized for power!'
# arm extension during toss
if user.get_average_data('shoulder2wrist_left')[1] < (playerleft_arm.get_average_data()[1] * 0.80):
arm_tip_load_left_first = 'Non-dominant arm extention throughout the toss is significantly inconsistent. Try to keep your left arm a lot straighter on the takeback and load.'
arm_tip_load_left_second = 'the extension of your non-dominant arm throughout the toss is significantly inconsistent. Try to keep your left arm a lot straighter on the takeback and load.'
score += 7
elif user.get_average_data('shoulder2wrist_left')[1] < (playerleft_arm.get_average_data()[1] * 0.90):
arm_tip_load_left_first = 'Non-dominant arm extention throughout the toss is inconsistent. Try to keep your left arm straighter on the takeback and load.'
arm_tip_load_left_second = 'the extension of your non-dominant arm throughout the toss is inconsistent. Try to keep your left arm a lot straighter on the takeback and load.'
score += 4
elif user.get_average_data('shoulder2wrist_left')[1] < (playerleft_arm.get_average_data()[1] * 0.95):
arm_tip_load_left_first = 'Non-dominant arm extention throughout the toss is slightly inconsistent. Try to keep your left arm a bit straighter on the takeback and load.'
arm_tip_load_left_second = 'the extension of your non-dominant arm throughout the toss is slightly inconsistent. Try to keep your left arm a lot straighter on the takeback and load.'
score += 2
else:
arm_tip_load_left_first = 'Your tossing arm looks very fluid and consistent!'
arm_tip_load_left_second = 'Great work on the tossing arm in the load! it looks very fluid and consistent'
pick_from_this_right = [arm_tip_load_right_first, arm_tip_load_right_second]
pick_from_this_left = [arm_tip_load_left_first, arm_tip_load_left_second]
arm_tip_load_right = random.choice(pick_from_this_right)
arm_tip_load_left = random.choice(pick_from_this_left)
arm_tips_load = [arm_tip_load_right, arm_tip_load_left, score]
return arm_tips_load
def arms_tips_extend(user, playerright_arm):
arm_tip_extend_right_first = 'None'
arm_tip_extend_right_second = 'None'
score = 0
if user.get_max_data('shoulder2wrist_right')[2] < (playerright_arm.get_max_data()[2] * 0.80):
arm_tip_extend_right_first = 'Dominant arm is significantly not extending enough during contact. Make sure to either toss the ball a lot higher and/or make contact at its apex.'
arm_tip_extend_right_second = 'When you make contact at the ball flights apex, make sure that your arm is extended out signficantly more.'
score += 7
elif user.get_max_data('shoulder2wrist_right')[2] < (playerright_arm.get_max_data()[2] * 0.90):
arm_tip_extend_right_first = 'Dominant arm is not extending enough during contact. Make sure to either toss the ball higher and/or make contact at its apex.'
arm_tip_extend_right_second = 'When you make contact at the ball flights apex, make sure that your arm is extended out more.'
score += 4
elif user.get_max_data('shoulder2wrist_right')[2] < (playerright_arm.get_max_data()[2] * 0.95):
arm_tip_extend_right_first = 'Dominant arm is slightly not extending enough during contact. Make sure to either toss the ball a bit higher and/or make contact at its apex.'
arm_tip_extend_right_second = 'When you make contact at the ball flights apex, make sure that your arm is extended out slightly more.'
score += 2
else:
arm_tip_extend_right_first = 'Good job, your dominant arm is optimally extended on contact!'
arm_tip_extend_right_second = 'Nice, your dominant arm is extended the perfect amout!'
pick_from_this_right = [arm_tip_extend_right_first, arm_tip_extend_right_second]
arm_tip_extend_right = random.choice(pick_from_this_right)
arm_tips_extend = [arm_tip_extend_right, score]
return arm_tips_extend
def arms_tips_finish(user, playerleft_arm):
arm_tip_finish_left_first = 'None'
arm_tip_finish_left_second = 'None'
score = 0
if user.get_min_data('shoulder2wrist_left')[3] > (playerleft_arm.get_min_data()[3] * 1.20):
arm_tip_finish_left_first = 'Your non-dominant arm should be significantly closer to your body in preparation for an easier recovery of the racquet.'
arm_tip_finish_left_second = 'Recovery is much easier if your non-dominant arm is closer to your body. Try brining your non-dominant arm significantly closer to your body on the finish.'
score += 7
elif user.get_min_data('shoulder2wrist_left')[3] > (playerleft_arm.get_min_data()[3] * 1.10):
arm_tip_finish_left_first = 'Your non-dominant arm should be closer to your body in preparation for an easier recovery of the racquet.'
arm_tip_finish_left_second = 'Recovery is much easier if your non-dominant arm is closer to your body. Try brining your non-dominant arm closer to your body on the finish.'
score += 4
elif user.get_min_data('shoulder2wrist_left')[3] > (playerleft_arm.get_min_data()[3] * 1.05):
arm_tip_finish_left_first = 'Your non-dominant arm should be slightly closer to your body in preparation for an easier recovery of the racquet.'
arm_tip_finish_left_second = 'Recovery is much easier if your non-dominant arm is closer to your body. Try brining your non-dominant arm slightly closer to your body on the finish.'
score += 2
else:
arm_tip_finish_left_first = 'Nice, your arms seem to be positioned correctly on the finish! Racquet recovery is much easier with optimal arm placement.'
arm_tip_finish_left_second = 'Good job, your arms seem to be positioned correctly on the finish! Racquet recovery is much easier with optimal arm placement.'
pick_from_this_left = [arm_tip_finish_left_first, arm_tip_finish_left_second]
arm_tip_finish_left = random.choice(pick_from_this_left)
arm_tips_finish = [arm_tip_finish_left, score]
return arm_tips_finish
def arm_tip_summary(user, playerright_arm, playerleft_arm):
full_arm_list = []
arm_start = arms_tips_start(user, playerright_arm, playerleft_arm)
arm_load = arms_tips_load(user, playerright_arm, playerleft_arm)
arm_extend = arms_tips_extend(user, playerright_arm)
arm_finish = arms_tips_finish(user, playerleft_arm)
arm_tip_list = [arm_start, arm_load, arm_extend, arm_finish]
for i in arm_tip_list:
for j in i:
if type(j) != int:
full_arm_list.append(j)
return full_arm_list
def arm_score_quant(user, playerright_arm, playerleft_arm):
arm_start = arms_tips_start(user, playerright_arm, playerleft_arm)
arm_load = arms_tips_load(user, playerright_arm, playerleft_arm)
arm_extend = arms_tips_extend(user, playerright_arm)
arm_finish = arms_tips_finish(user, playerleft_arm)
arm_tip_list = [arm_start, arm_load, arm_extend, arm_finish]
score = []
for i in arm_tip_list:
score.append(i[-1])
return score
# body
def body_tips_start(user, playerright_body):
body_tip_start_first = 'None'
body_tip_start_second = 'None'
score_lost = 0
# less than the pro angles
if user.get_min_data('elbow2hip_right')[0] < (playerright_body.get_min_data()[0] * 0.80):
body_tip_start_first = 'Your dominant arm is hanging significantly too low on the starting position. Try raising your are by a large amount.'
body_tip_start_second = 'Raise your dominant arm by a significant amount because it is hanging too low on the starting position'
score_lost += 7
elif user.get_min_data('elbow2hip_right')[0] < (playerright_body.get_min_data()[0] * 0.90):
body_tip_start_first = 'Your dominant arm is hanging too low on the starting position. Try raising your amount.'
body_tip_start_second = 'Raise your dominant arm because it is hanging too low on the starting position'
score_lost += 4
elif user.get_min_data('elbow2hip_right')[0] < (playerright_body.get_min_data()[0] * 0.95):
body_tip_start_first = 'Your dominant arm is hanging slightly too low on the starting position. Try raising your are by a small amount.'
body_tip_start_second = 'Raise your dominant arm by a slight amount because it is hanging too low on the starting position'
score_lost += 2
# grater than the pro angles
elif user.get_min_data('elbow2hip_right')[0] > (playerright_body.get_max_data()[0] * 1.20):
body_tip_start_first = 'Your dominant arm is raised significantly too high on the starting position. Try lowering your are by a large amount.'
body_tip_start_second = 'Lower your dominant arm by a large amount because it is hanging significantly too high on the starting position'
score_lost += 7
elif user.get_min_data('elbow2hip_right')[0] > (playerright_body.get_max_data()[0] * 1.10):
body_tip_start_first = 'Your dominant arm is raised too high on the starting position. Try lowering your amount.'
body_tip_start_second = 'Lower your dominant arm because it is hanging too high on the starting position'
score_lost += 4
elif user.get_min_data('elbow2hip_right')[0] > (playerright_body.get_max_data()[0] * 1.05):
body_tip_start_first = 'Your dominant arm is raised slightly too high on the starting position. Try lowering your are by a small amount.'
body_tip_start_second = 'Lower your dominant arm by a small amount because it is hanging slightly too high on the starting position'
score_lost += 2
else:
body_tip_start_first = 'Your upper arms are the perfect distance from your body!'
body_tip_start_second = 'Nice job, Your upper arms are the perfect distance from your body!'
pick_from_this_body = [body_tip_start_first, body_tip_start_second]
body_tip_start = random.choice(pick_from_this_body)
return [body_tip_start, score_lost]
def body_tips_load(user, playerleft_body):
score_lost = 0
body_tip_load_left_first = 'None'
body_tip_load_left_second = 'None'
if user.get_max_data('elbow2hip_left')[1] < (playerleft_body.get_max_data()[1] * 0.80):
body_tip_load_left_first = 'The tossing side of your body is significantly under stretching during the load. Try to reach up with your tossing arm a lot more.'
body_tip_load_left_second = 'Make sure to reach up with your tossing arm a lot more during the load because your tossing side of the body is significantly under stretching.'
score_lost += 7
elif user.get_min_data('elbow2hip_left')[1] < (playerleft_body.get_min_data()[1] * 0.90):
body_tip_load_left_first = 'The tossing side of your body is under stretching during the load. Try to reach up with your tossing arm more.'
body_tip_load_left_second = 'Make sure to reach up with your tossing arm more during the load because your tossing side of the body is under stretching.'
score_lost += 4
elif user.get_min_data('elbow2hip_left')[1] < (playerleft_body.get_min_data()[1] * 0.95):
body_tip_load_left_first = 'The tossing side of your body is slighly under stretching during the load. Try to reach up with your tossing arm a bit more.'
body_tip_load_left_second = 'Make sure to reach up with your tossing arm a bit more during the load because your tossing side of the body is slightly under stretching.'
score_lost += 2
elif user.get_max_data('elbow2hip_left')[1] > (playerleft_body.get_max_data()[1] * 1.20):
body_tip_load_left_first = 'The tossing side of your body is stretching significantly too much during the load. Try to align your tossing arm with the side of your body a lot more.'
body_tip_load_left_second = 'Make sure align your tossing arm with the side of your body a lot more during the load because your tossing side of the body is stretching significantly too much.'
score_lost += 7
elif user.get_min_data('elbow2hip_left')[1] > (playerleft_body.get_min_data()[1] * 1.10):
body_tip_load_left_first = 'The tossing side of your body is stretching too much during the load. Try to align your tossing arm with the side of your body more.'
body_tip_load_left_second = 'Make sure align your tossing arm with the side of your body more during the load because your tossing side of the body is stretching too much.'
score_lost += 4
elif user.get_min_data('elbow2hip_left')[1] > (playerleft_body.get_min_data()[1] * 1.05):
body_tip_load_left_first = 'The tossing side of your body is stretching slightly too much during the load. Try to align your tossing arm with the side of your body slightly more.'
body_tip_load_left_second = 'Make sure align your tossing arm with the side of your body a bit more during the load because your tossing side of the body is stretching slightly too much.'
score_lost += 2
else:
body_tip_load_left_first = 'Your tossing arm and subsequent extension of your side look perfect on the toss!'
body_tip_load_left_second = "Nice, work your tossing arm and side extension don't need any work at the moment!"
pick_from_this_body = [body_tip_load_left_first, body_tip_load_left_second]
body_tip_load = random.choice(pick_from_this_body)
return [body_tip_load, score_lost]
def body_tips_extend(user, playerright_body):
score_lost = 0
body_tip_extend = 'None'
# not extending enough during contact
if user.get_max_data('elbow2hip_right')[2] < (playerright_body.get_max_data()[2] * 0.60):
body_tip_extend = 'Your elbow on the dominant arm is significantly too low relative to your shoulder during the extension before contact. Try strongly adjusting your toss or your body position during extension'
score_lost += 7
elif user.get_max_data('elbow2hip_right')[2] < (playerright_body.get_max_data()[2] * 0.80):
body_tip_extend = 'Your elbow on the dominant arm is too low relative to your shoulder during the extension before contact. Try adjusting your toss or your body position during extension'
score_lost += 4
elif user.get_max_data('elbow2hip_right')[2] < (playerright_body.get_max_data()[2] * 0.90):
body_tip_extend = 'Your elbow on the dominant arm is slightly too low relative to your shoulder during the extension before contact. Try partially adjusting your toss or your body position during extension'
score_lost += 2
elif user.get_max_data('elbow2hip_right')[2] > (playerright_body.get_max_data()[2] * 1.20):
body_tip_extend = 'Your elbow on the dominant arm is significantly too high relative to your shoulder during the extension before contact. Try strongly adjusting your toss or your body position during extension'
score_lost += 7
elif user.get_max_data('elbow2hip_right')[2] > (playerright_body.get_max_data()[2] * 1.10):
body_tip_extend = 'Your elbow on the dominant arm is too high relative to your shoulder during the extension before contact. Try adjusting your toss or your body position during extension'
score_lost += 4
elif user.get_max_data('elbow2hip_right')[2] > (playerright_body.get_max_data()[2] * 1.05):
body_tip_extend = 'Your elbow on the dominant arm is slightly too high relative to your shoulder during the extension before contact. Try partially adjusting your toss or your body position during extension'
score_lost += 2
else:
body_tip_extend = 'Good job! the side of your body looks great during the extension into contact. The toss is maximized for body involvement in the serve.'
return [body_tip_extend, score_lost]
def body_score(user, playerright_body, playerleft_body):
body_start = body_tips_start(user, playerright_body)
body_load = body_tips_load(user, playerleft_body)
body_extend = body_tips_extend(user, playerright_body)
body_tip_list = [body_start, body_load, body_extend]
score = []
for i in body_tip_list:
score.append(i[0])
return score
def body_score_quant(user, playerright_body, playerleft_body):
body_start = body_tips_start(user, playerright_body)
body_load = body_tips_load(user, playerleft_body)
body_extend = body_tips_extend(user, playerright_body)
body_tip_list = [body_start, body_load, body_extend]
score = []
for i in body_tip_list:
score.append(i[-1])
return score
def total_score(user, playerright_leg, playerleft_leg, playerright_arm, playerleft_arm, playerright_body, playerleft_body):
legs = sum(leg_score_quant(user, playerright_leg, playerleft_leg))
arms = sum(arm_score_quant(user, playerright_arm, playerleft_arm))
body = sum(body_score_quant(user, playerright_body, playerleft_body))
score = 100 - (arms + legs + body)
if score < 0:
score = 0
return score
####################### for left handed players ##############################
# Legs
def legs_tips_start_left_handers(user, playerright_leg, playerleft_leg):
leg_tip_start_first = 'None'
leg_tip_start_second = 'None'
score_lost = 0
# less than the pro angles
if user.get_max_data('hip2ankle_left')[0] < (playerright_leg.get_max_data()[0] * 0.80) and user.get_max_data('hip2ankle_right')[0] < (playerleft_leg.get_max_data()[0] * 0.80):
leg_tip_start_first = 'Try to standup significantly more during your ready position.'
leg_tip_start_second = 'Stand remarkably higher on your ready position.'
score_lost += 7
elif user.get_max_data('hip2ankle_left')[0] < (playerright_leg.get_max_data()[0] * 0.90) and user.get_max_data('hip2ankle_right')[0] < (playerleft_leg.get_max_data()[0] * 0.90):
leg_tip_start_first = 'Try to standup more during your ready position.'
leg_tip_start_second = 'Stand higher on your ready position.'
score_lost += 4
elif user.get_max_data('hip2ankle_left')[0] < (playerright_leg.get_max_data()[0] * 0.95) and user.get_max_data('hip2ankle_right')[0] < (playerleft_leg.get_max_data()[0] * 0.95):
leg_tip_start_first = 'Try to standup more during your ready position.'
leg_tip_start_second = 'Stand slightly higher on your ready position.'
score_lost += 2
# grater than the pro angles
elif user.get_max_data('hip2ankle_left')[0] > (playerright_leg.get_max_data()[0] * 1.20) and user.get_max_data('hip2ankle_right')[0] > (playerleft_leg.get_max_data()[0] * 1.20):
leg_tip_start_first = 'Try to bend your legs significantly more during your ready position.'
leg_tip_start_second = 'Your legs are much too straight on the ready position. Bend them remarkably more.'
score_lost += 7
elif user.get_max_data('hip2ankle_left')[0] > (playerright_leg.get_max_data()[0] * 1.10) and user.get_max_data('hip2ankle_right')[0] > (playerleft_leg.get_max_data()[0] * 1.10):
leg_tip_start_first = 'Try to bend your legs more during your ready position.'
leg_tip_start_second = 'Your legs are too straight on the ready position. Bend them more.'
score_lost += 4
elif user.get_max_data('hip2ankle_left')[0] > (playerright_leg.get_max_data()[0] * 1.05) and user.get_max_data('hip2ankle_right')[0] > (playerleft_leg.get_max_data()[0] * 1.05):
leg_tip_start_first = 'Try to bend your legs slightly more during your ready position.'
leg_tip_start_second = 'Your legs are slightly too straight on the ready position. Bend them a bit more.'
score_lost += 2
# greater than one angle and less than the other
elif user.get_max_data('hip2ankle_left')[0] > (playerright_leg.get_max_data()[0] * 1.20) and user.get_max_data('hip2ankle_right')[0] < (playerleft_leg.get_max_data()[0] * 1.20):
leg_tip_start_first = 'Your legs seem to be starting at significantly contrasting angles. Try copying your leg positions more during your ready position to increase balance.'
leg_tip_start_second = 'Your legs are strongly out of sync. Try mirroring your leg positions significantly more during the ready position'
score_lost += 7
elif user.get_max_data('hip2ankle_left')[0] < (playerright_leg.get_max_data()[0] * 1.10) and user.get_max_data('hip2ankle_right')[0] > (playerleft_leg.get_max_data()[0] * 1.10):
leg_tip_start_first = 'Your legs seem to be starting at contrasting angles. Try copying your leg positions more during your ready position to increase balance.'
leg_tip_start_second = 'Your legs are out of sync. Try mirroring your leg positions more during the ready position'
score_lost += 4
elif user.get_max_data('hip2ankle_left')[0] < (playerright_leg.get_max_data()[0] * 1.05) and user.get_max_data('hip2ankle_right')[0] > (playerleft_leg.get_max_data()[0] * 1.05):
leg_tip_start_first = 'Your legs seem to be starting at slighty contrasting angles. Try copying your leg positions more during your ready position to increase balance.'
leg_tip_start_second = 'Your legs are slightly out of sync. Try mirroring your leg positions slightly more during the ready position'
score_lost += 2
else:
leg_tip_start_first = 'Your legs are placed well, and your lower body posture looks good!'
leg_tip_start_second = 'You do not need to adjust your legs on your ready position. Your lower body posture looks good!'
pick_from_this = [leg_tip_start_first, leg_tip_start_second]
leg_tip_start = random.choice(pick_from_this)
return [leg_tip_start, score_lost]
def legs_tips_load_left_handers(user, playerright_leg, playerleft_leg):
score_lost = 0
leg_tip_load_first = 'None'
leg_tip_load_second = 'None'
# bending legs too much during load
if user.get_min_data('hip2ankle_left')[1] < (playerright_leg.get_min_data()[1] * 0.80) and user.get_min_data('hip2ankle_right')[1] < (playerleft_leg.get_min_data()[1] * 0.80):
leg_tip_load_first = 'You are loading your back leg too extremely. try standing much taller during your take back.'
leg_tip_load_second = 'You do not need to bend your back leg so extremely on the load. Try standing up much more on your takeback to keep yourself from being too low.'
score_lost += 7
elif user.get_min_data('hip2ankle_left')[1] < (playerright_leg.get_min_data()[1] * 0.90) and user.get_min_data('hip2ankle_right')[1] < (playerleft_leg.get_min_data()[1] * 0.90):
leg_tip_load_first = 'You are loading your back legs too much, try standing taller during your take back.'
leg_tip_load_second = 'You do not need to bend your back leg that much on the load. Try standing up more on your takeback to keep yourself from being too low.'
score_lost += 4
elif user.get_min_data('hip2ankle_left')[1] < (playerright_leg.get_min_data()[1] * 0.95) and user.get_min_data('hip2ankle_right')[1] < (playerleft_leg.get_min_data()[1] * 0.95):
leg_tip_load_first = 'You are loading your back legs a bit too much, try standing slightly taller during your take back.'
leg_tip_load_second = 'Your are bending your back leg slightly too much on the load. Try standing up a bit more on your takeback to keep yourself from being too low.'
score_lost += 2
# not bedning legs enough during laod
elif user.get_min_data('hip2ankle_left')[1] > (playerright_leg.get_min_data()[1] * 1.20) and user.get_min_data('hip2ankle_right')[1] > (playerleft_leg.get_min_data()[1] * 1.20):
leg_tip_load_first = 'You can significatly increase power in your serve by bending your legs a convincing amount.'
leg_tip_load_second = 'Bend your legs much more to significantly decrease the angle of them in your load and subsequently significantly increase power'
score_lost += 7
elif user.get_min_data('hip2ankle_left')[1] > (playerright_leg.get_min_data()[1] * 1.10) and user.get_min_data('hip2ankle_right')[1] > (playerleft_leg.get_min_data()[1] * 1.10):
leg_tip_load_first = 'You can increase power in your serve by bending your legs more.'
leg_tip_load_second = 'Bend your legs more to decrease the angle of them in your load and subsequently increase power'
score_lost += 4
elif user.get_min_data('hip2ankle_left')[1] > (playerright_leg.get_min_data()[1] * 1.05) and user.get_min_data('hip2ankle_right')[1] > (playerleft_leg.get_min_data()[1] * 1.05):
leg_tip_load_first = 'You can slightly increase power in your serve by bending your legs a small amount.'
leg_tip_load_second = 'Bend your legs slightly more to decrease the angle of them in your load and subsequently slightly increase power'
score_lost += 2
# leaning forward during takeback
elif user.get_min_data('hip2ankle_left')[1] > (playerright_leg.get_min_data()[1] * 1.20) and user.get_min_data('hip2ankle_right')[1] < (playerleft_leg.get_min_data()[1] * 0.80):
leg_tip_load_first = 'Your lower body is leaning significantly too far forward during you takeback. Try to more equally match the bend in your right and left legs.'
leg_tip_load_second = 'You seem to be leaning ramarkably too far forward during your takeback. Try to mirror your legs much more.'
score_lost += 7
elif user.get_min_data('hip2ankle_left')[1] > (playerright_leg.get_min_data()[1] * 1.10) and user.get_min_data('hip2ankle_right')[1] < (playerleft_leg.get_min_data()[1] * 0.90):
leg_tip_load_first = 'Your lower body may be leaning too far forward during you takeback. Try to more equally match the bend in your right and left legs.'
leg_tip_load_second = 'You seem to be leaning too far forward during your takeback. Try to mirror your legs more.'
score_lost += 4
elif user.get_min_data('hip2ankle_left')[1] > (playerright_leg.get_min_data()[1] * 1.05) and user.get_min_data('hip2ankle_right')[1] < (playerleft_leg.get_min_data()[1] * 0.95):
leg_tip_load_first = 'Your lower body may be leaning slightly forward during you takeback. Try to more equally match the bend in your right and left legs.'
leg_tip_load_second = 'You seem to be leaning a bit too far forward during your takeback. Try to mirror your legs a small fraction more.'
score_lost += 2
# leaning backward during takeback
elif user.get_min_data('hip2ankle_left')[1] < (playerright_leg.get_min_data()[1] * 0.80) and user.get_min_data('hip2ankle_right')[1] > (playerleft_leg.get_min_data()[1] * 1.20):
leg_tip_load_first = 'Your lower body is leaning significantly too far back during you takeback. Try to more equally match the bend in your right and left legs.'
leg_tip_load_second = 'You seem to be leaning ramarkably too far back during your takeback. Try to mirror your legs much more.'
score_lost += 7
elif user.get_min_data('hip2ankle_left')[1] < (playerright_leg.get_min_data()[1] * 0.90) and user.get_min_data('hip2ankle_right')[1] > (playerleft_leg.get_min_data()[1] * 1.10):
leg_tip_load_first = 'Your lower body is leaning too far back during you takeback. Try to more equally match the bend in your right and left legs.'
leg_tip_load_second = 'You seem to be leaning too far back during your takeback. Try to mirror your legs more.'
score_lost += 4
elif user.get_min_data('hip2ankle_left')[1] < (playerright_leg.get_min_data()[1] * 0.95) and user.get_min_data('hip2ankle_right')[1] > (playerleft_leg.get_min_data()[1] * 1.05):
leg_tip_load_first = 'Your lower body is leaning slightly too far back during you takeback. Try to more equally match the bend in your right and left legs.'
leg_tip_load_second = 'You seem to be leaning a bit too far back during your takeback. Try to mirror your legs a small fraction more.'
score_lost += 2
else:
leg_tip_load_first = 'Your legs are well balanced during your takeback, and your legs are loading very well!'
leg_tip_load_second = 'Your balance does not need any improvement at this point because your legs are loading very well!'
pick_from_this = [leg_tip_load_first, leg_tip_load_second]
leg_tip_load = random.choice(pick_from_this)
return [leg_tip_load, score_lost]
def legs_tips_extend_left_handers(user, playerright_leg, playerleft_leg):
score_lost = 0
leg_tip_extend_first = 'None'
leg_tip_extend_second = 'None'
# not extending enough during contact
if user.get_max_data('hip2ankle_left')[2] < (playerright_leg.get_max_data()[2] * 0.80) and user.get_max_data('hip2ankle_right')[2] < (playerleft_leg.get_max_data()[2] * 0.80):
leg_tip_extend_first = 'Your legs are significantly under extending during contact. Make sure to immensely increase drive with you legs on your extension.'
leg_tip_extend_second = 'You are remarkably too low on your extension during contact. Drive much more with your legs to be straighter on contact.'
score_lost += 7
elif user.get_max_data('hip2ankle_left')[2] < (playerright_leg.get_max_data()[2] * 0.90) and user.get_max_data('hip2ankle_right')[2] < (playerleft_leg.get_max_data()[2] * 0.90):
leg_tip_extend_first = 'Your legs are under extending during contact. Make sure to drive more with you legs on your extension.'
leg_tip_extend_second = 'You are too low on your extension during contact. Drive more with your legs to be straighter on contact.'
score_lost += 4
elif user.get_max_data('hip2ankle_left')[2] < (playerright_leg.get_max_data()[2] * 0.95) and user.get_max_data('hip2ankle_right')[2] < (playerleft_leg.get_max_data()[2] * 0.95):
leg_tip_extend_first = 'Your legs are slightly under extending during contact. Make sure to drive slightly more with you legs on your extension.'
leg_tip_extend_second = 'You are slightly too low on your extension during contact. Drive a bit more with your legs to be straighter on contact.'
score_lost += 2
# mirroring extension with the legs
elif user.get_max_data('hip2ankle_left')[2] > (playerright_leg.get_max_data()[2] * 1.10) and user.get_max_data('hip2ankle_right')[2] < (playerleft_leg.get_max_data()[2] * 0.90):
leg_tip_extend_first = 'Your back leg should extend more to be closer in line with your front leg during extension and contact.'
leg_tip_extend_second = 'Your legs should be closely in line during extension and contact. Try to extend your back leg more to accompish this.'
score_lost += 4
elif user.get_max_data('hip2ankle_left')[2] < (playerright_leg.get_max_data()[2] * 0.90) and user.get_max_data('hip2ankle_right')[2] > (playerleft_leg.get_max_data()[2] * 1.10):
leg_tip_extend_first = 'Your front leg should extend more to be closer in line with your back leg during extension and contact.'
leg_tip_extend_second = 'Your legs should be closely in line during extension and contact. Try to extend your front leg more to accompish this.'
score_lost += 4
else:
leg_tip_extend_first = 'Good job! Your legs are optimally extending on contact and consistently mirroring eachother.'
leg_tip_extend_second = 'Your legs are optimally extending on contact and consistently mirroring eachother! There is no need to adjust your legs during your extension.'
pick_from_this = [leg_tip_extend_first, leg_tip_extend_second]
leg_tip_extend = random.choice(pick_from_this)
return [leg_tip_extend, score_lost]
def legs_tips_finish_left_handers(user, playerleft_leg):
score_lost = 0
leg_tip_finish_first = 'None'
leg_tip_finish_second = 'None'
if user.get_min_data('hip2ankle_right')[3] < (playerleft_leg.get_min_data()[3] * 0.80):
leg_tip_finish_start = 'You are dipping your front leg significantly too much during the finish. Try landing a lot taller in the legs to optimize your recovery.'
leg_tip_finish_second = 'Your front leg is absorbing remarkably too much on the finish. You need to finish much taller to have a better chance at recovery.'
score_lost += 7
elif user.get_min_data('hip2ankle_right')[3] < (playerleft_leg.get_min_data()[3] * 0.90):
leg_tip_finish_first = 'You are dipping your front leg too much during the finish. Try landing taller in the legs to optimize your recovery.'
leg_tip_finish_second = 'Your front leg is absorbing too strongly on the finish. You need to finish taller to have a better chance at recovery.'
score_lost += 4
elif user.get_min_data('hip2ankle_right')[3] < (playerleft_leg.get_min_data()[3] * 0.95):
leg_tip_finish_first = 'You are dipping your front leg slightly too much during the finish. Try landing a bit taller in the legs to optimize your recovery.'
leg_tip_finish_second = 'Your front leg is absorbing slightly too much on the finish. You need to finish a bit taller to have a better chance at recovery.'
score_lost += 2
elif user.get_min_data('hip2ankle_right')[3] > (playerleft_leg.get_min_data()[3] * 1.20):
leg_tip_finish_first = 'You are standing significantly too tall during your finish. Try getting a lot lower in the legs to absorb your impact with the ground.'
leg_tip_finish_second = 'You are remarkably too high on the finish. You need to get much lower and absorb the impact with your legs more. '
score_lost += 7
elif user.get_min_data('hip2ankle_right')[3] > (playerleft_leg.get_min_data()[3] * 1.10):
leg_tip_finish_first = 'You are standing too tall during your finish. Try getting lower in the legs to absorb your impact with the ground.'
leg_tip_finish_second = 'You are too high on the finish. You need to get lower and absorb the impact with your legs more. '
score_lost += 4
elif user.get_min_data('hip2ankle_right')[3] > (playerleft_leg.get_min_data()[3] * 1.05):
leg_tip_finish_first = 'You are standing slightly too tall during your finish. Try getting a bit lower in the legs to absorb your impact with the ground.'
leg_tip_finish_second = 'You are slightly too high on the finish. You need to get a bit lower and absorb the impact with your legs more. '
score_lost += 2
else:
leg_tip_finish_first = 'You are landing in a great position that is optimal for balance and recovery!'
leg_tip_finish_second = 'Your landing is neither too low nor too high on the finish! You are optimzed for a strong recovery after the serve.'
pick_from_this = [leg_tip_finish_first, leg_tip_finish_second]
leg_tip_finish = random.choice(pick_from_this)
return [leg_tip_finish, score_lost]
def leg_score_left_handers(user, playerright_leg, playerleft_leg):
leg_start_left_handers = legs_tips_start_left_handers(user, playerright_leg, playerleft_leg)
leg_load_left_handers = legs_tips_load_left_handers(user, playerright_leg, playerleft_leg)
leg_extend_left_handers = legs_tips_extend_left_handers(user, playerright_leg, playerleft_leg)
leg_finish_left_handers = legs_tips_finish_left_handers(user, playerright_leg)
leg_tip_list = [leg_start_left_handers, leg_load_left_handers, leg_extend_left_handers, leg_finish_left_handers]
score = []
for i in leg_tip_list:
score.append(i[0])
return score
def leg_score_quant_left_handers(user, playerright_leg, playerleft_leg):
leg_start_left_handers = legs_tips_start_left_handers(user, playerright_leg, playerleft_leg)
leg_load_left_handers = legs_tips_load_left_handers(user, playerright_leg, playerleft_leg)
leg_extend_left_handers = legs_tips_extend_left_handers(user, playerright_leg, playerleft_leg)
leg_finish_left_handers = legs_tips_finish_left_handers(user, playerright_leg)
leg_tip_list = [leg_start_left_handers, leg_load_left_handers, leg_extend_left_handers, leg_finish_left_handers]
score = []
for i in leg_tip_list:
score.append(i[-1])
return score
# Arms
def arms_tips_start_left_handers(user, playerright_arm, playerleft_arm):
arm_tip_start_right_first = 'None'
arm_tip_start_right_first = 'None'
arm_tip_start_both_first = 'None'
arm_tip_start_both_second = 'None'
score = 0
# right arm extended out too much
if user.get_min_data('shoulder2wrist_left')[0] > (playerright_arm.get_min_data()[0] * 1.20):
arm_tip_start_right_first = 'Dominant arm is extended out significantly too much during the starting position. Tuck your arm in a large amount to minimize the time it takes to start the takeback.'
arm_tip_start_right_second = 'Dominant arm is extended out notably too much during the starting position. Tuck your arm in a huge amount to minimize the time it takes to start the takeback.'
score += 7
elif user.get_min_data('shoulder2wrist_left')[0] > (playerright_arm.get_min_data()[0] * 1.10):
arm_tip_start_right_first = 'Dominant arm is extended out too much during the starting position. Tuck your arm in more to minimize the time it takes to start the takeback.'
arm_tip_start_right_second = 'The dominant arm does not need to be extended out that much during the starting position. Tuck your arm in to minimize the time it takes to start the takeback.'
score += 4
elif user.get_min_data('shoulder2wrist_left')[0] > (playerright_arm.get_min_data()[0] * 1.05):
arm_tip_start_right_first = 'Dominant arm is extended out slightly too much during the starting position. Tuck your arm in a bit more to slightly minimize the time it takes to start the takeback.'
arm_tip_start_right_second = 'The dominant arm is slightly over-extended during the starting position. Tuck your arm in a bit to minimize the time it takes to start the takeback.'
score += 2
# right arm tucked in too much
elif user.get_min_data('shoulder2wrist_left')[0] < (playerright_arm.get_min_data()[0] * 0.80):
arm_tip_start_right_first = 'Dominant arm is tucked in significantly too much during the starting position. Extend your arm out a large amout to to increase momentum going into the takeback.'
arm_tip_start_right_second = 'Your momentum going into the takeback could increase significantly if your dominant arm is tucked in remarkably more during the starting position.'
score += 7
elif user.get_min_data('shoulder2wrist_left')[0] < (playerright_arm.get_min_data()[0] * 0.90):
arm_tip_start_right_first = 'Dominant arm is tucked in too much during the starting position. Extend your arm out more to to increase momentum going into the takeback.'
arm_tip_start_right_second = 'Your momentum going into the takeback could increase if your dominant arm is tucked in more during the starting position.'
score += 4
elif user.get_min_data('shoulder2wrist_left')[0] < (playerright_arm.get_min_data()[0] * 0.95):
arm_tip_start_right_first = 'Dominant arm is tucked in slightly too much during the starting position. Extend your arm out a bit more to to increase momentum going into the takeback.'
arm_tip_start_right_second = 'Your momentum going into the takeback could slightly increase if your dominant arm is tucked in a bit more during the starting position.'
score += 2
else:
arm_tip_start_right_first = 'Great works, your racquet seems to be starting in the right spot based on your right arm position!'
arm_tip_start_right_second = 'Your momentum going into the takeback could slightly increase if your dominant arm is tucked in a bit more during the starting position.'
# arms not aligned
if user.get_min_data('shoulder2wrist_right')[0] > (playerleft_arm.get_min_data()[0] * 1.20) and user.get_min_data('shoulder2wrist_left')[0] < (playerright_arm.get_min_data()[0] * 0.80):
arm_tip_start_both_first = 'Arms are significantly not aligned enough during starting position. Your arms should mirror each other more closely by bending your left and/or retracting your right.'
arm_tip_start_both_second = 'Arms are significantly unaligned during starting position. Your arms need to copy each other more closely by bending your left and/or retracting your right.'
score += 7
elif user.get_min_data('shoulder2wrist_right')[0] > (playerleft_arm.get_min_data()[0] * 1.10) and user.get_min_data('shoulder2wrist_left')[0] < (playerright_arm.get_min_data()[0] * 0.90):
arm_tip_start_both_first = 'Arms are not aligned enough during starting position. Your arms should mirror each other more closely by bending your left and/or retracting your right.'
arm_tip_start_both_second = 'Arms are unaligned during starting position. Your arms need to copy each other more closely by bending your left and/or retracting your right.'
score += 4
elif user.get_min_data('shoulder2wrist_right')[0] > (playerleft_arm.get_min_data()[0] * 1.05) and user.get_min_data('shoulder2wrist_left')[0] < (playerright_arm.get_min_data()[0] * 0.95):
arm_tip_start_both_first = 'Arms are slightly not aligned enough during starting position. Your arms should mirror each other more closely by bending your left and/or retracting your right.'
arm_tip_start_both_second = 'Arms are slightly unaligned during starting position. Your arms need to copy each other more closely by bending your left and/or retracting your right.'
score += 2
elif user.get_min_data('shoulder2wrist_right')[0] < (playerleft_arm.get_min_data()[0] * 0.80) and user.get_min_data('shoulder2wrist_left')[0] > (playerright_arm.get_min_data()[0] * 1.20):
arm_tip_start_both_first = 'Arms are significantly not aligned enough during starting position. Your arms should mirror each other more closely by bending your right arm and/or retracting your left arm.'
arm_tip_start_both_second = 'Arms are significantly unaligned during starting position. Your arms need to copy each other more closely by bending your right and/or retracting your left.'
score += 7
elif user.get_min_data('shoulder2wrist_right')[0] < (playerleft_arm.get_min_data()[0] * 0.90) and user.get_min_data('shoulder2wrist_left')[0] > (playerright_arm.get_min_data()[0] * 1.10):
arm_tip_start_both_first = 'Arms are not aligned enough during starting position. Your arms should mirror each other more closely by bending your right arm and/or retracting your left arm.'
arm_tip_start_both_second = 'Arms are unaligned during starting position. Your arms need to copy each other more closely by bending your right and/or retracting your left.'
score += 4
elif user.get_min_data('shoulder2wrist_right')[0] < (playerleft_arm.get_min_data()[0] * 0.95) and user.get_min_data('shoulder2wrist_left')[0] > (playerright_arm.get_min_data()[0] * 1.05):
arm_tip_start_both_first = 'Arms are slightly not aligned enough during starting position. Your arms should mirror each other more closely by bending your right arm and/or retracting your left arm.'
arm_tip_start_both_second = 'Arms are slightly unaligned during starting position. Your arms need to copy each other more closely by bending your right and/or retracting your left.'
score += 2
else:
arm_tip_start_both_first = 'Overall, arms are aligned well during the starting position!'
arm_tip_start_both_second = 'Goodjob, arms are aligned well during the starting position!'
pick_from_this_right = [arm_tip_start_right_first, arm_tip_start_right_second]
pick_from_this_both = [arm_tip_start_both_first, arm_tip_start_both_second]
arm_tip_start_right = random.choice(pick_from_this_right)
arm_tip_start_both = random.choice(pick_from_this_both)
arm_tips_start = [arm_tip_start_right, arm_tip_start_both, score]
return arm_tips_start
def arms_tips_load_left_handers(user, playerright_arm, playerleft_arm):
arm_tip_load_right_first = 'None'
arm_tip_load_right_second = 'None'
arm_tip_load_left_first = 'None'
arm_tip_load_left_second = 'None'
score = 0
# bending arm too much
if user.get_min_data('shoulder2wrist_left')[1] < (playerright_arm.get_min_data()[1] * 0.80):
arm_tip_load_right_first = 'You are bending your dominant arm significantly too much on the loadup. Extend your arm a lot more for more power.'
arm_tip_load_right_second= 'To increase power by a large amount, you should extend your dominant arm much more because you are bending it significantly too much on the loadup.'
score += 7
elif user.get_min_data('shoulder2wrist_left')[1] < (playerright_arm.get_min_data()[1] * 0.90):
arm_tip_load_right_first = 'You are bending your dominant arm too much on the loadup. Extend your arm to a comfortable position for more power.'
arm_tip_load_right_second= 'To increase power, you should extend your dominant arm more because you are bending it too much on the loadup.'
score += 4
elif user.get_min_data('shoulder2wrist_left')[1] < (playerright_arm.get_min_data()[1] * 0.95):
arm_tip_load_right_first = 'You are bending your dominant arm slightyl too much on the loadup. Extend your arm a bit to a comfortable position for more power.'
arm_tip_load_right_second= 'To slightly increase power, you should extend your dominant arm a bit more because you are bending it fractionally too much on the loadup.'
score += 2
# not enough arm bend
elif user.get_min_data('shoulder2wrist_left')[1] > (playerright_arm.get_min_data()[1] * 1.20):
arm_tip_load_right_first = 'You are significantly under retracting your dominant arm in the loadup. Bend your arm a lot more to a comfortable position for more power.'
arm_tip_load_right_second= 'You are extended significantly too much on your dominant arm in the loadup. Bend your arm a lot more to a comfortable position for more power.'
score += 7
elif user.get_min_data('shoulder2wrist_left')[1] > (playerright_arm.get_min_data()[1] * 1.10):
arm_tip_load_right_first = 'You are not bending your dominant arm enough on the loadup. Bend your arm to a comfortable position for more power.'
arm_tip_load_right_second= 'You are extended too much on your dominant arm in the loadup. Bend your arm a lot more to a comfortable position for more power.'
score += 4
elif user.get_min_data('shoulder2wrist_left')[1] > (playerright_arm.get_min_data()[1] * 1.05):
arm_tip_load_right_first = 'You are not bending your dominant arm slightly enough on the loadup. Bend your arm a bit to a comfortable position for more power.'
arm_tip_load_right_second= 'You are extended slightly too much on your dominant arm in the loadup. Bend your arm a lot more to a comfortable position for more power.'
score += 2
else:
arm_tip_load_right_first = 'Great work, your dominant arm is optimzed for power in the loadup!'
arm_tip_load_right_second= 'Good job, your dominant arm does not need any changes in the loadup at the moment because it is optimized for power!'
# arm extension during toss
if user.get_average_data('shoulder2wrist_right')[1] < (playerleft_arm.get_average_data()[1] * 0.80):
arm_tip_load_left_first = 'Non-dominant arm extention throughout the toss is significantly inconsistent. Try to keep your left arm a lot straighter on the takeback and load.'
arm_tip_load_left_second = 'the extension of your non-dominant arm throughout the toss is significantly inconsistent. Try to keep your left arm a lot straighter on the takeback and load.'
score += 7
elif user.get_average_data('shoulder2wrist_right')[1] < (playerleft_arm.get_average_data()[1] * 0.90):
arm_tip_load_left_first = 'Non-dominant arm extention throughout the toss is inconsistent. Try to keep your left arm straighter on the takeback and load.'
arm_tip_load_left_second = 'the extension of your non-dominant arm throughout the toss is inconsistent. Try to keep your left arm a lot straighter on the takeback and load.'
score += 4
elif user.get_average_data('shoulder2wrist_right')[1] < (playerleft_arm.get_average_data()[1] * 0.95):
arm_tip_load_left_first = 'Non-dominant arm extention throughout the toss is slightly inconsistent. Try to keep your left arm a bit straighter on the takeback and load.'
arm_tip_load_left_second = 'the extension of your non-dominant arm throughout the toss is slightly inconsistent. Try to keep your left arm a lot straighter on the takeback and load.'
score += 2
else:
arm_tip_load_left_first = 'Your tossing arm looks very fluid and consistent!'
arm_tip_load_left_second = 'Great work on the tossing arm in the load! it looks very fluid and consistent'
pick_from_this_right = [arm_tip_load_right_first, arm_tip_load_right_second]
pick_from_this_left = [arm_tip_load_left_first, arm_tip_load_left_second]
arm_tip_load_right = random.choice(pick_from_this_right)
arm_tip_load_left = random.choice(pick_from_this_left)
arm_tips_load = [arm_tip_load_right, arm_tip_load_left, score]
return arm_tips_load
def arms_tips_extend_left_handers(user, playerright_arm):
arm_tip_extend_right_first = 'None'
arm_tip_extend_right_second = 'None'
score = 0
if user.get_max_data('shoulder2wrist_left')[2] < (playerright_arm.get_max_data()[2] * 0.80):
arm_tip_extend_right_first = 'Dominant arm is significantly not extending enough during contact. Make sure to either toss the ball a lot higher and/or make contact at its apex.'
arm_tip_extend_right_second = 'When you make contact at the ball flights apex, make sure that your arm is extended out signficantly more.'
score += 7
elif user.get_max_data('shoulder2wrist_left')[2] < (playerright_arm.get_max_data()[2] * 0.90):
arm_tip_extend_right_first = 'Dominant arm is not extending enough during contact. Make sure to either toss the ball higher and/or make contact at its apex.'
arm_tip_extend_right_second = 'When you make contact at the ball flights apex, make sure that your arm is extended out more.'
score += 4
elif user.get_max_data('shoulder2wrist_left')[2] < (playerright_arm.get_max_data()[2] * 0.95):
arm_tip_extend_right_first = 'Dominant arm is slightly not extending enough during contact. Make sure to either toss the ball a bit higher and/or make contact at its apex.'
arm_tip_extend_right_second = 'When you make contact at the ball flights apex, make sure that your arm is extended out slightly more.'
score += 2
else:
arm_tip_extend_right_first = 'Good job, your dominant arm is optimally extended on contact!'
arm_tip_extend_right_second = 'Nice, your dominant arm is extended the perfect amout!'
pick_from_this_right = [arm_tip_extend_right_first, arm_tip_extend_right_second]
arm_tip_extend_right = random.choice(pick_from_this_right)
arm_tips_extend = [arm_tip_extend_right, score]
return arm_tips_extend
def arms_tips_finish_left_handers(user, playerleft_arm):
arm_tip_finish_left_first = 'None'
arm_tip_finish_left_second = 'None'
score = 0
if user.get_min_data('shoulder2wrist_right')[3] > (playerleft_arm.get_min_data()[3] * 1.20):
arm_tip_finish_left_first = 'Your non-dominant arm should be significantly closer to your body in preparation for an easier recovery of the racquet.'
arm_tip_finish_left_second = 'Recovery is much easier if your non-dominant arm is closer to your body. Try brining your non-dominant arm significantly closer to your body on the finish.'
score += 7
elif user.get_min_data('shoulder2wrist_right')[3] > (playerleft_arm.get_min_data()[3] * 1.10):
arm_tip_finish_left_first = 'Your non-dominant arm should be closer to your body in preparation for an easier recovery of the racquet.'
arm_tip_finish_left_second = 'Recovery is much easier if your non-dominant arm is closer to your body. Try brining your non-dominant arm closer to your body on the finish.'
score += 4
elif user.get_min_data('shoulder2wrist_right')[3] > (playerleft_arm.get_min_data()[3] * 1.05):
arm_tip_finish_left_first = 'Your non-dominant arm should be slightly closer to your body in preparation for an easier recovery of the racquet.'
arm_tip_finish_left_second = 'Recovery is much easier if your non-dominant arm is closer to your body. Try brining your non-dominant arm slightly closer to your body on the finish.'
score += 2
else:
arm_tip_finish_left_first = 'Nice, your arms seem to be positioned correctly on the finish! Racquet recovery is much easier with optimal arm placement.'
arm_tip_finish_left_second = 'Good job, your arms seem to be positioned correctly on the finish! Racquet recovery is much easier with optimal arm placement.'
pick_from_this_left = [arm_tip_finish_left_first, arm_tip_finish_left_second]
arm_tip_finish_left = random.choice(pick_from_this_left)
arm_tips_finish = [arm_tip_finish_left, score]
return arm_tips_finish
def arm_tip_summary_left_handers(user, playerright_arm, playerleft_arm):
full_arm_list = []
arm_start_left_handers = arms_tips_start_left_handers(user, playerright_arm, playerleft_arm)
arm_load_left_handers = arms_tips_load_left_handers(user, playerright_arm, playerleft_arm)
arm_extend_left_handers = arms_tips_extend_left_handers(user, playerright_arm)
arm_finish_left_handers = arms_tips_finish_left_handers(user, playerleft_arm)
arm_tip_list = [arm_start_left_handers, arm_load_left_handers, arm_extend_left_handers, arm_finish_left_handers]
for i in arm_tip_list:
for j in i:
if type(j) != int:
full_arm_list.append(j)
return full_arm_list
def arm_score_quant_left_handers(user, playerright_arm, playerleft_arm):
arm_start_left_handers = arms_tips_start_left_handers(user, playerright_arm, playerleft_arm)
arm_load_left_handers = arms_tips_load_left_handers(user, playerright_arm, playerleft_arm)
arm_extend_left_handers = arms_tips_extend_left_handers(user, playerright_arm)
arm_finish_left_handers = arms_tips_finish_left_handers(user, playerleft_arm)
arm_tip_list = [arm_start_left_handers, arm_load_left_handers, arm_extend_left_handers, arm_finish_left_handers]
score = []
for i in arm_tip_list:
score.append(i[-1])
return score
# body
def body_tips_start_left_handers(user, playerright_body):
body_tip_start_first = 'None'
body_tip_start_second = 'None'
score_lost = 0
# less than the pro angles
if user.get_min_data('elbow2hip_left')[0] < (playerright_body.get_min_data()[0] * 0.80):
body_tip_start_first = 'Your dominant arm is hanging significantly too low on the starting position. Try raising your are by a large amount.'
body_tip_start_second = 'Raise your dominant arm by a significant amount because it is hanging too low on the starting position'
score_lost += 7
elif user.get_min_data('elbow2hip_left')[0] < (playerright_body.get_min_data()[0] * 0.90):
body_tip_start_first = 'Your dominant arm is hanging too low on the starting position. Try raising your amount.'
body_tip_start_second = 'Raise your dominant arm because it is hanging too low on the starting position'
score_lost += 4
elif user.get_min_data('elbow2hip_left')[0] < (playerright_body.get_min_data()[0] * 0.95):
body_tip_start_first = 'Your dominant arm is hanging slightly too low on the starting position. Try raising your are by a small amount.'
body_tip_start_second = 'Raise your dominant arm by a slight amount because it is hanging too low on the starting position'
score_lost += 2
# grater than the pro angles
elif user.get_min_data('elbow2hip_left')[0] > (playerright_body.get_max_data()[0] * 1.20):
body_tip_start_first = 'Your dominant arm is raised significantly too high on the starting position. Try lowering your are by a large amount.'
body_tip_start_second = 'Lower your dominant arm by a large amount because it is hanging significantly too high on the starting position'
score_lost += 7
elif user.get_min_data('elbow2hip_left')[0] > (playerright_body.get_max_data()[0] * 1.10):
body_tip_start_first = 'Your dominant arm is raised too high on the starting position. Try lowering your amount.'
body_tip_start_second = 'Lower your dominant arm because it is hanging too high on the starting position'
score_lost += 4
elif user.get_min_data('elbow2hip_left')[0] > (playerright_body.get_max_data()[0] * 1.05):
body_tip_start_first = 'Your dominant arm is raised slightly too high on the starting position. Try lowering your are by a small amount.'
body_tip_start_second = 'Lower your dominant arm by a small amount because it is hanging slightly too high on the starting position'
score_lost += 2
else:
body_tip_start_first = 'Your upper arms are the perfect distance from your body!'
body_tip_start_second = 'Nice job, Your upper arms are the perfect distance from your body!'
pick_from_this_body = [body_tip_start_first, body_tip_start_second]
body_tip_start = random.choice(pick_from_this_body)
return [body_tip_start, score_lost]
def body_tips_load_left_handers(user, playerleft_body):
score_lost = 0
body_tip_load_left_first = 'None'
body_tip_load_left_second = 'None'
if user.get_max_data('elbow2hip_right')[1] < (playerleft_body.get_max_data()[1] * 0.80):
body_tip_load_left_first = 'The tossing side of your body is significantly under stretching during the load. Try to reach up with your tossing arm a lot more.'
body_tip_load_left_second = 'Make sure to reach up with your tossing arm a lot more during the load because your tossing side of the body is significantly under stretching.'
score_lost += 7
elif user.get_min_data('elbow2hip_right')[1] < (playerleft_body.get_min_data()[1] * 0.90):
body_tip_load_left_first = 'The tossing side of your body is under stretching during the load. Try to reach up with your tossing arm more.'
body_tip_load_left_second = 'Make sure to reach up with your tossing arm more during the load because your tossing side of the body is under stretching.'
score_lost += 4
elif user.get_min_data('elbow2hip_right')[1] < (playerleft_body.get_min_data()[1] * 0.95):
body_tip_load_left_first = 'The tossing side of your body is slighly under stretching during the load. Try to reach up with your tossing arm a bit more.'
body_tip_load_left_second = 'Make sure to reach up with your tossing arm a bit more during the load because your tossing side of the body is slightly under stretching.'
score_lost += 2
elif user.get_max_data('elbow2hip_right')[1] > (playerleft_body.get_max_data()[1] * 1.20):
body_tip_load_left_first = 'The tossing side of your body is stretching significantly too much during the load. Try to align your tossing arm with the side of your body a lot more.'
body_tip_load_left_second = 'Make sure align your tossing arm with the side of your body a lot more during the load because your tossing side of the body is stretching significantly too much.'
score_lost += 7
elif user.get_min_data('elbow2hip_right')[1] > (playerleft_body.get_min_data()[1] * 1.10):
body_tip_load_left_first = 'The tossing side of your body is stretching too much during the load. Try to align your tossing arm with the side of your body more.'
body_tip_load_left_second = 'Make sure align your tossing arm with the side of your body more during the load because your tossing side of the body is stretching too much.'
score_lost += 4
elif user.get_min_data('elbow2hip_right')[1] > (playerleft_body.get_min_data()[1] * 1.05):
body_tip_load_left_first = 'The tossing side of your body is stretching slightly too much during the load. Try to align your tossing arm with the side of your body slightly more.'
body_tip_load_left_second = 'Make sure align your tossing arm with the side of your body a bit more during the load because your tossing side of the body is stretching slightly too much.'
score_lost += 2
else:
body_tip_load_left_first = 'Your tossing arm and subsequent extension of your side look perfect on the toss!'
body_tip_load_left_second = "Nice, work your tossing arm and side extension don't need any work at the moment!"
pick_from_this_body = [body_tip_load_left_first, body_tip_load_left_second]
body_tip_load = random.choice(pick_from_this_body)
return [body_tip_load, score_lost]
def body_tips_extend_left_handers(user, playerright_body):
score_lost = 0
body_tip_extend = 'None'
# not extending enough during contact
if user.get_max_data('elbow2hip_left')[2] < (playerright_body.get_max_data()[2] * 0.60):
body_tip_extend = 'Your elbow on the dominant arm is significantly too low relative to your shoulder during the extension before contact. Try strongly adjusting your toss or your body position during extension'
score_lost += 7
elif user.get_max_data('elbow2hip_left')[2] < (playerright_body.get_max_data()[2] * 0.80):
body_tip_extend = 'Your elbow on the dominant arm is too low relative to your shoulder during the extension before contact. Try adjusting your toss or your body position during extension'
score_lost += 4
elif user.get_max_data('elbow2hip_left')[2] < (playerright_body.get_max_data()[2] * 0.90):
body_tip_extend = 'Your elbow on the dominant arm is slightly too low relative to your shoulder during the extension before contact. Try partially adjusting your toss or your body position during extension'
score_lost += 2
elif user.get_max_data('elbow2hip_left')[2] > (playerright_body.get_max_data()[2] * 1.20):
body_tip_extend = 'Your elbow on the dominant arm is significantly too high relative to your shoulder during the extension before contact. Try strongly adjusting your toss or your body position during extension'
score_lost += 7
elif user.get_max_data('elbow2hip_left')[2] > (playerright_body.get_max_data()[2] * 1.10):
body_tip_extend = 'Your elbow on the dominant arm is too high relative to your shoulder during the extension before contact. Try adjusting your toss or your body position during extension'
score_lost += 4
elif user.get_max_data('elbow2hip_left')[2] > (playerright_body.get_max_data()[2] * 1.05):
body_tip_extend = 'Your elbow on the dominant arm is slightly too high relative to your shoulder during the extension before contact. Try partially adjusting your toss or your body position during extension'
score_lost += 2
else:
body_tip_extend = 'Good job! the side of your body looks great during the extension into contact. The toss is maximized for body involvement in the serve.'
return [body_tip_extend, score_lost]
def body_score_left_handers(user, playerright_body, playerleft_body):
body_start_left_handers = body_tips_start_left_handers(user, playerright_body)
body_load_left_handers = body_tips_load_left_handers(user, playerleft_body)
body_extend_left_handers = body_tips_extend_left_handers(user, playerright_body)
body_tip_list = [body_start_left_handers, body_load_left_handers, body_extend_left_handers]
score = []
for i in body_tip_list:
score.append(i[0])
return score
def body_score_quant_left_handers(user, playerright_body, playerleft_body):
body_start_left_handers = body_tips_start_left_handers(user, playerright_body)
body_load_left_handers = body_tips_load_left_handers(user, playerleft_body)
body_extend_left_handers = body_tips_extend_left_handers(user, playerright_body)
body_tip_list = [body_start_left_handers, body_load_left_handers, body_extend_left_handers]
score = []
for i in body_tip_list:
score.append(i[-1])
return score
def total_score_left_handers(user, playerright_leg, playerleft_leg, playerright_arm, playerleft_arm, playerright_body, playerleft_body):
legs = sum(leg_score_quant_left_handers(user, playerright_leg, playerleft_leg))
arms = sum(arm_score_quant_left_handers(user, playerright_arm, playerleft_arm))
body = sum(body_score_quant_left_handers(user, playerright_body, playerleft_body))
score = 100 - (arms + legs + body)
if score < 0:
score = 0
return score
| 78.597444 | 220 | 0.737318 | 15,748 | 98,404 | 4.352172 | 0.023178 | 0.027372 | 0.038519 | 0.027372 | 0.978771 | 0.973431 | 0.97327 | 0.970834 | 0.967259 | 0.965669 | 0 | 0.01884 | 0.184982 | 98,404 | 1,251 | 221 | 78.660272 | 0.835738 | 0.020751 | 0 | 0.766478 | 0 | 0.220339 | 0.466283 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.051789 | false | 0 | 0.006591 | 0 | 0.110169 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8c151eb472fa9d68938c2bd844cbe171482d08b1 | 95,282 | py | Python | src/MOSIM/mmi/services/MCoordinateSystemMapper.py | dfki-asr/MMIPython-Core | 2f4b51ffde606c45661d9dbd5153576f919bdb8b | [
"MIT"
] | null | null | null | src/MOSIM/mmi/services/MCoordinateSystemMapper.py | dfki-asr/MMIPython-Core | 2f4b51ffde606c45661d9dbd5153576f919bdb8b | [
"MIT"
] | null | null | null | src/MOSIM/mmi/services/MCoordinateSystemMapper.py | dfki-asr/MMIPython-Core | 2f4b51ffde606c45661d9dbd5153576f919bdb8b | [
"MIT"
] | null | null | null | #
# Autogenerated by Thrift Compiler (0.13.0)
#
# DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING
#
# options string: py
#
from thrift.Thrift import TType, TMessageType, TFrozenDict, TException, TApplicationException
from thrift.protocol.TProtocol import TProtocolException
from thrift.TRecursive import fix_spec
import sys
import MOSIM.mmi.services.MMIServiceBase
import logging
from .ttypes import *
from thrift.Thrift import TProcessor
from thrift.transport import TTransport
all_structs = []
class Iface(MOSIM.mmi.services.MMIServiceBase.Iface):
def TransformToMMI_L(self, transform, coordinateSystem):
"""
Parameters:
- transform
- coordinateSystem
"""
pass
def TransformToMMI(self, transform, firstAxis, secondAxis, thirdAxis):
"""
Parameters:
- transform
- firstAxis
- secondAxis
- thirdAxis
"""
pass
def TransformFromMMI_L(self, transform, coordinateSystem):
"""
Parameters:
- transform
- coordinateSystem
"""
pass
def TransformFromMMI(self, transform, firstAxis, secondAxis, thirdAxis):
"""
Parameters:
- transform
- firstAxis
- secondAxis
- thirdAxis
"""
pass
def QuaternionToMMI_L(self, quat, coordinateSystem):
"""
Parameters:
- quat
- coordinateSystem
"""
pass
def QuaternionToMMI(self, quat, firstAxis, secondAxis, thirdAxis):
"""
Parameters:
- quat
- firstAxis
- secondAxis
- thirdAxis
"""
pass
def QuaternionFromMMI_L(self, quat, coordinateSystem):
"""
Parameters:
- quat
- coordinateSystem
"""
pass
def QuaternionFromMMI(self, quat, firstAxis, secondAxis, thirdAxis):
"""
Parameters:
- quat
- firstAxis
- secondAxis
- thirdAxis
"""
pass
def VectorToMMI_L(self, quat, coordinateSystem):
"""
Parameters:
- quat
- coordinateSystem
"""
pass
def VectorToMMI(self, quat, firstAxis, secondAxis, thirdAxis):
"""
Parameters:
- quat
- firstAxis
- secondAxis
- thirdAxis
"""
pass
def VectorFromMMI_L(self, quat, coordinateSystem):
"""
Parameters:
- quat
- coordinateSystem
"""
pass
def VectorFromMMI(self, quat, firstAxis, secondAxis, thirdAxis):
"""
Parameters:
- quat
- firstAxis
- secondAxis
- thirdAxis
"""
pass
class Client(MOSIM.mmi.services.MMIServiceBase.Client, Iface):
def __init__(self, iprot, oprot=None):
MOSIM.mmi.services.MMIServiceBase.Client.__init__(self, iprot, oprot)
def TransformToMMI_L(self, transform, coordinateSystem):
"""
Parameters:
- transform
- coordinateSystem
"""
self.send_TransformToMMI_L(transform, coordinateSystem)
return self.recv_TransformToMMI_L()
def send_TransformToMMI_L(self, transform, coordinateSystem):
self._oprot.writeMessageBegin('TransformToMMI_L', TMessageType.CALL, self._seqid)
args = TransformToMMI_L_args()
args.transform = transform
args.coordinateSystem = coordinateSystem
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_TransformToMMI_L(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = TransformToMMI_L_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "TransformToMMI_L failed: unknown result")
def TransformToMMI(self, transform, firstAxis, secondAxis, thirdAxis):
"""
Parameters:
- transform
- firstAxis
- secondAxis
- thirdAxis
"""
self.send_TransformToMMI(transform, firstAxis, secondAxis, thirdAxis)
return self.recv_TransformToMMI()
def send_TransformToMMI(self, transform, firstAxis, secondAxis, thirdAxis):
self._oprot.writeMessageBegin('TransformToMMI', TMessageType.CALL, self._seqid)
args = TransformToMMI_args()
args.transform = transform
args.firstAxis = firstAxis
args.secondAxis = secondAxis
args.thirdAxis = thirdAxis
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_TransformToMMI(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = TransformToMMI_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "TransformToMMI failed: unknown result")
def TransformFromMMI_L(self, transform, coordinateSystem):
"""
Parameters:
- transform
- coordinateSystem
"""
self.send_TransformFromMMI_L(transform, coordinateSystem)
return self.recv_TransformFromMMI_L()
def send_TransformFromMMI_L(self, transform, coordinateSystem):
self._oprot.writeMessageBegin('TransformFromMMI_L', TMessageType.CALL, self._seqid)
args = TransformFromMMI_L_args()
args.transform = transform
args.coordinateSystem = coordinateSystem
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_TransformFromMMI_L(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = TransformFromMMI_L_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "TransformFromMMI_L failed: unknown result")
def TransformFromMMI(self, transform, firstAxis, secondAxis, thirdAxis):
"""
Parameters:
- transform
- firstAxis
- secondAxis
- thirdAxis
"""
self.send_TransformFromMMI(transform, firstAxis, secondAxis, thirdAxis)
return self.recv_TransformFromMMI()
def send_TransformFromMMI(self, transform, firstAxis, secondAxis, thirdAxis):
self._oprot.writeMessageBegin('TransformFromMMI', TMessageType.CALL, self._seqid)
args = TransformFromMMI_args()
args.transform = transform
args.firstAxis = firstAxis
args.secondAxis = secondAxis
args.thirdAxis = thirdAxis
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_TransformFromMMI(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = TransformFromMMI_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "TransformFromMMI failed: unknown result")
def QuaternionToMMI_L(self, quat, coordinateSystem):
"""
Parameters:
- quat
- coordinateSystem
"""
self.send_QuaternionToMMI_L(quat, coordinateSystem)
return self.recv_QuaternionToMMI_L()
def send_QuaternionToMMI_L(self, quat, coordinateSystem):
self._oprot.writeMessageBegin('QuaternionToMMI_L', TMessageType.CALL, self._seqid)
args = QuaternionToMMI_L_args()
args.quat = quat
args.coordinateSystem = coordinateSystem
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_QuaternionToMMI_L(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = QuaternionToMMI_L_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "QuaternionToMMI_L failed: unknown result")
def QuaternionToMMI(self, quat, firstAxis, secondAxis, thirdAxis):
"""
Parameters:
- quat
- firstAxis
- secondAxis
- thirdAxis
"""
self.send_QuaternionToMMI(quat, firstAxis, secondAxis, thirdAxis)
return self.recv_QuaternionToMMI()
def send_QuaternionToMMI(self, quat, firstAxis, secondAxis, thirdAxis):
self._oprot.writeMessageBegin('QuaternionToMMI', TMessageType.CALL, self._seqid)
args = QuaternionToMMI_args()
args.quat = quat
args.firstAxis = firstAxis
args.secondAxis = secondAxis
args.thirdAxis = thirdAxis
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_QuaternionToMMI(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = QuaternionToMMI_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "QuaternionToMMI failed: unknown result")
def QuaternionFromMMI_L(self, quat, coordinateSystem):
"""
Parameters:
- quat
- coordinateSystem
"""
self.send_QuaternionFromMMI_L(quat, coordinateSystem)
return self.recv_QuaternionFromMMI_L()
def send_QuaternionFromMMI_L(self, quat, coordinateSystem):
self._oprot.writeMessageBegin('QuaternionFromMMI_L', TMessageType.CALL, self._seqid)
args = QuaternionFromMMI_L_args()
args.quat = quat
args.coordinateSystem = coordinateSystem
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_QuaternionFromMMI_L(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = QuaternionFromMMI_L_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "QuaternionFromMMI_L failed: unknown result")
def QuaternionFromMMI(self, quat, firstAxis, secondAxis, thirdAxis):
"""
Parameters:
- quat
- firstAxis
- secondAxis
- thirdAxis
"""
self.send_QuaternionFromMMI(quat, firstAxis, secondAxis, thirdAxis)
return self.recv_QuaternionFromMMI()
def send_QuaternionFromMMI(self, quat, firstAxis, secondAxis, thirdAxis):
self._oprot.writeMessageBegin('QuaternionFromMMI', TMessageType.CALL, self._seqid)
args = QuaternionFromMMI_args()
args.quat = quat
args.firstAxis = firstAxis
args.secondAxis = secondAxis
args.thirdAxis = thirdAxis
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_QuaternionFromMMI(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = QuaternionFromMMI_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "QuaternionFromMMI failed: unknown result")
def VectorToMMI_L(self, quat, coordinateSystem):
"""
Parameters:
- quat
- coordinateSystem
"""
self.send_VectorToMMI_L(quat, coordinateSystem)
return self.recv_VectorToMMI_L()
def send_VectorToMMI_L(self, quat, coordinateSystem):
self._oprot.writeMessageBegin('VectorToMMI_L', TMessageType.CALL, self._seqid)
args = VectorToMMI_L_args()
args.quat = quat
args.coordinateSystem = coordinateSystem
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_VectorToMMI_L(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = VectorToMMI_L_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "VectorToMMI_L failed: unknown result")
def VectorToMMI(self, quat, firstAxis, secondAxis, thirdAxis):
"""
Parameters:
- quat
- firstAxis
- secondAxis
- thirdAxis
"""
self.send_VectorToMMI(quat, firstAxis, secondAxis, thirdAxis)
return self.recv_VectorToMMI()
def send_VectorToMMI(self, quat, firstAxis, secondAxis, thirdAxis):
self._oprot.writeMessageBegin('VectorToMMI', TMessageType.CALL, self._seqid)
args = VectorToMMI_args()
args.quat = quat
args.firstAxis = firstAxis
args.secondAxis = secondAxis
args.thirdAxis = thirdAxis
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_VectorToMMI(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = VectorToMMI_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "VectorToMMI failed: unknown result")
def VectorFromMMI_L(self, quat, coordinateSystem):
"""
Parameters:
- quat
- coordinateSystem
"""
self.send_VectorFromMMI_L(quat, coordinateSystem)
return self.recv_VectorFromMMI_L()
def send_VectorFromMMI_L(self, quat, coordinateSystem):
self._oprot.writeMessageBegin('VectorFromMMI_L', TMessageType.CALL, self._seqid)
args = VectorFromMMI_L_args()
args.quat = quat
args.coordinateSystem = coordinateSystem
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_VectorFromMMI_L(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = VectorFromMMI_L_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "VectorFromMMI_L failed: unknown result")
def VectorFromMMI(self, quat, firstAxis, secondAxis, thirdAxis):
"""
Parameters:
- quat
- firstAxis
- secondAxis
- thirdAxis
"""
self.send_VectorFromMMI(quat, firstAxis, secondAxis, thirdAxis)
return self.recv_VectorFromMMI()
def send_VectorFromMMI(self, quat, firstAxis, secondAxis, thirdAxis):
self._oprot.writeMessageBegin('VectorFromMMI', TMessageType.CALL, self._seqid)
args = VectorFromMMI_args()
args.quat = quat
args.firstAxis = firstAxis
args.secondAxis = secondAxis
args.thirdAxis = thirdAxis
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_VectorFromMMI(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = VectorFromMMI_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "VectorFromMMI failed: unknown result")
class Processor(MOSIM.mmi.services.MMIServiceBase.Processor, Iface, TProcessor):
def __init__(self, handler):
MOSIM.mmi.services.MMIServiceBase.Processor.__init__(self, handler)
self._processMap["TransformToMMI_L"] = Processor.process_TransformToMMI_L
self._processMap["TransformToMMI"] = Processor.process_TransformToMMI
self._processMap["TransformFromMMI_L"] = Processor.process_TransformFromMMI_L
self._processMap["TransformFromMMI"] = Processor.process_TransformFromMMI
self._processMap["QuaternionToMMI_L"] = Processor.process_QuaternionToMMI_L
self._processMap["QuaternionToMMI"] = Processor.process_QuaternionToMMI
self._processMap["QuaternionFromMMI_L"] = Processor.process_QuaternionFromMMI_L
self._processMap["QuaternionFromMMI"] = Processor.process_QuaternionFromMMI
self._processMap["VectorToMMI_L"] = Processor.process_VectorToMMI_L
self._processMap["VectorToMMI"] = Processor.process_VectorToMMI
self._processMap["VectorFromMMI_L"] = Processor.process_VectorFromMMI_L
self._processMap["VectorFromMMI"] = Processor.process_VectorFromMMI
self._on_message_begin = None
def on_message_begin(self, func):
self._on_message_begin = func
def process(self, iprot, oprot):
(name, type, seqid) = iprot.readMessageBegin()
if self._on_message_begin:
self._on_message_begin(name, type, seqid)
if name not in self._processMap:
iprot.skip(TType.STRUCT)
iprot.readMessageEnd()
x = TApplicationException(TApplicationException.UNKNOWN_METHOD, 'Unknown function %s' % (name))
oprot.writeMessageBegin(name, TMessageType.EXCEPTION, seqid)
x.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
return
else:
self._processMap[name](self, seqid, iprot, oprot)
return True
def process_TransformToMMI_L(self, seqid, iprot, oprot):
args = TransformToMMI_L_args()
args.read(iprot)
iprot.readMessageEnd()
result = TransformToMMI_L_result()
try:
result.success = self._handler.TransformToMMI_L(args.transform, args.coordinateSystem)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("TransformToMMI_L", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_TransformToMMI(self, seqid, iprot, oprot):
args = TransformToMMI_args()
args.read(iprot)
iprot.readMessageEnd()
result = TransformToMMI_result()
try:
result.success = self._handler.TransformToMMI(args.transform, args.firstAxis, args.secondAxis, args.thirdAxis)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("TransformToMMI", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_TransformFromMMI_L(self, seqid, iprot, oprot):
args = TransformFromMMI_L_args()
args.read(iprot)
iprot.readMessageEnd()
result = TransformFromMMI_L_result()
try:
result.success = self._handler.TransformFromMMI_L(args.transform, args.coordinateSystem)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("TransformFromMMI_L", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_TransformFromMMI(self, seqid, iprot, oprot):
args = TransformFromMMI_args()
args.read(iprot)
iprot.readMessageEnd()
result = TransformFromMMI_result()
try:
result.success = self._handler.TransformFromMMI(args.transform, args.firstAxis, args.secondAxis, args.thirdAxis)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("TransformFromMMI", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_QuaternionToMMI_L(self, seqid, iprot, oprot):
args = QuaternionToMMI_L_args()
args.read(iprot)
iprot.readMessageEnd()
result = QuaternionToMMI_L_result()
try:
result.success = self._handler.QuaternionToMMI_L(args.quat, args.coordinateSystem)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("QuaternionToMMI_L", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_QuaternionToMMI(self, seqid, iprot, oprot):
args = QuaternionToMMI_args()
args.read(iprot)
iprot.readMessageEnd()
result = QuaternionToMMI_result()
try:
result.success = self._handler.QuaternionToMMI(args.quat, args.firstAxis, args.secondAxis, args.thirdAxis)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("QuaternionToMMI", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_QuaternionFromMMI_L(self, seqid, iprot, oprot):
args = QuaternionFromMMI_L_args()
args.read(iprot)
iprot.readMessageEnd()
result = QuaternionFromMMI_L_result()
try:
result.success = self._handler.QuaternionFromMMI_L(args.quat, args.coordinateSystem)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("QuaternionFromMMI_L", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_QuaternionFromMMI(self, seqid, iprot, oprot):
args = QuaternionFromMMI_args()
args.read(iprot)
iprot.readMessageEnd()
result = QuaternionFromMMI_result()
try:
result.success = self._handler.QuaternionFromMMI(args.quat, args.firstAxis, args.secondAxis, args.thirdAxis)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("QuaternionFromMMI", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_VectorToMMI_L(self, seqid, iprot, oprot):
args = VectorToMMI_L_args()
args.read(iprot)
iprot.readMessageEnd()
result = VectorToMMI_L_result()
try:
result.success = self._handler.VectorToMMI_L(args.quat, args.coordinateSystem)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("VectorToMMI_L", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_VectorToMMI(self, seqid, iprot, oprot):
args = VectorToMMI_args()
args.read(iprot)
iprot.readMessageEnd()
result = VectorToMMI_result()
try:
result.success = self._handler.VectorToMMI(args.quat, args.firstAxis, args.secondAxis, args.thirdAxis)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("VectorToMMI", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_VectorFromMMI_L(self, seqid, iprot, oprot):
args = VectorFromMMI_L_args()
args.read(iprot)
iprot.readMessageEnd()
result = VectorFromMMI_L_result()
try:
result.success = self._handler.VectorFromMMI_L(args.quat, args.coordinateSystem)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("VectorFromMMI_L", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_VectorFromMMI(self, seqid, iprot, oprot):
args = VectorFromMMI_args()
args.read(iprot)
iprot.readMessageEnd()
result = VectorFromMMI_result()
try:
result.success = self._handler.VectorFromMMI(args.quat, args.firstAxis, args.secondAxis, args.thirdAxis)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("VectorFromMMI", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
# HELPER FUNCTIONS AND STRUCTURES
class TransformToMMI_L_args(object):
"""
Attributes:
- transform
- coordinateSystem
"""
def __init__(self, transform=None, coordinateSystem=None,):
self.transform = transform
self.coordinateSystem = coordinateSystem
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.transform = MOSIM.mmi.math.ttypes.MTransform()
self.transform.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.LIST:
self.coordinateSystem = []
(_etype311, _size308) = iprot.readListBegin()
for _i312 in range(_size308):
_elem313 = iprot.readI32()
self.coordinateSystem.append(_elem313)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('TransformToMMI_L_args')
if self.transform is not None:
oprot.writeFieldBegin('transform', TType.STRUCT, 1)
self.transform.write(oprot)
oprot.writeFieldEnd()
if self.coordinateSystem is not None:
oprot.writeFieldBegin('coordinateSystem', TType.LIST, 2)
oprot.writeListBegin(TType.I32, len(self.coordinateSystem))
for iter314 in self.coordinateSystem:
oprot.writeI32(iter314)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(TransformToMMI_L_args)
TransformToMMI_L_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'transform', [MOSIM.mmi.math.ttypes.MTransform, None], None, ), # 1
(2, TType.LIST, 'coordinateSystem', (TType.I32, None, False), None, ), # 2
)
class TransformToMMI_L_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = MOSIM.mmi.math.ttypes.MTransform()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('TransformToMMI_L_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(TransformToMMI_L_result)
TransformToMMI_L_result.thrift_spec = (
(0, TType.STRUCT, 'success', [MOSIM.mmi.math.ttypes.MTransform, None], None, ), # 0
)
class TransformToMMI_args(object):
"""
Attributes:
- transform
- firstAxis
- secondAxis
- thirdAxis
"""
def __init__(self, transform=None, firstAxis=None, secondAxis=None, thirdAxis=None,):
self.transform = transform
self.firstAxis = firstAxis
self.secondAxis = secondAxis
self.thirdAxis = thirdAxis
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.transform = MOSIM.mmi.math.ttypes.MTransform()
self.transform.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.firstAxis = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.secondAxis = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.I32:
self.thirdAxis = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('TransformToMMI_args')
if self.transform is not None:
oprot.writeFieldBegin('transform', TType.STRUCT, 1)
self.transform.write(oprot)
oprot.writeFieldEnd()
if self.firstAxis is not None:
oprot.writeFieldBegin('firstAxis', TType.I32, 2)
oprot.writeI32(self.firstAxis)
oprot.writeFieldEnd()
if self.secondAxis is not None:
oprot.writeFieldBegin('secondAxis', TType.I32, 3)
oprot.writeI32(self.secondAxis)
oprot.writeFieldEnd()
if self.thirdAxis is not None:
oprot.writeFieldBegin('thirdAxis', TType.I32, 4)
oprot.writeI32(self.thirdAxis)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(TransformToMMI_args)
TransformToMMI_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'transform', [MOSIM.mmi.math.ttypes.MTransform, None], None, ), # 1
(2, TType.I32, 'firstAxis', None, None, ), # 2
(3, TType.I32, 'secondAxis', None, None, ), # 3
(4, TType.I32, 'thirdAxis', None, None, ), # 4
)
class TransformToMMI_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = MOSIM.mmi.math.ttypes.MTransform()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('TransformToMMI_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(TransformToMMI_result)
TransformToMMI_result.thrift_spec = (
(0, TType.STRUCT, 'success', [MOSIM.mmi.math.ttypes.MTransform, None], None, ), # 0
)
class TransformFromMMI_L_args(object):
"""
Attributes:
- transform
- coordinateSystem
"""
def __init__(self, transform=None, coordinateSystem=None,):
self.transform = transform
self.coordinateSystem = coordinateSystem
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.transform = MOSIM.mmi.math.ttypes.MTransform()
self.transform.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.LIST:
self.coordinateSystem = []
(_etype318, _size315) = iprot.readListBegin()
for _i319 in range(_size315):
_elem320 = iprot.readI32()
self.coordinateSystem.append(_elem320)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('TransformFromMMI_L_args')
if self.transform is not None:
oprot.writeFieldBegin('transform', TType.STRUCT, 1)
self.transform.write(oprot)
oprot.writeFieldEnd()
if self.coordinateSystem is not None:
oprot.writeFieldBegin('coordinateSystem', TType.LIST, 2)
oprot.writeListBegin(TType.I32, len(self.coordinateSystem))
for iter321 in self.coordinateSystem:
oprot.writeI32(iter321)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(TransformFromMMI_L_args)
TransformFromMMI_L_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'transform', [MOSIM.mmi.math.ttypes.MTransform, None], None, ), # 1
(2, TType.LIST, 'coordinateSystem', (TType.I32, None, False), None, ), # 2
)
class TransformFromMMI_L_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = MOSIM.mmi.math.ttypes.MTransform()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('TransformFromMMI_L_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(TransformFromMMI_L_result)
TransformFromMMI_L_result.thrift_spec = (
(0, TType.STRUCT, 'success', [MOSIM.mmi.math.ttypes.MTransform, None], None, ), # 0
)
class TransformFromMMI_args(object):
"""
Attributes:
- transform
- firstAxis
- secondAxis
- thirdAxis
"""
def __init__(self, transform=None, firstAxis=None, secondAxis=None, thirdAxis=None,):
self.transform = transform
self.firstAxis = firstAxis
self.secondAxis = secondAxis
self.thirdAxis = thirdAxis
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.transform = MOSIM.mmi.math.ttypes.MTransform()
self.transform.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.firstAxis = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.secondAxis = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.I32:
self.thirdAxis = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('TransformFromMMI_args')
if self.transform is not None:
oprot.writeFieldBegin('transform', TType.STRUCT, 1)
self.transform.write(oprot)
oprot.writeFieldEnd()
if self.firstAxis is not None:
oprot.writeFieldBegin('firstAxis', TType.I32, 2)
oprot.writeI32(self.firstAxis)
oprot.writeFieldEnd()
if self.secondAxis is not None:
oprot.writeFieldBegin('secondAxis', TType.I32, 3)
oprot.writeI32(self.secondAxis)
oprot.writeFieldEnd()
if self.thirdAxis is not None:
oprot.writeFieldBegin('thirdAxis', TType.I32, 4)
oprot.writeI32(self.thirdAxis)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(TransformFromMMI_args)
TransformFromMMI_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'transform', [MOSIM.mmi.math.ttypes.MTransform, None], None, ), # 1
(2, TType.I32, 'firstAxis', None, None, ), # 2
(3, TType.I32, 'secondAxis', None, None, ), # 3
(4, TType.I32, 'thirdAxis', None, None, ), # 4
)
class TransformFromMMI_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = MOSIM.mmi.math.ttypes.MTransform()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('TransformFromMMI_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(TransformFromMMI_result)
TransformFromMMI_result.thrift_spec = (
(0, TType.STRUCT, 'success', [MOSIM.mmi.math.ttypes.MTransform, None], None, ), # 0
)
class QuaternionToMMI_L_args(object):
"""
Attributes:
- quat
- coordinateSystem
"""
def __init__(self, quat=None, coordinateSystem=None,):
self.quat = quat
self.coordinateSystem = coordinateSystem
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.quat = MOSIM.mmi.math.ttypes.MQuaternion()
self.quat.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.LIST:
self.coordinateSystem = []
(_etype325, _size322) = iprot.readListBegin()
for _i326 in range(_size322):
_elem327 = iprot.readI32()
self.coordinateSystem.append(_elem327)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('QuaternionToMMI_L_args')
if self.quat is not None:
oprot.writeFieldBegin('quat', TType.STRUCT, 1)
self.quat.write(oprot)
oprot.writeFieldEnd()
if self.coordinateSystem is not None:
oprot.writeFieldBegin('coordinateSystem', TType.LIST, 2)
oprot.writeListBegin(TType.I32, len(self.coordinateSystem))
for iter328 in self.coordinateSystem:
oprot.writeI32(iter328)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(QuaternionToMMI_L_args)
QuaternionToMMI_L_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'quat', [MOSIM.mmi.math.ttypes.MQuaternion, None], None, ), # 1
(2, TType.LIST, 'coordinateSystem', (TType.I32, None, False), None, ), # 2
)
class QuaternionToMMI_L_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = MOSIM.mmi.math.ttypes.MQuaternion()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('QuaternionToMMI_L_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(QuaternionToMMI_L_result)
QuaternionToMMI_L_result.thrift_spec = (
(0, TType.STRUCT, 'success', [MOSIM.mmi.math.ttypes.MQuaternion, None], None, ), # 0
)
class QuaternionToMMI_args(object):
"""
Attributes:
- quat
- firstAxis
- secondAxis
- thirdAxis
"""
def __init__(self, quat=None, firstAxis=None, secondAxis=None, thirdAxis=None,):
self.quat = quat
self.firstAxis = firstAxis
self.secondAxis = secondAxis
self.thirdAxis = thirdAxis
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.quat = MOSIM.mmi.math.ttypes.MQuaternion()
self.quat.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.firstAxis = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.secondAxis = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.I32:
self.thirdAxis = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('QuaternionToMMI_args')
if self.quat is not None:
oprot.writeFieldBegin('quat', TType.STRUCT, 1)
self.quat.write(oprot)
oprot.writeFieldEnd()
if self.firstAxis is not None:
oprot.writeFieldBegin('firstAxis', TType.I32, 2)
oprot.writeI32(self.firstAxis)
oprot.writeFieldEnd()
if self.secondAxis is not None:
oprot.writeFieldBegin('secondAxis', TType.I32, 3)
oprot.writeI32(self.secondAxis)
oprot.writeFieldEnd()
if self.thirdAxis is not None:
oprot.writeFieldBegin('thirdAxis', TType.I32, 4)
oprot.writeI32(self.thirdAxis)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(QuaternionToMMI_args)
QuaternionToMMI_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'quat', [MOSIM.mmi.math.ttypes.MQuaternion, None], None, ), # 1
(2, TType.I32, 'firstAxis', None, None, ), # 2
(3, TType.I32, 'secondAxis', None, None, ), # 3
(4, TType.I32, 'thirdAxis', None, None, ), # 4
)
class QuaternionToMMI_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = MOSIM.mmi.math.ttypes.MQuaternion()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('QuaternionToMMI_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(QuaternionToMMI_result)
QuaternionToMMI_result.thrift_spec = (
(0, TType.STRUCT, 'success', [MOSIM.mmi.math.ttypes.MQuaternion, None], None, ), # 0
)
class QuaternionFromMMI_L_args(object):
"""
Attributes:
- quat
- coordinateSystem
"""
def __init__(self, quat=None, coordinateSystem=None,):
self.quat = quat
self.coordinateSystem = coordinateSystem
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.quat = MOSIM.mmi.math.ttypes.MQuaternion()
self.quat.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.LIST:
self.coordinateSystem = []
(_etype332, _size329) = iprot.readListBegin()
for _i333 in range(_size329):
_elem334 = iprot.readI32()
self.coordinateSystem.append(_elem334)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('QuaternionFromMMI_L_args')
if self.quat is not None:
oprot.writeFieldBegin('quat', TType.STRUCT, 1)
self.quat.write(oprot)
oprot.writeFieldEnd()
if self.coordinateSystem is not None:
oprot.writeFieldBegin('coordinateSystem', TType.LIST, 2)
oprot.writeListBegin(TType.I32, len(self.coordinateSystem))
for iter335 in self.coordinateSystem:
oprot.writeI32(iter335)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(QuaternionFromMMI_L_args)
QuaternionFromMMI_L_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'quat', [MOSIM.mmi.math.ttypes.MQuaternion, None], None, ), # 1
(2, TType.LIST, 'coordinateSystem', (TType.I32, None, False), None, ), # 2
)
class QuaternionFromMMI_L_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = MOSIM.mmi.math.ttypes.MQuaternion()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('QuaternionFromMMI_L_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(QuaternionFromMMI_L_result)
QuaternionFromMMI_L_result.thrift_spec = (
(0, TType.STRUCT, 'success', [MOSIM.mmi.math.ttypes.MQuaternion, None], None, ), # 0
)
class QuaternionFromMMI_args(object):
"""
Attributes:
- quat
- firstAxis
- secondAxis
- thirdAxis
"""
def __init__(self, quat=None, firstAxis=None, secondAxis=None, thirdAxis=None,):
self.quat = quat
self.firstAxis = firstAxis
self.secondAxis = secondAxis
self.thirdAxis = thirdAxis
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.quat = MOSIM.mmi.math.ttypes.MQuaternion()
self.quat.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.firstAxis = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.secondAxis = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.I32:
self.thirdAxis = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('QuaternionFromMMI_args')
if self.quat is not None:
oprot.writeFieldBegin('quat', TType.STRUCT, 1)
self.quat.write(oprot)
oprot.writeFieldEnd()
if self.firstAxis is not None:
oprot.writeFieldBegin('firstAxis', TType.I32, 2)
oprot.writeI32(self.firstAxis)
oprot.writeFieldEnd()
if self.secondAxis is not None:
oprot.writeFieldBegin('secondAxis', TType.I32, 3)
oprot.writeI32(self.secondAxis)
oprot.writeFieldEnd()
if self.thirdAxis is not None:
oprot.writeFieldBegin('thirdAxis', TType.I32, 4)
oprot.writeI32(self.thirdAxis)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(QuaternionFromMMI_args)
QuaternionFromMMI_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'quat', [MOSIM.mmi.math.ttypes.MQuaternion, None], None, ), # 1
(2, TType.I32, 'firstAxis', None, None, ), # 2
(3, TType.I32, 'secondAxis', None, None, ), # 3
(4, TType.I32, 'thirdAxis', None, None, ), # 4
)
class QuaternionFromMMI_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = MOSIM.mmi.math.ttypes.MQuaternion()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('QuaternionFromMMI_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(QuaternionFromMMI_result)
QuaternionFromMMI_result.thrift_spec = (
(0, TType.STRUCT, 'success', [MOSIM.mmi.math.ttypes.MQuaternion, None], None, ), # 0
)
class VectorToMMI_L_args(object):
"""
Attributes:
- quat
- coordinateSystem
"""
def __init__(self, quat=None, coordinateSystem=None,):
self.quat = quat
self.coordinateSystem = coordinateSystem
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.quat = MOSIM.mmi.math.ttypes.MVector3()
self.quat.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.LIST:
self.coordinateSystem = []
(_etype339, _size336) = iprot.readListBegin()
for _i340 in range(_size336):
_elem341 = iprot.readI32()
self.coordinateSystem.append(_elem341)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('VectorToMMI_L_args')
if self.quat is not None:
oprot.writeFieldBegin('quat', TType.STRUCT, 1)
self.quat.write(oprot)
oprot.writeFieldEnd()
if self.coordinateSystem is not None:
oprot.writeFieldBegin('coordinateSystem', TType.LIST, 2)
oprot.writeListBegin(TType.I32, len(self.coordinateSystem))
for iter342 in self.coordinateSystem:
oprot.writeI32(iter342)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(VectorToMMI_L_args)
VectorToMMI_L_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'quat', [MOSIM.mmi.math.ttypes.MVector3, None], None, ), # 1
(2, TType.LIST, 'coordinateSystem', (TType.I32, None, False), None, ), # 2
)
class VectorToMMI_L_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = MOSIM.mmi.math.ttypes.MVector3()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('VectorToMMI_L_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(VectorToMMI_L_result)
VectorToMMI_L_result.thrift_spec = (
(0, TType.STRUCT, 'success', [MOSIM.mmi.math.ttypes.MVector3, None], None, ), # 0
)
class VectorToMMI_args(object):
"""
Attributes:
- quat
- firstAxis
- secondAxis
- thirdAxis
"""
def __init__(self, quat=None, firstAxis=None, secondAxis=None, thirdAxis=None,):
self.quat = quat
self.firstAxis = firstAxis
self.secondAxis = secondAxis
self.thirdAxis = thirdAxis
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.quat = MOSIM.mmi.math.ttypes.MVector3()
self.quat.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.firstAxis = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.secondAxis = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.I32:
self.thirdAxis = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('VectorToMMI_args')
if self.quat is not None:
oprot.writeFieldBegin('quat', TType.STRUCT, 1)
self.quat.write(oprot)
oprot.writeFieldEnd()
if self.firstAxis is not None:
oprot.writeFieldBegin('firstAxis', TType.I32, 2)
oprot.writeI32(self.firstAxis)
oprot.writeFieldEnd()
if self.secondAxis is not None:
oprot.writeFieldBegin('secondAxis', TType.I32, 3)
oprot.writeI32(self.secondAxis)
oprot.writeFieldEnd()
if self.thirdAxis is not None:
oprot.writeFieldBegin('thirdAxis', TType.I32, 4)
oprot.writeI32(self.thirdAxis)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(VectorToMMI_args)
VectorToMMI_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'quat', [MOSIM.mmi.math.ttypes.MVector3, None], None, ), # 1
(2, TType.I32, 'firstAxis', None, None, ), # 2
(3, TType.I32, 'secondAxis', None, None, ), # 3
(4, TType.I32, 'thirdAxis', None, None, ), # 4
)
class VectorToMMI_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = MOSIM.mmi.math.ttypes.MVector3()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('VectorToMMI_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(VectorToMMI_result)
VectorToMMI_result.thrift_spec = (
(0, TType.STRUCT, 'success', [MOSIM.mmi.math.ttypes.MVector3, None], None, ), # 0
)
class VectorFromMMI_L_args(object):
"""
Attributes:
- quat
- coordinateSystem
"""
def __init__(self, quat=None, coordinateSystem=None,):
self.quat = quat
self.coordinateSystem = coordinateSystem
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.quat = MOSIM.mmi.math.ttypes.MVector3()
self.quat.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.LIST:
self.coordinateSystem = []
(_etype346, _size343) = iprot.readListBegin()
for _i347 in range(_size343):
_elem348 = iprot.readI32()
self.coordinateSystem.append(_elem348)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('VectorFromMMI_L_args')
if self.quat is not None:
oprot.writeFieldBegin('quat', TType.STRUCT, 1)
self.quat.write(oprot)
oprot.writeFieldEnd()
if self.coordinateSystem is not None:
oprot.writeFieldBegin('coordinateSystem', TType.LIST, 2)
oprot.writeListBegin(TType.I32, len(self.coordinateSystem))
for iter349 in self.coordinateSystem:
oprot.writeI32(iter349)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(VectorFromMMI_L_args)
VectorFromMMI_L_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'quat', [MOSIM.mmi.math.ttypes.MVector3, None], None, ), # 1
(2, TType.LIST, 'coordinateSystem', (TType.I32, None, False), None, ), # 2
)
class VectorFromMMI_L_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = MOSIM.mmi.math.ttypes.MVector3()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('VectorFromMMI_L_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(VectorFromMMI_L_result)
VectorFromMMI_L_result.thrift_spec = (
(0, TType.STRUCT, 'success', [MOSIM.mmi.math.ttypes.MVector3, None], None, ), # 0
)
class VectorFromMMI_args(object):
"""
Attributes:
- quat
- firstAxis
- secondAxis
- thirdAxis
"""
def __init__(self, quat=None, firstAxis=None, secondAxis=None, thirdAxis=None,):
self.quat = quat
self.firstAxis = firstAxis
self.secondAxis = secondAxis
self.thirdAxis = thirdAxis
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.quat = MOSIM.mmi.math.ttypes.MVector3()
self.quat.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.firstAxis = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.secondAxis = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.I32:
self.thirdAxis = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('VectorFromMMI_args')
if self.quat is not None:
oprot.writeFieldBegin('quat', TType.STRUCT, 1)
self.quat.write(oprot)
oprot.writeFieldEnd()
if self.firstAxis is not None:
oprot.writeFieldBegin('firstAxis', TType.I32, 2)
oprot.writeI32(self.firstAxis)
oprot.writeFieldEnd()
if self.secondAxis is not None:
oprot.writeFieldBegin('secondAxis', TType.I32, 3)
oprot.writeI32(self.secondAxis)
oprot.writeFieldEnd()
if self.thirdAxis is not None:
oprot.writeFieldBegin('thirdAxis', TType.I32, 4)
oprot.writeI32(self.thirdAxis)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(VectorFromMMI_args)
VectorFromMMI_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'quat', [MOSIM.mmi.math.ttypes.MVector3, None], None, ), # 1
(2, TType.I32, 'firstAxis', None, None, ), # 2
(3, TType.I32, 'secondAxis', None, None, ), # 3
(4, TType.I32, 'thirdAxis', None, None, ), # 4
)
class VectorFromMMI_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = MOSIM.mmi.math.ttypes.MVector3()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('VectorFromMMI_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(VectorFromMMI_result)
VectorFromMMI_result.thrift_spec = (
(0, TType.STRUCT, 'success', [MOSIM.mmi.math.ttypes.MVector3, None], None, ), # 0
)
fix_spec(all_structs)
del all_structs
| 34.850768 | 134 | 0.595579 | 9,521 | 95,282 | 5.740153 | 0.022477 | 0.014272 | 0.02569 | 0.023714 | 0.934001 | 0.916783 | 0.893252 | 0.87091 | 0.851442 | 0.833785 | 0 | 0.009002 | 0.305105 | 95,282 | 2,733 | 135 | 34.86352 | 0.81642 | 0.025346 | 0 | 0.852997 | 1 | 0 | 0.039783 | 0.003766 | 0 | 0 | 0 | 0 | 0 | 1 | 0.110371 | false | 0.005709 | 0.004282 | 0.034253 | 0.208373 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4fe952fc2d2f269ae60aea02f9510e9eb1bd823b | 134 | py | Python | babyai_sr/rl/algos/__init__.py | thomasaunger/babyai_sr | 27fba5fb960640ebc405de83d5ab75b8c6d50ad7 | [
"BSD-3-Clause"
] | null | null | null | babyai_sr/rl/algos/__init__.py | thomasaunger/babyai_sr | 27fba5fb960640ebc405de83d5ab75b8c6d50ad7 | [
"BSD-3-Clause"
] | null | null | null | babyai_sr/rl/algos/__init__.py | thomasaunger/babyai_sr | 27fba5fb960640ebc405de83d5ab75b8c6d50ad7 | [
"BSD-3-Clause"
] | null | null | null | from babyai_sr.rl.algos.base import BaseAlgo
from babyai_sr.rl.algos.ppo import PPOAlgo
from babyai_sr.rl.algos.test import TestAlgo
| 33.5 | 44 | 0.835821 | 24 | 134 | 4.541667 | 0.5 | 0.275229 | 0.330275 | 0.385321 | 0.522936 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097015 | 134 | 3 | 45 | 44.666667 | 0.900826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
8b0a59e067c125ec6ab80357aab2f4e5e95452b7 | 12,517 | py | Python | tests/response/test_caching.py | Colin-b/pyxelrest | 5c8db40d1537d0f9c29acd928ec9519b6bb557ec | [
"MIT"
] | 7 | 2018-12-07T10:08:53.000Z | 2021-03-24T07:52:36.000Z | tests/response/test_caching.py | Colin-b/pyxelrest | 5c8db40d1537d0f9c29acd928ec9519b6bb557ec | [
"MIT"
] | 76 | 2018-12-07T10:29:48.000Z | 2021-11-17T00:54:24.000Z | tests/response/test_caching.py | Colin-b/pyxelrest | 5c8db40d1537d0f9c29acd928ec9519b6bb557ec | [
"MIT"
] | null | null | null | import time
import cachetools
import pytest
from responses import RequestsMock
from tests import loader
def _sent_request(responses: RequestsMock, url: str) -> bool:
for call in responses.calls:
if call.request.url == url:
# Pop out verified request (to be able to check multiple requests)
responses.calls._calls.remove(call)
return call.request
@pytest.fixture
def caching_service(responses: RequestsMock):
responses.add(
responses.GET,
url="http://localhost:8949/",
json={
"swagger": "2.0",
"produces": ["application/json"],
"paths": {
"/cached": {
"parameters": [
{
"description": "",
"in": "query",
"name": "test1",
"required": True,
"type": "string",
},
{
"description": "",
"in": "query",
"name": "test2",
"required": True,
"type": "string",
},
],
"get": {
"operationId": "get_cached",
"responses": {"200": {"description": "return value"}},
},
"post": {
"operationId": "post_cached",
"responses": {"200": {"description": "return value"}},
},
"put": {
"operationId": "put_cached",
"responses": {"200": {"description": "return value"}},
},
"delete": {
"operationId": "delete_cached",
"responses": {"200": {"description": "return value"}},
},
}
},
},
match_querystring=True,
)
def test_get_cached(caching_service, responses: RequestsMock, tmpdir):
generated_functions = loader.load(
tmpdir,
{
"caching": {
"open_api": {"definition": "http://localhost:8949/"},
"formulas": {
"dynamic_array": {
"lock_excel": True,
"cache": {"duration": 5},
}
},
}
},
)
responses.add(
responses.GET,
url="http://localhost:8949/cached?test1=1&test2=2",
json={},
match_querystring=True,
)
assert generated_functions.caching_get_cached(test1="1", test2="2") == [[""]]
# Assert a request is issued as there is no cache for this request
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
responses.add(
responses.GET,
url="http://localhost:8949/cached?test1=1&test2=3",
json={},
match_querystring=True,
)
assert generated_functions.caching_get_cached(test1="1", test2="3") == [[""]]
# Assert a request is issued as there is no cache for this request
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=3")
assert generated_functions.caching_get_cached(test1="1", test2="2") == [[""]]
# Assert no request is issued as there is a cache for this request
assert not _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
# Wait for cache to be out of date
time.sleep(6)
assert generated_functions.caching_get_cached(test1="1", test2="2") == [[""]]
# Assert a request is issued as there is no cache for this request
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
assert generated_functions.caching_get_cached(test1="1", test2="2") == [[""]]
# Assert no request is issued as there is a cache for this request
assert not _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
def test_post_cached(caching_service, responses: RequestsMock, tmpdir):
generated_functions = loader.load(
tmpdir,
{
"caching": {
"open_api": {"definition": "http://localhost:8949/"},
"formulas": {
"dynamic_array": {
"lock_excel": True,
"cache": {"duration": 5},
}
},
}
},
)
responses.add(
responses.POST,
url="http://localhost:8949/cached?test1=1&test2=2",
json={},
match_querystring=True,
)
assert generated_functions.caching_post_cached(test1="1", test2="2") == [[""]]
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
responses.add(
responses.POST,
url="http://localhost:8949/cached?test1=1&test2=3",
json={},
match_querystring=True,
)
assert generated_functions.caching_post_cached(test1="1", test2="3") == [[""]]
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=3")
assert generated_functions.caching_post_cached(test1="1", test2="2") == [[""]]
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
time.sleep(5)
assert generated_functions.caching_post_cached(test1="1", test2="2") == [[""]]
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
assert generated_functions.caching_post_cached(test1="1", test2="2") == [[""]]
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
def test_put_cached(caching_service, responses: RequestsMock, tmpdir):
generated_functions = loader.load(
tmpdir,
{
"caching": {
"open_api": {"definition": "http://localhost:8949/"},
"formulas": {
"dynamic_array": {
"lock_excel": True,
"cache": {"duration": 5},
}
},
}
},
)
responses.add(
responses.PUT,
url="http://localhost:8949/cached?test1=1&test2=2",
json={},
match_querystring=True,
)
assert generated_functions.caching_put_cached(test1="1", test2="2") == [[""]]
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
responses.add(
responses.PUT,
url="http://localhost:8949/cached?test1=1&test2=3",
json={},
match_querystring=True,
)
assert generated_functions.caching_put_cached(test1="1", test2="3") == [[""]]
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=3")
assert generated_functions.caching_put_cached(test1="1", test2="2") == [[""]]
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
time.sleep(5)
assert generated_functions.caching_put_cached(test1="1", test2="2") == [[""]]
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
assert generated_functions.caching_put_cached(test1="1", test2="2") == [[""]]
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
def test_delete_cached(caching_service, responses: RequestsMock, tmpdir):
generated_functions = loader.load(
tmpdir,
{
"caching": {
"open_api": {"definition": "http://localhost:8949/"},
"formulas": {
"dynamic_array": {
"lock_excel": True,
"cache": {"duration": 5},
}
},
}
},
)
responses.add(
responses.DELETE,
url="http://localhost:8949/cached?test1=1&test2=2",
json={},
match_querystring=True,
)
assert generated_functions.caching_delete_cached(test1="1", test2="2") == [[""]]
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
responses.add(
responses.DELETE,
url="http://localhost:8949/cached?test1=1&test2=3",
json={},
match_querystring=True,
)
assert generated_functions.caching_delete_cached(test1="1", test2="3") == [[""]]
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=3")
assert generated_functions.caching_delete_cached(test1="1", test2="2") == [[""]]
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
time.sleep(5)
assert generated_functions.caching_delete_cached(test1="1", test2="2") == [[""]]
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
assert generated_functions.caching_delete_cached(test1="1", test2="2") == [[""]]
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
def test_get_cached_negative_duration(caching_service, responses: RequestsMock, tmpdir):
generated_functions = loader.load(
tmpdir,
{
"caching": {
"open_api": {"definition": "http://localhost:8949/"},
"formulas": {
"dynamic_array": {
"lock_excel": True,
"cache": {"duration": -5},
}
},
}
},
)
responses.add(
responses.GET,
url="http://localhost:8949/cached?test1=1&test2=2",
json={},
match_querystring=True,
)
assert generated_functions.caching_get_cached(test1="1", test2="2") == [[""]]
# Assert a request is issued as there is no cache
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
assert generated_functions.caching_get_cached(test1="1", test2="2") == [[""]]
# Assert a request is issued as there is no cache
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
def test_get_cached_invalid_duration(caching_service, responses: RequestsMock, tmpdir):
generated_functions = loader.load(
tmpdir,
{
"caching": {
"open_api": {"definition": "http://localhost:8949/"},
"formulas": {
"dynamic_array": {
"lock_excel": True,
"cache": {"duration": "not a number"},
}
},
}
},
)
responses.add(
responses.GET,
url="http://localhost:8949/cached?test1=1&test2=2",
json={},
match_querystring=True,
)
assert generated_functions.caching_get_cached(test1="1", test2="2") == [[""]]
# Assert a request is issued as there is no cache
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
assert generated_functions.caching_get_cached(test1="1", test2="2") == [[""]]
# Assert a request is issued as there is no cache
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
def test_get_cached_without_cachetools(
caching_service, responses: RequestsMock, tmpdir, monkeypatch
):
def fail_import(*args, **kwargs):
raise ImportError
monkeypatch.setattr(cachetools, "TTLCache", fail_import)
generated_functions = loader.load(
tmpdir,
{
"caching": {
"open_api": {"definition": "http://localhost:8949/"},
"formulas": {
"dynamic_array": {
"lock_excel": True,
"cache": {"duration": 5},
}
},
}
},
)
responses.add(
responses.GET,
url="http://localhost:8949/cached?test1=1&test2=2",
json={},
match_querystring=True,
)
assert generated_functions.caching_get_cached(test1="1", test2="2") == [[""]]
# Assert a request is issued as there is no cache
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
assert generated_functions.caching_get_cached(test1="1", test2="2") == [[""]]
# Assert a request is issued as there is no cache
assert _sent_request(responses, "http://localhost:8949/cached?test1=1&test2=2")
| 35.86533 | 88 | 0.548774 | 1,271 | 12,517 | 5.241542 | 0.085759 | 0.104023 | 0.113479 | 0.160763 | 0.883969 | 0.877814 | 0.853648 | 0.853648 | 0.847043 | 0.847043 | 0 | 0.052777 | 0.306703 | 12,517 | 348 | 89 | 35.968391 | 0.714911 | 0.056723 | 0 | 0.605536 | 0 | 0 | 0.233104 | 0 | 0 | 0 | 0 | 0 | 0.179931 | 1 | 0.034602 | false | 0 | 0.027682 | 0 | 0.065744 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8b25ac5d584b3b6384468eb8e972c5a2cb11063f | 12,858 | py | Python | test.py | eric-wieser/generatorify | 7bd759ecf88f836ece6cdbcf7ce1074260c0c5ef | [
"MIT"
] | 1 | 2020-04-09T19:02:34.000Z | 2020-04-09T19:02:34.000Z | test.py | eric-wieser/generatorify | 7bd759ecf88f836ece6cdbcf7ce1074260c0c5ef | [
"MIT"
] | 2 | 2019-07-28T04:14:54.000Z | 2019-10-19T15:09:42.000Z | test.py | eric-wieser/generatorify | 7bd759ecf88f836ece6cdbcf7ce1074260c0c5ef | [
"MIT"
] | null | null | null | from unittest import mock
from collections import namedtuple
import weakref
import pytest
import generatorify
class Error(namedtuple('Error', 'e')):
def _key(self):
return (type(self.e), self.e.args)
def __eq__(self, other):
if not isinstance(other, Error):
return NotImplemented
return self._key() == other._key()
def __ne__(self, other):
if not isinstance(other, Error):
return NotImplemented
return self._key() != other._key()
class Value(namedtuple('Value', 'v')):
def _key(self):
return self.v
def __eq__(self, other):
if not isinstance(other, Value):
return NotImplemented
return self._key() == other._key()
def __ne__(self, other):
if not isinstance(other, Value):
return NotImplemented
return self._key() != other._key()
# verify the helpers work
assert Value(1) == Value(1)
assert Value(1) != Value(2)
assert Error(ValueError()) == Error(ValueError())
assert Error(ValueError()) != Error(TypeError())
assert Value(1) != Error(1)
assert not (Value(1) == Error(1))
def result_of(f, *args, **kwargs):
try:
return Value(f(*args, **kwargs))
except BaseException as e:
return Error(e)
class ComparisonTest:
@pytest.fixture
def g_m(self):
return mock.MagicMock()
@pytest.fixture
def g(self, g_m):
g = self.generator(g_m)
assert g_m.method_calls == []
return g
@pytest.fixture
def c_m(self):
return mock.MagicMock()
@pytest.fixture
def c_direct(self, c_m):
c = generatorify.generator_from_callback(lambda yield_: self.callback(c_m, yield_))
assert c_m.method_calls == []
return c
@pytest.fixture
def c_roundtrip(self, c_m):
callback = generatorify.callback_from_generator(lambda: self.generator(c_m))
c = generatorify.generator_from_callback(callback)
assert c_m.method_calls == []
return c
@pytest.fixture(params=['direct', 'roundtrip'])
def c(self, request):
return request.getfixturevalue("c_{}".format(request.param))
class TestNext(ComparisonTest):
""" Test that next behaves just like a generator """
@staticmethod
def generator(m):
m.one()
yield 1
m.two()
yield 2
m.three()
@staticmethod
def callback(m, yield_):
m.one()
yield_(1)
m.two()
yield_(2)
m.three()
def test(self, c, g, c_m, g_m):
assert result_of(next, g) == result_of(next, c) == Value(1)
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Value(2)
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls
class TestReturn(ComparisonTest):
""" Test that return behaves just like a generator """
@staticmethod
def generator(m):
m.one()
yield 1
m.two()
return 2
@staticmethod
def callback(m, yield_):
m.one()
yield_(1)
m.two()
return 2
def test(self, c, g, c_m, g_m):
assert result_of(next, g) == result_of(next, c) == Value(1)
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Error(StopIteration(2,))
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls
class TestReturnImmediately(ComparisonTest):
""" Test that return behaves just like a generator """
@staticmethod
def generator(m):
m.one()
return 2
yield # pragma: no cover
@staticmethod
def callback(m, yield_):
m.one()
return 2
def test(self, c, g, c_m, g_m):
assert result_of(next, g) == result_of(next, c) == Error(StopIteration(2,))
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls
class TestRaise(ComparisonTest):
""" Test that raise behaves just like a generator """
@staticmethod
def generator(m):
m.one()
yield 1
m.two()
raise ValueError
@staticmethod
def callback(m, yield_):
m.one()
yield_(1)
m.two()
raise ValueError
def test(self, c, g, c_m, g_m):
assert result_of(next, g) == result_of(next, c) == Value(1)
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Error(ValueError())
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls
class TestReceive(ComparisonTest):
""" Test that send behaves just like a generator """
@staticmethod
def generator(m):
m.one()
ret1 = yield 1
m.two(ret1)
ret2 = yield 2
m.three(ret2)
@staticmethod
def callback(m, yield_):
m.one()
ret1 = yield_(1)
m.two(ret1)
ret2 = yield_(2)
m.three(ret2)
def test_send(self, c, g, c_m, g_m):
# first send call must be non-none
assert result_of(g.send, "not none") == result_of(c.send, "not none") == Error(TypeError(mock.ANY))
assert g_m.method_calls == c_m.method_calls == []
assert result_of(next, g) == result_of(next, c) == Value(1)
assert g_m.method_calls == c_m.method_calls
assert result_of(g.send, 'a') == result_of(c.send, 'a') == Value(2)
assert g_m.method_calls == c_m.method_calls
assert result_of(g.send, 'b') == result_of(c.send, 'b') == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls
def test_throw_before_first(self, c, g, c_m, g_m):
assert result_of(g.throw, ValueError) == result_of(c.throw, ValueError) == Error(ValueError())
assert g_m.method_calls == c_m.method_calls == []
assert result_of(next, g) == result_of(next, c) == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls == []
def test_close_before_first(self, c, g, c_m, g_m):
assert result_of(g.close) == result_of(c.close) == Value(None)
assert g_m.method_calls == c_m.method_calls == []
assert result_of(next, g) == result_of(next, c) == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls == []
def test_throw(self, c, g, c_m, g_m):
assert result_of(next, g) == result_of(next, c) == Value(1)
assert g_m.method_calls == c_m.method_calls
assert result_of(g.throw, ValueError) == result_of(c.throw, ValueError) == Error(ValueError())
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls
def test_close(self, c, g, c_m, g_m):
assert result_of(next, g) == result_of(next, c) == Value(1)
assert g_m.method_calls == c_m.method_calls
assert result_of(g.close) == result_of(c.close) == Value(None)
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls
class TestCatchAndContinue(ComparisonTest):
""" Test that catch behaves just like a generator """
@staticmethod
def generator(m):
m.one()
try:
yield 1
except BaseException as e:
m.two(Error(e)) # Error(e) makes equality work in the test
# not caught
yield 2
m.three()
@staticmethod
def callback(m, yield_):
m.one()
# caught and continued
try:
yield_(1)
except BaseException as e:
m.two(Error(e)) # Error(e) makes equality work in the test
yield_(2)
m.three()
def test_throw(self, c, g, c_m, g_m):
assert result_of(next, g) == result_of(next, c) == Value(1)
assert g_m.method_calls == c_m.method_calls
assert result_of(g.throw, ValueError) == result_of(c.throw, ValueError) == Value(2)
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls
def test_close(self, c, g, c_m, g_m):
assert result_of(next, g) == result_of(next, c) == Value(1)
assert g_m.method_calls == c_m.method_calls
# different messages, wildcard goes in the middle
assert result_of(g.close) == Error(RuntimeError(mock.ANY)) == result_of(c.close)
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls
class TestCatchAndRaise(ComparisonTest):
""" Test that catch behaves just like a generator """
@staticmethod
def generator(m):
m.one()
try:
yield 1
except BaseException as e:
m.two(Error(e)) # Error(e) makes equality work in the test
raise OSError()
@staticmethod
def callback(m, yield_):
m.one()
# caught and continued
try:
yield_(1)
except BaseException as e:
m.two(Error(e)) # Error(e) makes equality work in the test
raise OSError()
def test_throw(self, c, g, c_m, g_m):
assert result_of(next, g) == result_of(next, c) == Value(1)
assert g_m.method_calls == c_m.method_calls
assert g_m.method_calls == c_m.method_calls
assert result_of(g.throw, ValueError) == result_of(c.throw, ValueError) == Error(OSError())
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls
def test_close(self, c, g, c_m, g_m):
assert result_of(next, g) == result_of(next, c) == Value(1)
assert g_m.method_calls == c_m.method_calls
assert result_of(g.close) == result_of(c.close) == Error(OSError())
assert g_m.method_calls == c_m.method_calls
assert result_of(g.close) == result_of(c.close) == Value(None)
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls
class TestCatchAndReturn(ComparisonTest):
@staticmethod
def generator(m):
m.one()
try:
yield 1
except BaseException as e:
m.two(Error(e)) # Error(e) makes equality work in the test
return 3
@staticmethod
def callback(m, yield_):
m.one()
# caught and continued
try:
yield_(1)
except BaseException as e:
m.two(Error(e)) # Error(e) makes equality work in the test
return 3
def test_throw(self, c, g, c_m, g_m):
assert result_of(next, g) == result_of(next, c) == Value(1)
assert g_m.method_calls == c_m.method_calls
assert result_of(g.throw, ValueError) == result_of(c.throw, ValueError) == Error(StopIteration(3,))
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls
def test_close(self, c, g, c_m, g_m):
assert result_of(next, g) == result_of(next, c) == Value(1)
assert g_m.method_calls == c_m.method_calls
assert result_of(g.close) == result_of(c.close) == Value(None)
assert g_m.method_calls == c_m.method_calls
assert result_of(next, g) == result_of(next, c) == Error(StopIteration())
assert g_m.method_calls == c_m.method_calls
# no fixture here, they make checking reference too hard
def test_no_circular_references():
def invoke_with_values(f):
f(1)
f(2) # pragma: no cover
f(3) # pragma: no cover
values_iter = generatorify.generator_from_callback(
lambda yield_: invoke_with_values(yield_)
)
next(values_iter)
wr = weakref.ref(values_iter)
del values_iter
assert wr() is None
if __name__ == '__main__':
pytest.main([__file__, '-vv', '-s']) # pragma: no cover
| 33.224806 | 107 | 0.613781 | 1,835 | 12,858 | 4.074114 | 0.075204 | 0.088951 | 0.152488 | 0.083467 | 0.806849 | 0.79909 | 0.790128 | 0.770867 | 0.753879 | 0.743446 | 0 | 0.00665 | 0.26326 | 12,858 | 386 | 108 | 33.310881 | 0.78254 | 0.068207 | 0 | 0.759076 | 0 | 0 | 0.005369 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.151815 | false | 0 | 0.016502 | 0.016502 | 0.283828 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8c90cee0af6f67205b68510a33279a2ed9352729 | 263 | py | Python | queueman/__init__.py | ludeeus/taskfactory | 60117308acdd06a5439bff002b7397dd08421bbb | [
"MIT"
] | null | null | null | queueman/__init__.py | ludeeus/taskfactory | 60117308acdd06a5439bff002b7397dd08421bbb | [
"MIT"
] | null | null | null | queueman/__init__.py | ludeeus/taskfactory | 60117308acdd06a5439bff002b7397dd08421bbb | [
"MIT"
] | 1 | 2020-09-12T08:00:10.000Z | 2020-09-12T08:00:10.000Z | """Initalize QueueManager."""
from queueman.manager import QueueManager
from queueman.exceptions import (
QueueManagerBaseException,
QueueManagerExecutionStillInProgress,
)
from queueman.helper import iscoroutine
from queueman.decorator import concurrent
| 29.222222 | 41 | 0.8327 | 23 | 263 | 9.521739 | 0.565217 | 0.219178 | 0.219178 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.110266 | 263 | 8 | 42 | 32.875 | 0.935897 | 0.087452 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.571429 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8c9f4146c6367cdcb5c8f2bd47bec339a3adb447 | 18,680 | py | Python | src/models.py | puzzlepcs/pygcn | 94d942fbcd16cbdb44365ee91172302ce779ec34 | [
"MIT"
] | null | null | null | src/models.py | puzzlepcs/pygcn | 94d942fbcd16cbdb44365ee91172302ce779ec34 | [
"MIT"
] | null | null | null | src/models.py | puzzlepcs/pygcn | 94d942fbcd16cbdb44365ee91172302ce779ec34 | [
"MIT"
] | null | null | null | import math
import torch
import torch.nn as nn
import torch.nn.functional as F
from torch.nn.parameter import Parameter
from data.data_helper import BasicDataset
from src.layers import GraphConvolution
# class GCN(nn.Module):
# def __init__(self, nfeat, nhid, nclass, dropout):
# super(GCN, self).__init__()
#
# self.gc1 = GraphConvolution(nfeat, nhid)
# self.gc2 = GraphConvolution(nhid, nclass)
# self.dropout = dropout
#
# def forward(self, x, adj):
# x = F.relu(self.gc1(x, adj))
# x = F.dropout(x, self.dropout, training=self.training)
# x = self.gc2(x, adj)
# return F.log_softmax(x, dim=1)
#
# class MyLightGCN(nn.Module):
# def __init__(self, num_layers, latent_dim, graph, initial_feature, num_nodes):
# super(MyLightGCN, self).__init__()
# self.num_nodes = num_nodes # number of nodes
# self.num_layers = num_layers # number of layers
# self.latent_dim = latent_dim # dimensionality of embeddings
#
# self.graph = graph # adjacency matrix
# self.initial_feature = initial_feature # initial node vector
#
# self.embedding = nn.Embedding(
# num_embeddings=self.num_nodes, embedding_dim=self.latent_dim)
#
# # initialize node embedding
# if initial_feature is None:
# nn.init.normal_(self.embedding.weight, std=0.1)
# print('use NORMAL distribution N(0,1) initialization')
# # nn.init.xavier_normal_(self.embedding.weight)
# else:
# self.embedding.weight.data.copy_(initial_feature)
# print('use pretrained data')
#
# # print('graph: ', graph)
# print('num_layers: ', self.num_layers)
#
#
# def forward(self):
# emb = self.embedding.weight
#
# embs = [emb]
#
# for layer in range(self.num_layers):
# emb = torch.spmm(self.graph, emb)
# embs.append(emb)
#
# embs = torch.stack(embs, dim=1)
# out = torch.mean(embs, dim=1)
#
# return out
class BasicModel(nn.Module):
def __init__(self):
super(BasicModel, self).__init__()
def topology_loss(self, neg):
raise NotImplementedError
class LightGCN(BasicModel):
def __init__(self,
config: dict,
dataset: BasicDataset):
super(LightGCN, self).__init__()
self.config = config
self.dataset: BasicDataset = dataset
self.__init_weight()
def __init_weight(self):
self.num_nodes = self.dataset.n_nodes # number of nodes
self.num_hyperedges = self.dataset.m_hyperedges # number of hyperedges
self.output_dim = self.dataset.num_class # number of classes
self.num_layers = self.config['num_layers'] # number of layers
self.latent_dim = self.config['latent_dim'] # dimensionality of embeddings
# Node & Hyperedge embeddings
self.embedding_node = torch.nn.Embedding(
num_embeddings=self.num_nodes, embedding_dim=self.latent_dim,
device=self.config["device"]
)
self.embedding_hyperedge = torch.nn.Embedding(
num_embeddings=self.num_hyperedges, embedding_dim=self.latent_dim,
device=self.config["device"]
)
# initialize node embedding & hyperedge embedding
nn.init.normal_(self.embedding_node.weight, std=0.1)
nn.init.normal_(self.embedding_hyperedge.weight, std=0.1)
# Final layer for classification
self.linear_layer = torch.nn.Linear(
in_features=self.latent_dim, out_features=self.dataset.num_class,
bias=True, device=self.config["device"]
)
# Graph structure
self.Graph = self.dataset.getSparseGraph().to(self.config["device"])
self.Inc = torch.transpose(
self.dataset.getIncidenceMatrix(), 0, 1
).to(self.config["device"])
def computer(self):
"""
propagation method (Similar to LightGCN)
"""
node_emb = self.embedding_node.weight
hyperedge_emb = self.embedding_hyperedge.weight
all_emb = torch.cat([node_emb, hyperedge_emb])
embs = [all_emb]
for layer in range(self.num_layers):
all_emb = torch.spmm(self.Graph, all_emb)
embs.append(all_emb)
embs = torch.stack(embs, dim=1)
out = torch.mean(embs, dim=1)
nodes, hyperedges = torch.split(out, [self.num_nodes, self.num_hyperedges])
return nodes, hyperedges
def forward(self):
do = self.config["dropout"]
nodes, hyperedges = self.computer()
Z = self.linear_layer(nodes)
Z = F.dropout(Z, do, training=self.training)
return F.log_softmax(Z, dim=1)
def topology_loss(self, neg):
n, m = self.dataset.n_nodes, self.dataset.m_hyperedges
inc = self.Inc
num_negatives = self.config["num_negatives"]
node_emb, hyperedge_emb = self.computer()
total_loss = 0.0
for idx in range(m):
vector_src = hyperedge_emb[idx].expand(1, -1)
pos_idx = inc[idx]._indices()[0, :]
neg_idx = neg[idx]._indices()[0, :]
if pos_idx.shape[0] == 0:
continue
vectors_pos = node_emb[pos_idx]
vectors_neg = node_emb[neg_idx]
pos_scores = torch.sigmoid(torch.mm(vector_src, vectors_pos.T))
pos_scores = torch.min(pos_scores, dim=1).values
neg_scores = torch.sigmoid(torch.mm(vector_src, vectors_neg.T))
neg_scores = torch.reshape(neg_scores, (num_negatives, -1))
neg_scores = torch.min(neg_scores, dim=1).values
assert pos_scores.shape[0] * num_negatives == neg_scores.shape[0]
scores = torch.cat([pos_scores, neg_scores])
labels = torch.cat([torch.ones_like(pos_scores), torch.zeros_like(neg_scores)])
# scores = torch.cat([pos_scores, neg_scores], dim=1)
# labels = torch.cat([torch.ones_like(pos_scores), torch.zeros_like(neg_scores)], dim=1)
loss = F.binary_cross_entropy_with_logits(scores, labels)
total_loss += loss
# for idx in range(m):
# vector_src = hyperedge_emb[idx].expand(1, -1)
#
# pos_idx = torch.transpose(inc, 0, 1)[idx]._indices()[0, :]
# neg_idx = torch.transpose(neg, 0, 1)[idx]._indices()[0, :]
#
# print(idx)
# print(pos_idx)
# print(neg_idx)
#
# vectors_pos = node_emb[pos_idx]
# vectors_neg = node_emb[neg_idx]
#
# pos_scores = torch.sigmoid(torch.mm(vector_src, vectors_pos.T))
# pos_scores = torch.min(pos_scores, dim=1).values
#
# neg_scores = torch.sigmoid(torch.mm(vector_src, vectors_neg.T))
#
# print(pos_scores)
# print(neg_scores)
#
# neg_scores = torch.reshape(neg_scores, (num_negatives, -1))
# neg_scores = torch.min(neg_scores, dim=1).values
#
# assert pos_scores.shape[0] * num_negatives == neg_scores.shape[0]
#
# scores = torch.cat([pos_scores, neg_scores])
# labels = torch.cat([torch.ones_like(pos_scores), torch.zeros_like(neg_scores)])
#
# loss = F.binary_cross_entropy_with_logits(scores, labels)
# total_loss += loss
return total_loss / n
class GCN(BasicModel):
def __init__(self,
config : dict,
dataset: BasicDataset):
super(GCN, self).__init__()
self.config = config
self.dataset: BasicDataset = dataset
self.__init_weight()
def __init_weight(self):
l = self.config["num_layers"]
d, c = self.dataset.input_dim, self.dataset.num_class
# Dimensionality of hidden dimensions
h = [d]
for i in range(l-1):
power = l - i + 2
if self.config["data"] == 'citeseer': power = l - i + 4
h.append(2**power)
h.append(c)
self.layers = nn.ModuleList(
[GraphConvolution(h[i], h[i+1]).to(self.config["device"]) for i in range(l)]
)
self.Graph = self.dataset.getSparseGraph().to(self.config["device"])
self.Inc = torch.transpose(
self.dataset.getIncidenceMatrix(), 0, 1
).to(self.config["device"])
self.X = self.dataset.getFeatures().to(self.config["device"])
def forward(self):
do = self.config["dropout"]
l = self.config["num_layers"]
X = self.X
for i, layer in enumerate(self.layers):
X = F.relu(layer(X, self.Graph))
if i < l-1:
X = F.dropout(X, do, training=self.training)
return F.log_softmax(X, dim=1)
def computer(self):
l = self.config["num_layers"]
n, m = self.dataset.n_nodes, self.dataset.m_hyperedges
X = self.X
for i, layer in enumerate(self.layers):
if i == l:
break
X = F.relu(layer(X, self.Graph))
nodes, hyperedges = torch.split(X, [n, m])
return nodes, hyperedges
def topology_loss(self, neg):
n, m = self.dataset.n_nodes, self.dataset.m_hyperedges
inc = self.Inc
num_negatives = self.config["num_negatives"]
node_emb, hyperedge_emb = self.computer()
total_loss = 0.0
for idx in range(m):
vector_src = hyperedge_emb[idx].expand(1, -1)
pos_idx = inc[idx]._indices()[0, :]
neg_idx = neg[idx]._indices()[0, :]
if pos_idx.shape[0] == 0:
continue
vectors_pos = node_emb[pos_idx]
vectors_neg = node_emb[neg_idx]
pos_scores = torch.sigmoid(torch.mm(vector_src, vectors_pos.T))
pos_scores = torch.min(pos_scores, dim=1).values
neg_scores = torch.sigmoid(torch.mm(vector_src, vectors_neg.T))
neg_scores = torch.reshape(neg_scores, (num_negatives, -1))
neg_scores = torch.min(neg_scores, dim=1).values
assert pos_scores.shape[0] * num_negatives == neg_scores.shape[0]
# scores = torch.cat([pos_scores, neg_scores], dim=1)
# labels = torch.cat([torch.ones_like(pos_scores), torch.zeros_like(neg_scores)], dim=1)
scores = torch.cat([pos_scores, neg_scores])
labels = torch.cat([torch.ones_like(pos_scores), torch.zeros_like(neg_scores)])
loss = F.binary_cross_entropy_with_logits(scores, labels)
total_loss += loss
# for idx in range(m):
# vector_src = hyperedge_emb[idx].expand(1, -1)
#
# pos_idx = torch.transpose(inc, 0, 1)[idx]._indices()[0, :]
# neg_idx = torch.transpose(neg, 0, 1)[idx]._indices()[0, :]
#
# print(idx)
# print(pos_idx)
# print(neg_idx)
#
# vectors_pos = node_emb[pos_idx]
# vectors_neg = node_emb[neg_idx]
#
# pos_scores = torch.sigmoid(torch.mm(vector_src, vectors_pos.T))
# pos_scores = torch.min(pos_scores, dim=1).values
#
# neg_scores = torch.sigmoid(torch.mm(vector_src, vectors_neg.T))
#
# print(pos_scores)
# print(neg_scores)
#
# neg_scores = torch.reshape(neg_scores, (num_negatives, -1))
# neg_scores = torch.min(neg_scores, dim=1).values
#
# assert pos_scores.shape[0] * num_negatives == neg_scores.shape[0]
#
# scores = torch.cat([pos_scores, neg_scores])
# labels = torch.cat([torch.ones_like(pos_scores), torch.zeros_like(neg_scores)])
#
# loss = F.binary_cross_entropy_with_logits(scores, labels)
# total_loss += loss
return total_loss / n
class MyGCN(BasicModel):
def __init__(self, config: dict, dataset:BasicDataset):
super(MyGCN, self).__init__()
self.config = config
self.dataset = dataset
self.__init_weight()
def __init_weight(self):
h = self.config["latent_dim"]
device = self.config["device"]
l = self.config["num_layers"]
d, c, = self.dataset.input_dim, self.dataset.num_class
# Layer for dimensionality reduction (d -> h)
self.reduction_layer = torch.nn.Linear(
in_features=d, out_features=h, bias=True, device=device
)
# GCN layers (h -> h)
self.layers = nn.ModuleList(
[GraphConvolution(h, h).to(device) for _ in range(l)]
)
# Layer for classsification (h -> c)
self.classification_layer = torch.nn.Linear(
in_features=h, out_features=h, bias=True, device=device
)
self.Graph = self.dataset.getSparseGraph().to(device)
self.Inc = torch.transpose(
self.dataset.getIncidenceMatrix(), 0, 1
).to(self.config["device"])
self.X = self.dataset.getFeatures().to(self.config["device"])
def computer(self):
do = self.config["dropout"]
n, m = self.dataset.n_nodes, self.dataset.m_hyperedges
all_emb = self.X
all_emb = self.reduction_layer(all_emb)
all_emb = F.dropout(all_emb, do, training=self.training)
embs = [all_emb]
for i, layer in enumerate(self.layers):
all_emb = F.relu(layer(all_emb, self.Graph))
all_emb = F.dropout(all_emb, do, training=self.training)
embs.append(all_emb)
embs = torch.stack(embs, dim=1)
out = torch.mean(embs, dim=1)
nodes, hyperedges = torch.split(out, [n, m])
return nodes, hyperedges
def forward(self):
do = self.config["dropout"]
nodes, _ = self.computer()
Z = self.classification_layer(nodes)
Z = F.dropout(Z, do, training=self.training)
return F.log_softmax(Z, dim=1)
def topology_loss(self, neg):
n, m = self.dataset.n_nodes, self.dataset.m_hyperedges
inc = self.Inc
num_negatives = self.config["num_negatives"]
node_emb, hyperedge_emb = self.computer()
total_loss = 0.0
for idx in range(m):
vector_src = hyperedge_emb[idx].expand(1, -1)
pos_idx = inc[idx]._indices()[0, :]
neg_idx = neg[idx]._indices()[0, :]
if pos_idx.shape[0] == 0:
continue
vectors_pos = node_emb[pos_idx]
vectors_neg = node_emb[neg_idx]
pos_scores = torch.sigmoid(torch.mm(vector_src, vectors_pos.T))
pos_scores = torch.min(pos_scores, dim=1).values
neg_scores = torch.sigmoid(torch.mm(vector_src, vectors_neg.T))
neg_scores = torch.reshape(neg_scores, (num_negatives, -1))
neg_scores = torch.min(neg_scores, dim=1).values
assert pos_scores.shape[0] * num_negatives == neg_scores.shape[0]
scores = torch.cat([pos_scores, neg_scores])
labels = torch.cat([torch.ones_like(pos_scores), torch.zeros_like(neg_scores)])
loss = F.binary_cross_entropy_with_logits(scores, labels)
total_loss += loss
return total_loss / n
class MyLightGCN(BasicModel):
def __init__(self,
config: dict,
dataset: BasicDataset):
super(MyLightGCN, self).__init__()
self.config = config
self.dataset: BasicDataset = dataset
self.__init_weight()
def __init_weight(self):
h = self.config["latent_dim"]
device = self.config["device"]
d, c = self.dataset.input_dim, self.dataset.num_class
# Layer for dimensionality reduction
self.reduction_layer = torch.nn.Linear(
in_features=d, out_features=h, bias=True, device=device
)
# Final layer for classification
self.classification_layer = torch.nn.Linear(
in_features=h, out_features=c, bias=True, device=device
)
# Graph structure
self.X = self.dataset.getFeatures().to(device)
self.Graph = self.dataset.getSparseGraph().to(device)
self.Inc = torch.transpose(self.dataset.getIncidenceMatrix(), 0, 1).to(device)
def computer(self):
"""
propagation method (Similar to LightGCN)
"""
l = self.config["num_layers"]
do = self.config["dropout"]
n, m = self.dataset.n_nodes, self.dataset.m_hyperedges
all_emb = self.reduction_layer(self.X)
all_emb = F.dropout(all_emb, do, training=self.training)
embs = [all_emb]
for layer in range(l):
all_emb = torch.spmm(self.Graph, all_emb)
embs.append(all_emb)
embs = torch.stack(embs, dim=1)
out = torch.mean(embs, dim=1)
nodes, hyperedges = torch.split(out, [n, m])
return nodes, hyperedges
def forward(self):
do = self.config["dropout"]
nodes, _ = self.computer()
Z = self.classification_layer(nodes)
Z = F.dropout(Z, do, training=self.training)
return F.log_softmax(Z, dim=1)
def topology_loss(self, neg):
n, m = self.dataset.n_nodes, self.dataset.m_hyperedges
inc = self.Inc
num_negatives = self.config["num_negatives"]
node_emb, hyperedge_emb = self.computer()
total_loss = 0.0
for idx in range(m):
vector_src = hyperedge_emb[idx].expand(1, -1)
pos_idx = inc[idx]._indices()[0, :]
neg_idx = neg[idx]._indices()[0, :]
if pos_idx.shape[0] == 0:
continue
vectors_pos = node_emb[pos_idx]
vectors_neg = node_emb[neg_idx]
pos_scores = torch.sigmoid(torch.mm(vector_src, vectors_pos.T))
pos_scores = torch.min(pos_scores, dim=1).values
neg_scores = torch.sigmoid(torch.mm(vector_src, vectors_neg.T))
neg_scores = torch.reshape(neg_scores, (num_negatives, -1))
neg_scores = torch.min(neg_scores, dim=1).values
assert pos_scores.shape[0] * num_negatives == neg_scores.shape[0]
scores = torch.cat([pos_scores, neg_scores])
labels = torch.cat([torch.ones_like(pos_scores), torch.zeros_like(neg_scores)])
loss = F.binary_cross_entropy_with_logits(scores, labels)
total_loss += loss
return total_loss / n
| 33.597122 | 100 | 0.587473 | 2,339 | 18,680 | 4.475417 | 0.070115 | 0.046427 | 0.026748 | 0.026366 | 0.829671 | 0.781334 | 0.759171 | 0.73548 | 0.718953 | 0.68141 | 0 | 0.009175 | 0.294004 | 18,680 | 555 | 101 | 33.657658 | 0.784577 | 0.242238 | 0 | 0.744027 | 0 | 0 | 0.01962 | 0 | 0 | 0 | 0 | 0 | 0.013652 | 1 | 0.075085 | false | 0 | 0.023891 | 0 | 0.156997 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8cbc385aa63bf41dacd79a96d5f1f58a43e8c5e7 | 937 | py | Python | controllers/example_controller/example_controller.py | eljueves/competition-simulator | 52825d5b35935a11f199207250dc4dbfc1f2e023 | [
"MIT"
] | null | null | null | controllers/example_controller/example_controller.py | eljueves/competition-simulator | 52825d5b35935a11f199207250dc4dbfc1f2e023 | [
"MIT"
] | null | null | null | controllers/example_controller/example_controller.py | eljueves/competition-simulator | 52825d5b35935a11f199207250dc4dbfc1f2e023 | [
"MIT"
] | null | null | null | from sr.robot import *
R = Robot()
print("I see {} things".format(len(R.see())))
# motor board 0, channel 0 to half power forward
R.motors[0].m0.power = 50
# motor board 0, channel 1 to half power forward
R.motors[0].m1.power = 50
# sleep for 1 second
R.sleep(1)
# motor board 0, channel 0 to stopped
R.motors[0].m0.power = 0
# motor board 0, channel 1 to stopped
R.motors[0].m1.power = 0
# sleep for 2 seconds
R.sleep(2)
# motor board 0, channel 0 to half power backward
R.motors[0].m0.power = -50
# motor board 0, channel 1 to half power forward
R.motors[0].m1.power = 50
# sleep for 0.75 seconds
R.sleep(0.75)
# motor board 0, channel 0 to half power forward
R.motors[0].m0.power = 50
# motor board 0, channel 1 to half power forward
R.motors[0].m1.power = 50
# sleep for 1 second
R.sleep(1)
# motor board 0, channel 0 to stopped
R.motors[0].m0.power = 0
# motor board 0, channel 1 to stopped
R.motors[0].m1.power = 0
| 19.520833 | 49 | 0.689434 | 186 | 937 | 3.473118 | 0.172043 | 0.154799 | 0.170279 | 0.278638 | 0.843653 | 0.843653 | 0.843653 | 0.843653 | 0.797214 | 0.797214 | 0 | 0.089005 | 0.184632 | 937 | 47 | 50 | 19.93617 | 0.756545 | 0.541089 | 0 | 0.647059 | 0 | 0 | 0.036145 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.058824 | 0.058824 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
50b35723f20fea2aeacd0fff6f501a16b7939d1c | 237 | py | Python | tests/parser/bug.70.bk.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/bug.70.bk.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/bug.70.bk.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | input = """
% Locations:
loc(dropOff).
loc(distCenter).
loc(truck).
%loc(dest).
% Packages:
pkg(pidVar).
"""
output = """
% Locations:
loc(dropOff).
loc(distCenter).
loc(truck).
%loc(dest).
% Packages:
pkg(pidVar).
"""
| 11.285714 | 17 | 0.586498 | 26 | 237 | 5.346154 | 0.423077 | 0.172662 | 0.273381 | 0.316547 | 0.920863 | 0.920863 | 0.920863 | 0.920863 | 0.920863 | 0.920863 | 0 | 0 | 0.185654 | 237 | 20 | 18 | 11.85 | 0.720207 | 0 | 0 | 0.888889 | 0 | 0 | 0.859729 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
50c625b4142a1261a32124225201d65d4649c846 | 39,930 | py | Python | sdk/python/pulumi_azure/consumption/_inputs.py | aangelisc/pulumi-azure | 71dd9c75403146e16f7480e5a60b08bc0329660e | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure/consumption/_inputs.py | aangelisc/pulumi-azure | 71dd9c75403146e16f7480e5a60b08bc0329660e | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure/consumption/_inputs.py | aangelisc/pulumi-azure | 71dd9c75403146e16f7480e5a60b08bc0329660e | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = [
'BudgetResourceGroupFilterArgs',
'BudgetResourceGroupFilterDimensionArgs',
'BudgetResourceGroupFilterNotArgs',
'BudgetResourceGroupFilterNotDimensionArgs',
'BudgetResourceGroupFilterNotTagArgs',
'BudgetResourceGroupFilterTagArgs',
'BudgetResourceGroupNotificationArgs',
'BudgetResourceGroupTimePeriodArgs',
'BudgetSubscriptionFilterArgs',
'BudgetSubscriptionFilterDimensionArgs',
'BudgetSubscriptionFilterNotArgs',
'BudgetSubscriptionFilterNotDimensionArgs',
'BudgetSubscriptionFilterNotTagArgs',
'BudgetSubscriptionFilterTagArgs',
'BudgetSubscriptionNotificationArgs',
'BudgetSubscriptionTimePeriodArgs',
]
@pulumi.input_type
class BudgetResourceGroupFilterArgs:
def __init__(__self__, *,
dimensions: Optional[pulumi.Input[Sequence[pulumi.Input['BudgetResourceGroupFilterDimensionArgs']]]] = None,
not_: Optional[pulumi.Input['BudgetResourceGroupFilterNotArgs']] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input['BudgetResourceGroupFilterTagArgs']]]] = None):
"""
:param pulumi.Input[Sequence[pulumi.Input['BudgetResourceGroupFilterDimensionArgs']]] dimensions: One or more `dimension` blocks as defined below to filter the budget on.
:param pulumi.Input['BudgetResourceGroupFilterNotArgs'] not_: A `not` block as defined below to filter the budget on.
:param pulumi.Input[Sequence[pulumi.Input['BudgetResourceGroupFilterTagArgs']]] tags: One or more `tag` blocks as defined below to filter the budget on.
"""
if dimensions is not None:
pulumi.set(__self__, "dimensions", dimensions)
if not_ is not None:
pulumi.set(__self__, "not_", not_)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter
def dimensions(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['BudgetResourceGroupFilterDimensionArgs']]]]:
"""
One or more `dimension` blocks as defined below to filter the budget on.
"""
return pulumi.get(self, "dimensions")
@dimensions.setter
def dimensions(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['BudgetResourceGroupFilterDimensionArgs']]]]):
pulumi.set(self, "dimensions", value)
@property
@pulumi.getter(name="not")
def not_(self) -> Optional[pulumi.Input['BudgetResourceGroupFilterNotArgs']]:
"""
A `not` block as defined below to filter the budget on.
"""
return pulumi.get(self, "not_")
@not_.setter
def not_(self, value: Optional[pulumi.Input['BudgetResourceGroupFilterNotArgs']]):
pulumi.set(self, "not_", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['BudgetResourceGroupFilterTagArgs']]]]:
"""
One or more `tag` blocks as defined below to filter the budget on.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['BudgetResourceGroupFilterTagArgs']]]]):
pulumi.set(self, "tags", value)
@pulumi.input_type
class BudgetResourceGroupFilterDimensionArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
values: pulumi.Input[Sequence[pulumi.Input[str]]],
operator: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The name of the column to use for the filter. The allowed values are
:param pulumi.Input[Sequence[pulumi.Input[str]]] values: Specifies a list of values for the column. The allowed values are `ChargeType`, `Frequency`, `InvoiceId`, `Meter`, `MeterCategory`, `MeterSubCategory`, `PartNumber`, `PricingModel`, `Product`, `ProductOrderId`, `ProductOrderName`, `PublisherType`, `ReservationId`, `ReservationName`, `ResourceGroupName`, `ResourceGuid`, `ResourceId`, `ResourceLocation`, `ResourceType`, `ServiceFamily`, `ServiceName`, `UnitOfMeasure`.
:param pulumi.Input[str] operator: The operator to use for comparison. The allowed values are `In`.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "values", values)
if operator is not None:
pulumi.set(__self__, "operator", operator)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of the column to use for the filter. The allowed values are
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def values(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
Specifies a list of values for the column. The allowed values are `ChargeType`, `Frequency`, `InvoiceId`, `Meter`, `MeterCategory`, `MeterSubCategory`, `PartNumber`, `PricingModel`, `Product`, `ProductOrderId`, `ProductOrderName`, `PublisherType`, `ReservationId`, `ReservationName`, `ResourceGroupName`, `ResourceGuid`, `ResourceId`, `ResourceLocation`, `ResourceType`, `ServiceFamily`, `ServiceName`, `UnitOfMeasure`.
"""
return pulumi.get(self, "values")
@values.setter
def values(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "values", value)
@property
@pulumi.getter
def operator(self) -> Optional[pulumi.Input[str]]:
"""
The operator to use for comparison. The allowed values are `In`.
"""
return pulumi.get(self, "operator")
@operator.setter
def operator(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "operator", value)
@pulumi.input_type
class BudgetResourceGroupFilterNotArgs:
def __init__(__self__, *,
dimension: Optional[pulumi.Input['BudgetResourceGroupFilterNotDimensionArgs']] = None,
tag: Optional[pulumi.Input['BudgetResourceGroupFilterNotTagArgs']] = None):
"""
:param pulumi.Input['BudgetResourceGroupFilterNotDimensionArgs'] dimension: One `dimension` block as defined below to filter the budget on. Conflicts with `tag`.
:param pulumi.Input['BudgetResourceGroupFilterNotTagArgs'] tag: One `tag` block as defined below to filter the budget on. Conflicts with `dimension`.
"""
if dimension is not None:
pulumi.set(__self__, "dimension", dimension)
if tag is not None:
pulumi.set(__self__, "tag", tag)
@property
@pulumi.getter
def dimension(self) -> Optional[pulumi.Input['BudgetResourceGroupFilterNotDimensionArgs']]:
"""
One `dimension` block as defined below to filter the budget on. Conflicts with `tag`.
"""
return pulumi.get(self, "dimension")
@dimension.setter
def dimension(self, value: Optional[pulumi.Input['BudgetResourceGroupFilterNotDimensionArgs']]):
pulumi.set(self, "dimension", value)
@property
@pulumi.getter
def tag(self) -> Optional[pulumi.Input['BudgetResourceGroupFilterNotTagArgs']]:
"""
One `tag` block as defined below to filter the budget on. Conflicts with `dimension`.
"""
return pulumi.get(self, "tag")
@tag.setter
def tag(self, value: Optional[pulumi.Input['BudgetResourceGroupFilterNotTagArgs']]):
pulumi.set(self, "tag", value)
@pulumi.input_type
class BudgetResourceGroupFilterNotDimensionArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
values: pulumi.Input[Sequence[pulumi.Input[str]]],
operator: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The name of the column to use for the filter. The allowed values are
:param pulumi.Input[Sequence[pulumi.Input[str]]] values: Specifies a list of values for the column. The allowed values are `ChargeType`, `Frequency`, `InvoiceId`, `Meter`, `MeterCategory`, `MeterSubCategory`, `PartNumber`, `PricingModel`, `Product`, `ProductOrderId`, `ProductOrderName`, `PublisherType`, `ReservationId`, `ReservationName`, `ResourceGroupName`, `ResourceGuid`, `ResourceId`, `ResourceLocation`, `ResourceType`, `ServiceFamily`, `ServiceName`, `UnitOfMeasure`.
:param pulumi.Input[str] operator: The operator to use for comparison. The allowed values are `In`.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "values", values)
if operator is not None:
pulumi.set(__self__, "operator", operator)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of the column to use for the filter. The allowed values are
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def values(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
Specifies a list of values for the column. The allowed values are `ChargeType`, `Frequency`, `InvoiceId`, `Meter`, `MeterCategory`, `MeterSubCategory`, `PartNumber`, `PricingModel`, `Product`, `ProductOrderId`, `ProductOrderName`, `PublisherType`, `ReservationId`, `ReservationName`, `ResourceGroupName`, `ResourceGuid`, `ResourceId`, `ResourceLocation`, `ResourceType`, `ServiceFamily`, `ServiceName`, `UnitOfMeasure`.
"""
return pulumi.get(self, "values")
@values.setter
def values(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "values", value)
@property
@pulumi.getter
def operator(self) -> Optional[pulumi.Input[str]]:
"""
The operator to use for comparison. The allowed values are `In`.
"""
return pulumi.get(self, "operator")
@operator.setter
def operator(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "operator", value)
@pulumi.input_type
class BudgetResourceGroupFilterNotTagArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
values: pulumi.Input[Sequence[pulumi.Input[str]]],
operator: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The name of the tag to use for the filter.
:param pulumi.Input[Sequence[pulumi.Input[str]]] values: Specifies a list of values for the tag.
:param pulumi.Input[str] operator: The operator to use for comparison. The allowed values are `In`.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "values", values)
if operator is not None:
pulumi.set(__self__, "operator", operator)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of the tag to use for the filter.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def values(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
Specifies a list of values for the tag.
"""
return pulumi.get(self, "values")
@values.setter
def values(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "values", value)
@property
@pulumi.getter
def operator(self) -> Optional[pulumi.Input[str]]:
"""
The operator to use for comparison. The allowed values are `In`.
"""
return pulumi.get(self, "operator")
@operator.setter
def operator(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "operator", value)
@pulumi.input_type
class BudgetResourceGroupFilterTagArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
values: pulumi.Input[Sequence[pulumi.Input[str]]],
operator: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The name of the tag to use for the filter.
:param pulumi.Input[Sequence[pulumi.Input[str]]] values: Specifies a list of values for the tag.
:param pulumi.Input[str] operator: The operator to use for comparison. The allowed values are `In`.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "values", values)
if operator is not None:
pulumi.set(__self__, "operator", operator)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of the tag to use for the filter.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def values(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
Specifies a list of values for the tag.
"""
return pulumi.get(self, "values")
@values.setter
def values(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "values", value)
@property
@pulumi.getter
def operator(self) -> Optional[pulumi.Input[str]]:
"""
The operator to use for comparison. The allowed values are `In`.
"""
return pulumi.get(self, "operator")
@operator.setter
def operator(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "operator", value)
@pulumi.input_type
class BudgetResourceGroupNotificationArgs:
def __init__(__self__, *,
operator: pulumi.Input[str],
threshold: pulumi.Input[int],
contact_emails: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
contact_groups: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
contact_roles: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
enabled: Optional[pulumi.Input[bool]] = None):
"""
:param pulumi.Input[str] operator: The comparison operator for the notification. Must be one of `EqualTo`, `GreaterThan`, or `GreaterThanOrEqualTo`.
:param pulumi.Input[int] threshold: Threshold value associated with a notification. Notification is sent when the cost exceeded the threshold. It is always percent and has to be between 0 and 1000.
:param pulumi.Input[Sequence[pulumi.Input[str]]] contact_emails: Specifies a list of email addresses to send the budget notification to when the threshold is exceeded.
:param pulumi.Input[Sequence[pulumi.Input[str]]] contact_groups: Specifies a list of Action Group IDs to send the budget notification to when the threshold is exceeded.
:param pulumi.Input[Sequence[pulumi.Input[str]]] contact_roles: Specifies a list of contact roles to send the budget notification to when the threshold is exceeded.
:param pulumi.Input[bool] enabled: Should the notification be enabled?
"""
pulumi.set(__self__, "operator", operator)
pulumi.set(__self__, "threshold", threshold)
if contact_emails is not None:
pulumi.set(__self__, "contact_emails", contact_emails)
if contact_groups is not None:
pulumi.set(__self__, "contact_groups", contact_groups)
if contact_roles is not None:
pulumi.set(__self__, "contact_roles", contact_roles)
if enabled is not None:
pulumi.set(__self__, "enabled", enabled)
@property
@pulumi.getter
def operator(self) -> pulumi.Input[str]:
"""
The comparison operator for the notification. Must be one of `EqualTo`, `GreaterThan`, or `GreaterThanOrEqualTo`.
"""
return pulumi.get(self, "operator")
@operator.setter
def operator(self, value: pulumi.Input[str]):
pulumi.set(self, "operator", value)
@property
@pulumi.getter
def threshold(self) -> pulumi.Input[int]:
"""
Threshold value associated with a notification. Notification is sent when the cost exceeded the threshold. It is always percent and has to be between 0 and 1000.
"""
return pulumi.get(self, "threshold")
@threshold.setter
def threshold(self, value: pulumi.Input[int]):
pulumi.set(self, "threshold", value)
@property
@pulumi.getter(name="contactEmails")
def contact_emails(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Specifies a list of email addresses to send the budget notification to when the threshold is exceeded.
"""
return pulumi.get(self, "contact_emails")
@contact_emails.setter
def contact_emails(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "contact_emails", value)
@property
@pulumi.getter(name="contactGroups")
def contact_groups(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Specifies a list of Action Group IDs to send the budget notification to when the threshold is exceeded.
"""
return pulumi.get(self, "contact_groups")
@contact_groups.setter
def contact_groups(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "contact_groups", value)
@property
@pulumi.getter(name="contactRoles")
def contact_roles(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Specifies a list of contact roles to send the budget notification to when the threshold is exceeded.
"""
return pulumi.get(self, "contact_roles")
@contact_roles.setter
def contact_roles(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "contact_roles", value)
@property
@pulumi.getter
def enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Should the notification be enabled?
"""
return pulumi.get(self, "enabled")
@enabled.setter
def enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "enabled", value)
@pulumi.input_type
class BudgetResourceGroupTimePeriodArgs:
def __init__(__self__, *,
start_date: pulumi.Input[str],
end_date: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] start_date: The start date for the budget. The start date must be first of the month and should be less than the end date. Budget start date must be on or after June 1, 2017. Future start date should not be more than twelve months. Past start date should be selected within the timegrain period. Changing this forces a new Resource Group Consumption Budget to be created.
:param pulumi.Input[str] end_date: The end date for the budget. If not set this will be 10 years after the start date.
"""
pulumi.set(__self__, "start_date", start_date)
if end_date is not None:
pulumi.set(__self__, "end_date", end_date)
@property
@pulumi.getter(name="startDate")
def start_date(self) -> pulumi.Input[str]:
"""
The start date for the budget. The start date must be first of the month and should be less than the end date. Budget start date must be on or after June 1, 2017. Future start date should not be more than twelve months. Past start date should be selected within the timegrain period. Changing this forces a new Resource Group Consumption Budget to be created.
"""
return pulumi.get(self, "start_date")
@start_date.setter
def start_date(self, value: pulumi.Input[str]):
pulumi.set(self, "start_date", value)
@property
@pulumi.getter(name="endDate")
def end_date(self) -> Optional[pulumi.Input[str]]:
"""
The end date for the budget. If not set this will be 10 years after the start date.
"""
return pulumi.get(self, "end_date")
@end_date.setter
def end_date(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "end_date", value)
@pulumi.input_type
class BudgetSubscriptionFilterArgs:
def __init__(__self__, *,
dimensions: Optional[pulumi.Input[Sequence[pulumi.Input['BudgetSubscriptionFilterDimensionArgs']]]] = None,
not_: Optional[pulumi.Input['BudgetSubscriptionFilterNotArgs']] = None,
tags: Optional[pulumi.Input[Sequence[pulumi.Input['BudgetSubscriptionFilterTagArgs']]]] = None):
"""
:param pulumi.Input[Sequence[pulumi.Input['BudgetSubscriptionFilterDimensionArgs']]] dimensions: One or more `dimension` blocks as defined below to filter the budget on.
:param pulumi.Input['BudgetSubscriptionFilterNotArgs'] not_: A `not` block as defined below to filter the budget on.
:param pulumi.Input[Sequence[pulumi.Input['BudgetSubscriptionFilterTagArgs']]] tags: One or more `tag` blocks as defined below to filter the budget on.
"""
if dimensions is not None:
pulumi.set(__self__, "dimensions", dimensions)
if not_ is not None:
pulumi.set(__self__, "not_", not_)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter
def dimensions(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['BudgetSubscriptionFilterDimensionArgs']]]]:
"""
One or more `dimension` blocks as defined below to filter the budget on.
"""
return pulumi.get(self, "dimensions")
@dimensions.setter
def dimensions(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['BudgetSubscriptionFilterDimensionArgs']]]]):
pulumi.set(self, "dimensions", value)
@property
@pulumi.getter(name="not")
def not_(self) -> Optional[pulumi.Input['BudgetSubscriptionFilterNotArgs']]:
"""
A `not` block as defined below to filter the budget on.
"""
return pulumi.get(self, "not_")
@not_.setter
def not_(self, value: Optional[pulumi.Input['BudgetSubscriptionFilterNotArgs']]):
pulumi.set(self, "not_", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['BudgetSubscriptionFilterTagArgs']]]]:
"""
One or more `tag` blocks as defined below to filter the budget on.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['BudgetSubscriptionFilterTagArgs']]]]):
pulumi.set(self, "tags", value)
@pulumi.input_type
class BudgetSubscriptionFilterDimensionArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
values: pulumi.Input[Sequence[pulumi.Input[str]]],
operator: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The name of the column to use for the filter. The allowed values are
:param pulumi.Input[Sequence[pulumi.Input[str]]] values: Specifies a list of values for the column. The allowed values are `ChargeType`, `Frequency`, `InvoiceId`, `Meter`, `MeterCategory`, `MeterSubCategory`, `PartNumber`, `PricingModel`, `Product`, `ProductOrderId`, `ProductOrderName`, `PublisherType`, `ReservationId`, `ReservationName`, `ResourceGroupName`, `ResourceGuid`, `ResourceId`, `ResourceLocation`, `ResourceType`, `ServiceFamily`, `ServiceName`, `UnitOfMeasure`.
:param pulumi.Input[str] operator: The operator to use for comparison. The allowed values are `In`.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "values", values)
if operator is not None:
pulumi.set(__self__, "operator", operator)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of the column to use for the filter. The allowed values are
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def values(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
Specifies a list of values for the column. The allowed values are `ChargeType`, `Frequency`, `InvoiceId`, `Meter`, `MeterCategory`, `MeterSubCategory`, `PartNumber`, `PricingModel`, `Product`, `ProductOrderId`, `ProductOrderName`, `PublisherType`, `ReservationId`, `ReservationName`, `ResourceGroupName`, `ResourceGuid`, `ResourceId`, `ResourceLocation`, `ResourceType`, `ServiceFamily`, `ServiceName`, `UnitOfMeasure`.
"""
return pulumi.get(self, "values")
@values.setter
def values(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "values", value)
@property
@pulumi.getter
def operator(self) -> Optional[pulumi.Input[str]]:
"""
The operator to use for comparison. The allowed values are `In`.
"""
return pulumi.get(self, "operator")
@operator.setter
def operator(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "operator", value)
@pulumi.input_type
class BudgetSubscriptionFilterNotArgs:
def __init__(__self__, *,
dimension: Optional[pulumi.Input['BudgetSubscriptionFilterNotDimensionArgs']] = None,
tag: Optional[pulumi.Input['BudgetSubscriptionFilterNotTagArgs']] = None):
"""
:param pulumi.Input['BudgetSubscriptionFilterNotDimensionArgs'] dimension: One `dimension` block as defined below to filter the budget on. Conflicts with `tag`.
:param pulumi.Input['BudgetSubscriptionFilterNotTagArgs'] tag: One `tag` block as defined below to filter the budget on. Conflicts with `dimension`.
"""
if dimension is not None:
pulumi.set(__self__, "dimension", dimension)
if tag is not None:
pulumi.set(__self__, "tag", tag)
@property
@pulumi.getter
def dimension(self) -> Optional[pulumi.Input['BudgetSubscriptionFilterNotDimensionArgs']]:
"""
One `dimension` block as defined below to filter the budget on. Conflicts with `tag`.
"""
return pulumi.get(self, "dimension")
@dimension.setter
def dimension(self, value: Optional[pulumi.Input['BudgetSubscriptionFilterNotDimensionArgs']]):
pulumi.set(self, "dimension", value)
@property
@pulumi.getter
def tag(self) -> Optional[pulumi.Input['BudgetSubscriptionFilterNotTagArgs']]:
"""
One `tag` block as defined below to filter the budget on. Conflicts with `dimension`.
"""
return pulumi.get(self, "tag")
@tag.setter
def tag(self, value: Optional[pulumi.Input['BudgetSubscriptionFilterNotTagArgs']]):
pulumi.set(self, "tag", value)
@pulumi.input_type
class BudgetSubscriptionFilterNotDimensionArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
values: pulumi.Input[Sequence[pulumi.Input[str]]],
operator: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The name of the column to use for the filter. The allowed values are
:param pulumi.Input[Sequence[pulumi.Input[str]]] values: Specifies a list of values for the column. The allowed values are `ChargeType`, `Frequency`, `InvoiceId`, `Meter`, `MeterCategory`, `MeterSubCategory`, `PartNumber`, `PricingModel`, `Product`, `ProductOrderId`, `ProductOrderName`, `PublisherType`, `ReservationId`, `ReservationName`, `ResourceGroupName`, `ResourceGuid`, `ResourceId`, `ResourceLocation`, `ResourceType`, `ServiceFamily`, `ServiceName`, `UnitOfMeasure`.
:param pulumi.Input[str] operator: The operator to use for comparison. The allowed values are `In`.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "values", values)
if operator is not None:
pulumi.set(__self__, "operator", operator)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of the column to use for the filter. The allowed values are
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def values(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
Specifies a list of values for the column. The allowed values are `ChargeType`, `Frequency`, `InvoiceId`, `Meter`, `MeterCategory`, `MeterSubCategory`, `PartNumber`, `PricingModel`, `Product`, `ProductOrderId`, `ProductOrderName`, `PublisherType`, `ReservationId`, `ReservationName`, `ResourceGroupName`, `ResourceGuid`, `ResourceId`, `ResourceLocation`, `ResourceType`, `ServiceFamily`, `ServiceName`, `UnitOfMeasure`.
"""
return pulumi.get(self, "values")
@values.setter
def values(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "values", value)
@property
@pulumi.getter
def operator(self) -> Optional[pulumi.Input[str]]:
"""
The operator to use for comparison. The allowed values are `In`.
"""
return pulumi.get(self, "operator")
@operator.setter
def operator(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "operator", value)
@pulumi.input_type
class BudgetSubscriptionFilterNotTagArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
values: pulumi.Input[Sequence[pulumi.Input[str]]],
operator: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The name of the tag to use for the filter.
:param pulumi.Input[Sequence[pulumi.Input[str]]] values: Specifies a list of values for the tag.
:param pulumi.Input[str] operator: The operator to use for comparison. The allowed values are `In`.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "values", values)
if operator is not None:
pulumi.set(__self__, "operator", operator)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of the tag to use for the filter.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def values(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
Specifies a list of values for the tag.
"""
return pulumi.get(self, "values")
@values.setter
def values(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "values", value)
@property
@pulumi.getter
def operator(self) -> Optional[pulumi.Input[str]]:
"""
The operator to use for comparison. The allowed values are `In`.
"""
return pulumi.get(self, "operator")
@operator.setter
def operator(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "operator", value)
@pulumi.input_type
class BudgetSubscriptionFilterTagArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
values: pulumi.Input[Sequence[pulumi.Input[str]]],
operator: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The name of the tag to use for the filter.
:param pulumi.Input[Sequence[pulumi.Input[str]]] values: Specifies a list of values for the tag.
:param pulumi.Input[str] operator: The operator to use for comparison. The allowed values are `In`.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "values", values)
if operator is not None:
pulumi.set(__self__, "operator", operator)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of the tag to use for the filter.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def values(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
Specifies a list of values for the tag.
"""
return pulumi.get(self, "values")
@values.setter
def values(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "values", value)
@property
@pulumi.getter
def operator(self) -> Optional[pulumi.Input[str]]:
"""
The operator to use for comparison. The allowed values are `In`.
"""
return pulumi.get(self, "operator")
@operator.setter
def operator(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "operator", value)
@pulumi.input_type
class BudgetSubscriptionNotificationArgs:
def __init__(__self__, *,
operator: pulumi.Input[str],
threshold: pulumi.Input[int],
contact_emails: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
contact_groups: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
contact_roles: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
enabled: Optional[pulumi.Input[bool]] = None):
"""
:param pulumi.Input[str] operator: The comparison operator for the notification. Must be one of `EqualTo`, `GreaterThan`, or `GreaterThanOrEqualTo`.
:param pulumi.Input[int] threshold: Threshold value associated with a notification. Notification is sent when the cost exceeded the threshold. It is always percent and has to be between 0 and 1000.
:param pulumi.Input[Sequence[pulumi.Input[str]]] contact_emails: Specifies a list of email addresses to send the budget notification to when the threshold is exceeded.
:param pulumi.Input[Sequence[pulumi.Input[str]]] contact_groups: Specifies a list of Action Group IDs to send the budget notification to when the threshold is exceeded.
:param pulumi.Input[Sequence[pulumi.Input[str]]] contact_roles: Specifies a list of contact roles to send the budget notification to when the threshold is exceeded.
:param pulumi.Input[bool] enabled: Should the notification be enabled?
"""
pulumi.set(__self__, "operator", operator)
pulumi.set(__self__, "threshold", threshold)
if contact_emails is not None:
pulumi.set(__self__, "contact_emails", contact_emails)
if contact_groups is not None:
pulumi.set(__self__, "contact_groups", contact_groups)
if contact_roles is not None:
pulumi.set(__self__, "contact_roles", contact_roles)
if enabled is not None:
pulumi.set(__self__, "enabled", enabled)
@property
@pulumi.getter
def operator(self) -> pulumi.Input[str]:
"""
The comparison operator for the notification. Must be one of `EqualTo`, `GreaterThan`, or `GreaterThanOrEqualTo`.
"""
return pulumi.get(self, "operator")
@operator.setter
def operator(self, value: pulumi.Input[str]):
pulumi.set(self, "operator", value)
@property
@pulumi.getter
def threshold(self) -> pulumi.Input[int]:
"""
Threshold value associated with a notification. Notification is sent when the cost exceeded the threshold. It is always percent and has to be between 0 and 1000.
"""
return pulumi.get(self, "threshold")
@threshold.setter
def threshold(self, value: pulumi.Input[int]):
pulumi.set(self, "threshold", value)
@property
@pulumi.getter(name="contactEmails")
def contact_emails(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Specifies a list of email addresses to send the budget notification to when the threshold is exceeded.
"""
return pulumi.get(self, "contact_emails")
@contact_emails.setter
def contact_emails(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "contact_emails", value)
@property
@pulumi.getter(name="contactGroups")
def contact_groups(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Specifies a list of Action Group IDs to send the budget notification to when the threshold is exceeded.
"""
return pulumi.get(self, "contact_groups")
@contact_groups.setter
def contact_groups(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "contact_groups", value)
@property
@pulumi.getter(name="contactRoles")
def contact_roles(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Specifies a list of contact roles to send the budget notification to when the threshold is exceeded.
"""
return pulumi.get(self, "contact_roles")
@contact_roles.setter
def contact_roles(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "contact_roles", value)
@property
@pulumi.getter
def enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Should the notification be enabled?
"""
return pulumi.get(self, "enabled")
@enabled.setter
def enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "enabled", value)
@pulumi.input_type
class BudgetSubscriptionTimePeriodArgs:
def __init__(__self__, *,
start_date: pulumi.Input[str],
end_date: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] start_date: The start date for the budget. The start date must be first of the month and should be less than the end date. Budget start date must be on or after June 1, 2017. Future start date should not be more than twelve months. Past start date should be selected within the timegrain period. Changing this forces a new Subscription Consumption Budget to be created.
:param pulumi.Input[str] end_date: The end date for the budget. If not set this will be 10 years after the start date.
"""
pulumi.set(__self__, "start_date", start_date)
if end_date is not None:
pulumi.set(__self__, "end_date", end_date)
@property
@pulumi.getter(name="startDate")
def start_date(self) -> pulumi.Input[str]:
"""
The start date for the budget. The start date must be first of the month and should be less than the end date. Budget start date must be on or after June 1, 2017. Future start date should not be more than twelve months. Past start date should be selected within the timegrain period. Changing this forces a new Subscription Consumption Budget to be created.
"""
return pulumi.get(self, "start_date")
@start_date.setter
def start_date(self, value: pulumi.Input[str]):
pulumi.set(self, "start_date", value)
@property
@pulumi.getter(name="endDate")
def end_date(self) -> Optional[pulumi.Input[str]]:
"""
The end date for the budget. If not set this will be 10 years after the start date.
"""
return pulumi.get(self, "end_date")
@end_date.setter
def end_date(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "end_date", value)
| 43.402174 | 484 | 0.658552 | 4,574 | 39,930 | 5.646917 | 0.042632 | 0.122653 | 0.078052 | 0.069689 | 0.913082 | 0.908281 | 0.902939 | 0.88087 | 0.877308 | 0.870882 | 0 | 0.001581 | 0.223942 | 39,930 | 919 | 485 | 43.449402 | 0.831935 | 0.350939 | 0 | 0.874317 | 1 | 0 | 0.116864 | 0.066122 | 0 | 0 | 0 | 0 | 0 | 1 | 0.211293 | false | 0 | 0.009107 | 0 | 0.340619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
50d24012e87dc5cc0fc8fea5f01c807f5fe2142a | 21,001 | py | Python | src/python/simple_schedule_domain.py | ml4ai/tomcat-planrec | 70af6464851c641eb414fd1c818cfb5b1351e079 | [
"MIT"
] | null | null | null | src/python/simple_schedule_domain.py | ml4ai/tomcat-planrec | 70af6464851c641eb414fd1c818cfb5b1351e079 | [
"MIT"
] | 4 | 2021-06-17T15:21:25.000Z | 2021-07-18T20:20:07.000Z | src/python/simple_schedule_domain.py | ml4ai/tomcat-planrec | 70af6464851c641eb414fd1c818cfb5b1351e079 | [
"MIT"
] | null | null | null | # actions
def go_to_school(state):
state["went-to-school"] = 1
return state
def go_to_work(state):
state["went-to-work"] = 1
return state
def do_chores(state):
state["did-chores"] = 1
return state
def do_homework(state):
state["did-homework"] = 1
state["have-homework"] = 0
return state
def stay_for_tutoring(state):
state["stayed-for-tutoring"] = 1
return state
def go_running(state):
state["ran"] = 1
return state
def play_videogames(state):
state["played-videogames"] = 1
return state
def go_to_store(state):
state["went-to-store"] = 1
state["need-groceries"] = 0
return state
def watch_movie(state):
state["watched-movie"] = 1
state["found-movies"] = 0
return state
actions = [
go_to_school,
go_to_work,
do_chores,
do_homework,
stay_for_tutoring,
go_running,
play_videogames,
go_to_store,
watch_movie,
]
# preconditions
def default(state):
return True
def work_raining_homework(state):
if state["raining"]:
if state["have-homework"]:
if state["work-today"]:
return True
return False
def raining_homework(state):
if state["raining"]:
if state["have-homework"]:
if not state["work-today"]:
return True
return False
def work_homework(state):
if not state["raining"]:
if state["have-homework"]:
if state["work-today"]:
return True
return False
def homework(state):
if not state["raining"]:
if state["have-homework"]:
if not state["work-today"]:
return True
return False
def work_raining(state):
if state["raining"]:
if not state["have-homework"]:
if state["work-today"]:
return True
return False
def raining(state):
if state["raining"]:
if not state["have-homework"]:
if not state["work-today"]:
return True
return False
def work(state):
if not state["raining"]:
if not state["have-homework"]:
if state["work-today"]:
return True
return False
def no_work(state):
if not state["raining"]:
if not state["have-homework"]:
if not state["work-today"]:
return True
return False
def work_raining_homework_store(state):
if state["raining"]:
if state["have-homework"]:
if state["work-today"]:
if state["need-groceries"]:
return True
return False
def work_raining_homework_no_store(state):
if state["raining"]:
if state["have-homework"]:
if state["work-today"]:
if not state["need-groceries"]:
return True
return False
def raining_homework_store(state):
if state["raining"]:
if state["have-homework"]:
if not state["work-today"]:
if state["need-groceries"]:
return True
return False
def raining_homework_no_store(state):
if state["raining"]:
if state["have-homework"]:
if not state["work-today"]:
if not state["need-groceries"]:
return True
return False
def work_homework_store(state):
if not state["raining"]:
if state["have-homework"]:
if state["work-today"]:
if state["need-groceries"]:
return True
return False
def work_homework_no_store(state):
if not state["raining"]:
if state["have-homework"]:
if state["work-today"]:
if not state["need-groceries"]:
return True
return False
def homework_store(state):
if not state["raining"]:
if state["have-homework"]:
if not state["work-today"]:
if state["need-groceries"]:
return True
return False
def homework_no_store(state):
if not state["raining"]:
if state["have-homework"]:
if not state["work-today"]:
if not state["need-groceries"]:
return True
return False
def work_raining_store(state):
if state["raining"]:
if not state["have-homework"]:
if state["work-today"]:
if state["need-groceries"]:
return True
return False
def work_raining_no_store(state):
if state["raining"]:
if not state["have-homework"]:
if state["work-today"]:
if not state["need-groceries"]:
return True
return False
def raining_store(state):
if state["raining"]:
if not state["have-homework"]:
if not state["work-today"]:
if state["need-groceries"]:
return True
return False
def raining_no_store(state):
if state["raining"]:
if not state["have-homework"]:
if not state["work-today"]:
if not state["need-groceries"]:
return True
return False
def work_store(state):
if not state["raining"]:
if not state["have-homework"]:
if state["work-today"]:
if state["need-groceries"]:
return True
return False
def work_no_store(state):
if not state["raining"]:
if not state["have-homework"]:
if state["work-today"]:
if not state["need-groceries"]:
return True
return False
def no_work_store(state):
if not state["raining"]:
if not state["have-homework"]:
if not state["work-today"]:
if state["need-groceries"]:
return True
return False
def no_work_no_store(state):
if not state["raining"]:
if not state["have-homework"]:
if not state["work-today"]:
if not state["need-groceries"]:
return True
return False
def raining_homework_movie(state):
if state["raining"]:
if state["have-homework"]:
if not state["work-today"]:
if state["found-movie"]:
return True
return False
def raining_homework_no_movie(state):
if state["raining"]:
if state["have-homework"]:
if not state["work-today"]:
if not state["found-movie"]:
return True
return False
def homework_movie(state):
if not state["raining"]:
if state["have-homework"]:
if not state["work-today"]:
if state["found-movie"]:
return True
return False
def homework_no_movie(state):
if not state["raining"]:
if state["have-homework"]:
if not state["work-today"]:
if not state["found-movie"]:
return True
return False
def raining_movie(state):
if state["raining"]:
if not state["have-homework"]:
if not state["work-today"]:
if state["found-movie"]:
return True
return False
def raining_no_movie(state):
if state["raining"]:
if not state["have-homework"]:
if not state["work-today"]:
if not state["found-movie"]:
return True
return False
def no_work_movie(state):
if not state["raining"]:
if not state["have-homework"]:
if not state["work-today"]:
if state["found-movie"]:
return True
return False
def no_work_no_movie(state):
if not state["raining"]:
if not state["have-homework"]:
if not state["work-today"]:
if not state["found-movie"]:
return True
return False
def work_raining_friday(state):
if state["raining"]:
if state["work-today"]:
return True
return False
def raining_friday(state):
if state["raining"]:
if not state["work-today"]:
return True
return False
def work_friday(state):
if not state["raining"]:
if state["work-today"]:
return True
return False
def no_work_friday(state):
if not state["raining"]:
if not state["work-today"]:
return True
return False
# methods
methods = [
{
"task": "P",
"preconditions": default,
"subtasks": ["MONDAY"],
"t_prob": 0.2,
},
{
"task": "P",
"preconditions": default,
"subtasks": ["TUESDAY"],
"t_prob": 0.2,
},
{
"task": "P",
"preconditions": default,
"subtasks": ["WEDNESDAY"],
"t_prob": 0.2,
},
{
"task": "P",
"preconditions": default,
"subtasks": ["THURSDAY"],
"t_prob": 0.2,
},
{
"task": "P",
"preconditions": default,
"subtasks": ["FRIDAY"],
"t_prob": 0.2,
},
{
"task": "MONDAY",
"preconditions": work_raining_homework,
"subtasks": [
"!go_to_school",
"!go_to_work",
"!do_chores",
"!do_homework",
],
"t_prob": 1,
},
{
"task": "MONDAY",
"preconditions": raining_homework,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!do_chores",
"!do_homework",
],
"t_prob": 1,
},
{
"task": "MONDAY",
"preconditions": work_homework,
"subtasks": [
"!go_to_school",
"!go_to_work",
"!go_running",
"!do_homework",
],
"t_prob": 1,
},
{
"task": "MONDAY",
"preconditions": homework,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!go_running",
"!do_homework",
],
"t_prob": 1,
},
{
"task": "MONDAY",
"preconditions": work_raining,
"subtasks": [
"!go_to_school",
"!go_to_work",
"!do_chores",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "MONDAY",
"preconditions": raining,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!do_chores",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "MONDAY",
"preconditions": work,
"subtasks": [
"!go_to_school",
"!go_to_work",
"!go_running",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "MONDAY",
"preconditions": no_work,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!go_running",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": work_raining_homework_store,
"subtasks": [
"!go_to_work",
"!go_to_school",
"!go_to_store",
"!do_homework",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": work_raining_homework_no_store,
"subtasks": [
"!go_to_work",
"!go_to_school",
"!do_homework",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": raining_homework_store,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!go_to_store",
"!do_homework",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": raining_homework_no_store,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!do_homework",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": work_homework_store,
"subtasks": [
"!go_to_work",
"!go_to_school",
"!go_to_store",
"!do_homework",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": work_homework_no_store,
"subtasks": [
"!go_to_work",
"!go_to_school",
"!do_homework",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": homework_store,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!go_to_store",
"!do_homework",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": homework_no_store,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!do_homework",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": work_raining_store,
"subtasks": [
"!go_to_work",
"!go_to_school",
"!go_to_store",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": work_raining_no_store,
"subtasks": [
"!go_to_work",
"!go_to_school",
"!do_chores",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": raining_store,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!go_to_store",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": raining_no_store,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!do_chores",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": work_store,
"subtasks": [
"!go_to_work",
"!go_to_school",
"!go_to_store",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": work_no_store,
"subtasks": [
"!go_to_work",
"!go_to_school",
"!go_running",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": no_work_store,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!go_to_store",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "TUESDAY",
"preconditions": no_work_no_store,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!go_running",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "WEDNESDAY",
"preconditions": work_raining_homework,
"subtasks": [
"!go_to_school",
"!do_homework",
"!go_to_work",
"!do_chores",
],
"t_prob": 1,
},
{
"task": "WEDNESDAY",
"preconditions": raining_homework,
"subtasks": [
"!go_to_school",
"!do_homework",
"!do_chores",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "WEDNESDAY",
"preconditions": work_homework,
"subtasks": [
"!go_to_school",
"!do_homework",
"!go_to_work",
"!do_chores",
],
"t_prob": 1,
},
{
"task": "WEDNESDAY",
"preconditions": homework,
"subtasks": [
"!go_to_school",
"!do_homework",
"!go_running",
"!do_chores",
],
"t_prob": 1,
},
{
"task": "WEDNESDAY",
"preconditions": work_raining,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!go_to_work",
"!do_chores",
],
"t_prob": 1,
},
{
"task": "WEDNESDAY",
"preconditions": raining,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!play_videogames",
"!do_chores",
],
"t_prob": 1,
},
{
"task": "WEDNESDAY",
"preconditions": work,
"subtasks": [
"!go_to_school",
"!go_to_work",
"!go_running",
"!do_chores",
],
"t_prob": 1,
},
{
"task": "WEDNESDAY",
"preconditions": no_work,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!go_running",
"!do_chores",
],
"t_prob": 1,
},
{
"task": "THURSDAY",
"preconditions": work_raining_homework,
"subtasks": [
"!go_to_work",
"!go_to_school",
"!do_homework",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "THURSDAY",
"preconditions": raining_homework_movie,
"subtasks": [
"!go_to_school",
"!do_homework",
"!do_chores",
"!watch_movie",
],
"t_prob": 1,
},
{
"task": "THURSDAY",
"preconditions": raining_homework_no_movie,
"subtasks": [
"!go_to_school",
"!do_homework",
"!do_chores",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "THURSDAY",
"preconditions": work_homework,
"subtasks": [
"!go_to_work",
"!go_to_school",
"!do_homework",
"!go_running",
],
"t_prob": 1,
},
{
"task": "THURSDAY",
"preconditions": homework_movie,
"subtasks": [
"!go_to_school",
"!do_homework",
"!go_running",
"!watch_movie",
],
"t_prob": 1,
},
{
"task": "THURSDAY",
"preconditions": homework_no_movie,
"subtasks": [
"!go_to_school",
"!do_homework",
"!go_running",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "THURSDAY",
"preconditions": work_raining,
"subtasks": [
"!go_to_work",
"!go_to_school",
"!stay_for_tutoring",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "THURSDAY",
"preconditions": raining_movie,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!play_videogames",
"!watch_movie",
],
"t_prob": 1,
},
{
"task": "THURSDAY",
"preconditions": raining_no_movie,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "THURSDAY",
"preconditions": work,
"subtasks": [
"!go_to_work",
"!go_to_school",
"!go_running",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "THURSDAY",
"preconditions": no_work_movie,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!play_videogames",
"!watch_movie",
],
"t_prob": 1,
},
{
"task": "THURSDAY",
"preconditions": no_work_no_movie,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "FRIDAY",
"preconditions": work_raining_friday,
"subtasks": ["!do_chores", "!play_videogames", "!go_to_work"],
"t_prob": 1,
},
{
"task": "FRIDAY",
"preconditions": raining_friday,
"subtasks": [
"!go_to_school",
"!stay_for_tutoring",
"!do_chores",
"!play_videogames",
],
"t_prob": 1,
},
{
"task": "FRIDAY",
"preconditions": work_friday,
"subtasks": ["!play_videogames", "!go_running", "!go_to_work"],
"t_prob": 1,
},
{
"task": "FRIDAY",
"preconditions": no_work_friday,
"subtasks": [
"!go_to_school",
"!go_running",
"!do_chores",
"!play_videogames",
],
"t_prob": 1,
},
]
| 22.679266 | 71 | 0.464835 | 1,982 | 21,001 | 4.672048 | 0.027245 | 0.035421 | 0.073434 | 0.050756 | 0.934665 | 0.919546 | 0.915443 | 0.896544 | 0.861879 | 0.769222 | 0 | 0.005508 | 0.394838 | 21,001 | 925 | 72 | 22.703784 | 0.72311 | 0.001381 | 0 | 0.743652 | 0 | 0 | 0.286389 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055623 | false | 0 | 0 | 0.001209 | 0.154776 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
50ec2d013896eeae3ddd11f861971979a65066e6 | 437 | py | Python | tests/api/test_dataset.py | deeplearninc/auger-ai | b50af35e8ea28b528ec233a2f4a8d4e412059be9 | [
"MIT"
] | null | null | null | tests/api/test_dataset.py | deeplearninc/auger-ai | b50af35e8ea28b528ec233a2f4a8d4e412059be9 | [
"MIT"
] | 25 | 2019-07-09T04:26:19.000Z | 2020-07-21T06:43:25.000Z | tests/api/test_dataset.py | deeplearninc/auger-ai | b50af35e8ea28b528ec233a2f4a8d4e412059be9 | [
"MIT"
] | 1 | 2019-07-09T15:19:13.000Z | 2019-07-09T15:19:13.000Z | import pytest
class TestDataSetAPI():
@pytest.mark.skip(reason='make it work first')
def test_list(self):
assert False
@pytest.mark.skip(reason='make it work first')
def test_create(self):
assert False
@pytest.mark.skip(reason='make it work first')
def test_delete(self):
assert False
@pytest.mark.skip(reason='make it work first')
def test_select(self):
assert False
| 21.85 | 50 | 0.652174 | 60 | 437 | 4.683333 | 0.333333 | 0.142349 | 0.199288 | 0.284698 | 0.758007 | 0.758007 | 0.758007 | 0.758007 | 0.758007 | 0.758007 | 0 | 0 | 0.240275 | 437 | 19 | 51 | 23 | 0.846386 | 0 | 0 | 0.571429 | 0 | 0 | 0.16476 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 1 | 0.285714 | false | 0 | 0.071429 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0fae795cf1960f959d23548baa20cf7c7bd7cf03 | 34,261 | py | Python | fatiando/gravmag/prism.py | silky/fatiando | 5041c6b29758a5e73e9d7b2b906fa5e493fd9aba | [
"BSD-3-Clause"
] | 1 | 2019-06-27T11:32:56.000Z | 2019-06-27T11:32:56.000Z | fatiando/gravmag/prism.py | silky/fatiando | 5041c6b29758a5e73e9d7b2b906fa5e493fd9aba | [
"BSD-3-Clause"
] | null | null | null | fatiando/gravmag/prism.py | silky/fatiando | 5041c6b29758a5e73e9d7b2b906fa5e493fd9aba | [
"BSD-3-Clause"
] | null | null | null | r"""
Calculate the potential fields of the 3D right rectangular prism.
.. note:: All input units are SI. Output is in conventional units: SI for the
gravitatonal potential, mGal for gravity, Eotvos for gravity gradients, nT
for magnetic total field anomalies.
.. note:: The coordinate system of the input parameters is x -> North,
y -> East and z -> Down.
**Gravity**
The gravitational fields are calculated using the formula of Nagy et al.
(2000). Available functions are:
* :func:`~fatiando.gravmag.prism.potential`
* :func:`~fatiando.gravmag.prism.gx`
* :func:`~fatiando.gravmag.prism.gy`
* :func:`~fatiando.gravmag.prism.gz`
* :func:`~fatiando.gravmag.prism.gxx`
* :func:`~fatiando.gravmag.prism.gxy`
* :func:`~fatiando.gravmag.prism.gxz`
* :func:`~fatiando.gravmag.prism.gyy`
* :func:`~fatiando.gravmag.prism.gyz`
* :func:`~fatiando.gravmag.prism.gzz`
.. warning::
The gxy, gxz, and gyz components have singularities when the computation
point is aligned with the corners of the prism on the bottom, east, and
north sides, respectively. In these cases, the above functions will move
the computation point slightly to avoid these singularities. Unfortunately,
this means that the result will not be as accurate **on those points**.
**Magnetic**
Available fields are the total-field anomaly (using the formula of
Bhattacharyya, 1964) and x, y, z components of the magnetic induction:
* :func:`~fatiando.gravmag.prism.tf`
* :func:`~fatiando.gravmag.prism.bx`
* :func:`~fatiando.gravmag.prism.by`
* :func:`~fatiando.gravmag.prism.bz`
**Auxiliary Functions**
Calculates the second derivatives of the function
.. math::
\phi(x,y,z) = \int\int\int \frac{1}{r}
\mathrm{d}\nu \mathrm{d}\eta \mathrm{d}\zeta
with respect to the variables :math:`x`, :math:`y`, and :math:`z`.
In this equation,
.. math::
r = \sqrt{(x - \nu)^2 + (y - \eta)^2 + (z - \zeta)^2}
and :math:`\nu`, :math:`\eta`, :math:`\zeta` are the Cartesian
coordinates of an element inside the volume of a 3D prism.
These second derivatives are used to calculate
the total field anomaly and the gravity gradient tensor
components.
* :func:`~fatiando.gravmag.prism.kernelxx`
* :func:`~fatiando.gravmag.prism.kernelxy`
* :func:`~fatiando.gravmag.prism.kernelxz`
* :func:`~fatiando.gravmag.prism.kernelyy`
* :func:`~fatiando.gravmag.prism.kernelyz`
* :func:`~fatiando.gravmag.prism.kernelzz`
**References**
Bhattacharyya, B. K. (1964), Magnetic anomalies due to prism-shaped bodies with
arbitrary polarization, Geophysics, 29(4), 517, doi: 10.1190/1.1439386.
Nagy, D., G. Papp, and J. Benedek (2000), The gravitational potential and its
derivatives for the prism: Journal of Geodesy, 74, 552--560,
doi: 10.1007/s001900000116.
----
"""
from __future__ import division
import numpy
from .. import utils
from ..constants import G, SI2EOTVOS, CM, T2NT, SI2MGAL
try:
from . import _prism
except ImportError:
_prism = None
def potential(xp, yp, zp, prisms, dens=None):
"""
Calculates the gravitational potential.
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> **DOWN**.
.. note:: All input and output values in **SI** units(!)!
Parameters:
* xp, yp, zp : arrays
Arrays with the x, y, and z coordinates of the computation points.
* prisms : list of :class:`~fatiando.mesher.Prism`
The density model used to calculate the gravitational effect.
Prisms must have the property ``'density'``. Prisms that don't have
this property will be ignored in the computations. Elements of *prisms*
that are None will also be ignored. *prisms* can also be a
:class:`~fatiando.mesher.PrismMesh`.
* dens : float or None
If not None, will use this value instead of the ``'density'`` property
of the prisms. Use this, e.g., for sensitivity matrix building.
Returns:
* res : array
The field calculated on xp, yp, zp
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same length!")
size = len(xp)
res = numpy.zeros(size, dtype=numpy.float)
for prism in prisms:
if prism is None or ('density' not in prism.props and dens is None):
continue
if dens is None:
density = prism.props['density']
else:
density = dens
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.potential(xp, yp, zp, x1, x2, y1, y2, z1, z2, density, res)
res *= G
return res
def gx(xp, yp, zp, prisms, dens=None):
"""
Calculates the :math:`g_x` gravity acceleration component.
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> **DOWN**.
.. note:: All input values in **SI** units(!) and output in **mGal**!
Parameters:
* xp, yp, zp : arrays
Arrays with the x, y, and z coordinates of the computation points.
* prisms : list of :class:`~fatiando.mesher.Prism`
The density model used to calculate the gravitational effect.
Prisms must have the property ``'density'``. Prisms that don't have
this property will be ignored in the computations. Elements of *prisms*
that are None will also be ignored. *prisms* can also be a
:class:`~fatiando.mesher.PrismMesh`.
* dens : float or None
If not None, will use this value instead of the ``'density'`` property
of the prisms. Use this, e.g., for sensitivity matrix building.
Returns:
* res : array
The field calculated on xp, yp, zp
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same length!")
size = len(xp)
res = numpy.zeros(size, dtype=numpy.float)
for prism in prisms:
if prism is None or ('density' not in prism.props and dens is None):
continue
if dens is None:
density = prism.props['density']
else:
density = dens
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.gx(xp, yp, zp, x1, x2, y1, y2, z1, z2, density, res)
res *= G * SI2MGAL
return res
def gy(xp, yp, zp, prisms, dens=None):
"""
Calculates the :math:`g_y` gravity acceleration component.
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> **DOWN**.
.. note:: All input values in **SI** units(!) and output in **mGal**!
Parameters:
* xp, yp, zp : arrays
Arrays with the x, y, and z coordinates of the computation points.
* prisms : list of :class:`~fatiando.mesher.Prism`
The density model used to calculate the gravitational effect.
Prisms must have the property ``'density'``. Prisms that don't have
this property will be ignored in the computations. Elements of *prisms*
that are None will also be ignored. *prisms* can also be a
:class:`~fatiando.mesher.PrismMesh`.
* dens : float or None
If not None, will use this value instead of the ``'density'`` property
of the prisms. Use this, e.g., for sensitivity matrix building.
Returns:
* res : array
The field calculated on xp, yp, zp
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same length!")
size = len(xp)
res = numpy.zeros(size, dtype=numpy.float)
for prism in prisms:
if prism is None or ('density' not in prism.props and dens is None):
continue
if dens is None:
density = prism.props['density']
else:
density = dens
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.gy(xp, yp, zp, x1, x2, y1, y2, z1, z2, density, res)
res *= G * SI2MGAL
return res
def gz(xp, yp, zp, prisms, dens=None):
"""
Calculates the :math:`g_z` gravity acceleration component.
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> **DOWN**.
.. note:: All input values in **SI** units(!) and output in **mGal**!
Parameters:
* xp, yp, zp : arrays
Arrays with the x, y, and z coordinates of the computation points.
* prisms : list of :class:`~fatiando.mesher.Prism`
The density model used to calculate the gravitational effect.
Prisms must have the property ``'density'``. Prisms that don't have
this property will be ignored in the computations. Elements of *prisms*
that are None will also be ignored. *prisms* can also be a
:class:`~fatiando.mesher.PrismMesh`.
* dens : float or None
If not None, will use this value instead of the ``'density'`` property
of the prisms. Use this, e.g., for sensitivity matrix building.
Returns:
* res : array
The field calculated on xp, yp, zp
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same length!")
size = len(xp)
res = numpy.zeros(size, dtype=numpy.float)
for prism in prisms:
if prism is None or ('density' not in prism.props and dens is None):
continue
if dens is None:
density = prism.props['density']
else:
density = dens
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.gz(xp, yp, zp, x1, x2, y1, y2, z1, z2, density, res)
res *= G * SI2MGAL
return res
def gxx(xp, yp, zp, prisms, dens=None):
"""
Calculates the :math:`g_{xx}` gravity gradient tensor component.
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> **DOWN**.
.. note:: All input values in **SI** units(!) and output in **Eotvos**!
Parameters:
* xp, yp, zp : arrays
Arrays with the x, y, and z coordinates of the computation points.
* prisms : list of :class:`~fatiando.mesher.Prism`
The density model used to calculate the gravitational effect.
Prisms must have the property ``'density'``. Prisms that don't have
this property will be ignored in the computations. Elements of *prisms*
that are None will also be ignored. *prisms* can also be a
:class:`~fatiando.mesher.PrismMesh`.
* dens : float or None
If not None, will use this value instead of the ``'density'`` property
of the prisms. Use this, e.g., for sensitivity matrix building.
Returns:
* res : array
The field calculated on xp, yp, zp
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same length!")
size = len(xp)
res = numpy.zeros(size, dtype=numpy.float)
for prism in prisms:
if prism is None or ('density' not in prism.props and dens is None):
continue
if dens is None:
density = prism.props['density']
else:
density = dens
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.gxx(xp, yp, zp, x1, x2, y1, y2, z1, z2, density, res)
res *= G * SI2EOTVOS
return res
def gxy(xp, yp, zp, prisms, dens=None):
"""
Calculates the :math:`g_{xy}` gravity gradient tensor component.
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> **DOWN**.
.. note:: All input values in **SI** units(!) and output in **Eotvos**!
.. warning::
This component has singularities when the computation
point is aligned with the corners of the prism on the bottom side.
In these cases, the computation point slightly to avoid these
singularities. Unfortunately, this means that the result will not be as
accurate **on those points**.
Parameters:
* xp, yp, zp : arrays
Arrays with the x, y, and z coordinates of the computation points.
* prisms : list of :class:`~fatiando.mesher.Prism`
The density model used to calculate the gravitational effect.
Prisms must have the property ``'density'``. Prisms that don't have
this property will be ignored in the computations. Elements of *prisms*
that are None will also be ignored. *prisms* can also be a
:class:`~fatiando.mesher.PrismMesh`.
* dens : float or None
If not None, will use this value instead of the ``'density'`` property
of the prisms. Use this, e.g., for sensitivity matrix building.
Returns:
* res : array
The field calculated on xp, yp, zp
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same length!")
size = len(xp)
res = numpy.zeros(size, dtype=numpy.float)
for prism in prisms:
if prism is None or ('density' not in prism.props and dens is None):
continue
if dens is None:
density = prism.props['density']
else:
density = dens
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.gxy(xp, yp, zp, x1, x2, y1, y2, z1, z2, density, res)
res *= G * SI2EOTVOS
return res
def gxz(xp, yp, zp, prisms, dens=None):
"""
Calculates the :math:`g_{xz}` gravity gradient tensor component.
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> **DOWN**.
.. note:: All input values in **SI** units(!) and output in **Eotvos**!
.. warning::
This component has singularities when the computation
point is aligned with the corners of the prism on the east side.
In these cases, the computation point slightly to avoid these
singularities. Unfortunately, this means that the result will not be as
accurate **on those points**.
Parameters:
* xp, yp, zp : arrays
Arrays with the x, y, and z coordinates of the computation points.
* prisms : list of :class:`~fatiando.mesher.Prism`
The density model used to calculate the gravitational effect.
Prisms must have the property ``'density'``. Prisms that don't have
this property will be ignored in the computations. Elements of *prisms*
that are None will also be ignored. *prisms* can also be a
:class:`~fatiando.mesher.PrismMesh`.
* dens : float or None
If not None, will use this value instead of the ``'density'`` property
of the prisms. Use this, e.g., for sensitivity matrix building.
Returns:
* res : array
The field calculated on xp, yp, zp
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same length!")
size = len(xp)
res = numpy.zeros(size, dtype=numpy.float)
for prism in prisms:
if prism is None or ('density' not in prism.props and dens is None):
continue
if dens is None:
density = prism.props['density']
else:
density = dens
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.gxz(xp, yp, zp, x1, x2, y1, y2, z1, z2, density, res)
res *= G * SI2EOTVOS
return res
def gyy(xp, yp, zp, prisms, dens=None):
"""
Calculates the :math:`g_{yy}` gravity gradient tensor component.
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> **DOWN**.
.. note:: All input values in **SI** units(!) and output in **Eotvos**!
Parameters:
* xp, yp, zp : arrays
Arrays with the x, y, and z coordinates of the computation points.
* prisms : list of :class:`~fatiando.mesher.Prism`
The density model used to calculate the gravitational effect.
Prisms must have the property ``'density'``. Prisms that don't have
this property will be ignored in the computations. Elements of *prisms*
that are None will also be ignored. *prisms* can also be a
:class:`~fatiando.mesher.PrismMesh`.
* dens : float or None
If not None, will use this value instead of the ``'density'`` property
of the prisms. Use this, e.g., for sensitivity matrix building.
Returns:
* res : array
The field calculated on xp, yp, zp
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same length!")
size = len(xp)
res = numpy.zeros(size, dtype=numpy.float)
for prism in prisms:
if prism is None or ('density' not in prism.props and dens is None):
continue
if dens is None:
density = prism.props['density']
else:
density = dens
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.gyy(xp, yp, zp, x1, x2, y1, y2, z1, z2, density, res)
res *= G * SI2EOTVOS
return res
def gyz(xp, yp, zp, prisms, dens=None):
"""
Calculates the :math:`g_{yz}` gravity gradient tensor component.
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> **DOWN**.
.. note:: All input values in **SI** units(!) and output in **Eotvos**!
.. warning::
This component has singularities when the computation
point is aligned with the corners of the prism on the north side.
In these cases, the computation point slightly to avoid these
singularities. Unfortunately, this means that the result will not be as
accurate **on those points**.
Parameters:
* xp, yp, zp : arrays
Arrays with the x, y, and z coordinates of the computation points.
* prisms : list of :class:`~fatiando.mesher.Prism`
The density model used to calculate the gravitational effect.
Prisms must have the property ``'density'``. Prisms that don't have
this property will be ignored in the computations. Elements of *prisms*
that are None will also be ignored. *prisms* can also be a
:class:`~fatiando.mesher.PrismMesh`.
* dens : float or None
If not None, will use this value instead of the ``'density'`` property
of the prisms. Use this, e.g., for sensitivity matrix building.
Returns:
* res : array
The field calculated on xp, yp, zp
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same length!")
size = len(xp)
res = numpy.zeros(size, dtype=numpy.float)
for prism in prisms:
if prism is None or ('density' not in prism.props and dens is None):
continue
if dens is None:
density = prism.props['density']
else:
density = dens
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.gyz(xp, yp, zp, x1, x2, y1, y2, z1, z2, density, res)
res *= G * SI2EOTVOS
return res
def gzz(xp, yp, zp, prisms, dens=None):
"""
Calculates the :math:`g_{zz}` gravity gradient tensor component.
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> **DOWN**.
.. note:: All input values in **SI** units(!) and output in **Eotvos**!
Parameters:
* xp, yp, zp : arrays
Arrays with the x, y, and z coordinates of the computation points.
* prisms : list of :class:`~fatiando.mesher.Prism`
The density model used to calculate the gravitational effect.
Prisms must have the property ``'density'``. Prisms that don't have
this property will be ignored in the computations. Elements of *prisms*
that are None will also be ignored. *prisms* can also be a
:class:`~fatiando.mesher.PrismMesh`.
* dens : float or None
If not None, will use this value instead of the ``'density'`` property
of the prisms. Use this, e.g., for sensitivity matrix building.
Returns:
* res : array
The field calculated on xp, yp, zp
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same length!")
size = len(xp)
res = numpy.zeros(size, dtype=numpy.float)
for prism in prisms:
if prism is None or ('density' not in prism.props and dens is None):
continue
if dens is None:
density = prism.props['density']
else:
density = dens
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.gzz(xp, yp, zp, x1, x2, y1, y2, z1, z2, density, res)
res *= G * SI2EOTVOS
return res
def tf(xp, yp, zp, prisms, inc, dec, pmag=None):
"""
Calculate the total-field magnetic anomaly of prisms.
.. note:: Input units are SI. Output is in nT
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> Down.
Parameters:
* xp, yp, zp : arrays
Arrays with the x, y, and z coordinates of the computation points.
* prisms : list of :class:`~fatiando.mesher.Prism`
The model used to calculate the total field anomaly.
Prisms without the physical property ``'magnetization'`` will
be ignored. *prisms* can also be a :class:`~fatiando.mesher.PrismMesh`.
* inc : float
The inclination of the regional field (in degrees)
* dec : float
The declination of the regional field (in degrees)
* pmag : [mx, my, mz] or None
A magnetization vector. If not None, will use this value instead of the
``'magnetization'`` property of the prisms. Use this, e.g., for
sensitivity matrix building.
Returns:
* res : array
The field calculated on xp, yp, zp
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same length!")
size = len(xp)
res = numpy.zeros(size, dtype=numpy.float)
# Calculate the 3 components of the unit vector in the direction of the
# regional field
fx, fy, fz = utils.dircos(inc, dec)
if pmag is not None:
if isinstance(pmag, float) or isinstance(pmag, int):
mx, my, mz = pmag * fx, pmag * fy, pmag * fz
else:
mx, my, mz = pmag
for prism in prisms:
if (prism is None or
('magnetization' not in prism.props and pmag is None)):
continue
if pmag is None:
mag = prism.props['magnetization']
if isinstance(mag, float) or isinstance(mag, int):
mx, my, mz = mag * fx, mag * fy, mag * fz
else:
mx, my, mz = mag
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.tf(xp, yp, zp, x1, x2, y1, y2, z1, z2, mx, my, mz, fx, fy, fz,
res)
res *= CM * T2NT
return res
def bx(xp, yp, zp, prisms, pmag=None):
"""
Calculates the x component of the magnetic induction produced by
rectangular prisms.
.. note:: Input units are SI. Output is in nT
Parameters:
* xp, yp, zp : arrays
The x, y, and z coordinates where the anomaly will be calculated
* prisms : list of :class:`fatiando.mesher.Prism`
The model used to calculate the total field anomaly.
Prisms without the physical property ``'magnetization'`` will
be ignored. The ``'magnetization'`` must be a vector.
* pmag : [mx, my, mz] or None
A magnetization vector. If not None, will use this value instead of the
``'magnetization'`` property of the prisms. Use this, e.g., for
sensitivity matrix building.
Returns:
* bx: array
The x component of the magnetic induction
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same shape!")
if pmag is not None:
mx, my, mz = pmag
size = len(xp)
res = numpy.zeros(size, dtype=numpy.float)
for prism in prisms:
if (prism is None or
('magnetization' not in prism.props and pmag is None)):
continue
if pmag is None:
mx, my, mz = prism.props['magnetization']
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.bx(xp, yp, zp, x1, x2, y1, y2, z1, z2, mx, my, mz, res)
res *= CM * T2NT
return res
def by(xp, yp, zp, prisms, pmag=None):
"""
Calculates the y component of the magnetic induction produced by
rectangular prisms.
.. note:: Input units are SI. Output is in nT
Parameters:
* xp, yp, zp : arrays
The x, y, and z coordinates where the anomaly will be calculated
* prisms : list of :class:`fatiando.mesher.Prism`
The model used to calculate the total field anomaly.
Prisms without the physical property ``'magnetization'`` will
be ignored. The ``'magnetization'`` must be a vector.
* pmag : [mx, my, mz] or None
A magnetization vector. If not None, will use this value instead of the
``'magnetization'`` property of the prisms. Use this, e.g., for
sensitivity matrix building.
Returns:
* by: array
The y component of the magnetic induction
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same shape!")
if pmag is not None:
mx, my, mz = pmag
size = len(xp)
res = numpy.zeros(size, dtype=numpy.float)
for prism in prisms:
if (prism is None or
('magnetization' not in prism.props and pmag is None)):
continue
if pmag is None:
mx, my, mz = prism.props['magnetization']
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.by(xp, yp, zp, x1, x2, y1, y2, z1, z2, mx, my, mz, res)
res *= CM * T2NT
return res
def bz(xp, yp, zp, prisms, pmag=None):
"""
Calculates the z component of the magnetic induction produced by
rectangular prisms.
.. note:: Input units are SI. Output is in nT
Parameters:
* xp, yp, zp : arrays
The x, y, and z coordinates where the anomaly will be calculated
* prisms : list of :class:`fatiando.mesher.Prism`
The model used to calculate the total field anomaly.
Prisms without the physical property ``'magnetization'`` will
be ignored. The ``'magnetization'`` must be a vector.
* pmag : [mx, my, mz] or None
A magnetization vector. If not None, will use this value instead of the
``'magnetization'`` property of the prisms. Use this, e.g., for
sensitivity matrix building.
Returns:
* bz: array
The z component of the magnetic induction
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same shape!")
if pmag is not None:
mx, my, mz = pmag
size = len(xp)
res = numpy.zeros(size, dtype=numpy.float)
for prism in prisms:
if (prism is None or
('magnetization' not in prism.props and pmag is None)):
continue
if pmag is None:
mx, my, mz = prism.props['magnetization']
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.bz(xp, yp, zp, x1, x2, y1, y2, z1, z2, mx, my, mz, res)
res *= CM * T2NT
return res
def kernelxx(xp, yp, zp, prism):
r"""
Calculates the xx derivative of the function
.. math::
\phi(x,y,z) = \int\int\int \frac{1}{r}
\mathrm{d}\nu \mathrm{d}\eta \mathrm{d}\zeta
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> Down.
Parameters:
* xp, yp, zp : arrays
The x, y, and z coordinates of the computation points.
* prisms : object of :class:`fatiando.mesher.Prism`
The model used to calculate the field.
Returns:
* res : array
The effect calculated on the computation points.
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same shape!")
res = numpy.zeros(len(xp), dtype=numpy.float)
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.gxx(xp, yp, zp, x1, x2, y1, y2, z1, z2, 1, res)
return res
def kernelyy(xp, yp, zp, prism):
r"""
Calculates the yy derivative of the function
.. math::
\phi(x,y,z) = \int\int\int \frac{1}{r}
\mathrm{d}\nu \mathrm{d}\eta \mathrm{d}\zeta
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> Down.
Parameters:
* xp, yp, zp : arrays
The x, y, and z coordinates of the computation points.
* prisms : object of :class:`fatiando.mesher.Prism`
The model used to calculate the field.
Returns:
* res : array
The effect calculated on the computation points.
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same shape!")
res = numpy.zeros(len(xp), dtype=numpy.float)
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.gyy(xp, yp, zp, x1, x2, y1, y2, z1, z2, 1, res)
return res
def kernelzz(xp, yp, zp, prism):
r"""
Calculates the zz derivative of the function
.. math::
\phi(x,y,z) = \int\int\int \frac{1}{r}
\mathrm{d}\nu \mathrm{d}\eta \mathrm{d}\zeta
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> Down.
Parameters:
* xp, yp, zp : arrays
The x, y, and z coordinates of the computation points.
* prisms : object of :class:`fatiando.mesher.Prism`
The model used to calculate the field.
Returns:
* res : array
The effect calculated on the computation points.
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same shape!")
res = numpy.zeros(len(xp), dtype=numpy.float)
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.gzz(xp, yp, zp, x1, x2, y1, y2, z1, z2, 1, res)
return res
def kernelxy(xp, yp, zp, prism):
r"""
Calculates the xy derivative of the function
.. math::
\phi(x,y,z) = \int\int\int \frac{1}{r}
\mathrm{d}\nu \mathrm{d}\eta \mathrm{d}\zeta
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> Down.
Parameters:
* xp, yp, zp : arrays
The x, y, and z coordinates of the computation points.
* prisms : object of :class:`fatiando.mesher.Prism`
The model used to calculate the field.
Returns:
* res : array
The effect calculated on the computation points.
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same shape!")
res = numpy.zeros(len(xp), dtype=numpy.float)
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.gxy(xp, yp, zp, x1, x2, y1, y2, z1, z2, 1, res)
return res
def kernelxz(xp, yp, zp, prism):
r"""
Calculates the xz derivative of the function
.. math::
\phi(x,y,z) = \int\int\int \frac{1}{r}
\mathrm{d}\nu \mathrm{d}\eta \mathrm{d}\zeta
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> Down.
Parameters:
* xp, yp, zp : arrays
The x, y, and z coordinates of the computation points.
* prisms : object of :class:`fatiando.mesher.Prism`
The model used to calculate the field.
Returns:
* res : array
The effect calculated on the computation points.
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same shape!")
res = numpy.zeros(len(xp), dtype=numpy.float)
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.gxz(xp, yp, zp, x1, x2, y1, y2, z1, z2, 1, res)
return res
def kernelyz(xp, yp, zp, prism):
r"""
Calculates the yz derivative of the function
.. math::
\phi(x,y,z) = \int\int\int \frac{1}{r}
\mathrm{d}\nu \mathrm{d}\eta \mathrm{d}\zeta
.. note:: The coordinate system of the input parameters is to be
x -> North, y -> East and z -> Down.
Parameters:
* xp, yp, zp : arrays
The x, y, and z coordinates of the computation points.
* prisms : object of :class:`fatiando.mesher.Prism`
The model used to calculate the field.
Returns:
* res : array
The effect calculated on the computation points.
"""
if xp.shape != yp.shape or xp.shape != zp.shape:
raise ValueError("Input arrays xp, yp, and zp must have same shape!")
res = numpy.zeros(len(xp), dtype=numpy.float)
x1, x2 = prism.x1, prism.x2
y1, y2 = prism.y1, prism.y2
z1, z2 = prism.z1, prism.z2
_prism.gyz(xp, yp, zp, x1, x2, y1, y2, z1, z2, 1, res)
return res
| 33.888229 | 79 | 0.610957 | 5,006 | 34,261 | 4.174391 | 0.055134 | 0.017419 | 0.020386 | 0.018089 | 0.889266 | 0.8874 | 0.884816 | 0.871991 | 0.865675 | 0.863521 | 0 | 0.018524 | 0.27991 | 34,261 | 1,010 | 80 | 33.921782 | 0.828503 | 0.575757 | 0 | 0.808383 | 0 | 0 | 0.096289 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05988 | false | 0 | 0.017964 | 0 | 0.137725 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0fd35295c1f47a7ba56faeb951d04607bb4e67d8 | 159 | py | Python | tests/molecular/reactions/reaction_factory/fixtures/__init__.py | andrewtarzia/stk | 1ac2ecbb5c9940fe49ce04cbf5603fd7538c475a | [
"MIT"
] | 21 | 2018-04-12T16:25:24.000Z | 2022-02-14T23:05:43.000Z | tests/molecular/reactions/reaction_factory/fixtures/__init__.py | JelfsMaterialsGroup/stk | 0d3e1b0207aa6fa4d4d5ee8dfe3a29561abb08a2 | [
"MIT"
] | 8 | 2019-03-19T12:36:36.000Z | 2020-11-11T12:46:00.000Z | tests/molecular/reactions/reaction/fixtures/__init__.py | supramolecular-toolkit/stk | 0d3e1b0207aa6fa4d4d5ee8dfe3a29561abb08a2 | [
"MIT"
] | 5 | 2018-08-07T13:00:16.000Z | 2021-11-01T00:55:10.000Z | from .one_one_reaction import * # noqa
from .one_two_reaction import * # noqa
from .two_two_reaction import * # noqa
from .dative_reaction import * # noqa
| 31.8 | 39 | 0.748428 | 23 | 159 | 4.869565 | 0.304348 | 0.5 | 0.642857 | 0.589286 | 0.446429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176101 | 159 | 4 | 40 | 39.75 | 0.854962 | 0.119497 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
fa48f8d109f433a63753e236d7599ff44e9f9eaf | 66,529 | py | Python | generated/resources/service_instance_heat.py | atsgen/tf-heat-plugin | 5c0405eb93287368f60f7e227e5af5ada6bfeed2 | [
"Apache-2.0"
] | 1 | 2020-04-05T19:43:40.000Z | 2020-04-05T19:43:40.000Z | generated/resources/service_instance_heat.py | atsgen/tf-heat-plugin | 5c0405eb93287368f60f7e227e5af5ada6bfeed2 | [
"Apache-2.0"
] | null | null | null | generated/resources/service_instance_heat.py | atsgen/tf-heat-plugin | 5c0405eb93287368f60f7e227e5af5ada6bfeed2 | [
"Apache-2.0"
] | 1 | 2020-08-25T12:47:27.000Z | 2020-08-25T12:47:27.000Z |
# AUTO-GENERATED file from IFMapApiGenerator. Do Not Edit!
from contrail_heat.resources import contrail
try:
from heat.common.i18n import _
except ImportError:
pass
from heat.engine import attributes
from heat.engine import constraints
from heat.engine import properties
try:
from heat.openstack.common import log as logging
except ImportError:
from oslo_log import log as logging
import uuid
from vnc_api import vnc_api
LOG = logging.getLogger(__name__)
class ContrailServiceInstance(contrail.ContrailResource):
PROPERTIES = (
NAME, FQ_NAME, DISPLAY_NAME, SERVICE_INSTANCE_BINDINGS, SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR, SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR_KEY, SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR_VALUE, SERVICE_INSTANCE_PROPERTIES, SERVICE_INSTANCE_PROPERTIES_AUTO_POLICY, SERVICE_INSTANCE_PROPERTIES_AVAILABILITY_ZONE, SERVICE_INSTANCE_PROPERTIES_MANAGEMENT_VIRTUAL_NETWORK, SERVICE_INSTANCE_PROPERTIES_LEFT_VIRTUAL_NETWORK, SERVICE_INSTANCE_PROPERTIES_LEFT_IP_ADDRESS, SERVICE_INSTANCE_PROPERTIES_RIGHT_VIRTUAL_NETWORK, SERVICE_INSTANCE_PROPERTIES_RIGHT_IP_ADDRESS, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_VIRTUAL_NETWORK, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_IP_ADDRESS, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_PREFIX, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_NEXT_HOP, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_NEXT_HOP_TYPE, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES_COMMUNITY_ATTRIBUTE, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP_IP_PREFIX, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP_IP_PREFIX_LEN, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_MAC, SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_ADDRESS_MODE, SERVICE_INSTANCE_PROPERTIES_SCALE_OUT, SERVICE_INSTANCE_PROPERTIES_SCALE_OUT_MAX_INSTANCES, SERVICE_INSTANCE_PROPERTIES_SCALE_OUT_AUTO_SCALE, SERVICE_INSTANCE_PROPERTIES_HA_MODE, SERVICE_INSTANCE_PROPERTIES_VIRTUAL_ROUTER_ID, SERVICE_TEMPLATE_REFS, INSTANCE_IP_REFS, INSTANCE_IP_REFS_DATA, INSTANCE_IP_REFS_DATA_INTERFACE_TYPE, PROJECT
) = (
'name', 'fq_name', 'display_name', 'service_instance_bindings', 'service_instance_bindings_key_value_pair', 'service_instance_bindings_key_value_pair_key', 'service_instance_bindings_key_value_pair_value', 'service_instance_properties', 'service_instance_properties_auto_policy', 'service_instance_properties_availability_zone', 'service_instance_properties_management_virtual_network', 'service_instance_properties_left_virtual_network', 'service_instance_properties_left_ip_address', 'service_instance_properties_right_virtual_network', 'service_instance_properties_right_ip_address', 'service_instance_properties_interface_list', 'service_instance_properties_interface_list_virtual_network', 'service_instance_properties_interface_list_ip_address', 'service_instance_properties_interface_list_static_routes', 'service_instance_properties_interface_list_static_routes_route', 'service_instance_properties_interface_list_static_routes_route_prefix', 'service_instance_properties_interface_list_static_routes_route_next_hop', 'service_instance_properties_interface_list_static_routes_route_next_hop_type', 'service_instance_properties_interface_list_static_routes_route_community_attributes', 'service_instance_properties_interface_list_static_routes_route_community_attributes_community_attribute', 'service_instance_properties_interface_list_allowed_address_pairs', 'service_instance_properties_interface_list_allowed_address_pairs_allowed_address_pair', 'service_instance_properties_interface_list_allowed_address_pairs_allowed_address_pair_ip', 'service_instance_properties_interface_list_allowed_address_pairs_allowed_address_pair_ip_ip_prefix', 'service_instance_properties_interface_list_allowed_address_pairs_allowed_address_pair_ip_ip_prefix_len', 'service_instance_properties_interface_list_allowed_address_pairs_allowed_address_pair_mac', 'service_instance_properties_interface_list_allowed_address_pairs_allowed_address_pair_address_mode', 'service_instance_properties_scale_out', 'service_instance_properties_scale_out_max_instances', 'service_instance_properties_scale_out_auto_scale', 'service_instance_properties_ha_mode', 'service_instance_properties_virtual_router_id', 'service_template_refs', 'instance_ip_refs', 'instance_ip_refs_data', 'instance_ip_refs_data_interface_type', 'project'
)
properties_schema = {
NAME: properties.Schema(
properties.Schema.STRING,
_('NAME.'),
update_allowed=True,
required=False,
),
FQ_NAME: properties.Schema(
properties.Schema.STRING,
_('FQ_NAME.'),
update_allowed=True,
required=False,
),
DISPLAY_NAME: properties.Schema(
properties.Schema.STRING,
_('DISPLAY_NAME.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_BINDINGS: properties.Schema(
properties.Schema.MAP,
_('SERVICE_INSTANCE_BINDINGS.'),
update_allowed=True,
required=False,
schema={
SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR: properties.Schema(
properties.Schema.LIST,
_('SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR.'),
update_allowed=True,
required=False,
schema=properties.Schema(
properties.Schema.MAP,
schema={
SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR_KEY: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR_KEY.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR_VALUE: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR_VALUE.'),
update_allowed=True,
required=False,
),
}
)
),
}
),
SERVICE_INSTANCE_PROPERTIES: properties.Schema(
properties.Schema.MAP,
_('SERVICE_INSTANCE_PROPERTIES.'),
update_allowed=True,
required=False,
schema={
SERVICE_INSTANCE_PROPERTIES_AUTO_POLICY: properties.Schema(
properties.Schema.BOOLEAN,
_('SERVICE_INSTANCE_PROPERTIES_AUTO_POLICY.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_PROPERTIES_AVAILABILITY_ZONE: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_AVAILABILITY_ZONE.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_PROPERTIES_MANAGEMENT_VIRTUAL_NETWORK: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_MANAGEMENT_VIRTUAL_NETWORK.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_PROPERTIES_LEFT_VIRTUAL_NETWORK: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_LEFT_VIRTUAL_NETWORK.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_PROPERTIES_LEFT_IP_ADDRESS: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_LEFT_IP_ADDRESS.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_PROPERTIES_RIGHT_VIRTUAL_NETWORK: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_RIGHT_VIRTUAL_NETWORK.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_PROPERTIES_RIGHT_IP_ADDRESS: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_RIGHT_IP_ADDRESS.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST: properties.Schema(
properties.Schema.LIST,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST.'),
update_allowed=True,
required=False,
schema=properties.Schema(
properties.Schema.MAP,
schema={
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_VIRTUAL_NETWORK: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_VIRTUAL_NETWORK.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_IP_ADDRESS: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_IP_ADDRESS.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES: properties.Schema(
properties.Schema.MAP,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES.'),
update_allowed=True,
required=False,
schema={
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE: properties.Schema(
properties.Schema.LIST,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE.'),
update_allowed=True,
required=False,
schema=properties.Schema(
properties.Schema.MAP,
schema={
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_PREFIX: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_PREFIX.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_NEXT_HOP: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_NEXT_HOP.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_NEXT_HOP_TYPE: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_NEXT_HOP_TYPE.'),
update_allowed=True,
required=False,
constraints=[
constraints.AllowedValues([u'service-instance', u'ip-address']),
],
),
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES: properties.Schema(
properties.Schema.MAP,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES.'),
update_allowed=True,
required=False,
schema={
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES_COMMUNITY_ATTRIBUTE: properties.Schema(
properties.Schema.LIST,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES_COMMUNITY_ATTRIBUTE.'),
update_allowed=True,
required=False,
),
}
),
}
)
),
}
),
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS: properties.Schema(
properties.Schema.MAP,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS.'),
update_allowed=True,
required=False,
schema={
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR: properties.Schema(
properties.Schema.LIST,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR.'),
update_allowed=True,
required=False,
schema=properties.Schema(
properties.Schema.MAP,
schema={
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP: properties.Schema(
properties.Schema.MAP,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP.'),
update_allowed=True,
required=False,
schema={
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP_IP_PREFIX: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP_IP_PREFIX.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP_IP_PREFIX_LEN: properties.Schema(
properties.Schema.INTEGER,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP_IP_PREFIX_LEN.'),
update_allowed=True,
required=False,
),
}
),
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_MAC: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_MAC.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_ADDRESS_MODE: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_ADDRESS_MODE.'),
update_allowed=True,
required=False,
constraints=[
constraints.AllowedValues([u'active-active', u'active-standby']),
],
),
}
)
),
}
),
}
)
),
SERVICE_INSTANCE_PROPERTIES_SCALE_OUT: properties.Schema(
properties.Schema.MAP,
_('SERVICE_INSTANCE_PROPERTIES_SCALE_OUT.'),
update_allowed=True,
required=False,
schema={
SERVICE_INSTANCE_PROPERTIES_SCALE_OUT_MAX_INSTANCES: properties.Schema(
properties.Schema.INTEGER,
_('SERVICE_INSTANCE_PROPERTIES_SCALE_OUT_MAX_INSTANCES.'),
update_allowed=True,
required=False,
),
SERVICE_INSTANCE_PROPERTIES_SCALE_OUT_AUTO_SCALE: properties.Schema(
properties.Schema.BOOLEAN,
_('SERVICE_INSTANCE_PROPERTIES_SCALE_OUT_AUTO_SCALE.'),
update_allowed=True,
required=False,
),
}
),
SERVICE_INSTANCE_PROPERTIES_HA_MODE: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_HA_MODE.'),
update_allowed=True,
required=False,
constraints=[
constraints.AllowedValues([u'active-active', u'active-standby']),
],
),
SERVICE_INSTANCE_PROPERTIES_VIRTUAL_ROUTER_ID: properties.Schema(
properties.Schema.STRING,
_('SERVICE_INSTANCE_PROPERTIES_VIRTUAL_ROUTER_ID.'),
update_allowed=True,
required=False,
),
}
),
SERVICE_TEMPLATE_REFS: properties.Schema(
properties.Schema.LIST,
_('SERVICE_TEMPLATE_REFS.'),
update_allowed=True,
required=False,
),
INSTANCE_IP_REFS: properties.Schema(
properties.Schema.LIST,
_('INSTANCE_IP_REFS.'),
update_allowed=True,
required=False,
),
INSTANCE_IP_REFS_DATA: properties.Schema(
properties.Schema.LIST,
_('INSTANCE_IP_REFS_DATA.'),
update_allowed=True,
required=False,
schema=properties.Schema(
properties.Schema.MAP,
schema={
INSTANCE_IP_REFS_DATA_INTERFACE_TYPE: properties.Schema(
properties.Schema.STRING,
_('INSTANCE_IP_REFS_DATA_INTERFACE_TYPE.'),
update_allowed=True,
required=False,
),
}
)
),
PROJECT: properties.Schema(
properties.Schema.STRING,
_('PROJECT.'),
update_allowed=True,
required=False,
),
}
attributes_schema = {
NAME: attributes.Schema(
_('NAME.'),
),
FQ_NAME: attributes.Schema(
_('FQ_NAME.'),
),
DISPLAY_NAME: attributes.Schema(
_('DISPLAY_NAME.'),
),
SERVICE_INSTANCE_BINDINGS: attributes.Schema(
_('SERVICE_INSTANCE_BINDINGS.'),
),
SERVICE_INSTANCE_PROPERTIES: attributes.Schema(
_('SERVICE_INSTANCE_PROPERTIES.'),
),
SERVICE_TEMPLATE_REFS: attributes.Schema(
_('SERVICE_TEMPLATE_REFS.'),
),
INSTANCE_IP_REFS: attributes.Schema(
_('INSTANCE_IP_REFS.'),
),
INSTANCE_IP_REFS_DATA: attributes.Schema(
_('INSTANCE_IP_REFS_DATA.'),
),
PROJECT: attributes.Schema(
_('PROJECT.'),
),
}
update_allowed_keys = ('Properties',)
def handle_create(self):
parent_obj = None
if parent_obj is None and self.properties.get(self.PROJECT):
try:
parent_obj = self.vnc_lib().project_read(id=self.properties.get(self.PROJECT))
except vnc_api.NoIdError:
parent_obj = self.vnc_lib().project_read(fq_name_str=self.properties.get(self.PROJECT))
except:
parent_obj = None
if parent_obj is None:
tenant_id = self.stack.context.tenant_id
parent_obj = self.vnc_lib().project_read(id=str(uuid.UUID(tenant_id)))
if parent_obj is None:
raise Exception('Error: parent is not specified in template!')
obj_0 = vnc_api.ServiceInstance(name=self.properties[self.NAME],
parent_obj=parent_obj)
if self.properties.get(self.DISPLAY_NAME) is not None:
obj_0.set_display_name(self.properties.get(self.DISPLAY_NAME))
if self.properties.get(self.SERVICE_INSTANCE_BINDINGS) is not None:
obj_1 = vnc_api.KeyValuePairs()
if self.properties.get(self.SERVICE_INSTANCE_BINDINGS, {}).get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR) is not None:
for index_1 in range(len(self.properties.get(self.SERVICE_INSTANCE_BINDINGS, {}).get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR))):
obj_2 = vnc_api.KeyValuePair()
if self.properties.get(self.SERVICE_INSTANCE_BINDINGS, {}).get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR, {})[index_1].get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR_KEY) is not None:
obj_2.set_key(self.properties.get(self.SERVICE_INSTANCE_BINDINGS, {}).get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR, {})[index_1].get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR_KEY))
if self.properties.get(self.SERVICE_INSTANCE_BINDINGS, {}).get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR, {})[index_1].get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR_VALUE) is not None:
obj_2.set_value(self.properties.get(self.SERVICE_INSTANCE_BINDINGS, {}).get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR, {})[index_1].get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR_VALUE))
obj_1.add_key_value_pair(obj_2)
obj_0.set_service_instance_bindings(obj_1)
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES) is not None:
obj_1 = vnc_api.ServiceInstanceType()
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_AUTO_POLICY) is not None:
obj_1.set_auto_policy(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_AUTO_POLICY))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_AVAILABILITY_ZONE) is not None:
obj_1.set_availability_zone(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_AVAILABILITY_ZONE))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_MANAGEMENT_VIRTUAL_NETWORK) is not None:
obj_1.set_management_virtual_network(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_MANAGEMENT_VIRTUAL_NETWORK))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_LEFT_VIRTUAL_NETWORK) is not None:
obj_1.set_left_virtual_network(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_LEFT_VIRTUAL_NETWORK))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_LEFT_IP_ADDRESS) is not None:
obj_1.set_left_ip_address(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_LEFT_IP_ADDRESS))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_RIGHT_VIRTUAL_NETWORK) is not None:
obj_1.set_right_virtual_network(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_RIGHT_VIRTUAL_NETWORK))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_RIGHT_IP_ADDRESS) is not None:
obj_1.set_right_ip_address(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_RIGHT_IP_ADDRESS))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST) is not None:
for index_1 in range(len(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST))):
obj_2 = vnc_api.ServiceInstanceInterfaceType()
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_VIRTUAL_NETWORK) is not None:
obj_2.set_virtual_network(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_VIRTUAL_NETWORK))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_IP_ADDRESS) is not None:
obj_2.set_ip_address(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_IP_ADDRESS))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES) is not None:
obj_3 = vnc_api.RouteTableType()
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE) is not None:
for index_3 in range(len(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE))):
obj_4 = vnc_api.RouteType()
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_PREFIX) is not None:
obj_4.set_prefix(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_PREFIX))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_NEXT_HOP) is not None:
obj_4.set_next_hop(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_NEXT_HOP))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_NEXT_HOP_TYPE) is not None:
obj_4.set_next_hop_type(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_NEXT_HOP_TYPE))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES) is not None:
obj_5 = vnc_api.CommunityAttributes()
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES_COMMUNITY_ATTRIBUTE) is not None:
for index_5 in range(len(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES_COMMUNITY_ATTRIBUTE))):
obj_5.add_community_attribute(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES_COMMUNITY_ATTRIBUTE)[index_5])
obj_4.set_community_attributes(obj_5)
obj_3.add_route(obj_4)
obj_2.set_static_routes(obj_3)
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS) is not None:
obj_3 = vnc_api.AllowedAddressPairs()
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR) is not None:
for index_3 in range(len(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR))):
obj_4 = vnc_api.AllowedAddressPair()
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP) is not None:
obj_5 = vnc_api.SubnetType()
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP_IP_PREFIX) is not None:
obj_5.set_ip_prefix(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP_IP_PREFIX))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP_IP_PREFIX_LEN) is not None:
obj_5.set_ip_prefix_len(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP_IP_PREFIX_LEN))
obj_4.set_ip(obj_5)
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_MAC) is not None:
obj_4.set_mac(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_MAC))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_ADDRESS_MODE) is not None:
obj_4.set_address_mode(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_ADDRESS_MODE))
obj_3.add_allowed_address_pair(obj_4)
obj_2.set_allowed_address_pairs(obj_3)
obj_1.add_interface_list(obj_2)
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT) is not None:
obj_2 = vnc_api.ServiceScaleOutType()
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT_MAX_INSTANCES) is not None:
obj_2.set_max_instances(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT_MAX_INSTANCES))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT_AUTO_SCALE) is not None:
obj_2.set_auto_scale(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT_AUTO_SCALE))
obj_1.set_scale_out(obj_2)
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_HA_MODE) is not None:
obj_1.set_ha_mode(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_HA_MODE))
if self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_VIRTUAL_ROUTER_ID) is not None:
obj_1.set_virtual_router_id(self.properties.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_VIRTUAL_ROUTER_ID))
obj_0.set_service_instance_properties(obj_1)
# reference to service_template_refs
if self.properties.get(self.SERVICE_TEMPLATE_REFS):
for index_0 in range(len(self.properties.get(self.SERVICE_TEMPLATE_REFS))):
try:
ref_obj = self.vnc_lib().service_template_read(
id=self.properties.get(self.SERVICE_TEMPLATE_REFS)[index_0]
)
except vnc_api.NoIdError:
ref_obj = self.vnc_lib().service_template_read(
fq_name_str=self.properties.get(self.SERVICE_TEMPLATE_REFS)[index_0]
)
obj_0.add_service_template(ref_obj)
# reference to instance_ip_refs
obj_1 = None
if self.properties.get(self.INSTANCE_IP_REFS_DATA) is not None:
for index_0 in range(len(self.properties.get(self.INSTANCE_IP_REFS_DATA))):
obj_1 = vnc_api.ServiceInterfaceTag()
if self.properties.get(self.INSTANCE_IP_REFS_DATA, {})[index_0].get(self.INSTANCE_IP_REFS_DATA_INTERFACE_TYPE) is not None:
obj_1.set_interface_type(self.properties.get(self.INSTANCE_IP_REFS_DATA, {})[index_0].get(self.INSTANCE_IP_REFS_DATA_INTERFACE_TYPE))
if self.properties.get(self.INSTANCE_IP_REFS):
try:
ref_obj = self.vnc_lib().instance_ip_read(
id=self.properties.get(self.INSTANCE_IP_REFS)[index_0]
)
except vnc_api.NoIdError:
ref_obj = self.vnc_lib().instance_ip_read(
fq_name_str=self.properties.get(self.INSTANCE_IP_REFS)[index_0]
)
obj_0.add_instance_ip(ref_obj, obj_1)
try:
obj_uuid = super(ContrailServiceInstance, self).resource_create(obj_0)
except:
raise Exception(_('service-instance %s could not be updated.') % self.name)
self.resource_id_set(obj_uuid)
def handle_update(self, json_snippet, tmpl_diff, prop_diff):
try:
obj_0 = self.vnc_lib().service_instance_read(
id=self.resource_id
)
except:
raise Exception(_('service-instance %s not found.') % self.name)
if prop_diff.get(self.DISPLAY_NAME) is not None:
obj_0.set_display_name(prop_diff.get(self.DISPLAY_NAME))
if prop_diff.get(self.SERVICE_INSTANCE_BINDINGS) is not None:
obj_1 = vnc_api.KeyValuePairs()
if prop_diff.get(self.SERVICE_INSTANCE_BINDINGS, {}).get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR) is not None:
for index_1 in range(len(prop_diff.get(self.SERVICE_INSTANCE_BINDINGS, {}).get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR))):
obj_2 = vnc_api.KeyValuePair()
if prop_diff.get(self.SERVICE_INSTANCE_BINDINGS, {}).get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR, {})[index_1].get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR_KEY) is not None:
obj_2.set_key(prop_diff.get(self.SERVICE_INSTANCE_BINDINGS, {}).get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR, {})[index_1].get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR_KEY))
if prop_diff.get(self.SERVICE_INSTANCE_BINDINGS, {}).get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR, {})[index_1].get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR_VALUE) is not None:
obj_2.set_value(prop_diff.get(self.SERVICE_INSTANCE_BINDINGS, {}).get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR, {})[index_1].get(self.SERVICE_INSTANCE_BINDINGS_KEY_VALUE_PAIR_VALUE))
obj_1.add_key_value_pair(obj_2)
obj_0.set_service_instance_bindings(obj_1)
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES) is not None:
obj_1 = vnc_api.ServiceInstanceType()
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_AUTO_POLICY) is not None:
obj_1.set_auto_policy(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_AUTO_POLICY))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_AVAILABILITY_ZONE) is not None:
obj_1.set_availability_zone(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_AVAILABILITY_ZONE))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_MANAGEMENT_VIRTUAL_NETWORK) is not None:
obj_1.set_management_virtual_network(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_MANAGEMENT_VIRTUAL_NETWORK))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_LEFT_VIRTUAL_NETWORK) is not None:
obj_1.set_left_virtual_network(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_LEFT_VIRTUAL_NETWORK))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_LEFT_IP_ADDRESS) is not None:
obj_1.set_left_ip_address(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_LEFT_IP_ADDRESS))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_RIGHT_VIRTUAL_NETWORK) is not None:
obj_1.set_right_virtual_network(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_RIGHT_VIRTUAL_NETWORK))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_RIGHT_IP_ADDRESS) is not None:
obj_1.set_right_ip_address(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_RIGHT_IP_ADDRESS))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST) is not None:
for index_1 in range(len(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST))):
obj_2 = vnc_api.ServiceInstanceInterfaceType()
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_VIRTUAL_NETWORK) is not None:
obj_2.set_virtual_network(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_VIRTUAL_NETWORK))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_IP_ADDRESS) is not None:
obj_2.set_ip_address(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_IP_ADDRESS))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES) is not None:
obj_3 = vnc_api.RouteTableType()
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE) is not None:
for index_3 in range(len(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE))):
obj_4 = vnc_api.RouteType()
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_PREFIX) is not None:
obj_4.set_prefix(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_PREFIX))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_NEXT_HOP) is not None:
obj_4.set_next_hop(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_NEXT_HOP))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_NEXT_HOP_TYPE) is not None:
obj_4.set_next_hop_type(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_NEXT_HOP_TYPE))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES) is not None:
obj_5 = vnc_api.CommunityAttributes()
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES_COMMUNITY_ATTRIBUTE) is not None:
for index_5 in range(len(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES_COMMUNITY_ATTRIBUTE))):
obj_5.add_community_attribute(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_STATIC_ROUTES_ROUTE_COMMUNITY_ATTRIBUTES_COMMUNITY_ATTRIBUTE)[index_5])
obj_4.set_community_attributes(obj_5)
obj_3.add_route(obj_4)
obj_2.set_static_routes(obj_3)
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS) is not None:
obj_3 = vnc_api.AllowedAddressPairs()
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR) is not None:
for index_3 in range(len(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR))):
obj_4 = vnc_api.AllowedAddressPair()
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP) is not None:
obj_5 = vnc_api.SubnetType()
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP_IP_PREFIX) is not None:
obj_5.set_ip_prefix(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP_IP_PREFIX))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP_IP_PREFIX_LEN) is not None:
obj_5.set_ip_prefix_len(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_IP_IP_PREFIX_LEN))
obj_4.set_ip(obj_5)
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_MAC) is not None:
obj_4.set_mac(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_MAC))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_ADDRESS_MODE) is not None:
obj_4.set_address_mode(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST, {})[index_1].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS, {}).get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR, {})[index_3].get(self.SERVICE_INSTANCE_PROPERTIES_INTERFACE_LIST_ALLOWED_ADDRESS_PAIRS_ALLOWED_ADDRESS_PAIR_ADDRESS_MODE))
obj_3.add_allowed_address_pair(obj_4)
obj_2.set_allowed_address_pairs(obj_3)
obj_1.add_interface_list(obj_2)
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT) is not None:
obj_2 = vnc_api.ServiceScaleOutType()
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT_MAX_INSTANCES) is not None:
obj_2.set_max_instances(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT_MAX_INSTANCES))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT_AUTO_SCALE) is not None:
obj_2.set_auto_scale(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT, {}).get(self.SERVICE_INSTANCE_PROPERTIES_SCALE_OUT_AUTO_SCALE))
obj_1.set_scale_out(obj_2)
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_HA_MODE) is not None:
obj_1.set_ha_mode(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_HA_MODE))
if prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_VIRTUAL_ROUTER_ID) is not None:
obj_1.set_virtual_router_id(prop_diff.get(self.SERVICE_INSTANCE_PROPERTIES, {}).get(self.SERVICE_INSTANCE_PROPERTIES_VIRTUAL_ROUTER_ID))
obj_0.set_service_instance_properties(obj_1)
# reference to service_template_refs
ref_obj_list = []
ref_data_list = []
if self.SERVICE_TEMPLATE_REFS in prop_diff:
for index_0 in range(len(prop_diff.get(self.SERVICE_TEMPLATE_REFS) or [])):
try:
ref_obj = self.vnc_lib().service_template_read(
id=prop_diff.get(self.SERVICE_TEMPLATE_REFS)[index_0]
)
except:
ref_obj = self.vnc_lib().service_template_read(
fq_name_str=prop_diff.get(self.SERVICE_TEMPLATE_REFS)[index_0]
)
ref_obj_list.append(ref_obj.fq_name)
obj_0.set_service_template_list(ref_obj_list)
# End: reference to service_template_refs
# reference to instance_ip
ref_obj_list = []
ref_data_list = []
if prop_diff.get(self.INSTANCE_IP_REFS_DATA) is not None:
for index_0 in range(len(prop_diff.get(self.INSTANCE_IP_REFS_DATA))):
obj_1 = vnc_api.ServiceInterfaceTag()
if prop_diff.get(self.INSTANCE_IP_REFS_DATA, {})[index_0].get(self.INSTANCE_IP_REFS_DATA_INTERFACE_TYPE) is not None:
obj_1.set_interface_type(prop_diff.get(self.INSTANCE_IP_REFS_DATA, {})[index_0].get(self.INSTANCE_IP_REFS_DATA_INTERFACE_TYPE))
ref_data_list.append(obj_1)
if self.INSTANCE_IP_REFS in prop_diff:
for index_0 in range(len(prop_diff.get(self.INSTANCE_IP_REFS_DATA) or [])):
try:
ref_obj = self.vnc_lib().instance_ip_read(
id=prop_diff.get(self.INSTANCE_IP_REFS)[index_0]
)
except:
ref_obj = self.vnc_lib().instance_ip_read(
fq_name_str=prop_diff.get(self.INSTANCE_IP_REFS)[index_0]
)
ref_obj_list.append(ref_obj.fq_name)
obj_0.set_instance_ip_list(ref_obj_list, ref_data_list)
# End: reference to instance_ip_refs
try:
self.vnc_lib().service_instance_update(obj_0)
except:
raise Exception(_('service-instance %s could not be updated.') % self.name)
def handle_delete(self):
if self.resource_id is None:
return
try:
self.vnc_lib().service_instance_delete(id=self.resource_id)
except Exception as ex:
self._ignore_not_found(ex)
LOG.warn(_('service_instance %s already deleted.') % self.name)
def _show_resource(self):
obj = self.vnc_lib().service_instance_read(id=self.resource_id)
obj_dict = obj.serialize_to_json()
return obj_dict
def resource_mapping():
return {
'OS::ContrailV2::ServiceInstance': ContrailServiceInstance,
}
| 98.854383 | 2,315 | 0.683146 | 7,313 | 66,529 | 5.661151 | 0.02352 | 0.206159 | 0.305556 | 0.221063 | 0.96186 | 0.95186 | 0.933188 | 0.924541 | 0.892729 | 0.875169 | 0 | 0.005257 | 0.2423 | 66,529 | 672 | 2,316 | 99.001488 | 0.816025 | 0.003848 | 0 | 0.493691 | 1 | 0 | 0.072211 | 0.065737 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007886 | false | 0.001577 | 0.01735 | 0.001577 | 0.037855 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
d733a4dd9f06730058e279d31cc2b1a5c516ba2d | 11,496 | py | Python | angr/procedures/definitions/win32_dsparse.py | r4b3rt/angr | c133cfd4f83ffea2a1d9e064241e9459eaabc55f | [
"BSD-2-Clause"
] | null | null | null | angr/procedures/definitions/win32_dsparse.py | r4b3rt/angr | c133cfd4f83ffea2a1d9e064241e9459eaabc55f | [
"BSD-2-Clause"
] | null | null | null | angr/procedures/definitions/win32_dsparse.py | r4b3rt/angr | c133cfd4f83ffea2a1d9e064241e9459eaabc55f | [
"BSD-2-Clause"
] | null | null | null | # pylint:disable=line-too-long
import logging
from ...sim_type import SimTypeFunction, SimTypeShort, SimTypeInt, SimTypeLong, SimTypeLongLong, SimTypeDouble, SimTypeFloat, SimTypePointer, SimTypeChar, SimStruct, SimTypeFixedSizeArray, SimTypeBottom, SimUnion, SimTypeBool
from ...calling_conventions import SimCCStdcall, SimCCMicrosoftAMD64
from .. import SIM_PROCEDURES as P
from . import SimLibrary
_l = logging.getLogger(name=__name__)
lib = SimLibrary()
lib.set_default_cc('X86', SimCCStdcall)
lib.set_default_cc('AMD64', SimCCMicrosoftAMD64)
lib.set_library_names("dsparse.dll")
prototypes = \
{
#
'DsMakeSpnW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeShort(signed=False, label="UInt16"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["ServiceClass", "ServiceName", "InstanceName", "InstancePort", "Referrer", "pcSpnLength", "pszSpn"]),
#
'DsMakeSpnA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeShort(signed=False, label="UInt16"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["ServiceClass", "ServiceName", "InstanceName", "InstancePort", "Referrer", "pcSpnLength", "pszSpn"]),
#
'DsCrackSpnA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeShort(signed=False, label="UInt16"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["pszSpn", "pcServiceClass", "ServiceClass", "pcServiceName", "ServiceName", "pcInstanceName", "InstanceName", "pInstancePort"]),
#
'DsCrackSpnW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeShort(signed=False, label="UInt16"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["pszSpn", "pcServiceClass", "ServiceClass", "pcServiceName", "ServiceName", "pcInstanceName", "InstanceName", "pInstancePort"]),
#
'DsQuoteRdnValueW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["cUnquotedRdnValueLength", "psUnquotedRdnValue", "pcQuotedRdnValueLength", "psQuotedRdnValue"]),
#
'DsQuoteRdnValueA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["cUnquotedRdnValueLength", "psUnquotedRdnValue", "pcQuotedRdnValueLength", "psQuotedRdnValue"]),
#
'DsUnquoteRdnValueW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["cQuotedRdnValueLength", "psQuotedRdnValue", "pcUnquotedRdnValueLength", "psUnquotedRdnValue"]),
#
'DsUnquoteRdnValueA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["cQuotedRdnValueLength", "psQuotedRdnValue", "pcUnquotedRdnValueLength", "psUnquotedRdnValue"]),
#
'DsGetRdnW': SimTypeFunction([SimTypePointer(SimTypePointer(SimTypeChar(label="Char"), offset=0), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypePointer(SimTypeChar(label="Char"), offset=0), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypePointer(SimTypeChar(label="Char"), offset=0), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["ppDN", "pcDN", "ppKey", "pcKey", "ppVal", "pcVal"]),
#
'DsCrackUnquotedMangledRdnW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeBottom(label="Guid"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="DS_MANGLE_FOR"), offset=0)], SimTypeInt(signed=True, label="Int32"), arg_names=["pszRDN", "cchRDN", "pGuid", "peDsMangleFor"]),
#
'DsCrackUnquotedMangledRdnA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeBottom(label="Guid"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="DS_MANGLE_FOR"), offset=0)], SimTypeInt(signed=True, label="Int32"), arg_names=["pszRDN", "cchRDN", "pGuid", "peDsMangleFor"]),
#
'DsIsMangledRdnValueW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="DS_MANGLE_FOR")], SimTypeInt(signed=True, label="Int32"), arg_names=["pszRdn", "cRdn", "eDsMangleForDesired"]),
#
'DsIsMangledRdnValueA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="DS_MANGLE_FOR")], SimTypeInt(signed=True, label="Int32"), arg_names=["pszRdn", "cRdn", "eDsMangleForDesired"]),
#
'DsIsMangledDnA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="DS_MANGLE_FOR")], SimTypeInt(signed=True, label="Int32"), arg_names=["pszDn", "eDsMangleFor"]),
#
'DsIsMangledDnW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="DS_MANGLE_FOR")], SimTypeInt(signed=True, label="Int32"), arg_names=["pszDn", "eDsMangleFor"]),
#
'DsCrackSpn2A': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeShort(signed=False, label="UInt16"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["pszSpn", "cSpn", "pcServiceClass", "ServiceClass", "pcServiceName", "ServiceName", "pcInstanceName", "InstanceName", "pInstancePort"]),
#
'DsCrackSpn2W': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeShort(signed=False, label="UInt16"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["pszSpn", "cSpn", "pcServiceClass", "ServiceClass", "pcServiceName", "ServiceName", "pcInstanceName", "InstanceName", "pInstancePort"]),
#
'DsCrackSpn3W': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeShort(signed=False, label="UInt16"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["pszSpn", "cSpn", "pcHostName", "HostName", "pcInstanceName", "InstanceName", "pPortNumber", "pcDomainName", "DomainName", "pcRealmName", "RealmName"]),
#
'DsCrackSpn4W': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["pszSpn", "cSpn", "pcHostName", "HostName", "pcInstanceName", "InstanceName", "pcPortName", "PortName", "pcDomainName", "DomainName", "pcRealmName", "RealmName"]),
}
lib.set_prototypes(prototypes)
| 191.6 | 1,043 | 0.741127 | 1,171 | 11,496 | 7.23655 | 0.105038 | 0.079301 | 0.128393 | 0.187161 | 0.888128 | 0.888128 | 0.888128 | 0.875856 | 0.868775 | 0.864055 | 0 | 0.023047 | 0.079071 | 11,496 | 59 | 1,044 | 194.847458 | 0.777368 | 0.002436 | 0 | 0 | 0 | 0 | 0.220949 | 0.020301 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.151515 | 0 | 0.151515 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
d74a6ceb90c098093d3364ba40aec752e5db9bcd | 40,495 | py | Python | imdb.py | vaishaligarg2015/Imdb_data_converter | 48a43b36edc904bff0522ff42c12cfe7ce76cefe | [
"MIT"
] | null | null | null | imdb.py | vaishaligarg2015/Imdb_data_converter | 48a43b36edc904bff0522ff42c12cfe7ce76cefe | [
"MIT"
] | null | null | null | imdb.py | vaishaligarg2015/Imdb_data_converter | 48a43b36edc904bff0522ff42c12cfe7ce76cefe | [
"MIT"
] | null | null | null | import re
import csv
import sys
import getopt
import os
def convert_actors():
with open('actors.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('actors.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Name','Title','Year','TV Series','Season No','Episode No','Character As'])
found_position = False
for line in f_in:
line = line.rstrip()
if found_position == False:
if line.lower() == 'the actors list':
for i in range(4):
next(f_in)
found_position = True
elif line:
if line[0] not in [' ', '\t']:
last_valid_name = None
row = []
#name, title, year
m = re.search("(.*?)\t+(.*)\s\((\d+).*?\)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
last_valid_name = m.group(1)
elif last_valid_name == None:
continue
row.extend([last_valid_name, m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
#character as in square bracket
m2 = re.search("\[(.*)\]", line)
if m2:
row.append(m2.group(1))
else:
row.append( "")
f_out.writerow( row)
def convert_costume_designers():
with open('costume-designers.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('costume-designers.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Name','Title,Year','TV Series','Season No','Episode No','Character As'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'the costume designers list':
for i in range(4):
next(f_in)
foundposition = True
elif line:
if line[0] not in [' ', '\t']:
last_valid_name = None
row = []
#name, title, year
m = re.search("(.*?)\t+(.*)\s\((\d+).*?\)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
last_valid_name = m.group(1)
elif last_valid_name == None:
continue
row.extend([last_valid_name, m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
#character as
m2 = re.search("\(as\s(.*)\)", line)
if m2:
row.append(m2.group(1))
else:
row.append("")
f_out.writerow(row)
def convert_actresses():
with open('actresses.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('actresses.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Name','Title,Year','TV Series','Season No','Episode No','Character As','Billed No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'the actresses list':
for i in range(4):
next(f_in)
foundposition = True
elif line:
if line[0] not in [' ', '\t']:
last_valid_name = None
row = []
#name, title, year
m = re.search("(.*?)\t+(.*)\s\((\d+).*?\)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
last_valid_name = m.group(1)
elif last_valid_name == None:
continue
row.extend([last_valid_name, m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
#character as in square bracket
m2 = re.search("\[(.*)\]", line)
if m2:
row.append(m2.group(1))
else:
row.append("")
#billed no <#>
m3 = re.search("\<(\d+)\>", line)
if m3:
row.append(m3.group(1))
else:
row.append("")
f_out.writerow( row)
def convert_certificates():
with open('certificates.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('certificates.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Title','Year','Country','Rating','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'certificates list':
for i in range(2):
next(f_in)
foundposition = True
elif line:
row = []
#title, year, country, rating
m = re.search('(.*)\s\((\d+)\).*\t+(\w+)\:(\w+)', line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
row.extend([m.group(1), m.group(2), m.group(3), m.group(4)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_cinematographers():
with open('cinematographers.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('cinematographers.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Name','Title','Year','TV Series','Season No','Episode No','Character As'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'the cinematographers list':
for i in range(4):
next(f_in)
foundposition = True
elif line:
if line[0] not in [' ', '\t']:
last_valid_name = None
row = []
#name, title, year
m = re.search("(.*?)\t+(.*)\s\((\d+).*?\)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
last_valid_name = m.group(1)
elif last_valid_name == None:
continue
row.extend([last_valid_name, m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["" "", ""])
#character as in round bracket
m2 = re.search("\d+\).*\((.+)\)$", line)
if m2:
row.append(m2.group(1))
else:
row.append("")
f_out.writerow( row)
def convert_composers():
with open('composers.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('composers.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Name','Title','Year','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'the composers list':
for i in range(4):
next(f_in)
foundposition = True
elif line:
if line[0] not in [' ', '\t']:
last_valid_name = None
row = []
#name, title, year
m = re.search("(.*?)\t+(.*)\s\((\d+).*?\)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
last_valid_name = m.group(1)
elif last_valid_name == None:
continue
row.extend([last_valid_name, m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_countries():
with open('countries.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('countries.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Title','Year','Country','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'countries list':
for i in range(1):
next(f_in)
foundposition = True
elif line:
row = []
#title, year, country
m = re.search("(.*)\s\((\d+).*?\).*\t+(.+)$", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
row.extend([m.group(1), m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_directors():
with open('directors.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('directors.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Name','Title','Year','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'the directors list':
for i in range(4):
next(f_in)
foundposition = True
elif line:
if line[0] not in [' ', '\t']:
last_valid_name = None
row = []
#name, title, year
m = re.search("(.*?)\t+(.*)\s\((\d+).*?\)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
last_valid_name = m.group(1)
elif last_valid_name == None:
continue
row.extend([last_valid_name, m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_distributors():
with open('distributors.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('distributors.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Title','Year','Distributor','Country','Media','TV Series','Season No','Episode No','Country','Media'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'distributors list':
for i in range(1):
next(f_in)
foundposition = True
elif line:
row = []
#title, year, distributor
m = re.search("(.*)\s\((\d+)\).*\t+(.+)\t+\(\d+\)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
row.extend([m.group(1), m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
#country, media
#"#LawstinWoods" (2013) {The Arrival (#1.1)} Dailymotion [us] (2014) (worldwide) (video)
m2 = re.search(".*\s\(\d+\)\s.*\t+.*\(\d+\)\s+\((.*)\)\s\((.*)\)", line)
if m2:
row.extend([m2.group(1), m2.group(2)])
else:
row.extend(["", ""])
f_out.writerow( row)
def convert_editors():
with open('editors.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('editors.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Name','Title','Year','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'the editors list':
for i in range(4):
next(f_in)
foundposition = True
elif line:
if line[0] not in [' ', '\t']:
last_valid_name = None
row = []
#name, title, year
m = re.search("(.*?)\t+(.*)\s\((\d+).*?\)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
last_valid_name = m.group(1)
elif last_valid_name == None:
continue
row.extend([last_valid_name, m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_language():
with open('language.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('language.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Title','Year','Language','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'language list':
for i in range(1):
next(f_in)
foundposition = True
elif line:
row = []
#title, year, country
m = re.search("(.*)\s\((\d+).*?\).*\t+(.+)$", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
row.extend([m.group(1), m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_locations():
with open('locations.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('locations.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Title','Year','Location','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'locations list':
for i in range(1):
next(f_in)
foundposition = True
elif line:
row = []
#title, year, country
m = re.search("(.*)\s\((\d+)\)\s.*\s.*\t+(.+)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
row.extend([m.group(1), m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_movies():
with open('movies.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('movies.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Title','Year','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'movies list':
for i in range(2):
next(f_in)
foundposition = True
elif line:
row = []
#title, year, country
m = re.search("(.*)\s\((\d+)\)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
row.extend([m.group(1), m.group(2)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_producers():
with open('producers.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('producers.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Name','Title','Year','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'the producers list':
for i in range(4):
next(f_in)
foundposition = True
elif line:
if line[0] not in [' ', '\t']:
last_valid_name = None
row = []
#name, title, year
m = re.search("(.*?)\t+(.*)\s\((\d+).*?\)\s+", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
last_valid_name = m.group(1)
elif last_valid_name == None:
continue
row.extend([last_valid_name, m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_production_companies():
with open('production-companies.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('production-companies.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Title','Year','Production Company','Country','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'production companies list':
for i in range(1):
next(f_in)
foundposition = True
elif line:
row = []
#title, year, country
m = re.search("(.*)\s\((\d+)\).*\t+(.*)\s\[(.*)\]$", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
row.extend([m.group(1), m.group(2), m.group(3), m.group(4)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_production_designers():
with open('production-designers.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('production-designers.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Name','Title','Year','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'the production designers list':
for i in range(4):
next(f_in)
foundposition = True
elif line:
if line[0] not in [' ', '\t']:
last_valid_name = None
row = []
#name, title, year
m = re.search("(.*?)\t+(.*)\s\((\d+).*?\)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
last_valid_name = m.group(1)
elif last_valid_name == None:
continue
row.extend([last_valid_name, m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_ratings():
with open('ratings.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('ratings.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['New Distribution','Votes','Rank','Title','Year','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'top 250 movies (25000+ votes)':
for i in range(15):
next(f_in)
foundposition = True
elif line:
row = []
#New distribution, votes, rank, title, year
m = re.search("\s+(.*)\s\s(.*)\s\s(.*)\s\s(.*)\s\((\d+)\)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
row.extend([m.group(1), m.group(2), m.group(3), m.group(4), m.group(5)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_release_dates():
with open('release-dates.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('release-dates.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Title','Year','Country','Release Date','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'release dates list':
for i in range(1):
next(f_in)
foundposition = True
elif line:
row = []
#title, year, country, date
m = re.search("(.*)\s\((\d+)\).*\t+(.*?)\:(\d+\s\w+\s\d+)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
row.extend([m.group(1), m.group(2), m.group(3), m.group(4)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_sound_mix():
with open('sound-mix.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('sound-mix.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Title','Year','Sound Mix','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'sound-mix list':
for i in range(1):
next(f_in)
foundposition = True
elif line:
row = []
#title, year, country
m = re.search("(.*)\s\((\d+)\).*\t+(\w+)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
row.extend([m.group(1), m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_special_effects_companies():
with open('special-effects-companies.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('special-effects-companies.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Title','Year','Company','Country','Type of effect','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'special effects companies list':
for i in range(1):
next(f_in)
foundposition = True
elif line:
row = []
#title, year, company, country, type of effect
m = re.search("(.*)\s\((\d+)\).*\t+(.+)\s\[(\w+)\]\s\((.+)\)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
row.extend([m.group(1), m.group(2), m.group(3), m.group(4), m.group(5)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_technical():
with open('technical.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('technical.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Title','Year','Technical','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'technical list':
for i in range(4):
next(f_in)
foundposition = True
elif line:
row = []
#title, year, technical
m = re.search("(.*)\s\((\d+)\).*\t+(.*)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
row.extend([m.group(1), m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_writers():
with open('writers.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('writers.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Name','Title','Year','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == 'the writers list':
for i in range(4):
next(f_in)
foundposition = True
elif line:
if line[0] not in [' ', '\t']:
last_valid_name = None
row = []
#name, title, year, as
m = re.search("(.*?)\t+(.*)\s\((\d+).*?\)", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
last_valid_name = m.group(1)
elif last_valid_name == None:
continue
row.extend([last_valid_name, m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
def convert_genres():
with open('genres.list', 'r', encoding= 'ISO-8859-1') as f_in:
with open('genres.csv', 'w') as f_outfile:
f_out = csv.writer(f_outfile)
f_out.writerow(['Title','Year','Genre','TV Series','Season No','Episode No'])
foundposition = False
for line in f_in:
line = line.rstrip()
if foundposition == False:
if line.lower() == '8: the genres list':
for i in range(2):
next(f_in)
foundposition = True
elif line:
row = []
#title, year, country
m = re.search("(.*)\s\((\d+).*?\).*\t+(.+)$", line)
if m is None:
if line == re.search("^-+$", line):
break
else:
if m.group(1):
row.extend([m.group(1), m.group(2), m.group(3)])
#TV series name, season no and episode no
m1 = re.search("\{(.*?)\s?\(\#(\d+)\.(\d+)\)\}", line)
if m1:
row.extend([m1.group(1), m1.group(2), m1.group(3)])
else:
row.extend(["", "", ""])
f_out.writerow( row)
imdb_dict = {
'actors.list': convert_actors,
'costume-designers.list': convert_costume_designers,
'actresses.list' : convert_actresses,
'certificates.list' : convert_certificates,
'cinematographers.list' : convert_cinematographers,
'composers.list' : convert_composers,
'countries.list' : convert_countries,
'directors.list' : convert_directors,
'distributors.list' : convert_distributors,
'editors.list' : convert_editors,
'languages.list' : convert_language,
'locations.list' : convert_locations,
'movies.list' : convert_movies,
'producers.list' : convert_producers,
'production-companies.list' : convert_production_companies,
'production-designers.list' : convert_production_designers,
'ratings.list' : convert_ratings,
'release-dates.list' : convert_release_dates,
'sound-mix.list' : convert_sound_mix,
'special-effects-companies.list' : convert_special_effects_companies,
'technical.list' : convert_technical,
'writers.list' : convert_writers,
'genres.list' : convert_genres
}
def usage():
print ('Usage: ')
print ('To process specific files')
print(sys.argv[0] + '[-d <basedir>]' + 'listfile1 listfile2....')
print ('To process all files')
print(sys.argv[0] + '[-d <basedir>]' + '-a')
def main():
imdb_directory = "."
convert_all_files = False
try:
optlist, filelist = getopt.getopt(sys.argv[1:], 'had:')
except getopt.GetoptError:
usage()
sys.exit(2)
for filename in filelist:
if filename not in imdb_dict:
print("Invalid file name")
usage()
sys.exit(2)
for opt, arg in optlist:
if opt == '-h':
usage()
sys.exit(2)
elif opt == '-a' and filelist == []:
convert_all_files = True
elif opt == "-d":
imdb_directory = arg
else:
usage()
sys.exit(2)
os.chdir(imdb_directory)
if convert_all_files:
for filename, fn in imdb_dict.items():
print( "Processing " + filename + "...", end="", flush=True)
fn()
print( " done")
else:
for filename in filelist:
fn = imdb_dict[filename]
print( "Processing " + filename + "...", end="", flush=True)
fn()
print( " done")
if __name__ == "__main__":
main() | 37.810458 | 131 | 0.384393 | 4,028 | 40,495 | 3.778054 | 0.042205 | 0.038638 | 0.027205 | 0.036273 | 0.826193 | 0.816664 | 0.809699 | 0.800697 | 0.798397 | 0.788343 | 0 | 0.022609 | 0.470256 | 40,495 | 1,071 | 132 | 37.810458 | 0.686789 | 0.040795 | 0 | 0.764411 | 0 | 0 | 0.12112 | 0.044808 | 0 | 0 | 0 | 0 | 0 | 1 | 0.031328 | false | 0 | 0.006266 | 0 | 0.037594 | 0.012531 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ad0391ae1d053da9dc0a979bf42a74822dc82e33 | 14,644 | py | Python | src/test/scenarios/managed-network/output/src/managed-network/azext_managed_network/generated/custom.py | 00Kai0/autorest.az | 3aa71d108f013ee5257a5ca1b210e35b7146b45a | [
"MIT"
] | null | null | null | src/test/scenarios/managed-network/output/src/managed-network/azext_managed_network/generated/custom.py | 00Kai0/autorest.az | 3aa71d108f013ee5257a5ca1b210e35b7146b45a | [
"MIT"
] | null | null | null | src/test/scenarios/managed-network/output/src/managed-network/azext_managed_network/generated/custom.py | 00Kai0/autorest.az | 3aa71d108f013ee5257a5ca1b210e35b7146b45a | [
"MIT"
] | null | null | null | # --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
# pylint: disable=too-many-lines
# pylint: disable=unused-argument
from azure.cli.core.util import sdk_no_wait
def managed_network_mn_list(client,
resource_group_name=None,
top=None,
skiptoken=None):
if resource_group_name:
return client.list_by_resource_group(resource_group_name=resource_group_name,
top=top,
skiptoken=skiptoken)
return client.list_by_subscription(top=top,
skiptoken=skiptoken)
def managed_network_mn_create(client,
resource_group_name,
managed_network_name,
location,
tags=None,
properties=None):
return client.create_or_update(resource_group_name=resource_group_name,
managed_network_name=managed_network_name,
location=location,
tags=tags,
properties=properties)
def managed_network_mn_update(client,
resource_group_name,
managed_network_name,
tags=None):
return client.begin_update(resource_group_name=resource_group_name,
managed_network_name=managed_network_name,
tags=tags)
def managed_network_mn_delete(client,
resource_group_name,
managed_network_name):
return client.begin_delete(resource_group_name=resource_group_name,
managed_network_name=managed_network_name)
def managed_network_mn_get_modify(client,
resource_group_name,
managed_network_name):
return client.get_modify(resource_group_name=resource_group_name,
managed_network_name=managed_network_name)
def managed_network_mn_scope_assignment_list(client,
scope):
return client.list(scope=scope)
def managed_network_mn_scope_assignment_show(client,
scope,
scope_assignment_name):
return client.get(scope=scope,
scope_assignment_name=scope_assignment_name)
def managed_network_mn_scope_assignment_create(client,
scope,
scope_assignment_name,
location,
assigned_managed_network=None):
return client.create_or_update(scope=scope,
scope_assignment_name=scope_assignment_name,
location=location,
assigned_managed_network=assigned_managed_network)
def managed_network_mn_scope_assignment_update(client,
scope,
scope_assignment_name,
location,
assigned_managed_network=None):
return client.create_or_update(scope=scope,
scope_assignment_name=scope_assignment_name,
location=location,
assigned_managed_network=assigned_managed_network)
def managed_network_mn_scope_assignment_delete(client,
scope,
scope_assignment_name):
return client.delete(scope=scope,
scope_assignment_name=scope_assignment_name)
def managed_network_mn_group_list(client,
resource_group_name,
managed_network_name,
top=None,
skiptoken=None):
return client.list_by_managed_network(resource_group_name=resource_group_name,
managed_network_name=managed_network_name,
top=top,
skiptoken=skiptoken)
def managed_network_mn_group_show(client,
resource_group_name,
managed_network_name,
group_name):
return client.get(resource_group_name=resource_group_name,
managed_network_name=managed_network_name,
managed_network_group_name=group_name)
def managed_network_mn_group_create(client,
resource_group_name,
managed_network_name,
group_name,
location,
management_groups=None,
subscriptions=None,
virtual_networks=None,
subnets=None,
no_wait=False):
return sdk_no_wait(no_wait,
client.begin_create_or_update,
resource_group_name=resource_group_name,
managed_network_name=managed_network_name,
managed_network_group_name=group_name,
location=location,
management_groups=management_groups,
subscriptions=subscriptions,
virtual_networks=virtual_networks,
subnets=subnets)
def managed_network_mn_group_update(client,
resource_group_name,
managed_network_name,
group_name,
location,
management_groups=None,
subscriptions=None,
virtual_networks=None,
subnets=None,
no_wait=False):
return sdk_no_wait(no_wait,
client.begin_create_or_update,
resource_group_name=resource_group_name,
managed_network_name=managed_network_name,
managed_network_group_name=group_name,
location=location,
management_groups=management_groups,
subscriptions=subscriptions,
virtual_networks=virtual_networks,
subnets=subnets)
def managed_network_mn_group_delete(client,
resource_group_name,
managed_network_name,
group_name,
no_wait=False):
return sdk_no_wait(no_wait,
client.begin_delete,
resource_group_name=resource_group_name,
managed_network_name=managed_network_name,
managed_network_group_name=group_name)
def managed_network_managed_network_peering_policy_list(client,
resource_group_name,
managed_network_name,
top=None,
skiptoken=None):
return client.list_by_managed_network(resource_group_name=resource_group_name,
managed_network_name=managed_network_name,
top=top,
skiptoken=skiptoken)
def managed_network_managed_network_peering_policy_show(client,
resource_group_name,
managed_network_name,
policy_name):
return client.get(resource_group_name=resource_group_name,
managed_network_name=managed_network_name,
managed_network_peering_policy_name=policy_name)
def managed_network_managed_network_peering_policy_hub_and_spoke_topology_create(client,
resource_group_name,
managed_network_name,
policy_name,
location,
hub=None,
spokes=None,
mesh=None,
no_wait=False):
properties = {}
properties['type'] = 'HubAndSpokeTopology'
properties['hub'] = hub
properties['spokes'] = spokes
properties['mesh'] = mesh
return sdk_no_wait(no_wait,
client.begin_create_or_update,
resource_group_name=resource_group_name,
managed_network_name=managed_network_name,
managed_network_peering_policy_name=policy_name,
location=location,
properties=properties)
def managed_network_managed_network_peering_policy_mesh_topology_create(client,
resource_group_name,
managed_network_name,
policy_name,
location,
hub=None,
spokes=None,
mesh=None,
no_wait=False):
properties = {}
properties['type'] = 'MeshTopology'
properties['hub'] = hub
properties['spokes'] = spokes
properties['mesh'] = mesh
return sdk_no_wait(no_wait,
client.begin_create_or_update,
resource_group_name=resource_group_name,
managed_network_name=managed_network_name,
managed_network_peering_policy_name=policy_name,
location=location,
properties=properties)
def managed_network_managed_network_peering_policy_hub_and_spoke_topology_update(instance,
resource_group_name,
managed_network_name,
policy_name,
location,
hub=None,
spokes=None,
mesh=None,
no_wait=False):
instance.type = 'HubAndSpokeTopology'
if hub is not None:
instance.hub = hub
if spokes is not None:
instance.spokes = spokes
if mesh is not None:
instance.mesh = mesh
return instance
def managed_network_managed_network_peering_policy_mesh_topology_update(instance,
resource_group_name,
managed_network_name,
policy_name,
location,
hub=None,
spokes=None,
mesh=None,
no_wait=False):
instance.type = 'MeshTopology'
if hub is not None:
instance.hub = hub
if spokes is not None:
instance.spokes = spokes
if mesh is not None:
instance.mesh = mesh
return instance
def managed_network_managed_network_peering_policy_delete(client,
resource_group_name,
managed_network_name,
policy_name,
no_wait=False):
return sdk_no_wait(no_wait,
client.begin_delete,
resource_group_name=resource_group_name,
managed_network_name=managed_network_name,
managed_network_peering_policy_name=policy_name)
| 49.640678 | 102 | 0.428981 | 1,034 | 14,644 | 5.635397 | 0.096712 | 0.213832 | 0.160632 | 0.166123 | 0.883645 | 0.863566 | 0.847091 | 0.822893 | 0.787884 | 0.737429 | 0 | 0 | 0.525335 | 14,644 | 294 | 103 | 49.809524 | 0.838297 | 0.03428 | 0 | 0.806723 | 0 | 0 | 0.006794 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.092437 | false | 0 | 0.004202 | 0.071429 | 0.193277 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
ad03d4b4858c86d7ebc6e8f8a2585e162d8e2716 | 6,160 | py | Python | tests/integration/api/v2010/account/call/test_payment.py | BrimmingDev/twilio-python | 3226b5fed92b3c2ce64f03e6b19fc4792ef7647f | [
"MIT"
] | 1,362 | 2015-01-04T10:25:18.000Z | 2022-03-24T10:07:08.000Z | tests/integration/api/v2010/account/call/test_payment.py | BrimmingDev/twilio-python | 3226b5fed92b3c2ce64f03e6b19fc4792ef7647f | [
"MIT"
] | 299 | 2015-01-30T09:52:39.000Z | 2022-03-31T23:03:02.000Z | tests/integration/api/v2010/account/call/test_payment.py | BrimmingDev/twilio-python | 3226b5fed92b3c2ce64f03e6b19fc4792ef7647f | [
"MIT"
] | 622 | 2015-01-03T04:43:09.000Z | 2022-03-29T14:11:00.000Z | # coding=utf-8
r"""
This code was generated by
\ / _ _ _| _ _
| (_)\/(_)(_|\/| |(/_ v1.0.0
/ /
"""
from tests import IntegrationTestCase
from tests.holodeck import Request
from twilio.base.exceptions import TwilioException
from twilio.http.response import Response
class PaymentTestCase(IntegrationTestCase):
def test_create_request(self):
self.holodeck.mock(Response(500, ''))
with self.assertRaises(TwilioException):
self.client.api.v2010.accounts("ACXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.calls("CAXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.payments.create(idempotency_key="idempotency_key", status_callback="https://example.com")
values = {'IdempotencyKey': "idempotency_key", 'StatusCallback': "https://example.com", }
self.holodeck.assert_has_request(Request(
'post',
'https://api.twilio.com/2010-04-01/Accounts/ACXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX/Calls/CAXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX/Payments.json',
data=values,
))
def test_start_payment_session_success_response(self):
self.holodeck.mock(Response(
201,
'''
{
"account_sid": "ACaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"call_sid": "CAaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"date_created": "Wed, 18 Dec 2019 20:02:01 +0000",
"date_updated": "Wed, 18 Dec 2019 20:02:01 +0000",
"sid": "PKaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"uri": "/2010-04-01/Accounts/ACaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Calls/CAaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Payments/PKaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa.json"
}
'''
))
actual = self.client.api.v2010.accounts("ACXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.calls("CAXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.payments.create(idempotency_key="idempotency_key", status_callback="https://example.com")
self.assertIsNotNone(actual)
def test_update_request(self):
self.holodeck.mock(Response(500, ''))
with self.assertRaises(TwilioException):
self.client.api.v2010.accounts("ACXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.calls("CAXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.payments("PKXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX").update(idempotency_key="idempotency_key", status_callback="https://example.com")
values = {'IdempotencyKey': "idempotency_key", 'StatusCallback': "https://example.com", }
self.holodeck.assert_has_request(Request(
'post',
'https://api.twilio.com/2010-04-01/Accounts/ACXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX/Calls/CAXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX/Payments/PKXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX.json',
data=values,
))
def test_collect_credit_card_number_response(self):
self.holodeck.mock(Response(
200,
'''
{
"account_sid": "ACaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"call_sid": "CAaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"date_created": "Wed, 18 Dec 2019 20:02:01 +0000",
"date_updated": "Wed, 18 Dec 2019 20:02:01 +0000",
"sid": "PKaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"uri": "/2010-04-01/Accounts/ACaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Calls/CAaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Payments/PKaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa.json"
}
'''
))
actual = self.client.api.v2010.accounts("ACXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.calls("CAXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.payments("PKXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX").update(idempotency_key="idempotency_key", status_callback="https://example.com")
self.assertIsNotNone(actual)
def test_collect_credit_card_expiry_date_response(self):
self.holodeck.mock(Response(
200,
'''
{
"account_sid": "ACaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"call_sid": "CAaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"date_created": "Wed, 18 Dec 2019 20:02:01 +0000",
"date_updated": "Wed, 18 Dec 2019 20:02:01 +0000",
"sid": "PKaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"uri": "/2010-04-01/Accounts/ACaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Calls/CAaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Payments/PKaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa.json"
}
'''
))
actual = self.client.api.v2010.accounts("ACXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.calls("CAXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.payments("PKXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX").update(idempotency_key="idempotency_key", status_callback="https://example.com")
self.assertIsNotNone(actual)
def test_complete_payment_response(self):
self.holodeck.mock(Response(
200,
'''
{
"account_sid": "ACaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"call_sid": "CAaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"date_created": "Wed, 18 Dec 2019 20:02:01 +0000",
"date_updated": "Wed, 18 Dec 2019 20:02:01 +0000",
"sid": "PKaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
"uri": "/2010-04-01/Accounts/ACaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Calls/CAaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa/Payments/PKaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa.json"
}
'''
))
actual = self.client.api.v2010.accounts("ACXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.calls("CAXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX") \
.payments("PKXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX").update(idempotency_key="idempotency_key", status_callback="https://example.com")
self.assertIsNotNone(actual)
| 46.666667 | 182 | 0.619805 | 466 | 6,160 | 8.038627 | 0.197425 | 0.052322 | 0.100374 | 0.172985 | 0.919915 | 0.899626 | 0.88094 | 0.88094 | 0.88094 | 0.88094 | 0 | 0.049554 | 0.272727 | 6,160 | 131 | 183 | 47.022901 | 0.786607 | 0.017695 | 0 | 0.758065 | 1 | 0.032258 | 0.306154 | 0.141262 | 0 | 0 | 0 | 0 | 0.129032 | 1 | 0.096774 | false | 0 | 0.064516 | 0 | 0.177419 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ad8b02000cfe432ac2c38efafd66a0b0693f4ccc | 53,579 | py | Python | ucb_api/python_client/swagger_client/api/buildtargets_api.py | educup/ucb-python-api | 4b3532be465afe6480b7e362e8942ff67b95633b | [
"MIT"
] | 1 | 2020-09-12T03:15:35.000Z | 2020-09-12T03:15:35.000Z | ucb_api/python_client/swagger_client/api/buildtargets_api.py | 70nybl4nc0/ucb-python-api | 9e6020436b7ca65667ed581dd91d2f8ad0674b91 | [
"MIT"
] | null | null | null | ucb_api/python_client/swagger_client/api/buildtargets_api.py | 70nybl4nc0/ucb-python-api | 9e6020436b7ca65667ed581dd91d2f8ad0674b91 | [
"MIT"
] | 1 | 2020-09-12T02:57:22.000Z | 2020-09-12T02:57:22.000Z | # coding: utf-8
"""
Unity Cloud Build
This API is intended to be used in conjunction with the Unity Cloud Build service. A tool for building your Unity projects in the Cloud. See https://developer.cloud.unity3d.com for more information. ## Making requests This website is built to allow requests to be made against the API. If you are currently logged into Cloud Build you should be able to make requests without entering an API key. You can find your API key in the Unity Cloud Services portal by clicking on 'Cloud Build Preferences' in the sidebar. Copy the API Key and paste it into the upper left corner of this website. It will be used in all subsequent requests. ## Clients The Unity Cloud Build API is based upon Swagger. Client libraries to integrate with your projects can easily be generated with the [Swagger Code Generator](https://github.com/swagger-api/swagger-codegen). The JSON schema required to generate a client for this API version is located here: ``` [API_URL][BASE_PATH]/api.json ``` ## Authorization The Unity Cloud Build API requires an access token from your Unity Cloud Build account, which can be found at https://build.cloud.unity3d.com/login/me To authenticate requests, include a Basic Authentication header with your API key as the value. e.g. ``` Authorization: Basic [YOUR API KEY] ``` ## Pagination Paged results will take two parameters. A page number that is calculated based upon the per_page amount. For instance if there are 40 results and you specify page 2 with per_page set to 10 you will receive records 11-20. Paged results will also return a Content-Range header. For the example above the content range header would look like this: ``` Content-Range: items 11-20/40 ``` ## Versioning The API version is indicated in the request URL. Upgrading to a newer API version can be done by changing the path. The API will receive a new version in the following cases: * removal of a path or request type * addition of a required field * removal of a required field The following changes are considered backwards compatible and will not trigger a new API version: * addition of an endpoint or request type * addition of an optional field * removal of an optional field * changes to the format of ids ## Identifiers It should not be assumed that any of the identifiers used in paths will be a perfect match for your user-entered information. If you see unexpected 403s or 404s from API calls then check your identifiers match the ones used by the API. In particular, `projectId` does NOT typically change when the project is renamed and in fact may not be a direct match for the project name even at initial creation time. To avoid confusion we recommend that instead of using the human-readable autogenerated orgId and projectId available from the API you should instead use: * org foreign key for `orgId` (available from project APIs as `orgFk` and org APIs as `coreForeignKey`) * `guid` for `projectId` All links generated by the API and the Dashboard should follow this format already, making it easy to figure out the correct parameters by making a comparison. ## Rate Limiting Requests against the Cloud Build API are limited to a rate of 100 per minute. To preserve the quality of service throughout Cloud Build, additional rate limits may apply to some actions. For example, polling aggressively instead of using webhooks or making API calls with a high concurrency may result in rate limiting. It is not intended for these rate limits to interfere with any legitimate use of the API. Please contact support at <cloudbuild@unity3d.com> if your use is affected by this rate limit. You can check the returned HTTP headers for any API request to see your current rate limit status. * __X-RateLimit-Limit:__ maximum number of requests per minute * __X-RateLimit-Remaining:__ remaining number of requests in the current window * __X-RateLimit-Reset:__ time at which the current window will reset (UTC epoch seconds) Once you go over the rate limit you will receive an error response: ``` HTTP Status: 429 { \"error\": \"Rate limit exceeded, retry in XX seconds\" } ``` # noqa: E501
OpenAPI spec version: 1.0.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from ..api_client import ApiClient
class BuildtargetsApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def add_build_target(self, orgid, projectid, options, **kwargs): # noqa: E501
"""Create build target for a project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_build_target(orgid, projectid, options, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param Options6 options: Options for build target create/update (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_build_target_with_http_info(orgid, projectid, options, **kwargs) # noqa: E501
else:
(data) = self.add_build_target_with_http_info(orgid, projectid, options, **kwargs) # noqa: E501
return data
def add_build_target_with_http_info(self, orgid, projectid, options, **kwargs): # noqa: E501
"""Create build target for a project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_build_target_with_http_info(orgid, projectid, options, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param Options6 options: Options for build target create/update (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['orgid', 'projectid', 'options'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_build_target" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'orgid' is set
if ('orgid' not in params or
params['orgid'] is None):
raise ValueError("Missing the required parameter `orgid` when calling `add_build_target`") # noqa: E501
# verify the required parameter 'projectid' is set
if ('projectid' not in params or
params['projectid'] is None):
raise ValueError("Missing the required parameter `projectid` when calling `add_build_target`") # noqa: E501
# verify the required parameter 'options' is set
if ('options' not in params or
params['options'] is None):
raise ValueError("Missing the required parameter `options` when calling `add_build_target`") # noqa: E501
collection_formats = {}
path_params = {}
if 'orgid' in params:
path_params['orgid'] = params['orgid'] # noqa: E501
if 'projectid' in params:
path_params['projectid'] = params['projectid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'options' in params:
body_params = params['options']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'text/plain', 'text/html', 'text/csv']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apikey', 'permissions'] # noqa: E501
return self.api_client.call_api(
'/orgs/{orgid}/projects/{projectid}/buildtargets', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_build_target(self, orgid, projectid, buildtargetid, **kwargs): # noqa: E501
"""Delete build target # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_build_target(orgid, projectid, buildtargetid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param str buildtargetid: unique id auto-generated from the build target name (required)
:return: str
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_build_target_with_http_info(orgid, projectid, buildtargetid, **kwargs) # noqa: E501
else:
(data) = self.delete_build_target_with_http_info(orgid, projectid, buildtargetid, **kwargs) # noqa: E501
return data
def delete_build_target_with_http_info(self, orgid, projectid, buildtargetid, **kwargs): # noqa: E501
"""Delete build target # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_build_target_with_http_info(orgid, projectid, buildtargetid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param str buildtargetid: unique id auto-generated from the build target name (required)
:return: str
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['orgid', 'projectid', 'buildtargetid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_build_target" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'orgid' is set
if ('orgid' not in params or
params['orgid'] is None):
raise ValueError("Missing the required parameter `orgid` when calling `delete_build_target`") # noqa: E501
# verify the required parameter 'projectid' is set
if ('projectid' not in params or
params['projectid'] is None):
raise ValueError("Missing the required parameter `projectid` when calling `delete_build_target`") # noqa: E501
# verify the required parameter 'buildtargetid' is set
if ('buildtargetid' not in params or
params['buildtargetid'] is None):
raise ValueError("Missing the required parameter `buildtargetid` when calling `delete_build_target`") # noqa: E501
collection_formats = {}
path_params = {}
if 'orgid' in params:
path_params['orgid'] = params['orgid'] # noqa: E501
if 'projectid' in params:
path_params['projectid'] = params['projectid'] # noqa: E501
if 'buildtargetid' in params:
path_params['buildtargetid'] = params['buildtargetid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'text/plain', 'text/html', 'text/csv']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apikey', 'permissions'] # noqa: E501
return self.api_client.call_api(
'/orgs/{orgid}/projects/{projectid}/buildtargets/{buildtargetid}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='str', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_build_target(self, orgid, projectid, buildtargetid, **kwargs): # noqa: E501
"""Get a build target # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_build_target(orgid, projectid, buildtargetid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param str buildtargetid: unique id auto-generated from the build target name (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_build_target_with_http_info(orgid, projectid, buildtargetid, **kwargs) # noqa: E501
else:
(data) = self.get_build_target_with_http_info(orgid, projectid, buildtargetid, **kwargs) # noqa: E501
return data
def get_build_target_with_http_info(self, orgid, projectid, buildtargetid, **kwargs): # noqa: E501
"""Get a build target # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_build_target_with_http_info(orgid, projectid, buildtargetid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param str buildtargetid: unique id auto-generated from the build target name (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['orgid', 'projectid', 'buildtargetid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_build_target" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'orgid' is set
if ('orgid' not in params or
params['orgid'] is None):
raise ValueError("Missing the required parameter `orgid` when calling `get_build_target`") # noqa: E501
# verify the required parameter 'projectid' is set
if ('projectid' not in params or
params['projectid'] is None):
raise ValueError("Missing the required parameter `projectid` when calling `get_build_target`") # noqa: E501
# verify the required parameter 'buildtargetid' is set
if ('buildtargetid' not in params or
params['buildtargetid'] is None):
raise ValueError("Missing the required parameter `buildtargetid` when calling `get_build_target`") # noqa: E501
collection_formats = {}
path_params = {}
if 'orgid' in params:
path_params['orgid'] = params['orgid'] # noqa: E501
if 'projectid' in params:
path_params['projectid'] = params['projectid'] # noqa: E501
if 'buildtargetid' in params:
path_params['buildtargetid'] = params['buildtargetid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'text/plain', 'text/html', 'text/csv']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apikey', 'permissions'] # noqa: E501
return self.api_client.call_api(
'/orgs/{orgid}/projects/{projectid}/buildtargets/{buildtargetid}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_build_targets(self, orgid, projectid, **kwargs): # noqa: E501
"""List all build targets for a project # noqa: E501
Gets all configured build targets for a project, regardless of whether they are enabled. Add \"?include=settings,credentials\" as a query parameter to include the build target settings and credentials with the response. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_build_targets(orgid, projectid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param str include: Extra fields to include in the response
:param bool include_last_success: Include last successful build
:return: list[InlineResponse2007]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_build_targets_with_http_info(orgid, projectid, **kwargs) # noqa: E501
else:
(data) = self.get_build_targets_with_http_info(orgid, projectid, **kwargs) # noqa: E501
return data
def get_build_targets_with_http_info(self, orgid, projectid, **kwargs): # noqa: E501
"""List all build targets for a project # noqa: E501
Gets all configured build targets for a project, regardless of whether they are enabled. Add \"?include=settings,credentials\" as a query parameter to include the build target settings and credentials with the response. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_build_targets_with_http_info(orgid, projectid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param str include: Extra fields to include in the response
:param bool include_last_success: Include last successful build
:return: list[InlineResponse2007]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['orgid', 'projectid', 'include', 'include_last_success'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_build_targets" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'orgid' is set
if ('orgid' not in params or
params['orgid'] is None):
raise ValueError("Missing the required parameter `orgid` when calling `get_build_targets`") # noqa: E501
# verify the required parameter 'projectid' is set
if ('projectid' not in params or
params['projectid'] is None):
raise ValueError("Missing the required parameter `projectid` when calling `get_build_targets`") # noqa: E501
collection_formats = {}
path_params = {}
if 'orgid' in params:
path_params['orgid'] = params['orgid'] # noqa: E501
if 'projectid' in params:
path_params['projectid'] = params['projectid'] # noqa: E501
query_params = []
if 'include' in params:
query_params.append(('include', params['include'])) # noqa: E501
if 'include_last_success' in params:
query_params.append(('include_last_success', params['include_last_success'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'text/plain', 'text/html', 'text/csv']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apikey', 'permissions'] # noqa: E501
return self.api_client.call_api(
'/orgs/{orgid}/projects/{projectid}/buildtargets', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[InlineResponse2007]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_build_targets_for_org(self, orgid, **kwargs): # noqa: E501
"""List all build targets for an org # noqa: E501
Gets all configured build targets for an org, regardless of whether they are enabled. Add \"?include=settings,credentials\" as a query parameter to include the build target settings and credentials with the response. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_build_targets_for_org(orgid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str include: Extra fields to include in the response
:param bool include_last_success: Include last successful build
:return: list[InlineResponse2007]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_build_targets_for_org_with_http_info(orgid, **kwargs) # noqa: E501
else:
(data) = self.get_build_targets_for_org_with_http_info(orgid, **kwargs) # noqa: E501
return data
def get_build_targets_for_org_with_http_info(self, orgid, **kwargs): # noqa: E501
"""List all build targets for an org # noqa: E501
Gets all configured build targets for an org, regardless of whether they are enabled. Add \"?include=settings,credentials\" as a query parameter to include the build target settings and credentials with the response. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_build_targets_for_org_with_http_info(orgid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str include: Extra fields to include in the response
:param bool include_last_success: Include last successful build
:return: list[InlineResponse2007]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['orgid', 'include', 'include_last_success'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_build_targets_for_org" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'orgid' is set
if ('orgid' not in params or
params['orgid'] is None):
raise ValueError("Missing the required parameter `orgid` when calling `get_build_targets_for_org`") # noqa: E501
collection_formats = {}
path_params = {}
if 'orgid' in params:
path_params['orgid'] = params['orgid'] # noqa: E501
query_params = []
if 'include' in params:
query_params.append(('include', params['include'])) # noqa: E501
if 'include_last_success' in params:
query_params.append(('include_last_success', params['include_last_success'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'text/plain', 'text/html', 'text/csv']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apikey', 'permissions'] # noqa: E501
return self.api_client.call_api(
'/orgs/{orgid}/buildtargets', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[InlineResponse2007]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_env_variables_for_build_target(self, orgid, projectid, buildtargetid, **kwargs): # noqa: E501
"""Get environment variables # noqa: E501
Get all configured environment variables for a given build target # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_env_variables_for_build_target(orgid, projectid, buildtargetid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param str buildtargetid: unique id auto-generated from the build target name (required)
:return: dict(str, str)
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_env_variables_for_build_target_with_http_info(orgid, projectid, buildtargetid, **kwargs) # noqa: E501
else:
(data) = self.get_env_variables_for_build_target_with_http_info(orgid, projectid, buildtargetid, **kwargs) # noqa: E501
return data
def get_env_variables_for_build_target_with_http_info(self, orgid, projectid, buildtargetid, **kwargs): # noqa: E501
"""Get environment variables # noqa: E501
Get all configured environment variables for a given build target # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_env_variables_for_build_target_with_http_info(orgid, projectid, buildtargetid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param str buildtargetid: unique id auto-generated from the build target name (required)
:return: dict(str, str)
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['orgid', 'projectid', 'buildtargetid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_env_variables_for_build_target" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'orgid' is set
if ('orgid' not in params or
params['orgid'] is None):
raise ValueError("Missing the required parameter `orgid` when calling `get_env_variables_for_build_target`") # noqa: E501
# verify the required parameter 'projectid' is set
if ('projectid' not in params or
params['projectid'] is None):
raise ValueError("Missing the required parameter `projectid` when calling `get_env_variables_for_build_target`") # noqa: E501
# verify the required parameter 'buildtargetid' is set
if ('buildtargetid' not in params or
params['buildtargetid'] is None):
raise ValueError("Missing the required parameter `buildtargetid` when calling `get_env_variables_for_build_target`") # noqa: E501
collection_formats = {}
path_params = {}
if 'orgid' in params:
path_params['orgid'] = params['orgid'] # noqa: E501
if 'projectid' in params:
path_params['projectid'] = params['projectid'] # noqa: E501
if 'buildtargetid' in params:
path_params['buildtargetid'] = params['buildtargetid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'text/plain', 'text/html', 'text/csv']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apikey', 'permissions'] # noqa: E501
return self.api_client.call_api(
'/orgs/{orgid}/projects/{projectid}/buildtargets/{buildtargetid}/envvars', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='dict(str, str)', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_stats_for_build_target(self, orgid, projectid, buildtargetid, **kwargs): # noqa: E501
"""Get build target statistics # noqa: E501
Get statistics for the specified build target # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_stats_for_build_target(orgid, projectid, buildtargetid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param str buildtargetid: unique id auto-generated from the build target name (required)
:param str build_status: Query for only builds of a specific status
:param bool clean_build: Query for builds that have either been built clean or using caches
:param float limit: Get stats for last limit builds
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_stats_for_build_target_with_http_info(orgid, projectid, buildtargetid, **kwargs) # noqa: E501
else:
(data) = self.get_stats_for_build_target_with_http_info(orgid, projectid, buildtargetid, **kwargs) # noqa: E501
return data
def get_stats_for_build_target_with_http_info(self, orgid, projectid, buildtargetid, **kwargs): # noqa: E501
"""Get build target statistics # noqa: E501
Get statistics for the specified build target # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_stats_for_build_target_with_http_info(orgid, projectid, buildtargetid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param str buildtargetid: unique id auto-generated from the build target name (required)
:param str build_status: Query for only builds of a specific status
:param bool clean_build: Query for builds that have either been built clean or using caches
:param float limit: Get stats for last limit builds
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['orgid', 'projectid', 'buildtargetid', 'build_status', 'clean_build', 'limit'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_stats_for_build_target" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'orgid' is set
if ('orgid' not in params or
params['orgid'] is None):
raise ValueError("Missing the required parameter `orgid` when calling `get_stats_for_build_target`") # noqa: E501
# verify the required parameter 'projectid' is set
if ('projectid' not in params or
params['projectid'] is None):
raise ValueError("Missing the required parameter `projectid` when calling `get_stats_for_build_target`") # noqa: E501
# verify the required parameter 'buildtargetid' is set
if ('buildtargetid' not in params or
params['buildtargetid'] is None):
raise ValueError("Missing the required parameter `buildtargetid` when calling `get_stats_for_build_target`") # noqa: E501
collection_formats = {}
path_params = {}
if 'orgid' in params:
path_params['orgid'] = params['orgid'] # noqa: E501
if 'projectid' in params:
path_params['projectid'] = params['projectid'] # noqa: E501
if 'buildtargetid' in params:
path_params['buildtargetid'] = params['buildtargetid'] # noqa: E501
query_params = []
if 'build_status' in params:
query_params.append(('buildStatus', params['build_status'])) # noqa: E501
if 'clean_build' in params:
query_params.append(('cleanBuild', params['clean_build'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'text/plain', 'text/html', 'text/csv']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apikey', 'permissions'] # noqa: E501
return self.api_client.call_api(
'/orgs/{orgid}/projects/{projectid}/buildtargets/{buildtargetid}/stats', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def set_env_variables_for_build_target(self, orgid, projectid, buildtargetid, envvars, **kwargs): # noqa: E501
"""Set environment variables # noqa: E501
Set all configured environment variables for a given build target # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.set_env_variables_for_build_target(orgid, projectid, buildtargetid, envvars, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param str buildtargetid: unique id auto-generated from the build target name (required)
:param object envvars: Environment variables (required)
:return: dict(str, str)
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.set_env_variables_for_build_target_with_http_info(orgid, projectid, buildtargetid, envvars, **kwargs) # noqa: E501
else:
(data) = self.set_env_variables_for_build_target_with_http_info(orgid, projectid, buildtargetid, envvars, **kwargs) # noqa: E501
return data
def set_env_variables_for_build_target_with_http_info(self, orgid, projectid, buildtargetid, envvars, **kwargs): # noqa: E501
"""Set environment variables # noqa: E501
Set all configured environment variables for a given build target # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.set_env_variables_for_build_target_with_http_info(orgid, projectid, buildtargetid, envvars, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param str buildtargetid: unique id auto-generated from the build target name (required)
:param object envvars: Environment variables (required)
:return: dict(str, str)
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['orgid', 'projectid', 'buildtargetid', 'envvars'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method set_env_variables_for_build_target" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'orgid' is set
if ('orgid' not in params or
params['orgid'] is None):
raise ValueError("Missing the required parameter `orgid` when calling `set_env_variables_for_build_target`") # noqa: E501
# verify the required parameter 'projectid' is set
if ('projectid' not in params or
params['projectid'] is None):
raise ValueError("Missing the required parameter `projectid` when calling `set_env_variables_for_build_target`") # noqa: E501
# verify the required parameter 'buildtargetid' is set
if ('buildtargetid' not in params or
params['buildtargetid'] is None):
raise ValueError("Missing the required parameter `buildtargetid` when calling `set_env_variables_for_build_target`") # noqa: E501
# verify the required parameter 'envvars' is set
if ('envvars' not in params or
params['envvars'] is None):
raise ValueError("Missing the required parameter `envvars` when calling `set_env_variables_for_build_target`") # noqa: E501
collection_formats = {}
path_params = {}
if 'orgid' in params:
path_params['orgid'] = params['orgid'] # noqa: E501
if 'projectid' in params:
path_params['projectid'] = params['projectid'] # noqa: E501
if 'buildtargetid' in params:
path_params['buildtargetid'] = params['buildtargetid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'envvars' in params:
body_params = params['envvars']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'text/plain', 'text/html', 'text/csv']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apikey', 'permissions'] # noqa: E501
return self.api_client.call_api(
'/orgs/{orgid}/projects/{projectid}/buildtargets/{buildtargetid}/envvars', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='dict(str, str)', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_build_target(self, orgid, projectid, buildtargetid, options, **kwargs): # noqa: E501
"""Update build target details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_build_target(orgid, projectid, buildtargetid, options, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param str buildtargetid: unique id auto-generated from the build target name (required)
:param Options7 options: Options for build target create/update (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_build_target_with_http_info(orgid, projectid, buildtargetid, options, **kwargs) # noqa: E501
else:
(data) = self.update_build_target_with_http_info(orgid, projectid, buildtargetid, options, **kwargs) # noqa: E501
return data
def update_build_target_with_http_info(self, orgid, projectid, buildtargetid, options, **kwargs): # noqa: E501
"""Update build target details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_build_target_with_http_info(orgid, projectid, buildtargetid, options, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str orgid: Organization identifier (required)
:param str projectid: Project identifier (required)
:param str buildtargetid: unique id auto-generated from the build target name (required)
:param Options7 options: Options for build target create/update (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['orgid', 'projectid', 'buildtargetid', 'options'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_build_target" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'orgid' is set
if ('orgid' not in params or
params['orgid'] is None):
raise ValueError("Missing the required parameter `orgid` when calling `update_build_target`") # noqa: E501
# verify the required parameter 'projectid' is set
if ('projectid' not in params or
params['projectid'] is None):
raise ValueError("Missing the required parameter `projectid` when calling `update_build_target`") # noqa: E501
# verify the required parameter 'buildtargetid' is set
if ('buildtargetid' not in params or
params['buildtargetid'] is None):
raise ValueError("Missing the required parameter `buildtargetid` when calling `update_build_target`") # noqa: E501
# verify the required parameter 'options' is set
if ('options' not in params or
params['options'] is None):
raise ValueError("Missing the required parameter `options` when calling `update_build_target`") # noqa: E501
collection_formats = {}
path_params = {}
if 'orgid' in params:
path_params['orgid'] = params['orgid'] # noqa: E501
if 'projectid' in params:
path_params['projectid'] = params['projectid'] # noqa: E501
if 'buildtargetid' in params:
path_params['buildtargetid'] = params['buildtargetid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'options' in params:
body_params = params['options']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'text/plain', 'text/html', 'text/csv']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['apikey', 'permissions'] # noqa: E501
return self.api_client.call_api(
'/orgs/{orgid}/projects/{projectid}/buildtargets/{buildtargetid}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 49.518484 | 4,140 | 0.642696 | 6,336 | 53,579 | 5.249211 | 0.067393 | 0.042094 | 0.03127 | 0.019483 | 0.888572 | 0.881476 | 0.879371 | 0.873839 | 0.870652 | 0.86536 | 0 | 0.015131 | 0.269789 | 53,579 | 1,081 | 4,141 | 49.564292 | 0.834961 | 0.392113 | 0 | 0.806397 | 0 | 0 | 0.243308 | 0.059111 | 0 | 0 | 0 | 0 | 0 | 1 | 0.031987 | false | 0 | 0.006734 | 0 | 0.085859 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d10aeb6e644c2284636128a05002787f42b72023 | 42 | py | Python | tools/database_backends/__init__.py | sjdv1982/seamless | 1b814341e74a56333c163f10e6f6ceab508b7df9 | [
"MIT"
] | 15 | 2017-06-07T12:49:12.000Z | 2020-07-25T18:06:04.000Z | tools/database_backends/__init__.py | sjdv1982/seamless | 1b814341e74a56333c163f10e6f6ceab508b7df9 | [
"MIT"
] | 110 | 2016-06-21T23:20:44.000Z | 2022-02-24T16:15:22.000Z | tools/database_backends/__init__.py | sjdv1982/seamless | 1b814341e74a56333c163f10e6f6ceab508b7df9 | [
"MIT"
] | 6 | 2016-06-21T11:19:22.000Z | 2019-01-21T13:45:39.000Z | from . import redis
from . import flatfile | 21 | 22 | 0.785714 | 6 | 42 | 5.5 | 0.666667 | 0.606061 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 42 | 2 | 22 | 21 | 0.942857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d11203eb5935df403e400ee14b9ea15b64309d5a | 1,393 | py | Python | solutions/0812-rotate-string/rotate-string.py | iFun/Project-G | d33b3b3c7bcee64f93dc2539fd9955a27f321d96 | [
"MIT"
] | null | null | null | solutions/0812-rotate-string/rotate-string.py | iFun/Project-G | d33b3b3c7bcee64f93dc2539fd9955a27f321d96 | [
"MIT"
] | null | null | null | solutions/0812-rotate-string/rotate-string.py | iFun/Project-G | d33b3b3c7bcee64f93dc2539fd9955a27f321d96 | [
"MIT"
] | null | null | null | # We are given two strings, A and B.
#
# A shift on A consists of taking string A and moving the leftmost character to the rightmost position. For example, if A = 'abcde', then it will be 'bcdea' after one shift on A. Return True if and only if A can become B after some number of shifts on A.
#
#
# Example 1:
# Input: A = 'abcde', B = 'cdeab'
# Output: true
#
# Example 2:
# Input: A = 'abcde', B = 'abced'
# Output: false
#
#
# Note:
#
#
# A and B will have length at most 100.
#
#
#
# @lc app=leetcode id=796 lang=python3
#
# [796] Rotate String
#
# https://leetcode.com/problems/rotate-string/description/
#
# algorithms
# Easy (49.23%)
# Likes: 445
# Dislikes: 39
# Total Accepted: 46.5K
# Total Submissions: 94.4K
# Testcase Example: '"abcde"\n"cdeab"'
#
# We are given two strings, A and B.
#
# A shift on A consists of taking string A and moving the leftmost character to
# the rightmost position. For example, if A = 'abcde', then it will be 'bcdea'
# after one shift on A. Return True if and only if A can become B after some
# number of shifts on A.
#
#
# Example 1:
# Input: A = 'abcde', B = 'cdeab'
# Output: true
#
# Example 2:
# Input: A = 'abcde', B = 'abced'
# Output: false
#
#
# Note:
#
#
# A and B will have length at most 100.
#
#
#
class Solution:
def rotateString(self, A: str, B: str) -> bool:
return len(A) == len(B) and A in B + B
| 20.485294 | 255 | 0.643216 | 230 | 1,393 | 3.895652 | 0.395652 | 0.026786 | 0.022321 | 0.053571 | 0.714286 | 0.714286 | 0.714286 | 0.714286 | 0.714286 | 0.714286 | 0 | 0.029823 | 0.22972 | 1,393 | 67 | 256 | 20.791045 | 0.805219 | 0.821249 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 9 |
0f3013d6a50f7afd175675f8220f62abe24efb02 | 22,819 | py | Python | tests/test_NoConcatenation.py | SLINGhub/MSOrganiser | 918acda503093963a87a272f73bf6b07e8363e19 | [
"MIT"
] | null | null | null | tests/test_NoConcatenation.py | SLINGhub/MSOrganiser | 918acda503093963a87a272f73bf6b07e8363e19 | [
"MIT"
] | null | null | null | tests/test_NoConcatenation.py | SLINGhub/MSOrganiser | 918acda503093963a87a272f73bf6b07e8363e19 | [
"MIT"
] | null | null | null | import unittest
from unittest.mock import patch
import os
import pandas as pd
import openpyxl
from MSOrganiser import no_concatenate_workflow
from MSDataOutput import MSDataOutput
WIDETABLEFORMROW1_MULTIPLEISTD_FILENAME = os.path.join(os.path.dirname(__file__),"testdata",
"test_no_concatenate_multipleISTD", "WideTableFormRow1.csv")
WIDETABLEFORMROW2_MULTIPLEISTD_FILENAME = os.path.join(os.path.dirname(__file__),"testdata",
"test_no_concatenate_multipleISTD", "WideTableFormRow2.csv")
WIDETABLEFORMROW_MULTIPLEISTD_ANNOTATION = os.path.join(os.path.dirname(__file__),"testdata",
"test_no_concatenate_multipleISTD", "WideTableFormRow_Annotation.xlsx")
WIDETABLEFORMROW1_MULTIPLEISTD_RESULTS_FILENAME = os.path.join(os.path.dirname(__file__),"testdata",
"test_no_concatenate_multipleISTD", "WideTableFormRow1_Results.xlsx")
WIDETABLEFORMROW1_MULTIPLEISTD_RESULTS_TRANSPOSE_FILENAME = os.path.join(os.path.dirname(__file__),"testdata",
"test_no_concatenate_multipleISTD", "WideTableFormRow1_TransposeResults.xlsx")
WIDETABLEFORMROW1_MULTIPLEISTD_RESULTS_LONGTABLE_FILENAME = os.path.join(os.path.dirname(__file__),"testdata",
"test_no_concatenate_multipleISTD", "WideTableFormRow1_LongTable.xlsx")
WIDETABLEFORMROW1_MULTIPLEISTD_RESULTS_LONGTABLE_WITH_ANNOT_FILENAME = os.path.join(os.path.dirname(__file__),"testdata",
"test_no_concatenate_multipleISTD",
"WideTableFormRow1_LongTable_with_Annot.xlsx")
WIDETABLEFORMROW1_FILENAME = os.path.join(os.path.dirname(__file__),"testdata",
"test_no_concatenate", "WideTableFormRow1.csv")
WIDETABLEFORMROW2_FILENAME = os.path.join(os.path.dirname(__file__),"testdata",
"test_no_concatenate", "WideTableFormRow2.csv")
WIDETABLEFORMROW_ANNOTATION = os.path.join(os.path.dirname(__file__),"testdata",
"test_no_concatenate", "WideTableFormRow_Annotation.xlsx")
WIDETABLEFORMROW1_RESULTS_FILENAME = os.path.join(os.path.dirname(__file__),"testdata",
"test_no_concatenate", "WideTableFormRow1_Results.xlsx")
WIDETABLEFORMROW1_RESULTS_TRANSPOSE_FILENAME = os.path.join(os.path.dirname(__file__),"testdata",
"test_no_concatenate", "WideTableFormRow1_TransposeResults.xlsx")
WIDETABLEFORMROW1_RESULTS_LONGTABLE_FILENAME = os.path.join(os.path.dirname(__file__),"testdata",
"test_no_concatenate", "WideTableFormRow1_LongTable.xlsx")
WIDETABLEFORMROW1_RESULTS_LONGTABLE_WITH_ANNOT_FILENAME = os.path.join(os.path.dirname(__file__),"testdata",
"test_no_concatenate", "WideTableFormRow1_LongTable_with_Annot.xlsx")
class NoConcatenation_Test(unittest.TestCase):
# See https://realpython.com/lessons/mocking-print-unit-tests/
# for more details on mock
def setUp(self):
# Replace the print function in MSCalculate.py file to a mock
self.patcher = patch('MSCalculate.print')
def test_no_concatenate(self):
"""Check if the software is able to from the two input raw data
* Extract the Area
* Calculate the normalised Area and concentation
"""
stored_args = {
'MS_Files': [WIDETABLEFORMROW1_FILENAME, WIDETABLEFORMROW2_FILENAME],
'MS_FileType': 'Agilent Wide Table in csv',
'Output_Directory': 'D:\\MSOrganiser',
'Output_Options': ['Area', 'normArea by ISTD', 'normConc by ISTD'],
'Annot_File': WIDETABLEFORMROW_ANNOTATION,
'Output_Format': 'Excel',
'Concatenate': 'No Concatenate',
'Transpose_Results': False,
'Allow_Multiple_ISTD': False,
'Long_Table': False,
'Long_Table_Annot': False,
'Testing': False
}
# We just check the results of the first file input
[file_data_list, file_name] = no_concatenate_workflow(stored_args,testing = True)
file_name_index = file_name.index(WIDETABLEFORMROW1_FILENAME)
[file_data,sheet_names] = file_data_list[file_name_index]
for sheet_name in sheet_names:
if sheet_name != "Long_Table":
data_index = sheet_names.index(sheet_name)
ExcelData_df = self.__sheet_to_df(workbook = WIDETABLEFORMROW1_RESULTS_FILENAME,
sheet_name = sheet_name,
allow_multiple_istd = False)
self.__compare_df(file_data[data_index],ExcelData_df)
def test_no_concatenate_LongTable(self):
"""Check if the software is able to from the two input raw data
* Extract the Area
* Calculate the normalised Area and concentation
* Create a Long Table without Annotation
"""
stored_args = {
'MS_Files': [WIDETABLEFORMROW1_FILENAME, WIDETABLEFORMROW2_FILENAME],
'MS_FileType': 'Agilent Wide Table in csv',
'Output_Directory': 'D:\\MSOrganiser',
'Output_Options': ['Area', 'normArea by ISTD', 'normConc by ISTD'],
'Annot_File': WIDETABLEFORMROW_ANNOTATION,
'Output_Format': 'Excel',
'Concatenate': 'No Concatenate',
'Transpose_Results': False,
'Allow_Multiple_ISTD': False,
'Long_Table': True,
'Long_Table_Annot': False,
'Testing': False
}
# We just check the results of the first file input
[file_data_list, file_name] = no_concatenate_workflow(stored_args,testing = True)
file_name_index = file_name.index(WIDETABLEFORMROW1_FILENAME)
[file_data,sheet_names] = file_data_list[file_name_index]
data_index = sheet_names.index("Long_Table")
ExcelData_df = self.__sheet_to_df(workbook = WIDETABLEFORMROW1_RESULTS_LONGTABLE_FILENAME,
sheet_name = "Long_Table",
allow_multiple_istd = True)
ExcelData_df = ExcelData_df.fillna('')
self.__compare_df(file_data[data_index],ExcelData_df)
def test_no_concatenate_LongTable_with_Annot(self):
"""Check if the software is able to from the two input raw data
* Extract the Area
* Calculate the normalised Area and concentation
* Create a Long Table with Annotation
"""
stored_args = {
'MS_Files': [WIDETABLEFORMROW1_FILENAME, WIDETABLEFORMROW2_FILENAME],
'MS_FileType': 'Agilent Wide Table in csv',
'Output_Directory': 'D:\\MSOrganiser',
'Output_Options': ['Area', 'normArea by ISTD', 'normConc by ISTD'],
'Annot_File': WIDETABLEFORMROW_ANNOTATION,
'Output_Format': 'Excel',
'Concatenate': 'No Concatenate',
'Transpose_Results': False,
'Allow_Multiple_ISTD': False,
'Long_Table': True,
'Long_Table_Annot': True,
'Testing': False
}
# We just check the results of the first file input
[file_data_list, file_name] = no_concatenate_workflow(stored_args,testing = True)
file_name_index = file_name.index(WIDETABLEFORMROW1_FILENAME)
[file_data,sheet_names] = file_data_list[file_name_index]
data_index = sheet_names.index("Long_Table")
ExcelData_df = self.__sheet_to_df(workbook = WIDETABLEFORMROW1_RESULTS_LONGTABLE_WITH_ANNOT_FILENAME,
sheet_name = "Long_Table",
allow_multiple_istd = True)
ExcelData_df = ExcelData_df.fillna('')
self.__compare_df(file_data[data_index],ExcelData_df)
def test_no_concatenate_transpose(self):
"""Check if the software is able to from the two input raw data
* Extract the Area
* Calculate the normalised Area and concentation
* Transpose the results correctly
"""
stored_args = {
'MS_Files': [WIDETABLEFORMROW1_FILENAME, WIDETABLEFORMROW2_FILENAME],
'MS_FileType': 'Agilent Wide Table in csv',
'Output_Directory': 'D:\\MSOrganiser',
'Output_Options': ['Area', 'normArea by ISTD', 'normConc by ISTD'],
'Annot_File': WIDETABLEFORMROW_ANNOTATION,
'Output_Format': 'Excel',
'Concatenate': 'No Concatenate',
'Transpose_Results': True,
'Allow_Multiple_ISTD': False,
'Long_Table': False,
'Long_Table_Annot': False,
'Testing': False
}
# We just check the results of the first file input
[file_data_list, file_name] = no_concatenate_workflow(stored_args,testing = True)
file_name_index = file_name.index(WIDETABLEFORMROW1_FILENAME)
[file_data,sheet_names] = file_data_list[file_name_index]
for sheet_name in sheet_names:
if sheet_name != "Long_Table":
data_index = sheet_names.index(sheet_name)
# Not every sheet_name needs to be transposed.
if sheet_name in ["Area", "normArea_by_ISTD", "normConc_by_ISTD"]:
if sheet_name in ["normArea_by_ISTD", "normConc_by_ISTD"]:
file_data[data_index] = MSDataOutput.transpose_MSdata(file_data[data_index],
allow_multiple_istd = False)
elif sheet_name in ["Area"]:
file_data[data_index] = MSDataOutput.transpose_MSdata(file_data[data_index],
allow_multiple_istd = False)
else:
print("We have a non-existing sheet_name")
exit(-1)
file_data[data_index] = file_data[data_index].fillna('')
ExcelData_df = self.__sheet_to_df(workbook = WIDETABLEFORMROW1_RESULTS_TRANSPOSE_FILENAME,
sheet_name = sheet_name,
allow_multiple_istd = False,
transpose = True)
ExcelData_df = ExcelData_df.fillna('')
self.__compare_df(file_data[data_index],ExcelData_df)
def test_no_concatenate_multiple_ISTD(self):
"""Check if the software is able to from the two input raw data
* Extract the Area
* Calculate the normalised Area and concentation using multiple ISTD
"""
stored_args = {
'MS_Files': [WIDETABLEFORMROW1_MULTIPLEISTD_FILENAME, WIDETABLEFORMROW2_MULTIPLEISTD_FILENAME],
'MS_FileType': 'Agilent Wide Table in csv',
'Output_Directory': 'D:\\MSOrganiser',
'Output_Options': ['Area', 'normArea by ISTD', 'normConc by ISTD'],
'Annot_File': WIDETABLEFORMROW_MULTIPLEISTD_ANNOTATION,
'Output_Format': 'Excel',
'Concatenate': 'No Concatenate',
'Transpose_Results': False,
'Allow_Multiple_ISTD': True,
'Long_Table': False,
'Long_Table_Annot': False,
'Testing': False
}
mock_print = self.patcher.start()
# We just check the results of the first file input
[file_data_list, file_name] = no_concatenate_workflow(stored_args,testing = True)
mock_print.assert_called_with('There are Transition_Names mentioned in the ' +
'Transition_Name_Annot sheet but have a blank Transition_Name_ISTD.\n' +
'\"LPC 14:0\"\n' +
'\"LPC 24:0\"',
flush = True)
file_name_index = file_name.index(WIDETABLEFORMROW1_MULTIPLEISTD_FILENAME)
[file_data,sheet_names] = file_data_list[file_name_index]
for sheet_name in sheet_names:
if sheet_name != "Long_Table":
data_index = sheet_names.index(sheet_name)
ExcelData_df = self.__sheet_to_df(workbook = WIDETABLEFORMROW1_MULTIPLEISTD_RESULTS_FILENAME,
sheet_name = sheet_name,
allow_multiple_istd = True)
#print(ExcelData_df)
#print(file_data[data_index])
self.__compare_df(file_data[data_index],ExcelData_df)
def test_no_concatenate_multiple_ISTD_LongTable(self):
"""Check if the software is able to from the two input raw data
* Extract the Area
* Calculate the normalised Area and concentation using multiple ISTD
* Create a Long Table without Annotation
"""
stored_args = {
'MS_Files': [WIDETABLEFORMROW1_MULTIPLEISTD_FILENAME, WIDETABLEFORMROW2_MULTIPLEISTD_FILENAME],
'MS_FileType': 'Agilent Wide Table in csv',
'Output_Directory': 'D:\\MSOrganiser',
'Output_Options': ['Area', 'normArea by ISTD', 'normConc by ISTD'],
'Annot_File': WIDETABLEFORMROW_MULTIPLEISTD_ANNOTATION,
'Output_Format': 'Excel',
'Concatenate': 'No Concatenate',
'Transpose_Results': False,
'Allow_Multiple_ISTD': True,
'Long_Table': True,
'Long_Table_Annot': False,
'Testing': False
}
mock_print = self.patcher.start()
# We just check the results of the first file input
[file_data_list, file_name] = no_concatenate_workflow(stored_args,testing = True)
mock_print.assert_called_with('There are Transition_Names mentioned in the ' +
'Transition_Name_Annot sheet but have a blank Transition_Name_ISTD.\n' +
'\"LPC 14:0\"\n' +
'\"LPC 24:0\"',
flush = True)
file_name_index = file_name.index(WIDETABLEFORMROW1_MULTIPLEISTD_FILENAME)
[file_data,sheet_names] = file_data_list[file_name_index]
data_index = sheet_names.index("Long_Table")
ExcelData_df = self.__sheet_to_df(workbook = WIDETABLEFORMROW1_MULTIPLEISTD_RESULTS_LONGTABLE_FILENAME,
sheet_name = "Long_Table",
allow_multiple_istd = True)
ExcelData_df = ExcelData_df.fillna('')
self.__compare_df(file_data[data_index],ExcelData_df)
def test_no_concatenate_multiple_ISTD_LongTable_with_Annot(self):
"""Check if the software is able to from the two input raw data
* Extract the Area
* Calculate the normalised Area and concentation using multiple ISTD
* Create a Long Table with Annotation
"""
stored_args = {
'MS_Files': [WIDETABLEFORMROW1_MULTIPLEISTD_FILENAME, WIDETABLEFORMROW2_MULTIPLEISTD_FILENAME],
'MS_FileType': 'Agilent Wide Table in csv',
'Output_Directory': 'D:\\MSOrganiser',
'Output_Options': ['Area', 'normArea by ISTD', 'normConc by ISTD'],
'Annot_File': WIDETABLEFORMROW_MULTIPLEISTD_ANNOTATION,
'Output_Format': 'Excel',
'Concatenate': 'No Concatenate',
'Transpose_Results': False,
'Allow_Multiple_ISTD': True,
'Long_Table': True,
'Long_Table_Annot': True,
'Testing': False
}
mock_print = self.patcher.start()
# We just check the results of the first file input
[file_data_list, file_name] = no_concatenate_workflow(stored_args,testing = True)
mock_print.assert_called_with('There are Transition_Names mentioned in the ' +
'Transition_Name_Annot sheet but have a blank Transition_Name_ISTD.\n' +
'\"LPC 14:0\"\n' +
'\"LPC 24:0\"',
flush = True)
file_name_index = file_name.index(WIDETABLEFORMROW1_MULTIPLEISTD_FILENAME)
[file_data,sheet_names] = file_data_list[file_name_index]
data_index = sheet_names.index("Long_Table")
ExcelData_df = self.__sheet_to_df(workbook = WIDETABLEFORMROW1_MULTIPLEISTD_RESULTS_LONGTABLE_WITH_ANNOT_FILENAME,
sheet_name = "Long_Table",
allow_multiple_istd = True)
ExcelData_df = ExcelData_df.fillna('')
self.__compare_df(file_data[data_index],ExcelData_df)
def test_no_concatenate_multiple_ISTD_transpose(self):
"""Check if the software is able to from the two input raw data
* Extract the Area
* Calculate the normalised Area and concentation using multiple ISTD
* Transpose the results correctly
"""
stored_args = {
'MS_Files': [WIDETABLEFORMROW1_MULTIPLEISTD_FILENAME, WIDETABLEFORMROW2_MULTIPLEISTD_FILENAME],
'MS_FileType': 'Agilent Wide Table in csv',
'Output_Directory': 'D:\\MSOrganiser',
'Output_Options': ['Area', 'normArea by ISTD', 'normConc by ISTD'],
'Annot_File': WIDETABLEFORMROW_MULTIPLEISTD_ANNOTATION,
'Output_Format': 'Excel',
'Concatenate': 'No Concatenate',
'Transpose_Results': True,
'Allow_Multiple_ISTD': True,
'Long_Table': False,
'Long_Table_Annot': False,
'Testing': False
}
mock_print = self.patcher.start()
# We just check the results of the first file input
[file_data_list, file_name] = no_concatenate_workflow(stored_args,testing = True)
mock_print.assert_called_with('There are Transition_Names mentioned in the ' +
'Transition_Name_Annot sheet but have a blank Transition_Name_ISTD.\n' +
'\"LPC 14:0\"\n' +
'\"LPC 24:0\"',
flush = True)
file_name_index = file_name.index(WIDETABLEFORMROW1_MULTIPLEISTD_FILENAME)
[file_data,sheet_names] = file_data_list[file_name_index]
for sheet_name in sheet_names:
if sheet_name != "Long_Table":
data_index = sheet_names.index(sheet_name)
# Not every sheet_name needs to be transposed.
if sheet_name in ["Area", "normArea_by_ISTD", "normConc_by_ISTD"]:
if sheet_name in ["normArea_by_ISTD", "normConc_by_ISTD"]:
file_data[data_index] = MSDataOutput.transpose_MSdata(file_data[data_index],
allow_multiple_istd = True)
elif sheet_name in ["Area"]:
file_data[data_index] = MSDataOutput.transpose_MSdata(file_data[data_index],
allow_multiple_istd = False)
else:
print("We have a non-existing sheet_name")
exit(-1)
file_data[data_index] = file_data[data_index].fillna('')
ExcelData_df = self.__sheet_to_df(workbook = WIDETABLEFORMROW1_MULTIPLEISTD_RESULTS_TRANSPOSE_FILENAME,
sheet_name = sheet_name,
allow_multiple_istd = True,
transpose = True)
ExcelData_df = ExcelData_df.fillna('')
self.__compare_df(file_data[data_index],ExcelData_df)
def __compare_df(self,MSData_df,ExcelData_df):
MSData_df = MSData_df.apply(pd.to_numeric, errors='ignore', downcast = 'float')
ExcelData_df = ExcelData_df.apply(pd.to_numeric, errors='ignore', downcast = 'float')
pd.testing.assert_frame_equal(MSData_df,ExcelData_df)
def __sheet_to_df(self,workbook,sheet_name,
allow_multiple_istd = False, transpose = False):
# Assume that the header is at the first row of the sheet.
header_col = [0]
# Assume that there is no index column
index_col = None
if allow_multiple_istd and not transpose and sheet_name in ["normArea_by_ISTD", "normConc_by_ISTD"]:
# Assume that the header is at the first two row of the sheet.
header_col = [0,1]
# Assume that the first column is the index column
index_col = [0]
ExcelData_df = pd.read_excel(
io = workbook,
header = header_col,
index_col = index_col,
engine = "openpyxl",
sheet_name= sheet_name
)
# We need to update the column names as the second level column name can be blank
# But pandas will fill it in as "Unnamed: {Level position}"
if allow_multiple_istd and not transpose and sheet_name in ["normArea_by_ISTD", "normConc_by_ISTD"]:
tuples=[]
for index, column_tuple in enumerate(ExcelData_df.columns.values):
if("Unnamed:" in column_tuple[1]):
tuples.append((column_tuple[0],""))
else:
tuples.append(column_tuple)
column_index = pd.MultiIndex.from_tuples(tuples, names=["Transition_Name", "Transition_Name_ISTD"])
ExcelData_df.columns = column_index
#print(ExcelData_df.columns)
#print(ExcelData_df.columns.levels)
#for level in range(ExcelData_df.columns.nlevels):
#unique = ExcelData_df.columns.levels[level]
#print(unique)
return ExcelData_df
def tearDown(self):
self.patcher.stop()
if __name__ == '__main__':
unittest.main()
| 50.373068 | 151 | 0.588413 | 2,330 | 22,819 | 5.407296 | 0.082403 | 0.028574 | 0.024764 | 0.028336 | 0.868164 | 0.860862 | 0.859513 | 0.850385 | 0.845464 | 0.824748 | 0 | 0.00577 | 0.33161 | 22,819 | 452 | 152 | 50.484513 | 0.820286 | 0.111574 | 0 | 0.70347 | 0 | 0 | 0.196732 | 0.041903 | 0 | 0 | 0 | 0 | 0.015773 | 1 | 0.037855 | false | 0 | 0.022082 | 0 | 0.066246 | 0.0347 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0f45838fddb84ddb7adb777f4f2976ea0baa8c60 | 32,128 | py | Python | cloudmersive_image_api_client/api/face_api.py | Cloudmersive/Cloudmersive.APIClient.Python.ImageRecognition | 280666acc0b34d905ff54fe2aaec1768a0a3d0e7 | [
"Apache-2.0"
] | 1 | 2018-06-24T01:33:50.000Z | 2018-06-24T01:33:50.000Z | cloudmersive_image_api_client/api/face_api.py | Cloudmersive/Cloudmersive.APIClient.Python.ImageRecognition | 280666acc0b34d905ff54fe2aaec1768a0a3d0e7 | [
"Apache-2.0"
] | null | null | null | cloudmersive_image_api_client/api/face_api.py | Cloudmersive/Cloudmersive.APIClient.Python.ImageRecognition | 280666acc0b34d905ff54fe2aaec1768a0a3d0e7 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
imageapi
Image Recognition and Processing APIs let you use Machine Learning to recognize and process images, and also perform useful image modification operations. # noqa: E501
OpenAPI spec version: v1
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from cloudmersive_image_api_client.api_client import ApiClient
class FaceApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def face_compare(self, input_image, match_face, **kwargs): # noqa: E501
"""Compare and match faces # noqa: E501
Find the faces in an input image, and compare against a reference image to determine if there is a match against the face in the reference image. The reference image (second parameter) should contain exactly one face. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.face_compare(input_image, match_face, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file input_image: Image file to perform the operation on; this image can contain one or more faces which will be matched against face provided in the second image. Common file formats such as PNG, JPEG are supported. (required)
:param file match_face: Image of a single face to compare and match against. (required)
:return: FaceCompareResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.face_compare_with_http_info(input_image, match_face, **kwargs) # noqa: E501
else:
(data) = self.face_compare_with_http_info(input_image, match_face, **kwargs) # noqa: E501
return data
def face_compare_with_http_info(self, input_image, match_face, **kwargs): # noqa: E501
"""Compare and match faces # noqa: E501
Find the faces in an input image, and compare against a reference image to determine if there is a match against the face in the reference image. The reference image (second parameter) should contain exactly one face. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.face_compare_with_http_info(input_image, match_face, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file input_image: Image file to perform the operation on; this image can contain one or more faces which will be matched against face provided in the second image. Common file formats such as PNG, JPEG are supported. (required)
:param file match_face: Image of a single face to compare and match against. (required)
:return: FaceCompareResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['input_image', 'match_face'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method face_compare" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'input_image' is set
if ('input_image' not in params or
params['input_image'] is None):
raise ValueError("Missing the required parameter `input_image` when calling `face_compare`") # noqa: E501
# verify the required parameter 'match_face' is set
if ('match_face' not in params or
params['match_face'] is None):
raise ValueError("Missing the required parameter `match_face` when calling `face_compare`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'input_image' in params:
local_var_files['inputImage'] = params['input_image'] # noqa: E501
if 'match_face' in params:
local_var_files['matchFace'] = params['match_face'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'text/json', 'application/xml', 'text/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = ['Apikey'] # noqa: E501
return self.api_client.call_api(
'/image/face/compare-and-match', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='FaceCompareResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def face_crop_first(self, image_file, **kwargs): # noqa: E501
"""Crop image to face with square crop # noqa: E501
Crop an image to the face (rectangular crop). If there is more than one face present, choose the first one. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.face_crop_first(image_file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file image_file: Image file to perform the operation on. Common file formats such as PNG, JPEG are supported. (required)
:return: str
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.face_crop_first_with_http_info(image_file, **kwargs) # noqa: E501
else:
(data) = self.face_crop_first_with_http_info(image_file, **kwargs) # noqa: E501
return data
def face_crop_first_with_http_info(self, image_file, **kwargs): # noqa: E501
"""Crop image to face with square crop # noqa: E501
Crop an image to the face (rectangular crop). If there is more than one face present, choose the first one. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.face_crop_first_with_http_info(image_file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file image_file: Image file to perform the operation on. Common file formats such as PNG, JPEG are supported. (required)
:return: str
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['image_file'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method face_crop_first" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'image_file' is set
if ('image_file' not in params or
params['image_file'] is None):
raise ValueError("Missing the required parameter `image_file` when calling `face_crop_first`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'image_file' in params:
local_var_files['imageFile'] = params['image_file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/octet-stream']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = ['Apikey'] # noqa: E501
return self.api_client.call_api(
'/image/face/crop/first', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='str', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def face_crop_first_round(self, image_file, **kwargs): # noqa: E501
"""Crop image to face with round crop # noqa: E501
Crop an image to the face (circular/round crop). If there is more than one face present, choose the first one. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.face_crop_first_round(image_file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file image_file: Image file to perform the operation on. Common file formats such as PNG, JPEG are supported. (required)
:return: str
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.face_crop_first_round_with_http_info(image_file, **kwargs) # noqa: E501
else:
(data) = self.face_crop_first_round_with_http_info(image_file, **kwargs) # noqa: E501
return data
def face_crop_first_round_with_http_info(self, image_file, **kwargs): # noqa: E501
"""Crop image to face with round crop # noqa: E501
Crop an image to the face (circular/round crop). If there is more than one face present, choose the first one. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.face_crop_first_round_with_http_info(image_file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file image_file: Image file to perform the operation on. Common file formats such as PNG, JPEG are supported. (required)
:return: str
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['image_file'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method face_crop_first_round" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'image_file' is set
if ('image_file' not in params or
params['image_file'] is None):
raise ValueError("Missing the required parameter `image_file` when calling `face_crop_first_round`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'image_file' in params:
local_var_files['imageFile'] = params['image_file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/octet-stream']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = ['Apikey'] # noqa: E501
return self.api_client.call_api(
'/image/face/crop/first/round', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='str', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def face_detect_age(self, image_file, **kwargs): # noqa: E501
"""Detect the age of people in an image # noqa: E501
Identify the age, position, and size of human faces in an image, along with a recognition confidence level. People in the image do NOT need to be facing the camera; they can be facing away, edge-on, etc. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.face_detect_age(image_file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file image_file: Image file to perform the operation on. Common file formats such as PNG, JPEG are supported. (required)
:return: AgeDetectionResult
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.face_detect_age_with_http_info(image_file, **kwargs) # noqa: E501
else:
(data) = self.face_detect_age_with_http_info(image_file, **kwargs) # noqa: E501
return data
def face_detect_age_with_http_info(self, image_file, **kwargs): # noqa: E501
"""Detect the age of people in an image # noqa: E501
Identify the age, position, and size of human faces in an image, along with a recognition confidence level. People in the image do NOT need to be facing the camera; they can be facing away, edge-on, etc. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.face_detect_age_with_http_info(image_file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file image_file: Image file to perform the operation on. Common file formats such as PNG, JPEG are supported. (required)
:return: AgeDetectionResult
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['image_file'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method face_detect_age" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'image_file' is set
if ('image_file' not in params or
params['image_file'] is None):
raise ValueError("Missing the required parameter `image_file` when calling `face_detect_age`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'image_file' in params:
local_var_files['imageFile'] = params['image_file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'text/json', 'application/xml', 'text/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = ['Apikey'] # noqa: E501
return self.api_client.call_api(
'/image/face/detect-age', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AgeDetectionResult', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def face_detect_gender(self, image_file, **kwargs): # noqa: E501
"""Detect the gender of people in an image # noqa: E501
Identify the gender, position, and size of human faces in an image, along with a recognition confidence level. People in the image should be facing the camera. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.face_detect_gender(image_file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file image_file: Image file to perform the operation on. Common file formats such as PNG, JPEG are supported. (required)
:return: GenderDetectionResult
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.face_detect_gender_with_http_info(image_file, **kwargs) # noqa: E501
else:
(data) = self.face_detect_gender_with_http_info(image_file, **kwargs) # noqa: E501
return data
def face_detect_gender_with_http_info(self, image_file, **kwargs): # noqa: E501
"""Detect the gender of people in an image # noqa: E501
Identify the gender, position, and size of human faces in an image, along with a recognition confidence level. People in the image should be facing the camera. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.face_detect_gender_with_http_info(image_file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file image_file: Image file to perform the operation on. Common file formats such as PNG, JPEG are supported. (required)
:return: GenderDetectionResult
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['image_file'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method face_detect_gender" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'image_file' is set
if ('image_file' not in params or
params['image_file'] is None):
raise ValueError("Missing the required parameter `image_file` when calling `face_detect_gender`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'image_file' in params:
local_var_files['imageFile'] = params['image_file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'text/json', 'application/xml', 'text/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = ['Apikey'] # noqa: E501
return self.api_client.call_api(
'/image/face/detect-gender', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GenderDetectionResult', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def face_locate(self, image_file, **kwargs): # noqa: E501
"""Detect and find faces in an image # noqa: E501
Locate the positions of all faces in an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.face_locate(image_file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file image_file: Image file to perform the operation on. Common file formats such as PNG, JPEG are supported. (required)
:return: FaceLocateResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.face_locate_with_http_info(image_file, **kwargs) # noqa: E501
else:
(data) = self.face_locate_with_http_info(image_file, **kwargs) # noqa: E501
return data
def face_locate_with_http_info(self, image_file, **kwargs): # noqa: E501
"""Detect and find faces in an image # noqa: E501
Locate the positions of all faces in an image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.face_locate_with_http_info(image_file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file image_file: Image file to perform the operation on. Common file formats such as PNG, JPEG are supported. (required)
:return: FaceLocateResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['image_file'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method face_locate" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'image_file' is set
if ('image_file' not in params or
params['image_file'] is None):
raise ValueError("Missing the required parameter `image_file` when calling `face_locate`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'image_file' in params:
local_var_files['imageFile'] = params['image_file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'text/json', 'application/xml', 'text/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = ['Apikey'] # noqa: E501
return self.api_client.call_api(
'/image/face/locate', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='FaceLocateResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def face_locate_with_landmarks(self, image_file, **kwargs): # noqa: E501
"""Detect and find faces and landmarks eyes and nose and mouth in image # noqa: E501
Locate the positions of all faces in an image, along with the eyes, eye brows, nose and mouth components of each # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.face_locate_with_landmarks(image_file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file image_file: Image file to perform the operation on. Common file formats such as PNG, JPEG are supported. (required)
:return: FaceLocateWithLandmarksResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.face_locate_with_landmarks_with_http_info(image_file, **kwargs) # noqa: E501
else:
(data) = self.face_locate_with_landmarks_with_http_info(image_file, **kwargs) # noqa: E501
return data
def face_locate_with_landmarks_with_http_info(self, image_file, **kwargs): # noqa: E501
"""Detect and find faces and landmarks eyes and nose and mouth in image # noqa: E501
Locate the positions of all faces in an image, along with the eyes, eye brows, nose and mouth components of each # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.face_locate_with_landmarks_with_http_info(image_file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file image_file: Image file to perform the operation on. Common file formats such as PNG, JPEG are supported. (required)
:return: FaceLocateWithLandmarksResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['image_file'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method face_locate_with_landmarks" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'image_file' is set
if ('image_file' not in params or
params['image_file'] is None):
raise ValueError("Missing the required parameter `image_file` when calling `face_locate_with_landmarks`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'image_file' in params:
local_var_files['imageFile'] = params['image_file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'text/json', 'application/xml', 'text/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = ['Apikey'] # noqa: E501
return self.api_client.call_api(
'/image/face/locate-with-landmarks', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='FaceLocateWithLandmarksResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 43.652174 | 243 | 0.629264 | 3,941 | 32,128 | 4.904085 | 0.061152 | 0.047602 | 0.020283 | 0.026078 | 0.95276 | 0.943706 | 0.940549 | 0.934599 | 0.929632 | 0.929011 | 0 | 0.015338 | 0.285701 | 32,128 | 735 | 244 | 43.711565 | 0.826833 | 0.385956 | 0 | 0.795918 | 0 | 0 | 0.199583 | 0.044919 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038265 | false | 0 | 0.010204 | 0 | 0.104592 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0f68329ec9bf9354d1f14e3c191601392a76eb4c | 148 | py | Python | market_maker/auth/__init__.py | tuckermint/sample-market-maker | c6b211065cc8096fb7109b16b85510a0a7091179 | [
"Apache-2.0"
] | null | null | null | market_maker/auth/__init__.py | tuckermint/sample-market-maker | c6b211065cc8096fb7109b16b85510a0a7091179 | [
"Apache-2.0"
] | null | null | null | market_maker/auth/__init__.py | tuckermint/sample-market-maker | c6b211065cc8096fb7109b16b85510a0a7091179 | [
"Apache-2.0"
] | null | null | null | from market_maker.auth.AccessTokenAuth import *
from market_maker.auth.APIKeyAuth import *
from market_maker.auth.APIKeyAuthWithExpires import *
| 37 | 54 | 0.837838 | 18 | 148 | 6.722222 | 0.444444 | 0.247934 | 0.371901 | 0.471074 | 0.413223 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.101351 | 148 | 3 | 55 | 49.333333 | 0.909774 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
0f6d4cdb4b1516e420b979ea0236e738fa933c56 | 94 | py | Python | tfrecord/tools/__init__.py | gsgoncalves/tfrecord | b5d0bddf0cbe14e6aea9a1585d186e36a847248a | [
"MIT"
] | 662 | 2019-11-26T04:57:30.000Z | 2022-03-31T21:07:43.000Z | tfrecord/tools/__init__.py | gsgoncalves/tfrecord | b5d0bddf0cbe14e6aea9a1585d186e36a847248a | [
"MIT"
] | 67 | 2019-11-26T21:16:05.000Z | 2022-01-29T02:52:30.000Z | tfrecord/tools/__init__.py | gsgoncalves/tfrecord | b5d0bddf0cbe14e6aea9a1585d186e36a847248a | [
"MIT"
] | 85 | 2019-11-26T06:20:12.000Z | 2022-03-15T03:15:44.000Z | from tfrecord.tools import tfrecord2idx
from tfrecord.tools.tfrecord2idx import create_index
| 23.5 | 52 | 0.87234 | 12 | 94 | 6.75 | 0.583333 | 0.296296 | 0.419753 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023529 | 0.095745 | 94 | 3 | 53 | 31.333333 | 0.929412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7e3989dcb23c7fcf8370b9c35ac3a3cfdcf4490d | 131 | py | Python | s_dbw/__init__.py | alashkov83/S_Dbw | 4e2b7619e9cbe3f04aca1f5ca440198fb06d3390 | [
"MIT"
] | 7 | 2018-12-24T13:16:25.000Z | 2022-01-24T10:33:46.000Z | s_dbw/__init__.py | alashkov83/S_Dbw | 4e2b7619e9cbe3f04aca1f5ca440198fb06d3390 | [
"MIT"
] | 6 | 2019-03-20T06:37:02.000Z | 2021-04-12T17:45:25.000Z | s_dbw/__init__.py | alashkov83/S_Dbw | 4e2b7619e9cbe3f04aca1f5ca440198fb06d3390 | [
"MIT"
] | 4 | 2019-03-20T07:32:55.000Z | 2021-09-07T13:59:16.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""Created by lashkov on 01.11.18"""
from .s_dbw import S_Dbw
from .s_dbw import SD
| 21.833333 | 36 | 0.664122 | 25 | 131 | 3.36 | 0.76 | 0.142857 | 0.190476 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072072 | 0.152672 | 131 | 5 | 37 | 26.2 | 0.684685 | 0.564886 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.