hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
8285e9b005712dba88c070e020523f456f40d049 | 17,211 | py | Python | sdk/python/pulumi_onelogin/privilege.py | pulumi/pulumi-onelogin | 431437785ae221128c119f336c195a738fab42e2 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2022-03-04T19:26:40.000Z | 2022-03-04T19:26:40.000Z | sdk/python/pulumi_onelogin/privilege.py | pulumi/pulumi-onelogin | 431437785ae221128c119f336c195a738fab42e2 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2021-09-13T23:15:43.000Z | 2022-03-15T17:52:46.000Z | sdk/python/pulumi_onelogin/privilege.py | pulumi/pulumi-onelogin | 431437785ae221128c119f336c195a738fab42e2 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2022-01-26T00:20:19.000Z | 2022-01-26T00:20:19.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
from . import outputs
from ._inputs import *
__all__ = ['PrivilegeArgs', 'Privilege']
@pulumi.input_type
class PrivilegeArgs:
def __init__(__self__, *,
privileges: pulumi.Input[Sequence[pulumi.Input['PrivilegePrivilegeArgs']]],
description: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
role_ids: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]] = None,
user_ids: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]] = None):
"""
The set of arguments for constructing a Privilege resource.
:param pulumi.Input[Sequence[pulumi.Input['PrivilegePrivilegeArgs']]] privileges: A list of statements that describe what the privilege grants access to.
:param pulumi.Input[str] description: Description for the Privilege.
:param pulumi.Input[str] name: The name of the privilege.
:param pulumi.Input[Sequence[pulumi.Input[int]]] role_ids: A list of role IDs for whom the role applies.
:param pulumi.Input[Sequence[pulumi.Input[int]]] user_ids: A list of user IDs for whom the privilege applies.
"""
pulumi.set(__self__, "privileges", privileges)
if description is not None:
pulumi.set(__self__, "description", description)
if name is not None:
pulumi.set(__self__, "name", name)
if role_ids is not None:
pulumi.set(__self__, "role_ids", role_ids)
if user_ids is not None:
pulumi.set(__self__, "user_ids", user_ids)
@property
@pulumi.getter
def privileges(self) -> pulumi.Input[Sequence[pulumi.Input['PrivilegePrivilegeArgs']]]:
"""
A list of statements that describe what the privilege grants access to.
"""
return pulumi.get(self, "privileges")
@privileges.setter
def privileges(self, value: pulumi.Input[Sequence[pulumi.Input['PrivilegePrivilegeArgs']]]):
pulumi.set(self, "privileges", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Description for the Privilege.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the privilege.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="roleIds")
def role_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[int]]]]:
"""
A list of role IDs for whom the role applies.
"""
return pulumi.get(self, "role_ids")
@role_ids.setter
def role_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]]):
pulumi.set(self, "role_ids", value)
@property
@pulumi.getter(name="userIds")
def user_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[int]]]]:
"""
A list of user IDs for whom the privilege applies.
"""
return pulumi.get(self, "user_ids")
@user_ids.setter
def user_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]]):
pulumi.set(self, "user_ids", value)
@pulumi.input_type
class _PrivilegeState:
def __init__(__self__, *,
description: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
privileges: Optional[pulumi.Input[Sequence[pulumi.Input['PrivilegePrivilegeArgs']]]] = None,
role_ids: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]] = None,
user_ids: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]] = None):
"""
Input properties used for looking up and filtering Privilege resources.
:param pulumi.Input[str] description: Description for the Privilege.
:param pulumi.Input[str] name: The name of the privilege.
:param pulumi.Input[Sequence[pulumi.Input['PrivilegePrivilegeArgs']]] privileges: A list of statements that describe what the privilege grants access to.
:param pulumi.Input[Sequence[pulumi.Input[int]]] role_ids: A list of role IDs for whom the role applies.
:param pulumi.Input[Sequence[pulumi.Input[int]]] user_ids: A list of user IDs for whom the privilege applies.
"""
if description is not None:
pulumi.set(__self__, "description", description)
if name is not None:
pulumi.set(__self__, "name", name)
if privileges is not None:
pulumi.set(__self__, "privileges", privileges)
if role_ids is not None:
pulumi.set(__self__, "role_ids", role_ids)
if user_ids is not None:
pulumi.set(__self__, "user_ids", user_ids)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Description for the Privilege.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the privilege.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def privileges(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['PrivilegePrivilegeArgs']]]]:
"""
A list of statements that describe what the privilege grants access to.
"""
return pulumi.get(self, "privileges")
@privileges.setter
def privileges(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['PrivilegePrivilegeArgs']]]]):
pulumi.set(self, "privileges", value)
@property
@pulumi.getter(name="roleIds")
def role_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[int]]]]:
"""
A list of role IDs for whom the role applies.
"""
return pulumi.get(self, "role_ids")
@role_ids.setter
def role_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]]):
pulumi.set(self, "role_ids", value)
@property
@pulumi.getter(name="userIds")
def user_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[int]]]]:
"""
A list of user IDs for whom the privilege applies.
"""
return pulumi.get(self, "user_ids")
@user_ids.setter
def user_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]]):
pulumi.set(self, "user_ids", value)
class Privilege(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
description: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
privileges: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['PrivilegePrivilegeArgs']]]]] = None,
role_ids: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]] = None,
user_ids: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]] = None,
__props__=None):
"""
Manage Privilege resources.
This resource allows you to create and configure Privilege.
## Example Usage
### Strict Ordering
```python
import pulumi
import pulumi_onelogin as onelogin
super_admin = onelogin.Privilege("superAdmin",
description="description",
privileges=[onelogin.PrivilegePrivilegeArgs(
statements=[
onelogin.PrivilegePrivilegeStatementArgs(
action=["apps:List"],
effect="Allow",
scope=["*"],
),
onelogin.PrivilegePrivilegeStatementArgs(
action=[
"users:List",
"users:Update",
],
effect="Allow",
scope=[
"users/123",
"users/345",
],
),
],
)],
role_ids=[
987,
654,
],
user_ids=[
123,
345,
])
```
## Import
A privilege can be imported using the OneLogin Privilege ID.
```sh
$ pulumi import onelogin:index/privilege:Privilege super_admin <privilege id>
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] description: Description for the Privilege.
:param pulumi.Input[str] name: The name of the privilege.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['PrivilegePrivilegeArgs']]]] privileges: A list of statements that describe what the privilege grants access to.
:param pulumi.Input[Sequence[pulumi.Input[int]]] role_ids: A list of role IDs for whom the role applies.
:param pulumi.Input[Sequence[pulumi.Input[int]]] user_ids: A list of user IDs for whom the privilege applies.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: PrivilegeArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manage Privilege resources.
This resource allows you to create and configure Privilege.
## Example Usage
### Strict Ordering
```python
import pulumi
import pulumi_onelogin as onelogin
super_admin = onelogin.Privilege("superAdmin",
description="description",
privileges=[onelogin.PrivilegePrivilegeArgs(
statements=[
onelogin.PrivilegePrivilegeStatementArgs(
action=["apps:List"],
effect="Allow",
scope=["*"],
),
onelogin.PrivilegePrivilegeStatementArgs(
action=[
"users:List",
"users:Update",
],
effect="Allow",
scope=[
"users/123",
"users/345",
],
),
],
)],
role_ids=[
987,
654,
],
user_ids=[
123,
345,
])
```
## Import
A privilege can be imported using the OneLogin Privilege ID.
```sh
$ pulumi import onelogin:index/privilege:Privilege super_admin <privilege id>
```
:param str resource_name: The name of the resource.
:param PrivilegeArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(PrivilegeArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
description: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
privileges: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['PrivilegePrivilegeArgs']]]]] = None,
role_ids: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]] = None,
user_ids: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = PrivilegeArgs.__new__(PrivilegeArgs)
__props__.__dict__["description"] = description
__props__.__dict__["name"] = name
if privileges is None and not opts.urn:
raise TypeError("Missing required property 'privileges'")
__props__.__dict__["privileges"] = privileges
__props__.__dict__["role_ids"] = role_ids
__props__.__dict__["user_ids"] = user_ids
super(Privilege, __self__).__init__(
'onelogin:index/privilege:Privilege',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
description: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
privileges: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['PrivilegePrivilegeArgs']]]]] = None,
role_ids: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]] = None,
user_ids: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]] = None) -> 'Privilege':
"""
Get an existing Privilege resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] description: Description for the Privilege.
:param pulumi.Input[str] name: The name of the privilege.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['PrivilegePrivilegeArgs']]]] privileges: A list of statements that describe what the privilege grants access to.
:param pulumi.Input[Sequence[pulumi.Input[int]]] role_ids: A list of role IDs for whom the role applies.
:param pulumi.Input[Sequence[pulumi.Input[int]]] user_ids: A list of user IDs for whom the privilege applies.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _PrivilegeState.__new__(_PrivilegeState)
__props__.__dict__["description"] = description
__props__.__dict__["name"] = name
__props__.__dict__["privileges"] = privileges
__props__.__dict__["role_ids"] = role_ids
__props__.__dict__["user_ids"] = user_ids
return Privilege(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def description(self) -> pulumi.Output[Optional[str]]:
"""
Description for the Privilege.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
The name of the privilege.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def privileges(self) -> pulumi.Output[Sequence['outputs.PrivilegePrivilege']]:
"""
A list of statements that describe what the privilege grants access to.
"""
return pulumi.get(self, "privileges")
@property
@pulumi.getter(name="roleIds")
def role_ids(self) -> pulumi.Output[Optional[Sequence[int]]]:
"""
A list of role IDs for whom the role applies.
"""
return pulumi.get(self, "role_ids")
@property
@pulumi.getter(name="userIds")
def user_ids(self) -> pulumi.Output[Optional[Sequence[int]]]:
"""
A list of user IDs for whom the privilege applies.
"""
return pulumi.get(self, "user_ids")
| 39.384439 | 179 | 0.599094 | 1,834 | 17,211 | 5.437841 | 0.094329 | 0.119122 | 0.080016 | 0.097764 | 0.828637 | 0.812293 | 0.791337 | 0.776497 | 0.760955 | 0.753334 | 0 | 0.003044 | 0.293824 | 17,211 | 436 | 180 | 39.474771 | 0.817509 | 0.341991 | 0 | 0.726829 | 1 | 0 | 0.089887 | 0.025911 | 0 | 0 | 0 | 0 | 0 | 1 | 0.156098 | false | 0.004878 | 0.034146 | 0 | 0.282927 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
82ad83ec9edb7090ae11c5c8c5b5e50ded9c7a58 | 102 | py | Python | TSFEL/TSFdlib-master/tsfel/__init__.py | TSFDlib/TSFDlib | 57031762bc25e53f05418c6dfc1b3ce66658acfb | [
"MIT"
] | 1 | 2021-09-02T04:21:03.000Z | 2021-09-02T04:21:03.000Z | TSFEL/TSFdlib-master/tsfel/__init__.py | TSFDlib/TSFDlib | 57031762bc25e53f05418c6dfc1b3ce66658acfb | [
"MIT"
] | null | null | null | TSFEL/TSFdlib-master/tsfel/__init__.py | TSFDlib/TSFDlib | 57031762bc25e53f05418c6dfc1b3ce66658acfb | [
"MIT"
] | null | null | null | from tsfel.features_extraction import *
from tsfel.features_report import *
from tsfel.utils import *
| 25.5 | 39 | 0.823529 | 14 | 102 | 5.857143 | 0.5 | 0.329268 | 0.414634 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 102 | 3 | 40 | 34 | 0.911111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
82c15f82842bcd26ed78e93e7a6de96312633064 | 3,567 | py | Python | Packs/ApiModules/Scripts/CSVFeedApiModule/CSVFeedApiModule_test.py | joshhe/content | cfc13a269b5e367f3219f42dafdfc23aa500e22f | [
"MIT"
] | 1 | 2020-04-26T14:31:44.000Z | 2020-04-26T14:31:44.000Z | Packs/ApiModules/Scripts/CSVFeedApiModule/CSVFeedApiModule_test.py | vvbaliga/content | cfc13a269b5e367f3219f42dafdfc23aa500e22f | [
"MIT"
] | null | null | null | Packs/ApiModules/Scripts/CSVFeedApiModule/CSVFeedApiModule_test.py | vvbaliga/content | cfc13a269b5e367f3219f42dafdfc23aa500e22f | [
"MIT"
] | null | null | null | from CSVFeedApiModule import *
import requests_mock
def test_get_indicators_1():
"""Test with 1 fieldname"""
feed_url_to_config = {
'https://ipstack.com': {
'fieldnames': ['value'],
'indicator_type': 'IP'
}
}
with open('test_data/ip_ranges.txt') as ip_ranges_txt:
ip_ranges = ip_ranges_txt.read().encode('utf8')
with requests_mock.Mocker() as m:
itype = 'IP'
args = {
'indicator_type': itype,
'limit': 35
}
m.get('https://ipstack.com', content=ip_ranges)
client = Client(
url="https://ipstack.com",
feed_url_to_config=feed_url_to_config,
)
hr, indicators_ec, raw_json = get_indicators_command(client, args)
assert not indicators_ec
for ind_json in raw_json:
ind_val = ind_json.get('value')
ind_type = ind_json.get('type')
ind_rawjson = ind_json.get('rawJSON')
assert ind_val
assert ind_type == itype
assert ind_rawjson['value'] == ind_val
assert ind_rawjson['type'] == ind_type
def test_get_indicators_with_mapping():
"""Test with 1 fieldname"""
feed_url_to_config = {
'https://ipstack.com': {
'fieldnames': ['value', 'a'],
'indicator_type': 'IP',
'mapping': {
'a': 'AAA'
}
}
}
with open('test_data/ip_ranges.txt') as ip_ranges_txt:
ip_ranges = ip_ranges_txt.read()
with requests_mock.Mocker() as m:
itype = 'IP'
args = {
'indicator_type': itype,
'limit': 35
}
m.get('https://ipstack.com', content=ip_ranges.encode('utf-8'))
client = Client(
url="https://ipstack.com",
feed_url_to_config=feed_url_to_config
)
hr, indicators_ec, raw_json = get_indicators_command(client, args)
assert not indicators_ec
for ind_json in raw_json:
ind_val = ind_json.get('value')
ind_map = ind_json['CustomFields'].get('AAA')
ind_type = ind_json.get('type')
ind_rawjson = ind_json.get('rawJSON')
assert ind_val
assert ind_type == itype
assert ind_map == 'a'
assert ind_rawjson['value'] == ind_val
assert ind_rawjson['type'] == ind_type
def test_get_indicators_2():
"""Test with 1 fieldname that's not called indicator"""
feed_url_to_config = {
'https://ipstack.com': {
'fieldnames': ['special_ind'],
'indicator_type': 'IP'
}
}
with open('test_data/ip_ranges.txt') as ip_ranges_txt:
ip_ranges = ip_ranges_txt.read().encode('utf8')
with requests_mock.Mocker() as m:
itype = 'IP'
args = {
'indicator_type': itype,
'limit': 35
}
m.get('https://ipstack.com', content=ip_ranges)
client = Client(
url="https://ipstack.com",
feed_url_to_config=feed_url_to_config,
)
hr, indicators_ec, raw_json = get_indicators_command(client, args)
assert not indicators_ec
for ind_json in raw_json:
ind_val = ind_json.get('value')
ind_type = ind_json.get('type')
ind_rawjson = ind_json.get('rawJSON')
assert ind_val
assert ind_type == itype
assert ind_rawjson['value'] == ind_val
assert ind_rawjson['type'] == ind_type
| 31.566372 | 74 | 0.555929 | 428 | 3,567 | 4.324766 | 0.149533 | 0.06483 | 0.04376 | 0.072934 | 0.893571 | 0.893571 | 0.893571 | 0.893571 | 0.871961 | 0.871961 | 0 | 0.005826 | 0.326325 | 3,567 | 112 | 75 | 31.848214 | 0.764461 | 0.026072 | 0 | 0.701031 | 0 | 0 | 0.149798 | 0.019954 | 0 | 0 | 0 | 0 | 0.164948 | 1 | 0.030928 | false | 0 | 0.020619 | 0 | 0.051546 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7d5d20532c6bade22f3c359bfe744016e08a1803 | 5,692 | py | Python | IOKR/tests/data/test_load_data.py | hi-paris/IOKR | d5d8483af27e99ee8a1b30cb408d308bf8477788 | [
"MIT"
] | null | null | null | IOKR/tests/data/test_load_data.py | hi-paris/IOKR | d5d8483af27e99ee8a1b30cb408d308bf8477788 | [
"MIT"
] | 3 | 2022-01-20T11:55:41.000Z | 2022-03-30T17:35:05.000Z | IOKR/tests/data/test_load_data.py | hi-paris/IOKR | d5d8483af27e99ee8a1b30cb408d308bf8477788 | [
"MIT"
] | 1 | 2022-03-30T09:16:31.000Z | 2022-03-30T09:16:31.000Z | """Test module for the module: load_data.py"""
import numpy as np
from IOKR.data.load_data import load_bibtex
from IOKR.data.load_data import load_corel5k
class TestLoadBibtex():
"""Test class for the function: load_bibtex"""
def test_returned_variables_not_empty(self):
"""Test checking if returned variables from load_bibtex are not empty
Returns
-------
None
"""
load = load_bibtex("IOKR/data/bibtex")
print(load)
assert load[0] is not None, "Expected variable: 'X'"
assert load[1] is not None, "Expected variable: 'Y'"
assert load[2] != "", "Expected variable: 'X_txt'"
assert load[3] != "", "Expected variable: 'Y_txt'"
def test_returned_variables_good_type(self):
"""Test checking if returned variables from load_bibtex are the expected type
Returns
-------
None
"""
load = load_bibtex("IOKR/data/bibtex")
actual_x = type(load[0])
actual_y = type(load[1])
actual_x_txt = type(load[2])
actual_y_txt = type(load[3])
expected1 = "np.array"
expected2 = 'list'
print(actual_x, actual_y, actual_x_txt, actual_y_txt)
assert isinstance(load[0], np.ndarray), f"'X' should be {expected1}, but is {actual_x} "
assert isinstance(load[1], np.ndarray), f"'Y' should be {expected1}, but is {actual_y} "
assert isinstance(load[2], list), f"'X_txt' should be {expected2}, but is {actual_x_txt} "
assert isinstance(load[3], list), f"'Y_txt' should be {expected2}, but is {actual_y_txt} "
def test_returned_variables_good_shape(self):
"""Test checking if returned variables from load_bibtex are the expected shape
Returns
-------
None
"""
load = load_bibtex("IOKR/data/bibtex")
actual_x_shape = load[0].shape
actual_y_shape = load[1].shape
actual_x_txt_len = len(load[2])
actual_y_txt_len = len(load[3])
expected_x_shape = (7395, 1836)
expected_y_shape = (7395, 149)
expected_x_txt_len = 1836
expected_y_txt_len = 149
print(actual_x_shape, actual_y_shape, actual_x_txt_len, actual_y_txt_len)
assert actual_x_shape, f"'X' should be {expected_x_shape}, but is {actual_x_shape} "
assert actual_y_shape, f"'Y' should be {expected_y_shape}, but is {actual_y_shape} "
assert actual_x_txt_len, f"'X_txt' should be {expected_x_txt_len}, but is {actual_x_txt_len} "
assert expected_y_txt_len, f"'Y_txt' should be {expected_y_txt_len}, but is {actual_y_txt_len} "
# def test_check_X_y(self):
# """Input validation for standard estimators.
#
# Checks X and y for consistent length, enforces X to be 2D and y 1D.
# By default, X is checked to be non-empty and containing only finite values.
# Standard input checks are also applied to y,
# such as checking that y does not have np.nan or np.inf targets.
#
# Returns
# -------
# None
# """
# load = load_bibtex("IOKR/data/bibtex")
# check = check_X_y(load[0], load[1])
# assert check
class TestLoadCorel5k():
"""Test class for the function: load_corel5k"""
def test_returned_variables_not_empty(self):
"""Test checking if returned variables from load_corel5k are not empty
Returns
-------
None
"""
load = load_corel5k("IOKR/data/corel5k")
print(load)
assert load[0] is not None, "Expected variable: 'X'"
assert load[1] is not None, "Expected variable: 'Y'"
assert load[2] != "", "Expected variable: 'X_txt'"
assert load[3] != "", "Expected variable: 'Y_txt'"
def test_returned_variables_good_type(self):
"""Test checking if returned variables from load_corel5k are the expected type
Returns
-------
None
"""
load = load_corel5k("IOKR/data/corel5k")
actual_x = type(load[0])
actual_y = type(load[1])
actual_x_txt = type(load[2])
actual_y_txt = type(load[3])
expected1 = "np.array"
expected2 = 'list'
print(actual_x, actual_y, actual_x_txt, actual_y_txt)
assert isinstance(load[0], np.ndarray), f"'X' should be {expected1}, but is {actual_x} "
assert isinstance(load[1], np.ndarray), f"'Y' should be {expected1}, but is {actual_y} "
assert isinstance(load[2], list), f"'X_txt' should be {expected2}, but is {actual_x_txt} "
assert isinstance(load[3], list), f"'Y_txt' should be {expected2}, but is {actual_y_txt} "
def test_returned_variables_good_shape(self):
"""Test checking if returned variables from load_corel5k are of expected shape
Returns
-------
None
"""
load = load_corel5k("IOKR/data/corel5k")
actual_x_shape = load[0].shape
actual_y_shape = load[1].shape
actual_x_txt_len = len(load[2])
actual_y_txt_len = len(load[3])
expected_x_shape = (5000, 499)
expected_y_shape = (5000, 374)
expected_x_txt_len = 499
expected_y_txt_len = 374
print(actual_x_shape, actual_y_shape, actual_x_txt_len, actual_y_txt_len)
assert actual_x_shape, f"'X' should be {expected_x_shape}, but is {actual_x_shape} "
assert actual_y_shape, f"'Y' should be {expected_y_shape}, but is {actual_y_shape} "
assert actual_x_txt_len, f"'X_txt' should be {expected_x_txt_len}, but is {actual_x_txt_len} "
assert expected_y_txt_len, f"'Y_txt' should be {expected_y_txt_len}, but is {actual_y_txt_len} "
| 39.804196 | 104 | 0.630534 | 820 | 5,692 | 4.112195 | 0.115854 | 0.058126 | 0.052195 | 0.02847 | 0.843713 | 0.843713 | 0.822361 | 0.799822 | 0.758897 | 0.758897 | 0 | 0.024096 | 0.256325 | 5,692 | 142 | 105 | 40.084507 | 0.772502 | 0.213985 | 0 | 0.826667 | 0 | 0 | 0.285342 | 0.019924 | 0 | 0 | 0 | 0 | 0.32 | 1 | 0.08 | false | 0 | 0.04 | 0 | 0.146667 | 0.08 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7d62a95ec8c493df15c874939a68d7aebb023253 | 5,729 | py | Python | dock/api.py | goldmann/dock | 6323b2b443d765fb6a8c88de123d8ed8a94751a9 | [
"BSD-3-Clause"
] | null | null | null | dock/api.py | goldmann/dock | 6323b2b443d765fb6a8c88de123d8ed8a94751a9 | [
"BSD-3-Clause"
] | null | null | null | dock/api.py | goldmann/dock | 6323b2b443d765fb6a8c88de123d8ed8a94751a9 | [
"BSD-3-Clause"
] | null | null | null | """
Copyright (c) 2015 Red Hat, Inc
All rights reserved.
This software may be modified and distributed under the terms
of the BSD license. See the LICENSE file for details.
Python API for dock. This is the official way of interacting with dock.
"""
from dock.inner import DockerBuildWorkflow
from dock.outer import PrivilegedBuildManager, DockerhostBuildManager
__all__ = (
'build_image_in_privileged_container',
'build_image_using_hosts_docker',
'build_image_here',
)
def build_image_in_privileged_container(build_image, git_url, image,
git_dockerfile_path=None, git_commit=None, parent_registry=None,
target_registries=None, push_buildroot_to=None,
parent_registry_insecure=False, target_registries_insecure=False,
**kwargs):
"""
build image from provided dockerfile (specified as git url) in privileged image
:param build_image: str, image where target image should be built
:param git_url: str, URL to git repo
:param image: str, tag for built image ([registry/]image_name[:tag])
:param git_dockerfile_path: str, path to dockerfile within git repo (if not in root)
:param git_commit: str, git commit to check out
:param parent_registry: str, registry to pull base image from
:param target_registries: list of str, list of registries to push image to (might change in future)
:param push_buildroot_to: str, repository where buildroot should be pushed
:param parent_registry_insecure: bool, allow connecting to parent registry over plain http
:param target_registries_insecure: bool, allow connecting to target registries over plain http
:return: BuildResults
"""
build_json = {
"git_url": git_url,
"image": image,
"git_dockerfile_path": git_dockerfile_path,
"git_commit": git_commit,
"parent_registry": parent_registry,
"target_registries": target_registries,
"parent_registry_insecure": parent_registry_insecure,
"target_registries_insecure": target_registries_insecure,
}
build_json.update(kwargs)
m = PrivilegedBuildManager(build_image, build_json)
build_response = m.build()
if push_buildroot_to:
m.commit_buildroot()
m.push_buildroot(push_buildroot_to)
return build_response
def build_image_using_hosts_docker(build_image, git_url, image,
git_dockerfile_path=None, git_commit=None, parent_registry=None,
target_registries=None, push_buildroot_to=None,
parent_registry_insecure=False, target_registries_insecure=False,
**kwargs):
"""
build image from provided dockerfile (specified as git url) in container
using docker from host
:param build_image: str, image where target image should be built
:param git_url: str, URL to git repo
:param image: str, tag for built image ([registry/]image_name[:tag])
:param git_dockerfile_path: str, path to dockerfile within git repo (if not in root)
:param git_commit: str, git commit to check out
:param parent_registry: str, registry to pull base image from
:param target_registries: list of str, list of registries to push image to (might change in future)
:param push_buildroot_to: str, repository where buildroot should be pushed
:param parent_registry_insecure: bool, allow connecting to parent registry over plain http
:param target_registries_insecure: bool, allow connecting to target registries over plain http
:return: BuildResults
"""
build_json = {
"git_url": git_url,
"image": image,
"git_dockerfile_path": git_dockerfile_path,
"git_commit": git_commit,
"parent_registry": parent_registry,
"target_registries": target_registries,
"parent_registry_insecure": parent_registry_insecure,
"target_registries_insecure": target_registries_insecure,
}
build_json.update(kwargs)
m = DockerhostBuildManager(build_image, build_json)
build_response = m.build()
if push_buildroot_to:
m.commit_buildroot()
m.push_buildroot(push_buildroot_to)
return build_response
def build_image_here(git_url, image,
git_dockerfile_path=None, git_commit=None, parent_registry=None,
target_registries=None, parent_registry_insecure=False,
target_registries_insecure=False, **kwargs):
"""
build image from provided dockerfile (specified as git url) in current environment
:param git_url: str, URL to git repo
:param image: str, tag for built image ([registry/]image_name[:tag])
:param git_dockerfile_path: str, path to dockerfile within git repo (if not in root)
:param git_commit: str, git commit to check out
:param parent_registry: str, registry to pull base image from
:param target_registries: list of str, list of registries to push image to (might change in future)
:param parent_registry_insecure: bool, allow connecting to parent registry over plain http
:param target_registries_insecure: bool, allow connecting to target registries over plain http
:return: BuildResults
"""
build_json = {
"git_url": git_url,
"image": image,
"git_dockerfile_path": git_dockerfile_path,
"git_commit": git_commit,
"parent_registry": parent_registry,
"target_registries": target_registries,
"parent_registry_insecure": parent_registry_insecure,
"target_registries_insecure": target_registries_insecure,
}
build_json.update(kwargs)
m = DockerBuildWorkflow(**build_json)
return m.build_docker_image()
def list_dockerfiles_in_git():
"""
clone provided repo and return all dockerfiles found in the repo
:return:
"""
| 40.34507 | 103 | 0.72997 | 751 | 5,729 | 5.314248 | 0.146471 | 0.094713 | 0.051115 | 0.033074 | 0.863443 | 0.863443 | 0.863443 | 0.834878 | 0.834878 | 0.834878 | 0 | 0.000872 | 0.199337 | 5,729 | 141 | 104 | 40.631206 | 0.869196 | 0.46989 | 0 | 0.710145 | 0 | 0 | 0.158618 | 0.075784 | 0 | 0 | 0 | 0 | 0 | 1 | 0.057971 | false | 0 | 0.028986 | 0 | 0.130435 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7db3e129c0201aff5c633bc6ea6f95e90c84a102 | 43,488 | py | Python | multiple-languages/python/ros-cdk-foas-1.0.3/src/ros_cdk_foas/__init__.py | aliyun/Resource-Orchestration-Service-Cloud-Development-K | 2b81e135002ed81cb72f7d07be7ff497ea39e2e1 | [
"Apache-2.0"
] | 15 | 2020-11-10T02:00:28.000Z | 2022-02-07T19:28:10.000Z | multiple-languages/python/ros-cdk-foas-1.0.3/src/ros_cdk_foas/__init__.py | aliyun/Resource-Orchestration-Service-Cloud-Development-K | 2b81e135002ed81cb72f7d07be7ff497ea39e2e1 | [
"Apache-2.0"
] | 23 | 2021-02-02T04:37:02.000Z | 2022-03-31T06:41:06.000Z | multiple-languages/python/ros-cdk-foas-1.0.3/src/ros_cdk_foas/__init__.py | aliyun/Resource-Orchestration-Service-Cloud-Development-K | 2b81e135002ed81cb72f7d07be7ff497ea39e2e1 | [
"Apache-2.0"
] | 4 | 2021-01-13T05:48:43.000Z | 2022-03-15T11:26:48.000Z | '''
## Aliyun ROS FOAS Construct Library
This module is part of the AliCloud ROS Cloud Development Kit (ROS CDK) project.
```python
# Example automatically generated from non-compiling source. May contain errors.
import * as FOAS from '@alicloud/ros-cdk-foas';
```
'''
import abc
import builtins
import datetime
import enum
import typing
import jsii
import publication
import typing_extensions
from ._jsii import *
import ros_cdk_core
class Cluster(
ros_cdk_core.Resource,
metaclass=jsii.JSIIMeta,
jsii_type="@alicloud/ros-cdk-foas.Cluster",
):
'''A ROS resource type: ``ALIYUN::FOAS::Cluster``.'''
def __init__(
self,
scope: ros_cdk_core.Construct,
id: builtins.str,
props: "ClusterProps",
enable_resource_property_constraint: typing.Optional[builtins.bool] = None,
) -> None:
'''Create a new ``ALIYUN::FOAS::Cluster``.
Param scope - scope in which this resource is defined
Param id - scoped id of the resource
Param props - resource properties
:param scope: -
:param id: -
:param props: -
:param enable_resource_property_constraint: -
'''
jsii.create(self.__class__, self, [scope, id, props, enable_resource_property_constraint])
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrClusterId")
def attr_cluster_id(self) -> ros_cdk_core.IResolvable:
'''Attribute ClusterId: Cluster ID.'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrClusterId"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrEngineVersions")
def attr_engine_versions(self) -> ros_cdk_core.IResolvable:
'''Attribute EngineVersions: Engine Versions.'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrEngineVersions"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrMasterInstanceInfos")
def attr_master_instance_infos(self) -> ros_cdk_core.IResolvable:
'''Attribute MasterInstanceInfos: Master instance infos.'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrMasterInstanceInfos"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrOrderId")
def attr_order_id(self) -> ros_cdk_core.IResolvable:
'''Attribute OrderId: Order ID.'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrOrderId"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrSecurityGroupId")
def attr_security_group_id(self) -> ros_cdk_core.IResolvable:
'''Attribute SecurityGroupId: Security group Id.'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrSecurityGroupId"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrSlaveInstanceInfos")
def attr_slave_instance_infos(self) -> ros_cdk_core.IResolvable:
'''Attribute SlaveInstanceInfos: Slave instance infos.'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrSlaveInstanceInfos"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrState")
def attr_state(self) -> ros_cdk_core.IResolvable:
'''Attribute State: Cluster status.'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrState"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrVSwitchIds")
def attr_v_switch_ids(self) -> ros_cdk_core.IResolvable:
'''Attribute VSwitchIds: VSwitch Ids.'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrVSwitchIds"))
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-foas.ClusterProps",
jsii_struct_bases=[],
name_mapping={
"cluster_name": "clusterName",
"description": "description",
"oss_bucket": "ossBucket",
"v_switch_id": "vSwitchId",
"order": "order",
"order_id": "orderId",
},
)
class ClusterProps:
def __init__(
self,
*,
cluster_name: typing.Union[builtins.str, ros_cdk_core.IResolvable],
description: typing.Union[builtins.str, ros_cdk_core.IResolvable],
oss_bucket: typing.Union[builtins.str, ros_cdk_core.IResolvable],
v_switch_id: typing.Union[builtins.str, ros_cdk_core.IResolvable],
order: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosCluster.OrderProperty"]] = None,
order_id: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
) -> None:
'''Properties for defining a ``ALIYUN::FOAS::Cluster``.
:param cluster_name: Property clusterName: Cluster name. It begins with a letter, and contains only lowercase English letters, numbers, underscores (_), and is limited to 3-64 characters.
:param description: Property description: Cluster description.
:param oss_bucket: Property ossBucket: Bucket name in your OSS.
:param v_switch_id: Property vSwitchId: VSwitch ID.
:param order: Property order: Order detail. Only one of property Order or OrderId can be specified. Order is not suggested. Policy AliyunBSSFullAccess must be granted to StreamDefaultRole in RAM console. The order can not be cancelled.
:param order_id: Property orderId: Order Id. Only one of property Order or OrderId can be specified. OrderId is suggested.
'''
self._values: typing.Dict[str, typing.Any] = {
"cluster_name": cluster_name,
"description": description,
"oss_bucket": oss_bucket,
"v_switch_id": v_switch_id,
}
if order is not None:
self._values["order"] = order
if order_id is not None:
self._values["order_id"] = order_id
@builtins.property
def cluster_name(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''Property clusterName: Cluster name.
It begins with a letter, and contains only lowercase English letters, numbers, underscores (_), and is limited to 3-64 characters.
'''
result = self._values.get("cluster_name")
assert result is not None, "Required property 'cluster_name' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def description(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''Property description: Cluster description.'''
result = self._values.get("description")
assert result is not None, "Required property 'description' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def oss_bucket(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''Property ossBucket: Bucket name in your OSS.'''
result = self._values.get("oss_bucket")
assert result is not None, "Required property 'oss_bucket' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def v_switch_id(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''Property vSwitchId: VSwitch ID.'''
result = self._values.get("v_switch_id")
assert result is not None, "Required property 'v_switch_id' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def order(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosCluster.OrderProperty"]]:
'''Property order: Order detail.
Only one of property Order or OrderId can be specified.
Order is not suggested.
Policy AliyunBSSFullAccess must be granted to StreamDefaultRole in RAM console.
The order can not be cancelled.
'''
result = self._values.get("order")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosCluster.OrderProperty"]], result)
@builtins.property
def order_id(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''Property orderId: Order Id.
Only one of property Order or OrderId can be specified.
OrderId is suggested.
'''
result = self._values.get("order_id")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "ClusterProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class Project(
ros_cdk_core.Resource,
metaclass=jsii.JSIIMeta,
jsii_type="@alicloud/ros-cdk-foas.Project",
):
'''A ROS resource type: ``ALIYUN::FOAS::Project``.'''
def __init__(
self,
scope: ros_cdk_core.Construct,
id: builtins.str,
props: "ProjectProps",
enable_resource_property_constraint: typing.Optional[builtins.bool] = None,
) -> None:
'''Create a new ``ALIYUN::FOAS::Project``.
Param scope - scope in which this resource is defined
Param id - scoped id of the resource
Param props - resource properties
:param scope: -
:param id: -
:param props: -
:param enable_resource_property_constraint: -
'''
jsii.create(self.__class__, self, [scope, id, props, enable_resource_property_constraint])
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrState")
def attr_state(self) -> ros_cdk_core.IResolvable:
'''Attribute State: Project state.'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrState"))
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-foas.ProjectProps",
jsii_struct_bases=[],
name_mapping={
"deploy_type": "deployType",
"manager_ids": "managerIds",
"name": "name",
"cluster_id": "clusterId",
"description": "description",
"order_id": "orderId",
},
)
class ProjectProps:
def __init__(
self,
*,
deploy_type: typing.Union[builtins.str, ros_cdk_core.IResolvable],
manager_ids: typing.Union[builtins.str, ros_cdk_core.IResolvable],
name: typing.Union[builtins.str, ros_cdk_core.IResolvable],
cluster_id: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
description: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
order_id: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
) -> None:
'''Properties for defining a ``ALIYUN::FOAS::Project``.
:param deploy_type: Property deployType: Cluster type: Exclusive cluster: cell Shared cluster: public.
:param manager_ids: Property managerIds: Comma delimited account Id list of managers.
:param name: Property name: Project name. It begins with a letter, and contains only lowercase English letters, numbers, underscores (_), and is limited to 3-64 characters.
:param cluster_id: Property clusterId: Cluster ID.
:param description: Property description: Project description.
:param order_id: Property orderId: Order Id of Shared cluster.
'''
self._values: typing.Dict[str, typing.Any] = {
"deploy_type": deploy_type,
"manager_ids": manager_ids,
"name": name,
}
if cluster_id is not None:
self._values["cluster_id"] = cluster_id
if description is not None:
self._values["description"] = description
if order_id is not None:
self._values["order_id"] = order_id
@builtins.property
def deploy_type(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''Property deployType: Cluster type: Exclusive cluster: cell Shared cluster: public.'''
result = self._values.get("deploy_type")
assert result is not None, "Required property 'deploy_type' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def manager_ids(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''Property managerIds: Comma delimited account Id list of managers.'''
result = self._values.get("manager_ids")
assert result is not None, "Required property 'manager_ids' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def name(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''Property name: Project name.
It begins with a letter, and contains only lowercase English letters, numbers, underscores (_), and is limited to 3-64 characters.
'''
result = self._values.get("name")
assert result is not None, "Required property 'name' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def cluster_id(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''Property clusterId: Cluster ID.'''
result = self._values.get("cluster_id")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
@builtins.property
def description(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''Property description: Project description.'''
result = self._values.get("description")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
@builtins.property
def order_id(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''Property orderId: Order Id of Shared cluster.'''
result = self._values.get("order_id")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "ProjectProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class RosCluster(
ros_cdk_core.RosResource,
metaclass=jsii.JSIIMeta,
jsii_type="@alicloud/ros-cdk-foas.RosCluster",
):
'''A ROS template type: ``ALIYUN::FOAS::Cluster``.'''
def __init__(
self,
scope: ros_cdk_core.Construct,
id: builtins.str,
props: "RosClusterProps",
enable_resource_property_constraint: builtins.bool,
) -> None:
'''Create a new ``ALIYUN::FOAS::Cluster``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param props: - resource properties.
:param enable_resource_property_constraint: -
'''
jsii.create(self.__class__, self, [scope, id, props, enable_resource_property_constraint])
@jsii.member(jsii_name="renderProperties")
def _render_properties(
self,
props: typing.Mapping[builtins.str, typing.Any],
) -> typing.Mapping[builtins.str, typing.Any]:
'''
:param props: -
'''
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.invoke(self, "renderProperties", [props]))
@jsii.python.classproperty # type: ignore[misc]
@jsii.member(jsii_name="ROS_RESOURCE_TYPE_NAME")
def ROS_RESOURCE_TYPE_NAME(cls) -> builtins.str:
'''The resource type name for this resource class.'''
return typing.cast(builtins.str, jsii.sget(cls, "ROS_RESOURCE_TYPE_NAME"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrClusterId")
def attr_cluster_id(self) -> ros_cdk_core.IResolvable:
'''
:Attribute: ClusterId: Cluster ID.
'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrClusterId"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrEngineVersions")
def attr_engine_versions(self) -> ros_cdk_core.IResolvable:
'''
:Attribute: EngineVersions: Engine Versions.
'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrEngineVersions"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrMasterInstanceInfos")
def attr_master_instance_infos(self) -> ros_cdk_core.IResolvable:
'''
:Attribute: MasterInstanceInfos: Master instance infos.
'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrMasterInstanceInfos"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrOrderId")
def attr_order_id(self) -> ros_cdk_core.IResolvable:
'''
:Attribute: OrderId: Order ID.
'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrOrderId"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrSecurityGroupId")
def attr_security_group_id(self) -> ros_cdk_core.IResolvable:
'''
:Attribute: SecurityGroupId: Security group Id.
'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrSecurityGroupId"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrSlaveInstanceInfos")
def attr_slave_instance_infos(self) -> ros_cdk_core.IResolvable:
'''
:Attribute: SlaveInstanceInfos: Slave instance infos.
'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrSlaveInstanceInfos"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrState")
def attr_state(self) -> ros_cdk_core.IResolvable:
'''
:Attribute: State: Cluster status.
'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrState"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrVSwitchIds")
def attr_v_switch_ids(self) -> ros_cdk_core.IResolvable:
'''
:Attribute: VSwitchIds: VSwitch Ids.
'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrVSwitchIds"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="rosProperties")
def _ros_properties(self) -> typing.Mapping[builtins.str, typing.Any]:
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.get(self, "rosProperties"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="clusterName")
def cluster_name(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: clusterName: Cluster name. It begins with a letter, and contains only lowercase English letters, numbers, underscores (_), and is limited to 3-64 characters.
'''
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], jsii.get(self, "clusterName"))
@cluster_name.setter
def cluster_name(
self,
value: typing.Union[builtins.str, ros_cdk_core.IResolvable],
) -> None:
jsii.set(self, "clusterName", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="description")
def description(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: description: Cluster description.
'''
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], jsii.get(self, "description"))
@description.setter
def description(
self,
value: typing.Union[builtins.str, ros_cdk_core.IResolvable],
) -> None:
jsii.set(self, "description", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="enableResourcePropertyConstraint")
def enable_resource_property_constraint(self) -> builtins.bool:
return typing.cast(builtins.bool, jsii.get(self, "enableResourcePropertyConstraint"))
@enable_resource_property_constraint.setter
def enable_resource_property_constraint(self, value: builtins.bool) -> None:
jsii.set(self, "enableResourcePropertyConstraint", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="ossBucket")
def oss_bucket(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: ossBucket: Bucket name in your OSS.
'''
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], jsii.get(self, "ossBucket"))
@oss_bucket.setter
def oss_bucket(
self,
value: typing.Union[builtins.str, ros_cdk_core.IResolvable],
) -> None:
jsii.set(self, "ossBucket", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="vSwitchId")
def v_switch_id(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: vSwitchId: VSwitch ID.
'''
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], jsii.get(self, "vSwitchId"))
@v_switch_id.setter
def v_switch_id(
self,
value: typing.Union[builtins.str, ros_cdk_core.IResolvable],
) -> None:
jsii.set(self, "vSwitchId", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="order")
def order(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosCluster.OrderProperty"]]:
'''
:Property:
order: Order detail. Only one of property Order or OrderId can be specified.
Order is not suggested.
Policy AliyunBSSFullAccess must be granted to StreamDefaultRole in RAM console.
The order can not be cancelled.
'''
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosCluster.OrderProperty"]], jsii.get(self, "order"))
@order.setter
def order(
self,
value: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosCluster.OrderProperty"]],
) -> None:
jsii.set(self, "order", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="orderId")
def order_id(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property:
orderId: Order Id. Only one of property Order or OrderId can be specified.
OrderId is suggested.
'''
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], jsii.get(self, "orderId"))
@order_id.setter
def order_id(
self,
value: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]],
) -> None:
jsii.set(self, "orderId", value)
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-foas.RosCluster.OrderProperty",
jsii_struct_bases=[],
name_mapping={
"master_spec": "masterSpec",
"slave_spec": "slaveSpec",
"master_number": "masterNumber",
"pay_model": "payModel",
"period": "period",
"slave_number": "slaveNumber",
},
)
class OrderProperty:
def __init__(
self,
*,
master_spec: typing.Union[builtins.str, ros_cdk_core.IResolvable],
slave_spec: typing.Union[builtins.str, ros_cdk_core.IResolvable],
master_number: typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]] = None,
pay_model: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
period: typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]] = None,
slave_number: typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]] = None,
) -> None:
'''
:param master_spec:
:param slave_spec:
:param master_number:
:param pay_model:
:param period:
:param slave_number:
'''
self._values: typing.Dict[str, typing.Any] = {
"master_spec": master_spec,
"slave_spec": slave_spec,
}
if master_number is not None:
self._values["master_number"] = master_number
if pay_model is not None:
self._values["pay_model"] = pay_model
if period is not None:
self._values["period"] = period
if slave_number is not None:
self._values["slave_number"] = slave_number
@builtins.property
def master_spec(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: masterSpec: Master spec, such as Ecs_4c16g.
'''
result = self._values.get("master_spec")
assert result is not None, "Required property 'master_spec' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def slave_spec(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: slaveSpec: Master spec, such as Ecs_4c16g.
'''
result = self._values.get("slave_spec")
assert result is not None, "Required property 'slave_spec' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def master_number(
self,
) -> typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]]:
'''
:Property: masterNumber: Number of masters. Valid values: 1, 3. Default to 3.
'''
result = self._values.get("master_number")
return typing.cast(typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]], result)
@builtins.property
def pay_model(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: payModel: Pay model. Valid values: pre, post. Default to post.
'''
result = self._values.get("pay_model")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
@builtins.property
def period(
self,
) -> typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]]:
'''
:Property: period: Pre paid time period. Unit is month. Default to 1.
'''
result = self._values.get("period")
return typing.cast(typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]], result)
@builtins.property
def slave_number(
self,
) -> typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]]:
'''
:Property: slaveNumber: Number of masters. Valid values: 2-1000. Default to 2.
'''
result = self._values.get("slave_number")
return typing.cast(typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "OrderProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-foas.RosClusterProps",
jsii_struct_bases=[],
name_mapping={
"cluster_name": "clusterName",
"description": "description",
"oss_bucket": "ossBucket",
"v_switch_id": "vSwitchId",
"order": "order",
"order_id": "orderId",
},
)
class RosClusterProps:
def __init__(
self,
*,
cluster_name: typing.Union[builtins.str, ros_cdk_core.IResolvable],
description: typing.Union[builtins.str, ros_cdk_core.IResolvable],
oss_bucket: typing.Union[builtins.str, ros_cdk_core.IResolvable],
v_switch_id: typing.Union[builtins.str, ros_cdk_core.IResolvable],
order: typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosCluster.OrderProperty]] = None,
order_id: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
) -> None:
'''Properties for defining a ``ALIYUN::FOAS::Cluster``.
:param cluster_name:
:param description:
:param oss_bucket:
:param v_switch_id:
:param order:
:param order_id:
'''
self._values: typing.Dict[str, typing.Any] = {
"cluster_name": cluster_name,
"description": description,
"oss_bucket": oss_bucket,
"v_switch_id": v_switch_id,
}
if order is not None:
self._values["order"] = order
if order_id is not None:
self._values["order_id"] = order_id
@builtins.property
def cluster_name(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: clusterName: Cluster name. It begins with a letter, and contains only lowercase English letters, numbers, underscores (_), and is limited to 3-64 characters.
'''
result = self._values.get("cluster_name")
assert result is not None, "Required property 'cluster_name' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def description(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: description: Cluster description.
'''
result = self._values.get("description")
assert result is not None, "Required property 'description' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def oss_bucket(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: ossBucket: Bucket name in your OSS.
'''
result = self._values.get("oss_bucket")
assert result is not None, "Required property 'oss_bucket' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def v_switch_id(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: vSwitchId: VSwitch ID.
'''
result = self._values.get("v_switch_id")
assert result is not None, "Required property 'v_switch_id' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def order(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosCluster.OrderProperty]]:
'''
:Property:
order: Order detail. Only one of property Order or OrderId can be specified.
Order is not suggested.
Policy AliyunBSSFullAccess must be granted to StreamDefaultRole in RAM console.
The order can not be cancelled.
'''
result = self._values.get("order")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosCluster.OrderProperty]], result)
@builtins.property
def order_id(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property:
orderId: Order Id. Only one of property Order or OrderId can be specified.
OrderId is suggested.
'''
result = self._values.get("order_id")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "RosClusterProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class RosProject(
ros_cdk_core.RosResource,
metaclass=jsii.JSIIMeta,
jsii_type="@alicloud/ros-cdk-foas.RosProject",
):
'''A ROS template type: ``ALIYUN::FOAS::Project``.'''
def __init__(
self,
scope: ros_cdk_core.Construct,
id: builtins.str,
props: "RosProjectProps",
enable_resource_property_constraint: builtins.bool,
) -> None:
'''Create a new ``ALIYUN::FOAS::Project``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param props: - resource properties.
:param enable_resource_property_constraint: -
'''
jsii.create(self.__class__, self, [scope, id, props, enable_resource_property_constraint])
@jsii.member(jsii_name="renderProperties")
def _render_properties(
self,
props: typing.Mapping[builtins.str, typing.Any],
) -> typing.Mapping[builtins.str, typing.Any]:
'''
:param props: -
'''
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.invoke(self, "renderProperties", [props]))
@jsii.python.classproperty # type: ignore[misc]
@jsii.member(jsii_name="ROS_RESOURCE_TYPE_NAME")
def ROS_RESOURCE_TYPE_NAME(cls) -> builtins.str:
'''The resource type name for this resource class.'''
return typing.cast(builtins.str, jsii.sget(cls, "ROS_RESOURCE_TYPE_NAME"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrState")
def attr_state(self) -> ros_cdk_core.IResolvable:
'''
:Attribute: State: Project state.
'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrState"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="rosProperties")
def _ros_properties(self) -> typing.Mapping[builtins.str, typing.Any]:
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.get(self, "rosProperties"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="deployType")
def deploy_type(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property:
deployType: Cluster type:
Exclusive cluster: cell
Shared cluster: public
'''
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], jsii.get(self, "deployType"))
@deploy_type.setter
def deploy_type(
self,
value: typing.Union[builtins.str, ros_cdk_core.IResolvable],
) -> None:
jsii.set(self, "deployType", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="enableResourcePropertyConstraint")
def enable_resource_property_constraint(self) -> builtins.bool:
return typing.cast(builtins.bool, jsii.get(self, "enableResourcePropertyConstraint"))
@enable_resource_property_constraint.setter
def enable_resource_property_constraint(self, value: builtins.bool) -> None:
jsii.set(self, "enableResourcePropertyConstraint", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="managerIds")
def manager_ids(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: managerIds: Comma delimited account Id list of managers.
'''
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], jsii.get(self, "managerIds"))
@manager_ids.setter
def manager_ids(
self,
value: typing.Union[builtins.str, ros_cdk_core.IResolvable],
) -> None:
jsii.set(self, "managerIds", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="name")
def name(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: name: Project name. It begins with a letter, and contains only lowercase English letters, numbers, underscores (_), and is limited to 3-64 characters.
'''
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], jsii.get(self, "name"))
@name.setter
def name(self, value: typing.Union[builtins.str, ros_cdk_core.IResolvable]) -> None:
jsii.set(self, "name", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="clusterId")
def cluster_id(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: clusterId: Cluster ID.
'''
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], jsii.get(self, "clusterId"))
@cluster_id.setter
def cluster_id(
self,
value: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]],
) -> None:
jsii.set(self, "clusterId", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="description")
def description(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: description: Project description.
'''
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], jsii.get(self, "description"))
@description.setter
def description(
self,
value: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]],
) -> None:
jsii.set(self, "description", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="orderId")
def order_id(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: orderId: Order Id of Shared cluster.
'''
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], jsii.get(self, "orderId"))
@order_id.setter
def order_id(
self,
value: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]],
) -> None:
jsii.set(self, "orderId", value)
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-foas.RosProjectProps",
jsii_struct_bases=[],
name_mapping={
"deploy_type": "deployType",
"manager_ids": "managerIds",
"name": "name",
"cluster_id": "clusterId",
"description": "description",
"order_id": "orderId",
},
)
class RosProjectProps:
def __init__(
self,
*,
deploy_type: typing.Union[builtins.str, ros_cdk_core.IResolvable],
manager_ids: typing.Union[builtins.str, ros_cdk_core.IResolvable],
name: typing.Union[builtins.str, ros_cdk_core.IResolvable],
cluster_id: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
description: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
order_id: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
) -> None:
'''Properties for defining a ``ALIYUN::FOAS::Project``.
:param deploy_type:
:param manager_ids:
:param name:
:param cluster_id:
:param description:
:param order_id:
'''
self._values: typing.Dict[str, typing.Any] = {
"deploy_type": deploy_type,
"manager_ids": manager_ids,
"name": name,
}
if cluster_id is not None:
self._values["cluster_id"] = cluster_id
if description is not None:
self._values["description"] = description
if order_id is not None:
self._values["order_id"] = order_id
@builtins.property
def deploy_type(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property:
deployType: Cluster type:
Exclusive cluster: cell
Shared cluster: public
'''
result = self._values.get("deploy_type")
assert result is not None, "Required property 'deploy_type' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def manager_ids(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: managerIds: Comma delimited account Id list of managers.
'''
result = self._values.get("manager_ids")
assert result is not None, "Required property 'manager_ids' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def name(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: name: Project name. It begins with a letter, and contains only lowercase English letters, numbers, underscores (_), and is limited to 3-64 characters.
'''
result = self._values.get("name")
assert result is not None, "Required property 'name' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def cluster_id(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: clusterId: Cluster ID.
'''
result = self._values.get("cluster_id")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
@builtins.property
def description(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: description: Project description.
'''
result = self._values.get("description")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
@builtins.property
def order_id(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: orderId: Order Id of Shared cluster.
'''
result = self._values.get("order_id")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "RosProjectProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
__all__ = [
"Cluster",
"ClusterProps",
"Project",
"ProjectProps",
"RosCluster",
"RosClusterProps",
"RosProject",
"RosProjectProps",
]
publication.publish()
| 39.824176 | 243 | 0.652824 | 5,060 | 43,488 | 5.42253 | 0.045652 | 0.039799 | 0.062322 | 0.123989 | 0.921022 | 0.91417 | 0.910125 | 0.904694 | 0.901706 | 0.891756 | 0 | 0.001191 | 0.227787 | 43,488 | 1,091 | 244 | 39.860678 | 0.815854 | 0.193203 | 0 | 0.781659 | 1 | 0 | 0.112924 | 0.027884 | 0 | 0 | 0 | 0 | 0.02329 | 1 | 0.154294 | false | 0 | 0.014556 | 0.027656 | 0.302766 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7deff6ebaafd53e66f9e8a9b1fa6e4c493fa15f4 | 35,690 | py | Python | src/ebay_rest/api/sell_account/api/fulfillment_policy_api.py | gbm001/ebay_rest | 077d3478423ccd80ff35e0361821d6a11180bc54 | [
"MIT"
] | null | null | null | src/ebay_rest/api/sell_account/api/fulfillment_policy_api.py | gbm001/ebay_rest | 077d3478423ccd80ff35e0361821d6a11180bc54 | [
"MIT"
] | null | null | null | src/ebay_rest/api/sell_account/api/fulfillment_policy_api.py | gbm001/ebay_rest | 077d3478423ccd80ff35e0361821d6a11180bc54 | [
"MIT"
] | null | null | null | # coding: utf-8
"""
Account API
The <b>Account API</b> gives sellers the ability to configure their eBay seller accounts, including the seller's policies (the Fulfillment Policy, Payment Policy, and Return Policy), opt in and out of eBay seller programs, configure sales tax tables, and get account information. <br><br>For details on the availability of the methods in this API, see <a href=\"/api-docs/sell/account/overview.html#requirements\">Account API requirements and restrictions</a>. # noqa: E501
OpenAPI spec version: v1.6.3
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from ...sell_account.api_client import ApiClient
class FulfillmentPolicyApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_fulfillment_policy(self, body, **kwargs): # noqa: E501
"""create_fulfillment_policy # noqa: E501
This method creates a new fulfillment policy where the policy encapsulates seller's terms for fulfilling item purchases. Fulfillment policies include the shipment options that the seller offers to buyers. <br><br>Each policy targets a <b>marketplaceId</b> and <code>categoryTypes.</code><b>name</b> combination and you can create multiple policies for each combination. Be aware that some marketplaces require a specific fulfillment policy for vehicle listings. <br><br>A successful request returns the URI to the new policy in the <b>Location</b> response header and the ID for the new policy is returned in the response payload. <p class=\"tablenote\"><b>Tip:</b> For details on creating and using the business policies supported by the Account API, see <a href=\"/api-docs/sell/static/seller-accounts/business-policies.html\">eBay business policies</a>.</p> <p><b>Marketplaces and locales</b></p> <p>Policy instructions can be localized by providing a locale in the <code>Accept-Language</code> HTTP request header. For example, the following setting displays field values from the request body in German: <code>Accept-Language: de-DE</code>.</p> <p>Target the specific locale of a marketplace that supports multiple locales using the <code>Content-Language</code> request header. For example, target the French locale of the Canadian marketplace by specifying the <code>fr-CA</code> locale for <code>Content-Language</code>. Likewise, target the Dutch locale of the Belgium marketplace by setting <code>Content-Language: nl-BE</code>.</p> <p class=\"tablenote\"><b>Tip:</b> For details on headers, see <a href=\"/api-docs/static/rest-request-components.html#HTTP\">HTTP request headers</a>.</p><p><b>Using the eBay standard envelope service (eSE)</b></p> <p>The eBay standard envelope service (eSE) is a domestic envelope service with tracking through eBay. This service applies to specific Trading Cards categories (not all categories are supported), and to Coins & Paper Money, Postcards, and Stamps. See <a href=\"/api-docs/sell/static/seller-accounts/using-the-ebay-standard-envelope-service.html\" target=\"_blank\">Using the eBay standard envelope (eSE) service</a>.</p> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_fulfillment_policy(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param FulfillmentPolicyRequest body: Request to create a seller account fulfillment policy. (required)
:return: SetFulfillmentPolicyResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_fulfillment_policy_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.create_fulfillment_policy_with_http_info(body, **kwargs) # noqa: E501
return data
def create_fulfillment_policy_with_http_info(self, body, **kwargs): # noqa: E501
"""create_fulfillment_policy # noqa: E501
This method creates a new fulfillment policy where the policy encapsulates seller's terms for fulfilling item purchases. Fulfillment policies include the shipment options that the seller offers to buyers. <br><br>Each policy targets a <b>marketplaceId</b> and <code>categoryTypes.</code><b>name</b> combination and you can create multiple policies for each combination. Be aware that some marketplaces require a specific fulfillment policy for vehicle listings. <br><br>A successful request returns the URI to the new policy in the <b>Location</b> response header and the ID for the new policy is returned in the response payload. <p class=\"tablenote\"><b>Tip:</b> For details on creating and using the business policies supported by the Account API, see <a href=\"/api-docs/sell/static/seller-accounts/business-policies.html\">eBay business policies</a>.</p> <p><b>Marketplaces and locales</b></p> <p>Policy instructions can be localized by providing a locale in the <code>Accept-Language</code> HTTP request header. For example, the following setting displays field values from the request body in German: <code>Accept-Language: de-DE</code>.</p> <p>Target the specific locale of a marketplace that supports multiple locales using the <code>Content-Language</code> request header. For example, target the French locale of the Canadian marketplace by specifying the <code>fr-CA</code> locale for <code>Content-Language</code>. Likewise, target the Dutch locale of the Belgium marketplace by setting <code>Content-Language: nl-BE</code>.</p> <p class=\"tablenote\"><b>Tip:</b> For details on headers, see <a href=\"/api-docs/static/rest-request-components.html#HTTP\">HTTP request headers</a>.</p><p><b>Using the eBay standard envelope service (eSE)</b></p> <p>The eBay standard envelope service (eSE) is a domestic envelope service with tracking through eBay. This service applies to specific Trading Cards categories (not all categories are supported), and to Coins & Paper Money, Postcards, and Stamps. See <a href=\"/api-docs/sell/static/seller-accounts/using-the-ebay-standard-envelope-service.html\" target=\"_blank\">Using the eBay standard envelope (eSE) service</a>.</p> # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_fulfillment_policy_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param FulfillmentPolicyRequest body: Request to create a seller account fulfillment policy. (required)
:return: SetFulfillmentPolicyResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_fulfillment_policy" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `create_fulfillment_policy`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_auth'] # noqa: E501
return self.api_client.call_api(
'/fulfillment_policy', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SetFulfillmentPolicyResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_fulfillment_policy(self, fulfillment_policy_id, **kwargs): # noqa: E501
"""delete_fulfillment_policy # noqa: E501
This method deletes a fulfillment policy. Supply the ID of the policy you want to delete in the <b>fulfillmentPolicyId</b> path parameter. Note that you cannot delete the default fulfillment policy. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_fulfillment_policy(fulfillment_policy_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fulfillment_policy_id: This path parameter specifies the ID of the fulfillment policy to delete. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_fulfillment_policy_with_http_info(fulfillment_policy_id, **kwargs) # noqa: E501
else:
(data) = self.delete_fulfillment_policy_with_http_info(fulfillment_policy_id, **kwargs) # noqa: E501
return data
def delete_fulfillment_policy_with_http_info(self, fulfillment_policy_id, **kwargs): # noqa: E501
"""delete_fulfillment_policy # noqa: E501
This method deletes a fulfillment policy. Supply the ID of the policy you want to delete in the <b>fulfillmentPolicyId</b> path parameter. Note that you cannot delete the default fulfillment policy. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_fulfillment_policy_with_http_info(fulfillment_policy_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fulfillment_policy_id: This path parameter specifies the ID of the fulfillment policy to delete. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fulfillment_policy_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_fulfillment_policy" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'fulfillment_policy_id' is set
if ('fulfillment_policy_id' not in params or
params['fulfillment_policy_id'] is None):
raise ValueError("Missing the required parameter `fulfillment_policy_id` when calling `delete_fulfillment_policy`") # noqa: E501
collection_formats = {}
path_params = {}
if 'fulfillment_policy_id' in params:
path_params['fulfillmentPolicyId'] = params['fulfillment_policy_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['api_auth'] # noqa: E501
return self.api_client.call_api(
'/fulfillment_policy/{fulfillmentPolicyId}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_fulfillment_policies(self, marketplace_id, **kwargs): # noqa: E501
"""get_fulfillment_policies # noqa: E501
This method retrieves all the fulfillment policies configured for the marketplace you specify using the <code>marketplace_id</code> query parameter. <br><br><b>Marketplaces and locales</b> <br><br>Get the correct policies for a marketplace that supports multiple locales using the <code>Content-Language</code> request header. For example, get the policies for the French locale of the Canadian marketplace by specifying <code>fr-CA</code> for the <code>Content-Language</code> header. Likewise, target the Dutch locale of the Belgium marketplace by setting <code>Content-Language: nl-BE</code>. For details on header values, see <a href=\"/api-docs/static/rest-request-components.html#HTTP\" target=\"_blank\">HTTP request headers</a>. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_fulfillment_policies(marketplace_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str marketplace_id: This query parameter specifies the eBay marketplace of the policies you want to retrieve. For implementation help, refer to eBay API documentation at https://developer.ebay.com/api-docs/sell/account/types/ba:MarketplaceIdEnum (required)
:return: FulfillmentPolicyResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_fulfillment_policies_with_http_info(marketplace_id, **kwargs) # noqa: E501
else:
(data) = self.get_fulfillment_policies_with_http_info(marketplace_id, **kwargs) # noqa: E501
return data
def get_fulfillment_policies_with_http_info(self, marketplace_id, **kwargs): # noqa: E501
"""get_fulfillment_policies # noqa: E501
This method retrieves all the fulfillment policies configured for the marketplace you specify using the <code>marketplace_id</code> query parameter. <br><br><b>Marketplaces and locales</b> <br><br>Get the correct policies for a marketplace that supports multiple locales using the <code>Content-Language</code> request header. For example, get the policies for the French locale of the Canadian marketplace by specifying <code>fr-CA</code> for the <code>Content-Language</code> header. Likewise, target the Dutch locale of the Belgium marketplace by setting <code>Content-Language: nl-BE</code>. For details on header values, see <a href=\"/api-docs/static/rest-request-components.html#HTTP\" target=\"_blank\">HTTP request headers</a>. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_fulfillment_policies_with_http_info(marketplace_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str marketplace_id: This query parameter specifies the eBay marketplace of the policies you want to retrieve. For implementation help, refer to eBay API documentation at https://developer.ebay.com/api-docs/sell/account/types/ba:MarketplaceIdEnum (required)
:return: FulfillmentPolicyResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['marketplace_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_fulfillment_policies" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'marketplace_id' is set
if ('marketplace_id' not in params or
params['marketplace_id'] is None):
raise ValueError("Missing the required parameter `marketplace_id` when calling `get_fulfillment_policies`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'marketplace_id' in params:
query_params.append(('marketplace_id', params['marketplace_id'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_auth'] # noqa: E501
return self.api_client.call_api(
'/fulfillment_policy', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='FulfillmentPolicyResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_fulfillment_policy(self, fulfillment_policy_id, **kwargs): # noqa: E501
"""get_fulfillment_policy # noqa: E501
This method retrieves the complete details of a fulfillment policy. Supply the ID of the policy you want to retrieve using the <b>fulfillmentPolicyId</b> path parameter. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_fulfillment_policy(fulfillment_policy_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fulfillment_policy_id: This path parameter specifies the ID of the fulfillment policy you want to retrieve. (required)
:return: FulfillmentPolicy
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_fulfillment_policy_with_http_info(fulfillment_policy_id, **kwargs) # noqa: E501
else:
(data) = self.get_fulfillment_policy_with_http_info(fulfillment_policy_id, **kwargs) # noqa: E501
return data
def get_fulfillment_policy_with_http_info(self, fulfillment_policy_id, **kwargs): # noqa: E501
"""get_fulfillment_policy # noqa: E501
This method retrieves the complete details of a fulfillment policy. Supply the ID of the policy you want to retrieve using the <b>fulfillmentPolicyId</b> path parameter. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_fulfillment_policy_with_http_info(fulfillment_policy_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fulfillment_policy_id: This path parameter specifies the ID of the fulfillment policy you want to retrieve. (required)
:return: FulfillmentPolicy
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fulfillment_policy_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_fulfillment_policy" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'fulfillment_policy_id' is set
if ('fulfillment_policy_id' not in params or
params['fulfillment_policy_id'] is None):
raise ValueError("Missing the required parameter `fulfillment_policy_id` when calling `get_fulfillment_policy`") # noqa: E501
collection_formats = {}
path_params = {}
if 'fulfillment_policy_id' in params:
path_params['fulfillmentPolicyId'] = params['fulfillment_policy_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_auth'] # noqa: E501
return self.api_client.call_api(
'/fulfillment_policy/{fulfillmentPolicyId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='FulfillmentPolicy', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_fulfillment_policy_by_name(self, marketplace_id, name, **kwargs): # noqa: E501
"""get_fulfillment_policy_by_name # noqa: E501
This method retrieves the complete details for a single fulfillment policy. In the request, supply both the policy <code>name</code> and its associated <code>marketplace_id</code> as query parameters. <br><br><b>Marketplaces and locales</b> <br><br>Get the correct policy for a marketplace that supports multiple locales using the <code>Content-Language</code> request header. For example, get a policy for the French locale of the Canadian marketplace by specifying <code>fr-CA</code> for the <code>Content-Language</code> header. Likewise, target the Dutch locale of the Belgium marketplace by setting <code>Content-Language: nl-BE</code>. For details on header values, see <a href=\"/api-docs/static/rest-request-components.html#HTTP\">HTTP request headers</a>. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_fulfillment_policy_by_name(marketplace_id, name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str marketplace_id: This query parameter specifies the eBay marketplace of the policy you want to retrieve. For implementation help, refer to eBay API documentation at https://developer.ebay.com/api-docs/sell/account/types/ba:MarketplaceIdEnum (required)
:param str name: This query parameter specifies the user-defined name of the fulfillment policy you want to retrieve. (required)
:return: FulfillmentPolicy
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_fulfillment_policy_by_name_with_http_info(marketplace_id, name, **kwargs) # noqa: E501
else:
(data) = self.get_fulfillment_policy_by_name_with_http_info(marketplace_id, name, **kwargs) # noqa: E501
return data
def get_fulfillment_policy_by_name_with_http_info(self, marketplace_id, name, **kwargs): # noqa: E501
"""get_fulfillment_policy_by_name # noqa: E501
This method retrieves the complete details for a single fulfillment policy. In the request, supply both the policy <code>name</code> and its associated <code>marketplace_id</code> as query parameters. <br><br><b>Marketplaces and locales</b> <br><br>Get the correct policy for a marketplace that supports multiple locales using the <code>Content-Language</code> request header. For example, get a policy for the French locale of the Canadian marketplace by specifying <code>fr-CA</code> for the <code>Content-Language</code> header. Likewise, target the Dutch locale of the Belgium marketplace by setting <code>Content-Language: nl-BE</code>. For details on header values, see <a href=\"/api-docs/static/rest-request-components.html#HTTP\">HTTP request headers</a>. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_fulfillment_policy_by_name_with_http_info(marketplace_id, name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str marketplace_id: This query parameter specifies the eBay marketplace of the policy you want to retrieve. For implementation help, refer to eBay API documentation at https://developer.ebay.com/api-docs/sell/account/types/ba:MarketplaceIdEnum (required)
:param str name: This query parameter specifies the user-defined name of the fulfillment policy you want to retrieve. (required)
:return: FulfillmentPolicy
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['marketplace_id', 'name'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_fulfillment_policy_by_name" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'marketplace_id' is set
if ('marketplace_id' not in params or
params['marketplace_id'] is None):
raise ValueError("Missing the required parameter `marketplace_id` when calling `get_fulfillment_policy_by_name`") # noqa: E501
# verify the required parameter 'name' is set
if ('name' not in params or
params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `get_fulfillment_policy_by_name`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'marketplace_id' in params:
query_params.append(('marketplace_id', params['marketplace_id'])) # noqa: E501
if 'name' in params:
query_params.append(('name', params['name'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_auth'] # noqa: E501
return self.api_client.call_api(
'/fulfillment_policy/get_by_policy_name', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='FulfillmentPolicy', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_fulfillment_policy(self, body, fulfillment_policy_id, **kwargs): # noqa: E501
"""update_fulfillment_policy # noqa: E501
This method updates an existing fulfillment policy. Specify the policy you want to update using the <b>fulfillment_policy_id</b> path parameter. Supply a complete policy payload with the updates you want to make; this call overwrites the existing policy with the new details specified in the payload. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_fulfillment_policy(body, fulfillment_policy_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param FulfillmentPolicyRequest body: Fulfillment policy request (required)
:param str fulfillment_policy_id: This path parameter specifies the ID of the fulfillment policy you want to update. (required)
:return: SetFulfillmentPolicyResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_fulfillment_policy_with_http_info(body, fulfillment_policy_id, **kwargs) # noqa: E501
else:
(data) = self.update_fulfillment_policy_with_http_info(body, fulfillment_policy_id, **kwargs) # noqa: E501
return data
def update_fulfillment_policy_with_http_info(self, body, fulfillment_policy_id, **kwargs): # noqa: E501
"""update_fulfillment_policy # noqa: E501
This method updates an existing fulfillment policy. Specify the policy you want to update using the <b>fulfillment_policy_id</b> path parameter. Supply a complete policy payload with the updates you want to make; this call overwrites the existing policy with the new details specified in the payload. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_fulfillment_policy_with_http_info(body, fulfillment_policy_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param FulfillmentPolicyRequest body: Fulfillment policy request (required)
:param str fulfillment_policy_id: This path parameter specifies the ID of the fulfillment policy you want to update. (required)
:return: SetFulfillmentPolicyResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body', 'fulfillment_policy_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_fulfillment_policy" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_fulfillment_policy`") # noqa: E501
# verify the required parameter 'fulfillment_policy_id' is set
if ('fulfillment_policy_id' not in params or
params['fulfillment_policy_id'] is None):
raise ValueError("Missing the required parameter `fulfillment_policy_id` when calling `update_fulfillment_policy`") # noqa: E501
collection_formats = {}
path_params = {}
if 'fulfillment_policy_id' in params:
path_params['fulfillmentPolicyId'] = params['fulfillment_policy_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_auth'] # noqa: E501
return self.api_client.call_api(
'/fulfillment_policy/{fulfillmentPolicyId}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SetFulfillmentPolicyResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 57.195513 | 2,209 | 0.676072 | 4,467 | 35,690 | 5.216029 | 0.068502 | 0.096309 | 0.038326 | 0.018541 | 0.960215 | 0.95721 | 0.953176 | 0.944378 | 0.938798 | 0.93133 | 0 | 0.010261 | 0.238162 | 35,690 | 623 | 2,210 | 57.287319 | 0.846672 | 0.518521 | 0 | 0.792793 | 0 | 0 | 0.209437 | 0.088002 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039039 | false | 0 | 0.012012 | 0 | 0.108108 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
81528ad8c001a72d882d39a1f9ae8fb08c040ef6 | 82,554 | py | Python | atom/nucleus/python/nucleus_api/api/questionnaire_api.py | sumit4-ttn/SDK | b3ae385e5415e47ac70abd0b3fdeeaeee9aa7cff | [
"Apache-2.0"
] | null | null | null | atom/nucleus/python/nucleus_api/api/questionnaire_api.py | sumit4-ttn/SDK | b3ae385e5415e47ac70abd0b3fdeeaeee9aa7cff | [
"Apache-2.0"
] | null | null | null | atom/nucleus/python/nucleus_api/api/questionnaire_api.py | sumit4-ttn/SDK | b3ae385e5415e47ac70abd0b3fdeeaeee9aa7cff | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Hydrogen Atom API
The Hydrogen Atom API # noqa: E501
OpenAPI spec version: 1.7.0
Contact: info@hydrogenplatform.com
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from nucleus_api.api_client import ApiClient
class QuestionnaireApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_answer_using_post(self, answer, **kwargs): # noqa: E501
"""Create an answer # noqa: E501
Create a new answer for question. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_answer_using_post(answer, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Answer answer: answer (required)
:return: Answer
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_answer_using_post_with_http_info(answer, **kwargs) # noqa: E501
else:
(data) = self.create_answer_using_post_with_http_info(answer, **kwargs) # noqa: E501
return data
def create_answer_using_post_with_http_info(self, answer, **kwargs): # noqa: E501
"""Create an answer # noqa: E501
Create a new answer for question. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_answer_using_post_with_http_info(answer, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Answer answer: answer (required)
:return: Answer
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['answer'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_answer_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'answer' is set
if ('answer' not in params or
params['answer'] is None):
raise ValueError("Missing the required parameter `answer` when calling `create_answer_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'answer' in params:
body_params = params['answer']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/answer', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Answer', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_client_response_using_post(self, client_response, **kwargs): # noqa: E501
"""Create a client response # noqa: E501
Create a new client response for a question as part of a questionnaires. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_client_response_using_post(client_response, async_req=True)
>>> result = thread.get()
:param async_req bool
:param ClientResponse client_response: clientResponse (required)
:return: ClientResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_client_response_using_post_with_http_info(client_response, **kwargs) # noqa: E501
else:
(data) = self.create_client_response_using_post_with_http_info(client_response, **kwargs) # noqa: E501
return data
def create_client_response_using_post_with_http_info(self, client_response, **kwargs): # noqa: E501
"""Create a client response # noqa: E501
Create a new client response for a question as part of a questionnaires. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_client_response_using_post_with_http_info(client_response, async_req=True)
>>> result = thread.get()
:param async_req bool
:param ClientResponse client_response: clientResponse (required)
:return: ClientResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['client_response'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_client_response_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'client_response' is set
if ('client_response' not in params or
params['client_response'] is None):
raise ValueError("Missing the required parameter `client_response` when calling `create_client_response_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'client_response' in params:
body_params = params['client_response']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/client_response', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ClientResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_question_using_post(self, question, **kwargs): # noqa: E501
"""Create a question # noqa: E501
Create a new question for questionnaire. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_question_using_post(question, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Question question: question (required)
:return: Question
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_question_using_post_with_http_info(question, **kwargs) # noqa: E501
else:
(data) = self.create_question_using_post_with_http_info(question, **kwargs) # noqa: E501
return data
def create_question_using_post_with_http_info(self, question, **kwargs): # noqa: E501
"""Create a question # noqa: E501
Create a new question for questionnaire. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_question_using_post_with_http_info(question, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Question question: question (required)
:return: Question
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['question'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_question_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'question' is set
if ('question' not in params or
params['question'] is None):
raise ValueError("Missing the required parameter `question` when calling `create_question_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'question' in params:
body_params = params['question']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/question', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Question', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_questionnaire_using_post(self, questionnaire, **kwargs): # noqa: E501
"""Create a questionnaire # noqa: E501
Create a new questionnaire for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_questionnaire_using_post(questionnaire, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Questionnaire questionnaire: questionnaire (required)
:return: Questionnaire
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_questionnaire_using_post_with_http_info(questionnaire, **kwargs) # noqa: E501
else:
(data) = self.create_questionnaire_using_post_with_http_info(questionnaire, **kwargs) # noqa: E501
return data
def create_questionnaire_using_post_with_http_info(self, questionnaire, **kwargs): # noqa: E501
"""Create a questionnaire # noqa: E501
Create a new questionnaire for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_questionnaire_using_post_with_http_info(questionnaire, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Questionnaire questionnaire: questionnaire (required)
:return: Questionnaire
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['questionnaire'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_questionnaire_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'questionnaire' is set
if ('questionnaire' not in params or
params['questionnaire'] is None):
raise ValueError("Missing the required parameter `questionnaire` when calling `create_questionnaire_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'questionnaire' in params:
body_params = params['questionnaire']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/questionnaire', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Questionnaire', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_answer_using_delete(self, answer_id, **kwargs): # noqa: E501
"""Delete an answer # noqa: E501
Delete an answer for the question # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_answer_using_delete(answer_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str answer_id: UUID answer_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_answer_using_delete_with_http_info(answer_id, **kwargs) # noqa: E501
else:
(data) = self.delete_answer_using_delete_with_http_info(answer_id, **kwargs) # noqa: E501
return data
def delete_answer_using_delete_with_http_info(self, answer_id, **kwargs): # noqa: E501
"""Delete an answer # noqa: E501
Delete an answer for the question # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_answer_using_delete_with_http_info(answer_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str answer_id: UUID answer_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['answer_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_answer_using_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'answer_id' is set
if ('answer_id' not in params or
params['answer_id'] is None):
raise ValueError("Missing the required parameter `answer_id` when calling `delete_answer_using_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'answer_id' in params:
path_params['answer_id'] = params['answer_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/answer/{answer_id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_client_response_using_delete(self, client_response_id, **kwargs): # noqa: E501
"""Delete a client response # noqa: E501
Permanently delete a client response for a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_client_response_using_delete(client_response_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str client_response_id: UUID client_response_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_client_response_using_delete_with_http_info(client_response_id, **kwargs) # noqa: E501
else:
(data) = self.delete_client_response_using_delete_with_http_info(client_response_id, **kwargs) # noqa: E501
return data
def delete_client_response_using_delete_with_http_info(self, client_response_id, **kwargs): # noqa: E501
"""Delete a client response # noqa: E501
Permanently delete a client response for a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_client_response_using_delete_with_http_info(client_response_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str client_response_id: UUID client_response_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['client_response_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_client_response_using_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'client_response_id' is set
if ('client_response_id' not in params or
params['client_response_id'] is None):
raise ValueError("Missing the required parameter `client_response_id` when calling `delete_client_response_using_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'client_response_id' in params:
path_params['client_response_id'] = params['client_response_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/client_response/{client_response_id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_question_using_delete(self, question_id, **kwargs): # noqa: E501
"""Delete a question # noqa: E501
Delete an answer for the question # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_question_using_delete(question_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str question_id: UUID question_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_question_using_delete_with_http_info(question_id, **kwargs) # noqa: E501
else:
(data) = self.delete_question_using_delete_with_http_info(question_id, **kwargs) # noqa: E501
return data
def delete_question_using_delete_with_http_info(self, question_id, **kwargs): # noqa: E501
"""Delete a question # noqa: E501
Delete an answer for the question # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_question_using_delete_with_http_info(question_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str question_id: UUID question_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['question_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_question_using_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'question_id' is set
if ('question_id' not in params or
params['question_id'] is None):
raise ValueError("Missing the required parameter `question_id` when calling `delete_question_using_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'question_id' in params:
path_params['question_id'] = params['question_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/question/{question_id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_questionnaire_using_delete(self, questionnaire_id, **kwargs): # noqa: E501
"""Delete a questionnaire # noqa: E501
Permanently delete a questionnaire for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_questionnaire_using_delete(questionnaire_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str questionnaire_id: UUID questionnaire_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_questionnaire_using_delete_with_http_info(questionnaire_id, **kwargs) # noqa: E501
else:
(data) = self.delete_questionnaire_using_delete_with_http_info(questionnaire_id, **kwargs) # noqa: E501
return data
def delete_questionnaire_using_delete_with_http_info(self, questionnaire_id, **kwargs): # noqa: E501
"""Delete a questionnaire # noqa: E501
Permanently delete a questionnaire for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_questionnaire_using_delete_with_http_info(questionnaire_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str questionnaire_id: UUID questionnaire_id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['questionnaire_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_questionnaire_using_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'questionnaire_id' is set
if ('questionnaire_id' not in params or
params['questionnaire_id'] is None):
raise ValueError("Missing the required parameter `questionnaire_id` when calling `delete_questionnaire_using_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'questionnaire_id' in params:
path_params['questionnaire_id'] = params['questionnaire_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/questionnaire/{questionnaire_id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_answer_all_using_get(self, **kwargs): # noqa: E501
"""List all Answers # noqa: E501
Get information for all Answers # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_answer_all_using_get(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageAnswer
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_answer_all_using_get_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_answer_all_using_get_with_http_info(**kwargs) # noqa: E501
return data
def get_answer_all_using_get_with_http_info(self, **kwargs): # noqa: E501
"""List all Answers # noqa: E501
Get information for all Answers # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_answer_all_using_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageAnswer
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['ascending', 'filter', 'order_by', 'page', 'size'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_answer_all_using_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'ascending' in params:
query_params.append(('ascending', params['ascending'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
if 'order_by' in params:
query_params.append(('order_by', params['order_by'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/answer', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageAnswer', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_answer_using_get(self, answer_id, **kwargs): # noqa: E501
"""Retrieve an answer # noqa: E501
Retrieve the information for an answer for question # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_answer_using_get(answer_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str answer_id: UUID answer_id (required)
:return: Answer
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_answer_using_get_with_http_info(answer_id, **kwargs) # noqa: E501
else:
(data) = self.get_answer_using_get_with_http_info(answer_id, **kwargs) # noqa: E501
return data
def get_answer_using_get_with_http_info(self, answer_id, **kwargs): # noqa: E501
"""Retrieve an answer # noqa: E501
Retrieve the information for an answer for question # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_answer_using_get_with_http_info(answer_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str answer_id: UUID answer_id (required)
:return: Answer
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['answer_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_answer_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'answer_id' is set
if ('answer_id' not in params or
params['answer_id'] is None):
raise ValueError("Missing the required parameter `answer_id` when calling `get_answer_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'answer_id' in params:
path_params['answer_id'] = params['answer_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/answer/{answer_id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Answer', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_client_response_all_using_get(self, **kwargs): # noqa: E501
"""List all client responses # noqa: E501
Get all the client responses for questions as part of a questionnaire defined by your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_client_response_all_using_get(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageClientResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_client_response_all_using_get_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_client_response_all_using_get_with_http_info(**kwargs) # noqa: E501
return data
def get_client_response_all_using_get_with_http_info(self, **kwargs): # noqa: E501
"""List all client responses # noqa: E501
Get all the client responses for questions as part of a questionnaire defined by your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_client_response_all_using_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageClientResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['ascending', 'filter', 'order_by', 'page', 'size'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_client_response_all_using_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'ascending' in params:
query_params.append(('ascending', params['ascending'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
if 'order_by' in params:
query_params.append(('order_by', params['order_by'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/client_response', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageClientResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_client_response_using_get(self, client_response_id, **kwargs): # noqa: E501
"""Retrieve a client response # noqa: E501
Retrieve the information for a client response for a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_client_response_using_get(client_response_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str client_response_id: UUID client_response_id (required)
:return: ClientResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_client_response_using_get_with_http_info(client_response_id, **kwargs) # noqa: E501
else:
(data) = self.get_client_response_using_get_with_http_info(client_response_id, **kwargs) # noqa: E501
return data
def get_client_response_using_get_with_http_info(self, client_response_id, **kwargs): # noqa: E501
"""Retrieve a client response # noqa: E501
Retrieve the information for a client response for a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_client_response_using_get_with_http_info(client_response_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str client_response_id: UUID client_response_id (required)
:return: ClientResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['client_response_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_client_response_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'client_response_id' is set
if ('client_response_id' not in params or
params['client_response_id'] is None):
raise ValueError("Missing the required parameter `client_response_id` when calling `get_client_response_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'client_response_id' in params:
path_params['client_response_id'] = params['client_response_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/client_response/{client_response_id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ClientResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_question_all_using_get(self, **kwargs): # noqa: E501
"""List all Questions # noqa: E501
Get information for all Questions # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_question_all_using_get(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageQuestion
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_question_all_using_get_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_question_all_using_get_with_http_info(**kwargs) # noqa: E501
return data
def get_question_all_using_get_with_http_info(self, **kwargs): # noqa: E501
"""List all Questions # noqa: E501
Get information for all Questions # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_question_all_using_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageQuestion
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['ascending', 'filter', 'order_by', 'page', 'size'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_question_all_using_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'ascending' in params:
query_params.append(('ascending', params['ascending'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
if 'order_by' in params:
query_params.append(('order_by', params['order_by'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/question', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageQuestion', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_question_using_get(self, question_id, **kwargs): # noqa: E501
"""Retrieve a question # noqa: E501
Retrieve the information for a question for questionnaire # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_question_using_get(question_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str question_id: UUID question_id (required)
:return: Question
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_question_using_get_with_http_info(question_id, **kwargs) # noqa: E501
else:
(data) = self.get_question_using_get_with_http_info(question_id, **kwargs) # noqa: E501
return data
def get_question_using_get_with_http_info(self, question_id, **kwargs): # noqa: E501
"""Retrieve a question # noqa: E501
Retrieve the information for a question for questionnaire # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_question_using_get_with_http_info(question_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str question_id: UUID question_id (required)
:return: Question
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['question_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_question_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'question_id' is set
if ('question_id' not in params or
params['question_id'] is None):
raise ValueError("Missing the required parameter `question_id` when calling `get_question_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'question_id' in params:
path_params['question_id'] = params['question_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/question/{question_id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Question', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_questionnaire_all_using_get(self, **kwargs): # noqa: E501
"""List all questionnaires # noqa: E501
Get the information for all questionnaires defined for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_questionnaire_all_using_get(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageQuestionnaire
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_questionnaire_all_using_get_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_questionnaire_all_using_get_with_http_info(**kwargs) # noqa: E501
return data
def get_questionnaire_all_using_get_with_http_info(self, **kwargs): # noqa: E501
"""List all questionnaires # noqa: E501
Get the information for all questionnaires defined for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_questionnaire_all_using_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str filter: filter
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageQuestionnaire
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['ascending', 'filter', 'order_by', 'page', 'size'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_questionnaire_all_using_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'ascending' in params:
query_params.append(('ascending', params['ascending'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
if 'order_by' in params:
query_params.append(('order_by', params['order_by'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/questionnaire', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageQuestionnaire', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_questionnaire_using_get(self, questionnaire_id, **kwargs): # noqa: E501
"""Retrieve a questionnaire # noqa: E501
Retrieve the information for a questionnaire for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_questionnaire_using_get(questionnaire_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str questionnaire_id: UUID questionnaire_id (required)
:return: Questionnaire
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_questionnaire_using_get_with_http_info(questionnaire_id, **kwargs) # noqa: E501
else:
(data) = self.get_questionnaire_using_get_with_http_info(questionnaire_id, **kwargs) # noqa: E501
return data
def get_questionnaire_using_get_with_http_info(self, questionnaire_id, **kwargs): # noqa: E501
"""Retrieve a questionnaire # noqa: E501
Retrieve the information for a questionnaire for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_questionnaire_using_get_with_http_info(questionnaire_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str questionnaire_id: UUID questionnaire_id (required)
:return: Questionnaire
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['questionnaire_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_questionnaire_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'questionnaire_id' is set
if ('questionnaire_id' not in params or
params['questionnaire_id'] is None):
raise ValueError("Missing the required parameter `questionnaire_id` when calling `get_questionnaire_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'questionnaire_id' in params:
path_params['questionnaire_id'] = params['questionnaire_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/questionnaire/{questionnaire_id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Questionnaire', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_answer_using_put(self, answer, answer_id, **kwargs): # noqa: E501
"""Update an answer # noqa: E501
Update a answer for question. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_answer_using_put(answer, answer_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Answer answer: answer (required)
:param str answer_id: UUID answer_id (required)
:return: Answer
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_answer_using_put_with_http_info(answer, answer_id, **kwargs) # noqa: E501
else:
(data) = self.update_answer_using_put_with_http_info(answer, answer_id, **kwargs) # noqa: E501
return data
def update_answer_using_put_with_http_info(self, answer, answer_id, **kwargs): # noqa: E501
"""Update an answer # noqa: E501
Update a answer for question. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_answer_using_put_with_http_info(answer, answer_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Answer answer: answer (required)
:param str answer_id: UUID answer_id (required)
:return: Answer
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['answer', 'answer_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_answer_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'answer' is set
if ('answer' not in params or
params['answer'] is None):
raise ValueError("Missing the required parameter `answer` when calling `update_answer_using_put`") # noqa: E501
# verify the required parameter 'answer_id' is set
if ('answer_id' not in params or
params['answer_id'] is None):
raise ValueError("Missing the required parameter `answer_id` when calling `update_answer_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'answer_id' in params:
path_params['answer_id'] = params['answer_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'answer' in params:
body_params = params['answer']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/answer/{answer_id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Answer', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_client_response_using_put(self, client_response, client_response_id, **kwargs): # noqa: E501
"""Update a client response # noqa: E501
Update a client response for a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_client_response_using_put(client_response, client_response_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param ClientResponse client_response: client_response (required)
:param str client_response_id: UUID client_response_id (required)
:return: ClientResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_client_response_using_put_with_http_info(client_response, client_response_id, **kwargs) # noqa: E501
else:
(data) = self.update_client_response_using_put_with_http_info(client_response, client_response_id, **kwargs) # noqa: E501
return data
def update_client_response_using_put_with_http_info(self, client_response, client_response_id, **kwargs): # noqa: E501
"""Update a client response # noqa: E501
Update a client response for a client. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_client_response_using_put_with_http_info(client_response, client_response_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param ClientResponse client_response: client_response (required)
:param str client_response_id: UUID client_response_id (required)
:return: ClientResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['client_response', 'client_response_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_client_response_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'client_response' is set
if ('client_response' not in params or
params['client_response'] is None):
raise ValueError("Missing the required parameter `client_response` when calling `update_client_response_using_put`") # noqa: E501
# verify the required parameter 'client_response_id' is set
if ('client_response_id' not in params or
params['client_response_id'] is None):
raise ValueError("Missing the required parameter `client_response_id` when calling `update_client_response_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'client_response_id' in params:
path_params['client_response_id'] = params['client_response_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'client_response' in params:
body_params = params['client_response']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/client_response/{client_response_id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ClientResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_question_using_put(self, question, question_id, **kwargs): # noqa: E501
"""Update a question. # noqa: E501
Update a question for questionnaire. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_question_using_put(question, question_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Question question: question (required)
:param str question_id: UUID question_id (required)
:return: Question
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_question_using_put_with_http_info(question, question_id, **kwargs) # noqa: E501
else:
(data) = self.update_question_using_put_with_http_info(question, question_id, **kwargs) # noqa: E501
return data
def update_question_using_put_with_http_info(self, question, question_id, **kwargs): # noqa: E501
"""Update a question. # noqa: E501
Update a question for questionnaire. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_question_using_put_with_http_info(question, question_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Question question: question (required)
:param str question_id: UUID question_id (required)
:return: Question
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['question', 'question_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_question_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'question' is set
if ('question' not in params or
params['question'] is None):
raise ValueError("Missing the required parameter `question` when calling `update_question_using_put`") # noqa: E501
# verify the required parameter 'question_id' is set
if ('question_id' not in params or
params['question_id'] is None):
raise ValueError("Missing the required parameter `question_id` when calling `update_question_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'question_id' in params:
path_params['question_id'] = params['question_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'question' in params:
body_params = params['question']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/question/{question_id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Question', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_questionnaire_using_put(self, questionnaire, questionnaire_id, **kwargs): # noqa: E501
"""Update a questionnaire # noqa: E501
Update a questionnaire for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_questionnaire_using_put(questionnaire, questionnaire_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Questionnaire questionnaire: questionnaire (required)
:param str questionnaire_id: UUID questionnaire_id (required)
:return: Questionnaire
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_questionnaire_using_put_with_http_info(questionnaire, questionnaire_id, **kwargs) # noqa: E501
else:
(data) = self.update_questionnaire_using_put_with_http_info(questionnaire, questionnaire_id, **kwargs) # noqa: E501
return data
def update_questionnaire_using_put_with_http_info(self, questionnaire, questionnaire_id, **kwargs): # noqa: E501
"""Update a questionnaire # noqa: E501
Update a questionnaire for your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_questionnaire_using_put_with_http_info(questionnaire, questionnaire_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Questionnaire questionnaire: questionnaire (required)
:param str questionnaire_id: UUID questionnaire_id (required)
:return: Questionnaire
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['questionnaire', 'questionnaire_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_questionnaire_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'questionnaire' is set
if ('questionnaire' not in params or
params['questionnaire'] is None):
raise ValueError("Missing the required parameter `questionnaire` when calling `update_questionnaire_using_put`") # noqa: E501
# verify the required parameter 'questionnaire_id' is set
if ('questionnaire_id' not in params or
params['questionnaire_id'] is None):
raise ValueError("Missing the required parameter `questionnaire_id` when calling `update_questionnaire_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'questionnaire_id' in params:
path_params['questionnaire_id'] = params['questionnaire_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'questionnaire' in params:
body_params = params['questionnaire']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/questionnaire/{questionnaire_id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Questionnaire', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 40.329262 | 148 | 0.618165 | 9,431 | 82,554 | 5.127134 | 0.019192 | 0.051123 | 0.023163 | 0.02978 | 0.985399 | 0.979257 | 0.977644 | 0.968565 | 0.959135 | 0.952269 | 0 | 0.016415 | 0.29451 | 82,554 | 2,046 | 149 | 40.348974 | 0.813825 | 0.323352 | 0 | 0.836199 | 1 | 0 | 0.191811 | 0.053095 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037104 | false | 0 | 0.00362 | 0 | 0.095928 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
816019d97d8bd34fb3b9e5524a36b7d0ec8fc06c | 172,227 | py | Python | ofagent/loxi/of14/oxm.py | jonohart/voltha | 87314cd53cb4c61e7e62b0ed3fc6da94603cc507 | [
"Apache-2.0"
] | null | null | null | ofagent/loxi/of14/oxm.py | jonohart/voltha | 87314cd53cb4c61e7e62b0ed3fc6da94603cc507 | [
"Apache-2.0"
] | null | null | null | ofagent/loxi/of14/oxm.py | jonohart/voltha | 87314cd53cb4c61e7e62b0ed3fc6da94603cc507 | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2008 The Board of Trustees of The Leland Stanford Junior University
# Copyright (c) 2011, 2012 Open Networking Foundation
# Copyright (c) 2012, 2013 Big Switch Networks, Inc.
# See the file LICENSE.pyloxi which should have been included in the source distribution
# Automatically generated by LOXI from template module.py
# Do not modify
import struct
import loxi
import util
import loxi.generic_util
import sys
ofp = sys.modules['loxi.of14']
class oxm(loxi.OFObject):
subtypes = {}
def __init__(self, type_len=None):
if type_len != None:
self.type_len = type_len
else:
self.type_len = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
return ''.join(packed)
@staticmethod
def unpack(reader):
subtype, = reader.peek('!L', 0)
subclass = oxm.subtypes.get(subtype)
if subclass:
return subclass.unpack(reader)
obj = oxm()
obj.type_len = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.type_len != other.type_len: return False
return True
def pretty_print(self, q):
q.text("oxm {")
with q.group():
with q.indent(2):
q.breakable()
q.breakable()
q.text('}')
class arp_op(oxm):
type_len = 2147494402
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = arp_op()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147494402)
obj.value = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("arp_op {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147494402] = arp_op
class arp_op_masked(oxm):
type_len = 2147494660
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
packed.append(struct.pack("!H", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = arp_op_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147494660)
obj.value = reader.read("!H")[0]
obj.value_mask = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("arp_op_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147494660] = arp_op_masked
class arp_sha(oxm):
type_len = 2147495942
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = arp_sha()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147495942)
obj.value = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("arp_sha {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.breakable()
q.text('}')
oxm.subtypes[2147495942] = arp_sha
class arp_sha_masked(oxm):
type_len = 2147496204
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
packed.append(struct.pack("!6B", *self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = arp_sha_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147496204)
obj.value = list(reader.read('!6B'))
obj.value_mask = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("arp_sha_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.text(","); q.breakable()
q.text("value_mask = ");
q.text(util.pretty_mac(self.value_mask))
q.breakable()
q.text('}')
oxm.subtypes[2147496204] = arp_sha_masked
class arp_spa(oxm):
type_len = 2147494916
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = arp_spa()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147494916)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("arp_spa {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147494916] = arp_spa
class arp_spa_masked(oxm):
type_len = 2147495176
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = arp_spa_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147495176)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("arp_spa_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147495176] = arp_spa_masked
class arp_tha(oxm):
type_len = 2147496454
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = arp_tha()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147496454)
obj.value = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("arp_tha {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.breakable()
q.text('}')
oxm.subtypes[2147496454] = arp_tha
class arp_tha_masked(oxm):
type_len = 2147496716
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
packed.append(struct.pack("!6B", *self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = arp_tha_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147496716)
obj.value = list(reader.read('!6B'))
obj.value_mask = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("arp_tha_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.text(","); q.breakable()
q.text("value_mask = ");
q.text(util.pretty_mac(self.value_mask))
q.breakable()
q.text('}')
oxm.subtypes[2147496716] = arp_tha_masked
class arp_tpa(oxm):
type_len = 2147495428
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = arp_tpa()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147495428)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("arp_tpa {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147495428] = arp_tpa
class arp_tpa_masked(oxm):
type_len = 2147495688
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = arp_tpa_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147495688)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("arp_tpa_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147495688] = arp_tpa_masked
class bsn_egr_port_group_id(oxm):
type_len = 200196
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_egr_port_group_id()
_type_len = reader.read("!L")[0]
assert(_type_len == 200196)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_egr_port_group_id {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[200196] = bsn_egr_port_group_id
class bsn_egr_port_group_id_masked(oxm):
type_len = 200456
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_egr_port_group_id_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 200456)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_egr_port_group_id_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[200456] = bsn_egr_port_group_id_masked
class bsn_in_ports_128(oxm):
type_len = 196624
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = set()
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(util.pack_bitmap_128(self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_in_ports_128()
_type_len = reader.read("!L")[0]
assert(_type_len == 196624)
obj.value = util.unpack_bitmap_128(reader)
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_in_ports_128 {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.pp(self.value)
q.breakable()
q.text('}')
oxm.subtypes[196624] = bsn_in_ports_128
class bsn_in_ports_128_masked(oxm):
type_len = 196896
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = set()
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = set()
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(util.pack_bitmap_128(self.value))
packed.append(util.pack_bitmap_128(self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_in_ports_128_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 196896)
obj.value = util.unpack_bitmap_128(reader)
obj.value_mask = util.unpack_bitmap_128(reader)
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_in_ports_128_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.pp(self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.pp(self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[196896] = bsn_in_ports_128_masked
class bsn_in_ports_512(oxm):
type_len = 206400
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = set()
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(util.pack_bitmap_512(self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_in_ports_512()
_type_len = reader.read("!L")[0]
assert(_type_len == 206400)
obj.value = util.unpack_bitmap_512(reader)
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_in_ports_512 {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.pp(self.value)
q.breakable()
q.text('}')
oxm.subtypes[206400] = bsn_in_ports_512
class bsn_in_ports_512_masked(oxm):
type_len = 206720
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = set()
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = set()
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(util.pack_bitmap_512(self.value))
packed.append(util.pack_bitmap_512(self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_in_ports_512_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 206720)
obj.value = util.unpack_bitmap_512(reader)
obj.value_mask = util.unpack_bitmap_512(reader)
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_in_ports_512_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.pp(self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.pp(self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[206720] = bsn_in_ports_512_masked
class bsn_ingress_port_group_id(oxm):
type_len = 206852
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_ingress_port_group_id()
_type_len = reader.read("!L")[0]
assert(_type_len == 206852)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_ingress_port_group_id {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[206852] = bsn_ingress_port_group_id
class bsn_ingress_port_group_id_masked(oxm):
type_len = 207112
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_ingress_port_group_id_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 207112)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_ingress_port_group_id_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[207112] = bsn_ingress_port_group_id_masked
class bsn_inner_eth_dst(oxm):
type_len = 207878
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_inner_eth_dst()
_type_len = reader.read("!L")[0]
assert(_type_len == 207878)
obj.value = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_inner_eth_dst {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.breakable()
q.text('}')
oxm.subtypes[207878] = bsn_inner_eth_dst
class bsn_inner_eth_dst_masked(oxm):
type_len = 208140
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
packed.append(struct.pack("!6B", *self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_inner_eth_dst_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 208140)
obj.value = list(reader.read('!6B'))
obj.value_mask = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_inner_eth_dst_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.text(","); q.breakable()
q.text("value_mask = ");
q.text(util.pretty_mac(self.value_mask))
q.breakable()
q.text('}')
oxm.subtypes[208140] = bsn_inner_eth_dst_masked
class bsn_inner_eth_src(oxm):
type_len = 208390
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_inner_eth_src()
_type_len = reader.read("!L")[0]
assert(_type_len == 208390)
obj.value = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_inner_eth_src {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.breakable()
q.text('}')
oxm.subtypes[208390] = bsn_inner_eth_src
class bsn_inner_eth_src_masked(oxm):
type_len = 208652
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
packed.append(struct.pack("!6B", *self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_inner_eth_src_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 208652)
obj.value = list(reader.read('!6B'))
obj.value_mask = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_inner_eth_src_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.text(","); q.breakable()
q.text("value_mask = ");
q.text(util.pretty_mac(self.value_mask))
q.breakable()
q.text('}')
oxm.subtypes[208652] = bsn_inner_eth_src_masked
class bsn_inner_vlan_vid(oxm):
type_len = 208898
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_inner_vlan_vid()
_type_len = reader.read("!L")[0]
assert(_type_len == 208898)
obj.value = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_inner_vlan_vid {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[208898] = bsn_inner_vlan_vid
class bsn_inner_vlan_vid_masked(oxm):
type_len = 209156
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
packed.append(struct.pack("!H", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_inner_vlan_vid_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 209156)
obj.value = reader.read("!H")[0]
obj.value_mask = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_inner_vlan_vid_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[209156] = bsn_inner_vlan_vid_masked
class bsn_l2_cache_hit(oxm):
type_len = 205825
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_l2_cache_hit()
_type_len = reader.read("!L")[0]
assert(_type_len == 205825)
obj.value = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_l2_cache_hit {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[205825] = bsn_l2_cache_hit
class bsn_l2_cache_hit_masked(oxm):
type_len = 206082
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
packed.append(struct.pack("!B", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_l2_cache_hit_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 206082)
obj.value = reader.read("!B")[0]
obj.value_mask = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_l2_cache_hit_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[206082] = bsn_l2_cache_hit_masked
class bsn_l3_interface_class_id(oxm):
type_len = 198660
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_l3_interface_class_id()
_type_len = reader.read("!L")[0]
assert(_type_len == 198660)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_l3_interface_class_id {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[198660] = bsn_l3_interface_class_id
class bsn_l3_interface_class_id_masked(oxm):
type_len = 198920
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_l3_interface_class_id_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 198920)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_l3_interface_class_id_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[198920] = bsn_l3_interface_class_id_masked
class bsn_l3_src_class_id(oxm):
type_len = 199172
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_l3_src_class_id()
_type_len = reader.read("!L")[0]
assert(_type_len == 199172)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_l3_src_class_id {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[199172] = bsn_l3_src_class_id
class bsn_l3_src_class_id_masked(oxm):
type_len = 199432
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_l3_src_class_id_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 199432)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_l3_src_class_id_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[199432] = bsn_l3_src_class_id_masked
class bsn_lag_id(oxm):
type_len = 197124
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_lag_id()
_type_len = reader.read("!L")[0]
assert(_type_len == 197124)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_lag_id {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[197124] = bsn_lag_id
class bsn_lag_id_masked(oxm):
type_len = 197384
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_lag_id_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 197384)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_lag_id_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[197384] = bsn_lag_id_masked
class bsn_tcp_flags(oxm):
type_len = 204802
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_tcp_flags()
_type_len = reader.read("!L")[0]
assert(_type_len == 204802)
obj.value = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_tcp_flags {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[204802] = bsn_tcp_flags
class bsn_tcp_flags_masked(oxm):
type_len = 205060
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
packed.append(struct.pack("!H", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_tcp_flags_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 205060)
obj.value = reader.read("!H")[0]
obj.value_mask = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_tcp_flags_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[205060] = bsn_tcp_flags_masked
class bsn_udf0(oxm):
type_len = 200708
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf0()
_type_len = reader.read("!L")[0]
assert(_type_len == 200708)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_udf0 {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[200708] = bsn_udf0
class bsn_udf0_masked(oxm):
type_len = 200968
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf0_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 200968)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_udf0_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[200968] = bsn_udf0_masked
class bsn_udf1(oxm):
type_len = 201220
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf1()
_type_len = reader.read("!L")[0]
assert(_type_len == 201220)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_udf1 {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[201220] = bsn_udf1
class bsn_udf1_masked(oxm):
type_len = 201480
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf1_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 201480)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_udf1_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[201480] = bsn_udf1_masked
class bsn_udf2(oxm):
type_len = 201732
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf2()
_type_len = reader.read("!L")[0]
assert(_type_len == 201732)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_udf2 {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[201732] = bsn_udf2
class bsn_udf2_masked(oxm):
type_len = 201992
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf2_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 201992)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_udf2_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[201992] = bsn_udf2_masked
class bsn_udf3(oxm):
type_len = 202244
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf3()
_type_len = reader.read("!L")[0]
assert(_type_len == 202244)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_udf3 {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[202244] = bsn_udf3
class bsn_udf3_masked(oxm):
type_len = 202504
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf3_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 202504)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_udf3_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[202504] = bsn_udf3_masked
class bsn_udf4(oxm):
type_len = 202756
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf4()
_type_len = reader.read("!L")[0]
assert(_type_len == 202756)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_udf4 {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[202756] = bsn_udf4
class bsn_udf4_masked(oxm):
type_len = 203016
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf4_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 203016)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_udf4_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[203016] = bsn_udf4_masked
class bsn_udf5(oxm):
type_len = 203268
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf5()
_type_len = reader.read("!L")[0]
assert(_type_len == 203268)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_udf5 {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[203268] = bsn_udf5
class bsn_udf5_masked(oxm):
type_len = 203528
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf5_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 203528)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_udf5_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[203528] = bsn_udf5_masked
class bsn_udf6(oxm):
type_len = 203780
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf6()
_type_len = reader.read("!L")[0]
assert(_type_len == 203780)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_udf6 {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[203780] = bsn_udf6
class bsn_udf6_masked(oxm):
type_len = 204040
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf6_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 204040)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_udf6_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[204040] = bsn_udf6_masked
class bsn_udf7(oxm):
type_len = 204292
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf7()
_type_len = reader.read("!L")[0]
assert(_type_len == 204292)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_udf7 {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[204292] = bsn_udf7
class bsn_udf7_masked(oxm):
type_len = 204552
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_udf7_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 204552)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_udf7_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[204552] = bsn_udf7_masked
class bsn_vfi(oxm):
type_len = 209410
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_vfi()
_type_len = reader.read("!L")[0]
assert(_type_len == 209410)
obj.value = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_vfi {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[209410] = bsn_vfi
class bsn_vfi_masked(oxm):
type_len = 209668
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
packed.append(struct.pack("!H", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_vfi_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 209668)
obj.value = reader.read("!H")[0]
obj.value_mask = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_vfi_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[209668] = bsn_vfi_masked
class bsn_vlan_xlate_port_group_id(oxm):
type_len = 205316
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_vlan_xlate_port_group_id()
_type_len = reader.read("!L")[0]
assert(_type_len == 205316)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_vlan_xlate_port_group_id {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[205316] = bsn_vlan_xlate_port_group_id
class bsn_vlan_xlate_port_group_id_masked(oxm):
type_len = 205576
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_vlan_xlate_port_group_id_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 205576)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_vlan_xlate_port_group_id_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[205576] = bsn_vlan_xlate_port_group_id_masked
class bsn_vrf(oxm):
type_len = 197636
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_vrf()
_type_len = reader.read("!L")[0]
assert(_type_len == 197636)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_vrf {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[197636] = bsn_vrf
class bsn_vrf_masked(oxm):
type_len = 197896
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_vrf_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 197896)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_vrf_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[197896] = bsn_vrf_masked
class bsn_vxlan_network_id(oxm):
type_len = 207364
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_vxlan_network_id()
_type_len = reader.read("!L")[0]
assert(_type_len == 207364)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("bsn_vxlan_network_id {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[207364] = bsn_vxlan_network_id
class bsn_vxlan_network_id_masked(oxm):
type_len = 207624
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = bsn_vxlan_network_id_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 207624)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("bsn_vxlan_network_id_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[207624] = bsn_vxlan_network_id_masked
class eth_dst(oxm):
type_len = 2147485190
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = eth_dst()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147485190)
obj.value = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("eth_dst {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.breakable()
q.text('}')
oxm.subtypes[2147485190] = eth_dst
class eth_dst_masked(oxm):
type_len = 2147485452
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
packed.append(struct.pack("!6B", *self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = eth_dst_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147485452)
obj.value = list(reader.read('!6B'))
obj.value_mask = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("eth_dst_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.text(","); q.breakable()
q.text("value_mask = ");
q.text(util.pretty_mac(self.value_mask))
q.breakable()
q.text('}')
oxm.subtypes[2147485452] = eth_dst_masked
class eth_src(oxm):
type_len = 2147485702
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = eth_src()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147485702)
obj.value = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("eth_src {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.breakable()
q.text('}')
oxm.subtypes[2147485702] = eth_src
class eth_src_masked(oxm):
type_len = 2147485964
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
packed.append(struct.pack("!6B", *self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = eth_src_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147485964)
obj.value = list(reader.read('!6B'))
obj.value_mask = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("eth_src_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.text(","); q.breakable()
q.text("value_mask = ");
q.text(util.pretty_mac(self.value_mask))
q.breakable()
q.text('}')
oxm.subtypes[2147485964] = eth_src_masked
class eth_type(oxm):
type_len = 2147486210
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = eth_type()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147486210)
obj.value = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("eth_type {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147486210] = eth_type
class eth_type_masked(oxm):
type_len = 2147486468
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
packed.append(struct.pack("!H", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = eth_type_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147486468)
obj.value = reader.read("!H")[0]
obj.value_mask = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("eth_type_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147486468] = eth_type_masked
class icmpv4_code(oxm):
type_len = 2147493889
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = icmpv4_code()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147493889)
obj.value = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("icmpv4_code {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147493889] = icmpv4_code
class icmpv4_code_masked(oxm):
type_len = 2147494146
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
packed.append(struct.pack("!B", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = icmpv4_code_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147494146)
obj.value = reader.read("!B")[0]
obj.value_mask = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("icmpv4_code_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147494146] = icmpv4_code_masked
class icmpv4_type(oxm):
type_len = 2147493377
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = icmpv4_type()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147493377)
obj.value = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("icmpv4_type {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147493377] = icmpv4_type
class icmpv4_type_masked(oxm):
type_len = 2147493634
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
packed.append(struct.pack("!B", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = icmpv4_type_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147493634)
obj.value = reader.read("!B")[0]
obj.value_mask = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("icmpv4_type_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147493634] = icmpv4_type_masked
class icmpv6_code(oxm):
type_len = 2147499009
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = icmpv6_code()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147499009)
obj.value = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("icmpv6_code {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147499009] = icmpv6_code
class icmpv6_code_masked(oxm):
type_len = 2147499266
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
packed.append(struct.pack("!B", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = icmpv6_code_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147499266)
obj.value = reader.read("!B")[0]
obj.value_mask = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("icmpv6_code_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147499266] = icmpv6_code_masked
class icmpv6_type(oxm):
type_len = 2147498497
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = icmpv6_type()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147498497)
obj.value = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("icmpv6_type {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147498497] = icmpv6_type
class icmpv6_type_masked(oxm):
type_len = 2147498754
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
packed.append(struct.pack("!B", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = icmpv6_type_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147498754)
obj.value = reader.read("!B")[0]
obj.value_mask = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("icmpv6_type_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147498754] = icmpv6_type_masked
class in_phy_port(oxm):
type_len = 2147484164
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(util.pack_port_no(self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = in_phy_port()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147484164)
obj.value = util.unpack_port_no(reader)
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("in_phy_port {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_port(self.value))
q.breakable()
q.text('}')
oxm.subtypes[2147484164] = in_phy_port
class in_phy_port_masked(oxm):
type_len = 2147484424
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(util.pack_port_no(self.value))
packed.append(util.pack_port_no(self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = in_phy_port_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147484424)
obj.value = util.unpack_port_no(reader)
obj.value_mask = util.unpack_port_no(reader)
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("in_phy_port_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_port(self.value))
q.text(","); q.breakable()
q.text("value_mask = ");
q.text(util.pretty_port(self.value_mask))
q.breakable()
q.text('}')
oxm.subtypes[2147484424] = in_phy_port_masked
class in_port(oxm):
type_len = 2147483652
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(util.pack_port_no(self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = in_port()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147483652)
obj.value = util.unpack_port_no(reader)
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("in_port {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_port(self.value))
q.breakable()
q.text('}')
oxm.subtypes[2147483652] = in_port
class in_port_masked(oxm):
type_len = 2147483912
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(util.pack_port_no(self.value))
packed.append(util.pack_port_no(self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = in_port_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147483912)
obj.value = util.unpack_port_no(reader)
obj.value_mask = util.unpack_port_no(reader)
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("in_port_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_port(self.value))
q.text(","); q.breakable()
q.text("value_mask = ");
q.text(util.pretty_port(self.value_mask))
q.breakable()
q.text('}')
oxm.subtypes[2147483912] = in_port_masked
class ip_dscp(oxm):
type_len = 2147487745
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ip_dscp()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147487745)
obj.value = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("ip_dscp {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147487745] = ip_dscp
class ip_dscp_masked(oxm):
type_len = 2147488002
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
packed.append(struct.pack("!B", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ip_dscp_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147488002)
obj.value = reader.read("!B")[0]
obj.value_mask = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("ip_dscp_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147488002] = ip_dscp_masked
class ip_ecn(oxm):
type_len = 2147488257
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ip_ecn()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147488257)
obj.value = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("ip_ecn {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147488257] = ip_ecn
class ip_ecn_masked(oxm):
type_len = 2147488514
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
packed.append(struct.pack("!B", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ip_ecn_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147488514)
obj.value = reader.read("!B")[0]
obj.value_mask = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("ip_ecn_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147488514] = ip_ecn_masked
class ip_proto(oxm):
type_len = 2147488769
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ip_proto()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147488769)
obj.value = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("ip_proto {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147488769] = ip_proto
class ip_proto_masked(oxm):
type_len = 2147489026
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
packed.append(struct.pack("!B", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ip_proto_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147489026)
obj.value = reader.read("!B")[0]
obj.value_mask = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("ip_proto_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147489026] = ip_proto_masked
class ipv4_dst(oxm):
type_len = 2147489796
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv4_dst()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147489796)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("ipv4_dst {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_ipv4(self.value))
q.breakable()
q.text('}')
oxm.subtypes[2147489796] = ipv4_dst
class ipv4_dst_masked(oxm):
type_len = 2147490056
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv4_dst_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147490056)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("ipv4_dst_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_ipv4(self.value))
q.text(","); q.breakable()
q.text("value_mask = ");
q.text(util.pretty_ipv4(self.value_mask))
q.breakable()
q.text('}')
oxm.subtypes[2147490056] = ipv4_dst_masked
class ipv4_src(oxm):
type_len = 2147489284
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv4_src()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147489284)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("ipv4_src {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_ipv4(self.value))
q.breakable()
q.text('}')
oxm.subtypes[2147489284] = ipv4_src
class ipv4_src_masked(oxm):
type_len = 2147489544
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv4_src_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147489544)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("ipv4_src_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_ipv4(self.value))
q.text(","); q.breakable()
q.text("value_mask = ");
q.text(util.pretty_ipv4(self.value_mask))
q.breakable()
q.text('}')
oxm.subtypes[2147489544] = ipv4_src_masked
class ipv6_dst(oxm):
type_len = 2147497488
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = '\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!16s", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv6_dst()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147497488)
obj.value = reader.read('!16s')[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("ipv6_dst {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.pp(self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147497488] = ipv6_dst
class ipv6_dst_masked(oxm):
type_len = 2147497760
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = '\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = '\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!16s", self.value))
packed.append(struct.pack("!16s", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv6_dst_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147497760)
obj.value = reader.read('!16s')[0]
obj.value_mask = reader.read('!16s')[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("ipv6_dst_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.pp(self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.pp(self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147497760] = ipv6_dst_masked
class ipv6_exthdr(oxm):
type_len = 2147503618
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv6_exthdr()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147503618)
obj.value = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("ipv6_exthdr {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147503618] = ipv6_exthdr
class ipv6_exthdr_masked(oxm):
type_len = 2147503876
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
packed.append(struct.pack("!H", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv6_exthdr_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147503876)
obj.value = reader.read("!H")[0]
obj.value_mask = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("ipv6_exthdr_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147503876] = ipv6_exthdr_masked
class ipv6_flabel(oxm):
type_len = 2147497988
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv6_flabel()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147497988)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("ipv6_flabel {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147497988] = ipv6_flabel
class ipv6_flabel_masked(oxm):
type_len = 2147498248
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv6_flabel_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147498248)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("ipv6_flabel_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147498248] = ipv6_flabel_masked
class ipv6_nd_sll(oxm):
type_len = 2147500038
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv6_nd_sll()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147500038)
obj.value = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("ipv6_nd_sll {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.breakable()
q.text('}')
oxm.subtypes[2147500038] = ipv6_nd_sll
class ipv6_nd_sll_masked(oxm):
type_len = 2147500300
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
packed.append(struct.pack("!6B", *self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv6_nd_sll_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147500300)
obj.value = list(reader.read('!6B'))
obj.value_mask = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("ipv6_nd_sll_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.text(","); q.breakable()
q.text("value_mask = ");
q.text(util.pretty_mac(self.value_mask))
q.breakable()
q.text('}')
oxm.subtypes[2147500300] = ipv6_nd_sll_masked
class ipv6_nd_target(oxm):
type_len = 2147499536
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = '\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!16s", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv6_nd_target()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147499536)
obj.value = reader.read('!16s')[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("ipv6_nd_target {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.pp(self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147499536] = ipv6_nd_target
class ipv6_nd_target_masked(oxm):
type_len = 2147499808
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = '\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = '\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!16s", self.value))
packed.append(struct.pack("!16s", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv6_nd_target_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147499808)
obj.value = reader.read('!16s')[0]
obj.value_mask = reader.read('!16s')[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("ipv6_nd_target_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.pp(self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.pp(self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147499808] = ipv6_nd_target_masked
class ipv6_nd_tll(oxm):
type_len = 2147500550
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv6_nd_tll()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147500550)
obj.value = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("ipv6_nd_tll {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.breakable()
q.text('}')
oxm.subtypes[2147500550] = ipv6_nd_tll
class ipv6_nd_tll_masked(oxm):
type_len = 2147500812
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = [0,0,0,0,0,0]
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = [0,0,0,0,0,0]
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!6B", *self.value))
packed.append(struct.pack("!6B", *self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv6_nd_tll_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147500812)
obj.value = list(reader.read('!6B'))
obj.value_mask = list(reader.read('!6B'))
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("ipv6_nd_tll_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_mac(self.value))
q.text(","); q.breakable()
q.text("value_mask = ");
q.text(util.pretty_mac(self.value_mask))
q.breakable()
q.text('}')
oxm.subtypes[2147500812] = ipv6_nd_tll_masked
class ipv6_src(oxm):
type_len = 2147496976
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = '\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!16s", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv6_src()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147496976)
obj.value = reader.read('!16s')[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("ipv6_src {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.pp(self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147496976] = ipv6_src
class ipv6_src_masked(oxm):
type_len = 2147497248
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = '\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = '\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!16s", self.value))
packed.append(struct.pack("!16s", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = ipv6_src_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147497248)
obj.value = reader.read('!16s')[0]
obj.value_mask = reader.read('!16s')[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("ipv6_src_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.pp(self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.pp(self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147497248] = ipv6_src_masked
class metadata(oxm):
type_len = 2147484680
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!Q", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = metadata()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147484680)
obj.value = reader.read("!Q")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("metadata {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147484680] = metadata
class metadata_masked(oxm):
type_len = 2147484944
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!Q", self.value))
packed.append(struct.pack("!Q", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = metadata_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147484944)
obj.value = reader.read("!Q")[0]
obj.value_mask = reader.read("!Q")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("metadata_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147484944] = metadata_masked
class mpls_bos(oxm):
type_len = 2147502081
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = mpls_bos()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147502081)
obj.value = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("mpls_bos {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147502081] = mpls_bos
class mpls_bos_masked(oxm):
type_len = 2147502338
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
packed.append(struct.pack("!B", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = mpls_bos_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147502338)
obj.value = reader.read("!B")[0]
obj.value_mask = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("mpls_bos_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147502338] = mpls_bos_masked
class mpls_label(oxm):
type_len = 2147501060
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = mpls_label()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147501060)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("mpls_label {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147501060] = mpls_label
class mpls_label_masked(oxm):
type_len = 2147501320
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = mpls_label_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147501320)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("mpls_label_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147501320] = mpls_label_masked
class mpls_tc(oxm):
type_len = 2147501569
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = mpls_tc()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147501569)
obj.value = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("mpls_tc {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147501569] = mpls_tc
class mpls_tc_masked(oxm):
type_len = 2147501826
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
packed.append(struct.pack("!B", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = mpls_tc_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147501826)
obj.value = reader.read("!B")[0]
obj.value_mask = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("mpls_tc_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147501826] = mpls_tc_masked
class pbb_uca(oxm):
type_len = 2147504641
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = pbb_uca()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147504641)
obj.value = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("pbb_uca {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147504641] = pbb_uca
class pbb_uca_masked(oxm):
type_len = 2147504898
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
packed.append(struct.pack("!B", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = pbb_uca_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147504898)
obj.value = reader.read("!B")[0]
obj.value_mask = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("pbb_uca_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147504898] = pbb_uca_masked
class sctp_dst(oxm):
type_len = 2147492866
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = sctp_dst()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147492866)
obj.value = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("sctp_dst {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147492866] = sctp_dst
class sctp_dst_masked(oxm):
type_len = 2147493124
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
packed.append(struct.pack("!H", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = sctp_dst_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147493124)
obj.value = reader.read("!H")[0]
obj.value_mask = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("sctp_dst_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147493124] = sctp_dst_masked
class sctp_src(oxm):
type_len = 2147492354
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = sctp_src()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147492354)
obj.value = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("sctp_src {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147492354] = sctp_src
class sctp_src_masked(oxm):
type_len = 2147492612
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
packed.append(struct.pack("!H", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = sctp_src_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147492612)
obj.value = reader.read("!H")[0]
obj.value_mask = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("sctp_src_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147492612] = sctp_src_masked
class tcp_dst(oxm):
type_len = 2147490818
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = tcp_dst()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147490818)
obj.value = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("tcp_dst {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147490818] = tcp_dst
class tcp_dst_masked(oxm):
type_len = 2147491076
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
packed.append(struct.pack("!H", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = tcp_dst_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147491076)
obj.value = reader.read("!H")[0]
obj.value_mask = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("tcp_dst_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147491076] = tcp_dst_masked
class tcp_src(oxm):
type_len = 2147490306
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = tcp_src()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147490306)
obj.value = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("tcp_src {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147490306] = tcp_src
class tcp_src_masked(oxm):
type_len = 2147490564
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
packed.append(struct.pack("!H", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = tcp_src_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147490564)
obj.value = reader.read("!H")[0]
obj.value_mask = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("tcp_src_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147490564] = tcp_src_masked
class tunnel_id(oxm):
type_len = 2147503112
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!Q", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = tunnel_id()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147503112)
obj.value = reader.read("!Q")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("tunnel_id {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147503112] = tunnel_id
class tunnel_id_masked(oxm):
type_len = 2147503376
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!Q", self.value))
packed.append(struct.pack("!Q", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = tunnel_id_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147503376)
obj.value = reader.read("!Q")[0]
obj.value_mask = reader.read("!Q")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("tunnel_id_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147503376] = tunnel_id_masked
class tunnel_ipv4_dst(oxm):
type_len = 81924
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = tunnel_ipv4_dst()
_type_len = reader.read("!L")[0]
assert(_type_len == 81924)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("tunnel_ipv4_dst {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_ipv4(self.value))
q.breakable()
q.text('}')
oxm.subtypes[81924] = tunnel_ipv4_dst
class tunnel_ipv4_dst_masked(oxm):
type_len = 82184
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = tunnel_ipv4_dst_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 82184)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("tunnel_ipv4_dst_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_ipv4(self.value))
q.text(","); q.breakable()
q.text("value_mask = ");
q.text(util.pretty_ipv4(self.value_mask))
q.breakable()
q.text('}')
oxm.subtypes[82184] = tunnel_ipv4_dst_masked
class tunnel_ipv4_src(oxm):
type_len = 81412
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = tunnel_ipv4_src()
_type_len = reader.read("!L")[0]
assert(_type_len == 81412)
obj.value = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("tunnel_ipv4_src {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_ipv4(self.value))
q.breakable()
q.text('}')
oxm.subtypes[81412] = tunnel_ipv4_src
class tunnel_ipv4_src_masked(oxm):
type_len = 81672
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!L", self.value))
packed.append(struct.pack("!L", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = tunnel_ipv4_src_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 81672)
obj.value = reader.read("!L")[0]
obj.value_mask = reader.read("!L")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("tunnel_ipv4_src_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text(util.pretty_ipv4(self.value))
q.text(","); q.breakable()
q.text("value_mask = ");
q.text(util.pretty_ipv4(self.value_mask))
q.breakable()
q.text('}')
oxm.subtypes[81672] = tunnel_ipv4_src_masked
class udp_dst(oxm):
type_len = 2147491842
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = udp_dst()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147491842)
obj.value = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("udp_dst {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147491842] = udp_dst
class udp_dst_masked(oxm):
type_len = 2147492100
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
packed.append(struct.pack("!H", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = udp_dst_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147492100)
obj.value = reader.read("!H")[0]
obj.value_mask = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("udp_dst_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147492100] = udp_dst_masked
class udp_src(oxm):
type_len = 2147491330
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = udp_src()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147491330)
obj.value = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("udp_src {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147491330] = udp_src
class udp_src_masked(oxm):
type_len = 2147491588
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
packed.append(struct.pack("!H", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = udp_src_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147491588)
obj.value = reader.read("!H")[0]
obj.value_mask = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("udp_src_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147491588] = udp_src_masked
class vlan_pcp(oxm):
type_len = 2147487233
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = vlan_pcp()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147487233)
obj.value = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("vlan_pcp {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147487233] = vlan_pcp
class vlan_pcp_masked(oxm):
type_len = 2147487490
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!B", self.value))
packed.append(struct.pack("!B", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = vlan_pcp_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147487490)
obj.value = reader.read("!B")[0]
obj.value_mask = reader.read("!B")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("vlan_pcp_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147487490] = vlan_pcp_masked
class vlan_vid(oxm):
type_len = 2147486722
def __init__(self, value=None):
if value != None:
self.value = value
else:
self.value = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = vlan_vid()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147486722)
obj.value = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
return True
def pretty_print(self, q):
q.text("vlan_vid {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.breakable()
q.text('}')
oxm.subtypes[2147486722] = vlan_vid
class vlan_vid_masked(oxm):
type_len = 2147486980
def __init__(self, value=None, value_mask=None):
if value != None:
self.value = value
else:
self.value = 0
if value_mask != None:
self.value_mask = value_mask
else:
self.value_mask = 0
return
def pack(self):
packed = []
packed.append(struct.pack("!L", self.type_len))
packed.append(struct.pack("!H", self.value))
packed.append(struct.pack("!H", self.value_mask))
return ''.join(packed)
@staticmethod
def unpack(reader):
obj = vlan_vid_masked()
_type_len = reader.read("!L")[0]
assert(_type_len == 2147486980)
obj.value = reader.read("!H")[0]
obj.value_mask = reader.read("!H")[0]
return obj
def __eq__(self, other):
if type(self) != type(other): return False
if self.value != other.value: return False
if self.value_mask != other.value_mask: return False
return True
def pretty_print(self, q):
q.text("vlan_vid_masked {")
with q.group():
with q.indent(2):
q.breakable()
q.text("value = ");
q.text("%#x" % self.value)
q.text(","); q.breakable()
q.text("value_mask = ");
q.text("%#x" % self.value_mask)
q.breakable()
q.text('}')
oxm.subtypes[2147486980] = vlan_vid_masked
| 28.091176 | 96 | 0.52976 | 21,223 | 172,227 | 4.115158 | 0.01456 | 0.115622 | 0.041816 | 0.056849 | 0.915728 | 0.904449 | 0.897511 | 0.89403 | 0.893366 | 0.881091 | 0 | 0.041401 | 0.333142 | 172,227 | 6,130 | 97 | 28.095759 | 0.719027 | 0.00198 | 0 | 0.842653 | 0 | 0.001735 | 0.038562 | 0.006179 | 0 | 0 | 0 | 0 | 0.025453 | 1 | 0.12823 | false | 0 | 0.000964 | 0 | 0.283263 | 0.025646 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
81723a4d1e012f1fc6dc0632ccd30adb44106290 | 57,900 | py | Python | ostap/fitting/models_2d.py | cpawley/ostap | da561c52c6c62a8586f13bb2fa1a678715966c0b | [
"BSD-3-Clause"
] | null | null | null | ostap/fitting/models_2d.py | cpawley/ostap | da561c52c6c62a8586f13bb2fa1a678715966c0b | [
"BSD-3-Clause"
] | null | null | null | ostap/fitting/models_2d.py | cpawley/ostap | da561c52c6c62a8586f13bb2fa1a678715966c0b | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# =============================================================================
## @file ostap/fitting/models_2d.py
# Smooth non-factorizable 2D-models to describe background distribtions
# @author Vanya BELYAEV Ivan.Belyaeve@itep.ru
# @date 2011-07-25
# =============================================================================
""" Set of useful non-factorizable 2D-models to describe background distribtions
"""
# =============================================================================
__version__ = "$Revision:"
__author__ = "Vanya BELYAEV Ivan.Belyaev@itep.ru"
__date__ = "2011-07-25"
__all__ = (
'PolyPos2D_pdf' , ## A positive polynomial in 2D
'PolyPos2Dsym_pdf', ## A positive symmetric polynomial in 2D
'PSPol2D_pdf' , ## Product of phase spaces, modulated with 2D polynomial
'PSPol2D2_pdf' , ## Product of phase spaces, modulated with 2D polynomial
'PSPol2D3_pdf' , ## Product of phase spaces, modulated with 2D polynomial
'PSPol2Dsym_pdf' , ## Symmetric product of phase spaces, modulated with 2D polynomial
'PSPol2D2sym_pdf' , ## Symmetric product of phase spaces, modulated with 2D polynomial
'PSPol2D3sym_pdf' , ## Symmetric product of phase spaces, modulated with 2D polynomial
'ExpoPSPol2D_pdf' , ## Exponential times phase space times positive 2D-polynomial
'ExpoPol2D_pdf' , ## Product of exponents times positive 2D-polynomial
'ExpoPol2Dsym_pdf', ## Symmetric version of above
##
'Spline2D_pdf' , ## 2D generic positive spline
'Spline2Dsym_pdf' , ## 2D symmetric positive spline
#
'make_B2D' , ## create 2D "background" function
'make_B2Dsym' , ## create symmetric 2D "background" function
)
# =============================================================================
import ROOT, math
from ostap.core.core import cpp, Ostap
from ostap.math.base import iszero
from ostap.fitting.utils import Phases
from ostap.fitting.fit2d import PDF2, Flat2D
from ostap.fitting.signals import Gauss_pdf, CB2_pdf
# =============================================================================
from ostap.logger.logger import getLogger
if '__main__' == __name__ : logger = getLogger ( 'ostap.fitting.models_2d' )
else : logger = getLogger ( __name__ )
# =============================================================================
models = []
# =============================================================================
## @class PolyBase2
# helper base class to implement various polynomial-like shapes
class PolyBase2(PDF2,Phases) :
"""Helper base class to implement various polynomial-like shapes
"""
def __init__ ( self , name , xvar , yvar , power , the_phis = None ) :
PDF2 .__init__ ( self , name , xvar , yvar )
Phases.__init__ ( self , power , the_phis )
# =============================================================================
## @class PolyPos2D_pdf
# positive polynomial in 2D:
# \f[ P(x,y) =
# \sum^{n}_{i=0} \sum{k}_{j=0} a_{ij} B^n_i(x) B^k_j(y)
# \f]
# where
# - \f$ B^n_i(x)\f$ denotes the basic Bersntein polynomial
# - \f$ a_{ij} \ge 0 \f$
# - \f$ \sum_{i,j} a_{ij} = 1 \f$
# @see Ostap::Models::Poly2DPositive
# @see Ostap::Math::Poly2DPositive
# @author Vanya BELYAEV Ivan.Belyaev@itep.ru
# @date 2013-01-10
class PolyPos2D_pdf(PolyBase2) :
r"""Positive (non-factorizable!) polynomial in 2D:
P_{n,k}(x,y) = \sum^{n}_{i=0}\sum{k}_{j=0} a_{ij} B^n_i(x) B^k_j(y)
where:
- B^n_i - are Bernstein polynomials
- a_{ij} >= 0
- \sum_{ij} a_{ij} = 1
Note:
- f(x,y)>=0 for whole 2D-range
"""
def __init__ ( self ,
name ,
xvar , ## the first dimension
yvar , ## the second dimension
nx = 2 , ## polynomial degree in X
ny = 2 , ## polynomial degree in Y
the_phis = None ) :
## check arguments
assert isinstance ( nx , int ) and 0 <= nx < 100 , "``nx''-parameter is illegal: %s" % nx
assert isinstance ( ny , int ) and 0 <= ny < 100 , "``ny''-parameter is illegal: %s" % ny
##
PolyBase2.__init__ ( self , name , xvar , yvar , ( nx + 1 ) * ( ny + 1 ) - 1 , the_phis )
self.__nx = nx
self.__ny = ny
#
## finally build PDF
#
self.pdf = Ostap.Models.Poly2DPositive (
'p2Dp_%s' % name ,
'Poly2DPositive(%s)' % name ,
self.xvar ,
self.yvar ,
self.nx ,
self.ny ,
self.phi_list )
## save configuration
self.config = {
'name' : self.name ,
'xvar' : self.xvar ,
'yvar' : self.yvar ,
'nx' : self.nx ,
'ny' : self.ny ,
'the_phis' : self.phis ,
}
@property
def nx ( self ) :
"""``nx''-parameter - order/degree of 2D-polynom in x-direction"""
return self.__nx
@property
def ny ( self ) :
"""``ny''-parameter - order/degree of 2D-polynom in y-direction"""
return self.__ny
models.append ( PolyPos2D_pdf )
# =============================================================================
## @class PolyPos2Dsym_pdf
# Positive symetric polynomial in 2D. Symmetrized version of PolyPos2D_pdf
# \f[ P_{n}(x,y) =
# \sum^{n}_{i=0} \sum{k}_{j=0} a_{ij} B^n_i(x) B^k_j(y)
# \f]
# where
# - \f$ B^n_i(x)\f$ denotes the basic bersntein polynomial
# - \f$ a_{ij} \ge 0 \f$
# - \f$ a_{ji}=a_{i,j} \f$
# - \f$ \sum_{i,j} a_{ij} = 1 \f$
# @see Ostap::Models::Poly2DSymPositive
# @see Ostap::Math::Poly2DSymPositive
# @author Vanya BELYAEV Ivan.Belyaev@itep.ru
# @date 2013-01-10
class PolyPos2Dsym_pdf(PolyBase2) :
r"""Positive (non-factorizable!) SYMMETRIC polynomial in 2D:
P_{n}(x,y) = \sum^{n}_{i=0} \sum{k}_{j=0} a_{ij} B^n_i(x) B^k_j(y)
where:
- B^n_i - are Bernstein polynomials
- a_{ij} = a_{ji}
- a_{ij} \ge 0
- a_{ji}=a_{i,j}
- \sum_{i,j} a_{ij} = 1
Note:
- f(x,y)>=0 for whole 2D-range
- f(x,y) = f(y,x)
"""
def __init__ ( self ,
name ,
xvar , ## the first dimension
yvar , ## the second dimension
n = 2 , ## polynomial degree
the_phis = None ) :
## check arguments
assert isinstance ( n , int ) and 0 <= n < 100 , "``n''-parameter is illegal: %s" % n
##
self.__n = n
PolyBase2.__init__ ( self , name , xvar , yvar , ( n + 1 ) * ( n + 2 ) / 2 , the_phis )
if self.xminmax() != self.yminmax() :
logger.warning( 'PolyPos2Dsym: x&y have different edges %s vs %s' % ( self.xminmax() , self.yminmax() ) )
## finally build PDF
self.pdf = Ostap.Models.Poly2DSymPositive (
'p2Dsp_%s' % name ,
'Poly2DSymPositive(%s)' % name ,
self.x ,
self.y ,
self.n ,
self.phi_list )
## save configuration
self.config = {
'name' : self.name ,
'xvar' : self.xvar ,
'yvar' : self.yvar ,
'n' : self.n ,
'the_phis' : self.phis ,
}
@property
def n ( self ) :
"""``n''-parameter - order/degree of 2D-polynom in x&y-directions"""
return self.__n
@property
def nx ( self ) :
"""``nx''-parameter - order/degree of 2D-polynom in x-direction"""
return self.__n
@property
def ny ( self ) :
"""``ny''-parameter - order/degree of 2D-polynom in y-direction"""
return self.__n
models.append ( PolyPos2Dsym_pdf )
# =============================================================================
## @class PSPol2D_pdf
# The 2D-function, that represent a cross-product of two phase-space factors,
#
# The function is:
# \f[ f(x,y) =
# \Phi_{k,l}(x;x_{low}, x_{high})
# \Phi_{m,n}(y;y_{low}, y_{high})
# P_{N,M}(x,y) \f]
# where
# - \f$ \Phi_{k,l}(x;x_{low},x_{high}) \f$ is a phase-space function for x-axis
# - \f$ \Phi_{m,n}(y;y_{low},y_{high}) \f$ is a phase-space function for y-axis
# - \f$ P_{N,M}(x,y) \f$ is 2D positive Bernstein polynomial
#
# @see Ostap::Models::PS2DPol
# @see Ostap::Math::PS2DPol
# @see Ostap::Math::PhaseSpaceNL
# @see Ostap::Math::Positive2D
# @author Vanya BELYAEV Ivan.Belyaev@itep.ru
# @date 2013-01-10
class PSPol2D_pdf(PolyBase2) :
r"""Product of phase space factors, modulated by the positive polynom in 2D
f (x,y) =
= \Phi_{k,l}(x;x_{low}, x_{high})
* \Phi_{m,n}(y;y_{low}, y_{high})
* P_{N,M}(x,y)
where:
- \Phi_{k,l}(x;x_{low},x_{high}) is a phase-space function for x-axis
- \Phi_{m,n}(y;y_{low},y_{high}) is a phase-space function for y-axis
- \P_{N,M}(x,y) is 2D positive Bernstein polynomial
Note:
- f(x,y)>=0 for whole 2D-range
"""
def __init__ ( self ,
name ,
xvar , ## the first dimension
yvar , ## the second dimension
psx , ## phase space in X, Ostap::Math::PhaseSpaceNL
psy , ## phase space in Y, Ostap::Math::PhaseSpaceNL
nx = 2 , ## polynomial degree in X
ny = 2 , ## polynomial degree in Y
the_phis = None ) :
## check arguments
assert isinstance ( nx , int ) and 0 <= nx < 100 , "``nx''-parameter is illegal: %s" % nx
assert isinstance ( ny , int ) and 0 <= ny < 100 , "``ny''-parameter is illegal: %s" % ny
## the base
PolyBase2.__init__ ( self , name , xvar , yvar ,
( nx + 1 ) * ( ny + 1 ) - 1 , the_phis )
self.__phasespacex = psx
self.__phasespacey = psy
self.__nx = nx
self.__ny = ny
#
## finally build PDF
#
self.pdf = Ostap.Models.PS2DPol (
'ps2D_%s' % name ,
'PS2DPol(%s)' % name ,
self.x ,
self.y ,
self.phasespacex ,
self.phasespacey ,
self.nx ,
self.ny ,
self.phi_list )
## save configuration
self.config = {
'name' : self.name ,
'xvar' : self.xvar ,
'yvar' : self.yvar ,
'psx' : self.phasespacex ,
'psy' : self.phasespacey ,
'nx' : self.nx ,
'ny' : self.ny ,
'the_phis' : self.phis ,
}
@property
def mass1 ( self ) :
"""``mass1''-variable for the fit (alias for ``x'' or ``xvar'')"""
return self.xvar
@property
def mass2 ( self ) :
"""``mass2''-variable for the fit (alias for ``y'' or ``yvar'')"""
return self.yvar
@property
def phasespacex( self ) :
"""``x-phasespace''-function for PSPol2D-function"""
return self.__phasespacex
@property
def phasespacey( self ) :
"""``y-phasespace''-function for PSPol2D-function"""
return self.__phasespacey
@property
def nx ( self ) :
"""``nx''-parameter - order/degree of 2D-polynom in x-direction"""
return self.__nx
@property
def ny ( self ) :
"""``ny''-parameter - order/degree of 2D-polynom in y-direction"""
return self.__ny
models.append ( PSPol2D_pdf )
# =============================================================================
## @class PSPol2D2_pdf
# The 2D-function, that represent non-factorizeable "product" of
# phase-space functions modulated by the 2D-positive polynomial.
# The function is useful to describe e.g. 2D-distributions of
# \f$ m_{23}\f$ vs \f$m_{45}\f$ from 5-body decays.
#
# The function is:
# \f[ f(x,y) = \frac{1}{2}
# \left( \Phi_{k,n}(x;x_{low},x_{high}) \Phi_{l,m-1}(y,y_{low},m_{max}-x)
# + \Phi_{l,m}(y;y_{low},y_{high}) \Phi_{k,n-1}(x,x_{low},m_{max}-y)
# \right) P_{N^{x},N^{y}}(x,y) \f]
# where
# - \f$ \Phi_{i,j}(z;z_{low},z_{high}\f$ are normalized phase space functions
# for mass of \f$i\f$-particles from \f$j\f$-body decays
# - \f$ P_{N^{x},N^{y}}(x,y) \f$ is 2D positive Bernstein polynomial
# - \f$m_{max}\f$ is a maximal allowed mass for \f$x+y\f$
#
# @see Ostap::Models::PS2DPol2
# @see Ostap::Math::PS2DPol2
# @see Ostap::Math::PhaseSpaceNL
# @see Ostap::Math::Positive2D
# @author Vanya BELYAEV Ivan.Belyaev@itep.ru
# @date 2013-01-10
class PSPol2D2_pdf(PolyBase2) :
r"""Product of phase space factors, modulated by the positive polynom in 2D
f(x,y) =
\frac{1}{2} \left(
\Phi_{k,n}(x;x_{low},x_{high}) \Phi_{l,m-1}(y,y_{low},m_{max}-x)
+ \Phi_{l,m}(y;y_{low},y_{high}) \Phi_{k,n-1}(x,x_{low},m_{max}-y) \right)
P_{N^{x},N^{y}}(x,y)
where
- \Phi_{i,j}(z;z_{low},z_{high} are normalized phase space functions
for mass of i-particles from j-body decays
- P_{N^{x},N^{y}}(x,y) is 2D positive Bernstein polynomial
- m_{max} is a maximal allowed mass for \f$x+y\f$
Note:
- f(x,y)>=0 for whole 2D-range
"""
def __init__ ( self ,
name ,
xvar , ## the first dimension
yvar , ## the second dimension
psx , ## phase space in X, Ostap::Math::PhaseSpaceNL
psy , ## phase space in Y, Ostap::Math::PhaseSpaceNL
mmax , ## max-mass
nx = 2 , ## polynomial degree in X
ny = 2 , ## polynomial degree in Y
the_phis = None ) :
## check arguments
assert isinstance ( nx , int ) and 0 <= nx < 100 , "``nx''-parameter is illegal: %s" % nx
assert isinstance ( ny , int ) and 0 <= ny < 100 , "``ny''-parameter is illegal: %s" % ny
assert isinstance ( mmax , float ) , "``mmax''-parameter is illegal: %s" % mmax
## the base
PolyBase2.__init__ ( self , name , xvar , yvar ,
( nx + 1 ) * ( ny + 1 ) - 1 , the_phis )
self.__phasespacex = psx
self.__phasespacey = psy
self.__nx = nx
self.__ny = ny
self.__mmax = mmax
#
## finally build PDF
#
self.pdf = Ostap.Models.PS2DPol2 (
'ps2D2_%s' % name ,
'PS2DPol2(%s)' % name ,
self.x ,
self.y ,
self.phasespacex ,
self.phasespacey ,
self.mmax ,
self.nx ,
self.ny ,
self.phi_list )
## save configuration
self.config = {
'name' : self.name ,
'xvar' : self.xvar ,
'yvar' : self.yvar ,
'psx' : self.phasespacex ,
'psy' : self.phasespacey ,
'mmax' : self.mmax ,
'nx' : self.nx ,
'ny' : self.ny ,
'the_phis' : self.phis ,
}
@property
def mass1 ( self ) :
"""``mass1''-variable for the fit (alias for ``x'' or ``xvar'')"""
return self.xvar
@property
def mass2 ( self ) :
"""``mass2''-variable for the fit (alias for ``y'' or ``yvar'')"""
return self.yvar
@property
def phasespacex( self ) :
"""``x-phasespace''-function for PSPol2D-function"""
return self.__phasespacex
@property
def phasespacey( self ) :
"""``y-phasespace''-function for PSPol2D-function"""
return self.__phasespacey
@property
def mmax ( self ) :
"""``mmax''-parameter - the maximal allowed mass"""
return self.__mmax
@property
def nx ( self ) :
"""``nx''-parameter - order/degree of 2D-polynom in x-direction"""
return self.__nx
@property
def ny ( self ) :
"""``ny''-parameter - order/degree of 2D-polynom in y-direction"""
return self.__ny
models.append ( PSPol2D2_pdf )
# =============================================================================
## @class PSPol2D3_pdf
# The 2D-function, that represent non-factorizeable "product" of
# two modulated phase-space functions.
# It can be considered as a simpler alternative for class PSPol2D2_pdf
# The function is useful to describe e.g. 2D-distributions of
# \f$ m_{23}\f$ vs \f$m_{45}\f$ from 5-body decays.
#
# The function is:
# \f[ f(x,y) = \frac{1}{2}
# \left( \Phi^{(N^{x})}_{k,n}(x;x_{low},x_{high}) \Phi_{l,m-1}(y,y_{low},m_{max}-x)
# + \Phi^{(N^{y})}_{l,m}(y;y_{low},y_{high}) \Phi_{k,n-1}(x,x_{low},m_{max}-y)
# \right) \f]
# where
# - \f$\Phi_{i,j}(z;z_{low},z_{high}\f$ are normalized phase space functions
# for mass of \f$i\f$-particles from \f$j\f$-body decays
# - \f$\Phi^{(N)}_{i,j}(z;z_{low},z_{high}\f$ are normalized phase space functions
# for mass of \f$i\f$-particles from \f$j\f$-body decays, modulated by
# 1D positive benrstein polynomial of degree \f$N\f$
# - \f$m_{max}\f$ is a maximal allowed mass for \f$x+y\f$
#
# @see Ostap::Models::PS2DPol3
# @see Ostap::Math::PS2DPol3
# @see Ostap::Math::PhaseSpacePol
# @see Ostap::Math::PhaseSpaceNL
# @see Ostap::Math::Positive2D
# @author Vanya BELYAEV Ivan.Belyaev@itep.ru
# @date 2013-01-10
class PSPol2D3_pdf(PolyBase2) :
r"""Product of phase space factors, modulated by the positive polynom in 2D
The function is:
f(x,y) = \frac{1}{2}
\left( \Phi^{(N^{x})}_{k,n}(x;x_{low},x_{high}) \Phi_{l,m-1}(y,y_{low},m_{max}-x)
+ \Phi^{(N^{y})}_{l,m}(y;y_{low},y_{high}) \Phi_{k,n-1}(x,x_{low},m_{max}-y)
\right)
where
- \f$\Phi_{i,j}(z;z_{low},z_{high}\f$ are normalized phase space functions
for mass of \f$i\f$-particles from \f$j\f$-body decays
- \f$\Phi^{(N)}_{i,j}(z;z_{low},z_{high}\f$ are normalized phase space functions
for mass of \f$i\f$-particles from \f$j\f$-body decays, modulated by
1D positive benrstein polynomial of degree \f$N\f$
- \f$m_{max}\f$ is a maximal allowed mass for \f$x+y\f$
Note:
- f(x,y)>=0 for whole 2D-range
"""
def __init__ ( self ,
name ,
xvar , ## the first dimension
yvar , ## the second dimension
psx , ## phase space in X, Ostap::Math::PhaseSpaceNL
psy , ## phase space in Y, Ostap::Math::PhaseSpaceNL
mmax , ## max-mass
nx = 2 , ## polynomial degree in X
ny = 2 , ## polynomial degree in Y
the_phis = None ) :
## check arguments
assert isinstance ( nx , int ) and 0 <= nx < 100 , "``nx''-parameter is illegal: %s" % nx
assert isinstance ( ny , int ) and 0 <= ny < 100 , "``ny''-parameter is illegal: %s" % ny
assert isinstance ( mmax , float ) , "``mmax''-parameter is illegal: %s" % mmax
## the base
PolyBase2.__init__ ( self , name , xvar , yvar , nx + ny , the_phis )
self.__phasespacex = psx
self.__phasespacey = psy
self.__nx = nx
self.__ny = ny
self.__mmax = mmax
#
## finally build PDF
#
self.pdf = Ostap.Models.PS2DPol3 (
'ps2D3_%s' % name ,
'PS2DPol3(%s)' % name ,
self.x ,
self.y ,
self.phasespacex ,
self.phasespacey ,
self.mmax ,
self.nx ,
self.ny ,
self.phi_list )
## save configuration
self.config = {
'name' : self.name ,
'xvar' : self.xvar ,
'yvar' : self.yvar ,
'psx' : self.phasespacex ,
'psy' : self.phasespacey ,
'mmax' : self.mmax ,
'nx' : self.nx ,
'ny' : self.ny ,
'the_phis' : self.phis ,
}
@property
def mass1 ( self ) :
"""``mass1''-variable for the fit (alias for ``x'' or ``xvar'')"""
return self.xvar
@property
def mass2 ( self ) :
"""``mass2''-variable for the fit (alias for ``y'' or ``yvar'')"""
return self.yvar
@property
def phasespacex( self ) :
"""``x-phasespace''-function for PSPol2D-function"""
return self.__phasespacex
@property
def phasespacey( self ) :
"""``y-phasespace''-function for PSPol2D-function"""
return self.__phasespacey
@property
def mmax ( self ) :
"""``mmax''-parameter - the maximal allowed mass"""
return self.__mmax
@property
def nx ( self ) :
"""``nx''-parameter - order/degree of 2D-polynom in x-direction"""
return self.__nx
@property
def ny ( self ) :
"""``ny''-parameter - order/degree of 2D-polynom in y-direction"""
return self.__ny
models.append ( PSPol2D3_pdf )
# =============================================================================
## @class PSPol2Dsym_pdf
# The symmetric 2D-function, that represent a cross-product
# \f$ \Phi_{k,l}(x)\f$ and \f$ \Phi_{m,n}(y)\f$,
# modulated by the 2D-positive symmetric polynomial.
# It is a "symmetrised" version of class PSPol2D
#
# The function is:
# \f[ f(x,y) =
# \Phi_{k,l}(x;x_{low}, x_{high})
# \Phi_{k,l}(y;y_{low}, y_{high})
# P_{N,N}(x,y) \f]
# where
# - \f$ \Phi_{k,l}(x;x_{low},x_{high}) \f$ is a phase-space function,
# \f$ y_{low}=x_{low}\f$ and \f$y_{high}=x_{high}\f$
# - \f$ P_{N,N}(x,y) \f$ is 2D positive symmetric Bernstein polynomial
#
# Clearly the function is symmetric: \f$f(x,y) = f(y,x) \f$
# @see Ostap::Models::PS2DPolSym
# @see Ostap::Math::PS2DPolSym
# @see Ostap::Math::PhaseSpaceNL
# @see Ostap::Math::Positive2DSym
# @author Vanya BELYAEV Ivan.Belyaev@itep.ru
# @date 2013-01-10
class PSPol2Dsym_pdf(PolyBase2) :
r"""Symmetric product of phase space factors, modulated by the positive polynom in 2D
# The function is:
# \f[ f(x,y) =
# \Phi_{k,l}(x;x_{low}, x_{high})
# \Phi_{k,l}(y;y_{low}, y_{high})
# P_{N,N}(x,y) \f]
# where
# - \f$ \Phi_{k,l}(x;x_{low},x_{high}) \f$ is a phase-space function,
# \f$ y_{low}=x_{low}\f$ and \f$y_{high}=x_{high}\f$
# - \f$ P_{N,N}(x,y) \f$ is 2D positive symmetric Bernstein polynomial
#
Note:
- f(x,y)>=0 for whole 2D-range
- f(x,y)=f(y,x) for whole 2D-range
"""
def __init__ ( self ,
name ,
xvar , ## the first dimension
yvar , ## the second dimension
ps , ## phase space in X, Ostap::Math::PhaseSpaceNL
n = 2 , ## polynomial degree in Y
the_phis = None ) :
## check arguments
assert isinstance ( n , int ) and 0 <= n < 100 , "``n''-parameter is illegal: %s" % n
##
## the base
PolyBase2.__init__ ( self , name , xvar , yvar , ( n + 1 ) * ( n + 2 ) / 2 - 1 , the_phis )
if self.xminmax() != self.yminmax() :
logger.warning( 'PSPol2Dsym_pdf: x&y have different edges %s vs %s' % ( self.xminmax() , self.yminmax() ) )
self.__phasespace = ps
self.__n = n
#
## finally build PDF
#
self.pdf = Ostap.Models.PS2DPolSym (
'ps2Ds_%s' % name ,
'PS2DPolSym(%s)' % name ,
self.x ,
self.y ,
self.phasespace ,
self.n ,
self.phi_list )
## save configuration
self.config = {
'name' : self.name ,
'xvar' : self.xvar ,
'yvar' : self.yvar ,
'ps' : self.phasespace ,
'n' : self.n ,
'the_phis' : self.phis ,
}
@property
def mass1 ( self ) :
"""``mass1''-variable for the fit (alias for ``x'' or ``xvar'')"""
return self.xvar
@property
def mass2 ( self ) :
"""``mass2''-variable for the fit (alias for ``y'' or ``yvar'')"""
return self.yvar
@property
def phasespace ( self ) :
"""``phasespace''-function for PSPol2DSym-function"""
return self.__phasespace
@property
def phasespacex( self ) :
"""``x-phasespace''-function for PSPol2Dsym-function"""
return self.__phasespace
@property
def phasespacey( self ) :
"""``y-phasespace''-function for PSPol2Dsym-function"""
return self.__phasespace
@property
def n ( self ) :
"""``n''-parameter - order/degree of 2D-polynom in x&y-directions"""
return self.__n
@property
def nx ( self ) :
"""``nx''-parameter - order/degree of 2D-polynom in x-direction"""
return self.__n
@property
def ny ( self ) :
"""``ny''-parameter - order/degree of 2D-polynom in y-direction"""
return self.__n
models.append ( PSPol2Dsym_pdf )
# =============================================================================
## @class PSPol2D2sym_pdf
#
# The symmetric 2D-function, that represent non-factorizeable "product" of
# phase-space functions modulated by the 2D-positive polynomial.
# It is a symmetrised version of class PSPol2D2.
# The function is useful to describe e.g. 2D-distributions of
# \f$ m_{23}\f$ vs \f$m_{45}\f$ from 5-body decays.
#
# The function is:
# \f[ f(x,y) = \frac{1}{2}
# \left( \Phi_{k,n}(x;x_{low},x_{high}) \Phi_{k,n-1}(y,y_{low},m_{max}-x)
# + \Phi_{k,n}(y;y_{low},y_{high}) \Phi_{k,n-1}(x,x_{low},m_{max}-y)
# \right)
# P_{N,N}(x,y) \f]
# where
# - \f$ \Phi_{i,j}(x;x_{low},x_{high}\f$ are normalized phase space function,
# for mass of \f$i\f$-particles from \f$j\f$-body decays;
# - \f$ y_{low}=x_{low}\f$ and \f$y_{high}=x_{high}\f$
# - \f$ P_{N,N}(x,y) \f$ is 2D positive symmertic Bernstein polynomial
# - \f$m_{max}\f$ is a maximal allowed mass for \f$x+y\f$
#
# Clearly the function is symmetric \f$f(x,y) = f(y,x) \f$
# @see Ostap::Models::PS2DPolSym
# @see Ostap::Math::PS2DPolSym
# @see Ostap::Math::PhaseSpaceNL
# @see Ostap::Math::Positive2DSym
# @author Vanya BELYAEV Ivan.Belyaev@itep.ru
# @date 2013-01-10
class PSPol2D2sym_pdf(PolyBase2) :
"""Symmetric product of phase space factors, modulated by the positive polynom in 2D
f(x,y) = PS(x) * PS(y) * Pn(x,y)
where
- PS(x) is a phase space function (Ostap::Math::PhaseSpaceNL)
- Pnk(x,y) is positive symmetric non-factorizable polynom (Ostap::Math::Positive2DSym)
-- Pn(x,y) = sum^{i=n}_{i=0}sum^{j=n}_{j=0} a^2_{ij} B^n_i(x) B^n_j(y), where
--- B^n_i - are Bernstein polynomials
--- a_{ij}=a_{ji}
Note:
- f(x,y)>=0 for whole 2D-range
- f(x,y)=f(y,x) for whole 2D-range
"""
def __init__ ( self ,
name ,
xvar , ## the first dimension
yvar , ## the second dimension
ps , ## phase space in X, Ostap::Math::PhaseSpaceNL
mmax , ## max-mass
n = 2 , ## polynomial degree in Y
the_phis = None ) :
## check arguments
assert isinstance ( n , int ) and 0 <= n < 100 , "``n''-parameter is illegal: %s" % n
assert isinstance ( mmax , float ) , "``mmax''-parameter is illegal: %s" % mmax
##
## the base
PolyBase2.__init__ ( self , name , xvar , yvar , ( n + 1 ) * ( n + 2 ) / 2 - 1 , the_phis )
if self.xminmax() != self.yminmax() :
logger.warning( 'PSPol2Dsym_pdf: x&y have different edges %s vs %s' % ( self.xminmax() , self.yminmax() ) )
self.__phasespace = ps
self.__n = n
self.__mmax = mmax
#
## finally build PDF
#
self.pdf = Ostap.Models.PS2DPol2Sym (
'ps2D2s_%s' % name ,
'PS2DPol2Sym(%s)' % name ,
self.x ,
self.y ,
self.phasespace ,
self.mmax ,
self.n ,
self.phi_list )
## save configuration
self.config = {
'name' : self.name ,
'xvar' : self.xvar ,
'yvar' : self.yvar ,
'ps' : self.phasespace ,
'mmax' : self.mmax ,
'n' : self.n ,
'the_phis' : self.phis ,
}
@property
def mass1 ( self ) :
"""``mass1''-variable for the fit (alias for ``x'' or ``xvar'')"""
return self.xvar
@property
def mass2 ( self ) :
"""``mass2''-variable for the fit (alias for ``y'' or ``yvar'')"""
return self.yvar
@property
def phasespace ( self ) :
"""``phasespace''-function for PSPol2DSym-function"""
return self.__phasespace
@property
def phasespacex( self ) :
"""``x-phasespace''-function for PSPol2Dsym-function"""
return self.__phasespace
@property
def phasespacey( self ) :
"""``y-phasespace''-function for PSPol2Dsym-function"""
return self.__phasespace
@property
def mmax ( self ) :
"""``mmax''-parameter - the maximal allowed mass"""
return self.__mmax
@property
def n ( self ) :
"""``n''-parameter - order/degree of 2D-polynom in x&y-directions"""
return self.__n
@property
def nx ( self ) :
"""``nx''-parameter - order/degree of 2D-polynom in x-direction"""
return self.__n
@property
def ny ( self ) :
"""``ny''-parameter - order/degree of 2D-polynom in y-direction"""
return self.__n
models.append ( PSPol2D2sym_pdf )
# =============================================================================
## @class PSPol2D3sym_pdf
#
# The symmetric 2D-function, that represent non-factorizeable "product" of
# two modulated phase-space functions.
# It is a symmetrized version of PSPol2D3_pdf
# The function is useful to describe e.g. 2D-distributions of
# \f$ m_{23}\f$ vs \f$m_{45}\f$ from 5-body decays.
#
# The function is:
# \f[ f(x,y) = \frac{1}{2}
# \left( \Phi^{(N)}_{k,n}(x;x_{low},x_{high}) \Phi_{k,n-1}(y,y_{low},m_{max}-x)
# + \Phi^{(N)}_{k,n}(y;y_{low},y_{high}) \Phi_{k,n-1}(x,x_{low},m_{max}-y)
# \right) \f]
# where
# - \f$ \Phi_{i,j}(z;z_{low},z_{high}\f$ are normalized phase space functions
# for mass of \f$i\f$-particles from \f$j\f$-body decays
# - \f$\Phi^{(N)}_{i,j}(z;z_{low},z_{high}\f$ are normalized phase space functions
# for mass of \f$i\f$-particles from \f$j\f$-body decays, modulated by
# 1D positive benrstein polynomial of degree \f$N\f$
# - \f$m_{max}\f$ is a maximal allowed mass for \f$x+y\f$
#
# Clearly the function is symmetric: \f$f(y,x)=f(x,y)\f$
# @see Ostap::Models::PS2DPolSym
# @see Ostap::Math::PS2DPolSym
# @see Ostap::Math::PhaseSpaceNL
# @see Ostap::Math::Positive2DSym
# @author Vanya BELYAEV Ivan.Belyaev@itep.ru
# @date 2013-01-10
class PSPol2D3sym_pdf(PolyBase2) :
"""Symmetric product of phase space factors, modulated by the positive polynom in 2D
f(x,y) = PS(x) * PS(y) * Pn(x,y)
where
- PS(x) is a phase space function (Ostap::Math::PhaseSpaceNL)
- Pnk(x,y) is positive symmetric non-factorizable polynom (Ostap::Math::Positive2DSym)
-- Pn(x,y) = sum^{i=n}_{i=0}sum^{j=n}_{j=0} a^2_{ij} B^n_i(x) B^n_j(y), where
--- B^n_i - are Bernstein polynomials
--- a_{ij}=a_{ji}
Note:
- f(x,y)>=0 for whole 2D-range
- f(x,y)=f(y,x) for whole 2D-range
"""
def __init__ ( self ,
name ,
xvar , ## the first dimension
yvar , ## the second dimension
ps , ## phase space in X, Ostap::Math::PhaseSpaceNL
mmax , ## max-mass
n = 2 , ## polynomial degree in Y
the_phis = None ) :
## check arguments
assert isinstance ( n , int ) and 0 <= n < 100 , "``n''-parameter is illegal: %s" % n
assert isinstance ( mmax , float ) , "``mmax''-parameter is illegal: %s" % mmax
##
## the base
PolyBase2.__init__ ( self , name , xvar , yvar , n , the_phis )
if self.xminmax() != self.yminmax() :
logger.warning( 'PSPol2Dsym_pdf: x&y have different edges %s vs %s' % ( self.xminmax() , self.yminmax() ) )
self.__phasespace = ps
self.__n = n
self.__mmax = mmax
#
## finally build PDF
#
self.pdf = Ostap.Models.PS2DPol3Sym (
'ps2D2s_%s' % name ,
'PS2DPol2Sym(%s)' % name ,
self.x ,
self.y ,
self.phasespace ,
self.mmax ,
self.n ,
self.phi_list )
## save configuration
self.config = {
'name' : self.name ,
'xvar' : self.xvar ,
'yvar' : self.yvar ,
'ps' : self.phasespace ,
'mmax' : self.mmax ,
'n' : self.n ,
'the_phis' : self.phis ,
}
@property
def mass1 ( self ) :
"""``mass1''-variable for the fit (alias for ``x'' or ``xvar'')"""
return self.xvar
@property
def mass2 ( self ) :
"""``mass2''-variable for the fit (alias for ``y'' or ``yvar'')"""
return self.yvar
@property
def phasespace ( self ) :
"""``phasespace''-function for PSPol2DSym-function"""
return self.__phasespace
@property
def phasespacex( self ) :
"""``x-phasespace''-function for PSPol2Dsym-function"""
return self.__phasespace
@property
def phasespacey( self ) :
"""``y-phasespace''-function for PSPol2Dsym-function"""
return self.__phasespace
@property
def mmax ( self ) :
"""``mmax''-parameter - the maximal allowed mass"""
return self.__mmax
@property
def n ( self ) :
"""``n''-parameter - order/degree of 2D-polynom in x&y-directions"""
return self.__n
@property
def nx ( self ) :
"""``nx''-parameter - order/degree of 2D-polynom in x-direction"""
return self.__n
@property
def ny ( self ) :
"""``ny''-parameter - order/degree of 2D-polynom in y-direction"""
return self.__n
models.append ( PSPol2D3sym_pdf )
# =============================================================================
## @class ExpoPSPol2D_pdf
# Product of phase space factors, modulated by positiev polynomial in 2D
# \f$ f(x,y) = exp(\tau x) \times \Phi (y) \times P^+(x,y) \f$,
# where \f$ P^+(x,y)\f$ denotes the positive polynomial,
# @see Ostap::Models::ExpoPS2DPol
# @see Ostap::Math::ExpoPS2DPol
# @author Vanya BELYAEV Ivan.Belyaev@itep.ru
# @date 2013-01-10
class ExpoPSPol2D_pdf(PolyBase2) :
"""Product of exponential and phase space factor,
modulated by the positive polynom in 2D
f(x,y) = exp(tau*x) * PS(y) * Pnk(x,y)
where
- PS (y) is a phase space function for y-axis (Ostap::Math::PhaseSpaceNL)
- Pnk(x,y) is positive non-factorizable polynom
Pnk(x,y) = sum^{i=n}_{i=0}sum{j=k}_{j=0} a^2_{ij} B^n_i(x) B^k_j(y)
where:
- B^n_i - are Bernstein polynomials
Note:
- f(x,y)>=0 for whole 2D-range
"""
def __init__ ( self ,
name ,
xvar , ## the first dimension
yvar , ## the second dimension
psy = None , ## phase space in Y, Ostap::Math::PhaseSpaceNL
nx = 2 , ## polynomial degree in X
ny = 2 , ## polynomial degree in Y
tau = None ) : ## the exponent
## check arguments
assert isinstance ( nx , int ) and 0 <= nx < 100 , "``nx''-parameter is illegal: %s" % nx
assert isinstance ( ny , int ) and 0 <= ny < 100 , "``ny''-parameter is illegal: %s" % ny
## the base
PolyBase2.__init__ ( self , name , xvar , yvar , ( nx + 1 ) * ( ny + 1 ) - 1 )
limits_tau = ()
if self.xminmax() :
mn , mx = self.xminmax()
mmax = max ( abs ( mn ) , abs ( mx ) )
limits_tau = -500. / mmax , 500. / mmax
## the exponential slope
self.__tau = self.make_var ( tau ,
"tau_%s" % name ,
"tau(%s)" % name , tau , 0 , *limits_tau )
#
self.__phasespace = psy
self.__nx = nx
self.__ny = ny
#
## finally build PDF
#
self.pdf = Ostap.Models.ExpoPS2DPol (
'ps2D_%s' % name ,
'PS2DPol(%s)' % name ,
self.x ,
self.y ,
self.tau ,
self.phasespacey ,
self.nx ,
self.ny ,
self.phi_list )
## save configuration
self.config = {
'name' : self.name ,
'xvar' : self.xvar ,
'yvar' : self.yvar ,
'psy' : self.phasespacey ,
'nx' : self.nx ,
'ny' : self.ny ,
'tau' : self.tau ,
'the_phis' : self.phis ,
}
@property
def mass1 ( self ) :
"""``mass1''-variable for the fit (alias for ``x'' or ``xvar'')"""
return self.xvar
@property
def mass2 ( self ) :
"""``mass2''-variable for the fit (alias for ``y'' or ``yvar'')"""
return self.yvar
@property
def tau ( self ) :
"""``tau''-parameters, the exponential slope for x-dimension"""
return self.__tau
@tau.setter
def tau ( self , value ) :
value = float ( value )
self.__tau.setVal ( value )
@property
def phasespace ( self ) :
"""``phasespace''-function for PSPol2DSym-function"""
return self.__phasespace
@property
def phasespacey( self ) :
"""``y-phasespace''-function for PSPol2Dsym-function"""
return self.phasespace
@property
def nx ( self ) :
"""``nx''-parameter - order/degree of 2D-polynom in x-direction"""
return self.__nx
@property
def ny ( self ) :
"""``ny''-parameter - order/degree of 2D-polynom in y-direction"""
return self.__ny
models.append ( ExpoPSPol2D_pdf )
# =============================================================================
## @class ExpoPol2D_pdf
# Product of phase space factors, modulated by positive polynomial in 2D
# \f$ f(x,y) = exp(\tau_x x) \times exp ( \tau_y y) \times P^+(x,y) \f$,
# where \f$ P^+(x,y)\f$ denotes the positive polynomial,
# @see Ostap::Models::Expo2DPol
# @see Ostap::Math::Expo2DPol
# @author Vanya BELYAEV Ivan.Belyaev@itep.ru
# @date 2013-01-10
class ExpoPol2D_pdf(PolyBase2) :
"""Product of exponential factors
modulated by the positive polynom in 2D
f(x,y) = exp(tau_x*x) * exp(tau_y*y) * Pnk(x,y)
where
- Pnk(x,y) is positive non-factorizable polynom
Pnk(x,y) = sum^{i=n}_{i=0}sum{j=k}_{j=0} a^2_{ij} B^n_i(x) B^k_j(y)
where:
- B^n_i - are Bernstein polynomials
Note:
- f(x,y)>=0 for whole 2D-range
"""
def __init__ ( self ,
name ,
xvar , ## the first dimension
yvar , ## the second dimension
nx = 2 , ## polynomial degree in X
ny = 2 , ## polynomial degree in Y
taux = None , ## the exponent in X
tauy = None , ## the exponent in Y
the_phis = None ) :
## check arguments
assert isinstance ( nx , int ) and 0 <= nx < 100 , "``nx''-parameter is illegal: %s" % nx
assert isinstance ( ny , int ) and 0 <= ny < 100 , "``ny''-parameter is illegal: %s" % ny
PolyBase2.__init__ ( self , name , xvar , yvar ,
( nx + 1 ) * ( ny + 1 ) - 1 , the_phis )
limits_taux = ()
if self.xminmax() :
mn , mx = self.xminmax()
mmax = max ( abs ( mn ) , abs ( mx ) )
limits_taux = -500. / mmax , 500. / mmax
limits_tauy = ()
if self.yminmax() :
mn , mx = self.yminmax()
mmax = max ( abs ( mn ) , abs ( mx ) )
limits_tauy = -500. / mmax , 500. / mmax
self.__nx = nx
self.__ny = ny
#
## the exponential slopes
#
self.__taux = self.make_var ( taux ,
"taux_%s" % name ,
"taux(%s)" % name , taux , 0 , *limits_taux )
#
self.__tauy = self.make_var ( tauy ,
"tauy_%s" % name ,
"tauy(%s)" % name , tauy , 0 , *limits_tauy )
#
## finally build PDF
#
self.pdf = Ostap.Models.Expo2DPol (
'exp2D_%s' % name ,
'Expo2DPol(%s)' % name ,
self.x ,
self.y ,
self.taux ,
self.tauy ,
nx ,
ny ,
self.phi_list )
## save configuration
self.config = {
'name' : self.name ,
'xvar' : self.xvar ,
'yvar' : self.yvar ,
'nx' : self.nx ,
'ny' : self.ny ,
'taux' : self.taux ,
'tauy' : self.tauy ,
'the_phis' : self.phis ,
}
@property
def taux ( self ) :
"""``tau-x''-parameters, the exponential slope for x-dimension"""
return self.__taux
@taux.setter
def taux ( self , value ) :
value = float ( value )
self.__taux.setVal ( value )
@property
def tauy ( self ) :
"""``tau-y''-parameters, the exponential slope for y-dimension"""
return self.__tauy
@tauy.setter
def tauy ( self , value ) :
value = float ( value )
self.__tauy.setVal ( value )
@property
def nx ( self ) :
"""``nx''-parameter - order/degree of 2D-polynom in x-direction"""
return self.__nx
@property
def ny ( self ) :
"""``ny''-parameter - order/degree of 2D-polynom in y-direction"""
return self.__ny
models.append ( ExpoPol2D_pdf )
# =============================================================================
## @class ExpoPol2Dsym_pdf
# Product of phase space factors, modulated by positiev polynomial in 2D
# \f$ f(x,y) = exp(\tau x) \times exp ( \tau y) \times P^+_{sym}(x,y) \f$,
# where \f$ P^+_{sym}(x,y)\f$ denotes the symmetric positive polynomial,
# @see Ostap::Models::Expo2DPolSym
# @see Ostap::Math::Expo2DPolSym
# @author Vanya BELYAEV Ivan.Belyaev@itep.ru
# @date 2013-01-10
class ExpoPol2Dsym_pdf(PolyBase2) :
"""Symmetric product of exponential factors modulated by the positive polynom in 2D
f(x,y) = exp(tau*x) * exp(tau*y) * Sn(x,y)
where
- Sn(x,y) is positive symmetric non-factorizable polynom
Sn(x,y) = sum^{i=n}_{i=0}sum{j=n}_{j=0} a^2_{ij} B^n_i(x) B^n_j(y)
where:
- B^n_i - are Bernstein polynomials
- a_{ij}=a_{ji}
Note:
- f(x,y)>=0 for whole 2D-range
- f(x,y)=f(y,x)
"""
def __init__ ( self ,
name ,
xvar , ## the first dimension
yvar , ## the second dimension
n = 2 , ## polynomial degree in X and Y
tau = None , ## the exponent
the_phis = None ) :
## check arguments
assert isinstance ( n , int ) and 0 <= n < 100 , "``n''-parameter is illegal: %s" % n
##
PolyBase2.__init__ ( self , name , xvar , yvar ,
( n + 1 ) * ( n + 2 ) / 2 - 1 , the_phis )
if self.xminmax() != self.yminmax() :
logger.warning( 'PSPol2Dsym_pdf: x&y have different edges %s vs %s' % ( self.xminmax() , self.yminmax() ) )
limits_tau = ()
if self.xminmax() :
mn , mx = self.xminmax()
mmax = max ( abs ( mn ) , abs ( mx ) )
limits_tau = -500. / mmax , 500. / mmax
self.__n = n
#
## the exponential slopes
#
self.__tau = self.make_var ( tau ,
"tau_%s" % name ,
"tau(%s)" % name , tau , 0 , *limits_tau )
#
## finally build PDF
#
self.pdf = Ostap.Models.Expo2DPolSym (
'exp2Ds_%s' % name ,
'Expo2DPolSym(%s)' % name ,
self.x ,
self.y ,
self.tau ,
self.n ,
self.phi_list )
## save configuration
self.config = {
'name' : self.name ,
'xvar' : self.xvar ,
'yvar' : self.yvar ,
'n' : self.n ,
'tau' : self.tau ,
'the_phis' : self.phis
}
@property
def tau ( self ) :
"""``tau''-parameter, the exponential slope for x&y-dimensions"""
return self.__tau
@tau.setter
def tau ( self , value ) :
value = float ( value )
self.__tau.setVal ( value )
@property
def taux ( self ) :
"""``tau-x''-parameters, the exponential slope for x-dimension"""
return self.__tau
@taux.setter
def taux ( self , value ) :
value = float ( value )
self.__tau.setVal ( value )
@property
def tauy ( self ) :
"""``tau-y''-parameters, the exponential slope for y-dimension"""
return self.__tau
@tauy.setter
def tauy ( self , value ) :
value = float ( value )
self.__tau.setVal ( value )
@property
def n ( self ) :
"""``n''-parameter - order/degree of 2D-polynom in x&y-directions"""
return self.__n
@property
def nx ( self ) :
"""``nx''-parameter - order/degree of 2D-polynom in x-direction"""
return self.__n
@property
def ny ( self ) :
"""``ny''-parameter - order/degree of 2D-polynom in y-direction"""
return self.__n
models.append ( ExpoPol2Dsym_pdf )
# =============================================================================
## @class Spline2D_pdf
# positive spline in 2D:
# \f{displaymath} f(x,y) = \sum^{i=n}_{i=0}\sum{j=k}_{j=0} a^2_{ij} M^n_i(x) M^k_j(y) \f},
# where \f$ B^n_i(x)\f$ denotes the M-splines
# @see Ostap::Models::Spline2D
# @see Ostap::Math::Spline2D
# @author Vanya BELYAEV Ivan.Belyaev@itep.ru
# @date 2013-01-10
class Spline2D_pdf(PolyBase2) :
"""Positive non-factorizable spline in 2D
f(x,y) = sum_i sum_j a^2_{i,j} Nx_i(x) * Ny_j(y),
where
- Nx_i and Ny_j are normailzed B-splines for x and y-axes
Note:
- f(x,y)>=0 for whole 2D-range
"""
def __init__ ( self ,
name ,
xvar , ## the first dimension
yvar , ## the second dimension
spline , ## the spline: Ostap.Math.PositiveSpline2D
the_phis = None ) : ##
PolyBase2.__init__ ( self , name , xvar , yvar , spline.npars() , the_phis )
self.__spline = spline
#
## finally build PDF
#
self.pdf = Ostap.Models.Spline2D (
's2Dp_%s' % name ,
'Spline2D(%s)' % name ,
self.x ,
self.y ,
self.spline ,
self.phi_list )
## save configuration
self.config = {
'name' : self.name ,
'xvar' : self.xvar ,
'yvar' : self.yvar ,
'spline' : self.spline ,
'the_phis' : self.phis
}
@property
def spline ( self ) :
"""``spline''-function for Spline2D PDF"""
return self.__spline
models.append ( Spline2D_pdf )
# =============================================================================
## @class Spline2Dsym_pdf
# symmetric positive spline in 2D:
# \f{displaymath} f(x,y) = \sum^{i=n}_{i=0}\sum{j=k}_{j=0} a^2_{ij} M^n_i(x) M^k_j(y) \f},
# where \f$ B^n_i(x)\f$ denotes the M-splines
# @see Ostap::Models::Spline2DSym
# @see Ostap::Math::Spline2DSym
# @author Vanya BELYAEV Ivan.Belyaev@itep.ru
# @date 2013-01-10
class Spline2Dsym_pdf(PolyBase2) :
"""SYMMETRIC positive non-factorizable spline in 2D
f(x,y) = sum_i sum_j a^2_{i,j} N_i(x) * N_j(y),
where
- N_i are normailzed B-splines
Note:
- f(x,y)>=0 for whole 2D-range
- f(x,y)=f(y,x) for whole 2D-range
"""
def __init__ ( self ,
name ,
xvar , ## the first dimension
yvar , ## the second dimension
spline , ## the spline: Ostap.Math.PositiveSpline2DSym
the_phis = None ) :
PolyBase2.__init__ ( self , name , xvar , yvar , spline.npars() , the_phis )
if self.xminmax() != self.yminmax() :
logger.warning( 'Spline2Dsym_pdf: x&y have different edges %s vs %s' % ( self.xminmax() , self.yminmax() ) )
self.__spline = spline
#
## finally build PDF
#
self.pdf = Ostap.Models.Spline2DSym (
's2Dp_%s' % name ,
'Spline2DSym(%s)' % name ,
self.x ,
self.y ,
self.spline ,
self.phi_list )
## save configuration
self.config = {
'name' : self.name ,
'xvar' : self.xvar ,
'yvar' : self.yvar ,
'spline' : self.spline ,
'the_phis' : self.phis
}
@property
def spline ( self ) :
"""``spline''-function for Spline2Dsym PDF"""
return self.__spline
models.append ( Spline2Dsym_pdf )
# =============================================================================
# some tiny decoration of underlying classes
# =============================================================================
_rv = ROOT.gROOT.GetVersionInt() // 10000
def _2d_get_pars_ ( self ) :
"""
Get parameters of underlying positive Berstein polynomial
"""
if hasattr ( self , 'pdf' ) :
return _2d_get_pars_ ( self.pdf )
elif hasattr ( self , 'function' ) :
return _2d_get_pars_ ( self.function () )
elif hasattr ( self , 'positive' ) :
return _2d_get_pars_ ( self.positive () )
elif hasattr ( self , 'polynom' ) :
return _2d_get_pars_ ( self. polynom () )
elif hasattr ( self , 'bernstein' ) :
b = self.bernstein()
m = ROOT.TMatrix ( b.nX() + 1 , b.nY() + 1 )
for i in range ( 0 , b.nX() + 1 ) :
for j in range ( 0 , b.nY() + 1 ) :
if _rv < 6 : m[i][j] = b.par(i,j)
else : m[i, j] = b.par(i,j)
return m
return ROOT.TMatrix()
for t in ( PolyPos2D_pdf ,
PolyPos2Dsym_pdf ,
PSPol2D_pdf ,
PSPol2Dsym_pdf ,
ExpoPSPol2D_pdf ,
ExpoPol2D_pdf ,
ExpoPol2Dsym_pdf ) :
t.pars = _2d_get_pars_
# =============================================================================
# ==============================================================================
## Easy creation of 2D function for background
# @code
# xvar = ...
# yvar = ...
# bkg = make_B2D ( 'BB' , xvar , yvar , -1 , -1 ) ## create PolyPos2D_pdf
# bkg = make_B2D ( 'BB' , xvar , yvar , 1 , 1 ) ## create ExpoPol2D_pdf
# bkg = make_B2D ( 'BB' , xvar , yvar , 1 , -1 ) ## create ExpoPol2D_pdf, fix tau_y
# bkg = make_B2D ( 'BB' , xvar , yvar , -1 , 1 ) ## create ExpoPol2D_pdf, fix tau_x
# bkg = make_B2D ( 'BB' , xvar , yvar , 0 , 0 ) ## create Flat2D
# endcode
def make_B2D ( name , xvar , yvar , nx , ny ) :
"""Easy creation of 2D function for background
>>> xvar = ...
>>> yvar = ...
>>> bkg = make_B2D ( 'BB' , xvar , yvar , -1 , -1 ) ## create PolyPos2D_pdf
>>> bkg = make_B2D ( 'BB' , xvar , yvar , 1 , 1 ) ## create ExpoPol2D_pdf
>>> bkg = make_B2D ( 'BB' , xvar , yvar , 1 , -1 ) ## create ExpoPol2D_pdf, fix tau_y
>>> bkg = make_B2D ( 'BB' , xvar , yvar , -1 , 1 ) ## create ExpoPol2D_pdf, fix tau_x
>>> bkg = make_B2D ( 'BB' , xvar , yvar , 0 , 0 ) ## create Flat2D
"""
if 0 == nx and 0 == ny :
return Flat2D ( name = name , xvar = xvar , yvar = yvar )
elif 0 >= nx and 0 >= ny :
return PolyPos2D_pdf ( name = name , xvar = xvar , yvar = yvar , nx = abs ( nx ) , ny = abs ( ny ) )
fun2 = ExpoPol2D_pdf ( name = name , xvar = xvar , yvar = yvar , nx = abs ( nx ) , ny = abs ( ny ) )
if 0 > nx : fun2.taux.fix ( 0 )
if 0 > ny : fun2.tauy.fix ( 0 )
return fun2
# ==============================================================================
## Easy creation of symmetric 2D function for background
# @code
# xvar = ...
# yvar = ...
# bkg = make_B2Dsym ( 'BB' , xvar , yvar , -1 ) ## create PolyPol2Dsym_pdf
# bkg = make_B2Dsym ( 'BB' , xvar , yvar , 1 ) ## create ExpoPol2Dsym_pdf
# bkg = make_B2Dsym ( 'BB' , xvar , yvar , 0 ) ## create Flat2D
# endcode
def make_B2Dsym ( name , xvar , yvar , n ) :
"""Easy creation of symmetric 2D function for background
>>> xvar = ...
>>> yvar = ...
>>> bkg = make_B2Dsym ( 'BB' , xvar , yvar , -1 ) ## create PolyPol2Dsym_pdf
>>> bkg = make_B2Dsym ( 'BB' , xvar , yvar , 1 ) ## create ExpoPol2Dsym_pdf
>>> bkg = make_B2Dsym ( 'BB' , xvar , yvar , 0 ) ## create Flat2D
"""
if 0 == n :
return Flat2D ( name = name , xvar = xvar , yvar = yvar )
elif 0 >= n :
return PolyPos2DSym_pdf ( name = name , xvar = xvar , yvar = yvar , n = abs ( n ) )
fun2 = ExpoPol2Dsym_pdf ( name = name , xvar = xvar , yvar = yvar , n = abs ( n ) )
return fun2
# =============================================================================
if '__main__' == __name__ :
from ostap.utils.docme import docme
docme ( __name__ , logger = logger , symbols = models )
# =============================================================================
# The END
# =============================================================================
| 35.829208 | 120 | 0.474301 | 6,736 | 57,900 | 3.947892 | 0.048397 | 0.007596 | 0.005641 | 0.016847 | 0.856654 | 0.83725 | 0.825969 | 0.811717 | 0.785507 | 0.764901 | 0 | 0.022348 | 0.357012 | 57,900 | 1,615 | 121 | 35.851393 | 0.691961 | 0.43304 | 0 | 0.761364 | 0 | 0 | 0.063677 | 0.001419 | 0 | 0 | 0 | 0 | 0.026515 | 1 | 0.117424 | false | 0 | 0.010101 | 0 | 0.248737 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
81be216290e6d7c708a4dfd94d240dd8ed6a564f | 139 | py | Python | Mundo01/Python/aula11a.py | molonti/CursoemVideo---Python | 4f6a7af648f7f619d11e95fa3dc7a33b28fcfa11 | [
"MIT"
] | null | null | null | Mundo01/Python/aula11a.py | molonti/CursoemVideo---Python | 4f6a7af648f7f619d11e95fa3dc7a33b28fcfa11 | [
"MIT"
] | null | null | null | Mundo01/Python/aula11a.py | molonti/CursoemVideo---Python | 4f6a7af648f7f619d11e95fa3dc7a33b28fcfa11 | [
"MIT"
] | null | null | null | print('Ola, Mundo')
print('\033[1;31;43mOlá, Mundo!\033[m')
print('\033[4;30;45mOlá, Mundo!\033[m')
print('\033[0;33;44mOlá, Mundo!\033[m') | 34.75 | 39 | 0.654676 | 27 | 139 | 3.37037 | 0.518519 | 0.263736 | 0.296703 | 0.307692 | 0.373626 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 0.05036 | 139 | 4 | 40 | 34.75 | 0.439394 | 0 | 0 | 0 | 0 | 0 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 10 |
81df01a288eb4529ad94cef2c6b3cd4e39a040d9 | 6,468 | py | Python | user_details/migrations/0001_initial.py | Shreyanshsachan/College-Predictor | 87068aa1d1a889ced586ff155bc2b5d9a78340f7 | [
"MIT"
] | null | null | null | user_details/migrations/0001_initial.py | Shreyanshsachan/College-Predictor | 87068aa1d1a889ced586ff155bc2b5d9a78340f7 | [
"MIT"
] | null | null | null | user_details/migrations/0001_initial.py | Shreyanshsachan/College-Predictor | 87068aa1d1a889ced586ff155bc2b5d9a78340f7 | [
"MIT"
] | null | null | null | # Generated by Django 2.2a1 on 2019-03-20 09:59
from django.conf import settings
import django.core.validators
from django.db import migrations, models
import django.db.models.deletion
import multiselectfield.db.fields
import user_details.validation
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='FillProfile',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('adv_air', models.PositiveIntegerField(validators=[django.core.validators.MaxValueValidator(100), django.core.validators.MinValueValidator(1), user_details.validation.validate_zero])),
('mains_air', models.PositiveIntegerField(validators=[django.core.validators.MaxValueValidator(100), django.core.validators.MinValueValidator(1), user_details.validation.validate_zero])),
('state', models.CharField(choices=[('andhra pradesh', 'Andhra Pradesh'), ('arrunachal pradesh', 'Arrunachal Pradesh'), ('assam', 'Assam'), ('bihar', 'Bihar'), ('chattisgarh', 'Chattisgarh'), ('goa', 'Goa'), ('gujrat', 'Gujrat'), ('haryana', 'Haryana'), ('himachal pradesh', 'Himachal Pradesh'), ('jammu & kashmir', 'Jammu & Kashmir'), ('jharkhand', 'Jharkhand'), ('karnataka', 'Karnataka'), ('kerala', 'Kerala'), ('madhya pradesh', 'Madhya Pradesh'), ('maharashtra', 'Maharashtra'), ('manipur', 'Manipur'), ('meghalaya', 'Meghalaya'), ('mizoram', 'Mizoram'), ('nagaland', 'Nagaland'), ('odisha', 'Odisha'), ('rajasthan', 'Rajasthan'), ('sikkim', 'Sikkim'), ('tamil nadu', 'Tamil Nadu'), ('telangana', 'Telangana'), ('tripura', 'Tripura'), ('uttarakhand', 'Uttharakhand'), ('uttar pradesh', 'Uttar Pradesh'), ('west bengal', 'West Bengal')], max_length=100)),
('category', models.CharField(choices=[('general', 'General'), ('general-pwd', 'General-PwD'), ('obc-ncl', 'OBC-NCL'), ('obc-ncl-pwd', 'OBC-NCL-PWD'), ('sc', 'SC'), ('st', 'ST')], max_length=100)),
('gender', models.BooleanField(choices=[(True, 'Male'), (False, 'Female')], default=True)),
('Logged_in_user', models.ForeignKey(default=None, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='FillPrefer',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('college_selected', multiselectfield.db.fields.MultiSelectField(choices=[('Indian Institute of Technology Bombay, Aerospace Engineering', 'Indian Institute of Technology Bombay, Aerospace Engineering'), ('Indian Institute of Technology Bombay, Chemical Engineering', 'Indian Institute of Technology Bombay, Chemical Engineering'), ('Indian Institute of Technology Bombay, Civil Engineering', 'Indian Institute of Technology Bombay, Civil Engineering'), ('Indian Institute of Technology Bombay, Computer Science and Engineering', 'Indian Institute of Technology Bombay, Computer Science and Engineering'), ('Indian Institute of Technology Bombay, Electrical Engineering', 'Indian Institute of Technology Bombay, Electrical Engineering'), ('Indian Institute of Technology Bombay, Electronics and Communication', 'Indian Institute of Technology Bombay, Electronics and Communication'), ('Indian Institute of Technology Bombay, Mechanical Engineering', 'Indian Institute of Technology Bombay, Mechanical Engineering'), ('Indian Institute of Technology Delhi, Aerospace Engineering', 'Indian Institute of Technology Delhi, Aerospace Engineering'), ('Indian Institute of Technology Delhi, Chemical Engineering', 'Indian Institute of Technology Delhi, Chemical Engineering'), ('Indian Institute of Technology Delhi, Civil Engineering', 'Indian Institute of Technology Delhi, Civil Engineering'), ('Indian Institute of Technology Delhi, Computer Science and Engineering', 'Indian Institute of Technology Delhi, Computer Science and Engineering'), ('Indian Institute of Technology Delhi, Electrical Engineering', 'Indian Institute of Technology Delhi, Electrical Engineering'), ('Indian Institute of Technology Delhi, Electronics and Communication', 'Indian Institute of Technology Delhi, Electronics and Communication'), ('Indian Institute of Technology Delhi, Mechanical Engineering', 'Indian Institute of Technology Delhi, Mechanical Engineering'), ('National Institute of Technology Surathkal, Chemical Engineering', 'National Institute of Technology Surathkal, Chemical Engineering'), ('National Institute of Technology Surathkal, Civil Engineering', 'National Institute of Technology Surathkal, Civil Engineering'), ('National Institute of Technology Surathkal, Computer Science and Engineering', 'National Institute of Technology Surathkal, Computer Science and Engineering'), ('National Institute of Technology Surathkal, Electrical Engineering', 'National Institute of Technology Surathkal, Electrical Engineering'), ('National Institute of Technology Surathkal, Electronics and Communication', 'National Institute of Technology Surathkal, Electronics and Communication'), ('National Institute of Technology Surathkal, Information Technology', 'National Institute of Technology Surathkal, Information Technology'), ('National Institute of Technology Surathkal, Mechanical Engineering', 'National Institute of Technology Surathkal, Mechanical Engineering'), ('National Institute of Technology Trichy, Chemical Engineering', 'National Institute of Technology Trichy, Chemical Engineering'), ('National Institute of Technology Trichy, Civil Engineering', 'National Institute of Technology Trichy, Civil Engineering'), ('National Institute of Technology Trichy, Computer Science and Engineering', 'National Institute of Technology Trichy, Computer Science and Engineering'), ('National Institute of Technology Trichy, Electrical Engineering', 'National Institute of Technology Trichy, Electrical Engineering'), ('National Institute of Technology Trichy, Electronics and Communication', 'National Institute of Technology Trichy, Electronics and Communication')], max_length=1687)),
('Logged_in_user', models.ForeignKey(default=None, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
]
| 157.756098 | 3,693 | 0.745053 | 684 | 6,468 | 7.001462 | 0.204678 | 0.11944 | 0.228023 | 0.157862 | 0.750052 | 0.750052 | 0.750052 | 0.736897 | 0.718313 | 0.718313 | 0 | 0.005883 | 0.132808 | 6,468 | 40 | 3,694 | 161.7 | 0.847923 | 0.006957 | 0 | 0.363636 | 1 | 0 | 0.630276 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.181818 | 0 | 0.30303 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
c4b9e6102b47ce5737e8386fecb879fd82377fcc | 15,148 | py | Python | src/Test/test_fuzzy_parse.py | CelineDknp/SemiParsingCFG | 0263a5b442c7c994279357738afeea59f642428d | [
"MIT"
] | null | null | null | src/Test/test_fuzzy_parse.py | CelineDknp/SemiParsingCFG | 0263a5b442c7c994279357738afeea59f642428d | [
"MIT"
] | null | null | null | src/Test/test_fuzzy_parse.py | CelineDknp/SemiParsingCFG | 0263a5b442c7c994279357738afeea59f642428d | [
"MIT"
] | null | null | null | import pytest
from Utils.config import *
from main import process_and_parse
#Testing the base parsing for if/else
def test_fuzzy_parse_base_if():
node_array = process_and_parse("TestFiles/pytest/if_normal_test_file.COB")
assert len(node_array) == 3
assert node_array[0].get_type() == NODE_COND_START
assert node_array[1].get_type() == NODE_COND_BRANCH
assert node_array[2].get_type() == NODE_COND_END
assert node_array[0].get_depth() == 0
assert node_array[1].get_depth() == 0
assert node_array[2].get_depth() == 0
#Testing the base parsing for if/else without END-IFs
def test_fuzzy_parse_base_if_no_end_if():
node_array = process_and_parse("TestFiles/pytest/if_no_end-if_test_file.COB")
assert len(node_array) == 3
assert node_array[0].get_type() == NODE_COND_START
assert node_array[1].get_type() == NODE_COND_BRANCH
assert node_array[2].get_type() == NODE_COND_END_ANY
assert node_array[0].get_depth() == 0
assert node_array[1].get_depth() == 0
assert node_array[2].get_depth() == 0
#Testing simple branches for if/else
def test_fuzzy_parse_base_if_left_branch():
node_array = process_and_parse("TestFiles/pytest/if_left_branch_test_file.COB")
assert len(node_array) == 4
assert node_array[0].get_type() == NODE_COND_START
assert node_array[1].get_type() == NODE_SQL
assert node_array[2].get_type() == NODE_COND_BRANCH
assert node_array[3].get_type() == NODE_COND_END
assert node_array[0].get_depth() == 0
assert node_array[1].get_depth() == 1
assert node_array[2].get_depth() == 0
assert node_array[3].get_depth() == 0
def test_fuzzy_parse_base_if_right_branch():
node_array = process_and_parse("TestFiles/pytest/if_right_branch_test_file.COB")
assert len(node_array) == 4
assert node_array[0].get_type() == NODE_COND_START
assert node_array[1].get_type() == NODE_COND_BRANCH
assert node_array[2].get_type() == NODE_SQL
assert node_array[3].get_type() == NODE_COND_END
assert node_array[0].get_depth() == 0
assert node_array[1].get_depth() == 0
assert node_array[2].get_depth() == 1
assert node_array[3].get_depth() == 0
def test_fuzzy_parse_base_if_both_branch():
node_array = process_and_parse("TestFiles/pytest/if_both_branch_test_file.COB")
assert len(node_array) == 5
assert node_array[0].get_type() == NODE_COND_START
assert node_array[1].get_type() == NODE_SQL
assert node_array[2].get_type() == NODE_COND_BRANCH
assert node_array[3].get_type() == NODE_SQL
assert node_array[4].get_type() == NODE_COND_END
assert node_array[0].get_depth() == 0
assert node_array[1].get_depth() == 1
assert node_array[2].get_depth() == 0
assert node_array[3].get_depth() == 1
assert node_array[4].get_depth() == 0
#Testing nested branches for if/else
def test_fuzzy_parse_base_if_nested_left_branch():
node_array = process_and_parse("TestFiles/pytest/if_nested_left_branch_test_file.COB")
assert len(node_array) == 5
assert node_array[0].get_type() == NODE_COND_START
assert node_array[1].get_type() == NODE_COND_START
assert node_array[2].get_type() == NODE_COND_END
assert node_array[3].get_type() == NODE_COND_BRANCH
assert node_array[4].get_type() == NODE_COND_END
assert node_array[0].get_depth() == 0
assert node_array[1].get_depth() == 1
assert node_array[2].get_depth() == 1
assert node_array[3].get_depth() == 0
assert node_array[3].get_depth() == 0
def test_fuzzy_parse_base_if_nested_right_branch():
node_array = process_and_parse("TestFiles/pytest/if_nested_right_branch_test_file.COB")
assert len(node_array) == 5
assert node_array[0].get_type() == NODE_COND_START
assert node_array[1].get_type() == NODE_COND_BRANCH
assert node_array[2].get_type() == NODE_COND_START
assert node_array[3].get_type() == NODE_COND_END
assert node_array[4].get_type() == NODE_COND_END
assert node_array[0].get_depth() == 0
assert node_array[1].get_depth() == 0
assert node_array[2].get_depth() == 1
assert node_array[3].get_depth() == 1
assert node_array[4].get_depth() == 0
def test_fuzzy_parse_base_if_nested_both_branch():
node_array = process_and_parse("TestFiles/pytest/if_nested_both_branch_test_file.COB")
assert len(node_array) == 7
assert node_array[0].get_type() == NODE_COND_START
assert node_array[1].get_type() == NODE_COND_START
assert node_array[2].get_type() == NODE_COND_END
assert node_array[3].get_type() == NODE_COND_BRANCH
assert node_array[4].get_type() == NODE_COND_START
assert node_array[5].get_type() == NODE_COND_END
assert node_array[6].get_type() == NODE_COND_END
assert node_array[0].get_depth() == 0
assert node_array[1].get_depth() == 1
assert node_array[2].get_depth() == 1
assert node_array[3].get_depth() == 0
assert node_array[4].get_depth() == 1
assert node_array[5].get_depth() == 1
assert node_array[6].get_depth() == 0
#Testing the conditions of ifs
def test_fuzzy_parse_base_if_condition():
node_array = process_and_parse("TestFiles/pytest/if_condition_base_test_file.COB")
assert len(node_array) == 2
assert node_array[0].get_type() == NODE_COND_START
assert node_array[1].get_type() == NODE_COND_END
assert node_array[0].get_condition() == "A > 0"
def test_fuzzy_parse_if_condition_two_lines():
node_array = process_and_parse("TestFiles/pytest/if_condition_two_lines_test_file.COB")
assert len(node_array) == 2
assert node_array[0].get_type() == NODE_COND_START
assert node_array[1].get_type() == NODE_COND_END
assert node_array[0].get_condition() == "A > 0 AND B = 10"
def test_fuzzy_parse_if_condition_three_lines():
node_array = process_and_parse("TestFiles/pytest/if_condition_three_lines_test_file.COB")
assert len(node_array) == 2
assert node_array[0].get_type() == NODE_COND_START
assert node_array[1].get_type() == NODE_COND_END
assert node_array[0].get_condition() == "A > 0 AND B = 10 AND C"
def test_fuzzy_parse_if_condition_case_insensitive():
node_array = process_and_parse("TestFiles/pytest/if_condition_case_insensitive_test_file.COB")
assert len(node_array) == 2
assert node_array[0].get_type() == NODE_COND_START
assert node_array[1].get_type() == NODE_COND_END
assert node_array[0].get_condition() == "A > 0 and B > 0"
#Tests for evaluate (multiple branch conditions)
def test_fuzzy_parse_simple_evaluate():
node_array = process_and_parse("TestFiles/pytest/evaluate_simple_test_file.COB")
assert len(node_array) == 8
assert node_array[0].get_type() == NODE_COND_START
assert node_array[0].get_condition() == "TRUE"
assert node_array[1].get_type() == NODE_COND_BRANCH
assert node_array[1].get_condition() == "A > 0"
assert node_array[3].get_type() == NODE_COND_BRANCH
assert node_array[3].get_condition() == "A < 0"
assert node_array[5].get_type() == NODE_COND_BRANCH
assert node_array[5].get_condition() == "A = 0"
assert node_array[7].get_type() == NODE_COND_END
def test_fuzzy_parse_mixed_evaluate():
node_array = process_and_parse("TestFiles/pytest/evaluate_if_mix_test_file.COB")
assert len(node_array) == 14
assert node_array[0].get_type() == NODE_COND_START
assert node_array[0].get_condition() == "TRUE"
assert node_array[1].get_type() == NODE_COND_BRANCH
assert node_array[1].get_condition() == "A > 0"
assert node_array[2].get_type() == NODE_COND_START
assert node_array[2].get_condition() == "B > 1"
assert node_array[4].get_type() == NODE_COND_BRANCH
assert node_array[5].get_type() == NODE_COND_END
assert node_array[6].get_type() == NODE_COND_BRANCH
assert node_array[6].get_condition() == "A < 0"
assert node_array[7].get_type() == NODE_COND_START
assert node_array[7].get_condition() == "B > 2"
assert node_array[8].get_type() == NODE_COND_BRANCH
assert node_array[10].get_type() == NODE_COND_END
assert node_array[11].get_type() == NODE_COND_BRANCH
assert node_array[11].get_condition() == "A = 0"
assert node_array[13].get_type() == NODE_COND_END
#Testing the comment part of parsing
def test_fuzzy_parse_comment():
node_array = process_and_parse("TestFiles/pytest/comment_normal_test_file.COB")
assert len(node_array) == 5
assert node_array[0].get_type() == NODE_COND_START
assert node_array[1].get_type() == NODE_COND_BRANCH
assert node_array[2].get_type() == NODE_COND_END
assert node_array[3].get_type() == NODE_COND_START
assert node_array[4].get_type() == NODE_COND_END
assert node_array[0].get_depth() == 0
assert node_array[1].get_depth() == 0
assert node_array[2].get_depth() == 0
assert node_array[3].get_depth() == 0
assert node_array[4].get_depth() == 0
#Testing the string part of parsing, normal case
def test_fuzzy_parse_string():
node_array = process_and_parse("TestFiles/pytest/string_normal_test_file.COB")
assert len(node_array) == 3
assert node_array[0].get_type() == NODE_COND_START
assert node_array[1].get_type() == NODE_COND_BRANCH
assert node_array[2].get_type() == NODE_COND_END
assert node_array[0].get_depth() == 0
assert node_array[1].get_depth() == 0
assert node_array[2].get_depth() == 0
#Testing the string part of parsing, special case with multiple line strings
def test_fuzzy_parse_string_on_multiple_lines():
node_array = process_and_parse("TestFiles/pytest/string_on_two_lines_test_file.COB")
assert len(node_array) == 3
assert node_array[0].get_type() == NODE_COND_START
assert node_array[1].get_type() == NODE_COND_BRANCH
assert node_array[2].get_type() == NODE_COND_END
assert node_array[0].get_depth() == 0
assert node_array[1].get_depth() == 0
assert node_array[2].get_depth() == 0
#Testing the exec SQL part of parsing, all case insisitive possibilities
def test_fuzzy_parse_exec_case_insensitive():
node_array = process_and_parse("TestFiles/pytest/exec_sql_all_forms_test_file.COB")
assert len(node_array) == 4
assert node_array[0].get_type() == NODE_SQL
assert node_array[1].get_type() == NODE_SQL
assert node_array[2].get_type() == NODE_SQL
assert node_array[3].get_type() == NODE_SQL
assert node_array[0].get_depth() == 0
assert node_array[1].get_depth() == 0
assert node_array[2].get_depth() == 0
assert node_array[3].get_depth() == 0
#Testing the next sentence instruction (control loop)
def test_fuzzy_parse_next_sentence():
node_array = process_and_parse("TestFiles/pytest/next_sentence_test_file.COB")
assert len(node_array) == 7
assert node_array[0].get_type() == NODE_COND_START
assert node_array[0].get_condition() == "A > 0"
assert node_array[1].get_type() == NODE_LOOP
assert node_array[2].get_type() == NODE_COND_BRANCH
assert node_array[3].get_type() == NODE_COND_END
assert node_array[4].get_type() == NODE_COND_END_ANY
assert node_array[5].get_type() == NODE_COND_START
assert node_array[6].get_type() == NODE_COND_END
def test_fuzzy_parse_next_sentence_v2():
node_array = process_and_parse("TestFiles/pytest/next_sentence_v2_test_file.COB")
assert len(node_array) == 7
assert node_array[0].get_type() == NODE_COND_START
assert node_array[0].get_condition() == "A > 0"
assert node_array[1].get_type() == NODE_LOOP
assert node_array[2].get_type() == NODE_COND_BRANCH
assert node_array[3].get_type() == NODE_COND_END
assert node_array[4].get_type() == NODE_COND_START
assert node_array[5].get_type() == NODE_COND_END
assert node_array[6].get_type() == NODE_COND_END_ANY
#Testing the simple (single label) perform
def test_fuzzy_parse_base_perform():
node_array = process_and_parse("TestFiles/pytest/perform_base_test_file.COB")
assert len(node_array) == 8
assert node_array[0].get_type() == NODE_LABEL
assert node_array[1].get_type() == NODE_LOOP
assert node_array[2].get_type() == NODE_LABEL
assert node_array[3].get_type() == NODE_LABEL
assert node_array[4].get_type() == NODE_LABEL
assert node_array[5].get_type() == NODE_LABEL
assert node_array[6].get_type() == NODE_LABEL
assert node_array[7].get_type() == NODE_LABEL
def test_fuzzy_parse_chained_perform():
node_array = process_and_parse("TestFiles/pytest/perform_chained_test_file.COB")
assert len(node_array) == 9
assert node_array[0].get_type() == NODE_LABEL
assert node_array[1].get_type() == NODE_LOOP
assert node_array[2].get_type() == NODE_LABEL
assert node_array[3].get_type() == NODE_LABEL
assert node_array[4].get_type() == NODE_LOOP
assert node_array[5].get_type() == NODE_LABEL
assert node_array[7].get_type() == NODE_LABEL
assert node_array[7].get_type() == NODE_LABEL
assert node_array[8].get_type() == NODE_LABEL
def test_fuzzy_parse_broken_perform():
node_array = process_and_parse("TestFiles/pytest/perform_broken_goto_test_file.COB")
assert len(node_array) == 9
assert node_array[0].get_type() == NODE_LABEL
assert node_array[1].get_type() == NODE_LOOP
assert node_array[2].get_type() == NODE_LABEL
assert node_array[3].get_type() == NODE_LABEL
assert node_array[4].get_type() == NODE_LOOP
assert node_array[5].get_type() == NODE_LABEL
assert node_array[7].get_type() == NODE_LABEL
assert node_array[7].get_type() == NODE_LABEL
assert node_array[8].get_type() == NODE_LABEL
#Testing the perform thru (multiple label goback)
def test_fuzzy_parse_base_perform_thru():
node_array = process_and_parse("TestFiles/pytest/performThru_base_test_file.COB")
assert len(node_array) == 8
assert node_array[0].get_type() == NODE_LABEL
assert node_array[1].get_type() == NODE_LOOP
assert node_array[2].get_type() == NODE_LABEL
assert node_array[3].get_type() == NODE_LABEL
assert node_array[4].get_type() == NODE_LABEL
assert node_array[5].get_type() == NODE_LABEL
assert node_array[6].get_type() == NODE_LABEL
assert node_array[7].get_type() == NODE_LABEL
def test_fuzzy_parse_chained_perform_thru():
node_array = process_and_parse("TestFiles/pytest/performThru_chained_test_file.COB")
assert len(node_array) == 9
assert node_array[0].get_type() == NODE_LABEL
assert node_array[1].get_type() == NODE_LOOP
assert node_array[2].get_type() == NODE_LABEL
assert node_array[3].get_type() == NODE_LABEL
assert node_array[4].get_type() == NODE_LOOP
assert node_array[5].get_type() == NODE_LABEL
assert node_array[7].get_type() == NODE_LABEL
assert node_array[7].get_type() == NODE_LABEL
assert node_array[8].get_type() == NODE_LABEL
def test_fuzzy_parse_goto_perform_thru():
node_array = process_and_parse("TestFiles/pytest/performThru_goto_test_file.COB")
assert len(node_array) == 9
assert node_array[0].get_type() == NODE_LABEL
assert node_array[1].get_type() == NODE_LOOP
assert node_array[2].get_type() == NODE_LABEL
assert node_array[3].get_type() == NODE_LABEL
assert node_array[4].get_type() == NODE_LOOP
assert node_array[5].get_type() == NODE_LABEL
assert node_array[7].get_type() == NODE_LABEL
assert node_array[7].get_type() == NODE_LABEL
assert node_array[8].get_type() == NODE_LABEL
def test_fuzzy_parse_broken_perform_thru():
node_array = process_and_parse("TestFiles/pytest/performThru_broken_goto_test_file.COB")
assert len(node_array) == 9
assert node_array[0].get_type() == NODE_LABEL
assert node_array[1].get_type() == NODE_LOOP
assert node_array[2].get_type() == NODE_LABEL
assert node_array[3].get_type() == NODE_LABEL
assert node_array[4].get_type() == NODE_LOOP
assert node_array[5].get_type() == NODE_LABEL
assert node_array[7].get_type() == NODE_LABEL
assert node_array[7].get_type() == NODE_LABEL
assert node_array[8].get_type() == NODE_LABEL
| 44.163265 | 95 | 0.765051 | 2,563 | 15,148 | 4.123293 | 0.040577 | 0.230791 | 0.308005 | 0.112131 | 0.94559 | 0.935749 | 0.915784 | 0.894398 | 0.864497 | 0.786336 | 0 | 0.023613 | 0.102588 | 15,148 | 342 | 96 | 44.292398 | 0.753788 | 0.039807 | 0 | 0.717608 | 0 | 0 | 0.097447 | 0.089464 | 0 | 0 | 0 | 0 | 0.810631 | 1 | 0.089701 | false | 0 | 0.009967 | 0 | 0.099668 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
c4e97e5cbd7e34ea65214bc440726ff5af301551 | 13 | py | Python | apis.py | caiden3/cslq | 4e9d5001f962d5895bcb0af11e320c94d190678b | [
"Apache-2.0"
] | null | null | null | apis.py | caiden3/cslq | 4e9d5001f962d5895bcb0af11e320c94d190678b | [
"Apache-2.0"
] | null | null | null | apis.py | caiden3/cslq | 4e9d5001f962d5895bcb0af11e320c94d190678b | [
"Apache-2.0"
] | null | null | null | print("452")
| 6.5 | 12 | 0.615385 | 2 | 13 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 0.076923 | 13 | 1 | 13 | 13 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0.230769 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
1ef66fa7814fcc5d5dd00b2e232e347e892b1343 | 7,101 | py | Python | tests/test_fs_mod.py | jpfxgood/bkp | e9bdf030154ba216108225770fdb7d71cdb43da3 | [
"MIT"
] | null | null | null | tests/test_fs_mod.py | jpfxgood/bkp | e9bdf030154ba216108225770fdb7d71cdb43da3 | [
"MIT"
] | 1 | 2020-07-12T20:42:37.000Z | 2020-07-12T20:42:37.000Z | tests/test_fs_mod.py | jpfxgood/bkp | e9bdf030154ba216108225770fdb7d71cdb43da3 | [
"MIT"
] | null | null | null | import os
from bkp_test_util import fs_testdir
from bkp_core import fs_mod
from io import StringIO
import time
import re
import math
def test_fs_mod_ssh(fs_testdir):
""" test suite for the fs_mod module covering sftp functionality """
def get_config():
""" return config settings """
return fs_testdir
def ssh_path(filename):
""" return the ssh remote path for a given filename """
return fs_testdir["ssh_basepath"]+'/'+filename
def local_path(filename):
""" return the local path for a given filename """
return os.path.join(fs_testdir["local_path"],filename)
for f in fs_testdir["remote_files"]:
assert(fs_mod.fs_stat(ssh_path(f),get_config) == fs_testdir["remote_files_stats"][f])
test_filename = local_path(fs_testdir["local_files"][1])
test_remote_filename = ssh_path(fs_testdir["local_files"][1])
assert(fs_mod.fs_test(test_remote_filename,False,get_config))
assert(fs_mod.fs_test(test_filename,False,get_config))
file_stat = fs_mod.fs_stat(test_filename)
fs_mod.fs_put(test_filename,test_remote_filename,get_config)
fs_mod.fs_utime(test_remote_filename,(file_stat[0],file_stat[0]),get_config)
assert(fs_mod.fs_stat(test_remote_filename,get_config) == file_stat)
fs_mod.fs_del(test_remote_filename,False,get_config)
assert(fs_mod.fs_stat(test_remote_filename,get_config) == (-1,-1))
test_filename = local_path(fs_testdir["remote_files"][2])
test_remote_filename = ssh_path(fs_testdir["remote_files"][2])
file_stat = fs_mod.fs_stat(test_remote_filename,get_config)
fs_mod.fs_get(test_remote_filename, test_filename, get_config)
fs_mod.fs_utime(test_filename,(file_stat[0],file_stat[0]))
assert(fs_mod.fs_stat(test_filename) == file_stat)
remote_count = 0
local_count = 0
remote_list = StringIO(fs_mod.fs_ls(fs_testdir["ssh_basepath"], False, get_config))
for l in remote_list:
mtime = math.floor(time.mktime(time.strptime(l[:16],"%Y-%m-%d %H:%M")))
parts = re.split(r"\s+",l,3)
size = int(parts[2])
file_name = os.path.basename(parts[3]).rstrip()
if file_name in fs_testdir["remote_files"]:
assert(fs_testdir["remote_files_stats"][file_name][1] == size)
remote_count += 1
elif file_name in fs_testdir["local_files"]:
local_count += 1
assert(remote_count == 5 and local_count == 0)
def test_fs_mod_s3(fs_testdir):
""" test suite for the fs_mod module covering s3 functionality """
def get_config():
""" return config settings """
return fs_testdir
def s3_path(filename):
""" return the s3 remote path for a given filename """
return fs_testdir["s3_basepath"]+'/'+filename
def local_path(filename):
""" return the local path for a given filename """
return os.path.join(fs_testdir["local_path"],filename)
for f in fs_testdir["remote_files"]:
assert(fs_mod.fs_stat(s3_path(f),get_config) == fs_testdir["remote_files_stats"][f])
test_filename = local_path(fs_testdir["local_files"][1])
test_remote_filename = s3_path(fs_testdir["local_files"][1])
assert(fs_mod.fs_test(test_remote_filename,False,get_config))
assert(fs_mod.fs_test(test_filename,False,get_config))
file_stat = fs_mod.fs_stat(test_filename)
fs_mod.fs_put(test_filename,test_remote_filename,get_config)
fs_mod.fs_utime(test_remote_filename,(file_stat[0],file_stat[0]),get_config)
assert(fs_mod.fs_stat(test_remote_filename,get_config) == file_stat)
fs_mod.fs_del(test_remote_filename,False,get_config)
assert(fs_mod.fs_stat(test_remote_filename,get_config) == (-1,-1))
test_filename = local_path(fs_testdir["remote_files"][2])
test_remote_filename = s3_path(fs_testdir["remote_files"][2])
file_stat = fs_mod.fs_stat(test_remote_filename,get_config)
fs_mod.fs_get(test_remote_filename, test_filename, get_config)
fs_mod.fs_utime(test_filename,(file_stat[0],file_stat[0]))
assert(fs_mod.fs_stat(test_filename) == file_stat)
remote_count = 0
local_count = 0
remote_list = StringIO(fs_mod.fs_ls(fs_testdir["s3_basepath"]+'/', False, get_config))
for l in remote_list:
mtime = math.floor(time.mktime(time.strptime(l[:16],"%Y-%m-%d %H:%M")))
parts = re.split(r"\s+",l,3)
size = int(parts[2])
file_name = os.path.basename(parts[3]).rstrip()
if file_name in fs_testdir["remote_files"]:
assert(fs_testdir["remote_files_stats"][file_name][1] == size)
remote_count += 1
elif file_name in fs_testdir["local_files"]:
local_count += 1
assert(remote_count == 5 and local_count == 0)
def test_fs_mod_file(fs_testdir):
""" test suite for the fs_mod module covering local file system functionality """
def get_config():
""" return config settings """
return fs_testdir
def remote_fs_path(filename):
""" return the remote filesystem path for a given filename """
return fs_testdir["remote_fs_basepath"]+'/'+filename
def local_path(filename):
""" return the local path for a given filename """
return os.path.join(fs_testdir["local_path"],filename)
for f in fs_testdir["remote_files"]:
assert(fs_mod.fs_stat(remote_fs_path(f),get_config) == fs_testdir["remote_files_stats"][f])
test_filename = local_path(fs_testdir["local_files"][1])
test_remote_filename = remote_fs_path(fs_testdir["local_files"][1])
assert(fs_mod.fs_test(test_remote_filename,False,get_config))
assert(fs_mod.fs_test(test_filename,False,get_config))
file_stat = fs_mod.fs_stat(test_filename)
fs_mod.fs_put(test_filename,test_remote_filename,get_config)
fs_mod.fs_utime(test_remote_filename,(file_stat[0],file_stat[0]),get_config)
assert(fs_mod.fs_stat(test_remote_filename,get_config) == file_stat)
fs_mod.fs_del(test_remote_filename,False,get_config)
assert(fs_mod.fs_stat(test_remote_filename,get_config) == (-1,-1))
test_filename = local_path(fs_testdir["remote_files"][2])
test_remote_filename = remote_fs_path(fs_testdir["remote_files"][2])
file_stat = fs_mod.fs_stat(test_remote_filename,get_config)
fs_mod.fs_get(test_remote_filename, test_filename, get_config)
fs_mod.fs_utime(test_filename,(file_stat[0],file_stat[0]))
assert(fs_mod.fs_stat(test_filename) == file_stat)
remote_count = 0
local_count = 0
remote_list = StringIO(fs_mod.fs_ls(fs_testdir["remote_fs_basepath"], False, get_config))
for l in remote_list:
mtime = math.floor(time.mktime(time.strptime(l[:16],"%Y-%m-%d %H:%M")))
parts = re.split(r"\s+",l,3)
size = int(parts[2])
file_name = os.path.basename(parts[3]).rstrip()
if file_name in fs_testdir["remote_files"]:
assert(fs_testdir["remote_files_stats"][file_name][1] == size)
remote_count += 1
elif file_name in fs_testdir["local_files"]:
local_count += 1
assert(remote_count == 5 and local_count == 0)
| 43.564417 | 99 | 0.702859 | 1,104 | 7,101 | 4.165761 | 0.072464 | 0.053272 | 0.063927 | 0.078278 | 0.941292 | 0.937813 | 0.937813 | 0.935638 | 0.924331 | 0.906067 | 0 | 0.012749 | 0.171525 | 7,101 | 162 | 100 | 43.833333 | 0.768995 | 0.077736 | 0 | 0.774194 | 0 | 0 | 0.08 | 0 | 0 | 0 | 0 | 0 | 0.193548 | 1 | 0.096774 | false | 0 | 0.056452 | 0 | 0.225806 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
481cd23438021666681b9c26384718513e457f7f | 141 | py | Python | relevanceai/operations/cluster/__init__.py | RelevanceAI/RelevanceAI | a0542f35153d9c842f3d2cd0955d6b07f6dfc07b | [
"Apache-2.0"
] | 21 | 2021-11-23T13:01:36.000Z | 2022-03-23T03:45:30.000Z | relevanceai/operations/cluster/__init__.py | RelevanceAI/RelevanceAI | a0542f35153d9c842f3d2cd0955d6b07f6dfc07b | [
"Apache-2.0"
] | 217 | 2021-11-23T00:11:01.000Z | 2022-03-30T08:11:49.000Z | relevanceai/operations/cluster/__init__.py | RelevanceAI/RelevanceAI | a0542f35153d9c842f3d2cd0955d6b07f6dfc07b | [
"Apache-2.0"
] | 4 | 2022-01-04T01:48:30.000Z | 2022-02-11T03:19:32.000Z | """ClusterOps
"""
from relevanceai.operations.cluster.cluster import ClusterOps
from relevanceai.operations.cluster.sub import SubClusterOps
| 28.2 | 61 | 0.843972 | 15 | 141 | 7.933333 | 0.533333 | 0.235294 | 0.420168 | 0.588235 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.070922 | 141 | 4 | 62 | 35.25 | 0.908397 | 0.070922 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
485a46ca20353ce9fa295a7c16de6508d0b26cc1 | 284,303 | py | Python | stamper_targets/netronome/sdk6_rte/RunTimeEnvironment.py | TuDatTr/P4STA | d254608f5acd2a33af7b9242287dc6214700e0c8 | [
"Apache-2.0"
] | 20 | 2020-09-04T02:55:49.000Z | 2022-03-18T13:44:42.000Z | stamper_targets/netronome/sdk6_rte/RunTimeEnvironment.py | TuDatTr/P4STA | d254608f5acd2a33af7b9242287dc6214700e0c8 | [
"Apache-2.0"
] | 5 | 2019-12-17T16:56:59.000Z | 2021-06-07T08:32:21.000Z | stamper_targets/netronome/sdk6_rte/RunTimeEnvironment.py | TuDatTr/P4STA | d254608f5acd2a33af7b9242287dc6214700e0c8 | [
"Apache-2.0"
] | 6 | 2019-12-12T22:17:05.000Z | 2022-01-14T10:11:32.000Z | #
# Autogenerated by Thrift Compiler (0.12.0)
#
# DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING
#
# options string: py
#
from thrift.Thrift import TType, TMessageType, TFrozenDict, TException, TApplicationException
from thrift.protocol.TProtocol import TProtocolException
from thrift.TRecursive import fix_spec
import sys
import logging
from .ttypes import *
from thrift.Thrift import TProcessor
from thrift.transport import TTransport
all_structs = []
class Iface(object):
"""
************************************************************
Service *
************************************************************
"""
def sys_ping(self):
pass
def sys_echo(self, echo_msg):
"""
Parameters:
- echo_msg
"""
pass
def sys_shutdown(self):
pass
def sys_version_get(self):
pass
def sys_log_level_get(self):
pass
def sys_log_level_set(self, level):
"""
Parameters:
- level
"""
pass
def design_load(self, arguments):
"""
Parameters:
- arguments
"""
pass
def design_unload(self):
pass
def design_reconfig(self, pif_config_json):
"""
Parameters:
- pif_config_json
"""
pass
def design_load_status(self):
pass
def ports_info_retrieve(self):
pass
def table_list_all(self):
pass
def table_entry_add(self, tbl_id, entry):
"""
Parameters:
- tbl_id
- entry
"""
pass
def table_entry_edit(self, tbl_id, entry):
"""
Parameters:
- tbl_id
- entry
"""
pass
def table_entry_delete(self, tbl_id, entry):
"""
Parameters:
- tbl_id
- entry
"""
pass
def table_retrieve(self, tbl_id):
"""
Parameters:
- tbl_id
"""
pass
def table_version_get(self):
pass
def parser_value_set_list_all(self):
pass
def parser_value_set_add(self, pvs_id, entries):
"""
Parameters:
- pvs_id
- entries
"""
pass
def parser_value_set_clear(self, pvs_id):
"""
Parameters:
- pvs_id
"""
pass
def parser_value_set_retrieve(self, pvs_id):
"""
Parameters:
- pvs_id
"""
pass
def p4_counter_list_all(self):
pass
def p4_counter_retrieve(self, counter_id):
"""
Parameters:
- counter_id
"""
pass
def p4_counter_clear(self, counter_id):
"""
Parameters:
- counter_id
"""
pass
def p4_counter_clear_all(self):
pass
def register_list_all(self):
pass
def register_retrieve(self, regarr):
"""
Parameters:
- regarr
"""
pass
def register_clear(self, regarr):
"""
Parameters:
- regarr
"""
pass
def register_field_set(self, regarr, field_id, value):
"""
Parameters:
- regarr
- field_id
- value
"""
pass
def register_set(self, regarr, values):
"""
Parameters:
- regarr
- values
"""
pass
def sys_counter_retrieve_all(self):
pass
def sys_counter_clear_all(self):
pass
def mcast_config_get_all(self):
pass
def mcast_config_get(self, mcast_group):
"""
Parameters:
- mcast_group
"""
pass
def mcast_config_set(self, cfg):
"""
Parameters:
- cfg
"""
pass
def meter_list_all(self):
pass
def meter_config_get(self, meter_id):
"""
Parameters:
- meter_id
"""
pass
def meter_config_set(self, meter_id, cfgs):
"""
Parameters:
- meter_id
- cfgs
"""
pass
def digest_list_all(self):
pass
def digest_register(self, digest_id):
"""
Parameters:
- digest_id
"""
pass
def digest_deregister(self, digest_regid):
"""
Parameters:
- digest_regid
"""
pass
def digest_retrieve(self, digest_regid):
"""
Parameters:
- digest_regid
"""
pass
def traffic_class_set(self, port_id, cfgs):
"""
Parameters:
- port_id
- cfgs
"""
pass
def traffic_class_commit(self, port_id):
"""
Parameters:
- port_id
"""
pass
def traffic_class_get(self, port_id):
"""
Parameters:
- port_id
"""
pass
def debugctl(self, debug_id, debug_data):
"""
Parameters:
- debug_id
- debug_data
"""
pass
class Client(Iface):
"""
************************************************************
Service *
************************************************************
"""
def __init__(self, iprot, oprot=None):
self._iprot = self._oprot = iprot
if oprot is not None:
self._oprot = oprot
self._seqid = 0
def sys_ping(self):
self.send_sys_ping()
return self.recv_sys_ping()
def send_sys_ping(self):
self._oprot.writeMessageBegin('sys_ping', TMessageType.CALL, self._seqid)
args = sys_ping_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_sys_ping(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = sys_ping_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "sys_ping failed: unknown result")
def sys_echo(self, echo_msg):
"""
Parameters:
- echo_msg
"""
self.send_sys_echo(echo_msg)
self.recv_sys_echo()
def send_sys_echo(self, echo_msg):
self._oprot.writeMessageBegin('sys_echo', TMessageType.CALL, self._seqid)
args = sys_echo_args()
args.echo_msg = echo_msg
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_sys_echo(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = sys_echo_result()
result.read(iprot)
iprot.readMessageEnd()
return
def sys_shutdown(self):
self.send_sys_shutdown()
def send_sys_shutdown(self):
self._oprot.writeMessageBegin('sys_shutdown', TMessageType.ONEWAY, self._seqid)
args = sys_shutdown_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def sys_version_get(self):
self.send_sys_version_get()
return self.recv_sys_version_get()
def send_sys_version_get(self):
self._oprot.writeMessageBegin('sys_version_get', TMessageType.CALL, self._seqid)
args = sys_version_get_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_sys_version_get(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = sys_version_get_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "sys_version_get failed: unknown result")
def sys_log_level_get(self):
self.send_sys_log_level_get()
return self.recv_sys_log_level_get()
def send_sys_log_level_get(self):
self._oprot.writeMessageBegin('sys_log_level_get', TMessageType.CALL, self._seqid)
args = sys_log_level_get_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_sys_log_level_get(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = sys_log_level_get_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "sys_log_level_get failed: unknown result")
def sys_log_level_set(self, level):
"""
Parameters:
- level
"""
self.send_sys_log_level_set(level)
return self.recv_sys_log_level_set()
def send_sys_log_level_set(self, level):
self._oprot.writeMessageBegin('sys_log_level_set', TMessageType.CALL, self._seqid)
args = sys_log_level_set_args()
args.level = level
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_sys_log_level_set(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = sys_log_level_set_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "sys_log_level_set failed: unknown result")
def design_load(self, arguments):
"""
Parameters:
- arguments
"""
self.send_design_load(arguments)
return self.recv_design_load()
def send_design_load(self, arguments):
self._oprot.writeMessageBegin('design_load', TMessageType.CALL, self._seqid)
args = design_load_args()
args.arguments = arguments
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_design_load(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = design_load_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "design_load failed: unknown result")
def design_unload(self):
self.send_design_unload()
return self.recv_design_unload()
def send_design_unload(self):
self._oprot.writeMessageBegin('design_unload', TMessageType.CALL, self._seqid)
args = design_unload_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_design_unload(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = design_unload_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "design_unload failed: unknown result")
def design_reconfig(self, pif_config_json):
"""
Parameters:
- pif_config_json
"""
self.send_design_reconfig(pif_config_json)
return self.recv_design_reconfig()
def send_design_reconfig(self, pif_config_json):
self._oprot.writeMessageBegin('design_reconfig', TMessageType.CALL, self._seqid)
args = design_reconfig_args()
args.pif_config_json = pif_config_json
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_design_reconfig(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = design_reconfig_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "design_reconfig failed: unknown result")
def design_load_status(self):
self.send_design_load_status()
return self.recv_design_load_status()
def send_design_load_status(self):
self._oprot.writeMessageBegin('design_load_status', TMessageType.CALL, self._seqid)
args = design_load_status_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_design_load_status(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = design_load_status_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "design_load_status failed: unknown result")
def ports_info_retrieve(self):
self.send_ports_info_retrieve()
return self.recv_ports_info_retrieve()
def send_ports_info_retrieve(self):
self._oprot.writeMessageBegin('ports_info_retrieve', TMessageType.CALL, self._seqid)
args = ports_info_retrieve_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_ports_info_retrieve(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = ports_info_retrieve_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "ports_info_retrieve failed: unknown result")
def table_list_all(self):
self.send_table_list_all()
return self.recv_table_list_all()
def send_table_list_all(self):
self._oprot.writeMessageBegin('table_list_all', TMessageType.CALL, self._seqid)
args = table_list_all_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_table_list_all(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = table_list_all_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "table_list_all failed: unknown result")
def table_entry_add(self, tbl_id, entry):
"""
Parameters:
- tbl_id
- entry
"""
self.send_table_entry_add(tbl_id, entry)
return self.recv_table_entry_add()
def send_table_entry_add(self, tbl_id, entry):
self._oprot.writeMessageBegin('table_entry_add', TMessageType.CALL, self._seqid)
args = table_entry_add_args()
args.tbl_id = tbl_id
args.entry = entry
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_table_entry_add(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = table_entry_add_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "table_entry_add failed: unknown result")
def table_entry_edit(self, tbl_id, entry):
"""
Parameters:
- tbl_id
- entry
"""
self.send_table_entry_edit(tbl_id, entry)
return self.recv_table_entry_edit()
def send_table_entry_edit(self, tbl_id, entry):
self._oprot.writeMessageBegin('table_entry_edit', TMessageType.CALL, self._seqid)
args = table_entry_edit_args()
args.tbl_id = tbl_id
args.entry = entry
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_table_entry_edit(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = table_entry_edit_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "table_entry_edit failed: unknown result")
def table_entry_delete(self, tbl_id, entry):
"""
Parameters:
- tbl_id
- entry
"""
self.send_table_entry_delete(tbl_id, entry)
return self.recv_table_entry_delete()
def send_table_entry_delete(self, tbl_id, entry):
self._oprot.writeMessageBegin('table_entry_delete', TMessageType.CALL, self._seqid)
args = table_entry_delete_args()
args.tbl_id = tbl_id
args.entry = entry
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_table_entry_delete(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = table_entry_delete_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "table_entry_delete failed: unknown result")
def table_retrieve(self, tbl_id):
"""
Parameters:
- tbl_id
"""
self.send_table_retrieve(tbl_id)
return self.recv_table_retrieve()
def send_table_retrieve(self, tbl_id):
self._oprot.writeMessageBegin('table_retrieve', TMessageType.CALL, self._seqid)
args = table_retrieve_args()
args.tbl_id = tbl_id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_table_retrieve(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = table_retrieve_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "table_retrieve failed: unknown result")
def table_version_get(self):
self.send_table_version_get()
return self.recv_table_version_get()
def send_table_version_get(self):
self._oprot.writeMessageBegin('table_version_get', TMessageType.CALL, self._seqid)
args = table_version_get_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_table_version_get(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = table_version_get_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "table_version_get failed: unknown result")
def parser_value_set_list_all(self):
self.send_parser_value_set_list_all()
return self.recv_parser_value_set_list_all()
def send_parser_value_set_list_all(self):
self._oprot.writeMessageBegin('parser_value_set_list_all', TMessageType.CALL, self._seqid)
args = parser_value_set_list_all_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_parser_value_set_list_all(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = parser_value_set_list_all_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "parser_value_set_list_all failed: unknown result")
def parser_value_set_add(self, pvs_id, entries):
"""
Parameters:
- pvs_id
- entries
"""
self.send_parser_value_set_add(pvs_id, entries)
return self.recv_parser_value_set_add()
def send_parser_value_set_add(self, pvs_id, entries):
self._oprot.writeMessageBegin('parser_value_set_add', TMessageType.CALL, self._seqid)
args = parser_value_set_add_args()
args.pvs_id = pvs_id
args.entries = entries
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_parser_value_set_add(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = parser_value_set_add_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "parser_value_set_add failed: unknown result")
def parser_value_set_clear(self, pvs_id):
"""
Parameters:
- pvs_id
"""
self.send_parser_value_set_clear(pvs_id)
return self.recv_parser_value_set_clear()
def send_parser_value_set_clear(self, pvs_id):
self._oprot.writeMessageBegin('parser_value_set_clear', TMessageType.CALL, self._seqid)
args = parser_value_set_clear_args()
args.pvs_id = pvs_id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_parser_value_set_clear(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = parser_value_set_clear_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "parser_value_set_clear failed: unknown result")
def parser_value_set_retrieve(self, pvs_id):
"""
Parameters:
- pvs_id
"""
self.send_parser_value_set_retrieve(pvs_id)
return self.recv_parser_value_set_retrieve()
def send_parser_value_set_retrieve(self, pvs_id):
self._oprot.writeMessageBegin('parser_value_set_retrieve', TMessageType.CALL, self._seqid)
args = parser_value_set_retrieve_args()
args.pvs_id = pvs_id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_parser_value_set_retrieve(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = parser_value_set_retrieve_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "parser_value_set_retrieve failed: unknown result")
def p4_counter_list_all(self):
self.send_p4_counter_list_all()
return self.recv_p4_counter_list_all()
def send_p4_counter_list_all(self):
self._oprot.writeMessageBegin('p4_counter_list_all', TMessageType.CALL, self._seqid)
args = p4_counter_list_all_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_p4_counter_list_all(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = p4_counter_list_all_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "p4_counter_list_all failed: unknown result")
def p4_counter_retrieve(self, counter_id):
"""
Parameters:
- counter_id
"""
self.send_p4_counter_retrieve(counter_id)
return self.recv_p4_counter_retrieve()
def send_p4_counter_retrieve(self, counter_id):
self._oprot.writeMessageBegin('p4_counter_retrieve', TMessageType.CALL, self._seqid)
args = p4_counter_retrieve_args()
args.counter_id = counter_id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_p4_counter_retrieve(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = p4_counter_retrieve_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "p4_counter_retrieve failed: unknown result")
def p4_counter_clear(self, counter_id):
"""
Parameters:
- counter_id
"""
self.send_p4_counter_clear(counter_id)
return self.recv_p4_counter_clear()
def send_p4_counter_clear(self, counter_id):
self._oprot.writeMessageBegin('p4_counter_clear', TMessageType.CALL, self._seqid)
args = p4_counter_clear_args()
args.counter_id = counter_id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_p4_counter_clear(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = p4_counter_clear_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "p4_counter_clear failed: unknown result")
def p4_counter_clear_all(self):
self.send_p4_counter_clear_all()
return self.recv_p4_counter_clear_all()
def send_p4_counter_clear_all(self):
self._oprot.writeMessageBegin('p4_counter_clear_all', TMessageType.CALL, self._seqid)
args = p4_counter_clear_all_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_p4_counter_clear_all(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = p4_counter_clear_all_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "p4_counter_clear_all failed: unknown result")
def register_list_all(self):
self.send_register_list_all()
return self.recv_register_list_all()
def send_register_list_all(self):
self._oprot.writeMessageBegin('register_list_all', TMessageType.CALL, self._seqid)
args = register_list_all_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_register_list_all(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = register_list_all_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "register_list_all failed: unknown result")
def register_retrieve(self, regarr):
"""
Parameters:
- regarr
"""
self.send_register_retrieve(regarr)
return self.recv_register_retrieve()
def send_register_retrieve(self, regarr):
self._oprot.writeMessageBegin('register_retrieve', TMessageType.CALL, self._seqid)
args = register_retrieve_args()
args.regarr = regarr
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_register_retrieve(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = register_retrieve_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "register_retrieve failed: unknown result")
def register_clear(self, regarr):
"""
Parameters:
- regarr
"""
self.send_register_clear(regarr)
return self.recv_register_clear()
def send_register_clear(self, regarr):
self._oprot.writeMessageBegin('register_clear', TMessageType.CALL, self._seqid)
args = register_clear_args()
args.regarr = regarr
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_register_clear(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = register_clear_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "register_clear failed: unknown result")
def register_field_set(self, regarr, field_id, value):
"""
Parameters:
- regarr
- field_id
- value
"""
self.send_register_field_set(regarr, field_id, value)
return self.recv_register_field_set()
def send_register_field_set(self, regarr, field_id, value):
self._oprot.writeMessageBegin('register_field_set', TMessageType.CALL, self._seqid)
args = register_field_set_args()
args.regarr = regarr
args.field_id = field_id
args.value = value
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_register_field_set(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = register_field_set_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "register_field_set failed: unknown result")
def register_set(self, regarr, values):
"""
Parameters:
- regarr
- values
"""
self.send_register_set(regarr, values)
return self.recv_register_set()
def send_register_set(self, regarr, values):
self._oprot.writeMessageBegin('register_set', TMessageType.CALL, self._seqid)
args = register_set_args()
args.regarr = regarr
args.values = values
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_register_set(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = register_set_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "register_set failed: unknown result")
def sys_counter_retrieve_all(self):
self.send_sys_counter_retrieve_all()
return self.recv_sys_counter_retrieve_all()
def send_sys_counter_retrieve_all(self):
self._oprot.writeMessageBegin('sys_counter_retrieve_all', TMessageType.CALL, self._seqid)
args = sys_counter_retrieve_all_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_sys_counter_retrieve_all(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = sys_counter_retrieve_all_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "sys_counter_retrieve_all failed: unknown result")
def sys_counter_clear_all(self):
self.send_sys_counter_clear_all()
return self.recv_sys_counter_clear_all()
def send_sys_counter_clear_all(self):
self._oprot.writeMessageBegin('sys_counter_clear_all', TMessageType.CALL, self._seqid)
args = sys_counter_clear_all_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_sys_counter_clear_all(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = sys_counter_clear_all_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "sys_counter_clear_all failed: unknown result")
def mcast_config_get_all(self):
self.send_mcast_config_get_all()
return self.recv_mcast_config_get_all()
def send_mcast_config_get_all(self):
self._oprot.writeMessageBegin('mcast_config_get_all', TMessageType.CALL, self._seqid)
args = mcast_config_get_all_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_mcast_config_get_all(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = mcast_config_get_all_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "mcast_config_get_all failed: unknown result")
def mcast_config_get(self, mcast_group):
"""
Parameters:
- mcast_group
"""
self.send_mcast_config_get(mcast_group)
return self.recv_mcast_config_get()
def send_mcast_config_get(self, mcast_group):
self._oprot.writeMessageBegin('mcast_config_get', TMessageType.CALL, self._seqid)
args = mcast_config_get_args()
args.mcast_group = mcast_group
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_mcast_config_get(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = mcast_config_get_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "mcast_config_get failed: unknown result")
def mcast_config_set(self, cfg):
"""
Parameters:
- cfg
"""
self.send_mcast_config_set(cfg)
return self.recv_mcast_config_set()
def send_mcast_config_set(self, cfg):
self._oprot.writeMessageBegin('mcast_config_set', TMessageType.CALL, self._seqid)
args = mcast_config_set_args()
args.cfg = cfg
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_mcast_config_set(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = mcast_config_set_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "mcast_config_set failed: unknown result")
def meter_list_all(self):
self.send_meter_list_all()
return self.recv_meter_list_all()
def send_meter_list_all(self):
self._oprot.writeMessageBegin('meter_list_all', TMessageType.CALL, self._seqid)
args = meter_list_all_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_meter_list_all(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = meter_list_all_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "meter_list_all failed: unknown result")
def meter_config_get(self, meter_id):
"""
Parameters:
- meter_id
"""
self.send_meter_config_get(meter_id)
return self.recv_meter_config_get()
def send_meter_config_get(self, meter_id):
self._oprot.writeMessageBegin('meter_config_get', TMessageType.CALL, self._seqid)
args = meter_config_get_args()
args.meter_id = meter_id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_meter_config_get(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = meter_config_get_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "meter_config_get failed: unknown result")
def meter_config_set(self, meter_id, cfgs):
"""
Parameters:
- meter_id
- cfgs
"""
self.send_meter_config_set(meter_id, cfgs)
return self.recv_meter_config_set()
def send_meter_config_set(self, meter_id, cfgs):
self._oprot.writeMessageBegin('meter_config_set', TMessageType.CALL, self._seqid)
args = meter_config_set_args()
args.meter_id = meter_id
args.cfgs = cfgs
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_meter_config_set(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = meter_config_set_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "meter_config_set failed: unknown result")
def digest_list_all(self):
self.send_digest_list_all()
return self.recv_digest_list_all()
def send_digest_list_all(self):
self._oprot.writeMessageBegin('digest_list_all', TMessageType.CALL, self._seqid)
args = digest_list_all_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_digest_list_all(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = digest_list_all_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "digest_list_all failed: unknown result")
def digest_register(self, digest_id):
"""
Parameters:
- digest_id
"""
self.send_digest_register(digest_id)
return self.recv_digest_register()
def send_digest_register(self, digest_id):
self._oprot.writeMessageBegin('digest_register', TMessageType.CALL, self._seqid)
args = digest_register_args()
args.digest_id = digest_id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_digest_register(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = digest_register_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "digest_register failed: unknown result")
def digest_deregister(self, digest_regid):
"""
Parameters:
- digest_regid
"""
self.send_digest_deregister(digest_regid)
return self.recv_digest_deregister()
def send_digest_deregister(self, digest_regid):
self._oprot.writeMessageBegin('digest_deregister', TMessageType.CALL, self._seqid)
args = digest_deregister_args()
args.digest_regid = digest_regid
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_digest_deregister(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = digest_deregister_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "digest_deregister failed: unknown result")
def digest_retrieve(self, digest_regid):
"""
Parameters:
- digest_regid
"""
self.send_digest_retrieve(digest_regid)
return self.recv_digest_retrieve()
def send_digest_retrieve(self, digest_regid):
self._oprot.writeMessageBegin('digest_retrieve', TMessageType.CALL, self._seqid)
args = digest_retrieve_args()
args.digest_regid = digest_regid
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_digest_retrieve(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = digest_retrieve_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "digest_retrieve failed: unknown result")
def traffic_class_set(self, port_id, cfgs):
"""
Parameters:
- port_id
- cfgs
"""
self.send_traffic_class_set(port_id, cfgs)
return self.recv_traffic_class_set()
def send_traffic_class_set(self, port_id, cfgs):
self._oprot.writeMessageBegin('traffic_class_set', TMessageType.CALL, self._seqid)
args = traffic_class_set_args()
args.port_id = port_id
args.cfgs = cfgs
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_traffic_class_set(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = traffic_class_set_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "traffic_class_set failed: unknown result")
def traffic_class_commit(self, port_id):
"""
Parameters:
- port_id
"""
self.send_traffic_class_commit(port_id)
return self.recv_traffic_class_commit()
def send_traffic_class_commit(self, port_id):
self._oprot.writeMessageBegin('traffic_class_commit', TMessageType.CALL, self._seqid)
args = traffic_class_commit_args()
args.port_id = port_id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_traffic_class_commit(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = traffic_class_commit_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "traffic_class_commit failed: unknown result")
def traffic_class_get(self, port_id):
"""
Parameters:
- port_id
"""
self.send_traffic_class_get(port_id)
return self.recv_traffic_class_get()
def send_traffic_class_get(self, port_id):
self._oprot.writeMessageBegin('traffic_class_get', TMessageType.CALL, self._seqid)
args = traffic_class_get_args()
args.port_id = port_id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_traffic_class_get(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = traffic_class_get_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "traffic_class_get failed: unknown result")
def debugctl(self, debug_id, debug_data):
"""
Parameters:
- debug_id
- debug_data
"""
self.send_debugctl(debug_id, debug_data)
return self.recv_debugctl()
def send_debugctl(self, debug_id, debug_data):
self._oprot.writeMessageBegin('debugctl', TMessageType.CALL, self._seqid)
args = debugctl_args()
args.debug_id = debug_id
args.debug_data = debug_data
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_debugctl(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = debugctl_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "debugctl failed: unknown result")
class Processor(Iface, TProcessor):
def __init__(self, handler):
self._handler = handler
self._processMap = {}
self._processMap["sys_ping"] = Processor.process_sys_ping
self._processMap["sys_echo"] = Processor.process_sys_echo
self._processMap["sys_shutdown"] = Processor.process_sys_shutdown
self._processMap["sys_version_get"] = Processor.process_sys_version_get
self._processMap["sys_log_level_get"] = Processor.process_sys_log_level_get
self._processMap["sys_log_level_set"] = Processor.process_sys_log_level_set
self._processMap["design_load"] = Processor.process_design_load
self._processMap["design_unload"] = Processor.process_design_unload
self._processMap["design_reconfig"] = Processor.process_design_reconfig
self._processMap["design_load_status"] = Processor.process_design_load_status
self._processMap["ports_info_retrieve"] = Processor.process_ports_info_retrieve
self._processMap["table_list_all"] = Processor.process_table_list_all
self._processMap["table_entry_add"] = Processor.process_table_entry_add
self._processMap["table_entry_edit"] = Processor.process_table_entry_edit
self._processMap["table_entry_delete"] = Processor.process_table_entry_delete
self._processMap["table_retrieve"] = Processor.process_table_retrieve
self._processMap["table_version_get"] = Processor.process_table_version_get
self._processMap["parser_value_set_list_all"] = Processor.process_parser_value_set_list_all
self._processMap["parser_value_set_add"] = Processor.process_parser_value_set_add
self._processMap["parser_value_set_clear"] = Processor.process_parser_value_set_clear
self._processMap["parser_value_set_retrieve"] = Processor.process_parser_value_set_retrieve
self._processMap["p4_counter_list_all"] = Processor.process_p4_counter_list_all
self._processMap["p4_counter_retrieve"] = Processor.process_p4_counter_retrieve
self._processMap["p4_counter_clear"] = Processor.process_p4_counter_clear
self._processMap["p4_counter_clear_all"] = Processor.process_p4_counter_clear_all
self._processMap["register_list_all"] = Processor.process_register_list_all
self._processMap["register_retrieve"] = Processor.process_register_retrieve
self._processMap["register_clear"] = Processor.process_register_clear
self._processMap["register_field_set"] = Processor.process_register_field_set
self._processMap["register_set"] = Processor.process_register_set
self._processMap["sys_counter_retrieve_all"] = Processor.process_sys_counter_retrieve_all
self._processMap["sys_counter_clear_all"] = Processor.process_sys_counter_clear_all
self._processMap["mcast_config_get_all"] = Processor.process_mcast_config_get_all
self._processMap["mcast_config_get"] = Processor.process_mcast_config_get
self._processMap["mcast_config_set"] = Processor.process_mcast_config_set
self._processMap["meter_list_all"] = Processor.process_meter_list_all
self._processMap["meter_config_get"] = Processor.process_meter_config_get
self._processMap["meter_config_set"] = Processor.process_meter_config_set
self._processMap["digest_list_all"] = Processor.process_digest_list_all
self._processMap["digest_register"] = Processor.process_digest_register
self._processMap["digest_deregister"] = Processor.process_digest_deregister
self._processMap["digest_retrieve"] = Processor.process_digest_retrieve
self._processMap["traffic_class_set"] = Processor.process_traffic_class_set
self._processMap["traffic_class_commit"] = Processor.process_traffic_class_commit
self._processMap["traffic_class_get"] = Processor.process_traffic_class_get
self._processMap["debugctl"] = Processor.process_debugctl
def process(self, iprot, oprot):
(name, type, seqid) = iprot.readMessageBegin()
if name not in self._processMap:
iprot.skip(TType.STRUCT)
iprot.readMessageEnd()
x = TApplicationException(TApplicationException.UNKNOWN_METHOD, 'Unknown function %s' % (name))
oprot.writeMessageBegin(name, TMessageType.EXCEPTION, seqid)
x.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
return
else:
self._processMap[name](self, seqid, iprot, oprot)
return True
def process_sys_ping(self, seqid, iprot, oprot):
args = sys_ping_args()
args.read(iprot)
iprot.readMessageEnd()
result = sys_ping_result()
try:
result.success = self._handler.sys_ping()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("sys_ping", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_sys_echo(self, seqid, iprot, oprot):
args = sys_echo_args()
args.read(iprot)
iprot.readMessageEnd()
result = sys_echo_result()
try:
self._handler.sys_echo(args.echo_msg)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("sys_echo", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_sys_shutdown(self, seqid, iprot, oprot):
args = sys_shutdown_args()
args.read(iprot)
iprot.readMessageEnd()
try:
self._handler.sys_shutdown()
except TTransport.TTransportException:
raise
except Exception:
logging.exception('Exception in oneway handler')
def process_sys_version_get(self, seqid, iprot, oprot):
args = sys_version_get_args()
args.read(iprot)
iprot.readMessageEnd()
result = sys_version_get_result()
try:
result.success = self._handler.sys_version_get()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("sys_version_get", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_sys_log_level_get(self, seqid, iprot, oprot):
args = sys_log_level_get_args()
args.read(iprot)
iprot.readMessageEnd()
result = sys_log_level_get_result()
try:
result.success = self._handler.sys_log_level_get()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("sys_log_level_get", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_sys_log_level_set(self, seqid, iprot, oprot):
args = sys_log_level_set_args()
args.read(iprot)
iprot.readMessageEnd()
result = sys_log_level_set_result()
try:
result.success = self._handler.sys_log_level_set(args.level)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("sys_log_level_set", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_design_load(self, seqid, iprot, oprot):
args = design_load_args()
args.read(iprot)
iprot.readMessageEnd()
result = design_load_result()
try:
result.success = self._handler.design_load(args.arguments)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("design_load", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_design_unload(self, seqid, iprot, oprot):
args = design_unload_args()
args.read(iprot)
iprot.readMessageEnd()
result = design_unload_result()
try:
result.success = self._handler.design_unload()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("design_unload", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_design_reconfig(self, seqid, iprot, oprot):
args = design_reconfig_args()
args.read(iprot)
iprot.readMessageEnd()
result = design_reconfig_result()
try:
result.success = self._handler.design_reconfig(args.pif_config_json)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("design_reconfig", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_design_load_status(self, seqid, iprot, oprot):
args = design_load_status_args()
args.read(iprot)
iprot.readMessageEnd()
result = design_load_status_result()
try:
result.success = self._handler.design_load_status()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("design_load_status", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_ports_info_retrieve(self, seqid, iprot, oprot):
args = ports_info_retrieve_args()
args.read(iprot)
iprot.readMessageEnd()
result = ports_info_retrieve_result()
try:
result.success = self._handler.ports_info_retrieve()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("ports_info_retrieve", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_table_list_all(self, seqid, iprot, oprot):
args = table_list_all_args()
args.read(iprot)
iprot.readMessageEnd()
result = table_list_all_result()
try:
result.success = self._handler.table_list_all()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("table_list_all", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_table_entry_add(self, seqid, iprot, oprot):
args = table_entry_add_args()
args.read(iprot)
iprot.readMessageEnd()
result = table_entry_add_result()
try:
result.success = self._handler.table_entry_add(args.tbl_id, args.entry)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("table_entry_add", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_table_entry_edit(self, seqid, iprot, oprot):
args = table_entry_edit_args()
args.read(iprot)
iprot.readMessageEnd()
result = table_entry_edit_result()
try:
result.success = self._handler.table_entry_edit(args.tbl_id, args.entry)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("table_entry_edit", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_table_entry_delete(self, seqid, iprot, oprot):
args = table_entry_delete_args()
args.read(iprot)
iprot.readMessageEnd()
result = table_entry_delete_result()
try:
result.success = self._handler.table_entry_delete(args.tbl_id, args.entry)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("table_entry_delete", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_table_retrieve(self, seqid, iprot, oprot):
args = table_retrieve_args()
args.read(iprot)
iprot.readMessageEnd()
result = table_retrieve_result()
try:
result.success = self._handler.table_retrieve(args.tbl_id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("table_retrieve", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_table_version_get(self, seqid, iprot, oprot):
args = table_version_get_args()
args.read(iprot)
iprot.readMessageEnd()
result = table_version_get_result()
try:
result.success = self._handler.table_version_get()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("table_version_get", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_parser_value_set_list_all(self, seqid, iprot, oprot):
args = parser_value_set_list_all_args()
args.read(iprot)
iprot.readMessageEnd()
result = parser_value_set_list_all_result()
try:
result.success = self._handler.parser_value_set_list_all()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("parser_value_set_list_all", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_parser_value_set_add(self, seqid, iprot, oprot):
args = parser_value_set_add_args()
args.read(iprot)
iprot.readMessageEnd()
result = parser_value_set_add_result()
try:
result.success = self._handler.parser_value_set_add(args.pvs_id, args.entries)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("parser_value_set_add", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_parser_value_set_clear(self, seqid, iprot, oprot):
args = parser_value_set_clear_args()
args.read(iprot)
iprot.readMessageEnd()
result = parser_value_set_clear_result()
try:
result.success = self._handler.parser_value_set_clear(args.pvs_id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("parser_value_set_clear", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_parser_value_set_retrieve(self, seqid, iprot, oprot):
args = parser_value_set_retrieve_args()
args.read(iprot)
iprot.readMessageEnd()
result = parser_value_set_retrieve_result()
try:
result.success = self._handler.parser_value_set_retrieve(args.pvs_id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("parser_value_set_retrieve", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_p4_counter_list_all(self, seqid, iprot, oprot):
args = p4_counter_list_all_args()
args.read(iprot)
iprot.readMessageEnd()
result = p4_counter_list_all_result()
try:
result.success = self._handler.p4_counter_list_all()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("p4_counter_list_all", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_p4_counter_retrieve(self, seqid, iprot, oprot):
args = p4_counter_retrieve_args()
args.read(iprot)
iprot.readMessageEnd()
result = p4_counter_retrieve_result()
try:
result.success = self._handler.p4_counter_retrieve(args.counter_id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("p4_counter_retrieve", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_p4_counter_clear(self, seqid, iprot, oprot):
args = p4_counter_clear_args()
args.read(iprot)
iprot.readMessageEnd()
result = p4_counter_clear_result()
try:
result.success = self._handler.p4_counter_clear(args.counter_id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("p4_counter_clear", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_p4_counter_clear_all(self, seqid, iprot, oprot):
args = p4_counter_clear_all_args()
args.read(iprot)
iprot.readMessageEnd()
result = p4_counter_clear_all_result()
try:
result.success = self._handler.p4_counter_clear_all()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("p4_counter_clear_all", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_register_list_all(self, seqid, iprot, oprot):
args = register_list_all_args()
args.read(iprot)
iprot.readMessageEnd()
result = register_list_all_result()
try:
result.success = self._handler.register_list_all()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("register_list_all", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_register_retrieve(self, seqid, iprot, oprot):
args = register_retrieve_args()
args.read(iprot)
iprot.readMessageEnd()
result = register_retrieve_result()
try:
result.success = self._handler.register_retrieve(args.regarr)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("register_retrieve", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_register_clear(self, seqid, iprot, oprot):
args = register_clear_args()
args.read(iprot)
iprot.readMessageEnd()
result = register_clear_result()
try:
result.success = self._handler.register_clear(args.regarr)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("register_clear", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_register_field_set(self, seqid, iprot, oprot):
args = register_field_set_args()
args.read(iprot)
iprot.readMessageEnd()
result = register_field_set_result()
try:
result.success = self._handler.register_field_set(args.regarr, args.field_id, args.value)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("register_field_set", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_register_set(self, seqid, iprot, oprot):
args = register_set_args()
args.read(iprot)
iprot.readMessageEnd()
result = register_set_result()
try:
result.success = self._handler.register_set(args.regarr, args.values)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("register_set", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_sys_counter_retrieve_all(self, seqid, iprot, oprot):
args = sys_counter_retrieve_all_args()
args.read(iprot)
iprot.readMessageEnd()
result = sys_counter_retrieve_all_result()
try:
result.success = self._handler.sys_counter_retrieve_all()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("sys_counter_retrieve_all", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_sys_counter_clear_all(self, seqid, iprot, oprot):
args = sys_counter_clear_all_args()
args.read(iprot)
iprot.readMessageEnd()
result = sys_counter_clear_all_result()
try:
result.success = self._handler.sys_counter_clear_all()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("sys_counter_clear_all", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_mcast_config_get_all(self, seqid, iprot, oprot):
args = mcast_config_get_all_args()
args.read(iprot)
iprot.readMessageEnd()
result = mcast_config_get_all_result()
try:
result.success = self._handler.mcast_config_get_all()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("mcast_config_get_all", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_mcast_config_get(self, seqid, iprot, oprot):
args = mcast_config_get_args()
args.read(iprot)
iprot.readMessageEnd()
result = mcast_config_get_result()
try:
result.success = self._handler.mcast_config_get(args.mcast_group)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("mcast_config_get", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_mcast_config_set(self, seqid, iprot, oprot):
args = mcast_config_set_args()
args.read(iprot)
iprot.readMessageEnd()
result = mcast_config_set_result()
try:
result.success = self._handler.mcast_config_set(args.cfg)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("mcast_config_set", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_meter_list_all(self, seqid, iprot, oprot):
args = meter_list_all_args()
args.read(iprot)
iprot.readMessageEnd()
result = meter_list_all_result()
try:
result.success = self._handler.meter_list_all()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("meter_list_all", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_meter_config_get(self, seqid, iprot, oprot):
args = meter_config_get_args()
args.read(iprot)
iprot.readMessageEnd()
result = meter_config_get_result()
try:
result.success = self._handler.meter_config_get(args.meter_id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("meter_config_get", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_meter_config_set(self, seqid, iprot, oprot):
args = meter_config_set_args()
args.read(iprot)
iprot.readMessageEnd()
result = meter_config_set_result()
try:
result.success = self._handler.meter_config_set(args.meter_id, args.cfgs)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("meter_config_set", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_digest_list_all(self, seqid, iprot, oprot):
args = digest_list_all_args()
args.read(iprot)
iprot.readMessageEnd()
result = digest_list_all_result()
try:
result.success = self._handler.digest_list_all()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("digest_list_all", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_digest_register(self, seqid, iprot, oprot):
args = digest_register_args()
args.read(iprot)
iprot.readMessageEnd()
result = digest_register_result()
try:
result.success = self._handler.digest_register(args.digest_id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("digest_register", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_digest_deregister(self, seqid, iprot, oprot):
args = digest_deregister_args()
args.read(iprot)
iprot.readMessageEnd()
result = digest_deregister_result()
try:
result.success = self._handler.digest_deregister(args.digest_regid)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("digest_deregister", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_digest_retrieve(self, seqid, iprot, oprot):
args = digest_retrieve_args()
args.read(iprot)
iprot.readMessageEnd()
result = digest_retrieve_result()
try:
result.success = self._handler.digest_retrieve(args.digest_regid)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("digest_retrieve", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_traffic_class_set(self, seqid, iprot, oprot):
args = traffic_class_set_args()
args.read(iprot)
iprot.readMessageEnd()
result = traffic_class_set_result()
try:
result.success = self._handler.traffic_class_set(args.port_id, args.cfgs)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("traffic_class_set", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_traffic_class_commit(self, seqid, iprot, oprot):
args = traffic_class_commit_args()
args.read(iprot)
iprot.readMessageEnd()
result = traffic_class_commit_result()
try:
result.success = self._handler.traffic_class_commit(args.port_id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("traffic_class_commit", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_traffic_class_get(self, seqid, iprot, oprot):
args = traffic_class_get_args()
args.read(iprot)
iprot.readMessageEnd()
result = traffic_class_get_result()
try:
result.success = self._handler.traffic_class_get(args.port_id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("traffic_class_get", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_debugctl(self, seqid, iprot, oprot):
args = debugctl_args()
args.read(iprot)
iprot.readMessageEnd()
result = debugctl_result()
try:
result.success = self._handler.debugctl(args.debug_id, args.debug_data)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("debugctl", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
# HELPER FUNCTIONS AND STRUCTURES
class sys_ping_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sys_ping_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sys_ping_args)
sys_ping_args.thrift_spec = (
)
class sys_ping_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRING:
self.success = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sys_ping_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRING, 0)
oprot.writeString(self.success.encode('utf-8') if sys.version_info[0] == 2 else self.success)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sys_ping_result)
sys_ping_result.thrift_spec = (
(0, TType.STRING, 'success', 'UTF8', None, ), # 0
)
class sys_echo_args(object):
"""
Attributes:
- echo_msg
"""
def __init__(self, echo_msg=None,):
self.echo_msg = echo_msg
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.echo_msg = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sys_echo_args')
if self.echo_msg is not None:
oprot.writeFieldBegin('echo_msg', TType.STRING, 1)
oprot.writeString(self.echo_msg.encode('utf-8') if sys.version_info[0] == 2 else self.echo_msg)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sys_echo_args)
sys_echo_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'echo_msg', 'UTF8', None, ), # 1
)
class sys_echo_result(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sys_echo_result')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sys_echo_result)
sys_echo_result.thrift_spec = (
)
class sys_shutdown_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sys_shutdown_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sys_shutdown_args)
sys_shutdown_args.thrift_spec = (
)
class sys_version_get_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sys_version_get_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sys_version_get_args)
sys_version_get_args.thrift_spec = (
)
class sys_version_get_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRING:
self.success = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sys_version_get_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRING, 0)
oprot.writeString(self.success.encode('utf-8') if sys.version_info[0] == 2 else self.success)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sys_version_get_result)
sys_version_get_result.thrift_spec = (
(0, TType.STRING, 'success', 'UTF8', None, ), # 0
)
class sys_log_level_get_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sys_log_level_get_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sys_log_level_get_args)
sys_log_level_get_args.thrift_spec = (
)
class sys_log_level_get_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.I32:
self.success = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sys_log_level_get_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.I32, 0)
oprot.writeI32(self.success)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sys_log_level_get_result)
sys_log_level_get_result.thrift_spec = (
(0, TType.I32, 'success', None, None, ), # 0
)
class sys_log_level_set_args(object):
"""
Attributes:
- level
"""
def __init__(self, level=None,):
self.level = level
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.level = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sys_log_level_set_args')
if self.level is not None:
oprot.writeFieldBegin('level', TType.I32, 1)
oprot.writeI32(self.level)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sys_log_level_set_args)
sys_log_level_set_args.thrift_spec = (
None, # 0
(1, TType.I32, 'level', None, None, ), # 1
)
class sys_log_level_set_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sys_log_level_set_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sys_log_level_set_result)
sys_log_level_set_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class design_load_args(object):
"""
Attributes:
- arguments
"""
def __init__(self, arguments=None,):
self.arguments = arguments
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.arguments = DesignLoadArgs()
self.arguments.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('design_load_args')
if self.arguments is not None:
oprot.writeFieldBegin('arguments', TType.STRUCT, 1)
self.arguments.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(design_load_args)
design_load_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'arguments', [DesignLoadArgs, None], None, ), # 1
)
class design_load_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('design_load_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(design_load_result)
design_load_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class design_unload_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('design_unload_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(design_unload_args)
design_unload_args.thrift_spec = (
)
class design_unload_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('design_unload_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(design_unload_result)
design_unload_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class design_reconfig_args(object):
"""
Attributes:
- pif_config_json
"""
def __init__(self, pif_config_json=None,):
self.pif_config_json = pif_config_json
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.pif_config_json = iprot.readBinary()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('design_reconfig_args')
if self.pif_config_json is not None:
oprot.writeFieldBegin('pif_config_json', TType.STRING, 1)
oprot.writeBinary(self.pif_config_json)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(design_reconfig_args)
design_reconfig_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'pif_config_json', 'BINARY', None, ), # 1
)
class design_reconfig_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('design_reconfig_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(design_reconfig_result)
design_reconfig_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class design_load_status_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('design_load_status_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(design_load_status_args)
design_load_status_args.thrift_spec = (
)
class design_load_status_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = DesignLoadStatus()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('design_load_status_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(design_load_status_result)
design_load_status_result.thrift_spec = (
(0, TType.STRUCT, 'success', [DesignLoadStatus, None], None, ), # 0
)
class ports_info_retrieve_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('ports_info_retrieve_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(ports_info_retrieve_args)
ports_info_retrieve_args.thrift_spec = (
)
class ports_info_retrieve_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype31, _size28) = iprot.readListBegin()
for _i32 in range(_size28):
_elem33 = PortInfo()
_elem33.read(iprot)
self.success.append(_elem33)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('ports_info_retrieve_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter34 in self.success:
iter34.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(ports_info_retrieve_result)
ports_info_retrieve_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [PortInfo, None], False), None, ), # 0
)
class table_list_all_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('table_list_all_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(table_list_all_args)
table_list_all_args.thrift_spec = (
)
class table_list_all_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype38, _size35) = iprot.readListBegin()
for _i39 in range(_size35):
_elem40 = TableDesc()
_elem40.read(iprot)
self.success.append(_elem40)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('table_list_all_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter41 in self.success:
iter41.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(table_list_all_result)
table_list_all_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [TableDesc, None], False), None, ), # 0
)
class table_entry_add_args(object):
"""
Attributes:
- tbl_id
- entry
"""
def __init__(self, tbl_id=None, entry=None,):
self.tbl_id = tbl_id
self.entry = entry
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.tbl_id = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.entry = TableEntry()
self.entry.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('table_entry_add_args')
if self.tbl_id is not None:
oprot.writeFieldBegin('tbl_id', TType.I32, 1)
oprot.writeI32(self.tbl_id)
oprot.writeFieldEnd()
if self.entry is not None:
oprot.writeFieldBegin('entry', TType.STRUCT, 2)
self.entry.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(table_entry_add_args)
table_entry_add_args.thrift_spec = (
None, # 0
(1, TType.I32, 'tbl_id', None, None, ), # 1
(2, TType.STRUCT, 'entry', [TableEntry, None], None, ), # 2
)
class table_entry_add_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('table_entry_add_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(table_entry_add_result)
table_entry_add_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class table_entry_edit_args(object):
"""
Attributes:
- tbl_id
- entry
"""
def __init__(self, tbl_id=None, entry=None,):
self.tbl_id = tbl_id
self.entry = entry
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.tbl_id = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.entry = TableEntry()
self.entry.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('table_entry_edit_args')
if self.tbl_id is not None:
oprot.writeFieldBegin('tbl_id', TType.I32, 1)
oprot.writeI32(self.tbl_id)
oprot.writeFieldEnd()
if self.entry is not None:
oprot.writeFieldBegin('entry', TType.STRUCT, 2)
self.entry.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(table_entry_edit_args)
table_entry_edit_args.thrift_spec = (
None, # 0
(1, TType.I32, 'tbl_id', None, None, ), # 1
(2, TType.STRUCT, 'entry', [TableEntry, None], None, ), # 2
)
class table_entry_edit_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('table_entry_edit_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(table_entry_edit_result)
table_entry_edit_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class table_entry_delete_args(object):
"""
Attributes:
- tbl_id
- entry
"""
def __init__(self, tbl_id=None, entry=None,):
self.tbl_id = tbl_id
self.entry = entry
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.tbl_id = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.entry = TableEntry()
self.entry.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('table_entry_delete_args')
if self.tbl_id is not None:
oprot.writeFieldBegin('tbl_id', TType.I32, 1)
oprot.writeI32(self.tbl_id)
oprot.writeFieldEnd()
if self.entry is not None:
oprot.writeFieldBegin('entry', TType.STRUCT, 2)
self.entry.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(table_entry_delete_args)
table_entry_delete_args.thrift_spec = (
None, # 0
(1, TType.I32, 'tbl_id', None, None, ), # 1
(2, TType.STRUCT, 'entry', [TableEntry, None], None, ), # 2
)
class table_entry_delete_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('table_entry_delete_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(table_entry_delete_result)
table_entry_delete_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class table_retrieve_args(object):
"""
Attributes:
- tbl_id
"""
def __init__(self, tbl_id=None,):
self.tbl_id = tbl_id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.tbl_id = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('table_retrieve_args')
if self.tbl_id is not None:
oprot.writeFieldBegin('tbl_id', TType.I32, 1)
oprot.writeI32(self.tbl_id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(table_retrieve_args)
table_retrieve_args.thrift_spec = (
None, # 0
(1, TType.I32, 'tbl_id', None, None, ), # 1
)
class table_retrieve_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype45, _size42) = iprot.readListBegin()
for _i46 in range(_size42):
_elem47 = TableEntry()
_elem47.read(iprot)
self.success.append(_elem47)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('table_retrieve_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter48 in self.success:
iter48.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(table_retrieve_result)
table_retrieve_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [TableEntry, None], False), None, ), # 0
)
class table_version_get_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('table_version_get_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(table_version_get_args)
table_version_get_args.thrift_spec = (
)
class table_version_get_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.I64:
self.success = iprot.readI64()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('table_version_get_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.I64, 0)
oprot.writeI64(self.success)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(table_version_get_result)
table_version_get_result.thrift_spec = (
(0, TType.I64, 'success', None, None, ), # 0
)
class parser_value_set_list_all_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('parser_value_set_list_all_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(parser_value_set_list_all_args)
parser_value_set_list_all_args.thrift_spec = (
)
class parser_value_set_list_all_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype52, _size49) = iprot.readListBegin()
for _i53 in range(_size49):
_elem54 = ParserValueSetDesc()
_elem54.read(iprot)
self.success.append(_elem54)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('parser_value_set_list_all_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter55 in self.success:
iter55.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(parser_value_set_list_all_result)
parser_value_set_list_all_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [ParserValueSetDesc, None], False), None, ), # 0
)
class parser_value_set_add_args(object):
"""
Attributes:
- pvs_id
- entries
"""
def __init__(self, pvs_id=None, entries=None,):
self.pvs_id = pvs_id
self.entries = entries
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.pvs_id = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.LIST:
self.entries = []
(_etype59, _size56) = iprot.readListBegin()
for _i60 in range(_size56):
_elem61 = ParserValueSetEntry()
_elem61.read(iprot)
self.entries.append(_elem61)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('parser_value_set_add_args')
if self.pvs_id is not None:
oprot.writeFieldBegin('pvs_id', TType.I32, 1)
oprot.writeI32(self.pvs_id)
oprot.writeFieldEnd()
if self.entries is not None:
oprot.writeFieldBegin('entries', TType.LIST, 2)
oprot.writeListBegin(TType.STRUCT, len(self.entries))
for iter62 in self.entries:
iter62.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(parser_value_set_add_args)
parser_value_set_add_args.thrift_spec = (
None, # 0
(1, TType.I32, 'pvs_id', None, None, ), # 1
(2, TType.LIST, 'entries', (TType.STRUCT, [ParserValueSetEntry, None], False), None, ), # 2
)
class parser_value_set_add_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('parser_value_set_add_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(parser_value_set_add_result)
parser_value_set_add_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class parser_value_set_clear_args(object):
"""
Attributes:
- pvs_id
"""
def __init__(self, pvs_id=None,):
self.pvs_id = pvs_id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.pvs_id = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('parser_value_set_clear_args')
if self.pvs_id is not None:
oprot.writeFieldBegin('pvs_id', TType.I32, 1)
oprot.writeI32(self.pvs_id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(parser_value_set_clear_args)
parser_value_set_clear_args.thrift_spec = (
None, # 0
(1, TType.I32, 'pvs_id', None, None, ), # 1
)
class parser_value_set_clear_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('parser_value_set_clear_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(parser_value_set_clear_result)
parser_value_set_clear_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class parser_value_set_retrieve_args(object):
"""
Attributes:
- pvs_id
"""
def __init__(self, pvs_id=None,):
self.pvs_id = pvs_id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.pvs_id = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('parser_value_set_retrieve_args')
if self.pvs_id is not None:
oprot.writeFieldBegin('pvs_id', TType.I32, 1)
oprot.writeI32(self.pvs_id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(parser_value_set_retrieve_args)
parser_value_set_retrieve_args.thrift_spec = (
None, # 0
(1, TType.I32, 'pvs_id', None, None, ), # 1
)
class parser_value_set_retrieve_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype66, _size63) = iprot.readListBegin()
for _i67 in range(_size63):
_elem68 = ParserValueSetEntry()
_elem68.read(iprot)
self.success.append(_elem68)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('parser_value_set_retrieve_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter69 in self.success:
iter69.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(parser_value_set_retrieve_result)
parser_value_set_retrieve_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [ParserValueSetEntry, None], False), None, ), # 0
)
class p4_counter_list_all_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('p4_counter_list_all_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(p4_counter_list_all_args)
p4_counter_list_all_args.thrift_spec = (
)
class p4_counter_list_all_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype73, _size70) = iprot.readListBegin()
for _i74 in range(_size70):
_elem75 = P4CounterDesc()
_elem75.read(iprot)
self.success.append(_elem75)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('p4_counter_list_all_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter76 in self.success:
iter76.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(p4_counter_list_all_result)
p4_counter_list_all_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [P4CounterDesc, None], False), None, ), # 0
)
class p4_counter_retrieve_args(object):
"""
Attributes:
- counter_id
"""
def __init__(self, counter_id=None,):
self.counter_id = counter_id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.counter_id = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('p4_counter_retrieve_args')
if self.counter_id is not None:
oprot.writeFieldBegin('counter_id', TType.I32, 1)
oprot.writeI32(self.counter_id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(p4_counter_retrieve_args)
p4_counter_retrieve_args.thrift_spec = (
None, # 0
(1, TType.I32, 'counter_id', None, None, ), # 1
)
class p4_counter_retrieve_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = P4CounterReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('p4_counter_retrieve_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(p4_counter_retrieve_result)
p4_counter_retrieve_result.thrift_spec = (
(0, TType.STRUCT, 'success', [P4CounterReturn, None], None, ), # 0
)
class p4_counter_clear_args(object):
"""
Attributes:
- counter_id
"""
def __init__(self, counter_id=None,):
self.counter_id = counter_id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.counter_id = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('p4_counter_clear_args')
if self.counter_id is not None:
oprot.writeFieldBegin('counter_id', TType.I32, 1)
oprot.writeI32(self.counter_id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(p4_counter_clear_args)
p4_counter_clear_args.thrift_spec = (
None, # 0
(1, TType.I32, 'counter_id', None, None, ), # 1
)
class p4_counter_clear_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('p4_counter_clear_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(p4_counter_clear_result)
p4_counter_clear_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class p4_counter_clear_all_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('p4_counter_clear_all_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(p4_counter_clear_all_args)
p4_counter_clear_all_args.thrift_spec = (
)
class p4_counter_clear_all_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('p4_counter_clear_all_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(p4_counter_clear_all_result)
p4_counter_clear_all_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class register_list_all_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('register_list_all_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(register_list_all_args)
register_list_all_args.thrift_spec = (
)
class register_list_all_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype80, _size77) = iprot.readListBegin()
for _i81 in range(_size77):
_elem82 = RegisterDesc()
_elem82.read(iprot)
self.success.append(_elem82)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('register_list_all_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter83 in self.success:
iter83.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(register_list_all_result)
register_list_all_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [RegisterDesc, None], False), None, ), # 0
)
class register_retrieve_args(object):
"""
Attributes:
- regarr
"""
def __init__(self, regarr=None,):
self.regarr = regarr
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.regarr = RegisterArrayArg()
self.regarr.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('register_retrieve_args')
if self.regarr is not None:
oprot.writeFieldBegin('regarr', TType.STRUCT, 1)
self.regarr.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(register_retrieve_args)
register_retrieve_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'regarr', [RegisterArrayArg, None], None, ), # 1
)
class register_retrieve_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype87, _size84) = iprot.readListBegin()
for _i88 in range(_size84):
_elem89 = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
self.success.append(_elem89)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('register_retrieve_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRING, len(self.success))
for iter90 in self.success:
oprot.writeString(iter90.encode('utf-8') if sys.version_info[0] == 2 else iter90)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(register_retrieve_result)
register_retrieve_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRING, 'UTF8', False), None, ), # 0
)
class register_clear_args(object):
"""
Attributes:
- regarr
"""
def __init__(self, regarr=None,):
self.regarr = regarr
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.regarr = RegisterArrayArg()
self.regarr.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('register_clear_args')
if self.regarr is not None:
oprot.writeFieldBegin('regarr', TType.STRUCT, 1)
self.regarr.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(register_clear_args)
register_clear_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'regarr', [RegisterArrayArg, None], None, ), # 1
)
class register_clear_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('register_clear_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(register_clear_result)
register_clear_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class register_field_set_args(object):
"""
Attributes:
- regarr
- field_id
- value
"""
def __init__(self, regarr=None, field_id=None, value=None,):
self.regarr = regarr
self.field_id = field_id
self.value = value
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.regarr = RegisterArrayArg()
self.regarr.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.field_id = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRING:
self.value = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('register_field_set_args')
if self.regarr is not None:
oprot.writeFieldBegin('regarr', TType.STRUCT, 1)
self.regarr.write(oprot)
oprot.writeFieldEnd()
if self.field_id is not None:
oprot.writeFieldBegin('field_id', TType.I32, 2)
oprot.writeI32(self.field_id)
oprot.writeFieldEnd()
if self.value is not None:
oprot.writeFieldBegin('value', TType.STRING, 3)
oprot.writeString(self.value.encode('utf-8') if sys.version_info[0] == 2 else self.value)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(register_field_set_args)
register_field_set_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'regarr', [RegisterArrayArg, None], None, ), # 1
(2, TType.I32, 'field_id', None, None, ), # 2
(3, TType.STRING, 'value', 'UTF8', None, ), # 3
)
class register_field_set_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('register_field_set_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(register_field_set_result)
register_field_set_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class register_set_args(object):
"""
Attributes:
- regarr
- values
"""
def __init__(self, regarr=None, values=None,):
self.regarr = regarr
self.values = values
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.regarr = RegisterArrayArg()
self.regarr.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.LIST:
self.values = []
(_etype94, _size91) = iprot.readListBegin()
for _i95 in range(_size91):
_elem96 = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
self.values.append(_elem96)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('register_set_args')
if self.regarr is not None:
oprot.writeFieldBegin('regarr', TType.STRUCT, 1)
self.regarr.write(oprot)
oprot.writeFieldEnd()
if self.values is not None:
oprot.writeFieldBegin('values', TType.LIST, 2)
oprot.writeListBegin(TType.STRING, len(self.values))
for iter97 in self.values:
oprot.writeString(iter97.encode('utf-8') if sys.version_info[0] == 2 else iter97)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(register_set_args)
register_set_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'regarr', [RegisterArrayArg, None], None, ), # 1
(2, TType.LIST, 'values', (TType.STRING, 'UTF8', False), None, ), # 2
)
class register_set_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('register_set_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(register_set_result)
register_set_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class sys_counter_retrieve_all_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sys_counter_retrieve_all_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sys_counter_retrieve_all_args)
sys_counter_retrieve_all_args.thrift_spec = (
)
class sys_counter_retrieve_all_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype101, _size98) = iprot.readListBegin()
for _i102 in range(_size98):
_elem103 = SysCounterValue()
_elem103.read(iprot)
self.success.append(_elem103)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sys_counter_retrieve_all_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter104 in self.success:
iter104.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sys_counter_retrieve_all_result)
sys_counter_retrieve_all_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [SysCounterValue, None], False), None, ), # 0
)
class sys_counter_clear_all_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sys_counter_clear_all_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sys_counter_clear_all_args)
sys_counter_clear_all_args.thrift_spec = (
)
class sys_counter_clear_all_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sys_counter_clear_all_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sys_counter_clear_all_result)
sys_counter_clear_all_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class mcast_config_get_all_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('mcast_config_get_all_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(mcast_config_get_all_args)
mcast_config_get_all_args.thrift_spec = (
)
class mcast_config_get_all_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype108, _size105) = iprot.readListBegin()
for _i109 in range(_size105):
_elem110 = McastCfgEntry()
_elem110.read(iprot)
self.success.append(_elem110)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('mcast_config_get_all_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter111 in self.success:
iter111.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(mcast_config_get_all_result)
mcast_config_get_all_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [McastCfgEntry, None], False), None, ), # 0
)
class mcast_config_get_args(object):
"""
Attributes:
- mcast_group
"""
def __init__(self, mcast_group=None,):
self.mcast_group = mcast_group
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.mcast_group = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('mcast_config_get_args')
if self.mcast_group is not None:
oprot.writeFieldBegin('mcast_group', TType.I32, 1)
oprot.writeI32(self.mcast_group)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(mcast_config_get_args)
mcast_config_get_args.thrift_spec = (
None, # 0
(1, TType.I32, 'mcast_group', None, None, ), # 1
)
class mcast_config_get_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = McastCfgEntry()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('mcast_config_get_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(mcast_config_get_result)
mcast_config_get_result.thrift_spec = (
(0, TType.STRUCT, 'success', [McastCfgEntry, None], None, ), # 0
)
class mcast_config_set_args(object):
"""
Attributes:
- cfg
"""
def __init__(self, cfg=None,):
self.cfg = cfg
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.cfg = McastCfgEntry()
self.cfg.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('mcast_config_set_args')
if self.cfg is not None:
oprot.writeFieldBegin('cfg', TType.STRUCT, 1)
self.cfg.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(mcast_config_set_args)
mcast_config_set_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'cfg', [McastCfgEntry, None], None, ), # 1
)
class mcast_config_set_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('mcast_config_set_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(mcast_config_set_result)
mcast_config_set_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class meter_list_all_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('meter_list_all_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(meter_list_all_args)
meter_list_all_args.thrift_spec = (
)
class meter_list_all_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype115, _size112) = iprot.readListBegin()
for _i116 in range(_size112):
_elem117 = MeterDesc()
_elem117.read(iprot)
self.success.append(_elem117)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('meter_list_all_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter118 in self.success:
iter118.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(meter_list_all_result)
meter_list_all_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [MeterDesc, None], False), None, ), # 0
)
class meter_config_get_args(object):
"""
Attributes:
- meter_id
"""
def __init__(self, meter_id=None,):
self.meter_id = meter_id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.meter_id = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('meter_config_get_args')
if self.meter_id is not None:
oprot.writeFieldBegin('meter_id', TType.I32, 1)
oprot.writeI32(self.meter_id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(meter_config_get_args)
meter_config_get_args.thrift_spec = (
None, # 0
(1, TType.I32, 'meter_id', None, None, ), # 1
)
class meter_config_get_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype122, _size119) = iprot.readListBegin()
for _i123 in range(_size119):
_elem124 = MeterCfg()
_elem124.read(iprot)
self.success.append(_elem124)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('meter_config_get_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter125 in self.success:
iter125.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(meter_config_get_result)
meter_config_get_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [MeterCfg, None], False), None, ), # 0
)
class meter_config_set_args(object):
"""
Attributes:
- meter_id
- cfgs
"""
def __init__(self, meter_id=None, cfgs=None,):
self.meter_id = meter_id
self.cfgs = cfgs
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.meter_id = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.LIST:
self.cfgs = []
(_etype129, _size126) = iprot.readListBegin()
for _i130 in range(_size126):
_elem131 = MeterCfg()
_elem131.read(iprot)
self.cfgs.append(_elem131)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('meter_config_set_args')
if self.meter_id is not None:
oprot.writeFieldBegin('meter_id', TType.I32, 1)
oprot.writeI32(self.meter_id)
oprot.writeFieldEnd()
if self.cfgs is not None:
oprot.writeFieldBegin('cfgs', TType.LIST, 2)
oprot.writeListBegin(TType.STRUCT, len(self.cfgs))
for iter132 in self.cfgs:
iter132.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(meter_config_set_args)
meter_config_set_args.thrift_spec = (
None, # 0
(1, TType.I32, 'meter_id', None, None, ), # 1
(2, TType.LIST, 'cfgs', (TType.STRUCT, [MeterCfg, None], False), None, ), # 2
)
class meter_config_set_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('meter_config_set_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(meter_config_set_result)
meter_config_set_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class digest_list_all_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('digest_list_all_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(digest_list_all_args)
digest_list_all_args.thrift_spec = (
)
class digest_list_all_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype136, _size133) = iprot.readListBegin()
for _i137 in range(_size133):
_elem138 = DigestDesc()
_elem138.read(iprot)
self.success.append(_elem138)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('digest_list_all_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter139 in self.success:
iter139.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(digest_list_all_result)
digest_list_all_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [DigestDesc, None], False), None, ), # 0
)
class digest_register_args(object):
"""
Attributes:
- digest_id
"""
def __init__(self, digest_id=None,):
self.digest_id = digest_id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.digest_id = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('digest_register_args')
if self.digest_id is not None:
oprot.writeFieldBegin('digest_id', TType.I32, 1)
oprot.writeI32(self.digest_id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(digest_register_args)
digest_register_args.thrift_spec = (
None, # 0
(1, TType.I32, 'digest_id', None, None, ), # 1
)
class digest_register_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.I64:
self.success = iprot.readI64()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('digest_register_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.I64, 0)
oprot.writeI64(self.success)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(digest_register_result)
digest_register_result.thrift_spec = (
(0, TType.I64, 'success', None, None, ), # 0
)
class digest_deregister_args(object):
"""
Attributes:
- digest_regid
"""
def __init__(self, digest_regid=None,):
self.digest_regid = digest_regid
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I64:
self.digest_regid = iprot.readI64()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('digest_deregister_args')
if self.digest_regid is not None:
oprot.writeFieldBegin('digest_regid', TType.I64, 1)
oprot.writeI64(self.digest_regid)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(digest_deregister_args)
digest_deregister_args.thrift_spec = (
None, # 0
(1, TType.I64, 'digest_regid', None, None, ), # 1
)
class digest_deregister_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('digest_deregister_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(digest_deregister_result)
digest_deregister_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class digest_retrieve_args(object):
"""
Attributes:
- digest_regid
"""
def __init__(self, digest_regid=None,):
self.digest_regid = digest_regid
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I64:
self.digest_regid = iprot.readI64()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('digest_retrieve_args')
if self.digest_regid is not None:
oprot.writeFieldBegin('digest_regid', TType.I64, 1)
oprot.writeI64(self.digest_regid)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(digest_retrieve_args)
digest_retrieve_args.thrift_spec = (
None, # 0
(1, TType.I64, 'digest_regid', None, None, ), # 1
)
class digest_retrieve_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype143, _size140) = iprot.readListBegin()
for _i144 in range(_size140):
_elem145 = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
self.success.append(_elem145)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('digest_retrieve_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRING, len(self.success))
for iter146 in self.success:
oprot.writeString(iter146.encode('utf-8') if sys.version_info[0] == 2 else iter146)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(digest_retrieve_result)
digest_retrieve_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRING, 'UTF8', False), None, ), # 0
)
class traffic_class_set_args(object):
"""
Attributes:
- port_id
- cfgs
"""
def __init__(self, port_id=None, cfgs=None,):
self.port_id = port_id
self.cfgs = cfgs
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I64:
self.port_id = iprot.readI64()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.LIST:
self.cfgs = []
(_etype150, _size147) = iprot.readListBegin()
for _i151 in range(_size147):
_elem152 = TrafficClassCfg()
_elem152.read(iprot)
self.cfgs.append(_elem152)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('traffic_class_set_args')
if self.port_id is not None:
oprot.writeFieldBegin('port_id', TType.I64, 1)
oprot.writeI64(self.port_id)
oprot.writeFieldEnd()
if self.cfgs is not None:
oprot.writeFieldBegin('cfgs', TType.LIST, 2)
oprot.writeListBegin(TType.STRUCT, len(self.cfgs))
for iter153 in self.cfgs:
iter153.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(traffic_class_set_args)
traffic_class_set_args.thrift_spec = (
None, # 0
(1, TType.I64, 'port_id', None, None, ), # 1
(2, TType.LIST, 'cfgs', (TType.STRUCT, [TrafficClassCfg, None], False), None, ), # 2
)
class traffic_class_set_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('traffic_class_set_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(traffic_class_set_result)
traffic_class_set_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class traffic_class_commit_args(object):
"""
Attributes:
- port_id
"""
def __init__(self, port_id=None,):
self.port_id = port_id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I64:
self.port_id = iprot.readI64()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('traffic_class_commit_args')
if self.port_id is not None:
oprot.writeFieldBegin('port_id', TType.I64, 1)
oprot.writeI64(self.port_id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(traffic_class_commit_args)
traffic_class_commit_args.thrift_spec = (
None, # 0
(1, TType.I64, 'port_id', None, None, ), # 1
)
class traffic_class_commit_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RteReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('traffic_class_commit_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(traffic_class_commit_result)
traffic_class_commit_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RteReturn, None], None, ), # 0
)
class traffic_class_get_args(object):
"""
Attributes:
- port_id
"""
def __init__(self, port_id=None,):
self.port_id = port_id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I64:
self.port_id = iprot.readI64()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('traffic_class_get_args')
if self.port_id is not None:
oprot.writeFieldBegin('port_id', TType.I64, 1)
oprot.writeI64(self.port_id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(traffic_class_get_args)
traffic_class_get_args.thrift_spec = (
None, # 0
(1, TType.I64, 'port_id', None, None, ), # 1
)
class traffic_class_get_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype157, _size154) = iprot.readListBegin()
for _i158 in range(_size154):
_elem159 = TrafficClassCfg()
_elem159.read(iprot)
self.success.append(_elem159)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('traffic_class_get_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter160 in self.success:
iter160.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(traffic_class_get_result)
traffic_class_get_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [TrafficClassCfg, None], False), None, ), # 0
)
class debugctl_args(object):
"""
Attributes:
- debug_id
- debug_data
"""
def __init__(self, debug_id=None, debug_data=None,):
self.debug_id = debug_id
self.debug_data = debug_data
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.debug_id = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.debug_data = iprot.readBinary()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('debugctl_args')
if self.debug_id is not None:
oprot.writeFieldBegin('debug_id', TType.STRING, 1)
oprot.writeString(self.debug_id.encode('utf-8') if sys.version_info[0] == 2 else self.debug_id)
oprot.writeFieldEnd()
if self.debug_data is not None:
oprot.writeFieldBegin('debug_data', TType.STRING, 2)
oprot.writeBinary(self.debug_data)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(debugctl_args)
debugctl_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'debug_id', 'UTF8', None, ), # 1
(2, TType.STRING, 'debug_data', 'BINARY', None, ), # 2
)
class debugctl_result(object):
"""
Attributes:
- success
"""
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = DebugCtlReturn()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('debugctl_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(debugctl_result)
debugctl_result.thrift_spec = (
(0, TType.STRUCT, 'success', [DebugCtlReturn, None], None, ), # 0
)
fix_spec(all_structs)
del all_structs
| 33.962848 | 134 | 0.604 | 30,830 | 284,303 | 5.250762 | 0.010866 | 0.015165 | 0.027298 | 0.013491 | 0.939684 | 0.905857 | 0.87455 | 0.84921 | 0.828541 | 0.811201 | 0 | 0.00586 | 0.296483 | 284,303 | 8,370 | 135 | 33.966906 | 0.803497 | 0.014228 | 0 | 0.815573 | 1 | 0 | 0.043693 | 0.007634 | 0 | 0 | 0 | 0 | 0 | 1 | 0.129771 | false | 0.007023 | 0.001221 | 0.041679 | 0.242595 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6f9565a928b6e8833e5c6b91c2e7bc075f757fe0 | 15,350 | py | Python | inpaint_covid/unets.py | octaviomtz/inpaint_covid | 7da491b81d730715c99b33b2a38760a2ab85be90 | [
"Apache-2.0"
] | null | null | null | inpaint_covid/unets.py | octaviomtz/inpaint_covid | 7da491b81d730715c99b33b2a38760a2ab85be90 | [
"Apache-2.0"
] | 1 | 2022-02-26T10:14:23.000Z | 2022-02-26T10:14:23.000Z | inpaint_covid/unets.py | octaviomtz/inpaint_covid | 7da491b81d730715c99b33b2a38760a2ab85be90 | [
"Apache-2.0"
] | null | null | null | # AUTOGENERATED! DO NOT EDIT! File to edit: 01_unets.ipynb (unless otherwise specified).
__all__ = ['unet5', 'unet4', 'unet3', 'unet2', 'get_architecture']
# Cell
import numpy as np
import tensorflow as tf
from tensorflow.keras.models import Model, load_model
from tensorflow.keras.layers import Input
from tensorflow.keras.layers import Dropout, Lambda
from tensorflow.keras.layers import Conv2D, Conv2DTranspose, BatchNormalization
from tensorflow.keras.layers import MaxPooling2D
from tensorflow.keras.layers import concatenate
from tensorflow.keras.callbacks import EarlyStopping, ModelCheckpoint
from tensorflow.keras import backend as K
from tensorflow.keras.layers import GaussianNoise
from scipy.ndimage import binary_erosion, binary_dilation
from tensorflow.keras.layers import LeakyReLU, ReLU
from tqdm.keras import TqdmCallback
from tensorflow.keras import backend as K
# Cell
def unet5(ct_small, ch=32, g_noise= 0.3, act_max_value = 1, act_out_max_value = 1):
IMG_CHANNELS = np.shape(ct_small)[-1]
inputs = Input(np.shape(ct_small))
c1 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (inputs)
if (g_noise > 0): c1 = GaussianNoise(g_noise) (c1)
c1 = BatchNormalization()(c1)
c1 = ReLU(max_value=act_max_value)(c1)
# c1 = Dropout(0.1) (c1)
c1 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (c1)
if (g_noise > 0): c1 = GaussianNoise(g_noise) (c1)
c1 = BatchNormalization()(c1)
c1 = ReLU(max_value=act_max_value)(c1)
p1 = MaxPooling2D((2, 2)) (c1)
c2 = Conv2D(ch*2, (3, 3), kernel_initializer='he_normal', padding='same') (p1)
if (g_noise > 0): c2 = GaussianNoise(g_noise) (c2)
c2 = BatchNormalization()(c2)
c2 = ReLU(max_value=act_max_value)(c2)
# c2 = Dropout(0.1) (c2)
c2 = Conv2D(ch*2, (3, 3), kernel_initializer='he_normal', padding='same') (c2)
if (g_noise > 0): c2 = GaussianNoise(g_noise) (c2)
c2 = BatchNormalization()(c2)
c2 = ReLU(max_value=act_max_value)(c2)
p2 = MaxPooling2D((2, 2)) (c2)
c3 = Conv2D(ch*4, (3, 3), kernel_initializer='he_normal', padding='same') (p2)
if (g_noise > 0): c3 = GaussianNoise(g_noise) (c3)
c3 = BatchNormalization()(c3)
c3 = ReLU(max_value=act_max_value)(c3)
# c3 = Dropout(0.2) (c3)
c3 = Conv2D(ch*4, (3, 3), kernel_initializer='he_normal', padding='same') (c3)
if (g_noise > 0): c3 = GaussianNoise(g_noise) (c3)
c3 = BatchNormalization()(c3)
c3 = ReLU(max_value=act_max_value)(c3)
p3 = MaxPooling2D((2, 2)) (c3)
c4 = Conv2D(ch*8, (3, 3), kernel_initializer='he_normal', padding='same') (p3)
if (g_noise > 0): c4 = GaussianNoise(g_noise) (c4)
c4 = BatchNormalization()(c4)
c4 = ReLU(max_value=act_max_value)(c4)
# c4 = Dropout(0.2) (c4)
c4 = Conv2D(ch*8, (3, 3), kernel_initializer='he_normal', padding='same') (c4)
if (g_noise > 0): c4 = GaussianNoise(g_noise) (c4)
c4 = BatchNormalization()(c4)
c4 = ReLU(max_value=act_max_value)(c4)
p4 = MaxPooling2D(pool_size=(2, 2)) (c4)
c5 = Conv2D(ch*16, (3, 3), kernel_initializer='he_normal', padding='same') (p4)
if (g_noise > 0): c5 = GaussianNoise(g_noise) (c5)
c5 = BatchNormalization()(c5)
c5 = ReLU(max_value=act_max_value)(c5)
# c5 = Dropout(0.3) (c5)
c5 = Conv2D(ch*16, (3, 3), kernel_initializer='he_normal', padding='same') (c5)
if (g_noise > 0): c5 = GaussianNoise(g_noise) (c5)
c5 = BatchNormalization()(c5)
c5 = ReLU(max_value=act_max_value)(c5)
out_inter = c5
u6 = Conv2DTranspose(ch*4, (2, 2), strides=(2, 2), padding='same') (c5)
u6 = concatenate([u6, c4])
c6 = Conv2D(ch*8, (3, 3), kernel_initializer='he_normal', padding='same') (u6)
if (g_noise > 0): c6 = GaussianNoise(g_noise) (c6)
c6 = BatchNormalization()(c6)
c6 = ReLU(max_value=act_max_value)(c6)
# c6 = Dropout(0.2) (c6)
c6 = Conv2D(ch*8, (3, 3), kernel_initializer='he_normal', padding='same') (c6)
if (g_noise > 0): c6 = GaussianNoise(g_noise) (c6)
c6 = BatchNormalization()(c6)
c6 = ReLU(max_value=act_max_value)(c6)
u7 = Conv2DTranspose(ch*2, (2, 2), strides=(2, 2), padding='same') (c6)
u7 = concatenate([u7, c3])
c7 = Conv2D(ch*4, (3, 3), kernel_initializer='he_normal', padding='same') (u7)
if (g_noise > 0): c7 = GaussianNoise(g_noise) (c7)
c7 = BatchNormalization()(c7)
c7 = ReLU(max_value=act_max_value)(c7)
# c7 = Dropout(0.2) (c7)
c7 = Conv2D(ch*4, (3, 3), kernel_initializer='he_normal', padding='same') (c7)
if (g_noise > 0): c7 = GaussianNoise(g_noise) (c7)
c7 = BatchNormalization()(c7)
c7 = ReLU(max_value=act_max_value)(c7)
u8 = Conv2DTranspose(ch, (2, 2), strides=(2, 2), padding='same') (c7)
u8 = concatenate([u8, c2])
c8 = Conv2D(ch*2, (3, 3), kernel_initializer='he_normal', padding='same') (u8)
if (g_noise > 0): c8 = GaussianNoise(g_noise) (c8)
c8 = BatchNormalization()(c8)
c8 = ReLU(max_value=act_max_value)(c8)
# c8 = Dropout(0.1) (c8)
c8 = Conv2D(ch*2, (3, 3), kernel_initializer='he_normal', padding='same') (c8)
if (g_noise > 0): c8 = GaussianNoise(g_noise) (c8)
c8 = BatchNormalization()(c8)
c8 = ReLU(max_value=act_max_value)(c8)
u9 = Conv2DTranspose(ch//2, (2, 2), strides=(2, 2), padding='same') (c8)
u9 = concatenate([u9, c1], axis=3)
c9 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (u9)
if (g_noise > 0): c9 = GaussianNoise(g_noise) (c9)
c9 = BatchNormalization()(c9)
c9 = ReLU(max_value=act_max_value)(c9)
# c9 = Dropout(0.1) (c9)
c9 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (c9)
if (g_noise > 0): c9 = GaussianNoise(g_noise) (c9)
c9 = BatchNormalization()(c9)
c9 = ReLU(max_value=act_max_value)(c9)
outputs = Conv2D(IMG_CHANNELS, (1, 1)) (c9)
outputs = ReLU(max_value=act_out_max_value)(outputs)
model = Model(inputs=[inputs], outputs=[outputs])
return model
# Cell
def unet4(ct_small, ch=32, g_noise= 0.3, act_max_value = 1, act_out_max_value = 1):
IMG_CHANNELS = np.shape(ct_small)[-1]
inputs = Input(np.shape(ct_small))
c1 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (inputs)
if (g_noise > 0): c1 = GaussianNoise(g_noise) (c1)
c1 = BatchNormalization()(c1)
c1 = ReLU(max_value=act_max_value)(c1)
# c1 = Dropout(0.1) (c1)
c1 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (c1)
if (g_noise > 0): c1 = GaussianNoise(g_noise) (c1)
c1 = BatchNormalization()(c1)
c1 = ReLU(max_value=act_max_value)(c1)
p1 = MaxPooling2D((2, 2)) (c1)
c2 = Conv2D(ch*2, (3, 3), kernel_initializer='he_normal', padding='same') (p1)
if (g_noise > 0): c2 = GaussianNoise(g_noise) (c2)
c2 = BatchNormalization()(c2)
c2 = ReLU(max_value=act_max_value)(c2)
# c2 = Dropout(0.1) (c2)
c2 = Conv2D(ch*2, (3, 3), kernel_initializer='he_normal', padding='same') (c2)
if (g_noise > 0): c2 = GaussianNoise(g_noise) (c2)
c2 = BatchNormalization()(c2)
c2 = ReLU(max_value=act_max_value)(c2)
p2 = MaxPooling2D((2, 2)) (c2)
c3 = Conv2D(ch*4, (3, 3), kernel_initializer='he_normal', padding='same') (p2)
if (g_noise > 0): c3 = GaussianNoise(g_noise) (c3)
c3 = BatchNormalization()(c3)
c3 = ReLU(max_value=act_max_value)(c3)
# c3 = Dropout(0.2) (c3)
c3 = Conv2D(ch*4, (3, 3), kernel_initializer='he_normal', padding='same') (c3)
if (g_noise > 0): c3 = GaussianNoise(g_noise) (c3)
c3 = BatchNormalization()(c3)
c3 = ReLU(max_value=act_max_value)(c3)
p3 = MaxPooling2D((2, 2)) (c3)
c4 = Conv2D(ch*8, (3, 3), kernel_initializer='he_normal', padding='same') (p3)
if (g_noise > 0): c4 = GaussianNoise(g_noise) (c4)
c4 = BatchNormalization()(c4)
c4 = ReLU(max_value=act_max_value)(c4)
# c4 = Dropout(0.3) (c4)
c4 = Conv2D(ch*8, (3, 3), kernel_initializer='he_normal', padding='same') (c4)
if (g_noise > 0): c4 = GaussianNoise(g_noise) (c4)
c4 = BatchNormalization()(c4)
c4 = ReLU(max_value=act_max_value)(c4)
out_inter = c4
u5 = Conv2DTranspose(ch*2, (2, 2), strides=(2, 2), padding='same') (c4)
u5 = concatenate([u5, c3])
c5 = Conv2D(ch*4, (3, 3), kernel_initializer='he_normal', padding='same') (u5)
if (g_noise > 0): c5 = GaussianNoise(g_noise) (c5)
c5 = BatchNormalization()(c5)
c5 = ReLU(max_value=act_max_value)(c5)
# c5 = Dropout(0.2) (c5)
c5 = Conv2D(ch*4, (3, 3), kernel_initializer='he_normal', padding='same') (c5)
if (g_noise > 0): c5 = GaussianNoise(g_noise) (c5)
c5 = BatchNormalization()(c5)
c5 = ReLU(max_value=act_max_value)(c5)
u6 = Conv2DTranspose(ch, (2, 2), strides=(2, 2), padding='same') (c5)
u6 = concatenate([u6, c2])
c6 = Conv2D(ch*2, (3, 3), kernel_initializer='he_normal', padding='same') (u6)
if (g_noise > 0): c6 = GaussianNoise(g_noise) (c6)
c6 = BatchNormalization()(c6)
c6 = ReLU(max_value=act_max_value)(c6)
# c6 = Dropout(0.1) (c6)
c6 = Conv2D(ch*2, (3, 3), kernel_initializer='he_normal', padding='same') (c6)
if (g_noise > 0): c6 = GaussianNoise(g_noise) (c6)
c6 = BatchNormalization()(c6)
c6 = ReLU(max_value=act_max_value)(c6)
u7 = Conv2DTranspose(ch//2, (2, 2), strides=(2, 2), padding='same') (c6)
u7 = concatenate([u7, c1], axis=3)
c7 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (u7)
if (g_noise > 0): c7 = GaussianNoise(g_noise) (c7)
c7 = BatchNormalization()(c7)
c7 = ReLU(max_value=act_max_value)(c7)
# c7 = Dropout(0.1) (c7)
c7 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (c7)
if (g_noise > 0): c7 = GaussianNoise(g_noise) (c7)
c7 = BatchNormalization()(c7)
c7 = ReLU(max_value=act_max_value)(c7)
outputs = Conv2D(IMG_CHANNELS, (1, 1)) (c7)
outputs = ReLU(max_value=act_out_max_value)(outputs)
model = Model(inputs=[inputs], outputs=[outputs])
return model
# Cell
def unet3(ct_small, ch=32, g_noise= 0.3, act_max_value = 1, act_out_max_value = 1):
IMG_CHANNELS = np.shape(ct_small)[-1]
inputs = Input(np.shape(ct_small))
c1 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (inputs)
if (g_noise > 0): c1 = GaussianNoise(g_noise) (c1)
c1 = BatchNormalization()(c1)
c1 = ReLU(max_value=act_max_value)(c1)
# c1 = Dropout(0.1) (c1)
c1 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (c1)
if (g_noise > 0): c1 = GaussianNoise(g_noise) (c1)
c1 = BatchNormalization()(c1)
c1 = ReLU(max_value=act_max_value)(c1)
p1 = MaxPooling2D((2, 2)) (c1)
c2 = Conv2D(ch*2, (3, 3), kernel_initializer='he_normal', padding='same') (p1)
if (g_noise > 0): c2 = GaussianNoise(g_noise) (c2)
c2 = BatchNormalization()(c2)
c2 = ReLU(max_value=act_max_value)(c2)
# c2 = Dropout(0.1) (c2)
c2 = Conv2D(ch*2, (3, 3), kernel_initializer='he_normal', padding='same') (c2)
if (g_noise > 0): c2 = GaussianNoise(g_noise) (c2)
c2 = BatchNormalization()(c2)
c2 = ReLU(max_value=act_max_value)(c2)
p2 = MaxPooling2D((2, 2)) (c2)
c3 = Conv2D(ch*4, (3, 3), kernel_initializer='he_normal', padding='same') (p2)
if (g_noise > 0): c3 = GaussianNoise(g_noise) (c3)
c3 = BatchNormalization()(c3)
c3 = ReLU(max_value=act_max_value)(c3)
# c3 = Dropout(0.3) (c3)
c3 = Conv2D(ch*4, (3, 3), kernel_initializer='he_normal', padding='same') (c3)
if (g_noise > 0): c3 = GaussianNoise(g_noise) (c3)
c3 = BatchNormalization()(c3)
c3 = ReLU(max_value=act_max_value)(c3)
out_inter = c3
u4 = Conv2DTranspose(ch, (2, 2), strides=(2, 2), padding='same') (c3)
u4 = concatenate([u4, c2])
c4 = Conv2D(ch*2, (3, 3), kernel_initializer='he_normal', padding='same') (u4)
if (g_noise > 0): c4 = GaussianNoise(g_noise) (c4)
c4 = BatchNormalization()(c4)
c4 = ReLU(max_value=act_max_value)(c4)
# c4 = Dropout(0.1) (c4)
c4 = Conv2D(ch*2, (3, 3), kernel_initializer='he_normal', padding='same') (c4)
if (g_noise > 0): c4 = GaussianNoise(g_noise) (c4)
c4 = BatchNormalization()(c4)
c4 = ReLU(max_value=act_max_value)(c4)
u5 = Conv2DTranspose(ch//2, (2, 2), strides=(2, 2), padding='same') (c4)
u5 = concatenate([u5, c1], axis=3)
c5 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (u5)
if (g_noise > 0): c5 = GaussianNoise(g_noise) (c5)
c5 = BatchNormalization()(c5)
c5 = ReLU(max_value=act_max_value)(c5)
# c5 = Dropout(0.1) (c5)
c5 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (c5)
if (g_noise > 0): c5 = GaussianNoise(g_noise) (c5)
c5 = BatchNormalization()(c5)
c5 = ReLU(max_value=act_max_value)(c5)
outputs = Conv2D(IMG_CHANNELS, (1, 1)) (c5)
outputs = ReLU(max_value=act_out_max_value)(outputs)
model = Model(inputs=[inputs], outputs=[outputs])
return model
# Cell
def unet2(ct_small, ch=32, g_noise= 0.3, act_max_value = 1, act_out_max_value = 1):
IMG_CHANNELS = np.shape(ct_small)[-1]
inputs = Input(np.shape(ct_small))
c1 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (inputs)
if (g_noise > 0): c1 = GaussianNoise(g_noise) (c1)
c1 = BatchNormalization()(c1)
c1 = ReLU(max_value=act_max_value)(c1)
# c1 = Dropout(0.1) (c1)
c1 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (c1)
if (g_noise > 0): c1 = GaussianNoise(g_noise) (c1)
c1 = BatchNormalization()(c1)
c1 = ReLU(max_value=act_max_value)(c1)
p1 = MaxPooling2D((2, 2)) (c1)
c2 = Conv2D(ch*2, (3, 3), kernel_initializer='he_normal', padding='same') (p1)
if (g_noise > 0): c2 = GaussianNoise(g_noise) (c2)
c2 = BatchNormalization()(c2)
c2 = ReLU(max_value=act_max_value)(c2)
# c2 = Dropout(0.3) (c2)
c2 = Conv2D(ch*4, (3, 3), kernel_initializer='he_normal', padding='same') (c2)
if (g_noise > 0): c2 = GaussianNoise(g_noise) (c2)
c2 = BatchNormalization()(c2)
c2 = ReLU(max_value=act_max_value)(c2)
out_inter = c2
u3 = Conv2DTranspose(ch//2, (2, 2), strides=(2, 2), padding='same') (c2)
u3 = concatenate([u3, c1], axis=3)
c3 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (u3)
if (g_noise > 0): c3 = GaussianNoise(g_noise) (c3)
c3 = BatchNormalization()(c3)
c3 = ReLU(max_value=act_max_value)(c3)
# c3 = Dropout(0.1) (c3)
c3 = Conv2D(ch, (3, 3), kernel_initializer='he_normal', padding='same') (c3)
if (g_noise > 0): c3 = GaussianNoise(g_noise) (c3)
c3 = BatchNormalization()(c3)
c3 = ReLU(max_value=act_max_value)(c3)
outputs = Conv2D(IMG_CHANNELS, (1, 1)) (c3)
outputs = ReLU(max_value=act_out_max_value)(outputs)
model = Model(inputs=[inputs], outputs=[outputs])
return model
# Cell
def get_architecture(ct_small, archi=5, ch=32, g_noise= 0.3, act_max_value = 1, act_out_max_value = 1):
if archi==2:
model = unet2(ct_small, ch, g_noise, act_max_value, act_out_max_value)
if archi==3:
model = unet3(ct_small, ch, g_noise, act_max_value, act_out_max_value)
if archi==4:
model = unet4(ct_small, ch, g_noise, act_max_value, act_out_max_value)
else:
model = unet5(ct_small, ch, g_noise, act_max_value, act_out_max_value)
return model | 44.364162 | 103 | 0.643974 | 2,353 | 15,350 | 4.011475 | 0.048449 | 0.103401 | 0.066427 | 0.082636 | 0.91069 | 0.8877 | 0.876682 | 0.869266 | 0.869266 | 0.859201 | 0 | 0.074938 | 0.18456 | 15,350 | 346 | 104 | 44.364162 | 0.679156 | 0.043518 | 0 | 0.717857 | 1 | 0 | 0.047778 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017857 | false | 0 | 0.053571 | 0 | 0.089286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6fe22e6c3ef7b5902ddf477072b3397ab85bd535 | 9,646 | py | Python | envi/tests/msp430/imov.py | rnui2k/vivisect | b7b00f2d03defef28b4b8c912e3a8016e956c5f7 | [
"ECL-2.0",
"Apache-2.0"
] | 716 | 2015-01-01T14:41:11.000Z | 2022-03-28T06:51:50.000Z | envi/tests/msp430/imov.py | rnui2k/vivisect | b7b00f2d03defef28b4b8c912e3a8016e956c5f7 | [
"ECL-2.0",
"Apache-2.0"
] | 266 | 2015-01-01T15:07:27.000Z | 2022-03-30T15:19:26.000Z | envi/tests/msp430/imov.py | rnui2k/vivisect | b7b00f2d03defef28b4b8c912e3a8016e956c5f7 | [
"ECL-2.0",
"Apache-2.0"
] | 159 | 2015-01-01T16:19:44.000Z | 2022-03-21T21:55:34.000Z | from envi.archs.msp430.regs import *
checks = [
# MOV
(
'MOV r14, r15',
{ 'regs': [(REG_R14, 0x1122), (REG_R15, 0x3344)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "0f4e", 'data': "" },
{ 'regs': [(REG_R14, 0x1122), (REG_R15, 0x1122)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "0f4e", 'data': "" }
),
(
'MOV #0xaabb, r15',
{ 'regs': [], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "3f40bbaa", 'data': "" },
{ 'regs': [(REG_R15, 0xaabb)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "3f40bbaa", 'data': "" }
),
(
'MOV @r14, r15',
{ 'regs': [(REG_R14, 0x1002)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "2f4e", 'data': "00112233445566" },
{ 'regs': [(REG_R14, 0x1002), (REG_R15, 0x3322)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "2f4e", 'data': "00112233445566" }
),
(
'MOV r14, @r15',
{ 'regs': [(REG_R14, 0xaabb), (REG_R15, 0x1002)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "8f4e0000", 'data': "00112233445566" },
{ 'regs': [(REG_R14, 0xaabb), (REG_R15, 0x1002)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "8f4e0000", 'data': "0011bbaa445566" }
),
(
'MOV @r14+, r15',
{ 'regs': [(REG_R14, 0x1002)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "3f4e", 'data': "00112233445566" },
{ 'regs': [(REG_R14, 0x1004), (REG_R15, 0x3322)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "3f4e", 'data': "00112233445566" }
),
# PC
(
'MOV pc, r15',
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "0f40", 'data': "" },
{ 'regs': [(REG_R15, 0x4402)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "0f40", 'data': "" }
),
(
'MOV @pc, r15',
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "2f40aabb", 'data': "" },
{ 'regs': [(REG_R15, 0xbbaa)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "2f40aabb", 'data': "" }
),
(
'MOV r15, @pc',
{ 'regs': [(REG_R15, 0xaabb)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "804f0000", 'data': "" },
{ 'regs': [(REG_R15, 0xaabb)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "804f0000bbaa", 'data': "" }
),
# Constant Generators
# SR X(Rn) (0)
(
'MOV 0(sr), r15',
{ 'regs': [(REG_R15, 0xaabb)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "1f420000", 'data': "" },
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "1f420000", 'data': "" }
),
# SR @Rn 4
(
'MOV @sr, r15',
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "2f42", 'data': "" },
{ 'regs': [(REG_R15, 0x4)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "2f42", 'data': "" }
),
# SR @Rn+ 8
(
'MOV @sr+, r15',
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "3f42", 'data': "" },
{ 'regs': [(REG_R15, 0x8)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "3f42", 'data': "" }
),
# CG Rn 0
(
'MOV cg, r15',
{ 'regs': [(REG_R15, 0xaabb)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "0f43", 'data': "" },
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "0f43", 'data': "" }
),
# CG X(Rn) 1
(
'MOV 0(cg), r15',
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "1f430000", 'data': "" },
{ 'regs': [(REG_R15, 0x1)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "1f430000", 'data': "" }
),
# CG @Rn 2
(
'MOV @cg, r15',
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "2f43", 'data': "" },
{ 'regs': [(REG_R15, 0x2)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "2f43", 'data': "" }
),
# CG @Rn+ -1
(
'MOV @cg+, r15',
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "3f43", 'data': "" },
{ 'regs': [(REG_R15, 0xffff)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "3f43", 'data': "" }
),
# MOV.b
(
'MOV.b r14, r15',
{ 'regs': [(REG_R14, 0x1122), (REG_R15, 0x3344)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "4f4e", 'data': "" },
{ 'regs': [(REG_R14, 0x1122), (REG_R15, 0x22)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "4f4e", 'data': "" }
),
(
'MOV.b pc, r15',
{ 'regs': [], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "4f40", 'data': "" },
{ 'regs': [(REG_R15, 0x02)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "4f40", 'data': "" }
),
(
'MOV.b #0xaabb, r15',
{ 'regs': [], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "7f40bb00", 'data': "" },
{ 'regs': [(REG_R15, 0xbb)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "7f40bb00", 'data': "" }
),
(
'MOV.b @r14, r15',
{ 'regs': [(REG_R14, 0x1002)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "6f4e", 'data': "00112233445566" },
{ 'regs': [(REG_R14, 0x1002), (REG_R15, 0x22)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "6f4e", 'data': "00112233445566" }
),
(
'MOV.b r14, @r15',
{ 'regs': [(REG_R14, 0xaabb), (REG_R15, 0x1002)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "cf4e0000", 'data': "00112233445566" },
{ 'regs': [(REG_R14, 0xaabb), (REG_R15, 0x1002)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "cf4e0000", 'data': "0011bb33445566" }
),
(
'MOV.b @r14+, r15',
{ 'regs': [(REG_R14, 0x1002)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "7f4e", 'data': "00112233445566" },
{ 'regs': [(REG_R14, 0x1003), (REG_R15, 0x22)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "7f4e", 'data': "00112233445566" }
),
# PC
(
'MOV.b pc, r15',
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "4f40", 'data': "" },
{ 'regs': [(REG_R15, 0x02)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "4f40", 'data': "" }
),
(
'MOV.b @pc, r15',
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "6f40aabb", 'data': "" },
{ 'regs': [(REG_R15, 0xaa)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "6f40aabb", 'data': "" }
),
(
'MOV.b r15, @pc',
{ 'regs': [(REG_R15, 0xaabb)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "c04f0000", 'data': "" },
{ 'regs': [(REG_R15, 0xaabb)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "c04f0000bb", 'data': "" }
),
# Constant Generators
# SR X(Rn) (0)
(
'MOV.b 0(sr), r15',
{ 'regs': [(REG_R15, 0xaabb)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "5f420000", 'data': "" },
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "5f420000", 'data': "" }
),
# SR @Rn 4
(
'MOV.b @sr, r15',
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "6f42", 'data': "" },
{ 'regs': [(REG_R15, 0x4)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "6f42", 'data': "" }
),
# SR @Rn+ 8
(
'MOV.b @sr+, r15',
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "7f42", 'data': "" },
{ 'regs': [(REG_R15, 0x8)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "7f42", 'data': "" }
),
# CG Rn 0
(
'MOV.b cg, r15',
{ 'regs': [(REG_R15, 0xaabb)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "4f43", 'data': "" },
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "4f43", 'data': "" }
),
# CG X(Rn) 1
(
'MOV.b 0(cg), r15',
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "5f430000", 'data': "" },
{ 'regs': [(REG_R15, 0x1)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "5f430000", 'data': "" }
),
# CG @Rn 2
(
'MOV.b @cg, r15',
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "6f43", 'data': "" },
{ 'regs': [(REG_R15, 0x2)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "6f43", 'data': "" }
),
# CG @Rn+ -1
(
'MOV.b @cg+, r15',
{ 'regs': [(REG_R15, 0x0)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "7f43", 'data': "" },
{ 'regs': [(REG_R15, 0xff)], 'flags': [(SR_N, 0), (SR_Z, 0), (SR_C, 0), (SR_V, 0)], 'code': "7f43", 'data': "" }
),
]
| 52.140541 | 160 | 0.423077 | 1,454 | 9,646 | 2.587345 | 0.059147 | 0.14992 | 0.131845 | 0.148325 | 0.938331 | 0.919724 | 0.910952 | 0.897661 | 0.872674 | 0.872674 | 0 | 0.151583 | 0.260004 | 9,646 | 184 | 161 | 52.423913 | 0.375455 | 0.026643 | 0 | 0.221519 | 0 | 0 | 0.213386 | 0 | 0 | 0 | 0.036721 | 0 | 0 | 1 | 0 | false | 0 | 0.006329 | 0 | 0.006329 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d20d266a160e18bc131c08d22bb6aa8d1e8eb396 | 125 | py | Python | dear/io/__init__.py | dongying/dear | 6f9a4f63bf3ee197dc03d7d2bd0451a83906d2ba | [
"MIT"
] | 2 | 2015-03-10T12:40:29.000Z | 2015-09-07T05:05:34.000Z | dear/io/__init__.py | dongying/dear | 6f9a4f63bf3ee197dc03d7d2bd0451a83906d2ba | [
"MIT"
] | null | null | null | dear/io/__init__.py | dongying/dear | 6f9a4f63bf3ee197dc03d7d2bd0451a83906d2ba | [
"MIT"
] | null | null | null | #-*- coding: utf-8 -*-
import _decoder as decoder
from _decoder import get_decoder
from _decoder import open as open_audio
| 17.857143 | 39 | 0.768 | 19 | 125 | 4.789474 | 0.526316 | 0.241758 | 0.395604 | 0.527473 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009524 | 0.16 | 125 | 6 | 40 | 20.833333 | 0.857143 | 0.168 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
9631e88cbcadd85620ad3a1b9497a22325ac4a7f | 194 | py | Python | python/grit/common/model/usage/__init__.py | ground-context/grit | 3edd0a6c3ec2797fcd090690d6075db0c1965f5f | [
"Apache-2.0"
] | 1 | 2018-03-01T03:29:17.000Z | 2018-03-01T03:29:17.000Z | python/grit/common/model/usage/__init__.py | ground-context/grit | 3edd0a6c3ec2797fcd090690d6075db0c1965f5f | [
"Apache-2.0"
] | 1 | 2018-03-22T16:57:55.000Z | 2018-03-22T16:57:55.000Z | python/grit/common/model/usage/__init__.py | ground-context/grit | 3edd0a6c3ec2797fcd090690d6075db0c1965f5f | [
"Apache-2.0"
] | 2 | 2018-11-20T19:53:16.000Z | 2021-02-25T15:48:46.000Z | import grit.common.model.usage.lineage_edge
import grit.common.model.usage.lineage_edge_version
import grit.common.model.usage.lineage_graph
import grit.common.model.usage.lineage_graph_version
| 38.8 | 52 | 0.876289 | 30 | 194 | 5.466667 | 0.3 | 0.243902 | 0.390244 | 0.512195 | 0.914634 | 0.914634 | 0.914634 | 0 | 0 | 0 | 0 | 0 | 0.041237 | 194 | 4 | 53 | 48.5 | 0.88172 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
963248a3951073d4d70a7f0c6537ee99d0b63f4d | 252,808 | py | Python | gooddata-metadata-client/gooddata_metadata_client/api/workspace_object_controller_api.py | jaceksan/gooddata-python-sdk | 640bd8b679e00a5f0eb627bdf6143de078f8b59b | [
"MIT"
] | null | null | null | gooddata-metadata-client/gooddata_metadata_client/api/workspace_object_controller_api.py | jaceksan/gooddata-python-sdk | 640bd8b679e00a5f0eb627bdf6143de078f8b59b | [
"MIT"
] | null | null | null | gooddata-metadata-client/gooddata_metadata_client/api/workspace_object_controller_api.py | jaceksan/gooddata-python-sdk | 640bd8b679e00a5f0eb627bdf6143de078f8b59b | [
"MIT"
] | null | null | null | """
OpenAPI definition
No description provided (generated by Openapi Generator https://github.com/openapitools/openapi-generator) # noqa: E501
The version of the OpenAPI document: v0
Generated by: https://openapi-generator.tech
"""
import re # noqa: F401
import sys # noqa: F401
from gooddata_metadata_client.api_client import ApiClient, Endpoint as _Endpoint
from gooddata_metadata_client.model_utils import ( # noqa: F401
check_allowed_values,
check_validations,
date,
datetime,
file_type,
none_type,
validate_and_convert_types
)
from gooddata_metadata_client.model.json_api_analytical_dashboard_in_document import JsonApiAnalyticalDashboardInDocument
from gooddata_metadata_client.model.json_api_analytical_dashboard_out_document import JsonApiAnalyticalDashboardOutDocument
from gooddata_metadata_client.model.json_api_analytical_dashboard_out_list import JsonApiAnalyticalDashboardOutList
from gooddata_metadata_client.model.json_api_attribute_out_document import JsonApiAttributeOutDocument
from gooddata_metadata_client.model.json_api_attribute_out_list import JsonApiAttributeOutList
from gooddata_metadata_client.model.json_api_dataset_out_document import JsonApiDatasetOutDocument
from gooddata_metadata_client.model.json_api_dataset_out_list import JsonApiDatasetOutList
from gooddata_metadata_client.model.json_api_fact_out_document import JsonApiFactOutDocument
from gooddata_metadata_client.model.json_api_fact_out_list import JsonApiFactOutList
from gooddata_metadata_client.model.json_api_filter_context_in_document import JsonApiFilterContextInDocument
from gooddata_metadata_client.model.json_api_filter_context_out_document import JsonApiFilterContextOutDocument
from gooddata_metadata_client.model.json_api_filter_context_out_list import JsonApiFilterContextOutList
from gooddata_metadata_client.model.json_api_label_out_document import JsonApiLabelOutDocument
from gooddata_metadata_client.model.json_api_label_out_list import JsonApiLabelOutList
from gooddata_metadata_client.model.json_api_metric_in_document import JsonApiMetricInDocument
from gooddata_metadata_client.model.json_api_metric_out_document import JsonApiMetricOutDocument
from gooddata_metadata_client.model.json_api_metric_out_list import JsonApiMetricOutList
from gooddata_metadata_client.model.json_api_visualization_object_in_document import JsonApiVisualizationObjectInDocument
from gooddata_metadata_client.model.json_api_visualization_object_out_document import JsonApiVisualizationObjectOutDocument
from gooddata_metadata_client.model.json_api_visualization_object_out_list import JsonApiVisualizationObjectOutList
from gooddata_metadata_client.model.json_api_workspace_data_filter_in_document import JsonApiWorkspaceDataFilterInDocument
from gooddata_metadata_client.model.json_api_workspace_data_filter_out_document import JsonApiWorkspaceDataFilterOutDocument
from gooddata_metadata_client.model.json_api_workspace_data_filter_out_list import JsonApiWorkspaceDataFilterOutList
from gooddata_metadata_client.model.json_api_workspace_data_filter_setting_out_document import JsonApiWorkspaceDataFilterSettingOutDocument
from gooddata_metadata_client.model.json_api_workspace_data_filter_setting_out_list import JsonApiWorkspaceDataFilterSettingOutList
class WorkspaceObjectControllerApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def __create_entity_analytical_dashboards(
self,
workspace_id,
json_api_analytical_dashboard_in_document,
**kwargs
):
"""create_entity_analytical_dashboards # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_entity_analytical_dashboards(workspace_id, json_api_analytical_dashboard_in_document, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
json_api_analytical_dashboard_in_document (JsonApiAnalyticalDashboardInDocument):
Keyword Args:
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiAnalyticalDashboardOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['json_api_analytical_dashboard_in_document'] = \
json_api_analytical_dashboard_in_document
return self.call_with_http_info(**kwargs)
self.create_entity_analytical_dashboards = _Endpoint(
settings={
'response_type': (JsonApiAnalyticalDashboardOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/analyticalDashboards',
'operation_id': 'create_entity_analytical_dashboards',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'json_api_analytical_dashboard_in_document',
'include',
],
'required': [
'workspace_id',
'json_api_analytical_dashboard_in_document',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"VISUALIZATIONOBJECTS": "visualizationObjects",
"ANALYTICALDASHBOARDS": "analyticalDashboards",
"LABELS": "labels",
"METRICS": "metrics",
"DATASETS": "datasets",
"FILTERCONTEXTS": "filterContexts",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'json_api_analytical_dashboard_in_document':
(JsonApiAnalyticalDashboardInDocument,),
'include':
([str],),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'include': 'include',
},
'location_map': {
'workspace_id': 'path',
'json_api_analytical_dashboard_in_document': 'body',
'include': 'query',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [
'application/vnd.gooddata.api+json'
]
},
api_client=api_client,
callable=__create_entity_analytical_dashboards
)
def __create_entity_filter_contexts(
self,
workspace_id,
json_api_filter_context_in_document,
**kwargs
):
"""create_entity_filter_contexts # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_entity_filter_contexts(workspace_id, json_api_filter_context_in_document, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
json_api_filter_context_in_document (JsonApiFilterContextInDocument):
Keyword Args:
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiFilterContextOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['json_api_filter_context_in_document'] = \
json_api_filter_context_in_document
return self.call_with_http_info(**kwargs)
self.create_entity_filter_contexts = _Endpoint(
settings={
'response_type': (JsonApiFilterContextOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/filterContexts',
'operation_id': 'create_entity_filter_contexts',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'json_api_filter_context_in_document',
'include',
],
'required': [
'workspace_id',
'json_api_filter_context_in_document',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"ATTRIBUTES": "attributes",
"DATASETS": "datasets",
"LABELS": "labels",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'json_api_filter_context_in_document':
(JsonApiFilterContextInDocument,),
'include':
([str],),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'include': 'include',
},
'location_map': {
'workspace_id': 'path',
'json_api_filter_context_in_document': 'body',
'include': 'query',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [
'application/vnd.gooddata.api+json'
]
},
api_client=api_client,
callable=__create_entity_filter_contexts
)
def __create_entity_metrics(
self,
workspace_id,
json_api_metric_in_document,
**kwargs
):
"""create_entity_metrics # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_entity_metrics(workspace_id, json_api_metric_in_document, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
json_api_metric_in_document (JsonApiMetricInDocument):
Keyword Args:
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiMetricOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['json_api_metric_in_document'] = \
json_api_metric_in_document
return self.call_with_http_info(**kwargs)
self.create_entity_metrics = _Endpoint(
settings={
'response_type': (JsonApiMetricOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/metrics',
'operation_id': 'create_entity_metrics',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'json_api_metric_in_document',
'include',
],
'required': [
'workspace_id',
'json_api_metric_in_document',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"FACTS": "facts",
"ATTRIBUTES": "attributes",
"LABELS": "labels",
"METRICS": "metrics",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'json_api_metric_in_document':
(JsonApiMetricInDocument,),
'include':
([str],),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'include': 'include',
},
'location_map': {
'workspace_id': 'path',
'json_api_metric_in_document': 'body',
'include': 'query',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [
'application/vnd.gooddata.api+json'
]
},
api_client=api_client,
callable=__create_entity_metrics
)
def __create_entity_visualization_objects(
self,
workspace_id,
json_api_visualization_object_in_document,
**kwargs
):
"""create_entity_visualization_objects # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_entity_visualization_objects(workspace_id, json_api_visualization_object_in_document, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
json_api_visualization_object_in_document (JsonApiVisualizationObjectInDocument):
Keyword Args:
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiVisualizationObjectOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['json_api_visualization_object_in_document'] = \
json_api_visualization_object_in_document
return self.call_with_http_info(**kwargs)
self.create_entity_visualization_objects = _Endpoint(
settings={
'response_type': (JsonApiVisualizationObjectOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/visualizationObjects',
'operation_id': 'create_entity_visualization_objects',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'json_api_visualization_object_in_document',
'include',
],
'required': [
'workspace_id',
'json_api_visualization_object_in_document',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"FACTS": "facts",
"ATTRIBUTES": "attributes",
"LABELS": "labels",
"METRICS": "metrics",
"DATASETS": "datasets",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'json_api_visualization_object_in_document':
(JsonApiVisualizationObjectInDocument,),
'include':
([str],),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'include': 'include',
},
'location_map': {
'workspace_id': 'path',
'json_api_visualization_object_in_document': 'body',
'include': 'query',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [
'application/vnd.gooddata.api+json'
]
},
api_client=api_client,
callable=__create_entity_visualization_objects
)
def __create_entity_workspace_data_filters(
self,
workspace_id,
json_api_workspace_data_filter_in_document,
**kwargs
):
"""create_entity_workspace_data_filters # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_entity_workspace_data_filters(workspace_id, json_api_workspace_data_filter_in_document, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
json_api_workspace_data_filter_in_document (JsonApiWorkspaceDataFilterInDocument):
Keyword Args:
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiWorkspaceDataFilterOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['json_api_workspace_data_filter_in_document'] = \
json_api_workspace_data_filter_in_document
return self.call_with_http_info(**kwargs)
self.create_entity_workspace_data_filters = _Endpoint(
settings={
'response_type': (JsonApiWorkspaceDataFilterOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/workspaceDataFilters',
'operation_id': 'create_entity_workspace_data_filters',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'json_api_workspace_data_filter_in_document',
'include',
],
'required': [
'workspace_id',
'json_api_workspace_data_filter_in_document',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"WORKSPACEDATAFILTERSETTINGS": "workspaceDataFilterSettings",
"FILTERSETTINGS": "filterSettings",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'json_api_workspace_data_filter_in_document':
(JsonApiWorkspaceDataFilterInDocument,),
'include':
([str],),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'include': 'include',
},
'location_map': {
'workspace_id': 'path',
'json_api_workspace_data_filter_in_document': 'body',
'include': 'query',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [
'application/vnd.gooddata.api+json'
]
},
api_client=api_client,
callable=__create_entity_workspace_data_filters
)
def __delete_entity_analytical_dashboards(
self,
workspace_id,
object_id,
**kwargs
):
"""delete_entity_analytical_dashboards # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_entity_analytical_dashboards(workspace_id, object_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
return self.call_with_http_info(**kwargs)
self.delete_entity_analytical_dashboards = _Endpoint(
settings={
'response_type': None,
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/analyticalDashboards/{objectId}',
'operation_id': 'delete_entity_analytical_dashboards',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'predicate',
'filter',
],
'required': [
'workspace_id',
'object_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'predicate': 'query',
'filter': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__delete_entity_analytical_dashboards
)
def __delete_entity_filter_contexts(
self,
workspace_id,
object_id,
**kwargs
):
"""delete_entity_filter_contexts # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_entity_filter_contexts(workspace_id, object_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
return self.call_with_http_info(**kwargs)
self.delete_entity_filter_contexts = _Endpoint(
settings={
'response_type': None,
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/filterContexts/{objectId}',
'operation_id': 'delete_entity_filter_contexts',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'predicate',
'filter',
],
'required': [
'workspace_id',
'object_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'predicate': 'query',
'filter': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__delete_entity_filter_contexts
)
def __delete_entity_metrics(
self,
workspace_id,
object_id,
**kwargs
):
"""delete_entity_metrics # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_entity_metrics(workspace_id, object_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
return self.call_with_http_info(**kwargs)
self.delete_entity_metrics = _Endpoint(
settings={
'response_type': None,
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/metrics/{objectId}',
'operation_id': 'delete_entity_metrics',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'predicate',
'filter',
],
'required': [
'workspace_id',
'object_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'predicate': 'query',
'filter': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__delete_entity_metrics
)
def __delete_entity_visualization_objects(
self,
workspace_id,
object_id,
**kwargs
):
"""delete_entity_visualization_objects # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_entity_visualization_objects(workspace_id, object_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
return self.call_with_http_info(**kwargs)
self.delete_entity_visualization_objects = _Endpoint(
settings={
'response_type': None,
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/visualizationObjects/{objectId}',
'operation_id': 'delete_entity_visualization_objects',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'predicate',
'filter',
],
'required': [
'workspace_id',
'object_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'predicate': 'query',
'filter': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__delete_entity_visualization_objects
)
def __delete_entity_workspace_data_filters(
self,
workspace_id,
object_id,
**kwargs
):
"""delete_entity_workspace_data_filters # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_entity_workspace_data_filters(workspace_id, object_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
return self.call_with_http_info(**kwargs)
self.delete_entity_workspace_data_filters = _Endpoint(
settings={
'response_type': None,
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/workspaceDataFilters/{objectId}',
'operation_id': 'delete_entity_workspace_data_filters',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'predicate',
'filter',
],
'required': [
'workspace_id',
'object_id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'predicate': 'query',
'filter': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__delete_entity_workspace_data_filters
)
def __get_all_entities_analytical_dashboards(
self,
workspace_id,
**kwargs
):
"""get_all_entities_analytical_dashboards # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_entities_analytical_dashboards(workspace_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
page (int): Zero-based page index (0..N). [optional] if omitted the server will use the default value of 0
size (int): The size of the page to be returned. [optional] if omitted the server will use the default value of 20
sort ([str]): Sorting criteria in the format: property(,asc|desc). Default sort order is ascending. Multiple sort criteria are supported.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiAnalyticalDashboardOutList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
return self.call_with_http_info(**kwargs)
self.get_all_entities_analytical_dashboards = _Endpoint(
settings={
'response_type': (JsonApiAnalyticalDashboardOutList,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/analyticalDashboards',
'operation_id': 'get_all_entities_analytical_dashboards',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'predicate',
'filter',
'include',
'page',
'size',
'sort',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"VISUALIZATIONOBJECTS": "visualizationObjects",
"ANALYTICALDASHBOARDS": "analyticalDashboards",
"LABELS": "labels",
"METRICS": "metrics",
"DATASETS": "datasets",
"FILTERCONTEXTS": "filterContexts",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'page':
(int,),
'size':
(int,),
'sort':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'page': 'page',
'size': 'size',
'sort': 'sort',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'page': 'query',
'size': 'query',
'sort': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
'sort': 'multi',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_all_entities_analytical_dashboards
)
def __get_all_entities_attributes(
self,
workspace_id,
**kwargs
):
"""get_all_entities_attributes # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_entities_attributes(workspace_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
page (int): Zero-based page index (0..N). [optional] if omitted the server will use the default value of 0
size (int): The size of the page to be returned. [optional] if omitted the server will use the default value of 20
sort ([str]): Sorting criteria in the format: property(,asc|desc). Default sort order is ascending. Multiple sort criteria are supported.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiAttributeOutList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
return self.call_with_http_info(**kwargs)
self.get_all_entities_attributes = _Endpoint(
settings={
'response_type': (JsonApiAttributeOutList,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/attributes',
'operation_id': 'get_all_entities_attributes',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'predicate',
'filter',
'include',
'page',
'size',
'sort',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"DATASETS": "datasets",
"LABELS": "labels",
"DATASET": "dataset",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'page':
(int,),
'size':
(int,),
'sort':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'page': 'page',
'size': 'size',
'sort': 'sort',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'page': 'query',
'size': 'query',
'sort': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
'sort': 'multi',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_all_entities_attributes
)
def __get_all_entities_datasets(
self,
workspace_id,
**kwargs
):
"""get_all_entities_datasets # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_entities_datasets(workspace_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
page (int): Zero-based page index (0..N). [optional] if omitted the server will use the default value of 0
size (int): The size of the page to be returned. [optional] if omitted the server will use the default value of 20
sort ([str]): Sorting criteria in the format: property(,asc|desc). Default sort order is ascending. Multiple sort criteria are supported.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiDatasetOutList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
return self.call_with_http_info(**kwargs)
self.get_all_entities_datasets = _Endpoint(
settings={
'response_type': (JsonApiDatasetOutList,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/datasets',
'operation_id': 'get_all_entities_datasets',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'predicate',
'filter',
'include',
'page',
'size',
'sort',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"ATTRIBUTES": "attributes",
"FACTS": "facts",
"DATASETS": "datasets",
"REFERENCES": "references",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'page':
(int,),
'size':
(int,),
'sort':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'page': 'page',
'size': 'size',
'sort': 'sort',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'page': 'query',
'size': 'query',
'sort': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
'sort': 'multi',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_all_entities_datasets
)
def __get_all_entities_facts(
self,
workspace_id,
**kwargs
):
"""get_all_entities_facts # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_entities_facts(workspace_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
page (int): Zero-based page index (0..N). [optional] if omitted the server will use the default value of 0
size (int): The size of the page to be returned. [optional] if omitted the server will use the default value of 20
sort ([str]): Sorting criteria in the format: property(,asc|desc). Default sort order is ascending. Multiple sort criteria are supported.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiFactOutList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
return self.call_with_http_info(**kwargs)
self.get_all_entities_facts = _Endpoint(
settings={
'response_type': (JsonApiFactOutList,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/facts',
'operation_id': 'get_all_entities_facts',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'predicate',
'filter',
'include',
'page',
'size',
'sort',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"DATASETS": "datasets",
"DATASET": "dataset",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'page':
(int,),
'size':
(int,),
'sort':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'page': 'page',
'size': 'size',
'sort': 'sort',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'page': 'query',
'size': 'query',
'sort': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
'sort': 'multi',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_all_entities_facts
)
def __get_all_entities_filter_contexts(
self,
workspace_id,
**kwargs
):
"""get_all_entities_filter_contexts # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_entities_filter_contexts(workspace_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
page (int): Zero-based page index (0..N). [optional] if omitted the server will use the default value of 0
size (int): The size of the page to be returned. [optional] if omitted the server will use the default value of 20
sort ([str]): Sorting criteria in the format: property(,asc|desc). Default sort order is ascending. Multiple sort criteria are supported.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiFilterContextOutList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
return self.call_with_http_info(**kwargs)
self.get_all_entities_filter_contexts = _Endpoint(
settings={
'response_type': (JsonApiFilterContextOutList,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/filterContexts',
'operation_id': 'get_all_entities_filter_contexts',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'predicate',
'filter',
'include',
'page',
'size',
'sort',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"ATTRIBUTES": "attributes",
"DATASETS": "datasets",
"LABELS": "labels",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'page':
(int,),
'size':
(int,),
'sort':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'page': 'page',
'size': 'size',
'sort': 'sort',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'page': 'query',
'size': 'query',
'sort': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
'sort': 'multi',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_all_entities_filter_contexts
)
def __get_all_entities_labels(
self,
workspace_id,
**kwargs
):
"""get_all_entities_labels # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_entities_labels(workspace_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
page (int): Zero-based page index (0..N). [optional] if omitted the server will use the default value of 0
size (int): The size of the page to be returned. [optional] if omitted the server will use the default value of 20
sort ([str]): Sorting criteria in the format: property(,asc|desc). Default sort order is ascending. Multiple sort criteria are supported.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiLabelOutList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
return self.call_with_http_info(**kwargs)
self.get_all_entities_labels = _Endpoint(
settings={
'response_type': (JsonApiLabelOutList,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/labels',
'operation_id': 'get_all_entities_labels',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'predicate',
'filter',
'include',
'page',
'size',
'sort',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"ATTRIBUTES": "attributes",
"ATTRIBUTE": "attribute",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'page':
(int,),
'size':
(int,),
'sort':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'page': 'page',
'size': 'size',
'sort': 'sort',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'page': 'query',
'size': 'query',
'sort': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
'sort': 'multi',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_all_entities_labels
)
def __get_all_entities_metrics(
self,
workspace_id,
**kwargs
):
"""get_all_entities_metrics # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_entities_metrics(workspace_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
page (int): Zero-based page index (0..N). [optional] if omitted the server will use the default value of 0
size (int): The size of the page to be returned. [optional] if omitted the server will use the default value of 20
sort ([str]): Sorting criteria in the format: property(,asc|desc). Default sort order is ascending. Multiple sort criteria are supported.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiMetricOutList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
return self.call_with_http_info(**kwargs)
self.get_all_entities_metrics = _Endpoint(
settings={
'response_type': (JsonApiMetricOutList,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/metrics',
'operation_id': 'get_all_entities_metrics',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'predicate',
'filter',
'include',
'page',
'size',
'sort',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"FACTS": "facts",
"ATTRIBUTES": "attributes",
"LABELS": "labels",
"METRICS": "metrics",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'page':
(int,),
'size':
(int,),
'sort':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'page': 'page',
'size': 'size',
'sort': 'sort',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'page': 'query',
'size': 'query',
'sort': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
'sort': 'multi',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_all_entities_metrics
)
def __get_all_entities_visualization_objects(
self,
workspace_id,
**kwargs
):
"""get_all_entities_visualization_objects # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_entities_visualization_objects(workspace_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
page (int): Zero-based page index (0..N). [optional] if omitted the server will use the default value of 0
size (int): The size of the page to be returned. [optional] if omitted the server will use the default value of 20
sort ([str]): Sorting criteria in the format: property(,asc|desc). Default sort order is ascending. Multiple sort criteria are supported.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiVisualizationObjectOutList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
return self.call_with_http_info(**kwargs)
self.get_all_entities_visualization_objects = _Endpoint(
settings={
'response_type': (JsonApiVisualizationObjectOutList,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/visualizationObjects',
'operation_id': 'get_all_entities_visualization_objects',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'predicate',
'filter',
'include',
'page',
'size',
'sort',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"FACTS": "facts",
"ATTRIBUTES": "attributes",
"LABELS": "labels",
"METRICS": "metrics",
"DATASETS": "datasets",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'page':
(int,),
'size':
(int,),
'sort':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'page': 'page',
'size': 'size',
'sort': 'sort',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'page': 'query',
'size': 'query',
'sort': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
'sort': 'multi',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_all_entities_visualization_objects
)
def __get_all_entities_workspace_data_filter_settings(
self,
workspace_id,
**kwargs
):
"""get_all_entities_workspace_data_filter_settings # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_entities_workspace_data_filter_settings(workspace_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
page (int): Zero-based page index (0..N). [optional] if omitted the server will use the default value of 0
size (int): The size of the page to be returned. [optional] if omitted the server will use the default value of 20
sort ([str]): Sorting criteria in the format: property(,asc|desc). Default sort order is ascending. Multiple sort criteria are supported.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiWorkspaceDataFilterSettingOutList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
return self.call_with_http_info(**kwargs)
self.get_all_entities_workspace_data_filter_settings = _Endpoint(
settings={
'response_type': (JsonApiWorkspaceDataFilterSettingOutList,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/workspaceDataFilterSettings',
'operation_id': 'get_all_entities_workspace_data_filter_settings',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'predicate',
'filter',
'include',
'page',
'size',
'sort',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"WORKSPACEDATAFILTERS": "workspaceDataFilters",
"WORKSPACEDATAFILTER": "workspaceDataFilter",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'page':
(int,),
'size':
(int,),
'sort':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'page': 'page',
'size': 'size',
'sort': 'sort',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'page': 'query',
'size': 'query',
'sort': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
'sort': 'multi',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_all_entities_workspace_data_filter_settings
)
def __get_all_entities_workspace_data_filters(
self,
workspace_id,
**kwargs
):
"""get_all_entities_workspace_data_filters # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_entities_workspace_data_filters(workspace_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
page (int): Zero-based page index (0..N). [optional] if omitted the server will use the default value of 0
size (int): The size of the page to be returned. [optional] if omitted the server will use the default value of 20
sort ([str]): Sorting criteria in the format: property(,asc|desc). Default sort order is ascending. Multiple sort criteria are supported.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiWorkspaceDataFilterOutList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
return self.call_with_http_info(**kwargs)
self.get_all_entities_workspace_data_filters = _Endpoint(
settings={
'response_type': (JsonApiWorkspaceDataFilterOutList,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/workspaceDataFilters',
'operation_id': 'get_all_entities_workspace_data_filters',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'predicate',
'filter',
'include',
'page',
'size',
'sort',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"WORKSPACEDATAFILTERSETTINGS": "workspaceDataFilterSettings",
"FILTERSETTINGS": "filterSettings",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'page':
(int,),
'size':
(int,),
'sort':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'page': 'page',
'size': 'size',
'sort': 'sort',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'page': 'query',
'size': 'query',
'sort': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
'sort': 'multi',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_all_entities_workspace_data_filters
)
def __get_entity_analytical_dashboards(
self,
workspace_id,
object_id,
**kwargs
):
"""get_entity_analytical_dashboards # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_entity_analytical_dashboards(workspace_id, object_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiAnalyticalDashboardOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
return self.call_with_http_info(**kwargs)
self.get_entity_analytical_dashboards = _Endpoint(
settings={
'response_type': (JsonApiAnalyticalDashboardOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/analyticalDashboards/{objectId}',
'operation_id': 'get_entity_analytical_dashboards',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'predicate',
'filter',
'include',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
'object_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"VISUALIZATIONOBJECTS": "visualizationObjects",
"ANALYTICALDASHBOARDS": "analyticalDashboards",
"LABELS": "labels",
"METRICS": "metrics",
"DATASETS": "datasets",
"FILTERCONTEXTS": "filterContexts",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_entity_analytical_dashboards
)
def __get_entity_attributes(
self,
workspace_id,
object_id,
**kwargs
):
"""get_entity_attributes # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_entity_attributes(workspace_id, object_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiAttributeOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
return self.call_with_http_info(**kwargs)
self.get_entity_attributes = _Endpoint(
settings={
'response_type': (JsonApiAttributeOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/attributes/{objectId}',
'operation_id': 'get_entity_attributes',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'predicate',
'filter',
'include',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
'object_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"DATASETS": "datasets",
"LABELS": "labels",
"DATASET": "dataset",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_entity_attributes
)
def __get_entity_datasets(
self,
workspace_id,
object_id,
**kwargs
):
"""get_entity_datasets # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_entity_datasets(workspace_id, object_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiDatasetOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
return self.call_with_http_info(**kwargs)
self.get_entity_datasets = _Endpoint(
settings={
'response_type': (JsonApiDatasetOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/datasets/{objectId}',
'operation_id': 'get_entity_datasets',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'predicate',
'filter',
'include',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
'object_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"ATTRIBUTES": "attributes",
"FACTS": "facts",
"DATASETS": "datasets",
"REFERENCES": "references",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_entity_datasets
)
def __get_entity_facts(
self,
workspace_id,
object_id,
**kwargs
):
"""get_entity_facts # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_entity_facts(workspace_id, object_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiFactOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
return self.call_with_http_info(**kwargs)
self.get_entity_facts = _Endpoint(
settings={
'response_type': (JsonApiFactOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/facts/{objectId}',
'operation_id': 'get_entity_facts',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'predicate',
'filter',
'include',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
'object_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"DATASETS": "datasets",
"DATASET": "dataset",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_entity_facts
)
def __get_entity_filter_contexts(
self,
workspace_id,
object_id,
**kwargs
):
"""get_entity_filter_contexts # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_entity_filter_contexts(workspace_id, object_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiFilterContextOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
return self.call_with_http_info(**kwargs)
self.get_entity_filter_contexts = _Endpoint(
settings={
'response_type': (JsonApiFilterContextOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/filterContexts/{objectId}',
'operation_id': 'get_entity_filter_contexts',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'predicate',
'filter',
'include',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
'object_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"ATTRIBUTES": "attributes",
"DATASETS": "datasets",
"LABELS": "labels",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_entity_filter_contexts
)
def __get_entity_labels(
self,
workspace_id,
object_id,
**kwargs
):
"""get_entity_labels # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_entity_labels(workspace_id, object_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiLabelOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
return self.call_with_http_info(**kwargs)
self.get_entity_labels = _Endpoint(
settings={
'response_type': (JsonApiLabelOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/labels/{objectId}',
'operation_id': 'get_entity_labels',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'predicate',
'filter',
'include',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
'object_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"ATTRIBUTES": "attributes",
"ATTRIBUTE": "attribute",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_entity_labels
)
def __get_entity_metrics(
self,
workspace_id,
object_id,
**kwargs
):
"""get_entity_metrics # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_entity_metrics(workspace_id, object_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiMetricOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
return self.call_with_http_info(**kwargs)
self.get_entity_metrics = _Endpoint(
settings={
'response_type': (JsonApiMetricOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/metrics/{objectId}',
'operation_id': 'get_entity_metrics',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'predicate',
'filter',
'include',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
'object_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"FACTS": "facts",
"ATTRIBUTES": "attributes",
"LABELS": "labels",
"METRICS": "metrics",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_entity_metrics
)
def __get_entity_visualization_objects(
self,
workspace_id,
object_id,
**kwargs
):
"""get_entity_visualization_objects # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_entity_visualization_objects(workspace_id, object_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiVisualizationObjectOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
return self.call_with_http_info(**kwargs)
self.get_entity_visualization_objects = _Endpoint(
settings={
'response_type': (JsonApiVisualizationObjectOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/visualizationObjects/{objectId}',
'operation_id': 'get_entity_visualization_objects',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'predicate',
'filter',
'include',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
'object_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"FACTS": "facts",
"ATTRIBUTES": "attributes",
"LABELS": "labels",
"METRICS": "metrics",
"DATASETS": "datasets",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_entity_visualization_objects
)
def __get_entity_workspace_data_filter_settings(
self,
workspace_id,
object_id,
**kwargs
):
"""get_entity_workspace_data_filter_settings # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_entity_workspace_data_filter_settings(workspace_id, object_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiWorkspaceDataFilterSettingOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
return self.call_with_http_info(**kwargs)
self.get_entity_workspace_data_filter_settings = _Endpoint(
settings={
'response_type': (JsonApiWorkspaceDataFilterSettingOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/workspaceDataFilterSettings/{objectId}',
'operation_id': 'get_entity_workspace_data_filter_settings',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'predicate',
'filter',
'include',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
'object_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"WORKSPACEDATAFILTERS": "workspaceDataFilters",
"WORKSPACEDATAFILTER": "workspaceDataFilter",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_entity_workspace_data_filter_settings
)
def __get_entity_workspace_data_filters(
self,
workspace_id,
object_id,
**kwargs
):
"""get_entity_workspace_data_filters # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_entity_workspace_data_filters(workspace_id, object_id, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
x_gdc_validate_relations (bool): [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiWorkspaceDataFilterOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
return self.call_with_http_info(**kwargs)
self.get_entity_workspace_data_filters = _Endpoint(
settings={
'response_type': (JsonApiWorkspaceDataFilterOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/workspaceDataFilters/{objectId}',
'operation_id': 'get_entity_workspace_data_filters',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'predicate',
'filter',
'include',
'x_gdc_validate_relations',
],
'required': [
'workspace_id',
'object_id',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"WORKSPACEDATAFILTERSETTINGS": "workspaceDataFilterSettings",
"FILTERSETTINGS": "filterSettings",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
'x_gdc_validate_relations':
(bool,),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
'x_gdc_validate_relations': 'X-GDC-VALIDATE-RELATIONS',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'predicate': 'query',
'filter': 'query',
'include': 'query',
'x_gdc_validate_relations': 'header',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_entity_workspace_data_filters
)
def __update_entity_analytical_dashboards(
self,
workspace_id,
object_id,
json_api_analytical_dashboard_in_document,
**kwargs
):
"""update_entity_analytical_dashboards # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_entity_analytical_dashboards(workspace_id, object_id, json_api_analytical_dashboard_in_document, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
json_api_analytical_dashboard_in_document (JsonApiAnalyticalDashboardInDocument):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiAnalyticalDashboardOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
kwargs['json_api_analytical_dashboard_in_document'] = \
json_api_analytical_dashboard_in_document
return self.call_with_http_info(**kwargs)
self.update_entity_analytical_dashboards = _Endpoint(
settings={
'response_type': (JsonApiAnalyticalDashboardOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/analyticalDashboards/{objectId}',
'operation_id': 'update_entity_analytical_dashboards',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'json_api_analytical_dashboard_in_document',
'predicate',
'filter',
'include',
],
'required': [
'workspace_id',
'object_id',
'json_api_analytical_dashboard_in_document',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"VISUALIZATIONOBJECTS": "visualizationObjects",
"ANALYTICALDASHBOARDS": "analyticalDashboards",
"LABELS": "labels",
"METRICS": "metrics",
"DATASETS": "datasets",
"FILTERCONTEXTS": "filterContexts",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'json_api_analytical_dashboard_in_document':
(JsonApiAnalyticalDashboardInDocument,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'json_api_analytical_dashboard_in_document': 'body',
'predicate': 'query',
'filter': 'query',
'include': 'query',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [
'application/vnd.gooddata.api+json'
]
},
api_client=api_client,
callable=__update_entity_analytical_dashboards
)
def __update_entity_filter_contexts(
self,
workspace_id,
object_id,
json_api_filter_context_in_document,
**kwargs
):
"""update_entity_filter_contexts # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_entity_filter_contexts(workspace_id, object_id, json_api_filter_context_in_document, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
json_api_filter_context_in_document (JsonApiFilterContextInDocument):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiFilterContextOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
kwargs['json_api_filter_context_in_document'] = \
json_api_filter_context_in_document
return self.call_with_http_info(**kwargs)
self.update_entity_filter_contexts = _Endpoint(
settings={
'response_type': (JsonApiFilterContextOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/filterContexts/{objectId}',
'operation_id': 'update_entity_filter_contexts',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'json_api_filter_context_in_document',
'predicate',
'filter',
'include',
],
'required': [
'workspace_id',
'object_id',
'json_api_filter_context_in_document',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"ATTRIBUTES": "attributes",
"DATASETS": "datasets",
"LABELS": "labels",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'json_api_filter_context_in_document':
(JsonApiFilterContextInDocument,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'json_api_filter_context_in_document': 'body',
'predicate': 'query',
'filter': 'query',
'include': 'query',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [
'application/vnd.gooddata.api+json'
]
},
api_client=api_client,
callable=__update_entity_filter_contexts
)
def __update_entity_metrics(
self,
workspace_id,
object_id,
json_api_metric_in_document,
**kwargs
):
"""update_entity_metrics # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_entity_metrics(workspace_id, object_id, json_api_metric_in_document, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
json_api_metric_in_document (JsonApiMetricInDocument):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiMetricOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
kwargs['json_api_metric_in_document'] = \
json_api_metric_in_document
return self.call_with_http_info(**kwargs)
self.update_entity_metrics = _Endpoint(
settings={
'response_type': (JsonApiMetricOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/metrics/{objectId}',
'operation_id': 'update_entity_metrics',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'json_api_metric_in_document',
'predicate',
'filter',
'include',
],
'required': [
'workspace_id',
'object_id',
'json_api_metric_in_document',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"FACTS": "facts",
"ATTRIBUTES": "attributes",
"LABELS": "labels",
"METRICS": "metrics",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'json_api_metric_in_document':
(JsonApiMetricInDocument,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'json_api_metric_in_document': 'body',
'predicate': 'query',
'filter': 'query',
'include': 'query',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [
'application/vnd.gooddata.api+json'
]
},
api_client=api_client,
callable=__update_entity_metrics
)
def __update_entity_visualization_objects(
self,
workspace_id,
object_id,
json_api_visualization_object_in_document,
**kwargs
):
"""update_entity_visualization_objects # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_entity_visualization_objects(workspace_id, object_id, json_api_visualization_object_in_document, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
json_api_visualization_object_in_document (JsonApiVisualizationObjectInDocument):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiVisualizationObjectOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
kwargs['json_api_visualization_object_in_document'] = \
json_api_visualization_object_in_document
return self.call_with_http_info(**kwargs)
self.update_entity_visualization_objects = _Endpoint(
settings={
'response_type': (JsonApiVisualizationObjectOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/visualizationObjects/{objectId}',
'operation_id': 'update_entity_visualization_objects',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'json_api_visualization_object_in_document',
'predicate',
'filter',
'include',
],
'required': [
'workspace_id',
'object_id',
'json_api_visualization_object_in_document',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"FACTS": "facts",
"ATTRIBUTES": "attributes",
"LABELS": "labels",
"METRICS": "metrics",
"DATASETS": "datasets",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'json_api_visualization_object_in_document':
(JsonApiVisualizationObjectInDocument,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'json_api_visualization_object_in_document': 'body',
'predicate': 'query',
'filter': 'query',
'include': 'query',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [
'application/vnd.gooddata.api+json'
]
},
api_client=api_client,
callable=__update_entity_visualization_objects
)
def __update_entity_workspace_data_filters(
self,
workspace_id,
object_id,
json_api_workspace_data_filter_in_document,
**kwargs
):
"""update_entity_workspace_data_filters # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_entity_workspace_data_filters(workspace_id, object_id, json_api_workspace_data_filter_in_document, async_req=True)
>>> result = thread.get()
Args:
workspace_id (str):
object_id (str):
json_api_workspace_data_filter_in_document (JsonApiWorkspaceDataFilterInDocument):
Keyword Args:
predicate ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Composed query parameters used for filtering. 'id' parameter can be used for all objects. Other parameters are present according to object type (title, description,...). You can specify any object parameter and parameter of related entity up to 2nd level (for example name=John&language=english,czech&address.city=London&father.id=123).. [optional]
filter (str): Filtering parameter in RSQL. See https://github.com/jirutka/rsql-parser.You can specify any object parameter and parameter of related entity up to 2nd level (for example title=='Some Title';description=='desc'). [optional]
include ([str]): Array of included collections or individual relationships. Includes are separated by commas (e.g. include=entity1s,entity2s). Collection include represents the inclusion of every relationship between this entity and the given collection. Relationship include represents the inclusion of the particular relationships only. If single parameter \"ALL\" is present, all possible includes are used (include=ALL). __WARNING:__ Individual include types (collection, relationship or ALL) cannot be combined together.. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
JsonApiWorkspaceDataFilterOutDocument
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['workspace_id'] = \
workspace_id
kwargs['object_id'] = \
object_id
kwargs['json_api_workspace_data_filter_in_document'] = \
json_api_workspace_data_filter_in_document
return self.call_with_http_info(**kwargs)
self.update_entity_workspace_data_filters = _Endpoint(
settings={
'response_type': (JsonApiWorkspaceDataFilterOutDocument,),
'auth': [],
'endpoint_path': '/api/entities/workspaces/{workspaceId}/workspaceDataFilters/{objectId}',
'operation_id': 'update_entity_workspace_data_filters',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'workspace_id',
'object_id',
'json_api_workspace_data_filter_in_document',
'predicate',
'filter',
'include',
],
'required': [
'workspace_id',
'object_id',
'json_api_workspace_data_filter_in_document',
],
'nullable': [
],
'enum': [
'include',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('include',): {
"WORKSPACEDATAFILTERSETTINGS": "workspaceDataFilterSettings",
"FILTERSETTINGS": "filterSettings",
"ALL": "ALL"
},
},
'openapi_types': {
'workspace_id':
(str,),
'object_id':
(str,),
'json_api_workspace_data_filter_in_document':
(JsonApiWorkspaceDataFilterInDocument,),
'predicate':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
'filter':
(str,),
'include':
([str],),
},
'attribute_map': {
'workspace_id': 'workspaceId',
'object_id': 'objectId',
'predicate': 'predicate',
'filter': 'filter',
'include': 'include',
},
'location_map': {
'workspace_id': 'path',
'object_id': 'path',
'json_api_workspace_data_filter_in_document': 'body',
'predicate': 'query',
'filter': 'query',
'include': 'query',
},
'collection_format_map': {
'include': 'csv',
}
},
headers_map={
'accept': [
'application/vnd.gooddata.api+json'
],
'content_type': [
'application/vnd.gooddata.api+json'
]
},
api_client=api_client,
callable=__update_entity_workspace_data_filters
)
| 45.485426 | 554 | 0.500803 | 22,394 | 252,808 | 5.428731 | 0.014379 | 0.031669 | 0.014971 | 0.020729 | 0.973736 | 0.962269 | 0.958477 | 0.952612 | 0.939994 | 0.928692 | 0 | 0.002714 | 0.412689 | 252,808 | 5,557 | 555 | 45.493612 | 0.816073 | 0.374347 | 0 | 0.77926 | 1 | 0 | 0.248291 | 0.07802 | 0 | 0 | 0 | 0 | 0 | 1 | 0.009061 | false | 0 | 0.007299 | 0 | 0.025422 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
96882d96563e0c286336290443a8aa6ec95b0af7 | 20,402 | py | Python | packages/augur-core/tests/trading/test_orders.py | joeykrug/augur | d6d52aee324e8c7eeb85c9358094d3a4bcda66bc | [
"MIT"
] | 1 | 2022-01-05T23:11:08.000Z | 2022-01-05T23:11:08.000Z | packages/augur-core/tests/trading/test_orders.py | joeykrug/augur | d6d52aee324e8c7eeb85c9358094d3a4bcda66bc | [
"MIT"
] | null | null | null | packages/augur-core/tests/trading/test_orders.py | joeykrug/augur | d6d52aee324e8c7eeb85c9358094d3a4bcda66bc | [
"MIT"
] | 3 | 2020-08-18T18:44:54.000Z | 2020-08-24T07:02:19.000Z | #!/usr/bin/env python
from eth_tester.exceptions import TransactionFailed
from pytest import mark, raises
from utils import fix, longTo32Bytes, longToHexString, nullAddress, stringToBytes
from constants import BID, ASK, YES, NO
WEI_TO_ETH = 10**18
def test_walkOrderList_bids(contractsFixture, market):
orders = contractsFixture.contracts['Orders']
outcomeID = 1
order = {
"orderID": longTo32Bytes(5),
"type": BID,
"amount": fix('1'),
"price": 6000,
"sender": contractsFixture.accounts[0],
"outcome": outcomeID,
"moneyEscrowed": fix('6000'),
"sharesEscrowed": 0,
"betterOrderID": longTo32Bytes(0),
"worseOrderID": longTo32Bytes(0),
"tradeGroupID": stringToBytes("0"),
"kycToken": nullAddress
}
uints = [order["amount"], order["price"], order["outcome"], order["moneyEscrowed"], order["sharesEscrowed"]]
bytes32s = [order["betterOrderID"], order["worseOrderID"], order["tradeGroupID"], stringToBytes("0")]
orderId5 = orders.testSaveOrder(uints, bytes32s, order["type"], market.address, order["sender"], order["kycToken"])
assert(orderId5 != bytearray(32)), "Save order"
bestOrderID = orders.getBestOrderId(BID, market.address, outcomeID, nullAddress)
worstOrderID = orders.getWorstOrderId(BID, market.address, outcomeID, nullAddress)
assert(bestOrderID == orderId5)
assert(worstOrderID == orderId5)
# walk down order list starting from bestOrderID
assert(orders.descendOrderList(BID, 6000, bestOrderID) == [orderId5, longTo32Bytes(0)])
assert(orders.descendOrderList(BID, 5900, bestOrderID) == [orderId5, longTo32Bytes(0)])
assert(orders.descendOrderList(BID, 6100, bestOrderID) == [longTo32Bytes(0), orderId5])
assert(orders.descendOrderList(BID, 5800, bestOrderID) == [orderId5, longTo32Bytes(0)])
assert(orders.descendOrderList(BID, 5950, bestOrderID) == [orderId5, longTo32Bytes(0)])
# walk up order list starting from worstOrderID
assert(orders.ascendOrderList(BID, 6000, worstOrderID) == [orderId5, longTo32Bytes(0)])
assert(orders.ascendOrderList(BID, 5900, worstOrderID) == [orderId5, longTo32Bytes(0)])
assert(orders.ascendOrderList(BID, 6100, worstOrderID) == [longTo32Bytes(0), orderId5])
assert(orders.ascendOrderList(BID, 5800, worstOrderID) == [orderId5, longTo32Bytes(0)])
assert(orders.ascendOrderList(BID, 5950, bestOrderID) == [orderId5, longTo32Bytes(0)])
order = {
"orderID": longTo32Bytes(6),
"type": BID,
"amount": fix('1'),
"price": 5900,
"sender": contractsFixture.accounts[0],
"outcome": outcomeID,
"moneyEscrowed": fix('5900'),
"sharesEscrowed": 0,
"betterOrderID": longTo32Bytes(0),
"worseOrderID": longTo32Bytes(0),
"tradeGroupID": stringToBytes("0"),
"kycToken": nullAddress
}
uints = [order["amount"], order["price"], order["outcome"], order["moneyEscrowed"], order["sharesEscrowed"]]
bytes32s = [order["betterOrderID"], order["worseOrderID"], order["tradeGroupID"], stringToBytes("0")]
orderId6 = orders.testSaveOrder(uints, bytes32s, order["type"], market.address, order["sender"], order["kycToken"])
assert(orderId6 != bytearray(32)), "Save order"
bestOrderID = orders.getBestOrderId(BID, market.address, outcomeID, nullAddress)
worstOrderID = orders.getWorstOrderId(BID, market.address, outcomeID, nullAddress)
assert(bestOrderID == orderId5)
assert(worstOrderID == orderId6)
# walk down order list starting from bestOrderID
assert(orders.descendOrderList(BID, 6000, bestOrderID) == [orderId5, orderId6])
assert(orders.descendOrderList(BID, 5900, bestOrderID) == [orderId6, longTo32Bytes(0)])
assert(orders.descendOrderList(BID, 6100, bestOrderID) == [longTo32Bytes(0), orderId5])
assert(orders.descendOrderList(BID, 5800, bestOrderID) == [orderId6, longTo32Bytes(0)])
assert(orders.descendOrderList(BID, 5950, bestOrderID) == [orderId5, orderId6])
# walk up order list starting from worstOrderID
assert(orders.ascendOrderList(BID, 6000, worstOrderID) == [orderId5, orderId6])
assert(orders.ascendOrderList(BID, 5900, worstOrderID) == [orderId6, longTo32Bytes(0)])
assert(orders.ascendOrderList(BID, 6100, worstOrderID) == [longTo32Bytes(0), orderId5])
assert(orders.ascendOrderList(BID, 5800, worstOrderID) == [orderId6, longTo32Bytes(0)])
assert(orders.ascendOrderList(BID, 5950, bestOrderID) == [orderId5, orderId6])
order = {
"orderID": longTo32Bytes(7),
"type": BID,
"amount": fix('1'),
"price": 5950,
"sender": contractsFixture.accounts[0],
"outcome": outcomeID,
"moneyEscrowed": fix('5950'),
"sharesEscrowed": 0,
"betterOrderID": longTo32Bytes(0),
"worseOrderID": longTo32Bytes(0),
"tradeGroupID": stringToBytes("0"),
"kycToken": nullAddress
}
uints = [order["amount"], order["price"], order["outcome"], order["moneyEscrowed"], order["sharesEscrowed"]]
bytes32s = [order["betterOrderID"], order["worseOrderID"], order["tradeGroupID"], stringToBytes("0")]
orderId7 = orders.testSaveOrder(uints, bytes32s, order["type"], market.address, order["sender"], order["kycToken"])
assert(orderId7 != bytearray(32)), "Save order"
bestOrderID = orders.getBestOrderId(BID, market.address, outcomeID, nullAddress)
worstOrderID = orders.getWorstOrderId(BID, market.address, outcomeID, nullAddress)
assert(bestOrderID == orderId5)
assert(worstOrderID == orderId6)
# walk down order list starting from bestOrderID
assert(orders.descendOrderList(BID, 6000, bestOrderID) == [orderId5, orderId7])
assert(orders.descendOrderList(BID, 5900, bestOrderID) == [orderId6, longTo32Bytes(0)])
assert(orders.descendOrderList(BID, 6100, bestOrderID) == [longTo32Bytes(0), orderId5])
assert(orders.descendOrderList(BID, 5800, bestOrderID) == [orderId6, longTo32Bytes(0)])
# walk up order list starting from worstOrderID
assert(orders.ascendOrderList(BID, 6000, worstOrderID) == [orderId5, orderId7])
assert(orders.ascendOrderList(BID, 5900, worstOrderID) == [orderId6, longTo32Bytes(0)])
assert(orders.ascendOrderList(BID, 6100, worstOrderID) == [longTo32Bytes(0), orderId5])
assert(orders.ascendOrderList(BID, 5800, worstOrderID) == [orderId6, longTo32Bytes(0)])
assert(orders.testRemoveOrder(orderId5) == 1), "Remove order 5"
assert(orders.testRemoveOrder(orderId6) == 1), "Remove order 6"
assert(orders.testRemoveOrder(orderId7) == 1), "Remove order 7"
def test_walkOrderList_asks(contractsFixture, market):
orders = contractsFixture.contracts['Orders']
outcomeID = 1
order = {
"orderID": longTo32Bytes(8),
"type": ASK,
"amount": fix('1'),
"price": 6000,
"sender": contractsFixture.accounts[0],
"outcome": outcomeID,
"moneyEscrowed": fix('6000'),
"sharesEscrowed": 0,
"betterOrderID": longTo32Bytes(0),
"worseOrderID": longTo32Bytes(0),
"tradeGroupID": stringToBytes("0"),
"kycToken": nullAddress
}
uints = [order["amount"], order["price"], order["outcome"], order["moneyEscrowed"], order["sharesEscrowed"]]
bytes32s = [order["betterOrderID"], order["worseOrderID"], order["tradeGroupID"], stringToBytes("0")]
orderId8 = orders.testSaveOrder(uints, bytes32s, order["type"], market.address, order["sender"], order["kycToken"])
assert(orderId8 != bytearray(32)), "Save order"
bestOrderID = orders.getBestOrderId(ASK, market.address, outcomeID, nullAddress)
worstOrderID = orders.getWorstOrderId(ASK, market.address, outcomeID, nullAddress)
assert(bestOrderID == orderId8)
assert(worstOrderID == orderId8)
# walk down order list starting from bestOrderID
assert(orders.descendOrderList(ASK, 6000, bestOrderID) == [orderId8, longTo32Bytes(0)])
assert(orders.descendOrderList(ASK, 5900, bestOrderID) == [longTo32Bytes(0), orderId8])
assert(orders.descendOrderList(ASK, 6100, bestOrderID) == [orderId8, longTo32Bytes(0)])
assert(orders.descendOrderList(ASK, 5800, bestOrderID) == [longTo32Bytes(0), orderId8])
# walk up order list starting from worstOrderID
assert(orders.ascendOrderList(ASK, 6000, worstOrderID) == [orderId8, longTo32Bytes(0)])
assert(orders.ascendOrderList(ASK, 5900, worstOrderID) == [longTo32Bytes(0), orderId8])
assert(orders.ascendOrderList(ASK, 6100, worstOrderID) == [orderId8, longTo32Bytes(0)])
assert(orders.ascendOrderList(ASK, 5800, worstOrderID) == [longTo32Bytes(0), orderId8])
order = {
"orderID": longTo32Bytes(9),
"type": ASK,
"amount": fix('1'),
"price": 5900,
"sender": contractsFixture.accounts[0],
"outcome": outcomeID,
"moneyEscrowed": fix('5900'),
"sharesEscrowed": 0,
"betterOrderID": longTo32Bytes(0),
"worseOrderID": longTo32Bytes(0),
"tradeGroupID": stringToBytes("0"),
"kycToken": nullAddress
}
uints = [order["amount"], order["price"], order["outcome"], order["moneyEscrowed"], order["sharesEscrowed"]]
bytes32s = [order["betterOrderID"], order["worseOrderID"], order["tradeGroupID"], stringToBytes("0")]
orderId9 = orders.testSaveOrder(uints, bytes32s, order["type"], market.address, order["sender"], order["kycToken"])
assert(orderId9 != bytearray(32)), "Save order"
bestOrderID = orders.getBestOrderId(ASK, market.address, outcomeID, nullAddress)
worstOrderID = orders.getWorstOrderId(ASK, market.address, outcomeID, nullAddress)
assert(bestOrderID == orderId9)
assert(worstOrderID == orderId8)
# walk down order list starting from bestOrderID
assert(orders.descendOrderList(ASK, 6000, bestOrderID) == [orderId8, longTo32Bytes(0)])
assert(orders.descendOrderList(ASK, 5900, bestOrderID) == [orderId9, orderId8])
assert(orders.descendOrderList(ASK, 6100, bestOrderID) == [orderId8, longTo32Bytes(0)])
assert(orders.descendOrderList(ASK, 5800, bestOrderID) == [longTo32Bytes(0), orderId9])
assert(orders.descendOrderList(ASK, 5950, bestOrderID) == [orderId9, orderId8])
# walk up order list starting from worstOrderID
assert(orders.ascendOrderList(ASK, 6000, worstOrderID) == [orderId8, longTo32Bytes(0)])
assert(orders.ascendOrderList(ASK, 5900, worstOrderID) == [orderId9, orderId8])
assert(orders.ascendOrderList(ASK, 6100, worstOrderID) == [orderId8, longTo32Bytes(0)])
assert(orders.ascendOrderList(ASK, 5800, worstOrderID) == [longTo32Bytes(0), orderId9])
assert(orders.ascendOrderList(ASK, 5950, bestOrderID) == [orderId9, orderId8])
order = {
"orderID": longTo32Bytes(10),
"type": ASK,
"amount": fix('1'),
"price": 5950,
"sender": contractsFixture.accounts[0],
"outcome": outcomeID,
"moneyEscrowed": fix('5950'),
"sharesEscrowed": 0,
"betterOrderID": longTo32Bytes(0),
"worseOrderID": longTo32Bytes(0),
"tradeGroupID": stringToBytes("0"),
"kycToken": nullAddress
}
uints = [order["amount"], order["price"], order["outcome"], order["moneyEscrowed"], order["sharesEscrowed"]]
bytes32s = [order["betterOrderID"], order["worseOrderID"], order["tradeGroupID"], stringToBytes("0")]
orderId10 = orders.testSaveOrder(uints, bytes32s, order["type"], market.address, order["sender"], order["kycToken"])
assert(orderId10 != bytearray(32)), "Save order"
bestOrderID = orders.getBestOrderId(ASK, market.address, outcomeID, nullAddress)
worstOrderID = orders.getWorstOrderId(ASK, market.address, outcomeID, nullAddress)
assert(bestOrderID == orderId9)
assert(worstOrderID == orderId8)
# walk down order list starting from bestOrderID
assert(orders.descendOrderList(ASK, 6000, bestOrderID) == [orderId8, longTo32Bytes(0)])
assert(orders.descendOrderList(ASK, 5900, bestOrderID) == [orderId9, orderId10])
assert(orders.descendOrderList(ASK, 6100, bestOrderID) == [orderId8, longTo32Bytes(0)])
assert(orders.descendOrderList(ASK, 5800, bestOrderID) == [longTo32Bytes(0), orderId9])
# walk up order list starting from worstOrderID
assert(orders.ascendOrderList(ASK, 6000, worstOrderID) == [orderId8, longTo32Bytes(0)])
assert(orders.ascendOrderList(ASK, 5900, worstOrderID) == [orderId9, orderId10])
assert(orders.ascendOrderList(ASK, 6100, worstOrderID) == [orderId8, longTo32Bytes(0)])
assert(orders.ascendOrderList(ASK, 5800, worstOrderID) == [longTo32Bytes(0), orderId9])
assert(orders.testRemoveOrder(orderId8) == 1), "Remove order 8"
assert(orders.testRemoveOrder(orderId9) == 1), "Remove order 9"
assert(orders.testRemoveOrder(orderId10) == 1), "Remove order 10"
@mark.parametrize('where, orderType, hints', [
('best', BID, True),
('middle', BID, True),
('worst', BID, True),
('best', BID, False),
('middle', BID, False),
('worst', BID, False),
('best', ASK, True),
('middle', ASK, True),
('worst', ASK, True),
('best', ASK, False),
('middle', ASK, False),
('worst', ASK, False),
])
def test_orderBidSorting(where, orderType, hints, contractsFixture, market):
orders = contractsFixture.contracts['Orders']
# setup pre-existing orders
worstPrice = 6000 if orderType == BID else 6600
bestPrice = 6600 if orderType == BID else 6000
uints = [fix('1'), worstPrice, YES, worstPrice, 0]
bytes32s = [longTo32Bytes(0), longTo32Bytes(0), stringToBytes("0"), stringToBytes("0")]
worstOrderId = orders.testSaveOrder(uints, bytes32s, orderType, market.address, contractsFixture.accounts[0], nullAddress)
uints = [fix('1'), bestPrice, YES, bestPrice, 0]
bytes32s = [longTo32Bytes(0), longTo32Bytes(0), stringToBytes("0"), stringToBytes("0")]
bestOrderId = orders.testSaveOrder(uints, bytes32s, orderType, market.address, contractsFixture.accounts[0], nullAddress)
# validate that our setup went smoothly
assert orders.getBestOrderId(orderType, market.address, YES, nullAddress) == bestOrderId
assert orders.getWorstOrderId(orderType, market.address, YES, nullAddress) == worstOrderId
assert orders.getWorseOrderId(bestOrderId) == worstOrderId
assert orders.getWorseOrderId(worstOrderId) == longTo32Bytes(0)
assert orders.getBetterOrderId(worstOrderId) == bestOrderId
assert orders.getBetterOrderId(bestOrderId) == longTo32Bytes(0)
# insert our new order
if where == 'best':
orderPrice = 6700 if orderType == BID else 5900
betterOrderId = longTo32Bytes(0)
worseOrderId = bestOrderId if hints else longTo32Bytes(0)
if where == 'middle':
orderPrice = 6300
betterOrderId = bestOrderId if hints else longTo32Bytes(0)
worseOrderId = worstOrderId if hints else longTo32Bytes(0)
if where == 'worst':
orderPrice = 5900 if orderType == BID else 6700
betterOrderId = worstOrderId if hints else longTo32Bytes(0)
worseOrderId = longTo32Bytes(0)
uints = [fix('1'), orderPrice, YES, orderPrice, 0]
bytes32s = [betterOrderId, worseOrderId, stringToBytes("0"), stringToBytes("0")]
insertedOrder = orders.testSaveOrder(uints, bytes32s, orderType, market.address, contractsFixture.accounts[0], nullAddress)
# validate the new order was inserted correctly
assert orders.getBetterOrderId(insertedOrder) == longTo32Bytes(0) if where == 'best' else bestOrderId
assert orders.getWorseOrderId(insertedOrder) == longTo32Bytes(0) if where == 'worst' else worstOrderId
assert orders.getBestOrderId(orderType, market.address, YES, nullAddress) == insertedOrder if where == 'best' else bestOrderId
assert orders.getWorstOrderId(orderType, market.address, YES, nullAddress) == insertedOrder if where == 'worst' else worstOrderId
def test_saveOrder(contractsFixture, market):
orders = contractsFixture.contracts['Orders']
uints = [fix(10), 5000, NO, 0, fix(10)]
bytes32s = [longTo32Bytes(0), longTo32Bytes(0), stringToBytes("1"), stringToBytes("0")]
orderId1 = orders.testSaveOrder(uints, bytes32s, BID, market.address, contractsFixture.accounts[1], nullAddress)
assert(orderId1 != bytearray(32)), "saveOrder wasn't executed successfully"
uints = [fix(10), 5000, NO, fix('10', '5000'), 0]
bytes32s = [longTo32Bytes(0), longTo32Bytes(0), stringToBytes("1"), stringToBytes("0")]
orderId2 = orders.testSaveOrder(uints, bytes32s, ASK, market.address, contractsFixture.accounts[2], nullAddress)
assert(orderId2 != bytearray(32)), "saveOrder wasn't executed successfully"
assert(orders.getAmount(orderId1) == fix(10)), "amount for order1 should be set to 10"
assert(orders.getAmount(orderId2) == fix(10)), "amount for order2 should be set to 10"
assert(orders.getPrice(orderId1) == 5000), "price for order1 should be set to 5000 wei"
assert(orders.getPrice(orderId2) == 5000), "price for order2 should be set to 5000 wei"
assert(orders.getOrderCreator(orderId1) == contractsFixture.accounts[1]), "orderOwner for order1 should be contractsFixture.accounts[1]"
assert(orders.getOrderCreator(orderId2) == contractsFixture.accounts[2]), "orderOwner for order2 should be contractsFixture.accounts[2]"
assert orders.getOrderMoneyEscrowed(orderId1) == 0, "money escrowed should be 0"
assert orders.getOrderMoneyEscrowed(orderId2) == fix('10', '5000'), "money escrowed should be 50000 ETH"
assert orders.getOrderSharesEscrowed(orderId1) == fix(10), "shares escrowed should be fix(10)"
assert orders.getOrderSharesEscrowed(orderId2) == 0, "shares escrowed should be 0"
assert orders.getBetterOrderId(orderId1) == longTo32Bytes(0), "better order id should be 0"
assert orders.getBetterOrderId(orderId2) == longTo32Bytes(0), "better order id should be 0"
assert orders.getWorseOrderId(orderId1) == longTo32Bytes(0), "worse order id should be 0"
assert orders.getWorseOrderId(orderId2) == longTo32Bytes(0), "worse order id should be 0"
assert(orders.testRemoveOrder(orderId1) == 1), "Remove order 1"
assert(orders.testRemoveOrder(orderId2) == 1), "Remove order 2"
def test_removeOrder(contractsFixture, market):
orders = contractsFixture.contracts['Orders']
uints = (fix('10'), 5000, NO, 0, fix('10'))
bytes32s = (longTo32Bytes(0), longTo32Bytes(0), stringToBytes("1"), stringToBytes("0"))
orderId1 = orders.testSaveOrder(uints, bytes32s, BID, market.address, contractsFixture.accounts[1], nullAddress)
assert(orderId1 != bytearray(32)), "saveOrder wasn't executed successfully"
uints = (fix('10'), 5000, NO, fix('10', '5000'), 0)
bytes32s = (longTo32Bytes(0), longTo32Bytes(0), stringToBytes("1"), stringToBytes("0"))
orderId2 = orders.testSaveOrder(uints, bytes32s, BID, market.address, contractsFixture.accounts[2], nullAddress)
assert(orderId2 != bytearray(32)), "saveOrder wasn't executed successfully"
uints = (fix('10'), 5000, YES, 0, fix('10'))
bytes32s = (longTo32Bytes(0), longTo32Bytes(0), stringToBytes("1"), stringToBytes("0"))
orderId3 = orders.testSaveOrder(uints, bytes32s, BID, market.address, contractsFixture.accounts[1], nullAddress)
assert(orderId3 != bytearray(32)), "saveOrder wasn't executed successfully"
assert orders.getAmount(orderId3) == fix('10')
assert orders.getPrice(orderId3) == 5000
assert orders.getOrderCreator(orderId3) == contractsFixture.accounts[1]
assert orders.getOrderMoneyEscrowed(orderId3) == 0
assert orders.getOrderSharesEscrowed(orderId3) == fix('10')
assert orders.getBetterOrderId(orderId3) == longTo32Bytes(0)
assert orders.getWorseOrderId(orderId3) == longTo32Bytes(0)
assert(orders.testRemoveOrder(orderId3) == 1), "removeOrder wasn't executed successfully"
assert orders.getAmount(orderId3) == 0
assert orders.getPrice(orderId3) == 0
assert orders.getOrderCreator(orderId3) == longToHexString(0)
assert orders.getOrderMoneyEscrowed(orderId3) == 0
assert orders.getOrderSharesEscrowed(orderId3) == 0
assert orders.getBetterOrderId(orderId3) == longTo32Bytes(0)
assert orders.getWorseOrderId(orderId3) == longTo32Bytes(0)
assert(orders.testRemoveOrder(orderId1) == 1), "Remove order 1"
assert(orders.testRemoveOrder(orderId2) == 1), "Remove order 2"
| 57.632768 | 140 | 0.701353 | 2,058 | 20,402 | 6.948008 | 0.075802 | 0.08644 | 0.038184 | 0.054549 | 0.819987 | 0.796419 | 0.770613 | 0.759144 | 0.726694 | 0.6883 | 0 | 0.059043 | 0.15822 | 20,402 | 353 | 141 | 57.796034 | 0.773553 | 0.034751 | 0 | 0.537217 | 0 | 0 | 0.122789 | 0.002846 | 0 | 0 | 0 | 0 | 0.407767 | 1 | 0.016181 | false | 0 | 0.012945 | 0 | 0.029126 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
737736f35937ceabda93ee21c0408b6935b38d5d | 156 | py | Python | hdmi/cores/__init__.py | srivatsan-ramesh/HDMI-Source-Sink-Modules | 00b99db3d50f9f88f74f0d1685cddcbe35ba1933 | [
"MIT"
] | 10 | 2016-05-08T11:41:40.000Z | 2021-11-16T08:28:06.000Z | hdmi/cores/__init__.py | srivatsan-ramesh/HDMI-Source-Sink-Modules | 00b99db3d50f9f88f74f0d1685cddcbe35ba1933 | [
"MIT"
] | 3 | 2016-05-18T17:36:14.000Z | 2016-06-23T17:42:37.000Z | hdmi/cores/__init__.py | srivatsan-ramesh/HDMI-Source-Sink-Modules | 00b99db3d50f9f88f74f0d1685cddcbe35ba1933 | [
"MIT"
] | 2 | 2016-05-21T13:53:20.000Z | 2016-06-21T22:05:42.000Z | from .constants import control_token_0
from .constants import control_token_1
from .constants import control_token_2
from .constants import control_token_3
| 31.2 | 38 | 0.871795 | 24 | 156 | 5.333333 | 0.375 | 0.40625 | 0.59375 | 0.8125 | 0.96875 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028571 | 0.102564 | 156 | 4 | 39 | 39 | 0.885714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
738e5e488f637e736d557e65c5121b1bf0a03377 | 198 | py | Python | challenges/challenge011_Generators_for_Fun_and_Profit/test_challenge011.py | alex-vegan/100daysofcode-with-python-course | b6c12316abe18274b7963371b8f0ed2fd549ef07 | [
"MIT"
] | 2 | 2018-10-28T17:12:37.000Z | 2018-10-28T17:12:39.000Z | challenges/challenge011_Generators_for_Fun_and_Profit/test_challenge011.py | alex-vegan/100daysofcode-with-python-course | b6c12316abe18274b7963371b8f0ed2fd549ef07 | [
"MIT"
] | 3 | 2018-10-28T17:11:04.000Z | 2018-10-29T22:36:36.000Z | challenges/challenge011_Generators_for_Fun_and_Profit/test_challenge011.py | alex-vegan/100daysofcode-with-python-course | b6c12316abe18274b7963371b8f0ed2fd549ef07 | [
"MIT"
] | null | null | null | from challenge011 import gen_files, gen_lines, gen_grep, gen_count
def test_gen_files():
pass
def test_gen_lines():
pass
def test_gen_grep():
pass
def test_gen_count():
pass
| 11 | 66 | 0.707071 | 31 | 198 | 4.129032 | 0.354839 | 0.21875 | 0.3125 | 0.328125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019355 | 0.217172 | 198 | 17 | 67 | 11.647059 | 0.806452 | 0 | 0 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.444444 | true | 0.444444 | 0.111111 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
7394e91e0d9606fb97aa09a343732fee51aa2564 | 9,024 | py | Python | Week5/hw5_1/validate.py | italoag/M101P | 708bdd793735228f820f3f50f57c44ce8fc637ef | [
"MIT"
] | 1 | 2015-06-20T21:10:50.000Z | 2015-06-20T21:10:50.000Z | Week5/hw5_1/validate.py | italoag/M101P | 708bdd793735228f820f3f50f57c44ce8fc637ef | [
"MIT"
] | null | null | null | Week5/hw5_1/validate.py | italoag/M101P | 708bdd793735228f820f3f50f57c44ce8fc637ef | [
"MIT"
] | null | null | null | import base64
code="
import pymongo
import random
import re
import string
import sys
import getopt
import pprint

# Copyright 2012
# 10gen, Inc.
# Author: Andrew Erlichson   aje@10gen.com
#
# If you are a student and reading this code, turn back now, before
# the MongoDB gods smite you.

connection = None
db = None
webhost = "localhost:8082"
mongostr = "mongodb://localhost:27017"
db_name = "blog"

# this script will check that homework 4.3

# command line arg parsing to make folks happy who want to run at mongolabs or mongohq
# this functions uses global vars to communicate. forgive me.
def arg_parsing(argv):

    global webhost
    global mongostr
    global db_name

    try:
        opts, args = getopt.getopt(argv, "-p:-m:-d:")
    except getopt.GetoptError:
        print "usage validate.py -p webhost -m mongoConnectString -d databaseName"
        print "\tmongoConnectionString default to {0}".format(mongostr)
        print "\tdatabaseName defaults to {0}".format(db_name)
        sys.exit(2)
    for opt, arg in opts:
        if (opt == '-h'):
            print "usage validate.py -m mongoConnectString -d databaseName"
            sys.exit(2)
        elif opt in ("-m"):
            mongostr = arg
            print "Overriding MongoDB connection string to be ", mongostr
        elif opt in ("-d"):
            db_name = arg
            print "Overriding MongoDB database to be ", db_name

# check to see if they loaded the data set
def check_for_data_integrity():

    posts = db.posts
    try:
        count = posts.count()
    except:
        print "can't query MongoDB..is it running?"
        raise
        return False

    if (count != 1000):
        print "There are supposed to be 1000 documents. you have ", count
        return False

    # find the most popular tags
    try:

        result = db.posts.aggregate([{'$project':{'tags':1}}, 
                                     {'$unwind':'$tags'}, 
                                     {'$group':{'_id': '$tags',
                                                'count':{'$sum':1}}}, 
                                     {'$sort':{'count':-1}}, 
                                     {'$limit':10}])
    except:
        print "can't query MongoDB..is it running?"
        raise
        return False


    if (result['result'][0]['count'] != 13 or
        result['result'][0]['_id'] != "sphynx"):
        print "The dataset is not properly loaded. The distribution of post tags is wrong."
        return False

    print "Data looks like it is properly loaded into the posts collection"

    return True
    

def check_for_fast_blog_home_page():

    posts = db.posts

    try:
        explain = posts.find().sort('date', direction=-1).limit(10).explain()
    except:
        print "can't query MongoDB..is it running?"
        raise
        return False

    if (explain['nscannedObjects'] > 10):
        print "Sorry, executing the query to display the home page is too slow. "
        print "We should be scanning no more than 10 documents. You scanned", explain['nscannedObjects']
        print "here is the output from explain"

        pp = pprint.PrettyPrinter(depth=6)
        pp.pprint(explain)
        return False
    
    print "Home page is super fast. Nice job.\n"
    return True

def get_the_middle_permalink():
    posts = db.posts
    try:
        c = posts.find().skip(500).limit(1)
        for doc in c:
            permalink = doc['permalink']
            return permalink
    except:
        print "can't query MongoDB..is it running?"
        raise
    return ""

def check_for_fast_blog_entry_page():
    
    posts = db.posts

    permalink = get_the_middle_permalink()
    try:
        explain = posts.find({'permalink':permalink}).explain()
    except:
        print "can't query MongoDB..is it running?"
        raise
        return False

    if (explain['nscannedObjects'] > 1):
        print "Sorry, executing the query to retrieve a post by permalink is too slow "
        print "We should be scanning no more than 1 documents. You scanned", explain['nscannedObjects']
        print "here is the output from explain"

        pp = pprint.PrettyPrinter(depth=6)
        pp.pprint(explain)
        return False
    
    print "Blog retrieval by permalink is super fast. Nice job.\n"
    return True


def check_for_fast_posts_by_tag_page():
    posts = db.posts

    tag = "sphynx"
    try:
        explain = posts.find({'tags':tag}).sort('date', direction=-1).limit(10).explain()
    except:
        print "can't query MongoDB..is it running?"
        raise
        return False

    if (explain['nscannedObjects'] > 10):
        print "Sorry, executing the query to retrieve posts by tag is too slow."
        print "We should be scanning no more than 10 documents. You scanned", explain['nscannedObjects']
        print "here is the output from explain"

        pp = pprint.PrettyPrinter(depth=6)
        pp.pprint(explain)
        return False
    
    print "Blog retrieval by tag is super fast. Nice job.\n"
    return True


# main section of the code
def main(argv):
            
    arg_parsing(argv)
    global connection
    global db

    print "Welcome to the HW 4.3 Checker. My job is to make sure you added the indexes"
    print "that make the blog fast in the following three situations"
    print "\tWhen showing the home page"
    print "\tWhen fetching a particular post"
    print "\tWhen showing all posts for a particular tag"

    # connect to the db (mongostr was set in arg_parsing)
    try:
        connection = pymongo.Connection(mongostr, safe=True)
        db = connection[db_name]
    except:
        print "can't connect to MongoDB using", mongostr, ". Is it running?"
        sys.exit(1)
        
    if (not check_for_data_integrity()):
        print "Sorry, the data set is not loaded correctly in the posts collection"
        sys.exit(1)

    if (not check_for_fast_blog_home_page()):
        print "Sorry, the query to display the blog home page is too slow."
        sys.exit(1)

    if (not check_for_fast_blog_entry_page()):
        print "Sorry, the query to retrieve a blog post by permalink is too slow."
        sys.exit(1)

    if (not check_for_fast_posts_by_tag_page()):
        print "Sorry, the query to retrieve all posts with a certain tag is too slow"
        sys.exit(1)
    
    # if you are reading this in cleartext, you are violating the honor code.
    # You can still redeem yourself. Get it working and don't submit the validation code until you do.
    # All a man has at the end of the day is his word.
    print "Tests Passed for HW 4.3. Your HW 4.3 validation code is 893jfns29f728fn29f20f2"



if __name__ == "__main__":
    main(sys.argv[1:])







"
eval(compile(base64.b64decode(code), "<string>", 'exec'))
| 2,256 | 8,951 | 0.996786 | 16 | 9,024 | 562.1875 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098581 | 0.000665 | 9,024 | 3 | 8,952 | 3,008 | 0.898869 | 0 | 0 | 0 | 0 | 0 | 0.992465 | 0.991135 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 9 |
73968c9ae10bb40af998012be22a09b8e6018f8f | 38,480 | py | Python | tests/wasp1/AllAnswerSets/checker_14__backtracking_model_checks_.test.py | bernardocuteri/wasp | 05c8f961776dbdbf7afbf905ee00fc262eba51ad | [
"Apache-2.0"
] | 19 | 2015-12-03T08:53:45.000Z | 2022-03-31T02:09:43.000Z | tests/wasp1/AllAnswerSets/checker_14__backtracking_model_checks_.test.py | bernardocuteri/wasp | 05c8f961776dbdbf7afbf905ee00fc262eba51ad | [
"Apache-2.0"
] | 80 | 2017-11-25T07:57:32.000Z | 2018-06-10T19:03:30.000Z | tests/wasp1/AllAnswerSets/checker_14__backtracking_model_checks_.test.py | bernardocuteri/wasp | 05c8f961776dbdbf7afbf905ee00fc262eba51ad | [
"Apache-2.0"
] | 6 | 2015-01-15T07:51:48.000Z | 2020-06-18T14:47:48.000Z | input = """
% This used to generate incorrect results (models were missing) and was
% provided by the Potsdam group.
p14|p6|p6|p24:-not p14,p23.
p14|p6|p6|p24:-p22,p23.
p14|p6|p6|p7:-not p14,p23.
p14|p6|p6|p7:-p22,p23.
p14|p6|p9|p24:-not p14,p23.
p14|p6|p9|p24:-p22,p23.
p14|p6|p9|p7:-not p14,p23.
p14|p6|p9|p7:-p22,p23.
p14|p11|p6|p24:-not p14,p23.
p14|p11|p6|p24:-p22,p23.
p14|p11|p6|p7:-not p14,p23.
p14|p11|p6|p7:-p22,p23.
p14|p11|p9|p24:-not p14,p23.
p14|p11|p9|p24:-p22,p23.
p14|p11|p9|p7:-not p14,p23.
p14|p11|p9|p7:-p22,p23.
p2|p6|p6|p24:-not p14,p23.
p2|p6|p6|p24:-p22,p23.
p2|p6|p6|p7:-not p14,p23.
p2|p6|p6|p7:-p22,p23.
p2|p6|p9|p24:-not p14,p23.
p2|p6|p9|p24:-p22,p23.
p2|p6|p9|p7:-not p14,p23.
p2|p6|p9|p7:-p22,p23.
p2|p11|p6|p24:-not p14,p23.
p2|p11|p6|p24:-p22,p23.
p2|p11|p6|p7:-not p14,p23.
p2|p11|p6|p7:-p22,p23.
p2|p11|p9|p24:-not p14,p23.
p2|p11|p9|p24:-p22,p23.
p2|p11|p9|p7:-not p14,p23.
p2|p11|p9|p7:-p22,p23.
p22|not_p23|p15|p5|not_p8:-p16.
p22|not_p23|p15|p5|not_p8:-p5.
p22|not_p23|p15|p9|not_p8:-p16.
p22|not_p23|p15|p9|not_p8:-p5.
p22|not_p23|p16|p5|not_p8:-p16.
p22|not_p23|p16|p5|not_p8:-p5.
p22|not_p23|p16|p9|not_p8:-p16.
p22|not_p23|p16|p9|not_p8:-p5.
p22|p24|p15|p5|not_p8:-p16.
p22|p24|p15|p5|not_p8:-p5.
p22|p24|p15|p9|not_p8:-p16.
p22|p24|p15|p9|not_p8:-p5.
p22|p24|p16|p5|not_p8:-p16.
p22|p24|p16|p5|not_p8:-p5.
p22|p24|p16|p9|not_p8:-p16.
p22|p24|p16|p9|not_p8:-p5.
not_p6|not_p23|p15|p5|not_p8:-p16.
not_p6|not_p23|p15|p5|not_p8:-p5.
not_p6|not_p23|p15|p9|not_p8:-p16.
not_p6|not_p23|p15|p9|not_p8:-p5.
not_p6|not_p23|p16|p5|not_p8:-p16.
not_p6|not_p23|p16|p5|not_p8:-p5.
not_p6|not_p23|p16|p9|not_p8:-p16.
not_p6|not_p23|p16|p9|not_p8:-p5.
not_p6|p24|p15|p5|not_p8:-p16.
not_p6|p24|p15|p5|not_p8:-p5.
not_p6|p24|p15|p9|not_p8:-p16.
not_p6|p24|p15|p9|not_p8:-p5.
not_p6|p24|p16|p5|not_p8:-p16.
not_p6|p24|p16|p5|not_p8:-p5.
not_p6|p24|p16|p9|not_p8:-p16.
not_p6|p24|p16|p9|not_p8:-p5.
not_p20|p11|p5|not_p5:-not p13.
not_p20|p11|p5:-not p13.
not_p20|p11|p5|p21|not_p5:-not p13.
not_p20|p11|p5|p21:-not p13.
not_p20|p11|p4|p5|not_p5:-not p13.
not_p20|p11|p4|p5:-not p13.
not_p20|p11|p4|p21|not_p5:-not p13.
not_p20|p11|p4|p21:-not p13.
not_p20|p19|p5|not_p5:-not p13.
not_p20|p19|p5:-not p13.
not_p20|p19|p5|p21|not_p5:-not p13.
not_p20|p19|p5|p21:-not p13.
not_p20|p19|p4|p5|not_p5:-not p13.
not_p20|p19|p4|p5:-not p13.
not_p20|p19|p4|p21|not_p5:-not p13.
not_p20|p19|p4|p21:-not p13.
p11|p5|not_p5:-not p13,not p15.
p11|p5:-not p13,not p15.
p11|p5|p21|not_p5:-not p13,not p15.
p11|p5|p21:-not p13,not p15.
p11|p4|p5|not_p5:-not p13,not p15.
p11|p4|p5:-not p13,not p15.
p11|p4|p21|not_p5:-not p13,not p15.
p11|p4|p21:-not p13,not p15.
p19|p5|not_p5:-not p13,not p15.
p19|p5:-not p13,not p15.
p19|p5|p21|not_p5:-not p13,not p15.
p19|p5|p21:-not p13,not p15.
p19|p4|p5|not_p5:-not p13,not p15.
p19|p4|p5:-not p13,not p15.
p19|p4|p21|not_p5:-not p13,not p15.
p19|p4|p21:-not p13,not p15.
p15|not_p3|p21:-p1,p23.
p15|not_p3|p21:-p8,p23.
p15|not_p3|p21:-p1,p23,not p16.
p15|not_p3|p21:-p8,p23,not p16.
p15|p16|p21:-p1,p23.
p15|p16|p21:-p8,p23.
p15|p16|p21:-p1,p23,not p16.
p15|p16|p21:-p8,p23,not p16.
p2|not_p3|p21:-p1,p23.
p2|not_p3|p21:-p8,p23.
p2|not_p3|p21:-p1,p23,not p16.
p2|not_p3|p21:-p8,p23,not p16.
p2|p16|p21:-p1,p23.
p2|p16|p21:-p8,p23.
p2|p16|p21:-p1,p23,not p16.
p2|p16|p21:-p8,p23,not p16.
p8|p5|p10|not_p19:-p23,p6.
p8|p5|p10|not_p19:-p11,p6.
p8|p5|p10|not_p4:-p23,p6.
p8|p5|p10|not_p4:-p11,p6.
p8|p5|p4|not_p19:-p23,p6.
p8|p5|p4|not_p19:-p11,p6.
p8|p5|p4|not_p4:-p23,p6.
p8|p5|p4|not_p4:-p11,p6.
p8|p9|p10|not_p19:-p23,p6.
p8|p9|p10|not_p19:-p11,p6.
p8|p9|p10|not_p4:-p23,p6.
p8|p9|p10|not_p4:-p11,p6.
p8|p9|p4|not_p19:-p23,p6.
p8|p9|p4|not_p19:-p11,p6.
p8|p9|p4|not_p4:-p23,p6.
p8|p9|p4|not_p4:-p11,p6.
p18|p5|p10|not_p19:-p23,p6.
p18|p5|p10|not_p19:-p11,p6.
p18|p5|p10|not_p4:-p23,p6.
p18|p5|p10|not_p4:-p11,p6.
p18|p5|p4|not_p19:-p23,p6.
p18|p5|p4|not_p19:-p11,p6.
p18|p5|p4|not_p4:-p23,p6.
p18|p5|p4|not_p4:-p11,p6.
p18|p9|p10|not_p19:-p23,p6.
p18|p9|p10|not_p19:-p11,p6.
p18|p9|p10|not_p4:-p23,p6.
p18|p9|p10|not_p4:-p11,p6.
p18|p9|p4|not_p19:-p23,p6.
p18|p9|p4|not_p19:-p11,p6.
p18|p9|p4|not_p4:-p23,p6.
p18|p9|p4|not_p4:-p11,p6.
p18|p5|not_p1|p9:-p1,p12.
p18|p5|not_p1|p9:-p16,p12.
p18|p5|not_p1|p4:-p1,p12.
p18|p5|not_p1|p4:-p16,p12.
p18|p5|not_p24|p9:-p1,p12.
p18|p5|not_p24|p9:-p16,p12.
p18|p5|not_p24|p4:-p1,p12.
p18|p5|not_p24|p4:-p16,p12.
p18|not_p1|p9:-p1,p12,not p11.
p18|not_p1|p9:-p16,p12,not p11.
p18|not_p1|p4:-p1,p12,not p11.
p18|not_p1|p4:-p16,p12,not p11.
p18|not_p24|p9:-p1,p12,not p11.
p18|not_p24|p9:-p16,p12,not p11.
p18|not_p24|p4:-p1,p12,not p11.
p18|not_p24|p4:-p16,p12,not p11.
not_p2|p5|not_p1|p9:-p1,p12.
not_p2|p5|not_p1|p9:-p16,p12.
not_p2|p5|not_p1|p4:-p1,p12.
not_p2|p5|not_p1|p4:-p16,p12.
not_p2|p5|not_p24|p9:-p1,p12.
not_p2|p5|not_p24|p9:-p16,p12.
not_p2|p5|not_p24|p4:-p1,p12.
not_p2|p5|not_p24|p4:-p16,p12.
not_p2|not_p1|p9:-p1,p12,not p11.
not_p2|not_p1|p9:-p16,p12,not p11.
not_p2|not_p1|p4:-p1,p12,not p11.
not_p2|not_p1|p4:-p16,p12,not p11.
not_p2|not_p24|p9:-p1,p12,not p11.
not_p2|not_p24|p9:-p16,p12,not p11.
not_p2|not_p24|p4:-p1,p12,not p11.
not_p2|not_p24|p4:-p16,p12,not p11.
p12|not_p15|p16|p17:-p6,p1.
p12|not_p15|p16|p17:-p9,p1.
p12|not_p15|p16|p18:-p6,p1.
p12|not_p15|p16|p18:-p9,p1.
p12|not_p15|p7|p17:-p6,p1.
p12|not_p15|p7|p17:-p9,p1.
p12|not_p15|p7|p18:-p6,p1.
p12|not_p15|p7|p18:-p9,p1.
p12|p4|p16|p17:-p6,p1.
p12|p4|p16|p17:-p9,p1.
p12|p4|p16|p18:-p6,p1.
p12|p4|p16|p18:-p9,p1.
p12|p4|p7|p17:-p6,p1.
p12|p4|p7|p17:-p9,p1.
p12|p4|p7|p18:-p6,p1.
p12|p4|p7|p18:-p9,p1.
p6|not_p15|p16|p17:-p6,p1.
p6|not_p15|p16|p17:-p9,p1.
p6|not_p15|p16|p18:-p6,p1.
p6|not_p15|p16|p18:-p9,p1.
p6|not_p15|p7|p17:-p6,p1.
p6|not_p15|p7|p17:-p9,p1.
p6|not_p15|p7|p18:-p6,p1.
p6|not_p15|p7|p18:-p9,p1.
p6|p4|p16|p17:-p6,p1.
p6|p4|p16|p17:-p9,p1.
p6|p4|p16|p18:-p6,p1.
p6|p4|p16|p18:-p9,p1.
p6|p4|p7|p17:-p6,p1.
p6|p4|p7|p17:-p9,p1.
p6|p4|p7|p18:-p6,p1.
p6|p4|p7|p18:-p9,p1.
p1|p12|p2|not_p16:-not p23,not p3.
p1|p12|p2:-p19,not p23,not p3.
p1|p12|not_p16:-not p23,not p3,not p11.
p1|p12:-p19,not p23,not p3,not p11.
p1|p7|p2|not_p16:-not p23,not p3.
p1|p7|p2:-p19,not p23,not p3.
p1|p7|not_p16:-not p23,not p3,not p11.
p1|p7:-p19,not p23,not p3,not p11.
p6|p12|p2|not_p16:-not p23,not p3.
p6|p12|p2:-p19,not p23,not p3.
p6|p12|not_p16:-not p23,not p3,not p11.
p6|p12:-p19,not p23,not p3,not p11.
p6|p7|p2|not_p16:-not p23,not p3.
p6|p7|p2:-p19,not p23,not p3.
p6|p7|not_p16:-not p23,not p3,not p11.
p6|p7:-p19,not p23,not p3,not p11.
p19|p1|p12|p2|not_p16:-not p23.
p19|p1|p12|p2:-p19,not p23.
p19|p1|p12|not_p16:-not p23,not p11.
p19|p1|p12:-p19,not p23,not p11.
p19|p1|p7|p2|not_p16:-not p23.
p19|p1|p7|p2:-p19,not p23.
p19|p1|p7|not_p16:-not p23,not p11.
p19|p1|p7:-p19,not p23,not p11.
p19|p6|p12|p2|not_p16:-not p23.
p19|p6|p12|p2:-p19,not p23.
p19|p6|p12|not_p16:-not p23,not p11.
p19|p6|p12:-p19,not p23,not p11.
p19|p6|p7|p2|not_p16:-not p23.
p19|p6|p7|p2:-p19,not p23.
p19|p6|p7|not_p16:-not p23,not p11.
p19|p6|p7:-p19,not p23,not p11.
not_p19|not_p9|not_p25|p13:-p11,p21.
not_p19|not_p9|not_p25|p13:-p13,p21.
not_p19|not_p9|not_p25|not_p9:-p11,p21.
not_p19|not_p9|not_p25|not_p9:-p13,p21.
not_p19|not_p9|not_p3|p13:-p11,p21.
not_p19|not_p9|not_p3|p13:-p13,p21.
not_p19|not_p9|not_p3|not_p9:-p11,p21.
not_p19|not_p9|not_p3|not_p9:-p13,p21.
not_p19|p22|not_p25|p13:-p11,p21.
not_p19|p22|not_p25|p13:-p13,p21.
not_p19|p22|not_p25|not_p9:-p11,p21.
not_p19|p22|not_p25|not_p9:-p13,p21.
not_p19|p22|not_p3|p13:-p11,p21.
not_p19|p22|not_p3|p13:-p13,p21.
not_p19|p22|not_p3|not_p9:-p11,p21.
not_p19|p22|not_p3|not_p9:-p13,p21.
p16|not_p9|not_p25|p13:-p11,p21.
p16|not_p9|not_p25|p13:-p13,p21.
p16|not_p9|not_p25|not_p9:-p11,p21.
p16|not_p9|not_p25|not_p9:-p13,p21.
p16|not_p9|not_p3|p13:-p11,p21.
p16|not_p9|not_p3|p13:-p13,p21.
p16|not_p9|not_p3|not_p9:-p11,p21.
p16|not_p9|not_p3|not_p9:-p13,p21.
p16|p22|not_p25|p13:-p11,p21.
p16|p22|not_p25|p13:-p13,p21.
p16|p22|not_p25|not_p9:-p11,p21.
p16|p22|not_p25|not_p9:-p13,p21.
p16|p22|not_p3|p13:-p11,p21.
p16|p22|not_p3|p13:-p13,p21.
p16|p22|not_p3|not_p9:-p11,p21.
p16|p22|not_p3|not_p9:-p13,p21.
p4|p23|p21|not_p25|not_p20:-not p24.
p4|p23|p21|not_p25:-p25,not p24.
p4|p23|p21|p13|not_p20:-not p24.
p4|p23|p21|p13:-p25,not p24.
p4|p23|p2|not_p25|not_p20:-not p24.
p4|p23|p2|not_p25:-p25,not p24.
p4|p23|p2|p13|not_p20:-not p24.
p4|p23|p2|p13:-p25,not p24.
p4|p21|p21|not_p25|not_p20:-not p24.
p4|p21|p21|not_p25:-p25,not p24.
p4|p21|p21|p13|not_p20:-not p24.
p4|p21|p21|p13:-p25,not p24.
p4|p21|p2|not_p25|not_p20:-not p24.
p4|p21|p2|not_p25:-p25,not p24.
p4|p21|p2|p13|not_p20:-not p24.
p4|p21|p2|p13:-p25,not p24.
not_p18|p23|p21|not_p25|not_p20:-not p24.
not_p18|p23|p21|not_p25:-p25,not p24.
not_p18|p23|p21|p13|not_p20:-not p24.
not_p18|p23|p21|p13:-p25,not p24.
not_p18|p23|p2|not_p25|not_p20:-not p24.
not_p18|p23|p2|not_p25:-p25,not p24.
not_p18|p23|p2|p13|not_p20:-not p24.
not_p18|p23|p2|p13:-p25,not p24.
not_p18|p21|p21|not_p25|not_p20:-not p24.
not_p18|p21|p21|not_p25:-p25,not p24.
not_p18|p21|p21|p13|not_p20:-not p24.
not_p18|p21|p21|p13:-p25,not p24.
not_p18|p21|p2|not_p25|not_p20:-not p24.
not_p18|p21|p2|not_p25:-p25,not p24.
not_p18|p21|p2|p13|not_p20:-not p24.
not_p18|p21|p2|p13:-p25,not p24.
not_p12|p14|p15|not_p18:-p6,p24.
not_p12|p14|p15|not_p18:-not p5,p24.
not_p12|p14|p15|p11:-p6,p24.
not_p12|p14|p15|p11:-not p5,p24.
not_p12|p14|p2|not_p18:-p6,p24.
not_p12|p14|p2|not_p18:-not p5,p24.
not_p12|p14|p2|p11:-p6,p24.
not_p12|p14|p2|p11:-not p5,p24.
not_p12|p15|p15|not_p18:-p6,p24.
not_p12|p15|p15|not_p18:-not p5,p24.
not_p12|p15|p15|p11:-p6,p24.
not_p12|p15|p15|p11:-not p5,p24.
not_p12|p15|p2|not_p18:-p6,p24.
not_p12|p15|p2|not_p18:-not p5,p24.
not_p12|p15|p2|p11:-p6,p24.
not_p12|p15|p2|p11:-not p5,p24.
p1|p14|p15|not_p18:-p6,p24.
p1|p14|p15|not_p18:-not p5,p24.
p1|p14|p15|p11:-p6,p24.
p1|p14|p15|p11:-not p5,p24.
p1|p14|p2|not_p18:-p6,p24.
p1|p14|p2|not_p18:-not p5,p24.
p1|p14|p2|p11:-p6,p24.
p1|p14|p2|p11:-not p5,p24.
p1|p15|p15|not_p18:-p6,p24.
p1|p15|p15|not_p18:-not p5,p24.
p1|p15|p15|p11:-p6,p24.
p1|p15|p15|p11:-not p5,p24.
p1|p15|p2|not_p18:-p6,p24.
p1|p15|p2|not_p18:-not p5,p24.
p1|p15|p2|p11:-p6,p24.
p1|p15|p2|p11:-not p5,p24.
p11|p24|p8|not_p4:-p24,p2.
p11|p24|p8|not_p4:-p7,p2.
p11|p24|p8|not_p15:-p24,p2.
p11|p24|p8|not_p15:-p7,p2.
p11|p24|not_p1|not_p4:-p24,p2.
p11|p24|not_p1|not_p4:-p7,p2.
p11|p24|not_p1|not_p15:-p24,p2.
p11|p24|not_p1|not_p15:-p7,p2.
p11|not_p20|p8|not_p4:-p24,p2.
p11|not_p20|p8|not_p4:-p7,p2.
p11|not_p20|p8|not_p15:-p24,p2.
p11|not_p20|p8|not_p15:-p7,p2.
p11|not_p20|not_p1|not_p4:-p24,p2.
p11|not_p20|not_p1|not_p4:-p7,p2.
p11|not_p20|not_p1|not_p15:-p24,p2.
p11|not_p20|not_p1|not_p15:-p7,p2.
p24|p8|not_p4:-p24,p2,not p25.
p24|p8|not_p4:-p7,p2,not p25.
p24|p8|not_p15:-p24,p2,not p25.
p24|p8|not_p15:-p7,p2,not p25.
p24|not_p1|not_p4:-p24,p2,not p25.
p24|not_p1|not_p4:-p7,p2,not p25.
p24|not_p1|not_p15:-p24,p2,not p25.
p24|not_p1|not_p15:-p7,p2,not p25.
not_p20|p8|not_p4:-p24,p2,not p25.
not_p20|p8|not_p4:-p7,p2,not p25.
not_p20|p8|not_p15:-p24,p2,not p25.
not_p20|p8|not_p15:-p7,p2,not p25.
not_p20|not_p1|not_p4:-p24,p2,not p25.
not_p20|not_p1|not_p4:-p7,p2,not p25.
not_p20|not_p1|not_p15:-p24,p2,not p25.
not_p20|not_p1|not_p15:-p7,p2,not p25.
p24|p9|p6|p10:-p19,p23.
p24|p9|p6|p10:-p12,p23.
p24|p9|p6|not_p22:-p19,p23.
p24|p9|p6|not_p22:-p12,p23.
p24|p9|p24|p10:-p19,p23.
p24|p9|p24|p10:-p12,p23.
p24|p9|p24|not_p22:-p19,p23.
p24|p9|p24|not_p22:-p12,p23.
p24|not_p8|p6|p10:-p19,p23.
p24|not_p8|p6|p10:-p12,p23.
p24|not_p8|p6|not_p22:-p19,p23.
p24|not_p8|p6|not_p22:-p12,p23.
p24|not_p8|p24|p10:-p19,p23.
p24|not_p8|p24|p10:-p12,p23.
p24|not_p8|p24|not_p22:-p19,p23.
p24|not_p8|p24|not_p22:-p12,p23.
p6|p9|p6|p10:-p19,p23.
p6|p9|p6|p10:-p12,p23.
p6|p9|p6|not_p22:-p19,p23.
p6|p9|p6|not_p22:-p12,p23.
p6|p9|p24|p10:-p19,p23.
p6|p9|p24|p10:-p12,p23.
p6|p9|p24|not_p22:-p19,p23.
p6|p9|p24|not_p22:-p12,p23.
p6|not_p8|p6|p10:-p19,p23.
p6|not_p8|p6|p10:-p12,p23.
p6|not_p8|p6|not_p22:-p19,p23.
p6|not_p8|p6|not_p22:-p12,p23.
p6|not_p8|p24|p10:-p19,p23.
p6|not_p8|p24|p10:-p12,p23.
p6|not_p8|p24|not_p22:-p19,p23.
p6|not_p8|p24|not_p22:-p12,p23.
p24|p10|not_p18|not_p20:-p9,not p21.
p24|p10|not_p18|not_p20:-p19,not p21.
p24|p10|not_p18|not_p21:-p9,not p21.
p24|p10|not_p18|not_p21:-p19,not p21.
p24|p10|not_p7|not_p20:-p9,not p21.
p24|p10|not_p7|not_p20:-p19,not p21.
p24|p10|not_p7|not_p21:-p9,not p21.
p24|p10|not_p7|not_p21:-p19,not p21.
p24|p3|not_p18|not_p20:-p9,not p21.
p24|p3|not_p18|not_p20:-p19,not p21.
p24|p3|not_p18|not_p21:-p9,not p21.
p24|p3|not_p18|not_p21:-p19,not p21.
p24|p3|not_p7|not_p20:-p9,not p21.
p24|p3|not_p7|not_p20:-p19,not p21.
p24|p3|not_p7|not_p21:-p9,not p21.
p24|p3|not_p7|not_p21:-p19,not p21.
p9|p10|not_p18|not_p20:-p9,not p21.
p9|p10|not_p18|not_p20:-p19,not p21.
p9|p10|not_p18|not_p21:-p9,not p21.
p9|p10|not_p18|not_p21:-p19,not p21.
p9|p10|not_p7|not_p20:-p9,not p21.
p9|p10|not_p7|not_p20:-p19,not p21.
p9|p10|not_p7|not_p21:-p9,not p21.
p9|p10|not_p7|not_p21:-p19,not p21.
p9|p3|not_p18|not_p20:-p9,not p21.
p9|p3|not_p18|not_p20:-p19,not p21.
p9|p3|not_p18|not_p21:-p9,not p21.
p9|p3|not_p18|not_p21:-p19,not p21.
p9|p3|not_p7|not_p20:-p9,not p21.
p9|p3|not_p7|not_p20:-p19,not p21.
p9|p3|not_p7|not_p21:-p9,not p21.
p9|p3|not_p7|not_p21:-p19,not p21.
p18|not_p23|p11|p1:-p6,p23.
p18|not_p23|p11|p1:-p5,p23.
p18|not_p23|p11|not_p20:-p6,p23.
p18|not_p23|p11|not_p20:-p5,p23.
p18|not_p23|p21|p1:-p6,p23.
p18|not_p23|p21|p1:-p5,p23.
p18|not_p23|p21|not_p20:-p6,p23.
p18|not_p23|p21|not_p20:-p5,p23.
p18|p11|p1:-p6,p23,not p4.
p18|p11|p1:-p5,p23,not p4.
p18|p11|not_p20:-p6,p23,not p4.
p18|p11|not_p20:-p5,p23,not p4.
p18|p21|p1:-p6,p23,not p4.
p18|p21|p1:-p5,p23,not p4.
p18|p21|not_p20:-p6,p23,not p4.
p18|p21|not_p20:-p5,p23,not p4.
p3|not_p23|p11|p1:-p6,p23.
p3|not_p23|p11|p1:-p5,p23.
p3|not_p23|p11|not_p20:-p6,p23.
p3|not_p23|p11|not_p20:-p5,p23.
p3|not_p23|p21|p1:-p6,p23.
p3|not_p23|p21|p1:-p5,p23.
p3|not_p23|p21|not_p20:-p6,p23.
p3|not_p23|p21|not_p20:-p5,p23.
p3|p11|p1:-p6,p23,not p4.
p3|p11|p1:-p5,p23,not p4.
p3|p11|not_p20:-p6,p23,not p4.
p3|p11|not_p20:-p5,p23,not p4.
p3|p21|p1:-p6,p23,not p4.
p3|p21|p1:-p5,p23,not p4.
p3|p21|not_p20:-p6,p23,not p4.
p3|p21|not_p20:-p5,p23,not p4.
p3|p20|not_p20|p3:-p13,not p22.
p3|p20|not_p20|p3:-p1,not p22.
p3|p20|not_p20|p18:-p13,not p22.
p3|p20|not_p20|p18:-p1,not p22.
p3|p20|p18|p3:-p13,not p22.
p3|p20|p18|p3:-p1,not p22.
p3|p20|p18:-p13,not p22.
p3|p20|p18:-p1,not p22.
p3|not_p20|p3:-p13,not p22,not p14.
p3|not_p20|p3:-p1,not p22,not p14.
p3|not_p20|p18:-p13,not p22,not p14.
p3|not_p20|p18:-p1,not p22,not p14.
p3|p18|p3:-p13,not p22,not p14.
p3|p18|p3:-p1,not p22,not p14.
p3|p18:-p13,not p22,not p14.
p3|p18:-p1,not p22,not p14.
not_p21|p20|not_p20|p3:-p13,not p22.
not_p21|p20|not_p20|p3:-p1,not p22.
not_p21|p20|not_p20|p18:-p13,not p22.
not_p21|p20|not_p20|p18:-p1,not p22.
not_p21|p20|p18|p3:-p13,not p22.
not_p21|p20|p18|p3:-p1,not p22.
not_p21|p20|p18:-p13,not p22.
not_p21|p20|p18:-p1,not p22.
not_p21|not_p20|p3:-p13,not p22,not p14.
not_p21|not_p20|p3:-p1,not p22,not p14.
not_p21|not_p20|p18:-p13,not p22,not p14.
not_p21|not_p20|p18:-p1,not p22,not p14.
not_p21|p18|p3:-p13,not p22,not p14.
not_p21|p18|p3:-p1,not p22,not p14.
not_p21|p18:-p13,not p22,not p14.
not_p21|p18:-p1,not p22,not p14.
p5|p12|not_p20:-p17,not p7.
p5|p12|not_p20:-p14,not p7.
p5|p12|p25|not_p20:-p17,not p7.
p5|p12|p25|not_p20:-p14,not p7.
p5|p11|p12|not_p20:-p17,not p7.
p5|p11|p12|not_p20:-p14,not p7.
p5|p11|p25|not_p20:-p17,not p7.
p5|p11|p25|not_p20:-p14,not p7.
p22|p12|not_p20:-p17,not p7.
p22|p12|not_p20:-p14,not p7.
p22|p12|p25|not_p20:-p17,not p7.
p22|p12|p25|not_p20:-p14,not p7.
p22|p11|p12|not_p20:-p17,not p7.
p22|p11|p12|not_p20:-p14,not p7.
p22|p11|p25|not_p20:-p17,not p7.
p22|p11|p25|not_p20:-p14,not p7.
p21|p5|p12|not_p20:-p17.
p21|p5|p12|not_p20:-p14.
p21|p5|p12|p25|not_p20:-p17.
p21|p5|p12|p25|not_p20:-p14.
p21|p5|p11|p12|not_p20:-p17.
p21|p5|p11|p12|not_p20:-p14.
p21|p5|p11|p25|not_p20:-p17.
p21|p5|p11|p25|not_p20:-p14.
p21|p22|p12|not_p20:-p17.
p21|p22|p12|not_p20:-p14.
p21|p22|p12|p25|not_p20:-p17.
p21|p22|p12|p25|not_p20:-p14.
p21|p22|p11|p12|not_p20:-p17.
p21|p22|p11|p12|not_p20:-p14.
p21|p22|p11|p25|not_p20:-p17.
p21|p22|p11|p25|not_p20:-p14.
not_p23|not_p7|p21|p5:-not p24,not p3.
not_p23|not_p7|p21|p5:-p13,not p3.
not_p23|not_p7|p21|p11:-not p24,not p3.
not_p23|not_p7|p21|p11:-p13,not p3.
not_p23|not_p7|not_p17|p5:-not p24,not p3.
not_p23|not_p7|not_p17|p5:-p13,not p3.
not_p23|not_p7|not_p17|p11:-not p24,not p3.
not_p23|not_p7|not_p17|p11:-p13,not p3.
not_p23|p7|p21|p5:-not p24,not p3.
not_p23|p7|p21|p5:-p13,not p3.
not_p23|p7|p21|p11:-not p24,not p3.
not_p23|p7|p21|p11:-p13,not p3.
not_p23|p7|not_p17|p5:-not p24,not p3.
not_p23|p7|not_p17|p5:-p13,not p3.
not_p23|p7|not_p17|p11:-not p24,not p3.
not_p23|p7|not_p17|p11:-p13,not p3.
p5|not_p7|p21|p5:-not p24,not p3.
p5|not_p7|p21|p5:-p13,not p3.
p5|not_p7|p21|p11:-not p24,not p3.
p5|not_p7|p21|p11:-p13,not p3.
p5|not_p7|not_p17|p5:-not p24,not p3.
p5|not_p7|not_p17|p5:-p13,not p3.
p5|not_p7|not_p17|p11:-not p24,not p3.
p5|not_p7|not_p17|p11:-p13,not p3.
p5|p7|p21|p5:-not p24,not p3.
p5|p7|p21|p5:-p13,not p3.
p5|p7|p21|p11:-not p24,not p3.
p5|p7|p21|p11:-p13,not p3.
p5|p7|not_p17|p5:-not p24,not p3.
p5|p7|not_p17|p5:-p13,not p3.
p5|p7|not_p17|p11:-not p24,not p3.
p5|p7|not_p17|p11:-p13,not p3.
not_p14|not_p8|p10:-p17,p5,not p2.
not_p14|not_p8|p10|not_p20:-p17,p5.
not_p14|not_p8|not_p23:-p17,p5,not p2.
not_p14|not_p8|not_p23|not_p20:-p17,p5.
not_p14|p4|p10:-p17,p5,not p2.
not_p14|p4|p10|not_p20:-p17,p5.
not_p14|p4|not_p23:-p17,p5,not p2.
not_p14|p4|not_p23|not_p20:-p17,p5.
not_p5|not_p8|p10:-p17,p5,not p2.
not_p5|not_p8|p10|not_p20:-p17,p5.
not_p5|not_p8|not_p23:-p17,p5,not p2.
not_p5|not_p8|not_p23|not_p20:-p17,p5.
not_p5|p4|p10:-p17,p5,not p2.
not_p5|p4|p10|not_p20:-p17,p5.
not_p5|p4|not_p23:-p17,p5,not p2.
not_p5|p4|not_p23|not_p20:-p17,p5.
p7|p3|p1:-p12,p5,not p3.
p7|p3|p1:-p19,p5,not p3.
p7|p3|p6:-p12,p5,not p3.
p7|p3|p6:-p19,p5,not p3.
p7|p1:-p12,p5,not p3,not p7.
p7|p1:-p19,p5,not p3,not p7.
p7|p6:-p12,p5,not p3,not p7.
p7|p6:-p19,p5,not p3,not p7.
p7|p4|p3|p1:-p12,p5.
p7|p4|p3|p1:-p19,p5.
p7|p4|p3|p6:-p12,p5.
p7|p4|p3|p6:-p19,p5.
p7|p4|p1:-p12,p5,not p7.
p7|p4|p1:-p19,p5,not p7.
p7|p4|p6:-p12,p5,not p7.
p7|p4|p6:-p19,p5,not p7.
p15|p3|p1:-p12,p5,not p3.
p15|p3|p1:-p19,p5,not p3.
p15|p3|p6:-p12,p5,not p3.
p15|p3|p6:-p19,p5,not p3.
p15|p1:-p12,p5,not p3,not p7.
p15|p1:-p19,p5,not p3,not p7.
p15|p6:-p12,p5,not p3,not p7.
p15|p6:-p19,p5,not p3,not p7.
p15|p4|p3|p1:-p12,p5.
p15|p4|p3|p1:-p19,p5.
p15|p4|p3|p6:-p12,p5.
p15|p4|p3|p6:-p19,p5.
p15|p4|p1:-p12,p5,not p7.
p15|p4|p1:-p19,p5,not p7.
p15|p4|p6:-p12,p5,not p7.
p15|p4|p6:-p19,p5,not p7.
p11|p8|p4|p16.
p11|p8|p4|p18.
p11|p8|not_p23|p16.
p11|p8|not_p23|p18.
p11|p1|p4|p16.
p11|p1|p4|p18.
p11|p1|not_p23|p16.
p11|p1|not_p23|p18.
p15|p8|p4|p16.
p15|p8|p4|p18.
p15|p8|not_p23|p16.
p15|p8|not_p23|p18.
p15|p1|p4|p16.
p15|p1|p4|p18.
p15|p1|not_p23|p16.
p15|p1|not_p23|p18.
p17|p6|p8|p25.
p17|p6|p8|not_p8.
p17|p6|not_p19|p25.
p17|p6|not_p19|not_p8.
p17|p12|p8|p25.
p17|p12|p8|not_p8.
p17|p12|not_p19|p25.
p17|p12|not_p19|not_p8.
p14|p6|p8|p25.
p14|p6|p8|not_p8.
p14|p6|not_p19|p25.
p14|p6|not_p19|not_p8.
p14|p12|p8|p25.
p14|p12|p8|not_p8.
p14|p12|not_p19|p25.
p14|p12|not_p19|not_p8.
p15|p14|p24|not_p3.
p15|p14|p24|p3.
p15|p14|p9|not_p3.
p15|p14|p9|p3.
p15|p11|p24|not_p3.
p15|p11|p24|p3.
p15|p11|p9|not_p3.
p15|p11|p9|p3.
not_p25|p14|p24|not_p3.
not_p25|p14|p24|p3.
not_p25|p14|p9|not_p3.
not_p25|p14|p9|p3.
not_p25|p11|p24|not_p3.
not_p25|p11|p24|p3.
not_p25|p11|p9|not_p3.
not_p25|p11|p9|p3.
p15|p2|p23|not_p12.
p15|p2|p23|p15.
p15|p2|p24|not_p12.
p15|p2|p24|p15.
p15|not_p22|p23|not_p12.
p15|not_p22|p23|p15.
p15|not_p22|p24|not_p12.
p15|not_p22|p24|p15.
p2|p23|not_p12:-not p23.
p2|p23|p15:-not p23.
p2|p24|not_p12:-not p23.
p2|p24|p15:-not p23.
not_p22|p23|not_p12:-not p23.
not_p22|p23|p15:-not p23.
not_p22|p24|not_p12:-not p23.
not_p22|p24|p15:-not p23.
not_p5|p15|not_p19|p22.
not_p5|p15|not_p19|not_p12.
not_p5|p15|p22.
not_p5|p15|p22|not_p12.
not_p5|not_p5|not_p19|p22.
not_p5|not_p5|not_p19|not_p12.
not_p5|not_p5|p22.
not_p5|not_p5|p22|not_p12.
not_p21|p15|not_p19|p22.
not_p21|p15|not_p19|not_p12.
not_p21|p15|p22.
not_p21|p15|p22|not_p12.
not_p21|not_p5|not_p19|p22.
not_p21|not_p5|not_p19|not_p12.
not_p21|not_p5|p22.
not_p21|not_p5|p22|not_p12.
not_p6:-not p6.
:-p6,not_p6.
not_p2:-not p2.
:-p2,not_p2.
not_p24:-not p24.
:-p24,not_p24.
not_p16:-not p16.
:-p16,not_p16.
not_p9:-not p9.
:-p9,not_p9.
not_p4:-not p4.
:-p4,not_p4.
not_p1:-not p1.
:-p1,not_p1.
not_p15:-not p15.
:-p15,not_p15.
not_p18:-not p18.
:-p18,not_p18.
not_p7:-not p7.
:-p7,not_p7.
not_p17:-not p17.
:-p17,not_p17.
not_p14:-not p14.
:-p14,not_p14.
not_p20:-not p20.
:-p20,not_p20.
not_p23:-not p23.
:-p23,not_p23.
not_p8:-not p8.
:-p8,not_p8.
not_p3:-not p3.
:-p3,not_p3.
not_p25:-not p25.
:-p25,not_p25.
not_p22:-not p22.
:-p22,not_p22.
not_p19:-not p19.
:-p19,not_p19.
not_p21:-not p21.
:-p21,not_p21.
not_p5:-not p5.
:-p5,not_p5.
not_p12:-not p12.
:-p12,not_p12.
"""
output = """
{not_p1, not_p12, not_p14, not_p15, not_p16, not_p17, not_p18, not_p20, not_p21, not_p22, not_p23, not_p3, not_p5, not_p6, not_p7, not_p8, p11, p19, p2, p24, p25, p4, p9}
{not_p1, not_p12, not_p14, not_p15, not_p17, not_p19, not_p2, not_p20, not_p21, not_p6, not_p8, p16, p18, p22, p23, p24, p25, p3, p4, p5, p7, p9}
{not_p1, not_p12, not_p14, not_p15, not_p17, not_p19, not_p2, not_p20, not_p25, not_p3, not_p4, not_p6, p11, p16, p18, p21, p22, p23, p24, p5, p7, p8, p9}
{not_p1, not_p12, not_p14, not_p15, not_p17, not_p19, not_p2, not_p20, not_p25, not_p4, not_p6, p16, p18, p21, p22, p23, p24, p3, p5, p7, p8, p9}
{not_p1, not_p12, not_p14, not_p15, not_p17, not_p19, not_p2, not_p20, not_p3, not_p4, not_p6, not_p8, p11, p16, p18, p21, p22, p23, p24, p25, p5, p7, p9}
{not_p1, not_p12, not_p14, not_p15, not_p17, not_p19, not_p2, not_p20, not_p4, not_p6, not_p8, p16, p18, p21, p22, p23, p24, p25, p3, p5, p7, p9}
{not_p1, not_p12, not_p14, not_p16, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p22, not_p23, not_p3, not_p4, not_p5, not_p6, not_p7, not_p8, p11, p15, p24, p25, p9}
{not_p1, not_p12, not_p14, not_p16, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p23, not_p24, not_p25, not_p3, not_p4, not_p6, not_p7, not_p9, p11, p15, p22, p5, p8}
{not_p1, not_p12, not_p14, not_p16, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p23, not_p24, not_p25, not_p3, not_p6, not_p7, not_p9, p15, p22, p4, p5, p8}
{not_p1, not_p12, not_p14, not_p16, not_p17, not_p18, not_p19, not_p2, not_p20, not_p22, not_p23, not_p24, not_p25, not_p3, not_p4, not_p5, not_p6, not_p7, not_p9, p11, p15, p21, p8}
{not_p1, not_p12, not_p14, not_p16, not_p17, not_p18, not_p19, not_p2, not_p20, not_p22, not_p23, not_p24, not_p25, not_p3, not_p5, not_p6, not_p7, not_p9, p15, p21, p4, p8}
{not_p1, not_p12, not_p14, not_p16, not_p17, not_p18, not_p19, not_p20, not_p21, not_p22, not_p23, not_p3, not_p5, not_p6, not_p7, not_p8, p15, p2, p24, p25, p4, p9}
{not_p1, not_p12, not_p15, not_p17, not_p19, not_p2, not_p20, not_p21, not_p6, not_p8, not_p9, p11, p14, p16, p18, p22, p23, p24, p25, p3, p4, p5, p7}
{not_p1, not_p12, not_p15, not_p17, not_p19, not_p2, not_p20, not_p25, not_p3, not_p4, not_p6, not_p9, p11, p14, p16, p18, p21, p22, p23, p24, p5, p7, p8}
{not_p1, not_p12, not_p15, not_p17, not_p19, not_p2, not_p20, not_p3, not_p4, not_p6, not_p8, not_p9, p11, p14, p16, p18, p21, p22, p23, p24, p25, p5, p7}
{not_p1, not_p12, not_p15, not_p17, not_p19, not_p20, not_p21, not_p6, not_p7, not_p8, not_p9, p11, p14, p16, p18, p2, p22, p23, p24, p25, p3, p4, p5}
{not_p1, not_p12, not_p15, not_p17, not_p19, not_p20, not_p21, not_p6, not_p7, not_p8, p14, p16, p18, p2, p22, p23, p24, p25, p3, p4, p5, p9}
{not_p1, not_p12, not_p15, not_p17, not_p19, not_p20, not_p25, not_p3, not_p4, not_p6, not_p7, not_p9, p11, p14, p16, p18, p2, p21, p22, p23, p24, p5, p8}
{not_p1, not_p12, not_p15, not_p17, not_p19, not_p20, not_p25, not_p4, not_p6, not_p7, p14, p16, p18, p2, p21, p22, p23, p24, p3, p5, p8, p9}
{not_p1, not_p12, not_p15, not_p17, not_p19, not_p20, not_p3, not_p4, not_p6, not_p7, not_p8, not_p9, p11, p14, p16, p18, p2, p21, p22, p23, p24, p25, p5}
{not_p1, not_p12, not_p15, not_p17, not_p19, not_p20, not_p4, not_p6, not_p7, not_p8, p14, p16, p18, p2, p21, p22, p23, p24, p25, p3, p5, p9}
{not_p1, not_p12, not_p15, not_p19, not_p2, not_p20, not_p21, not_p25, not_p6, not_p8, not_p9, p11, p14, p16, p17, p18, p22, p23, p24, p3, p4, p5, p7}
{not_p1, not_p12, not_p15, not_p19, not_p2, not_p20, not_p21, not_p25, not_p6, not_p8, p14, p16, p17, p18, p22, p23, p24, p3, p4, p5, p7, p9}
{not_p1, not_p12, not_p15, not_p19, not_p2, not_p20, not_p25, not_p3, not_p6, not_p8, not_p9, p11, p14, p16, p17, p18, p21, p22, p23, p24, p4, p5, p7}
{not_p1, not_p12, not_p15, not_p19, not_p20, not_p21, not_p25, not_p6, not_p7, not_p8, not_p9, p11, p14, p16, p17, p18, p2, p22, p23, p24, p3, p4, p5}
{not_p1, not_p12, not_p15, not_p19, not_p20, not_p21, not_p25, not_p6, not_p7, not_p8, p14, p16, p17, p18, p2, p22, p23, p24, p3, p4, p5, p9}
{not_p1, not_p12, not_p15, not_p19, not_p20, not_p25, not_p3, not_p4, not_p6, not_p7, not_p8, not_p9, p11, p14, p16, p17, p18, p2, p21, p22, p23, p24, p5}
{not_p1, not_p12, not_p15, not_p19, not_p20, not_p25, not_p4, not_p6, not_p7, not_p8, p14, p16, p17, p18, p2, p21, p22, p23, p24, p3, p5, p9}
{not_p1, not_p12, not_p16, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p22, not_p23, not_p3, not_p5, not_p6, not_p7, not_p8, p14, p15, p24, p25, p4, p9}
{not_p1, not_p12, not_p16, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p23, not_p24, not_p3, not_p6, not_p7, not_p8, not_p9, p11, p14, p15, p22, p25, p4, p5}
{not_p1, not_p12, not_p16, not_p17, not_p18, not_p19, not_p2, not_p20, not_p23, not_p24, not_p3, not_p5, not_p6, not_p7, not_p8, not_p9, p11, p14, p15, p21, p22, p25, p4}
{not_p1, not_p12, not_p16, not_p17, not_p18, not_p19, not_p2, not_p20, not_p24, not_p3, not_p4, not_p5, not_p7, not_p8, not_p9, p11, p14, p15, p21, p22, p23, p25, p6}
{not_p1, not_p12, not_p16, not_p17, not_p18, not_p19, not_p20, not_p23, not_p24, not_p3, not_p4, not_p5, not_p6, not_p7, not_p8, not_p9, p11, p14, p15, p2, p21, p22, p25}
{not_p1, not_p12, not_p16, not_p17, not_p19, not_p2, not_p20, not_p22, not_p24, not_p3, not_p4, not_p5, not_p6, not_p7, not_p8, not_p9, p11, p13, p14, p15, p18, p21, p23, p25}
{not_p1, not_p12, not_p16, not_p17, not_p19, not_p20, not_p22, not_p23, not_p24, not_p3, not_p4, not_p5, not_p6, not_p7, not_p8, not_p9, p11, p13, p14, p15, p18, p2, p21, p25}
{not_p1, not_p12, not_p16, not_p18, not_p19, not_p2, not_p20, not_p21, not_p23, not_p24, not_p25, not_p3, not_p4, not_p6, not_p7, not_p8, not_p9, p10, p11, p14, p15, p17, p22, p5}
{not_p1, not_p12, not_p16, not_p18, not_p19, not_p2, not_p20, not_p21, not_p23, not_p24, not_p25, not_p3, not_p6, not_p7, not_p8, not_p9, p14, p15, p17, p22, p4, p5}
{not_p1, not_p12, not_p17, not_p18, not_p19, not_p2, not_p20, not_p22, not_p24, not_p3, not_p4, not_p5, not_p6, not_p7, not_p8, not_p9, p11, p14, p15, p16, p21, p23, p25}
{not_p1, not_p12, not_p17, not_p18, not_p19, not_p20, not_p22, not_p23, not_p24, not_p3, not_p4, not_p5, not_p6, not_p7, not_p8, not_p9, p11, p14, p15, p16, p2, p21, p25}
{not_p1, not_p14, not_p15, not_p16, not_p17, not_p18, not_p20, not_p21, not_p22, not_p23, not_p25, not_p3, not_p5, not_p7, not_p8, p10, p11, p12, p19, p2, p24, p4, p6, p9}
{not_p1, not_p14, not_p16, not_p17, not_p18, not_p19, not_p20, not_p22, not_p23, not_p24, not_p25, not_p3, not_p4, not_p5, not_p7, not_p8, not_p9, p11, p12, p15, p2, p21, p6}
{not_p1, not_p14, not_p16, not_p17, not_p18, not_p19, not_p20, not_p22, not_p23, not_p24, not_p25, not_p3, not_p5, not_p7, not_p8, not_p9, p12, p15, p2, p21, p4, p6}
{not_p1, not_p14, not_p17, not_p19, not_p20, not_p22, not_p23, not_p24, not_p25, not_p3, not_p4, not_p5, not_p6, not_p9, p12, p15, p16, p18, p2, p21, p7, p8}
{not_p1, not_p16, not_p17, not_p18, not_p19, not_p2, not_p22, not_p24, not_p3, not_p4, not_p5, not_p6, not_p7, not_p8, not_p9, p10, p11, p12, p13, p14, p15, p20, p21, p23, p25}
{not_p1, not_p16, not_p17, not_p18, not_p19, not_p22, not_p23, not_p24, not_p3, not_p4, not_p5, not_p6, not_p7, not_p8, not_p9, p11, p12, p13, p14, p15, p2, p20, p21, p25}
{not_p12, not_p14, not_p15, not_p16, not_p17, not_p18, not_p19, not_p2, not_p20, not_p25, not_p3, not_p6, p1, p21, p22, p23, p24, p4, p5, p7, p8, p9}
{not_p12, not_p14, not_p15, not_p16, not_p17, not_p18, not_p19, not_p2, not_p20, not_p25, not_p3, not_p7, p1, p21, p22, p23, p24, p4, p5, p6, p8, p9}
{not_p12, not_p14, not_p15, not_p16, not_p17, not_p20, not_p21, not_p22, not_p23, not_p3, not_p5, not_p6, not_p7, not_p8, p1, p11, p18, p19, p2, p24, p25, p4, p9}
{not_p12, not_p14, not_p15, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p25, not_p3, not_p6, p1, p16, p22, p23, p24, p4, p5, p7, p8, p9}
{not_p12, not_p14, not_p15, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p25, not_p3, not_p7, p1, p16, p22, p23, p24, p4, p5, p6, p8, p9}
{not_p12, not_p14, not_p15, not_p17, not_p18, not_p19, not_p2, not_p20, not_p25, not_p3, not_p4, not_p6, p1, p16, p21, p22, p23, p24, p5, p7, p8, p9}
{not_p12, not_p14, not_p15, not_p17, not_p19, not_p2, not_p20, not_p21, not_p3, not_p6, not_p8, p1, p16, p18, p22, p23, p24, p25, p4, p5, p7, p9}
{not_p12, not_p14, not_p15, not_p17, not_p19, not_p2, not_p20, not_p21, not_p3, not_p7, not_p8, p1, p16, p18, p22, p23, p24, p25, p4, p5, p6, p9}
{not_p12, not_p14, not_p15, not_p17, not_p19, not_p2, not_p20, not_p3, not_p4, not_p6, not_p8, p1, p16, p18, p21, p22, p23, p24, p25, p5, p7, p9}
{not_p12, not_p14, not_p15, not_p17, not_p2, not_p20, not_p21, not_p22, not_p3, not_p4, not_p5, not_p6, not_p8, p1, p11, p16, p18, p19, p23, p24, p25, p7, p9}
{not_p12, not_p14, not_p15, not_p17, not_p2, not_p20, not_p21, not_p22, not_p3, not_p5, not_p7, not_p8, p1, p10, p11, p16, p18, p19, p23, p24, p25, p4, p6, p9}
{not_p12, not_p14, not_p15, not_p17, not_p20, not_p21, not_p22, not_p23, not_p3, not_p4, not_p5, not_p6, not_p8, p1, p11, p16, p18, p19, p2, p24, p25, p7, p9}
{not_p12, not_p14, not_p15, not_p20, not_p21, not_p22, not_p23, not_p3, not_p4, not_p5, not_p6, not_p7, not_p8, p1, p11, p16, p17, p18, p19, p2, p24, p25, p9}
{not_p12, not_p14, not_p16, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p22, not_p23, not_p24, not_p25, not_p4, not_p5, not_p6, not_p7, not_p9, p1, p15, p3, p8}
{not_p12, not_p14, not_p16, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p23, not_p24, not_p25, not_p3, not_p4, not_p6, not_p7, not_p9, p1, p15, p22, p5, p8}
{not_p12, not_p14, not_p16, not_p17, not_p19, not_p2, not_p20, not_p22, not_p23, not_p24, not_p25, not_p3, not_p4, not_p5, not_p6, not_p7, not_p9, p1, p15, p18, p21, p8}
{not_p12, not_p14, not_p16, not_p19, not_p2, not_p20, not_p21, not_p22, not_p23, not_p3, not_p4, not_p5, not_p6, not_p7, not_p8, p1, p11, p15, p17, p18, p24, p25, p9}
{not_p12, not_p14, not_p16, not_p19, not_p2, not_p20, not_p21, not_p22, not_p23, not_p3, not_p5, not_p6, not_p7, not_p8, p1, p15, p17, p18, p24, p25, p4, p9}
{not_p12, not_p14, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p22, not_p23, not_p24, not_p25, not_p4, not_p5, not_p9, p1, p15, p16, p3, p6, p7, p8}
{not_p12, not_p14, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p22, not_p23, not_p4, not_p5, not_p8, p1, p11, p15, p16, p24, p25, p3, p6, p7, p9}
{not_p12, not_p14, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p22, not_p23, not_p5, not_p8, p1, p15, p16, p24, p25, p3, p4, p6, p7, p9}
{not_p12, not_p14, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p23, not_p24, not_p25, not_p3, not_p4, not_p9, p1, p15, p16, p22, p5, p6, p7, p8}
{not_p12, not_p14, not_p17, not_p18, not_p2, not_p20, not_p21, not_p22, not_p23, not_p5, not_p6, not_p8, p1, p15, p16, p19, p24, p25, p3, p4, p7, p9}
{not_p12, not_p14, not_p17, not_p19, not_p2, not_p20, not_p21, not_p22, not_p23, not_p3, not_p4, not_p5, not_p8, p1, p15, p16, p18, p24, p25, p6, p7, p9}
{not_p12, not_p14, not_p17, not_p19, not_p2, not_p20, not_p22, not_p23, not_p24, not_p25, not_p3, not_p4, not_p5, not_p9, p1, p15, p16, p18, p21, p6, p7, p8}
{not_p12, not_p14, not_p19, not_p2, not_p20, not_p21, not_p22, not_p23, not_p3, not_p4, not_p5, not_p7, not_p8, p1, p15, p16, p17, p18, p24, p25, p6, p9}
{not_p12, not_p14, not_p19, not_p2, not_p20, not_p21, not_p23, not_p24, not_p25, not_p3, not_p4, not_p7, not_p9, p1, p10, p15, p16, p17, p18, p22, p5, p6, p8}
{not_p12, not_p15, not_p16, not_p17, not_p18, not_p19, not_p2, not_p20, not_p25, not_p3, not_p4, not_p6, not_p9, p1, p11, p14, p21, p22, p23, p24, p5, p7, p8}
{not_p12, not_p15, not_p16, not_p17, not_p18, not_p19, not_p20, not_p25, not_p3, not_p4, not_p6, not_p7, not_p9, p1, p11, p14, p2, p21, p22, p23, p24, p5, p8}
{not_p12, not_p15, not_p16, not_p17, not_p18, not_p19, not_p20, not_p25, not_p3, not_p6, not_p7, p1, p14, p2, p21, p22, p23, p24, p4, p5, p8, p9}
{not_p12, not_p15, not_p16, not_p18, not_p2, not_p20, not_p25, not_p3, not_p5, not_p7, not_p9, p1, p10, p11, p14, p17, p19, p21, p22, p23, p24, p4, p6, p8}
{not_p12, not_p15, not_p16, not_p18, not_p20, not_p21, not_p22, not_p23, not_p24, not_p25, not_p4, not_p5, not_p6, not_p7, not_p9, p1, p11, p14, p17, p19, p2, p3, p8}
{not_p12, not_p15, not_p16, not_p19, not_p20, not_p25, not_p3, not_p4, not_p6, not_p7, p1, p14, p17, p18, p2, p21, p22, p23, p24, p5, p8, p9}
{not_p12, not_p15, not_p16, not_p2, not_p20, not_p25, not_p3, not_p4, not_p5, not_p7, not_p9, p1, p11, p14, p17, p18, p19, p21, p22, p23, p24, p6, p8}
{not_p12, not_p15, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p25, not_p3, not_p6, not_p9, p1, p11, p14, p16, p22, p23, p24, p4, p5, p7, p8}
{not_p12, not_p15, not_p17, not_p18, not_p19, not_p20, not_p21, not_p25, not_p3, not_p6, not_p7, not_p9, p1, p11, p14, p16, p2, p22, p23, p24, p4, p5, p8}
{not_p12, not_p15, not_p17, not_p18, not_p19, not_p20, not_p21, not_p25, not_p3, not_p6, not_p7, p1, p14, p16, p2, p22, p23, p24, p4, p5, p8, p9}
{not_p12, not_p15, not_p17, not_p19, not_p2, not_p20, not_p21, not_p3, not_p6, not_p8, not_p9, p1, p11, p14, p16, p18, p22, p23, p24, p25, p4, p5, p7}
{not_p12, not_p15, not_p17, not_p19, not_p2, not_p20, not_p21, not_p3, not_p7, not_p8, not_p9, p1, p10, p11, p14, p16, p18, p22, p23, p24, p25, p4, p5, p6}
{not_p12, not_p15, not_p17, not_p19, not_p20, not_p21, not_p3, not_p6, not_p7, not_p8, not_p9, p1, p11, p14, p16, p18, p2, p22, p23, p24, p25, p4, p5}
{not_p12, not_p15, not_p17, not_p19, not_p20, not_p21, not_p3, not_p6, not_p7, not_p8, p1, p14, p16, p18, p2, p22, p23, p24, p25, p4, p5, p9}
{not_p12, not_p15, not_p17, not_p2, not_p20, not_p21, not_p22, not_p3, not_p4, not_p5, not_p6, not_p7, not_p8, not_p9, p1, p10, p11, p14, p16, p18, p19, p23, p24, p25}
{not_p12, not_p15, not_p19, not_p2, not_p20, not_p21, not_p25, not_p3, not_p6, not_p8, not_p9, p1, p11, p14, p16, p17, p18, p22, p23, p24, p4, p5, p7}
{not_p12, not_p15, not_p19, not_p2, not_p20, not_p21, not_p25, not_p3, not_p6, not_p8, p1, p14, p16, p17, p18, p22, p23, p24, p4, p5, p7, p9}
{not_p12, not_p15, not_p19, not_p2, not_p20, not_p21, not_p25, not_p3, not_p7, not_p8, not_p9, p1, p10, p11, p14, p16, p17, p18, p22, p23, p24, p4, p5, p6}
{not_p12, not_p15, not_p19, not_p2, not_p20, not_p21, not_p25, not_p3, not_p7, not_p8, p1, p14, p16, p17, p18, p22, p23, p24, p4, p5, p6, p9}
{not_p12, not_p15, not_p19, not_p20, not_p21, not_p25, not_p3, not_p6, not_p7, not_p8, not_p9, p1, p11, p14, p16, p17, p18, p2, p22, p23, p24, p4, p5}
{not_p12, not_p15, not_p19, not_p20, not_p21, not_p25, not_p3, not_p6, not_p7, not_p8, p1, p14, p16, p17, p18, p2, p22, p23, p24, p4, p5, p9}
{not_p12, not_p15, not_p19, not_p20, not_p25, not_p3, not_p4, not_p6, not_p7, not_p8, p1, p14, p16, p17, p18, p2, p21, p22, p23, p24, p5, p9}
{not_p12, not_p15, not_p2, not_p20, not_p21, not_p22, not_p25, not_p3, not_p4, not_p5, not_p6, not_p7, not_p8, not_p9, p1, p10, p11, p14, p16, p17, p18, p19, p23, p24}
{not_p12, not_p15, not_p2, not_p20, not_p21, not_p22, not_p25, not_p3, not_p4, not_p5, not_p6, not_p7, not_p8, p1, p11, p14, p16, p17, p18, p19, p23, p24, p9}
{not_p12, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p22, not_p23, not_p24, not_p5, not_p8, not_p9, p1, p10, p11, p14, p15, p16, p25, p3, p4, p6, p7}
{not_p12, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p23, not_p24, not_p3, not_p8, not_p9, p1, p10, p11, p14, p15, p16, p22, p25, p4, p5, p6, p7}
{not_p12, not_p17, not_p18, not_p19, not_p2, not_p20, not_p21, not_p23, not_p24, not_p3, not_p8, p1, p11, p14, p15, p16, p22, p25, p4, p5, p6, p7, p9}
{not_p12, not_p18, not_p19, not_p2, not_p20, not_p21, not_p22, not_p23, not_p24, not_p25, not_p4, not_p5, not_p8, not_p9, p1, p11, p14, p15, p16, p17, p3, p6, p7}
{not_p12, not_p18, not_p19, not_p2, not_p20, not_p21, not_p22, not_p23, not_p24, not_p25, not_p5, not_p8, not_p9, p1, p14, p15, p16, p17, p3, p4, p6, p7}
{not_p12, not_p18, not_p19, not_p2, not_p20, not_p21, not_p23, not_p24, not_p25, not_p3, not_p4, not_p8, not_p9, p1, p10, p11, p14, p15, p16, p17, p22, p5, p6, p7}
{not_p12, not_p18, not_p19, not_p2, not_p20, not_p21, not_p23, not_p24, not_p25, not_p3, not_p8, not_p9, p1, p14, p15, p16, p17, p22, p4, p5, p6, p7}
{not_p12, not_p19, not_p2, not_p20, not_p21, not_p23, not_p24, not_p25, not_p3, not_p4, not_p7, not_p8, not_p9, p1, p10, p14, p15, p16, p17, p18, p22, p5, p6}
{not_p14, not_p15, not_p16, not_p17, not_p18, not_p19, not_p2, not_p20, not_p25, not_p3, not_p4, not_p7, p1, p11, p12, p21, p22, p23, p24, p5, p6, p8, p9}
"""
| 45.484634 | 182 | 0.700624 | 9,083 | 38,480 | 2.722779 | 0.004844 | 0.058712 | 0.041729 | 0.031135 | 0.949577 | 0.894384 | 0.752254 | 0.592455 | 0.432049 | 0.366261 | 0 | 0.299303 | 0.11237 | 38,480 | 845 | 183 | 45.538462 | 0.424757 | 0 | 0 | 0.002372 | 1 | 0.124555 | 0.999168 | 0.384641 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
73b66e973675814900d4b8fdce35c0fcc2e55e74 | 179 | py | Python | code/Rate.py | PPeltola/tiralabra | c41a9cf76eafe78e23706aeae02ba6e906adba6f | [
"MIT"
] | null | null | null | code/Rate.py | PPeltola/tiralabra | c41a9cf76eafe78e23706aeae02ba6e906adba6f | [
"MIT"
] | 1 | 2021-08-25T08:46:43.000Z | 2021-08-25T08:46:43.000Z | code/Rate.py | PPeltola/tiralabra | c41a9cf76eafe78e23706aeae02ba6e906adba6f | [
"MIT"
] | null | null | null | from math import exp
def decaying(x, n, decay_rate):
return x / (1 + decay_rate * (n - 1))
def exponential(n, initial, decay_rate):
return initial * exp(-decay_rate * n) | 25.571429 | 41 | 0.664804 | 29 | 179 | 3.965517 | 0.482759 | 0.313043 | 0.26087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014085 | 0.206704 | 179 | 7 | 42 | 25.571429 | 0.795775 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
73e61f613d17d793d388666544d0a3a0b64c2804 | 122 | py | Python | roboball2d/ball_gun/__init__.py | dtrb/roboball2d | 9e49ec89d3a8ce2b3b249dc8f3a975e9a412977a | [
"BSD-3-Clause"
] | 2 | 2020-04-17T19:52:32.000Z | 2020-06-14T13:27:42.000Z | roboball2d/ball_gun/__init__.py | dtrb/roboball2d | 9e49ec89d3a8ce2b3b249dc8f3a975e9a412977a | [
"BSD-3-Clause"
] | 4 | 2020-05-04T11:27:03.000Z | 2020-07-08T15:40:54.000Z | roboball2d/ball_gun/__init__.py | dtrb/roboball2d | 9e49ec89d3a8ce2b3b249dc8f3a975e9a412977a | [
"BSD-3-Clause"
] | 4 | 2020-04-09T13:21:24.000Z | 2021-06-14T11:23:09.000Z | from roboball2d.ball_gun.default_ball_gun import DefaultBallGun
from roboball2d.ball_gun.drop_ball_gun import DropBallGun
| 40.666667 | 63 | 0.901639 | 18 | 122 | 5.777778 | 0.5 | 0.269231 | 0.346154 | 0.403846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017544 | 0.065574 | 122 | 2 | 64 | 61 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
fb3fd4c68fe77d402f66861e2c7275e48e1c59cf | 2,873 | py | Python | oxe-api/test/resource/public/test_get_public_company.py | CybersecurityLuxembourg/openxeco | 8d4e5578bde6a07f5d6d569b16b4de224abf7bf0 | [
"BSD-2-Clause"
] | null | null | null | oxe-api/test/resource/public/test_get_public_company.py | CybersecurityLuxembourg/openxeco | 8d4e5578bde6a07f5d6d569b16b4de224abf7bf0 | [
"BSD-2-Clause"
] | null | null | null | oxe-api/test/resource/public/test_get_public_company.py | CybersecurityLuxembourg/openxeco | 8d4e5578bde6a07f5d6d569b16b4de224abf7bf0 | [
"BSD-2-Clause"
] | null | null | null | from test.BaseCase import BaseCase
class TestGetPublicCompany(BaseCase):
@BaseCase.login
def test_ok(self, token):
self.db.insert({"id": 2, "name": "My Company"}, self.db.tables["Company"])
response = self.application.get('/public/get_public_company/2',
headers=self.get_standard_header(token))
self.assertEqual(200, response.status_code)
self.assertEqual(response.json, {
'id': 2,
'image': None,
'name': 'My Company',
'status': 'ACTIVE',
'is_startup': 0,
'is_cybersecurity_core_business': 0,
'creation_date': None,
'description': None,
'sync_address': None,
'sync_global': None,
'sync_id': None,
'sync_node': None,
'sync_status': "OK",
'trade_register_number': None,
'website': None,
'linkedin_url': None,
'discord_url': None,
'twitter_url': None,
'youtube_url': None
})
@BaseCase.login
def test_ok_with_assignments(self, token):
self.db.insert({"id": 2, "name": "My Company"}, self.db.tables["Company"])
response = self.application.get('/public/get_public_company/2?include_assignments=True',
headers=self.get_standard_header(token))
self.assertEqual(200, response.status_code)
self.assertEqual(response.json, {
'id': 2,
'image': None,
'name': 'My Company',
'status': 'ACTIVE',
'is_startup': 0,
'is_cybersecurity_core_business': 0,
'creation_date': None,
'description': None,
'sync_address': None,
'sync_global': None,
'sync_id': None,
'sync_node': None,
'sync_status': "OK",
'taxonomy_assignment': [],
'trade_register_number': None,
'website': None,
'linkedin_url': None,
'discord_url': None,
'twitter_url': None,
'youtube_url': None
})
@BaseCase.login
def test_unexisting_id(self, token):
response = self.application.get('/public/get_public_company/4',
headers=self.get_standard_header(token))
self.assertEqual("422 Object not found", response.status)
@BaseCase.login
def test_ko_deleted_status_company(self, token):
self.db.insert({"id": 4, "name": "My Company", "status": "DELETED"}, self.db.tables["Company"])
response = self.application.get('/public/get_public_company/4',
headers=self.get_standard_header(token))
self.assertEqual("422 Object not found", response.status)
| 35.036585 | 103 | 0.540898 | 291 | 2,873 | 5.127148 | 0.24055 | 0.053619 | 0.043566 | 0.053619 | 0.86059 | 0.844504 | 0.829088 | 0.829088 | 0.829088 | 0.829088 | 0 | 0.013007 | 0.331013 | 2,873 | 81 | 104 | 35.469136 | 0.763267 | 0 | 0 | 0.852941 | 0 | 0 | 0.251305 | 0.083188 | 0 | 0 | 0 | 0 | 0.088235 | 1 | 0.058824 | false | 0 | 0.014706 | 0 | 0.088235 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fba5d91c46452dfdd2540c7c5520c4346511eb64 | 8,065 | py | Python | tests/commands/test_cooldowncommand.py | matthew-robertson/banned-word-tracker | 32defe7936114258325ef8ba2f740648d43d4abf | [
"MIT"
] | 11 | 2019-03-10T18:31:59.000Z | 2021-02-13T12:42:44.000Z | tests/commands/test_cooldowncommand.py | matthew-robertson/banned-word-tracker | 32defe7936114258325ef8ba2f740648d43d4abf | [
"MIT"
] | 51 | 2019-02-21T21:21:59.000Z | 2022-03-09T01:29:55.000Z | tests/commands/test_cooldowncommand.py | matthew-robertson/vore-tracker | c35807612397ae7bc540cb0a1af6bf3ec1f98593 | [
"MIT"
] | 5 | 2018-07-12T06:36:29.000Z | 2019-01-09T04:11:19.000Z | import unittest
from unittest.mock import patch
import discord
import datetime
from commands import CooldownCommand
from serverobjects.server import DiscordServer
class TestCooldownCommand(unittest.TestCase):
def setUp(self):
self.command = CooldownCommand()
def test_is_command_authorized__no_permissions_allowed(self):
result = self.command.is_command_authorized()
self.assertTrue(result)
def test_is_command_authorized__non_admin_allowed(self):
permissions = discord.Permissions()
result = self.command.is_command_authorized(permissions)
self.assertTrue(result)
def test_is_command_authorized__admin_allowed(self):
permissions = discord.Permissions.all()
result = self.command.is_command_authorized(permissions)
self.assertTrue(result)
def test_execute__one_word_cooldown(self):
time = datetime.datetime.now()
server_json = {
'server_id' : 1,
'awake' : True,
'timeout_duration_seconds': 1800,
'prefix': '!vt',
'banned_words': [{
'rowid': 1,
'server_id': 1,
'banned_word': 'vore',
'infracted_at': (time - datetime.timedelta(minutes=20)).strftime("%Y-%m-%d %H:%M:%S"),
'calledout_at': (time - datetime.timedelta(minutes=20)).strftime("%Y-%m-%d %H:%M:%S"),
'record': {
'record_seconds': 2400,
'infraction_count': 0
}
}]
}
server = DiscordServer(server_json, time, None)
self.assertEqual(
self.command.execute(server, time, "!vtct", None),
"The cooldown period is 30 minutes and 0 seconds.\n" +
"I'll be able to issue another alert for 'vore' in 9 minutes and 59 seconds.")
def test_execute__one_word_available(self):
time = datetime.datetime.now()
server_json = {
'server_id' : 1,
'awake' : True,
'timeout_duration_seconds': 1800,
'prefix': '!vt',
'banned_words': [{
'rowid': 1,
'server_id': 1,
'banned_word': 'vore',
'infracted_at': (time - datetime.timedelta(minutes=40)).strftime("%Y-%m-%d %H:%M:%S"),
'calledout_at': (time - datetime.timedelta(minutes=40)).strftime("%Y-%m-%d %H:%M:%S"),
'record': {
'record_seconds': 2400,
'infraction_count': 0
}
}]
}
server = DiscordServer(server_json, time, None)
self.assertEqual(
self.command.execute(server, time, "!vtct", None),
"The cooldown period is 30 minutes and 0 seconds.\nI'm ready to issue another warning for 'vore' now.")
def test_execute__multiple_words_mixed(self):
time = datetime.datetime.now()
server_json = {
'server_id' : 1,
'awake' : True,
'timeout_duration_seconds': 1800,
'prefix': '!vt',
'banned_words': [{
'rowid': 1,
'server_id': 1,
'banned_word': 'vore',
'infracted_at': (time - datetime.timedelta(minutes=40)).strftime("%Y-%m-%d %H:%M:%S"),
'calledout_at': (time - datetime.timedelta(minutes=40)).strftime("%Y-%m-%d %H:%M:%S"),
'record': {
'record_seconds': 2400,
'infraction_count': 0
}
},
{
'rowid': 2,
'server_id': 1,
'banned_word': 'test',
'infracted_at': (time - datetime.timedelta(minutes=20)).strftime("%Y-%m-%d %H:%M:%S"),
'calledout_at': (time - datetime.timedelta(minutes=20)).strftime("%Y-%m-%d %H:%M:%S"),
'record': {
'record_seconds': 2400,
'infraction_count': 0
}
}]
}
server = DiscordServer(server_json, time, None)
self.assertEqual(
self.command.execute(server, time, "!vtct", None),
"The cooldown period is 30 minutes and 0 seconds.\n" +
"I'm ready to issue another warning for 'vore' now.\n" +
"I'll be able to issue another alert for 'test' in 9 minutes and 59 seconds.")
def test_execute__one_of_multiple(self):
time = datetime.datetime.now()
server_json = {
'server_id' : 1,
'awake' : True,
'timeout_duration_seconds': 1800,
'prefix': '!vt',
'banned_words': [{
'rowid': 1,
'server_id': 1,
'banned_word': 'vore',
'infracted_at': (time - datetime.timedelta(minutes=40)).strftime("%Y-%m-%d %H:%M:%S"),
'calledout_at': (time - datetime.timedelta(minutes=40)).strftime("%Y-%m-%d %H:%M:%S"),
'record': {
'record_seconds': 2400,
'infraction_count': 0
}
},
{
'rowid': 2,
'server_id': 1,
'banned_word': 'test',
'infracted_at': (time - datetime.timedelta(minutes=20)).strftime("%Y-%m-%d %H:%M:%S"),
'calledout_at': (time - datetime.timedelta(minutes=20)).strftime("%Y-%m-%d %H:%M:%S"),
'record': {
'record_seconds': 2400,
'infraction_count': 0
}
}]
}
server = DiscordServer(server_json, time, None)
self.assertEqual(
self.command.execute(server, time, "!vtct 1", None),
"The cooldown period is 30 minutes and 0 seconds.\n" +
"I'm ready to issue another warning for 'vore' now.")
self.assertEqual(
self.command.execute(server, time, "!vtct 2", None),
"The cooldown period is 30 minutes and 0 seconds.\n" +
"I'll be able to issue another alert for 'test' in 9 minutes and 59 seconds.")
def test_execute__out_of_range(self):
time = datetime.datetime.now()
server_json = {
'server_id' : 1,
'awake' : True,
'timeout_duration_seconds': 1800,
'prefix': '!vt',
'banned_words': [{
'rowid': 1,
'server_id': 1,
'banned_word': 'vore',
'infracted_at': (time - datetime.timedelta(minutes=40)).strftime("%Y-%m-%d %H:%M:%S"),
'calledout_at': (time - datetime.timedelta(minutes=40)).strftime("%Y-%m-%d %H:%M:%S"),
'record': {
'record_seconds': 2400,
'infraction_count': 0
}
},
{
'rowid': 2,
'server_id': 1,
'banned_word': 'test',
'infracted_at': (time - datetime.timedelta(minutes=20)).strftime("%Y-%m-%d %H:%M:%S"),
'calledout_at': (time - datetime.timedelta(minutes=20)).strftime("%Y-%m-%d %H:%M:%S"),
'record': {
'record_seconds': 2400,
'infraction_count': 0
}
}]
}
server = DiscordServer(server_json, time, None)
self.assertEqual(
self.command.execute(server, time, "!vtct -1", None),
"The cooldown period is 30 minutes and 0 seconds.\n" +
"I'm ready to issue another warning for 'vore' now.\n" +
"I'll be able to issue another alert for 'test' in 9 minutes and 59 seconds.")
self.assertEqual(
self.command.execute(server, time, "!vtct 5", None),
"The cooldown period is 30 minutes and 0 seconds.\n" +
"I'm ready to issue another warning for 'vore' now.\n" +
"I'll be able to issue another alert for 'test' in 9 minutes and 59 seconds.") | 41.787565 | 115 | 0.509361 | 865 | 8,065 | 4.6 | 0.115607 | 0.063333 | 0.056296 | 0.092486 | 0.917567 | 0.906509 | 0.874843 | 0.874843 | 0.834883 | 0.825584 | 0 | 0.029406 | 0.354867 | 8,065 | 193 | 116 | 41.787565 | 0.735345 | 0 | 0 | 0.748634 | 0 | 0.005464 | 0.285643 | 0.014877 | 0 | 0 | 0 | 0 | 0.054645 | 1 | 0.04918 | false | 0 | 0.032787 | 0 | 0.087432 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
83a804c72b793f39bdcf59dab27a35da825f9935 | 96 | py | Python | Martijn/secret_password.py | ArtezGDA/text-IO | b9ed7f2433c0eda08fb45d125ea22a5fdeaef667 | [
"MIT"
] | null | null | null | Martijn/secret_password.py | ArtezGDA/text-IO | b9ed7f2433c0eda08fb45d125ea22a5fdeaef667 | [
"MIT"
] | null | null | null | Martijn/secret_password.py | ArtezGDA/text-IO | b9ed7f2433c0eda08fb45d125ea22a5fdeaef667 | [
"MIT"
] | null | null | null | github_account = {'user' : "martang", 'password' : "c1a72fdf282521203e8141324c6103c8491fc429"}
| 32 | 94 | 0.760417 | 6 | 96 | 12 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.344828 | 0.09375 | 96 | 2 | 95 | 48 | 0.482759 | 0 | 0 | 0 | 0 | 0 | 0.621053 | 0.421053 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
83b5348b0d4cf79d41dc92502b03d8b1d08d0daf | 21,631 | py | Python | Archive/vfi_2.py | madsankern/DynamicProgramming | 0812b844068c33b2529d4b11940f9c89582bc374 | [
"MIT"
] | null | null | null | Archive/vfi_2.py | madsankern/DynamicProgramming | 0812b844068c33b2529d4b11940f9c89582bc374 | [
"MIT"
] | null | null | null | Archive/vfi_2.py | madsankern/DynamicProgramming | 0812b844068c33b2529d4b11940f9c89582bc374 | [
"MIT"
] | null | null | null | # Solve the model using value function iteration
# To do:
# Rewrite value_of_choice to vectorize
# Vectorize inner loop in solve_VFI
import numpy as np
import tools
import scipy.optimize as optimize
import utility as util
import quantecon as qe # Package for Nelder-Mead algorithm
from numba import njit # Package for Nelder-Mead algorithm
def solve_VFI(par):
# Initialize solution class
class sol: pass
sol.c = par.grid_a.copy() # Initial guess is to consume everything
sol.v = util.u(sol.c,par) # Utility of consumption
sol.a = par.grid_a.copy() # Copy the exogenous asset grid for consistency (with EGM algortihm) -- Jeg kan se, at denne initialisering er nødvendig for at kunne
# plotte på et a-grid, men vi initialisere jo allerede griddet i setup(), så måske kan man kalde par.grid_a direkte, når der plottes? Har prøvet, men den klager.
sol.it = 0 # Iteration counter
sol.delta = 1000.0 # Difference between two iterations
# Iterate value function until convergence or break if no convergence
while (sol.delta >= par.tol_vfi and sol.it < par.max_iter):
# Use last iteration as the continuation value. See slides if confused
v_next = sol.v.copy()
# Loop over asset grid
for i_a,a in enumerate(par.grid_a):
# Minimize the minus the value function wrt consumption
obj_fun = lambda x : - value_of_choice(x,a,par.grid_a,v_next,par)
res = optimize.minimize_scalar(obj_fun, bounds=[0,a+1.0e-4], method='bounded')
# Unpack solution
sol.v[i_a] = -res.fun
sol.c[i_a] = res.x
# Update iteration parameters
sol.it += 1
sol.delta = max(abs(sol.v - v_next))
return sol
# Function that returns value of consumption choice
def value_of_choice(x,a,a_next,v_next,par):
# Unpack consumption (choice variable)
c = x
# Intialize expected continuation value
Ev_next = 0.0
# Loop over each possible state
for i in [0,1]:
# Next periods state for each income level
a_plus = par.y[i] + (1+par.r)*(a - c)
#Interpolate continuation given state a_plus
v_plus = tools.interp_linear_1d_scalar(a_next,v_next,a_plus)
# Append continuation value to calculate expected value
Ev_next += par.Pi[i] * v_plus
# Value of choice
v_guess = util.u(c,par) + par.beta * Ev_next
return v_guess
def solve_VFI_2d(par):
# Initialize solution class
class sol: pass
shape = (np.size(par.y),1)
sol.c = np.tile(par.grid_a.copy(), shape) # Initial guess is to consume everything for each state
sol.v = util.u(sol.c,par) # Utility of consumption
sol.a = par.grid_a.copy() # Copy the exogenous asset grid for consistency (with EGM algortihm)
state1 = 1 # UNEMPLOYMENT STATE Used as boolean in "value_of_choice" - Defined here for readability
state2 = 0 # EMPLOYMENT STATE Used as boolean in "value_of_choice" - Defined here for readability
sol.it = 0 # Iteration counter
sol.delta = 1000.0 # Difference between two iterations
# Iterate value function until convergence or break if no convergence
while (sol.delta >= par.tol_vfi and sol.it < par.max_iter):
# Use last iteration as the continuation value. See slides if confused
v_next = sol.v.copy()
# Loop over asset grid
for i_a,a in enumerate(par.grid_a):
# FUNCTIONS BELOW CAN BE WRITTEN AS LOOP - for i=0,1 - AND BE STORED IN AN ARRAY/LIST WITH TWO ENTRIES - a la res[i]=optimize.minimize....
# Minimize the minus the value function wrt consumption conditional on unemployment state
obj_fun = lambda x : - value_of_choice_2d(x,a,par.grid_a,v_next[0,:],par,state1)
res_1 = optimize.minimize_scalar(obj_fun, bounds=[0,a+1.0e-4], method='bounded')
# Minimize the minus the value function wrt consumption conditional on employment state
obj_fun = lambda x : - value_of_choice_2d(x,a,par.grid_a,v_next[1,:],par,state2)
res_2 = optimize.minimize_scalar(obj_fun, bounds=[0,a+1.0e-4], method='bounded')
# Unpack solutions
# State 1
sol.v[0,i_a] = -res_1.fun
sol.c[0,i_a] = res_1.x
# State 2
sol.v[1,i_a] = -res_2.fun
sol.c[1,i_a] = res_2.x
# Update iteration parameters
sol.it += 1
sol.delta = max( max(abs(sol.v[0] - v_next[0])), max(abs(sol.v[1] - v_next[1]))) # check this, is this optimal
return sol
# Function that returns value of consumption choice conditional on the state
def value_of_choice_2d(x,a,a_next,v_next,par,state):
# Unpack consumption (choice variable)
c = x
# Intialize expected continuation value
Ev_next = 0.0
# Compute value of choice conditional on being in state 1 (unemployment state)
###### VECTORIZE THIS
if state==1:
# Loop over each possible state
for i in [0,1]:
# Next periods state for each income level
a_plus = par.y[i] + (1+par.r)*(a - c)
#Interpolate continuation given state a_plus
v_plus = tools.interp_linear_1d_scalar(a_next,v_next,a_plus)
# Append continuation value to calculate expected value
Ev_next += par.P[0,i] * v_plus
# Compute value of choice conditional on being in state 2 (employment state)
else:
# Loop over each possible state
###### VECTORIZE THIS
for i in [0,1]:
# Next periods state for each income level
a_plus = par.y[i] + (1+par.r)*(a - c)
#Interpolate continuation given state a_plus
v_plus = tools.interp_linear_1d_scalar(a_next,v_next,a_plus)
# Append continuation value to calculate expected value
Ev_next += par.P[1,i] * v_plus
# Value of choice
v_guess = util.u(c,par) + par.beta * Ev_next
return v_guess
# Copy below into solve.model() file to plot policy functions:
# Plot some stuff
#fig = plt.figure(figsize=(14,5))
#ax = fig.add_subplot(1,2,1)
#ax.plot(sol_vfi_2d.a, sol_vfi_2d.c[0,:], linestyle = ':', color = 'red', label = '$y_1$')
#ax.plot(sol_vfi_2d.a, sol_vfi_2d.c[1,:], linestyle = ':', color = 'blue', label = '$y_2$')
#ax.plot(sol_vfi_2d.a[:10], sol_vfi_2d.a[:10], linestyle = '--', color = '0.6') # Check with 45 degree line. Seems correct
#ax.set_xlabel(f"Assets, $a_t$")
#ax.set_ylabel(f"Consumption, $c^\star_t$")
#ax.set_title(f'Policy function')
#ax.set_xlim([-1,20])
#ax.legend(frameon=True)
#plt.show()
# Attempt with 2 choice variables
#### Notes: Fordi vi skal løse problemet i to dele, så er vi nødt til at lave et if-statement, så det ene problem kun kører, når h_min>a. Hvis
#### h_min>a og vi OGSÅ forsøger at løse problemet med h bounded mellem h_min og a, så kan vi jo ikke få nogen løsninger der overholder budget constraint, da valg
#### af housing ikke er feasible. f.eks. hvis vi kun har a=2 og h_min=4
#### Idéen har derfor været kun at køre problemet, hvor der vælges consumption så længe h_min > a, og else køre begge problemer, og sammenligne i sidstnævnte scenarie
def solve_VFI_2dfull(par):
# Initialize solution class
class sol: pass
shape = (np.size(par.y),1)
sol.c = np.tile(par.grid_a.copy(), shape) # Initial guess is to consume everything for each state
sol.h = np.zeros(np.shape(sol.c)) # Initial guess for housing is therefore zero for each state
sol.v = util.u_with_housing(sol.c,sol.h,par) # Utility of consumption
sol.a = par.grid_a.copy() # Copy the exogenous asset grid for consistency (with EGM algortihm)
state1 = 1 # UNEMPLOYMENT STATE Used as boolean in "value_of_choice" - Defined here for readability
state2 = 0 # EMPLOYMENT STATE Used as boolean in "value_of_choice" - Defined here for readability
sol.it = 0 # Iteration counter
sol.delta = 1000.0 # Difference between two iterations
# Iterate value function until convergence or break if no convergence
while (sol.delta >= par.tol_vfi and sol.it < par.max_iter):
# Use last iteration as the continuation value. See slides if confused
v_next = sol.v.copy()
# Loop over asset grid
for i_a,a in enumerate(par.grid_a):
if par.h_min > a:
### THIS IS ESSENTIALLY THE SAME PROBLEM AS BEFORE IN SCALAR-CASE. (Maybe we should use minimize_scalar, if it is faster)
# Minimize the minus the value function wrt consumption conditional on unemployment state
obj_fun = lambda x : - value_of_choice_2d(x,a,par.grid_a,v_next[0,:],par,state1)
res_1 = optimize.minimize_scalar(obj_fun, bounds=[0,a+1.0e-4], method='bounded')
# Minimize the minus the value function wrt consumption conditional on employment state
obj_fun = lambda x : - value_of_choice_2d(x,a,par.grid_a,v_next[1,:],par,state2)
res_2 = optimize.minimize_scalar(obj_fun, bounds=[0,a+1.0e-4], method='bounded')
# Unpack solutions
# State 1
sol.v[0,i_a] = -res_1.fun
sol.c[0,i_a] = res_1.x
sol.h[0,i_a] = 0
# State 2
sol.v[1,i_a] = -res_2.fun
sol.c[1,i_a] = res_2.x
sol.h[1,i_a] = 0
else:
### COMPUTE SOLUTION WITH h=0, AND h element of h_min and a, AND COMPARE SOLUTIONS
# Minimize the minus the value function wrt consumption conditional on unemployment state
obj_fun = lambda x : - value_of_choice_2d(x,a,par.grid_a,v_next[0,:],par,state1)
res_1 = optimize.minimize_scalar(obj_fun, bounds=[0,a+1.0e-4], method='bounded')
# Minimize the minus the value function wrt consumption conditional on employment state
obj_fun = lambda x : - value_of_choice_2d(x,a,par.grid_a,v_next[1,:],par,state2)
res_2 = optimize.minimize_scalar(obj_fun, bounds=[0,a+1.0e-4], method='bounded')
# IMPLEMENT OPTIMIZER BELOW.
###
### DEBUG SOMEWHERE AROUND HERE ###
###
arguments = [a,par] #### Se link for idé: https://stackoverflow.com/questions/54611746/scipy-minimize-how-to-pass-args-to-both-the-objective-and-the-constraint
constraint = {'type': 'ineq', 'fun': feasibility_constraint, 'args': arguments}
# Initial guess, x0
#### MULTISTART GUESSES
#####
## BEMÆRK: Den klager nogle over, at "Values in x were outside bounds during a" (lader til at den klager i første iteration) - kan ikke se hvorfor.
#x0 = np.array([np.zeros(2)])+np.array([1.0e-3, par.h_min+1.0e-4])
#x0 = np.array([np.zeros(2)])+np.array([1.0e-6, a-1.0e-4]) # It usually converges for Quasi utility with these initial values.
x0 = np.array([np.zeros(2)])+np.array([1.0e-3, a-1.0e-4]) # It usually converges for Quasi utility with these initial values.
#x0 = np.array([np.zeros(2)])+np.array([1.0e-3, (a+par.h_min)/2])
#x0 = np.array([np.zeros(2)])+np.array([(a-par.h_min)/2, (a+par.h_min)/2])
#x0 = np.array([np.zeros(2)])+np.array([a-par.h_min-1.0e-6, (a+par.h_min)/2])
#x0 = np.array([np.zeros(2)])+np.array([a-par.h_min-1.0e-2, par.h_min+1.0e-4])
# Minimize the minus the value function wrt consumption conditional on unemployment state
#res_3 = optimize.minimize(objective_fun, x0, method='SLSQP', args = (a,par.grid_a,v_next[0,:],par,1), bounds = np.array([[1e-8,a+1.0e-4],[par.h_min,a+1.0e-4]]), constraints = constraint)
res_3 = optimize.minimize(objective_fun, x0, method='SLSQP', args = (a,par.grid_a,v_next[0,:],par,state1), bounds = np.array([[1.0e-4,a],[par.h_min,a+1.0e-4]]), constraints = constraint)
# Minimize the minus the value function wrt consumption conditional on employment state
#res_4 = optimize.minimize(objective_fun, x0, method='SLSQP', args = (a,par.grid_a,v_next[0,:],par,0), bounds = np.array([[1e-8,a+1.0e-4],[par.h_min,a+1.0e-4]]), constraints = constraint)
res_4 = optimize.minimize(objective_fun, x0, method='SLSQP', args = (a,par.grid_a,v_next[1,:],par,state2), bounds = np.array([[1.0e-4,a],[par.h_min,a+1.0e-4]]), constraints = constraint)
# Unpack solutions
# State 1
# NOTEE: Can maybe store res_1, res_2, res_3, and res_4 in an array/list and run code below in a loop.
sol.v[0,i_a] = max(-res_1.fun,-res_3.fun)
if -res_1.fun>=-res_3.fun:
sol.c[0,i_a] = res_1.x
sol.h[0,i_a] = 0
else:
sol.c[0,i_a] = res_3.x[0]
sol.h[0,i_a] = res_3.x[1]
# State 2
sol.v[1,i_a] = max(-res_2.fun,-res_4.fun)
if -res_2.fun>=-res_4.fun:
sol.c[1,i_a] = res_2.x
sol.h[1,i_a] = 0
else:
sol.c[1,i_a] = res_4.x[0]
sol.h[1,i_a] = res_4.x[1]
# Update iteration parameters
sol.it += 1
sol.delta = max( max(abs(sol.v[0] - v_next[0])), max(abs(sol.v[1] - v_next[1]))) # check this, is this optimal
print(sol.delta)
return sol
# Function that returns value of consumption choice conditional on the state
def value_of_choice_2dfull(c,h,a,a_next,v_next,par,state):
# Unpack consumption (choice variable)
#c = x[0]
#h = x[1]
# Intialize expected continuation value
Ev_next = 0.0
# Compute value of choice conditional on being in state 1 (unemployment state)
if state==1:
# Loop over each possible state
for i in [0,1]:
# Next periods state for each income level
a_plus = par.y[i] + (1+par.r)*(a - par.hp*h - c)
#Interpolate continuation given state a_plus
v_plus = tools.interp_linear_1d_scalar(a_next,v_next,a_plus)
# Append continuation value to calculate expected value
Ev_next += par.P[0,i] * v_plus
# Compute value of choice conditional on being in state 2 (employment state)
else:
# Loop over each possible state
for i in [0,1]:
# Next periods state for each income level
a_plus = par.y[i] + (1+par.r)*(a - par.hp*h - c)
#Interpolate continuation given state a_plus
v_plus = tools.interp_linear_1d_scalar(a_next,v_next,a_plus)
# Append continuation value to calculate expected value
Ev_next += par.P[1,i] * v_plus
# Value of choice
v_guess = util.u_with_housing(c,h,par) + par.beta * Ev_next
return v_guess
def objective_fun(x,a,a_next,v_next,par,state):
# Unpack consumption (choice variable)
c = x[0]
h = x[1]
return -value_of_choice_2dfull(c,h,a,a_next,v_next,par,state)
def feasibility_constraint(x,a,par):
# Ensure that consumption and housing jointly cannot exceed cash on hands
c = x[0]
h = x[1]
return a-c-par.hp*h
###########################
####### NELDER-MEAD ####### - Virker ikke endnu!
###########################
@njit
def objective_fun_NELDER(x,a,a_next,v_next,par,state):
# Unpack consumption (choice variable)
c = x[0]
h = x[1]
penalty = 0
if c+h > a:
penalty = 10_000*(c+h-a)
#c /= (c+h)/a
#d /= (c+h)/a
return value_of_choice_2dfull(c,h,a,a_next,v_next,par,state) - penalty # maximization
@njit(parallel=True)
def solve_VFI_2dfull_NELDER(par):
#### BEMÆRK: FUNKTIONEN SKAL INITIALISERES ANDERLEDES, NÅR MAN BRUGER NJIT - MAN KAN IKKE BRUGE CLASS.
# Initialize solution class
class sol: pass
shape = (np.size(par.y),1)
sol.c = np.tile(par.grid_a.copy(), shape) # Initial guess is to consume everything for each state
sol.h = np.zeros(np.shape(sol.c)) # Initial guess for housing is therefore zero for each state
sol.v = util.u_with_housing(sol.c,sol.h,par) # Utility of consumption
sol.a = par.grid_a.copy() # Copy the exogenous asset grid for consistency (with EGM algortihm)
state1 = 1 # UNEMPLOYMENT STATE Used as boolean in "value_of_choice" - Defined here for readability
state2 = 0 # EMPLOYMENT STATE Used as boolean in "value_of_choice" - Defined here for readability
sol.it = 0 # Iteration counter
sol.delta = 1000.0 # Difference between two iterations
# Iterate value function until convergence or break if no convergence
# Use last iteration as the continuation value. See slides if confused
v_next = sol.v.copy()
# Loop over asset grid
for i_a,a in enumerate(par.grid_a):
if par.h_min > a:
### THIS IS ESSENTIALLY THE SAME PROBLEM AS BEFORE IN SCALAR-CASE. (Maybe we should use minimize_scalar, if it is faster)
# Minimize the minus the value function wrt consumption conditional on unemployment state
obj_fun = lambda x : - value_of_choice_2d(x,a,par.grid_a,v_next[0,:],par,state1)
res_1 = optimize.minimize_scalar(obj_fun, bounds=[0,a+1.0e-4], method='bounded')
# Minimize the minus the value function wrt consumption conditional on employment state
obj_fun = lambda x : - value_of_choice_2d(x,a,par.grid_a,v_next[1,:],par,state2)
res_2 = optimize.minimize_scalar(obj_fun, bounds=[0,a+1.0e-4], method='bounded')
# Unpack solutions
# State 1
sol.v[0,i_a] = -res_1.fun
sol.c[0,i_a] = res_1.x
sol.h[0,i_a] = 0
# State 2
sol.v[1,i_a] = -res_2.fun
sol.c[1,i_a] = res_2.x
sol.h[1,i_a] = 0
else:
### COMPUTE SOLUTION WITH h=0, AND h element of h_min and a, AND COMPARE SOLUTIONS
# Minimize the minus the value function wrt consumption conditional on unemployment state
obj_fun = lambda x : - value_of_choice_2d(x,a,par.grid_a,v_next[0,:],par,state1)
res_1 = optimize.minimize_scalar(obj_fun, bounds=[0,a+1.0e-4], method='bounded')
# Minimize the minus the value function wrt consumption conditional on employment state
obj_fun = lambda x : - value_of_choice_2d(x,a,par.grid_a,v_next[1,:],par,state2)
res_2 = optimize.minimize_scalar(obj_fun, bounds=[0,a+1.0e-4], method='bounded')
# IMPLEMENT OPTIMIZER BELOW.
###
### DEBUG SOMEWHERE AROUND HERE ###
###
arguments = [a,par] #### Se link for idé: https://stackoverflow.com/questions/54611746/scipy-minimize-how-to-pass-args-to-both-the-objective-and-the-constraint
constraint = {'type': 'ineq', 'fun': feasibility_constraint, 'args': arguments}
# Initial guess, x0
#### MULTISTART GUESSES
#x0 = np.array([np.zeros(2)])+np.array([1.0e-3, par.h_min+1.0e-4])
x0 = np.array([np.zeros(2)])+np.array([1.0e-6, a-1.0e-4]) # It usually converges for Quasi utility with these initial values.
#x0 = np.array([np.zeros(2)])+np.array([1.0e-3, (a+par.h_min)/2])
#x0 = np.array([np.zeros(2)])+np.array([(a-par.h_min)/2, (a+par.h_min)/2])
#x0 = np.array([np.zeros(2)])+np.array([a-par.h_min-1.0e-6, (a+par.h_min)/2])
#x0 = np.array([np.zeros(2)])+np.array([a-par.h_min-1.0e-2, par.h_min+1.0e-4])
#res_3 = optimize.minimize(objective_fun, x0, method='SLSQP', args = (a,par.grid_a,v_next[0,:],par,1), bounds = np.array([[1.0e-4,a],[par.h_min,a+1.0e-4]]), constraints = constraint)
res_3 = qe.optimize.nelder_mead(objective_fun_NELDER,x0,
bounds = np.array([[1.0e-4,a],[par.h_min,a+1.0e-4]]),
args = (a,par.grid_a,v_next[0,:],par,1),
tol_x=par.tol_vfi,
max_iter=1000)
#res_4 = optimize.minimize(objective_fun, x0, method='SLSQP', args = (a,par.grid_a,v_next[0,:],par,0), bounds = np.array([[1.0e-4,a],[par.h_min,a+1.0e-4]]), constraints = constraint)
res_4 = qe.optimize.nelder_mead(objective_fun_NELDER,x0,
bounds = np.array([[1.0e-4,a],[par.h_min,a+1.0e-4]]),
args = (a,par.grid_a,v_next[1,:],par,0),
tol_x=par.tol_vfi,
max_iter=1000)
sol.v[0,i_a] = max(-res_1.fun,-res_3.fun)
if -res_1.fun>=-res_3.fun:
sol.c[0,i_a] = res_1.x
sol.h[0,i_a] = 0
else:
sol.c[0,i_a] = res_3.x[0]
sol.h[0,i_a] = res_3.x[1]
# State 2
sol.v[1,i_a] = max(-res_2.fun,-res_4.fun)
if -res_2.fun>=-res_4.fun:
sol.c[1,i_a] = res_2.x
sol.h[1,i_a] = 0
else:
sol.c[1,i_a] = res_4.x[0]
sol.h[1,i_a] = res_4.x[1]
return sol
| 46.319058 | 203 | 0.599048 | 3,384 | 21,631 | 3.699173 | 0.109929 | 0.014699 | 0.010864 | 0.009586 | 0.845343 | 0.838632 | 0.834638 | 0.82617 | 0.818661 | 0.810193 | 0 | 0.033911 | 0.2802 | 21,631 | 466 | 204 | 46.418455 | 0.770071 | 0.463085 | 0 | 0.806604 | 0 | 0 | 0.010365 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04717 | false | 0.018868 | 0.028302 | 0 | 0.141509 | 0.004717 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
83d771be7e49dc2334705ae3ae9568eff6ca71ca | 1,748 | py | Python | tests/test_cards/test_actions/test_artisan.py | evanofslack/pyminion | 0d0bfc6d8e84e9f33e617c7d01b6edb649166290 | [
"MIT"
] | 5 | 2021-12-17T20:34:55.000Z | 2022-01-24T15:18:05.000Z | tests/test_cards/test_actions/test_artisan.py | evanofslack/pyminion | 0d0bfc6d8e84e9f33e617c7d01b6edb649166290 | [
"MIT"
] | 31 | 2021-10-29T21:05:00.000Z | 2022-03-22T03:27:14.000Z | tests/test_cards/test_actions/test_artisan.py | evanofslack/pyminion | 0d0bfc6d8e84e9f33e617c7d01b6edb649166290 | [
"MIT"
] | 1 | 2021-12-23T18:32:47.000Z | 2021-12-23T18:32:47.000Z | from pyminion.expansions.base import artisan, silver
from pyminion.game import Game
from pyminion.players import Human
def test_artisan_valid_gain_same_topdeck(human: Human, game: Game, monkeypatch):
human.hand.add(artisan)
assert len(game.supply.piles[1]) == 40
assert len(human.hand) == 1
responses = iter(["silver", "silver"])
monkeypatch.setattr("builtins.input", lambda input: next(responses))
human.hand.cards[0].play(human, game)
assert len(human.hand) == 0
assert human.deck.cards[-1] is silver
assert len(human.playmat) == 1
assert human.state.actions == 0
assert len(game.supply.piles[1]) == 39
def test_artisan_invalid_gain(human: Human, game: Game, monkeypatch):
human.hand.add(artisan)
assert len(game.supply.piles[1]) == 40
assert len(human.hand) == 1
responses = iter(["gold", "silver", "silver"])
monkeypatch.setattr("builtins.input", lambda input: next(responses))
human.hand.cards[0].play(human, game)
assert len(human.hand) == 0
assert human.deck.cards[-1] is silver
assert len(human.playmat) == 1
assert human.state.actions == 0
assert len(game.supply.piles[1]) == 39
def test_artisan_valid_gain_diff_topdeck(human: Human, game: Game, monkeypatch):
human.hand.add(artisan)
human.hand.add(artisan)
assert len(game.supply.piles[1]) == 40
assert len(human.hand) == 2
responses = iter(["silver", "artisan"])
monkeypatch.setattr("builtins.input", lambda input: next(responses))
human.hand.cards[0].play(human, game)
assert len(human.hand) == 1
assert human.deck.cards[-1] is artisan
assert len(human.playmat) == 1
assert human.state.actions == 0
assert len(game.supply.piles[1]) == 39
| 32.981132 | 80 | 0.687071 | 248 | 1,748 | 4.790323 | 0.181452 | 0.113636 | 0.106061 | 0.09596 | 0.867003 | 0.839226 | 0.819865 | 0.819865 | 0.819865 | 0.819865 | 0 | 0.024828 | 0.170481 | 1,748 | 52 | 81 | 33.615385 | 0.794483 | 0 | 0 | 0.725 | 0 | 0 | 0.047483 | 0 | 0 | 0 | 0 | 0 | 0.525 | 1 | 0.075 | false | 0 | 0.075 | 0 | 0.15 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
83f82e9c5eafa3861b5e6fb99aabcd062a63ce76 | 170 | py | Python | testsuite/six-dir/using_vendored_six.py | xoviat/modulegraph2 | 766d00bdb40e5b2fe206b53a87b1bce3f9dc9c2a | [
"MIT"
] | 9 | 2020-03-22T14:48:01.000Z | 2021-05-30T12:18:12.000Z | testsuite/six-dir/using_vendored_six.py | xoviat/modulegraph2 | 766d00bdb40e5b2fe206b53a87b1bce3f9dc9c2a | [
"MIT"
] | 15 | 2020-01-06T10:02:32.000Z | 2021-05-28T12:22:44.000Z | testsuite/six-dir/using_vendored_six.py | ronaldoussoren/modulegraph2 | b6ab1766b0098651b51083235ff8a18a5639128b | [
"MIT"
] | 4 | 2020-05-10T18:51:41.000Z | 2021-04-07T14:03:12.000Z | from vendored.six.moves import html_parser
from vendored.six.moves import reload_module
from vendored.six.moves import reduce
from six.moves.urllib.error import URLError
| 34 | 44 | 0.852941 | 27 | 170 | 5.296296 | 0.481481 | 0.223776 | 0.314685 | 0.41958 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094118 | 170 | 4 | 45 | 42.5 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
f7b9d32e9bf656a8ceb7b2151b621b1e3aaa979d | 3,150 | py | Python | fixture/contact.py | Popravka22/python_training | c5f834ac920f09a4c709c55d41b5de57a9fe8fcd | [
"Apache-2.0"
] | null | null | null | fixture/contact.py | Popravka22/python_training | c5f834ac920f09a4c709c55d41b5de57a9fe8fcd | [
"Apache-2.0"
] | null | null | null | fixture/contact.py | Popravka22/python_training | c5f834ac920f09a4c709c55d41b5de57a9fe8fcd | [
"Apache-2.0"
] | null | null | null | from selenium.webdriver.common.by import By
import time
class ContactHelper:
def __init__(self, app):
self.app = app
def open_contact_form(self):
wd = self.app.wd
wd.find_element(By.LINK_TEXT, "add new").click()
def create_contact(self, contact):
wd = self.app.wd
self.open_contact_form()
#fill contact form
wd.find_element_by_name("firstname").click()
wd.find_element_by_name("firstname").clear()
wd.find_element_by_name("firstname").send_keys(contact.firstname)
wd.find_element_by_name("lastname").click()
wd.find_element_by_name("lastname").clear()
wd.find_element_by_name("lastname").send_keys(contact.lastname)
wd.find_element_by_name("company").click()
wd.find_element_by_name("company").clear()
wd.find_element_by_name("company").send_keys(contact.company)
wd.find_element_by_name("mobile").click()
wd.find_element_by_name("mobile").clear()
wd.find_element_by_name("mobile").send_keys(contact.mobile)
wd.find_element_by_name("email").click()
wd.find_element_by_name("email").clear()
wd.find_element_by_name("email").send_keys(contact.email)
wd.find_element(By.NAME, "submit").click()
self.return_to_home_page()
def edit_first(self, contact):
wd = self.app.wd
# select first contact
wd.find_element(By.LINK_TEXT, "home").click()
wd.find_element(By.NAME, "selected[]").click()
wd.find_element_by_xpath("//img[@alt='Edit']").click()
# edit entry
wd.find_element_by_name("firstname").click()
wd.find_element_by_name("firstname").clear()
wd.find_element_by_name("firstname").send_keys(contact.firstname)
wd.find_element_by_name("lastname").click()
wd.find_element_by_name("lastname").clear()
wd.find_element_by_name("lastname").send_keys(contact.lastname)
wd.find_element_by_name("company").click()
wd.find_element_by_name("company").clear()
wd.find_element_by_name("company").send_keys(contact.company)
wd.find_element_by_name("mobile").click()
wd.find_element_by_name("mobile").clear()
wd.find_element_by_name("mobile").send_keys(contact.mobile)
wd.find_element_by_name("email").click()
wd.find_element_by_name("email").clear()
wd.find_element_by_name("email").send_keys(contact.email)
wd.find_element(By.NAME, "update").click()
self.return_to_home_page()
def delete_first_contact(self):
wd = self.app.wd
#select first contact
wd.find_element(By.LINK_TEXT, "home").click()
wd.find_element(By.NAME, "selected[]").click()
#submit deletion contact
wd.find_element_by_xpath("//input[@value='Delete']").click()
#self.assertRegexpMatches(self.close_alert_and_get_its_text(), r"^Delete 1 addresses[\s\S]$")
wd.switch_to.alert.accept()
time.sleep(5) # Сон в 5 секунд
def return_to_home_page(self):
wd = self.app.wd
wd.find_element(By.LINK_TEXT, "home page").click()
| 42.567568 | 101 | 0.664762 | 439 | 3,150 | 4.439636 | 0.159453 | 0.12314 | 0.266803 | 0.30785 | 0.792201 | 0.766034 | 0.749102 | 0.718317 | 0.718317 | 0.718317 | 0 | 0.001181 | 0.193333 | 3,150 | 73 | 102 | 43.150685 | 0.76584 | 0.063175 | 0 | 0.683333 | 0 | 0 | 0.10462 | 0.008152 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.033333 | 0 | 0.15 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
790d685e82700fc2d434189758494db076f50329 | 6,133 | py | Python | predict/ensemble.py | DataArk/CHIP2021-Task1-Top1 | e352198d96d31c60541e4a271f20cc23b3ab6b92 | [
"Apache-2.0"
] | 15 | 2021-12-18T06:08:55.000Z | 2022-03-30T00:41:45.000Z | predict/ensemble.py | confstantine/nlp-task | cb152e885bc6f6f1243a12ad90b1c715eb548736 | [
"Apache-2.0"
] | 1 | 2021-12-20T05:57:37.000Z | 2021-12-20T13:43:07.000Z | predict/ensemble.py | DataArk/CHIP2021-Task1-Top1 | e352198d96d31c60541e4a271f20cc23b3ab6b92 | [
"Apache-2.0"
] | 1 | 2021-12-27T04:49:35.000Z | 2021-12-27T04:49:35.000Z | import codecs
import json
from tqdm import tqdm
import copy
submit_result2 = []
with codecs.open('dialog_chinese-macbert.txt', mode='r', encoding='utf8') as f:
reader = f.readlines(f)
data_list = []
for dialogue_idx_, dialogue_ in enumerate(tqdm(reader)):
dialogue_ = json.loads(dialogue_)
submit_result2.append(dialogue_)
submit_result4 = []
with codecs.open('macbert2-f-f.txt', mode='r', encoding='utf8') as f:
reader = f.readlines(f)
data_list = []
for dialogue_idx_, dialogue_ in enumerate(tqdm(reader)):
dialogue_ = json.loads(dialogue_)
submit_result4.append(dialogue_)
submit_result3 = []
with codecs.open('macbert2-f.txt', mode='r', encoding='utf8') as f:
reader = f.readlines(f)
data_list = []
for dialogue_idx_, dialogue_ in enumerate(tqdm(reader)):
dialogue_ = json.loads(dialogue_)
submit_result3.append(dialogue_)
submit_result5 = []
with codecs.open('mcbert.txt', mode='r', encoding='utf8') as f:
reader = f.readlines(f)
data_list = []
for dialogue_idx_, dialogue_ in enumerate(tqdm(reader)):
dialogue_ = json.loads(dialogue_)
submit_result5.append(dialogue_)
submit_result6 = []
with codecs.open('medbert.txt', mode='r', encoding='utf8') as f:
reader = f.readlines(f)
data_list = []
for dialogue_idx_, dialogue_ in enumerate(tqdm(reader)):
dialogue_ = json.loads(dialogue_)
submit_result6.append(dialogue_)
submit_result = []
with codecs.open('macbert2-f.txt', mode='r', encoding='utf8') as f:
reader = f.readlines(f)
data_list = []
for dialogue_idx_, dialogue_ in enumerate(tqdm(reader)):
dialogue_ = json.loads(dialogue_)
for content_idx_, contents_ in enumerate(dialogue_['dialog_info']):
terms_ = contents_['ner']
if len(terms_) != 0:
idx_ = 0
for _ner_idx, term_ in enumerate(terms_):
if dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] == '阳性' and dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] != submit_result3[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr']:
dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] = submit_result3[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr']
elif dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] == '阴性' and dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] != submit_result3[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr']:
dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] = submit_result3[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr']
elif dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] != submit_result2[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr']:
if submit_result2[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] == '不标注':
dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] = submit_result2[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr']
elif dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] == '阳性':
if submit_result2[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] == '其他':
dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] = submit_result2[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr']
elif dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] != submit_result4[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr']:
if dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] == '阴性':
if submit_result4[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] == '不标注':
dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] = submit_result4[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr']
elif dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] != submit_result5[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr']:
if dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] == '阴性':
if submit_result5[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] == '不标注':
dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] = submit_result5[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr']
# elif submit_result5[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] == '其他':
# dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] = submit_result5[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr']
elif dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] != submit_result6[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr']:
if dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] == '阳性':
if submit_result6[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] == '其他':
dialogue_['dialog_info'][content_idx_]['ner'][_ner_idx]['attr'] = submit_result6[dialogue_idx_]['dialog_info'][content_idx_]['ner'][_ner_idx]['attr']
submit_result.append(dialogue_)
with open('./result.txt', 'w', encoding='utf-8') as output_data:
for json_content in submit_result:
output_data.write(json.dumps(json_content, ensure_ascii=False) + '\n') | 51.537815 | 268 | 0.612098 | 729 | 6,133 | 4.663923 | 0.089163 | 0.120588 | 0.2 | 0.235294 | 0.835588 | 0.828824 | 0.828824 | 0.828824 | 0.828824 | 0.827059 | 0 | 0.008741 | 0.216534 | 6,133 | 119 | 269 | 51.537815 | 0.698855 | 0.044839 | 0 | 0.421053 | 0 | 0 | 0.145062 | 0.004458 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.052632 | 0 | 0.052632 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f7261ecb34d10b321947c71f2d29ac44db652fb3 | 20,071 | py | Python | vis/visualize_court.py | szhaofelicia/sgan | ead42d4bb3b1278c4c9ffcae8fa9c2dc036a52ff | [
"MIT"
] | 3 | 2022-01-02T16:58:39.000Z | 2022-02-07T08:29:48.000Z | vis/visualize_court.py | szhaofelicia/sgan | ead42d4bb3b1278c4c9ffcae8fa9c2dc036a52ff | [
"MIT"
] | null | null | null | vis/visualize_court.py | szhaofelicia/sgan | ead42d4bb3b1278c4c9ffcae8fa9c2dc036a52ff | [
"MIT"
] | null | null | null | import numpy as np
# import plotly
import plotly.graph_objects as go
def draw_plotly_half_court(fig, fig_width=600, margins=10):
# From: https://community.plot.ly/t/arc-shape-with-path/7205/5
def ellipse_arc(x_center=0.0, y_center=0.0, a=10.5, b=10.5, start_angle=0.0, end_angle=2 * np.pi, N=200, closed=False):
t = np.linspace(start_angle, end_angle, N)
x = x_center + a * np.cos(t)
y = y_center + b * np.sin(t)
path = f'M {x[0]}, {y[0]}'
for k in range(1, len(t)):
path += f'L{x[k]}, {y[k]}'
if closed:
path += ' Z'
return path
fig_height = fig_width * (470 + 2 * margins) / (500 + 2 * margins)
fig.update_layout(width=fig_width, height=fig_height)
# Set axes ranges
fig.update_xaxes(range=[-250 - margins, 250 + margins])
fig.update_yaxes(range=[-52.5 - margins, 417.5 + margins])
threept_break_y = 89.47765084
three_line_col = "#777777"
main_line_col = "#777777"
fig.update_layout(
# Line Horizontal
margin=dict(l=20, r=20, t=20, b=20),
paper_bgcolor="white",
plot_bgcolor="white",
yaxis=dict(
scaleanchor="x",
scaleratio=1,
showgrid=False,
zeroline=False,
showline=False,
ticks='',
showticklabels=False,
fixedrange=True,
),
xaxis=dict(
showgrid=False,
zeroline=False,
showline=False,
ticks='',
showticklabels=False,
fixedrange=True,
),
shapes=[
# half_layout=[
dict(
type="rect", x0=-250, y0=-52.5, x1=250, y1=417.5,
line=dict(color=main_line_col, width=1),
# fillcolor='#333333',
layer='below'
), ## sideline rect
dict(
type="rect", x0=-80, y0=-52.5, x1=80, y1=137.5,
line=dict(color=main_line_col, width=1),
# fillcolor='#333333',
layer='below'
),# lane line rect
dict(
type="rect", x0=-60, y0=-52.5, x1=60, y1=137.5,
line=dict(color=main_line_col, width=1),
# fillcolor='#333333',
layer='below'
), # foul line rect
dict(
type="circle", x0=-60, y0=77.5, x1=60, y1=197.5, xref="x", yref="y",
line=dict(color=main_line_col, width=1),
# fillcolor='#dddddd',
layer='below'
), # free-throw circle
dict(
type="line", x0=-60, y0=137.5, x1=60, y1=137.5,
line=dict(color=main_line_col, width=1),
layer='below'
), # foul line
dict(
type="rect", x0=-2, y0=-7.25, x1=2, y1=-12.5,
line=dict(color="#ec7607", width=1),
fillcolor='#ec7607',
), # hoop rect
dict(
type="circle", x0=-7.5, y0=-7.5, x1=7.5, y1=7.5, xref="x", yref="y",
line=dict(color="#ec7607", width=1),
), # hoop circle
dict(
type="line", x0=-30, y0=-12.5, x1=30, y1=-12.5,
line=dict(color="#ec7607", width=1),
), # backboard
dict(type="path",
path=ellipse_arc(a=40, b=40, start_angle=0, end_angle=np.pi),
line=dict(color=main_line_col, width=1), layer='below'), # no-change semi-circle
dict(type="path",
path=ellipse_arc(a=237.5, b=237.5, start_angle=0.386283101, end_angle=np.pi - 0.386283101),
line=dict(color=main_line_col, width=1), layer='below'), # three-point line:arc
dict(
type="line", x0=-220, y0=-52.5, x1=-220, y1=threept_break_y,
line=dict(color=three_line_col, width=1), layer='below'
), # three-point line:left edge
# dict(
# type="line", x0=-220, y0=-52.5, x1=-220, y1=threept_break_y,
# line=dict(color=three_line_col, width=1), layer='below'
# ),
dict(
type="line", x0=220, y0=-52.5, x1=220, y1=threept_break_y,
line=dict(color=three_line_col, width=1), layer='below'
), # three-point line:right edge
dict(
type="line", x0=-250, y0=227.5, x1=-220, y1=227.5,
line=dict(color=main_line_col, width=1), layer='below'
), # midcourt area marker:left
dict(
type="line", x0=250, y0=227.5, x1=220, y1=227.5,
line=dict(color=main_line_col, width=1), layer='below'
), # midcourt area marker:right
dict(
type="line", x0=-90, y0=17.5, x1=-80, y1=17.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(
type="line", x0=-90, y0=27.5, x1=-80, y1=27.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(
type="line", x0=-90, y0=57.5, x1=-80, y1=57.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(
type="line", x0=-90, y0=87.5, x1=-80, y1=87.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(
type="line", x0=90, y0=17.5, x1=80, y1=17.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(
type="line", x0=90, y0=27.5, x1=80, y1=27.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(
type="line", x0=90, y0=57.5, x1=80, y1=57.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(
type="line", x0=90, y0=87.5, x1=80, y1=87.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(type="path",
path=ellipse_arc(y_center=417.5, a=60, b=60, start_angle=-0, end_angle=-np.pi),
line=dict(color=main_line_col, width=1), layer='below'), # center circle: half
]
)
return True
def draw_plotly_whole_court(fig, fig_width=600, margins=10):
# From: https://community.plot.ly/t/arc-shape-with-path/7205/5
def ellipse_arc(x_center=0.0, y_center=0.0, a=10.5, b=10.5, start_angle=0.0, end_angle=2 * np.pi, N=200, closed=False):
t = np.linspace(start_angle, end_angle, N)
x = x_center + a * np.cos(t)
y = y_center + b * np.sin(t)
path = f'M {x[0]}, {y[0]}'
for k in range(1, len(t)):
path += f'L{x[k]}, {y[k]}'
if closed:
path += ' Z'
return path
fig_height = fig_width * (470*2 + 2 * margins) / (500 + 2 * margins)
fig.update_layout(width=fig_width, height=fig_height)
# Set axes ranges
fig.update_xaxes(range=[-250 - margins, 250 + margins])
fig.update_yaxes(range=[-52.5 - margins, 417.5+470 + margins])
# fig.update_xaxes(range=[ margins, 500 + margins])
# fig.update_yaxes(range=[margins, 470*2 + margins])
threept_break_y = 89.47765084
three_line_col = "#777777"
main_line_col = "#777777"
fig.update_layout(
# Line Horizontal
margin=dict(l=20, r=20, t=20, b=20),
paper_bgcolor="white",
plot_bgcolor="white",
yaxis=dict(
scaleanchor="x",
scaleratio=1,
showgrid=False,
zeroline=False,
showline=False,
ticks='',
showticklabels=False,
fixedrange=True,
),
xaxis=dict(
showgrid=False,
zeroline=False,
showline=False,
ticks='',
showticklabels=False,
fixedrange=True,
),
# width:500, height: 470
shapes=[
dict(
type="rect", x0=-250, y0=-52.5, x1=250, y1=417.5+470,
line=dict(color=main_line_col, width=1),
# fillcolor='#333333',
layer='below'
), ## sideline rect
# dict(
# type="rect", x0=-250, y0=-52.5, x1=250, y1=417.5,
# line=dict(color=main_line_col, width=1),
# # fillcolor='#333333',
# layer='below'
# ), ## sideline rect
dict(
type="rect", x0=-80, y0=-52.5, x1=80, y1=137.5,
line=dict(color=main_line_col, width=1),
# fillcolor='#333333',
layer='below'
),# lane line rect
dict(
type="rect", x0=-60, y0=-52.5, x1=60, y1=137.5,
line=dict(color=main_line_col, width=1),
# fillcolor='#333333',
layer='below'
), # foul line rect
dict(
type="circle", x0=-60, y0=77.5, x1=60, y1=197.5, xref="x", yref="y",
line=dict(color=main_line_col, width=1),
# fillcolor='#dddddd',
layer='below'
), # free-throw circle
dict(
type="line", x0=-60, y0=137.5, x1=60, y1=137.5,
line=dict(color=main_line_col, width=1),
layer='below'
), # foul line
dict(
type="rect", x0=-2, y0=-7.25, x1=2, y1=-12.5,
line=dict(color="#ec7607", width=1),
fillcolor='#ec7607',
), # hoop rect
dict(
type="circle", x0=-7.5, y0=-7.5, x1=7.5, y1=7.5, xref="x", yref="y",
line=dict(color="#ec7607", width=1),
), # hoop circle
dict(
type="line", x0=-30, y0=-12.5, x1=30, y1=-12.5,
line=dict(color="#ec7607", width=1),
), # backboard
dict(type="path",
path=ellipse_arc(a=40, b=40, start_angle=0, end_angle=np.pi),
line=dict(color=main_line_col, width=1), layer='below'), # no-change semi-circle
dict(type="path",
path=ellipse_arc(a=237.5, b=237.5, start_angle=0.386283101, end_angle=np.pi - 0.386283101),
line=dict(color=main_line_col, width=1), layer='below'), # three-point line:arc
dict(
type="line", x0=-220, y0=-52.5, x1=-220, y1=threept_break_y,
line=dict(color=three_line_col, width=1), layer='below'
), # three-point line:left edge
# dict(
# type="line", x0=-220, y0=-52.5, x1=-220, y1=threept_break_y,
# line=dict(color=three_line_col, width=1), layer='below'
# ),
dict(
type="line", x0=220, y0=-52.5, x1=220, y1=threept_break_y,
line=dict(color=three_line_col, width=1), layer='below'
), # three-point line:right edge
dict(
type="line", x0=-250, y0=227.5, x1=-220, y1=227.5,
line=dict(color=main_line_col, width=1), layer='below'
), # midcourt area marker:left
dict(
type="line", x0=250, y0=227.5, x1=220, y1=227.5,
line=dict(color=main_line_col, width=1), layer='below'
), # midcourt area marker:right
dict(
type="line", x0=-90, y0=17.5, x1=-80, y1=17.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(
type="line", x0=-90, y0=27.5, x1=-80, y1=27.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(
type="line", x0=-90, y0=57.5, x1=-80, y1=57.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(
type="line", x0=-90, y0=87.5, x1=-80, y1=87.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(
type="line", x0=90, y0=17.5, x1=80, y1=17.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(
type="line", x0=90, y0=27.5, x1=80, y1=27.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(
type="line", x0=90, y0=57.5, x1=80, y1=57.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(
type="line", x0=90, y0=87.5, x1=80, y1=87.5,
line=dict(color=main_line_col, width=1), layer='below'
), # lane line marker
dict(type="path",
path=ellipse_arc(y_center=417.5, a=60, b=60, start_angle=-0, end_angle=-np.pi),
line=dict(color=main_line_col, width=1), layer='below'), # center circle: half
## upper
# dict(
# type="rect", x0=-250, y0=-52.5, x1=250, y1=417.5,
# line=dict(color=main_line_col, width=1),
# # fillcolor='#333333',
# layer='below'
# ), ## sideline rect
# dict(
# type="rect", x0=-80, y0=-52.5, x1=80, y1=137.5,
# line=dict(color=main_line_col, width=1),
# # fillcolor='#333333',
# layer='below'
# ), # lane line rect
# dict(
# type="rect", x0=-60, y0=-52.5, x1=60, y1=137.5,
# line=dict(color=main_line_col, width=1),
# # fillcolor='#333333',
# layer='below'
# ), # foul line rect
# dict(
# type="circle", x0=-60, y0=77.5, x1=60, y1=197.5, xref="x", yref="y",
# line=dict(color=main_line_col, width=1),
# # fillcolor='#dddddd',
# layer='below'
# ), # free-throw circle
# dict(
# type="line", x0=-60, y0=137.5, x1=60, y1=137.5,
# line=dict(color=main_line_col, width=1),
# layer='below'
# ), # foul line
#
# dict(
# type="rect", x0=-2, y0=-7.25, x1=2, y1=-12.5,
# line=dict(color="#ec7607", width=1),
# fillcolor='#ec7607',
# ), # hoop rect
# dict(
# type="circle", x0=-7.5, y0=-7.5, x1=7.5, y1=7.5, xref="x", yref="y",
# line=dict(color="#ec7607", width=1),
# ), # hoop circle
# dict(
# type="line", x0=-30, y0=-12.5, x1=30, y1=-12.5,
# line=dict(color="#ec7607", width=1),
# ), # backboard
#
# dict(type="path",
# path=ellipse_arc(a=40, b=40, start_angle=0, end_angle=np.pi),
# line=dict(color=main_line_col, width=1), layer='below'), # no-change semi-circle
# dict(type="path",
# path=ellipse_arc(a=237.5, b=237.5, start_angle=0.386283101, end_angle=np.pi - 0.386283101),
# line=dict(color=main_line_col, width=1), layer='below'), # three-point line:arc
# dict(
# type="line", x0=-220, y0=-52.5, x1=-220, y1=threept_break_y,
# line=dict(color=three_line_col, width=1), layer='below'
# ), # three-point line:left edge
# # dict(
# # type="line", x0=-220, y0=-52.5, x1=-220, y1=threept_break_y,
# # line=dict(color=three_line_col, width=1), layer='below'
# # ),
# dict(
# type="line", x0=220, y0=-52.5, x1=220, y1=threept_break_y,
# line=dict(color=three_line_col, width=1), layer='below'
# ), # three-point line:right edge
#
# dict(
# type="line", x0=-250, y0=227.5, x1=-220, y1=227.5,
# line=dict(color=main_line_col, width=1), layer='below'
# ), # midcourt area marker:left
# dict(
# type="line", x0=250, y0=227.5, x1=220, y1=227.5,
# line=dict(color=main_line_col, width=1), layer='below'
# ), # midcourt area marker:right
# dict(
# type="line", x0=-90, y0=17.5, x1=-80, y1=17.5,
# line=dict(color=main_line_col, width=1), layer='below'
# ), # lane line marker
# dict(
# type="line", x0=-90, y0=27.5, x1=-80, y1=27.5,
# line=dict(color=main_line_col, width=1), layer='below'
# ), # lane line marker
# dict(
# type="line", x0=-90, y0=57.5, x1=-80, y1=57.5,
# line=dict(color=main_line_col, width=1), layer='below'
# ), # lane line marker
# dict(
# type="line", x0=-90, y0=87.5, x1=-80, y1=87.5,
# line=dict(color=main_line_col, width=1), layer='below'
# ), # lane line marker
# dict(
# type="line", x0=90, y0=17.5, x1=80, y1=17.5,
# line=dict(color=main_line_col, width=1), layer='below'
# ), # lane line marker
# dict(
# type="line", x0=90, y0=27.5, x1=80, y1=27.5,
# line=dict(color=main_line_col, width=1), layer='below'
# ), # lane line marker
# dict(
# type="line", x0=90, y0=57.5, x1=80, y1=57.5,
# line=dict(color=main_line_col, width=1), layer='below'
# ), # lane line marker
# dict(
# type="line", x0=90, y0=87.5, x1=80, y1=87.5,
# line=dict(color=main_line_col, width=1), layer='below'
# ), # lane line marker
#
# dict(type="path",
# path=ellipse_arc(y_center=417.5, a=60, b=60, start_angle=-0, end_angle=-np.pi),
# line=dict(color=main_line_col, width=1), layer='below'), # center circle: half
]
)
return True
max_freq = 0.002
# freq_by_hex = np.array([min(max_freq, i) for i in league_hexbin_stats['freq_by_hex']])
colorscale = 'YlOrRd'
marker_cmin = 0.1
marker_cmax = 0.6
ticktexts = [str(marker_cmin*100)+'%-', "", str(marker_cmax*100)+'%+']
fig = go.Figure()
# draw_plotly_half_court(fig)
draw_plotly_whole_court(fig)
# fig.add_trace(go.Scatter(
# x=xlocs, y=ylocs, mode='markers', name='markers',
# marker=dict(
# size=freq_by_hex, sizemode='area', sizeref=2. * max(freq_by_hex) / (11. ** 2), sizemin=2.5,
# color=accs_by_hex, colorscale=colorscale,
# colorbar=dict(
# thickness=15,
# x=0.84,
# y=0.87,
# yanchor='middle',
# len=0.2,
# title=dict(
# text="<B>Accuracy</B>",
# font=dict(
# size=11,
# color='#4d4d4d'
# ),
# ),
# tickvals=[marker_cmin, (marker_cmin + marker_cmax) / 2, marker_cmax],
# ticktext=ticktexts,
# tickfont=dict(
# size=11,
# color='#4d4d4d'
# )
# ),
# cmin=marker_cmin, cmax=marker_cmax,
# line=dict(width=1, color='#333333'), symbol='hexagon',
# ),
# ))
# fig.show(config=dict(displayModeBar=False))
# fig.show()
vis_dir='/media/felicia/Data/sgan_results/vis/'
fig.write_image(vis_dir+"court.svg")
| 40.061876 | 123 | 0.478202 | 2,621 | 20,071 | 3.557802 | 0.083556 | 0.066059 | 0.101769 | 0.089223 | 0.914853 | 0.9037 | 0.899303 | 0.899303 | 0.899303 | 0.899303 | 0 | 0.114297 | 0.362264 | 20,071 | 500 | 124 | 40.142 | 0.614219 | 0.322356 | 0 | 0.920415 | 0 | 0 | 0.04714 | 0.002777 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013841 | false | 0 | 0.00692 | 0 | 0.034602 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f74b1b0545cd6c36deb693a8d2f598ea78d8a726 | 11,414 | py | Python | tests/ludwig/encoders/test_text_encoders.py | jimthompson5802/ludwig | 8a369328a3f839d9cdb3710be315952c7891d7c0 | [
"Apache-2.0"
] | null | null | null | tests/ludwig/encoders/test_text_encoders.py | jimthompson5802/ludwig | 8a369328a3f839d9cdb3710be315952c7891d7c0 | [
"Apache-2.0"
] | null | null | null | tests/ludwig/encoders/test_text_encoders.py | jimthompson5802/ludwig | 8a369328a3f839d9cdb3710be315952c7891d7c0 | [
"Apache-2.0"
] | null | null | null | import pytest
import torch
from ludwig.encoders import text_encoders
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", [None, "sum"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_albert_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
albert_encoder = text_encoders.ALBERTEncoder(
use_pretrained=use_pretrained,
reduce_output=reduce_output,
max_sequence_length=max_sequence_length,
)
inputs = torch.rand((2, max_sequence_length)).type(albert_encoder.input_dtype)
inputs = torch.rand((2, max_sequence_length)).type(albert_encoder.input_dtype)
outputs = albert_encoder(inputs)
assert outputs["encoder_output"].shape[1:] == albert_encoder.output_shape
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", [None, "cls_pooled", "sum"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_bert_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
bert = text_encoders.BERTEncoder(
use_pretrained=use_pretrained,
reduce_output=reduce_output,
max_sequence_length=max_sequence_length,
)
inputs = torch.rand((2, max_sequence_length)).type(bert.input_dtype)
outputs = bert(inputs)
assert outputs["encoder_output"].shape[1:] == bert.output_shape
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", ["last", "sum", "mean"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_xlm_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
xlm_encoder = text_encoders.XLMEncoder(
use_pretrained=use_pretrained,
reduce_output=reduce_output,
max_sequence_length=max_sequence_length,
)
inputs = torch.rand((2, max_sequence_length)).type(xlm_encoder.input_dtype)
outputs = xlm_encoder(inputs)
assert outputs["encoder_output"].shape[1:] == xlm_encoder.output_shape
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", [None, "sum"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_gpt_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
gpt_encoder = text_encoders.GPTEncoder(
use_pretrained=use_pretrained,
reduce_output=reduce_output,
max_sequence_length=max_sequence_length,
)
inputs = torch.rand((2, max_sequence_length)).type(gpt_encoder.input_dtype)
outputs = gpt_encoder(inputs)
assert outputs["encoder_output"].shape[1:] == gpt_encoder.output_shape
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", ["cls_pooled", "sum"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_roberta_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
roberta_encoder = text_encoders.RoBERTaEncoder(
use_pretrained=use_pretrained,
reduce_output=reduce_output,
max_sequence_length=max_sequence_length,
)
inputs = torch.rand((2, max_sequence_length)).type(roberta_encoder.input_dtype)
outputs = roberta_encoder(inputs)
assert outputs["encoder_output"].shape[1:] == roberta_encoder.output_shape
@pytest.mark.parametrize("use_pretrained", [True, False])
@pytest.mark.parametrize("reduce_output", [None, "sum"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_gpt2_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
gpt_encoder = text_encoders.GPT2Encoder(
use_pretrained=use_pretrained,
reduce_output=reduce_output,
max_sequence_length=max_sequence_length,
)
inputs = torch.rand((2, max_sequence_length)).type(gpt_encoder.input_dtype)
outputs = gpt_encoder(inputs)
assert outputs["encoder_output"].shape[1:] == gpt_encoder.output_shape
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", [None, "sum"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_distil_bert(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
distil_bert_encoder = text_encoders.DistilBERTEncoder(
use_pretrained=use_pretrained,
reduce_output=reduce_output,
max_sequence_length=max_sequence_length,
)
inputs = torch.rand((2, max_sequence_length)).type(distil_bert_encoder.input_dtype)
outputs = distil_bert_encoder(inputs)
assert outputs["encoder_output"].shape[1:] == distil_bert_encoder.output_shape
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", [None, "sum"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_transfoxl_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
transfo = text_encoders.TransformerXLEncoder(
use_pretrained=use_pretrained,
reduce_output=reduce_output,
max_sequence_length=max_sequence_length,
)
inputs = torch.randint(10, (2, max_sequence_length)).type(transfo.input_dtype)
outputs = transfo(inputs)
assert outputs["encoder_output"].shape[1:] == transfo.output_shape
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", [None, "sum"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_ctrl_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
encoder = text_encoders.CTRLEncoder(
max_sequence_length,
use_pretrained=use_pretrained,
reduce_output=reduce_output,
)
inputs = torch.rand((2, max_sequence_length)).type(encoder.input_dtype)
outputs = encoder(inputs)
assert outputs["encoder_output"].shape[1:] == encoder.output_shape
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", [None, "cls_pooled"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_camembert_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
encoder = text_encoders.CamemBERTEncoder(
use_pretrained=use_pretrained,
reduce_output=reduce_output,
max_sequence_length=max_sequence_length,
)
inputs = torch.rand((2, max_sequence_length)).type(encoder.input_dtype)
outputs = encoder(inputs)
assert outputs["encoder_output"].shape[1:] == encoder.output_shape
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", [None, "cls_pooled"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_longformer_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
encoder = text_encoders.LongformerEncoder(
use_pretrained=use_pretrained,
reduce_output=reduce_output,
max_sequence_length=max_sequence_length,
)
inputs = torch.rand((2, max_sequence_length)).type(encoder.input_dtype)
outputs = encoder(inputs)
assert outputs["encoder_output"].shape[1:] == encoder.output_shape
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", [None, "sum"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_mt5_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
mt5_encoder = text_encoders.MT5Encoder(
use_pretrained=use_pretrained,
reduce_output=reduce_output,
max_sequence_length=max_sequence_length,
)
inputs = torch.rand((2, max_sequence_length)).type(mt5_encoder.input_dtype)
outputs = mt5_encoder(inputs)
assert outputs["encoder_output"].shape[1:] == mt5_encoder.output_shape
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", [None, "sum"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_xlmroberta_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
xlmroberta_encoder = text_encoders.XLMRoBERTaEncoder(
use_pretrained=use_pretrained,
reduce_output=reduce_output,
max_sequence_length=max_sequence_length,
)
inputs = torch.rand((2, max_sequence_length)).type(xlmroberta_encoder.input_dtype)
outputs = xlmroberta_encoder(inputs)
assert outputs["encoder_output"].shape[1:] == xlmroberta_encoder.output_shape
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", [None, "cls_pooled"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_longformer_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
encoder = text_encoders.LongformerEncoder(
use_pretrained=use_pretrained, reduce_output=reduce_output, max_sequence_length=max_sequence_length
)
inputs = torch.rand((2, max_sequence_length)).type(encoder.input_dtype)
outputs = encoder(inputs)
assert outputs["encoder_output"].shape[1:] == encoder.output_shape
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", [None, "sum"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_electra_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
encoder = text_encoders.ELECTRAEncoder(
use_pretrained=use_pretrained, reduce_output=reduce_output, max_sequence_length=max_sequence_length
)
inputs = torch.rand((2, max_sequence_length)).type(encoder.input_dtype)
outputs = encoder(inputs)
assert outputs["encoder_output"].shape[1:] == encoder.output_shape
@pytest.mark.parametrize("pretrained_model_name_or_path", ["bert-base-uncased"])
@pytest.mark.parametrize("reduce_output", [None, "sum", "cls_pooled"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_auto_transformer_encoder(pretrained_model_name_or_path: str, reduce_output: str, max_sequence_length: int):
encoder = text_encoders.AutoTransformerEncoder(
pretrained_model_name_or_path=pretrained_model_name_or_path,
reduce_output=reduce_output,
max_sequence_length=max_sequence_length,
)
inputs = torch.rand((2, max_sequence_length)).type(encoder.input_dtype)
outputs = encoder(inputs)
assert outputs["encoder_output"].shape[1:] == encoder.output_shape
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", [None, "sum"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_flaubert_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
encoder = text_encoders.FlauBERTEncoder(
use_pretrained=use_pretrained, reduce_output=reduce_output, max_sequence_length=max_sequence_length
)
inputs = torch.rand((2, max_sequence_length)).type(encoder.input_dtype)
outputs = encoder(inputs)
assert outputs["encoder_output"].shape[1:] == encoder.output_shape
@pytest.mark.parametrize("use_pretrained", [False])
@pytest.mark.parametrize("reduce_output", [None, "sum"])
@pytest.mark.parametrize("max_sequence_length", [20])
def test_t5_encoder(use_pretrained: bool, reduce_output: str, max_sequence_length: int):
encoder = text_encoders.T5Encoder(
use_pretrained=use_pretrained, reduce_output=reduce_output, max_sequence_length=max_sequence_length
)
inputs = torch.rand((2, max_sequence_length)).type(encoder.input_dtype)
outputs = encoder(inputs)
assert outputs["encoder_output"].shape[1:] == encoder.output_shape
| 45.656 | 116 | 0.758542 | 1,431 | 11,414 | 5.707897 | 0.059399 | 0.121205 | 0.187316 | 0.041871 | 0.884672 | 0.869736 | 0.869736 | 0.855534 | 0.80999 | 0.804236 | 0 | 0.008456 | 0.119327 | 11,414 | 249 | 117 | 45.839357 | 0.804119 | 0 | 0 | 0.603774 | 0 | 0 | 0.107324 | 0.002541 | 0 | 0 | 0 | 0 | 0.084906 | 1 | 0.084906 | false | 0 | 0.014151 | 0 | 0.099057 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f76386ac51bf75964637c8ddaaa498a95fdad38f | 35,542 | py | Python | indico/migrations/versions/20200619_1526_b6dd0a4ed40d_add_registration_form_to_event_acls.py | salevajo/indico | 6f9cbabc20d1641caea907099388ae2b04965cf8 | [
"MIT"
] | 1 | 2018-11-12T21:29:26.000Z | 2018-11-12T21:29:26.000Z | indico/migrations/versions/20200619_1526_b6dd0a4ed40d_add_registration_form_to_event_acls.py | salevajo/indico | 6f9cbabc20d1641caea907099388ae2b04965cf8 | [
"MIT"
] | 9 | 2020-09-08T09:25:57.000Z | 2022-01-13T02:59:05.000Z | indico/migrations/versions/20200619_1526_b6dd0a4ed40d_add_registration_form_to_event_acls.py | salevajo/indico | 6f9cbabc20d1641caea907099388ae2b04965cf8 | [
"MIT"
] | 3 | 2020-07-20T09:09:44.000Z | 2020-10-19T00:29:49.000Z | """Add registration form to event acls
Revision ID: b6dd0a4ed40d
Revises: c0fc1e46888b
Create Date: 2020-06-19 15:26:01.961716
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = 'b6dd0a4ed40d'
down_revision = 'c0fc1e46888b'
branch_labels = None
depends_on = None
def upgrade():
op.add_column('attachment_principals', sa.Column('registration_form_id', sa.Integer()), schema='attachments')
op.add_column('folder_principals', sa.Column('registration_form_id', sa.Integer()), schema='attachments')
op.add_column('contribution_principals', sa.Column('registration_form_id', sa.Integer()), schema='events')
op.add_column('principals', sa.Column('registration_form_id', sa.Integer(), nullable=True), schema='events')
op.add_column('session_principals', sa.Column('registration_form_id', sa.Integer(), nullable=True), schema='events')
op.create_index(None, 'attachment_principals', ['registration_form_id'], schema='attachments')
op.create_index(None, 'folder_principals', ['registration_form_id'], schema='attachments')
op.create_index(None, 'contribution_principals', ['registration_form_id'], schema='events')
op.create_index(None, 'principals', ['registration_form_id'], schema='events')
op.create_index(None, 'session_principals', ['registration_form_id'], schema='events')
op.create_foreign_key(None, 'attachment_principals', 'forms',
['registration_form_id'], ['id'],
source_schema='attachments', referent_schema='event_registration')
op.create_foreign_key(None, 'folder_principals', 'forms',
['registration_form_id'], ['id'],
source_schema='attachments', referent_schema='event_registration')
op.create_foreign_key(None, 'contribution_principals', 'forms',
['registration_form_id'], ['id'],
source_schema='events', referent_schema='event_registration')
op.create_foreign_key(None, 'principals', 'forms',
['registration_form_id'], ['id'],
source_schema='events', referent_schema='event_registration')
op.create_foreign_key(None, 'session_principals', 'forms',
['registration_form_id'], ['id'],
source_schema='events', referent_schema='event_registration')
op.execute('''
ALTER TABLE "attachments"."attachment_principals" DROP CONSTRAINT "ck_attachment_principals_valid_category_role";
ALTER TABLE "attachments"."attachment_principals" DROP CONSTRAINT "ck_attachment_principals_valid_enum_type";
ALTER TABLE "attachments"."attachment_principals" DROP CONSTRAINT "ck_attachment_principals_valid_event_role";
ALTER TABLE "attachments"."attachment_principals" DROP CONSTRAINT "ck_attachment_principals_valid_local_group";
ALTER TABLE "attachments"."attachment_principals" DROP CONSTRAINT "ck_attachment_principals_valid_multipass_group";
ALTER TABLE "attachments"."attachment_principals" DROP CONSTRAINT "ck_attachment_principals_valid_user";
ALTER TABLE "attachments"."folder_principals" DROP CONSTRAINT "ck_folder_principals_valid_category_role";
ALTER TABLE "attachments"."folder_principals" DROP CONSTRAINT "ck_folder_principals_valid_enum_type";
ALTER TABLE "attachments"."folder_principals" DROP CONSTRAINT "ck_folder_principals_valid_event_role";
ALTER TABLE "attachments"."folder_principals" DROP CONSTRAINT "ck_folder_principals_valid_local_group";
ALTER TABLE "attachments"."folder_principals" DROP CONSTRAINT "ck_folder_principals_valid_multipass_group";
ALTER TABLE "attachments"."folder_principals" DROP CONSTRAINT "ck_folder_principals_valid_user";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_valid_category_role";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_valid_email";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_valid_enum_type";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_valid_event_role";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_valid_local_group";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_valid_multipass_group";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_valid_user";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_category_role";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_email";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_enum_type";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_event_role";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_local_group";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_multipass_group";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_network";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_user";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_valid_category_role";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_valid_email";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_valid_enum_type";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_valid_event_role";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_valid_local_group";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_valid_multipass_group";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_valid_user";
ALTER TABLE "attachments"."attachment_principals" ADD CONSTRAINT "ck_attachment_principals_valid_registration_form" CHECK (((type <> 8) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (registration_form_id IS NOT NULL))));
ALTER TABLE "attachments"."folder_principals" ADD CONSTRAINT "ck_folder_principals_valid_registration_form" CHECK (((type <> 8) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (registration_form_id IS NOT NULL))));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_registration_form_read_only" CHECK (((type <> 8) OR ((NOT full_access) AND (array_length(permissions, 1) IS NULL))));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_valid_registration_form" CHECK (((type <> 8) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (registration_form_id IS NOT NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_registration_form_read_only" CHECK (((type <> 8) OR ((NOT full_access) AND (array_length(permissions, 1) IS NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_registration_form" CHECK (((type <> 8) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (ip_network_group_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (registration_form_id IS NOT NULL))));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_registration_form_read_only" CHECK (((type <> 8) OR ((NOT full_access) AND (array_length(permissions, 1) IS NULL))));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_valid_registration_form" CHECK (((type <> 8) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (registration_form_id IS NOT NULL))));
ALTER TABLE "attachments"."attachment_principals" ADD CONSTRAINT "ck_attachment_principals_valid_category_role" CHECK (((type <> 7) OR ((event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (category_role_id IS NOT NULL))));
ALTER TABLE "attachments"."attachment_principals" ADD CONSTRAINT "ck_attachment_principals_valid_enum_type" CHECK ((type = ANY (ARRAY[1, 2, 3, 6, 7, 8])));
ALTER TABLE "attachments"."attachment_principals" ADD CONSTRAINT "ck_attachment_principals_valid_event_role" CHECK (((type <> 6) OR ((category_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (event_role_id IS NOT NULL))));
ALTER TABLE "attachments"."attachment_principals" ADD CONSTRAINT "ck_attachment_principals_valid_local_group" CHECK (((type <> 2) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (local_group_id IS NOT NULL))));
ALTER TABLE "attachments"."attachment_principals" ADD CONSTRAINT "ck_attachment_principals_valid_multipass_group" CHECK (((type <> 3) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (mp_group_name IS NOT NULL) AND (mp_group_provider IS NOT NULL))));
ALTER TABLE "attachments"."attachment_principals" ADD CONSTRAINT "ck_attachment_principals_valid_user" CHECK (((type <> 1) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NOT NULL))));
ALTER TABLE "attachments"."folder_principals" ADD CONSTRAINT "ck_folder_principals_valid_category_role" CHECK (((type <> 7) OR ((event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (category_role_id IS NOT NULL))));
ALTER TABLE "attachments"."folder_principals" ADD CONSTRAINT "ck_folder_principals_valid_enum_type" CHECK ((type = ANY (ARRAY[1, 2, 3, 6, 7, 8])));
ALTER TABLE "attachments"."folder_principals" ADD CONSTRAINT "ck_folder_principals_valid_event_role" CHECK (((type <> 6) OR ((category_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (event_role_id IS NOT NULL))));
ALTER TABLE "attachments"."folder_principals" ADD CONSTRAINT "ck_folder_principals_valid_local_group" CHECK (((type <> 2) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (local_group_id IS NOT NULL))));
ALTER TABLE "attachments"."folder_principals" ADD CONSTRAINT "ck_folder_principals_valid_multipass_group" CHECK (((type <> 3) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (mp_group_name IS NOT NULL) AND (mp_group_provider IS NOT NULL))));
ALTER TABLE "attachments"."folder_principals" ADD CONSTRAINT "ck_folder_principals_valid_user" CHECK (((type <> 1) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NOT NULL))));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_valid_category_role" CHECK (((type <> 7) OR ((email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (category_role_id IS NOT NULL))));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_valid_email" CHECK (((type <> 4) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (email IS NOT NULL))));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_valid_enum_type" CHECK ((type = ANY (ARRAY[1, 2, 3, 4, 6, 7, 8])));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_valid_event_role" CHECK (((type <> 6) OR ((category_role_id IS NULL) AND (email IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (event_role_id IS NOT NULL))));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_valid_local_group" CHECK (((type <> 2) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (local_group_id IS NOT NULL))));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_valid_multipass_group" CHECK (((type <> 3) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (mp_group_name IS NOT NULL) AND (mp_group_provider IS NOT NULL))));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_valid_user" CHECK (((type <> 1) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NOT NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_category_role" CHECK (((type <> 7) OR ((email IS NULL) AND (event_role_id IS NULL) AND (ip_network_group_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (category_role_id IS NOT NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_email" CHECK (((type <> 4) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (ip_network_group_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (email IS NOT NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_enum_type" CHECK ((type = ANY (ARRAY[1, 2, 3, 4, 5, 6, 7, 8])));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_event_role" CHECK (((type <> 6) OR ((category_role_id IS NULL) AND (email IS NULL) AND (ip_network_group_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (event_role_id IS NOT NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_local_group" CHECK (((type <> 2) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (ip_network_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (local_group_id IS NOT NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_multipass_group" CHECK (((type <> 3) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (ip_network_group_id IS NULL) AND (local_group_id IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (mp_group_name IS NOT NULL) AND (mp_group_provider IS NOT NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_network" CHECK (((type <> 5) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (ip_network_group_id IS NOT NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_user" CHECK (((type <> 1) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (ip_network_group_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NOT NULL))));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_valid_category_role" CHECK (((type <> 7) OR ((email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (category_role_id IS NOT NULL))));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_valid_email" CHECK (((type <> 4) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (email IS NOT NULL))));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_valid_enum_type" CHECK ((type = ANY (ARRAY[1, 2, 3, 4, 6, 7, 8])));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_valid_event_role" CHECK (((type <> 6) OR ((category_role_id IS NULL) AND (email IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (event_role_id IS NOT NULL))));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_valid_local_group" CHECK (((type <> 2) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (local_group_id IS NOT NULL))));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_valid_multipass_group" CHECK (((type <> 3) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NULL) AND (mp_group_name IS NOT NULL) AND (mp_group_provider IS NOT NULL))));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_valid_user" CHECK (((type <> 1) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (registration_form_id IS NULL) AND (user_id IS NOT NULL))));
''')
def downgrade():
op.execute('''
DELETE FROM attachments.attachment_principals WHERE type = 8;
DELETE FROM attachments.folder_principals WHERE type = 8;
DELETE FROM events.principals WHERE type = 8;
DELETE FROM events.session_principals WHERE type = 8;
DELETE FROM events.contribution_principals WHERE type = 8;
''')
op.execute('''
ALTER TABLE "attachments"."attachment_principals" DROP CONSTRAINT "ck_attachment_principals_valid_registration_form";
ALTER TABLE "attachments"."attachment_principals" DROP CONSTRAINT "ck_attachment_principals_valid_category_role";
ALTER TABLE "attachments"."attachment_principals" DROP CONSTRAINT "ck_attachment_principals_valid_enum_type";
ALTER TABLE "attachments"."attachment_principals" DROP CONSTRAINT "ck_attachment_principals_valid_event_role";
ALTER TABLE "attachments"."attachment_principals" DROP CONSTRAINT "ck_attachment_principals_valid_local_group";
ALTER TABLE "attachments"."attachment_principals" DROP CONSTRAINT "ck_attachment_principals_valid_multipass_group";
ALTER TABLE "attachments"."attachment_principals" DROP CONSTRAINT "ck_attachment_principals_valid_user";
ALTER TABLE "attachments"."folder_principals" DROP CONSTRAINT "ck_folder_principals_valid_registration_form";
ALTER TABLE "attachments"."folder_principals" DROP CONSTRAINT "ck_folder_principals_valid_category_role";
ALTER TABLE "attachments"."folder_principals" DROP CONSTRAINT "ck_folder_principals_valid_enum_type";
ALTER TABLE "attachments"."folder_principals" DROP CONSTRAINT "ck_folder_principals_valid_event_role";
ALTER TABLE "attachments"."folder_principals" DROP CONSTRAINT "ck_folder_principals_valid_local_group";
ALTER TABLE "attachments"."folder_principals" DROP CONSTRAINT "ck_folder_principals_valid_multipass_group";
ALTER TABLE "attachments"."folder_principals" DROP CONSTRAINT "ck_folder_principals_valid_user";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_registration_form_read_only";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_valid_registration_form";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_valid_category_role";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_valid_email";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_valid_enum_type";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_valid_event_role";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_valid_local_group";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_valid_multipass_group";
ALTER TABLE "events"."contribution_principals" DROP CONSTRAINT "ck_contribution_principals_valid_user";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_registration_form_read_only";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_registration_form";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_category_role";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_email";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_enum_type";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_event_role";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_local_group";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_multipass_group";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_network";
ALTER TABLE "events"."principals" DROP CONSTRAINT "ck_principals_valid_user";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_registration_form_read_only";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_valid_registration_form";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_valid_category_role";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_valid_email";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_valid_enum_type";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_valid_event_role";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_valid_local_group";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_valid_multipass_group";
ALTER TABLE "events"."session_principals" DROP CONSTRAINT "ck_session_principals_valid_user";
ALTER TABLE "attachments"."attachment_principals" ADD CONSTRAINT "ck_attachment_principals_valid_category_role" CHECK (((type <> 7) OR ((event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (category_role_id IS NOT NULL))));
ALTER TABLE "attachments"."attachment_principals" ADD CONSTRAINT "ck_attachment_principals_valid_enum_type" CHECK ((type = ANY (ARRAY[1, 2, 3, 6, 7])));
ALTER TABLE "attachments"."attachment_principals" ADD CONSTRAINT "ck_attachment_principals_valid_event_role" CHECK (((type <> 6) OR ((category_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (event_role_id IS NOT NULL))));
ALTER TABLE "attachments"."attachment_principals" ADD CONSTRAINT "ck_attachment_principals_valid_local_group" CHECK (((type <> 2) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (local_group_id IS NOT NULL))));
ALTER TABLE "attachments"."attachment_principals" ADD CONSTRAINT "ck_attachment_principals_valid_multipass_group" CHECK (((type <> 3) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (user_id IS NULL) AND (mp_group_name IS NOT NULL) AND (mp_group_provider IS NOT NULL))));
ALTER TABLE "attachments"."attachment_principals" ADD CONSTRAINT "ck_attachment_principals_valid_user" CHECK (((type <> 1) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NOT NULL))));
ALTER TABLE "attachments"."folder_principals" ADD CONSTRAINT "ck_folder_principals_valid_category_role" CHECK (((type <> 7) OR ((event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (category_role_id IS NOT NULL))));
ALTER TABLE "attachments"."folder_principals" ADD CONSTRAINT "ck_folder_principals_valid_enum_type" CHECK ((type = ANY (ARRAY[1, 2, 3, 6, 7])));
ALTER TABLE "attachments"."folder_principals" ADD CONSTRAINT "ck_folder_principals_valid_event_role" CHECK (((type <> 6) OR ((category_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (event_role_id IS NOT NULL))));
ALTER TABLE "attachments"."folder_principals" ADD CONSTRAINT "ck_folder_principals_valid_local_group" CHECK (((type <> 2) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (local_group_id IS NOT NULL))));
ALTER TABLE "attachments"."folder_principals" ADD CONSTRAINT "ck_folder_principals_valid_multipass_group" CHECK (((type <> 3) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (user_id IS NULL) AND (mp_group_name IS NOT NULL) AND (mp_group_provider IS NOT NULL))));
ALTER TABLE "attachments"."folder_principals" ADD CONSTRAINT "ck_folder_principals_valid_user" CHECK (((type <> 1) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NOT NULL))));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_valid_category_role" CHECK (((type <> 7) OR ((email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (category_role_id IS NOT NULL))));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_valid_email" CHECK (((type <> 4) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (email IS NOT NULL))));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_valid_enum_type" CHECK ((type = ANY (ARRAY[1, 2, 3, 4, 6, 7])));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_valid_event_role" CHECK (((type <> 6) OR ((category_role_id IS NULL) AND (email IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (event_role_id IS NOT NULL))));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_valid_local_group" CHECK (((type <> 2) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (local_group_id IS NOT NULL))));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_valid_multipass_group" CHECK (((type <> 3) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (user_id IS NULL) AND (mp_group_name IS NOT NULL) AND (mp_group_provider IS NOT NULL))));
ALTER TABLE "events"."contribution_principals" ADD CONSTRAINT "ck_contribution_principals_valid_user" CHECK (((type <> 1) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NOT NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_category_role" CHECK (((type <> 7) OR ((email IS NULL) AND (event_role_id IS NULL) AND (ip_network_group_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (category_role_id IS NOT NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_email" CHECK (((type <> 4) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (ip_network_group_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (email IS NOT NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_enum_type" CHECK ((type = ANY (ARRAY[1, 2, 3, 4, 5, 6, 7])));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_event_role" CHECK (((type <> 6) OR ((category_role_id IS NULL) AND (email IS NULL) AND (ip_network_group_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (event_role_id IS NOT NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_local_group" CHECK (((type <> 2) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (ip_network_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (local_group_id IS NOT NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_multipass_group" CHECK (((type <> 3) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (ip_network_group_id IS NULL) AND (local_group_id IS NULL) AND (user_id IS NULL) AND (mp_group_name IS NOT NULL) AND (mp_group_provider IS NOT NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_network" CHECK (((type <> 5) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (ip_network_group_id IS NOT NULL))));
ALTER TABLE "events"."principals" ADD CONSTRAINT "ck_principals_valid_user" CHECK (((type <> 1) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (ip_network_group_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NOT NULL))));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_valid_category_role" CHECK (((type <> 7) OR ((email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (category_role_id IS NOT NULL))));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_valid_email" CHECK (((type <> 4) OR ((category_role_id IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (email IS NOT NULL))));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_valid_enum_type" CHECK ((type = ANY (ARRAY[1, 2, 3, 4, 6, 7])));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_valid_event_role" CHECK (((type <> 6) OR ((category_role_id IS NULL) AND (email IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (event_role_id IS NOT NULL))));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_valid_local_group" CHECK (((type <> 2) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NULL) AND (local_group_id IS NOT NULL))));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_valid_multipass_group" CHECK (((type <> 3) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (user_id IS NULL) AND (mp_group_name IS NOT NULL) AND (mp_group_provider IS NOT NULL))));
ALTER TABLE "events"."session_principals" ADD CONSTRAINT "ck_session_principals_valid_user" CHECK (((type <> 1) OR ((category_role_id IS NULL) AND (email IS NULL) AND (event_role_id IS NULL) AND (local_group_id IS NULL) AND (mp_group_name IS NULL) AND (mp_group_provider IS NULL) AND (user_id IS NOT NULL))));
''')
op.drop_column('session_principals', 'registration_form_id', schema='events')
op.drop_column('principals', 'registration_form_id', schema='events')
op.drop_column('contribution_principals', 'registration_form_id', schema='events')
op.drop_column('folder_principals', 'registration_form_id', schema='attachments')
op.drop_column('attachment_principals', 'registration_form_id', schema='attachments')
| 163.036697 | 385 | 0.76155 | 5,322 | 35,542 | 4.761744 | 0.019166 | 0.11187 | 0.140281 | 0.110252 | 0.981612 | 0.980033 | 0.978218 | 0.968235 | 0.959672 | 0.949254 | 0 | 0.005937 | 0.142282 | 35,542 | 217 | 386 | 163.788018 | 0.825357 | 0.004614 | 0 | 0.41791 | 0 | 0.378109 | 0.951768 | 0.335331 | 0 | 0 | 0 | 0 | 0 | 1 | 0.00995 | false | 0.099502 | 0.00995 | 0 | 0.019901 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 10 |
f79dd17d6ecc8047c4a3683d12d3f1f517bcadc0 | 10,780 | py | Python | test_control_de_flujo.py | EUD-curso-python/control-de-flujo-helvermaldonado | 1de2601bc32bed9d18a0012b340d89d26fdc4f9d | [
"MIT"
] | null | null | null | test_control_de_flujo.py | EUD-curso-python/control-de-flujo-helvermaldonado | 1de2601bc32bed9d18a0012b340d89d26fdc4f9d | [
"MIT"
] | null | null | null | test_control_de_flujo.py | EUD-curso-python/control-de-flujo-helvermaldonado | 1de2601bc32bed9d18a0012b340d89d26fdc4f9d | [
"MIT"
] | null | null | null | from utils_test import eval_test
import pytest
import control_de_flujo
d = {
'naturales': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100],
'acumulado': ['1', '1 2', '1 2 3', '1 2 3 4', '1 2 3 4 5', '1 2 3 4 5 6', '1 2 3 4 5 6 7', '1 2 3 4 5 6 7 8', '1 2 3 4 5 6 7 8 9', '1 2 3 4 5 6 7 8 9 10', '1 2 3 4 5 6 7 8 9 10 11', '1 2 3 4 5 6 7 8 9 10 11 12', '1 2 3 4 5 6 7 8 9 10 11 12 13', '1 2 3 4 5 6 7 8 9 10 11 12 13 14', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49', '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50'],
'suma100': 5050,
'tabla100': '134,268,402,536,670,804,938,1072,1206,1340',
'multiplos3': 17,
'regresivo50': ['50 49 48 47 46 45 44 43 42 41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '49 48 47 46 45 44 43 42 41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '48 47 46 45 44 43 42 41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '47 46 45 44 43 42 41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '46 45 44 43 42 41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '45 44 43 42 41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '44 43 42 41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '43 42 41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '42 41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '15 14 13 12 11 10 9 8 7 6 5 4 3 2 1', '14 13 12 11 10 9 8 7 6 5 4 3 2 1', '13 12 11 10 9 8 7 6 5 4 3 2 1', '12 11 10 9 8 7 6 5 4 3 2 1', '11 10 9 8 7 6 5 4 3 2 1', '10 9 8 7 6 5 4 3 2 1', '9 8 7 6 5 4 3 2 1', '8 7 6 5 4 3 2 1', '7 6 5 4 3 2 1', '6 5 4 3 2 1', '5 4 3 2 1', '4 3 2 1', '3 2 1', '2 1', '1'],
'invertido': [66, 61, 56, 51, 46, 41, 36, 31, 26, 21, 16, 11, 6, 1],
'primos': [37, 41, 43, 47, 53, 59, 61, 67, 71, 73, 79, 83, 89, 97, 101, 103, 107, 109, 113, 127, 131, 137, 139, 149, 151, 157, 163, 167, 173, 179, 181, 191, 193, 197, 199, 211, 223, 227, 229, 233, 239, 241, 251, 257, 263, 269, 271, 277, 281, 283, 293],
'fibonacci': [0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, 610, 987, 1597, 2584, 4181, 6765, 10946, 17711, 28657, 46368, 75025, 121393, 196418, 317811, 514229, 832040, 1346269, 2178309, 3524578, 5702887, 9227465, 14930352, 24157817, 39088169, 63245986, 102334155, 165580141, 267914296, 433494437, 701408733, 1134903170, 1836311903, 2971215073, 4807526976, 7778742049, 12586269025, 20365011074, 32951280099, 53316291173, 86267571272, 139583862445, 225851433717, 365435296162, 591286729879, 956722026041],
'factorial': 265252859812191058636308480000000,
'pares': [941, 672, 99, 749, 251, 889, 836, 512, 674, 272, 859, 898, 119, 798, 348, 33, 460, 168, 929, 233, 48, 890, 484, 831, 366, 271, 870, 449, 347, 519, 242, 985, 490, 999, 355, 416, 214, 834, 698, 217, 334],
'cubos': [1, 8, 27, 64, 125, 216, 343, 512, 729, 1000, 1331, 1728, 2197, 2744, 3375, 4096, 4913, 5832, 6859, 8000, 9261, 10648, 12167, 13824, 15625, 17576, 19683, 21952, 24389, 27000, 29791, 32768, 35937, 39304, 42875, 46656, 50653, 54872, 59319, 64000, 68921, 74088, 79507, 85184, 91125, 97336, 103823, 110592, 117649, 125000, 132651, 140608, 148877, 157464, 166375, 175616, 185193, 195112, 205379, 216000, 226981, 238328, 250047, 262144, 274625, 287496, 300763, 314432, 328509, 343000, 357911, 373248, 389017, 405224, 421875, 438976, 456533, 474552, 493039, 512000, 531441, 551368, 571787, 592704, 614125, 636056, 658503, 681472, 704969, 729000, 753571, 778688, 804357, 830584, 857375, 884736, 912673, 941192, 970299, 1000000],
'suma_2s': 2469135800,
'patron': '*\n**\n***\n****\n*****\n******\n*******\n********\n*********\n**********\n***********\n************\n*************\n**************\n***************\n****************\n*****************\n******************\n*******************\n********************\n*********************\n**********************\n***********************\n************************\n*************************\n**************************\n***************************\n****************************\n*****************************\n******************************\n*****************************\n****************************\n***************************\n**************************\n*************************\n************************\n***********************\n**********************\n*********************\n********************\n*******************\n******************\n*****************\n****************\n***************\n**************\n*************\n************\n***********\n**********\n*********\n********\n*******\n******\n*****\n****\n***\n**\n*'
}
@pytest.mark.parametrize("var", list(d.keys()))
def test_eval(var):
eval_test(d, var, control_de_flujo) | 414.615385 | 3,581 | 0.560204 | 3,034 | 10,780 | 1.987475 | 0.118655 | 0.018905 | 0.027861 | 0.036484 | 0.729519 | 0.727861 | 0.725871 | 0.723549 | 0.720896 | 0.71791 | 0 | 0.7395 | 0.279963 | 10,780 | 26 | 3,582 | 414.615385 | 0.037362 | 0 | 0 | 0 | 0 | 2.772727 | 0.732121 | 0.098136 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.136364 | 0 | 0.181818 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
e3a8ded314eaebdb4a110c2f61b5a0354428a207 | 598 | py | Python | calc.py | tutsplus/taming-python-with-unit-tests | 20196e7d013cce45c2ba7c5cc7da15d5df3bf504 | [
"BSD-2-Clause"
] | null | null | null | calc.py | tutsplus/taming-python-with-unit-tests | 20196e7d013cce45c2ba7c5cc7da15d5df3bf504 | [
"BSD-2-Clause"
] | null | null | null | calc.py | tutsplus/taming-python-with-unit-tests | 20196e7d013cce45c2ba7c5cc7da15d5df3bf504 | [
"BSD-2-Clause"
] | null | null | null | class Calculator(object):
def __init__(self):
self.total = 0
def add(self, x, y = None):
if y is None:
self.total += x
else:
self.total = x + y
def subtract(self, x, y = None):
if y is None:
self.total -= x
else:
self.total = x - y
def multipy(self, x, y = None):
if y is None:
self.total *= x
else:
self.total = x * y
def divide(self, x, y = None):
if y is None:
self.total /= x
else:
self.total = x / y
| 20.62069 | 36 | 0.431438 | 81 | 598 | 3.135802 | 0.222222 | 0.318898 | 0.314961 | 0.15748 | 0.728346 | 0.728346 | 0.728346 | 0.728346 | 0.728346 | 0.728346 | 0 | 0.003106 | 0.461538 | 598 | 28 | 37 | 21.357143 | 0.785714 | 0 | 0 | 0.347826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.217391 | false | 0 | 0 | 0 | 0.26087 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
e3dd3ed5cfd1ce64a9346df30ee9f972b3b77ff7 | 45 | py | Python | data-purifying-GCN/feature-extraction/datasets/__init__.py | lulujianjie/efficient-person-generation-for-reid | 1bb29c7c280e3322a65af36b37deecbce0c1d322 | [
"RSA-MD"
] | 24 | 2020-04-27T01:53:02.000Z | 2020-09-09T04:39:31.000Z | data-purifying-GCN/graph-clustering/datasets/__init__.py | lulujianjie/efficient-person-generation-for-reID | 1bb29c7c280e3322a65af36b37deecbce0c1d322 | [
"RSA-MD"
] | 8 | 2020-09-27T04:55:10.000Z | 2021-11-25T02:31:09.000Z | data-purifying-GCN/graph-clustering/datasets/__init__.py | lulujianjie/efficient-person-generation-for-reID | 1bb29c7c280e3322a65af36b37deecbce0c1d322 | [
"RSA-MD"
] | 8 | 2020-04-27T01:54:24.000Z | 2022-02-05T06:21:20.000Z | from .make_dataloader import make_dataloader | 45 | 45 | 0.888889 | 6 | 45 | 6.333333 | 0.666667 | 0.736842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088889 | 45 | 1 | 45 | 45 | 0.926829 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
541fc1bb395735d7c412a70e7e8cadeb59f00502 | 146 | py | Python | Tournament/__init__.py | YuMurata/Tournament | 8fcd7573fb3421b4ec06eb6f2ad3bcdbfdce8805 | [
"MIT"
] | null | null | null | Tournament/__init__.py | YuMurata/Tournament | 8fcd7573fb3421b4ec06eb6f2ad3bcdbfdce8805 | [
"MIT"
] | null | null | null | Tournament/__init__.py | YuMurata/Tournament | 8fcd7573fb3421b4ec06eb6f2ad3bcdbfdce8805 | [
"MIT"
] | null | null | null | from .tournament import Tournament, TournamentException
from .tournament import GameWin, CompleteException
from .player import Player, PlayerList
| 36.5 | 55 | 0.856164 | 15 | 146 | 8.333333 | 0.533333 | 0.224 | 0.32 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.10274 | 146 | 3 | 56 | 48.666667 | 0.954198 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
581eb8f0fb3ff03a0fa180b0f25134a496b7d3d9 | 1,542 | py | Python | dataset/list_cls.py | jianpengz/MB-DCNN | 5946fcc9f95d76f4ed807eb7b7487703cca317ed | [
"MIT"
] | 27 | 2020-10-07T12:28:51.000Z | 2022-03-08T07:22:56.000Z | dataset/list_cls.py | DLWK/MB-DCNN | 17777be932a1435d22eab655f1c33fcd0a4b41e1 | [
"MIT"
] | null | null | null | dataset/list_cls.py | DLWK/MB-DCNN | 17777be932a1435d22eab655f1c33fcd0a4b41e1 | [
"MIT"
] | 15 | 2020-10-18T02:46:30.000Z | 2022-01-24T12:40:05.000Z | import pandas as pd
import os
path = 'ISIC/'
if not os.path.isdir(path):
os.mkdir(path)
# train classification
class_p = 'Training_Add'
txtName = 'ISIC/'+class_p+'_cls.txt'
f = open(txtName, 'a+')
path = 'cls_data/'+class_p+'_resize_crop_cls/'
path_gt = pd.read_csv('cls_data/ISIC-2017_'+class_p+'_Part3_GroundTruth_crop_cls.csv')
labels = path_gt.Labels
names = path_gt.ID
for i in range(len(names)):
trainIMG = names[i]
trainGT = str(labels[i])
result = trainIMG + ' ' + trainGT +'\n'
f.write(result)
f.close()
aug_num = 9
# val classification
class_p = 'Validation'
txtName = 'ISIC/'+class_p+'_crop'+str(aug_num)+'_cls.txt'
f = open(txtName, 'a+')
path = 'cls_data/'+class_p+'_resize_crop'+str(aug_num)+'_cls/'
path_gt = pd.read_csv('cls_data/ISIC-2017_'+class_p+'_Part3_GroundTruth_crop'+str(aug_num)+'_cls.csv')
labels = path_gt.Labels
names = path_gt.ID
for i in range(len(names)):
trainIMG = names[i]
trainGT = str(labels[i])
result = trainIMG + ' ' + trainGT +'\n'
f.write(result)
f.close()
# test classification
class_p = 'Testing'
txtName = 'ISIC/'+class_p+'_crop'+str(aug_num)+'_cls.txt'
f = open(txtName, 'a+')
path = 'cls_data/'+class_p+'_resize_crop'+str(aug_num)+'_cls/'
path_gt = pd.read_csv('cls_data/ISIC-2017_'+class_p+'_Part3_GroundTruth_crop'+str(aug_num)+'_cls.csv')
labels = path_gt.Labels
names = path_gt.ID
for i in range(len(names)):
trainIMG = names[i]
trainGT = str(labels[i])
result = trainIMG + ' ' + trainGT +'\n'
f.write(result)
f.close()
| 22.676471 | 102 | 0.67834 | 248 | 1,542 | 3.947581 | 0.217742 | 0.073544 | 0.061287 | 0.079673 | 0.808989 | 0.808989 | 0.808989 | 0.808989 | 0.808989 | 0.808989 | 0 | 0.012149 | 0.145914 | 1,542 | 67 | 103 | 23.014925 | 0.731207 | 0.038262 | 0 | 0.733333 | 0 | 0 | 0.220419 | 0.052062 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.044444 | 0 | 0.044444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
584cce5ce7495ae4708a7560014dce7f249b3b14 | 4,638 | py | Python | site-packages/pskf/tools/plot/pa/dual.py | jjokella/pyshemkf | 61a6329f7aa5739a38a68504fd6a44568fcd833b | [
"MIT"
] | 5 | 2019-02-06T10:52:52.000Z | 2021-05-21T09:32:45.000Z | site-packages/pskf/tools/plot/pa/dual.py | jjokella/pyshemkf | 61a6329f7aa5739a38a68504fd6a44568fcd833b | [
"MIT"
] | null | null | null | site-packages/pskf/tools/plot/pa/dual.py | jjokella/pyshemkf | 61a6329f7aa5739a38a68504fd6a44568fcd833b | [
"MIT"
] | 1 | 2018-12-04T11:39:10.000Z | 2018-12-04T11:39:10.000Z | # Dual
wavewell_dats = {50: '2018_03_13', 70: '2018_03_13',
100: '2018_03_13', 250: '2018_03_13',
500: '2018_03_13', 1000: '2018_03_13',
2000: '2018_03_13'}
wavewell_lets = {50: 'b', 70: 'aln', 100: 'bxz', 250: 'dkl',
500: 'ewx', 1000: 'gjj', 2000: 'hvv'}
wavewell_nums = {50: 1000, 70: 1000, 100: 1000, 250: 1000,
500: 100, 1000: 100, 2000: 100}
wavewell_obss = {50: 60, 70: 60, 100: 60, 250: 60,
500: 60, 1000: 60, 2000: 60}
wavewell2_dats = {50: '2019_03_04', 70: '2019_03_04',
100: '2019_03_04', 250: '2019_03_04',
500: '2019_03_04', 1000: '2019_03_04',
2000: '2019_03_04'}
wavewell2_lets = {50: 'b', 70: 'aln', 100: 'bxz', 250: 'dkl',
500: 'ewx', 1000: 'gjj', 2000: 'hvv'}
wavewell2_nums = {50: 1000, 70: 1000, 100: 1000, 250: 1000,
500: 100, 1000: 100, 2000: 100}
wavewell2_obss = {50: 60, 70: 60, 100: 60, 250: 60,
500: 60, 1000: 60, 2000: 60}
wavereal_dats = {50: '2018_02_07', 70: '2018_02_07',
100: '2018_02_07', 250: '2018_02_07',
500: '2018_02_07', 1000: '2018_02_07',
2000: '2018_02_07'}
wavereal_lets = {50: 'b', 70: 'aln', 100: 'bxz', 250: 'dkl',
500: 'ewx', 1000: 'gjj', 2000: 'hvv'}
wavereal_nums = {50: 1000, 70: 1000, 100: 1000, 250: 1000,
500: 100, 1000: 100, 2000: 100}
wavereal_obss = {50: 100, 70: 100, 100: 100, 250: 100,
500: 100, 1000: 100, 2000: 100}
wavereal2_dats = {50: '2019_03_04', 70: '2019_03_04',
100: '2019_03_04', 250: '2019_03_04',
500: '2019_03_04', 1000: '2019_03_04',
2000: '2019_03_04'}
wavereal2_lets = {50: 'b', 70: 'aln', 100: 'bxz', 250: 'dkl',
500: 'ewx', 1000: 'gjj', 2000: 'hvv'}
wavereal2_nums = {50: 1000, 70: 1000, 100: 1000, 250: 1000,
500: 100, 1000: 100, 2000: 100}
wavereal2_obss = {50: 100, 70: 100, 100: 100, 250: 100,
500: 100, 1000: 100, 2000: 100}
corrsmall_wavereal_dats = {50: '2018_09_04', 70: '2018_09_04',
100: '2018_09_04', 250: '2018_09_04',
500: '2018_09_04', 1000: '2018_09_04',
2000: '2018_09_04'}
corrsmall_wavereal_lets = {50: 'b', 70: 'aln', 100: 'bxz', 250: 'dkl',
500: 'ewx', 1000: 'gjj', 2000: 'hvv'}
corrsmall_wavereal_nums = {50: 1000, 70: 1000, 100: 1000, 250: 1000,
500: 100, 1000: 100, 2000: 100}
corrsmall_wavereal_obss = {50: 100, 70: 100, 100: 100, 250: 100,
500: 100, 1000: 100, 2000: 100}
corrlarge_wavereal_dats = {50: '2018_09_14', 70: '2018_09_14',
100: '2018_09_14', 250: '2018_09_14',
500: '2018_09_14', 1000: '2018_09_14',
2000: '2018_09_14'}
corrlarge_wavereal_lets = {50: 'b', 70: 'aln', 100: 'bxz', 250: 'dkl',
500: 'ewx', 1000: 'gjj', 2000: 'hvv'}
corrlarge_wavereal_nums = {50: 1000, 70: 1000, 100: 1000, 250: 1000,
500: 100, 1000: 100, 2000: 100}
corrlarge_wavereal_obss = {50: 100, 70: 100, 100: 100, 250: 100,
500: 100, 1000: 100, 2000: 100}
corrsmall_wavewell_dats = {50: '2018_09_04', 70: '2018_09_04',
100: '2018_09_04', 250: '2018_09_04',
500: '2018_09_04', 1000: '2018_09_04',
2000: '2018_09_04'}
corrsmall_wavewell_lets = {50: 'b', 70: 'aln', 100: 'bxz', 250: 'dkl',
500: 'ewx', 1000: 'gjj', 2000: 'hvv'}
corrsmall_wavewell_nums = {50: 1000, 70: 1000, 100: 1000, 250: 1000,
500: 100, 1000: 100, 2000: 100}
corrsmall_wavewell_obss = {50: 60, 70: 60, 100: 60, 250: 60,
500: 60, 1000: 60, 2000: 60}
corrlarge_wavewell_dats = {50: '2018_09_14', 70: '2018_09_14',
100: '2018_09_14', 250: '2018_09_14',
500: '2018_09_14', 1000: '2018_09_14',
2000: '2018_09_14'}
corrlarge_wavewell_lets = {50: 'b', 70: 'aln', 100: 'bxz', 250: 'dkl',
500: 'ewx', 1000: 'gjj', 2000: 'hvv'}
corrlarge_wavewell_nums = {50: 1000, 70: 1000, 100: 1000, 250: 1000,
500: 100, 1000: 100, 2000: 100}
corrlarge_wavewell_obss = {50: 60, 70: 60, 100: 60, 250: 60,
500: 60, 1000: 60, 2000: 60}
| 52.11236 | 70 | 0.498491 | 641 | 4,638 | 3.357254 | 0.054602 | 0.078067 | 0.052045 | 0.072491 | 0.886152 | 0.882435 | 0.882435 | 0.874071 | 0.858736 | 0.858736 | 0 | 0.481469 | 0.342605 | 4,638 | 88 | 71 | 52.704545 | 0.224336 | 0.000862 | 0 | 0.525 | 0 | 0 | 0.153713 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
584db3f955900303a1529f3559f17061a640d2fb | 14,092 | py | Python | Codes/Word_count/predict_return.py | ZhouXing19/ContAnalysisFinalProj | 45026ee8cef1fa607bd7108236736b222d39f156 | [
"MIT"
] | 1 | 2022-02-17T02:03:09.000Z | 2022-02-17T02:03:09.000Z | Codes/Word_count/predict_return.py | ZhouXing19/ContAnalysisFinalProj | 45026ee8cef1fa607bd7108236736b222d39f156 | [
"MIT"
] | null | null | null | Codes/Word_count/predict_return.py | ZhouXing19/ContAnalysisFinalProj | 45026ee8cef1fa607bd7108236736b222d39f156 | [
"MIT"
] | 1 | 2021-03-19T22:32:41.000Z | 2021-03-19T22:32:41.000Z | ###############################################
# Use all word count to predict stock returns
###############################################
import pandas as pd
import pickle
hb = pickle.load(open('dictionary_r.pkl', 'rb'))
hb.filter_extremes(no_below=10, no_above=0.5)
word_ids = hb.keys()
df_main = pickle.load(open('df_bow_ret.pkl', 'rb'))
df_main['FDATE'] = pd.to_datetime(df_main['FDATE'].astype(str))
import datetime as dt
df_train = df_main[df_main['FDATE']<dt.datetime(2019,1,1)]
df_test = df_main[df_main['FDATE']>=dt.datetime(2019,1,1)]
df_train = df_train[['bow', 'ma_ret']]
# df_main = df_main[['bow' ,'RET']]
df_train.columns = ['words', 'ret']
df_train['words'] = df_train['words'].apply(lambda x: [y[0] for y in x])
df_train['ret'] = pd.to_numeric(df_train['ret'], errors = 'coerce')
df_main = df_train
import numpy as np
fv_arr = np.zeros((len(word_ids), 2), dtype=int)
for i in range(len(df_main)):
words = df_main.iloc[i,0]
ret = df_main.iloc[i,1]
for word in words:
fv_arr[word][0] += 1
if ret > 0:
fv_arr[word][1] += 1
FValue = pd.DataFrame(fv_arr)
FValue.columns = ['Denom', 'Nom']
FValue['index'] = FValue.index
FValue['word'] = FValue['index'].apply(lambda x: hb[x])
FValue = FValue[FValue['Denom']!=0]
FValue['FValue'] = FValue['Nom']/FValue['Denom']
pd.set_option('max_row', 200)
neg_df = FValue[FValue['Denom']>2000].sort_values(by='FValue').head(200)[['word', 'FValue']]
pos_df = FValue[FValue['Denom']>2000].sort_values(by='FValue', ascending = False).head(100)[['word', 'FValue']]
pos_words = pos_df['word'].values.tolist()
neg_words = neg_df['word'].values.tolist()
df_main = pickle.load(open('df_bow_ret.pkl', 'rb'))
df_main['FDATE'] = pd.to_datetime(df_main['FDATE'].astype(str))
df_train = df_main[df_main['FDATE']<dt.datetime(2019,1,1)]
df_train['words'] = df_train['bow'].apply(dict)
pos_id = [hb.token2id[i] for i in pos_words]
neg_id = [hb.token2id[i] for i in neg_words]
all_id = neg_id + pos_id
def get_vec(x):
vec = [0]*len(all_id)
for i, word in enumerate(all_id):
if word in x:
vec[i] += x[word]
return vec
df_train['vect'] = df_train['words'].apply(get_vec)
df_test['words'] = df_test['bow'].apply(dict)
df_test['vect'] = df_test['words'].apply(get_vec)
df_train['label'] = np.where(df_train['ma_ret']>=0, 1, 0)
df_test['label'] = np.where(df_test['ma_ret']>=0, 1, 0)
from sklearn.linear_model import LogisticRegression
logistic_ret= LogisticRegression(penalty='l2')
logistic_ret.fit(np.stack(df_train['vect'], axis=0), df_train['label'])
print('Logistic Regression:')
print('training: ', logistic_ret.score(np.stack(df_train['vect'], axis=0), df_train['label']))
print('testing: ', logistic_ret.score(np.stack(df_test['vect'], axis=0), df_test['label']))
import sklearn
from sklearn.ensemble import RandomForestClassifier
rf_ret = sklearn.ensemble.RandomForestClassifier()
rf_ret.fit(np.stack(df_train['vect'], axis =0), df_train['label'])
print('Random Forest')
print('training: ', rf_ret.score(np.stack(df_train['vect'], axis=0), df_train['label']))
print('testing: ', rf_ret.score(np.stack(df_test['vect'], axis=0), df_test['label']))
from sklearn.neural_network import MLPClassifier
nn_ret = sklearn.neural_network.MLPClassifier()
nn_ret.fit(np.stack(df_train['vect'], axis =0), df_train['label'])
print('Multi-layer perceptron')
print('training: ', nn_ret.score(np.stack(df_train['vect'], axis=0), df_train['label']))
print('testing: ', nn_ret.score(np.stack(df_test['vect'], axis=0), df_test['label']))
###############################################
# Use financial dictionary word count to predict stock returns
###############################################
import warnings
warnings.filterwarnings("ignore")
df_main = pickle.load(open('df_bow_ret.pkl', 'rb'))
df_main['FDATE'] = pd.to_datetime(df_main['FDATE'].astype(str))
LM_neg = pd.read_excel('LoughranMcDonald_SentimentWordLists_2018.xlsx', sheet_name = 'Negative', header = None)
LM_neg.columns = ['neg']
LM_neg['neg'] = LM_neg['neg'].apply(lambda x: x.lower())
LM_pos = pd.read_excel('LoughranMcDonald_SentimentWordLists_2018.xlsx', sheet_name = 'Positive', header = None)
LM_pos.columns = ['pos']
LM_pos['pos'] = LM_pos['pos'].apply(lambda x: x.lower())
LM_words = LM_neg['neg'].values.tolist() + LM_pos['pos'].values.tolist()
LM_lst = []
word_lst = []
for i in LM_words:
try:
LM_lst.append(hb.token2id[i])
word_lst.append(i)
except:
continue
import datetime as dt
df_train = df_main[df_main['FDATE']<dt.datetime(2019,1,1)]
df_test = df_main[df_main['FDATE']>=dt.datetime(2019,1,1)]
def get_vec(x):
vec = [0]*len(LM_lst)
for i, word in enumerate(LM_lst):
if word in x:
vec[i] += x[word]
return vec
df_train['words'] = df_train['bow'].apply(dict)
df_train['vect'] = df_train['words'].apply(get_vec)
df_test['words'] = df_test['bow'].apply(dict)
df_test['vect'] = df_test['words'].apply(get_vec)
df_train['label'] = np.where(df_train['ma_ret']>=0, 1, 0)
df_test['label'] = np.where(df_test['ma_ret']>=0, 1, 0)
logistic_ret= LogisticRegression(penalty='l2')
logistic_ret.fit(np.stack(df_train['vect'], axis=0), df_train['label'])
print('logistic regression')
print('training: ', logistic_ret.score(np.stack(df_train['vect'], axis=0), df_train['label']))
print('testing: ', logistic_ret.score(np.stack(df_test['vect'], axis=0), df_test['label']))
import sklearn
from sklearn.ensemble import RandomForestClassifier
rf_ret = sklearn.ensemble.RandomForestClassifier()
rf_ret.fit(np.stack(df_train['vect'], axis =0), df_train['label'])
print('random forest')
print('training: ', rf_ret.score(np.stack(df_train['vect'], axis=0), df_train['label']))
print('testing: ', rf_ret.score(np.stack(df_test['vect'], axis=0), df_test['label']))
from sklearn.neural_network import MLPClassifier
nn_ret = sklearn.neural_network.MLPClassifier()
nn_ret.fit(np.stack(df_train['vect'], axis =0), df_train['label'])
print('multi-layer perceptron')
print('training: ', nn_ret.score(np.stack(df_train['vect'], axis=0), df_train['label']))
print('testing: ', nn_ret.score(np.stack(df_test['vect'], axis=0), df_test['label']))
###############################################
# Use all word tf-idf to predict stock returns
###############################################
import pandas as pd
import pickle
hb = pickle.load(open('dictionary_r.pkl', 'rb'))
hb.filter_extremes(no_below=10, no_above=0.5)
word_ids = hb.keys()
df_main = pickle.load(open('df_bow_ret.pkl', 'rb'))
from gensim.models import TfidfModel
corpus = df_main['bow'].values.tolist()
model = TfidfModel(corpus)
df_main['bow_tfidf'] = df_main['bow'].apply(lambda x: model[x])
df_main['FDATE'] = pd.to_datetime(df_main['FDATE'].astype(str))
import datetime as dt
df_train = df_main[df_main['FDATE']<dt.datetime(2019,1,1)]
df_test = df_main[df_main['FDATE']>=dt.datetime(2019,1,1)]
df_FValue = df_train[['bow', 'ma_ret']]
df_FValue.columns = ['words', 'ret']
df_FValue['words'] = df_FValue['words'].apply(lambda x: [y[0] for y in x])
df_FValue['ret'] = pd.to_numeric(df_FValue['ret'], errors = 'coerce')
import numpy as np
fv_arr = np.zeros((len(word_ids), 2), dtype=int)
for i in range(len(df_FValue)):
words = df_FValue.iloc[i,0]
ret = df_FValue.iloc[i,1]
for word in words:
fv_arr[word][0] += 1
if ret > 0:
fv_arr[word][1] += 1
FValue = pd.DataFrame(fv_arr)
FValue.columns = ['Denom', 'Nom']
FValue['index'] = FValue.index
FValue['word'] = FValue['index'].apply(lambda x: hb[x])
FValue = FValue[FValue['Denom']!=0]
FValue['FValue'] = FValue['Nom']/FValue['Denom']
pd.set_option('max_row', 200)
neg_df = FValue[FValue['Denom']>2000].sort_values(by='FValue').head(200)[['word', 'FValue']]
pos_df = FValue[FValue['Denom']>2000].sort_values(by='FValue', ascending = False).head(100)[['word', 'FValue']]
pos_words = pos_df['word'].values.tolist()
neg_words = neg_df['word'].values.tolist()
pos_id = [hb.token2id[i] for i in pos_words]
neg_id = [hb.token2id[i] for i in neg_words]
all_id = neg_id + pos_id
def get_vec(x):
vec = [0]*len(all_id)
for i, word in enumerate(all_id):
if word in x:
vec[i] += x[word]
return vec
df_train['words'] = df_train['bow_tfidf'].apply(dict)
df_train['vect'] = df_train['words'].apply(get_vec)
df_test['words'] = df_test['bow_tfidf'].apply(dict)
df_test['vect'] = df_test['words'].apply(get_vec)
df_train['label'] = np.where(df_train['ma_ret']>=0, 1, 0)
df_test['label'] = np.where(df_test['ma_ret']>=0, 1, 0)
from sklearn.linear_model import LogisticRegression
logistic_ret= LogisticRegression(penalty='l2')
logistic_ret.fit(np.stack(df_train['vect'], axis=0), df_train['label'])
print('Logistic Regression:')
print('training: ', logistic_ret.score(np.stack(df_train['vect'], axis=0), df_train['label']))
print('testing: ', logistic_ret.score(np.stack(df_test['vect'], axis=0), df_test['label']))
import sklearn
from sklearn.ensemble import RandomForestClassifier
rf_ret = sklearn.ensemble.RandomForestClassifier()
rf_ret.fit(np.stack(df_train['vect'], axis =0), df_train['label'])
print('Random Forest')
print('training: ', rf_ret.score(np.stack(df_train['vect'], axis=0), df_train['label']))
print('testing: ', rf_ret.score(np.stack(df_test['vect'], axis=0), df_test['label']))
from sklearn.neural_network import MLPClassifier
nn_ret = sklearn.neural_network.MLPClassifier()
nn_ret.fit(np.stack(df_train['vect'], axis =0), df_train['label'])
print('Multi-layer perceptron')
print('training: ', nn_ret.score(np.stack(df_train['vect'], axis=0), df_train['label']))
print('testing: ', nn_ret.score(np.stack(df_test['vect'], axis=0), df_test['label']))
###############################################
# Use LM dictionary tf-idf to predict stock returns
###############################################
import warnings
warnings.filterwarnings("ignore")
df_main = pickle.load(open('df_bow_ret.pkl', 'rb'))
df_main['FDATE'] = pd.to_datetime(df_main['FDATE'].astype(str))
from gensim.models import TfidfModel
corpus = df_main['bow'].values.tolist()
model = TfidfModel(corpus)
df_main['bow_tfidf'] = df_main['bow'].apply(lambda x: model[x])
LM_neg = pd.read_excel('LoughranMcDonald_SentimentWordLists_2018.xlsx', sheet_name = 'Negative', header = None)
LM_neg.columns = ['neg']
LM_neg['neg'] = LM_neg['neg'].apply(lambda x: x.lower())
LM_pos = pd.read_excel('LoughranMcDonald_SentimentWordLists_2018.xlsx', sheet_name = 'Positive', header = None)
LM_pos.columns = ['pos']
LM_pos['pos'] = LM_pos['pos'].apply(lambda x: x.lower())
LM_words = LM_neg['neg'].values.tolist() + LM_pos['pos'].values.tolist()
LM_lst = []
word_lst = []
for i in LM_words:
try:
LM_lst.append(hb.token2id[i])
word_lst.append(i)
except:
continue
import datetime as dt
df_train = df_main[df_main['FDATE']<dt.datetime(2019,1,1)]
df_test = df_main[df_main['FDATE']>=dt.datetime(2019,1,1)]
def get_vec(x):
vec = [0]*len(LM_lst)
for i, word in enumerate(LM_lst):
if word in x:
vec[i] += x[word]
return vec
df_train['words'] = df_train['bow_tfidf'].apply(dict)
df_train['vect'] = df_train['words'].apply(get_vec)
df_test['words'] = df_test['bow_tfidf'].apply(dict)
df_test['vect'] = df_test['words'].apply(get_vec)
df_train['label'] = np.where(df_train['ma_ret']>=0, 1, 0)
df_test['label'] = np.where(df_test['ma_ret']>=0, 1, 0)
logistic_ret= LogisticRegression(penalty='l2')
logistic_ret.fit(np.stack(df_train['vect'], axis=0), df_train['label'])
print('logistic regression')
print('training: ', logistic_ret.score(np.stack(df_train['vect'], axis=0), df_train['label']))
print('testing: ', logistic_ret.score(np.stack(df_test['vect'], axis=0), df_test['label']))
import sklearn
from sklearn.ensemble import RandomForestClassifier
rf_ret = sklearn.ensemble.RandomForestClassifier()
rf_ret.fit(np.stack(df_train['vect'], axis =0), df_train['label'])
print('random forest')
print('training: ', rf_ret.score(np.stack(df_train['vect'], axis=0), df_train['label']))
print('testing: ', rf_ret.score(np.stack(df_test['vect'], axis=0), df_test['label']))
from sklearn.neural_network import MLPClassifier
nn_ret = sklearn.neural_network.MLPClassifier()
nn_ret.fit(np.stack(df_train['vect'], axis =0), df_train['label'])
print('multi-layer perceptron')
print('training: ', nn_ret.score(np.stack(df_train['vect'], axis=0), df_train['label']))
print('testing: ', nn_ret.score(np.stack(df_test['vect'], axis=0), df_test['label']))
###############################################
# Use Topic Attention to predict stock returns
###############################################
import pandas as pd
import numpy as np
topic_df = pd.read_pickle('topic_change_10_topics.pickle')
def get_vect(x):
lst = [0]*10
y = dict(x)
for i in range(10):
if i in y:
lst[i]+=y[i]
return lst
topic_df['vect'] = topic_df['topics'].apply(get_vect)
df = pd.read_pickle('df_ind_bow.pkl')
df['vect'] = topic_df['vect']
df['RET'] = pd.to_numeric(df['RET'], errors = 'coerce')
df['label'] = np.where(df['ma_ret']>=0, 1, 0)
df['FDATE'] = pd.to_datetime(df['FDATE'].astype(str))
import datetime as dt
df_train = df[df['FDATE']<dt.datetime(2019,1,1)]
df_test = df[df['FDATE']>=dt.datetime(2019,1,1)]
from sklearn.linear_model import LogisticRegression
logistic_ret= LogisticRegression(penalty='l2')
logistic_ret.fit(np.stack(df_train['vect'], axis=0), df_train['label'])
print('Logistic Regression:')
print('training: ', logistic_ret.score(np.stack(df_train['vect'], axis=0), df_train['label']))
print('testing: ', logistic_ret.score(np.stack(df_test['vect'], axis=0), df_test['label']))
import sklearn
from sklearn.ensemble import RandomForestClassifier
rf_ret = sklearn.ensemble.RandomForestClassifier()
rf_ret.fit(np.stack(df_train['vect'], axis =0), df_train['label'])
print('Random Forest')
print('training: ', rf_ret.score(np.stack(df_train['vect'], axis=0), df_train['label']))
print('testing: ', rf_ret.score(np.stack(df_test['vect'], axis=0), df_test['label']))
| 33.472684 | 111 | 0.671232 | 2,233 | 14,092 | 4.03717 | 0.072996 | 0.073766 | 0.04193 | 0.051248 | 0.951858 | 0.935441 | 0.93056 | 0.927454 | 0.922019 | 0.914032 | 0 | 0.018691 | 0.107792 | 14,092 | 420 | 112 | 33.552381 | 0.698322 | 0.020011 | 0 | 0.890845 | 0 | 0 | 0.153315 | 0.015677 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017606 | false | 0 | 0.119718 | 0 | 0.15493 | 0.147887 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
586e11ef7bb107249f7598dae2b353e5970ebb15 | 83,293 | py | Python | tests/test_audioplayer.py | tangb/cleepapp-audioplayer | f51e3557a6b19bd55e773e6f4fd8e0fa13569535 | [
"MIT"
] | null | null | null | tests/test_audioplayer.py | tangb/cleepapp-audioplayer | f51e3557a6b19bd55e773e6f4fd8e0fa13569535 | [
"MIT"
] | null | null | null | tests/test_audioplayer.py | tangb/cleepapp-audioplayer | f51e3557a6b19bd55e773e6f4fd8e0fa13569535 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import unittest
import logging
import sys
sys.path.append('../')
from backend.audioplayer import Audioplayer
from backend.audioplayer import Gst
from backend.audioplayerplaybackupdateevent import AudioplayerPlaybackUpdateEvent
from cleep.exception import InvalidParameter, MissingParameter, CommandError, Unauthorized, CommandInfo
from cleep.libs.tests import session
from mock import Mock, patch, MagicMock
class GstreamerMsg:
pass
class ParseUrlResult:
pass
class TestAudioplayer(unittest.TestCase):
def setUp(self):
logging.basicConfig(level=logging.FATAL, format=u'%(asctime)s %(name)s:%(lineno)d %(levelname)s : %(message)s')
self.session = session.TestSession(self)
def tearDown(self):
self.session.clean()
def init(self, start=True):
self.module = self.session.setup(Audioplayer)
if start:
self.session.start_module(self.module)
def test_configure(self):
self.init(False)
with patch('backend.audioplayer.Gst') as gstMock:
self.session.start_module(self.module)
gstMock.init.assert_called_with(None)
def test_on_stop(self):
self.init()
player = {
'uuid': 'the-uuid',
'player': 'player-stuff',
}
self.module.players = {'the-uuid': player}
self.module._Audioplayer__destroy_player = Mock()
self.module._on_stop()
self.module._Audioplayer__destroy_player.assert_called_with(player)
self.assertTrue(len(self.module.players.keys()), 0)
def test__prepare_player(self):
self.init()
player = {
'uuid': 'the-uuid',
'player': 'player-stuff',
}
self.module.players = {'the-uuid': player}
self.module._Audioplayer__reset_player = Mock()
self.module._Audioplayer__build_pipeline = Mock()
result = self.module._Audioplayer__prepare_player('the-uuid', 'source', 'audio-format')
self.assertEqual(result, player)
self.module._Audioplayer__reset_player.assert_called_with(player)
self.module._Audioplayer__build_pipeline.assert_called_with('source', 'audio-format', player)
def test__prepare_player_no_player(self):
self.init()
with self.assertRaises(Exception) as cm:
self.module._Audioplayer__prepare_player('the-uuid', 'source', 'audio-format')
self.assertEqual(str(cm.exception), 'Player "the-uuid" does not exist')
def test__create_player(self):
self.init()
self.module._get_unique_id = Mock(return_value='the-uuid')
player = {
"uuid": 'the-uuid',
"playlist": {
"index": None,
"tracks": [],
"repeat": False,
"shuffle": False,
"volume": 0,
"metadata": None,
"duration": None,
},
"player": None,
"source": None,
"volume": None,
"pipeline": [],
"internal": {
"to_destroy": False,
"tags_sent": False,
# NOT TESTED
# "last_state": Gst.State.NULL,
},
}
result = self.module._Audioplayer__create_player()
del result['internal']['last_state']
self.assertEqual(result, player)
def test__reset_player(self):
self.init()
player = Mock()
pipeline_elt1 = Mock()
pipeline_elt2 = Mock()
pipeline_elt3 = Mock()
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': ['track1', 'track2', 'track3'],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': player,
'source': 'source',
'volume': 'volume',
'pipeline': [
pipeline_elt1,
pipeline_elt2,
pipeline_elt3,
],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._Audioplayer__reset_player(player_data)
player.set_state.assert_called()
pipeline_elt1.unlink.assert_called()
pipeline_elt2.unlink.assert_called()
self.assertFalse(pipeline_elt3.unlink.called)
player.remove.assert_any_call(pipeline_elt1)
player.remove.assert_any_call(pipeline_elt2)
player.remove.assert_any_call(pipeline_elt3)
self.assertIsNone(player_data['player'])
self.assertIsNone(player_data['source'])
self.assertIsNone(player_data['volume'])
self.assertEqual(len(player_data['pipeline']), 0)
self.assertEqual(player_data['playlist']['index'], 1)
self.assertEqual(len(player_data['playlist']['tracks']), 3)
self.assertEqual(player_data['playlist']['volume'], 55)
self.assertEqual(player_data['internal']['to_destroy'], False)
self.assertEqual(player_data['internal']['tags_sent'], False)
def test_destroy_player(self):
self.init()
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': ['track1', 'track2', 'track3'],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': Mock(),
'source': 'source',
'volume': 'volume',
'pipeline': [Mock()],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._destroy_player(player_data)
self.assertTrue(player_data['internal']['to_destroy'])
def test__destroy_player(self):
self.init()
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': ['track1', 'track2', 'track3'],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': Mock(),
'source': 'source',
'volume': 'volume',
'pipeline': [Mock()],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._Audioplayer__reset_player = Mock()
self.module._Audioplayer__destroy_player(player_data)
self.module._Audioplayer__reset_player.assert_called_with(player_data)
self.assertEqual(len(self.module.players), 0)
@patch('backend.audioplayer.Gst.Pipeline')
@patch('backend.audioplayer.Gst.ElementFactory')
def test__build_pipeline(self, elementFactoryMock, pipelineMock):
self.init()
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': ['track1', 'track2', 'track3'],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': None,
'source': None,
'volume': None,
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
sourceMock = Mock()
self.module._Audioplayer__build_pipeline(sourceMock, 'audio/mpeg', player_data)
pipelineMock.new.assert_called_once_with('the-uuid')
self.assertEqual(len(player_data['pipeline']), len(Audioplayer.AUDIO_PIPELINE_ELEMENTS["audio/mpeg"])+4)
self.assertIsNotNone(player_data['player'])
self.assertIsNotNone(player_data['source'])
self.assertIsNotNone(player_data['volume'])
self.assertEqual(elementFactoryMock.make.call_count, len(player_data['pipeline'])-1) # -1 because source element is created elsewhere
@patch('backend.audioplayer.Gst.Pipeline')
@patch('backend.audioplayer.Gst.ElementFactory')
def test__build_pipeline_exception(self, elementFactoryMock, pipelineMock):
self.init()
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': ['track1', 'track2', 'track3'],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': None,
'source': None,
'volume': None,
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
sourceMock = Mock()
elementFactoryMock.make.side_effect = [Mock(), Mock(), Mock(), None]
with self.assertRaises(Exception) as cm:
self.module._Audioplayer__build_pipeline(sourceMock, 'audio/mpeg', player_data)
self.assertEqual(str(cm.exception), 'Error configuring audio player')
player_data['pipeline'].clear()
def test_on_process(self):
self.init()
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': ['track1', 'track2', 'track3'],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': None,
'source': None,
'volume': None,
'pipeline': [],
'internal': {
'to_destroy': True,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._Audioplayer__process_players_messages = Mock()
self.module._Audioplayer__destroy_player = Mock()
self.module._on_process()
self.module._Audioplayer__process_players_messages.assert_called_once()
self.module._Audioplayer__destroy_player.assert_called_once_with(player_data)
def test_on_process_no_player_to_destroy(self):
self.init()
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': ['track1', 'track2', 'track3'],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': None,
'source': None,
'volume': None,
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._Audioplayer__process_players_messages = Mock()
self.module._Audioplayer__destroy_player = Mock()
self.module._on_process()
self.module._Audioplayer__process_players_messages.assert_called_once()
self.assertEqual(self.module._Audioplayer__destroy_player.call_count, 0)
def test_on_process_no_player(self):
self.init()
self.module._Audioplayer__process_players_messages = Mock()
self.module._Audioplayer__destroy_player = Mock()
self.module._on_process()
self.module._Audioplayer__process_players_messages.assert_called_once()
self.assertEqual(self.module._Audioplayer__destroy_player.call_count, 0)
def test__process_players_messages(self):
self.init()
player1_mock = Mock()
player1_mock.get_bus.return_value.pop.side_effect = ['msg1', 'msg2', None]
player2_mock = Mock()
player2_mock.get_bus.return_value.pop.side_effect = ['msg3', None]
self.module.players = {
'uuid1': {
'uuid': 'uuid1',
'player': player1_mock,
'pipeline': [],
'internal': {
'to_destroy': False,
},
},
'uuid2': {
'uuid': 'uuid2',
'player': player2_mock,
'pipeline': [],
'internal': {
'to_destroy': False,
},
},
}
self.module._Audioplayer__process_gstreamer_message = Mock()
self.module._Audioplayer__process_players_messages()
self.module._Audioplayer__process_gstreamer_message.assert_any_call('uuid1', session.AnyArg(), 'msg1')
self.module._Audioplayer__process_gstreamer_message.assert_any_call('uuid1', session.AnyArg(), 'msg2')
self.module._Audioplayer__process_gstreamer_message.assert_any_call('uuid2', session.AnyArg(), 'msg3')
def test__process_players_messages_no_player(self):
self.init()
self.module._Audioplayer__process_gstreamer_message = Mock()
self.module._Audioplayer__process_players_messages()
self.module._Audioplayer__process_gstreamer_message.assert_not_called()
def test__process_players_messages_exception(self):
self.init()
player1_mock = Mock()
player1_mock.get_bus.return_value.pop.side_effect = Exception('Test exception')
self.module.players = {
'uuid1': {
'uuid': 'uuid1',
'player': player1_mock,
'pipeline': [],
'internal': {
'to_destroy': False,
},
},
}
self.module._Audioplayer__process_gstreamer_message = Mock()
self.module.logger.exception = Mock()
self.module._Audioplayer__process_players_messages()
self.module.logger.exception.assert_called_with('Error processing player "%s" messages', 'uuid1')
def test__process_gstreamer_message_eos(self):
self.init()
msg = GstreamerMsg()
msg.type = Gst.MessageType.EOS
player = Mock()
self.module._Audioplayer__play_next_track = Mock()
self.module._Audioplayer__send_playback_event = Mock()
self.module._Audioplayer__process_gstreamer_message('the-uuid', player, msg)
player.set_state.assert_called_with(Gst.State.NULL)
self.module._Audioplayer__play_next_track.assert_called_with('the-uuid')
self.module._Audioplayer__send_playback_event.assert_called_with('the-uuid', player)
def test__process_gstreamer_message_state_changed(self):
self.init()
msg = GstreamerMsg()
msg.type = Gst.MessageType.STATE_CHANGED
player = Mock()
self.module._Audioplayer__play_next_track = Mock()
self.module._Audioplayer__send_playback_event = Mock()
self.module._Audioplayer__process_gstreamer_message('the-uuid', player, msg)
player.set_state.assert_not_called()
self.module._Audioplayer__play_next_track.assert_not_called()
self.module._Audioplayer__send_playback_event.assert_called_with('the-uuid', player)
def test__process_gstreamer_message_error(self):
self.init()
msg = GstreamerMsg()
msg.type = Gst.MessageType.ERROR
msg.parse_error = Mock(return_value=('error', 'debug'))
player = Mock()
self.module._Audioplayer__play_next_track = Mock()
self.module._Audioplayer__send_playback_event = Mock()
self.module._Audioplayer__process_gstreamer_message('the-uuid', player, msg)
player.set_state.assert_called_with(Gst.State.NULL)
msg.parse_error.assert_called()
self.module._Audioplayer__play_next_track.assert_not_called()
self.module._Audioplayer__send_playback_event.assert_called_with('the-uuid', player)
def test__process_gstreamer_message_tag_with_metadata_complete(self):
self.init()
msg = GstreamerMsg()
msg.type = Gst.MessageType.TAG
tag = {'album': 'dummy'}
msg.parse_tag = Mock(return_value=tag)
player = Mock()
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'playlist': {
'metadata': {}
},
'pipeline': [],
'internal': {
'tags_sent': False,
'to_destroy': False,
}
}
}
self.module._Audioplayer__play_next_track = Mock()
self.module._Audioplayer__send_playback_event = Mock()
self.module._Audioplayer__get_audio_metadata = Mock(return_value=(True, tag))
self.module._Audioplayer__process_gstreamer_message('the-uuid', player, msg)
player.set_state.assert_not_called()
msg.parse_tag.assert_called()
self.module._Audioplayer__get_audio_metadata.assert_called_with(tag)
self.module._Audioplayer__play_next_track.assert_not_called()
self.module._Audioplayer__send_playback_event.assert_called()
self.assertTrue(self.module.players['the-uuid']['internal']['tags_sent'])
# call another time to check tags are not read again
msg.parse_tag.reset_mock()
self.module._Audioplayer__process_gstreamer_message('the-uuid', player, msg)
msg.parse_tag.assert_not_called()
def test__process_gstreamer_message_tag_with_metadata_incomplete(self):
self.init()
msg = GstreamerMsg()
msg.type = Gst.MessageType.TAG
tag = {'album': 'dummy'}
msg.parse_tag = Mock(return_value=tag)
player = Mock()
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'playlist': {
'metadata': {}
},
'pipeline': [],
'internal': {
'tags_sent': False,
'to_destroy': False,
}
}
}
self.module._Audioplayer__play_next_track = Mock()
self.module._Audioplayer__send_playback_event = Mock()
self.module._Audioplayer__get_audio_metadata = Mock(return_value=(False, tag))
self.module._Audioplayer__process_gstreamer_message('the-uuid', player, msg)
player.set_state.assert_not_called()
msg.parse_tag.assert_called()
self.module._Audioplayer__get_audio_metadata.assert_called_with(tag)
self.module._Audioplayer__play_next_track.assert_not_called()
self.module._Audioplayer__send_playback_event.assert_not_called()
self.assertFalse(self.module.players['the-uuid']['internal']['tags_sent'])
# call another time to check tags are not read again
self.module._Audioplayer__process_gstreamer_message('the-uuid', player, msg)
msg.parse_tag.assert_called()
self.assertFalse(self.module.players['the-uuid']['internal']['tags_sent'])
def test__process_gstreamer_message_duration_changed(self):
self.init()
msg = GstreamerMsg()
msg.type = Gst.MessageType.DURATION_CHANGED
player = Mock()
self.module._Audioplayer__play_next_track = Mock()
self.module._Audioplayer__send_playback_event = Mock()
self.module._Audioplayer__process_gstreamer_message('the-uuid', player, msg)
player.set_state.assert_not_called()
self.module._Audioplayer__play_next_track.assert_not_called()
self.module._Audioplayer__send_playback_event.assert_called()
def test__send_playback_event(self):
self.init()
player = Mock()
player.get_state.return_value = ('dummy', Gst.State.PAUSED, 'dummy')
player.query_duration.return_value = (True, 666000000000)
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'internal': {
'tags_sent': False,
'to_destroy': False,
'last_state': None,
},
'playlist': {
'index': 0,
'metadata': {},
'tracks': ['track1'],
'duration': 123,
}
}
}
self.module._Audioplayer__send_playback_event('the-uuid', player)
self.assertEqual(self.module.players['the-uuid']['internal']['last_state'], Gst.State.PAUSED)
self.session.assert_event_called_with('audioplayer.playback.update', {
'playeruuid': 'the-uuid',
'state': 'paused',
'index': 0,
'duration': 666,
'metadata': {},
'track': 'track1',
})
def test__send_playback_event_no_duration(self):
self.init()
player = Mock()
player.get_state.return_value = ('dummy', Gst.State.PAUSED, 'dummy')
player.query_duration.return_value = (False, 0)
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'internal': {
'tags_sent': False,
'to_destroy': False,
'last_state': None,
},
'playlist': {
'index': 0,
'metadata': {},
'tracks': ['track1'],
'duration': 123,
}
}
}
self.module._Audioplayer__send_playback_event('the-uuid', player)
self.assertEqual(self.module.players['the-uuid']['internal']['last_state'], Gst.State.PAUSED)
self.session.assert_event_called_with('audioplayer.playback.update', {
'playeruuid': 'the-uuid',
'state': 'paused',
'index': 0,
'duration': 123,
'metadata': {},
'track': 'track1',
})
def test__send_playback_event_same_state(self):
self.init()
player = Mock()
player.get_state = Mock(return_value=('dummy', Gst.State.PAUSED, 'dummy'))
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'internal': {
'tags_sent': False,
'to_destroy': False,
'last_state': Gst.State.PAUSED,
}
}
}
self.module._Audioplayer__send_playback_event('the-uuid', player)
self.assertEqual(self.module.players['the-uuid']['internal']['last_state'], Gst.State.PAUSED)
self.assertEqual(self.session.event_call_count('audioplayer.playback.update'), 0)
def test__send_playback_event_same_state_but_forced(self):
self.init()
player = Mock()
player.get_state = Mock(return_value=('dummy', Gst.State.PAUSED, 'dummy'))
player.query_duration.return_value = (False, 0)
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'internal': {
'tags_sent': False,
'to_destroy': False,
'last_state': Gst.State.PAUSED,
},
'playlist': {
'metadata': {},
'index': 0,
'tracks': ['track1'],
'volume': 12,
'duration': 123,
},
}
}
self.module._Audioplayer__send_playback_event('the-uuid', player, force=True)
self.assertEqual(self.module.players['the-uuid']['internal']['last_state'], Gst.State.PAUSED)
self.assertEqual(self.session.event_call_count('audioplayer.playback.update'), 1)
def test__send_playback_event_ready_state(self):
self.init()
player = Mock()
player.get_state = Mock(return_value=('dummy', Gst.State.READY, 'dummy'))
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'internal': {
'tags_sent': False,
'to_destroy': False,
'last_state': Gst.State.PAUSED,
}
}
}
self.module._Audioplayer__send_playback_event('the-uuid', player)
self.assertEqual(self.module.players['the-uuid']['internal']['last_state'], Gst.State.PAUSED)
self.assertEqual(self.session.event_call_count('audioplayer.playback.update'), 0)
def test__get_playback_info(self):
self.init()
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'internal': {
'tags_sent': False,
'to_destroy': False,
'last_state': Gst.State.PAUSED,
},
'playlist': {
'duration': 123,
'index': 0,
'tracks': ['track1'],
'volume': 50,
'metadata': {},
},
}
}
result = self.module._Audioplayer__get_playback_info('the-uuid')
logging.debug('Playback info: %s', result)
self.assertDictEqual(result, {
'index': 0,
'playeruuid': 'the-uuid',
'track': 'track1',
'metadata': {},
'state': 'paused',
'duration': 123
})
def test__get_playback_info_player_not_found(self):
self.init()
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'internal': {
'tags_sent': False,
'to_destroy': False,
'last_state': Gst.State.PAUSED,
},
'playlist': {
'duration': 123,
'index': 0,
'tracks': ['track1'],
'volume': 50,
'metadata': {},
},
}
}
result = self.module._Audioplayer__get_playback_info('dummy')
logging.debug('Playback info: %s', result)
self.assertDictEqual(result, {
'index': 0,
'playeruuid': 'dummy',
'track': None,
'metadata': {},
'state': 'stopped',
'duration': 0
})
def test__get_audio_metadata(self):
self.init()
tags = Mock()
tags.to_string.return_value = 'all-tags'
tags.nth_tag_name.side_effect = [
'artist',
'album-artist',
'album',
'title',
'genre',
'track-number',
'datetime',
'channel-mode',
'minimum-bitrate',
'maximum-bitrate',
'bitrate',
]
tags.get_string.side_effect = [
(True, '[artist]'),
(True, '[album-artist]'),
(True, '[album]'),
(True, '[title]'),
(True, '[genre]'),
(True, '[channel-mode]'),
]
tags.get_uint.side_effect = [
(True, 2), # track-number
(True, 333), # min bitrate
(True, 999), # max bitrate
(True, 666), # bitrate
]
date_time = Mock()
date_time.has_year.return_value = True
date_time.get_year.return_value = 2021
tags.get_date_time.return_value = (True, date_time)
tags.n_tags.return_value = 11
complete, metadata = self.module._Audioplayer__get_audio_metadata(tags)
logging.debug('Metadata: %s', metadata)
self.assertTrue(complete)
self.assertDictEqual(metadata, {
'artist': '[album-artist]',
'album': '[album]',
'title': '[title]',
'genre': '[genre]',
'year': 2021,
'track': 2,
'channels': '[channel-mode]',
'bitratemin': 333,
'bitratemax': 999,
'bitrateavg': 666,
})
def test__get_audio_metadata_track_string(self):
self.init()
tags = Mock()
tags.to_string.return_value = 'all-tags'
tags.nth_tag_name.side_effect = [
'track-number',
]
tags.get_string.side_effect = [
(True, '3'),
]
tags.get_uint.side_effect = [
(False, 2), # track-number
]
tags.n_tags.return_value = 1
complete, metadata = self.module._Audioplayer__get_audio_metadata(tags)
logging.debug('Metadata: %s', metadata)
self.assertFalse(complete)
self.assertDictEqual(metadata, {
'artist': None,
'album': None,
'title': None,
'genre': None,
'year': None,
'track': '3',
'channels': None,
'bitratemin': None,
'bitratemax': None,
'bitrateavg': None,
})
@patch('backend.audioplayer.magic.from_file')
def test__get_file_audio_format(self, mock_from_file):
self.init()
mock_from_file.return_value = 'audio/mpeg'
result = self.module._Audioplayer__get_file_audio_format('/audio/file/path.mp3')
logging.debug('Format: %s' % result)
self.assertEqual(result, 'audio/mpeg')
@patch('backend.audioplayer.magic.from_file')
def test__get_file_audio_format_unknown_format(self, mock_from_file):
self.init()
mock_from_file.return_value = 'audio/dummy'
result = self.module._Audioplayer__get_file_audio_format('/audio/file/path.mp3')
logging.debug('Format: %s' % result)
self.assertIsNone(result)
@patch('backend.audioplayer.magic.from_file')
def test__get_file_audio_format_exception(self, mock_from_file):
self.init()
mock_from_file.side_effect = Exception('Test exception')
result = self.module._Audioplayer__get_file_audio_format('/audio/file/path.mp3')
logging.debug('Format: %s' % result)
self.assertIsNone(result)
def test_is_filepath(self):
self.init()
with patch('backend.audioplayer.os.path.exists') as exists_mock:
exists_mock.return_value = True
self.assertTrue(self.module._is_filepath('/dummy/resource'))
with patch('backend.audioplayer.os.path.exists') as exists_mock:
with patch('backend.audioplayer.parse_url') as parse_url_mock:
exists_mock.return_value = False
result = ParseUrlResult()
result.scheme = 'https'
parse_url_mock.return_value = result
self.assertFalse(self.module._is_filepath('/dummy/resource'))
with patch('backend.audioplayer.os.path.exists') as exists_mock:
with patch('backend.audioplayer.parse_url') as parse_url_mock:
exists_mock.return_value = False
result = ParseUrlResult()
result.scheme = 'dummy'
parse_url_mock.return_value = result
with self.assertRaises(Exception) as cm:
self.assertFalse(self.module._is_filepath('/dummy/resource'))
self.assertEqual(str(cm.exception), 'Resource is invalid (file may not exist)')
def test_make_track(self):
self.init()
result = self.module._make_track('/dummy/resource', 'audio/dummy')
self.assertDictEqual(result, {
'resource': '/dummy/resource',
'audio_format': 'audio/dummy',
})
def test_add_track(self):
self.init()
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'playlist': {
'tracks': ['track1'],
'index': 0,
},
'internal': {
'to_destroy': False,
},
}
}
track = self.module._make_track('/dummy/resource', 'audio/mpeg')
with patch('backend.audioplayer.os.path.exists') as exists_mock:
exists_mock.return_value = True
self.module.add_track('the-uuid', '/dummy/resource', 'audio/mpeg')
self.assertDictEqual(self.module.players['the-uuid']['playlist']['tracks'][-1], track)
def test_add_track_playlist_limit_reached(self):
self.init()
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'playlist': {
'tracks': ['track1'],
'index': 0,
},
'internal': {
'to_destroy': False,
},
}
}
self.module.MAX_PLAYLIST_TRACKS = 3
with patch('backend.audioplayer.os.path.exists') as exists_mock:
exists_mock.return_value = True
self.assertTrue(self.module.add_track('the-uuid', '/dummy/resource', 'audio/mpeg'))
self.assertTrue(self.module.add_track('the-uuid', '/dummy/resource', 'audio/mpeg'))
self.assertTrue(self.module.add_track('the-uuid', '/dummy/resource', 'audio/mpeg'))
self.assertFalse(self.module.add_track('the-uuid', '/dummy/resource', 'audio/mpeg'))
def test_add_track_exception(self):
self.init()
self.module.players = {}
with self.assertRaises(Exception) as cm:
self.module.add_track('the-uuid', '/dummy/resource', 'audio/dummy')
self.assertEqual(str(cm.exception), 'Player "the-uuid" does not exist')
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'internal': {
'to_destroy': False
}
}
}
with patch('backend.audioplayer.os.path.exists') as exists_mock:
with patch('backend.audioplayer.parse_url') as parse_url_mock:
exists_mock.return_value = False
result = ParseUrlResult()
result.scheme = 'http'
parse_url_mock.return_value = result
with self.assertRaises(MissingParameter) as cm:
self.module.add_track('the-uuid', '/dummy/resource/url')
self.assertEqual(str(cm.exception), 'Url resource must have audio_format specified')
with self.assertRaises(Exception) as cm:
self.module.add_track('the-uuid', '/dummy/resource', 'audio/dummy')
self.assertEqual(str(cm.exception), 'Audio format "audio/dummy" is not supported')
def test_add_tracks(self):
self.init()
track = self.module._make_track('/dummy/resource', 'audio/dummy')
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'playlist': {
'tracks': [track],
'index': 0,
},
'internal': {
'to_destroy': False,
},
}
}
self.module.add_track = Mock(return_value=True)
self.module.add_tracks('the-uuid', [track, track, track])
self.assertEqual(self.module.add_track.call_count, 3)
def test_add_tracks_playlist_limit_reached(self):
self.init()
track = self.module._make_track('/dummy/resource', 'audio/dummy')
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'playlist': {
'tracks': [track],
'index': 0,
},
'internal': {
'to_destroy': False,
},
}
}
self.module.add_track = Mock(side_effect=[True, True, False])
with self.assertRaises(CommandInfo) as cm:
self.module.add_tracks('the-uuid', [track, track, track, track])
self.assertEqual(str(cm.exception), 'All tracks were not added (playlist limit reached)')
def test_remove_track_middle(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
track3 = self.module._make_track('/resource/track3', 'audio/dummy')
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'playlist': {
'tracks': [track1, track2, track3],
'index': 0,
},
'internal': {
'to_destroy': False,
},
}
}
self.module.remove_track('the-uuid', 1)
logging.debug('Playlist tracks:%s' % self.module.players['the-uuid']['playlist']['tracks'])
self.assertListEqual(self.module.players['the-uuid']['playlist']['tracks'], [track1, track3])
def test_remove_track_last(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
track3 = self.module._make_track('/resource/track3', 'audio/dummy')
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'playlist': {
'tracks': [track1, track2, track3],
'index': 0,
},
'internal': {
'to_destroy': False,
},
}
}
self.module.remove_track('the-uuid', 2)
logging.debug('Playlist tracks:%s' % self.module.players['the-uuid']['playlist']['tracks'])
self.assertListEqual(self.module.players['the-uuid']['playlist']['tracks'], [track1, track2])
def test_remove_track_first(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
track3 = self.module._make_track('/resource/track3', 'audio/dummy')
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'playlist': {
'tracks': [track1, track2, track3],
'index': 1,
},
'internal': {
'to_destroy': False,
},
}
}
self.module.remove_track('the-uuid', 0)
logging.debug('Playlist tracks:%s' % self.module.players['the-uuid']['playlist']['tracks'])
self.assertListEqual(self.module.players['the-uuid']['playlist']['tracks'], [track2, track3])
def test_remove_track_exception(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
track3 = self.module._make_track('/resource/track3', 'audio/dummy')
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'playlist': {
'tracks': [track1, track2, track3],
'index': 0,
},
'internal': {
'to_destroy': False,
},
}
}
with self.assertRaises(InvalidParameter) as cm:
self.module.remove_track('the-uuid', 0)
self.assertEqual(str(cm.exception), "You can't remove current track")
with self.assertRaises(InvalidParameter) as cm:
self.module.remove_track('the-uuid', -1)
self.assertEqual(str(cm.exception), 'Track index is invalid')
with self.assertRaises(InvalidParameter) as cm:
self.module.remove_track('the-uuid', 3)
self.assertEqual(str(cm.exception), 'Track index is invalid')
with self.assertRaises(InvalidParameter) as cm:
self.module.remove_track('dummy', 1)
self.assertEqual(str(cm.exception), 'Player "dummy" does not exist')
def test_start_playback(self):
self.init()
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 0,
'tracks': [],
'repeat': False,
'volume': None,
'metadata': {},
},
'player': None,
'source': None,
'volume': None,
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': False,
'last_state': None,
},
}
self.module._Audioplayer__play_track = Mock()
self.module._Audioplayer__destroy_player = Mock()
self.module._Audioplayer__create_player = Mock(return_value=player_data)
result = self.module.start_playback('/resource/dummy')
self.module._Audioplayer__create_player.assert_called()
self.module._Audioplayer__play_track.assert_called_with({'resource':'/resource/dummy', 'audio_format': None}, 'the-uuid', 100, False)
self.assertEqual(result, player_data['uuid'])
self.module._Audioplayer__destroy_player.assert_not_called()
def test_start_playback_exception(self):
self.init()
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 0,
'tracks': [],
'repeat': False,
'volume': None,
'metadata': {},
},
'player': None,
'source': None,
'volume': None,
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': False,
'last_state': None,
},
}
self.module._Audioplayer__play_track = Mock(side_effect=Exception('Test exception'))
self.module._Audioplayer__destroy_player = Mock()
self.module._Audioplayer__create_player = Mock(return_value=player_data)
with self.assertRaises(CommandError) as cm:
self.module.start_playback('/resource/dummy')
self.assertEqual(str(cm.exception), 'Unable to play resource')
self.module._Audioplayer__create_player.assert_called()
self.module._Audioplayer__play_track.assert_called_with({'resource':'/resource/dummy', 'audio_format': None}, 'the-uuid', 100, False)
self.module._Audioplayer__destroy_player.assert_called_with(player_data)
@patch('backend.audioplayer.Gst.ElementFactory')
@patch('backend.audioplayer.Audioplayer._is_filepath')
def test__play_track_with_file(self, is_filepath_mock, element_factory_mock):
self.init()
is_filepath_mock.return_value = True
player = MagicMock()
self.module._Audioplayer__prepare_player = Mock(return_value=player)
track = self.module._make_track('/resource/dummy', 'audio/mpeg')
self.module._Audioplayer__get_file_audio_format = Mock(return_value='audio/mpeg')
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'playlist': {
'tracks': [],
'index': 0,
'volume': 50,
},
'internal': {
'to_destroy': False,
},
}
}
self.module._Audioplayer__play_track(track, 'the-uuid')
logging.debug('Players: %s' % self.module.players)
self.module._Audioplayer__get_file_audio_format.assert_called_with('/resource/dummy')
player['source'].set_property.assert_any_call('location', '/resource/dummy')
player['volume'].set_property.assert_any_call('volume', 0.5)
player['player'].set_state.assert_called_with(Gst.State.PLAYING)
self.assertEqual(player['volume'].call_count, 0)
@patch('backend.audioplayer.Gst.ElementFactory')
@patch('backend.audioplayer.Audioplayer._is_filepath')
def test__play_track_with_url(self, is_filepath_mock, element_factory_mock):
self.init()
is_filepath_mock.return_value = False
player = MagicMock()
self.module._Audioplayer__prepare_player = Mock(return_value=player)
track = self.module._make_track('/resource/dummy', 'audio/mpeg')
self.module._Audioplayer__get_file_audio_format = Mock(return_value='audio/mpeg')
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'playlist': {
'tracks': [],
'index': 0,
'volume': 50,
},
'internal': {
'to_destroy': False,
},
}
}
self.module._Audioplayer__play_track(track, 'the-uuid')
logging.debug('Players: %s' % self.module.players)
self.module._Audioplayer__get_file_audio_format.assert_not_called()
player['source'].set_property.assert_any_call('location', '/resource/dummy')
player['volume'].set_property.assert_any_call('volume', 0.5)
player['player'].set_state.assert_called_with(Gst.State.PLAYING)
self.assertEqual(player['volume'].call_count, 0)
@patch('backend.audioplayer.Gst.ElementFactory')
@patch('backend.audioplayer.Audioplayer._is_filepath')
def test__play_track_with_volume(self, is_filepath_mock, element_factory_mock):
self.init()
is_filepath_mock.return_value = True
player = MagicMock()
self.module._Audioplayer__prepare_player = Mock(return_value=player)
track = self.module._make_track('/resource/dummy', 'audio/mpeg')
self.module._Audioplayer__get_file_audio_format = Mock(return_value='audio/mpeg')
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'playlist': {
'tracks': [],
'index': 0,
},
'internal': {
'to_destroy': False,
},
}
}
self.module._Audioplayer__play_track(track, 'the-uuid', 66)
logging.debug('Players: %s' % self.module.players)
player['volume'].set_property.assert_called_with('volume', 0.66)
@patch('backend.audioplayer.Gst.ElementFactory')
@patch('backend.audioplayer.Audioplayer._is_filepath')
def test__play_track_get_audio_format_failed(self, is_filepath_mock, element_factory_mock):
self.init()
is_filepath_mock.return_value = True
player = MagicMock()
self.module._Audioplayer__prepare_player = Mock(return_value=player)
track = self.module._make_track('/resource/dummy', 'audio/mpeg')
self.module._Audioplayer__get_file_audio_format = Mock(return_value=None)
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'playlist': {
'tracks': [],
'index': 0,
},
'internal': {
'to_destroy': False,
},
}
}
with self.assertRaises(CommandError) as cm:
self.module._Audioplayer__play_track(track, 'the-uuid')
self.assertEqual(str(cm.exception), 'Audio file not supported')
@patch('backend.audioplayer.Gst.ElementFactory')
@patch('backend.audioplayer.Audioplayer._is_filepath')
def test__play_track_exception(self, is_filepath_mock, element_factory_mock):
self.init()
is_filepath_mock.return_value = True
player = MagicMock()
player['source'].set_property.side_effect = Exception('Test exception')
self.module._Audioplayer__prepare_player = Mock(return_value=player)
track = self.module._make_track('/resource/dummy', 'audio/mpeg')
self.module._Audioplayer__get_file_audio_format = Mock(return_value='audio/mpeg')
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'playlist': {
'tracks': [],
'index': 0,
},
'internal': {
'to_destroy': False,
},
}
}
with self.assertRaises(Exception) as cm:
self.module._Audioplayer__play_track(track, 'the-uuid')
self.assertEqual(str(cm.exception), 'Test exception')
def test_get_track_index(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
track3 = self.module._make_track('/resource/track3', 'audio/dummy')
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'playlist': {
'tracks': [track1, track2, track3],
'index': 0,
},
'internal': {
'to_destroy': False,
},
}
}
index = self.module._get_track_index('the-uuid', track1)
self.assertEqual(index, 0)
index = self.module._get_track_index('the-uuid', track3)
self.assertEqual(index, 2)
index = self.module._get_track_index('the-uuid', track2)
self.assertEqual(index, 1)
def test_get_track_index_unknown_track(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
track3 = self.module._make_track('/resource/track3', 'audio/dummy')
self.module.players = {
'the-uuid': {
'uuid': 'the-uuid',
'player': None,
'pipeline': [],
'playlist': {
'tracks': [track1, track3],
'index': 0,
},
'internal': {
'to_destroy': False,
},
}
}
index = self.module._get_track_index('the-uuid', track2)
self.assertEqual(index, 0)
def test_pause_playback_while_playing(self):
self.init()
player = Mock()
player.get_state.return_value = ('dummy', Gst.State.PLAYING, 'dummy')
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': player,
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._set_volume = Mock()
self.module.pause_playback('the-uuid')
player.get_state.assert_called_with(1)
player.set_state.assert_called_with(Gst.State.PAUSED)
self.module._set_volume.assert_not_called()
def test_pause_playback_with_volume(self):
self.init()
player = Mock()
player.get_state.return_value = ('dummy', Gst.State.PLAYING, 'dummy')
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': player,
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._set_volume = Mock()
self.module.pause_playback('the-uuid', volume=66)
self.module._set_volume.assert_called_with('the-uuid', 66)
def test_pause_playback_force_play(self):
self.init()
player = Mock()
player.get_state.return_value = ('dummy', Gst.State.PLAYING, 'dummy')
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': player,
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module.pause_playback('the-uuid', force_play=True)
player.set_state.assert_called_with(Gst.State.PLAYING)
def test_pause_playback_force_pause(self):
self.init()
player = Mock()
player.get_state.return_value = ('dummy', Gst.State.PLAYING, 'dummy')
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': player,
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module.pause_playback('the-uuid', force_pause=True)
player.set_state.assert_called_with(Gst.State.PAUSED)
def test_pause_playback_while_paused(self):
self.init()
player = Mock()
player.get_state.return_value = ('dummy', Gst.State.PAUSED, 'dummy')
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': player,
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module.pause_playback('the-uuid')
player.get_state.assert_called_with(1)
player.set_state.assert_called_with(Gst.State.PLAYING)
def test_pause_playback_invalid_params(self):
self.init()
with self.assertRaises(InvalidParameter) as cm:
self.module.pause_playback('dummy')
self.assertEqual(str(cm.exception), 'Player "dummy" does not exist')
def test_stop_playback(self):
self.init()
player = Mock()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': player,
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._destroy_player = Mock()
self.module.stop_playback('the-uuid')
player.set_state.assert_called_with(Gst.State.NULL)
self.module._destroy_player.assert_called_with(player_data)
self.session.assert_event_called_with('audioplayer.playback.update', {
'playeruuid': 'the-uuid',
'state': Gst.State.NULL,
})
def test_stop_playback_invalid_params(self):
self.init()
with self.assertRaises(InvalidParameter) as cm:
self.module.stop_playback('dummy')
self.assertEqual(str(cm.exception), 'Player "dummy" does not exist')
def test_play_next_track(self):
self.init()
player = Mock()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 0,
'tracks': [track1, track2],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': player,
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._Audioplayer__play_next_track = Mock(return_value=True)
result = self.module.play_next_track('the-uuid')
self.assertTrue(result)
self.module._Audioplayer__play_next_track.assert_called_with('the-uuid')
def test_play_next_track_no_more_track(self):
self.init()
player = Mock()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1, track2],
'repeat': False,
'volume': 55,
'metadata': {},
},
'player': player,
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._Audioplayer__play_next_track = Mock(return_value=True)
result = self.module.play_next_track('the-uuid')
self.assertFalse(result)
self.module._Audioplayer__play_next_track.assert_not_called()
def test_play_next_track_no_more_track_but_repeat_enabled(self):
self.init()
player = Mock()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1, track2],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': player,
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._Audioplayer__play_next_track = Mock(return_value=True)
result = self.module.play_next_track('the-uuid')
self.assertTrue(result)
self.module._Audioplayer__play_next_track.assert_called_with('the-uuid')
def test_play_next_track_error_playing_track(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 0,
'tracks': [track1, track2],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': Mock(),
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._Audioplayer__play_next_track = Mock(return_value=False)
with self.assertRaises(CommandError) as cm:
self.module.play_next_track('the-uuid')
self.assertEqual(str(cm.exception), 'Error playing next track')
def test_play_next_track_invalid_params(self):
self.init()
with self.assertRaises(InvalidParameter) as cm:
self.module.play_next_track('dummy')
self.assertEqual(str(cm.exception), 'Player "dummy" does not exist')
def test__play_next_track(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 0,
'tracks': [track1, track2],
'repeat': False,
'volume': 55,
'metadata': {},
},
'player': Mock(),
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._Audioplayer__handle_end_of_playlist = Mock()
self.module._Audioplayer__play_track = Mock()
result = self.module._Audioplayer__play_next_track('the-uuid')
self.assertTrue(result)
self.module._Audioplayer__handle_end_of_playlist.assert_not_called()
self.module._Audioplayer__play_track.assert_called_with(track2, 'the-uuid')
self.assertEqual(player_data['playlist']['index'], 1)
def test__play_next_track_exception(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 0,
'tracks': [track1, track2],
'repeat': False,
'volume': 55,
},
'player': Mock(),
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._Audioplayer__handle_end_of_playlist = Mock()
self.module._Audioplayer__play_track = Mock(side_effect=Exception('Test exception'))
result = self.module._Audioplayer__play_next_track('the-uuid')
self.assertFalse(result)
self.module._Audioplayer__handle_end_of_playlist.assert_not_called()
def test__play_next_track_handle_end_of_playlist(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1, track2],
'repeat': False,
'volume': 55,
'metadata': {},
},
'player': Mock(),
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._Audioplayer__handle_end_of_playlist = Mock(return_value=False)
result = self.module._Audioplayer__play_next_track('the-uuid')
self.assertFalse(result)
self.module._Audioplayer__handle_end_of_playlist.return_value = True
result = self.module._Audioplayer__play_next_track('the-uuid')
self.assertTrue(result)
def test__play_next_track_invalid_params(self):
self.init()
self.assertFalse(self.module._Audioplayer__play_next_track('dummy'))
def test__handle_end_of_playlist(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1, track2],
'repeat': False,
'volume': 55,
'metadata': {},
},
'player': Mock(),
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._destroy_player = Mock()
result = self.module._Audioplayer__handle_end_of_playlist('the-uuid')
self.assertFalse(result)
self.module._destroy_player.assert_called_with(player_data)
def test__handle_end_of_playlist_repeat_enabled(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1, track2],
'repeat': True,
'shuffle': False,
'volume': 55,
'metadata': {},
},
'player': Mock(),
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._destroy_player = Mock()
self.module._Audioplayer__play_track = Mock()
self.module.shuffle_playlist = Mock()
result = self.module._Audioplayer__handle_end_of_playlist('the-uuid')
self.assertTrue(result)
self.module._destroy_player.assert_not_called()
self.module._Audioplayer__play_track.assert_called_with(track1, 'the-uuid')
self.module.shuffle_playlist.assert_not_called()
def test__handle_end_of_playlist_shuffle_enabled(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1, track2],
'repeat': True,
'shuffle': True,
'volume': 55,
'metadata': {},
},
'player': Mock(),
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._destroy_player = Mock()
self.module._Audioplayer__play_track = Mock()
self.module.shuffle_playlist = Mock()
result = self.module._Audioplayer__handle_end_of_playlist('the-uuid')
self.assertTrue(result)
self.module._destroy_player.assert_not_called()
self.module._Audioplayer__play_track.assert_called_with(track1, 'the-uuid')
self.module.shuffle_playlist.assert_called()
def test_play_previous_track(self):
self.init()
player = Mock()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1, track2],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': player,
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._Audioplayer__play_track = Mock()
result = self.module.play_previous_track('the-uuid')
self.assertTrue(result)
self.assertEqual(player_data['playlist']['index'], 0)
self.module._Audioplayer__play_track.assert_called_with(track1, 'the-uuid')
def test_play_previous_track_first_track(self):
self.init()
player = Mock()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 0,
'tracks': [track1, track2],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': player,
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._Audioplayer__play_track = Mock(return_value=True)
result = self.module.play_previous_track('the-uuid')
self.assertFalse(result)
self.assertEqual(player_data['playlist']['index'], 0)
self.module._Audioplayer__play_track.assert_not_called()
def test_play_previous_track_error_playing_track(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1, track2],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': Mock(),
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': 1,
},
}
self.module.players = {'the-uuid': player_data}
self.module._Audioplayer__play_track = Mock(side_effect=Exception('Test exception'))
with self.assertRaises(Exception) as cm:
self.module.play_previous_track('the-uuid')
self.assertEqual(str(cm.exception), 'Test exception')
def test_play_previous_track_invalid_params(self):
self.init()
with self.assertRaises(InvalidParameter) as cm:
self.module.play_previous_track('dummy')
self.assertEqual(str(cm.exception), 'Player "dummy" does not exist')
def test_get_players(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1, track2],
'repeat': True,
'volume': 55,
'metadata': {},
'duration': 666,
},
'player': Mock(),
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': Gst.State.PLAYING,
},
}
self.module.players = {'the-uuid': player_data}
players = self.module.get_players()
logging.debug('Players: %s', players)
self.assertListEqual(players, [{
'playeruuid': 'the-uuid',
'track': track2,
'state': 'playing',
'duration': 666,
'index': 1,
'metadata': {},
}])
def test_get_playlist(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 1,
'tracks': [track1, track2],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': Mock(),
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': Gst.State.PLAYING,
},
}
self.module.players = {'the-uuid': player_data}
playlist = self.module.get_playlist('the-uuid')
logging.debug('Playlist: %s', playlist)
self.assertEqual(playlist['index'], 1)
self.assertListEqual(playlist['tracks'], [track1, track2])
def test_get_playlist_invalid_params(self):
self.init()
with self.assertRaises(InvalidParameter) as cm:
self.module.get_playlist('dummy')
self.assertEqual(str(cm.exception), 'Player "dummy" does not exist')
def test_set_volume(self):
self.init()
volume = Mock()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 0,
'tracks': [track1],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': Mock(),
'source': Mock(),
'volume': volume,
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': Gst.State.PLAYING,
},
}
self.module.players = {'the-uuid': player_data}
self.module.set_volume('the-uuid', 1)
self.module.set_volume('the-uuid', 100)
self.module.set_volume('the-uuid', 66)
self.assertEqual(player_data['playlist']['volume'], 66)
volume.set_property.assert_called_with('volume', 0.66)
def test_set_volume_invalid_params(self):
self.init()
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 0,
'tracks': [{}],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': Mock(),
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': Gst.State.PLAYING,
},
}
self.module.players = {'the-uuid': player_data}
with self.assertRaises(InvalidParameter) as cm:
self.module.set_volume('dummy', 50)
self.assertEqual(str(cm.exception), 'Player "dummy" does not exist')
with self.assertRaises(InvalidParameter) as cm:
self.module.set_volume('the-uuid', -1)
self.assertEqual(str(cm.exception), 'Volume must be between 1 and 100')
with self.assertRaises(InvalidParameter) as cm:
self.module.set_volume('the-uuid', 101)
self.assertEqual(str(cm.exception), 'Volume must be between 1 and 100')
def test_set_repeat(self):
self.init()
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 0,
'tracks': [{}],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': Mock(),
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': Gst.State.PLAYING,
},
}
self.module.players = {'the-uuid': player_data}
self.module.set_repeat('the-uuid', False)
self.assertFalse(player_data['playlist']['repeat'])
self.module.set_repeat('the-uuid', True)
self.assertTrue(player_data['playlist']['repeat'])
def test_set_repeat_invalid_params(self):
self.init()
with self.assertRaises(InvalidParameter) as cm:
self.module.set_repeat('dummy', True)
self.assertEqual(str(cm.exception), 'Player "dummy" does not exist')
def test_shuffle_playlist_first_track_playing(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
track3 = self.module._make_track('/resource/track3', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 0,
'tracks': [track1, track2, track3],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': Mock(),
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': Gst.State.PLAYING,
},
}
self.module.players = {'the-uuid': player_data}
self.module.shuffle_playlist('the-uuid')
self.assertDictEqual(player_data['playlist']['tracks'][0], track1)
self.assertEqual(player_data['playlist']['index'], 0)
def test_shuffle_playlist_last_track_playing(self):
self.init()
track1 = self.module._make_track('/resource/track1', 'audio/dummy')
track2 = self.module._make_track('/resource/track2', 'audio/dummy')
track3 = self.module._make_track('/resource/track3', 'audio/dummy')
player_data = {
'uuid': 'the-uuid',
'playlist': {
'index': 2,
'tracks': [track1, track2, track3],
'repeat': True,
'volume': 55,
'metadata': {},
},
'player': Mock(),
'source': Mock(),
'volume': Mock(),
'pipeline': [],
'internal': {
'to_destroy': False,
'tags_sent': True,
'last_state': Gst.State.PLAYING,
},
}
self.module.players = {'the-uuid': player_data}
self.module.shuffle_playlist('the-uuid')
self.assertDictEqual(player_data['playlist']['tracks'][0], track3)
self.assertEqual(player_data['playlist']['index'], 0)
def test_shuffle_playlist_invalid_params(self):
self.init()
with self.assertRaises(CommandError) as cm:
self.module.shuffle_playlist('dummy')
self.assertEqual(str(cm.exception), 'Player "dummy" does not exist')
class TestAudioplayerPlaybackUpdateEvent(unittest.TestCase):
def setUp(self):
logging.basicConfig(level=logging.FATAL, format='%(asctime)s %(name)s:%(lineno)d %(levelname)s : %(message)s')
self.session = session.TestSession(self)
self.event = self.session.setup_event(AudioplayerPlaybackUpdateEvent)
def test_event_params(self):
self.assertEqual(self.event.EVENT_PARAMS, [
"playeruuid",
"state",
"duration",
"track",
"metadata",
"index",
])
if __name__ == '__main__':
# coverage run --omit="*/lib/python*/*","test_*" --concurrency=thread test_audioplayer.py; coverage report -m -i
unittest.main()
| 35.47402 | 141 | 0.527739 | 7,939 | 83,293 | 5.273964 | 0.038796 | 0.092907 | 0.073728 | 0.035825 | 0.88696 | 0.852042 | 0.82749 | 0.817077 | 0.802245 | 0.768689 | 0 | 0.010909 | 0.339656 | 83,293 | 2,347 | 142 | 35.489135 | 0.750336 | 0.00485 | 0 | 0.703632 | 0 | 0.000969 | 0.170122 | 0.013647 | 0 | 0 | 0 | 0 | 0.111864 | 1 | 0.045036 | false | 0.000969 | 0.004358 | 0 | 0.051332 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
589ec51114916fb924d208afbaf6586eccf86e8a | 123 | py | Python | topnum/data/__init__.py | machine-intelligence-laboratory/OptimalNumberOfTopics | 87267223987a4cb54b3f0ec431e87ee684044c7b | [
"MIT"
] | 5 | 2020-05-06T14:13:54.000Z | 2020-09-06T15:54:01.000Z | topnum/data/__init__.py | machine-intelligence-laboratory/OptimalNumberOfTopics | 87267223987a4cb54b3f0ec431e87ee684044c7b | [
"MIT"
] | 54 | 2020-02-10T07:08:31.000Z | 2020-09-08T21:45:39.000Z | topnum/data/__init__.py | machine-intelligence-laboratory/OptimalNumberOfTopics | 87267223987a4cb54b3f0ec431e87ee684044c7b | [
"MIT"
] | 2 | 2021-01-16T08:40:25.000Z | 2021-06-04T05:35:36.000Z | from .base_text_collection import BaseTextCollection
from .vowpal_wabbit_text_collection import VowpalWabbitTextCollection
| 41 | 69 | 0.918699 | 13 | 123 | 8.307692 | 0.692308 | 0.259259 | 0.37037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065041 | 123 | 2 | 70 | 61.5 | 0.93913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
546c78587729c20876c2fd7db2ff0a23f1b1e8a8 | 2,848 | py | Python | permissions.py | Discord-Dwarf/dwarf | 7b23e411198cc1b73c3923325d2cb84a2d3da53b | [
"MIT"
] | 2 | 2016-11-11T10:26:53.000Z | 2016-11-14T19:31:38.000Z | permissions.py | Dwarf-Community/dwarf | 7b23e411198cc1b73c3923325d2cb84a2d3da53b | [
"MIT"
] | null | null | null | permissions.py | Dwarf-Community/dwarf | 7b23e411198cc1b73c3923325d2cb84a2d3da53b | [
"MIT"
] | null | null | null | """REST API permissions"""
from rest_framework.permissions import BasePermission
from .models import Member
class GuildPermissions(BasePermission):
def has_permission(self, request, view):
return (request.user.is_superuser or
(request.user.is_staff and view.action == 'list') or
(request.user.is_authenticated and view.action == 'retrieve'))
def has_object_permission(self, request, view, obj):
return (request.user.is_superuser or request.user.is_staff or
Member.objects.filter(user=request.user, guild=obj).exists())
class StringPermissions(BasePermission):
def has_permission(self, request, view):
return (request.user.is_superuser or
(request.user.is_staff and view.action == 'destroy') or
(request.user.is_authenticated and view.action == 'create') or
view.action == 'list' or
view.action == 'retrieve')
class MessagePermissions(BasePermission):
def has_permission(self, request, view):
return (request.user.is_superuser or
(request.user.is_staff and
(view.action == 'list' or view.action == 'retrieve')))
class UserPermissions(BasePermission):
def has_permission(self, request, view):
return (request.user.is_superuser or
(request.user.is_staff and
(view.action == 'list' or view.action == 'retrieve')))
class MemberPermissions(BasePermission):
def has_permission(self, request, view):
return (request.user.is_superuser or
(request.user.is_staff and view.action == 'list') or
(request.user.is_authenticated and view.action == 'retrieve'))
def has_object_permission(self, request, view, obj):
return (request.user.is_superuser or
request.user.is_staff or
Member.objects.filter(user=request.user, guild=obj.guild).exists())
class RolePermissions(BasePermission):
def has_permission(self, request, view):
return (request.user.is_superuser or
(request.user.is_staff and view.action == 'list') or
(request.user.is_authenticated and view.action == 'retrieve'))
def has_object_permission(self, request, view, obj):
return (request.user.is_superuser or
request.user.is_staff or
Member.objects.filter(user=request.user, guild=obj.guild).exists())
class ChannelPermissions(BasePermission):
def has_permission(self, request, view):
return (request.user.is_superuser or
(request.user.is_staff and
(view.action == 'list' or view.action == 'retrieve')))
def has_object_permission(self, request, view, obj):
return request.user.is_superuser or request.user.is_staff
| 38.486486 | 83 | 0.651685 | 334 | 2,848 | 5.431138 | 0.125749 | 0.175854 | 0.186329 | 0.124035 | 0.870452 | 0.870452 | 0.870452 | 0.870452 | 0.826351 | 0.826351 | 0 | 0 | 0.241924 | 2,848 | 73 | 84 | 39.013699 | 0.840204 | 0.007022 | 0 | 0.692308 | 0 | 0 | 0.034373 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.211538 | false | 0 | 0.038462 | 0.211538 | 0.596154 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 9 |
549eb99c453325176a865ede212ef9996b9ed6f4 | 27,405 | py | Python | pyxform/tests_v1/test_xls2json.py | seadowg/pyxform | 038318c9a20303b45c7722466a550a0526a78da9 | [
"BSD-2-Clause"
] | null | null | null | pyxform/tests_v1/test_xls2json.py | seadowg/pyxform | 038318c9a20303b45c7722466a550a0526a78da9 | [
"BSD-2-Clause"
] | null | null | null | pyxform/tests_v1/test_xls2json.py | seadowg/pyxform | 038318c9a20303b45c7722466a550a0526a78da9 | [
"BSD-2-Clause"
] | null | null | null | import os
from pyxform.tests.utils import DIR as TESTS_DIR
from pyxform.tests_v1.pyxform_test_case import PyxformTestCase
from pyxform.xls2xform import xls2xform_convert
from pyxform.xls2json_backends import xls_to_dict
# Common XLSForms used in below TestCases
CHOICES = """
| survey | | | |
| | type | name | label |
| | select_one l1 | q1 | Q1 |
| {name} | | | |
| | list_name | name | label |
| | l1 | 1 | C1 |
"""
# Doubled braces ${{}} here because it's used as a format string.
EXTERNAL_CHOICES = """
| survey | | | | |
| | type | name | label | choice_filter |
| | text | q1 | Q1 | |
| | select_one_external l1 | q2 | Q2 | q1=${{q1}} |
| {name} | | | | |
| | list_name | name | q1 | |
| | l1 | 1 | 1 | |
| | l1 | 2 | 2 | |
"""
SETTINGS = """
| survey | | | |
| | type | name | label |
| | text | q1 | Q1 |
| {name} | | | |
| | id_string | title | |
| | my_id | My Survey | |
"""
SURVEY = """
| {name} | | | |
| | type | name | label |
| | text | q1 | Q1 |
"""
class TestXLS2JSONSheetNameHeuristics(PyxformTestCase):
err_similar_found = "the following sheets with similar names were found"
err_survey_required = "You must have a sheet named 'survey'."
err_choices_required = "There should be a choices sheet in this xlsform."
err_ext_choices_required = (
"There should be an external_choices sheet in this xlsform."
)
def test_workbook_to_json__case_insensitive__choices(self):
"""Should not warn/error if optional sheets are not lowercase."""
test_names = ("choices", "Choices", "CHOICES")
for n in test_names:
self.assertPyxformXform(
name="test", md=CHOICES.format(name=n), errored=False, warnings_count=0,
)
def test_workbook_to_json__case_insensitive__external_choices(self):
"""Should not warn/error if optional sheets are not lowercase."""
test_names = ("external_choices", "External_Choices", "EXTERNAL_CHOICES")
for n in test_names:
self.assertPyxformXform(
name="test",
md=EXTERNAL_CHOICES.format(name=n),
errored=False,
warnings_count=0,
)
def test_workbook_to_json__case_insensitive__settings(self):
"""Should not warn/error if optional sheets are not lowercase."""
test_names = ("settings", "Settings", "SETTINGS")
for n in test_names:
self.assertPyxformXform(
name="test",
md=SETTINGS.format(name=n),
errored=False,
warnings_count=0,
)
def test_workbook_to_json__case_insensitive__survey(self):
"""Should not warn/error if the survey sheet is not lowercase."""
test_names = ("survey", "Survey", "SURVEY")
for n in test_names:
self.assertPyxformXform(
name="test", md=SURVEY.format(name=n), errored=False, warnings_count=0,
)
def test_workbook_to_json__ignore_prefixed_name__choices(self):
"""Should ignore sheet name for spelling if prefixed with underscore."""
test_names = ("_choice", "_chioces", "_choics")
for n in test_names:
self.assertPyxformXform(
name="test",
md=CHOICES.format(name=n),
errored=True,
error__contains=[self.err_choices_required],
error__not_contains=[self.err_similar_found, "'{}'".format(n)],
)
def test_workbook_to_json__ignore_prefixed_name__external_choices(self):
"""Should ignore sheet name for spelling if prefixed with underscore."""
test_names = ("_external_choice", "_extrenal_choices", "_externa_choics")
for n in test_names:
self.assertPyxformXform(
name="test",
md=EXTERNAL_CHOICES.format(name=n),
errored=True,
error__contains=[self.err_ext_choices_required],
error__not_contains=[self.err_similar_found, "'{}'".format(n)],
)
def test_workbook_to_json__ignore_prefixed_name__settings(self):
"""Should ignore sheet name for spelling if prefixed with underscore."""
test_names = ("_setting", "_stetings", "_setings")
for n in test_names:
self.assertPyxformXform(
name="test",
md=SETTINGS.format(name=n),
errored=False,
warnings_count=0,
)
def test_workbook_to_json__ignore_prefixed_name__survey(self):
"""Should ignore sheet name for spelling if prefixed with underscore."""
test_names = ("_surveys", "_surve", "_sruvey")
for n in test_names:
self.assertPyxformXform(
name="test",
md=SURVEY.format(name=n),
errored=True,
error__contains=[self.err_survey_required],
error__not_contains=[self.err_similar_found, "'{}'".format(n)],
)
def test_workbook_to_json__misspelled_found__choices(self):
"""Should mention misspellings if similar sheet names found."""
test_names = ("choice", "chioces", "choics")
for n in test_names:
self.assertPyxformXform(
name="test",
md=CHOICES.format(name=n),
errored=True,
error__contains=[
self.err_choices_required,
self.err_similar_found,
"'{}'".format(n),
],
)
def test_workbook_to_json__misspelled_found__choices_exists(self):
"""Should not mention misspellings if the sheet exists."""
self.assertPyxformXform(
name="test",
md="""
| survey | | | |
| | type | name | label |
| | select_one l1 | q1 | Q1 |
| choices | | | |
| | list_name | name | label |
| | l1 | 1 | C1 |
| chioces | | | |
| | list_name | name | label |
| | l1 | 1 | C1 |
""",
errored=False,
warnings_count=0,
)
def test_workbook_to_json__misspelled_found__choices_multiple(self):
"""Should mention misspellings if similar sheet names found."""
self.assertPyxformXform(
name="test",
md="""
| survey | | | |
| | type | name | label |
| | select_one l1 | q1 | Q1 |
| choice | | | |
| | list_name | name | label |
| | l1 | 1 | C1 |
| chioces | | | |
| | list_name | name | label |
| | l1 | 1 | C1 |
""",
errored=True,
error__contains=[
self.err_choices_required,
self.err_similar_found,
"'choice'",
"'chioces'",
],
)
def test_workbook_to_json__misspelled_found__external_choices(self):
"""Should mention misspellings if similar sheet names found."""
test_names = ("external_choice", "extrenal_choices", "externa_choics")
for n in test_names:
self.assertPyxformXform(
name="test",
md=EXTERNAL_CHOICES.format(name=n),
errored=True,
error__contains=[
self.err_ext_choices_required,
self.err_similar_found,
"'{}'".format(n),
],
)
def test_workbook_to_json__misspelled_found__external_choices_exists(self):
"""Should not mention misspellings if the sheet exists."""
self.assertPyxformXform(
name="test",
md="""
| survey | | | | |
| | type | name | label | choice_filter |
| | text | q1 | Q1 | |
| | select_one_external l1 | q2 | Q2 | q1=${q1} |
| external_choices | | | | |
| | list_name | name | q1 | |
| | l1 | 1 | 1 | |
| | l1 | 2 | 2 | |
| extrenal_choices | | | | |
| | list_name | name | q1 | |
| | l1 | 1 | 1 | |
| | l1 | 2 | 2 | |
""",
errored=False,
warnings_count=0,
)
def test_workbook_to_json__misspelled_found__external_choices_multiple(self):
"""Should mention misspellings if similar sheet names found."""
self.assertPyxformXform(
name="test",
md="""
| survey | | | | |
| | type | name | label | choice_filter |
| | text | q1 | Q1 | |
| | select_one_external l1 | q2 | Q2 | q1=${q1} |
| external_choice | | | | |
| | list_name | name | q1 | |
| | l1 | 1 | 1 | |
| | l1 | 2 | 2 | |
| extrenal_choices | | | | |
| | list_name | name | q1 | |
| | l1 | 1 | 1 | |
| | l1 | 2 | 2 | |
""",
errored=True,
error__contains=[
self.err_ext_choices_required,
self.err_similar_found,
"'external_choice'",
"'extrenal_choices'",
],
)
def test_workbook_to_json__misspelled_found__settings(self):
"""Should mention misspellings if similar sheet names found."""
test_names = ("setting", "stetings", "setings")
for n in test_names:
self.assertPyxformXform(
name="test",
md=SETTINGS.format(name=n),
errored=False,
warnings__contains=[self.err_similar_found, "'{}'".format(n)],
)
def test_workbook_to_json__misspelled_found__settings_exists(self):
"""Should not mention misspellings if the sheet exists."""
self.assertPyxformXform(
name="test",
md="""
| survey | | | |
| | type | name | label |
| | text | q1 | Q1 |
| settings | | | |
| | id_string | title | |
| | my_id | My Survey | |
| stetings | | | |
| | id_string | title | |
| | my_id | My Survey | |
""",
errored=False,
warnings_count=0,
)
def test_workbook_to_json__misspelled_found__settings_multiple(self):
"""Should mention misspellings if similar sheet names found."""
self.assertPyxformXform(
name="test",
md="""
| survey | | | |
| | type | name | label |
| | text | q1 | Q1 |
| setting | | | |
| | id_string | title | |
| | my_id | My Survey | |
| stetings | | | |
| | id_string | title | |
| | my_id | My Survey | |
""",
errored=False,
warnings__contains=[self.err_similar_found, "'setting'", "'stetings'"],
)
def test_workbook_to_json__misspelled_found__survey(self):
"""Should mention misspellings if similar sheet names found."""
test_names = ("surveys", "surve", "sruvey")
for n in test_names:
self.assertPyxformXform(
name="test",
md=SURVEY.format(name=n),
errored=True,
error__contains=[
self.err_survey_required,
self.err_similar_found,
"'{}'".format(n),
],
)
def test_workbook_to_json__misspelled_found__survey_exists(self):
"""Should not mention misspellings if the sheet exists."""
self.assertPyxformXform(
name="test",
md="""
| survey | | | |
| | type | name | label |
| | text | q1 | Q1 |
| surve | | | |
| | type | name | label |
| | text | q1 | Q1 |
""",
errored=False,
warnings_count=0,
)
def test_workbook_to_json__misspelled_found__survey_multiple(self):
"""Should mention misspellings if similar sheet names found."""
self.assertPyxformXform(
name="test",
md="""
| surveys | | | |
| | type | name | label |
| | text | q1 | Q1 |
| Surve | | | |
| | type | name | label |
| | text | q1 | Q1 |
""",
errored=True,
error__contains=[
self.err_survey_required,
self.err_similar_found,
"'surveys'",
"'surve'",
],
)
def test_workbook_to_json__misspelled_not_found__choices(self):
"""Should not mention misspellings for dissimilar sheet names."""
test_names = ("cho", "ices", "choose")
for n in test_names:
self.assertPyxformXform(
name="test",
md=CHOICES.format(name=n),
errored=True,
error__not_contains=[self.err_similar_found, "'{}'".format(n)],
)
def test_workbook_to_json__misspelled_not_found__external_choices(self):
"""Should not mention misspellings for dissimilar sheet names."""
test_names = ("external", "choices", "eternal_choosey")
for n in test_names:
self.assertPyxformXform(
name="test",
md=EXTERNAL_CHOICES.format(name=n),
errored=True,
error__not_contains=[self.err_similar_found, "'{}'".format(n)],
)
def test_workbook_to_json__misspelled_not_found__settings(self):
"""Should not mention misspellings for dissimilar sheet names."""
test_names = ("hams", "spetltigs", "stetinsg")
for n in test_names:
self.assertPyxformXform(
name="test",
md=SETTINGS.format(name=n),
errored=False,
warnings_count=0,
)
def test_workbook_to_json__misspelled_not_found__survey(self):
"""Should not mention misspellings for dissimilar sheet names."""
test_names = ("hams", "suVVve", "settings")
for n in test_names:
self.assertPyxformXform(
name="test",
md=SURVEY.format(name=n),
errored=True,
error__not_contains=[self.err_similar_found],
)
def test_workbook_to_json__multiple_misspellings__all_ok(self):
"""Should not mention misspellings for complete example with correct spelling."""
self.assertPyxformXform(
name="test",
md="""
| survey | | | | |
| | type | name | label | choice_filter |
| | select_one l1 | q1 | Q1 | |
| | select_one_external l2 | q2 | Q2 | q1=${q1} |
| choices | | | |
| | list_name | name | label |
| | l1 | 1 | C1 |
| external_choices | | | |
| | list_name | name | q1 |
| | l2 | 1 | 1 |
| | l2 | 2 | 2 |
| settings | | | |
| | id_string | title |
| | my_id | My Survey |
""",
errored=False,
warnings_count=0,
)
def test_workbook_to_json__multiple_misspellings__survey(self):
"""Should mention misspellings in processing order (su, se, ch, ex)."""
self.assertPyxformXform(
name="test",
md="""
| surveys | | | | |
| | type | name | label | choice_filter |
| | select_one l1 | q1 | Q1 | |
| | select_one_external l2 | q2 | Q2 | q1=${q1} |
| chooses | | | |
| | list_name | name | label |
| | l1 | 1 | C1 |
| external_choyces | | | |
| | list_name | name | q1 |
| | l2 | 1 | 1 |
| | l2 | 2 | 2 |
| settyngs | | | |
| | id_string | title |
| | my_id | My Survey |
""",
errored=True,
warnings__not_contains=[self.err_similar_found, "'settyngs'",],
error__contains=[
self.err_survey_required,
self.err_similar_found,
"'surveys'",
],
error__not_contains=[
self.err_choices_required,
"'chooses'",
self.err_ext_choices_required,
"'external_choyces'",
],
)
def test_workbook_to_json__multiple_misspellings__choices(self):
"""Should mention misspellings in processing order (su, se, ch, ex)."""
self.assertPyxformXform(
name="test",
md="""
| survey | | | | |
| | type | name | label | choice_filter |
| | select_one l1 | q1 | Q1 | |
| | select_one_external l2 | q2 | Q2 | q1=${q1} |
| chooses | | | |
| | list_name | name | label |
| | l1 | 1 | C1 |
| external_choyces | | | |
| | list_name | name | q1 |
| | l2 | 1 | 1 |
| | l2 | 2 | 2 |
| settings | | | |
| | id_string | title |
| | my_id | My Survey |
""",
errored=True,
warnings__not_contains=[self.err_similar_found, "'settyngs'",],
error__contains=[self.err_choices_required, "'chooses'",],
error__not_contains=[
self.err_survey_required,
"'survey'",
# Not raised because the "select_one l1, q1" is checked first.
self.err_ext_choices_required,
"'external_choyces'",
],
)
def test_workbook_to_json__multiple_misspellings__external_choices(self):
"""Should mention misspellings in processing order (su, se, ch, ex)."""
self.assertPyxformXform(
name="test",
md="""
| survey | | | | |
| | type | name | label | choice_filter |
| | select_one l1 | q1 | Q1 | |
| | select_one_external l2 | q2 | Q2 | q1=${q1} |
| choices | | | |
| | list_name | name | label |
| | l1 | 1 | C1 |
| external_choyces | | | |
| | list_name | name | q1 |
| | l2 | 1 | 1 |
| | l2 | 2 | 2 |
| settings | | | |
| | id_string | title |
| | my_id | My Survey |
""",
errored=True,
warnings__not_contains=[self.err_similar_found, "'settyngs'",],
error__contains=[self.err_ext_choices_required, "'external_choyces'",],
error__not_contains=[
self.err_survey_required,
"'survey'",
self.err_choices_required,
"'chooses'",
],
)
def test_workbook_to_json__multiple_misspellings__settings(self):
"""Should mention misspellings in processing order (su, se, ch, ex)."""
self.assertPyxformXform(
name="test",
md="""
| survey | | | | |
| | type | name | label | choice_filter |
| | select_one l1 | q1 | Q1 | |
| | select_one_external l2 | q2 | Q2 | q1=${q1} |
| chooses | | | |
| | list_name | name | label |
| | l1 | 1 | C1 |
| external_choyces | | | |
| | list_name | name | q1 |
| | l2 | 1 | 1 |
| | l2 | 2 | 2 |
| settyngs | | | |
| | id_string | title |
| | my_id | My Survey |
""",
errored=True,
warnings__contains=[self.err_similar_found, "'settyngs'",],
error__contains=[self.err_choices_required, "'chooses'",],
error__not_contains=[
self.err_survey_required,
"'survey'",
# Not raised because the "select_one l1, q1" is checked first.
self.err_ext_choices_required,
"'external_choyces",
],
)
def test_workbook_to_json__optional_sheets_ok(self):
"""Should not warn when valid optional sheet names are provided."""
self.assertPyxformXform(
name="test",
md="""
| survey | | | |
| | type | name | label |
| | text | q1 | Q1 |
| settings | | | |
| | id_string | title | |
| | my_id | My Survey | |
| choices | | | |
| | list_name | name | label |
| | l1 | c1 | One |
""",
errored=False,
warnings_count=0,
)
def test_xls2xform_convert__e2e_with_settings_misspelling(self):
"""Should warn about settings misspelling when running full pipeline."""
file_name = "extra_sheet_names"
warnings = xls2xform_convert(
xlsform_path=os.path.join(TESTS_DIR, "example_xls", file_name + ".xlsx"),
xform_path=os.path.join(TESTS_DIR, "test_output", file_name + ".xml"),
validate=False,
pretty_print=False,
enketo=False,
)
expected = (
"When looking for a sheet named 'settings', the following sheets "
"with similar names were found: 'stettings'"
)
self.assertIn(expected, "\n".join(warnings))
def test_xls_to_dict__extra_sheet_names_are_returned_by_parser(self):
"""Should return all sheet names so that later steps can do spellcheck."""
d = xls_to_dict(
os.path.join(TESTS_DIR, "example_xls", "extra_sheet_names.xlsx")
)
self.assertIn("survey", d)
self.assertIn("my_sheet", d)
self.assertIn("stettings", d)
self.assertIn("choices", d)
| 45.523256 | 89 | 0.405327 | 2,129 | 27,405 | 4.916862 | 0.088304 | 0.026748 | 0.042988 | 0.04872 | 0.866355 | 0.848491 | 0.830149 | 0.798242 | 0.76796 | 0.757642 | 0 | 0.015655 | 0.501186 | 27,405 | 601 | 90 | 45.599002 | 0.75011 | 0.079839 | 0 | 0.731061 | 0 | 0 | 0.516186 | 0.000878 | 0 | 0 | 0 | 0 | 0.066288 | 1 | 0.060606 | false | 0 | 0.00947 | 0 | 0.079545 | 0.001894 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
49d7362f8a70395684c69c0d768c365a91c09c09 | 92 | py | Python | Session-3/Strings/S3SS1.py | saianuragpeddu/python-assignemts | a6bb192f2c0ef8ea86531c1a98f1b76150fa474b | [
"MIT"
] | null | null | null | Session-3/Strings/S3SS1.py | saianuragpeddu/python-assignemts | a6bb192f2c0ef8ea86531c1a98f1b76150fa474b | [
"MIT"
] | null | null | null | Session-3/Strings/S3SS1.py | saianuragpeddu/python-assignemts | a6bb192f2c0ef8ea86531c1a98f1b76150fa474b | [
"MIT"
] | 1 | 2019-07-06T02:37:58.000Z | 2019-07-06T02:37:58.000Z | def countA(word):
return word.count('a')
print(countA("apple"))
print(countA("Apple"))
| 15.333333 | 26 | 0.663043 | 13 | 92 | 4.692308 | 0.615385 | 0.360656 | 0.52459 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119565 | 92 | 5 | 27 | 18.4 | 0.753086 | 0 | 0 | 0 | 0 | 0 | 0.119565 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0.25 | 0.5 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 7 |
49f578d8db09be172adccfd69984c7fb46efc7eb | 155 | py | Python | utils_demo/callbacks/__init__.py | IBM/nesa-demo | 4e87217f44ff66414f78df6962ee8633d89f0cf5 | [
"MIT"
] | 2 | 2021-12-16T13:16:56.000Z | 2022-01-19T14:23:18.000Z | utils_demo/callbacks/__init__.py | SocioProphet/nesa-demo | 4e87217f44ff66414f78df6962ee8633d89f0cf5 | [
"MIT"
] | null | null | null | utils_demo/callbacks/__init__.py | SocioProphet/nesa-demo | 4e87217f44ff66414f78df6962ee8633d89f0cf5 | [
"MIT"
] | 1 | 2022-03-07T19:57:59.000Z | 2022-03-07T19:57:59.000Z | from .download_callback import download_callback
from .select_all_callback import select_all_callback
from .reset_page_callback import reset_page_callback
| 38.75 | 52 | 0.903226 | 22 | 155 | 5.909091 | 0.363636 | 0.323077 | 0.261538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.077419 | 155 | 3 | 53 | 51.666667 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3f7dfe9872ff164590e515daf7c930e1c0804794 | 107,806 | py | Python | ME/srez_model.py | giladddd/MLN | 52876e75671d3fee8905b16931aadc9ecdc7bd4f | [
"MIT"
] | 2 | 2019-04-16T05:04:23.000Z | 2020-05-20T15:31:19.000Z | ME/srez_model.py | giladddd/MLN | 52876e75671d3fee8905b16931aadc9ecdc7bd4f | [
"MIT"
] | null | null | null | ME/srez_model.py | giladddd/MLN | 52876e75671d3fee8905b16931aadc9ecdc7bd4f | [
"MIT"
] | 2 | 2018-12-30T14:16:02.000Z | 2019-08-06T16:43:46.000Z | import sys
import numpy as np
import tensorflow as tf
import scipy.io
import GTools as GT
FLAGS = tf.app.flags.FLAGS
import copy
import myParams
from srez_modelBase import Model
import srez_modelBase
def ConstConvKernel(K1,K2,FCOut):
W=np.zeros([K1,K2,K1,K2,FCOut,FCOut])
for i in range(0,K1):
for j in range(0,K2):
for t in range(0,FCOut):
W[i,j,i,j,t,t]=1
W=np.reshape(W,[K1,K2,K1*K2*FCOut,FCOut])
return W
def _generator_model(sess, features, labels, channels):
# Upside-down all-convolutional resnet
mapsize = 3
mapsize = myParams.myDict['MapSize']
res_units = [256, 128, 96]
old_vars = tf.global_variables()
# See Arxiv 1603.05027
model = Model('GEN', features)
# H=FLAGS.LabelsH;
# W=FLAGS.LabelsW;
H=myParams.myDict['LabelsH']
W=myParams.myDict['LabelsW']
channelsOut=myParams.myDict['channelsOut']
batch_size=myParams.myDict['batch_size']
DataH=myParams.myDict['DataH']
print("_generator_model")
print("%d %d %d" % (H, W,channels))
if myParams.myDict['NetMode'] == 'SPEN_Local':
print("SPEN_Local mode")
model.add_Split4thDim(2) # now (16, H, W, HNeighbors, 2)
model.add_PixelwiseMultC(1) #,NamePrefix='MapsForMat')
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'SPEN_FC':
print("SPEN_FC mode")
model.add_5thDim()
model.add_Permute45()
model.add_Mult2DMCxC(H,1)
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'SMASH1DFTxyC_YCC':
print("SMASH1DFTxyC_YCC mode")
# model.print_size('AAA') # (16, 64, 128, 16)
model.add_Split4thDim(2) # now (16, 64, 128, 8, 2)
DFTM=GT.DFT_matrix(H)
IDFTM=GT.IDFT_matrix(H)
DFTM_Half=GT.DFT_matrix(64)
IDFTM_Half=GT.IDFT_matrix(64)
# back to image space on RO
model.add_Mult2DMCyCSharedOverFeat(W,1,Trainable=False,InitC=IDFTM)
# YCC: also on PE
model.add_Mult2DMCxCSharedOverFeat(64,1,Trainable=False,InitC=IDFTM_Half)
# CC part
ncc=myParams.myDict['CC_channels']
# CC: model.add_conv2dC(ncc,mapsize=1) # now (16, 64, 128, ncc, 2)
model.add_einsumC('abcd,bcdx->abcx',[64,128,8, ncc])
# back to k-space space on RO
model.add_Mult2DMCyCSharedOverFeat(W,1,Trainable=False,InitC=DFTM)
# YCC: also on PE
model.add_Mult2DMCxCSharedOverFeat(64,1,Trainable=False,InitC=DFTM_Half)
# now conv, from 8 to 2
model.add_conv2dC(2,mapsize=3) # now (16, 64, 128, 2, 2)
# now put combine the 2 with the 64
model.add_Permute([0, 1, 3, 2, 4]) # now (16, 64, 2, 128, 2)
model.add_Reshape([16, 128, 128, 1,2])
# model.add_Mult2DMCxCSharedOverFeat(H,1,NamePrefix='MapsForMat')
model.add_Mult2DMCxCSharedOverFeat(H,1,Trainable=False,InitC=IDFTM)
model.add_Mult2DMCyCSharedOverFeat(W,1,Trainable=False,InitC=IDFTM)
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'SMASH1DFTxyC_XCC':
print("SMASH1DFTxyC_XCC mode")
# model.print_size('AAA') # (16, 64, 128, 16)
model.add_Split4thDim(2) # now (16, 64, 128, 8, 2)
DFTM=GT.DFT_matrix(H)
IDFTM=GT.IDFT_matrix(H)
# back to image space on RO
model.add_Mult2DMCyCSharedOverFeat(W,1,Trainable=False,InitC=IDFTM)
# CC part
ncc=myParams.myDict['CC_channels']
# CC: model.add_conv2dC(ncc,mapsize=1) # now (16, 64, 128, ncc, 2)
model.add_einsumC('abcd,bcdx->abcx',[64,128,8, ncc])
# back to k-space space on RO
model.add_Mult2DMCyCSharedOverFeat(W,1,Trainable=False,InitC=DFTM)
# now conv, from 8 to 2
model.add_conv2dC(2,mapsize=3) # now (16, 64, 128, 2, 2)
# now put combine the 2 with the 64
model.add_Permute([0, 1, 3, 2, 4]) # now (16, 64, 2, 128, 2)
model.add_Reshape([16, 128, 128, 1,2])
# model.add_Mult2DMCxCSharedOverFeat(H,1,NamePrefix='MapsForMat')
model.add_Mult2DMCxCSharedOverFeat(H,1,Trainable=False,InitC=IDFTM)
model.add_Mult2DMCyCSharedOverFeat(W,1,Trainable=False,InitC=IDFTM)
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'SMASH1DFTxyC_GCC':
print("SMASH1DFTxyC_GCC mode")
# model.print_size('AAA') # (16, 64, 128, 16)
model.add_Split4thDim(2) # now (16, 64, 128, 8, 2)
DFTM=GT.DFT_matrix(H)
IDFTM=GT.IDFT_matrix(H)
# back to image space on RO
model.add_Mult2DMCyCSharedOverFeat(W,1,Trainable=False,InitC=IDFTM)
# CC part
ncc=myParams.myDict['CC_channels']
# CC: model.add_conv2dC(ncc,mapsize=1) # now (16, 64, 128, ncc, 2)
model.add_einsumC('abcd,cdx->abcx',[128,8, ncc])
# back to k-space space on RO
model.add_Mult2DMCyCSharedOverFeat(W,1,Trainable=False,InitC=DFTM)
# now conv, from 8 to 2
model.add_conv2dC(2,mapsize=3) # now (16, 64, 128, 2, 2)
# now put combine the 2 with the 64
model.add_Permute([0, 1, 3, 2, 4]) # now (16, 64, 2, 128, 2)
model.add_Reshape([16, 128, 128, 1,2])
# model.add_Mult2DMCxCSharedOverFeat(H,1,NamePrefix='MapsForMat')
model.add_Mult2DMCxCSharedOverFeat(H,1,Trainable=False,InitC=IDFTM)
model.add_Mult2DMCyCSharedOverFeat(W,1,Trainable=False,InitC=IDFTM)
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'SMASH1DFTxyC_SCC':
print("SMASH1DFTxyC_SCC mode")
# model.print_size('AAA') # (16, 64, 128, 16)
model.add_Split4thDim(2) # now (16, 64, 128, 8, 2)
DFTM=GT.DFT_matrix(H)
IDFTM=GT.IDFT_matrix(H)
# back to image space on RO
model.add_Mult2DMCyCSharedOverFeat(W,1,Trainable=False,InitC=IDFTM)
# CC part
ncc=myParams.myDict['CC_channels']
model.add_conv2dC(ncc,mapsize=1) # now (16, 64, 128, ncc, 2)
# back to k-space space on RO
model.add_Mult2DMCyCSharedOverFeat(W,1,Trainable=False,InitC=DFTM)
# now conv, from 8 to 2
model.add_conv2dC(2,mapsize=3) # now (16, 64, 128, 2, 2)
# now put combine the 2 with the 64
model.add_Permute([0, 1, 3, 2, 4]) # now (16, 64, 2, 128, 2)
model.add_Reshape([16, 128, 128, 1,2])
# model.add_Mult2DMCxCSharedOverFeat(H,1,NamePrefix='MapsForMat')
model.add_Mult2DMCxCSharedOverFeat(H,1,Trainable=False,InitC=IDFTM)
model.add_Mult2DMCyCSharedOverFeat(W,1,Trainable=False,InitC=IDFTM)
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'SMASH1DFTxyC':
print("1DFTxyCMaps mode")
# model.print_size('AAA') # (16, 64, 128, 16)
model.add_Split4thDim(2) # now (16, 64, 128, 8, 2)
# now conv, from 8 to 2
model.add_conv2dC(2,mapsize=3) # now (16, 64, 128, 2, 2)
# now put combine the 2 with the 64
model.add_Permute([0, 1, 3, 2, 4]) # now (16, 64, 2, 128, 2)
model.add_Reshape([16, 128, 128, 1,2])
IDFTM=GT.IDFT_matrix(H)
# model.add_Mult2DMCxCSharedOverFeat(H,1,NamePrefix='MapsForMat')
model.add_Mult2DMCxCSharedOverFeat(H,1,Trainable=False,InitC=IDFTM)
model.add_Mult2DMCyCSharedOverFeat(W,1,Trainable=False,InitC=IDFTM)
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == '1DFTxyCMaps':
print("1DFTxyCMaps mode")
# model.print_size('AAA') # (16, 128, 128, 16)
model.add_Split4thDim(2) # now (16, 128, 128, 8, 2)
# model.print_size('CCC')
# model.add_Permute45()
model.add_Mult2DMCxCSharedOverFeat(H,1,NamePrefix='MapsForMat')
model.add_Mult2DMCyCSharedOverFeat(W,1)
model.add_PixelwiseMultC(1) #,NamePrefix='MapsForMat')
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == '2DFTC':
print("2DFTC mode")
model.add_5thDim()
model.add_Permute45()
model.add_Mult2DMCxC(H*W,1)
model.remove_5thDim()
model.add_reshapeTo4D(H,W)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == '1DFTxyC':
print("1DFTxyC mode")
model.add_5thDim()
model.add_Permute45()
model.add_Mult2DMCxC(H,1)
model.add_Mult2DMCyC(W,1)
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == '1DFTxC':
print("1DFTxC mode")
model.add_5thDim()
model.add_Permute45()
model.add_Mult2DMCxC(H,1)
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == '1DFTyC':
print("1DFTyC mode")
model.add_5thDim()
model.add_Permute45()
model.add_Mult2DMCyC(W,1)
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == '1DFTy':
print("1DFTy mode")
model.add_Mult2DMCy(W,channelsOut)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == '1DFTx':
print("1DFTx mode")
model.add_Mult2DMCx(H,channelsOut)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == '2DFT':
print("2DFT mode")
model.add_Mult2DMCy(W,channelsOut)
model.add_Mult2DMCx(H,channelsOut)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
# if myParams.myDict['NetMode'] == 'RegridTry1':
# print("RegridTry1 mode")
# model.add_PixelwiseMult(2, stddev_factor=1.0)
# model.add_Mult2DMCy(W,channelsOut)
# model.add_Mult2DMCx(H,channelsOut)
# new_vars = tf.global_variables()
# gene_vars = list(set(new_vars) - set(old_vars))
# return model.get_output(), gene_vars
# if myParams.myDict['NetMode'] == 'RegridTry1C':
# print("RegridTry1C mode")
# addBias=myParams.myDict['CmplxBias']>0
# if addBias:
# print("with bias")
# else:
# print("without bias")
# model.add_PixelwiseMult(2, stddev_factor=1.0)
# model.add_5thDim()
# model.add_Permute45()
# model.add_Mult2DMCyC(W,1,add_bias=addBias)
# model.add_Mult2DMCxC(H,1,add_bias=addBias)
# model.remove_5thDim()
# new_vars = tf.global_variables()
# gene_vars = list(set(new_vars) - set(old_vars))
# return model.get_output(), gene_vars
# if myParams.myDict['NetMode'] == 'RegridTry1C2':
# print("RegridTry1C2 mode")
# addBias=myParams.myDict['CmplxBias']>0
# if addBias:
# print("with bias")
# else:
# print("without bias")
# model.add_Split4thDim(2)
# model.add_PixelwiseMultC(1, stddev_factor=1.0)
# model.add_Mult2DMCyC(W,1,add_bias=addBias)
# model.add_Mult2DMCxC(H,1,add_bias=addBias)
# model.remove_5thDim()
# new_vars = tf.global_variables()
# gene_vars = list(set(new_vars) - set(old_vars))
# return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'RegridTry3C2_TS_WithTSB':
print("RegridTry3C2_TS_WithTSB mode")
FullData=scipy.io.loadmat(myParams.myDict['NMAP_FN'])
NMapCR=FullData['NMapCR']
NMapCR = tf.constant(NMapCR)
aDataH=myParams.myDict['aDataH']
aDataW=myParams.myDict['aDataW']
achannelsIn=myParams.myDict['achannelsIn']
nTS=myParams.myDict['nTimeSegments']
nccInData=myParams.myDict['nccInData']
nTraj=myParams.myDict['nTraj']
HalfDataH=np.int32(DataH/2)
# 133068/2 = 66534
# model.print_shape('Start') # now 16,133068,1,1
# model.add_Permute([0,2,3,1])
# model.add_Split4thDim(2) # now 16,1,1,133068/2,2C
model.add_Reshape([batch_size,1,1,HalfDataH,2])
# Now do TSB
model.add_Permute([0,3,2,1,4]) # now 16,133068/2,1,1,2C
model.add_Reshape([batch_size,nTraj,nccInData,1,2])
model.add_Permute([0,2,1,3,4]) # now 16 13 5118 1 2
model.add_Reshape([batch_size*nccInData,1,nTraj,1,2])
model.add_PixelwiseMultC(nTS, stddev_factor=1.0) # This is TSB. After: 16*13,1,5118,nTS,2
model.add_Reshape([batch_size,nccInData,nTraj,nTS,2])
model.add_Permute([2,1,4,0,3]) # now 5118 13 2 16 nTS
model.add_Reshape([nTraj*nccInData*2,batch_size*nTS,1,1])
# model.add_Permute([1,0,2,3])
# model.print_shape()
feature=model.get_output();
feature=tf.gather(feature,NMapCR,validate_indices=None,name=None)
# feature = tf.reshape(feature, [aDataH, aDataW, achannelsIn])
model.add_PutInOutput(feature) # After we're 131,131,192,16*nTS
model.add_Permute([3,0,1,2,4,5]) # Now 16*nTS,131,131,192,1,1
# model.add_Reshape([batch_size,nTS,aDataH,aDataW,2,96])
# model.print_shape()
model.add_Reshape([batch_size*nTS,aDataH,aDataW,achannelsIn]) # Now 16*nTS,131,131,192
UseSharedWightesInRelaxedFT=myParams.myDict['UseSharedWightesInRelaxedFT']>0
addBias=myParams.myDict['CmplxBias']>0
if addBias:
print("with bias")
else:
print("without bias")
model.add_Split4thDim(2) # Now we're batch_size*nTS, kH,kW, Neighbors(12)*Channels(8),2C
model.add_PixelwiseMultC(1, stddev_factor=1.0) # After we're batch_size*nTS,kH,kW,1,2C
# AfterRegrid_ForOut = tf.identity(model.get_output(), name="AfterRegrid_ForOut")
# model.add_PutInOutput(AfterRegrid_ForOut)
model.add_Reshape([batch_size,nTS,aDataH,aDataW,2])
model.add_Permute([0,2,3,1,4]) # After we're batch_size,kH,kW,nTS, 2C
# AfterRegridP_ForOut = tf.identity(model.get_output(), name="AfterRegridP_ForOut")
# model.add_PutInOutput(AfterRegridP_ForOut)
# Now continuing as with no TSB
MM=GT.gDFT_matrix(np.linspace(-50,50,aDataH),H)
MM=np.transpose(MM,axes=[1,0])
if UseSharedWightesInRelaxedFT:
model.add_Mult2DMCyCSharedOverFeat(W,1,add_bias=addBias,Trainable=False,InitC=MM,NamePrefix='FTy')
model.add_Mult2DMCxCSharedOverFeat(H,1,add_bias=addBias,Trainable=False,InitC=MM,NamePrefix='FTx')
else:
model.add_Mult2DMCyC(W,1,add_bias=addBias)
model.add_Mult2DMCxC(H,1,add_bias=addBias)
# AfterFT_ForOut = tf.identity(model.get_output(), name="AfterFT_ForOut")
# model.add_PutInOutput(AfterFT_ForOut)
# now supposedly batch_size,H,W,nTS
model.add_PixelwiseMultC(1, stddev_factor=1.0,NamePrefix='TSC') # This collecting the different TS to the final image.
# AfterTSC_ForOut = tf.identity(model.get_output(), name="AfterTSC_ForOut")
# model.add_PutInOutput(AfterTSC_ForOut)
model.remove_5thDim()
# EndForOut = tf.identity(model.get_output(), name="EndForOut")
# model.add_PutInOutput(EndForOut)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'RegridTry3C2FT_TS':
print("RegridTry3C2_TS mode")
aDataH=myParams.myDict['aDataH']
kMax=myParams.myDict['kMax']
aDataW=myParams.myDict['aDataW']
# achannelsIn=myParams.myDict['achannelsIn']
nTS=myParams.myDict['nTimeSegments']
nTSI=myParams.myDict['nTimeSegmentsI']
UseSharedWightesInRelaxedFT=myParams.myDict['UseSharedWightesInRelaxedFT']>0
RelaxedFT=myParams.myDict['RelaxedFT']>0
addBias=myParams.myDict['CmplxBias']>0
# FullData=scipy.io.loadmat(myParams.myDict['NMAP_FN'])
# NMapCR=FullData['NMapCR']
# NMapCR = tf.constant(NMapCR)
nccInData=myParams.myDict['nccInData']
# ncc=8
ncc=myParams.myDict['nccToUse']
nNeighbors=myParams.myDict['nNeighbors']
achannelsIn=ncc*nNeighbors*2
BaseNUFTDataP=myParams.myDict['BaseNUFTDataP']
NUFTData=scipy.io.loadmat(BaseNUFTDataP + 'TrajForNUFT.mat')
Traj=NUFTData['Trajm2'][0:2,:]
NMapCR=GT.GenerateNeighborsMap(Traj,kMax,aDataH,nccInData,ncc,nNeighbors)
NMapCR = tf.constant(NMapCR)
tmp=model.get_output();
tmp1=tf.slice(tmp,[0,0,0,0],[batch_size,132964,1,1])
TSCR=tf.slice(tmp,[0,132964,0,0],[batch_size,128*128*12*1,1,1])
TSCI=tf.slice(tmp,[0,132964+128*128*12*1,0,0],[batch_size,128*128*12*1,1,1])
Sens1DR=tf.slice(tmp,[0,132964+128*128*12*2,0,0],[batch_size,128*128*13*1,1,1])
Sens1DI=tf.slice(tmp,[0,132964+128*128*12*2+128*128*13,0,0],[batch_size,128*128*13*1,1,1])
TSCR=tf.reshape(TSCR,[batch_size,128,128,12,1])
TSCI=tf.reshape(TSCI,[batch_size,128,128,12,1])
Sens1DR=tf.reshape(Sens1DR,[batch_size,128,128,13,1])
Sens1DI=tf.reshape(Sens1DI,[batch_size,128,128,13,1])
# TSCRI=tf.concat([tf.stack([TSCR],axis=4),tf.stack([TSCI],axis=4)],axis=4)
# SensRI=tf.concat([tf.stack([Sens1DR],axis=4),tf.stack([Sens1DI],axis=4)],axis=4)
TSCRI=tf.concat([TSCR,TSCI],axis=4)
SensRI=tf.concat([Sens1DR,Sens1DI],axis=4)
model.add_PutInOutput(tmp1)
model.add_Permute([1,0,2,3]) # now we're 133068,16,1,1
# model.print_shape()
feature=model.get_output();
feature=tf.gather(feature,NMapCR,validate_indices=None,name=None) # After 131,131,192,16
# feature = tf.reshape(feature, [aDataH, aDataW, achannelsIn])
model.add_PutInOutput(feature)
model.add_Permute([3,0,1,2,4,5]) # After 16,131,131,192,1,1
# model.print_shape()
model.add_Reshape([batch_size,aDataH,aDataW,achannelsIn]) # After 16,131,131,192
model.add_Split4thDim(2) # Now we're kH,kW, Neighbors(12)*Channels(8),2
# model.add_PixelwiseMultC(nTS, stddev_factor=1.0) # After we're batch_size,kH,kW,nTS
InitForRC=[]
if myParams.myDict['InitForRFN'] != 'None':
InitForRM=scipy.io.loadmat(myParams.myDict['InitForRFN'])
InitForRR=InitForRM['gene_GEN_L007_PixelwiseMultC_weightR_0']
InitForRI=InitForRM['gene_GEN_L007_PixelwiseMultC_weightI_0']
InitForRC=InitForRR + 1j * InitForRI
model.add_PixelwiseMultC(nTS, stddev_factor=1.0,NamePrefix='',Trainable=True,InitC=InitForRC)
# model.print_shape('aaa')
model.add_CombineFeaturesAndC()
# or remove_5thDim
# k side here
model.add_ConvNetFromList(myParams.myDict['kSideNet'])
#model.add_conv2d(64, mapsize=mapsize, stride=1, stddev_factor=2.)
#model.add_elu()
#model.add_conv2dWithName(32, name="ggg", mapsize=1, stride=1, stddev_factor=2.)
#model.add_elu()
#model.add_conv2d(nTS*2, mapsize=5, stride=1, stddev_factor=2.)
# model.print_shape('bbb')
model.add_Split4thDim(2)
# model.print_shape('ccc')
model.add_2DFT()
# First I-side here
model.add_CombineFeaturesAndC()
model.add_ConvNetFromList(myParams.myDict['ISide1Net'])
#model.add_conv2d(64, mapsize=mapsize, stride=1, stddev_factor=2.)
#model.add_elu()
#model.add_conv2dWithName(32, name="ggg", mapsize=1, stride=1, stddev_factor=2.)
#model.add_elu()
#model.add_conv2d(nTS*2, mapsize=5, stride=1, stddev_factor=2.)
model.add_Split4thDim(2)
# Voxelwise I-side:
InitForLC=[]
if myParams.myDict['InitForLFN'] != 'None':
InitForLM=scipy.io.loadmat(myParams.myDict['InitForLFN'])
InitForLR=InitForLM['gene_GEN_L010_PixelwiseMultC_weightR_0']
InitForLI=InitForLM['gene_GEN_L010_PixelwiseMultC_weightI_0']
InitForLC=InitForLR + 1j * InitForLI
# model.add_PixelwiseMultC(1, stddev_factor=1.0,NamePrefix='',Trainable=True,InitC=InitForLC) # This collecting the different TS to the final image.
if nTSI>0:
model.add_PixelwiseMultC(nTSI, stddev_factor=1.0,NamePrefix='',Trainable=True,InitC=InitForLC) # This collecting the different TS to the final image.
# ConcatSensTSC=True
ConcatSensTSC=False
if ConcatSensTSC:
# Concat TSC and sens:
tmp=model.get_output() # 5D, [batch,H,W,Features,2]
# tmp = tf.Print(tmp,[],message='tmp shape: '+str(tmp.get_shape())+' '+str(tmp.dtype))
# tmp = tf.Print(tmp,[],message='TSCRI shape: '+str(TSCRI.get_shape())+' '+str(TSCRI.dtype))
# tmp = tf.Print(tmp,[],message='SensRI shape: '+str(SensRI.get_shape())+' '+str(SensRI.dtype))
tmp=tf.concat([tmp,TSCRI,SensRI],axis=3)
# tmp = tf.Print(tmp,[],message='tmpx shape: '+' '+str(tmp.get_shape()))
model.add_PutInOutput(tmp)
tmp=model.get_output() # 5D, [batch,H,W,Features,2]
# tmp = tf.Print(tmp,[],message='tmp shape: '+str(tmp.get_shape())+' '+str(tmp.dtype))
tmpR=tf.slice(tmp,[0,0,0,0,0],[batch_size,128,128,12,1])
tmpI=tf.slice(tmp,[0,0,0,0,1],[batch_size,128,128,12,1])
RR=tf.multiply(tmpR,TSCR)
RI=tf.multiply(tmpR,TSCI)
IR=tf.multiply(tmpI,TSCR)
II=tf.multiply(tmpI,TSCI)
R=RR+II
I=-RI+IR
tmp=tf.concat([R,I],axis=4)
tmp=tf.reduce_sum(tmp,axis=3)
model.add_PutInOutput(tmp)
# tmp=model.get_output() # 5D, [batch,H,W,Features,2]
# tmp2=tf.squeeze(tf.slice(TSCRI,[0,0,0,8,0],[batch_size,128,128,1,2]))
# tmp2=tf.squeeze(tf.slice(SensRI,[0,0,0,8,0],[batch_size,128,128,1,2]))
# tmp=tmp*0+tmp2
# model.add_PutInOutput(tmp)
# Final I-side here
model.add_CombineFeaturesAndC()
model.add_ConvNetFromList(myParams.myDict['ISide2Net'])
#model.add_conv2d(64, mapsize=mapsize, stride=1, stddev_factor=2.)
#model.add_elu()
#model.add_conv2dWithName(32, name="ggg", mapsize=1, stride=1, stddev_factor=2.)
#model.add_elu()
#model.add_conv2d(2, mapsize=5, stride=1, stddev_factor=2.)
# model.add_Split4thDim(2)
# model.remove_5thDim()
# if myParams.myDict['ISide2Net'][-2]==4:
# print("MB")
# model.add_Split4thDim(2)
# model.add_Permute34()
# model.add_Combine34(True)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'GenRegridCNN':
print("GenRegridCNN mode")
aDataH=myParams.myDict['aDataH']
kMax=myParams.myDict['kMax']
aDataW=myParams.myDict['aDataW']
nTS=myParams.myDict['nTimeSegments']
nTSI=myParams.myDict['nTimeSegmentsI']
UseSharedWightesInRelaxedFT=myParams.myDict['UseSharedWightesInRelaxedFT']>0
RelaxedFT=myParams.myDict['RelaxedFT']>0
addBias=myParams.myDict['CmplxBias']>0
nccInData=myParams.myDict['nccInData']
ncc=myParams.myDict['nccToUse']
nNeighbors=myParams.myDict['nNeighbors']
DataH=myParams.myDict['DataH']
DataW=myParams.myDict['DataW']
LabelsH=myParams.myDict['LabelsH']
LabelsW=myParams.myDict['LabelsW']
H=LabelsH
W=LabelsW
achannelsIn=ncc*nNeighbors*2
BaseNUFTDataP=myParams.myDict['BaseNUFTDataP']
NUFTData=scipy.io.loadmat(BaseNUFTDataP + 'TrajForNUFT.mat')
Traj=NUFTData['Trajm2'][0:2,:]
Kd=NUFTData['Kd']
P=NUFTData['P']
SN=NUFTData['SN']
# NMap,DMap=GT.GenerateNeighborsMapBaseExt(Traj,kMax,aDataH,nNeighbors)
NMap,DMap=GT.GenerateNeighborsMapBaseExt(Traj,64,aDataH,nNeighbors)
NMap=tf.constant(NMap)
# DMap H,W,nNeighbors,2
DMapx=np.reshape(DMap,(1,H,W,nNeighbors*2))
DMapx=np.tile(DMapx,(batch_size,1,1,1))/H
print('DMapx')
print(str(DMapx.shape))
DMapx=tf.constant(DMapx)
X=np.arange(-63,65)
X=np.tile(X,(H,1))
X=X.astype(np.float32)
X=np.reshape(X,(1,H,W))
Y=np.transpose(X,(0,2,1))
XY=np.stack((X,Y),axis=3)
XY=np.tile(XY,(batch_size,1,1,1))
XY=XY/H
XY=tf.constant(XY)
# print('XY')
# print(str(XY.shape))
nTraj=Traj.shape[1]
nCh=nccInData
tmp=model.get_output()
tmp1R=tf.slice(tmp,[0,0,0,0],[batch_size,nTraj*nCh,1,1])
tmp1I=tf.slice(tmp,[0,nTraj*nCh,0,0],[batch_size,nTraj*nCh,1,1])
SigC=tf.complex(tf.reshape(tmp1R,[batch_size,nCh,nTraj]),tf.reshape(tmp1I,[batch_size,nCh,nTraj]))
SigC=tf.transpose(SigC,[0,2,1])
# Traj.shape[0]
nTSC=12
# Sens1DR=tf.slice(tmp,[0,132964+H*W*nTSC*2,0,0],[batch_size,H*W*nCh*1,1,1])
# Sens1DI=tf.slice(tmp,[0,132964+H*W*nTSC*2+H*W*nCh,0,0],[batch_size,H*W*nCh*1,1,1])
# Sens1DR=tf.reshape(Sens1DR,[batch_size,H,W,nCh])
# Sens1DI=tf.reshape(Sens1DI,[batch_size,H,W,nCh])
# SensC=tf.complex(Sens1DR,Sens1DI) # batch_size, H, W, nCh
Reg=tf.gather(SigC,NMap,validate_indices=None,name=None,axis=1) #batch_size, H, W, nNeighbors, nCh
Reg=tf.reshape(Reg,(batch_size, H, W, nNeighbors*nCh))
RegRI=GT.ConcatCOnDim(Reg,3)
model.add_PutInOutput(RegRI)
model.add_concat(DMapx)
model.add_concat(XY)
UseBN=GT.getparam('UseBN')
model.add_ConvNetFromListWithNameAndScope( myParams.myDict['ISide1Net'],name='Net',scope='ConvNet',UseBN=UseBN)
F_RI=model.get_output()
def RItoCon4(X): return tf.squeeze(tf.complex(tf.slice(X,[0,0,0,0],[batch_size,H,W,1]),tf.slice(X,[0,0,0,1],[batch_size,H,W,1])))
F_C=RItoCon4(F_RI)
# F_C=tf.squeeze( tf.complex(tf.slice(F_RI,[0,0,0,0],[batch_size,H,W,1]),tf.slice(F_RI,[0,0,0,1],[batch_size,H,W,1])) )
# F_C=tf.transpose(F_C,[0,2,1])
AfterFT=tf.ifft2d(F_C)
# AfterFT=F_C
initR = GT._glorot_initializer_g((1,H,W), stddev_factor=2)
initI = GT._glorot_initializer_g((1,H,W), stddev_factor=2)
weightR = tf.get_variable("AfterFTR", initializer=initR)
weightI = tf.get_variable("AfterFTI", initializer=initI)
weightC=tf.complex(weightR,weightI)
AfterFT=tf.multiply(AfterFT,weightC) # batch_size, H, W
# AfterFT=tf.transpose(AfterFT,[0,2,3,1]) # so batch_size,H,W,nCh
FinalRI=GT.ConcatCOnDimWithStack(AfterFT,3)
model.add_PutInOutput(FinalRI)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'PerChannelKer':
print("96 mode")
aDataH=myParams.myDict['aDataH']
kMax=myParams.myDict['kMax']
aDataW=myParams.myDict['aDataW']
nTS=myParams.myDict['nTimeSegments']
nTSI=myParams.myDict['nTimeSegmentsI']
UseSharedWightesInRelaxedFT=myParams.myDict['UseSharedWightesInRelaxedFT']>0
RelaxedFT=myParams.myDict['RelaxedFT']>0
addBias=myParams.myDict['CmplxBias']>0
nccInData=myParams.myDict['nccInData']
ncc=myParams.myDict['nccToUse']
nNeighbors=myParams.myDict['nNeighbors']
DataH=myParams.myDict['DataH']
DataW=myParams.myDict['DataW']
LabelsH=myParams.myDict['LabelsH']
LabelsW=myParams.myDict['LabelsW']
H=LabelsH
W=LabelsW
achannelsIn=ncc*nNeighbors*2
BaseNUFTDataP=myParams.myDict['BaseNUFTDataP']
NUFTData=scipy.io.loadmat(BaseNUFTDataP + 'TrajForNUFT.mat')
Traj=NUFTData['Trajm2'][0:2,:]
Kd=NUFTData['Kd']
P=NUFTData['P']
SN=NUFTData['SN']
NMap=GT.GenerateNeighborsMapBase(Traj,kMax,aDataH,nNeighbors)
NMap=tf.constant(NMap)
nTraj=Traj.shape[1]
nCh=nccInData
X=np.arange(-63,65)
X=np.tile(X,(H,1))
X=X.astype(np.float32)
X=np.reshape(X,(1,H,W))
Y=np.transpose(X,(0,2,1))
XY=np.stack((X,Y),axis=3)
XY=np.tile(XY,(batch_size,1,1,1))
XY=XY/H
XY=tf.constant(XY)
tmp=model.get_output()
tmp1R=tf.slice(tmp,[0,0,0,0],[batch_size,nTraj*nCh,1,1])
tmp1I=tf.slice(tmp,[0,nTraj*nCh,0,0],[batch_size,nTraj*nCh,1,1])
SigC=tf.complex(tf.reshape(tmp1R,[batch_size,nCh,nTraj]),tf.reshape(tmp1I,[batch_size,nCh,nTraj]))
SigC=tf.transpose(SigC,[0,2,1])
# Traj.shape[0]
nTSC=12
Sens1DR=tf.slice(tmp,[0,132964+H*W*nTSC*2,0,0],[batch_size,H*W*nCh*1,1,1])
Sens1DI=tf.slice(tmp,[0,132964+H*W*nTSC*2+H*W*nCh,0,0],[batch_size,H*W*nCh*1,1,1])
Sens1DR=tf.reshape(Sens1DR,[batch_size,H,W,nCh])
Sens1DI=tf.reshape(Sens1DI,[batch_size,H,W,nCh])
SensC=tf.complex(Sens1DR,Sens1DI) # batch_size, H, W, nCh
SensRI=tf.concat([Sens1DR,Sens1DI],axis=3)
Reg=tf.gather(SigC,NMap,validate_indices=None,name=None,axis=1) #batch_size, H, W, nNeighbors, nCh
initR = GT._glorot_initializer_g((1,H,W,nNeighbors,1), stddev_factor=2)
initI = GT._glorot_initializer_g((1,H,W,nNeighbors,1), stddev_factor=2)
weightR = tf.get_variable("PerChannelKernelR", initializer=initR)
weightI = tf.get_variable("PerChannelKernelI", initializer=initI)
weightC=tf.complex(weightR,weightI)
Res=tf.reduce_sum(tf.multiply(Reg,weightC) ,axis=3) # batch_size, H, W, nCh
HalfH=H/2
HalfW=W/2
IdH=tf.concat([tf.range(HalfH,H), tf.range(0,HalfH)],axis=0)
IdH=tf.cast(IdH,tf.int32)
IdW=tf.concat([tf.range(HalfW,W), tf.range(0,HalfW)],axis=0)
IdW=tf.cast(IdW,tf.int32)
C = tf.gather(Res,IdH,axis=1)
C = tf.gather(C,IdW,axis=2)
# C = tf.Print(C,[],message=message + ' C shape: '+str(C.get_shape())+' '+str(C.dtype))
BeforeFT=tf.transpose(C,[0,3,1,2]) # so batch_size,nCh,H,W
BeforeFT=tf.transpose(BeforeFT,[0,1,3,2]) # so batch_size,nCh,W,H
AfterFT=tf.ifft2d(BeforeFT)
AfterFT=tf.transpose(AfterFT,[0,2,3,1]) # so batch_size,H,W,nCh
AfterFTRI=GT.ConcatCOnDim(AfterFT,3)
model.add_PutInOutput(AfterFTRI)
model.add_concat(XY)
model.add_concat(SensRI)
UseBN=GT.getparam('UseBN')
model.add_ConvNetFromListWithNameAndScope( myParams.myDict['ISide1Net'],name='Net',scope='ConvNet',UseBN=UseBN)
# WithSens=tf.reduce_sum(tf.multiply(AfterFT,tf.conj(SensC)),axis=3) # batch_size, H, W
# FinalRI=GT.ConcatCOnDimWithStack(WithSens,3)
# model.add_PutInOutput(FinalRI)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'Cart_ISTA_ITS_MB':
print("Cart_ISTA_ITS_MB mode")
nTS=GT.getparam('nTimeSegments')
nTSI=GT.getparam('nTimeSegmentsI')
UseSharedWightesInRelaxedFT=GT.getparam('UseSharedWightesInRelaxedFT')>0
nccInData=GT.getparam('nccInData')
ncc=GT.getparam('nccToUse')
nTSC=GT.getparam('nTimeSegments')
LabelsH=GT.getparam('LabelsH')
LabelsW=GT.getparam('LabelsW')
H=LabelsH
W=LabelsW
nCh=ncc
kMax=H/2
# kMax=GT.getparam('kMax')
MB=GT.getparam('MB')
# MB=1
X=np.arange(-kMax+1,kMax+1)
X=np.tile(X,(H,1))
X=X.astype(np.float32)
X=np.reshape(X,(1,H,W))
Y=np.transpose(X,(0,2,1))
XY=np.stack((X,Y),axis=3)
XY=np.tile(XY,(batch_size,1,1,1))
XY=XY/H
XY=tf.constant(XY)
CurLoc=0
tmp=model.get_output()
def ReadCFrom1D(Data,Loc,Sz):
ProdSz=np.prod(Sz)
tmp1R=Data[:,Loc:(Loc+ProdSz),:1,:1]
tmp1I=Data[:,Loc+ProdSz:Loc+2*ProdSz,:1,:1]
NewSz=np.concatenate(([-1],Sz),axis=0)
C=tf.complex(tf.reshape(tmp1R,NewSz),tf.reshape(tmp1I,NewSz))
AddedLoc=ProdSz*2
return C,AddedLoc
SensC6,ToAddToLoc=ReadCFrom1D(tmp,CurLoc,(H,W,nCh,MB)) # batch_size, H, W, nCh,1
CurLoc=CurLoc+ToAddToLoc
AHA_ITS,ToAddToLoc=ReadCFrom1D(tmp,CurLoc,(H,W,nTSC*MB)) # batch_size, H, W, nTSC*MB
CurLoc=CurLoc+ToAddToLoc
SendTSCest=GT.getparam('SendTSCest')>0
if SendTSCest:
TSCest,ToAddToLoc=ReadCFrom1D(tmp,CurLoc,(H,W,nTSC,MB)) # batch_size, H,W,nTS,MB
CurLoc=CurLoc+ToAddToLoc
AHA_ITSRI=GT.ConcatCOnDim(AHA_ITS,3) # batch_size, H, W, nTSC*MB*RI
SendWarmStart=GT.getparam('SendWarmStart')>0
if SendWarmStart:
WarmStart,ToAddToLoc=ReadCFrom1D(tmp,CurLoc,(H,W,nTSC*MB)) # batch_size, H,W,nTS,MB
CurLoc=CurLoc+ToAddToLoc
y0RI=GT.ConcatCOnDim(WarmStart,3) # batch_size, H, W, nTSC*MB*RI
# ShapeA=AHA_ITSRI.get_shape()
# ShapeB=y0RI.get_shape()
# y0RI=tf.Print(y0RI,[],'AHA_ITSRI shape '+str(ShapeA))
# y0RI=tf.Print(y0RI,[],'y0RI shape '+str(ShapeB))
else:
y0RI=AHA_ITSRI*0
# y0RI=GT.ConcatCOnDim(y0,3) # batch_size, H, W, nTSC*MB*RI
# SensC6 is batch_size, H, W, nCh,MB
# SensC6 should be H,W,/nTSC/,nCh,MB,batch_size
SensC6=tf.transpose(GT.TF_addDim(SensC6),[1,2,5,3,4,0])
model.add_PutInOutput(y0RI)
# NetList=GT.getparam('ISide1Net')
NetList=copy.deepcopy(GT.getparam('ISide1Net'))
nExtraFeatures=NetList[-2]
# nExtraFeatures=0
# ExtraFeatures=tf.constant(tf.zeros([batch_size,H,W,nExtraFeatures]),tf.float32)
ExtraFeatures=tf.zeros([batch_size,H,W,nExtraFeatures],tf.float32)
model.add_concat(ExtraFeatures)
NetList[-2]=nTSC*MB*2+nExtraFeatures
# model.add_concat(XY)
# model.add_concat(SensRI)
UseBN=GT.getparam('UseBN')
# model.add_ConvNetFromListWithNameAndScope(myParams.myDict['ISide1Net'],name='Net',scope='ConvNet',UseBN=UseBN,AddDirectConnection=True)
Iters=GT.getparam('Iterations')
nIter=Iters.shape[0]
# nIter=1
# AllItersRes=tf.Variable(tf.zeros([batch_size,H*(nIter+1),W*nTS*MB,2],tf.float32))
ResList=[]
def RItoCon5(X): return tf.squeeze(tf.complex(tf.slice(X,[0,0,0,0,0],[-1,-1,-1,-1,1]),tf.slice(X,[0,0,0,0,1],[-1,-1,-1,-1,1])),axis=4)
def RItoCon6(X): return tf.squeeze(tf.complex(tf.slice(X,[0,0,0,0,0,0],[-1,-1,-1,-1,-1,1]),tf.slice(X,[0,0,0,0,0,1],[-1,-1,-1,-1,-1,1])),axis=5)
CartMask=GT.getparam('CartMask')
for Iter in range(0, nIter):
CurEstRIa=model.get_output() # batch_size,H,W,2*nTS*MB
Cury0FeatRI=CurEstRIa[:,:,:,:2*nTS*MB]
ExtraFeatures=CurEstRIa[:,:,:,2*nTS*MB:]
CurEstRIa=tf.reshape(Cury0FeatRI,[batch_size,H,W,2,nTS,MB])
CurEstRI=tf.transpose(CurEstRIa,[0,1,2,4,5,3])
CurEst=RItoCon6(CurEstRI) # batch_size,H,W,nTSC,MB
# new simpler approach
if SendTSCest:
CurEstForAHA=CurEst*TSCest
else:
CurEstForAHA=CurEst
# if SendTSCest:
# CurEstForAHA=CurEst*TSCest
# Finaln_ITS_RI=GT.ConcatCOnDimWithStack(CurEstForAHA,5) # batch_size,H,W,nTSC,MB, RI
# Finaln_ITS_RI=tf.transpose(Finaln_ITS_RI,[0,1,4,3,2,5]) # batch_size,H,MB,nTSC,W,RI
# else:
# new simpler approach removed this:
# CurEstForAHA=CurEst
# TSCest0=tf.Print(TSCest0,[tfrm(TSCest0)],'TSCest0 ')
# print_op = tf.print("tensors:", CurEstForAHA.get_shape(),output_stream=sys.stderr)
# with tf.control_dependencies([print_op]):
# CurEstForAHA = CurEstForAHA * 1
Finaln_ITS_RI=tf.transpose(CurEstRIa,[0,1,5,4,2,3]) # batch_size,H,MB,nTSC,W,RI
Finaln_ITS_RI=tf.reshape(Finaln_ITS_RI,[batch_size,H,W*nTSC*MB,2])
ResList.append(Finaln_ITS_RI)
# AllItersRes[:,Iter*H:(Iter+1)*H,:,:]=Finaln_ITS_RI
# AllItersRes=tf.assign(AllItersRes[:,Iter*H:(Iter+1)*H,:,:],Finaln_ITS_RI)
# AllItersRes = AllItersRes[:,Iter*H:(Iter+1)*H,:,:].assign(Finaln_ITS_RI)
# InImage is batch_size,H,W,nTSC,MB
# AHA_CurEst=GT.TS_NUFFT_OPHOP_ITS_MB(CurEstForAHA,SensC6,H,W,batch_size,paddingsYMB,nTSC,nCh,fftkernc7)
AHA_CurEst=GT.Cartesian_OPHOP_ITS_MB(CurEstForAHA,SensC6,CartMask)
# AHA_CurEst = tf.Print(AHA_CurEst,[],message='AAAAAAA')
# new simpler approach
if SendTSCest:
# print('Applying TSCest')
# AHA_CurEst = tf.Print(AHA_CurEst,[],message='xx Applying TSCest')
AHA_CurEst=AHA_CurEst*tf.conj(TSCest)
# batch_size,H,W,nTSC,MB?
AHA_CurEst=tf.reshape(AHA_CurEst,[batch_size,H,W,nTS*MB])
AHA_CurEstRI=GT.ConcatCOnDim(AHA_CurEst,3)
model.add_PutInOutput(Cury0FeatRI)
model.add_concat(AHA_ITSRI)
model.add_concat(AHA_CurEstRI)
model.add_concat(ExtraFeatures)
NetName='Net'+str(Iters[Iter])
model.add_ConvNetFromListWithNameAndScope(NetList,name=NetName,scope='ConvNet',UseBN=UseBN,AddDirectConnection=True) # , stddev_factor=0.3
# test to show warm start
# t0= tf.get_variable('t0', initializer=tf.cast(1.0,tf.float32))
# model.add_PutInOutput(y0RI+t0*0)
Iter=Iter+1
CurEstRIa=model.get_output() # batch_size,H,W,2*nTS*MB
CurEstRIa=CurEstRIa[:,:,:,:2*nTS*MB]
CurEstRIa=tf.reshape(CurEstRIa,[batch_size,H,W,2,nTS,MB])
CurEstRI=tf.transpose(CurEstRIa,[0,1,2,4,5,3])
CurEst=RItoCon6(CurEstRI) # batch_size,H,W,nTSC,MB
# if SendTSCest:
# CurEstForAHA=CurEst*TSCest
# Finaln_ITS_RI=GT.ConcatCOnDimWithStack(CurEstForAHA,5) # batch_size,H,W,nTSC,MB, RI
# Finaln_ITS_RI=tf.transpose(Finaln_ITS_RI,[0,1,4,3,2,5]) # batch_size,H,MB,nTSC,W,RI
# else:
# CurEstForAHA=CurEst
Finaln_ITS_RI=tf.transpose(CurEstRIa,[0,1,5,4,2,3]) # batch_size,H,MB,nTSC,W,RI
# Finaln_ITS_RI=model.get_output() # batch_size,H,W,2*nTS*MB
# Finaln_ITS_RI=Finaln_ITS_RI[:,:,:,:2*nTS*MB]
# Finaln_ITS_RI=tf.reshape(Finaln_ITS_RI,[batch_size,H,W,2,nTS,MB])
# Finaln_ITS_RI=tf.transpose(Finaln_ITS_RI,[0,1,5,4,2,3]) # batch_size,H,W,nTSC,MB,RI
Finaln_ITS_RI=tf.reshape(Finaln_ITS_RI,[batch_size,H,W*nTSC*MB,2])
# AllItersRes[:,Iter*H:(Iter+1)*H,:,:]=Finaln_ITS_RI
# AllItersRes=tf.assign(AllItersRes[:,Iter*H:(Iter+1)*H,:,:],Finaln_ITS_RI)
# AllItersRes = AllItersRes[:,Iter*H:(Iter+1)*H,:,:].assign(Finaln_ITS_RI)
ResList.append(Finaln_ITS_RI)
AllItersRes=tf.concat(ResList,axis=1)
# model.add_PutInOutput(Finaln_ITS_RI)
model.add_PutInOutput(AllItersRes)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'ISTA_ITS_MB':
print("ISTA_ITS_MB mode")
aDataH=GT.getparam('aDataH')
kMax=GT.getparam('kMax')
aDataW=GT.getparam('aDataW')
nTS=GT.getparam('nTimeSegments')
nTSI=GT.getparam('nTimeSegmentsI')
UseSharedWightesInRelaxedFT=GT.getparam('UseSharedWightesInRelaxedFT')>0
RelaxedFT=GT.getparam('RelaxedFT')>0
addBias=GT.getparam('CmplxBias')>0
nccInData=GT.getparam('nccInData')
ncc=GT.getparam('nccToUse')
nNeighbors=GT.getparam('nNeighbors')
nTSC=GT.getparam('nTimeSegments')
LabelsH=GT.getparam('LabelsH')
LabelsW=GT.getparam('LabelsW')
H=LabelsH
W=LabelsW
nCh=ncc
MB=GT.getparam('MB')
X=np.arange(-kMax+1,kMax+1)
X=np.tile(X,(H,1))
X=X.astype(np.float32)
X=np.reshape(X,(1,H,W))
Y=np.transpose(X,(0,2,1))
XY=np.stack((X,Y),axis=3)
XY=np.tile(XY,(batch_size,1,1,1))
XY=XY/H
XY=tf.constant(XY)
CurLoc=0
tmp=model.get_output()
def ReadCFrom1D(Data,Loc,Sz):
ProdSz=np.prod(Sz)
tmp1R=Data[:,Loc:(Loc+ProdSz),:1,:1]
tmp1I=Data[:,Loc+ProdSz:Loc+2*ProdSz,:1,:1]
NewSz=np.concatenate(([-1],Sz),axis=0)
C=tf.complex(tf.reshape(tmp1R,NewSz),tf.reshape(tmp1I,NewSz))
AddedLoc=ProdSz*2
return C,AddedLoc
SensC6,ToAddToLoc=ReadCFrom1D(tmp,CurLoc,(H,W,nCh,MB)) # batch_size, H, W, nCh,1
CurLoc=CurLoc+ToAddToLoc
SendTSCest=GT.getparam('SendTSCest')>0
if SendTSCest:
TSCest,ToAddToLoc=ReadCFrom1D(tmp,CurLoc,(H,W,nTSC,MB)) # batch_size, H,W,nTS,MB
CurLoc=CurLoc+ToAddToLoc
SendSig=False
if SendSig:
NMap=GT.getparam('NMap')
Kd=GT.getparam('Kd')
nTraj=GT.getparam('nTraj')
TSBF=GT.getparam('TSBF')
SN=GT.getparam('SN')
sp_C=GT.getparam('sp_C')
SigC,ToAddToLoc=ReadCFrom1D(tmp,CurLoc,(nCh,nTraj))
CurLoc=CurLoc+ToAddToLoc
SigC=tf.transpose(SigC,[0,2,1])
Reg=tf.gather(SigC,NMap,validate_indices=None,name=None,axis=1) #batch_size, H, W, nNeighbors, nCh
initR = GT._glorot_initializer_g((1,H,W,nNeighbors,1,nTS), stddev_factor=2)
initI = GT._glorot_initializer_g((1,H,W,nNeighbors,1,nTS), stddev_factor=2)
weightR = tf.get_variable("PerChannelKernelR", initializer=initR)
weightI = tf.get_variable("PerChannelKernelI", initializer=initI)
weightC=tf.complex(weightR,weightI)
Reg=GT.TF_5d_to_6d(Reg)
Res=tf.reduce_sum(tf.multiply(Reg,weightC) ,axis=3) # batch_size, H, W, nCh, nTS
HalfH=H/2
HalfW=W/2
IdH=tf.concat([tf.range(HalfH,H), tf.range(0,HalfH)],axis=0)
IdH=tf.cast(IdH,tf.int32)
IdW=tf.concat([tf.range(HalfW,W), tf.range(0,HalfW)],axis=0)
IdW=tf.cast(IdW,tf.int32)
C = tf.gather(Res,IdH,axis=1)
C = tf.gather(C,IdW,axis=2)
BeforeFT=tf.transpose(C,[0,3,4,1,2]) # so batch_size,nCh,nTS,H,W
BeforeFT=tf.transpose(BeforeFT,[0,1,2,4,3]) # so batch_size,nCh,nTS,W,H
AfterFT=tf.ifft2d(BeforeFT)
AfterFT=tf.transpose(AfterFT,[0,3,4,1,2]) # so batch_size,H,W,nCh,nTS
y0=tf.reduce_sum( tf.multiply(AfterFT, tf.conj(SensC5)),axis=3) # batch_size,H,W,nTS
else:
y0,ToAddToLoc=ReadCFrom1D(tmp,CurLoc,(H,W,nTSC*MB)) # batch_size, H, W, nTSC*MB
CurLoc=CurLoc+ToAddToLoc
paddingsYMB=GT.getparam('paddingsYMB')
fftkernc7=GT.getparam('fftkernc7')
AHA_ITS,ToAddToLoc=ReadCFrom1D(tmp,CurLoc,(H,W,nTSC*MB)) # batch_size, H, W, nTSC*MB
CurLoc=CurLoc+ToAddToLoc
AHA_ITSRI=GT.ConcatCOnDim(AHA_ITS,3) # batch_size, H, W, nTSC*MB*RI
y0RI=GT.ConcatCOnDim(y0,3) # batch_size, H, W, nTSC*MB*RI
# SensC6 is batch_size, H, W, nCh,MB
# SensC6 should be H,W,/nTSC/,nCh,MB,batch_size
SensC6=tf.transpose(GT.TF_addDim(SensC6),[1,2,5,3,4,0])
model.add_PutInOutput(y0RI)
# NetList=GT.getparam('ISide1Net')
NetList=copy.deepcopy(GT.getparam('ISide1Net'))
nExtraFeatures=NetList[-2]
# nExtraFeatures=0
# ExtraFeatures=tf.constant(tf.zeros([batch_size,H,W,nExtraFeatures]),tf.float32)
ExtraFeatures=tf.zeros([batch_size,H,W,nExtraFeatures],tf.float32)
model.add_concat(ExtraFeatures)
NetList[-2]=nTSC*MB*2+nExtraFeatures
# model.add_concat(XY)
# model.add_concat(SensRI)
UseBN=GT.getparam('UseBN')
# model.add_ConvNetFromListWithNameAndScope(myParams.myDict['ISide1Net'],name='Net',scope='ConvNet',UseBN=UseBN,AddDirectConnection=True)
Iters=GT.getparam('Iterations')
nIter=Iters.shape[0]
# nIter=1
# AllItersRes=tf.Variable(tf.zeros([batch_size,H*(nIter+1),W*nTS*MB,2],tf.float32))
ResList=[]
def RItoCon5(X): return tf.squeeze(tf.complex(tf.slice(X,[0,0,0,0,0],[-1,-1,-1,-1,1]),tf.slice(X,[0,0,0,0,1],[-1,-1,-1,-1,1])),axis=4)
def RItoCon6(X): return tf.squeeze(tf.complex(tf.slice(X,[0,0,0,0,0,0],[-1,-1,-1,-1,-1,1]),tf.slice(X,[0,0,0,0,0,1],[-1,-1,-1,-1,-1,1])),axis=5)
for Iter in range(0, nIter):
CurEstRIa=model.get_output() # batch_size,H,W,2*nTS*MB
Cury0FeatRI=CurEstRIa[:,:,:,:2*nTS*MB]
ExtraFeatures=CurEstRIa[:,:,:,2*nTS*MB:]
CurEstRIa=tf.reshape(Cury0FeatRI,[batch_size,H,W,2,nTS,MB])
CurEstRI=tf.transpose(CurEstRIa,[0,1,2,4,5,3])
CurEst=RItoCon6(CurEstRI) # batch_size,H,W,nTSC,MB
if SendTSCest:
CurEstForAHA=CurEst*TSCest
Finaln_ITS_RI=GT.ConcatCOnDimWithStack(CurEstForAHA,5) # batch_size,H,W,nTSC,MB, RI
Finaln_ITS_RI=tf.transpose(Finaln_ITS_RI,[0,1,4,3,2,5]) # batch_size,H,MB,nTSC,W,RI
else:
CurEstForAHA=CurEst
Finaln_ITS_RI=tf.transpose(CurEstRIa,[0,1,5,4,2,3]) # batch_size,H,MB,nTSC,W,RI
Finaln_ITS_RI=tf.reshape(Finaln_ITS_RI,[batch_size,H,W*nTSC*MB,2])
ResList.append(Finaln_ITS_RI)
# AllItersRes[:,Iter*H:(Iter+1)*H,:,:]=Finaln_ITS_RI
# AllItersRes=tf.assign(AllItersRes[:,Iter*H:(Iter+1)*H,:,:],Finaln_ITS_RI)
# AllItersRes = AllItersRes[:,Iter*H:(Iter+1)*H,:,:].assign(Finaln_ITS_RI)
# InImage is batch_size,H,W,nTSC,MB
AHA_CurEst=GT.TS_NUFFT_OPHOP_ITS_MB(CurEstForAHA,SensC6,H,W,batch_size,paddingsYMB,nTSC,nCh,fftkernc7)
# batch_size,H,W,nTSC,MB?aux?
AHA_CurEst=tf.reshape(AHA_CurEst,[batch_size,H,W,nTS*MB])
AHA_CurEstRI=GT.ConcatCOnDim(AHA_CurEst,3)
model.add_PutInOutput(Cury0FeatRI)
model.add_concat(AHA_ITSRI)
model.add_concat(AHA_CurEstRI)
model.add_concat(ExtraFeatures)
NetName='Net'+str(Iters[Iter])
model.add_ConvNetFromListWithNameAndScope(NetList,name=NetName,scope='ConvNet',UseBN=UseBN,AddDirectConnection=True) # , stddev_factor=0.3
# test to show warm start
# t0= tf.get_variable('t0', initializer=tf.cast(1.0,tf.float32))
# model.add_PutInOutput(y0RI+t0*0)
Iter=Iter+1
CurEstRIa=model.get_output() # batch_size,H,W,2*nTS*MB
CurEstRIa=CurEstRIa[:,:,:,:2*nTS*MB]
CurEstRIa=tf.reshape(CurEstRIa,[batch_size,H,W,2,nTS,MB])
CurEstRI=tf.transpose(CurEstRIa,[0,1,2,4,5,3])
CurEst=RItoCon6(CurEstRI) # batch_size,H,W,nTSC,MB
if SendTSCest:
CurEstForAHA=CurEst*TSCest
Finaln_ITS_RI=GT.ConcatCOnDimWithStack(CurEstForAHA,5) # batch_size,H,W,nTSC,MB, RI
Finaln_ITS_RI=tf.transpose(Finaln_ITS_RI,[0,1,4,3,2,5]) # batch_size,H,MB,nTSC,W,RI
else:
CurEstForAHA=CurEst
Finaln_ITS_RI=tf.transpose(CurEstRIa,[0,1,5,4,2,3]) # batch_size,H,MB,nTSC,W,RI
# Finaln_ITS_RI=model.get_output() # batch_size,H,W,2*nTS*MB
# Finaln_ITS_RI=Finaln_ITS_RI[:,:,:,:2*nTS*MB]
# Finaln_ITS_RI=tf.reshape(Finaln_ITS_RI,[batch_size,H,W,2,nTS,MB])
# Finaln_ITS_RI=tf.transpose(Finaln_ITS_RI,[0,1,5,4,2,3]) # batch_size,H,W,nTSC,MB,RI
Finaln_ITS_RI=tf.reshape(Finaln_ITS_RI,[batch_size,H,W*nTSC*MB,2])
# AllItersRes[:,Iter*H:(Iter+1)*H,:,:]=Finaln_ITS_RI
# AllItersRes=tf.assign(AllItersRes[:,Iter*H:(Iter+1)*H,:,:],Finaln_ITS_RI)
# AllItersRes = AllItersRes[:,Iter*H:(Iter+1)*H,:,:].assign(Finaln_ITS_RI)
ResList.append(Finaln_ITS_RI)
AllItersRes=tf.concat(ResList,axis=1)
# model.add_PutInOutput(Finaln_ITS_RI)
model.add_PutInOutput(AllItersRes)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'ISTA_ITS':
print("ISTA_ITS mode")
aDataH=myParams.myDict['aDataH']
kMax=myParams.myDict['kMax']
aDataW=myParams.myDict['aDataW']
nTS=myParams.myDict['nTimeSegments']
nTSI=myParams.myDict['nTimeSegmentsI']
UseSharedWightesInRelaxedFT=myParams.myDict['UseSharedWightesInRelaxedFT']>0
RelaxedFT=myParams.myDict['RelaxedFT']>0
addBias=myParams.myDict['CmplxBias']>0
nccInData=myParams.myDict['nccInData']
ncc=myParams.myDict['nccToUse']
nNeighbors=myParams.myDict['nNeighbors']
nTSC=GT.getparam('nTimeSegments')
# DataH=myParams.myDict['DataH']
# DataW=myParams.myDict['DataW']
LabelsH=myParams.myDict['LabelsH']
LabelsW=myParams.myDict['LabelsW']
H=LabelsH
W=LabelsW
# nCh=nccInData
nCh=ncc
X=np.arange(-63,65)
X=np.tile(X,(H,1))
X=X.astype(np.float32)
X=np.reshape(X,(1,H,W))
Y=np.transpose(X,(0,2,1))
XY=np.stack((X,Y),axis=3)
XY=np.tile(XY,(batch_size,1,1,1))
XY=XY/H
XY=tf.constant(XY)
tmp=model.get_output()
CurLoc=0
def ReadCFrom1D(Data,Loc,Sz):
ProdSz=np.prod(Sz)
tmp1R=tf.slice(Data,[0,Loc,0,0],[batch_size,ProdSz,1,1])
tmp1I=tf.slice(Data,[0,Loc+ProdSz,0,0],[batch_size,ProdSz,1,1])
NewSz=np.concatenate(([-1],Sz),axis=0)
C=tf.complex(tf.reshape(tmp1R,NewSz),tf.reshape(tmp1I,NewSz))
AddedLoc=ProdSz*2
return C,AddedLoc
# Sens1DR=tf.slice(tmp,[0, CurLoc,0,0],[batch_size,H*W*nCh*1,1,1])
# Sens1DI=tf.slice(tmp,[0, CurLoc+H*W*nCh,0,0],[batch_size,H*W*nCh*1,1,1])
# Sens1DR=tf.reshape(Sens1DR,[batch_size,H,W,nCh])
# Sens1DI=tf.reshape(Sens1DI,[batch_size,H,W,nCh])
# SensC=tf.complex(Sens1DR,Sens1DI) # batch_size, H, W, nCh
# SensC5=GT.TF_4d_to_5d(SensC) # batch_size, H, W, nCh,1
SensC5,ToAddToLoc=ReadCFrom1D(tmp,CurLoc,(H,W,nCh,1)) # batch_size, H, W, nCh,1
CurLoc=CurLoc+ToAddToLoc
# SensRI=tf.concat([Sens1DR,Sens1DI],axis=3)
# CurLoc=CurLoc+H*W*nCh*2
SendSig=False
if SendSig:
NMap=GT.getparam('NMap')
Kd=GT.getparam('Kd')
nTraj=GT.getparam('nTraj')
TSBF=GT.getparam('TSBF')
SN=GT.getparam('SN')
sp_C=GT.getparam('sp_C')
# tmp1R=tf.slice(tmp,[0,CurLoc,0,0],[batch_size,nTraj*nCh,1,1])
# tmp1I=tf.slice(tmp,[0,CurLoc+nTraj*nCh,0,0],[batch_size,nTraj*nCh,1,1])
# SigC=tf.complex(tf.reshape(tmp1R,[batch_size,nCh,nTraj]),tf.reshape(tmp1I,[batch_size,nCh,nTraj]))
# SigC=tf.transpose(SigC,[0,2,1])
# CurLoc=CurLoc+nTraj*nCh*2
SigC,ToAddToLoc=ReadCFrom1D(tmp,CurLoc,(nCh,nTraj))
CurLoc=CurLoc+ToAddToLoc
SigC=tf.transpose(SigC,[0,2,1])
Reg=tf.gather(SigC,NMap,validate_indices=None,name=None,axis=1) #batch_size, H, W, nNeighbors, nCh
initR = GT._glorot_initializer_g((1,H,W,nNeighbors,1,nTS), stddev_factor=2)
initI = GT._glorot_initializer_g((1,H,W,nNeighbors,1,nTS), stddev_factor=2)
weightR = tf.get_variable("PerChannelKernelR", initializer=initR)
weightI = tf.get_variable("PerChannelKernelI", initializer=initI)
weightC=tf.complex(weightR,weightI)
Reg=GT.TF_5d_to_6d(Reg)
Res=tf.reduce_sum(tf.multiply(Reg,weightC) ,axis=3) # batch_size, H, W, nCh, nTS
HalfH=H/2
HalfW=W/2
IdH=tf.concat([tf.range(HalfH,H), tf.range(0,HalfH)],axis=0)
IdH=tf.cast(IdH,tf.int32)
IdW=tf.concat([tf.range(HalfW,W), tf.range(0,HalfW)],axis=0)
IdW=tf.cast(IdW,tf.int32)
C = tf.gather(Res,IdH,axis=1)
C = tf.gather(C,IdW,axis=2)
BeforeFT=tf.transpose(C,[0,3,4,1,2]) # so batch_size,nCh,nTS,H,W
BeforeFT=tf.transpose(BeforeFT,[0,1,2,4,3]) # so batch_size,nCh,nTS,W,H
AfterFT=tf.ifft2d(BeforeFT)
AfterFT=tf.transpose(AfterFT,[0,3,4,1,2]) # so batch_size,H,W,nCh,nTS
y0=tf.reduce_sum( tf.multiply(AfterFT, tf.conj(SensC5)),axis=3) # batch_size,H,W,nTS
else:
# y01DR=tf.slice(tmp,[0, CurLoc,0,0],[batch_size,H*W*nTSC*1,1,1])
# y01DI=tf.slice(tmp,[0, CurLoc+H*W*nTSC,0,0],[batch_size,H*W*nTSC*1,1,1])
# y0R=tf.reshape(y01DR,[batch_size,H,W,nTSC])
# y0I=tf.reshape(y01DI,[batch_size,H,W,nTSC])
# y0=tf.complex(y0R,y0I) # batch_size, H, W, nTSC
# CurLoc=CurLoc+H*W*nTSC*2
y0,ToAddToLoc=ReadCFrom1D(tmp,CurLoc,(H,W,nTSC))
CurLoc=CurLoc+ToAddToLoc
paddingsY=GT.getparam('paddingsY')
fftkernc5=GT.getparam('fftkernc5')
# AHA_ITS_1DR=tf.slice(tmp,[0, CurLoc,0,0],[batch_size,H*W*nTSC*1,1,1])
# AHA_ITS_1DI=tf.slice(tmp,[0, CurLoc+H*W*nTSC,0,0],[batch_size,H*W*nTSC*1,1,1])
# AHA_ITSR=tf.reshape(AHA_ITS_1DR,[batch_size,H,W,nTSC])
# AHA_ITSI=tf.reshape(AHA_ITS_1DI,[batch_size,H,W,nTSC])
# AHA_ITS=tf.complex(AHA_ITSR,AHA_ITSI) # batch_size, H, W, nTSC
# CurLoc=CurLoc+H*W*nTSC*2
AHA_ITS,ToAddToLoc=ReadCFrom1D(tmp,CurLoc,(H,W,nTSC))
CurLoc=CurLoc+ToAddToLoc
AHA_ITSRI=GT.ConcatCOnDim(AHA_ITS,3)
y0RI=GT.ConcatCOnDim(y0,3)
# Sens5 is batch_size, H, W, nCh,1
# Sens5 should be H,W,1,nCh,batch_size
SensC5=tf.transpose(SensC5,[1,2,4,3,0])
model.add_PutInOutput(y0RI)
# model.add_concat(XY)
# model.add_concat(SensRI)
UseBN=GT.getparam('UseBN')
# model.add_ConvNetFromListWithNameAndScope(myParams.myDict['ISide1Net'],name='Net',scope='ConvNet',UseBN=UseBN,AddDirectConnection=True)
Iters=GT.getparam('Iterations')
nIter=Iters.shape[0]
# nIter=1
def RItoCon5(X): return tf.squeeze(tf.complex(tf.slice(X,[0,0,0,0,0],[-1,-1,-1,-1,1]),tf.slice(X,[0,0,0,0,1],[-1,-1,-1,-1,1])),axis=4)
for Iter in range(0, nIter):
CurEstRI=model.get_output() # batch_size,H,W,2*nTS
CurEstRI=tf.reshape(CurEstRI,[batch_size,H,W,2,nTS])
CurEstRI=tf.transpose(CurEstRI,[0,1,2,4,3])
CurEst=RItoCon5(CurEstRI) # batch_size,H,W,nTSC
# InImage is batch_size,H,W,nTSC
AHA_CurEst=GT.TS_NUFFT_OPHOP_ITS(CurEst,SensC5,H,W,1,paddingsY,nTSC,nCh,fftkernc5)
AHA_CurEstRI=GT.ConcatCOnDim(AHA_CurEst,3)
model.add_concat(AHA_ITSRI)
model.add_concat(AHA_CurEstRI)
model.add_ConvNetFromListWithNameAndScope( myParams.myDict['ISide1Net'],name='Net'+str(Iters[Iter]),scope='ConvNet',UseBN=UseBN,AddDirectConnection=True) # , stddev_factor=0.3
# model.add_ConvNetFromListWithNameAndScope( myParams.myDict['ISide1Net'],name='Net',scope='ConvNet',UseBN=UseBN,AddDirectConnection=True) # , stddev_factor=0.3
# test to show warm start
# t0= tf.get_variable('t0', initializer=tf.cast(1.0,tf.float32))
# model.add_PutInOutput(y0RI+t0*0)
# now reshape images
Finaln_ITS_RI=model.get_output() # batch_size,H,W,2*nTS
Finaln_ITS_RI=tf.transpose(Finaln_ITS_RI,[0,1,3,2]) # batch_size,H,2*nTS,W
Finaln_ITS=tf.reshape(Finaln_ITS_RI,[batch_size,H,2,W*nTS])
Finaln_ITS=tf.transpose(Finaln_ITS,[0,1,3,2]) # batch_size,H,W*nTS,2
model.add_PutInOutput(Finaln_ITS)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'ISTA_WithB0T2S_Try1':
print("ISTA_WithB0T2S_Try1 mode")
def tfrm(X): return tf.reduce_mean(tf.abs(X))
UseBN=GT.getparam('UseBN')
Iters=GT.getparam('Iterations')
nIter=Iters.shape[0]
aDataH=myParams.myDict['aDataH']
kMax=myParams.myDict['kMax']
aDataW=myParams.myDict['aDataW']
nTS=myParams.myDict['nTimeSegments']
nTSI=myParams.myDict['nTimeSegmentsI']
UseSharedWightesInRelaxedFT=myParams.myDict['UseSharedWightesInRelaxedFT']>0
RelaxedFT=myParams.myDict['RelaxedFT']>0
addBias=myParams.myDict['CmplxBias']>0
nccInData=myParams.myDict['nccInData']
ncc=myParams.myDict['nccToUse']
nNeighbors=myParams.myDict['nNeighbors']
DataH=myParams.myDict['DataH']
DataW=myParams.myDict['DataW']
LabelsH=myParams.myDict['LabelsH']
LabelsW=myParams.myDict['LabelsW']
H=LabelsH
W=LabelsW
achannelsIn=ncc*nNeighbors*2
# BaseNUFTDataP=myParams.myDict['BaseNUFTDataP']
# NUFTData=scipy.io.loadmat(BaseNUFTDataP + 'TrajForNUFT.mat')
# Traj=NUFTData['Trajm2'][0:2,:]
# Kd=NUFTData['Kd']
# P=NUFTData['P']
# SN=NUFTData['SN']
# NMapCR=GT.GenerateNeighborsMap(Traj,kMax,aDataH,nccInData,ncc,nNeighbors)
# NMapCR = tf.constant(NMapCR)
# nTraj=Traj.shape[1]
nCh=nccInData
nTSC=GT.getparam('nTimeSegments')
Kd=GT.getparam('Kd')
nTraj=GT.getparam('nTraj')
TSBF=GT.getparam('TSBF')
SN=GT.getparam('SN')
sp_C=GT.getparam('sp_C')
ToPad=[Kd[0,0]-H,Kd[0,1]-W]
paddings = tf.constant([[0, ToPad[0]], [0, ToPad[1]],[0,0]])
paddingsX=tf.gather(paddings,[0,1,2],axis=0)
paddingsY=tf.gather(paddings,[0,1,2,2,2],axis=0)
SNc=tf.stack([tf.stack([tf.stack([tf.constant(SN,dtype=tf.complex64)],axis=2)],axis=3)],axis=4)
# Idx=scipy.sparse.find(P)
# I2=np.vstack([Idx[0],Idx[1]]).T
# I2=tf.constant(np.int64(I2))
# ValC=tf.constant(np.complex64(Idx[2]))
# sp_C = tf.SparseTensor(I2, ValC, [P.shape[0],P.shape[1]])
# BaseTSDataP=GT.getparam('BaseTSDataP')
# B0Data=scipy.io.loadmat(BaseTSDataP + 'B0TS.mat')
# TSBF=B0Data['TSBF']
TSBFX=np.transpose(np.reshape(TSBF,(nTSC,1,nTraj)),axes=(2,0,1))
TSBFX=tf.constant(np.complex64(TSBFX))
TSBFXc=tf.stack([TSBFX],axis=3)
tmp=model.get_output()
# tmp1=tf.slice(tmp,[0,0,0,0],[batch_size,132964,1,1])
TSCR=tf.slice(tmp,[0,nTraj*nCh*2,0,0],[batch_size,H*W*nTSC*1,1,1])
TSCI=tf.slice(tmp,[0,nTraj*nCh*2+H*W*nTSC*1,0,0],[batch_size,H*W*nTSC*1,1,1])
SensR=tf.slice(tmp,[0,nTraj*nCh*2+H*W*nTSC*2,0,0],[batch_size,H*W*nCh*1,1,1])
SensI=tf.slice(tmp,[0,nTraj*nCh*2+H*W*nTSC*2+H*W*nCh,0,0],[batch_size,H*W*nCh*1,1,1])
TSCR4=tf.reshape(TSCR,[batch_size,H,W,nTSC])
TSCI4=tf.reshape(TSCI,[batch_size,H,W,nTSC])
TSCR=tf.stack([TSCR4],axis=4)
TSCI=tf.stack([TSCI4],axis=4)
# TSCR=tf.reshape(TSCR,[batch_size,H,W,nTSC,1])
# TSCI=tf.reshape(TSCI,[batch_size,H,W,nTSC,1])
SensR=tf.reshape(SensR,[batch_size,H,W,nCh,1])
SensI=tf.reshape(SensI,[batch_size,H,W,nCh,1])
TSCRI=tf.concat([TSCR,TSCI],axis=4)
SensRI=tf.concat([SensR,SensI],axis=4)
TSCRI4=tf.concat([TSCR4,TSCI4],axis=3)
TSCc=tf.complex(TSCR,TSCI)
Sensc=tf.transpose(tf.complex(SensR,SensI),[0,1,2,4,3])
tmp1R=tf.slice(tmp,[0,0,0,0],[batch_size,nTraj*nCh,1,1])
tmp1I=tf.slice(tmp,[0,nTraj*nCh,0,0],[batch_size,nTraj*nCh,1,1])
SigC=tf.complex(tf.reshape(tmp1R,[batch_size,nCh,nTraj]),tf.reshape(tmp1I,[batch_size,nCh,nTraj]))
SigC=tf.transpose(SigC,[0,2,1])
# model.add_PutInOutput(tmp1)
# model.add_Permute([1,0,2,3]) # now we're 133068,16,1,1
# TSCrepEst0=tf.ones_like(y0,dtype=tf.complex64)
TSCrepEst0=tf.ones([batch_size,H,W,1,1],dtype=tf.complex64)
PowN=tf.reshape(tf.cast(np.arange(0,nTSC),tf.complex64),[1,1,1,nTSC,1]) # [batch_size,H,W,nTSC,nCh]
# PowN4=tf.reshape(tf.cast(np.arange(0,nTSC),tf.complex64),[1,1,1,nTSC]) # [batch_size,H,W,nTSC,nCh]
TSCest0=tf.pow(TSCrepEst0,PowN)
# TSCest0=tf.Print(TSCest0,[tfrm(TSCest0)],'TSCest0 ')
# TSCSens=tf.multiply(TSCc,Sensc)
# TSCSens=tf.transpose(TSCSens,[1,2,3,4,0])
TSCSens=tf.multiply(TSCest0,Sensc)
TSCSens=tf.transpose(TSCSens,[1,2,3,4,0])
# y0=GT.TS_NUFFT_OP_H(SigC,Sens,TSC,TSB,sp_C)
# SigC=tf.Print(SigC,[tfrm(SigC)],'SigC ')
# TSCSens=tf.Print(TSCSens,[tfrm(TSCSens)],'TSCSens ')
SigH=GT.TS_NUFFT_OP_H(SigC,TSCSens,SNc,H,W,batch_size,paddingsX,nTraj,nTSC,nCh,sp_C,TSBFXc, False)
# SigH=tf.Print(SigH,[tfrm(SigH)],'SigH ')
# print('SigH : ' + str(SigH.shape)) # [H,W, nTSC, nCh, batch_size]
TSC0=tf.reduce_sum(SigH,axis=3) # [H,W, nTSC, batch_size]
# print('TSC0a : ' + str(TSC0.shape)) # [H,W, nTSC, nCh, batch_size]
TSC0=tf.transpose(TSC0,[3,0,1,2]) # # [batch_size, H, W, nTSC]
# print('TSC0a : ' + str(TSC0.shape)) # [H,W, nTSC, nCh, batch_size]
y0=tf.reduce_sum(TSC0,axis=3) # [batch_size, H, W]
# print('y0a : ' + str(y0.shape)) # [H,W, nTSC, nCh, batch_size]
# print(y0.shape)
# y0=tf.Print(y0,[],'y0: ' + str(y0.shape))
t0= tf.get_variable('t0', initializer=tf.cast(1.0,tf.float32))
# y0=tf.Print(y0,[tfrm(y0)],'y0 ')
# y0=tf.multiply(tf.cast(t0,tf.complex64),y0)
# def ConcatCOnDim(X,dim): return tf.cast(tf.concat([tf.real(X),tf.imag(X)],axis=dim),tf.float32)
def ConcatCOnDimWithStack(X,dim): return tf.cast(tf.concat([tf.stack([tf.real(X)],axis=dim),tf.stack([tf.imag(X)],axis=dim)],axis=dim),tf.float32)
# y0RI=tf.concat([tf.stack([tf.real(y0)],axis=3),tf.stack([tf.imag(y0)],axis=3)],axis=3)
y0RI=ConcatCOnDimWithStack(y0,3)
# y0RI=tf.Print(y0RI,[tfrm(y0RI)],'y0RIa ')
y0RI=tf.multiply(t0,y0RI)
# y0RI=tf.Print(y0RI,[tfrm(y0RI)],'y0RIb ')
TSC0RI=GT.ConcatRIOn3(TSC0)
TSC0RI=TSC0RI*t0
# y0RI=tf.Print(y0RI,[tfrm(y0RI)],'y0RI ')
# TSC0RI=tf.Print(TSC0RI,[tfrm(TSC0RI)],'TSC0RI ')
model.add_PutInOutput(y0RI)
model.add_concat(TSC0RI)
# print(y0RI.shape)
# print(TSCRI4.shape)
# model.add_concat(TSCRI4)
model.add_ConvNetFromListWithNameAndScope(myParams.myDict['ISide1Net'],name='Net'+str(Iters[0]),scope='ConvNet',UseBN=UseBN)
# yNewDiff=model.get_output()
# y=y0RI+yNewDiff
# model.add_PutInOutput(y)
IAndTSC1x=model.get_output()
Ix=tf.slice(IAndTSC1x,[0,0,0,0],[-1,-1,-1,2])
TSC1x=tf.slice(IAndTSC1x,[0,0,0,2],[-1,-1,-1,2])
# Ix=tf.Print(Ix,[tfrm(Ix)],'Ix ')
# TSC1x=tf.Print(TSC1x,[tfrm(TSC1x)],'TSC1x ')
# model.add_sum(y0RI)
model.add_PutInOutput(y0RI+Ix)
model.add_concat( (tf.cast(1,tf.float32)+TSC1x) /100.0 )
fftkernTSF=scipy.io.loadmat('/media/a/H2/home/a/gUM/fftkernTS.mat')
fftkernTS=fftkernTSF['fftkernTS']
fftkernTS=tf.constant(fftkernTS)
fftkernc=tf.cast(fftkernTS,tf.complex64)
fftkernc5D=GT.TF_3d_to_5d(fftkernc)
# ConcatInsteadOfAdd=False
ConcatInsteadOfAdd=True
def RItoCon4(X): return tf.squeeze(tf.complex(tf.slice(X,[0,0,0,0],[batch_size,H,W,1]),tf.slice(X,[0,0,0,1],[batch_size,H,W,1])))
for Iter in range(1, nIter):
# y1=model.get_output()
IAndTSC11=model.get_output()
# IAndTSC11=tf.Print(IAndTSC11,[tfrm(IAndTSC11)],'IAndTSC11 ')
y1=tf.slice(IAndTSC11,[0,0,0,0],[-1,-1,-1,2])
TSC11=tf.slice(IAndTSC11,[0,0,0,2],[-1,-1,-1,2])
# y=tf.squeeze(tf.complex(tf.slice(y1,[0,0,0,0],[batch_size,H,W,1]),tf.slice(y1,[0,0,0,1],[batch_size,H,W,1])))
# y=RItoCon4(y1)
y=RItoCon4(y1)
# y=tf.Print(y,[tfrm(y)],'y ')
TSC11c=RItoCon4(TSC11)
# TSC11c=tf.Print(TSC11c,[tfrm(TSC11c)],'TSC11c ')
def TFexpix(X): return tf.exp(tf.complex(tf.zeros_like(X),X))
TSC11cMag=tf.abs(TSC11c)
TSC11cPhi=tf.angle(TSC11c)
TSC11cMag=tf.minimum(TSC11cMag,1.0)
TSC11c=tf.cast(TSC11cMag,tf.complex64)*TFexpix(TSC11cPhi)
TSCest1=tf.pow(GT.TF_3d_to_5d(TSC11c),PowN)
# TSCest1=tf.Print(TSCest1,[tfrm(TSCest1)],'TSCest1 ')
TSCSens=tf.multiply(TSCest1,Sensc)
TSCSens=tf.transpose(TSCSens,[1,2,3,4,0])
# TSCSens=tf.Print(TSCSens,[tfrm(TSCSens)],'TSCSens ')
# y = tf.Print(y,[tfrm(y)],message='y ')
# SigCur=GT.TS_NUFFT_OP(y,Sens,TSC,TSB,sp_C)
# SigCur=GT.TS_NUFFT_OP(y,TSCSens,SNc,H,W,batch_size,paddingsX,nTraj,nTSC,nCh,sp_C,TSBFXc)
# # yPrime=GT.TS_NUFFT_OP_H(SigCur,Sens,TSC,TSB,sp_C)
# yPrime=GT.TS_NUFFT_OP_H(SigCur,TSCSens,SNc,H,W,batch_size,paddingsX,nTraj,nTSC,nCh,sp_C,TSBFXc)
yPrime=GT.TS_NUFFT_OPHOP(y,TSCSens,H,W,batch_size,paddingsY,nTSC,nCh,fftkernc5D,SumOver=False)
yPrime=yPrime/(H*W*2*2)
# yPrime=tf.Print(yPrime,[tfrm(yPrime)],'yPrimea ')
# yPrime is [H,W,nTSC,nCh,batch_size]
yPrime=tf.reduce_sum(yPrime,axis=3) # [H,W,nTSC,batch_size]
TSCPrime=tf.transpose(yPrime,[3,0,1,2]) # [batch_size,H,W,nTSC]
yPrime=tf.reduce_sum(TSCPrime,axis=3) # [batch_size,H,W]
# yPrime=tf.Print(yPrime,[tfrm(yPrime)],'yPrime ')
# yPrime = tf.Print(yPrime,[tfrm(yPrime)],message='yPrime ')
# yPrime=y
if ConcatInsteadOfAdd:
yPrimeRI=ConcatCOnDimWithStack(yPrime,3)
model.add_concat(yPrimeRI)
model.add_concat(y0RI)
yRI=y1
TSCPrimeRI=GT.ConcatCOnDim(TSCPrime,3)
TSCest1RI=GT.ConcatCOnDim(tf.squeeze(TSCest1,axis=4),3)
model.add_concat(TSCPrimeRI/100.0)
model.add_concat(TSCest1RI/100.0)
else:
with tf.variable_scope('tScope', reuse=tf.AUTO_REUSE):
tCur= tf.get_variable('t_Iter'+str(Iters[Iter]), initializer=tf.cast(-2.0,tf.float32))
Diff=yPrime-y0
DiffTSC=TSCPrime-TSC0
y=y+tf.multiply(tf.cast(tCur,tf.complex64),Diff)
TSCupdated=TSCest1+GT.TF_4d_to_5d(tf.multiply(tf.cast(tCur,tf.complex64),DiffTSC))
yRI=ConcatCOnDimWithStack(y,3)
TSCupdatedRI=GT.ConcatRIOn3(TSCupdated)
TSCupdatedRI=tf.squeeze(TSCupdatedRI,axis=4)
model.add_PutInOutput(yRI)
# print('yRI : ' + str(yRI.shape)) # [H,W, nTSC, nCh, batch_size]
# print('TSCupdatedRI : ' + str(TSCupdatedRI.shape)) # [H,W, nTSC, nCh, batch_size]
model.add_concat(TSCupdatedRI)
# model.add_concat(TSCRI4)
model.add_ConvNetFromListWithNameAndScope( myParams.myDict['ISide1Net'],name='Net'+str(Iters[Iter]),scope='ConvNet',UseBN=UseBN) # , stddev_factor=0.3
# model.add_sum(yRI)
IAndTSC12=model.get_output()
y2=tf.slice(IAndTSC12,[0,0,0,0],[-1,-1,-1,2])
TSC12=tf.slice(IAndTSC12,[0,0,0,2],[-1,-1,-1,2])
model.add_PutInOutput(y2+yRI)
# model.add_concat(TSCupdatedRI+TSC12)
# model.add_concat(TSC11+TSC12)
model.add_concat(TSC12)
IAndTSC1F=model.get_output()
yF=tf.slice(IAndTSC1F,[0,0,0,0],[-1,-1,-1,2])
TSC1F=tf.slice(IAndTSC1F,[0,0,0,2],[-1,-1,-1,2])
ResF=tf.concat([yF,TSC1F],axis=2)
model.add_PutInOutput(ResF)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'ISTA_Try1':
print("ISTA_Try1 mode")
aDataH=myParams.myDict['aDataH']
kMax=myParams.myDict['kMax']
aDataW=myParams.myDict['aDataW']
nTS=myParams.myDict['nTimeSegments']
nTSI=myParams.myDict['nTimeSegmentsI']
UseSharedWightesInRelaxedFT=myParams.myDict['UseSharedWightesInRelaxedFT']>0
RelaxedFT=myParams.myDict['RelaxedFT']>0
addBias=myParams.myDict['CmplxBias']>0
nccInData=myParams.myDict['nccInData']
ncc=myParams.myDict['nccToUse']
nNeighbors=myParams.myDict['nNeighbors']
DataH=myParams.myDict['DataH']
DataW=myParams.myDict['DataW']
LabelsH=myParams.myDict['LabelsH']
LabelsW=myParams.myDict['LabelsW']
H=LabelsH
W=LabelsW
achannelsIn=ncc*nNeighbors*2
BaseNUFTDataP=myParams.myDict['BaseNUFTDataP']
NUFTData=scipy.io.loadmat(BaseNUFTDataP + 'TrajForNUFT.mat')
Traj=NUFTData['Trajm2'][0:2,:]
Kd=NUFTData['Kd']
P=NUFTData['P']
SN=NUFTData['SN']
NMapCR=GT.GenerateNeighborsMap(Traj,kMax,aDataH,nccInData,ncc,nNeighbors)
NMapCR = tf.constant(NMapCR)
nTraj=Traj.shape[1]
nCh=nccInData
nTSC=12
ToPad=[Kd[0,0]-H,Kd[0,1]-W]
paddings = tf.constant([[0, ToPad[0]], [0, ToPad[1]],[0,0]])
paddingsX=tf.gather(paddings,[0,1,2],axis=0)
paddingsY=tf.gather(paddings,[0,1,2,2,2],axis=0)
SNc=tf.stack([tf.stack([tf.stack([tf.constant(SN,dtype=tf.complex64)],axis=2)],axis=3)],axis=4)
Idx=scipy.sparse.find(P)
I2=np.vstack([Idx[0],Idx[1]]).T
I2=tf.constant(np.int64(I2))
ValC=tf.constant(np.complex64(Idx[2]))
sp_C = tf.SparseTensor(I2, ValC, [P.shape[0],P.shape[1]])
BaseTSDataP=GT.getparam('BaseTSDataP')
B0Data=scipy.io.loadmat(BaseTSDataP + 'B0TS.mat')
TSBF=B0Data['TSBF']
TSBFX=np.transpose(np.reshape(TSBF,(nTSC,1,nTraj)),axes=(2,0,1))
TSBFX=tf.constant(np.complex64(TSBFX))
TSBFXc=tf.stack([TSBFX],axis=3)
tmp=model.get_output()
# tmp1=tf.slice(tmp,[0,0,0,0],[batch_size,132964,1,1])
TSCR=tf.slice(tmp,[0,nTraj*nCh*2,0,0],[batch_size,H*W*nTSC*1,1,1])
TSCI=tf.slice(tmp,[0,nTraj*nCh*2+H*W*nTSC*1,0,0],[batch_size,H*W*nTSC*1,1,1])
SensR=tf.slice(tmp,[0,nTraj*nCh*2+H*W*nTSC*2,0,0],[batch_size,H*W*nCh*1,1,1])
SensI=tf.slice(tmp,[0,nTraj*nCh*2+H*W*nTSC*2+H*W*nCh,0,0],[batch_size,H*W*nCh*1,1,1])
TSCR4=tf.reshape(TSCR,[batch_size,H,W,nTSC])
TSCI4=tf.reshape(TSCI,[batch_size,H,W,nTSC])
TSCR=tf.stack([TSCR4],axis=4)
TSCI=tf.stack([TSCI4],axis=4)
# TSCR=tf.reshape(TSCR,[batch_size,H,W,nTSC,1])
# TSCI=tf.reshape(TSCI,[batch_size,H,W,nTSC,1])
SensR=tf.reshape(SensR,[batch_size,H,W,nCh,1])
SensI=tf.reshape(SensI,[batch_size,H,W,nCh,1])
TSCRI=tf.concat([TSCR,TSCI],axis=4)
SensRI=tf.concat([SensR,SensI],axis=4)
TSCRI4=tf.concat([TSCR4,TSCI4],axis=3)
TSCc=tf.complex(TSCR,TSCI)
Sensc=tf.transpose(tf.complex(SensR,SensI),[0,1,2,4,3])
TSCSens=tf.multiply(TSCc,Sensc)
TSCSens=tf.transpose(TSCSens,[1,2,3,4,0])
tmp1R=tf.slice(tmp,[0,0,0,0],[batch_size,nTraj*nCh,1,1])
tmp1I=tf.slice(tmp,[0,nTraj*nCh,0,0],[batch_size,nTraj*nCh,1,1])
SigC=tf.complex(tf.reshape(tmp1R,[batch_size,nCh,nTraj]),tf.reshape(tmp1I,[batch_size,nCh,nTraj]))
SigC=tf.transpose(SigC,[0,2,1])
# model.add_PutInOutput(tmp1)
# model.add_Permute([1,0,2,3]) # now we're 133068,16,1,1
y0=GT.TS_NUFFT_OP_H(SigC,Sens,TSC,TSB,sp_C)
y0=GT.TS_NUFFT_OP_H(SigC,TSCSens,SNc,H,W,batch_size,paddingsX,nTraj,nTSC,nCh,sp_C,TSBFXc)
t0= tf.get_variable('t0', initializer=tf.cast(1.0,tf.float32))
# y0=tf.multiply(tf.cast(t0,tf.complex64),y0)
def ConcatCOnDim(X,dim): return tf.cast(tf.concat([tf.real(X),tf.imag(X)],axis=dim),tf.float32)
def ConcatCOnDimWithStack(X,dim): return tf.cast(tf.concat([tf.stack([tf.real(X)],axis=dim),tf.stack([tf.imag(X)],axis=dim)],axis=dim),tf.float32)
# y0RI=tf.concat([tf.stack([tf.real(y0)],axis=3),tf.stack([tf.imag(y0)],axis=3)],axis=3)
y0RI=ConcatCOnDimWithStack(y0,3)
y0RI=tf.multiply(t0,y0RI)
model.add_PutInOutput(y0RI)
model.add_concat(TSCRI4)
# model.print_shape(message="y0 ")
UseBN=GT.getparam('UseBN')
# UseBN=False
Iters=GT.getparam('Iterations')
nIter=Iters.shape[0]
# UseSameNet=True
# if UseSameNet:
model.add_ConvNetFromListWithNameAndScope(myParams.myDict['ISide1Net'],name='Net'+str(Iters[0]),scope='ConvNet',UseBN=UseBN)
# else:
# model.add_ConvNetFromList(myParams.myDict['ISide1Net'],UseBN=UseBN)
# yNewDiff=model.get_output()
# y=y0RI+yNewDiff
# model.add_PutInOutput(y)
model.add_sum(y0RI)
# add_batch_norm
# nIter=GT.getparam('nIterations')
# Iter=0
# UseSamet=True
# UseSamet=False
# if UseSamet:
# with tf.variable_scope('tScope', reuse=tf.AUTO_REUSE):
# tCur= tf.get_variable('t'+str(Iters[0]), initializer=tf.cast(-2.0,tf.float32))
fftkernTSF=scipy.io.loadmat('/media/a/H2/home/a/gUM/fftkernTS.mat')
fftkernTS=fftkernTSF['fftkernTS']
# print('asasd')
# print(np.mean(np.abs(fftkernTS)))
fftkernTS=tf.constant(fftkernTS)
fftkernc=tf.cast(fftkernTS,tf.complex64)
fftkernc5D=GT.TF_3d_to_5d(fftkernc)
ConcatInsteadOfAdd=False
def tfrm(X): return tf.reduce_mean(tf.abs(X))
def RItoCon4(X): return tf.squeeze(tf.complex(tf.slice(X,[0,0,0,0],[batch_size,H,W,1]),tf.slice(X,[0,0,0,1],[batch_size,H,W,1])))
for Iter in range(1, nIter):
y1=model.get_output()
# y=tf.squeeze(tf.complex(tf.slice(y1,[0,0,0,0],[batch_size,H,W,1]),tf.slice(y1,[0,0,0,1],[batch_size,H,W,1])))
y=RItoCon4(y1)
# y = tf.Print(y,[tfrm(y)],message='y ')
# SigCur=GT.TS_NUFFT_OP(y,Sens,TSC,TSB,sp_C)
# SigCur=GT.TS_NUFFT_OP(y,TSCSens,SNc,H,W,batch_size,paddingsX,nTraj,nTSC,nCh,sp_C,TSBFXc)
# # yPrime=GT.TS_NUFFT_OP_H(SigCur,Sens,TSC,TSB,sp_C)
# yPrime=GT.TS_NUFFT_OP_H(SigCur,TSCSens,SNc,H,W,batch_size,paddingsX,nTraj,nTSC,nCh,sp_C,TSBFXc)
yPrime=GT.TS_NUFFT_OPHOP(y,TSCSens,H,W,batch_size,paddingsY,nTSC,nCh,fftkernc5D)
yPrime=yPrime/(H*W*2*2)
# yPrime = tf.Print(yPrime,[tfrm(yPrime)],message='yPrime ')
# yPrime=y
if ConcatInsteadOfAdd:
yPrimeRI=ConcatCOnDimWithStack(yPrime,3)
model.add_concat(yPrimeRI)
model.add_concat(y0RI)
yRI=y1
else:
with tf.variable_scope('tScope', reuse=tf.AUTO_REUSE):
tCur= tf.get_variable('t_Iter'+str(Iters[Iter]), initializer=tf.cast(-2.0,tf.float32))
Diff=yPrime-y0
y=y+tf.multiply(tf.cast(tCur,tf.complex64),Diff)
yRI=ConcatCOnDimWithStack(y,3)
model.add_PutInOutput(yRI)
model.add_concat(TSCRI4)
model.add_ConvNetFromListWithNameAndScope( myParams.myDict['ISide1Net'],name='Net'+str(Iters[Iter]),scope='ConvNet',UseBN=UseBN) # , stddev_factor=0.3
model.add_sum(yRI)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'RegridTry3C2_TS':
print("RegridTry3C2_TS mode")
aDataH=myParams.myDict['aDataH']
kMax=myParams.myDict['kMax']
aDataW=myParams.myDict['aDataW']
# achannelsIn=myParams.myDict['achannelsIn']
nTS=myParams.myDict['nTimeSegments']
UseSharedWightesInRelaxedFT=myParams.myDict['UseSharedWightesInRelaxedFT']>0
RelaxedFT=myParams.myDict['RelaxedFT']>0
addBias=myParams.myDict['CmplxBias']>0
# FullData=scipy.io.loadmat(myParams.myDict['NMAP_FN'])
# NMapCR=FullData['NMapCR']
# NMapCR = tf.constant(NMapCR)
nccInData=myParams.myDict['nccInData']
# ncc=8
ncc=myParams.myDict['nccToUse']
nNeighbors=myParams.myDict['nNeighbors']
achannelsIn=ncc*nNeighbors*2
# T=scipy.io.loadmat('/media/a/DATA/180628_AK/meas_MID244_gBP_VD11_U19_G35S155_4min_FID22439/Traj.mat')
# Traj=T['Traj'][0:2,:]
# BaseNUFTDataP='/media/a/DATA/13May18/Me/meas_MID409_gBP_VD11_U19_7ADCs_FID17798/'
# BaseNUFTDataP='/media/a/DATA/11Jul18/RL/meas_MID149_gBP_VD11_U19_G35S155_FID23846/'
BaseNUFTDataP=myParams.myDict['BaseNUFTDataP']
NUFTData=scipy.io.loadmat(BaseNUFTDataP + 'TrajForNUFT.mat')
Traj=NUFTData['Trajm2'][0:2,:]
# T=scipy.io.loadmat('/media/a/DATA/13May18/Me/meas_MID409_gBP_VD11_U19_7ADCs_FID17798/TrajForNUFT.mat')
# Traj=T['Trajm2'][0:2,:]
NMapCR=GT.GenerateNeighborsMap(Traj,kMax,aDataH,nccInData,ncc,nNeighbors)
NMapCR = tf.constant(NMapCR)
# nNeighbors=myParams.myDict['nNeighbors']
# nccInData=myParams.myDict['nccInData']
# CurBartTraj=scipy.io.loadmat('/media/a/DATA/180628_AK/meas_MID244_gBP_VD11_U19_G35S155_4min_FID22439/Traj.mat')
# CurBartTraj=CurBartTraj['BARTTrajMS'][0:2,2:]
# osN=aDataH
# nNeighbors=nNeighbors
# NMap=np.zeros([osN,osN,nNeighbors],dtype='int32')
# C=GT.linspaceWithHalfStep(-kMax,kMax,osN)
# nChToUseInNN=nccInData
# ncc=nccInData
# nTrajAct=CurBartTraj.shape[1]
# for i in np.arange(0,osN):
# for j in np.arange(0,osN):
# CurLoc=np.vstack([C[i], C[j]])
# D=CurBartTraj-CurLoc
# R=np.linalg.norm(D,ord=2,axis=0)/np.sqrt(2)
# Idx=np.argsort(R)
# NMap[i,j,:]=Idx[0:nNeighbors]
# a=np.reshape(np.arange(0,nChToUseInNN)*nTrajAct,(1,1,1,nChToUseInNN))
# NMapC=np.reshape(NMap,(NMap.shape[0],NMap.shape[1],NMap.shape[2],1))+a
# NMapC=np.transpose(NMapC,(0,1,2,3))
# NMapCX=np.reshape(NMapC,(osN,osN,nNeighbors*nChToUseInNN))
# NMapCR=np.concatenate((NMapCX,NMapCX+nTrajAct*ncc),axis=2)
# model.print_shape()
# model.add_Reshape([16*133068])
model.add_Permute([1,0,2,3]) # now we're 133068,16,1,1
# model.print_shape()
feature=model.get_output();
feature=tf.gather(feature,NMapCR,validate_indices=None,name=None) # After 131,131,192,16
# feature = tf.reshape(feature, [aDataH, aDataW, achannelsIn])
model.add_PutInOutput(feature)
model.add_Permute([3,0,1,2,4,5]) # After 16,131,131,192,1,1
# model.print_shape()
model.add_Reshape([batch_size,aDataH,aDataW,achannelsIn]) # After 16,131,131,192
model.add_Split4thDim(2) # Now we're kH,kW, Neighbors(12)*Channels(8),2
# model.add_PixelwiseMultC(nTS, stddev_factor=1.0) # After we're batch_size,kH,kW,nTS
InitForRC=[]
if myParams.myDict['InitForRFN'] != 'None':
InitForRM=scipy.io.loadmat(myParams.myDict['InitForRFN'])
InitForRR=InitForRM['gene_GEN_L007_PixelwiseMultC_weightR_0']
InitForRI=InitForRM['gene_GEN_L007_PixelwiseMultC_weightI_0']
InitForRC=InitForRR + 1j * InitForRI
model.add_PixelwiseMultC(nTS, stddev_factor=1.0,NamePrefix='',Trainable=True,InitC=InitForRC)
# model.print_shape('aaa')
model.add_2DFT()
# MM=GT.gDFT_matrix(np.linspace(-kMax,kMax,aDataH),H)
# MM=np.transpose(MM,axes=[1,0])
# if UseSharedWightesInRelaxedFT:
# model.add_Mult2DMCyCSharedOverFeat(W,1,add_bias=addBias,Trainable=RelaxedFT,InitC=MM)
# model.add_Mult2DMCxCSharedOverFeat(H,1,add_bias=addBias,Trainable=RelaxedFT,InitC=MM)
# else:
# model.add_Mult2DMCyC(W,1,add_bias=addBias)
# model.add_Mult2DMCxC(H,1,add_bias=addBias)
# now supposedly batch_size,H,W,nTS
# model.add_PixelwiseMultC(1, stddev_factor=1.0) # This collecting the different TS to the final image.
InitForLC=[]
if myParams.myDict['InitForLFN'] != 'None':
InitForLM=scipy.io.loadmat(myParams.myDict['InitForLFN'])
InitForLR=InitForLM['gene_GEN_L010_PixelwiseMultC_weightR_0']
InitForLI=InitForLM['gene_GEN_L010_PixelwiseMultC_weightI_0']
InitForLC=InitForLR + 1j * InitForLI
model.add_PixelwiseMultC(1, stddev_factor=1.0,NamePrefix='',Trainable=True,InitC=InitForLC) # This collecting the different TS to the final image.
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'RegridTry3C2_TSME':
print("RegridTry3C2_TSME mode")
# WhichEchosToRec=GT.getparam('WhichEchosToRec')
# nEchos=WhichEchosToRec.shape[0]
TimePointsForRec_ms=GT.getparam('TimePointsForRec_ms')
nEchos=TimePointsForRec_ms.shape[0]
aDataH=myParams.myDict['aDataH']
kMax=myParams.myDict['kMax']
aDataW=myParams.myDict['aDataW']
# achannelsIn=myParams.myDict['achannelsIn']
nTS=myParams.myDict['nTimeSegments']
UseSharedWightesInRelaxedFT=myParams.myDict['UseSharedWightesInRelaxedFT']>0
RelaxedFT=myParams.myDict['RelaxedFT']>0
addBias=myParams.myDict['CmplxBias']>0
nccInData=myParams.myDict['nccInData']
# ncc=8
ncc=myParams.myDict['nccToUse']
nNeighbors=myParams.myDict['nNeighbors']
achannelsIn=ncc*nNeighbors*2
BaseNUFTDataP=myParams.myDict['BaseNUFTDataP']
NUFTData=scipy.io.loadmat(BaseNUFTDataP + 'TrajForNUFT.mat')
Traj=NUFTData['Trajm2'][0:2,:]
NMapCR=GT.GenerateNeighborsMap(Traj,kMax,aDataH,nccInData,ncc,nNeighbors)
NMapCR = tf.constant(NMapCR)
model.add_Permute([1,0,2,3]) # now we're 133068,16,1,1
# model.print_shape()
feature=model.get_output();
feature=tf.gather(feature,NMapCR,validate_indices=None,name=None) # After 131,131,192,16
# feature = tf.reshape(feature, [aDataH, aDataW, achannelsIn])
model.add_PutInOutput(feature)
model.add_Permute([3,0,1,2,4,5]) # After 16,131,131,192,1,1
# model.print_shape()
model.add_Reshape([batch_size,aDataH,aDataW,achannelsIn]) # After 16,131,131,192
model.add_Split4thDim(2) # Now we're kH,kW, Neighbors(12)*Channels(8),2
# model.add_PixelwiseMultC(nTS, stddev_factor=1.0) # After we're batch_size,kH,kW,nTS
InitForRC=[]
if myParams.myDict['InitForRFN'] != 'None':
InitForRM=scipy.io.loadmat(myParams.myDict['InitForRFN'])
InitForRR=InitForRM['gene_GEN_L007_PixelwiseMultC_weightR_0']
InitForRI=InitForRM['gene_GEN_L007_PixelwiseMultC_weightI_0']
InitForRC=InitForRR + 1j * InitForRI
model.add_PixelwiseMultC(nTS, stddev_factor=1.0,NamePrefix='',Trainable=True,InitC=InitForRC)
model.add_2DFT()
InitForLC=[]
if myParams.myDict['InitForLFN'] != 'None':
InitForLM=scipy.io.loadmat(myParams.myDict['InitForLFN'])
InitForLR=InitForLM['gene_GEN_L010_PixelwiseMultC_weightR_0']
InitForLI=InitForLM['gene_GEN_L010_PixelwiseMultC_weightI_0']
InitForLC=InitForLR + 1j * InitForLI
# model.add_PixelwiseMultC(1, stddev_factor=1.0,NamePrefix='',Trainable=True,InitC=InitForLC) # This collecting the different TS to the final image.
model.add_PixelwiseMultC(nEchos, stddev_factor=1.0,NamePrefix='',Trainable=True,InitC=InitForLC) # This collects the different TS to the final image.
# model.print_shape('AfterL')
model.add_Permute34()
model.add_Combine34(True)
# model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'RegridTry3C2_TS_MB':
print("RegridTry3C2_TS_MB mode")
aDataH=myParams.myDict['aDataH']
kMax=myParams.myDict['kMax']
aDataW=myParams.myDict['aDataW']
# achannelsIn=myParams.myDict['achannelsIn']
nTS=myParams.myDict['nTimeSegments']
UseSharedWightesInRelaxedFT=myParams.myDict['UseSharedWightesInRelaxedFT']>0
RelaxedFT=myParams.myDict['RelaxedFT']>0
addBias=myParams.myDict['CmplxBias']>0
nccInData=myParams.myDict['nccInData']
ncc=myParams.myDict['nccToUse']
nNeighbors=myParams.myDict['nNeighbors']
achannelsIn=ncc*nNeighbors*2
BaseNUFTDataP=myParams.myDict['BaseNUFTDataP']
NUFTData=scipy.io.loadmat(BaseNUFTDataP + 'TrajForNUFT.mat')
Traj=NUFTData['Trajm2'][0:2,:]
NMapCR=GT.GenerateNeighborsMap(Traj,kMax,aDataH,nccInData,ncc,nNeighbors)
NMapCR = tf.constant(NMapCR)
model.add_Permute([1,0,2,3]) # now we're 133068,16,1,1
# model.print_shape()
feature=model.get_output();
feature=tf.gather(feature,NMapCR,validate_indices=None,name=None) # After 131,131,192,16
# feature = tf.reshape(feature, [aDataH, aDataW, achannelsIn])
model.add_PutInOutput(feature)
model.add_Permute([3,0,1,2,4,5]) # After 16,131,131,192,1,1
# model.print_shape()
model.add_Reshape([batch_size,aDataH,aDataW,achannelsIn]) # After 16,131,131,192
if addBias:
print("with bias")
else:
print("without bias")
model.add_Split4thDim(2) # Now we're kH,kW, Neighbors(12)*Channels(8),2
# model.add_PixelwiseMultC(nTS, stddev_factor=1.0) # After we're batch_size,kH,kW,nTS
InitForRC=[]
if myParams.myDict['InitForRFN'] != 'None':
InitForRM=scipy.io.loadmat(myParams.myDict['InitForRFN'])
InitForRR=InitForRM['gene_GEN_L007_PixelwiseMultC_weightR_0']
InitForRI=InitForRM['gene_GEN_L007_PixelwiseMultC_weightI_0']
InitForRC=InitForRR + 1j * InitForRI
model.add_PixelwiseMultC(nTS, stddev_factor=1.0,NamePrefix='',Trainable=True,InitC=InitForRC)
MM=GT.gDFT_matrix(np.linspace(-kMax,kMax,aDataH),H)
MM=np.transpose(MM,axes=[1,0])
if UseSharedWightesInRelaxedFT:
model.add_Mult2DMCyCSharedOverFeat(W,1,add_bias=addBias,Trainable=RelaxedFT,InitC=MM)
model.add_Mult2DMCxCSharedOverFeat(H,1,add_bias=addBias,Trainable=RelaxedFT,InitC=MM)
else:
model.add_Mult2DMCyC(W,1,add_bias=addBias)
model.add_Mult2DMCxC(H,1,add_bias=addBias)
# now supposedly batch_size,H,W,nTS
# ggg: 2 here is MB
# model.add_PixelwiseMultC(2, stddev_factor=1.0) # This collecting the different TS to the final image.
# model.print_shape('BeforeL')
InitForLC=[]
if myParams.myDict['InitForLFN'] != 'None':
InitForLM=scipy.io.loadmat(myParams.myDict['InitForLFN'])
InitForLR=InitForLM['gene_GEN_L010_PixelwiseMultC_weightR_0']
InitForLI=InitForLM['gene_GEN_L010_PixelwiseMultC_weightI_0']
InitForLC=InitForLR + 1j * InitForLI
model.add_PixelwiseMultC(2, stddev_factor=1.0,NamePrefix='',Trainable=True,InitC=InitForLC) # This collects the different TS to the final image.
# model.print_shape('AfterL')
model.add_Permute34()
model.add_Combine34(True)
# model.print_shape('After Combine34')
# model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'RegridTry1C2_TS':
print("RegridTry1C2_TS mode")
aDataH=myParams.myDict['aDataH']
kMax=myParams.myDict['kMax']
nTS=myParams.myDict['nTimeSegments']
UseSharedWightesInRelaxedFT=myParams.myDict['UseSharedWightesInRelaxedFT']>0
RelaxedFT=myParams.myDict['RelaxedFT']>0
addBias=myParams.myDict['CmplxBias']>0
model.add_Split4thDim(2)
# model.add_PixelwiseMultC(nTS, stddev_factor=1.0) # After we're batch_size,kH,kW,nTS
InitForRC=[]
print("InitForRC...")
print(myParams.myDict['InitForRFN'])
if myParams.myDict['InitForRFN'] != 'None':
print("InitForRC From file")
InitForRM=scipy.io.loadmat(myParams.myDict['InitForRFN'])
InitForRR=InitForRM['gene_GEN_L007_PixelwiseMultC_weightR_0']
InitForRI=InitForRM['gene_GEN_L007_PixelwiseMultC_weightI_0']
InitForRC=InitForRR + 1j * InitForRI
print("InitForRC...")
model.add_PixelwiseMultC(nTS, stddev_factor=1.0,NamePrefix='',Trainable=True,InitC=InitForRC)
MM=GT.gDFT_matrix(np.linspace(-kMax,kMax,aDataH),H)
MM=np.transpose(MM,axes=[1,0])
if UseSharedWightesInRelaxedFT:
model.add_Mult2DMCyCSharedOverFeat(W,1,add_bias=addBias,Trainable=RelaxedFT,InitC=MM)
model.add_Mult2DMCxCSharedOverFeat(H,1,add_bias=addBias,Trainable=RelaxedFT,InitC=MM)
else:
model.add_Mult2DMCyC(W,1,add_bias=addBias)
model.add_Mult2DMCxC(H,1,add_bias=addBias)
# now supposedly batch_size,H,W,nTS
# model.add_PixelwiseMultC(1, stddev_factor=1.0) # This collecting the different TS to the final image.
# add_PixelwiseMultC(self, numOutChannels, stddev_factor=1.0,NamePrefix='',Trainable=True,InitC=[]):
InitForLC=[]
if myParams.myDict['InitForLFN'] != 'None':
InitForLM=scipy.io.loadmat(myParams.myDict['InitForLFN'])
InitForLR=InitForLM['gene_GEN_L010_PixelwiseMultC_weightR_0']
InitForLI=InitForLM['gene_GEN_L010_PixelwiseMultC_weightI_0']
InitForLC=InitForLR + 1j * InitForLI
model.add_PixelwiseMultC(1, stddev_factor=1.0,NamePrefix='',Trainable=True,InitC=InitForLC) # This collecting the different TS to the final image.
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
# if myParams.myDict['NetMode'] == 'RegridTry1C2_TS2': # Shared features in relaxed FT
# print("RegridTry1C2_TS mode")
# addBias=myParams.myDict['CmplxBias']>0
# if addBias:
# print("with bias")
# else:
# print("without bias")
# nTS=7
# model.add_Split4thDim(2)
# model.add_PixelwiseMultC(nTS, stddev_factor=1.0)
# model.add_Mult2DMCyCSharedOverFeat(W,1,add_bias=addBias)
# model.add_Mult2DMCxCSharedOverFeat(H,1,add_bias=addBias)
# model.add_PixelwiseMultC(1, stddev_factor=1.0)
# model.remove_5thDim()
# new_vars = tf.global_variables()
# gene_vars = list(set(new_vars) - set(old_vars))
# return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'SMASHTry1':
print("SMASHTry1 mode")
addBias=myParams.myDict['CmplxBias']>0
model.add_PixelwiseMultC(2, stddev_factor=1.0)
model.add_Combine34()
model.add_Mult2DMCyC(W,1,add_bias=addBias)
model.add_Mult2DMCxC(H,1,add_bias=addBias)
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'SMASHTry1_CC':
print("SMASHTry1_CC mode")
addBias=myParams.myDict['CmplxBias']>0
# we're [Batch, kH,kW,AllChannels*Neighbors*RI]
model.add_Split4thDim(2) # Now [Batch, kH,kW,AllChannels*Neighbors,RI]
model.add_Mult2DMCxCSharedOverFeat(DataH, 1) # Now [Batch, H,kW,AllChannels*Neighbors,RI]
model.add_Split4thDim(6) # Now [Batch, H,kW,AllChannels,Neighbors,RI]
ncc=4
model.add_einsumC('abcde,dx->abcxe',[8, ncc])
model.add_Combine45(squeeze=True) # Now [Batch, H,kW,CompressedChannels*Neighbors,RI]
model.add_Mult2DMCxCSharedOverFeat(DataH, 1) # Now [Batch, kH,kW,CompressedChannels*Neighbors,RI]
model.add_PixelwiseMultC(2, stddev_factor=1.0)
model.add_Combine34()
model.add_Mult2DMCyC(W,1,add_bias=addBias)
model.add_Mult2DMCxC(H,1,add_bias=addBias)
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'SMASHTry1_GCC':
print("SMASHTry1_GCC mode")
addBias=myParams.myDict['CmplxBias']>0
# we're [Batch, kH,kW,AllChannels*Neighbors*RI]
model.add_Split4thDim(2) # Now [Batch, kH,kW,AllChannels*Neighbors,RI]
model.add_Mult2DMCxCSharedOverFeat(DataH, 1) # Now [Batch, H,kW,AllChannels*Neighbors,RI]
model.add_Split4thDim(6) # Now [Batch, H,kW,AllChannels,Neighbors,RI]
ncc=4
model.add_einsumC('abcde,bdx->abcxe',[DataH,8, ncc])
model.add_Combine45(squeeze=True) # Now [Batch, H,kW,CompressedChannels*Neighbors,RI]
model.add_Mult2DMCxCSharedOverFeat(DataH, 1) # Now [Batch, kH,kW,CompressedChannels*Neighbors,RI]
model.add_PixelwiseMultC(2, stddev_factor=1.0)
model.add_Combine34()
model.add_Mult2DMCyC(W,1,add_bias=addBias)
model.add_Mult2DMCxC(H,1,add_bias=addBias)
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'SMASHTry1_GCCF':
print("SMASHTry1_GCCF mode")
addBias=myParams.myDict['CmplxBias']>0
# we're [Batch, kH,kW,AllChannels*Neighbors*RI]
model.add_Split4thDim(2) # Now [Batch, H,kW,AllChannels*Neighbors,RI]
model.add_Split4thDim(6) # Now [Batch, H,kW,AllChannels,Neighbors,RI]
ncc=4
model.add_einsumC('abcde,bdx->abcxe',[DataH,8, ncc])
model.add_Combine45(squeeze=True) # Now [Batch, H,kW,CompressedChannels*Neighbors,RI]
DFTM=DFT_matrix(DataH)
model.add_Mult2DMCxCSharedOverFeat(DataH, 1,add_bias=addBias,Trainable=False,InitC=DFTM) # Now [Batch, kH,kW,CompressedChannels*Neighbors,RI]
model.add_PixelwiseMultC(2, stddev_factor=1.0)
model.add_Combine34()
model.add_Mult2DMCyC(W,1,add_bias=addBias)
model.add_Mult2DMCxC(H,1,add_bias=addBias)
model.remove_5thDim()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'Conv_3Layers':
print("Conv_3Layers mode")
# model.print_shape()
model.add_conv2d(64, mapsize=mapsize, stride=1, stddev_factor=2.)
model.add_elu()
model.add_conv2dWithName(32, name="ggg", mapsize=1, stride=1, stddev_factor=2.)
model.add_elu()
model.add_conv2d(channelsOut, mapsize=5, stride=1, stddev_factor=2.)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'Unet_v1_ForB0':
b=np.array([[64,0,0,128,128,0,0,64],[128,0,0,256,256,0,0,128],[512,0,0,0,0,0,0,512]])
model.add_UnetKsteps(b, mapsize=mapsize, stride=2, stddev_factor=1e-3)
# OutChannels=labels.shape[3]
model.add_conv2d(channelsOut, mapsize=mapsize, stride=1, stddev_factor=2.)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'Conv_v1_ForB0':
print("Conv_v1_ForB0 mode")
# model.print_shape()
model.add_conv2d(64, mapsize=mapsize, stride=1, stddev_factor=2.)
model.add_elu()
model.add_conv2dWithName(128, name="ggg", mapsize=mapsize, stride=1, stddev_factor=2.)
model.add_elu()
model.add_conv2d(128, mapsize=mapsize, stride=1, stddev_factor=2.)
model.add_elu()
model.add_conv2d(channelsOut, mapsize=mapsize, stride=1, stddev_factor=2.)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
if myParams.myDict['NetMode'] == 'Conv_v1':
print("Conv_v1 mode")
# model.print_shape()
model.add_conv2d(64, mapsize=mapsize, stride=1, stddev_factor=2.)
model.add_elu()
model.add_conv2dWithName(128, name="ggg", mapsize=mapsize, stride=1, stddev_factor=2.)
model.add_elu()
model.add_conv2d(128, mapsize=mapsize, stride=1, stddev_factor=2.)
model.add_elu()
model.add_conv2d(1, mapsize=mapsize, stride=1, stddev_factor=2.)
# SAE
SAE = myParams.myDict['NetMode'] == 'SAE'
if SAE:
model.add_conv2d(64, mapsize=mapsize, stride=1, stddev_factor=2.)
model.add_elu()
model.add_conv2dWithName(128, name="AE", mapsize=mapsize, stride=1, stddev_factor=2.)
model.add_conv2d(64, mapsize=mapsize, stride=1, stddev_factor=2.)
model.add_elu()
model.add_conv2d(channels, mapsize=7, stride=1, stddev_factor=2.)
model.add_sigmoid()
# model.add_tanh()
# kKick:
kKick= myParams.myDict['NetMode'] == 'kKick'
if kKick:
model.add_conv2d(64, mapsize=1, stride=1, stddev_factor=2.)
model.add_elu()
b=np.array([[64,0,0,128,128,0,0,64],[128,0,0,256,256,0,0,128],[512,0,0,0,0,0,0,512]])
model.add_UnetKsteps(b, mapsize=mapsize, stride=2, stddev_factor=1e-3)
model.add_conv2dWithName(50, name="AE", mapsize=3, stride=1, stddev_factor=2.)
model.add_elu()
model.add_conv2d(channels, mapsize=1, stride=1, stddev_factor=2.)
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
return model.get_output(), gene_vars
# AUTOMAP
# AUTOMAP_units = [64, 64, channels]
# AUTOMAP_mapsize = [5, 5, 7]
# ggg option 1: FC
# model.add_flatten() # FC1
# model.add_dense(num_units=H*W*2)
# model.add_reshapeTo4D(H,W)
TSRECON = myParams.myDict['NetMode'] == 'TSRECON'
if TSRECON:
# ggg option 2: FC per channel, and then dot multiplication per pixel, then conv
ChannelsPerCoil=myParams.myDict['NumFeatPerChannel']
NumTotalFeat=myParams.myDict['NumTotalFeat']
model.add_Mult2DMC(H*W,ChannelsPerCoil)
model.add_reshapeTo4D(H, W)
model.add_PixelwiseMult(NumTotalFeat, stddev_factor=1.0)
model.add_elu()
#model.add_denseFromM('piMDR')
#model.add_reshapeTo4D(FLAGS.LabelsH,FLAGS.LabelsW)
# #model.add_tanh() # FC2
#model.add_Unet1Step(128, mapsize=5, stride=2, num_layers=2, stddev_factor=1e-3)
#model.add_conv2d(channels, mapsize=5, stride=1, stddev_factor=2.)
b=np.array([[64,0,0,128,128,0,0,64],[128,0,0,256,256,0,0,128],[512,0,0,0,0,0,0,512]])
#b=np.array([[64,0,0,128,128,0,0,64],[128,0,0,256,256,0,0,128]])
#b=np.array([[64,0,0,0,0,0,0,64]])
model.add_UnetKsteps(b, mapsize=mapsize, stride=2, stddev_factor=1e-3)
# model.add_conv2d(channels, mapsize=1, stride=1, stddev_factor=2.)
# ggg: Autoencode
model.add_conv2dWithName(50, name="AE", mapsize=3, stride=1, stddev_factor=2.)
model.add_elu()
# ggg: Finish
model.add_conv2d(channels, mapsize=1, stride=1, stddev_factor=2.)
# #model.add_flatten()
# #model.add_dense(num_units=H*W*1)
# model.add_reshapeTo4D(FLAGS.LabelsH,FLAGS.LabelsW)
# #model.add_batch_norm()
# #model.add_tanh() # TC3
# # model.add_conv2d(AUTOMAP_units[0], mapsize=AUTOMAP_mapsize[0], stride=1, stddev_factor=2.)
# # model.add_batch_norm()
# #model.add_relu()
# #model.add_conv2d(AUTOMAP_units[1], mapsize=AUTOMAP_mapsize[1], stride=1, stddev_factor=2.)
# # model.add_batch_norm()
# #model.add_relu()
# #model.add_conv2d(AUTOMAP_units[2], mapsize=AUTOMAP_mapsize[2], stride=1, stddev_factor=2.)
# # model.add_conv2d(AUTOMAP_units[2], mapsize=1, stride=1, stddev_factor=2.)
# # model.add_relu()
#model.add_constMatMul()
#for ru in range(len(res_units)-1):
# nunits = res_units[ru]
# for j in range(2):
# model.add_residual_block(nunits, mapsize=mapsize)
# Spatial upscale (see http://distill.pub/2016/deconv-checkerboard/)
# and transposed convolution
# model.add_upscale()
# model.add_batch_norm()
# model.add_relu()
# model.add_conv2d_transpose(nunits, mapsize=mapsize, stride=1, stddev_factor=1.)
# model.add_flatten()
# model.add_dense(num_units=H*W*4)
# model.add_reshapeTo4D(FLAGS.LabelsH,FLAGS.LabelsW)
# #model.add_Mult2D()
# #model.add_Mult3DComplexRI()
SrezOrigImagePartModel=False
if SrezOrigImagePartModel:
nunits = res_units[0]
for j in range(2):
model.add_residual_block(nunits, mapsize=mapsize)
#model.add_upscale()
model.add_batch_norm()
model.add_relu()
model.add_conv2d_transpose(nunits, mapsize=mapsize, stride=1, stddev_factor=1.)
nunits = res_units[1]
for j in range(2):
model.add_residual_block(nunits, mapsize=mapsize)
#model.add_upscale()
model.add_batch_norm()
model.add_relu()
model.add_conv2d_transpose(nunits, mapsize=mapsize, stride=1, stddev_factor=1.)
# Finalization a la "all convolutional net"
nunits = res_units[-1]
model.add_conv2d(nunits, mapsize=mapsize, stride=1, stddev_factor=2.)
# Worse: model.add_batch_norm()
model.add_relu()
model.add_conv2d(nunits, mapsize=1, stride=1, stddev_factor=2.)
# Worse: model.add_batch_norm()
model.add_relu()
# Last layer is sigmoid with no batch normalization
model.add_conv2d(channels, mapsize=1, stride=1, stddev_factor=1.)
model.add_sigmoid()
new_vars = tf.global_variables()
gene_vars = list(set(new_vars) - set(old_vars))
# ggg = tf.identity(model.get_output(), name="ggg")
return model.get_output(), gene_vars
| 40.422197 | 187 | 0.609901 | 14,483 | 107,806 | 4.413174 | 0.048194 | 0.053946 | 0.024407 | 0.024783 | 0.864056 | 0.845547 | 0.822658 | 0.801333 | 0.789161 | 0.771653 | 0 | 0.048975 | 0.241638 | 107,806 | 2,666 | 188 | 40.437359 | 0.732815 | 0.286125 | 0 | 0.770083 | 0 | 0 | 0.068144 | 0.016133 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013158 | false | 0 | 0.006233 | 0.009695 | 0.047784 | 0.032548 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3f8e276c47446771bb7ecf5104c008bdc3d95ffb | 11,577 | py | Python | q2_phylogeny/_iqtree.py | turanoo/q2-phylogeny | 5b739b0281d2923fd1d5d34c8f47a5177fbd4674 | [
"BSD-3-Clause"
] | null | null | null | q2_phylogeny/_iqtree.py | turanoo/q2-phylogeny | 5b739b0281d2923fd1d5d34c8f47a5177fbd4674 | [
"BSD-3-Clause"
] | null | null | null | q2_phylogeny/_iqtree.py | turanoo/q2-phylogeny | 5b739b0281d2923fd1d5d34c8f47a5177fbd4674 | [
"BSD-3-Clause"
] | null | null | null | # ----------------------------------------------------------------------------
# Copyright (c) 2016-2018, QIIME 2 development team.
#
# Distributed under the terms of the Modified BSD License.
#
# The full license is in the file LICENSE, distributed with this software.
# ----------------------------------------------------------------------------
import os
import tempfile
from q2_types.feature_data import AlignedDNAFASTAFormat
from q2_types.tree import NewickFormat
from q2_phylogeny._raxml import run_command
_iqtree_defaults = {
'seed': None,
'n_cores': 1,
'n_runs': 1,
'substitution_model': 'MFP',
'run_prefix': 'q2iqtree',
'dtype': 'DNA',
'n_init_pars_trees': None,
'n_top_init_trees': None,
'n_best_retain_trees': None,
'n_iter': None,
'stop_iter': None,
'perturb_nni_strength': None,
'spr_radius': None,
'fast': False,
'alrt': None,
'abayes': False,
'lbp': None,
'allnni': False,
'bnni': False,
'safe': False,
'bootstrap_replicates': 1000,
'n_max_ufboot_iter': None,
'n_ufboot_steps': None,
'min_cor_ufboot': None,
'ep_break_ufboot': None,
}
def _build_iqtree_command(
alignment,
seed: int = _iqtree_defaults['seed'],
n_cores: int = _iqtree_defaults['n_cores'],
n_runs: int = _iqtree_defaults['n_runs'],
substitution_model: str = _iqtree_defaults['substitution_model'],
run_prefix: str = _iqtree_defaults['run_prefix'],
dtype: str = _iqtree_defaults['dtype'],
n_init_pars_trees: int = _iqtree_defaults['n_init_pars_trees'],
n_top_init_trees: int = _iqtree_defaults['n_top_init_trees'],
n_best_retain_trees: int = _iqtree_defaults['n_best_retain_trees'],
n_iter: int = _iqtree_defaults['n_iter'],
stop_iter: int = _iqtree_defaults['stop_iter'],
perturb_nni_strength: float = _iqtree_defaults['perturb_nni_strength'],
spr_radius: int = _iqtree_defaults['spr_radius'],
allnni: bool = _iqtree_defaults['allnni'],
fast: bool = _iqtree_defaults['fast'],
alrt: int = _iqtree_defaults['alrt'],
abayes: bool = _iqtree_defaults['abayes'],
lbp: int = _iqtree_defaults['lbp'],
safe: bool = _iqtree_defaults['safe']):
cmd = ['iqtree']
cmd += ['-st', str(dtype),
'--runs', '%i' % n_runs,
'-s', str(alignment),
'-m', str(substitution_model),
'-pre', str(run_prefix)]
if n_cores == 0:
cmd += ['-nt', 'AUTO']
else:
cmd += ['-nt', '%i' % n_cores]
if seed:
cmd += ['-seed', '%i' % seed]
if safe:
cmd += ['-safe']
if fast:
cmd += ['-fast']
if alrt:
cmd += ['-alrt', '%i' % alrt]
if abayes:
cmd += ['-abayes']
if lbp:
cmd += ['-lbp', '%i' % lbp]
if allnni:
cmd += ['-allnni']
if n_init_pars_trees:
cmd += ['-ninit', '%i' % n_init_pars_trees]
if n_top_init_trees:
cmd += ['-ntop', '%i' % n_top_init_trees]
if n_best_retain_trees:
cmd += ['-nbest', '%i' % n_best_retain_trees]
if n_iter:
cmd += ['-n', '%i' % n_iter]
if stop_iter:
cmd += ['-nstop', '%i' % stop_iter]
if perturb_nni_strength:
cmd += ['-pers', '%f' % perturb_nni_strength]
if spr_radius:
cmd += ['-sprrad', '%i' % spr_radius]
return cmd
def iqtree(
alignment: AlignedDNAFASTAFormat,
seed: int = _iqtree_defaults['seed'],
n_cores: int = _iqtree_defaults['n_cores'],
n_runs: int = _iqtree_defaults['n_runs'],
substitution_model: str = _iqtree_defaults['substitution_model'],
n_init_pars_trees: int = _iqtree_defaults['n_init_pars_trees'],
n_top_init_trees: int = _iqtree_defaults['n_top_init_trees'],
n_best_retain_trees: int = _iqtree_defaults['n_best_retain_trees'],
n_iter: int = _iqtree_defaults['n_iter'],
stop_iter: int = _iqtree_defaults['stop_iter'],
perturb_nni_strength: float = _iqtree_defaults['perturb_nni_strength'],
spr_radius: int = _iqtree_defaults['spr_radius'],
allnni: bool = _iqtree_defaults['allnni'],
fast: bool = _iqtree_defaults['fast'],
alrt: int = _iqtree_defaults['alrt'],
abayes: bool = _iqtree_defaults['abayes'],
lbp: int = _iqtree_defaults['lbp'],
safe: bool = _iqtree_defaults['safe'],
) -> NewickFormat:
result = NewickFormat()
with tempfile.TemporaryDirectory() as temp_dir:
run_prefix = os.path.join(temp_dir, 'q2iqtree')
cmd = _build_iqtree_command(alignment,
seed=seed,
n_cores=n_cores,
n_runs=n_runs,
substitution_model=substitution_model,
run_prefix=run_prefix,
n_init_pars_trees=n_init_pars_trees,
n_top_init_trees=n_top_init_trees,
n_best_retain_trees=n_best_retain_trees,
n_iter=n_iter,
stop_iter=stop_iter,
perturb_nni_strength=perturb_nni_strength,
spr_radius=spr_radius,
allnni=allnni,
fast=fast,
alrt=alrt,
abayes=abayes,
lbp=lbp,
safe=safe)
run_command(cmd)
tree_tmp_fp = os.path.join(temp_dir, '%s.treefile' % run_prefix)
os.rename(tree_tmp_fp, str(result))
return result
def _build_iqtree_ufbs_command(
alignment,
seed: int = _iqtree_defaults['seed'],
n_cores: int = _iqtree_defaults['n_cores'],
n_runs: int = _iqtree_defaults['n_runs'],
substitution_model: str = _iqtree_defaults['substitution_model'],
bootstrap_replicates: int = _iqtree_defaults['bootstrap_replicates'],
run_prefix: str = _iqtree_defaults['run_prefix'],
dtype: str = _iqtree_defaults['dtype'],
n_init_pars_trees: int = _iqtree_defaults['n_init_pars_trees'],
n_top_init_trees: int = _iqtree_defaults['n_top_init_trees'],
n_best_retain_trees: int = _iqtree_defaults['n_best_retain_trees'],
stop_iter: int = _iqtree_defaults['stop_iter'],
perturb_nni_strength: float = _iqtree_defaults['perturb_nni_strength'],
spr_radius: int = _iqtree_defaults['spr_radius'],
n_max_ufboot_iter: int = _iqtree_defaults['n_max_ufboot_iter'],
n_ufboot_steps: int = _iqtree_defaults['n_ufboot_steps'],
min_cor_ufboot: float = _iqtree_defaults['min_cor_ufboot'],
ep_break_ufboot: float = _iqtree_defaults['ep_break_ufboot'],
allnni: bool = _iqtree_defaults['allnni'],
alrt: int = _iqtree_defaults['alrt'],
abayes: bool = _iqtree_defaults['abayes'],
lbp: int = _iqtree_defaults['lbp'],
bnni: bool = _iqtree_defaults['bnni'],
safe: bool = _iqtree_defaults['safe']):
# This is a separate command becuase there are several
# bootstrap specific options.
cmd = ['iqtree', '-bb', '%i' % bootstrap_replicates]
cmd += ['-st', str(dtype),
'--runs', '%i' % n_runs,
'-s', str(alignment),
'-m', str(substitution_model),
'-pre', str(run_prefix)]
if n_cores == 0:
cmd += ['-nt', 'AUTO']
else:
cmd += ['-nt', '%i' % n_cores]
if seed:
cmd += ['-seed', '%i' % seed]
if safe:
cmd += ['-safe']
if allnni:
cmd += ['-allnni']
if alrt:
cmd += ['-alrt', '%i' % alrt]
if abayes:
cmd += ['-abayes']
if lbp:
cmd += ['-lbp', '%i' % lbp]
if bnni:
cmd += ['-bnni']
if n_init_pars_trees:
cmd += ['-ninit', '%i' % n_init_pars_trees]
if n_top_init_trees:
cmd += ['-ntop', '%i' % n_top_init_trees]
if n_best_retain_trees:
cmd += ['-nbest', '%i' % n_best_retain_trees]
if stop_iter:
cmd += ['-nstop', '%i' % stop_iter]
if perturb_nni_strength:
cmd += ['-pers', '%f' % perturb_nni_strength]
if spr_radius:
cmd += ['-sprrad', '%i' % spr_radius]
if n_max_ufboot_iter:
cmd += ['-nm', '%i' % n_max_ufboot_iter]
if n_ufboot_steps:
cmd += ['-nstep', '%i' % n_ufboot_steps]
if min_cor_ufboot:
cmd += ['-bcor', '%f' % min_cor_ufboot]
if ep_break_ufboot:
cmd += ['-beps', '%f' % ep_break_ufboot]
return cmd
def iqtree_ultrafast_bootstrap(
alignment: AlignedDNAFASTAFormat,
seed: int = _iqtree_defaults['seed'],
n_cores: int = _iqtree_defaults['n_cores'],
n_runs: int = _iqtree_defaults['n_runs'],
substitution_model: str = _iqtree_defaults['substitution_model'],
bootstrap_replicates: int = _iqtree_defaults['bootstrap_replicates'],
n_init_pars_trees: int = _iqtree_defaults['n_init_pars_trees'],
n_top_init_trees: int = _iqtree_defaults['n_top_init_trees'],
n_best_retain_trees: int = _iqtree_defaults['n_best_retain_trees'],
stop_iter: int = _iqtree_defaults['stop_iter'],
perturb_nni_strength: float = _iqtree_defaults['perturb_nni_strength'],
spr_radius: int = _iqtree_defaults['spr_radius'],
n_max_ufboot_iter: int = _iqtree_defaults['n_max_ufboot_iter'],
n_ufboot_steps: int = _iqtree_defaults['n_ufboot_steps'],
min_cor_ufboot: float = _iqtree_defaults['min_cor_ufboot'],
ep_break_ufboot: float = _iqtree_defaults['ep_break_ufboot'],
allnni: bool = _iqtree_defaults['allnni'],
alrt: int = _iqtree_defaults['alrt'],
abayes: bool = _iqtree_defaults['abayes'],
lbp: int = _iqtree_defaults['lbp'],
bnni: bool = _iqtree_defaults['bnni'],
safe: bool = _iqtree_defaults['safe']
) -> NewickFormat:
# NOTE: the IQ-TREE commands `-n` (called as `n_iter` in the `iqtree`
# method) and `-fast` are not compatable with ultrafast_bootstrap `-bb`.
result = NewickFormat()
with tempfile.TemporaryDirectory() as temp_dir:
run_prefix = os.path.join(temp_dir, 'q2iqtreeufboot')
cmd = _build_iqtree_ufbs_command(
alignment,
seed=seed,
n_cores=n_cores,
n_runs=n_runs,
substitution_model=substitution_model,
bootstrap_replicates=bootstrap_replicates,
run_prefix=run_prefix,
n_init_pars_trees=n_init_pars_trees,
n_top_init_trees=n_top_init_trees,
n_best_retain_trees=n_best_retain_trees,
stop_iter=stop_iter,
perturb_nni_strength=perturb_nni_strength,
spr_radius=spr_radius,
n_max_ufboot_iter=n_max_ufboot_iter,
n_ufboot_steps=n_ufboot_steps,
min_cor_ufboot=min_cor_ufboot,
ep_break_ufboot=ep_break_ufboot,
allnni=allnni,
alrt=alrt,
abayes=abayes,
lbp=lbp,
bnni=bnni,
safe=safe)
run_command(cmd)
tree_tmp_fp = os.path.join(temp_dir, '%s.treefile' % run_prefix)
os.rename(tree_tmp_fp, str(result))
return result
| 35.081818 | 79 | 0.576661 | 1,346 | 11,577 | 4.526746 | 0.107727 | 0.186115 | 0.133924 | 0.076809 | 0.803217 | 0.794682 | 0.766125 | 0.761858 | 0.761858 | 0.761858 | 0 | 0.002796 | 0.289367 | 11,577 | 329 | 80 | 35.18845 | 0.737815 | 0.047854 | 0 | 0.749049 | 0 | 0 | 0.129144 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.015209 | false | 0 | 0.019011 | 0 | 0.04943 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3f9d90eb7e6f19856a6d53c03f78b3849ec8e470 | 2,230 | py | Python | pipesmanufacturing/pipes_manufacturing/utils/batch.py | MohammadAhmad1990/pipesmanufacturing | 814e68b3a900ad576231d93103d20a7d93b4969c | [
"MIT"
] | 1 | 2020-12-24T05:03:04.000Z | 2020-12-24T05:03:04.000Z | pipesmanufacturing/pipes_manufacturing/utils/batch.py | MohammadAhmad1990/pipesmanufacturing | 814e68b3a900ad576231d93103d20a7d93b4969c | [
"MIT"
] | null | null | null | pipesmanufacturing/pipes_manufacturing/utils/batch.py | MohammadAhmad1990/pipesmanufacturing | 814e68b3a900ad576231d93103d20a7d93b4969c | [
"MIT"
] | 5 | 2019-11-12T06:05:23.000Z | 2020-04-21T19:59:41.000Z | import frappe
from frappe.model.document import Document
from erpnext.stock.doctype.batch.batch import get_batch_qty
def update_batch_stock_status(self, cdt):
if self.doctype == "Purchase Receipt":
for item in self.items:
if "Strip-MS" in str(item.item_code):
msg = get_batch_qty(item.batch_no)
total_batch_qty = 0
for x in msg:
total_batch_qty +=int(x.qty)
if msg:
batch_msg = frappe.get_doc("Batch",item.batch_no)
batch_msg.batch_stock_status = "Available"
batch_msg.save()
else:
batch_msg = frappe.get_doc("Batch",item.batch_no)
batch_msg.batch_stock_status = "Empty"
batch_msg.save()
elif self.doctype == "Delivery Note":
for item in self.items:
if "Strip-MS" in str(item.item_code):
msg = get_batch_qty(item.batch_no)
total_batch_qty = 0
for x in msg:
total_batch_qty +=int(x.qty)
if total_batch_qty>0:
batch_msg = frappe.get_doc("Batch",item.batch_no)
batch_msg.batch_stock_status = "Available"
batch_msg.save()
else:
batch_msg = frappe.get_doc("Batch",item.batch_no)
batch_msg.batch_stock_status = "Empty"
batch_msg.save()
else:
for item in self.items:
if "Strip-MS" in str(item.item_code):
msg = get_batch_qty(item.batch_no)
total_batch_qty = 0
for x in msg:
total_batch_qty +=int(x.qty)
if total_batch_qty>0:
batch_msg = frappe.get_doc("Batch",item.batch_no)
batch_msg.batch_stock_status = "Available"
batch_msg.save()
else:
batch_msg = frappe.get_doc("Batch",item.batch_no)
batch_msg.batch_stock_status = "Empty"
batch_msg.save()
| 40.545455 | 69 | 0.504036 | 261 | 2,230 | 4.019157 | 0.157088 | 0.137274 | 0.094376 | 0.097235 | 0.816015 | 0.816015 | 0.816015 | 0.816015 | 0.816015 | 0.816015 | 0 | 0.003831 | 0.414798 | 2,230 | 54 | 70 | 41.296296 | 0.8 | 0 | 0 | 0.857143 | 0 | 0 | 0.056104 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020408 | false | 0 | 0.061224 | 0 | 0.081633 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3fa3de6c48d0439f84b7295a13fe13456d5e290b | 103 | py | Python | hms/models/__init__.py | MuhmdTaha/Odoo-Lab-HMS | 1717d3df983e35ae67d5c4ede0eb98b8acad17c1 | [
"MIT"
] | null | null | null | hms/models/__init__.py | MuhmdTaha/Odoo-Lab-HMS | 1717d3df983e35ae67d5c4ede0eb98b8acad17c1 | [
"MIT"
] | null | null | null | hms/models/__init__.py | MuhmdTaha/Odoo-Lab-HMS | 1717d3df983e35ae67d5c4ede0eb98b8acad17c1 | [
"MIT"
] | null | null | null | from .import hms_patient
from .import hms_department
from .import hms_doctor
from .import hms_logs
| 20.6 | 28 | 0.805825 | 16 | 103 | 4.9375 | 0.4375 | 0.506329 | 0.658228 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15534 | 103 | 4 | 29 | 25.75 | 0.908046 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3fa81dc310661ef53b9ef6615c865f6f5ec1e17c | 121 | py | Python | controller/__init__.py | whaleygeek/mb_deathstar | f756b8b5b45927039c547d0f96f8e31a365b383b | [
"MIT"
] | null | null | null | controller/__init__.py | whaleygeek/mb_deathstar | f756b8b5b45927039c547d0f96f8e31a365b383b | [
"MIT"
] | null | null | null | controller/__init__.py | whaleygeek/mb_deathstar | f756b8b5b45927039c547d0f96f8e31a365b383b | [
"MIT"
] | null | null | null |
# The micro:bit is the controller
from mb_controller import *
# Pygame is the controller
##from pg_controller import *
| 17.285714 | 33 | 0.768595 | 18 | 121 | 5.055556 | 0.555556 | 0.10989 | 0.32967 | 0.417582 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173554 | 121 | 6 | 34 | 20.166667 | 0.91 | 0.68595 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3fea009d7bc22b572e34338e630a9c2e85dc7937 | 3,206 | py | Python | test/pyaz/netappfiles/volume/replication/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | test/pyaz/netappfiles/volume/replication/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | 9 | 2021-09-24T16:37:24.000Z | 2021-12-24T00:39:19.000Z | test/pyaz/netappfiles/volume/replication/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | import json, subprocess
from .... pyaz_utils import get_cli_name, get_params
def approve(resource_group, account_name, pool_name, volume_name, remote_volume_resource_id=None):
params = get_params(locals())
command = "az netappfiles volume replication approve " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def suspend(resource_group, account_name, pool_name, volume_name, force=None):
params = get_params(locals())
command = "az netappfiles volume replication suspend " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def resume(resource_group, account_name, pool_name, volume_name):
params = get_params(locals())
command = "az netappfiles volume replication resume " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def remove(resource_group, account_name, pool_name, volume_name):
params = get_params(locals())
command = "az netappfiles volume replication remove " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def status(resource_group, account_name, pool_name, volume_name):
params = get_params(locals())
command = "az netappfiles volume replication status " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def re_initialize(resource_group, account_name, pool_name, volume_name):
params = get_params(locals())
command = "az netappfiles volume replication re-initialize " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
| 36.431818 | 98 | 0.680287 | 387 | 3,206 | 5.537468 | 0.126615 | 0.078395 | 0.055996 | 0.067196 | 0.917872 | 0.917872 | 0.917872 | 0.917872 | 0.878675 | 0.878675 | 0 | 0.004747 | 0.211478 | 3,206 | 87 | 99 | 36.850575 | 0.842959 | 0 | 0 | 0.825 | 0 | 0 | 0.098253 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075 | false | 0 | 0.025 | 0 | 0.175 | 0.225 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b74cd64fb32e2df741554e6ac19f3a0e4620dbfc | 56,118 | py | Python | tools/device_file_generator/dfg/avr/avr_io.py | roboterclubaachen/xpcc | 010924901947381d20e83b838502880eb2ffea72 | [
"BSD-3-Clause"
] | 161 | 2015-01-13T15:52:06.000Z | 2020-02-13T01:26:04.000Z | tools/device_file_generator/dfg/avr/avr_io.py | salkinium/xpcc | 010924901947381d20e83b838502880eb2ffea72 | [
"BSD-3-Clause"
] | 281 | 2015-01-06T12:46:40.000Z | 2019-01-06T13:06:57.000Z | tools/device_file_generator/dfg/avr/avr_io.py | salkinium/xpcc | 010924901947381d20e83b838502880eb2ffea72 | [
"BSD-3-Clause"
] | 51 | 2015-03-03T19:56:12.000Z | 2020-03-22T02:13:36.000Z | # -*- coding: utf-8 -*-
# Copyright (c) 2013, Roboterclub Aachen e.V.
# All rights reserved.
#
# The file is part of the xpcc library and is released under the 3-clause BSD
# license. See the file `LICENSE` for the full license governing this code.
# -----------------------------------------------------------------------------
xmega_pins = \
[
{
"type": "a1",
"gpio":
[
{ "port": "A", "mask": 255 },
{ "port": "B", "mask": 255 },
{ "port": "C", "mask": 255 },
{ "port": "D", "mask": 255 },
{ "port": "E", "mask": 255 },
{ "port": "F", "mask": 255 },
{ "port": "H", "mask": 255 },
{ "port": "J", "mask": 255 },
{ "port": "K", "mask": 255 },
{ "port": "Q", "mask": 15 },
{ "port": "R", "mask": 3 }
]
},
{
"type": "a3",
"gpio":
[
{ "port": "A", "mask": 255 },
{ "port": "B", "mask": 255 },
{ "port": "C", "mask": 255 },
{ "port": "D", "mask": 255 },
{ "port": "E", "mask": 255 },
{ "port": "F", "mask": 255 },
{ "port": "R", "mask": 3 }
]
},
{
"type": "a4",
"gpio":
[
{ "port": "A", "mask": 255 },
{ "port": "B", "mask": 15 },
{ "port": "C", "mask": 255 },
{ "port": "D", "mask": 255 },
{ "port": "E", "mask": 15 },
{ "port": "R", "mask": 3 }
]
},
{
"type": "b1",
"gpio":
[
{ "port": "A", "mask": 255 },
{ "port": "B", "mask": 255 },
{ "port": "C", "mask": 255 },
{ "port": "D", "mask": 7 },
{ "port": "E", "mask": 255 },
{ "port": "R", "mask": 3 }
]
},
{
"type": "b3",
"gpio":
[
{ "port": "B", "mask": 255 },
{ "port": "C", "mask": 255 },
{ "port": "D", "mask": 3 },
{ "port": "R", "mask": 3 }
]
},
{
"type": "c3",
"gpio":
[
{ "port": "A", "mask": 255 },
{ "port": "B", "mask": 255 },
{ "port": "C", "mask": 255 },
{ "port": "D", "mask": 255 },
{ "port": "E", "mask": 255 },
{ "port": "F", "mask": 255 },
{ "port": "R", "mask": 3 }
]
},
{
"type": "c4",
"gpio":
[
{ "port": "A", "mask": 255 },
{ "port": "B", "mask": 15 },
{ "port": "C", "mask": 255 },
{ "port": "D", "mask": 255 },
{ "port": "E", "mask": 15 },
{ "port": "R", "mask": 3 }
]
},
{
"type": "d3",
"gpio":
[
{ "port": "A", "mask": 255 },
{ "port": "B", "mask": 255 },
{ "port": "C", "mask": 255 },
{ "port": "D", "mask": 255 },
{ "port": "E", "mask": 255 },
{ "port": "F", "mask": 255 },
{ "port": "R", "mask": 3 }
]
},
{
"type": "d4",
"gpio":
[
{ "port": "A", "mask": 255 },
{ "port": "B", "mask": 15 },
{ "port": "C", "mask": 255 },
{ "port": "D", "mask": 255 },
{ "port": "E", "mask": 15 },
{ "port": "R", "mask": 3 }
]
},
{
"type": "e5",
"gpio":
[
{ "port": "A", "mask": 255 },
{ "port": "C", "mask": 255 },
{ "port": "D", "mask": 255 },
{ "port": "R", "mask": 3 }
]
}
]
# the pins of the peripherals are always on the same pins on all ports
xmega_peripheral_pins = \
{
"spi":
[
{ "id": "4", "name": "ss", "dir": "out" },
{ "id": "5", "name": "mosi", "dir": "out", "remap": False },
{ "id": "6", "name": "miso", "dir": "in" },
{ "id": "7", "name": "sck", "dir": "out", "remap": False },
{ "id": "7", "name": "mosi", "dir": "out", "remap": True },
{ "id": "5", "name": "sck", "dir": "out", "remap": True }
],
"usart":
[
{ "id": "1", "name": "xck", "dir": "out", "instance": "0", "remap": False },
{ "id": "2", "name": "rxd", "dir": "in", "instance": "0", "remap": False },
{ "id": "3", "name": "txd", "dir": "out", "instance": "0", "remap": False },
{ "id": "5", "name": "xck", "dir": "out", "instance": "0", "remap": True },
{ "id": "6", "name": "rxd", "dir": "in", "instance": "0", "remap": True },
{ "id": "7", "name": "txd", "dir": "out", "instance": "0", "remap": True },
{ "id": "5", "name": "xck", "dir": "out", "instance": "1" },
{ "id": "6", "name": "rxd", "dir": "in", "instance": "1" },
{ "id": "7", "name": "txd", "dir": "out", "instance": "1" }
],
"twi":
[
{ "id": "0", "name": "sda", "dir": "io" },
{ "id": "1", "name": "scl", "dir": "out" }
],
"tc":
[
{ "id": "0", "name": "OCA", "dir": "out", "instance": "0", "remap": False },
{ "id": "1", "name": "OCB", "dir": "out", "instance": "0", "remap": False },
{ "id": "2", "name": "OCC", "dir": "out", "instance": "0", "remap": False },
{ "id": "3", "name": "OCD", "dir": "out", "instance": "0", "remap": False },
{ "id": "4", "name": "OCA", "dir": "out", "instance": "0", "remap": True },
{ "id": "5", "name": "OCB", "dir": "out", "instance": "0", "remap": True },
{ "id": "6", "name": "OCC", "dir": "out", "instance": "0", "remap": True },
{ "id": "7", "name": "OCD", "dir": "out", "instance": "0", "remap": True },
{ "id": "4", "name": "OCA", "dir": "out", "instance": "1" },
{ "id": "5", "name": "OCB", "dir": "out", "instance": "1" }
],
"awex":
[
{ "id": "0", "name": "ALS", "dir": "out" },
{ "id": "1", "name": "AHS", "dir": "out" },
{ "id": "2", "name": "BLS", "dir": "out" },
{ "id": "3", "name": "BHS", "dir": "out" },
{ "id": "4", "name": "CLS", "dir": "out" },
{ "id": "5", "name": "CHS", "dir": "out" },
{ "id": "6", "name": "DLS", "dir": "out" },
{ "id": "7", "name": "DHS", "dir": "out" }
],
"usb":
[
{ "id": "0", "name": "DM", "dir": "io" },
{ "id": "1", "name": "DP", "dir": "io" }
]
}
pins = \
[
{
"devices": ["atmega8", "atmega8a"],
"extint":
[
{ "port": "D", "id": "2", "int": "0" },
{ "port": "D", "id": "3", "int": "1" }
],
"spi":
[
{ "port": "B", "id": "3", "name": "miso", "dir": "in" },
{ "port": "B", "id": "4", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "5", "name": "sck", "dir": "out" },
{ "port": "B", "id": "2", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "D", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "1", "name": "txd", "dir": "out" }
],
"i2c":
[
{ "port": "C", "id": "5", "name": "scl", "dir": "out" },
{ "port": "C", "id": "4", "name": "sda", "dir": "io" }
]
},
{
"devices": ["atmega16", "atmega16a", "atmega32", "atmega32a"],
"extint":
[
{ "port": "D", "id": "2", "int": "0" },
{ "port": "D", "id": "3", "int": "1" },
{ "port": "B", "id": "2", "int": "2" }
],
"spi":
[
{ "port": "B", "id": "5", "name": "miso", "dir": "in" },
{ "port": "B", "id": "6", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "7", "name": "sck", "dir": "out" },
{ "port": "B", "id": "4", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "D", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "1", "name": "txd", "dir": "out" }
],
"i2c":
[
{ "port": "C", "id": "0", "name": "scl", "dir": "out" },
{ "port": "C", "id": "1", "name": "sda", "dir": "io" }
]
},
{
"devices": ["atmega64", "atmega64a", "atmega128", "atmega128a"],
"extint":
[
{ "port": "D", "id": "0", "int": "0" },
{ "port": "D", "id": "1", "int": "1" },
{ "port": "D", "id": "2", "int": "2" },
{ "port": "D", "id": "3", "int": "3" },
{ "port": "E", "id": "4", "int": "4" },
{ "port": "E", "id": "5", "int": "5" },
{ "port": "E", "id": "6", "int": "6" },
{ "port": "E", "id": "7", "int": "7" }
],
"spi":
[
{ "port": "B", "id": "2", "name": "miso", "dir": "in" },
{ "port": "B", "id": "3", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "1", "name": "sck", "dir": "out" },
{ "port": "B", "id": "0", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "E", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "E", "id": "1", "name": "txd", "dir": "out" },
{ "port": "E", "id": "2", "name": "xck", "dir": "out" }
],
"uart1":
[
{ "port": "D", "id": "2", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "3", "name": "txd", "dir": "out" },
{ "port": "D", "id": "5", "name": "xck", "dir": "out" }
],
"i2c":
[
{ "port": "D", "id": "0", "name": "scl", "dir": "out" },
{ "port": "D", "id": "1", "name": "sda", "dir": "io" }
]
},
{
"devices": ["atmega8u2", "atmega16u2", "atmega32u2"],
"pcint":
[
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
{ "port": "B", "id": "6", "int": "6" },
{ "port": "B", "id": "7", "int": "7" },
{ "port": "C", "id": "6", "int": "8" },
{ "port": "C", "id": "5", "int": "9" },
{ "port": "C", "id": "4", "int": "10" },
{ "port": "C", "id": "2", "int": "11" },
{ "port": "D", "id": "5", "int": "12" },
],
"extint":
[
{ "port": "D", "id": "0", "int": "0" },
{ "port": "D", "id": "1", "int": "1" },
{ "port": "D", "id": "2", "int": "2" },
{ "port": "D", "id": "3", "int": "3" },
{ "port": "D", "id": "4", "int": "4" },
{ "port": "D", "id": "5", "int": "5" },
{ "port": "D", "id": "6", "int": "6" },
{ "port": "D", "id": "7", "int": "7" }
],
"spi":
[
{ "port": "B", "id": "2", "name": "miso", "dir": "in" },
{ "port": "B", "id": "3", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "1", "name": "sck", "dir": "out" },
{ "port": "B", "id": "0", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "D", "id": "2", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "3", "name": "txd", "dir": "out" },
{ "port": "D", "id": "5", "name": "xck", "dir": "out" }
],
"uartspi": []
},
{
"devices": ["atmega8hva", "atmega16hva"],
"pcint": [],
"extint":
[
{ "port": "C", "id": "0", "int": "0" },
{ "port": "B", "id": "2", "int": "1" },
{ "port": "B", "id": "3", "int": "2" },
],
"spi":
[
{ "port": "B", "id": "2", "name": "miso", "dir": "in" },
{ "port": "B", "id": "3", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "1", "name": "sck", "dir": "out" },
{ "port": "B", "id": "0", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "C", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "C", "id": "0", "name": "txd", "dir": "out" }
]
},
{
"devices": ["atmega16c1", "atmega16m1", "atmega32c1", "atmega32m1", "atmega64c1", "atmega64m1"],
"pcint":
[
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
{ "port": "B", "id": "6", "int": "6" },
{ "port": "B", "id": "7", "int": "7" },
{ "port": "C", "id": "0", "int": "8" },
{ "port": "C", "id": "1", "int": "9" },
{ "port": "C", "id": "2", "int": "10" },
{ "port": "C", "id": "3", "int": "11" },
{ "port": "C", "id": "4", "int": "12" },
{ "port": "C", "id": "5", "int": "13" },
{ "port": "C", "id": "6", "int": "14" },
{ "port": "C", "id": "7", "int": "15" },
{ "port": "D", "id": "0", "int": "16" },
{ "port": "D", "id": "1", "int": "17" },
{ "port": "D", "id": "2", "int": "18" },
{ "port": "D", "id": "3", "int": "19" },
{ "port": "D", "id": "4", "int": "20" },
{ "port": "D", "id": "5", "int": "21" },
{ "port": "D", "id": "6", "int": "22" },
{ "port": "D", "id": "7", "int": "23" },
{ "port": "E", "id": "0", "int": "24" },
{ "port": "E", "id": "1", "int": "25" },
{ "port": "E", "id": "2", "int": "26" }
],
"extint":
[
{ "port": "D", "id": "6", "int": "0" },
{ "port": "B", "id": "2", "int": "1" },
{ "port": "B", "id": "5", "int": "2" },
{ "port": "C", "id": "0", "int": "3" },
],
"spi":
[
{ "port": "B", "id": "3", "name": "miso", "dir": "in" },
{ "port": "B", "id": "2", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "4", "name": "sck", "dir": "out" },
{ "port": "B", "id": "1", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "D", "id": "4", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "3", "name": "txd", "dir": "out" }
]
},
{
"devices": ["atmega16hvb", "atmega16hvbrevb", "atmega32hvb", "atmega32hvbrevb"],
"pcint":
[
{ "port": "A", "id": "0", "int": "0" },
{ "port": "A", "id": "1", "int": "1" },
{ "port": "A", "id": "2", "int": "2" },
{ "port": "A", "id": "3", "int": "3" },
{ "port": "B", "id": "0", "int": "4" },
{ "port": "B", "id": "1", "int": "5" },
{ "port": "B", "id": "2", "int": "6" },
{ "port": "B", "id": "3", "int": "7" },
{ "port": "B", "id": "4", "int": "8" },
{ "port": "B", "id": "5", "int": "9" },
{ "port": "B", "id": "6", "int": "10" },
{ "port": "B", "id": "7", "int": "11" }
],
"extint":
[
{ "port": "C", "id": "0", "int": "0" },
{ "port": "C", "id": "1", "int": "1" },
{ "port": "C", "id": "2", "int": "2" },
{ "port": "C", "id": "3", "int": "3" },
],
"spi":
[
{ "port": "B", "id": "6", "name": "miso", "dir": "in" },
{ "port": "B", "id": "7", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "5", "name": "sck", "dir": "out" },
{ "port": "B", "id": "4", "name": "ss", "dir": "out" }
],
"i2c":
[
{ "port": "C", "id": "4", "name": "scl", "dir": "out" },
{ "port": "C", "id": "3", "name": "sda", "dir": "io" }
]
},
{
"devices": ["atmega32hve2", "atmega64hve2"],
"pcint":
[
{ "port": "A", "id": "0", "int": "0" },
{ "port": "A", "id": "1", "int": "1" },
{ "port": "B", "id": "0", "int": "2" },
{ "port": "B", "id": "1", "int": "3" },
{ "port": "B", "id": "2", "int": "4" },
{ "port": "B", "id": "3", "int": "5" },
{ "port": "B", "id": "4", "int": "6" },
{ "port": "B", "id": "5", "int": "7" },
{ "port": "B", "id": "6", "int": "8" },
{ "port": "B", "id": "7", "int": "9" },
],
"extint":
[
{ "port": "B", "id": "7", "int": "0" },
],
"spi":
[
{ "port": "B", "id": "6", "name": "miso", "dir": "in" },
{ "port": "B", "id": "7", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "5", "name": "sck", "dir": "out" },
{ "port": "B", "id": "4", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "B", "id": "1", "name": "rxd", "dir": "in" },
{ "port": "B", "id": "3", "name": "txd", "dir": "out" }
]
},
{
"devices": ["atmega16u4", "atmega32u4"],
"pcint":
[
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
{ "port": "B", "id": "6", "int": "6" },
{ "port": "B", "id": "7", "int": "7" },
],
"extint":
[
{ "port": "D", "id": "0", "int": "0" },
{ "port": "D", "id": "1", "int": "1" },
{ "port": "D", "id": "2", "int": "2" },
{ "port": "D", "id": "3", "int": "3" },
{ "port": "E", "id": "6", "int": "6" },
],
"spi":
[
{ "port": "B", "id": "3", "name": "miso", "dir": "in" },
{ "port": "B", "id": "2", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "1", "name": "sck", "dir": "out" },
{ "port": "B", "id": "0", "name": "ss", "dir": "out" }
],
"uart1":
[
{ "port": "D", "id": "2", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "3", "name": "txd", "dir": "out" },
{ "port": "D", "id": "5", "name": "xck", "dir": "out" }
],
"i2c":
[
{ "port": "D", "id": "0", "name": "scl", "dir": "out" },
{ "port": "D", "id": "1", "name": "sda", "dir": "io" }
],
"uartspi": []
},
{
"devices": ["atmega48", "atmega48a", "atmega48p", "atmega48pa",
"atmega88", "atmega88a", "atmega88p", "atmega88pa",
"atmega168", "atmega168a", "atmega168p", "atmega168pa",
"atmega328", "atmega328a", "atmega328p", "atmega328pa"],
"pcint":
[
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
{ "port": "B", "id": "6", "int": "6" },
{ "port": "B", "id": "7", "int": "7" },
{ "port": "C", "id": "0", "int": "8" },
{ "port": "C", "id": "1", "int": "9" },
{ "port": "C", "id": "2", "int": "10" },
{ "port": "C", "id": "3", "int": "11" },
{ "port": "C", "id": "4", "int": "12" },
{ "port": "C", "id": "5", "int": "13" },
{ "port": "C", "id": "6", "int": "14" },
{ "port": "D", "id": "0", "int": "16" },
{ "port": "D", "id": "1", "int": "17" },
{ "port": "D", "id": "2", "int": "18" },
{ "port": "D", "id": "3", "int": "19" },
{ "port": "D", "id": "4", "int": "20" },
{ "port": "D", "id": "5", "int": "21" },
{ "port": "D", "id": "6", "int": "22" },
{ "port": "D", "id": "7", "int": "23" },
],
"extint":
[
{ "port": "D", "id": "2", "int": "0" },
{ "port": "D", "id": "3", "int": "1" },
],
"spi":
[
{ "port": "B", "id": "3", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "4", "name": "miso", "dir": "in" },
{ "port": "B", "id": "5", "name": "sck", "dir": "out" },
{ "port": "B", "id": "2", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "D", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "1", "name": "txd", "dir": "out" },
{ "port": "D", "id": "4", "name": "xck", "dir": "out" }
],
"i2c":
[
{ "port": "C", "id": "5", "name": "scl", "dir": "out" },
{ "port": "C", "id": "4", "name": "sda", "dir": "io" }
],
"uartspi": []
},
{
"devices": ["atmega64rfa1", "atmega64rfr2",
"atmega128rfa1", "atmega128rfr2",
"atmega256rfa1", "atmega256rfr2"],
"pcint":
[
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
{ "port": "B", "id": "6", "int": "6" },
{ "port": "B", "id": "7", "int": "7" },
{ "port": "E", "id": "0", "int": "8" },
],
"extint":
[
{ "port": "D", "id": "0", "int": "0" },
{ "port": "D", "id": "1", "int": "1" },
{ "port": "D", "id": "2", "int": "2" },
{ "port": "D", "id": "3", "int": "3" },
{ "port": "E", "id": "4", "int": "4" },
{ "port": "E", "id": "5", "int": "5" },
{ "port": "E", "id": "6", "int": "6" },
{ "port": "E", "id": "7", "int": "7" },
],
"spi":
[
{ "port": "B", "id": "2", "name": "miso", "dir": "in" },
{ "port": "B", "id": "3", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "1", "name": "sck", "dir": "out" },
{ "port": "B", "id": "0", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "E", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "E", "id": "1", "name": "txd", "dir": "out" }
],
"uart1":
[
{ "port": "D", "id": "2", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "3", "name": "txd", "dir": "out" }
],
"i2c":
[
{ "port": "D", "id": "0", "name": "scl", "dir": "out" },
{ "port": "D", "id": "1", "name": "sda", "dir": "io" }
],
"uartspi": []
},
{
"devices": ["atmega162"],
"pcint":
[
{ "port": "A", "id": "0", "int": "0" },
{ "port": "A", "id": "1", "int": "1" },
{ "port": "A", "id": "2", "int": "2" },
{ "port": "A", "id": "3", "int": "3" },
{ "port": "A", "id": "4", "int": "4" },
{ "port": "A", "id": "5", "int": "5" },
{ "port": "A", "id": "6", "int": "6" },
{ "port": "A", "id": "7", "int": "7" },
{ "port": "C", "id": "0", "int": "8" },
{ "port": "C", "id": "1", "int": "9" },
{ "port": "C", "id": "2", "int": "10" },
{ "port": "C", "id": "3", "int": "11" },
{ "port": "C", "id": "4", "int": "12" },
{ "port": "C", "id": "5", "int": "13" },
{ "port": "C", "id": "6", "int": "14" },
{ "port": "C", "id": "7", "int": "15" },
],
"extint":
[
{ "port": "D", "id": "2", "int": "0" },
{ "port": "D", "id": "3", "int": "1" },
{ "port": "E", "id": "0", "int": "2" },
],
"spi":
[
{ "port": "B", "id": "5", "name": "miso", "dir": "in" },
{ "port": "B", "id": "6", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "7", "name": "sck", "dir": "out" },
{ "port": "B", "id": "4", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "D", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "1", "name": "txd", "dir": "out" },
{ "port": "D", "id": "4", "name": "xck", "dir": "out" }
],
"uart1":
[
{ "port": "B", "id": "2", "name": "rxd", "dir": "in" },
{ "port": "B", "id": "3", "name": "txd", "dir": "out" },
{ "port": "D", "id": "2", "name": "xck", "dir": "out" }
]
},
{
"devices": ["atmega164", "atmega164a", "atmega164p", "atmega164pa",
"atmega324", "atmega324a", "atmega324p", "atmega324pa",
"atmega644", "atmega644a", "atmega644p", "atmega644pa",
"atmega1284", "atmega1284a", "atmega1284p", "atmega1284pa"],
"pcint":
[
{ "port": "A", "id": "0", "int": "0" },
{ "port": "A", "id": "1", "int": "1" },
{ "port": "A", "id": "2", "int": "2" },
{ "port": "A", "id": "3", "int": "3" },
{ "port": "A", "id": "4", "int": "4" },
{ "port": "A", "id": "5", "int": "5" },
{ "port": "A", "id": "6", "int": "6" },
{ "port": "A", "id": "7", "int": "7" },
{ "port": "B", "id": "0", "int": "8" },
{ "port": "B", "id": "1", "int": "9" },
{ "port": "B", "id": "2", "int": "10" },
{ "port": "B", "id": "3", "int": "11" },
{ "port": "B", "id": "4", "int": "12" },
{ "port": "B", "id": "5", "int": "13" },
{ "port": "B", "id": "6", "int": "14" },
{ "port": "B", "id": "7", "int": "15" },
{ "port": "C", "id": "0", "int": "16" },
{ "port": "C", "id": "1", "int": "17" },
{ "port": "C", "id": "2", "int": "18" },
{ "port": "C", "id": "3", "int": "19" },
{ "port": "C", "id": "4", "int": "20" },
{ "port": "C", "id": "5", "int": "21" },
{ "port": "C", "id": "6", "int": "22" },
{ "port": "C", "id": "7", "int": "23" },
{ "port": "D", "id": "0", "int": "24" },
{ "port": "D", "id": "1", "int": "25" },
{ "port": "D", "id": "2", "int": "26" },
{ "port": "D", "id": "3", "int": "27" },
{ "port": "D", "id": "4", "int": "28" },
{ "port": "D", "id": "5", "int": "29" },
{ "port": "D", "id": "6", "int": "30" },
{ "port": "D", "id": "7", "int": "31" },
],
"extint":
[
{ "port": "D", "id": "2", "int": "0" },
{ "port": "D", "id": "3", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
],
"spi":
[
{ "port": "B", "id": "5", "name": "miso", "dir": "in" },
{ "port": "B", "id": "6", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "7", "name": "sck", "dir": "out" },
{ "port": "B", "id": "4", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "D", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "1", "name": "txd", "dir": "out" },
{ "port": "B", "id": "0", "name": "xck", "dir": "out" }
],
"uart1":
[
{ "port": "D", "id": "2", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "3", "name": "txd", "dir": "out" },
{ "port": "D", "id": "4", "name": "xck", "dir": "out" }
],
"i2c":
[
{ "port": "C", "id": "0", "name": "scl", "dir": "out" },
{ "port": "C", "id": "1", "name": "sda", "dir": "io" }
],
"uartspi": []
},
{
"devices": ["atmega165", "atmega165a", "atmega165p", "atmega165pa",
"atmega325", "atmega325a", "atmega325p", "atmega325pa",
"atmega645", "atmega645a", "atmega645p", "atmega645pa",
"atmega169", "atmega169a", "atmega169p", "atmega169pa",
"atmega329", "atmega329a", "atmega329p", "atmega329pa",
"atmega649", "atmega649a", "atmega649p", "atmega649pa"],
"pcint":
[
{ "port": "E", "id": "0", "int": "0" },
{ "port": "E", "id": "1", "int": "1" },
{ "port": "E", "id": "2", "int": "2" },
{ "port": "E", "id": "3", "int": "3" },
{ "port": "E", "id": "4", "int": "4" },
{ "port": "E", "id": "5", "int": "5" },
{ "port": "E", "id": "6", "int": "6" },
{ "port": "E", "id": "7", "int": "7" },
{ "port": "B", "id": "0", "int": "8" },
{ "port": "B", "id": "1", "int": "9" },
{ "port": "B", "id": "2", "int": "10" },
{ "port": "B", "id": "3", "int": "11" },
{ "port": "B", "id": "4", "int": "12" },
{ "port": "B", "id": "5", "int": "13" },
{ "port": "B", "id": "6", "int": "14" },
{ "port": "B", "id": "7", "int": "15" },
],
"extint":
[
{ "port": "D", "id": "1", "int": "0" },
],
"spi":
[
{ "port": "B", "id": "2", "name": "miso", "dir": "in" },
{ "port": "B", "id": "3", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "1", "name": "sck", "dir": "out" },
{ "port": "B", "id": "0", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "E", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "E", "id": "1", "name": "txd", "dir": "out" },
{ "port": "E", "id": "2", "name": "xck", "dir": "out" }
],
"usi":
[
{ "port": "E", "id": "4", "name": "usck", "dir": "out" },
{ "port": "E", "id": "6", "name": "do", "dir": "out" },
{ "port": "E", "id": "5", "name": "di", "dir": "in" }
]
},
{
"devices": ["atmega3250", "atmega3250a", "atmega3250p", "atmega3250pa",
"atmega6450", "atmega6450a", "atmega6450p", "atmega6450pa",
"atmega3290", "atmega3290a", "atmega3290p", "atmega3290pa",
"atmega6490", "atmega6490a", "atmega6490p", "atmega6490pa"],
"pcint":
[
{ "port": "E", "id": "0", "int": "0" },
{ "port": "E", "id": "1", "int": "1" },
{ "port": "E", "id": "2", "int": "2" },
{ "port": "E", "id": "3", "int": "3" },
{ "port": "E", "id": "4", "int": "4" },
{ "port": "E", "id": "5", "int": "5" },
{ "port": "E", "id": "6", "int": "6" },
{ "port": "E", "id": "7", "int": "7" },
{ "port": "B", "id": "0", "int": "8" },
{ "port": "B", "id": "1", "int": "9" },
{ "port": "B", "id": "2", "int": "10" },
{ "port": "B", "id": "3", "int": "11" },
{ "port": "B", "id": "4", "int": "12" },
{ "port": "B", "id": "5", "int": "13" },
{ "port": "B", "id": "6", "int": "14" },
{ "port": "B", "id": "7", "int": "15" },
{ "port": "H", "id": "0", "int": "16" },
{ "port": "H", "id": "1", "int": "17" },
{ "port": "H", "id": "2", "int": "18" },
{ "port": "H", "id": "3", "int": "19" },
{ "port": "H", "id": "4", "int": "20" },
{ "port": "H", "id": "5", "int": "21" },
{ "port": "H", "id": "6", "int": "22" },
{ "port": "H", "id": "7", "int": "23" },
{ "port": "J", "id": "0", "int": "24" },
{ "port": "J", "id": "1", "int": "25" },
{ "port": "J", "id": "2", "int": "26" },
{ "port": "J", "id": "3", "int": "27" },
{ "port": "J", "id": "4", "int": "28" },
{ "port": "J", "id": "5", "int": "29" },
{ "port": "J", "id": "6", "int": "30" },
],
"extint":
[
{ "port": "D", "id": "1", "int": "0" },
],
"spi":
[
{ "port": "B", "id": "2", "name": "miso", "dir": "in" },
{ "port": "B", "id": "3", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "1", "name": "sck", "dir": "out" },
{ "port": "B", "id": "0", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "E", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "E", "id": "1", "name": "txd", "dir": "out" },
{ "port": "E", "id": "2", "name": "xck", "dir": "out" }
],
"usi":
[
{ "port": "E", "id": "4", "name": "usck", "dir": "out" },
{ "port": "E", "id": "6", "name": "do", "dir": "out" },
{ "port": "E", "id": "5", "name": "di", "dir": "in" }
]
},
{
"devices": ["atmega1281", "atmega2561"],
"pcint":
[
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
{ "port": "B", "id": "6", "int": "6" },
{ "port": "B", "id": "7", "int": "7" },
{ "port": "E", "id": "0", "int": "8" },
],
"extint":
[
{ "port": "D", "id": "0", "int": "0" },
{ "port": "D", "id": "1", "int": "1" },
{ "port": "D", "id": "2", "int": "2" },
{ "port": "D", "id": "3", "int": "3" },
{ "port": "E", "id": "4", "int": "4" },
{ "port": "E", "id": "5", "int": "5" },
{ "port": "E", "id": "6", "int": "6" },
{ "port": "E", "id": "7", "int": "7" },
],
"spi":
[
{ "port": "B", "id": "2", "name": "miso", "dir": "in" },
{ "port": "B", "id": "3", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "1", "name": "sck", "dir": "out" },
{ "port": "B", "id": "0", "name": "ss", "dir": "out" }
],
"uart1":
[
{ "port": "D", "id": "2", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "3", "name": "txd", "dir": "out" },
{ "port": "D", "id": "5", "name": "xck", "dir": "out" }
],
"i2c":
[
{ "port": "D", "id": "0", "name": "scl", "dir": "out" },
{ "port": "D", "id": "1", "name": "sda", "dir": "io" }
],
"uartspi": []
},
{
"devices": ["atmega640", "atmega1280", "atmega2560"],
"pcint":
[
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
{ "port": "B", "id": "6", "int": "6" },
{ "port": "B", "id": "7", "int": "7" },
{ "port": "E", "id": "0", "int": "8" },
{ "port": "J", "id": "0", "int": "9" },
{ "port": "J", "id": "1", "int": "10" },
{ "port": "J", "id": "2", "int": "11" },
{ "port": "J", "id": "3", "int": "12" },
{ "port": "J", "id": "4", "int": "13" },
{ "port": "J", "id": "5", "int": "14" },
{ "port": "J", "id": "6", "int": "15" },
{ "port": "K", "id": "0", "int": "16" },
{ "port": "K", "id": "1", "int": "17" },
{ "port": "K", "id": "2", "int": "18" },
{ "port": "K", "id": "3", "int": "19" },
{ "port": "K", "id": "4", "int": "20" },
{ "port": "K", "id": "5", "int": "21" },
{ "port": "K", "id": "6", "int": "22" },
{ "port": "K", "id": "7", "int": "23" },
],
"extint":
[
{ "port": "D", "id": "0", "int": "0" },
{ "port": "D", "id": "1", "int": "1" },
{ "port": "D", "id": "2", "int": "2" },
{ "port": "D", "id": "3", "int": "3" },
{ "port": "E", "id": "4", "int": "4" },
{ "port": "E", "id": "5", "int": "5" },
{ "port": "E", "id": "6", "int": "6" },
{ "port": "E", "id": "7", "int": "7" },
],
"spi":
[
{ "port": "B", "id": "2", "name": "miso", "dir": "in" },
{ "port": "B", "id": "3", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "1", "name": "sck", "dir": "out" },
{ "port": "B", "id": "0", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "E", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "E", "id": "1", "name": "txd", "dir": "out" },
{ "port": "E", "id": "2", "name": "xck", "dir": "out" }
],
"uart1":
[
{ "port": "D", "id": "2", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "3", "name": "txd", "dir": "out" },
{ "port": "D", "id": "5", "name": "xck", "dir": "out" }
],
"uart2":
[
{ "port": "H", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "H", "id": "1", "name": "txd", "dir": "out" },
{ "port": "H", "id": "2", "name": "xck", "dir": "out" }
],
"uart3":
[
{ "port": "J", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "J", "id": "1", "name": "txd", "dir": "out" },
{ "port": "J", "id": "2", "name": "xck", "dir": "out" }
],
"i2c":
[
{ "port": "D", "id": "0", "name": "scl", "dir": "out" },
{ "port": "D", "id": "1", "name": "sda", "dir": "io" }
],
"uartspi": []
},
{
"devices": ["atmega8515"],
"pcint": [],
"extint":
[
{ "port": "D", "id": "2", "int": "0" },
{ "port": "D", "id": "3", "int": "1" },
{ "port": "E", "id": "0", "int": "2" },
],
"spi":
[
{ "port": "B", "id": "5", "name": "miso", "dir": "in" },
{ "port": "B", "id": "6", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "7", "name": "sck", "dir": "out" },
{ "port": "B", "id": "4", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "D", "id": "01", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "1", "name": "txd", "dir": "out" },
{ "port": "D", "id": "4", "name": "xck", "dir": "out" }
]
},
{
"devices": ["atmega8535"],
"pcint": [],
"extint":
[
{ "port": "D", "id": "2", "int": "0" },
{ "port": "D", "id": "3", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
],
"spi":
[
{ "port": "B", "id": "5", "name": "miso", "dir": "in" },
{ "port": "B", "id": "6", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "7", "name": "sck", "dir": "out" },
{ "port": "B", "id": "4", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "D", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "1", "name": "txd", "dir": "out" },
{ "port": "B", "id": "0", "name": "xck", "dir": "out" }
],
"i2c":
[
{ "port": "C", "id": "0", "name": "scl", "dir": "out" },
{ "port": "C", "id": "1", "name": "sda", "dir": "io" }
]
},
{
"devices": ["atmega644rfr2", "atmega1284rfr2", "atmega2564rfr2"],
"pcint":
[
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
{ "port": "B", "id": "6", "int": "6" },
{ "port": "B", "id": "7", "int": "7" },
{ "port": "F", "id": "0", "int": "8" },
],
"extint":
[
{ "port": "D", "id": "0", "int": "0" },
{ "port": "D", "id": "1", "int": "1" },
{ "port": "D", "id": "2", "int": "2" },
{ "port": "D", "id": "3", "int": "3" },
{ "port": "E", "id": "4", "int": "4" },
{ "port": "E", "id": "5", "int": "5" },
{ "port": "E", "id": "6", "int": "6" },
{ "port": "E", "id": "7", "int": "7" },
],
"spi":
[
{ "port": "B", "id": "3", "name": "miso", "dir": "in" },
{ "port": "B", "id": "2", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "1", "name": "sck", "dir": "out" },
{ "port": "B", "id": "0", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "E", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "E", "id": "1", "name": "txd", "dir": "out" },
{ "port": "E", "id": "2", "name": "xck", "dir": "out" }
],
"uart1":
[
{ "port": "D", "id": "2", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "3", "name": "txd", "dir": "out" },
{ "port": "D", "id": "5", "name": "xck", "dir": "out" }
],
"i2c":
[
{ "port": "D", "id": "0", "name": "scl", "dir": "out" },
{ "port": "D", "id": "1", "name": "sda", "dir": "io" }
],
"uartspi": []
},
# ATtiny devices
{
"devices": ["attiny4", "attiny5", "attiny9", "attiny10"],
"pcint":
[
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
{ "port": "B", "id": "6", "int": "6" },
{ "port": "B", "id": "7", "int": "7" },
],
"extint":
[
{ "port": "B", "id": "2", "int": "0" },
],
#"spi":
#[
# { "port": "B", "id": "0", "name": "miso", "dir": "in" },
# { "port": "B", "id": "1", "name": "mosi", "dir": "out" },
# { "port": "B", "id": "2", "name": "sck", "dir": "out" }
#]
},
{
"devices": ["attiny13", "attiny13a"],
"pcint":
[
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
],
"extint":
[
{ "port": "B", "id": "1", "int": "0" },
],
},
{
"devices": ["attiny20"],
"pcint":
[
{ "port": "A", "id": "0", "int": "0" },
{ "port": "A", "id": "1", "int": "1" },
{ "port": "A", "id": "2", "int": "2" },
{ "port": "A", "id": "3", "int": "3" },
{ "port": "A", "id": "4", "int": "4" },
{ "port": "A", "id": "5", "int": "5" },
{ "port": "A", "id": "6", "int": "6" },
{ "port": "A", "id": "7", "int": "7" },
{ "port": "B", "id": "4", "int": "8" },
{ "port": "B", "id": "5", "int": "9" },
{ "port": "B", "id": "6", "int": "10" },
{ "port": "B", "id": "7", "int": "11" },
],
"extint":
[
{ "port": "B", "id": "2", "int": "0" },
],
"spi":
[
{ "port": "A", "id": "1", "name": "miso", "dir": "in" },
{ "port": "A", "id": "2", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "7", "name": "sck", "dir": "out" },
{ "port": "B", "id": "6", "name": "ss", "dir": "out" }
],
"i2c":
[
{ "port": "A", "id": "7", "name": "scl", "dir": "out" },
{ "port": "B", "id": "1", "name": "sda", "dir": "io" }
]
},
{
"devices": ["attiny24", "attiny24a",
"attiny44", "attiny44a",
"attiny84", "attiny84a"],
"pcint":
[
{ "port": "A", "id": "0", "int": "0" },
{ "port": "A", "id": "1", "int": "1" },
{ "port": "A", "id": "2", "int": "2" },
{ "port": "A", "id": "3", "int": "3" },
{ "port": "A", "id": "4", "int": "4" },
{ "port": "A", "id": "5", "int": "5" },
{ "port": "A", "id": "6", "int": "6" },
{ "port": "A", "id": "7", "int": "7" },
{ "port": "B", "id": "4", "int": "8" },
{ "port": "B", "id": "5", "int": "9" },
{ "port": "B", "id": "6", "int": "10" },
{ "port": "B", "id": "7", "int": "11" },
],
"extint":
[
{ "port": "B", "id": "2", "int": "0" },
],
"spi":
[
{ "port": "B", "id": "6", "name": "miso", "dir": "in" },
{ "port": "B", "id": "5", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "4", "name": "sck", "dir": "out" }
],
"usi":
[
{ "port": "A", "id": "4", "name": "usck", "dir": "out" },
{ "port": "A", "id": "5", "name": "do", "dir": "out" },
{ "port": "A", "id": "6", "name": "di", "dir": "in" }
]
},
{
"devices": ["attiny25", "attiny45", "attiny85"],
"pcint":
[
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
],
"extint":
[
{ "port": "B", "id": "2", "int": "0" },
],
#"spi":
#[
# { "port": "B", "id": "1", "name": "miso", "dir": "in" },
# { "port": "B", "id": "2", "name": "mosi", "dir": "out" },
# { "port": "B", "id": "3", "name": "sck", "dir": "out" }
#],
"usi":
[
{ "port": "B", "id": "2", "name": "usck", "dir": "out" },
{ "port": "B", "id": "1", "name": "do", "dir": "out" },
{ "port": "B", "id": "0", "name": "di", "dir": "in" }
]
},
{
"devices": ["attiny26"],
"pcint":
[
{ "port": "A", "id": "3", "int": "1" },
{ "port": "A", "id": "6", "int": "1" },
{ "port": "A", "id": "7", "int": "1" },
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
{ "port": "B", "id": "6", "int": "6" },
{ "port": "B", "id": "7", "int": "7" },
],
"extint":
[
{ "port": "B", "id": "6", "int": "0" },
],
"usi":
[
{ "port": "B", "id": "2", "name": "usck", "dir": "out" },
{ "port": "B", "id": "1", "name": "do", "dir": "out" },
{ "port": "B", "id": "0", "name": "di", "dir": "in" }
]
},
{
"devices": ["attiny40"],
"pcint":
[
{ "port": "A", "id": "0", "int": "0" },
{ "port": "A", "id": "1", "int": "1" },
{ "port": "A", "id": "2", "int": "2" },
{ "port": "A", "id": "3", "int": "3" },
{ "port": "A", "id": "4", "int": "4" },
{ "port": "A", "id": "5", "int": "5" },
{ "port": "A", "id": "6", "int": "6" },
{ "port": "A", "id": "7", "int": "7" },
{ "port": "B", "id": "0", "int": "8" },
{ "port": "B", "id": "1", "int": "9" },
{ "port": "B", "id": "2", "int": "10" },
{ "port": "B", "id": "3", "int": "11" },
{ "port": "C", "id": "0", "int": "12" },
{ "port": "C", "id": "1", "int": "13" },
{ "port": "C", "id": "2", "int": "14" },
{ "port": "C", "id": "3", "int": "15" },
{ "port": "C", "id": "4", "int": "16" },
{ "port": "C", "id": "5", "int": "17" },
],
"extint":
[
{ "port": "C", "id": "2", "int": "0" },
],
"spi":
[
{ "port": "C", "id": "4", "name": "miso", "dir": "in" },
{ "port": "C", "id": "2", "name": "mosi", "dir": "out" },
{ "port": "C", "id": "1", "name": "sck", "dir": "out" },
{ "port": "C", "id": "0", "name": "ss", "dir": "out" }
],
"i2c":
[
{ "port": "C", "id": "1", "name": "scl", "dir": "out" },
{ "port": "C", "id": "4", "name": "sda", "dir": "io" }
]
},
{
"devices": ["attiny43u"],
"pcint":
[
{ "port": "A", "id": "0", "int": "0" },
{ "port": "A", "id": "1", "int": "1" },
{ "port": "A", "id": "2", "int": "2" },
{ "port": "A", "id": "3", "int": "3" },
{ "port": "A", "id": "4", "int": "4" },
{ "port": "A", "id": "5", "int": "5" },
{ "port": "A", "id": "6", "int": "6" },
{ "port": "A", "id": "7", "int": "7" },
{ "port": "B", "id": "0", "int": "8" },
{ "port": "B", "id": "1", "int": "9" },
{ "port": "B", "id": "2", "int": "10" },
{ "port": "B", "id": "3", "int": "11" },
{ "port": "B", "id": "4", "int": "12" },
{ "port": "B", "id": "5", "int": "13" },
{ "port": "B", "id": "6", "int": "14" },
{ "port": "B", "id": "7", "int": "15" },
],
"extint":
[
{ "port": "B", "id": "7", "int": "0" },
],
"usi":
[
{ "port": "B", "id": "6", "name": "usck", "dir": "out" },
{ "port": "B", "id": "5", "name": "do", "dir": "out" },
{ "port": "B", "id": "4", "name": "di", "dir": "in" }
]
},
{
"devices": ["attiny48", "attiny88"],
"pcint":
[
{ "port": "A", "id": "0", "int": "24" },
{ "port": "A", "id": "1", "int": "25" },
{ "port": "A", "id": "2", "int": "26" },
{ "port": "A", "id": "3", "int": "27" },
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
{ "port": "B", "id": "6", "int": "6" },
{ "port": "B", "id": "7", "int": "7" },
{ "port": "C", "id": "0", "int": "8" },
{ "port": "C", "id": "1", "int": "9" },
{ "port": "C", "id": "2", "int": "10" },
{ "port": "C", "id": "3", "int": "11" },
{ "port": "C", "id": "4", "int": "12" },
{ "port": "C", "id": "5", "int": "13" },
{ "port": "C", "id": "6", "int": "14" },
{ "port": "C", "id": "7", "int": "15" },
{ "port": "D", "id": "0", "int": "16" },
{ "port": "D", "id": "1", "int": "17" },
{ "port": "D", "id": "2", "int": "18" },
{ "port": "D", "id": "3", "int": "19" },
{ "port": "D", "id": "4", "int": "20" },
{ "port": "D", "id": "5", "int": "21" },
{ "port": "D", "id": "6", "int": "22" },
{ "port": "D", "id": "7", "int": "23" },
],
"extint":
[
{ "port": "D", "id": "2", "int": "0" },
{ "port": "D", "id": "3", "int": "1" },
],
"spi":
[
{ "port": "B", "id": "3", "name": "miso", "dir": "in" },
{ "port": "B", "id": "4", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "5", "name": "sck", "dir": "out" },
{ "port": "B", "id": "2", "name": "ss", "dir": "out" }
],
"i2c":
[
{ "port": "C", "id": "5", "name": "scl", "dir": "out" },
{ "port": "C", "id": "4", "name": "sda", "dir": "io" }
]
},
{
"devices": ["attiny87", "attiny167"],
"extint":
[
{ "port": "B", "id": "6", "int": "0" },
{ "port": "A", "id": "3", "int": "1" },
],
"spi":
[
{ "port": "A", "id": "4", "name": "miso", "dir": "in" },
{ "port": "A", "id": "2", "name": "mosi", "dir": "out" },
{ "port": "A", "id": "5", "name": "sck", "dir": "out" },
{ "port": "A", "id": "6", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "A", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "A", "id": "1", "name": "txd", "dir": "out" }
],
"usi":
[
{ "port": "B", "id": "2", "name": "usck", "dir": "out" },
{ "port": "B", "id": "1", "name": "do", "dir": "out" },
{ "port": "B", "id": "0", "name": "di", "dir": "in" }
]
},
{
"devices": ["attiny261", "attiny261a",
"attiny461", "attiny461a",
"attiny861", "attiny861a"],
"pcint":
[
{ "port": "A", "id": "0", "int": "0" },
{ "port": "A", "id": "1", "int": "1" },
{ "port": "A", "id": "2", "int": "2" },
{ "port": "A", "id": "3", "int": "3" },
{ "port": "A", "id": "4", "int": "4" },
{ "port": "A", "id": "5", "int": "5" },
{ "port": "A", "id": "6", "int": "6" },
{ "port": "A", "id": "7", "int": "7" },
{ "port": "B", "id": "0", "int": "8" },
{ "port": "B", "id": "1", "int": "9" },
{ "port": "B", "id": "2", "int": "10" },
{ "port": "B", "id": "3", "int": "11" },
{ "port": "B", "id": "4", "int": "12" },
{ "port": "B", "id": "5", "int": "13" },
{ "port": "B", "id": "6", "int": "14" },
{ "port": "B", "id": "7", "int": "15" },
],
"extint":
[
{ "port": "B", "id": "6", "int": "0" },
{ "port": "A", "id": "2", "int": "1" },
],
"usi":
[
{ "port": "B", "id": "2", "name": "usck", "dir": "out" },
{ "port": "B", "id": "1", "name": "do", "dir": "out" },
{ "port": "B", "id": "0", "name": "di", "dir": "in" }
]
},
{
"devices": ["attiny828"],
"pcint":
[
{ "port": "A", "id": "0", "int": "0" },
{ "port": "A", "id": "1", "int": "1" },
{ "port": "A", "id": "2", "int": "2" },
{ "port": "A", "id": "3", "int": "3" },
{ "port": "A", "id": "4", "int": "4" },
{ "port": "A", "id": "5", "int": "5" },
{ "port": "A", "id": "6", "int": "6" },
{ "port": "A", "id": "7", "int": "7" },
{ "port": "B", "id": "0", "int": "8" },
{ "port": "B", "id": "1", "int": "9" },
{ "port": "B", "id": "2", "int": "10" },
{ "port": "B", "id": "3", "int": "11" },
{ "port": "B", "id": "4", "int": "12" },
{ "port": "B", "id": "5", "int": "13" },
{ "port": "B", "id": "6", "int": "14" },
{ "port": "B", "id": "7", "int": "15" },
{ "port": "C", "id": "0", "int": "16" },
{ "port": "C", "id": "1", "int": "17" },
{ "port": "C", "id": "2", "int": "18" },
{ "port": "C", "id": "3", "int": "19" },
{ "port": "C", "id": "4", "int": "20" },
{ "port": "C", "id": "5", "int": "21" },
{ "port": "C", "id": "6", "int": "22" },
{ "port": "C", "id": "7", "int": "23" },
{ "port": "D", "id": "0", "int": "24" },
{ "port": "D", "id": "1", "int": "25" },
{ "port": "D", "id": "2", "int": "26" },
{ "port": "D", "id": "3", "int": "27" },
],
"extint":
[
{ "port": "C", "id": "1", "int": "0" },
{ "port": "C", "id": "2", "int": "1" },
],
"spi":
[
{ "port": "D", "id": "0", "name": "miso", "dir": "in" },
{ "port": "D", "id": "1", "name": "mosi", "dir": "out" },
{ "port": "D", "id": "3", "name": "sck", "dir": "out" },
{ "port": "C", "id": "0", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "C", "id": "2", "name": "rxd", "dir": "in" },
{ "port": "C", "id": "3", "name": "txd", "dir": "out" }
],
"i2c":
[
{ "port": "D", "id": "3", "name": "scl", "dir": "out" },
{ "port": "D", "id": "0", "name": "sda", "dir": "io" }
],
"uartspi": []
},
{
"devices": ["attiny1634"],
"pcint":
[
{ "port": "A", "id": "0", "int": "0" },
{ "port": "A", "id": "1", "int": "1" },
{ "port": "A", "id": "2", "int": "2" },
{ "port": "A", "id": "3", "int": "3" },
{ "port": "A", "id": "4", "int": "4" },
{ "port": "A", "id": "5", "int": "5" },
{ "port": "A", "id": "6", "int": "6" },
{ "port": "A", "id": "7", "int": "7" },
{ "port": "B", "id": "0", "int": "8" },
{ "port": "B", "id": "1", "int": "9" },
{ "port": "B", "id": "2", "int": "10" },
{ "port": "B", "id": "3", "int": "11" },
{ "port": "C", "id": "0", "int": "12" },
{ "port": "C", "id": "1", "int": "13" },
],
"extint":
[
{ "port": "C", "id": "2", "int": "0" },
],
"uart0":
[
{ "port": "A", "id": "7", "name": "rxd", "dir": "in" },
{ "port": "B", "id": "0", "name": "txd", "dir": "out" }
],
"usi":
[
{ "port": "B", "id": "1", "name": "usck", "dir": "out" },
{ "port": "B", "id": "2", "name": "do", "dir": "out" },
{ "port": "C", "id": "1", "name": "di", "dir": "in" }
],
"uartspi": []
},
{
"devices": ["attiny2313a", "attiny2313",
"attiny4313", "attiny4313a"],
"pcint":
[
{ "port": "A", "id": "0", "int": "8" },
{ "port": "A", "id": "1", "int": "9" },
{ "port": "A", "id": "2", "int": "10" },
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
{ "port": "B", "id": "6", "int": "6" },
{ "port": "B", "id": "7", "int": "7" },
{ "port": "D", "id": "0", "int": "11" },
{ "port": "D", "id": "1", "int": "12" },
{ "port": "D", "id": "2", "int": "13" },
{ "port": "D", "id": "3", "int": "14" },
{ "port": "D", "id": "4", "int": "15" },
{ "port": "D", "id": "5", "int": "16" },
{ "port": "D", "id": "6", "int": "17" },
],
"extint":
[
{ "port": "D", "id": "2", "int": "0" },
{ "port": "D", "id": "3", "int": "1" },
],
"uart0":
[
{ "port": "D", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "1", "name": "txd", "dir": "out" }
],
"usi":
[
{ "port": "B", "id": "7", "name": "usck", "dir": "out" },
{ "port": "B", "id": "6", "name": "do", "dir": "out" },
{ "port": "B", "id": "5", "name": "di", "dir": "in" }
],
"uartspi": []
},
# AT90 devices
{
"devices": ["at90can32", "at90can64", "at90can128"],
"extint":
[
{ "port": "D", "id": "0", "int": "0" },
{ "port": "D", "id": "1", "int": "1" },
{ "port": "D", "id": "2", "int": "2" },
{ "port": "D", "id": "3", "int": "3" },
{ "port": "E", "id": "4", "int": "4" },
{ "port": "E", "id": "5", "int": "5" },
{ "port": "E", "id": "6", "int": "6" },
{ "port": "E", "id": "7", "int": "7" }
],
"spi":
[
{ "port": "B", "id": "2", "name": "miso", "dir": "in" },
{ "port": "B", "id": "3", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "1", "name": "sck", "dir": "out" },
{ "port": "B", "id": "0", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "E", "id": "0", "name": "rxd", "dir": "in" },
{ "port": "E", "id": "1", "name": "txd", "dir": "out" },
{ "port": "E", "id": "2", "name": "xck", "dir": "out" }
],
"uart1":
[
{ "port": "D", "id": "2", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "3", "name": "txd", "dir": "out" },
{ "port": "D", "id": "5", "name": "xck", "dir": "out" }
],
"i2c":
[
{ "port": "D", "id": "0", "name": "scl", "dir": "out" },
{ "port": "D", "id": "1", "name": "sda", "dir": "io" }
],
"can":
[
{ "port": "D", "id": "6", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "5", "name": "txd", "dir": "out" }
]
},
{
"devices": ["at90pwm1",
"at90pwm2", "at90pwm2b", "at90pwm216"],
"extint":
[
{ "port": "B", "id": "6", "int": "0" },
{ "port": "D", "id": "2", "int": "1" },
{ "port": "D", "id": "5", "int": "2" }
],
"spi":
[
{ "port": "B", "id": "1", "name": "miso", "dir": "in" },
{ "port": "B", "id": "0", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "7", "name": "sck", "dir": "out" },
{ "port": "D", "id": "3", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "D", "id": "4", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "3", "name": "txd", "dir": "out" },
{ "port": "D", "id": "0", "name": "xck", "dir": "out" }
]
},
{
"devices": ["at90pwm3", "at90pwm3b", "at90pwm316"],
"extint":
[
{ "port": "B", "id": "6", "int": "0" },
{ "port": "D", "id": "2", "int": "1" },
{ "port": "D", "id": "5", "int": "2" },
{ "port": "C", "id": "0", "int": "3" }
],
"spi":
[
{ "port": "B", "id": "1", "name": "miso", "dir": "in" },
{ "port": "B", "id": "0", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "7", "name": "sck", "dir": "out" },
{ "port": "D", "id": "3", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "D", "id": "4", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "3", "name": "txd", "dir": "out" },
{ "port": "D", "id": "0", "name": "xck", "dir": "out" }
]
},
{
"devices": ["at90pwm81", "at90pwm161"],
"extint":
[
{ "port": "B", "id": "2", "int": "0" },
{ "port": "B", "id": "5", "int": "1" },
{ "port": "E", "id": "0", "int": "2" }
],
"spi":
[
{ "port": "B", "id": "4", "name": "miso", "dir": "in" },
{ "port": "B", "id": "6", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "5", "name": "sck", "dir": "out" },
{ "port": "D", "id": "0", "name": "ss", "dir": "out" }
]
},
{
"devices": ["at90usb82", "at90usb162"],
"pcint":
[
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
{ "port": "B", "id": "6", "int": "6" },
{ "port": "B", "id": "7", "int": "7" },
{ "port": "C", "id": "6", "int": "8" },
{ "port": "C", "id": "5", "int": "9" },
{ "port": "C", "id": "4", "int": "10" },
{ "port": "C", "id": "2", "int": "11" },
{ "port": "D", "id": "5", "int": "12" }
],
"extint":
[
{ "port": "D", "id": "0", "int": "0" },
{ "port": "D", "id": "1", "int": "1" },
{ "port": "D", "id": "2", "int": "2" },
{ "port": "D", "id": "3", "int": "3" },
{ "port": "C", "id": "7", "int": "4" },
{ "port": "D", "id": "4", "int": "5" },
{ "port": "D", "id": "6", "int": "6" },
{ "port": "D", "id": "7", "int": "7" }
],
"spi":
[
{ "port": "B", "id": "2", "name": "miso", "dir": "in" },
{ "port": "B", "id": "3", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "1", "name": "sck", "dir": "out" },
{ "port": "D", "id": "0", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "D", "id": "2", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "3", "name": "txd", "dir": "out" },
{ "port": "D", "id": "5", "name": "xck", "dir": "out" }
],
"uartspi": []
},
{
"devices": ["at90usb646", "at90usb1286",
"at90usb647", "at90usb1287"],
"pcint":
[
{ "port": "B", "id": "0", "int": "0" },
{ "port": "B", "id": "1", "int": "1" },
{ "port": "B", "id": "2", "int": "2" },
{ "port": "B", "id": "3", "int": "3" },
{ "port": "B", "id": "4", "int": "4" },
{ "port": "B", "id": "5", "int": "5" },
{ "port": "B", "id": "6", "int": "6" },
{ "port": "B", "id": "7", "int": "7" }
],
"extint":
[
{ "port": "D", "id": "0", "int": "0" },
{ "port": "D", "id": "1", "int": "1" },
{ "port": "D", "id": "2", "int": "2" },
{ "port": "D", "id": "3", "int": "3" },
{ "port": "E", "id": "4", "int": "4" },
{ "port": "E", "id": "5", "int": "5" },
{ "port": "E", "id": "6", "int": "6" },
{ "port": "E", "id": "7", "int": "7" }
],
"spi":
[
{ "port": "B", "id": "2", "name": "miso", "dir": "in" },
{ "port": "B", "id": "3", "name": "mosi", "dir": "out" },
{ "port": "B", "id": "1", "name": "sck", "dir": "out" },
{ "port": "D", "id": "0", "name": "ss", "dir": "out" }
],
"uart0":
[
{ "port": "D", "id": "2", "name": "rxd", "dir": "in" },
{ "port": "D", "id": "3", "name": "txd", "dir": "out" },
{ "port": "D", "id": "5", "name": "xck", "dir": "out" }
],
"i2c":
[
{ "port": "D", "id": "0", "name": "scl", "dir": "out" },
{ "port": "D", "id": "1", "name": "sda", "dir": "io" }
],
"uartspi": []
}
]
| 29.977564 | 98 | 0.3287 | 7,009 | 56,118 | 2.631331 | 0.046226 | 0.101936 | 0.139294 | 0.04175 | 0.870466 | 0.810768 | 0.790219 | 0.725479 | 0.720544 | 0.709917 | 0 | 0.059236 | 0.253662 | 56,118 | 1,871 | 99 | 29.993586 | 0.381109 | 0.013757 | 0 | 0.651297 | 0 | 0 | 0.321355 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b7555b66c46bad286de97dd4acbc9fce42a5b617 | 28,211 | py | Python | core_effects_of_resampling.py | AndrzejTunkiel/Tape | 1f6a7c337d9f3557452cb5c80c2dfc4d99085be3 | [
"MIT"
] | null | null | null | core_effects_of_resampling.py | AndrzejTunkiel/Tape | 1f6a7c337d9f3557452cb5c80c2dfc4d99085be3 | [
"MIT"
] | null | null | null | core_effects_of_resampling.py | AndrzejTunkiel/Tape | 1f6a7c337d9f3557452cb5c80c2dfc4d99085be3 | [
"MIT"
] | 1 | 2021-11-15T01:21:22.000Z | 2021-11-15T01:21:22.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Mon Mar 15 19:43:26 2021
@author: llothar
"""
from sens_tape import tape
import pandas as pd
import numpy as np
import seaborn as sns
import matplotlib.pyplot as plt
plt.style.use(['science','no-latex'])
data = pd.read_csv('f9ad.csv')
drops = ['Unnamed: 0', 'Unnamed: 0.1', 'RHX_RT unitless', 'Pass Name unitless',
'nameWellbore', 'name','RGX_RT unitless',
'MWD Continuous Azimuth dega']
dfs = data.iloc[2000:10000]
index = 'Measured Depth m'
target = 'MWD Continuous Inclination dega',
fig, axs = plt.subplots(1, 2, sharey=True, figsize=(8,3))
index_dr = np.diff(dfs[index])
index_mean = np.mean(index_dr)
index_std = np.std(index_dr)
index_maxgap = np.max(index_dr)
h = 5
x = np.arange(np.min(dfs[index].to_numpy()),
np.max(dfs[index].to_numpy()),
index_maxgap*h)
from sklearn.neighbors import RadiusNeighborsRegressor
# raw = dfs['MWD Continuous Inclination dega'].interpolate().ffill().bfill().to_numpy()
# reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
# y = reg.predict(x.reshape(-1,1))
# plt.xlim(650,700)
# plt.plot(x,y)
# plt.plot(dfs[index].to_numpy(),raw)
# plt.show()
reg = RadiusNeighborsRegressor(radius=index_maxgap*1, weights='uniform')
raw = dfs['Rate of Penetration m/h'].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
y = reg.predict(x.reshape(-1,1))
axs[0].plot(x,y, c='blue', linewidth=1, label='r = 1 max step', linestyle="-")
reg = RadiusNeighborsRegressor(radius=index_maxgap*20, weights='uniform')
raw = dfs['Rate of Penetration m/h'].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
y = reg.predict(x.reshape(-1,1))
axs[0].plot(x,y, c='black', linewidth=1, label='r = 20 max step', linestyle="-")
reg = RadiusNeighborsRegressor(radius=index_maxgap*100, weights='uniform')
raw = dfs['Rate of Penetration m/h'].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
y = reg.predict(x.reshape(-1,1))
axs[0].plot(x,y, c='black', linewidth=1, label='r = 100 max step', linestyle="--")
raw_x = dfs[index].to_numpy()
axs[0].plot(raw_x,raw, c='red', linestyle=':', label='raw data')
axs[0].grid()
plt.tight_layout()
axs[0].set_xlim(650,690)
plt.ylim(0,60)
axs[0].legend()
axs[0].set_title('Uniform weight')
axs[0].set_ylabel('Rate of Penetration [m/h]')
axs[0].set_xlabel('Measured Depth [m]')
reg = RadiusNeighborsRegressor(radius=index_maxgap*1, weights='distance')
raw = dfs['Rate of Penetration m/h'].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
y = reg.predict(x.reshape(-1,1))
axs[1].plot(x,y, c='blue', linewidth=1, label='r = 1 max step', linestyle="-")
reg = RadiusNeighborsRegressor(radius=index_maxgap*20, weights='distance')
raw = dfs['Rate of Penetration m/h'].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
y = reg.predict(x.reshape(-1,1))
axs[1].plot(x,y, c='black', linewidth=1, label='r = 20 max step', linestyle="-")
reg = RadiusNeighborsRegressor(radius=index_maxgap*100, weights='distance')
raw = dfs['Rate of Penetration m/h'].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
y = reg.predict(x.reshape(-1,1))
axs[1].plot(x,y, c='black', linewidth=1, label='r = 100 max step', linestyle="--")
raw_x = dfs[index].to_numpy()
axs[1].plot(raw_x,raw, c='red', linestyle=':', label='raw data')
axs[1].grid()
plt.tight_layout()
axs[1].set_xlim(650,690)
plt.ylim(0,60)
axs[1].legend()
axs[1].set_title('Distance weight')
axs[1].set_xlabel('Measured Depth [m]')
plt.savefig('resampling_radius_rnr.pdf')
plt.show()
#%%
data = pd.read_csv('f9ad.csv')
drops = ['Unnamed: 0', 'Unnamed: 0.1', 'RHX_RT unitless', 'Pass Name unitless',
'nameWellbore', 'name','RGX_RT unitless',
'MWD Continuous Azimuth dega']
dfs = data.iloc[2000:10000]
index = 'Measured Depth m'
target = 'MWD Continuous Inclination dega',
fig, axs = plt.subplots(1, 2, sharey=True, figsize=(8,3))
index_dr = np.diff(dfs[index])
index_mean = np.mean(index_dr)
index_std = np.std(index_dr)
index_maxgap = np.max(index_dr)
h = 5
x = np.arange(np.min(dfs[index].to_numpy()),
np.max(dfs[index].to_numpy()),
index_maxgap*h)
from sklearn.neighbors import KNeighborsRegressor
# raw = dfs['MWD Continuous Inclination dega'].interpolate().ffill().bfill().to_numpy()
# reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
# y = reg.predict(x.reshape(-1,1))
# plt.xlim(650,700)
# plt.plot(x,y)
# plt.plot(dfs[index].to_numpy(),raw)
# plt.show()
reg = KNeighborsRegressor(n_neighbors=1, weights='uniform')
raw = dfs['Rate of Penetration m/h'].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
y = reg.predict(x.reshape(-1,1))
axs[0].plot(x,y, c='blue', linewidth=1, label='K = 1', linestyle="-")
reg = KNeighborsRegressor(n_neighbors=20, weights='uniform')
raw = dfs['Rate of Penetration m/h'].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
y = reg.predict(x.reshape(-1,1))
axs[0].plot(x,y, c='black', linewidth=1, label='K = 20', linestyle="-")
reg = KNeighborsRegressor(n_neighbors=100, weights='uniform')
raw = dfs['Rate of Penetration m/h'].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
y = reg.predict(x.reshape(-1,1))
axs[0].plot(x,y, c='black', linewidth=1, label='K = 100', linestyle="--")
raw_x = dfs[index].to_numpy()
axs[0].plot(raw_x,raw, c='red', linestyle=':', label='raw data')
axs[0].grid()
plt.tight_layout()
axs[0].set_xlim(650,690)
plt.ylim(0,60)
axs[0].legend()
axs[0].set_title('Uniform weight')
axs[0].set_ylabel('Rate of Penetration [m/h]')
axs[0].set_xlabel('Measured Depth [m]')
reg = KNeighborsRegressor(n_neighbors=1, weights='distance')
raw = dfs['Rate of Penetration m/h'].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
y = reg.predict(x.reshape(-1,1))
axs[1].plot(x,y, c='blue', linewidth=1, label='K = 1', linestyle="-")
reg = KNeighborsRegressor(n_neighbors=20, weights='distance')
raw = dfs['Rate of Penetration m/h'].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
y = reg.predict(x.reshape(-1,1))
axs[1].plot(x,y, c='black', linewidth=1, label='K = 20', linestyle="-")
reg = KNeighborsRegressor(n_neighbors=100, weights='distance')
raw = dfs['Rate of Penetration m/h'].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
y = reg.predict(x.reshape(-1,1))
axs[1].plot(x,y, c='black', linewidth=1, label='K = 100', linestyle="--")
raw_x = dfs[index].to_numpy()
axs[1].plot(raw_x,raw, c='red', linestyle=':', label='raw data')
axs[1].grid()
plt.tight_layout()
axs[1].set_xlim(650,690)
plt.ylim(0,60)
axs[1].legend()
axs[1].set_title('Distance weight')
axs[1].set_xlabel('Measured Depth [m]')
plt.savefig('resampling_radius_knn.pdf')
plt.show()
#%%
from shapely.geometry import Polygon
from shapely.geometry import LineString
from shapely.ops import unary_union
from shapely.ops import unary_union, polygonize
data = pd.read_csv('f9ad.csv')
drops = ['Unnamed: 0',
'Pass Name unitless',
'MWD Magnetic Toolface dega',
'nameWellbore',
'name',
'IMP/ARC Attenuation Conductivity 40-in. at 2 MHz mS/m',
'ARC Annular Pressure kPa',
'MWD Collar RPM rpm',
'IMP/ARC Non-BHcorr Phase-Shift Resistivity 28-in. at 2 MHz ohm.m',
'IMP/ARC Phase-Shift Conductivity 40-in. at 2 MHz mS/m',
'Annular Temperature degC',
'IMP/ARC Non-BHcorr Phase-Shift Resistivity 40-in. at 2 MHz ohm.m',
'ARC Gamma Ray (BH corrected) gAPI',
'IMP/ARC Non-BHcorr Attenuation Resistivity 40-in. at 2 MHz ohm.m',
'MWD Stick-Slip PKtoPK RPM rpm',
'IMP/ARC Non-BHcorr Attenuation Resistivity 28-in. at 2 MHz ohm.m',
'IMP/ARC Phase-Shift Conductivity 28-in. at 2 MHz mS/m'
]
data = data.drop(drops, axis=1)
dfs = data.iloc[2000:10000]
index = 'Measured Depth m'
target = 'Rate of Penetration m/h' #'MWD Continuous Inclination dega'
index_dr = np.diff(dfs[index])
index_mean = np.mean(index_dr)
index_std = np.std(index_dr)
index_maxgap = np.max(index_dr)
h = 5
data_x = np.arange(np.min(dfs[index].to_numpy()),
np.max(dfs[index].to_numpy()),
index_maxgap*h)
#%%
for target in list(data):
# try:
areas = []
samples = np.arange(1,200,10)
weightss = ['uniform', 'distance']
for weights in weightss:
areas = []
for i in samples:
reg = RadiusNeighborsRegressor(radius=index_maxgap*i, weights=weights)
raw = dfs[target].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
data_y = reg.predict(data_x.reshape(-1,1))
x_y_curve1 = np.rot90([data_x,data_y])
x_y_curve2 = np.rot90([dfs[index].to_numpy(), raw])
polygon_points = [] #creates a empty list where we will append the points to create the polygon
for xyvalue in x_y_curve1:
polygon_points.append([xyvalue[0],xyvalue[1]]) #append all xy points for curve 1
for xyvalue in x_y_curve2[::-1]:
polygon_points.append([xyvalue[0],xyvalue[1]]) #append all xy points for curve 2 in the reverse order (from last point to first point)
for xyvalue in x_y_curve1[0:1]:
polygon_points.append([xyvalue[0],xyvalue[1]]) #append the first point in curve 1 again, to it "closes" the polygon
polygon = Polygon(polygon_points)
area = polygon.area
x,y = polygon.exterior.xy
# original data
ls = LineString(np.c_[x, y])
# closed, non-simple
lr = LineString(ls.coords[:] + ls.coords[0:1])
lr.is_simple # False
mls = unary_union(lr)
mls.geom_type # MultiLineString'
Area_cal =[]
for polygon in polygonize(mls):
Area_cal.append(polygon.area)
Area_poly = (np.asarray(Area_cal).sum())
areas.append(Area_poly)
plt.plot(samples,areas, label=f'RNR, {weights}')
from sklearn.neighbors import KNeighborsRegressor
ks = np.arange(1,200,10)
for weights in weightss:
areas = []
for i in ks:
reg = KNeighborsRegressor(n_neighbors=i, weights=weights)
raw = dfs[target].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
data_y = reg.predict(data_x.reshape(-1,1))
x_y_curve1 = np.rot90([data_x,data_y])
x_y_curve2 = np.rot90([dfs[index].to_numpy(), raw])
polygon_points = [] #creates a empty list where we will append the points to create the polygon
for xyvalue in x_y_curve1:
polygon_points.append([xyvalue[0],xyvalue[1]]) #append all xy points for curve 1
for xyvalue in x_y_curve2[::-1]:
polygon_points.append([xyvalue[0],xyvalue[1]]) #append all xy points for curve 2 in the reverse order (from last point to first point)
for xyvalue in x_y_curve1[0:1]:
polygon_points.append([xyvalue[0],xyvalue[1]]) #append the first point in curve 1 again, to it "closes" the polygon
polygon = Polygon(polygon_points)
area = polygon.area
x,y = polygon.exterior.xy
# original data
ls = LineString(np.c_[x, y])
# closed, non-simple
lr = LineString(ls.coords[:] + ls.coords[0:1])
lr.is_simple # False
mls = unary_union(lr)
mls.geom_type # MultiLineString'
Area_cal =[]
for polygon in polygonize(mls):
Area_cal.append(polygon.area)
Area_poly = (np.asarray(Area_cal).sum())
areas.append(Area_poly)
plt.plot(ks,areas, label=f'KNN, {weights}')
plt.legend()
plt.title(target)
plt.grid()
plt.show()
# except:
# print(f'{target} failed for some reason')
#%%
# no poly version
def myr2(x,y,data_x, data_y):
try:
x1 = np.max(data_x[data_x < x])
x2 = np.min(data_x[data_x > x])
loc1 = np.where(data_x == x1)
loc2 = np.where(data_x == x2)
y1 = data_y[loc1][-1]
y2 = data_y[loc2][0]
m = (y1-y2)/(x1-x2)
b = (x1*y2 - x2*y1)/(x1-x2)
y_inter = m * x + b
return np.power(y-y_inter, 2)
except:
return 0
n = 0
for target in list(data):
# try:
plt.figure(figsize=(5,5))
areas = []
samples = np.arange(1,31,1)
weightss = ['uniform', 'distance']
for weights in weightss:
areas = []
for i in samples:
reg = RadiusNeighborsRegressor(radius=index_maxgap*i, weights=weights)
raw = dfs[target].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
data_y = reg.predict(data_x.reshape(-1,1))
totals = []
for row in np.rot90([data_x,data_y]):
x = row[0]
y = row[1]
totals.append(myr2(x,y,dfs[index].to_numpy(), raw))
Area_poly = np.power((np.sum(totals)/len(totals)),0.5)
areas.append(Area_poly)
plt.plot(samples,areas, label=f'RNR, {weights}')
from sklearn.neighbors import KNeighborsRegressor
ks = np.arange(1,31,1)
for weights in weightss:
areas = []
for i in ks:
reg = KNeighborsRegressor(n_neighbors=i, weights=weights)
raw = dfs[target].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
data_y = reg.predict(data_x.reshape(-1,1))
totals = []
for row in np.rot90([data_x,data_y]):
x = row[0]
y = row[1]
totals.append(myr2(x,y,dfs[index].to_numpy(), raw))
Area_poly = np.power((np.sum(totals)/len(totals)),0.5)
areas.append(Area_poly)
plt.plot(ks,areas, label=f'KNN, {weights}')
plt.xlabel('K \ radius multiplier')
plt.ylabel('Error [RMS]')
plt.legend()
plt.title(target)
plt.grid()
plt.yscale('log')
plt.savefig(f'{n}.pdf')
n += 1
plt.show()
# except:
# print(f'{target} failed for some reason')
#%%
# no poly version, Riemann squared
def myr2multi(x_start, y_start, x_stop, y_stop, data_x, data_y, res):
try:
loc_results = []
x_range = np.linspace(x_start, x_stop, res+1)[:-1]
y_range = np.linspace(y_start, y_stop, res+1)[:-1]
for i in range(res):
x = x_range[i]
y = y_range[i]
x1 = np.max(data_x[data_x <= x])
x2 = np.min(data_x[data_x > x])
loc1 = np.where(data_x == x1)
loc2 = np.where(data_x == x2)
y1 = data_y[loc1][-1]
y2 = data_y[loc2][0]
m = (y1-y2)/(x1-x2)
b = (x1*y2 - x2*y1)/(x1-x2)
y_inter = m * x + b
loc_results.append(np.power(y-y_inter, 2))
return loc_results
except:
return 0
print('oops')
n = 0
res = 10
global_results = []
colors = ['red', 'green', 'blue', 'black']
linestyles = ['-','--', '-.', ':']
for target in list(data):
# try:
c = 0
local_result = [[],[],[],[]]
plt.figure(figsize=(4,4))
areas = []
samples = np.arange(1,31,1)
weightss = ['uniform', 'distance']
plotno = 0
for weights in weightss:
areas = []
for i in samples:
reg = RadiusNeighborsRegressor(radius=index_maxgap*i, weights=weights)
raw = dfs[target].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
data_y = reg.predict(data_x.reshape(-1,1))
totals = []
newdata = np.rot90([data_x,data_y])
for i in range(1,len(newdata)):
x_start = newdata[i-1][0]
y_start = newdata[i-1][1]
x_stop = newdata[i][0]
y_stop = newdata[i][1]
result = myr2multi(x_start, y_start,
x_stop, y_stop,
dfs[index].to_numpy(), raw,
res)
totals.append(result) # added /np.mean(raw)
totals = np.asarray(totals)
Area_poly = np.power((np.sum(totals)/totals.size),0.5)
areas.append(Area_poly)
plt.plot(samples,areas, label=f'RNR\n{weights}',
c = colors[c], linestyle = linestyles[c],linewidth=1.5 )
c += 1
local_result[plotno] = areas
plotno += 1
from sklearn.neighbors import KNeighborsRegressor
ks = np.arange(1,31,1)
for weights in weightss:
areas = []
for i in ks:
reg = KNeighborsRegressor(n_neighbors=i, weights=weights)
raw = dfs[target].interpolate().ffill().bfill().to_numpy()
reg.fit(dfs[index].to_numpy().reshape(-1,1),raw)
data_y = reg.predict(data_x.reshape(-1,1))
totals = []
newdata = np.rot90([data_x,data_y])
for i in range(1,len(newdata)):
x_start = newdata[i-1][0]
y_start = newdata[i-1][1]
x_stop = newdata[i][0]
y_stop = newdata[i][1]
totals.append(myr2multi(x_start, y_start,
x_stop, y_stop,
dfs[index].to_numpy(), raw,
res))
totals = np.asarray(totals)
Area_poly = np.power((np.sum(totals)/totals.size),0.5)
areas.append(Area_poly)
plt.plot(ks,areas, label=f'KNN\n{weights}',
c = colors[c], linestyle = linestyles[c],linewidth=1.5 )
c += 1
local_result[plotno] = areas
plotno += 1
local_result = local_result/np.min(local_result)
global_results.append(local_result)
plt.xlabel('neigbor count / radius multiplier')
plt.ylabel('error [RMRS]')
plt.legend(bbox_to_anchor=(1.05, 1), loc='upper left')
plt.title(target)
plt.grid()
plt.yscale('log')
plt.savefig(f'multi_{n}.pdf')
n += 1
plt.show()
plt.plot(local_result[0])
plt.plot(local_result[1])
plt.plot(local_result[2])
plt.plot(local_result[3])
plt.show()
# except:
# print(f'{target} failed for some reason')
np.save('global_results.npy', global_results)
#%%
global_results = np.load('global_results.npy')
plt.figure(figsize=(4,4))
global_results = np.asarray(global_results)
methods_plot = [
'RNR\nuniform',
'RNR\ndistance',
'KNN\nuniform',
'KNN\ndistance'
]
colors = ['red', 'green', 'blue', 'black']
linestyles = ['-','--', '-.', ':']
for i in range(4):
plt.plot(np.nanmean(global_results[:,i,:], axis=0), label=methods_plot[i],
c=colors[i], linewidth=1.5, linestyle = linestyles[i])
plt.legend(bbox_to_anchor=(1.05, 1), loc='upper left')
plt.yscale('log')
ymax = 3.1
plt.yticks(np.arange(1,ymax,0.2), np.arange(100,ymax*100,20).astype(int))
plt.grid()
plt.xlabel('neighbor count / radius multiplier')
plt.ylabel('RMRS error compared\nto best selection [%]')
plt.ylim(1,3)
plt.xticks(np.arange(-1,31,5), np.arange(0,32,5))
plt.savefig('algocompare.pdf')
#%%
plt.figure(figsize=(5,4))
plt.rc('axes', axisbelow=True)
plt.grid(linewidth=1, color='gray')
x = np.arange(1,101,1)
y = 1/x
import matplotlib
cmap = matplotlib.cm.get_cmap('hsv')
n = 15
for i in range(n+1):
for j in range(i):
if i == n:
plt.bar(x=i,
height=y[j]/np.sum(y[:i]),
bottom=np.sum(y[:j])/np.sum(y[:i]),
color = cmap(j/(n+1)),
label=f'd = {j+1}',
edgecolor='black')
else:
plt.bar(x=i,
height=y[j]/np.sum(y[:i]),
bottom=np.sum(y[:j])/np.sum(y[:i]),
color = cmap(j/(n+2)),
edgecolor='black')
plt.xlim(0,n+1)
plt.xticks(np.arange(1,n+1,1), rotation=90)
plt.yticks(np.linspace(0,1,11), np.linspace(0,100,11).astype(int))
plt.legend(bbox_to_anchor=(1.05, 1), loc='upper left')
plt.xlabel('Radius Neighbour Regressor radius limit')
plt.ylabel('Datapoint weights, percent')
plt.tight_layout()
#plt.grid()
plt.savefig('Cumulative weights.pdf')
#%%
plt.figure(figsize=(5,4))
plt.rc('axes', axisbelow=True)
plt.grid(linewidth=1, color='gray')
x = np.ones(100)
y = x
import matplotlib
cmap = matplotlib.cm.get_cmap('hsv')
n = 15
for i in range(n+1):
for j in range(i):
if i == n:
plt.bar(x=i,
height=y[j]/np.sum(y[:i]),
bottom=np.sum(y[:j])/np.sum(y[:i]),
color = cmap(j/(n+1)),
label=f'd = {j+1}',
edgecolor='black')
else:
plt.bar(x=i,
height=y[j]/np.sum(y[:i]),
bottom=np.sum(y[:j])/np.sum(y[:i]),
color = cmap(j/(n+2)),
edgecolor='black')
plt.xlim(0,n+1)
plt.xticks(np.arange(1,n+1,1), rotation=90)
plt.yticks(np.linspace(0,1,11), np.linspace(0,100,11).astype(int))
plt.legend(bbox_to_anchor=(1.05, 1), loc='upper left')
plt.xlabel('Radius Neighbour Regressor radius limit')
plt.ylabel('Datapoint weights, percent')
plt.tight_layout()
plt.savefig('Cumulative weights2.pdf')
#%%
import glob
filelist = (glob.glob("full_log*.npy"))
data_array = []
for file in filelist:
data = np.load(file, allow_pickle=True)
if len(data) > 0:
data_array.append(data)
data = np.vstack(data_array)
plt.figure(figsize=(4,2.5))
df = pd.DataFrame(data=data, columns=["method", "n", "param"])
df['n'] = df['n'].astype(int)
methods = ['KNN uniform', 'KNN distance', 'RNR uniform', 'RNR distance']
ns = np.arange(1,11,1)
ms = np.arange(0,4,1)
summary = []
for m in ms:
for n in ns:
dft = df[df['method'] == methods[m]]
dft = dft[dft['n'] == n]
summary.append([m,n,len(dft)])
summary = np.asarray(summary)
methods_plot = ['KNN\nuniform',
'KNN\ndistance',
'RNR\nuniform',
'RNR\ndistance']
scaler = 1.5
plt.scatter(x=summary[:,1], y=summary[:,0], s=summary[:,2]*scaler, c='steelblue')
plt.xticks(ns)
plt.yticks(ms, methods_plot)
sizes = np.arange(100,401,100)
sizes = np.hstack((1,sizes))
for s in sizes:
plt.scatter([],[],s=s*scaler, c='steelblue', label=f'{s}\n ')
plt.legend(title='winner count', bbox_to_anchor=(1.0, 1), loc='upper left')
plt.xlabel('Neighbor count / Radius multiplier')
#%%
import glob
filelist = (glob.glob("simann*.npy"))
data_array = []
for file in filelist:
data = np.load(file, allow_pickle=True)
if len(data) > 0:
data_array.append(data)
data = np.vstack(data_array)
plt.figure(figsize=(4,2.5))
df = pd.DataFrame(data=data, columns=["method", "n", "param"])
df['n'] = df['n'].astype(int)
methods = ['KNN uniform', 'KNN distance', 'RNR uniform', 'RNR distance']
ns = np.arange(1,11,1)
ms = np.arange(0,4,1)
summary = []
for m in ms:
for n in ns:
dft = df[df['method'] == methods[m]]
dft = dft[dft['n'] == n]
summary.append([m,n,len(dft)])
summary = np.asarray(summary)
methods_plot = ['KNN\nuniform',
'KNN\ndistance',
'RNR\nuniform',
'RNR\ndistance']
scaler = 1
plt.scatter(x=summary[:,1], y=summary[:,0], s=summary[:,2]*scaler,
c='steelblue', linewidth=0.5, edgecolors='black')
plt.xticks(ns)
plt.yticks(ms, methods_plot)
sizes = np.arange(100,1001,300)
sizes = np.hstack((1,sizes))
for s in sizes:
plt.scatter([],[],s=s*scaler, c='steelblue', label=f'{s}\n '
,linewidth=0.5, edgecolors='black')
plt.legend(title='winner count', bbox_to_anchor=(1.0, 1), loc='upper left')
plt.xlabel('Neighbor count / Radius multiplier')
plt.savefig('riemann.pdf')
#%%
hs = [1,2,3,4,5,6,8,10,12,15,20,30]
for h in hs:
res = np.load(f'resh{h}.npy', allow_pickle=True)
data = np.vstack(res)
plt.figure(figsize=(6,2.5))
df = pd.DataFrame(data=data, columns=["method", "n", "param"])
df['n'] = df['n'].astype(int)
methods = ['KNN distance', 'KNN uniform', 'RNR distance', 'RNR uniform']
ns = np.arange(1,31,1)
ms = np.arange(0,4,1)
summary = []
for m in ms:
for n in ns:
dft = df[df['method'] == methods[m]]
dft = dft[dft['n'] == n]
summary.append([m,n,len(dft)])
summary = np.asarray(summary)
xsum = []
for i in range(4):
xsum.append(np.sum(summary[summary[:,0]==i][:,2]))
methods_plot = [f'KNN\ndistance ({xsum[0]})',
f'KNN\nuniform ({xsum[1]})',
f'RNR\ndistance ({xsum[2]})',
f'RNR\nuniform ({xsum[3]})']
scaler = 4
plt.scatter(x=summary[:,1], y=summary[:,0], s=summary[:,2]*scaler,
c='steelblue', linewidth=0.5, edgecolors='black')
plt.xticks(ns)
plt.yticks(ms, methods_plot)
plt.ylim(-0.5,3.5)
sizes = np.arange(10,51,10)
sizes = np.hstack((1,sizes))
for s in sizes:
plt.scatter([],[],s=s*scaler, c='steelblue', label=f'{s}'
,linewidth=0.5, edgecolors='black')
plt.legend(title='winner count', bbox_to_anchor=(1.0, 1), loc='upper left')
plt.xlabel('Neighbor count / Radius multiplier')
#plt.title(f'h-step {h}')
plt.xticks(ns[::1],ns[::1],rotation=90)
plt.xlim(0,20)
plt.grid()
plt.savefig(f'h-step {h}.pdf')
plt.show()
#%%
res = np.load(f'resh{5}.npy', allow_pickle=True)
data = np.vstack(res)
plt.figure(figsize=(6,2.5))
df = pd.DataFrame(data=data, columns=["method", "n", "param"])
df['n'] = df['n'].astype(int)
methods = ['KNN distance', 'KNN uniform', 'RNR distance', 'RNR uniform']
ns = np.arange(1,31,1)
ms = np.arange(0,4,1)
summary = []
for m in ms:
for n in ns:
dft = df[df['method'] == methods[m]]
dft = dft[dft['n'] == n]
summary.append([m,n,len(dft)])
summary = np.asarray(summary)
a = summary
print(a[a[:, 2].argsort()]) | 30.730937 | 154 | 0.55808 | 4,000 | 28,211 | 3.8525 | 0.09525 | 0.026347 | 0.023361 | 0.036989 | 0.87865 | 0.859377 | 0.839585 | 0.823686 | 0.815639 | 0.811486 | 0 | 0.036317 | 0.273829 | 28,211 | 918 | 155 | 30.730937 | 0.715904 | 0.055262 | 0 | 0.761453 | 0 | 0 | 0.12728 | 0.001881 | 0 | 0 | 0 | 0 | 0 | 1 | 0.00316 | false | 0.004739 | 0.028436 | 0 | 0.037915 | 0.00316 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b7caf52cea54e34ef48fde633d49af740ae52f3c | 30,573 | py | Python | mrpy/spatial_operators/ctr_poly/2nd_order_ctr_finite_diff/matrix_aux.py | marc-nguessan/mrpy | 6fb0bce485234a45bb863f71bc2bdf0a22014de3 | [
"BSD-3-Clause"
] | 2 | 2020-01-06T10:48:44.000Z | 2020-01-09T20:07:08.000Z | mrpy/spatial_operators/ctr_poly/2nd_order_ctr_finite_diff/matrix_aux.py | marc-nguessan/mrpy | 6fb0bce485234a45bb863f71bc2bdf0a22014de3 | [
"BSD-3-Clause"
] | 1 | 2020-01-09T20:08:50.000Z | 2020-01-09T20:11:20.000Z | mrpy/spatial_operators/ctr_poly/2nd_order_ctr_finite_diff/matrix_aux.py | marc-nguessan/mrpy | 6fb0bce485234a45bb863f71bc2bdf0a22014de3 | [
"BSD-3-Clause"
] | null | null | null | from __future__ import print_function, division
"""...
"""
import petsc4py.PETSc as petsc
from six.moves import range
from mrpy.mr_utils import mesh
import numpy as np
import math
# I need to precise where do the coefficients come from
coef = np.zeros(shape=(6, 5), dtype=np.float)
coef[1, 0] = -1./8
coef[2, 0] = -22./128
coef[2, 1] = 3./128
coef[3, 0] = -201./1024
coef[3, 1] = 11./256
coef[3, 2] = -5./1024
coef[4, 0] = -3461./16384
coef[4, 1] = 949./16384
coef[4, 2] = -185./16384
coef[4, 3] = 35./32768
coef[5, 0] = -29011./131072
coef[5, 1] = 569./8192
coef[5, 2] = -4661./262144
coef[5, 3] = 49./16384
coef[5, 4] = -63./262144
def nonzero_list_add(tree, nonzero_list, level, index_x=0, index_y=0, index_z=0):
"""...
"""
index = mesh.z_curve_index(tree.dimension, level, index_x, index_y, index_z)
if index in tree.tree_nodes and tree.nisleaf[index]:
nonzero_list[-1] += 1
elif index in tree.tree_nodes and not tree.nisleaf[index]:
#the children of the nodes are leaves
children_number = len(tree.nchildren[index])
for child_index in tree.nchildren[index]:
i = tree.nindex_x[child_index]
j = tree.nindex_y[child_index]
k = tree.nindex_z[child_index]
nonzero_list_add(tree, nonzero_list, level+1, i, j, k)
elif index not in tree.tree_nodes:
#the parent of the node is a leaf; we need to compute the prediction value
s = tree.stencil_prediction
if tree.dimension == 1:
i = int(math.floor(index_x/2)) # parent index_x
p = index_x % 2
# One more nonzero element for the parent of the node
nonzero_list_add(tree, nonzero_list, level-1, i)
for m in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i+m) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+m)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im)
if mesh.bc_compatible_local_indexes(tree, level-1, i-m) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-m)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im)
elif tree.dimension == 2:
i = int(math.floor(index_x/2)) # parent index_x
j = int(math.floor(index_y/2)) # parent index_y
p = index_x % 2
q = index_y % 2
# One more nonzero element for the parent of the node
nonzero_list_add(tree, nonzero_list, level-1, i, j)
for m in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i+m, j) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+m, j)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm)
if mesh.bc_compatible_local_indexes(tree, level-1, i-m, j) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-m, j)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j+m) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j+m)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j-m) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j-m)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm)
for a in range(1, s+1):
for b in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm)
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm)
elif tree.dimension == 3:
i = int(math.floor(index_x/2)) # parent index_x
j = int(math.floor(index_y/2)) # parent index_y
k = int(math.floor(index_z/2)) # parent index_z
p = index_x % 2
q = index_y % 2
r = index_z % 2
# One more nonzero element for the parent of the node
nonzero_list_add(tree, nonzero_list, level-1, i, j, k)
for m in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i+m, j, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+m, j, k)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i-m, j, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-m, j, k)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j+m, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j+m, k)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j-m, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j-m, k)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j, k+m) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j, k+m)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j, k-m) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j, k-m)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
for a in range(1, s+1):
for b in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b, k)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b, k)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b, k)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b, k)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
for a in range(1, s+1):
for b in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j, k+b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j, k+b)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j, k+b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j, k+b)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j, k-b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j, k-b)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j, k-b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j, k-b)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
for a in range(1, s+1):
for b in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i, j+a, k+b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j+a, k+b)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j-a, k+b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j-a, k+b)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j+a, k-b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j+a, k-b)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j-a, k-b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j-a, k-b)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
for a in range(1, s+1):
for b in range(1, s+1):
for c in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b, k+c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b, k+c)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b, k+c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b, k+c)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b, k+c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b, k+c)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b, k+c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b, k+c)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b, k-c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b, k-c)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b, k-c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b, k-c)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b, k-c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b, k-c)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b, k-c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b, k-c)
# One more nonzero element for every node used for the prediction
nonzero_list_add(tree, nonzero_list, level-1, im, jm, km)
def matrix_add(tree, matrix, row, value, level, index_x=0, index_y=0, index_z=0, add_to_col=0):
"""...
"""
index = mesh.z_curve_index(tree.dimension, level, index_x, index_y, index_z)
if index in tree.tree_nodes and tree.nisleaf[index]:
col = tree.nindex_tree_leaves[index]
col += add_to_col
matrix.setValue(row, col, value, True)
#matrix[row, col] = matrix[row, col] + value
elif index in tree.tree_nodes and not tree.nisleaf[index]:
#the children of the nodes are leaves
children_number = len(tree.nchildren[index])
for child_index in tree.nchildren[index]:
i = tree.nindex_x[child_index]
j = tree.nindex_y[child_index]
k = tree.nindex_z[child_index]
matrix_add(tree, matrix, row, value*(1./children_number), level+1, i, j, k, add_to_col)
elif index not in tree.tree_nodes:
#the parent of the node is a leaf; we need to compute the prediction value
s = tree.stencil_prediction
if tree.dimension == 1:
i = int(math.floor(index_x/2)) # parent index_x
p = index_x % 2
matrix_add(tree, matrix, row, value*1., level-1, i, add_to_col=add_to_col)
for m in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i+m) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+m)
matrix_add(tree, matrix, row, value*(-1)**p*coef[s, m-1], level-1, im, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i-m) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-m)
matrix_add(tree, matrix, row, value*(-1)*(-1)**p*coef[s, m-1], level-1, im, add_to_col=add_to_col)
elif tree.dimension == 2:
i = int(math.floor(index_x/2)) # parent index_x
j = int(math.floor(index_y/2)) # parent index_y
p = index_x % 2
q = index_y % 2
matrix_add(tree, matrix, row, value*1., level-1, i, j, add_to_col=add_to_col)
for m in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i+m, j) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+m, j)
matrix_add(tree, matrix, row, value*(-1)**p*coef[s, m-1], level-1, im, jm, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i-m, j) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-m, j)
matrix_add(tree, matrix, row, value*(-1)*(-1)**p*coef[s, m-1], level-1, im, jm, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j+m) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j+m)
matrix_add(tree, matrix, row, value*(-1)**q*coef[s, m-1], level-1, im, jm, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j-m) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j-m)
matrix_add(tree, matrix, row, value*(-1)*(-1)**q*coef[s, m-1], level-1, im, jm, add_to_col=add_to_col)
for a in range(1, s+1):
for b in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(p+q)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(p+q)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(p+q)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(p+q)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, add_to_col=add_to_col)
elif tree.dimension == 3:
i = int(math.floor(index_x/2)) # parent index_x
j = int(math.floor(index_y/2)) # parent index_y
k = int(math.floor(index_z/2)) # parent index_z
p = index_x % 2
q = index_y % 2
r = index_z % 2
matrix_add(tree, matrix, row, value*1., level-1, i, j, k)
for m in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i+m, j, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+m, j, k)
matrix_add(tree, matrix, row, value*(-1)**p*coef[s, m-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i-m, j, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-m, j, k)
matrix_add(tree, matrix, row, value*(-1)*(-1)**p*coef[s, m-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j+m, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j+m, k)
matrix_add(tree, matrix, row, value*(-1)**q*coef[s, m-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j-m, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j-m, k)
matrix_add(tree, matrix, row, value*(-1)*(-1)**q*coef[s, m-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j, k+m) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j, k+m)
matrix_add(tree, matrix, row, value*(-1)**r*coef[s, m-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j, k-m) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j, k-m)
matrix_add(tree, matrix, row, value*(-1)*(-1)**r*coef[s, m-1], level-1, im, jm, km, add_to_col=add_to_col)
for a in range(1, s+1):
for b in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b, k)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(p+q)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b, k)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(p+q)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b, k)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(p+q)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b, k) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b, k)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(p+q)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, km, add_to_col=add_to_col)
for a in range(1, s+1):
for b in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j, k+b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j, k+b)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(p+r)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j, k+b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j, k+b)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(p+r)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j, k-b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j, k-b)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(p+r)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j, k-b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j, k-b)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(p+r)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, km, add_to_col=add_to_col)
for a in range(1, s+1):
for b in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i, j+a, k+b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j+a, k+b)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(q+r)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j-a, k+b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j-a, k+b)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(q+r)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j+a, k-b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j+a, k-b)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(q+r)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i, j-a, k-b) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i, j-a, k-b)
matrix_add(tree, matrix, row, value*(-1)*(-1)**(q+r)*coef[s, a-1]*coef[s, b-1], level-1, im, jm, km, add_to_col=add_to_col)
for a in range(1, s+1):
for b in range(1, s+1):
for c in range(1, s+1):
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b, k+c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b, k+c)
matrix_add(tree, matrix, row, value*(-1)**(p+q+r)*coef[s, a-1]*coef[s, b-1]*coef[s, c-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b, k+c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b, k+c)
matrix_add(tree, matrix, row, value*(-1)**(p+q+r)*coef[s, a-1]*coef[s, b-1]*coef[s, c-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b, k+c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b, k+c)
matrix_add(tree, matrix, row, value*(-1)**(p+q+r)*coef[s, a-1]*coef[s, b-1]*coef[s, c-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b, k+c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b, k+c)
matrix_add(tree, matrix, row, value*(-1)**(p+q+r)*coef[s, a-1]*coef[s, b-1]*coef[s, c-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b, k-c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j+b, k-c)
matrix_add(tree, matrix, row, value*(-1)**(p+q+r)*coef[s, a-1]*coef[s, b-1]*coef[s, c-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b, k-c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j+b, k-c)
matrix_add(tree, matrix, row, value*(-1)**(p+q+r)*coef[s, a-1]*coef[s, b-1]*coef[s, c-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b, k-c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i+a, j-b, k-c)
matrix_add(tree, matrix, row, value*(-1)**(p+q+r)*coef[s, a-1]*coef[s, b-1]*coef[s, c-1], level-1, im, jm, km, add_to_col=add_to_col)
if mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b, k-c) is not None:
im, jm, km = mesh.bc_compatible_local_indexes(tree, level-1, i-a, j-b, k-c)
matrix_add(tree, matrix, row, value*(-1)**(p+q+r)*coef[s, a-1]*coef[s, b-1]*coef[s, c-1], level-1, im, jm, km, add_to_col=add_to_col)
| 63.167355 | 161 | 0.556831 | 4,996 | 30,573 | 3.239392 | 0.025821 | 0.083045 | 0.065744 | 0.186851 | 0.966078 | 0.965522 | 0.963853 | 0.960331 | 0.960331 | 0.960331 | 0 | 0.028121 | 0.319563 | 30,573 | 483 | 162 | 63.298137 | 0.749844 | 0.097308 | 0 | 0.854545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.006061 | false | 0 | 0.018182 | 0 | 0.024242 | 0.00303 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4d0213bfad65488531b71a67df0325cfdd694cb9 | 4,397 | py | Python | bind_preference_grammar/BindPreferenceGrammarVisitor.py | jacopoMauro/abs_deployer | f52e0fe51d640669b62039fac682632fa549c55f | [
"0BSD"
] | 2 | 2015-03-04T08:55:51.000Z | 2019-11-25T20:08:29.000Z | bind_preference_grammar/BindPreferenceGrammarVisitor.py | jacopoMauro/abs_deployer | f52e0fe51d640669b62039fac682632fa549c55f | [
"0BSD"
] | null | null | null | bind_preference_grammar/BindPreferenceGrammarVisitor.py | jacopoMauro/abs_deployer | f52e0fe51d640669b62039fac682632fa549c55f | [
"0BSD"
] | null | null | null | # Generated from BindPreferenceGrammar.g4 by ANTLR 4.7
from antlr4 import *
# This class defines a complete generic visitor for a parse tree produced by BindPreferenceGrammarParser.
class BindPreferenceGrammarVisitor(ParseTreeVisitor):
# Visit a parse tree produced by BindPreferenceGrammarParser#statement.
def visitStatement(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#ApreferenceLocal.
def visitApreferenceLocal(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#ApreferenceExpr.
def visitApreferenceExpr(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#b_expr.
def visitB_expr(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#b_term.
def visitB_term(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#b_factor.
def visitB_factor(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#relation.
def visitRelation(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#expr.
def visitExpr(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#AexprQuantifier.
def visitAexprQuantifier(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#AexprInt.
def visitAexprInt(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#AexprBind.
def visitAexprBind(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#AexprSum.
def visitAexprSum(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#AexprUnaryArithmetic.
def visitAexprUnaryArithmetic(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#AexprBrackets.
def visitAexprBrackets(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#AobjIDID.
def visitAobjIDID(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#AobjIDVar.
def visitAobjIDVar(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#AobjIDScenario.
def visitAobjIDScenario(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#AobjIDRE.
def visitAobjIDRE(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#typeV.
def visitTypeV(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#bool_binary_op.
def visitBool_binary_op(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#arith_binary_op.
def visitArith_binary_op(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#arith_unary_op.
def visitArith_unary_op(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#comparison_op.
def visitComparison_op(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#unaryOp.
def visitUnaryOp(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#boolFact.
def visitBoolFact(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#variable.
def visitVariable(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by BindPreferenceGrammarParser#re.
def visitRe(self, ctx):
return self.visitChildren(ctx)
| 30.748252 | 105 | 0.73914 | 480 | 4,397 | 6.729167 | 0.172917 | 0.052012 | 0.086687 | 0.156037 | 0.735294 | 0.735294 | 0.710526 | 0.694427 | 0.694427 | 0.694427 | 0 | 0.001133 | 0.197407 | 4,397 | 142 | 106 | 30.964789 | 0.91414 | 0.465317 | 0 | 0.482143 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.482143 | false | 0 | 0.017857 | 0.482143 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
4d1da23bf5d5e46942de07a3564013bb3344a226 | 8,193 | py | Python | libs/plotting_tv.py | LiwenxuanNJU/TVpgGLM | d07f81cf3a404474b640777a3ab01b0a79ad9187 | [
"MIT"
] | 1 | 2018-03-19T06:12:48.000Z | 2018-03-19T06:12:48.000Z | libs/plotting_tv.py | LiwenxuanNJU/TVpgGLM | d07f81cf3a404474b640777a3ab01b0a79ad9187 | [
"MIT"
] | null | null | null | libs/plotting_tv.py | LiwenxuanNJU/TVpgGLM | d07f81cf3a404474b640777a3ab01b0a79ad9187 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Thu Apr 13 20:40:46 2017
@author: roger
"""
import numpy as np
def plot_glm(data,
weights,
adjacency,
firingrates,
std_firingrates=None,
fig=None,
axs=None,
handles=None,
title=None,
figsize=(6, 3),
W_lim=3,
pltslice=slice(0, 500),
data_index=0,
N_to_plot=2):
"""
Plot the parameters of the model
:return:
"""
Y = data
W, A = weights, adjacency
N = W.shape[0]
# Do the imports here so that plotting stuff isn't loaded
# unless it is necessary
import matplotlib.pyplot as plt
import matplotlib.gridspec as gridspec
from mpl_toolkits.axes_grid1 import make_axes_locatable
if handles is None:
# If handles are not given, create a new plot
handles = []
fig = plt.figure(figsize=figsize)
gs = gridspec.GridSpec(N_to_plot, 3)
W_ax = fig.add_subplot(gs[:, 0])
A_ax = fig.add_subplot(gs[:, 1])
lam_axs = [fig.add_subplot(gs[i, 2]) for i in range(N_to_plot)]
axs = (W_ax, A_ax, lam_axs)
# Weight matrix
h_W = W_ax.imshow(W[:, :, 0], vmin=-W_lim, vmax=W_lim, cmap="RdBu", interpolation="nearest")
W_ax.set_xlabel("pre")
W_ax.set_ylabel("post")
W_ax.set_xticks(np.arange(N))
W_ax.set_xticklabels(np.arange(N) + 1)
W_ax.set_yticks(np.arange(N))
W_ax.set_yticklabels(np.arange(N) + 1)
W_ax.set_title("Weights")
# Colorbar
divider = make_axes_locatable(W_ax)
cbax = divider.new_horizontal(size="5%", pad=0.05)
fig.add_axes(cbax)
plt.colorbar(h_W, cax=cbax)
handles.append(h_W)
# Adjacency matrix
h_A = A_ax.imshow(A, vmin=0, vmax=1, cmap="Greys", interpolation="nearest")
A_ax.set_xlabel("pre")
A_ax.set_ylabel("post")
A_ax.set_title("Adjacency")
A_ax.set_xticks(np.arange(N))
A_ax.set_xticklabels(np.arange(N) + 1)
A_ax.set_yticks(np.arange(N))
A_ax.set_yticklabels(np.arange(N) + 1)
# Colorbar
divider = make_axes_locatable(A_ax)
cbax = divider.new_horizontal(size="5%", pad=0.05)
fig.add_axes(cbax)
plt.colorbar(h_A, cax=cbax)
handles.append(h_A)
# Plot the true and inferred rates
for n in range(min(N, N_to_plot)):
tn = np.where(Y[pltslice, n])[0]
lam_axs[n].plot(tn, np.ones_like(tn), 'ko', markersize=4)
# If given, plot the mean+-std of the firing rates
if std_firingrates is not None:
sausage_plot(np.arange(pltslice.start, pltslice.stop),
firingrates[pltslice, n],
std_firingrates[pltslice,n],
sgax=lam_axs[n],
alpha=0.5)
h_fr = lam_axs[n].plot(firingrates[pltslice, n], label="True")[0]
lam_axs[n].set_ylim(-0.05, 1.1)
lam_axs[n].set_ylabel("$\lambda_{}(t)$".format(n + 1))
if n == 0:
lam_axs[n].set_title("Firing Rates")
if n == min(N, N_to_plot) - 1:
lam_axs[n].set_xlabel("Time")
handles.append(h_fr)
if title is not None:
handles.append(fig.suptitle(title))
plt.tight_layout()
else:
# If we are given handles, update the data
handles[0].set_data(W[:, :, 0])
handles[1].set_data(A)
for n in range(min(N, N_to_plot)):
handles[2 + n].set_data(np.arange(pltslice.start, pltslice.stop), firingrates[pltslice, n])
if title is not None:
handles[-1].set_text(title)
plt.pause(0.001)
return fig, axs, handles
def plot_tv_glm(data,
weights,
adjacency,
firingrates,
std_firingrates=None,
fig=None,
axs=None,
handles=None,
title=None,
figsize=(6, 3),
W_lim=3,
pltslice=slice(0, 500),
data_index=0,
N_to_plot=2):
"""
Plot the parameters of the model
:return:
"""
Y = data
W, A = weights, adjacency
N = W.shape[0]
# Do the imports here so that plotting stuff isn't loaded
# unless it is necessary
import matplotlib.pyplot as plt
import matplotlib.gridspec as gridspec
from mpl_toolkits.axes_grid1 import make_axes_locatable
if handles is None:
# If handles are not given, create a new plot
handles = []
fig = plt.figure(figsize=figsize)
gs = gridspec.GridSpec(N_to_plot, 3)
W_ax = fig.add_subplot(gs[:, 0])
A_ax = fig.add_subplot(gs[:, 1])
lam_axs = [fig.add_subplot(gs[i, 2]) for i in range(N_to_plot)]
axs = (W_ax, A_ax, lam_axs)
# Weight matrix
h_W = W_ax.imshow(W[:, :, 0], vmin=-W_lim, vmax=W_lim, cmap="RdBu", interpolation="nearest")
W_ax.set_xlabel("pre")
W_ax.set_ylabel("post")
W_ax.set_xticks(np.arange(N))
W_ax.set_xticklabels(np.arange(N) + 1)
W_ax.set_yticks(np.arange(N))
W_ax.set_yticklabels(np.arange(N) + 1)
W_ax.set_title("Weights")
# Colorbar
divider = make_axes_locatable(W_ax)
cbax = divider.new_horizontal(size="5%", pad=0.05)
fig.add_axes(cbax)
plt.colorbar(h_W, cax=cbax)
handles.append(h_W)
# Adjacency matrix
h_A = A_ax.imshow(A, vmin=0, vmax=1, cmap="Greys", interpolation="nearest")
A_ax.set_xlabel("pre")
A_ax.set_ylabel("post")
A_ax.set_title("Adjacency")
A_ax.set_xticks(np.arange(N))
A_ax.set_xticklabels(np.arange(N) + 1)
A_ax.set_yticks(np.arange(N))
A_ax.set_yticklabels(np.arange(N) + 1)
# Colorbar
divider = make_axes_locatable(A_ax)
cbax = divider.new_horizontal(size="5%", pad=0.05)
fig.add_axes(cbax)
plt.colorbar(h_A, cax=cbax)
handles.append(h_A)
# Plot the true and inferred rates
for n in range(min(N, N_to_plot)):
tn = np.where(Y[pltslice, n])[0]
lam_axs[n].plot(tn, np.ones_like(tn), 'ko', markersize=4)
# If given, plot the mean+-std of the firing rates
if std_firingrates is not None:
sausage_plot(np.arange(pltslice.start, pltslice.stop),
firingrates[pltslice, n],
std_firingrates[pltslice,n],
sgax=lam_axs[n],
alpha=0.5)
h_fr = lam_axs[n].plot(firingrates[pltslice, n], label="True")[0]
lam_axs[n].set_ylim(-0.05, 1.1)
lam_axs[n].set_ylabel("$\lambda_{}(t)$".format(n + 1))
if n == 0:
lam_axs[n].set_title("Firing Rates")
if n == min(N, N_to_plot) - 1:
lam_axs[n].set_xlabel("Time")
handles.append(h_fr)
if title is not None:
handles.append(fig.suptitle(title))
plt.tight_layout()
else:
# If we are given handles, update the data
handles[0].set_data(W[:, :, 0])
handles[1].set_data(A)
for n in range(min(N, N_to_plot)):
handles[2 + n].set_data(np.arange(pltslice.start, pltslice.stop), firingrates[pltslice, n])
if title is not None:
handles[-1].set_text(title)
plt.pause(0.001)
return fig, axs, handles
def sausage_plot(x, y, yerr, sgax=None, **kwargs):
import matplotlib.pyplot as plt
from matplotlib.patches import Polygon
T = x.size
assert x.shape == y.shape == yerr.shape == (T,)
# Get axis
if sgax is None:
sgax = plt.gca()
# Compute envelope
env = np.zeros((T*2,2))
env[:,0] = np.concatenate((x, x[::-1]))
env[:,1] = np.concatenate((y + yerr, y[::-1] - yerr[::-1]))
# Add the patch
sgax.add_patch(Polygon(env, **kwargs))
| 31.633205 | 103 | 0.551446 | 1,172 | 8,193 | 3.681741 | 0.151877 | 0.032445 | 0.033372 | 0.01854 | 0.922132 | 0.915875 | 0.915875 | 0.915875 | 0.915875 | 0.915875 | 0 | 0.022454 | 0.320518 | 8,193 | 258 | 104 | 31.755814 | 0.75265 | 0.098743 | 0 | 0.926966 | 0 | 0 | 0.025704 | 0 | 0 | 0 | 0 | 0 | 0.005618 | 1 | 0.016854 | false | 0 | 0.050562 | 0 | 0.078652 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4d91024e253e5c5c79e06dc977eb631ffa420629 | 650 | py | Python | block_test.py | underdarkskies/x16rv2_hash | a88b4a368e07249c39878100611aae21f149107c | [
"MIT"
] | null | null | null | block_test.py | underdarkskies/x16rv2_hash | a88b4a368e07249c39878100611aae21f149107c | [
"MIT"
] | null | null | null | block_test.py | underdarkskies/x16rv2_hash | a88b4a368e07249c39878100611aae21f149107c | [
"MIT"
] | 2 | 2019-10-04T20:40:33.000Z | 2020-04-05T17:39:23.000Z | import x16rv2_hash, os, sys, time, binascii
header = '000000505afb08ae157fcbf7a78c7aa53dad3968f8f2aa631afeb7b127b7d6e05c000000e1b449a57e01755387395d479f5be65e50b1e87538ef9654df1eb7ba6f5924a454de555d5abd641d0086f7f40101000000010000000000000000000000000000000000000000000000000000000000000000ffffffff2003d7ba050454de555d083a1591e8710000000d2f6e6f64655374726174756d2f00000000020000000000000000266a24aa21a9ede2f61c3f71d1defd3fa999dfa36953755c690689799962b48bebd836974e8cf90088526a740000001976a914eb0ff2678a9e5f1d2c0e8bd4812dc46043c669fc88ac00000000'
hashbin = x16rv2_hash.getPoWHash(binascii.unhexlify(header))[::-1]
print(binascii.hexlify(hashbin))
| 72.222222 | 501 | 0.936923 | 21 | 650 | 28.904762 | 0.714286 | 0.032949 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.578199 | 0.026154 | 650 | 8 | 502 | 81.25 | 0.380727 | 0 | 0 | 0 | 0 | 0 | 0.755008 | 0.755008 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.25 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4da77d89d519867e03ca27c71e173f5e235f3812 | 30,250 | py | Python | LimeSoup/parser/parser_paper_IOP.py | steerapi/LimeSoupMIT | df321015a8568da9841cd1db1895f63c0220b324 | [
"MIT"
] | 10 | 2019-10-17T20:19:59.000Z | 2022-02-02T09:05:41.000Z | LimeSoup/parser/parser_paper_IOP.py | steerapi/LimeSoupMIT | df321015a8568da9841cd1db1895f63c0220b324 | [
"MIT"
] | 2 | 2020-01-15T02:52:38.000Z | 2021-03-25T12:12:58.000Z | LimeSoup/parser/parser_paper_IOP.py | steerapi/LimeSoupMIT | df321015a8568da9841cd1db1895f63c0220b324 | [
"MIT"
] | 3 | 2019-10-18T06:43:34.000Z | 2021-11-02T23:40:28.000Z | import warnings
import re
import bs4
from LimeSoup.parser import tools as tl
"""There are two parsers, because we have two different formats of papers from IOP. The main difference is their notes' name are different. For example, for the paragraph title, Format 1 uses the note name "heading" whereas format 2 uses "title"
"""
class ParserPaper1:
def __init__(self, raw_xml, parser_type='lxml', debugging=False):
"""
:param raw_xml:
:param parser_type: can be 'xml.parser', 'lxml', 'xml5lib', 'lxml-xml'
:param debugging: True or False
"""
self.debugging = debugging
# parsers 'xml.parser', 'lxml', 'xml5lib', 'lxml-xml'
self.soup = bs4.BeautifulSoup(raw_xml, parser_type)
self.parser_type = parser_type
self.title = []
self.keywords = []
self.data_sections = []
self.headings_sections = []
self.number_paragraphs_sections = []
if debugging:
self.soup_orig = self.soup
def deal_with_sections(self):
"""
Deal with the sections, parse tags that contains <'section_h#'>
Ex: <'section_h2'>
:return:
"""
self.data_sections = []
self.create_parser_sections(self.soup)
# self.data_sections = parse_section.data
# self.headings_sections = parse_section.heading
# self.number_paragraphs_sections = parse_section.number_paragraphs
# self.soup = parse_section.soup
# del parse_section
@staticmethod
def compile(pattern):
return re.compile(pattern)
def create_section(self, name='no_name_section', type_section='no_type', content=[]):
return {
'type': type_section
, 'name': name
, 'content': content
}
def create_parser_sections(self, soup):
search_str = re.compile('sec-level[1-6]')
section_tags = soup.find_all(search_str)
# Get all sections
for tag in section_tags:
name = self.convert_to_text(tag.find('heading').text)
content = []
for p in tag.find_all('p', recursive=False):
# content_text=self.convert_to_text(p.text)
content.append(self.convert_to_text(p.text))
# content.append(content_text)
self.data_sections.append(self.create_section(
name=name,
type_section=tag.name,
content=content
))
# Nest data sections
for i in range(6, 1, -1):
did_nest = False
secname = "section_h{}".format(i)
# supersec_name = "section_h{}".format(i-1)
curr_sec_set = []
for j, sec in enumerate(reversed(self.data_sections)):
if sec['type'] == secname:
curr_sec_set.insert(0, sec)
elif (sec['type'] != secname) and curr_sec_set:
sec['content'].extend(curr_sec_set)
curr_sec_set = []
did_nest = True
if did_nest:
self.data_sections = [s for s in self.data_sections if s['type'] != 'section_h{}'.format(i)]
@staticmethod
def create_soup(xml_xlm, parser_type='lxml'):
# parser_types = ['xml.parser', 'lxml', 'xml5lib', 'lxml-xml']
return bs4.BeautifulSoup(xml_xlm, parser_type)
def save_soup_to_file(self, filename='soup.xml', prettify=True):
"""
Save the soup to a file to be analysed. This can be used during the
debugging process.
:param filename: str that contain the name of the file
:param prettify: boolean to add spaces on children tags
:return: None - just save a file on disk
"""
with open(filename, 'w', encoding='utf-8') as fd_div:
if prettify:
fd_div.write(self.soup.prettify())
fd_div.write('\n')
else:
for item in self.soup:
fd_div.write(item)
fd_div.write('\n')
def get_title(self, rules):
try:
self.title = next(x for x in self.get(rules))
except StopIteration:
self.title = None
def get(self, rules):
results = list()
for rule in rules:
finds = self.soup.find_all(**rule)
for item in finds:
text = self.convert_to_text(item.get_text())
results.append(text)
item.extract()
return results
def parse_formula(self,rules):
for rule in rules:
finds = self.soup.find_all(**rule)
for item in finds:
label = item.find('id')
if label is not None:
label.string = ' ' + label.string + ' '
item.append(', ')
def get_keywords(self, rules):
self.keywords = []
for rule in rules:
for keyword in self.soup.find_all(**rule):
self.keywords.append(keyword.get_text().strip('\n'))
keyword.extract()
def remove_tags(self, rules):
"""
Remove tags from bs4 soup object using a list of bs4 rules to find_all()
:param rules: list() of dict() of rules of bs4 find_all()
:return: None
"""
for rule in rules:
[s.extract() for s in self.soup.find_all(**rule)]
def remove_tag(self, rules):
"""
Remove the first found tag from bs4 soup object using
a list of bs4 rules to find_all() Remove the first tag.
:param rules: rules: list() of dict() of rules of bs4 find_all()
:return: None
"""
for rule in rules:
[s.extract() for s in self.soup.find_all(limit=1, **rule)]
@property
def headings_orig(self):
if not self.debugging:
warnings.warn('Debugging mode has to be True when call the class')
return None
list_heading_soup = self.soup_orig.find_all('sec')
list_heading = []
for item in list_heading_soup:
list_heading.append(item.get_text())
return list_heading
@property
def headings(self):
if not self.debugging:
warnings.warn('Debugging mode has to be True when call the class')
return None
list_heading_soup = self.soup.find_all('sec')
list_heading = []
for item in list_heading_soup:
list_heading.append(self.convert_to_text(item.get_text()))
return list_heading
@property
def paragraphs(self):
if not self.debugging:
warnings.warn('Debugging mode has to be True when call the class')
return None
list_paragraphs_soup = self.soup.find_all(name='p') # re.compile(
list_paragraphs = []
for item in list_paragraphs_soup:
if len(self.convert_to_text(item.get_text())) != 0:
item.string = self.convert_to_text(item.get_text())
list_paragraphs.append(item.get_text())
return list_paragraphs
@property
def paragraphs_orig(self):
if not self.debugging:
warnings.warn('Debugging mode has to be True when call the class')
return None
list_paragraphs_soup = self.soup_orig.find_all(name=re.compile('p'))
list_paragraphs = []
for item in list_paragraphs_soup:
list_paragraphs.append(item.get_text())
return list_paragraphs
def number_of_paragraphs_inside_parameters(self, parameters):
if not self.debugging:
warnings.warn('Debugging mode has to be True when call the class')
return None
soup_sec = self.soup_orig.find_all(parameters)
number_of_paragraphs_soup_sec = 0
for it in soup_sec:
number_of_paragraphs_soup_sec += len(list(
it.find_all('p', recursive=False)
))
# print(' number paragraphs inside div class section and sub: ',
# number_of_paragraphs_soup_sec)
def number_of_paragraphs_children(self):
if not self.debugging:
warnings.warn('Debugging mode has to be True when call the class')
return None
number_of_paragraphs_children = len(list(list(
self.soup_orig.children)[0].find_all('p', recursive=True)
)
)
# print(' Number of Paragraphs externo : ', number_of_paragraphs_children)
def create_tag_from_selection(self, rule, name_new_tag, name_section='Abstract'):
"""
Create a tag inside a bs4 soup object from a selection using a rule.
:param rule: a dict() of rules of bs4 find_all()
:param name_new_tag: new tag's name
:param name_section: create a <h2> tag with the name_section content
:return: None
"""
inside_tags = self.soup.find_all(**rule)
section = self.soup.new_tag('section_{}'.format(name_new_tag))
heading = self.soup.new_tag('h2')
heading.append(name_section)
section.append(heading)
for tag in inside_tags:
tag.wrap(section)
section.append(tag)
def create_abstract(self, rule):
"""
Create a section for the abstract
"""
abstract = self.soup.find(**rule)
if abstract is not None:
self.data_sections.insert(0, self.create_section(
name='Abstract',
type_section='abstract',
content= self.convert_to_text(abstract.get_text())
))
def get_abstract(self, rule):
"""
Get abstract when there is no body article
"""
abstract = self.soup.find(**rule)
if abstract is not None:
abstract_text = re.sub('(?<!\.)\\n','',abstract.get_text())
abstract_text = abstract_text.replace('Abstract', '')
abstract_text = abstract_text.replace('\n','')
abstract_text = abstract_text.replace(' ', '')
abstract_dict = {
'type': 'section_h2',
'name': 'Abstract',
'content': [abstract_text]
}
return abstract_dict
def raw_text(self,rule):
"""
Get the text with no format if ACS does not provide the sections hierarchy.
"""
raw_text = self.soup.find(**rule)
if raw_text is not None:
raw_text_dict = {
'type': 'section_h2',
'name': 'Raw text',
'content': [raw_text.get_text()]
}
return raw_text_dict
def create_tag_to_paragraphs_inside_tag(self, rule, name_new_tag, name_section='Abstract'):
inside_tags_inter = self.soup.find_all(**rule)
if len(inside_tags_inter) == 0:
# self.save_soup_to_file('selction_found_nothing.xml')
# input('Section not created, selection found nothing')
return 'Section not created, number of paragraphs equal zero.'
inside_tags = inside_tags_inter[0].find_all(re.compile('para'), recursive=False)
#inside_tags = inside_tags_inter[0].find_all('p', recursive=False)
#inside_tags_ol = inside_tags_inter[0].find_all('ol', recursive=False)
#print(len(inside_tags_ol))
#inside_tags = inside_tags_p + inside_tags_ol
if len(inside_tags) == 0:
# self.save_soup_to_file('selction_found_nothing.xml')
# input('Section not created, number of paragraphs equal zero.')
return 'Section not created, number of paragraphs equal zero.'
section = self.soup.new_tag('section_{}'.format(name_new_tag))
heading = self.soup.new_tag('h2')
heading.append(name_section)
section.append(heading)
for tag in inside_tags:
tag_next_sibling = tag
while True:
tag_next_sibling = tag_next_sibling.next_sibling
if tag_next_sibling is None:
break
if tag_next_sibling.name is None:
continue
else:
break
tag.wrap(section)
section.append(tag)
if tag_next_sibling is None: break
if 'section_h' in tag_next_sibling.name:
break
def operation_tag_remove_space(self, rules):
for rule in rules:
tags = self.soup.find_all(**rule)
for tag in tags:
if tag is not None:
if tag.name is not None:
tag.string = tag.get_text().strip()
def create_tag_sections(self, rule=None):
"""
Create the standard tags (<section_h#>) using a rule to bs4 find_all()
:param rule:
:return:
"""
tags = self.soup.find_all('sec') # Tags corresponded to headings
for each_tag in tags:
# try:
tag_name_tmp = each_tag.find('id').string
#print('Tag:', each_tag.name, 'Label:', "%r"%tag_name_tmp)
# To be consistent with the html parser, the notation h1, h2, ..., h6 is kept.
tag_name = int(tag_name_tmp.count('.'))+2
section = self.soup.new_tag('section_h{}'.format(tag_name))
each_tag.wrap(section)
# except:
# section = self.soup.new_tag('section_h0')
# each_tag.wrap(section)
def rename_tag(self, rule, new_name='section_h4'):
tags = self.soup.find_all(**rule)
for tag in tags:
tag.name = new_name
def strip_tags(self, rules):
"""
Replace some tag with the children tag.
:param rules: list of rules for bs4 find_all()
:return: None
"""
tags = list()
for rule in rules:
for tag in self.soup.find_all(**rule):
tag.replace_with_children()
tags.append(tag.name)
return tags
def change_name_tag_sections(self):
tags = self.soup.find_all('sec')
for each_tag in tags:
try:
tag_name_tmp = each_tag['id']
# To be consistent with the xml parser, the notation h1, h2, ..., h6 is kept.
tag_name = int(tag_name_tmp.count('.'))+2
each_tag.name = 'section_h{}'.format(tag_name)
except:
each_tag.name = 'section_h2'
@staticmethod
def convert_to_text(text):
text = text.replace("\n", " ")
text = text.replace(" ?> ", " ")
text = text.replace(" []", " ")
text = text.replace(" [, ]", " ")
text = text.replace(" [, , ]", " ")
text = text.replace(" [, , , ]", " ")
text = text.replace(" [, , , ,]", " ")
text = ' '.join(str(text).split())
text = re.sub(r"\&(\w+?)gr;", r"\1", text)
return text
@property
def raw_xml(self):
return str(self.soup.prettify)
class ParserPaper2:
def __init__(self, raw_xml, parser_type='lxml', debugging=False):
"""
:param raw_xml:
:param parser_type: can be 'xml.parser', 'lxml', 'xml5lib', 'lxml-xml'
:param debugging: True or False
"""
self.debugging = debugging
# parsers 'xml.parser', 'lxml', 'xml5lib', 'lxml-xml'
self.soup = bs4.BeautifulSoup(raw_xml, parser_type)
self.parser_type = parser_type
self.title = []
self.keywords = []
self.data_sections = []
self.headings_sections = []
self.number_paragraphs_sections = []
if debugging:
self.soup_orig = self.soup
def deal_with_sections(self):
"""
Deal with the sections, parse tags that contains <'section_h#'>
Ex: <'section_h2'>
:return:
"""
self.data_sections = []
self.create_parser_sections(self.soup)
# self.data_sections = parse_section.data
# self.headings_sections = parse_section.heading
# self.number_paragraphs_sections = parse_section.number_paragraphs
# self.soup = parse_section.soup
# del parse_section
@staticmethod
def compile(pattern):
return re.compile(pattern)
def create_section(self, name='no_name_section', type_section='no_type', content=[]):
return {
'type': type_section
, 'name': name
, 'content': content
}
def create_parser_sections(self, soup):
search_str = re.compile('sec')
section_tags = soup.find_all(search_str)
# Get all sections
for tag in section_tags:
if tag.find('title'):
name = self.convert_to_text(tag.find('title').text)
content = []
for p in tag.find_all('p', recursive=False):
# content_text=self.convert_to_text(p.text)
content.append(self.convert_to_text(p.text))
# content.append(content_text)
self.data_sections.append(self.create_section(
name=name,
type_section=tag.name,
content=content
))
# Nest data sections
for i in range(6, 1, -1):
did_nest = False
secname = "section_h{}".format(i)
# supersec_name = "section_h{}".format(i-1)
curr_sec_set = []
for j, sec in enumerate(reversed(self.data_sections)):
if sec['type'] == secname:
curr_sec_set.insert(0, sec)
elif (sec['type'] != secname) and curr_sec_set:
sec['content'].extend(curr_sec_set)
curr_sec_set = []
did_nest = True
if did_nest:
self.data_sections = [s for s in self.data_sections if s['type'] != 'section_h{}'.format(i)]
@staticmethod
def create_soup(xml_xlm, parser_type='lxml'):
# parser_types = ['xml.parser', 'lxml', 'xml5lib', 'lxml-xml']
return bs4.BeautifulSoup(xml_xlm, parser_type)
def save_soup_to_file(self, filename='soup.xml', prettify=True):
"""
Save the soup to a file to be analysed. This can be used during the
debugging process.
:param filename: str that contain the name of the file
:param prettify: boolean to add spaces on children tags
:return: None - just save a file on disk
"""
with open(filename, 'w', encoding='utf-8') as fd_div:
if prettify:
fd_div.write(self.soup.prettify())
fd_div.write('\n')
else:
for item in self.soup:
fd_div.write(item)
fd_div.write('\n')
def get_title(self, rules):
try:
self.title = next(x for x in self.get(rules))
except StopIteration:
self.title = None
def get(self, rules):
results = list()
for rule in rules:
finds = self.soup.find_all(**rule)
for item in finds:
text = self.convert_to_text(item.get_text())
results.append(text)
item.extract()
return results
def parse_formula(self,rules):
for rule in rules:
finds = self.soup.find_all(**rule)
for item in finds:
label = item.find('id')
if label is not None:
label.string = ' ' + label.string + ' '
item.append(', ')
def get_keywords(self, rules):
self.keywords = []
for rule in rules:
for keyword in self.soup.find_all(**rule):
self.keywords.append(keyword.get_text().strip('\n'))
keyword.extract()
def remove_tags(self, rules):
"""
Remove tags from bs4 soup object using a list of bs4 rules to find_all()
:param rules: list() of dict() of rules of bs4 find_all()
:return: None
"""
for rule in rules:
[s.extract() for s in self.soup.find_all(**rule)]
def remove_tag(self, rules):
"""
Remove the first found tag from bs4 soup object using
a list of bs4 rules to find_all() Remove the first tag.
:param rules: rules: list() of dict() of rules of bs4 find_all()
:return: None
"""
for rule in rules:
[s.extract() for s in self.soup.find_all(limit=1, **rule)]
@property
def headings_orig(self):
if not self.debugging:
warnings.warn('Debugging mode has to be True when call the class')
return None
list_heading_soup = self.soup_orig.find_all('sec')
list_heading = []
for item in list_heading_soup:
list_heading.append(item.get_text())
return list_heading
@property
def headings(self):
if not self.debugging:
warnings.warn('Debugging mode has to be True when call the class')
return None
list_heading_soup = self.soup.find_all('sec')
list_heading = []
for item in list_heading_soup:
list_heading.append(self.convert_to_text(item.get_text()))
return list_heading
@property
def paragraphs(self):
if not self.debugging:
warnings.warn('Debugging mode has to be True when call the class')
return None
list_paragraphs_soup = self.soup.find_all(name='p') # re.compile(
list_paragraphs = []
for item in list_paragraphs_soup:
if len(self.convert_to_text(item.get_text())) != 0:
item.string = self.convert_to_text(item.get_text())
list_paragraphs.append(item.get_text())
return list_paragraphs
@property
def paragraphs_orig(self):
if not self.debugging:
warnings.warn('Debugging mode has to be True when call the class')
return None
list_paragraphs_soup = self.soup_orig.find_all(name=re.compile('p'))
list_paragraphs = []
for item in list_paragraphs_soup:
list_paragraphs.append(item.get_text())
return list_paragraphs
def number_of_paragraphs_inside_parameters(self, parameters):
if not self.debugging:
warnings.warn('Debugging mode has to be True when call the class')
return None
soup_sec = self.soup_orig.find_all(parameters)
number_of_paragraphs_soup_sec = 0
for it in soup_sec:
number_of_paragraphs_soup_sec += len(list(
it.find_all('p', recursive=False)
))
# print(' number paragraphs inside div class section and sub: ',
# number_of_paragraphs_soup_sec)
def number_of_paragraphs_children(self):
if not self.debugging:
warnings.warn('Debugging mode has to be True when call the class')
return None
number_of_paragraphs_children = len(list(list(
self.soup_orig.children)[0].find_all('p', recursive=True)
)
)
# print(' Number of Paragraphs externo : ', number_of_paragraphs_children)
def create_tag_from_selection(self, rule, name_new_tag, name_section='Abstract'):
"""
Create a tag inside a bs4 soup object from a selection using a rule.
:param rule: a dict() of rules of bs4 find_all()
:param name_new_tag: new tag's name
:param name_section: create a <h2> tag with the name_section content
:return: None
"""
inside_tags = self.soup.find_all(**rule)
section = self.soup.new_tag('section_{}'.format(name_new_tag))
heading = self.soup.new_tag('h2')
heading.append(name_section)
section.append(heading)
for tag in inside_tags:
tag.wrap(section)
section.append(tag)
def create_abstract(self, rule):
"""
Create a section for the abstract
"""
abstract = self.soup.find(**rule)
if abstract is not None:
self.data_sections.insert(0, self.create_section(
name='Abstract',
type_section='abstract',
content= self.convert_to_text(abstract.get_text())
))
def get_abstract(self, rule):
"""
Get abstract when there is no body article
"""
abstract = self.soup.find(**rule)
if abstract is not None:
abstract_text = re.sub('(?<!\.)\\n','',abstract.get_text())
abstract_text = abstract_text.replace('Abstract', '')
abstract_text = abstract_text.replace('\n','')
abstract_text = abstract_text.replace(' ', '')
abstract_dict = {
'type': 'section_h2',
'name': 'Abstract',
'content': [abstract_text]
}
return abstract_dict
def raw_text(self,rule):
"""
Get the text with no format if ACS does not provide the sections hierarchy.
"""
raw_text = self.soup.find(**rule)
if raw_text is not None:
raw_text_dict = {
'type': 'section_h2',
'name': 'Raw text',
'content': [raw_text.get_text()]
}
return raw_text_dict
def create_tag_to_paragraphs_inside_tag(self, rule, name_new_tag, name_section='Abstract'):
inside_tags_inter = self.soup.find_all(**rule)
if len(inside_tags_inter) == 0:
# self.save_soup_to_file('selction_found_nothing.xml')
# input('Section not created, selection found nothing')
return 'Section not created, number of paragraphs equal zero.'
inside_tags = inside_tags_inter[0].find_all(re.compile('para'), recursive=False)
#inside_tags = inside_tags_inter[0].find_all('p', recursive=False)
#inside_tags_ol = inside_tags_inter[0].find_all('ol', recursive=False)
#print(len(inside_tags_ol))
#inside_tags = inside_tags_p + inside_tags_ol
if len(inside_tags) == 0:
# self.save_soup_to_file('selction_found_nothing.xml')
# input('Section not created, number of paragraphs equal zero.')
return 'Section not created, number of paragraphs equal zero.'
section = self.soup.new_tag('section_{}'.format(name_new_tag))
heading = self.soup.new_tag('h2')
heading.append(name_section)
section.append(heading)
for tag in inside_tags:
tag_next_sibling = tag
while True:
tag_next_sibling = tag_next_sibling.next_sibling
if tag_next_sibling is None:
break
if tag_next_sibling.name is None:
continue
else:
break
tag.wrap(section)
section.append(tag)
if tag_next_sibling is None: break
if 'section_h' in tag_next_sibling.name:
break
def operation_tag_remove_space(self, rules):
for rule in rules:
tags = self.soup.find_all(**rule)
for tag in tags:
if tag is not None:
if tag.name is not None:
tag.string = tag.get_text().strip()
def create_tag_sections(self, rule=None):
"""
Create the standard tags (<section_h#>) using a rule to bs4 find_all()
:param rule:
:return:
"""
tags = self.soup.find_all('sec') # Tags corresponded to headings
for each_tag in tags:
# try:
tag_name_tmp = each_tag.find('id').string
#print('Tag:', each_tag.name, 'Label:', "%r"%tag_name_tmp)
# To be consistent with the html parser, the notation h1, h2, ..., h6 is kept.
tag_name = int(tag_name_tmp.count('.'))+2
section = self.soup.new_tag('section_h{}'.format(tag_name))
each_tag.wrap(section)
# except:
# section = self.soup.new_tag('section_h0')
# each_tag.wrap(section)
def rename_tag(self, rule, new_name='section_h4'):
tags = self.soup.find_all(**rule)
for tag in tags:
tag.name = new_name
def strip_tags(self, rules):
"""
Replace some tag with the children tag.
:param rules: list of rules for bs4 find_all()
:return: None
"""
tags = list()
for rule in rules:
for tag in self.soup.find_all(**rule):
tag.replace_with_children()
tags.append(tag.name)
return tags
def change_name_tag_sections(self):
tags = self.soup.find_all('sec')
for each_tag in tags:
try:
tag_name_tmp = each_tag['id']
# To be consistent with the xml parser, the notation h1, h2, ..., h6 is kept.
tag_name = int(tag_name_tmp.count('.'))+2
each_tag.name = 'section_h{}'.format(tag_name)
except:
each_tag.name = 'section_h2'
@staticmethod
def convert_to_text(text):
text = text.replace("\n", " ")
text = text.replace(" ?> ", " ")
text = text.replace(" []", " ")
text = text.replace(" [, ]", " ")
text = text.replace(" [, , ]", " ")
text = text.replace(" [, , , ]", " ")
text = text.replace(" [, , , ,]", " ")
text = ' '.join(str(text).split())
text = re.sub(r"\&(\w+?)gr;", r"\1", text)
return text
@property
def raw_xml(self):
return str(self.soup.prettify) | 37.718204 | 245 | 0.557851 | 3,667 | 30,250 | 4.403327 | 0.064358 | 0.035672 | 0.025268 | 0.026011 | 0.979996 | 0.979996 | 0.979996 | 0.976528 | 0.976528 | 0.976528 | 0 | 0.005245 | 0.338182 | 30,250 | 802 | 246 | 37.718204 | 0.801299 | 0.183074 | 0 | 0.961468 | 0 | 0 | 0.068939 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.113761 | false | 0 | 0.007339 | 0.014679 | 0.201835 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
12a13e137d043b8285c417e7450ee03077c8481a | 13,174 | py | Python | ppr-api/tests/unit/api/test_drafts.py | pwei1018/ppr | 1fdd2f1ad33217045404d7b872d9fad41a4c7da6 | [
"Apache-2.0"
] | 4 | 2020-01-21T21:46:42.000Z | 2021-02-24T18:30:24.000Z | ppr-api/tests/unit/api/test_drafts.py | pwei1018/ppr | 1fdd2f1ad33217045404d7b872d9fad41a4c7da6 | [
"Apache-2.0"
] | 1,313 | 2019-10-18T22:48:16.000Z | 2022-03-30T17:42:47.000Z | ppr-api/tests/unit/api/test_drafts.py | pwei1018/ppr | 1fdd2f1ad33217045404d7b872d9fad41a4c7da6 | [
"Apache-2.0"
] | 201 | 2019-10-18T21:34:41.000Z | 2022-03-31T20:07:42.000Z | # Copyright © 2019 Province of British Columbia
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Tests to verify the drafts endpoint.
Test-Suite to ensure that the /drafts endpoint is working as expected.
"""
import copy
from http import HTTPStatus
from registry_schemas.example_data.ppr import DRAFT_FINANCING_STATEMENT, DRAFT_CHANGE_STATEMENT, \
DRAFT_AMENDMENT_STATEMENT
from ppr_api.services.authz import STAFF_ROLE, COLIN_ROLE, PPR_ROLE
from tests.unit.services.utils import create_header_account, create_header
# prep sample post, put draft statement data
SAMPLE_JSON_FINANCING = copy.deepcopy(DRAFT_FINANCING_STATEMENT)
SAMPLE_JSON_CHANGE = copy.deepcopy(DRAFT_CHANGE_STATEMENT)
SAMPLE_JSON_AMENDMENT = copy.deepcopy(DRAFT_AMENDMENT_STATEMENT)
def test_draft_create_invalid_type(session, client, jwt):
"""Assert that create draft with an invalid type returns a 404 error."""
# setup
json_data = copy.deepcopy(SAMPLE_JSON_FINANCING)
json_data['type'] = 'INVALID_TYPE'
# test
rv = client.post('/api/v1/drafts',
json=json_data,
headers=create_header_account(jwt, [PPR_ROLE]),
content_type='application/json')
# check
assert rv.status_code == HTTPStatus.INTERNAL_SERVER_ERROR
def test_draft_create_valid_financing_201(session, client, jwt):
"""Assert that a valid draft financing statement returns a 201 status."""
# setup
json_data = copy.deepcopy(SAMPLE_JSON_FINANCING)
# test
rv = client.post('/api/v1/drafts',
json=json_data,
headers=create_header_account(jwt, [PPR_ROLE]),
content_type='application/json')
# check
assert rv.status_code == HTTPStatus.CREATED
assert rv.json['financingStatement']['documentId']
# now delete draft
document_id = rv.json['financingStatement']['documentId']
rv2 = client.delete('/api/v1/drafts/' + document_id,
headers=create_header_account(jwt, [PPR_ROLE]))
# check delete
assert rv2.status_code == HTTPStatus.NO_CONTENT
def test_draft_create_valid_amendment_201(session, client, jwt):
"""Assert that a valid draft amendment statement returns a 201 status."""
# setup
json_data = copy.deepcopy(SAMPLE_JSON_AMENDMENT)
# test
rv = client.post('/api/v1/drafts',
json=json_data,
headers=create_header_account(jwt, [PPR_ROLE]),
content_type='application/json')
# check
assert rv.status_code == HTTPStatus.CREATED
assert rv.json['amendmentStatement']['documentId']
# now delete draft
document_id = rv.json['amendmentStatement']['documentId']
rv2 = client.delete('/api/v1/drafts/' + document_id,
headers=create_header_account(jwt, [PPR_ROLE]))
# check delete
assert rv2.status_code == HTTPStatus.NO_CONTENT
def test_draft_valid_change_201(session, client, jwt):
"""Assert that a valid draft change statement returns a 201 status."""
# setup
json_data = copy.deepcopy(SAMPLE_JSON_CHANGE)
# test
rv = client.post('/api/v1/drafts',
json=json_data,
headers=create_header_account(jwt, [PPR_ROLE]),
content_type='application/json')
# check
assert rv.status_code == HTTPStatus.CREATED
assert rv.json['changeStatement']['documentId']
# now delete draft
document_id = rv.json['changeStatement']['documentId']
rv2 = client.delete('/api/v1/drafts/' + document_id,
headers=create_header_account(jwt, [PPR_ROLE]))
# check delete
assert rv2.status_code == HTTPStatus.NO_CONTENT
def test_draft_get_list_200(session, client, jwt):
"""Assert that a get draft list for an account returns a 200 status."""
# setup
# test
rv = client.get('/api/v1/drafts',
headers=create_header_account(jwt, [PPR_ROLE]))
# check
assert rv.status_code == HTTPStatus.OK
def test_draft_valid_get_statement_200(session, client, jwt):
"""Assert that a valid get draft by document ID returns a 200 status."""
# setup
# test
rv = client.get('/api/v1/drafts/D-T-FS01',
headers=create_header_account(jwt, [PPR_ROLE]))
# check
assert rv.status_code == HTTPStatus.OK
def test_draft_invalid_get_statement_404(session, client, jwt):
"""Assert that a get draft by invalid document ID returns a 404 status."""
# setup
# test
rv = client.get('/api/v1/drafts/D0012345',
headers=create_header_account(jwt, [PPR_ROLE]))
# check
assert rv.status_code == HTTPStatus.NOT_FOUND
def test_draft_update_invalid_type_404(session, client, jwt):
"""Assert that an update draft financing statement request with an invalid type returns a 404."""
# setup
json_data = copy.deepcopy(SAMPLE_JSON_FINANCING)
json_data['financingStatement']['type'] = 'XA'
# test
rv = client.put('/api/v1/drafts/D0034001',
json=json_data,
headers=create_header_account(jwt, [PPR_ROLE]),
content_type='application/json')
# check
assert rv.status_code == HTTPStatus.NOT_FOUND
def test_draft_update_valid_financing_200(session, client, jwt):
"""Assert that a valid draft financing statement update request returns a 200 status."""
# setup
json_data = copy.deepcopy(SAMPLE_JSON_FINANCING)
# test
rv = client.put('/api/v1/drafts/D-T-FS01',
json=json_data,
headers=create_header_account(jwt, [PPR_ROLE]),
content_type='application/json')
# check
assert rv.status_code == HTTPStatus.OK
def test_draft_update_valid_amendment_200(session, client, jwt):
"""Assert that a valid draft amendment statement update request returns a 200 status."""
# setup
json_data = copy.deepcopy(SAMPLE_JSON_AMENDMENT)
# test
rv = client.put('/api/v1/drafts/D-T-AM01',
json=json_data,
headers=create_header_account(jwt, [PPR_ROLE]),
content_type='application/json')
# check
assert rv.status_code == HTTPStatus.OK
def test_draft_update_valid_change_200(session, client, jwt):
"""Assert that a valid draft change statement update request returns a 200 status."""
# setup
json_data = copy.deepcopy(SAMPLE_JSON_CHANGE)
# test
rv = client.put('/api/v1/drafts/D-T-CH01',
json=json_data,
headers=create_header_account(jwt, [PPR_ROLE]),
content_type='application/json')
# check
assert rv.status_code == HTTPStatus.OK
# def test_draft_delete_204(session, client, jwt):
# """Assert that a valid delete draft request returns a 204 status."""
# setup
# test
# rv = client.delete(f'/api/v1/drafts/TEST-FSD1',
# headers=create_header_account(jwt, [PPR_ROLE]))
# check
# assert rv.status_code == HTTPStatus.NO_CONTENT
def test_draft_delete_404(session, client, jwt):
"""Assert that an invalid delete draft document ID returns a 404 status."""
# setup
# test
rv = client.delete('/api/v1/drafts/X12345X',
headers=create_header_account(jwt, [PPR_ROLE]))
# check
assert rv.status_code == HTTPStatus.NOT_FOUND
def test_draft_create_nonstaff_missing_account_400(session, client, jwt):
"""Assert that a non-staff draft request with no account ID returns a 400 status."""
# setup
json_data = copy.deepcopy(SAMPLE_JSON_FINANCING)
# test
rv = client.post('/api/v1/drafts',
json=json_data,
headers=create_header(jwt, [COLIN_ROLE]),
content_type='application/json')
# check
assert rv.status_code == HTTPStatus.BAD_REQUEST
def test_draft_create_staff_missing_account_400(session, client, jwt):
"""Assert that a staff draft request with no account ID returns a 400 status."""
# setup
json_data = copy.deepcopy(SAMPLE_JSON_FINANCING)
# test
rv = client.post('/api/v1/drafts',
json=json_data,
headers=create_header(jwt, [PPR_ROLE, STAFF_ROLE]),
content_type='application/json')
# check
assert rv.status_code == HTTPStatus.BAD_REQUEST
def test_draft_create_nonstaff_unauthorized_401(session, client, jwt):
"""Assert that a non-ppr role draft request with an account ID returns a 404 status."""
# setup
json_data = copy.deepcopy(SAMPLE_JSON_FINANCING)
# test
rv = client.post('/api/v1/drafts',
json=json_data,
headers=create_header_account(jwt, [COLIN_ROLE]),
content_type='application/json')
# check
assert rv.status_code == HTTPStatus.UNAUTHORIZED
def test_draft_list_nonstaff_missing_account_400(session, client, jwt):
"""Assert that a non-staff draft list request with no account ID returns a 400 status."""
# setup
# test
rv = client.get('/api/v1/drafts',
headers=create_header(jwt, [COLIN_ROLE]))
# check
assert rv.status_code == HTTPStatus.BAD_REQUEST
def test_draft_list_staff_missing_account_400(session, client, jwt):
"""Assert that a staff draft list request with no account ID returns a 400 status."""
# setup
# test
rv = client.get('/api/v1/drafts',
headers=create_header(jwt, [PPR_ROLE, STAFF_ROLE]))
# check
assert rv.status_code == HTTPStatus.BAD_REQUEST
def test_draft_list_nonstaff_unauthorized_401(session, client, jwt):
"""Assert that a non-ppr role draft list request with an account ID returns a 404 status."""
# setup
# test
rv = client.get('/api/v1/drafts',
headers=create_header_account(jwt, [COLIN_ROLE]))
# check
assert rv.status_code == HTTPStatus.UNAUTHORIZED
def test_draft_update_nonstaff_missing_account_400(session, client, jwt):
"""Assert that a non-staff update draft request with no account ID returns a 400 status."""
# setup
json_data = copy.deepcopy(SAMPLE_JSON_FINANCING)
# test
rv = client.put('/api/v1/drafts/TEST-FSD1',
json=json_data,
headers=create_header(jwt, [COLIN_ROLE]),
content_type='application/json')
# check
assert rv.status_code == HTTPStatus.BAD_REQUEST
def test_draft_update_staff_missing_account_400(session, client, jwt):
"""Assert that a staff update draft request with no account ID returns a 400 status."""
# setup
json_data = copy.deepcopy(SAMPLE_JSON_FINANCING)
# test
rv = client.put('/api/v1/drafts/TEST-FSD1',
json=json_data,
headers=create_header(jwt, [PPR_ROLE, STAFF_ROLE]),
content_type='application/json')
# check
assert rv.status_code == HTTPStatus.BAD_REQUEST
def test_draft_update_nonstaff_unauthorized_401(session, client, jwt):
"""Assert that a non-ppr role update draft request with an account ID returns a 404 status."""
# setup
json_data = copy.deepcopy(SAMPLE_JSON_FINANCING)
# test
rv = client.put('/api/v1/drafts/TEST-FSD1',
json=json_data,
headers=create_header_account(jwt, [COLIN_ROLE]),
content_type='application/json')
# check
assert rv.status_code == HTTPStatus.UNAUTHORIZED
def test_draft_get_nonstaff_missing_account_400(session, client, jwt):
"""Assert that a non-staff draft get request with no account ID returns a 400 status."""
# setup
# test
rv = client.get('/api/v1/drafts/TEST-FSD1',
headers=create_header(jwt, [COLIN_ROLE]))
# check
assert rv.status_code == HTTPStatus.BAD_REQUEST
def test_draft_get_staff_missing_account_201(session, client, jwt):
"""Assert that a staff draft get request with no account ID returns a 201 status."""
# setup
# test
rv = client.get('/api/v1/drafts/D-T-FS01',
headers=create_header(jwt, [PPR_ROLE, STAFF_ROLE]))
# check
assert rv.status_code == HTTPStatus.OK
def test_draft_get_nonstaff_unauthorized_401(session, client, jwt):
"""Assert that a non-ppr role draft get request with an account ID returns a 404 status."""
# setup
# test
rv = client.get('/api/v1/drafts/TEST-FSD1',
headers=create_header_account(jwt, [COLIN_ROLE]))
# check
assert rv.status_code == HTTPStatus.UNAUTHORIZED
| 35.130667 | 101 | 0.66525 | 1,702 | 13,174 | 4.940658 | 0.096945 | 0.042811 | 0.036627 | 0.065406 | 0.820312 | 0.808301 | 0.803306 | 0.781663 | 0.749197 | 0.743013 | 0 | 0.022262 | 0.236223 | 13,174 | 374 | 102 | 35.224599 | 0.813357 | 0.257932 | 0 | 0.707317 | 0 | 0 | 0.097173 | 0.031728 | 0 | 0 | 0 | 0 | 0.182927 | 1 | 0.146341 | false | 0 | 0.030488 | 0 | 0.176829 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
12b8c1e02191d56b0a2158e7699ab3a5ad3f34a2 | 96 | py | Python | faceDetection/__init__.py | thisKK/-arcfaceV1-retinaface- | b9cd772f4145908ab4517622b2c64d0dbcad02f5 | [
"MIT"
] | 2 | 2021-11-24T06:59:24.000Z | 2021-11-24T15:01:16.000Z | faceDetection/__init__.py | thisKK/-arcfaceV1-retinaface- | b9cd772f4145908ab4517622b2c64d0dbcad02f5 | [
"MIT"
] | null | null | null | faceDetection/__init__.py | thisKK/-arcfaceV1-retinaface- | b9cd772f4145908ab4517622b2c64d0dbcad02f5 | [
"MIT"
] | null | null | null | from .face_detection.detector import RetinaFace
from .face_detection._version import __version__ | 48 | 48 | 0.885417 | 12 | 96 | 6.5 | 0.583333 | 0.205128 | 0.435897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072917 | 96 | 2 | 48 | 48 | 0.876404 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
12bfc686d179eebb9bbe3d2d71d9370ce0b0e2e6 | 8,725 | py | Python | tests/components/pushbullet/test_notify.py | erogleva/core | 994ae09f69afe772150a698953c0d7386a745de2 | [
"Apache-2.0"
] | 2 | 2021-05-19T19:05:08.000Z | 2021-06-06T06:51:05.000Z | tests/components/pushbullet/test_notify.py | erogleva/core | 994ae09f69afe772150a698953c0d7386a745de2 | [
"Apache-2.0"
] | 56 | 2020-08-03T07:30:54.000Z | 2022-03-31T06:02:04.000Z | tests/components/pushbullet/test_notify.py | erogleva/core | 994ae09f69afe772150a698953c0d7386a745de2 | [
"Apache-2.0"
] | 2 | 2017-10-13T21:54:28.000Z | 2018-02-24T23:48:21.000Z | """The tests for the pushbullet notification platform."""
import json
from pushbullet import PushBullet
import pytest
import homeassistant.components.notify as notify
from homeassistant.setup import async_setup_component
from tests.async_mock import patch
from tests.common import assert_setup_component, load_fixture
@pytest.fixture
def mock_pushbullet():
"""Mock pushbullet."""
with patch.object(
PushBullet,
"_get_data",
return_value=json.loads(load_fixture("pushbullet_devices.json")),
):
yield
async def test_pushbullet_config(hass, mock_pushbullet):
"""Test setup."""
config = {
notify.DOMAIN: {
"name": "test",
"platform": "pushbullet",
"api_key": "MYFAKEKEY",
}
}
with assert_setup_component(1) as handle_config:
assert await async_setup_component(hass, notify.DOMAIN, config)
await hass.async_block_till_done()
assert handle_config[notify.DOMAIN]
async def test_pushbullet_config_bad(hass):
"""Test set up the platform with bad/missing configuration."""
config = {notify.DOMAIN: {"platform": "pushbullet"}}
with assert_setup_component(0) as handle_config:
assert await async_setup_component(hass, notify.DOMAIN, config)
await hass.async_block_till_done()
assert not handle_config[notify.DOMAIN]
async def test_pushbullet_push_default(hass, requests_mock, mock_pushbullet):
"""Test pushbullet push to default target."""
config = {
notify.DOMAIN: {
"name": "test",
"platform": "pushbullet",
"api_key": "MYFAKEKEY",
}
}
with assert_setup_component(1) as handle_config:
assert await async_setup_component(hass, notify.DOMAIN, config)
await hass.async_block_till_done()
assert handle_config[notify.DOMAIN]
requests_mock.register_uri(
"POST",
"https://api.pushbullet.com/v2/pushes",
status_code=200,
json={"mock_response": "Ok"},
)
data = {"title": "Test Title", "message": "Test Message"}
await hass.services.async_call(notify.DOMAIN, "test", data)
await hass.async_block_till_done()
assert requests_mock.called
assert requests_mock.call_count == 1
expected_body = {"body": "Test Message", "title": "Test Title", "type": "note"}
assert requests_mock.last_request.json() == expected_body
async def test_pushbullet_push_device(hass, requests_mock, mock_pushbullet):
"""Test pushbullet push to default target."""
config = {
notify.DOMAIN: {
"name": "test",
"platform": "pushbullet",
"api_key": "MYFAKEKEY",
}
}
with assert_setup_component(1) as handle_config:
assert await async_setup_component(hass, notify.DOMAIN, config)
await hass.async_block_till_done()
assert handle_config[notify.DOMAIN]
requests_mock.register_uri(
"POST",
"https://api.pushbullet.com/v2/pushes",
status_code=200,
json={"mock_response": "Ok"},
)
data = {
"title": "Test Title",
"message": "Test Message",
"target": ["device/DESKTOP"],
}
await hass.services.async_call(notify.DOMAIN, "test", data)
await hass.async_block_till_done()
assert requests_mock.called
assert requests_mock.call_count == 1
expected_body = {
"body": "Test Message",
"device_iden": "identity1",
"title": "Test Title",
"type": "note",
}
assert requests_mock.last_request.json() == expected_body
async def test_pushbullet_push_devices(hass, requests_mock, mock_pushbullet):
"""Test pushbullet push to default target."""
config = {
notify.DOMAIN: {
"name": "test",
"platform": "pushbullet",
"api_key": "MYFAKEKEY",
}
}
with assert_setup_component(1) as handle_config:
assert await async_setup_component(hass, notify.DOMAIN, config)
await hass.async_block_till_done()
assert handle_config[notify.DOMAIN]
requests_mock.register_uri(
"POST",
"https://api.pushbullet.com/v2/pushes",
status_code=200,
json={"mock_response": "Ok"},
)
data = {
"title": "Test Title",
"message": "Test Message",
"target": ["device/DESKTOP", "device/My iPhone"],
}
await hass.services.async_call(notify.DOMAIN, "test", data)
await hass.async_block_till_done()
assert requests_mock.called
assert requests_mock.call_count == 2
assert len(requests_mock.request_history) == 2
expected_body = {
"body": "Test Message",
"device_iden": "identity1",
"title": "Test Title",
"type": "note",
}
assert requests_mock.request_history[0].json() == expected_body
expected_body = {
"body": "Test Message",
"device_iden": "identity2",
"title": "Test Title",
"type": "note",
}
assert requests_mock.request_history[1].json() == expected_body
async def test_pushbullet_push_email(hass, requests_mock, mock_pushbullet):
"""Test pushbullet push to default target."""
config = {
notify.DOMAIN: {
"name": "test",
"platform": "pushbullet",
"api_key": "MYFAKEKEY",
}
}
with assert_setup_component(1) as handle_config:
assert await async_setup_component(hass, notify.DOMAIN, config)
await hass.async_block_till_done()
assert handle_config[notify.DOMAIN]
requests_mock.register_uri(
"POST",
"https://api.pushbullet.com/v2/pushes",
status_code=200,
json={"mock_response": "Ok"},
)
data = {
"title": "Test Title",
"message": "Test Message",
"target": ["email/user@host.net"],
}
await hass.services.async_call(notify.DOMAIN, "test", data)
await hass.async_block_till_done()
assert requests_mock.called
assert requests_mock.call_count == 1
assert len(requests_mock.request_history) == 1
expected_body = {
"body": "Test Message",
"email": "user@host.net",
"title": "Test Title",
"type": "note",
}
assert requests_mock.request_history[0].json() == expected_body
async def test_pushbullet_push_mixed(hass, requests_mock, mock_pushbullet):
"""Test pushbullet push to default target."""
config = {
notify.DOMAIN: {
"name": "test",
"platform": "pushbullet",
"api_key": "MYFAKEKEY",
}
}
with assert_setup_component(1) as handle_config:
assert await async_setup_component(hass, notify.DOMAIN, config)
await hass.async_block_till_done()
assert handle_config[notify.DOMAIN]
requests_mock.register_uri(
"POST",
"https://api.pushbullet.com/v2/pushes",
status_code=200,
json={"mock_response": "Ok"},
)
data = {
"title": "Test Title",
"message": "Test Message",
"target": ["device/DESKTOP", "email/user@host.net"],
}
await hass.services.async_call(notify.DOMAIN, "test", data)
await hass.async_block_till_done()
assert requests_mock.called
assert requests_mock.call_count == 2
assert len(requests_mock.request_history) == 2
expected_body = {
"body": "Test Message",
"device_iden": "identity1",
"title": "Test Title",
"type": "note",
}
assert requests_mock.request_history[0].json() == expected_body
expected_body = {
"body": "Test Message",
"email": "user@host.net",
"title": "Test Title",
"type": "note",
}
assert requests_mock.request_history[1].json() == expected_body
async def test_pushbullet_push_no_file(hass, requests_mock, mock_pushbullet):
"""Test pushbullet push to default target."""
config = {
notify.DOMAIN: {
"name": "test",
"platform": "pushbullet",
"api_key": "MYFAKEKEY",
}
}
with assert_setup_component(1) as handle_config:
assert await async_setup_component(hass, notify.DOMAIN, config)
await hass.async_block_till_done()
assert handle_config[notify.DOMAIN]
requests_mock.register_uri(
"POST",
"https://api.pushbullet.com/v2/pushes",
status_code=200,
json={"mock_response": "Ok"},
)
data = {
"title": "Test Title",
"message": "Test Message",
"target": ["device/DESKTOP", "device/My iPhone"],
"data": {"file": "not_a_file"},
}
assert not await hass.services.async_call(notify.DOMAIN, "test", data)
await hass.async_block_till_done()
| 31.843066 | 83 | 0.627049 | 991 | 8,725 | 5.287588 | 0.101917 | 0.073282 | 0.058397 | 0.050763 | 0.882634 | 0.875382 | 0.868511 | 0.866603 | 0.848664 | 0.848664 | 0 | 0.007452 | 0.246418 | 8,725 | 273 | 84 | 31.959707 | 0.789506 | 0.007794 | 0 | 0.721519 | 0 | 0 | 0.17657 | 0.002772 | 0 | 0 | 0 | 0 | 0.194093 | 1 | 0.004219 | false | 0 | 0.029536 | 0 | 0.033755 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
12c6c96ee23f1fd9ceae3598b0d84fa44aa277c8 | 5,267 | py | Python | tools/sources/post_harvey.py | jasmcaus/aws-disaster-response-ml | 5b0c3feedfa848e99d3487cb9024589e17d49af1 | [
"MIT"
] | 1 | 2022-02-21T19:33:02.000Z | 2022-02-21T19:33:02.000Z | tools/sources/post_harvey.py | jasmcaus/aws-disaster-response-ml | 5b0c3feedfa848e99d3487cb9024589e17d49af1 | [
"MIT"
] | null | null | null | tools/sources/post_harvey.py | jasmcaus/aws-disaster-response-ml | 5b0c3feedfa848e99d3487cb9024589e17d49af1 | [
"MIT"
] | null | null | null | harvey_urls = [
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-29/1040010030BE1200/1040010030BE1200.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-29/1040010031C9C900/1040010031C9C900.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-29/1040010032211E00/1040010032211E00.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-29/10400100325CA800/10400100325CA800.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-29/1040010032658F00/1040010032658F00.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-30/1020010065DCD300/1020010065DCD300.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-30/10200100682E8400/10200100682E8400.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-30/103001006EB42900/103001006EB42900.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-30/103001006F018C00/103001006F018C00.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-30/103001006F884000/103001006F884000.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-30/103001006FAD6200/103001006FAD6200.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-30/103001006FCDF300/103001006FCDF300.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-30/105001000B920800/105001000B920800.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-30/105001000B920900/105001000B920900.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-30/105001000B920A00/105001000B920A00.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-31/1020010065DF2700/1020010065DF2700.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-31/1020010066A97100/1020010066A97100.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-31/1020010067290D00/1020010067290D00.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-31/1020010067E38000/1020010067E38000.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-31/103001006D44B500/103001006D44B500.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-31/105001000B95E100/105001000B95E100.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-08-31/105001000B95E200/105001000B95E200.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-01/1030010070C13600/1030010070C13600.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-02/103001006F3C8400/103001006F3C8400.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-02/1030010071D49400/1030010071D49400.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-02/10400100324DAE00/10400100324DAE00.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-02/105001000B9D7E00/105001000B9D7E00.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-02/105001000B9D7F00/105001000B9D7F00.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-02/105001000B9D8000/105001000B9D8000.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-02/105001000B9D8100/105001000B9D8100.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-03/10200100643E0900/10200100643E0900.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-03/1020010065114800/1020010065114800.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-03/1020010066BD2E00/1020010066BD2E00.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-03/1020010066DDD400/1020010066DDD400.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-03/1020010068D6F400/1020010068D6F400.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-03/1020010068D84400/1020010068D84400.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-03/1020010069036700/1020010069036700.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-03/1020010069E6CC00/1020010069E6CC00.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-03/1040010031278100/1040010031278100.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-03/10400100319BA200/10400100319BA200.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-03/1040010032298700/1040010032298700.tif",
"https://opendata.digitalglobe.com/events/hurricane-harvey/post-event/2017-09-03/1040010032A12D00/1040010032A12D00.tif",
] | 119.704545 | 124 | 0.807481 | 632 | 5,267 | 6.727848 | 0.098101 | 0.12841 | 0.246943 | 0.276576 | 0.680386 | 0.680386 | 0.680386 | 0.680386 | 0.680386 | 0.680386 | 0 | 0.303522 | 0.04044 | 5,267 | 44 | 125 | 119.704545 | 0.537792 | 0 | 0 | 0 | 0 | 0.954545 | 0.932802 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
425828737df54949fe834ce816987764cc242baf | 12,422 | py | Python | PyAlgorithm/testing/data_structure_unittest.py | allenliuzihao/PyAlgorithm | 0468f1fc90795d7524e8674aecbfbd2214b256a7 | [
"MIT"
] | null | null | null | PyAlgorithm/testing/data_structure_unittest.py | allenliuzihao/PyAlgorithm | 0468f1fc90795d7524e8674aecbfbd2214b256a7 | [
"MIT"
] | null | null | null | PyAlgorithm/testing/data_structure_unittest.py | allenliuzihao/PyAlgorithm | 0468f1fc90795d7524e8674aecbfbd2214b256a7 | [
"MIT"
] | null | null | null | import unittest
import sys
import random
sys.path.append("../Data_Structure")
import heap
def isHeap(heap, option):
index = 1
while index <= heap.size():
if 2 * index <= heap.size():
if (option and heap.getArray()[index] > heap.getArray()[2 * index]) or (not option and heap.getArray()[index] < heap.getArray()[2 * index]):
return False
if 2 * index + 1 <= heap.size():
if (option and heap.getArray()[index] > heap.getArray()[2 * index + 1]) or (not option and heap.getArray()[index] < heap.getArray()[2 * index + 1]):
print heap.getArray()[index], " ", heap.getArray()[2 * index + 1]
return False
index += 1
return True
class MinHeapTestCase(unittest.TestCase):
def setUp(self):
self.heap = heap.Heap(True)
def test_insert_extract_one_element_min_heap(self):
self.heap.insert(1)
self.assertEqual(self.heap.size(), 1,
'incorrect heap size after insertion!')
self.assertTrue(isHeap(self.heap, True), "not a heap after insertion!")
extract = self.heap.extract()
self.assertEqual(self.heap.size(), 0,
'incorrect heap size after extraction!')
self.assertTrue(isHeap(self.heap, True), "not a heap after extraction!")
self.assertEqual(extract, 1,
'element extracted from heap is different from element inserted')
def test_insert_extract_sorted_elements_min_heap(self):
a = [i for i in xrange(1, 1000)]
for i in a:
self.heap.insert(i)
self.assertEqual(self.heap.size(), len(a),
'incorrect heap size after insertion!')
self.assertTrue(isHeap(self.heap, True), "not a heap after insertion!")
extract = []
while not self.heap.isEmpty():
extract.append(self.heap.extract())
self.assertEqual(self.heap.size(), 0,
'incorrect heap size after extraction!')
self.assertTrue(isHeap(self.heap, True), "not a heap after extraction!")
self.assertEqual(extract, a,
'elements extracted are different from elements inserted!')
def test_insert_extract_reverse_sorted_elements_min_heap(self):
a = [i for i in reversed(xrange(1, 1000))]
for i in a:
self.heap.insert(i)
self.assertEqual(self.heap.size(), len(a),
'incorrect heap size after insertion!')
self.assertTrue(isHeap(self.heap, True), "not a heap after insertion!")
extract = []
while not self.heap.isEmpty():
extract.append(self.heap.extract())
self.assertEqual(self.heap.size(), 0,
'incorrect heap size after extraction!')
self.assertTrue(isHeap(self.heap, True), "not a heap after extraction!")
extract.reverse()
self.assertEqual(extract, a,
'elements extracted are different from elements inserted!')
def test_insert_extract_random_elements_min_heap(self):
a = random.sample(range(100000), 10000)
for i in a:
self.heap.insert(i)
self.assertEqual(self.heap.size(), len(a),
'incorrect heap size after insertion!')
self.assertTrue(isHeap(self.heap, True), "not a heap after insertion!")
extract = []
while not self.heap.isEmpty():
extract.append(self.heap.extract())
self.assertEqual(self.heap.size(), 0,
'incorrect heap size after extraction!')
self.assertTrue(isHeap(self.heap, True), "not a heap after extraction!")
a.sort()
self.assertEqual(extract, a,
'elements extracted are different from elements inserted!')
def test_remove_element_from_empty_min_heap(self):
self.assertRaises(KeyError, self.heap.remove, (1))
def test_remove_element_from_min_heap_with_one_element(self):
self.heap.insert(1)
removed = self.heap.remove(1)
self.assertEqual(self.heap.size(), 0,
'incorrect heap size after removal!')
self.assertTrue(isHeap(self.heap, True), "not a heap after removal!")
self.assertEqual(removed, 1,
'elements removed are different from elements inserted!')
def test_remove_min_element_from_min_heap(self):
for i in xrange(1, 1000):
self.heap.insert(i)
removed = self.heap.remove(1)
self.assertEqual(self.heap.size(), 998,
'incorrect heap size after removal!')
self.assertTrue(isHeap(self.heap, True), "not a heap after removal!")
self.assertEqual(removed, 1, "removed element is different from the target for removal")
self.heap.emptyHeap()
def test_remove_max_element_from_min_heap(self):
for i in xrange(1, 1000):
self.heap.insert(i)
removed = self.heap.remove(999)
self.assertEqual(self.heap.size(), 998,
'incorrect heap size after removal!')
self.assertTrue(isHeap(self.heap, True), "not a heap after removal!")
self.assertEqual(removed, 999, "removed element is different from the target for removal")
self.heap.emptyHeap()
def test_remove_middle_element_from_min_heap(self):
for i in xrange(1, 1000):
self.heap.insert(i)
removed = self.heap.remove(500)
self.assertEqual(self.heap.size(), 998,
'incorrect heap size after removal!')
self.assertTrue(isHeap(self.heap, True), "not a heap after removal!")
self.assertEqual(removed, 500, "removed element is different from the target for removal")
self.heap.emptyHeap()
def test_remove_all_elements_from_min_heap(self):
for i in xrange(1, 1000):
self.heap.insert(i)
for i in xrange(1, 1000):
removed = self.heap.remove(i)
self.assertEqual(self.heap.size(), 999 - i,
'incorrect heap size after removal of element: ' + str(i) + '!')
self.assertTrue(isHeap(self.heap, True), "not a heap after removal of element: " + str(i) + '!')
self.assertEqual(removed, i, "removed element is different from the target for removal. Element: " + str(i))
class MaxHeapTestCase(unittest.TestCase):
def setUp(self):
self.heap = heap.Heap(False)
def test_insert_extract_one_element_max_heap(self):
self.heap.insert(1)
self.assertEqual(self.heap.size(), 1,
'incorrect heap size after insertion!')
self.assertTrue(isHeap(self.heap, False), "not a heap after insertion!")
extract = self.heap.extract()
self.assertEqual(self.heap.size(), 0,
'incorrect heap size after extraction!')
self.assertTrue(isHeap(self.heap, False), "not a heap after extraction!")
self.assertEqual(extract, 1,
'element extracted from heap is different from element inserted')
def test_insert_extract_sorted_elements_max_heap(self):
a = [i for i in xrange(1, 1000)]
for i in a:
self.heap.insert(i)
self.assertEqual(self.heap.size(), len(a),
'incorrect heap size after insertion!')
self.assertTrue(isHeap(self.heap, False), "not a heap after insertion!")
extract = []
while not self.heap.isEmpty():
extract.append(self.heap.extract())
self.assertEqual(self.heap.size(), 0,
'incorrect heap size after extraction!')
self.assertTrue(isHeap(self.heap, False), "not a heap after extraction!")
extract.sort()
self.assertEqual(extract, a,
'elements extracted are different from elements inserted!')
def test_insert_extract_reverse_sorted_elements_max_heap(self):
a = [i for i in reversed(xrange(1, 1000))]
for i in a:
self.heap.insert(i)
self.assertEqual(self.heap.size(), len(a),
'incorrect heap size after insertion!')
self.assertTrue(isHeap(self.heap, False), "not a heap after insertion!")
extract = []
while not self.heap.isEmpty():
extract.append(self.heap.extract())
self.assertEqual(self.heap.size(), 0,
'incorrect heap size after extraction!')
self.assertTrue(isHeap(self.heap, False), "not a heap after extraction!")
self.assertEqual(extract, a,
'elements extracted are different from elements inserted!')
def test_insert_extract_random_elements_max_heap(self):
a = random.sample(range(100000), 10000)
for i in a:
self.heap.insert(i)
self.assertEqual(self.heap.size(), len(a),
'incorrect heap size after insertion!')
self.assertTrue(isHeap(self.heap, False), "not a heap after insertion!")
extract = []
while not self.heap.isEmpty():
extract.append(self.heap.extract())
self.assertEqual(self.heap.size(), 0,
'incorrect heap size after extraction!')
self.assertTrue(isHeap(self.heap, False), "not a heap after extraction!")
a.sort()
extract.sort()
self.assertEqual(extract, a,
'elements extracted are different from elements inserted!')
def test_remove_element_from_empty_max_heap(self):
self.assertRaises(KeyError, self.heap.remove, (1))
def test_remove_element_from_max_heap_with_one_element(self):
self.heap.insert(1)
removed = self.heap.remove(1)
self.assertEqual(self.heap.size(), 0,
'incorrect heap size after removal!')
self.assertTrue(isHeap(self.heap, False), "not a heap after removal!")
self.assertEqual(removed, 1,
'elements removed are different from elements inserted!')
def test_remove_min_element_from_max_heap(self):
for i in xrange(1, 1000):
self.heap.insert(i)
removed = self.heap.remove(1)
self.assertEqual(self.heap.size(), 998,
'incorrect heap size after removal!')
self.assertTrue(isHeap(self.heap, False), "not a heap after removal!")
self.assertEqual(removed, 1, "removed element is different from the target for removal")
self.heap.emptyHeap()
def test_remove_max_element_from_max_heap(self):
for i in xrange(1, 1000):
self.heap.insert(i)
removed = self.heap.remove(999)
self.assertEqual(self.heap.size(), 998,
'incorrect heap size after removal!')
self.assertTrue(isHeap(self.heap, False), "not a heap after removal!")
self.assertEqual(removed, 999, "removed element is different from the target for removal")
self.heap.emptyHeap()
def test_remove_middle_element_from_max_heap(self):
for i in xrange(1, 1000):
self.heap.insert(i)
removed = self.heap.remove(500)
self.assertEqual(self.heap.size(), 998,
'incorrect heap size after removal!')
self.assertTrue(isHeap(self.heap, False), "not a heap after removal!")
self.assertEqual(removed, 500, "removed element is different from the target for removal")
self.heap.emptyHeap()
def test_remove_all_elements_from_max_heap(self):
for i in xrange(1, 1000):
self.heap.insert(i)
for i in xrange(1, 1000):
removed = self.heap.remove(i)
self.assertEqual(self.heap.size(), 999 - i,
'incorrect heap size after removal of element: ' + str(i) + '!')
self.assertTrue(isHeap(self.heap, False), "not a heap after removal of element: " + str(i) + '!')
self.assertEqual(removed, i, "removed element is different from the target for removal. Element: " + str(i))
if __name__ == '__main__':
suite = unittest.TestLoader().loadTestsFromTestCase(MaxHeapTestCase)
unittest.TextTestRunner(verbosity=2).run(suite)
suite = unittest.TestLoader().loadTestsFromTestCase(MinHeapTestCase)
unittest.TextTestRunner(verbosity=2).run(suite)
| 46.350746 | 160 | 0.609483 | 1,518 | 12,422 | 4.898551 | 0.061924 | 0.111888 | 0.066434 | 0.08042 | 0.949435 | 0.948628 | 0.92886 | 0.928053 | 0.923211 | 0.911377 | 0 | 0.020442 | 0.279343 | 12,422 | 267 | 161 | 46.524345 | 0.81021 | 0 | 0 | 0.810127 | 0 | 0 | 0.219771 | 0 | 0 | 0 | 0 | 0 | 0.303797 | 0 | null | null | 0 | 0.016878 | null | null | 0.004219 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
427b01744b79054281ecf0a220bf3e97478e3dd1 | 342 | py | Python | syslogng_pushbullet/destination.py | Soft/syslogng-pushbullet | 4d16a785d6db979fe47a1bf484e8ccd8256927ab | [
"MIT"
] | null | null | null | syslogng_pushbullet/destination.py | Soft/syslogng-pushbullet | 4d16a785d6db979fe47a1bf484e8ccd8256927ab | [
"MIT"
] | 1 | 2018-08-21T07:58:35.000Z | 2018-08-21T07:58:35.000Z | syslogng_pushbullet/destination.py | Soft/syslogng-pushbullet | 4d16a785d6db979fe47a1bf484e8ccd8256927ab | [
"MIT"
] | 1 | 2018-08-21T07:52:34.000Z | 2018-08-21T07:52:34.000Z | # -*- coding: utf-8 -*-
class Destination(object):
"""Destination base"""
def init(self, args):
return True
def deinit(self):
pass
def open(self):
return True
def is_opened(self):
return True
def close(self):
return True
def send(self, message):
return True
| 14.25 | 28 | 0.54386 | 40 | 342 | 4.625 | 0.525 | 0.27027 | 0.281081 | 0.275676 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004464 | 0.345029 | 342 | 23 | 29 | 14.869565 | 0.821429 | 0.114035 | 0 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.461538 | false | 0.076923 | 0 | 0.384615 | 0.923077 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 7 |
42980b0a670e69a5a4008846c2ac0dc0054bc039 | 98,329 | py | Python | cohesivenet/api/vns3/network_edge_plugins_api.py | cohesive/python-cohesivenet-sdk | 5620acfa669ff97c94d9aa04a16facda37d648c1 | [
"MIT"
] | null | null | null | cohesivenet/api/vns3/network_edge_plugins_api.py | cohesive/python-cohesivenet-sdk | 5620acfa669ff97c94d9aa04a16facda37d648c1 | [
"MIT"
] | null | null | null | cohesivenet/api/vns3/network_edge_plugins_api.py | cohesive/python-cohesivenet-sdk | 5620acfa669ff97c94d9aa04a16facda37d648c1 | [
"MIT"
] | null | null | null | # coding: utf-8
"""
VNS3 Controller API
Cohesive networks VNS3 API providing complete control of your network's addresses, routes, rules and edge # noqa: E501
The version of the OpenAPI document: 4.8
Contact: solutions@cohesive.net
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
import time
from cohesivenet import Logger
from cohesivenet.api_builder import VersionRouter
from cohesivenet.exceptions import ApiException
########################
# Container System API
########################
def delete_container(api_client, uuid, **kwargs): # noqa: E501
"""delete_container # noqa: E501
Delete a container # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.delete_container(client, uuid, async_req=True)
:param async_req bool: execute request asynchronously
:param str uuid: uuid of resource (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"uuid": uuid}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/container_system/containers/{uuid}",
"DELETE",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def delete_container_image(api_client, uuid, force=False, **kwargs): # noqa: E501
"""delete_container_image # noqa: E501
Delete container image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.delete_container_image(client, uuid, async_req=True)
:param async_req bool: execute request asynchronously
:param str uuid: uuid of resource (required)
:param bool force: force operation with cleanup of running containers
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = ["force"]
collection_formats = {}
path_params = {"uuid": uuid}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/container_system/images/{uuid}",
"DELETE",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_container_logs(api_client, uuid, lines=None, **kwargs): # noqa: E501
"""get_container_logs # noqa: E501
Fetch containers log messages # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_container_logs(uuid, lines, async_req=True)
:param async_req bool: execute request asynchronously
:param str uuid: uuid of resource (required)
:param int lines: Number of log lines to fetch (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = ["lines"]
collection_formats = {}
path_params = {"uuid": uuid}
query_params = []
for param in [p for p in request_params if local_var_params.get(p) is not None]:
query_params.append((param, local_var_params[param])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/container_system/containers/{uuid}/logs",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_container_system_ips(api_client, **kwargs): # noqa: E501
"""get_container_system_ips # noqa: E501
Retrieve IP address list for current container network configuration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_container_system_ips(client, async_req=True)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/container_system/ip_addresses",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_container_images(api_client, uuid=None, **kwargs): # noqa: E501
"""get_container_images # noqa: E501
Get list of existing container system images # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_container_images(async_req=True)
:param async_req bool: execute request asynchronously
:param str uuid: UUID for image to limit search
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = ["uuid"]
collection_formats = {}
path_params = {}
query_params = []
for param in [p for p in request_params if local_var_params.get(p) is not None]:
query_params.append((param, local_var_params[param])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/container_system/images",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_running_containers(
api_client, show_all=None, uuid=None, **kwargs
): # noqa: E501
"""get_container_system_running_containers # noqa: E501
Provides description information for one or all allocated containers # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_container_system_running_containers(client, async_req=True)
:param async_req bool: execute request asynchronously
:param bool show_all: Boolean for full list output
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = ["show_all", "uuid"]
collection_formats = {}
path_params = {}
query_params = []
for param in [p for p in request_params if local_var_params.get(p) is not None]:
query_params.append((param, local_var_params[param])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/container_system/containers",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_container_system_status(api_client, **kwargs): # noqa: E501
"""get_container_system_status # noqa: E501
Retrieve status of container system # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_container_system_status(client, async_req=True)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/container_system",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def post_action_container_system(api_client, action=None, **kwargs): # noqa: E501
"""post_action_container_system # noqa: E501
Start or Stop container system # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.post_action_container_system(client, , async_req=True)
:param action str: start or stop
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = ["action"]
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# HTTP header `Content-Type`
header_params["Content-Type"] = api_client.select_header_content_type( # noqa: E501
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/container_system",
"POST",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def post_commit_container(
api_client, uuid, name=None, description=None, **kwargs
): # noqa: E501
"""post_commit_container # noqa: E501
Creates a new container image from a running container # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.post_commit_container(client, uuid, name=name, async_req=True)
:param str uuid: uuid of resource (required)
:param str name: (required)
:param str description
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = ["name", "description"]
collection_formats = {}
path_params = {"uuid": uuid}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# HTTP header `Content-Type`
header_params["Content-Type"] = api_client.select_header_content_type( # noqa: E501
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/container_system/containers/{uuid}/commit",
"POST",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
# TODO add more validation for OR
def post_create_container_image(
api_client,
name=None,
url=None,
buildurl=None,
localbuild=None,
localimage=None,
imagefile=None,
buildfile=None,
description=None,
**kwargs
): # noqa: E501
"""post_create_container_image # noqa: E501
Create new container image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.post_create_container_image(post_create_container_image, async_req=True)
:param name str: Name of the image (required)
:param url str: URL of the image file to be imported
:param buildurl str: OR URL of a dockerfile that will be used to build the image
:param localbuild str: OR Local build file to create new image
:param localimage str: OR Local image to tag
:param imagefile str: image file upload
:param buildfile str: Dockerfile or zipped context file upload
:param description str:
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = [
"name",
"url",
"buildurl",
"localbuild",
"localimage",
"imagefile",
"buildfile",
"description",
]
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
if imagefile:
local_var_files["imagefile"] = imagefile
if buildfile:
local_var_files["buildfile"] = buildfile
# HTTP header `Content-Type`
if local_var_files:
header_params["Content-Type"] = api_client.select_header_content_type( # noqa: E501
["multipart/form-data"]
) # noqa: E501
form_params = body_params
body_params = {}
else:
header_params["Content-Type"] = api_client.select_header_content_type( # noqa: E501
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/container_system/images",
"POST",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def post_start_container(
api_client,
uuid=None,
image_uuid=None,
name=None,
ipaddress=None,
description=None,
command=None,
environment=None,
**kwargs
): # noqa: E501
"""post_start_container # noqa: E501
Create (allocate) a new container or start an existing one # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.post_start_container(start_container_request, async_req=True)
:param uuid str: required OR
:param image_uuid str: required OR
:param ipaddress str:
:param name str:
:param description str:
:param command str:
:param environment str:
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = [
"uuid",
"image_uuid",
"name",
"ipaddress",
"description",
"command",
"environment",
]
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# HTTP header `Content-Type`
header_params["Content-Type"] = api_client.select_header_content_type( # noqa: E501
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/container_system/containers",
"POST",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def put_configure_container_system(api_client, network=None, **kwargs): # noqa: E501
"""put_configure_container_system # noqa: E501
Configures the container network. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.put_configure_container_system(client, network="192.168.4.0/24", async_req=True)
:param network str:
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = ["network"]
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# HTTP header `Content-Type`
header_params["Content-Type"] = api_client.select_header_content_type( # noqa: E501
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/container_system",
"PUT",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def put_update_container_image(
api_client, uuid, name=None, description=None, **kwargs
): # noqa: E501
"""put_update_container_image # noqa: E501
Update container image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.put_update_container_image(uuid, update_container_image_request, async_req=True)
:param str uuid: uuid of resource (required)
:param str name: (required)
:param str description:
:param async_req bool: execute request asynchronously
:param UpdateContainerImageRequest update_container_image_request: (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = ["name", "description"]
collection_formats = {}
path_params = {"uuid": uuid}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# HTTP header `Content-Type`
header_params["Content-Type"] = api_client.select_header_content_type( # noqa: E501
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/container_system/images/{uuid}",
"PUT",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def put_stop_container(api_client, uuid, **kwargs): # noqa: E501
"""put_stop_container # noqa: E501
Stops a running container # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.put_stop_container(uuid, async_req=True)
:param async_req bool: execute request asynchronously
:param str uuid: uuid of resource (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"uuid": uuid}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/container_system/containers/{uuid}",
"PUT",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def post_export_image(api_client, uuid, name=None, **kwargs): # noqa: E501
"""post_export_image # noqa: E501
Create exported container image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.post_export_image(uuid, async_req=True)
:param uuid str: uuid of resource (required)
:param name str: name for file (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = ["name"]
collection_formats = {}
path_params = {"uuid": uuid}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/container_system/containers/{uuid}/exports",
"POST",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def wait_for_container_system_state(
api_client, running=True, sleep_time=2.0, timeout=30.0
):
expected_running_state = "true" if running else "false"
start_time = time.time()
while time.time() - start_time < timeout:
call_data = get_container_system_status(api_client)
running_state = str(call_data.response.running).lower()
if running_state == expected_running_state:
return True
time.sleep(sleep_time)
raise ApiException(
"Timeout: Failed to assert container system state running=%s [timeout=%s seconds, host=%s]"
% (expected_running_state, timeout, api_client.host_uri)
)
def assert_container_system_state(api_client, running, sleep_time=2.0, timeout=30.0):
"""Assert container system is either stopped or running
Arguments:
api_client {VNS3Client}
running {Boolean}
Keyword Arguments:
sleep_time {float} (default: {2.0})
timeout {float} (default: {30.0})
Raises:
ApiException:
Returns:
Boolean or raises ApiException
"""
action = "start" if running else "stop"
expected_running_state = "true" if running else "false"
expected_in_progress = "starting" if running else "stopping"
action_data = post_action_container_system(api_client, action=action)
response_state = str(action_data.response.running).lower()
if response_state == expected_running_state:
return True
assert response_state == expected_in_progress, "Unexpected state."
return wait_for_container_system_state(
api_client, running, sleep_time=sleep_time, timeout=timeout
)
def restart_container_network(api_client, sleep_time=2.0, timeout=30.0, **kwargs):
"""Restart the container network
Raises:
ApiException: Timeout exception
Returns:
Boolean
"""
start_time = time.time()
assert_container_system_state(
api_client, running=False, sleep_time=sleep_time, timeout=timeout
)
remaining_time = timeout - (time.time() - start_time)
assert_container_system_state(
api_client, running=True, sleep_time=sleep_time, timeout=remaining_time
)
return True
def wait_for_image_import(api_client, import_uuid, timeout=60.0, sleep_time=1.0):
"""Poll for image availability with a UUID
Arguments:
import_uuid {str} - UUID recieved on import
Keyword Arguments:
timeout {float}
sleep_time {float}
Raises:
ApiException: Raise if timeout or UUID does not exist
Returns:
Boolean
"""
start_time = time.time()
resp_data = get_container_images(api_client, uuid=import_uuid)
images = resp_data.response.images
if images is None:
raise ApiException("No images returned. Is container system running?")
elif len(images) == 0:
raise ApiException("Import UUID not found: %s" % import_uuid)
image_status = images[0]["status"]
if image_status == "Ready":
return True
time.sleep(sleep_time)
while time.time() - start_time < timeout:
try:
resp_data = get_container_images(api_client, uuid=import_uuid)
except ApiException as e:
if e.status == 500:
Logger.debug(
"API server error [500] waiting for image. Likely due to resource contention. Continuing polling.",
host=api_client.host_uri,
)
time.sleep(sleep_time)
continue
raise e
images = resp_data.response.images
if images is None:
raise ApiException("No images returned. Is container system running?")
elif len(images) == 0:
raise ApiException("Import UUID not found: %s" % import_uuid)
image_status = images[0]["status"]
if image_status == "Ready":
return True
time.sleep(sleep_time)
raise ApiException(
"API timeout [timeout=%s seconds] [Import image uuid=%s]"
% (timeout, import_uuid)
)
#########################################################
# Plugins API - BETA
#
# Plugins are Cohesive's container abstraction. They
# correspond to Container images and running containers
# while providing more management capabilities to the end
# user. We expect some sys-admins will manage plugins
# without any container expertise that one might expect
# from our container system users.
#########################################################
def get_plugins(api_client, *args, **kwargs): # noqa: E501
"""get_plugins # noqa: E501
Get list of plugin images.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_plugins(async_req=True)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = []
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugins",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_plugin(api_client, id, *args, **kwargs): # noqa: E501
"""get_plugin # noqa: E501
Get plugin image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_plugin(id, async_req=True)
:param async_req bool: execute request asynchronously
:param int id: ID for Plugin image (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"id": id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugins/{id}",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_plugin_instances(api_client, *args, **kwargs): # noqa: E501
"""get_plugin_instances # noqa: E501
Get list of running plugin instances.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_plugin_instances(async_req=True)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = []
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_plugin_instance(api_client, id, *args, **kwargs): # noqa: E501
"""get_plugin_instance # noqa: E501
Get running plugin instance # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_plugin_instance(id, async_req=True)
:param async_req bool: execute request asynchronously
:param int id: ID for Plugin instance (running container) (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"id": id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def post_commit_plugin_instance(
api_client, id, name=None, description=None, **kwargs
): # noqa: E501
"""post_commit_plugin_instance # noqa: E501
Create new plugin image from a running plugin instance.
This will create a new container image. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.post_commit_plugin_instance(client, id, name="New Image", async_req=True)
:param int id: ID for Plugin instance (running container) (required)
:param name str: Name for new plugin image (required)
:param decription str: Description for new plugin image
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = ["name", "description"]
collection_formats = {}
path_params = {"id": id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# HTTP header `Content-Type`
header_params["Content-Type"] = api_client.select_header_content_type( # noqa: E501
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}/commit",
"POST",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def post_create_manager_config(
api_client,
id,
log_files=None,
configuration_files=None,
ports=None,
process_manager=None,
executables=None,
**kwargs
): # noqa: E501
"""post_create_manager_config # noqa: E501
Create a new plugin manager configuration. If no params are provided,
a vanilla example configuration will be created. See docs for param specs:
https://docs.cohesive.net/docs/network-edge-plugins/plugin-manager/ # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.post_create_manager_config(client, id, async_req=True)
:param int id: ID for Plugin instance (running container) (required)
:param log_files list: List of Log File objects
:param configuration_files list: List of Configuration File objects
:param ports list: List of Port objects
:param process_manager Dict: Process Manager object
:param executables list: List of Executable objects
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = [
"log_files",
"configuration_files",
"ports",
"process_manager",
"executables",
]
collection_formats = {}
path_params = {"id": id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# HTTP header `Content-Type`
header_params["Content-Type"] = api_client.select_header_content_type( # noqa: E501
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}/manager",
"POST",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def put_update_manager_config(
api_client,
id,
log_files=None,
configuration_files=None,
ports=None,
process_manager=None,
executables=None,
**kwargs
): # noqa: E501
"""put_update_manager_config # noqa: E501
Update plugin manager configuration. See docs for param specs:
https://docs.cohesive.net/docs/network-edge-plugins/plugin-manager/ # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.put_update_manager_config(client, id, async_req=True)
:param int id: ID for Plugin instance (running container) (required)
:param log_files list: List of Log File objects
:param configuration_files list: List of Configuration File objects
:param ports list: List of Port objects
:param process_manager Dict: Process Manager object
:param executables list: List of Executable objects
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = [
"log_files",
"configuration_files",
"ports",
"process_manager",
"executables",
]
collection_formats = {}
path_params = {"id": id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# HTTP header `Content-Type`
header_params["Content-Type"] = api_client.select_header_content_type( # noqa: E501
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}/manager",
"PUT",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_plugin_instance_log_files(api_client, id, *args, **kwargs): # noqa: E501
"""get_plugin_instance_log_files # noqa: E501
Get plugin instance log file configurations defined in manager configuration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_plugin_instance_log_files(id, async_req=True)
:param int id: ID for Plugin instance (running container) (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"id": id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}/logs",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_plugin_instance_log_content(
api_client, id, file_slug, lines=None, **kwargs
): # noqa: E501
"""get_plugin_instance_log_content # noqa: E501
Read plugin instance log file content # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_plugin_instance_log_content(id, 0, async_req=True)
:param async_req bool: execute request asynchronously
:param int id: ID for Plugin instance (running container) (required)
:param str file_slug: Either the name of the log file (name key) or the index in the log files list. (required)
:param int lines: Number of log lines to return (default 25)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"id": id, "slug": file_slug}
query_params = ["lines"]
for param in [p for p in query_params if local_var_params.get(p) is not None]:
query_params.append((param, local_var_params[param])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}/logs/{slug}/content",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_plugin_instance_config_files(api_client, id, **kwargs): # noqa: E501
"""get_plugin_instance_config_files # noqa: E501
Get plugin instance configuration file configs defined in plugin manager config # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_plugin_instance_config_files(id, async_req=True)
:param async_req bool: execute request asynchronously
:param int id: ID for Plugin instance (running container) (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"id": id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}/configurations",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def revert_plugin_instance_config_file(
api_client, id, file_slug, version=None, **kwargs
): # noqa: E501
"""revert_plugin_instance_config_file # noqa: E501
Read plugin instance config file content # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.revert_plugin_instance_config_file(id, 0, version=2, async_req=True)
:param int id: ID for Plugin instance (running container) (required)
:param str file_slug: Either the name of the config file (name key) or the index in the config files list. (required)
:param int version: Version of file to revert to (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
request_params = ["version"]
path_params = {"id": id, "slug": file_slug}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}/configurations/{slug}/revert",
"POST",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_plugin_instance_config_content(
api_client, id, file_slug, version=None, **kwargs
): # noqa: E501
"""get_plugin_instance_config_content # noqa: E501
Read plugin instance config file content # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_plugin_instance_config_content(id, 0, async_req=True)
:param async_req bool: execute request asynchronously
:param int id: ID for Plugin instance (running container) (required)
:param str file_slug: Either the name of the config file (name key) or the index in the config files list. (required)
:param int version: Version of file to retrieve (defaults to current)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"id": id, "slug": file_slug}
query_params = ["version"]
for param in [p for p in query_params if local_var_params.get(p) is not None]:
query_params.append((param, local_var_params[param])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}/configurations/{slug}/content",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def put_plugin_instance_config_content(
api_client, id, file_slug, content=None, **kwargs
): # noqa: E501
"""put_plugin_instance_config_content # noqa: E501
Update plugin instance config file contents
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.put_plugin_instance_config_content(async_req=True)
:param int id: ID for Plugin instance (running container) (required)
:param str file_slug: Either the name of the config file (name key) or the index in the config files list. (required)
:param str content: File contents as string (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = ["content"]
collection_formats = {}
path_params = {"id": id, "slug": file_slug}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}/configurations/{slug}/content",
"PUT",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def delete_plugin_instance_config_version(
api_client, id, file_slug, version, *args, **kwargs
): # noqa: E501
"""delete_plugin_instance_config_version # noqa: E501
Delete plugin instance config file version # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.delete_plugin_instance_config_version(id, 0, 4, async_req=True)
:param int id: ID for Plugin instance (running container) (required)
:param str file_slug: Either the name of the config file (name key) or the index in the config files list. (required)
:param int version: Version of file to delete (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"id": id, "slug": file_slug, "version": version}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}/configurations/{slug}/versions/{version}",
"DELETE",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_plugin_instance_processes(api_client, id, *args, **kwargs): # noqa: E501
"""get_plugin_instance_processes # noqa: E501
Get plugin instance processes defined in the process_manager section
of the manager configuration file # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_plugin_instance_processes(id, async_req=True)
:param async_req bool: execute request asynchronously
:param int id: ID for Plugin instance (running container) (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"id": id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}/processes",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def run_plugin_instance_process_action(
api_client, id, process=None, action=None, timeout=None, **kwargs
): # noqa: E501
"""run_plugin_instance_process_action # noqa: E501
Run plugin instance process action for a process defined in process manager config # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.run_plugin_instance_process_action(id, process="main", action="status", async_req=True)
:param async_req bool: execute request asynchronously
:param int id: ID for Plugin instance (running container) (required)
:param str process: Name of the process. Should be listed in subprocesses list of config. (required)
:param str action: Action to take. See documentation for supported actions for your process manager. (required)
:param int timeout: timeout in seconds for the command, defaults to 10
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
request_params = ["process", "action", "timeout"]
path_params = {"id": id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}/processes/action",
"POST",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def run_plugin_instance_executable_command(
api_client, id, command=None, executable_path=None, timeout=None, **kwargs
): # noqa: E501
"""run_plugin_instance_executable_command # noqa: E501
Run command for a plugin instance executable # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.run_plugin_instance_executable_command(id, command="stop", async_req=True)
:param int id: ID for Plugin instance (running container) (required)
:param str command: The command to run. (A key in the commands object) (required)
:param str executable_path: Path to executable file in container (required if more than one executable is defined in manager config)
:param int timeout: timeout in seconds for the command, defaults to 2
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
request_params = ["command", "executable_path", "timeout"]
path_params = {"id": id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}/commands/execute",
"POST",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_plugin_instance_firewall(api_client, id, *args, **kwargs): # noqa: E501
"""get_plugin_instance_firewall # noqa: E501
Get list of firewall rules that are related to this plugin instance
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_plugin_instance_firewall(async_req=True)
:param int id: ID for Plugin instance (running container) (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = []
collection_formats = {}
path_params = {"id": id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}/firewall",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def put_plugin_instance_firewall_rule(
api_client,
id,
preset=None,
host_port=None,
container_port=None,
protocol=None,
**kwargs
): # noqa: E501
"""put_plugin_instance_firewall_rule # noqa: E501
Create a firewall rule for this plugin
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.put_plugin_instance_firewall_rule(id, preset="internet", async_req=True)
:param int id: ID for Plugin instance (running container) (required)
:param str preset: One of ssh, internet or port_map (required)
:param int host_port: VNS3 port. Required for preset "port_map"
:param int container_port: Plugin port to map VNS3 port to. Required for preset "port_map"
:param str protocol: Protocol for port map. Required for preset "port_map"
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = ["preset", "host_port", "container_port", "protocol"]
collection_formats = {}
path_params = {"id": id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/plugin-instances/{id}/firewall",
"PUT",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
class NetworkEdgePluginsApiRouter(VersionRouter):
function_library = {
"delete_container": {"4.8.4-5.1.5": delete_container},
"delete_container_image": {"4.8.4-5.1.5": delete_container_image},
"get_container_logs": {"4.8.4-5.1.5": get_container_logs},
"get_container_system_ips": {"4.8.4-5.1.5": get_container_system_ips},
"get_container_system_images": {"4.8.4-5.1.5": get_container_images},
"get_container_images": {"4.8.4-5.1.5": get_container_images},
"get_container_system_running_containers": {
"4.8.4-5.1.5": get_running_containers
},
"get_running_containers": {"4.8.4-5.1.5": get_running_containers},
"get_container_system_status": {"4.8.4-5.1.5": get_container_system_status},
"post_action_container_system": {"4.8.4-5.1.5": post_action_container_system},
"post_commit_container": {"4.8.4-5.1.5": post_commit_container},
"post_create_container_image": {"4.8.4-5.1.5": post_create_container_image},
"post_start_container": {"4.8.4-5.1.5": post_start_container},
"put_configure_container_system": {
"4.8.4-5.1.5": put_configure_container_system
},
"put_update_container_image": {"4.8.4-5.1.5": put_update_container_image},
"put_stop_container": {"4.8.4-5.1.5": put_stop_container},
"wait_for_container_system_state": {
"4.8.4-5.1.5": wait_for_container_system_state
},
"restart_container_network": {"4.8.4-5.1.5": restart_container_network},
"wait_for_image_import": {"4.8.4-5.1.5": wait_for_image_import},
"post_export_image": {"4.9.1-5.1.5": post_export_image}, # plugins
"get_plugins": {"5.0.0-5.1.5": get_plugins},
"get_plugin": {"5.0.0-5.1.5": get_plugin},
"post_commit_plugin_instance": {"5.0.0-5.1.5": post_commit_plugin_instance},
"get_plugin_instances": {"5.0.0-5.1.5": get_plugin_instances},
"get_plugin_instance": {"5.0.0-5.1.5": get_plugin_instance},
"post_create_manager_config": {"5.0.0-5.1.5": post_create_manager_config},
"put_update_manager_config": {"5.0.0-5.1.5": put_update_manager_config},
"get_plugin_instance_log_files": {"5.0.0-5.1.5": get_plugin_instance_log_files},
"get_plugin_instance_log_content": {
"5.0.0-5.1.5": get_plugin_instance_log_content
},
"get_plugin_instance_config_files": {
"5.0.0-5.1.5": get_plugin_instance_config_files
},
"revert_plugin_instance_config_file": {
"5.0.0-5.1.5": revert_plugin_instance_config_file
},
"get_plugin_instance_config_content": {
"5.0.0-5.1.5": get_plugin_instance_config_content
},
"put_plugin_instance_config_content": {
"5.0.0-5.1.5": put_plugin_instance_config_content
},
"delete_plugin_instance_config_version": {
"5.0.0-5.1.5": delete_plugin_instance_config_version
},
"get_plugin_instance_processes": {"5.0.0-5.1.5": get_plugin_instance_processes},
"run_plugin_instance_process_action": {
"5.0.0-5.1.5": run_plugin_instance_process_action
},
"run_plugin_instance_executable_command": {
"5.0.0-5.1.5": run_plugin_instance_executable_command
},
"get_plugin_instance_firewall": {"5.0.0-5.1.5": get_plugin_instance_firewall},
"put_plugin_instance_firewall_rule": {
"5.0.0-5.1.5": put_plugin_instance_firewall_rule
},
}
| 35.319325 | 136 | 0.640482 | 11,637 | 98,329 | 5.134829 | 0.031795 | 0.037889 | 0.04967 | 0.044666 | 0.893195 | 0.871758 | 0.857867 | 0.845299 | 0.832396 | 0.822489 | 0 | 0.015616 | 0.275799 | 98,329 | 2,783 | 137 | 35.332016 | 0.823508 | 0.440135 | 0 | 0.79219 | 0 | 0.000697 | 0.149798 | 0.049997 | 0 | 0 | 0 | 0.000359 | 0.003487 | 1 | 0.026499 | false | 0 | 0.009763 | 0 | 0.065551 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c408db08f4df77f0a8c4aaf8057c23787331d79a | 6,383 | py | Python | tests/commands/test_deploy.py | wpride/opta | 04b745019af595464f4a8060b9bffa2399c137e0 | [
"Apache-2.0"
] | null | null | null | tests/commands/test_deploy.py | wpride/opta | 04b745019af595464f4a8060b9bffa2399c137e0 | [
"Apache-2.0"
] | null | null | null | tests/commands/test_deploy.py | wpride/opta | 04b745019af595464f4a8060b9bffa2399c137e0 | [
"Apache-2.0"
] | null | null | null | from click.testing import CliRunner
from pytest import fixture
from pytest_mock import MockFixture
from opta.cli import cli
from opta.layer import Layer
@fixture(scope="module", autouse=True)
def mock_is_service_config(module_mocker: MockFixture) -> None:
module_mocker.patch("opta.commands.deploy.is_service_config", return_value=True)
def test_deploy_basic(mocker: MockFixture) -> None:
mocked_os_path_exists = mocker.patch("opta.utils.os.path.exists")
mocked_os_path_exists.return_value = True
mock_push = mocker.patch(
"opta.commands.deploy._push", return_value=("local_digest", "local_tag")
)
mock_apply = mocker.patch("opta.commands.deploy._apply")
mocked_layer_class = mocker.patch("opta.commands.deploy.Layer")
mocked_layer = mocker.Mock(spec=Layer)
mocked_layer_class.load_from_yaml.return_value = mocked_layer
mock_terraform_outputs = mocker.patch(
"opta.commands.deploy.Terraform.get_outputs",
return_value={"docker_repo_url": "blah"},
)
runner = CliRunner()
result = runner.invoke(cli, ["deploy", "-i", "local_image:local_tag"])
assert result.exit_code == 0
mock_push.assert_called_once_with(
image="local_image:local_tag", config="opta.yml", env=None, tag=None
)
mock_terraform_outputs.assert_called_once_with(mocked_layer)
mock_apply.assert_called_once_with(
config="opta.yml",
env=None,
refresh=False,
image_tag=None,
test=False,
auto_approve=False,
image_digest="local_digest",
detailed_plan=False,
)
mock_terraform_outputs.assert_called_once_with(mocker.ANY)
def test_deploy_auto_approve(mocker: MockFixture) -> None:
mocked_os_path_exists = mocker.patch("opta.utils.os.path.exists")
mocked_os_path_exists.return_value = True
mock_push = mocker.patch(
"opta.commands.deploy._push", return_value=("local_digest", "local_tag")
)
mock_apply = mocker.patch("opta.commands.deploy._apply")
mocked_layer_class = mocker.patch("opta.commands.deploy.Layer")
mocked_layer = mocker.Mock(spec=Layer)
mocked_layer_class.load_from_yaml.return_value = mocked_layer
mock_terraform_outputs = mocker.patch(
"opta.commands.deploy.Terraform.get_outputs",
return_value={"docker_repo_url": "blah"},
)
runner = CliRunner()
result = runner.invoke(
cli, ["deploy", "-i", "local_image:local_tag", "--auto-approve"]
)
assert result.exit_code == 0
mock_push.assert_called_once_with(
image="local_image:local_tag", config="opta.yml", env=None, tag=None
)
mock_terraform_outputs.assert_called_once_with(mocked_layer)
mock_apply.assert_called_once_with(
config="opta.yml",
env=None,
refresh=False,
image_tag=None,
test=False,
auto_approve=True,
image_digest="local_digest",
detailed_plan=False,
)
mock_terraform_outputs.assert_called_once_with(mocker.ANY)
def test_deploy_all_flags(mocker: MockFixture) -> None:
mocked_os_path_exists = mocker.patch("opta.utils.os.path.exists")
mocked_os_path_exists.return_value = True
mock_push = mocker.patch(
"opta.commands.deploy._push", return_value=("local_digest", "latest")
)
mock_apply = mocker.patch("opta.commands.deploy._apply")
mocked_layer_class = mocker.patch("opta.commands.deploy.Layer")
mocked_layer = mocker.Mock(spec=Layer)
mocked_layer_class.load_from_yaml.return_value = mocked_layer
mock_terraform_outputs = mocker.patch(
"opta.commands.deploy.Terraform.get_outputs",
return_value={"docker_repo_url": "blah"},
)
runner = CliRunner()
result = runner.invoke(
cli,
[
"deploy",
"--image",
"local_image:local_tag",
"--config",
"app/opta.yml",
"--env",
"staging",
"--tag",
"latest",
],
)
assert result.exit_code == 0
mock_push.assert_called_once_with(
image="local_image:local_tag", config="app/opta.yml", env="staging", tag="latest"
)
mock_terraform_outputs.assert_called_once_with(mocked_layer)
mock_apply.assert_called_once_with(
config="app/opta.yml",
env="staging",
refresh=False,
image_tag=None,
test=False,
auto_approve=False,
image_digest="local_digest",
detailed_plan=False,
)
def test_deploy_ecr_apply(mocker: MockFixture) -> None:
mocked_os_path_exists = mocker.patch("opta.utils.os.path.exists")
mocked_os_path_exists.return_value = True
mock_push = mocker.patch(
"opta.commands.deploy._push", return_value=("local_digest", "latest")
)
mock_apply = mocker.patch("opta.commands.deploy._apply")
mocked_layer_class = mocker.patch("opta.commands.deploy.Layer")
mocked_layer = mocker.Mock(spec=Layer)
mocked_layer_class.load_from_yaml.return_value = mocked_layer
mock_terraform_outputs = mocker.patch(
"opta.commands.deploy.Terraform.get_outputs", return_value={},
)
runner = CliRunner()
result = runner.invoke(
cli,
[
"deploy",
"--image",
"local_image:local_tag",
"--config",
"app/opta.yml",
"--env",
"staging",
"--tag",
"latest",
],
)
assert result.exit_code == 0
mock_push.assert_called_once_with(
image="local_image:local_tag", config="app/opta.yml", env="staging", tag="latest"
)
mock_terraform_outputs.assert_called_once_with(mocked_layer)
mock_apply.assert_has_calls(
[
mocker.call(
config="app/opta.yml",
env="staging",
refresh=False,
image_tag=None,
test=False,
auto_approve=False,
stdout_logs=False,
detailed_plan=False,
),
mocker.call(
config="app/opta.yml",
env="staging",
refresh=False,
image_tag=None,
test=False,
auto_approve=False,
image_digest="local_digest",
detailed_plan=False,
),
]
)
| 32.733333 | 89 | 0.639041 | 756 | 6,383 | 5.07672 | 0.108466 | 0.060188 | 0.082074 | 0.101876 | 0.907243 | 0.899687 | 0.899687 | 0.899687 | 0.899687 | 0.899687 | 0 | 0.000831 | 0.245966 | 6,383 | 194 | 90 | 32.902062 | 0.796593 | 0 | 0 | 0.765714 | 0 | 0 | 0.197399 | 0.123766 | 0 | 0 | 0 | 0 | 0.102857 | 1 | 0.028571 | false | 0 | 0.028571 | 0 | 0.057143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c454c95462278e3cffde9f10d19644fbb497a528 | 6,814 | py | Python | code/report-specific/qAgentsPlaygrounds/QLearningAgents.py | ekarais/RLFM | 479679e39b4fdd230b5a67c2005dd2fb001e7169 | [
"MIT"
] | 7 | 2020-06-17T14:09:57.000Z | 2022-03-15T05:20:42.000Z | code/report-specific/qAgentsPlaygrounds/QLearningAgents.py | ekarais/RLFM | 479679e39b4fdd230b5a67c2005dd2fb001e7169 | [
"MIT"
] | null | null | null | code/report-specific/qAgentsPlaygrounds/QLearningAgents.py | ekarais/RLFM | 479679e39b4fdd230b5a67c2005dd2fb001e7169 | [
"MIT"
] | 1 | 2020-06-17T14:09:58.000Z | 2020-06-17T14:09:58.000Z | from agents import Buyer, Seller, MarketAgent
#from environments import MarketEnvironment
import random
import math
import numpy as np
class QLearningBuyer(Buyer):
def __init__(self, agent_id: str, reservation_price: float, default_price: float, n_states=11, alpha=0.1, gamma=0.1, epsilon=0.1):
"""
A q-learning buyer agent that extends the market agent
The agent has a discrete number of states, which correspond to its offer at the timestep
:param agent_id: a unique id that differentiates this agent to other agents
:param reservation_price: the reservation price, or maximum price that this agent is
willing to buy
:param default_price: the default price, or starting price that this agent is
willing to buy at the first timestep. Smaller than the reservation price
"""
assert reservation_price > default_price,"Buyer Default Price must be smaller than the Reservation Price!"
super().__init__(agent_id, reservation_price)
self.n_states = n_states
self.default_price = default_price
self.offers = np.linspace(default_price, reservation_price, n_states)
self.q_table = np.zeros((n_states, n_states))
#Starting state is the default price index
self.state = 0
self.next_state = 0
self.alpha = alpha
self.gamma = gamma
self.epsilon = epsilon
#Whether or not the agent has had a deal in this market episode.
self.done = False
#List for storing rewards in each episode, not used as info for the learning process
self.rewards = []
def get_offer(self, previous: float, offers: dict, verbose=False, greedy=False):
"""
With Information provided by the Market Setting, decide on a new offer that the agent
believes will succeed
"""
#If greedy: always exploit, never expore
if(greedy):
eps = 0
else:
eps = self.epsilon
if(random.uniform(0,1) < eps):
self.next_state = random.randint(0,self.n_states-1)
if verbose: print(f'{self.agent_id} Exploring: next:{self.next_state}, previous:{self.state} ')
else:
#self.next_state = np.argmax(self.q_table[self.state])
self.next_state = np.random.choice(np.where(self.q_table[self.state] == self.q_table[self.state].max())[0])
if verbose: print(f'{self.agent_id} Exploiting: next:{self.next_state}, previous:{self.state} ')
new = self.offers[self.next_state]
new_offer = {self.agent_id: new}
offers.update(new_offer)
def update_table(self, reward, verbose=False):
'''
Update q table of agent according to the reward received from market step
'''
if not self.done:
old_value = self.q_table[self.state, self.next_state]
next_max = np.max(self.q_table[self.next_state])
new_value = (1 - self.alpha) * old_value + self.alpha * (reward + self.gamma * next_max)
self.q_table[self.state,self.next_state] = new_value
# If Agent has just had a deal, update self.done
if ((not self.done) and (reward != 0)):
#self.done = True
if verbose: print(f'Agent done! Reward is {reward}')
self.state = self.next_state
class QLearningSeller(Seller):
def __init__(self, agent_id: str, reservation_price: float, default_price: float, n_states=11, alpha=0.1, gamma=0.1, epsilon=0.1):
"""
A q-learning seller agent that extends the market agent
The agent has a discrete number of states, which correspond to its offer at the timestep
:param agent_id: a unique id that differentiates this agent to other agents
:param reservation_price: the reservation price, or minimum price that this agent is
willing to sell
:param default_price: the default price, or starting price that this agent is
willing to sell at the first timestep. Greater than the reservation price
"""
assert reservation_price < default_price,"Seller Default Price must be greater than the Reservation Price!"
super().__init__(agent_id, reservation_price)
self.n_states = n_states
self.default_price = default_price
self.offers = np.linspace(reservation_price, default_price, n_states)
self.q_table = np.zeros((n_states, n_states))
#Starting state is the default price index
self.state = n_states - 1
self.next_state = n_states - 1
self.alpha = alpha
self.gamma = gamma
self.epsilon = epsilon
#Whether or not the agent has had a deal in this market episode.
self.done = False
#List for storing rewards in each episode, not used as info for the learning process
self.rewards = []
def get_offer(self, previous: float, offers: dict, verbose=False, greedy=False):
"""
With Information provided by the Market Setting, decide on a new offer that the agent
believes will succeed
"""
#If greedy: always exploit, never expore
if(greedy):
eps = 0
else:
eps = self.epsilon
if(random.uniform(0,1) < eps):
self.next_state = random.randint(0,self.n_states-1)
if verbose: print(f'{self.agent_id} Exploring: next:{self.next_state}, previous:{self.state} ')
else:
#self.next_state = np.argmax(self.q_table[self.state])
self.next_state = np.random.choice(np.where(self.q_table[self.state] == self.q_table[self.state].max())[0])
if verbose: print(f'{self.agent_id} Exploiting: next:{self.next_state}, previous:{self.state} ')
new = self.offers[self.next_state]
new_offer = {self.agent_id: new}
offers.update(new_offer)
def update_table(self, reward, verbose=False):
'''
Update q table of agent according to the reward received from market step
'''
if not self.done:
old_value = self.q_table[self.state, self.next_state]
next_max = np.max(self.q_table[self.next_state])
new_value = (1 - self.alpha) * old_value + self.alpha * (reward + self.gamma * next_max)
if verbose: print(f'Updating: new value of q[{self.state}][{self.next_state}]={new_value}')
self.q_table[self.state,self.next_state] = new_value
# If Agent has just had a deal, update self.done
if ((not self.done) and (reward != 0)):
#self.done = True
if verbose: print(f'Agent done!')
self.state = self.next_state | 47.65035 | 135 | 0.633695 | 939 | 6,814 | 4.466454 | 0.149095 | 0.043872 | 0.071292 | 0.040057 | 0.908202 | 0.885074 | 0.885074 | 0.879113 | 0.861946 | 0.833572 | 0 | 0.007299 | 0.276196 | 6,814 | 143 | 136 | 47.65035 | 0.843066 | 0.299237 | 0 | 0.790123 | 0 | 0 | 0.117712 | 0.049213 | 0 | 0 | 0 | 0 | 0.024691 | 1 | 0.074074 | false | 0 | 0.049383 | 0 | 0.148148 | 0.08642 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c4611cf63b63dd942a93901fc6ccfe8403502590 | 3,973 | py | Python | tree_methods.py | Simbold/PyFinQ | c8be876ba7977fac1578ef788f51c2c9e7cb7d4c | [
"CC0-1.0"
] | 2 | 2020-08-23T19:03:49.000Z | 2022-01-01T04:54:03.000Z | tree_methods.py | Simbold/PyFinQ | c8be876ba7977fac1578ef788f51c2c9e7cb7d4c | [
"CC0-1.0"
] | null | null | null | tree_methods.py | Simbold/PyFinQ | c8be876ba7977fac1578ef788f51c2c9e7cb7d4c | [
"CC0-1.0"
] | null | null | null | import numpy as np
import numba
@numba.jit(nopython=True, parallel=True)
def generate_full_tree(spot, m, u, d):
# spot: underlyings spot price
# m: integer number of steps
# u: up value
# d: down value
# returns a full price tree as [(m+1)x(m+1)] numpy array
tree = np.zeros((m+1, m+1))
tree[0, 0] = spot
print("generating tree")
for j in range(1, m+1):
tree[j, j] = tree[j-1, j-1] * d
tree[0:j, j] = tree[0:j, j-1] * u
return tree
@numba.jit(nopython=True, parallel=True)
def generate_tree_state(spot, m, u, d):
# spot: underlyings spot price
# m: integer number of steps
# u: up value
# d: down value
# returns tree state at maturity as [(m+1),] array
tree = np.zeros(m + 1)
tree[0] = spot
print("generate tree state at maturity")
for j in range(1, m + 1):
tree[j] = tree[j-1] * d
tree[0:j] = tree[0:j] * u
return tree
@numba.jit(nopython=True, parallel=True)
def european_state_iterator(v, m, q, r, dt):
# v: an [(m+1),] array of the option value state at maturity
# m: integer number of steps
# q: Q probability of up
# r: risk free rate
# dt: mT/m
# returns value of the option
for j in range(m, 0, -1):
v = (q * v[0:j] + (1 - q) * v[1:j + 1]) * np.exp(-r * dt)
if j == round(m * 0.5):
print("Backwards iteration at: 50%")
return v[0]
@numba.jit(nopython=True, parallel=True)
def european_full_iterator(v, m, q, r, dt):
# v: an initial [(m+1)x(m+1)] numpy array of the option value with v[:, -1] option value state at maturity
# m: integer number of steps
# q: Q probability of up
# r: risk free rate
# dt: mT/m
# returns [(m+1)x(m+1)] numpy array of full value tree of the option
for j in range(m, 0, -1):
v[0:j, j - 1] = (q * v[0:j, j] + (1 - q) * v[1:j + 1, j]) * np.exp(-r * dt)
if j == round(m * 0.5):
print("Backwards iteration at: 50%")
return v[0]
@numba.jit(nopython=True)
def conditional_state_iterator(v, tree, m, q, r, dt, condition, cond1=0, cond2=0, cond3=0):
# v: an [(m+1),] array of the option value state at maturity # m: number of steps
# tree: tree state at maturity as [(m+1),] numpy array
# m: integer number of steps
# q: Q probability of up
# r: risk free rate
# dt: mT/m
# condition: condition function on how to manipulate state (for example american early exercise or barrier etc.)
# returns value of the option
for j in range(m, -1, -1):
for i in range(0, j):
v[i] = condition((q * v[i] + (1 - q) * v[i + 1]) * np.exp(-r * dt), tree[i, j - 1], cond1, cond2, cond3)
if (j == round(m*0.75)) & (i == 0):
print("Backwards iteration at: 50%")
v = v[0:j]
return v[0]
@numba.jit(nopython=True, parallel=True)
def conditional_full_iterator(v, tree, m, q, r, dt, condition, cond1, cond2, cond3):
# v: an initial [(m+1)x(m+1)] numpy array of the option value with v[:, -1] option value state at maturity
# m: integer number of steps
# tree: full price tree as [(m+1)x(m+1)] numpy array
# q: Q probability of up
# r: risk free rate
# dt: mT/m
# condition: condition function on how to manipulate state (for example american early exercise or barrier etc.)
# returns [(m+1)x(m+1)] numpy array of full value tree of the option
for j in range(m, -1, -1):
for i in numba.prange(0, j):
v[i, j - 1] = condition((q * v[i, j] + (1 - q) * v[i + 1, j]) * np.exp(-r * dt), tree[i, j - 1], cond1, cond2, cond3)
if (j == round(m*0.75)) & (i == 0):
print("Backwards iteration at: 50%")
return v[0]
@numba.jit(nopython=True)
def american_put_condition(v, s, strike, cond2=0, cond3=0):
# function to be passed to conditional iterator for american put
vm = np.maximum(v, np.maximum(strike - s, 0))
return vm
| 32.565574 | 129 | 0.578656 | 694 | 3,973 | 3.292507 | 0.134006 | 0.020131 | 0.038512 | 0.061269 | 0.814442 | 0.810503 | 0.787746 | 0.757987 | 0.716411 | 0.671772 | 0 | 0.037396 | 0.273093 | 3,973 | 121 | 130 | 32.834711 | 0.753809 | 0.37931 | 0 | 0.490909 | 1 | 0 | 0.063531 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.127273 | false | 0 | 0.036364 | 0 | 0.290909 | 0.109091 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c472a978e5131814a9da5bc94dd881b34911f3d0 | 38,280 | py | Python | ironicclient/tests/unit/osc/v1/test_baremetal_volume_target.py | ljmcgann/python-ironicclient | a5485dc29fe551e4cb5feaad52cd93d67b0ab53e | [
"Apache-2.0"
] | 41 | 2015-01-29T20:10:48.000Z | 2022-01-26T10:04:28.000Z | ironicclient/tests/unit/osc/v1/test_baremetal_volume_target.py | ljmcgann/python-ironicclient | a5485dc29fe551e4cb5feaad52cd93d67b0ab53e | [
"Apache-2.0"
] | null | null | null | ironicclient/tests/unit/osc/v1/test_baremetal_volume_target.py | ljmcgann/python-ironicclient | a5485dc29fe551e4cb5feaad52cd93d67b0ab53e | [
"Apache-2.0"
] | 46 | 2015-01-19T17:46:52.000Z | 2021-12-19T01:22:47.000Z | # Copyright 2017 FUJITSU LIMITED
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import copy
from unittest import mock
from osc_lib.tests import utils as osctestutils
from ironicclient import exc
from ironicclient.osc.v1 import baremetal_volume_target as bm_vol_target
from ironicclient.tests.unit.osc.v1 import fakes as baremetal_fakes
class TestBaremetalVolumeTarget(baremetal_fakes.TestBaremetal):
def setUp(self):
super(TestBaremetalVolumeTarget, self).setUp()
self.baremetal_mock = self.app.client_manager.baremetal
self.baremetal_mock.reset_mock()
class TestCreateBaremetalVolumeTarget(TestBaremetalVolumeTarget):
def setUp(self):
super(TestCreateBaremetalVolumeTarget, self).setUp()
self.baremetal_mock.volume_target.create.return_value = (
baremetal_fakes.FakeBaremetalResource(
None,
copy.deepcopy(baremetal_fakes.VOLUME_TARGET),
loaded=True,
))
# Get the command object to test
self.cmd = (
bm_vol_target.CreateBaremetalVolumeTarget(self.app, None))
def test_baremetal_volume_target_create(self):
arglist = [
'--node', baremetal_fakes.baremetal_uuid,
'--type',
baremetal_fakes.baremetal_volume_target_volume_type,
'--boot-index',
baremetal_fakes.baremetal_volume_target_boot_index,
'--volume-id',
baremetal_fakes.baremetal_volume_target_volume_id,
'--uuid', baremetal_fakes.baremetal_volume_target_uuid,
]
verifylist = [
('node_uuid', baremetal_fakes.baremetal_uuid),
('volume_type',
baremetal_fakes.baremetal_volume_target_volume_type),
('boot_index',
baremetal_fakes.baremetal_volume_target_boot_index),
('volume_id',
baremetal_fakes.baremetal_volume_target_volume_id),
('uuid', baremetal_fakes.baremetal_volume_target_uuid),
]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
args = {
'node_uuid': baremetal_fakes.baremetal_uuid,
'volume_type':
baremetal_fakes.baremetal_volume_target_volume_type,
'boot_index':
baremetal_fakes.baremetal_volume_target_boot_index,
'volume_id':
baremetal_fakes.baremetal_volume_target_volume_id,
'uuid': baremetal_fakes.baremetal_volume_target_uuid,
}
self.baremetal_mock.volume_target.create.assert_called_once_with(
**args)
def test_baremetal_volume_target_create_without_uuid(self):
arglist = [
'--node', baremetal_fakes.baremetal_uuid,
'--type',
baremetal_fakes.baremetal_volume_target_volume_type,
'--boot-index',
baremetal_fakes.baremetal_volume_target_boot_index,
'--volume-id',
baremetal_fakes.baremetal_volume_target_volume_id,
]
verifylist = [
('node_uuid', baremetal_fakes.baremetal_uuid),
('volume_type',
baremetal_fakes.baremetal_volume_target_volume_type),
('boot_index',
baremetal_fakes.baremetal_volume_target_boot_index),
('volume_id',
baremetal_fakes.baremetal_volume_target_volume_id),
]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
args = {
'node_uuid': baremetal_fakes.baremetal_uuid,
'volume_type':
baremetal_fakes.baremetal_volume_target_volume_type,
'boot_index':
baremetal_fakes.baremetal_volume_target_boot_index,
'volume_id':
baremetal_fakes.baremetal_volume_target_volume_id,
}
self.baremetal_mock.volume_target.create.assert_called_once_with(
**args)
def test_baremetal_volume_target_create_extras(self):
arglist = [
'--node', baremetal_fakes.baremetal_uuid,
'--type',
baremetal_fakes.baremetal_volume_target_volume_type,
'--boot-index',
baremetal_fakes.baremetal_volume_target_boot_index,
'--volume-id',
baremetal_fakes.baremetal_volume_target_volume_id,
'--extra', 'key1=value1',
'--extra', 'key2=value2',
]
verifylist = [
('node_uuid', baremetal_fakes.baremetal_uuid),
('volume_type',
baremetal_fakes.baremetal_volume_target_volume_type),
('boot_index',
baremetal_fakes.baremetal_volume_target_boot_index),
('volume_id',
baremetal_fakes.baremetal_volume_target_volume_id),
('extra', ['key1=value1', 'key2=value2'])
]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
args = {
'node_uuid': baremetal_fakes.baremetal_uuid,
'volume_type':
baremetal_fakes.baremetal_volume_target_volume_type,
'boot_index':
baremetal_fakes.baremetal_volume_target_boot_index,
'volume_id':
baremetal_fakes.baremetal_volume_target_volume_id,
'extra': baremetal_fakes.baremetal_volume_target_extra,
}
self.baremetal_mock.volume_target.create.assert_called_once_with(
**args)
def _test_baremetal_volume_target_missing_param(self, missing):
argdict = {
'--node': baremetal_fakes.baremetal_uuid,
'--type':
baremetal_fakes.baremetal_volume_target_volume_type,
'--boot-index':
baremetal_fakes.baremetal_volume_target_boot_index,
'--volume-id':
baremetal_fakes.baremetal_volume_target_volume_id,
'--uuid': baremetal_fakes.baremetal_volume_target_uuid,
}
arglist = []
for k, v in argdict.items():
if k not in missing:
arglist += [k, v]
verifylist = None
self.assertRaises(osctestutils.ParserException,
self.check_parser,
self.cmd, arglist, verifylist)
def test_baremetal_volume_target_create_missing_node(self):
self._test_baremetal_volume_target_missing_param(['--node'])
def test_baremetal_volume_target_create_missing_type(self):
self._test_baremetal_volume_target_missing_param(['--type'])
def test_baremetal_volume_target_create_missing_boot_index(self):
self._test_baremetal_volume_target_missing_param(['--boot-index'])
def test_baremetal_volume_target_create_missing_volume_id(self):
self._test_baremetal_volume_target_missing_param(['--volume-id'])
def test_baremetal_volume_target_create_invalid_boot_index(self):
arglist = [
'--node', baremetal_fakes.baremetal_uuid,
'--type',
baremetal_fakes.baremetal_volume_target_volume_type,
'--boot-index', 'string',
'--volume-id',
baremetal_fakes.baremetal_volume_target_volume_id,
]
verifylist = None
self.assertRaises(osctestutils.ParserException,
self.check_parser,
self.cmd, arglist, verifylist)
def test_baremetal_volume_target_create_negative_boot_index(self):
arglist = [
'--node', baremetal_fakes.baremetal_uuid,
'--type',
baremetal_fakes.baremetal_volume_target_volume_type,
'--boot-index', '-1',
'--volume-id',
baremetal_fakes.baremetal_volume_target_volume_id,
]
verifylist = [
('node_uuid', baremetal_fakes.baremetal_uuid),
('volume_type',
baremetal_fakes.baremetal_volume_target_volume_type),
('boot_index', -1),
('volume_id',
baremetal_fakes.baremetal_volume_target_volume_id),
]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.assertRaises(exc.CommandError, self.cmd.take_action, parsed_args)
class TestShowBaremetalVolumeTarget(TestBaremetalVolumeTarget):
def setUp(self):
super(TestShowBaremetalVolumeTarget, self).setUp()
self.baremetal_mock.volume_target.get.return_value = (
baremetal_fakes.FakeBaremetalResource(
None,
copy.deepcopy(baremetal_fakes.VOLUME_TARGET),
loaded=True))
self.cmd = (
bm_vol_target.ShowBaremetalVolumeTarget(self.app, None))
def test_baremetal_volume_target_show(self):
arglist = ['vvv-tttttt-vvvv']
verifylist = [('volume_target',
baremetal_fakes.baremetal_volume_target_uuid)]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
columns, data = self.cmd.take_action(parsed_args)
args = ['vvv-tttttt-vvvv']
self.baremetal_mock.volume_target.get.assert_called_once_with(
*args, fields=None)
collist = ('boot_index', 'extra', 'node_uuid', 'properties', 'uuid',
'volume_id', 'volume_type')
self.assertEqual(collist, columns)
datalist = (
baremetal_fakes.baremetal_volume_target_boot_index,
baremetal_fakes.baremetal_volume_target_extra,
baremetal_fakes.baremetal_uuid,
baremetal_fakes.baremetal_volume_target_properties,
baremetal_fakes.baremetal_volume_target_uuid,
baremetal_fakes.baremetal_volume_target_volume_id,
baremetal_fakes.baremetal_volume_target_volume_type,
)
self.assertEqual(datalist, tuple(data))
def test_baremetal_volume_target_show_no_options(self):
arglist = []
verifylist = []
self.assertRaises(osctestutils.ParserException,
self.check_parser,
self.cmd, arglist, verifylist)
def test_baremetal_volume_target_show_fields(self):
arglist = ['vvv-tttttt-vvvv', '--fields', 'uuid', 'volume_id']
verifylist = [('fields', [['uuid', 'volume_id']]),
('volume_target',
baremetal_fakes.baremetal_volume_target_uuid)]
fake_vt = copy.deepcopy(baremetal_fakes.VOLUME_TARGET)
fake_vt.pop('node_uuid')
fake_vt.pop('volume_type')
fake_vt.pop('boot_index')
fake_vt.pop('extra')
fake_vt.pop('properties')
self.baremetal_mock.volume_target.get.return_value = (
baremetal_fakes.FakeBaremetalResource(
None,
fake_vt,
loaded=True))
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
columns, data = self.cmd.take_action(parsed_args)
args = ['vvv-tttttt-vvvv']
fields = ['uuid', 'volume_id']
self.baremetal_mock.volume_target.get.assert_called_once_with(
*args, fields=fields)
collist = ('uuid', 'volume_id')
self.assertEqual(collist, columns)
datalist = (
baremetal_fakes.baremetal_volume_target_uuid,
baremetal_fakes.baremetal_volume_target_volume_id,
)
self.assertEqual(datalist, tuple(data))
def test_baremetal_volume_target_show_fields_multiple(self):
arglist = ['vvv-tttttt-vvvv', '--fields', 'uuid', 'volume_id',
'--fields', 'volume_type']
verifylist = [('fields', [['uuid', 'volume_id'], ['volume_type']]),
('volume_target',
baremetal_fakes.baremetal_volume_target_uuid)]
fake_vt = copy.deepcopy(baremetal_fakes.VOLUME_TARGET)
fake_vt.pop('node_uuid')
fake_vt.pop('boot_index')
fake_vt.pop('extra')
fake_vt.pop('properties')
self.baremetal_mock.volume_target.get.return_value = (
baremetal_fakes.FakeBaremetalResource(
None,
fake_vt,
loaded=True))
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
columns, data = self.cmd.take_action(parsed_args)
args = ['vvv-tttttt-vvvv']
fields = ['uuid', 'volume_id', 'volume_type']
self.baremetal_mock.volume_target.get.assert_called_once_with(
*args, fields=fields)
collist = ('uuid', 'volume_id', 'volume_type')
self.assertEqual(collist, columns)
datalist = (
baremetal_fakes.baremetal_volume_target_uuid,
baremetal_fakes.baremetal_volume_target_volume_id,
baremetal_fakes.baremetal_volume_target_volume_type,
)
self.assertEqual(datalist, tuple(data))
def test_baremetal_volume_target_show_invalid_fields(self):
arglist = ['vvv-tttttt-vvvv', '--fields', 'uuid', 'invalid']
verifylist = None
self.assertRaises(osctestutils.ParserException,
self.check_parser,
self.cmd, arglist, verifylist)
class TestListBaremetalVolumeTarget(TestBaremetalVolumeTarget):
def setUp(self):
super(TestListBaremetalVolumeTarget, self).setUp()
self.baremetal_mock.volume_target.list.return_value = [
baremetal_fakes.FakeBaremetalResource(
None,
copy.deepcopy(baremetal_fakes.VOLUME_TARGET),
loaded=True)
]
self.cmd = (
bm_vol_target.ListBaremetalVolumeTarget(self.app, None))
def test_baremetal_volume_target_list(self):
arglist = []
verifylist = []
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
columns, data = self.cmd.take_action(parsed_args)
kwargs = {
'marker': None,
'limit': None}
self.baremetal_mock.volume_target.list.assert_called_once_with(
**kwargs)
collist = (
"UUID",
"Node UUID",
"Driver Volume Type",
"Boot Index",
"Volume ID")
self.assertEqual(collist, columns)
datalist = ((baremetal_fakes.baremetal_volume_target_uuid,
baremetal_fakes.baremetal_uuid,
baremetal_fakes.baremetal_volume_target_volume_type,
baremetal_fakes.baremetal_volume_target_boot_index,
baremetal_fakes.baremetal_volume_target_volume_id),)
self.assertEqual(datalist, tuple(data))
def test_baremetal_volume_target_list_node(self):
arglist = ['--node', baremetal_fakes.baremetal_uuid]
verifylist = [('node', baremetal_fakes.baremetal_uuid)]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
columns, data = self.cmd.take_action(parsed_args)
kwargs = {
'node': baremetal_fakes.baremetal_uuid,
'marker': None,
'limit': None}
self.baremetal_mock.volume_target.list.assert_called_once_with(
**kwargs)
collist = (
"UUID",
"Node UUID",
"Driver Volume Type",
"Boot Index",
"Volume ID")
self.assertEqual(collist, columns)
datalist = ((baremetal_fakes.baremetal_volume_target_uuid,
baremetal_fakes.baremetal_uuid,
baremetal_fakes.baremetal_volume_target_volume_type,
baremetal_fakes.baremetal_volume_target_boot_index,
baremetal_fakes.baremetal_volume_target_volume_id),)
self.assertEqual(datalist, tuple(data))
def test_baremetal_volume_target_list_long(self):
arglist = ['--long']
verifylist = [('detail', True)]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
columns, data = self.cmd.take_action(parsed_args)
kwargs = {
'detail': True,
'marker': None,
'limit': None,
}
self.baremetal_mock.volume_target.list.assert_called_with(**kwargs)
collist = ('UUID', 'Node UUID', 'Driver Volume Type', 'Properties',
'Boot Index', 'Extra', 'Volume ID', 'Created At',
'Updated At')
self.assertEqual(collist, columns)
datalist = ((baremetal_fakes.baremetal_volume_target_uuid,
baremetal_fakes.baremetal_uuid,
baremetal_fakes.baremetal_volume_target_volume_type,
baremetal_fakes.baremetal_volume_target_properties,
baremetal_fakes.baremetal_volume_target_boot_index,
baremetal_fakes.baremetal_volume_target_extra,
baremetal_fakes.baremetal_volume_target_volume_id,
'',
''),)
self.assertEqual(datalist, tuple(data))
def test_baremetal_volume_target_list_fields(self):
arglist = ['--fields', 'uuid', 'boot_index']
verifylist = [('fields', [['uuid', 'boot_index']])]
fake_vt = copy.deepcopy(baremetal_fakes.VOLUME_TARGET)
fake_vt.pop('volume_type')
fake_vt.pop('extra')
fake_vt.pop('properties')
fake_vt.pop('volume_id')
fake_vt.pop('node_uuid')
self.baremetal_mock.volume_target.list.return_value = [
baremetal_fakes.FakeBaremetalResource(
None,
fake_vt,
loaded=True)
]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
columns, data = self.cmd.take_action(parsed_args)
kwargs = {
'detail': False,
'marker': None,
'limit': None,
'fields': ('uuid', 'boot_index')
}
self.baremetal_mock.volume_target.list.assert_called_with(**kwargs)
collist = ('UUID', 'Boot Index')
self.assertEqual(collist, columns)
datalist = ((baremetal_fakes.baremetal_volume_target_uuid,
baremetal_fakes.baremetal_volume_target_boot_index),)
self.assertEqual(datalist, tuple(data))
def test_baremetal_volume_target_list_fields_multiple(self):
arglist = ['--fields', 'uuid', 'boot_index', '--fields', 'extra']
verifylist = [('fields', [['uuid', 'boot_index'], ['extra']])]
fake_vt = copy.deepcopy(baremetal_fakes.VOLUME_TARGET)
fake_vt.pop('volume_type')
fake_vt.pop('properties')
fake_vt.pop('volume_id')
fake_vt.pop('node_uuid')
self.baremetal_mock.volume_target.list.return_value = [
baremetal_fakes.FakeBaremetalResource(
None,
fake_vt,
loaded=True)
]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
columns, data = self.cmd.take_action(parsed_args)
kwargs = {
'detail': False,
'marker': None,
'limit': None,
'fields': ('uuid', 'boot_index', 'extra')
}
self.baremetal_mock.volume_target.list.assert_called_with(**kwargs)
collist = ('UUID', 'Boot Index', 'Extra')
self.assertEqual(collist, columns)
datalist = ((baremetal_fakes.baremetal_volume_target_uuid,
baremetal_fakes.baremetal_volume_target_boot_index,
baremetal_fakes.baremetal_volume_target_extra),)
self.assertEqual(datalist, tuple(data))
def test_baremetal_volume_target_list_invalid_fields(self):
arglist = ['--fields', 'uuid', 'invalid']
verifylist = [('fields', [['uuid', 'invalid']])]
self.assertRaises(osctestutils.ParserException,
self.check_parser,
self.cmd, arglist, verifylist)
def test_baremetal_volume_target_list_marker(self):
arglist = ['--marker', baremetal_fakes.baremetal_volume_target_uuid]
verifylist = [
('marker', baremetal_fakes.baremetal_volume_target_uuid)]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
kwargs = {
'marker': baremetal_fakes.baremetal_volume_target_uuid,
'limit': None}
self.baremetal_mock.volume_target.list.assert_called_once_with(
**kwargs)
def test_baremetal_volume_target_list_limit(self):
arglist = ['--limit', '10']
verifylist = [('limit', 10)]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
kwargs = {
'marker': None,
'limit': 10}
self.baremetal_mock.volume_target.list.assert_called_once_with(
**kwargs)
def test_baremetal_volume_target_list_sort(self):
arglist = ['--sort', 'boot_index']
verifylist = [('sort', 'boot_index')]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
kwargs = {
'marker': None,
'limit': None}
self.baremetal_mock.volume_target.list.assert_called_once_with(
**kwargs)
def test_baremetal_volume_target_list_sort_desc(self):
arglist = ['--sort', 'boot_index:desc']
verifylist = [('sort', 'boot_index:desc')]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
kwargs = {
'marker': None,
'limit': None}
self.baremetal_mock.volume_target.list.assert_called_once_with(
**kwargs)
def test_baremetal_volume_target_list_exclusive_options(self):
arglist = ['--fields', 'uuid', '--long']
self.assertRaises(osctestutils.ParserException,
self.check_parser,
self.cmd, arglist, [])
def test_baremetal_volume_target_list_negative_limit(self):
arglist = ['--limit', '-1']
verifylist = [('limit', -1)]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.assertRaises(exc.CommandError,
self.cmd.take_action,
parsed_args)
class TestDeleteBaremetalVolumeTarget(TestBaremetalVolumeTarget):
def setUp(self):
super(TestDeleteBaremetalVolumeTarget, self).setUp()
self.cmd = bm_vol_target.DeleteBaremetalVolumeTarget(self.app, None)
def test_baremetal_volume_target_delete(self):
arglist = [baremetal_fakes.baremetal_volume_target_uuid]
verifylist = [('volume_targets',
[baremetal_fakes.baremetal_volume_target_uuid])]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.delete.assert_called_with(
baremetal_fakes.baremetal_volume_target_uuid)
def test_baremetal_volume_target_delete_multiple(self):
fake_volume_target_uuid2 = 'vvv-tttttt-tttt'
arglist = [baremetal_fakes.baremetal_volume_target_uuid,
fake_volume_target_uuid2]
verifylist = [('volume_targets',
[baremetal_fakes.baremetal_volume_target_uuid,
fake_volume_target_uuid2])]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.delete.assert_has_calls(
[mock.call(baremetal_fakes.baremetal_volume_target_uuid),
mock.call(fake_volume_target_uuid2)])
self.assertEqual(
2, self.baremetal_mock.volume_target.delete.call_count)
def test_baremetal_volume_target_delete_no_options(self):
arglist = []
verifylist = []
self.assertRaises(osctestutils.ParserException,
self.check_parser,
self.cmd, arglist, verifylist)
def test_baremetal_volume_target_delete_error(self):
arglist = [baremetal_fakes.baremetal_volume_target_uuid]
verifylist = [('volume_targets',
[baremetal_fakes.baremetal_volume_target_uuid])]
self.baremetal_mock.volume_target.delete.side_effect = (
exc.NotFound())
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.assertRaises(exc.ClientException,
self.cmd.take_action,
parsed_args)
self.baremetal_mock.volume_target.delete.assert_called_with(
baremetal_fakes.baremetal_volume_target_uuid)
def test_baremetal_volume_target_delete_multiple_error(self):
fake_volume_target_uuid2 = 'vvv-tttttt-tttt'
arglist = [baremetal_fakes.baremetal_volume_target_uuid,
fake_volume_target_uuid2]
verifylist = [('volume_targets',
[baremetal_fakes.baremetal_volume_target_uuid,
fake_volume_target_uuid2])]
self.baremetal_mock.volume_target.delete.side_effect = [
None, exc.NotFound()]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.assertRaises(exc.ClientException,
self.cmd.take_action,
parsed_args)
self.baremetal_mock.volume_target.delete.assert_has_calls(
[mock.call(baremetal_fakes.baremetal_volume_target_uuid),
mock.call(fake_volume_target_uuid2)])
self.assertEqual(
2, self.baremetal_mock.volume_target.delete.call_count)
class TestSetBaremetalVolumeTarget(TestBaremetalVolumeTarget):
def setUp(self):
super(TestSetBaremetalVolumeTarget, self).setUp()
self.cmd = (
bm_vol_target.SetBaremetalVolumeTarget(self.app, None))
def test_baremetal_volume_target_set_node_uuid(self):
new_node_uuid = 'xxx-xxxxxx-zzzz'
arglist = [
baremetal_fakes.baremetal_volume_target_uuid,
'--node', new_node_uuid]
verifylist = [
('volume_target',
baremetal_fakes.baremetal_volume_target_uuid),
('node_uuid', new_node_uuid)]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.update.assert_called_once_with(
baremetal_fakes.baremetal_volume_target_uuid,
[{'path': '/node_uuid', 'value': new_node_uuid, 'op': 'add'}])
def test_baremetal_volume_target_set_volume_type(self):
new_type = 'fibre_channel'
arglist = [
baremetal_fakes.baremetal_volume_target_uuid,
'--type', new_type]
verifylist = [
('volume_target',
baremetal_fakes.baremetal_volume_target_uuid),
('volume_type', new_type)]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.update.assert_called_once_with(
baremetal_fakes.baremetal_volume_target_uuid,
[{'path': '/volume_type', 'value': new_type, 'op': 'add'}])
def test_baremetal_volume_target_set_boot_index(self):
new_boot_idx = '3'
arglist = [
baremetal_fakes.baremetal_volume_target_uuid,
'--boot-index', new_boot_idx]
verifylist = [
('volume_target',
baremetal_fakes.baremetal_volume_target_uuid),
('boot_index', int(new_boot_idx))]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.update.assert_called_once_with(
baremetal_fakes.baremetal_volume_target_uuid,
[{'path': '/boot_index', 'value': int(new_boot_idx), 'op': 'add'}])
def test_baremetal_volume_target_set_negative_boot_index(self):
new_boot_idx = '-3'
arglist = [
baremetal_fakes.baremetal_volume_target_uuid,
'--boot-index', new_boot_idx]
verifylist = [
('volume_target',
baremetal_fakes.baremetal_volume_target_uuid),
('boot_index', int(new_boot_idx))]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.assertRaises(exc.CommandError, self.cmd.take_action, parsed_args)
def test_baremetal_volume_target_set_invalid_boot_index(self):
new_boot_idx = 'string'
arglist = [
baremetal_fakes.baremetal_volume_target_uuid,
'--boot-index', new_boot_idx]
verifylist = None
self.assertRaises(osctestutils.ParserException,
self.check_parser,
self.cmd, arglist, verifylist)
def test_baremetal_volume_target_set_volume_id(self):
new_volume_id = 'new-volume-id'
arglist = [
baremetal_fakes.baremetal_volume_target_uuid,
'--volume-id', new_volume_id]
verifylist = [
('volume_target',
baremetal_fakes.baremetal_volume_target_uuid),
('volume_id', new_volume_id)]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.update.assert_called_once_with(
baremetal_fakes.baremetal_volume_target_uuid,
[{'path': '/volume_id', 'value': new_volume_id, 'op': 'add'}])
def test_baremetal_volume_target_set_volume_type_and_volume_id(self):
new_volume_type = 'fibre_channel'
new_volume_id = 'new-volume-id'
arglist = [
baremetal_fakes.baremetal_volume_target_uuid,
'--type', new_volume_type,
'--volume-id', new_volume_id]
verifylist = [
('volume_target',
baremetal_fakes.baremetal_volume_target_uuid),
('volume_type', new_volume_type),
('volume_id', new_volume_id)]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.update.assert_called_once_with(
baremetal_fakes.baremetal_volume_target_uuid,
[{'path': '/volume_type', 'value': new_volume_type, 'op': 'add'},
{'path': '/volume_id', 'value': new_volume_id, 'op': 'add'}])
def test_baremetal_volume_target_set_extra(self):
arglist = [
baremetal_fakes.baremetal_volume_target_uuid,
'--extra', 'foo=bar']
verifylist = [
('volume_target',
baremetal_fakes.baremetal_volume_target_uuid),
('extra', ['foo=bar'])]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.update.assert_called_once_with(
baremetal_fakes.baremetal_volume_target_uuid,
[{'path': '/extra/foo', 'value': 'bar', 'op': 'add'}])
def test_baremetal_volume_target_set_multiple_extras(self):
arglist = [
baremetal_fakes.baremetal_volume_target_uuid,
'--extra', 'key1=val1', '--extra', 'key2=val2']
verifylist = [
('volume_target',
baremetal_fakes.baremetal_volume_target_uuid),
('extra', ['key1=val1', 'key2=val2'])]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.update.assert_called_once_with(
baremetal_fakes.baremetal_volume_target_uuid,
[{'path': '/extra/key1', 'value': 'val1', 'op': 'add'},
{'path': '/extra/key2', 'value': 'val2', 'op': 'add'}])
def test_baremetal_volume_target_set_property(self):
arglist = [
baremetal_fakes.baremetal_volume_target_uuid,
'--property', 'foo=bar']
verifylist = [
('volume_target',
baremetal_fakes.baremetal_volume_target_uuid),
('properties', ['foo=bar'])]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.update.assert_called_once_with(
baremetal_fakes.baremetal_volume_target_uuid,
[{'path': '/properties/foo', 'value': 'bar', 'op': 'add'}])
def test_baremetal_volume_target_set_multiple_properties(self):
arglist = [
baremetal_fakes.baremetal_volume_target_uuid,
'--property', 'key1=val1', '--property', 'key2=val2']
verifylist = [
('volume_target',
baremetal_fakes.baremetal_volume_target_uuid),
('properties', ['key1=val1', 'key2=val2'])]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.update.assert_called_once_with(
baremetal_fakes.baremetal_volume_target_uuid,
[{'path': '/properties/key1', 'value': 'val1', 'op': 'add'},
{'path': '/properties/key2', 'value': 'val2', 'op': 'add'}])
def test_baremetal_volume_target_set_no_options(self):
arglist = []
verifylist = []
self.assertRaises(osctestutils.ParserException,
self.check_parser,
self.cmd, arglist, verifylist)
def test_baremetal_volume_target_set_no_property(self):
arglist = [baremetal_fakes.baremetal_volume_target_uuid]
verifylist = [('volume_target',
baremetal_fakes.baremetal_volume_target_uuid)]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.update.assert_not_called()
class TestUnsetBaremetalVolumeTarget(TestBaremetalVolumeTarget):
def setUp(self):
super(TestUnsetBaremetalVolumeTarget, self).setUp()
self.cmd = bm_vol_target.UnsetBaremetalVolumeTarget(self.app, None)
def test_baremetal_volume_target_unset_extra(self):
arglist = [baremetal_fakes.baremetal_volume_target_uuid,
'--extra', 'key1']
verifylist = [('volume_target',
baremetal_fakes.baremetal_volume_target_uuid),
('extra', ['key1'])]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.update.assert_called_once_with(
baremetal_fakes.baremetal_volume_target_uuid,
[{'path': '/extra/key1', 'op': 'remove'}])
def test_baremetal_volume_target_unset_multiple_extras(self):
arglist = [baremetal_fakes.baremetal_volume_target_uuid,
'--extra', 'key1', '--extra', 'key2']
verifylist = [('volume_target',
baremetal_fakes.baremetal_volume_target_uuid),
('extra', ['key1', 'key2'])]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.update.assert_called_once_with(
baremetal_fakes.baremetal_volume_target_uuid,
[{'path': '/extra/key1', 'op': 'remove'},
{'path': '/extra/key2', 'op': 'remove'}])
def test_baremetal_volume_target_unset_property(self):
arglist = [baremetal_fakes.baremetal_volume_target_uuid,
'--property', 'key11']
verifylist = [('volume_target',
baremetal_fakes.baremetal_volume_target_uuid),
('properties', ['key11'])]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.update.assert_called_once_with(
baremetal_fakes.baremetal_volume_target_uuid,
[{'path': '/properties/key11', 'op': 'remove'}])
def test_baremetal_volume_target_unset_multiple_properties(self):
arglist = [baremetal_fakes.baremetal_volume_target_uuid,
'--property', 'key11', '--property', 'key22']
verifylist = [('volume_target',
baremetal_fakes.baremetal_volume_target_uuid),
('properties', ['key11', 'key22'])]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.update.assert_called_once_with(
baremetal_fakes.baremetal_volume_target_uuid,
[{'path': '/properties/key11', 'op': 'remove'},
{'path': '/properties/key22', 'op': 'remove'}])
def test_baremetal_volume_target_unset_no_options(self):
arglist = []
verifylist = []
self.assertRaises(osctestutils.ParserException,
self.check_parser,
self.cmd, arglist, verifylist)
def test_baremetal_volume_target_unset_no_property(self):
arglist = [baremetal_fakes.baremetal_volume_target_uuid]
verifylist = [('volume_target',
baremetal_fakes.baremetal_volume_target_uuid)]
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
self.cmd.take_action(parsed_args)
self.baremetal_mock.volume_target.update.assert_not_called()
| 39.141104 | 79 | 0.631322 | 3,969 | 38,280 | 5.697657 | 0.056185 | 0.143274 | 0.177368 | 0.173123 | 0.897188 | 0.872955 | 0.855665 | 0.832493 | 0.801981 | 0.790395 | 0 | 0.003253 | 0.269253 | 38,280 | 977 | 80 | 39.181167 | 0.805169 | 0.015256 | 0 | 0.705656 | 0 | 0 | 0.084287 | 0 | 0 | 0 | 0 | 0 | 0.086118 | 1 | 0.07455 | false | 0 | 0.007712 | 0 | 0.09126 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6724ba546a7e71e698eab4844cc7f0d38e4c309f | 29,651 | py | Python | sdk/python/pulumi_aws/transfer/access.py | chivandikwa/pulumi-aws | 19c08bf9dcb90544450ffa4eec7bf6751058fde2 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2021-11-10T16:33:40.000Z | 2021-11-10T16:33:40.000Z | sdk/python/pulumi_aws/transfer/access.py | chivandikwa/pulumi-aws | 19c08bf9dcb90544450ffa4eec7bf6751058fde2 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_aws/transfer/access.py | chivandikwa/pulumi-aws | 19c08bf9dcb90544450ffa4eec7bf6751058fde2 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['AccessArgs', 'Access']
@pulumi.input_type
class AccessArgs:
def __init__(__self__, *,
external_id: pulumi.Input[str],
server_id: pulumi.Input[str],
home_directory: Optional[pulumi.Input[str]] = None,
home_directory_mappings: Optional[pulumi.Input[Sequence[pulumi.Input['AccessHomeDirectoryMappingArgs']]]] = None,
home_directory_type: Optional[pulumi.Input[str]] = None,
policy: Optional[pulumi.Input[str]] = None,
posix_profile: Optional[pulumi.Input['AccessPosixProfileArgs']] = None,
role: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a Access resource.
:param pulumi.Input[str] external_id: The SID of a group in the directory connected to the Transfer Server (e.g., `S-1-1-12-1234567890-123456789-1234567890-1234`)
:param pulumi.Input[str] server_id: The Server ID of the Transfer Server (e.g., `s-12345678`)
:param pulumi.Input[str] home_directory: The landing directory (folder) for a user when they log in to the server using their SFTP client. It should begin with a `/`. The first item in the path is the name of the home bucket (accessible as `${Transfer:HomeBucket}` in the policy) and the rest is the home directory (accessible as `${Transfer:HomeDirectory}` in the policy). For example, `/example-bucket-1234/username` would set the home bucket to `example-bucket-1234` and the home directory to `username`.
:param pulumi.Input[Sequence[pulumi.Input['AccessHomeDirectoryMappingArgs']]] home_directory_mappings: Logical directory mappings that specify what S3 paths and keys should be visible to your user and how you want to make them visible. See Home Directory Mappings below.
:param pulumi.Input[str] home_directory_type: The type of landing directory (folder) you mapped for your users' home directory. Valid values are `PATH` and `LOGICAL`.
:param pulumi.Input['AccessPosixProfileArgs'] posix_profile: Specifies the full POSIX identity, including user ID (Uid), group ID (Gid), and any secondary groups IDs (SecondaryGids), that controls your users' access to your Amazon EFS file systems. See Posix Profile below.
:param pulumi.Input[str] role: Amazon Resource Name (ARN) of an IAM role that allows the service to controls your user’s access to your Amazon S3 bucket.
"""
pulumi.set(__self__, "external_id", external_id)
pulumi.set(__self__, "server_id", server_id)
if home_directory is not None:
pulumi.set(__self__, "home_directory", home_directory)
if home_directory_mappings is not None:
pulumi.set(__self__, "home_directory_mappings", home_directory_mappings)
if home_directory_type is not None:
pulumi.set(__self__, "home_directory_type", home_directory_type)
if policy is not None:
pulumi.set(__self__, "policy", policy)
if posix_profile is not None:
pulumi.set(__self__, "posix_profile", posix_profile)
if role is not None:
pulumi.set(__self__, "role", role)
@property
@pulumi.getter(name="externalId")
def external_id(self) -> pulumi.Input[str]:
"""
The SID of a group in the directory connected to the Transfer Server (e.g., `S-1-1-12-1234567890-123456789-1234567890-1234`)
"""
return pulumi.get(self, "external_id")
@external_id.setter
def external_id(self, value: pulumi.Input[str]):
pulumi.set(self, "external_id", value)
@property
@pulumi.getter(name="serverId")
def server_id(self) -> pulumi.Input[str]:
"""
The Server ID of the Transfer Server (e.g., `s-12345678`)
"""
return pulumi.get(self, "server_id")
@server_id.setter
def server_id(self, value: pulumi.Input[str]):
pulumi.set(self, "server_id", value)
@property
@pulumi.getter(name="homeDirectory")
def home_directory(self) -> Optional[pulumi.Input[str]]:
"""
The landing directory (folder) for a user when they log in to the server using their SFTP client. It should begin with a `/`. The first item in the path is the name of the home bucket (accessible as `${Transfer:HomeBucket}` in the policy) and the rest is the home directory (accessible as `${Transfer:HomeDirectory}` in the policy). For example, `/example-bucket-1234/username` would set the home bucket to `example-bucket-1234` and the home directory to `username`.
"""
return pulumi.get(self, "home_directory")
@home_directory.setter
def home_directory(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "home_directory", value)
@property
@pulumi.getter(name="homeDirectoryMappings")
def home_directory_mappings(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['AccessHomeDirectoryMappingArgs']]]]:
"""
Logical directory mappings that specify what S3 paths and keys should be visible to your user and how you want to make them visible. See Home Directory Mappings below.
"""
return pulumi.get(self, "home_directory_mappings")
@home_directory_mappings.setter
def home_directory_mappings(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['AccessHomeDirectoryMappingArgs']]]]):
pulumi.set(self, "home_directory_mappings", value)
@property
@pulumi.getter(name="homeDirectoryType")
def home_directory_type(self) -> Optional[pulumi.Input[str]]:
"""
The type of landing directory (folder) you mapped for your users' home directory. Valid values are `PATH` and `LOGICAL`.
"""
return pulumi.get(self, "home_directory_type")
@home_directory_type.setter
def home_directory_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "home_directory_type", value)
@property
@pulumi.getter
def policy(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "policy")
@policy.setter
def policy(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "policy", value)
@property
@pulumi.getter(name="posixProfile")
def posix_profile(self) -> Optional[pulumi.Input['AccessPosixProfileArgs']]:
"""
Specifies the full POSIX identity, including user ID (Uid), group ID (Gid), and any secondary groups IDs (SecondaryGids), that controls your users' access to your Amazon EFS file systems. See Posix Profile below.
"""
return pulumi.get(self, "posix_profile")
@posix_profile.setter
def posix_profile(self, value: Optional[pulumi.Input['AccessPosixProfileArgs']]):
pulumi.set(self, "posix_profile", value)
@property
@pulumi.getter
def role(self) -> Optional[pulumi.Input[str]]:
"""
Amazon Resource Name (ARN) of an IAM role that allows the service to controls your user’s access to your Amazon S3 bucket.
"""
return pulumi.get(self, "role")
@role.setter
def role(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "role", value)
@pulumi.input_type
class _AccessState:
def __init__(__self__, *,
external_id: Optional[pulumi.Input[str]] = None,
home_directory: Optional[pulumi.Input[str]] = None,
home_directory_mappings: Optional[pulumi.Input[Sequence[pulumi.Input['AccessHomeDirectoryMappingArgs']]]] = None,
home_directory_type: Optional[pulumi.Input[str]] = None,
policy: Optional[pulumi.Input[str]] = None,
posix_profile: Optional[pulumi.Input['AccessPosixProfileArgs']] = None,
role: Optional[pulumi.Input[str]] = None,
server_id: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering Access resources.
:param pulumi.Input[str] external_id: The SID of a group in the directory connected to the Transfer Server (e.g., `S-1-1-12-1234567890-123456789-1234567890-1234`)
:param pulumi.Input[str] home_directory: The landing directory (folder) for a user when they log in to the server using their SFTP client. It should begin with a `/`. The first item in the path is the name of the home bucket (accessible as `${Transfer:HomeBucket}` in the policy) and the rest is the home directory (accessible as `${Transfer:HomeDirectory}` in the policy). For example, `/example-bucket-1234/username` would set the home bucket to `example-bucket-1234` and the home directory to `username`.
:param pulumi.Input[Sequence[pulumi.Input['AccessHomeDirectoryMappingArgs']]] home_directory_mappings: Logical directory mappings that specify what S3 paths and keys should be visible to your user and how you want to make them visible. See Home Directory Mappings below.
:param pulumi.Input[str] home_directory_type: The type of landing directory (folder) you mapped for your users' home directory. Valid values are `PATH` and `LOGICAL`.
:param pulumi.Input['AccessPosixProfileArgs'] posix_profile: Specifies the full POSIX identity, including user ID (Uid), group ID (Gid), and any secondary groups IDs (SecondaryGids), that controls your users' access to your Amazon EFS file systems. See Posix Profile below.
:param pulumi.Input[str] role: Amazon Resource Name (ARN) of an IAM role that allows the service to controls your user’s access to your Amazon S3 bucket.
:param pulumi.Input[str] server_id: The Server ID of the Transfer Server (e.g., `s-12345678`)
"""
if external_id is not None:
pulumi.set(__self__, "external_id", external_id)
if home_directory is not None:
pulumi.set(__self__, "home_directory", home_directory)
if home_directory_mappings is not None:
pulumi.set(__self__, "home_directory_mappings", home_directory_mappings)
if home_directory_type is not None:
pulumi.set(__self__, "home_directory_type", home_directory_type)
if policy is not None:
pulumi.set(__self__, "policy", policy)
if posix_profile is not None:
pulumi.set(__self__, "posix_profile", posix_profile)
if role is not None:
pulumi.set(__self__, "role", role)
if server_id is not None:
pulumi.set(__self__, "server_id", server_id)
@property
@pulumi.getter(name="externalId")
def external_id(self) -> Optional[pulumi.Input[str]]:
"""
The SID of a group in the directory connected to the Transfer Server (e.g., `S-1-1-12-1234567890-123456789-1234567890-1234`)
"""
return pulumi.get(self, "external_id")
@external_id.setter
def external_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "external_id", value)
@property
@pulumi.getter(name="homeDirectory")
def home_directory(self) -> Optional[pulumi.Input[str]]:
"""
The landing directory (folder) for a user when they log in to the server using their SFTP client. It should begin with a `/`. The first item in the path is the name of the home bucket (accessible as `${Transfer:HomeBucket}` in the policy) and the rest is the home directory (accessible as `${Transfer:HomeDirectory}` in the policy). For example, `/example-bucket-1234/username` would set the home bucket to `example-bucket-1234` and the home directory to `username`.
"""
return pulumi.get(self, "home_directory")
@home_directory.setter
def home_directory(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "home_directory", value)
@property
@pulumi.getter(name="homeDirectoryMappings")
def home_directory_mappings(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['AccessHomeDirectoryMappingArgs']]]]:
"""
Logical directory mappings that specify what S3 paths and keys should be visible to your user and how you want to make them visible. See Home Directory Mappings below.
"""
return pulumi.get(self, "home_directory_mappings")
@home_directory_mappings.setter
def home_directory_mappings(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['AccessHomeDirectoryMappingArgs']]]]):
pulumi.set(self, "home_directory_mappings", value)
@property
@pulumi.getter(name="homeDirectoryType")
def home_directory_type(self) -> Optional[pulumi.Input[str]]:
"""
The type of landing directory (folder) you mapped for your users' home directory. Valid values are `PATH` and `LOGICAL`.
"""
return pulumi.get(self, "home_directory_type")
@home_directory_type.setter
def home_directory_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "home_directory_type", value)
@property
@pulumi.getter
def policy(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "policy")
@policy.setter
def policy(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "policy", value)
@property
@pulumi.getter(name="posixProfile")
def posix_profile(self) -> Optional[pulumi.Input['AccessPosixProfileArgs']]:
"""
Specifies the full POSIX identity, including user ID (Uid), group ID (Gid), and any secondary groups IDs (SecondaryGids), that controls your users' access to your Amazon EFS file systems. See Posix Profile below.
"""
return pulumi.get(self, "posix_profile")
@posix_profile.setter
def posix_profile(self, value: Optional[pulumi.Input['AccessPosixProfileArgs']]):
pulumi.set(self, "posix_profile", value)
@property
@pulumi.getter
def role(self) -> Optional[pulumi.Input[str]]:
"""
Amazon Resource Name (ARN) of an IAM role that allows the service to controls your user’s access to your Amazon S3 bucket.
"""
return pulumi.get(self, "role")
@role.setter
def role(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "role", value)
@property
@pulumi.getter(name="serverId")
def server_id(self) -> Optional[pulumi.Input[str]]:
"""
The Server ID of the Transfer Server (e.g., `s-12345678`)
"""
return pulumi.get(self, "server_id")
@server_id.setter
def server_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "server_id", value)
class Access(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
external_id: Optional[pulumi.Input[str]] = None,
home_directory: Optional[pulumi.Input[str]] = None,
home_directory_mappings: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['AccessHomeDirectoryMappingArgs']]]]] = None,
home_directory_type: Optional[pulumi.Input[str]] = None,
policy: Optional[pulumi.Input[str]] = None,
posix_profile: Optional[pulumi.Input[pulumi.InputType['AccessPosixProfileArgs']]] = None,
role: Optional[pulumi.Input[str]] = None,
server_id: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Provides a AWS Transfer Access resource.
## Example Usage
### Basic S3
```python
import pulumi
import pulumi_aws as aws
example = aws.transfer.Access("example",
external_id="S-1-1-12-1234567890-123456789-1234567890-1234",
server_id=aws_transfer_server["example"]["id"],
role=aws_iam_role["example"]["arn"],
home_directory=f"/{aws_s3_bucket['example']['id']}/")
```
### Basic EFS
```python
import pulumi
import pulumi_aws as aws
test = aws.transfer.Access("test",
external_id="S-1-1-12-1234567890-123456789-1234567890-1234",
server_id=aws_transfer_server["test"]["id"],
role=aws_iam_role["test"]["arn"],
home_directory=f"/{aws_efs_file_system['test']['id']}/",
posix_profile=aws.transfer.AccessPosixProfileArgs(
gid=1000,
uid=1000,
))
```
## Import
Transfer Accesses can be imported using the `server_id` and `external_id`, e.g.,
```sh
$ pulumi import aws:transfer/access:Access example s-12345678/S-1-1-12-1234567890-123456789-1234567890-1234
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] external_id: The SID of a group in the directory connected to the Transfer Server (e.g., `S-1-1-12-1234567890-123456789-1234567890-1234`)
:param pulumi.Input[str] home_directory: The landing directory (folder) for a user when they log in to the server using their SFTP client. It should begin with a `/`. The first item in the path is the name of the home bucket (accessible as `${Transfer:HomeBucket}` in the policy) and the rest is the home directory (accessible as `${Transfer:HomeDirectory}` in the policy). For example, `/example-bucket-1234/username` would set the home bucket to `example-bucket-1234` and the home directory to `username`.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['AccessHomeDirectoryMappingArgs']]]] home_directory_mappings: Logical directory mappings that specify what S3 paths and keys should be visible to your user and how you want to make them visible. See Home Directory Mappings below.
:param pulumi.Input[str] home_directory_type: The type of landing directory (folder) you mapped for your users' home directory. Valid values are `PATH` and `LOGICAL`.
:param pulumi.Input[pulumi.InputType['AccessPosixProfileArgs']] posix_profile: Specifies the full POSIX identity, including user ID (Uid), group ID (Gid), and any secondary groups IDs (SecondaryGids), that controls your users' access to your Amazon EFS file systems. See Posix Profile below.
:param pulumi.Input[str] role: Amazon Resource Name (ARN) of an IAM role that allows the service to controls your user’s access to your Amazon S3 bucket.
:param pulumi.Input[str] server_id: The Server ID of the Transfer Server (e.g., `s-12345678`)
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: AccessArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Provides a AWS Transfer Access resource.
## Example Usage
### Basic S3
```python
import pulumi
import pulumi_aws as aws
example = aws.transfer.Access("example",
external_id="S-1-1-12-1234567890-123456789-1234567890-1234",
server_id=aws_transfer_server["example"]["id"],
role=aws_iam_role["example"]["arn"],
home_directory=f"/{aws_s3_bucket['example']['id']}/")
```
### Basic EFS
```python
import pulumi
import pulumi_aws as aws
test = aws.transfer.Access("test",
external_id="S-1-1-12-1234567890-123456789-1234567890-1234",
server_id=aws_transfer_server["test"]["id"],
role=aws_iam_role["test"]["arn"],
home_directory=f"/{aws_efs_file_system['test']['id']}/",
posix_profile=aws.transfer.AccessPosixProfileArgs(
gid=1000,
uid=1000,
))
```
## Import
Transfer Accesses can be imported using the `server_id` and `external_id`, e.g.,
```sh
$ pulumi import aws:transfer/access:Access example s-12345678/S-1-1-12-1234567890-123456789-1234567890-1234
```
:param str resource_name: The name of the resource.
:param AccessArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(AccessArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
external_id: Optional[pulumi.Input[str]] = None,
home_directory: Optional[pulumi.Input[str]] = None,
home_directory_mappings: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['AccessHomeDirectoryMappingArgs']]]]] = None,
home_directory_type: Optional[pulumi.Input[str]] = None,
policy: Optional[pulumi.Input[str]] = None,
posix_profile: Optional[pulumi.Input[pulumi.InputType['AccessPosixProfileArgs']]] = None,
role: Optional[pulumi.Input[str]] = None,
server_id: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = AccessArgs.__new__(AccessArgs)
if external_id is None and not opts.urn:
raise TypeError("Missing required property 'external_id'")
__props__.__dict__["external_id"] = external_id
__props__.__dict__["home_directory"] = home_directory
__props__.__dict__["home_directory_mappings"] = home_directory_mappings
__props__.__dict__["home_directory_type"] = home_directory_type
__props__.__dict__["policy"] = policy
__props__.__dict__["posix_profile"] = posix_profile
__props__.__dict__["role"] = role
if server_id is None and not opts.urn:
raise TypeError("Missing required property 'server_id'")
__props__.__dict__["server_id"] = server_id
super(Access, __self__).__init__(
'aws:transfer/access:Access',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
external_id: Optional[pulumi.Input[str]] = None,
home_directory: Optional[pulumi.Input[str]] = None,
home_directory_mappings: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['AccessHomeDirectoryMappingArgs']]]]] = None,
home_directory_type: Optional[pulumi.Input[str]] = None,
policy: Optional[pulumi.Input[str]] = None,
posix_profile: Optional[pulumi.Input[pulumi.InputType['AccessPosixProfileArgs']]] = None,
role: Optional[pulumi.Input[str]] = None,
server_id: Optional[pulumi.Input[str]] = None) -> 'Access':
"""
Get an existing Access resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] external_id: The SID of a group in the directory connected to the Transfer Server (e.g., `S-1-1-12-1234567890-123456789-1234567890-1234`)
:param pulumi.Input[str] home_directory: The landing directory (folder) for a user when they log in to the server using their SFTP client. It should begin with a `/`. The first item in the path is the name of the home bucket (accessible as `${Transfer:HomeBucket}` in the policy) and the rest is the home directory (accessible as `${Transfer:HomeDirectory}` in the policy). For example, `/example-bucket-1234/username` would set the home bucket to `example-bucket-1234` and the home directory to `username`.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['AccessHomeDirectoryMappingArgs']]]] home_directory_mappings: Logical directory mappings that specify what S3 paths and keys should be visible to your user and how you want to make them visible. See Home Directory Mappings below.
:param pulumi.Input[str] home_directory_type: The type of landing directory (folder) you mapped for your users' home directory. Valid values are `PATH` and `LOGICAL`.
:param pulumi.Input[pulumi.InputType['AccessPosixProfileArgs']] posix_profile: Specifies the full POSIX identity, including user ID (Uid), group ID (Gid), and any secondary groups IDs (SecondaryGids), that controls your users' access to your Amazon EFS file systems. See Posix Profile below.
:param pulumi.Input[str] role: Amazon Resource Name (ARN) of an IAM role that allows the service to controls your user’s access to your Amazon S3 bucket.
:param pulumi.Input[str] server_id: The Server ID of the Transfer Server (e.g., `s-12345678`)
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _AccessState.__new__(_AccessState)
__props__.__dict__["external_id"] = external_id
__props__.__dict__["home_directory"] = home_directory
__props__.__dict__["home_directory_mappings"] = home_directory_mappings
__props__.__dict__["home_directory_type"] = home_directory_type
__props__.__dict__["policy"] = policy
__props__.__dict__["posix_profile"] = posix_profile
__props__.__dict__["role"] = role
__props__.__dict__["server_id"] = server_id
return Access(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="externalId")
def external_id(self) -> pulumi.Output[str]:
"""
The SID of a group in the directory connected to the Transfer Server (e.g., `S-1-1-12-1234567890-123456789-1234567890-1234`)
"""
return pulumi.get(self, "external_id")
@property
@pulumi.getter(name="homeDirectory")
def home_directory(self) -> pulumi.Output[Optional[str]]:
"""
The landing directory (folder) for a user when they log in to the server using their SFTP client. It should begin with a `/`. The first item in the path is the name of the home bucket (accessible as `${Transfer:HomeBucket}` in the policy) and the rest is the home directory (accessible as `${Transfer:HomeDirectory}` in the policy). For example, `/example-bucket-1234/username` would set the home bucket to `example-bucket-1234` and the home directory to `username`.
"""
return pulumi.get(self, "home_directory")
@property
@pulumi.getter(name="homeDirectoryMappings")
def home_directory_mappings(self) -> pulumi.Output[Optional[Sequence['outputs.AccessHomeDirectoryMapping']]]:
"""
Logical directory mappings that specify what S3 paths and keys should be visible to your user and how you want to make them visible. See Home Directory Mappings below.
"""
return pulumi.get(self, "home_directory_mappings")
@property
@pulumi.getter(name="homeDirectoryType")
def home_directory_type(self) -> pulumi.Output[Optional[str]]:
"""
The type of landing directory (folder) you mapped for your users' home directory. Valid values are `PATH` and `LOGICAL`.
"""
return pulumi.get(self, "home_directory_type")
@property
@pulumi.getter
def policy(self) -> pulumi.Output[Optional[str]]:
return pulumi.get(self, "policy")
@property
@pulumi.getter(name="posixProfile")
def posix_profile(self) -> pulumi.Output[Optional['outputs.AccessPosixProfile']]:
"""
Specifies the full POSIX identity, including user ID (Uid), group ID (Gid), and any secondary groups IDs (SecondaryGids), that controls your users' access to your Amazon EFS file systems. See Posix Profile below.
"""
return pulumi.get(self, "posix_profile")
@property
@pulumi.getter
def role(self) -> pulumi.Output[Optional[str]]:
"""
Amazon Resource Name (ARN) of an IAM role that allows the service to controls your user’s access to your Amazon S3 bucket.
"""
return pulumi.get(self, "role")
@property
@pulumi.getter(name="serverId")
def server_id(self) -> pulumi.Output[str]:
"""
The Server ID of the Transfer Server (e.g., `s-12345678`)
"""
return pulumi.get(self, "server_id")
| 54.505515 | 517 | 0.672389 | 3,758 | 29,651 | 5.135977 | 0.060404 | 0.084193 | 0.055127 | 0.054712 | 0.916274 | 0.903477 | 0.892596 | 0.880317 | 0.877675 | 0.866846 | 0 | 0.027996 | 0.224208 | 29,651 | 543 | 518 | 54.605893 | 0.811068 | 0.452329 | 0 | 0.767918 | 1 | 0 | 0.130745 | 0.055447 | 0 | 0 | 0 | 0 | 0 | 1 | 0.16041 | false | 0.003413 | 0.023891 | 0.010239 | 0.279863 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
67361d479a7f6020bb90f436dc4135217df411e3 | 1,690 | py | Python | test.py | dkhokhlov/conv2d | a2e4eb12a4c40d09b622cb8cf06654ec3339466a | [
"MIT"
] | null | null | null | test.py | dkhokhlov/conv2d | a2e4eb12a4c40d09b622cb8cf06654ec3339466a | [
"MIT"
] | null | null | null | test.py | dkhokhlov/conv2d | a2e4eb12a4c40d09b622cb8cf06654ec3339466a | [
"MIT"
] | null | null | null | import numpy as np
import conv
print('\n======================== Test 1')
image = np.array(
[[[
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0]
]]], dtype='float32') # N=1, Ci=1, H=4, W=4
kernel = np.ones((3, 1, 3, 3), dtype='float32') # Co, Ci, K, K
output = np.zeros((1, 3, 4, 4), dtype='float32') # N, Co, H, W
conv.conv2d(image, kernel, output)
print(image)
print('------')
print(kernel)
print('------')
print(output)
print('\n======================== Test 2')
image = np.array(
[[[
[1, 1, 1, 1],
[1, 1, 1, 1],
[1, 1, 1, 1],
[1, 1, 1, 1]
]]], dtype='float32') # N=1, Ci=1, H=4, W=4
kernel = np.ones((1, 1, 1, 1), dtype='float32') # Co, Ci, K, K
output = np.zeros((1, 1, 4, 4), dtype='float32') # N, Co, H, W
conv.conv2d(image, kernel, output)
print(image)
print('------')
print(kernel)
print('------')
print(output)
print('\n======================== Test 3')
kernel = np.ones((1, 1, 1, 1), dtype='float32') # Co, Ci, K, K
output = np.zeros((1, 1, 4, 4), dtype='float32') # N, Co, H, W
conv.conv2d(image, kernel, output)
print(image)
print('------')
print(kernel)
print('------')
print(output)
print('\n======================== Test 4')
image = np.array(
[[[
[0, 1, 2, 3],
[4, 5, 6, 7],
[8, 9, 10, 11],
[12, 13, 14, 15]
]]], dtype='float32') # N=1, Ci=1, H=4, W=4
kernel = np.ones((3, 1, 3, 3), dtype='float32') # Co, Ci, K, K
output = np.zeros((1, 3, 4, 4), dtype='float32') # N, Co, H, W
conv.conv2d(image, kernel, output, (0., 0.01))
print(image)
print('------')
print(kernel)
print('------')
print(output)
print('------')
print('bias', (0., 0.01))
| 25.223881 | 62 | 0.468639 | 274 | 1,690 | 2.890511 | 0.145985 | 0.058081 | 0.068182 | 0.075758 | 0.856061 | 0.856061 | 0.856061 | 0.856061 | 0.856061 | 0.796717 | 0 | 0.102333 | 0.213609 | 1,690 | 66 | 63 | 25.606061 | 0.493604 | 0.094083 | 0 | 0.770492 | 0 | 0 | 0.175889 | 0.068511 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.032787 | 0 | 0.032787 | 0.42623 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 9 |
674c05e19b1ccb7a52a04f678685da0d1b7de2ae | 9,271 | py | Python | accounting/migrations/0001_initial.py | venkat0708/BalajiVV | ddf74d26a7ecae3f3bc5a902dcab09bf8f30e448 | [
"MIT"
] | null | null | null | accounting/migrations/0001_initial.py | venkat0708/BalajiVV | ddf74d26a7ecae3f3bc5a902dcab09bf8f30e448 | [
"MIT"
] | null | null | null | accounting/migrations/0001_initial.py | venkat0708/BalajiVV | ddf74d26a7ecae3f3bc5a902dcab09bf8f30e448 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.10.7 on 2018-02-23 20:24
from __future__ import unicode_literals
import django.core.validators
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
('customers', '0009_staff'),
('products', '0004_auto_20180219_0114'),
('booking', '0005_auto_20180224_0154'),
]
operations = [
migrations.CreateModel(
name='Bill',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_date', models.DateTimeField(auto_now_add=True)),
('updated_date', models.DateTimeField(auto_now=True)),
('generated_date', models.DateField(verbose_name='date invoice generated')),
('due_date', models.DateField(verbose_name='date payment is expected')),
('status', models.CharField(choices=[('CREATED', 'Created'), ('CONFIRMED', 'Confirmed'), ('PARTIAL_PAYMENT', 'Partially Paid'), ('RECEIVED', 'Received'), ('CLOSED', 'Closed')], default='CREATED', max_length=15)),
('amount', models.IntegerField(default=500, validators=[django.core.validators.MinValueValidator(10, message='Amount should be greater than 10'), django.core.validators.MaxValueValidator(10000000, message='Amount should be less than 10000000')])),
('paid', models.IntegerField(default=500, validators=[django.core.validators.MinValueValidator(10, message='Amount should be greater than 10'), django.core.validators.MaxValueValidator(10000000, message='Amount should be less than 10000000')])),
('booked_service', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='billed_services', to='booking.Booked_Service')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Commission',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_date', models.DateTimeField(auto_now_add=True)),
('updated_date', models.DateTimeField(auto_now=True)),
('generated_date', models.DateField(verbose_name='date invoice generated')),
('due_date', models.DateField(verbose_name='date payment is expected')),
('status', models.CharField(choices=[('CREATED', 'Created'), ('CONFIRMED', 'Confirmed'), ('PARTIAL_PAYMENT', 'Partially Paid'), ('RECEIVED', 'Received'), ('CLOSED', 'Closed')], default='CREATED', max_length=15)),
('amount', models.IntegerField(default=500, validators=[django.core.validators.MinValueValidator(10, message='Amount should be greater than 10'), django.core.validators.MaxValueValidator(10000000, message='Amount should be less than 10000000')])),
('paid', models.IntegerField(default=500, validators=[django.core.validators.MinValueValidator(10, message='Amount should be greater than 10'), django.core.validators.MaxValueValidator(10000000, message='Amount should be less than 10000000')])),
('booked_service', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='commissions', to='booking.Booked_Service')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Commission_Structure',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_date', models.DateTimeField(auto_now_add=True)),
('updated_date', models.DateTimeField(auto_now=True)),
('amount', models.IntegerField(default=500, validators=[django.core.validators.MinValueValidator(10, message='Amount should be greater than 10'), django.core.validators.MaxValueValidator(100000, message='Amount should be less than 100000')])),
('service', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='commissions', to='products.Service')),
('staff', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='commissions', to='customers.Staff')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Invoice',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_date', models.DateTimeField(auto_now_add=True)),
('updated_date', models.DateTimeField(auto_now=True)),
('generated_date', models.DateField(verbose_name='date invoice generated')),
('due_date', models.DateField(verbose_name='date payment is expected')),
('status', models.CharField(choices=[('CREATED', 'Created'), ('CONFIRMED', 'Confirmed'), ('PARTIAL_PAYMENT', 'Partially Paid'), ('RECEIVED', 'Received'), ('CLOSED', 'Closed')], default='CREATED', max_length=15)),
('amount', models.IntegerField(default=500, validators=[django.core.validators.MinValueValidator(10, message='Amount should be greater than 10'), django.core.validators.MaxValueValidator(10000000, message='Amount should be less than 10000000')])),
('paid', models.IntegerField(default=500, validators=[django.core.validators.MinValueValidator(10, message='Amount should be greater than 10'), django.core.validators.MaxValueValidator(10000000, message='Amount should be less than 10000000')])),
('customer', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='invoices', to='customers.Customer')),
('event', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='invoice', to='booking.Event')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Payin',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_date', models.DateTimeField(auto_now_add=True)),
('updated_date', models.DateTimeField(auto_now=True)),
('date', models.DateField(verbose_name='payment date')),
('time', models.TimeField(verbose_name='payment time')),
('amount', models.IntegerField(default=500, validators=[django.core.validators.MinValueValidator(10, message='Amount should be greater than 10'), django.core.validators.MaxValueValidator(10000000, message='Amount should be less than 10000000')])),
('mode', models.CharField(choices=[('BANK', 'Bank'), ('CHEQUE', 'Cheque'), ('DD', 'Demand Draft'), ('CASH', 'Cash')], default='CASH', max_length=15)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Payout',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_date', models.DateTimeField(auto_now_add=True)),
('updated_date', models.DateTimeField(auto_now=True)),
('date', models.DateField(verbose_name='payment date')),
('time', models.TimeField(verbose_name='payment time')),
('amount', models.IntegerField(default=500, validators=[django.core.validators.MinValueValidator(10, message='Amount should be greater than 10'), django.core.validators.MaxValueValidator(10000000, message='Amount should be less than 10000000')])),
('mode', models.CharField(choices=[('BANK', 'Bank'), ('CHEQUE', 'Cheque'), ('DD', 'Demand Draft'), ('CASH', 'Cash')], default='CASH', max_length=15)),
],
options={
'abstract': False,
},
),
migrations.AddField(
model_name='invoice',
name='payins',
field=models.ManyToManyField(related_name='Payins', to='accounting.Payin'),
),
migrations.AddField(
model_name='commission',
name='payouts',
field=models.ManyToManyField(related_name='Payouts', to='accounting.Payout'),
),
migrations.AddField(
model_name='commission',
name='staff',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='staff_commissions', to='customers.Staff'),
),
migrations.AddField(
model_name='bill',
name='payouts',
field=models.ManyToManyField(related_name='billed_Payouts', to='accounting.Payout'),
),
migrations.AddField(
model_name='bill',
name='vendor',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='vendor', to='customers.Vendor'),
),
]
| 64.832168 | 263 | 0.62744 | 930 | 9,271 | 6.130108 | 0.13871 | 0.035082 | 0.066655 | 0.066304 | 0.893177 | 0.886687 | 0.847395 | 0.830556 | 0.812314 | 0.812314 | 0 | 0.037032 | 0.225218 | 9,271 | 142 | 264 | 65.288732 | 0.756648 | 0.007335 | 0 | 0.69403 | 1 | 0 | 0.227174 | 0.009783 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.029851 | 0 | 0.059701 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
674e978e9b569c82ef188407081c5f8b56068ff7 | 134,914 | py | Python | ciscoisesdk/api/v3_0_0/certificates.py | CiscoISE/ciscoisesdk | 860b0fc7cc15d0c2a39c64608195a7ab3d5f4885 | [
"MIT"
] | 36 | 2021-05-18T16:24:19.000Z | 2022-03-05T13:44:41.000Z | ciscoisesdk/api/v3_0_0/certificates.py | CiscoISE/ciscoisesdk | 860b0fc7cc15d0c2a39c64608195a7ab3d5f4885 | [
"MIT"
] | 15 | 2021-06-08T19:03:37.000Z | 2022-02-25T14:47:33.000Z | ciscoisesdk/api/v3_0_0/certificates.py | CiscoISE/ciscoisesdk | 860b0fc7cc15d0c2a39c64608195a7ab3d5f4885 | [
"MIT"
] | 6 | 2021-06-10T09:32:01.000Z | 2022-01-12T08:34:39.000Z | # -*- coding: utf-8 -*-
"""Cisco Identity Services Engine Certificates API wrapper.
Copyright (c) 2021 Cisco and/or its affiliates.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
"""
from __future__ import (
absolute_import,
division,
print_function,
unicode_literals,
)
from builtins import *
from past.builtins import basestring
from ...restsession import RestSession
from ...utils import (
check_type,
dict_from_items_with_values,
apply_path_params,
dict_of_str,
get_next_page,
)
class Certificates(object):
"""Identity Services Engine Certificates API (version: 3.0.0).
Wraps the Identity Services Engine Certificates
API and exposes the API as native Python
methods that return native Python objects.
"""
def __init__(self, session, object_factory, request_validator):
"""Initialize a new Certificates
object with the provided RestSession.
Args:
session(RestSession): The RESTful session object to be used for
API calls to the Identity Services Engine service.
Raises:
TypeError: If the parameter types are incorrect.
"""
check_type(session, RestSession)
super(Certificates, self).__init__()
self._session = session
self._object_factory = object_factory
self._request_validator = request_validator
def get_csrs(self,
filter=None,
filter_type=None,
page=None,
size=None,
sort=None,
sort_by=None,
headers=None,
**query_parameters):
""" This API supports Filtering, Sorting and Pagination.
Filtering and Sorting supported on below mentioned
attributes: friendlyName subject timeStamp
Supported Date Format: yyyy-MM-dd HH:mm:ss.SSS
Supported Operators: EQ, NEQ, GT and LT .
Args:
page(int): page query parameter. Page number.
size(int): size query parameter. Number of objects
returned per page.
sort(basestring): sort query parameter. sort type asc or
desc.
sort_by(basestring): sortBy query parameter. sort column
by which objects needs to be sorted.
filter(basestring, list, set, tuple): filter query
parameter. Simple filtering
should be available through the filter
query string parameter. The structure of
a filter is a triplet of field operator
and value separated with dots. More than
one filter can be sent. The logical
operator common to ALL filter criteria
will be by default AND, and can be
changed by using the "filterType=or"
query string parameter. Each resource
Data model description should specify if
an attribute is a filtered field.
OPERATOR DESCRIPTION EQ
Equals NEQ Not Equals GT
Greater Than LT Less Then
STARTSW Starts With NSTARTSW
Not Starts With ENDSW Ends With
NENDSW Not Ends With CONTAINS
Contains NCONTAINS Not Contains
.
filter_type(basestring): filterType query parameter. The
logical operator common to ALL filter
criteria will be by default AND, and can
be changed by using the parameter.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
check_type(page, (int, basestring, list))
check_type(size, (int, basestring, list))
check_type(sort, basestring)
check_type(sort_by, basestring)
check_type(filter, (basestring, list, set, tuple))
check_type(filter_type, basestring)
_params = {
'page':
page,
'size':
size,
'sort':
sort,
'sortBy':
sort_by,
'filter':
filter,
'filterType':
filter_type,
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
}
e_url = ('/api/v1/certs/certificate-signing-request')
endpoint_full_url = apply_path_params(e_url, path_params)
if with_custom_headers:
_api_response = self._session.get(endpoint_full_url, params=_params,
headers=_headers)
else:
_api_response = self._session.get(endpoint_full_url, params=_params)
return self._object_factory('bpm_eeef18d70b159f788b717e301dd3643_v3_0_0', _api_response)
def get_csrs_generator(self,
filter=None,
filter_type=None,
page=None,
size=None,
sort=None,
sort_by=None,
headers=None,
**query_parameters):
""" This API supports Filtering, Sorting and Pagination.
Filtering and Sorting supported on below mentioned
attributes: friendlyName subject timeStamp
Supported Date Format: yyyy-MM-dd HH:mm:ss.SSS
Supported Operators: EQ, NEQ, GT and LT .
Args:
page(int): page query parameter. Page number.
size(int): size query parameter. Number of objects
returned per page.
sort(basestring): sort query parameter. sort type asc or
desc.
sort_by(basestring): sortBy query parameter. sort column
by which objects needs to be sorted.
filter(basestring, list, set, tuple): filter query
parameter. Simple filtering
should be available through the filter
query string parameter. The structure of
a filter is a triplet of field operator
and value separated with dots. More than
one filter can be sent. The logical
operator common to ALL filter criteria
will be by default AND, and can be
changed by using the "filterType=or"
query string parameter. Each resource
Data model description should specify if
an attribute is a filtered field.
OPERATOR DESCRIPTION EQ
Equals NEQ Not Equals GT
Greater Than LT Less Then
STARTSW Starts With NSTARTSW
Not Starts With ENDSW Ends With
NENDSW Not Ends With CONTAINS
Contains NCONTAINS Not Contains
.
filter_type(basestring): filterType query parameter. The
logical operator common to ALL filter
criteria will be by default AND, and can
be changed by using the parameter.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
Generator: A generator object containing the following object.
+ RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
yield from get_next_page(
self.get_csrs, dict(
filter=filter,
filter_type=filter_type,
page=page,
size=size,
sort=sort,
sort_by=sort_by,
headers=headers,
**query_parameters
),
access_next_list=["nextPage", "href"],
access_resource_list=["response"])
def generate_csr(self,
allow_wild_card_cert=None,
certificate_policies=None,
digest_type=None,
hostnames=None,
key_length=None,
key_type=None,
portal_group_tag=None,
san_dir=None,
san_dns=None,
san_ip=None,
san_uri=None,
subject_city=None,
subject_common_name=None,
subject_country=None,
subject_org=None,
subject_org_unit=None,
subject_state=None,
used_for=None,
headers=None,
payload=None,
active_validation=True,
**query_parameters):
""" Generate a certificate signing request for Multi-Use, Admin,
EAP Authentication, RADIUS DTLS, PxGrid, SAML, Portal
and IMS Services. Following Parameters are present in
POST request body PARAMETER DESCRIPTION
EXAMPLE hostnames List of ise node hostnames
for which CSRs should be generated [ise-host1, ise-
host2] allowWildCardCert Allow use of WildCards
in certificates false keyLength Length of the
Key used for CSR generation (required) 512
keyType Type of key used for CSR generation either RSA
or ECDSA(required) RSA digestType Hash
algorithm used for signing CSR(required) SHA-256
usedFor Certificate Usage(required) MULTI-USE
subjectCommonName Certificate common
name(CN)(required) $FQDN$ subjectOrgUnit
Certificate organizational unit(OU) Engineering
subjectOrg Certificate organization (O) Cisco
subjectCity Certificate city or locality (L) San
Jose subjectState Certificate state (ST)
California subjectCountry Certificate country ( C)
US sanDNS Array of SAN(Subject Alternative Name)
DNS entries(optional) [ise.example.com] sanIP
Array of SAN IP entries(optional) [1.1.1.1] sanURI
Array of SAN URI entries(optional) [https://1.1.1.1]
sanDir Array of SAN DIR entries(optional)
[CN=AAA,DC=COM,C=IL] portalGroupTag Portal Group
Tag when using certificate for PORTAL service Default
Portal Certificate Group NOTE: For
allowWildCardCert to be false, the below mentioned
parameter is mandatory: hostnames When Certificate
is selected to be used for Portal Service, the below
mentioned parameter is mandatory: portalGroupTag .
Args:
allow_wild_card_cert(boolean): allowWildCardCert,
property of the request body.
certificate_policies(string): certificatePolicies,
property of the request body.
digest_type(string): digestType, property of the request
body. Available values are 'SHA-256',
'SHA-384' and 'SHA-512'.
hostnames(list): hostnames, property of the request body
(list of strings).
key_length(string): keyLength, property of the request
body. Available values are '512',
'1024', '2048' and '4096'.
key_type(string): keyType, property of the request body.
Available values are 'RSA' and 'ECDSA'.
portal_group_tag(string): portalGroupTag, property of
the request body.
san_dns(list): sanDNS, property of the request body
(list of strings).
san_dir(list): sanDir, property of the request body
(list of strings).
san_ip(list): sanIP, property of the request body (list
of strings).
san_uri(list): sanURI, property of the request body
(list of strings).
subject_city(string): subjectCity, property of the
request body.
subject_common_name(string): subjectCommonName, property
of the request body.
subject_country(string): subjectCountry, property of the
request body.
subject_org(string): subjectOrg, property of the request
body.
subject_org_unit(string): subjectOrgUnit, property of
the request body.
subject_state(string): subjectState, property of the
request body.
used_for(string): usedFor, property of the request body.
Available values are 'MULTI-USE',
'ADMIN', 'EAP-AUTH', 'DTLS-AUTH',
'PORTAL', 'PXGRID', 'SAML' and 'IMS'.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
payload(dict): A JSON serializable Python object to send in the
body of the Request.
active_validation(bool): Enable/Disable payload validation.
Defaults to True.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
is_xml_payload = 'application/xml' in _headers.get('Content-Type', [])
if active_validation and is_xml_payload:
check_type(payload, basestring)
if active_validation and not is_xml_payload:
check_type(payload, dict)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
}
if is_xml_payload:
_payload = payload
else:
_payload = {
'allowWildCardCert':
allow_wild_card_cert,
'certificatePolicies':
certificate_policies,
'digestType':
digest_type,
'hostnames':
hostnames,
'keyLength':
key_length,
'keyType':
key_type,
'portalGroupTag':
portal_group_tag,
'sanDNS':
san_dns,
'sanDir':
san_dir,
'sanIP':
san_ip,
'sanURI':
san_uri,
'subjectCity':
subject_city,
'subjectCommonName':
subject_common_name,
'subjectCountry':
subject_country,
'subjectOrg':
subject_org,
'subjectOrgUnit':
subject_org_unit,
'subjectState':
subject_state,
'usedFor':
used_for,
}
_payload.update(payload or {})
_payload = dict_from_items_with_values(_payload)
if active_validation and not is_xml_payload:
self._request_validator('jsd_e39868ea7aec5efcaaf55009699eda5d_v3_0_0')\
.validate(_payload)
e_url = ('/api/v1/certs/certificate-signing-request')
endpoint_full_url = apply_path_params(e_url, path_params)
request_params = {'data': _payload} if is_xml_payload else {'json': _payload}
if with_custom_headers:
_api_response = self._session.post(endpoint_full_url, params=_params,
headers=_headers,
**request_params)
else:
_api_response = self._session.post(endpoint_full_url, params=_params,
**request_params)
return self._object_factory('bpm_e39868ea7aec5efcaaf55009699eda5d_v3_0_0', _api_response)
def export_csr(self,
hostname,
id,
dirpath=None,
save_file=None,
headers=None,
**query_parameters):
"""The response of this API carries a CSR corresponding to the
requested ID.
Args:
hostname(basestring): hostname path parameter. The
hostname to which the CSR belongs.
id(basestring): id path parameter. The ID of the CSR to
be exported.
dirpath(basestring): Directory absolute path. Defaults to
os.getcwd().
save_file(bool): Enable or disable automatic file creation of
raw response.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
urllib3.response.HTTPResponse: HTTP Response container. For more
information check the `urlib3 documentation <https://urllib3.readthedocs.io/en/latest/reference/urllib3.response.html>`_
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
DownloadFailure: If was not able to download the raw
response to a file.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
check_type(hostname, basestring,
may_be_none=False)
check_type(id, basestring,
may_be_none=False)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
'hostname': hostname,
'id': id,
}
e_url = ('/api/v1/certs/certificate-signing-'
+ 'request/export/{hostname}/{id}')
endpoint_full_url = apply_path_params(e_url, path_params)
if with_custom_headers:
_api_response = self._session.get(endpoint_full_url, params=_params,
headers=_headers,
stream=True, dirpath=dirpath, save_file=save_file)
else:
_api_response = self._session.get(endpoint_full_url, params=_params,
stream=True, dirpath=dirpath, save_file=save_file)
return self._object_factory('bpm_ec26ec11d92356a594a6efa55ccb9be7_v3_0_0', _api_response)
def generate_intermediate_ca_csr(self,
headers=None,
**query_parameters):
"""CSR Generation for Intermediate Certificates.
Args:
headers(dict): Dictionary of HTTP Headers to send with the Request
.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
}
e_url = ('/api/v1/certs/certificate-signing-request/intermediate-'
+ 'ca')
endpoint_full_url = apply_path_params(e_url, path_params)
if with_custom_headers:
_api_response = self._session.post(endpoint_full_url, params=_params,
headers=_headers)
else:
_api_response = self._session.post(endpoint_full_url, params=_params)
return self._object_factory('bpm_bf95f099207a5b6599e04c47c22789c0_v3_0_0', _api_response)
def get_csr_by_id(self,
host_name,
id,
headers=None,
**query_parameters):
"""This API displays details of a Certificate Signing Request of a
particular node based on a given HostName and ID.
Args:
host_name(basestring): hostName path parameter. Name of
the host of which CSR's should be
returned.
id(basestring): id path parameter. The ID of the
Certificate Signing Request returned.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
check_type(host_name, basestring,
may_be_none=False)
check_type(id, basestring,
may_be_none=False)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
'hostName': host_name,
'id': id,
}
e_url = ('/api/v1/certs/certificate-signing-'
+ 'request/{hostName}/{id}')
endpoint_full_url = apply_path_params(e_url, path_params)
if with_custom_headers:
_api_response = self._session.get(endpoint_full_url, params=_params,
headers=_headers)
else:
_api_response = self._session.get(endpoint_full_url, params=_params)
return self._object_factory('bpm_b8104a50fc565ae9a756d6d0152e0e5b_v3_0_0', _api_response)
def delete_csr_by_id(self,
host_name,
id,
headers=None,
**query_parameters):
"""This API deletes a Certificate Signing Request of a particular
node based on a given HostName and ID.
Args:
host_name(basestring): hostName path parameter. Name of
the host of which CSR's should be
deleted.
id(basestring): id path parameter. The ID of the
Certificate Signing Request to be
deleted.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
check_type(host_name, basestring,
may_be_none=False)
check_type(id, basestring,
may_be_none=False)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
'hostName': host_name,
'id': id,
}
e_url = ('/api/v1/certs/certificate-signing-'
+ 'request/{hostName}/{id}')
endpoint_full_url = apply_path_params(e_url, path_params)
if with_custom_headers:
_api_response = self._session.delete(endpoint_full_url, params=_params,
headers=_headers)
else:
_api_response = self._session.delete(endpoint_full_url, params=_params)
return self._object_factory('bpm_bf792ec664fa5202beb776556908b0c1_v3_0_0', _api_response)
def regenerate_ise_root_ca(self,
remove_existing_ise_intermediate_csr=None,
headers=None,
payload=None,
active_validation=True,
**query_parameters):
"""This API will initiate regeneration of ISE root CA certificate
chain. Response contains id which can be used to track
the status. Setting
"removeExistingISEIntermediateCSR" to true will remove
existing ISE Intermediate CSR.
Args:
remove_existing_ise_intermediate_csr(boolean): Setting
this attribute to true will remove
existing ISE Intermediate CSR, property
of the request body.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
payload(dict): A JSON serializable Python object to send in the
body of the Request.
active_validation(bool): Enable/Disable payload validation.
Defaults to True.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
is_xml_payload = 'application/xml' in _headers.get('Content-Type', [])
if active_validation and is_xml_payload:
check_type(payload, basestring)
if active_validation and not is_xml_payload:
check_type(payload, dict)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
}
if is_xml_payload:
_payload = payload
else:
_payload = {
'removeExistingISEIntermediateCSR':
remove_existing_ise_intermediate_csr,
}
_payload.update(payload or {})
_payload = dict_from_items_with_values(_payload)
if active_validation and not is_xml_payload:
self._request_validator('jsd_e6d1b224e058288a8c4d70be72c9a6_v3_0_0')\
.validate(_payload)
e_url = ('/api/v1/certs/ise-root-ca/regenerate')
endpoint_full_url = apply_path_params(e_url, path_params)
request_params = {'data': _payload} if is_xml_payload else {'json': _payload}
if with_custom_headers:
_api_response = self._session.post(endpoint_full_url, params=_params,
headers=_headers,
**request_params)
else:
_api_response = self._session.post(endpoint_full_url, params=_params,
**request_params)
return self._object_factory('bpm_e6d1b224e058288a8c4d70be72c9a6_v3_0_0', _api_response)
def renew_certificates(self,
cert_type=None,
headers=None,
payload=None,
active_validation=True,
**query_parameters):
"""This API will initiate regeneration of certificates. Response
contains id which can be used to track the status.
Args:
cert_type(string): certType, property of the request
body. Available values are 'OCSP' and
'IMS'.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
payload(dict): A JSON serializable Python object to send in the
body of the Request.
active_validation(bool): Enable/Disable payload validation.
Defaults to True.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
is_xml_payload = 'application/xml' in _headers.get('Content-Type', [])
if active_validation and is_xml_payload:
check_type(payload, basestring)
if active_validation and not is_xml_payload:
check_type(payload, dict)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
}
if is_xml_payload:
_payload = payload
else:
_payload = {
'certType':
cert_type,
}
_payload.update(payload or {})
_payload = dict_from_items_with_values(_payload)
if active_validation and not is_xml_payload:
self._request_validator('jsd_c288192f954309b4b35aa612ff226_v3_0_0')\
.validate(_payload)
e_url = ('/api/v1/certs/renew-certificate')
endpoint_full_url = apply_path_params(e_url, path_params)
request_params = {'data': _payload} if is_xml_payload else {'json': _payload}
if with_custom_headers:
_api_response = self._session.post(endpoint_full_url, params=_params,
headers=_headers,
**request_params)
else:
_api_response = self._session.post(endpoint_full_url, params=_params,
**request_params)
return self._object_factory('bpm_c288192f954309b4b35aa612ff226_v3_0_0', _api_response)
def bind_csr(self,
admin=None,
allow_extended_validity=None,
allow_out_of_date_cert=None,
allow_replacement_of_certificates=None,
allow_replacement_of_portal_group_tag=None,
data=None,
eap=None,
host_name=None,
id=None,
ims=None,
name=None,
portal=None,
portal_group_tag=None,
pxgrid=None,
radius=None,
saml=None,
validate_certificate_extensions=None,
headers=None,
payload=None,
active_validation=True,
**query_parameters):
""" Bind CA Signed Certificate. NOTE: This API requires an
existing Certificate Signing Request, and the root
certificate must already be trusted. NOTE: The
certificate may have a validity period longer than 398
days. It may be untrusted by many browsers. NOTE:
Request Parameters accepting True and False as input can
be replaced by 1 and 0 respectively. Following
Parameters are used in POST body PARAMETER
DESCRIPTION EXAMPLE name Friendly name of
the certificate. Signed Certificate data
Plain-text contents of the certificate file (required)
Signed Certificate in escaped format
allowExtendedValidity Allow the certificates greater
than validity of 398 days (required) false
allowOutOfDateCert Allow out of date certificates
(required) false allowReplacementOfCertificates
Allow Replacement of certificates (required) false
allowReplacementOfPortalGroupTag Allow Replacement of
Portal Group Tag (required) false admin Use
certificate to authenticate the ISE Admin Portal false
eap Use certificate for EAP protocols that use SSL/TLS
tunneling false radius Use certificate for
RADSec server false pxgrid Use certificate for
the pxGrid Controller false ims Use
certificate for the ISE Messaging Service false
saml Use certificate for SAML Signing false
portal Use certificate for portal false
portalGroupTag Portal Group Tag for using certificate
with portal role Default Portal Certificate Group
validateCertificateExtensions Validate Certificate
Extensions false Following Roles can be used
in any combinations ROLE DEFAULT WARNING
Admin False Enabling Admin role for this certificate
will cause an application server restart on the selected
node. Note: Make sure required Certificate Chain is
imported under Trusted Certificates EAP
Authentication False Only one system certificate can
be used for EAP. Assigning EAP to this certificate will
remove the assignment from another certificate. Note:
Make sure required Certificate Chain is imported under
Trusted Certificates RADIUS DTLS False Only
one system certificate can be used for DTLS. Assigning
DTLS to this certificate will remove the assignment from
another certificate. Note: Make sure required
Certificate Chain is imported under Trusted Certificates
SAML False SAML cannot be used with other Usage.
Enabling SAML will uncheck all other Usage. Note: Make
sure required Certificate Chain is imported under
Trusted Certificates .
Args:
admin(boolean): Use certificate to authenticate the ISE
Admin Portal, property of the request
body.
allow_extended_validity(boolean): Allow import of
certificates with validity greater than
398 days, property of the request body.
allow_out_of_date_cert(boolean): Allow out of date
certificates (required), property of the
request body.
allow_replacement_of_certificates(boolean): Allow
Replacement of certificates (required),
property of the request body.
allow_replacement_of_portal_group_tag(boolean): Allow
Replacement of Portal Group Tag
(required), property of the request
body.
data(string): Signed Certificate in escaped format,
property of the request body.
eap(boolean): Use certificate for EAP protocols that use
SSL/TLS tunneling, property of the
request body.
host_name(string): Name of Host whose CSR ID has been
provided, property of the request body.
id(string): ID of the generated CSR, property of the
request body.
ims(boolean): Use certificate for the ISE Messaging
Service, property of the request body.
name(string): Friendly Name of the certificate, property
of the request body.
portal(boolean): Use for portal, property of the request
body.
portal_group_tag(string): Set Group tag, property of the
request body.
pxgrid(boolean): Use certificate for the pxGrid
Controller, property of the request
body.
radius(boolean): Use certificate for the RADSec server,
property of the request body.
saml(boolean): Use certificate for SAML Signing,
property of the request body.
validate_certificate_extensions(boolean): Validate
Certificate Extensions, property of the
request body.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
payload(dict): A JSON serializable Python object to send in the
body of the Request.
active_validation(bool): Enable/Disable payload validation.
Defaults to True.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
is_xml_payload = 'application/xml' in _headers.get('Content-Type', [])
if active_validation and is_xml_payload:
check_type(payload, basestring)
if active_validation and not is_xml_payload:
check_type(payload, dict)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
}
if is_xml_payload:
_payload = payload
else:
_payload = {
'admin':
admin,
'allowExtendedValidity':
allow_extended_validity,
'allowOutOfDateCert':
allow_out_of_date_cert,
'allowReplacementOfCertificates':
allow_replacement_of_certificates,
'allowReplacementOfPortalGroupTag':
allow_replacement_of_portal_group_tag,
'data':
data,
'eap':
eap,
'hostName':
host_name,
'id':
id,
'ims':
ims,
'name':
name,
'portal':
portal,
'portalGroupTag':
portal_group_tag,
'pxgrid':
pxgrid,
'radius':
radius,
'saml':
saml,
'validateCertificateExtensions':
validate_certificate_extensions,
}
_payload.update(payload or {})
_payload = dict_from_items_with_values(_payload)
if active_validation and not is_xml_payload:
self._request_validator('jsd_b94d7d3f0ed5d0b938151ae2cae9fa4_v3_0_0')\
.validate(_payload)
e_url = ('/api/v1/certs/signed-certificate/bind')
endpoint_full_url = apply_path_params(e_url, path_params)
request_params = {'data': _payload} if is_xml_payload else {'json': _payload}
if with_custom_headers:
_api_response = self._session.post(endpoint_full_url, params=_params,
headers=_headers,
**request_params)
else:
_api_response = self._session.post(endpoint_full_url, params=_params,
**request_params)
return self._object_factory('bpm_b94d7d3f0ed5d0b938151ae2cae9fa4_v3_0_0', _api_response)
def export_system_certificate(self,
export=None,
id=None,
password=None,
dirpath=None,
save_file=None,
headers=None,
payload=None,
active_validation=True,
**query_parameters):
""" Export System Certificate. Following Parameters are used
in POST body PARAMETER DESCRIPTION EXAMPLE
id ID of a System Certificate(required) <
SYSTEM_CERT_ID > export One of the below option
is required "CERTIFICATE" : Export only Certificate
without Private Key "CERTIFICATE_WITH_PRIVATE_KEY" :
Export both Certificate and Private Key(
"certificatePassword" is required)
CERTIFICATE_WITH_PRIVATE_KEY password
Certificate Password (required if "export" :
CERTIFICATE_WITH_PRIVATE_KEY ) Password Constraints:
Alphanumeric Minimum of 8 Characters Maximum of 100
Characters Passw*** NOTE: The response
of this API carries a ZIP file containing the
certificate and private key if "export" :
"CERTIFICATE_WITH_PRIVATE_KEY" in the request. If
"export" : "CERTIFICATE" in request body, the response
carries a ZIP file containing only the certificate.
WARNING: Exporting a private key is not a secure
operation. It could lead to possible exposure of the
private key. .
Args:
export(string): export, property of the request body.
Available values are 'CERTIFICATE' and
'CERTIFICATE_WITH_PRIVATE_KEY'.
id(string): id, property of the request body.
password(string): password, property of the request
body.
dirpath(basestring): Directory absolute path. Defaults to
os.getcwd().
save_file(bool): Enable or disable automatic file creation of
raw response.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
payload(dict): A JSON serializable Python object to send in the
body of the Request.
active_validation(bool): Enable/Disable payload validation.
Defaults to True.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
urllib3.response.HTTPResponse: HTTP Response container. For more
information check the `urlib3 documentation <https://urllib3.readthedocs.io/en/latest/reference/urllib3.response.html>`_
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
DownloadFailure: If was not able to download the raw
response to a file.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
is_xml_payload = 'application/xml' in _headers.get('Content-Type', [])
if active_validation and is_xml_payload:
check_type(payload, basestring)
if active_validation and not is_xml_payload:
check_type(payload, dict)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
}
if is_xml_payload:
_payload = payload
else:
_payload = {
'export':
export,
'id':
id,
'password':
password,
}
_payload.update(payload or {})
_payload = dict_from_items_with_values(_payload)
if active_validation and not is_xml_payload:
self._request_validator('jsd_dbe47028859573988880de76fec0936_v3_0_0')\
.validate(_payload)
e_url = ('/api/v1/certs/system-certificate/export')
endpoint_full_url = apply_path_params(e_url, path_params)
request_params = {'data': _payload} if is_xml_payload else {'json': _payload}
if with_custom_headers:
_api_response = self._session.post(endpoint_full_url, params=_params,
headers=_headers,
stream=True, dirpath=dirpath, save_file=save_file,
**request_params)
else:
_api_response = self._session.post(endpoint_full_url, params=_params,
stream=True, dirpath=dirpath, save_file=save_file,
**request_params)
return self._object_factory('bpm_dbe47028859573988880de76fec0936_v3_0_0', _api_response)
def import_system_certificate(self,
admin=None,
allow_extended_validity=None,
allow_out_of_date_cert=None,
allow_replacement_of_certificates=None,
allow_replacement_of_portal_group_tag=None,
allow_sha1_certificates=None,
allow_wild_card_certificates=None,
data=None,
eap=None,
ims=None,
name=None,
password=None,
portal=None,
portal_group_tag=None,
private_key_data=None,
pxgrid=None,
radius=None,
saml=None,
validate_certificate_extensions=None,
headers=None,
payload=None,
active_validation=True,
**query_parameters):
""" Import an X509 certificate as a system certificate. NOTE:
The certificate may have a validity period longer than
398 days. It may be untrusted by many browsers. NOTE:
Request Parameters accepting True and False as input can
be replaced by 1 and 0 respectively. Following
Parameters are used in POST body PARAMETER
DESCRIPTION EXAMPLE name Friendly name of
the certificate. System Certificate password
Password of the certificate to be imported (required).
Passw*** data Plain-text contents of the
certificate file (required) System Certificate in
escaped format privateKeyData Plain-text
contents of the private key file (required) System
Certificate Private Key in escaped format
allowOutOfDateCert Allow out of date certificates
(required) false allowSHA1Certificates Allow
SHA1 based certificates (required) false
allowExtendedValidity Allow the certificates greater
than validity of 398 days (required) false admin
Use certificate to authenticate the ISE Admin Portal
false eap Use certificate for EAP protocols that
use SSL/TLS tunneling false radius Use
certificate for RADSec server false pxgrid Use
certificate for the pxGrid Controller false ims
Use certificate for the ISE Messaging Service false
saml Use certificate for SAML Signing false
portal Use certificate for portal false
portalGroupTag Portal Group Tag for using certificate
with portal role Default Portal Certificate Group
allowReplacementOfPortalGroupTag Allow Replacement of
Portal Group Tag (required) false
allowWildCardCertificates Allow use of WildCards in
certificates false validateCertificateExtensions
Validate Certificate Extensions false
Following Roles can be used in any combinations
ROLE DEFAULT WARNING Admin False
Enabling Admin role for this certificate will cause an
application server restart on the selected node. Note:
Make sure required Certificate Chain is imported under
Trusted Certificates EAP Authentication False
Only one system certificate can be used for EAP.
Assigning EAP to this certificate will remove the
assignment from another certificate. Note: Make sure
required Certificate Chain is imported under Trusted
Certificates RADIUS DTLS False Only one system
certificate can be used for DTLS. Assigning DTLS to this
certificate will remove the assignment from another
certificate. Note: Make sure required Certificate Chain
is imported under Trusted Certificates SAML
False SAML cannot be used with other Usage. Enabling
SAML will uncheck all other Usage. Note: Make sure
required Certificate Chain is imported under Trusted
Certificates .
Args:
admin(boolean): Use certificate to authenticate the ISE
Admin Portal, property of the request
body.
allow_extended_validity(boolean): Allow import of
certificates with validity greater than
398 days, property of the request body.
allow_out_of_date_cert(boolean): Allow out of date
certificates (required), property of the
request body.
allow_replacement_of_certificates(boolean): Allow
Replacement of certificates (required),
property of the request body.
allow_replacement_of_portal_group_tag(boolean): Allow
Replacement of Portal Group Tag
(required), property of the request
body.
allow_sha1_certificates(boolean): Allow SHA1 based
certificates (required), property of the
request body.
allow_wild_card_certificates(boolean): Allow Wildcard
Certificates, property of the request
body.
data(string): Certificate Content (required), property
of the request body.
eap(boolean): Use certificate for EAP protocols that use
SSL/TLS tunneling, property of the
request body.
ims(boolean): Use certificate for the ISE Messaging
Service, property of the request body.
name(string): Name of the certificate, property of the
request body.
password(string): Certificate Password (required).,
property of the request body.
portal(boolean): Use for portal, property of the request
body.
portal_group_tag(string): Set Group tag, property of the
request body.
private_key_data(string): Private Key data (required),
property of the request body.
pxgrid(boolean): Use certificate for the pxGrid
Controller, property of the request
body.
radius(boolean): Use certificate for the RADSec server,
property of the request body.
saml(boolean): Use certificate for SAML Signing,
property of the request body.
validate_certificate_extensions(boolean): Validate
Certificate Extensions, property of the
request body.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
payload(dict): A JSON serializable Python object to send in the
body of the Request.
active_validation(bool): Enable/Disable payload validation.
Defaults to True.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
is_xml_payload = 'application/xml' in _headers.get('Content-Type', [])
if active_validation and is_xml_payload:
check_type(payload, basestring)
if active_validation and not is_xml_payload:
check_type(payload, dict)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
}
if is_xml_payload:
_payload = payload
else:
_payload = {
'admin':
admin,
'allowExtendedValidity':
allow_extended_validity,
'allowOutOfDateCert':
allow_out_of_date_cert,
'allowReplacementOfCertificates':
allow_replacement_of_certificates,
'allowReplacementOfPortalGroupTag':
allow_replacement_of_portal_group_tag,
'allowSHA1Certificates':
allow_sha1_certificates,
'allowWildCardCertificates':
allow_wild_card_certificates,
'data':
data,
'eap':
eap,
'ims':
ims,
'name':
name,
'password':
password,
'portal':
portal,
'portalGroupTag':
portal_group_tag,
'privateKeyData':
private_key_data,
'pxgrid':
pxgrid,
'radius':
radius,
'saml':
saml,
'validateCertificateExtensions':
validate_certificate_extensions,
}
_payload.update(payload or {})
_payload = dict_from_items_with_values(_payload)
if active_validation and not is_xml_payload:
self._request_validator('jsd_e6c7251a8508597f1b7ae61cbf953_v3_0_0')\
.validate(_payload)
e_url = ('/api/v1/certs/system-certificate/import')
endpoint_full_url = apply_path_params(e_url, path_params)
request_params = {'data': _payload} if is_xml_payload else {'json': _payload}
if with_custom_headers:
_api_response = self._session.post(endpoint_full_url, params=_params,
headers=_headers,
**request_params)
else:
_api_response = self._session.post(endpoint_full_url, params=_params,
**request_params)
return self._object_factory('bpm_e6c7251a8508597f1b7ae61cbf953_v3_0_0', _api_response)
def get_system_certificates(self,
host_name,
filter=None,
filter_type=None,
page=None,
size=None,
sort=None,
sort_by=None,
headers=None,
**query_parameters):
""" This API supports Filtering, Sorting and Pagination.
Filtering and Sorting supported on below mentioned
attributes: friendlyName issuedTo issuedBy
validFrom Supported Date Format: yyyy-MM-dd HH:mm:ss
Supported Operators: EQ, NEQ, GT and LT
expirationDate Supported Date Format: yyyy-MM-dd
HH:mm:ss Supported Operators: EQ, NEQ, GT and LT
.
Args:
host_name(basestring): hostName path parameter. Name of
the host of which system certificates
should be returned.
page(int): page query parameter. Page number.
size(int): size query parameter. Number of objects
returned per page.
sort(basestring): sort query parameter. sort type asc or
desc.
sort_by(basestring): sortBy query parameter. sort column
by which objects needs to be sorted.
filter(basestring, list, set, tuple): filter query
parameter. Simple filtering
should be available through the filter
query string parameter. The structure of
a filter is a triplet of field operator
and value separated with dots. More than
one filter can be sent. The logical
operator common to ALL filter criteria
will be by default AND, and can be
changed by using the "filterType=or"
query string parameter. Each resource
Data model description should specify if
an attribute is a filtered field.
OPERATOR DESCRIPTION EQ
Equals NEQ Not Equals GT
Greater Than LT Less Then
STARTSW Starts With NSTARTSW
Not Starts With ENDSW Ends With
NENDSW Not Ends With CONTAINS
Contains NCONTAINS Not Contains
.
filter_type(basestring): filterType query parameter. The
logical operator common to ALL filter
criteria will be by default AND, and can
be changed by using the parameter.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
check_type(page, (int, basestring, list))
check_type(size, (int, basestring, list))
check_type(sort, basestring)
check_type(sort_by, basestring)
check_type(filter, (basestring, list, set, tuple))
check_type(filter_type, basestring)
check_type(host_name, basestring,
may_be_none=False)
_params = {
'page':
page,
'size':
size,
'sort':
sort,
'sortBy':
sort_by,
'filter':
filter,
'filterType':
filter_type,
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
'hostName': host_name,
}
e_url = ('/api/v1/certs/system-certificate/{hostName}')
endpoint_full_url = apply_path_params(e_url, path_params)
if with_custom_headers:
_api_response = self._session.get(endpoint_full_url, params=_params,
headers=_headers)
else:
_api_response = self._session.get(endpoint_full_url, params=_params)
return self._object_factory('bpm_a56f5c5f739a83e8806da16be5_v3_0_0', _api_response)
def get_system_certificates_generator(self,
host_name,
filter=None,
filter_type=None,
page=None,
size=None,
sort=None,
sort_by=None,
headers=None,
**query_parameters):
""" This API supports Filtering, Sorting and Pagination.
Filtering and Sorting supported on below mentioned
attributes: friendlyName issuedTo issuedBy
validFrom Supported Date Format: yyyy-MM-dd HH:mm:ss
Supported Operators: EQ, NEQ, GT and LT
expirationDate Supported Date Format: yyyy-MM-dd
HH:mm:ss Supported Operators: EQ, NEQ, GT and LT
.
Args:
host_name(basestring): hostName path parameter. Name of
the host of which system certificates
should be returned.
page(int): page query parameter. Page number.
size(int): size query parameter. Number of objects
returned per page.
sort(basestring): sort query parameter. sort type asc or
desc.
sort_by(basestring): sortBy query parameter. sort column
by which objects needs to be sorted.
filter(basestring, list, set, tuple): filter query
parameter. Simple filtering
should be available through the filter
query string parameter. The structure of
a filter is a triplet of field operator
and value separated with dots. More than
one filter can be sent. The logical
operator common to ALL filter criteria
will be by default AND, and can be
changed by using the "filterType=or"
query string parameter. Each resource
Data model description should specify if
an attribute is a filtered field.
OPERATOR DESCRIPTION EQ
Equals NEQ Not Equals GT
Greater Than LT Less Then
STARTSW Starts With NSTARTSW
Not Starts With ENDSW Ends With
NENDSW Not Ends With CONTAINS
Contains NCONTAINS Not Contains
.
filter_type(basestring): filterType query parameter. The
logical operator common to ALL filter
criteria will be by default AND, and can
be changed by using the parameter.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
Generator: A generator object containing the following object.
+ RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
yield from get_next_page(
self.get_system_certificates, dict(
host_name=host_name,
filter=filter,
filter_type=filter_type,
page=page,
size=size,
sort=sort,
sort_by=sort_by,
headers=headers,
**query_parameters
),
access_next_list=["nextPage", "href"],
access_resource_list=["response"])
def get_system_certificate_by_id(self,
host_name,
id,
headers=None,
**query_parameters):
"""This API displays details of a System Certificate of a
particular node based on a given HostName and ID.
Args:
host_name(basestring): hostName path parameter. Name of
the host of which system certificates
should be returned.
id(basestring): id path parameter. The id of the system
certificate.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
check_type(host_name, basestring,
may_be_none=False)
check_type(id, basestring,
may_be_none=False)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
'hostName': host_name,
'id': id,
}
e_url = ('/api/v1/certs/system-certificate/{hostName}/{id}')
endpoint_full_url = apply_path_params(e_url, path_params)
if with_custom_headers:
_api_response = self._session.get(endpoint_full_url, params=_params,
headers=_headers)
else:
_api_response = self._session.get(endpoint_full_url, params=_params)
return self._object_factory('bpm_f36e90115b05416a71506061fed7e5c_v3_0_0', _api_response)
def update_system_certificate(self,
host_name,
id,
admin=None,
allow_replacement_of_portal_group_tag=None,
description=None,
eap=None,
expiration_ttl_period=None,
expiration_ttl_units=None,
ims=None,
name=None,
portal=None,
portal_group_tag=None,
pxgrid=None,
radius=None,
renew_self_signed_certificate=None,
saml=None,
headers=None,
payload=None,
active_validation=True,
**query_parameters):
""" Update a System Certificate. NOTE: Renewing a certificate
will cause an application server restart on the selected
node. NOTE: Request Parameters accepting True and
False as input can be replaced by 1 and 0 respectively.
Following Parameters are used in POST body
PARAMETER DESCRIPTION EXAMPLE name
Friendly name of the certificate. System Certificate
description Description of the Certificate Default
Description admin Use certificate to authenticate
the ISE Admin Portal false eap Use certificate
for EAP protocols that use SSL/TLS tunneling false
radius Use certificate for RADSec server false
pxgrid Use certificate for the pxGrid Controller
false ims Use certificate for the ISE Messaging
Service false saml Use certificate for SAML
Signing false portal Use certificate for
portal false portalGroupTag Portal Group Tag
for using certificate with portal role Default Portal
Certificate Group allowReplacementOfPortalGroupTag
Allow Replacement of Portal Group Tag (required) false
renewSelfSignedCertificate Renew Self Signed
Certificate false expirationTTLPeriod
Expiration Period 365 expirationTTLUnits
Expiration Units in one of the below formats days /
weeks / months / years days Following
Roles can be used in any combinations ROLE
DEFAULT WARNING Admin False Enabling
Admin role for this certificate will cause an
application server restart on the selected node. Note:
Make sure required Certificate Chain is imported under
Trusted Certificates EAP Authentication False
Only one system certificate can be used for EAP.
Assigning EAP to this certificate will remove the
assignment from another certificate. Note: Make sure
required Certificate Chain is imported under Trusted
Certificates RADIUS DTLS False Only one system
certificate can be used for DTLS. Assigning DTLS to this
certificate will remove the assignment from another
certificate. Note: Make sure required Certificate Chain
is imported under Trusted Certificates SAML
False SAML cannot be used with other Usage. Enabling
SAML will uncheck all other Usage. Note: Make sure
required Certificate Chain is imported under Trusted
Certificates .
Args:
admin(boolean): Use certificate to authenticate the ISE
Admin Portal, property of the request
body.
allow_replacement_of_portal_group_tag(boolean): Allow
Replacement of Portal Group Tag
(required), property of the request
body.
description(string): Description of System Certificate,
property of the request body.
eap(boolean): Use certificate for EAP protocols that use
SSL/TLS tunneling, property of the
request body.
expiration_ttl_period(integer): expirationTTLPeriod,
property of the request body.
expiration_ttl_units(string): expirationTTLUnits,
property of the request body. Available
values are 'days', 'weeks', 'months' and
'years'.
ims(boolean): Use certificate for the ISE Messaging
Service, property of the request body.
name(string): Name of the certificate, property of the
request body.
portal(boolean): Use for portal, property of the request
body.
portal_group_tag(string): Set Group tag, property of the
request body.
pxgrid(boolean): Use certificate for the pxGrid
Controller, property of the request
body.
radius(boolean): Use certificate for the RADSec server,
property of the request body.
renew_self_signed_certificate(boolean): Renew Self
Signed Certificate, property of the
request body.
saml(boolean): Use certificate for SAML Signing,
property of the request body.
id(basestring): id path parameter. The ID of the System
Certificate to be updated.
host_name(basestring): hostName path parameter. Name of
Host whose certificate needs to be
updated.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
payload(dict): A JSON serializable Python object to send in the
body of the Request.
active_validation(bool): Enable/Disable payload validation.
Defaults to True.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
if 'Content-Type' in headers:
check_type(headers.get('Content-Type'),
basestring, may_be_none=False)
if 'Accept' in headers:
check_type(headers.get('Accept'),
basestring, may_be_none=False)
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
is_xml_payload = 'application/xml' in _headers.get('Content-Type', [])
if active_validation and is_xml_payload:
check_type(payload, basestring)
if active_validation and not is_xml_payload:
check_type(payload, dict)
check_type(id, basestring,
may_be_none=False)
check_type(host_name, basestring,
may_be_none=False)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
'id': id,
'hostName': host_name,
}
if is_xml_payload:
_payload = payload
else:
_payload = {
'admin':
admin,
'allowReplacementOfPortalGroupTag':
allow_replacement_of_portal_group_tag,
'description':
description,
'eap':
eap,
'expirationTTLPeriod':
expiration_ttl_period,
'expirationTTLUnits':
expiration_ttl_units,
'ims':
ims,
'name':
name,
'portal':
portal,
'portalGroupTag':
portal_group_tag,
'pxgrid':
pxgrid,
'radius':
radius,
'renewSelfSignedCertificate':
renew_self_signed_certificate,
'saml':
saml,
}
_payload.update(payload or {})
_payload = dict_from_items_with_values(_payload)
if active_validation and not is_xml_payload:
self._request_validator('jsd_fb9c22ad9a5eddb590c85abdab460b_v3_0_0')\
.validate(_payload)
e_url = ('/api/v1/certs/system-certificate/{hostName}/{id}')
endpoint_full_url = apply_path_params(e_url, path_params)
request_params = {'data': _payload} if is_xml_payload else {'json': _payload}
if with_custom_headers:
_api_response = self._session.put(endpoint_full_url, params=_params,
headers=_headers,
**request_params)
else:
_api_response = self._session.put(endpoint_full_url, params=_params,
**request_params)
return self._object_factory('bpm_fb9c22ad9a5eddb590c85abdab460b_v3_0_0', _api_response)
def delete_system_certificate_by_id(self,
host_name,
id,
headers=None,
**query_parameters):
"""This API deletes a System Certificate of a particular node based
on a given HostName and ID.
Args:
host_name(basestring): hostName path parameter. Name of
the host from which the System
Certificate needs to be deleted.
id(basestring): id path parameter. The ID of the System
Certificate to be deleted.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
check_type(host_name, basestring,
may_be_none=False)
check_type(id, basestring,
may_be_none=False)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
'hostName': host_name,
'id': id,
}
e_url = ('/api/v1/certs/system-certificate/{hostName}/{id}')
endpoint_full_url = apply_path_params(e_url, path_params)
if with_custom_headers:
_api_response = self._session.delete(endpoint_full_url, params=_params,
headers=_headers)
else:
_api_response = self._session.delete(endpoint_full_url, params=_params)
return self._object_factory('bpm_dc2eec65ad680a3c5de47cd87c8_v3_0_0', _api_response)
def get_trusted_certificates(self,
filter=None,
filter_type=None,
page=None,
size=None,
sort=None,
sort_by=None,
headers=None,
**query_parameters):
""" This API supports Filtering, Sorting and Pagination.
Filtering and Sorting supported on below mentioned
attributes: friendlyName subject issuedTo
issuedBy validFrom Supported Date Format: yyyy-MM-
dd HH:mm:ss Supported Operators: EQ, NEQ, GT and LT
expirationDate Supported Date Format: yyyy-MM-dd
HH:mm:ss Supported Operators: EQ, NEQ, GT and LT
status Allowed values: enabled, disabled Supported
Operators: EQ, NEQ .
Args:
page(int): page query parameter. Page number.
size(int): size query parameter. Number of objects
returned per page.
sort(basestring): sort query parameter. sort type asc or
desc.
sort_by(basestring): sortBy query parameter. sort column
by which objects needs to be sorted.
filter(basestring, list, set, tuple): filter query
parameter. Simple filtering
should be available through the filter
query string parameter. The structure of
a filter is a triplet of field operator
and value separated with dots. More than
one filter can be sent. The logical
operator common to ALL filter criteria
will be by default AND, and can be
changed by using the "filterType=or"
query string parameter. Each resource
Data model description should specify if
an attribute is a filtered field.
OPERATOR DESCRIPTION EQ
Equals NEQ Not Equals GT
Greater Than LT Less Then
STARTSW Starts With NSTARTSW
Not Starts With ENDSW Ends With
NENDSW Not Ends With CONTAINS
Contains NCONTAINS Not Contains
.
filter_type(basestring): filterType query parameter. The
logical operator common to ALL filter
criteria will be by default AND, and can
be changed by using the parameter.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
check_type(page, (int, basestring, list))
check_type(size, (int, basestring, list))
check_type(sort, basestring)
check_type(sort_by, basestring)
check_type(filter, (basestring, list, set, tuple))
check_type(filter_type, basestring)
_params = {
'page':
page,
'size':
size,
'sort':
sort,
'sortBy':
sort_by,
'filter':
filter,
'filterType':
filter_type,
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
}
e_url = ('/api/v1/certs/trusted-certificate')
endpoint_full_url = apply_path_params(e_url, path_params)
if with_custom_headers:
_api_response = self._session.get(endpoint_full_url, params=_params,
headers=_headers)
else:
_api_response = self._session.get(endpoint_full_url, params=_params)
return self._object_factory('bpm_c654a18faf1b5571ac5ba61145d298c4_v3_0_0', _api_response)
def get_trusted_certificates_generator(self,
filter=None,
filter_type=None,
page=None,
size=None,
sort=None,
sort_by=None,
headers=None,
**query_parameters):
""" This API supports Filtering, Sorting and Pagination.
Filtering and Sorting supported on below mentioned
attributes: friendlyName subject issuedTo
issuedBy validFrom Supported Date Format: yyyy-MM-
dd HH:mm:ss Supported Operators: EQ, NEQ, GT and LT
expirationDate Supported Date Format: yyyy-MM-dd
HH:mm:ss Supported Operators: EQ, NEQ, GT and LT
status Allowed values: enabled, disabled Supported
Operators: EQ, NEQ .
Args:
page(int): page query parameter. Page number.
size(int): size query parameter. Number of objects
returned per page.
sort(basestring): sort query parameter. sort type asc or
desc.
sort_by(basestring): sortBy query parameter. sort column
by which objects needs to be sorted.
filter(basestring, list, set, tuple): filter query
parameter. Simple filtering
should be available through the filter
query string parameter. The structure of
a filter is a triplet of field operator
and value separated with dots. More than
one filter can be sent. The logical
operator common to ALL filter criteria
will be by default AND, and can be
changed by using the "filterType=or"
query string parameter. Each resource
Data model description should specify if
an attribute is a filtered field.
OPERATOR DESCRIPTION EQ
Equals NEQ Not Equals GT
Greater Than LT Less Then
STARTSW Starts With NSTARTSW
Not Starts With ENDSW Ends With
NENDSW Not Ends With CONTAINS
Contains NCONTAINS Not Contains
.
filter_type(basestring): filterType query parameter. The
logical operator common to ALL filter
criteria will be by default AND, and can
be changed by using the parameter.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
Generator: A generator object containing the following object.
+ RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
yield from get_next_page(
self.get_trusted_certificates, dict(
filter=filter,
filter_type=filter_type,
page=page,
size=size,
sort=sort,
sort_by=sort_by,
headers=headers,
**query_parameters
),
access_next_list=["nextPage", "href"],
access_resource_list=["response"])
def export_trusted_certificate(self,
id,
dirpath=None,
save_file=None,
headers=None,
**query_parameters):
"""The response of this API carries a trusted certificate file
mapped to the requested id.
Args:
id(basestring): id path parameter. The ID of the Trusted
Certificate to be exported.
dirpath(basestring): Directory absolute path. Defaults to
os.getcwd().
save_file(bool): Enable or disable automatic file creation of
raw response.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
urllib3.response.HTTPResponse: HTTP Response container. For more
information check the `urlib3 documentation <https://urllib3.readthedocs.io/en/latest/reference/urllib3.response.html>`_
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
DownloadFailure: If was not able to download the raw
response to a file.
"""
check_type(headers, dict)
if headers is not None:
if 'Content-Type' in headers:
check_type(headers.get('Content-Type'),
basestring, may_be_none=False)
if 'Accept' in headers:
check_type(headers.get('Accept'),
basestring, may_be_none=False)
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
check_type(id, basestring,
may_be_none=False)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
'id': id,
}
e_url = ('/api/v1/certs/trusted-certificate/export/{id}')
endpoint_full_url = apply_path_params(e_url, path_params)
if with_custom_headers:
_api_response = self._session.get(endpoint_full_url, params=_params,
headers=_headers,
stream=True, dirpath=dirpath, save_file=save_file)
else:
_api_response = self._session.get(endpoint_full_url, params=_params,
stream=True, dirpath=dirpath, save_file=save_file)
return self._object_factory('bpm_b62a711ce705542b5d1d92b7d3ca431_v3_0_0', _api_response)
def import_trust_certificate(self,
allow_basic_constraint_cafalse=None,
allow_out_of_date_cert=None,
allow_sha1_certificates=None,
data=None,
description=None,
name=None,
trust_for_certificate_based_admin_auth=None,
trust_for_cisco_services_auth=None,
trust_for_client_auth=None,
trust_for_ise_auth=None,
validate_certificate_extensions=None,
headers=None,
payload=None,
active_validation=True,
**query_parameters):
""" Import an X509 certificate as a trust certificate. NOTE:
Request Parameters accepting True and False as input can
be replaced by 1 and 0 respectively. Following
Parameters are used in POST body PARAMETER
DESCRIPTION EXAMPLE name Friendly name of
the certificate Trust Certificate description
Description of the certificate Passw*** data
Plain-text contents of the certificate file (required)
Trust Certificate in escaped format
allowOutOfDateCert Allow out of date certificates
(required) false allowSHA1Certificates Allow
SHA1 based certificates (required) false
trustForIseAuth Trust for authentication within ISE
false trustForClientAuth Trust for client
authentication and Syslog false
trustForCertificateBasedAdminAuth Trust for
Certificate based Admin authentication false
trustForCiscoServicesAuth Trust for authentication of
Cisco Services false
validateCertificateExtensions Validate extensions for
trust certificate false NOTE: If name is not
set, a default name of the following format will be
generated: common-name#issuer#nnnnn where
"nnnnn" is a unique number. You can always change the
friendly name later by editing the certificate.
You must choose how this certificate will be trusted in
ISE. The objective here is to distinguish between
certificates that are used for trust within an ISE
deployment and public certificates that are used to
trust Cisco services. Typically, you will not want to
use a given certificate for both purposes.
Trusted For Usage Authentication within ISE
Use "trustForIseAuth":true if the certificate is used
for trust within ISE, such as for secure communication
between ISE nodes Client authentication and Syslog
Use "trustForClientAuth":true if the certificate is to
be used for authentication of endpoints that contact ISE
over the EAP protocol. Also check this box if
certificate is used to trust a Syslog server. Make sure
to have keyCertSign bit asserted under KeyUsage
extension for this certificate. Note: "" can be set
true only if the "trustForIseAuth" has been set true.
Certificate based admin authentication Use
"trustForCertificateBasedAdminAuth":true if the
certificate is used for trust within ISE, such as for
secure communication between ISE nodes Note:
"trustForCertificateBasedAdminAuth" can be set true only
if "trustForIseAuth" and "trustForClientAuth" are true.
Authentication of Cisco Services Use
"trustForCiscoServicesAuth":true if the certificate is
to be used for trusting external Cisco services, such as
Feed Service. .
Args:
allow_basic_constraint_cafalse(boolean): Allow
Certificates with Basic Constraints CA
Field as False (required), property of
the request body.
allow_out_of_date_cert(boolean): Allow out of date
certificates (required), property of the
request body.
allow_sha1_certificates(boolean): Allow SHA1 based
certificates (required), property of the
request body.
data(string): Certificate content (required), property
of the request body.
description(string): Description of the certificate,
property of the request body.
name(string): Name of the certificate, property of the
request body.
trust_for_certificate_based_admin_auth(boolean): Trust
for Certificate based Admin
authentication, property of the request
body.
trust_for_cisco_services_auth(boolean): Trust for
authentication of Cisco Services,
property of the request body.
trust_for_client_auth(boolean): Trust for client
authentication and Syslog, property of
the request body.
trust_for_ise_auth(boolean): Trust for authentication
within ISE, property of the request
body.
validate_certificate_extensions(boolean): Validate trust
certificate extension, property of the
request body.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
payload(dict): A JSON serializable Python object to send in the
body of the Request.
active_validation(bool): Enable/Disable payload validation.
Defaults to True.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
pass
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
is_xml_payload = 'application/xml' in _headers.get('Content-Type', [])
if active_validation and is_xml_payload:
check_type(payload, basestring)
if active_validation and not is_xml_payload:
check_type(payload, dict)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
}
if is_xml_payload:
_payload = payload
else:
_payload = {
'allowBasicConstraintCAFalse':
allow_basic_constraint_cafalse,
'allowOutOfDateCert':
allow_out_of_date_cert,
'allowSHA1Certificates':
allow_sha1_certificates,
'data':
data,
'description':
description,
'name':
name,
'validateCertificateExtensions':
validate_certificate_extensions,
'trustForCertificateBasedAdminAuth':
trust_for_certificate_based_admin_auth,
'trustForCiscoServicesAuth':
trust_for_cisco_services_auth,
'trustForClientAuth':
trust_for_client_auth,
'trustForIseAuth':
trust_for_ise_auth,
}
_payload.update(payload or {})
_payload = dict_from_items_with_values(_payload)
if active_validation and not is_xml_payload:
self._request_validator('jsd_c8cd2f618b655d988ce626e579486596_v3_0_0')\
.validate(_payload)
e_url = ('/api/v1/certs/trusted-certificate/import')
endpoint_full_url = apply_path_params(e_url, path_params)
request_params = {'data': _payload} if is_xml_payload else {'json': _payload}
if with_custom_headers:
_api_response = self._session.post(endpoint_full_url, params=_params,
headers=_headers,
**request_params)
else:
_api_response = self._session.post(endpoint_full_url, params=_params,
**request_params)
return self._object_factory('bpm_c8cd2f618b655d988ce626e579486596_v3_0_0', _api_response)
def get_trusted_certificate_by_id(self,
id,
headers=None,
**query_parameters):
"""This API can displays details of a Trust Certificate based on a
given ID.
Args:
id(basestring): id path parameter. The id of the trust
certificate.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
if 'Content-Type' in headers:
check_type(headers.get('Content-Type'),
basestring, may_be_none=False)
if 'Accept' in headers:
check_type(headers.get('Accept'),
basestring, may_be_none=False)
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
check_type(id, basestring,
may_be_none=False)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
'id': id,
}
e_url = ('/api/v1/certs/trusted-certificate/{id}')
endpoint_full_url = apply_path_params(e_url, path_params)
if with_custom_headers:
_api_response = self._session.get(endpoint_full_url, params=_params,
headers=_headers)
else:
_api_response = self._session.get(endpoint_full_url, params=_params)
return self._object_factory('bpm_f8f4956d29b821fa9bbf23266_v3_0_0', _api_response)
def update_trusted_certificate(self,
id,
authenticate_before_crl_received=None,
automatic_crl_update=None,
automatic_crl_update_period=None,
automatic_crl_update_units=None,
crl_distribution_url=None,
crl_download_failure_retries=None,
crl_download_failure_retries_units=None,
description=None,
download_crl=None,
enable_ocsp_validation=None,
enable_server_identity_check=None,
ignore_crl_expiration=None,
name=None,
non_automatic_crl_update_period=None,
non_automatic_crl_update_units=None,
reject_if_no_status_from_ocs_p=None,
reject_if_unreachable_from_ocs_p=None,
selected_ocsp_service=None,
status=None,
trust_for_certificate_based_admin_auth=None,
trust_for_cisco_services_auth=None,
trust_for_client_auth=None,
trust_for_ise_auth=None,
headers=None,
payload=None,
active_validation=True,
**query_parameters):
""" Update a trusted certificate present in ISE trust store.
Following Parameters are used in PUT request body
PARAMETER DESCRIPTION EXAMPLE name
Friendly name of the certificate(required) Trust
Certificate status Status of the certificate
Enabled description Description of the
certificate Certificate for secure connection to
cisco.com trustForIseAuth Trust for
authentication within ISE false
trustForClientAuth Trust for client authentication and
Syslog false trustForCertificateBasedAdminAuth
Trust for Certificate based Admin authentication false
trustForCiscoServicesAuth Trust for authentication of
Cisco Services false enableOCSPValidation
Switch to enable/disable OCSP Validation false
selectedOCSPService Name of selected OCSP Service
INTERNAL_OCSP_SERVICE rejectIfNoStatusFromOCSP
Switch to reject certificate if there is no status from
OCSP false rejectIfUnreachableFromOCSP Switch
to reject certificate if unreachable from OCSP false
downloadCRL Switch to enable/disable download of CRL
false crlDistributionUrl Certificate Revocation
List Distribution URL automaticCRLUpdate
Switch to enable/disable automatic CRL update false
automaticCRLUpdatePeriod Automatic CRL update period
5 automaticCRLUpdateUnits Unit of time for
automatic CRL update Minutes
nonAutomaticCRLUpdatePeriod Non automatic CRL update
period 1 nonAutomaticCRLUpdateUnits Unit of
time of non automatic CRL update Hours
crlDownloadFailureRetries If CRL download fails, wait
time before retry 10
crlDownloadFailureRetriesUnits Unit of time before
retry if CRL download fails Minutes
enableServerIdentityCheck Switch to enable/disable
verification if HTTPS or LDAP server certificate name
fits the configured server URL false
authenticateBeforeCRLReceived Switch to enable/disable
CRL Verification if CRL is not Received false
ignoreCRLExpiration Switch to enable/disable ignore
CRL Expiration false Trusted For
Usage Authentication within ISE Use
"trustForIseAuth":true if the certificate is used for
trust within ISE, such as for secure communication
between ISE nodes Client authentication and Syslog
Use "trustForClientAuth":true if the certificate is to
be used for authentication of endpoints that contact ISE
over the EAP protocol. Also check this box if
certificate is used to trust a Syslog server. Make sure
to have keyCertSign bit asserted under KeyUsage
extension for this certificate. Note:
"trustForClientAuth" can be set true only if the
"trustForIseAuth" has been set true. Certificate
based admin authentication Use
"trustForCertificateBasedAdminAuth":true if the
certificate is used for trust within ISE, such as for
secure communication between ISE nodes Note:
"trustForCertificateBasedAdminAuth" can be set true only
if "trustForIseAuth" and "trustForClientAuth" are true.
Authentication of Cisco Services Use
"trustForCiscoServicesAuth":true if the certificate is
to be used for trusting external Cisco services, such as
Feed Service. OCSP Configuration Usage
Validation against OCSP service Use
"enableOCSPValidation":true to validate the certificate
against OCSP service mentioned in the field
selectedOCSPService OCSP Service name Use
"selectedOCSPService":"Name of OCSP Service" Name of
the OCSP service against which the certificate should be
validated Note: "selectedOCSPService" value will on be
used if "enableOCSPValidation" has been set true.
Reject the request if OCSP returns UNKNOWN status Use
"rejectIfNoStatusFromOCSP":true to reject the
certificate if the OCSP service returns UNKNOWN status
Note: "rejectIfNoStatusFromOCSP:true" can be used only
if "enableOCSPValidation" has been set true.
Reject the request if OCSP Responder is unreachable
Use "rejectIfUnreachableFromOCSP":true to reject the
certificate if the OCSP service is unreachable. Note:
"rejectIfUnreachableFromOCSP:true" can be used only if
"enableOCSPValidation" has been set true.
Certificate Revocation List Configuration Usage
Validation against CRL Use "downloadCRL":true to
validate the certificate against CRL downloaded from URL
mentioned in the field crlDistributionUrl CRL
distribution url Use "crlDistributionUrl" to specify
the URL from where the CRL should be downloaded Note:
"crlDistributionUrl" value will only be used if
"downloadCRL" has been set true. Retrieve CRL time
Use "automaticCRLUpdate":true and
automaticCRLUpdatePeriod, automaticCRLUpdatePeriod to
set the time before which CRL is automatically retrieved
prior to expiration Use "nonAutomaticCRLUpdatePeriod,
nonAutomaticCRLUpdateUnits to set the time period for
CRL retrieval in loop. Note: All the above fields can
be used only if "downloadCRL" has been set true.
If download fails Use "crlDownloadFailureRetries" and
"crlDownloadFailureRetriesUnits" to set retry time
period if CRL download fails Note:
"crlDownloadFailureRetries" and
"crlDownloadFailureRetriesUnits" can be used only if
"downloadCRL" has been set true. Enable Server
Identity Check Use "enableServerIdentityCheck":true
to verify that HTTPS or LDAPS server certificate name
fits the configured server URL Note:
"enableServerIdentityCheck:true" can be used only if
"downloadCRL" has been set true. Bypass CRL
Verification if CRL is not Received Use
"authenticateBeforeCRLReceived":true to bypass CRL
Verification if CRL is not Received Note:
"authenticateBeforeCRLReceived:true" can be used only if
"downloadCRL" has been set true. Ignore that CRL
is not yet valid or has expired Use
"ignoreCRLExpiration":true to ignore if CRL is not yet
valid or expired Note: "ignoreCRLExpiration:true" can
be used only if "downloadCRL" has been set true.
Note: boolean properties accept integers values as
well, with 0 considered as false and other values being
considered as true .
Args:
authenticate_before_crl_received(boolean): Switch to
enable/disable CRL Verification if CRL
is not Received, property of the request
body.
automatic_crl_update(boolean): Switch to enable/disable
automatic CRL update, property of the
request body.
automatic_crl_update_period(integer): Automatic CRL
update period, property of the request
body.
automatic_crl_update_units(string): Unit of time for
automatic CRL update, property of the
request body. Available values are
'Minutes', 'Hours', 'Days' and 'Weeks'.
crl_distribution_url(string): CRL Distribution URL,
property of the request body.
crl_download_failure_retries(integer): If CRL download
fails, wait time before retry, property
of the request body.
crl_download_failure_retries_units(string): Unit of time
before retry if CRL download fails,
property of the request body. Available
values are 'Minutes', 'Hours', 'Days'
and 'Weeks'.
description(string): Description for trust certificate,
property of the request body.
download_crl(boolean): Switch to enable/disable download
of CRL, property of the request body.
enable_ocsp_validation(boolean): Switch to
enable/disable OCSP Validation, property
of the request body.
enable_server_identity_check(boolean): Switch to
enable/disable verification if HTTPS or
LDAP server certificate name fits the
configured server URL, property of the
request body.
ignore_crl_expiration(boolean): Switch to enable/disable
ignore CRL Expiration, property of the
request body.
name(string): Friendly name of the certificate, property
of the request body.
non_automatic_crl_update_period(integer): Non automatic
CRL update period, property of the
request body.
non_automatic_crl_update_units(string): Unit of time of
non automatic CRL update, property of
the request body. Available values are
'Minutes', 'Hours', 'Days' and 'Weeks'.
reject_if_no_status_from_ocs_p(boolean): Switch to
reject certificate if there is no status
from OCSP, property of the request body.
reject_if_unreachable_from_ocs_p(boolean): Switch to
reject certificate if unreachable from
OCSP, property of the request body.
selected_ocsp_service(string): Name of selected OCSP
Service, property of the request body.
status(string): status, property of the request body.
Available values are 'Enabled' and
'Disabled'.
trust_for_certificate_based_admin_auth(boolean): Trust
for Certificate based Admin
authentication, property of the request
body.
trust_for_cisco_services_auth(boolean): Trust for
authentication of Cisco Services,
property of the request body.
trust_for_client_auth(boolean): Trust for client
authentication and Syslog, property of
the request body.
trust_for_ise_auth(boolean): Trust for authentication
within ISE, property of the request
body.
id(basestring): id path parameter. The id of the trust
certificate.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
payload(dict): A JSON serializable Python object to send in the
body of the Request.
active_validation(bool): Enable/Disable payload validation.
Defaults to True.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
if 'Content-Type' in headers:
check_type(headers.get('Content-Type'),
basestring, may_be_none=False)
if 'Accept' in headers:
check_type(headers.get('Accept'),
basestring, may_be_none=False)
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
is_xml_payload = 'application/xml' in _headers.get('Content-Type', [])
if active_validation and is_xml_payload:
check_type(payload, basestring)
if active_validation and not is_xml_payload:
check_type(payload, dict)
check_type(id, basestring,
may_be_none=False)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
'id': id,
}
if is_xml_payload:
_payload = payload
else:
_payload = {
'authenticateBeforeCRLReceived':
authenticate_before_crl_received,
'automaticCRLUpdate':
automatic_crl_update,
'automaticCRLUpdatePeriod':
automatic_crl_update_period,
'automaticCRLUpdateUnits':
automatic_crl_update_units,
'crlDistributionUrl':
crl_distribution_url,
'crlDownloadFailureRetries':
crl_download_failure_retries,
'crlDownloadFailureRetriesUnits':
crl_download_failure_retries_units,
'description':
description,
'downloadCRL':
download_crl,
'enableOCSPValidation':
enable_ocsp_validation,
'enableServerIdentityCheck':
enable_server_identity_check,
'ignoreCRLExpiration':
ignore_crl_expiration,
'name':
name,
'nonAutomaticCRLUpdatePeriod':
non_automatic_crl_update_period,
'nonAutomaticCRLUpdateUnits':
non_automatic_crl_update_units,
'rejectIfNoStatusFromOCSP':
reject_if_no_status_from_ocs_p,
'rejectIfUnreachableFromOCSP':
reject_if_unreachable_from_ocs_p,
'selectedOCSPService':
selected_ocsp_service,
'status':
status,
'trustForCertificateBasedAdminAuth':
trust_for_certificate_based_admin_auth,
'trustForCiscoServicesAuth':
trust_for_cisco_services_auth,
'trustForClientAuth':
trust_for_client_auth,
'trustForIseAuth':
trust_for_ise_auth,
}
_payload.update(payload or {})
_payload = dict_from_items_with_values(_payload)
if active_validation and not is_xml_payload:
self._request_validator('jsd_cb625d5ad0ad76b93282f5818a_v3_0_0')\
.validate(_payload)
e_url = ('/api/v1/certs/trusted-certificate/{id}')
endpoint_full_url = apply_path_params(e_url, path_params)
request_params = {'data': _payload} if is_xml_payload else {'json': _payload}
if with_custom_headers:
_api_response = self._session.put(endpoint_full_url, params=_params,
headers=_headers,
**request_params)
else:
_api_response = self._session.put(endpoint_full_url, params=_params,
**request_params)
return self._object_factory('bpm_cb625d5ad0ad76b93282f5818a_v3_0_0', _api_response)
def delete_trusted_certificate_by_id(self,
id,
headers=None,
**query_parameters):
"""This API deletes a Trust Certificate from Trusted Certificate
Store based on a given ID.
Args:
id(basestring): id path parameter. The ID of the Trusted
Certificate to be deleted.
headers(dict): Dictionary of HTTP Headers to send with the Request
.
**query_parameters: Additional query parameters (provides
support for parameters that may be added in the future).
Returns:
RestResponse: REST response with following properties:
- headers(MyDict): response headers.
- response(MyDict): response body as a MyDict object. Access the object's properties by using the dot notation
or the bracket notation.
- content(bytes): representation of the request's response
- text(str): representation of the request's response
Raises:
TypeError: If the parameter types are incorrect.
MalformedRequest: If the request body created is invalid.
ApiError: If the Identity Services Engine cloud returns an error.
"""
check_type(headers, dict)
if headers is not None:
if 'Content-Type' in headers:
check_type(headers.get('Content-Type'),
basestring, may_be_none=False)
if 'Accept' in headers:
check_type(headers.get('Accept'),
basestring, may_be_none=False)
with_custom_headers = False
_headers = self._session.headers or {}
if headers:
_headers.update(dict_of_str(headers))
with_custom_headers = True
check_type(id, basestring,
may_be_none=False)
_params = {
}
_params.update(query_parameters)
_params = dict_from_items_with_values(_params)
path_params = {
'id': id,
}
e_url = ('/api/v1/certs/trusted-certificate/{id}')
endpoint_full_url = apply_path_params(e_url, path_params)
if with_custom_headers:
_api_response = self._session.delete(endpoint_full_url, params=_params,
headers=_headers)
else:
_api_response = self._session.delete(endpoint_full_url, params=_params)
return self._object_factory('bpm_c578ef80918b5d038024d126cd6e3b8d_v3_0_0', _api_response)
| 44.365012 | 132 | 0.572691 | 13,659 | 134,914 | 5.486346 | 0.053737 | 0.02789 | 0.025301 | 0.028557 | 0.844658 | 0.824588 | 0.810283 | 0.796378 | 0.778964 | 0.767074 | 0 | 0.009414 | 0.377982 | 134,914 | 3,040 | 133 | 44.379605 | 0.883566 | 0.504981 | 0 | 0.785054 | 0 | 0 | 0.079769 | 0.053456 | 0 | 0 | 0 | 0 | 0 | 1 | 0.01926 | false | 0.016949 | 0.007704 | 0 | 0.043914 | 0.00077 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
679118f84e35e961e5a601ca0d5975a023df9330 | 20,135 | py | Python | _dev/journal/_journal_adjustment.py | nicolossus/pylfi | 7950aff5c36e7368cbe77b32ef348966b905f5cf | [
"MIT"
] | null | null | null | _dev/journal/_journal_adjustment.py | nicolossus/pylfi | 7950aff5c36e7368cbe77b32ef348966b905f5cf | [
"MIT"
] | null | null | null | _dev/journal/_journal_adjustment.py | nicolossus/pylfi | 7950aff5c36e7368cbe77b32ef348966b905f5cf | [
"MIT"
] | null | null | null |
class JournalAdjustment(JournalBase):
pass
###
###
###
###
class JournalBase:
"""Journal.
"""
pass
'''
def __init__(self):
self.accepted_parameters = {}
self.parameter_names = []
self.parameter_names_tex = []
self.labels = []
self.distances = []
self.sumstats = []
self._n_parameters = 0
self.configuration = {}
self.sampler_summary = {}
self._journal_started = False
def _start_journal(self):
self._journal_started = True
def _add_config(self, simulator, inference_scheme, distance, n_simulations, epsilon):
self.configuration["Simulator model"] = simulator.__name__
self.configuration["Inference scheme"] = inference_scheme
self.configuration["Distance metric"] = distance.__name__
self.configuration["Number of simulations"] = n_simulations
self.configuration["Epsilon"] = epsilon
def _add_parameter_names(self, priors):
for parameter in priors:
name = parameter.name
tex = parameter.tex
self.parameter_names.append(name)
self.accepted_parameters[name] = []
self.parameter_names_tex.append(tex)
self._n_parameters += 1
if tex is None:
self.labels.append(name)
else:
self.labels.append(tex)
def _add_accepted_parameters(self, thetas):
for parameter_name, theta in zip(self.parameter_names, thetas):
self.accepted_parameters[parameter_name].append(theta)
def _add_distance(self, distance):
self.distances.append(distance)
def _add_sumstat(self, sumstat):
self.sumstats.append(sumstat)
def _add_sampler_summary(self, number_of_simulations, accepted_count):
accept_ratio = accepted_count / number_of_simulations
# number of parameters estimated
self.sampler_summary["Number of simulations"] = number_of_simulations
self.sampler_summary["Number of accepted simulations"] = accepted_count
self.sampler_summary["Acceptance ratio"] = accept_ratio
# posterior means
# uncertainty
@property
def get_accepted_parameters(self):
return self.accepted_parameters
def _get_params_as_arrays(self):
"""
Transform data of accepted parameters to 1D arrays
"""
samples = self.get_accepted_parameters
if len(self.parameter_names) > 1:
params = (np.asarray(samples[name], float).squeeze() if np.asarray(
samples[name], float).ndim > 1 else np.asarray(samples[name], float) for name in self.parameter_names)
else:
samples = np.asarray(samples[self.parameter_names[0]], float)
params = samples.squeeze() if samples.ndim > 1 else samples
return params
def params_as_arrays(self):
*data, = self._get_params_as_arrays()
return data
def _set_point_estimate_statistic(self, statistic):
if statistic == 'mean':
pass
def _sample_point_estimates(self, statistic):
"""
Calculate point estimate of inferred parameters.
In statistics, point estimation involves the use of sample data to
calculate a single value (known as a point estimate since it identifies
a point in some parameter space) which is to serve as a "best guess" or
"best estimate" of an unknown population parameter (for example,
the population mean).
https://en.wikipedia.org/wiki/Point_estimation
"""
*samples, = self._get_params_as_arrays()
if self._n_parameters == 1:
point_estimates = [np.mean(samples)]
else:
point_estimates = [np.mean(sample) for sample in samples]
return point_estimates
@property
def get_distances(self):
check_journal_status(self._journal_started)
return self.distances
@property
def get_number_of_simulations(self):
check_journal_status(self._journal_started)
return self.sampler_summary["Number of simulations"]
@property
def get_number_of_accepted_simulations(self):
check_journal_status(self._journal_started)
return self.sampler_summary["Number of accepted simulations"]
@property
def get_acceptance_ratio(self):
check_journal_status(self._journal_started)
return self.sampler_summary["Acceptance ratio"]
def _samples(self, name):
pass
def _kde(self):
pass
def _add_histplot(self, data, ax, index, density, point_estimates, true_vals_bool, true_parameter_values):
n_bins = self._freedman_diaconis_rule(data)
ax.hist(data, density=density, histtype='bar', edgecolor=None,
color='steelblue', alpha=0.5, bins=n_bins, label="Accepted samples")
ax.axvline(
point_estimates[index], color='b', label="Point estimate")
if true_vals_bool:
ax.axvline(
true_parameter_values[index], color='r', linestyle='--', label="Groundtruth")
ax.set_xlabel(self.labels[index])
ax.set_title("Histogram of accepted " + self.labels[index])
def histplot(self, bins=10, rug=False, point_estimate='mean', show=True, dpi=120, path_to_save=None, true_parameter_values=None, **kwargs):
"""
histogram(s) of sampled parameter(s)
point estimate : mean, median, mode, None
The Mode value is the value that appears the most number of times
The median value is the value in the middle, after you have sorted all the values
The mean value is the average value
"""
N = self._n_parameters
# run checks
check_journal_status(self._journal_started)
if point_estimate is not None:
check_point_estimate_input(point_estimate)
point_estimates = self._point_estimates()
true_vals_bool = False
if true_parameter_values is not None:
check_true_parameter_values(N, true_parameter_values)
true_vals_bool = True
# get sampled parameters
*data, = self._get_params_as_arrays()
fig = plt.figure(figsize=(8, 6), tight_layout=True, dpi=dpi)
self._set_plot_style()
if N == 1:
ax = plt.subplot(111)
legend_position = 0
index = 0
add_histplot(data, ax, bins, label="Accepted samples", **kwargs)
if rug:
add_rugplot(data, ax)
if decorate:
ax.axvline(point_estimates[index],
color='b', label="Point estimate")
self._add_histplot(
data, ax, index, density, point_estimates, true_vals_bool, true_parameter_values)
else:
if N == 2 or N == 4:
cols = 2
legend_position = 1
else:
cols = 3
legend_position = 2
rows = int(np.ceil(N / cols))
gs = gridspec.GridSpec(ncols=cols, nrows=rows, figure=fig)
for index, data in enumerate(data):
ax = fig.add_subplot(gs[index])
self._add_histplot(
data, ax, index, density, point_estimates, true_vals_bool, true_parameter_values)
handles, labels = plt.gca().get_legend_handles_labels()
if true_vals_bool:
order = [2, 0, 1]
else:
order = [1, 0]
plt.legend([handles[idx] for idx in order],
[labels[idx] for idx in order],
loc='center left',
bbox_to_anchor=(1.04, 0.5),
fancybox=True,
borderaxespad=0.1,
ncol=1
)
if path_to_save is not None:
fig.savefig(path_to_save, dpi=dpi)
if show:
plt.show()
def adjusted_histplot():
# regression adjusted
pass
def kdeplot(self, kernel="gaussian"):
ax[1].plot(x, kernel.evaluate(x), label="approximate posterior")
pass
def distplot(self, kde=True, kde_kwds=None, ax=None):
"""
"""
if ax is None:
ax = plt.gca()
pass
def posterior_kde(self, kernel="gaussian"):
pass
@ property
def summary(self):
pass
@ property
def print_summary(self):
pass
def save(self, filename):
"""
Stores the journal to disk.
Parameters
----------
filename: string
the location of the file to store the current object to.
"""
with open(filename, 'wb') as output:
pickle.dump(self, output, -1)
def load(self, filename):
with open(filename, 'rb') as input:
journal = pickle.load(input)
return journal
def kde_fit(self):
# move to _adjustment
# wrap kde class instance
# to be called from posterior_plot
pass
def run_lra(self):
# move to _adjustment
# is_performed = True # when already run, do not run again
pass
'''
'''
@staticmethod
def run_lra(
theta: torch.Tensor,
x: torch.Tensor,
observation: torch.Tensor,
sample_weight=None,
) -> torch.Tensor:
"""Return parameters adjusted with linear regression adjustment.
Implementation as in Beaumont et al. 2002: https://arxiv.org/abs/1707.01254
"""
theta_adjusted = theta
for parameter_idx in range(theta.shape[1]):
regression_model = LinearRegression(fit_intercept=True)
regression_model.fit(
X=x,
y=theta[:, parameter_idx],
sample_weight=sample_weight,
)
theta_adjusted[:, parameter_idx] += regression_model.predict(
observation.reshape(1, -1)
)
theta_adjusted[:, parameter_idx] -= regression_model.predict(x)
return theta_adjusted
'''
##
###
##
##
##
'''
class Journal:
"""Journal.
"""
def __init__(self):
self.accepted_parameters = {}
self.parameter_names = []
self.parameter_names_tex = []
self.labels = []
self.distances = []
self.sumstats = []
self._n_parameters = 0
self.configuration = {}
self.sampler_summary = {}
self._journal_started = False
def _start_journal(self):
self._journal_started = True
def _add_config(self, simulator, inference_scheme, distance, n_simulations, epsilon):
self.configuration["Simulator model"] = simulator.__name__
self.configuration["Inference scheme"] = inference_scheme
self.configuration["Distance metric"] = distance.__name__
self.configuration["Number of simulations"] = n_simulations
self.configuration["Epsilon"] = epsilon
def _add_parameter_names(self, priors):
for parameter in priors:
name = parameter.name
tex = parameter.tex
self.parameter_names.append(name)
self.accepted_parameters[name] = []
self.parameter_names_tex.append(tex)
self._n_parameters += 1
if tex is None:
self.labels.append(name)
else:
self.labels.append(tex)
def _add_accepted_parameters(self, thetas):
for parameter_name, theta in zip(self.parameter_names, thetas):
self.accepted_parameters[parameter_name].append(theta)
def _add_distance(self, distance):
self.distances.append(distance)
def _add_sumstat(self, sumstat):
self.sumstats.append(sumstat)
def _add_sampler_summary(self, number_of_simulations, accepted_count):
accept_ratio = accepted_count / number_of_simulations
# number of parameters estimated
self.sampler_summary["Number of simulations"] = number_of_simulations
self.sampler_summary["Number of accepted simulations"] = accepted_count
self.sampler_summary["Acceptance ratio"] = accept_ratio
# posterior means
# uncertainty
@property
def get_accepted_parameters(self):
return self.accepted_parameters
def _get_params_as_arrays(self):
"""
Transform data of accepted parameters to 1D arrays
"""
samples = self.get_accepted_parameters
if len(self.parameter_names) > 1:
params = (np.asarray(samples[name], float).squeeze() if np.asarray(
samples[name], float).ndim > 1 else np.asarray(samples[name], float) for name in self.parameter_names)
else:
samples = np.asarray(samples[self.parameter_names[0]], float)
params = samples.squeeze() if samples.ndim > 1 else samples
return params
def _set_point_estimate_statistic(self, statistic):
if statistic == 'mean':
pass
def _sample_point_estimates(self, statistic):
"""
Calculate point estimate of inferred parameters.
In statistics, point estimation involves the use of sample data to
calculate a single value (known as a point estimate since it identifies
a point in some parameter space) which is to serve as a "best guess" or
"best estimate" of an unknown population parameter (for example,
the population mean).
https://en.wikipedia.org/wiki/Point_estimation
"""
*samples, = self._get_params_as_arrays()
if self._n_parameters == 1:
point_estimates = [np.mean(samples)]
else:
point_estimates = [np.mean(sample) for sample in samples]
return point_estimates
@property
def get_distances(self):
check_journal_status(self._journal_started)
return self.distances
@property
def get_number_of_simulations(self):
check_journal_status(self._journal_started)
return self.sampler_summary["Number of simulations"]
@property
def get_number_of_accepted_simulations(self):
check_journal_status(self._journal_started)
return self.sampler_summary["Number of accepted simulations"]
@property
def get_acceptance_ratio(self):
check_journal_status(self._journal_started)
return self.sampler_summary["Acceptance ratio"]
def _samples(self, name):
pass
def _kde(self):
pass
'''
'''
def _add_histplot(self, data, ax, index, density, point_estimates, true_vals_bool, true_parameter_values):
n_bins = self._freedman_diaconis_rule(data)
ax.hist(data, density=density, histtype='bar', edgecolor=None,
color='steelblue', alpha=0.5, bins=n_bins, label="Accepted samples")
ax.axvline(
point_estimates[index], color='b', label="Point estimate")
if true_vals_bool:
ax.axvline(
true_parameter_values[index], color='r', linestyle='--', label="Groundtruth")
ax.set_xlabel(self.labels[index])
ax.set_title("Histogram of accepted " + self.labels[index])
'''
'''
def histplot(self, bins=10, rug=False, point_estimate='mean', show=True, dpi=120, path_to_save=None, true_parameter_values=None, **kwargs):
"""
histogram(s) of sampled parameter(s)
point estimate : mean, median, mode, None
The Mode value is the value that appears the most number of times
The median value is the value in the middle, after you have sorted all the values
The mean value is the average value
"""
N = self._n_parameters
# run checks
check_journal_status(self._journal_started)
if point_estimate is not None:
check_point_estimate_input(point_estimate)
point_estimates = self._point_estimates()
true_vals_bool = False
if true_parameter_values is not None:
check_true_parameter_values(N, true_parameter_values)
true_vals_bool = True
# get sampled parameters
*data, = self._get_params_as_arrays()
fig = plt.figure(figsize=(8, 6), tight_layout=True, dpi=dpi)
self._set_plot_style()
if N == 1:
ax = plt.subplot(111)
legend_position = 0
index = 0
add_histplot(data, ax, bins, label="Accepted samples", **kwargs)
if rug:
add_rugplot(data, ax)
if decorate:
ax.axvline(point_estimates[index],
color='b', label="Point estimate")
self._add_histplot(
data, ax, index, density, point_estimates, true_vals_bool, true_parameter_values)
else:
if N == 2 or N == 4:
cols = 2
legend_position = 1
else:
cols = 3
legend_position = 2
rows = int(np.ceil(N / cols))
gs = gridspec.GridSpec(ncols=cols, nrows=rows, figure=fig)
for index, data in enumerate(data):
ax = fig.add_subplot(gs[index])
self._add_histplot(
data, ax, index, density, point_estimates, true_vals_bool, true_parameter_values)
handles, labels = plt.gca().get_legend_handles_labels()
if true_vals_bool:
order = [2, 0, 1]
else:
order = [1, 0]
plt.legend([handles[idx] for idx in order],
[labels[idx] for idx in order],
loc='center left',
bbox_to_anchor=(1.04, 0.5),
fancybox=True,
borderaxespad=0.1,
ncol=1
)
if path_to_save is not None:
fig.savefig(path_to_save, dpi=dpi)
if show:
plt.show()
def adjusted_histplot():
# regression adjusted
pass
def kdeplot(self, kernel="gaussian"):
ax[1].plot(x, kernel.evaluate(x), label="approximate posterior")
pass
def distplot(self, kde=True, kde_kwds=None, ax=None):
"""
"""
if ax is None:
ax = plt.gca()
pass
def posterior_kde(self, kernel="gaussian"):
pass
@ property
def summary(self):
pass
@ property
def print_summary(self):
pass
def save(self, filename):
"""
Stores the journal to disk.
Parameters
----------
filename: string
the location of the file to store the current object to.
"""
with open(filename, 'wb') as output:
pickle.dump(self, output, -1)
def load(self, filename):
with open(filename, 'rb') as input:
journal = pickle.load(input)
return journal
def kde_fit(self):
# move to _adjustment
# wrap kde class instance
# to be called from posterior_plot
pass
def run_lra(self):
# move to _adjustment
# is_performed = True # when already run, do not run again
pass
'''
'''
@staticmethod
def run_lra(
theta: torch.Tensor,
x: torch.Tensor,
observation: torch.Tensor,
sample_weight=None,
) -> torch.Tensor:
"""Return parameters adjusted with linear regression adjustment.
Implementation as in Beaumont et al. 2002: https://arxiv.org/abs/1707.01254
"""
theta_adjusted = theta
for parameter_idx in range(theta.shape[1]):
regression_model = LinearRegression(fit_intercept=True)
regression_model.fit(
X=x,
y=theta[:, parameter_idx],
sample_weight=sample_weight,
)
theta_adjusted[:, parameter_idx] += regression_model.predict(
observation.reshape(1, -1)
)
theta_adjusted[:, parameter_idx] -= regression_model.predict(x)
return theta_adjusted
'''
| 31.909667 | 143 | 0.600497 | 2,304 | 20,135 | 5.03342 | 0.120226 | 0.016556 | 0.024834 | 0.01897 | 0.990429 | 0.990429 | 0.988273 | 0.988273 | 0.988273 | 0.988273 | 0 | 0.008615 | 0.30817 | 20,135 | 630 | 144 | 31.960317 | 0.823905 | 0.000397 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
679a48d0e7eca15564e88c0b1f8cd9e6d9b9756f | 42,763 | py | Python | tests/ocd_frontend/__init__.py | ngi-nix/poliscoops | 491d12f83a44afbb4f1ee525b29ae70dc564e0f7 | [
"CC-BY-4.0"
] | 6 | 2017-08-16T13:14:42.000Z | 2021-11-23T00:41:20.000Z | tests/ocd_frontend/__init__.py | ngi-nix/poliscoops | 491d12f83a44afbb4f1ee525b29ae70dc564e0f7 | [
"CC-BY-4.0"
] | 65 | 2020-04-07T08:16:31.000Z | 2022-02-19T00:18:24.000Z | tests/ocd_frontend/__init__.py | openstate/coronalert | 9aa24cc0ea75b85e9bda0cfcd6ff592a2c61c95e | [
"CC-BY-4.0"
] | 2 | 2017-12-27T13:12:23.000Z | 2019-08-08T07:17:52.000Z | import json
import mock
import random
from unittest import TestCase as UnittestTestCase
from pprint import pprint
import requests
from flask import url_for, current_app
from flask.ext.testing import TestCase
from ocd_frontend.rest import tasks
from .mixins import OcdRestTestCaseMixin
class RestApiSearchTestCase(OcdRestTestCaseMixin, TestCase):
endpoint_url = 'api.search'
endpoint_url_args = {}
required_indexes = [
'ori_test_combined_index'
]
def test_valid_search(self):
"""Tests if a valid search request responds with a JSON and
status 200 OK."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
print(url)
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de'}))
self.assert_ok_json(response)
def test_missing_query(self):
"""Tests if a 200 response is returned when the required
``query`` attribute is missing."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
response = self.post(url, content_type='application/json',
data=json.dumps({'not-a-query': 'de'}))
self.assert_ok(response)
def test_sort_option_is_accepted(self):
"""Tests if valid use of the ``sort`` option results in a
JSON response with a 200 OK."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
sort_field = random.choice(current_app.config['SORTABLE_FIELDS']['items'])
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de',
'sort': sort_field}))
self.assert_ok_json(response)
def test_sort_order_option_is_accepted(self):
"""Test if valid use of the ``sort`` and ``order`` options
result in a JSON response with a 200 OK."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
available_sort_fields = current_app.config['SORTABLE_FIELDS']['items']
try:
available_sort_fields.remove('start_date')
available_sort_fields.remove('end_date')
except ValueError as e:
pass
sort_field = random.choice(available_sort_fields)
sort_order = random.choice(['asc', 'desc'])
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de',
'order': sort_order,
'sort': sort_field}))
self.assert_ok_json(response)
def test_sort_option_with_invalid_field(self):
"""Tests if sorting on an invalid field results in a response
with status code 400."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de',
'sort': 'not-a-sort-field'}))
self.assert_bad_request_json(response)
def test_sort_option_with_invalid_order(self):
"""Test if supplying an invalid order option results in a
response with status code 400."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
sort_field = random.choice(current_app.config['SORTABLE_FIELDS']['items'])
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de',
'order': 'upsidedown',
'sort': sort_field}))
self.assert_bad_request_json(response)
def test_facets(self):
"""Test if requesting facets results in a 200 OK, and if the
facets are actually present in the response."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
available_facets = current_app.config['AVAILABLE_FACETS']
facet_keys = random.sample(available_facets['items'].keys(), 1)
facets = {fk: available_facets['items'][fk] for fk in facet_keys}
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de',
'facets': facets}))
self.assert_ok_json(response)
self.assertIn(u'facets', response.json)
for fk in facet_keys:
self.assertIn(fk, response.json.get(u'facets', {}))
def test_invalid_facet_option_value(self):
"""Tests if requesting a facet with invalid value (not dict)
results in a response with status code 400."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
facets = {
'rights': []
}
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de',
'facets': facets}))
self.assert_bad_request_json(response)
def test_not_available_facet(self):
"""Tests if requesting a facet that is not available results
in a response with status code 400."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
facets = {
'rights-that-are-not-a-facet': {}
}
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de',
'facets': facets}))
self.assert_bad_request_json(response)
def test_facet_size(self):
"""Tests if valid use of the facet ``size`` attribute results in
a 200 OK JSON response."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
facets = {
'classification': {
'size': 10
}
}
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de',
'facets': facets}))
self.assert_ok_json(response)
def test_invalid_facet_size(self):
"""Tests if supplying an invalid facet ``size`` value results in
a response with status code 400."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
facets = {
'source': {
'size': 'abc'
}
}
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de',
'facets': facets}))
self.assert_bad_request_json(response)
# def test_datetime_facet(self):
# """Tests if valid use of the ``date`` facet results in a 200 OK
# JSON response."""
# url = url_for(self.endpoint_url, **self.endpoint_url_args)
#
# facets = {
# 'date': {
# 'interval': 'month'
# }
# }
#
# response = self.post(url, content_type='application/json',
# data=json.dumps({'query': 'de',
# 'facets': facets}))
# self.assert_ok_json(response)
# self.assertEqual(response.json['facets']['date']['_type'],
# 'date_histogram')
def test_datetime_facet_interval_not_string(self):
"""Test if supplying an invalid interval type (i.e. integer)
results in a response with status code 400."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
facets = {
'date': {
'date_histogram': {
'interval': 123
}
}
}
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de',
'facets': facets}))
self.assert_bad_request_json(response)
def test_datetime_facet_interval_not_allowed(self):
"""Tests if supplying an invalid interval size results in
a response with a status code 400."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
facets = {
'date': {
'date_histogram': {
'interval': 'millennium'
}
}
}
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de',
'facets': facets}))
self.assert_bad_request_json(response)
def test_facet_should_be_dict(self):
"""Tests if supplying a list as facet request description
results in a response with status code 400."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
facets = ['some facet']
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de',
'facets': facets}))
self.assert_bad_request_json(response)
def test_from(self):
"""Test if setting the ``from`` attribute responds with JSON
and status 200 OK."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de', 'from': 10}))
self.assert_ok_json(response)
def test_invalid_value_from(self):
"""Tests if supplying an invalid data type for the ``from``
attribute results in a response with status code 400."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de', 'from': 'abc'}))
self.assert_bad_request_json(response)
def test_negative_value_from(self):
"""Test if supplying a negative value for the ``from`` attribute
results in a response with status code 400."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de', 'from': -1}))
self.assert_bad_request_json(response)
def test_size(self):
"""Test if supplying a valid value for the ``size`` attribute
results in a 200 OK JSON response."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de', 'size': 10}))
self.assert_ok_json(response)
def test_invalid_value_size(self):
"""Test if supplying an invalid type for the ``size`` attribute
results in a response with status code 400."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de', 'size': 'abc'}))
self.assert_bad_request_json(response)
def test_negative_value_size(self):
"""Test if supplying a negative value for the ``size`` attribute
results in a response with status code 400."""
url = url_for(self.endpoint_url, **self.endpoint_url_args)
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de', 'size': -1}))
self.assert_bad_request_json(response)
@mock.patch('ocd_frontend.rest.tasks.log_event.delay')
def test_search_logging_called_if_enabled(self, mocked_log_task):
"""Test if the event log storage function is called when usage
logging is enabled."""
# Enable usage logging for this test
self.app.config['USAGE_LOGGING_ENABLED'] = True
# Make sure the Celery task doesn't get executed
mocked_log_task.return_value = lambda *args, **kwargs: None
url = url_for(self.endpoint_url, **self.endpoint_url_args)
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de'}))
self.assertTrue(mocked_log_task.called)
@mock.patch('ocd_frontend.rest.tasks.log_event.delay')
def test_search_logging_not_called_if_disabled(self, mocked_log_task):
"""Test if the event log storage function is not called when
usage logging is disabled."""
# Make sure usage logging is disabled
self.app.config['USAGE_LOGGING_ENABLED'] = False
# Make sure the Celery task doesn't get executed
mocked_log_task.return_value = lambda *args, **kwargs: None
url = url_for(self.endpoint_url, **self.endpoint_url_args)
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de'}))
self.assertFalse(mocked_log_task.called)
class RestApiSearchSourceTestCase(RestApiSearchTestCase):
endpoint_url = 'api.search_source'
endpoint_url_args = {'source_id': 'test_collection_index'}
required_indexes = [
'ori_test_collection_index'
]
def test_nonexistent_source_id(self):
"""Test if supplying a nonexistent ``source_id`` returns a 404
JSON response."""
url = url_for(self.endpoint_url, source_id='i-do-not-exist')
response = self.post(url, content_type='application/json',
data=json.dumps({'query': 'de'}))
self.assert_not_found_request_json(response)
class RestApiSearchSimilarTestCase(OcdRestTestCaseMixin, TestCase):
required_indexes = [
'ori_test_combined_index',
'ori_test_collection_index'
]
def test_valid_search(self):
"""Tests if a valid search request responds with a JSON and
status 200 OK."""
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
response = self.post(url, content_type='application/json',
data=json.dumps({}))
self.assert_ok_json(response)
def test_valid_search_source(self):
doc_id = self.doc_ids['ori_test_collection_index']['items'][0]
url = url_for('api.similar', source_id='test_collection_index',
object_id=doc_id)
response = self.post(url, content_type='application/json',
data=json.dumps({}))
self.assert_ok_json(response)
def test_search_nonexistent_source(self):
"""Test if finding similar objects within a source index that
doesn't exist returns a 404 JSON response (with the appropriate
error message)."""
source_id = 'i-do-not-exist'
doc_id = self.doc_ids['ori_test_collection_index']['items'][0]
url = url_for('api.get_object', source_id=source_id, object_id=doc_id)
response = self.get(url)
self.assert_not_found_request_json(response)
self.assertEqual(response.json['error'],
'Source \'%s\' does not exist' % source_id)
def test_sort_option_is_accepted(self):
"""Tests if valid use of the ``sort`` option results in a
JSON response with a 200 OK."""
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
available_sort_fields = current_app.config['SORTABLE_FIELDS']['items']
try:
available_sort_fields.remove('start_date')
available_sort_fields.remove('end_date')
except ValueError as e:
pass
sort_field = random.choice(available_sort_fields)
response = self.post(url, content_type='application/json',
data=json.dumps({'sort': sort_field}))
self.assert_ok_json(response)
def test_sort_order_option_is_accepted(self):
"""Test if valid use of the ``sort`` and ``order`` options
result in a JSON response with a 200 OK."""
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
available_sort_fields = current_app.config['SORTABLE_FIELDS']['items']
try:
available_sort_fields.remove('start_date')
available_sort_fields.remove('end_date')
except ValueError as e:
pass
sort_field = random.choice(available_sort_fields)
sort_order = random.choice(['asc', 'desc'])
response = self.post(url, content_type='application/json',
data=json.dumps({'order': sort_order,
'sort': sort_field}))
self.assert_ok_json(response)
def test_sort_option_with_invalid_field(self):
"""Tests if sorting on an invalid field results in a response
with status code 400."""
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
response = self.post(url, content_type='application/json',
data=json.dumps({'sort': 'not-a-sort-field'}))
self.assert_bad_request_json(response)
def test_sort_option_with_invalid_order(self):
"""Test if supplying an invalid order option results in a
response with status code 400."""
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
available_sort_fields = current_app.config['SORTABLE_FIELDS']['items']
try:
available_sort_fields.remove('start_date')
available_sort_fields.remove('end_date')
except ValueError as e:
pass
sort_field = random.choice(available_sort_fields)
response = self.post(url, content_type='application/json',
data=json.dumps({'order': 'upsidedown',
'sort': sort_field}))
self.assert_bad_request_json(response)
# def test_facets(self):
# """Test if requesting facets results in a 200 OK, and if the
# facets are actually present in the response."""
# doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
# url = url_for('api.similar', object_id=doc_id)
#
# available_facets = current_app.config['AVAILABLE_FACETS']
# facet_keys = random.sample(available_facets.keys(), 3)
# facets = {fk: available_facets[fk] for fk in facet_keys}
#
# response = self.post(url, content_type='application/json',
# data=json.dumps({'facets': facets}))
#
# self.assert_ok_json(response)
# self.assertIn('facets', response.json)
# for fk in facet_keys:
# self.assertIn(fk, response.json.get('facets', {}))
def test_not_available_facet(self):
"""Tests if requesting a facet that is not available results
in a response with status code 400."""
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
facets = {
'rights-that-are-not-a-facet': {
'terms': {
'field': 'meta.rights'
}
}
}
response = self.post(url, content_type='application/json',
data=json.dumps({'facets': facets}))
self.assert_bad_request_json(response)
def test_facet_size(self):
"""Tests if valid use of the facet ``size`` attribute results in
a 200 OK JSON response."""
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
facets = {
'classification': {
'size': 10
}
}
response = self.post(url, content_type='application/json',
data=json.dumps({'facets': facets}))
self.assert_ok_json(response)
def test_invalid_facet_size(self):
"""Tests if supplying an invalid facet ``size`` value results in
a response with status code 400."""
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
facets = {
'rights': {
'size': 'abc'
}
}
response = self.post(url, content_type='application/json',
data=json.dumps({'facets': facets}))
self.assert_bad_request_json(response)
# def test_datetime_facet(self):
# """Tests if valid use of the ``date`` facet results in a 200 OK
# JSON response."""
# doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
# url = url_for('api.similar', object_id=doc_id)
#
# facets = {
# 'date': {
# 'date_histogram': {
# 'field': 'date',
# 'interval': 'month'
# }
# }
# }
#
# response = self.post(url, content_type='application/json',
# data=json.dumps({'facets': facets}))
# self.assert_ok_json(response)
# self.assertEqual(response.json['facets']['date']['_type'],
# 'date_histogram')
#
# def test_datetime_facet_interval_not_string(self):
# """Test if supplying an invalid interval type (i.e. integer)
# results in a response with status code 400."""
# doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
# url = url_for('api.similar', object_id=doc_id)
#
# facets = {
# 'date': {
# 'date_histogram': {
# 'field': 'date',
# 'interval': 123
# }
# }
# }
#
# response = self.post(url, content_type='application/json',
# data=json.dumps({'facets': facets}))
# self.assert_bad_request_json(response)
#
# def test_datetime_facet_interval_not_allowed(self):
# """Tests if supplying an invalid interval size results in
# a response with a status code 400."""
# doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
# url = url_for('api.similar', object_id=doc_id)
#
# facets = {
# 'date': {
# 'date_histogram': {
# 'field': 'date',
# 'interval': 'millennium'
# }
# }
# }
#
# response = self.post(url, content_type='application/json',
# data=json.dumps({'facets': facets}))
# self.assert_bad_request_json(response)
def test_facet_should_be_dict(self):
"""Tests if supplying a list as facet request description
results in a response with status code 400."""
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
facets = ['some facet']
response = self.post(url, content_type='application/json',
data=json.dumps({'facets': facets}))
self.assert_bad_request_json(response)
def test_from(self):
"""Test if setting the ``from`` attribute responds with JSON
and status 200 OK."""
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
response = self.post(url, content_type='application/json',
data=json.dumps({'from': 10}))
self.assert_ok_json(response)
def test_invalid_value_from(self):
"""Tests if supplying an invalid data type for the ``from``
attribute results in a response with status code 400."""
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
response = self.post(url, content_type='application/json',
data=json.dumps({'from': 'abc'}))
self.assert_bad_request_json(response)
def test_negative_value_from(self):
"""Test if supplying a negative value for the ``from`` attribute
results in a response with status code 400."""
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
response = self.post(url, content_type='application/json',
data=json.dumps({'from': -1}))
self.assert_bad_request_json(response)
def test_size(self):
"""Test if supplying a valid value for the ``size`` attribute
results in a 200 OK JSON response."""
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
response = self.post(url, content_type='application/json',
data=json.dumps({'size': 10}))
self.assert_ok_json(response)
def test_invalid_value_size(self):
"""Test if supplying an invalid type for the ``size`` attribute
results in a response with status code 400."""
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
response = self.post(url, content_type='application/json',
data=json.dumps({'size': 'abc'}))
self.assert_bad_request_json(response)
def test_negative_value_size(self):
"""Test if supplying a negative value for the ``size`` attribute
results in a response with status code 400."""
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
response = self.post(url, content_type='application/json',
data=json.dumps({'size': -1}))
self.assert_bad_request_json(response)
@mock.patch('ocd_frontend.rest.tasks.log_event.delay')
def test_search_logging_called_if_enabled(self, mocked_log_task):
"""Test if the event log storage function is called when usage
logging is enabled."""
# Enable usage logging for this test
self.app.config['USAGE_LOGGING_ENABLED'] = True
# Make sure the Celery task doesn't get executed
mocked_log_task.return_value = lambda *args, **kwargs: None
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
response = self.post(url, content_type='application/json',
data=json.dumps({}))
self.assertTrue(mocked_log_task.called)
@mock.patch('ocd_frontend.rest.tasks.log_event.delay')
def test_search_logging_not_called_if_disabled(self, mocked_log_task):
"""Test if the event log storage function is not called when
usage logging is disabled."""
# Make sure usage logging is disabled
self.app.config['USAGE_LOGGING_ENABLED'] = False
# Make sure the Celery task doesn't get executed
mocked_log_task.return_value = lambda *args, **kwargs: None
doc_id = self.doc_ids['ori_test_combined_index']['item'][0]
url = url_for('api.similar', object_id=doc_id)
response = self.post(url, content_type='application/json',
data=json.dumps({}))
self.assertFalse(mocked_log_task.called)
class RestApiSourcesTestCase(OcdRestTestCaseMixin, TestCase):
required_indexes = [
'ori_test_combined_index'
]
def test_response_format(self):
url = url_for('api.list_sources')
response = self.get(url)
self.assert_ok_json(response)
self.assertIn('sources', response.json)
source_attrs = response.json['sources'][0].keys()
self.assertIn('id', source_attrs)
self.assertIn('organizations', source_attrs)
@mock.patch('ocd_frontend.rest.tasks.log_event.delay')
def test_logging_called_if_enabled(self, mocked_log_task):
"""Test if the event log storage function is called when usage
logging is enabled."""
# Enable usage logging for this test
self.app.config['USAGE_LOGGING_ENABLED'] = True
# Make sure the Celery task doesn't get executed
mocked_log_task.return_value = lambda *args, **kwargs: None
url = url_for('api.list_sources')
response = self.get(url)
self.assertTrue(mocked_log_task.called)
@mock.patch('ocd_frontend.rest.tasks.log_event.delay')
def test_logging_not_called_if_disabled(self, mocked_log_task):
"""Test if the event log storage function is not called when
usage logging is disabled."""
# Make sure usage logging is disabled
self.app.config['USAGE_LOGGING_ENABLED'] = False
# Make sure the Celery task doesn't get executed
mocked_log_task.return_value = lambda *args, **kwargs: None
url = url_for('api.list_sources')
response = self.get(url)
self.assertFalse(mocked_log_task.called)
class RestApiGetObjectTestCase(OcdRestTestCaseMixin, TestCase):
required_indexes = [
'ori_test_collection_index'
]
def test_get_existing_object(self):
"""Test getting an index document."""
doc_id = self.doc_ids['ori_test_collection_index']['items'][0]
url = url_for('api.get_object', source_id='test_collection_index',
object_id=doc_id)
response = self.get(url)
self.assert_ok_json(response)
def test_get_nonexistent_object(self):
"""Test if getting an object that doesn't exist returns a 404
JSON response (with the appropriate error message)."""
url = url_for('api.get_object', source_id='test_collection_index',
object_id='i-do-not-exist')
response = self.get(url)
self.assert_not_found_request_json(response)
self.assertEqual(response.json['error'], 'Document not found.')
def test_get_nonexistent_source(self):
"""Test if getting an object from a source index that doesn't
exist returns a 404 JSON response (with the appropriate error
message)."""
source_id = 'i-do-not-exist'
url = url_for('api.get_object', source_id=source_id,
object_id='i-do-not-exist')
response = self.get(url)
self.assert_not_found_request_json(response)
self.assertEqual(response.json['error'],
'Source \'%s\' does not exist' % source_id)
@mock.patch('ocd_frontend.rest.tasks.log_event.delay')
def test_logging_called_if_enabled(self, mocked_log_task):
"""Test if the event log storage function is called when usage
logging is enabled."""
# Enable usage logging for this test
self.app.config['USAGE_LOGGING_ENABLED'] = True
# Make sure the Celery task doesn't get executed
mocked_log_task.return_value = lambda *args, **kwargs: None
doc_id = self.doc_ids['ori_test_collection_index']['items'][0]
url = url_for('api.get_object', source_id='test_collection_index',
object_id=doc_id)
response = self.get(url)
self.assertTrue(mocked_log_task.called)
@mock.patch('ocd_frontend.rest.tasks.log_event.delay')
def test_search_logging_not_called_if_disabled(self, mocked_log_task):
"""Test if the event log storage function is not called when
usage logging is disabled."""
# Make sure usage logging is disabled
self.app.config['USAGE_LOGGING_ENABLED'] = False
# Make sure the Celery task doesn't get executed
mocked_log_task.return_value = lambda *args, **kwargs: None
doc_id = self.doc_ids['ori_test_collection_index']['items'][0]
url = url_for('api.get_object', source_id='test_collection_index',
object_id=doc_id)
self.assertFalse(mocked_log_task.called)
class RestApiGetObjectSourceTestCase(OcdRestTestCaseMixin, TestCase):
required_indexes = [
'ori_test_collection_index'
]
def test_get_existing_object(self):
"""Test getting an index document."""
doc_id = self.doc_ids['ori_test_collection_index']['items'][0]
url = url_for('api.get_object_source',
source_id='test_collection_index',
doc_type='items', object_id=doc_id)
response = self.get(url)
pprint(requests.get('http://localhost:9200/_cat/indices').content)
pprint(self.doc_ids)
pprint(url)
pprint(response)
self.assert_ok_json(response)
def test_get_nonexistent_object(self):
"""Test if getting an object that doesn't exist returns a 404
JSON response (with the appropriate error message)."""
url = url_for('api.get_object_source',
source_id='test_collection_index',
object_id='i-do-not-exist')
response = self.get(url)
self.assert_not_found_request_json(response)
self.assertEqual(response.json['error'], 'Document not found.')
def test_get_nonexistent_source(self):
"""Test if getting an object from a source index that doesn't
exist returns a 404 JSON response (with the appropriate error
message)."""
source_id = 'i-do-not-exist'
url = url_for('api.get_object_source', source_id=source_id,
object_id='i-do-not-exist')
response = self.get(url)
self.assert_not_found_request_json(response)
self.assertEqual(response.json['error'],
'Source \'%s\' does not exist' % source_id)
@mock.patch('ocd_frontend.rest.tasks.log_event.delay')
def test_logging_called_if_enabled(self, mocked_log_task):
"""Test if the event log storage function is called when usage
logging is enabled."""
# Enable usage logging for this test
self.app.config['USAGE_LOGGING_ENABLED'] = True
# Make sure the Celery task doesn't get executed
mocked_log_task.return_value = lambda *args, **kwargs: None
doc_id = self.doc_ids['ori_test_collection_index']['items'][0]
url = url_for('api.get_object_source',
source_id='test_collection_index', object_id=doc_id)
response = self.get(url)
self.assertTrue(mocked_log_task.called)
@mock.patch('ocd_frontend.rest.tasks.log_event.delay')
def test_search_logging_not_called_if_disabled(self, mocked_log_task):
"""Test if the event log storage function is not called when
usage logging is disabled."""
# Make sure usage logging is disabled
self.app.config['USAGE_LOGGING_ENABLED'] = False
# Make sure the Celery task doesn't get executed
mocked_log_task.return_value = lambda *args, **kwargs: None
doc_id = self.doc_ids['ori_test_collection_index']['items'][0]
url = url_for('api.get_object_source',
source_id='test_collection_index', object_id=doc_id)
response = self.get(url)
self.assertFalse(mocked_log_task.called)
class RestApiGetObjectStatsTestCaste(OcdRestTestCaseMixin, TestCase):
required_indexes = [
'ori_test_usage_logging_index',
'ori_test_collection_index'
]
def test_get_existing_object(self):
"""Test getting the stats of an indexed document."""
doc_id = self.doc_ids['ori_test_collection_index']['items'][0]
url = url_for('api.get_object_stats',
source_id='test_collection_index', object_id=doc_id)
response = self.get(url)
self.assert_ok_json(response)
def test_get_nonexistent_object(self):
"""Test if getting an object that doesn't exist returns a 404
JSON response."""
url = url_for('api.get_object_source',
source_id='test_collection_index',
object_id='i-do-not-exist')
response = self.get(url)
self.assert_not_found_request_json(response)
def test_get_nonexistent_source(self):
"""Test if getting an object from a source index that doesn't
exist returns a 404 JSON response."""
url = url_for('api.get_object_stats', source_id='i-do-not-exist',
object_id='i-do-not-exist')
response = self.get(url)
self.assert_not_found_request_json(response)
class RestApiResolveTestCase(OcdRestTestCaseMixin, TestCase):
required_indexes = [
'ori_test_resolver_index'
]
def test_successful_resolve(self):
"""Test if a valid URL resolves and returns a redirect with the
correct status, location and content type."""
doc_id = self.doc_ids['ori_test_resolver_index']['url'][0]
url = url_for('api.resolve', url_id=doc_id)
response = self.get(url, follow_redirects=False)
self.assert_status_code(response, 302)
self.assert_content_type(response, 'text/html; charset=utf-8')
self.assertIn('location', response.headers)
self.assertTrue(response.headers['location'].startswith('http://'))
def test_resolve_not_whitelisted_content_type(self):
"""Test that a resolve document with an incorrent content_type resolves
to the original url"""
doc_id = self.doc_ids['ori_test_resolver_index']['url'][1]
url = url_for('api.resolve', url_id=doc_id)
response = self.get(url, follow_redirects=False)
self.assert_status_code(response, 302)
self.assert_content_type(response, 'text/html; charset=utf-8')
self.assertIn('location', response.headers)
self.assertTrue(response.headers['location'].startswith('http://'))
def test_successful_thumbnail_resolve(self):
"""Test if a valid URL resolves and returns a redirect to a thumbnailed
image.
"""
doc_id = self.doc_ids['ori_test_resolver_index']['url'][0]
url = url_for('api.resolve', url_id=doc_id, size='large')
response = self.get(url, follow_redirects=False)
self.assert_status_code(response, 302)
self.assert_content_type(response, 'text/html; charset=utf-8')
self.assertIn('location', response.headers)
self.assertIn('large', response.headers['location'])
self.assertIn(self.app.config.get('THUMBNAIL_URL'), response.headers['location'])
def test_invalid_thumbnail_size_json(self):
"""Test if a request with an invalid thumbnail size returns a 400 with
proper content type"""
doc_id = self.doc_ids['ori_test_resolver_index']['url'][0]
url = url_for('api.resolve', url_id=doc_id, size='humongous')
response = self.get(url, follow_redirects=False)
self.assert_bad_request(response)
self.assert_content_type(response, 'application/json')
self.assertEqual(response.json.get('status'), 'error')
self.assertIn('appropriate thumbnail size', response.json.get('error'))
def test_invalid_thumbnail_size_html(self):
"""Test if a request with an invalid thumbnail size returns a 400 with
proper content type"""
doc_id = self.doc_ids['ori_test_resolver_index']['url'][0]
url = url_for('api.resolve', url_id=doc_id, size='humongous')
response = self.get(url, follow_redirects=False,
content_type='text/html')
self.assert_bad_request(response)
self.assert_content_type(response, 'text/html; charset=utf-8')
self.assertIn('<html><body>You did not provide an appropriate '
'thumbnail size', response.data)
def test_invalid_resolve_json(self):
"""Tests if a request to resolve an invalid URL results in a
404 response with the proper content type."""
url = url_for('api.resolve', url_id='i-do-not-exist')
response = self.get(url, follow_redirects=False,
content_type='application/json')
self.assert_not_found_request_json(response)
def test_invalid_resolve_html(self):
"""Tests if a request to resolve an invalid URL results in a
404 response with the proper content type."""
url = url_for('api.resolve', url_id='i-do-not-exist')
response = self.get(url, follow_redirects=False,
content_type='text/html')
self.assert_not_found(response)
self.assert_content_type(response, 'text/html; charset=utf-8')
@mock.patch('ocd_frontend.rest.tasks.log_event.delay')
def test_logging_called_if_enabled(self, mocked_log_task):
"""Test if the event log storage function is called when usage
logging is enabled."""
# Enable usage logging for this test
self.app.config['USAGE_LOGGING_ENABLED'] = True
# Make sure the Celery task doesn't get executed
mocked_log_task.return_value = lambda *args, **kwargs: None
doc_id = self.doc_ids['ori_test_resolver_index']['url'][0]
url = url_for('api.resolve', url_id=doc_id)
response = self.get(url, follow_redirects=False)
self.assertTrue(mocked_log_task.called)
@mock.patch('ocd_frontend.rest.tasks.log_event.delay')
def test_search_logging_not_called_if_disabled(self, mocked_log_task):
"""Test if the event log storage function is not called when
usage logging is disabled."""
# Make sure usage logging is disabled
self.app.config['USAGE_LOGGING_ENABLED'] = False
# Make sure the Celery task doesn't get executed
mocked_log_task.return_value = lambda *args, **kwargs: None
doc_id = self.doc_ids['ori_test_resolver_index']['url'][0]
url = url_for('api.resolve', url_id=doc_id)
response = self.get(url, follow_redirects=False)
self.assertFalse(mocked_log_task.called)
class LogEventTaskTestCase(UnittestTestCase):
default_args = {
'user_agent': 'abc',
'referer': 'def',
'user_ip': '127.0.0.1',
'created_at': '2015-01-01',
'event_type': 'get_object'
}
def test_unknown_event_raises_exception(self):
task_args = self.default_args
task_args['event_type'] = 'unknown-test-event'
self.assertRaises(ValueError, tasks.log_event, **task_args)
| 42.423611 | 89 | 0.613264 | 5,306 | 42,763 | 4.704674 | 0.049943 | 0.04086 | 0.025958 | 0.023074 | 0.920122 | 0.913632 | 0.907583 | 0.898009 | 0.88511 | 0.881184 | 0 | 0.008065 | 0.275144 | 42,763 | 1,007 | 90 | 42.46574 | 0.797271 | 0.23525 | 0 | 0.757119 | 0 | 0 | 0.162607 | 0.068509 | 0 | 0 | 0 | 0 | 0.157454 | 1 | 0.113903 | false | 0.0067 | 0.01675 | 0 | 0.167504 | 0.01005 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
67ab401c153202818411559e330b05bb22b5d7f8 | 3,808 | py | Python | petstore/test_datasets/user_data.py | andrii-grytsenko/io.swagger.petstore3.testing | 81a0a16d574d0c0664b297e7ba7ff2bb5a9a0c40 | [
"MIT"
] | null | null | null | petstore/test_datasets/user_data.py | andrii-grytsenko/io.swagger.petstore3.testing | 81a0a16d574d0c0664b297e7ba7ff2bb5a9a0c40 | [
"MIT"
] | null | null | null | petstore/test_datasets/user_data.py | andrii-grytsenko/io.swagger.petstore3.testing | 81a0a16d574d0c0664b297e7ba7ff2bb5a9a0c40 | [
"MIT"
] | null | null | null | user_add = [
('{"id":9910,"username":"user9910","firstName":"John","lastName":"James","email":"john@email.com",'
'"password":"9910","phone":"9910","userStatus":1}', 200),
('{"id":9910,"username":"user9910","firstName":"John","lastName":"James","email":"john@email.com",'
'"password":"9910","phone":"9910","userStatus":1}', 400),
('{"id":9911,"username":"user9911","firstName":"John","lastName":"James","email":"john@email.com",'
'"password":"9911","phone":"9911","userStatus":2}', 200),
('{"id":9911,"username":"user9911","firstName":"John","lastName":"James","email":"john@email.com",'
'"password":"9911","phone":"9911","userStatus":2}', 400),
('{"id":9912,"username":"user9912","firstName":"John","lastName":"James","email":"john@email.com",'
'"password":"9912","phone":"9912","userStatus":3}', 200),
('{"id":9912,"username":"user9912","firstName":"John","lastName":"James","email":"john@email.com",'
'"password":"9912","phone":"9912","userStatus":3}', 400),
]
user_add_ids = [f"Create user [Data: {item[0]}], expected code={item[1]}" for item in user_add]
user_login = [("user9910", "9910", 200), ("user9911", "9911", 200), ("user9912", "9912", 200),
("user9910", "", 400), ("user9911", "", 400), ("user9912", "", 400),
]
user_login_ids = [f"Login with user name [{item[0]}] and password [{item[1]}], expected code={item[2]}" for item in
user_login]
user_delete = [
('user9910', 200), ('user9910', 400),
('user9911', 200), ('user9911', 400),
('user9912', 200), ('user9912', 400),
]
user_delete_ids = [f"Delete user {item[0]}], expected code={item[1]}" for item in user_delete]
user_find = [
('{"id":9910,"username":"user9910","firstName":"John","lastName":"James","email":"john@email.com",'
'"password":"9910","phone":"9910","userStatus":1}', 200),
('{"id":9911,"username":"user9911","firstName":"John","lastName":"James","email":"john@email.com",'
'"password":"9911","phone":"9911","userStatus":2}', 200),
('{"id":9912,"username":"user9912","firstName":"John","lastName":"James","email":"john@email.com",'
'"password":"9912","phone":"9912","userStatus":3}', 200),
('{"id":10,"username":"theUser","firstName":"John","lastName":"James","email":"john@email.com",'
'"password":"9910","phone":"9910","userStatus":1}', 200),
]
user_find_ids = [f"Create user [Data: {item[0]}], expected code={item[1]}" for item in user_find]
user_update = [
("user9910", '{"id":9910,"username":"user9910new","firstName":"John","lastName":"James",'
'"email":"john@email.com","password":"9910","phone":"99100099","userStatus":1}', 200),
("user9911", '{"id":9911,"username":"user9911new","firstName":"John","lastName":"James",'
'"email":"john@email.com","password":"9911","phone":"99110099","userStatus":2}', 200),
("user9912", '{"id":9912,"username":"user9912new","firstName":"John","lastName":"James",'
'"email":"john@email.com","password":"9912","phone":"99120099","userStatus":3}', 200),
]
user_update_ids = [f"Update user {item[0]}], expected code={item[2]}" for item in user_update]
user_add_list = [
(['{"id":19910,"username":"user9910","firstName":"John","lastName":"James","email":"john@email.com",'
'"password":"9910","phone":"9910","userStatus":1}',
'{"id":19911,"username":"user9911","firstName":"John","lastName":"James","email":"john@email.com",'
'"password":"9911","phone":"9911","userStatus":2}',
'{"id":19912,"username":"user9912","firstName":"John","lastName":"James","email":"john@email.com",'
'"password":"9912","phone":"9912","userStatus":3}'], 200),
]
user_add_list_ids = [f"Create user [Data: {item[0]}], expected code={item[1]}" for item in user_add_list]
| 62.42623 | 115 | 0.598739 | 458 | 3,808 | 4.919214 | 0.115721 | 0.092321 | 0.149134 | 0.184643 | 0.754549 | 0.754549 | 0.748779 | 0.748779 | 0.722148 | 0.722148 | 0 | 0.131283 | 0.11187 | 3,808 | 60 | 116 | 63.466667 | 0.534891 | 0 | 0 | 0.290909 | 0 | 0.309091 | 0.733981 | 0.610557 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.309091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 10 |
67eab1fb6e1cd3579d62b3603ad476f1f5623417 | 4,940 | py | Python | src/parameters/back_end_params.py | asf174/TopDownNvidia | 6147ba1029b4880879e33b2af381046fccbf8a45 | [
"Unlicense"
] | null | null | null | src/parameters/back_end_params.py | asf174/TopDownNvidia | 6147ba1029b4880879e33b2af381046fccbf8a45 | [
"Unlicense"
] | null | null | null | src/parameters/back_end_params.py | asf174/TopDownNvidia | 6147ba1029b4880879e33b2af381046fccbf8a45 | [
"Unlicense"
] | 1 | 2021-09-20T19:26:25.000Z | 2021-09-20T19:26:25.000Z | """
Class with all params of BackEnd class
and their subclasses
@author: Alvaro Saiz (UC)
@date: Jul 2021
@version: 1.0
"""
class BackEndParameters:
C_BACK_END_NAME : str = "BACK-END"
C_BACK_END_DESCRIPTION : str = ("It analyzes the parts of the GPU architecture where the BackEnd produces bottleneck,\n"
+ "which leads to IPC losses. In this part, We analyze aspects related to the 'execution' part of\n"
+ "the instructions, in which aspects such as limitations by functional units, memory limits, etc.\n")
# NVPROF metrics/arguments
C_BACK_END_NVPROF_L1_METRICS : str = ("stall_memory_dependency,stall_constant_memory_dependency,stall_pipe_busy," +
"stall_memory_throttle,stall_exec_dependency")
C_BACK_END_NVPROF_L1_EVENTS : str = ("")
C_BACK_END_NVPROF_L2_METRICS : str = ("stall_memory_dependency,stall_constant_memory_dependency,stall_pipe_busy," +
"stall_memory_throttle,stall_exec_dependency")
C_BACK_END_NVPROF_L2_EVENTS : str = ("")
C_BACK_END_NVPROF_L3_METRICS : str = ("stall_memory_dependency,stall_constant_memory_dependency,stall_pipe_busy," +
"stall_memory_throttle,stall_exec_dependency")
C_BACK_END_NVPROF_L3_EVENTS : str = ("")
# NSIGHT metrics
C_BACK_END_NSIGHT_L1_METRICS : str = ("smsp__warp_issue_stalled_long_scoreboard_per_warp_active.pct," +
"smsp__warp_issue_stalled_imc_miss_per_warp_active.pct," +
"smsp__warp_issue_stalled_math_pipe_throttle_per_warp_active.pct," +
"smsp__warp_issue_stalled_mio_throttle_per_warp_active.pct," +
"smsp__warp_issue_stalled_drain_per_warp_active.pct," +
"smsp__warp_issue_stalled_lg_throttle_per_warp_active.pct," +
"smsp__warp_issue_stalled_short_scoreboard_per_warp_active.pct," +
"smsp__warp_issue_stalled_wait_per_warp_active.pct," +
"smsp__warp_issue_stalled_tex_throttle_per_warp_active.pct")
C_BACK_END_NSIGHT_L2_METRICS : str = ("smsp__warp_issue_stalled_long_scoreboard_per_warp_active.pct," +
"smsp__warp_issue_stalled_imc_miss_per_warp_active.pct," +
"smsp__warp_issue_stalled_math_pipe_throttle_per_warp_active.pct," +
"smsp__warp_issue_stalled_mio_throttle_per_warp_active.pct," +
"smsp__warp_issue_stalled_drain_per_warp_active.pct," +
"smsp__warp_issue_stalled_lg_throttle_per_warp_active.pct," +
"smsp__warp_issue_stalled_short_scoreboard_per_warp_active.pct," +
"smsp__warp_issue_stalled_wait_per_warp_active.pct," +
"smsp__warp_issue_stalled_tex_throttle_per_warp_active.pct")
C_BACK_END_NSIGHT_L3_METRICS : str = ("smsp__warp_issue_stalled_long_scoreboard_per_warp_active.pct," +
"smsp__warp_issue_stalled_imc_miss_per_warp_active.pct," +
"smsp__warp_issue_stalled_math_pipe_throttle_per_warp_active.pct," +
"smsp__warp_issue_stalled_mio_throttle_per_warp_active.pct," +
"smsp__warp_issue_stalled_drain_per_warp_active.pct," +
"smsp__warp_issue_stalled_lg_throttle_per_warp_active.pct," +
"smsp__warp_issue_stalled_short_scoreboard_per_warp_active.pct," +
"smsp__warp_issue_stalled_wait_per_warp_active.pct," +
"smsp__warp_issue_stalled_tex_throttle_per_warp_active.pct")
| 85.172414 | 154 | 0.510931 | 464 | 4,940 | 4.741379 | 0.198276 | 0.098182 | 0.159545 | 0.245455 | 0.798182 | 0.786364 | 0.765455 | 0.765455 | 0.765455 | 0.765455 | 0 | 0.005415 | 0.439271 | 4,940 | 57 | 155 | 86.666667 | 0.788809 | 0.033806 | 0 | 0.658537 | 0 | 0.02439 | 0.456969 | 0.396725 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0.292683 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
db28cb2c726376d3b3643c7342ecd240945bbdfb | 3,862 | py | Python | src/game/moves_test.py | marmelab/quixo-python | 7d0aee865ad7a10097d27846702f32803339b64c | [
"MIT"
] | 1 | 2019-05-27T09:39:10.000Z | 2019-05-27T09:39:10.000Z | src/game/moves_test.py | marmelab/quixo-python | 7d0aee865ad7a10097d27846702f32803339b64c | [
"MIT"
] | 4 | 2019-05-27T12:36:23.000Z | 2019-06-03T07:36:08.000Z | src/game/moves_test.py | marmelab/quixo-python | 7d0aee865ad7a10097d27846702f32803339b64c | [
"MIT"
] | null | null | null | import unittest
from game.moves import move_row, move_col, is_movable_tile, move_tile
class TestBoardMethods(unittest.TestCase):
def test_move_row(self):
init_board = [
[0, 0, 0, 0, 0],
[0, 0, 0, 1, 1],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0]
]
expected_board = [
[0, 0, 0, 0, 0],
[0, 0, 1, 1, 1],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0]
]
board = move_row(init_board, 1, 0, 4, 1)
self.assertEqual(board, expected_board)
def test_move_row_tile_existing(self):
init_board = [
[0, 0, 0, 0, 0],
[1, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0]
]
expected_board = [
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0]
]
board = move_row(init_board, 1, 0, 4, 1)
self.assertEqual(board, expected_board)
def test_move_row_tile_existing_end(self):
init_board = [
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0]
]
expected_board = [
[0, 0, 0, 0, 0],
[1, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0]
]
board = move_row(init_board, 1, 4, 0, 1)
self.assertEqual(board, expected_board)
def test_move_col(self):
init_board = [
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 1, 0, 0, 0],
[0, 1, 0, 0, 0]
]
expected_board = [
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 1, 0, 0, 0],
[0, 1, 0, 0, 0],
[0, 1, 0, 0, 0]
]
board = move_col(init_board, 1, 0, 4, 1)
self.assertEqual(board, expected_board)
def test_move_col_tile_existing(self):
init_board = [
[0, 1, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0]
]
expected_board = [
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 1, 0, 0, 0]
]
board = move_col(init_board, 1, 0, 4, 1)
self.assertEqual(board, expected_board)
def test_move_col_tile_existing_end(self):
init_board = [
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 1, 0, 0, 0]
]
expected_board = [
[0, 1, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0]
]
board = move_col(init_board, 1, 4, 0, 1)
self.assertEqual(board, expected_board)
def test_is_movable_tile(self):
acceptable = is_movable_tile(0, 1)
self.assertTrue(acceptable)
unacceptable = is_movable_tile(1, 1)
self.assertFalse(unacceptable)
def test_move_tile(self):
init_board = [
[0, 1, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 1, 0, 0, 0],
[0, -1, 0, 0, 0]
]
expected_board = [
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 1, 0, 0, 0],
[0, -1, 0, 0, 0],
[0, 1, 0, 0, 0]
]
tile_start = (0, 1)
tile_end = (4, 1)
board = move_tile(init_board, tile_start, tile_end, 1)
self.assertEqual(board, expected_board)
| 27.390071 | 69 | 0.370792 | 573 | 3,862 | 2.368237 | 0.054101 | 0.428887 | 0.572587 | 0.669123 | 0.770081 | 0.770081 | 0.745026 | 0.745026 | 0.73913 | 0.736183 | 0 | 0.183958 | 0.460901 | 3,862 | 140 | 70 | 27.585714 | 0.467819 | 0 | 0 | 0.705426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069767 | 1 | 0.062016 | false | 0 | 0.015504 | 0 | 0.085271 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
e1dcd23514daea0e57d7e7aad645f37534f8c2bc | 138 | py | Python | libot.py | Livin21/LiBot | 7f39b5f400133d61f4eff42ad26945f1a6490012 | [
"MIT"
] | null | null | null | libot.py | Livin21/LiBot | 7f39b5f400133d61f4eff42ad26945f1a6490012 | [
"MIT"
] | null | null | null | libot.py | Livin21/LiBot | 7f39b5f400133d61f4eff42ad26945f1a6490012 | [
"MIT"
] | null | null | null | import tensorflow as tf
from tensorflow.models.rnn.translate import data_utils
from tensorflow.models.rnn.translate import seq2seq_model
| 27.6 | 57 | 0.862319 | 20 | 138 | 5.85 | 0.6 | 0.239316 | 0.34188 | 0.393162 | 0.649573 | 0.649573 | 0 | 0 | 0 | 0 | 0 | 0.008 | 0.094203 | 138 | 4 | 58 | 34.5 | 0.928 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
e1faa7c0b2aa0a4f0c0033b83d6c5a6bb60e8fa1 | 1,990 | py | Python | alphabet.py | mtsarkar2000/Onuronon-Animation | 5b639e5d7fac9fe305f092fed03dd2b15f637b7b | [
"MIT"
] | null | null | null | alphabet.py | mtsarkar2000/Onuronon-Animation | 5b639e5d7fac9fe305f092fed03dd2b15f637b7b | [
"MIT"
] | null | null | null | alphabet.py | mtsarkar2000/Onuronon-Animation | 5b639e5d7fac9fe305f092fed03dd2b15f637b7b | [
"MIT"
] | null | null | null | alphabet = {
'A': ((0,0),(0.5,1),(0.75,0.5),(0.25,0.5),(0.75,0.5),(1,0)),
'B': ((0,0),(0,1),(0.625 ,1),(0.75,0.875),(0.75,0.625),(0.625,0.5),(0,0.5),(0.625,0.5),(0.75,0.375),(0.75,0.125),(0.625,0),(0,0)),
'C': ((0.75,0.125),(0.625,0),(0.125,0),(0,0.125),(0,0.875),(0.125,1),(0.625,1),(0.75,0.875)),
'D': ((0,0),(0,1),(0.625 ,1),(0.75,0.875),(0.75,0.125),(0.625,0),(0,0)),
'E': ((0.75,0),(0,0),(0,0.5),(0.75,0.5),(0,0.5),(0,1),(0.75,1)),
'F': ((0,0),(0,0.5),(0.75,0.5),(0,0.5),(0,1),(0.75,1)),
'G': ((0.75,0.5),(0.625,0.5),(0.75,0.5),(0.75,0.125),(0.625,0),(0.125,0),(0,0.125),(0,0.875),(0.125,1),(0.625,1),(0.75,0.875)),
'H': ((0,0),(0,1),(0,0.5),(0.75,0.5),(0.75,1),(0.75,0)),
'I': ((0,0),(0.25,0),(0.125,0),(0.125,1),(0,1),(0.25,1)),
'J': ((0,0.125),(0.125,0),(0.375,0),(0.5,0.125),(0.5,1)),
'K': ((0,0),(0,1),(0,0.5),(0.75,1),(0,0.5),(0.75,0)),
'L': ((0,0),(0,1),(0,0),(0.75,0)),
'M': ((0,0),(0,1),(0.5,0),(1,1),(1,0)),
'N': ((0,0),(0,1),(0.75,0),(0.75,1)),
'O': ((0.75,0.125),(0.625,0),(0.125,0),(0,0.125),(0,0.875),(0.125,1),(0.625,1),(0.75,0.875),(0.75,0.125)),
'P': ((0,0),(0,1),(0.625,1),(0.75,0.875),(0.75,0.625),(0.625,0.5),(0,0.5)),
'Q': ((0.75,0.125),(0.625,0),(0.125,0),(0,0.125),(0,0.875),(0.125,1),(0.625,1),(0.75,0.875),(0.75,0.125),(0.875,0)),
'R': ((0,0),(0,1),(0.625,1),(0.75,0.875),(0.75,0.625),(0.625,0.5),(0,0.5),(0.625,0.5),(0.875,0)),
'S': ((0,0.125),(0.125,0),(0.625,0),(0.75,0.125),(0.75,0.375),(0.675,0.5),(0.125,0.5),(0,0.625),(0,0.875),(0.125,1),(0.625,1),(0.75,0.875)),
'T': ((0,1),(0.5,1),(0.5,0),(0.5,1),(1,1)),
'U': ((0,1),(0,0.125),(0.125,0),(0.625,0),(0.75,0.125),(0.75,1)),
'V': ((0,1),(0.375,0),(0.75,1)),
'W': ((0,1),(0.25,0),(0.5,1),(0.75,0),(1,1)),
'X': ((0,0),(0.375,0.5),(0,1),(0.375,0.5),(0.75,1),(0.375,0.5),(0.75,0)),
'Y': ((0,1),(0.375,0.5),(0.375,0),(0.375,0.5),(0.75,1)),
'Z': ((0,1),(0.75,1),(0,0),(0.75,0)),
} | 71.071429 | 145 | 0.396985 | 556 | 1,990 | 1.420863 | 0.066547 | 0.205063 | 0.197468 | 0.082278 | 0.853165 | 0.786076 | 0.722785 | 0.627848 | 0.620253 | 0.550633 | 0 | 0.419534 | 0.094472 | 1,990 | 28 | 146 | 71.071429 | 0.018868 | 0 | 0 | 0 | 0 | 0 | 0.013238 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c01cca6350b42c4ff8c1627b5e42c788d842af84 | 51,393 | py | Python | dlkit/abstract_osid/relationship/managers.py | UOC/dlkit | a9d265db67e81b9e0f405457464e762e2c03f769 | [
"MIT"
] | 2 | 2018-02-23T12:16:11.000Z | 2020-10-08T17:54:24.000Z | dlkit/abstract_osid/relationship/managers.py | UOC/dlkit | a9d265db67e81b9e0f405457464e762e2c03f769 | [
"MIT"
] | 87 | 2017-04-21T18:57:15.000Z | 2021-12-13T19:43:57.000Z | dlkit/abstract_osid/relationship/managers.py | UOC/dlkit | a9d265db67e81b9e0f405457464e762e2c03f769 | [
"MIT"
] | 1 | 2018-03-01T16:44:25.000Z | 2018-03-01T16:44:25.000Z | """Implementations of relationship abstract base class managers."""
# pylint: disable=invalid-name
# Method names comply with OSID specification.
# pylint: disable=no-init
# Abstract classes do not define __init__.
# pylint: disable=too-few-public-methods
# Some interfaces are specified as 'markers' and include no methods.
# pylint: disable=too-many-public-methods
# Number of methods are defined in specification
# pylint: disable=too-many-ancestors
# Inheritance defined in specification
# pylint: disable=too-many-arguments
# Argument signature defined in specification.
# pylint: disable=duplicate-code
# All apparent duplicates have been inspected. They aren't.
import abc
class RelationshipProfile:
"""The relationship profile describes the interoperability among relationship services."""
__metaclass__ = abc.ABCMeta
@abc.abstractmethod
def supports_visible_federation(self):
"""Tests if any family federation is exposed.
Federation is exposed when a specific family may be identified,
selected and used to create a lookup or admin session.
Federation is not exposed when a set of families appears as a
single family.
:return: ``true`` if visible federation is supported, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_relationship_lookup(self):
"""Tests if looking up relationships is supported.
:return: ``true`` if relationship lookup is supported, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_relationship_query(self):
"""Tests if querying relationships is supported.
:return: ``true`` if relationship query is supported, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_relationship_search(self):
"""Tests if searching relationships is supported.
:return: ``true`` if relationship search is supported, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_relationship_admin(self):
"""Tests if relationship administrative service is supported.
:return: ``true`` if relationship administration is supported, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_relationship_notification(self):
"""Tests if a relationship notification service is supported.
:return: ``true`` if relationship notification is supported, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_relationship_family(self):
"""Tests if a relationship family cataloging service is supported.
:return: ``true`` if relationship families are supported, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_relationship_family_assignment(self):
"""Tests if a relationship cataloging service is supported.
A relationship cataloging service maps relationships to
families.
:return: ``true`` if relationship families are supported, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_relationship_smart_family(self):
"""Tests if a relationship smart family cataloging service is supported.
:return: ``true`` if relationship smart families are supported, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_family_lookup(self):
"""Tests if looking up families is supported.
:return: ``true`` if family lookup is supported, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_family_query(self):
"""Tests if querying families is supported.
:return: ``true`` if family query is supported, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_family_search(self):
"""Tests if searching families is supported.
:return: ``true`` if family search is supported, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_family_admin(self):
"""Tests if familyadministrative service is supported.
:return: ``true`` if family administration is supported, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_family_notification(self):
"""Tests if a family notification service is supported.
:return: ``true`` if family notification is supported, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_family_hierarchy(self):
"""Tests for the availability of a family hierarchy traversal service.
:return: ``true`` if family hierarchy traversal is available, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented in all
providers.*
"""
return # boolean
@abc.abstractmethod
def supports_family_hierarchy_design(self):
"""Tests for the availability of a family hierarchy design service.
:return: ``true`` if family hierarchy design is available, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_relationship_batch(self):
"""Tests for the availability of a relationship batch service.
:return: ``true`` if a relationship batch service is available, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def supports_relationship_rules(self):
"""Tests if a relationship rules service is supported.
:return: ``true`` if relationship rules service is supported, ``false`` otherwise
:rtype: ``boolean``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def get_relationship_record_types(self):
"""Gets the supported ``Relationship`` record types.
:return: a list containing the supported ``Relationship`` record types
:rtype: ``osid.type.TypeList``
*compliance: mandatory -- This method must be implemented.*
"""
return # osid.type.TypeList
relationship_record_types = property(fget=get_relationship_record_types)
@abc.abstractmethod
def supports_relationship_record_type(self, relationship_record_type):
"""Tests if the given ``Relationship`` record type is supported.
:param relationship_record_type: a ``Type`` indicating a ``Relationship`` record type
:type relationship_record_type: ``osid.type.Type``
:return: ``true`` if the given ``Type`` is supported, ``false`` otherwise
:rtype: ``boolean``
:raise: ``NullArgument`` -- ``relationship_record_type`` is ``null``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def get_relationship_search_record_types(self):
"""Gets the supported ``Relationship`` search record types.
:return: a list containing the supported ``Relationship`` search record types
:rtype: ``osid.type.TypeList``
*compliance: mandatory -- This method must be implemented.*
"""
return # osid.type.TypeList
relationship_search_record_types = property(fget=get_relationship_search_record_types)
@abc.abstractmethod
def supports_relationship_search_record_type(self, relationship_search_record_type):
"""Tests if the given ``Relationship`` search record type is supported.
:param relationship_search_record_type: a ``Type`` indicating a ``Relationship`` search record type
:type relationship_search_record_type: ``osid.type.Type``
:return: ``true`` if the given search record type is supported, ``false`` otherwise
:rtype: ``boolean``
:raise: ``NullArgument`` -- ``relationship_search_record_type`` is ``null``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def get_family_record_types(self):
"""Gets the supported ``Family`` record types.
:return: a list containing the supported ``Family`` types
:rtype: ``osid.type.TypeList``
*compliance: mandatory -- This method must be implemented.*
"""
return # osid.type.TypeList
family_record_types = property(fget=get_family_record_types)
@abc.abstractmethod
def supports_family_record_type(self, family_record_type):
"""Tests if the given ``Family`` record type is supported.
:param family_record_type: a ``Type`` indicating a ``Family`` record type
:type family_record_type: ``osid.type.Type``
:return: ``true`` if the given ``Type`` is supported, ``false`` otherwise
:rtype: ``boolean``
:raise: ``NullArgument`` -- ``family_record_type`` is ``null``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
@abc.abstractmethod
def get_family_search_record_types(self):
"""Gets the supported ``Family`` search record types.
:return: a list containing the supported ``Family`` search record types
:rtype: ``osid.type.TypeList``
*compliance: mandatory -- This method must be implemented.*
"""
return # osid.type.TypeList
family_search_record_types = property(fget=get_family_search_record_types)
@abc.abstractmethod
def supports_family_search_record_type(self, family_search_record_type):
"""Tests if the given ``Family`` search record type is supported.
:param family_search_record_type: a ``Type`` indicating a ``Family`` search record type
:type family_search_record_type: ``osid.type.Type``
:return: ``true`` if the given ``Type`` is supported, ``false`` otherwise
:rtype: ``boolean``
:raise: ``NullArgument`` -- ``family_search_record_type`` is ``null``
*compliance: mandatory -- This method must be implemented.*
"""
return # boolean
class RelationshipManager:
"""The relationship manager provides access to relationship sessions and provides interoperability tests for various aspects of this service.
The sessions included in this manager are:
* ``RelationshipLookupSession:`` a session to retrieve and examine
relationships
* ``RelationshipQuerySession:`` a session to query relationships
* ``RelationshipSearchSession:`` a session to search for
relationships
* ``RelationshipAdminSession:`` a session to manage relationships
* ``RelationshipNotificationSession:`` a session to receive
notifications pertaining to relationship changes
* ``RelationshipFamilySession:`` a session to look up relationship
to family mappings
* ``RelationshipFamilyAssignmentSession:`` a session to manage
relationship to family catalog mappings
* ``RelationshipSmartFamilySession:`` a session to manage dynamic
relationship families
* ``FamilyLookupSession:`` a session to retrieve families
* ``FamilyQuerySession:`` a session to query families
* ``FamilySearchSession:`` a session to search for families
* ``FamilyAdminSession:`` a session to create and delete families
* ``FamilyNotificationSession:`` a session to receive
notifications pertaining to family changes
* ``FamilyHierarchySession:`` a session to traverse a hierarchy of
families
* ``FamilyHierarchyDesignSession:`` a session to manage a family
hierarchy
"""
__metaclass__ = abc.ABCMeta
@abc.abstractmethod
def get_relationship_lookup_session(self):
"""Gets the ``OsidSession`` associated with the relationship lookup service.
:return: a ``RelationshipLookupSession``
:rtype: ``osid.relationship.RelationshipLookupSession``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_lookup()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_lookup()`` is ``true``.*
"""
return # osid.relationship.RelationshipLookupSession
relationship_lookup_session = property(fget=get_relationship_lookup_session)
@abc.abstractmethod
def get_relationship_lookup_session_for_family(self, family_id):
"""Gets the ``OsidSession`` associated with the relationship lookup service for the given family.
:param family_id: the ``Id`` of the family
:type family_id: ``osid.id.Id``
:return: a ``RelationshipLookupSession``
:rtype: ``osid.relationship.RelationshipLookupSession``
:raise: ``NotFound`` -- no ``Family`` found by the given ``Id``
:raise: ``NullArgument`` -- ``family_id`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_lookup()`` or ``supports_visible_federation()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_lookup()`` and
``supports_visible_federation()`` are ``true``*
"""
return # osid.relationship.RelationshipLookupSession
@abc.abstractmethod
def get_relationship_query_session(self):
"""Gets the ``OsidSession`` associated with the relationship query service.
:return: a ``RelationshipQuerySession``
:rtype: ``osid.relationship.RelationshipQuerySession``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_query()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_query()`` is ``true``.*
"""
return # osid.relationship.RelationshipQuerySession
relationship_query_session = property(fget=get_relationship_query_session)
@abc.abstractmethod
def get_relationship_query_session_for_family(self, family_id):
"""Gets the ``OsidSession`` associated with the relationship query service for the given family.
:param family_id: the ``Id`` of the family
:type family_id: ``osid.id.Id``
:return: a ``RelationshipQuerySession``
:rtype: ``osid.relationship.RelationshipQuerySession``
:raise: ``NotFound`` -- no ``Family`` found by the given ``Id``
:raise: ``NullArgument`` -- ``family_id`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_query()`` or ``supports_visible_federation()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_query()`` and
``supports_visible_federation()`` are ``true``*
"""
return # osid.relationship.RelationshipQuerySession
@abc.abstractmethod
def get_relationship_search_session(self):
"""Gets the ``OsidSession`` associated with the relationship search service.
:return: a ``RelationshipSearchSession``
:rtype: ``osid.relationship.RelationshipSearchSession``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_search()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_search()`` is ``true``.*
"""
return # osid.relationship.RelationshipSearchSession
relationship_search_session = property(fget=get_relationship_search_session)
@abc.abstractmethod
def get_relationship_search_session_for_family(self, family_id):
"""Gets the ``OsidSession`` associated with the relationship search service for the given family.
:param family_id: the ``Id`` of the ``Family``
:type family_id: ``osid.id.Id``
:return: a ``RelationshipSearchSession``
:rtype: ``osid.relationship.RelationshipSearchSession``
:raise: ``NotFound`` -- no family found by the given ``Id``
:raise: ``NullArgument`` -- ``family_id`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_search()`` or ``supports_visible_federation()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_search()`` and
``supports_visible_federation()`` are ``true``*
"""
return # osid.relationship.RelationshipSearchSession
@abc.abstractmethod
def get_relationship_admin_session(self):
"""Gets the ``OsidSession`` associated with the relationship administration service.
:return: a ``RelationshipAdminSession``
:rtype: ``osid.relationship.RelationshipAdminSession``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_admin()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_admin()`` is ``true``.*
"""
return # osid.relationship.RelationshipAdminSession
relationship_admin_session = property(fget=get_relationship_admin_session)
@abc.abstractmethod
def get_relationship_admin_session_for_family(self, family_id):
"""Gets the ``OsidSession`` associated with the relationship administration service for the given family.
:param family_id: the ``Id`` of the ``Family``
:type family_id: ``osid.id.Id``
:return: a ``RelationshipAdminSession``
:rtype: ``osid.relationship.RelationshipAdminSession``
:raise: ``NotFound`` -- no family found by the given ``Id``
:raise: ``NullArgument`` -- ``family_id`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_admin()`` or ``supports_visible_federation()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_admin()`` and
``supports_visible_federation()`` are ``true``*
"""
return # osid.relationship.RelationshipAdminSession
@abc.abstractmethod
def get_relationship_notification_session(self, relationship_receiver):
"""Gets the ``OsidSession`` associated with the relationship notification service.
:param relationship_receiver: the receiver
:type relationship_receiver: ``osid.relationship.RelationshipReceiver``
:return: a ``RelationshipNotificationSession``
:rtype: ``osid.relationship.RelationshipNotificationSession``
:raise: ``NullArgument`` -- ``relationship_receiver`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_notification()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_notification()`` is ``true``.*
"""
return # osid.relationship.RelationshipNotificationSession
@abc.abstractmethod
def get_relationship_notification_session_for_family(self, relationship_receiver, family_id):
"""Gets the ``OsidSession`` associated with the relationship notification service for the given family.
:param relationship_receiver: the receiver
:type relationship_receiver: ``osid.relationship.RelationshipReceiver``
:param family_id: the ``Id`` of the ``Family``
:type family_id: ``osid.id.Id``
:return: a ``RelationshipNotificationSession``
:rtype: ``osid.relationship.RelationshipNotificationSession``
:raise: ``NotFound`` -- no family found by the given ``Id``
:raise: ``NullArgument`` -- ``relationship_receiver`` or ``family_id`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_notification()`` or ``supports_visible_federation()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_notification()`` and
``supports_visible_federation()`` are ``true``*
"""
return # osid.relationship.RelationshipNotificationSession
@abc.abstractmethod
def get_relationship_family_session(self):
"""Gets the ``OsidSession`` to lookup relationship/family mappings.
:return: a ``RelationshipFamilySession``
:rtype: ``osid.relationship.RelationshipFamilySession``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_family()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_family()`` is ``true``.*
"""
return # osid.relationship.RelationshipFamilySession
relationship_family_session = property(fget=get_relationship_family_session)
@abc.abstractmethod
def get_relationship_family_assignment_session(self):
"""Gets the ``OsidSession`` associated with assigning relationships to families.
:return: a ``RelationshipFamilyAssignmentSession``
:rtype: ``osid.relationship.RelationshipFamilyAssignmentSession``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_family_assignment()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_family_assignment()`` is ``true``.*
"""
return # osid.relationship.RelationshipFamilyAssignmentSession
relationship_family_assignment_session = property(fget=get_relationship_family_assignment_session)
@abc.abstractmethod
def get_relationship_smart_family_session(self, family_id):
"""Gets the ``OsidSession`` to manage dynamic families of retlationships.
:param family_id: the ``Id`` of the ``Family``
:type family_id: ``osid.id.Id``
:return: a ``RelationshipSmartFamilySession``
:rtype: ``osid.relationship.RelationshipSmartFamilySession``
:raise: ``NotFound`` -- no family found by the given ``Id``
:raise: ``NullArgument`` -- ``family_id`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_smart_family()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_smart_family()`` is ``true``.*
"""
return # osid.relationship.RelationshipSmartFamilySession
@abc.abstractmethod
def get_family_lookup_session(self):
"""Gets the ``OsidSession`` associated with the family lookup service.
:return: a ``FamilyLookupSession``
:rtype: ``osid.relationship.FamilyLookupSession``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_family_lookup()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_family_lookup()`` is ``true``.*
"""
return # osid.relationship.FamilyLookupSession
family_lookup_session = property(fget=get_family_lookup_session)
@abc.abstractmethod
def get_family_query_session(self):
"""Gets the ``OsidSession`` associated with the family query service.
:return: a ``FamilyQuerySession``
:rtype: ``osid.relationship.FamilyQuerySession``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_family_query()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_family_query()`` is ``true``.*
"""
return # osid.relationship.FamilyQuerySession
family_query_session = property(fget=get_family_query_session)
@abc.abstractmethod
def get_family_search_session(self):
"""Gets the ``OsidSession`` associated with the family search service.
:return: a ``FamilySearchSession``
:rtype: ``osid.relationship.FamilySearchSession``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_family_search()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_family_search()`` is ``true``.*
"""
return # osid.relationship.FamilySearchSession
family_search_session = property(fget=get_family_search_session)
@abc.abstractmethod
def get_family_admin_session(self):
"""Gets the ``OsidSession`` associated with the family administrative service.
:return: a ``FamilyAdminSession``
:rtype: ``osid.relationship.FamilyAdminSession``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_family_admin()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_family_admin()`` is ``true``.*
"""
return # osid.relationship.FamilyAdminSession
family_admin_session = property(fget=get_family_admin_session)
@abc.abstractmethod
def get_family_notification_session(self, family_receiver):
"""Gets the ``OsidSession`` associated with the family notification service.
:param family_receiver: the receiver
:type family_receiver: ``osid.relationship.FamilyReceiver``
:return: a ``FamilyNotificationSession``
:rtype: ``osid.relationship.FamilyNotificationSession``
:raise: ``NullArgument`` -- ``family_receiver`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_family_notification()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_family_notification()`` is ``true``.*
"""
return # osid.relationship.FamilyNotificationSession
@abc.abstractmethod
def get_family_hierarchy_session(self):
"""Gets the ``OsidSession`` associated with the family hierarchy service.
:return: a ``FamilyHierarchySession`` for families
:rtype: ``osid.relationship.FamilyHierarchySession``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_family_hierarchy()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_family_hierarchy()`` is ``true``.*
"""
return # osid.relationship.FamilyHierarchySession
family_hierarchy_session = property(fget=get_family_hierarchy_session)
@abc.abstractmethod
def get_family_hierarchy_design_session(self):
"""Gets the ``OsidSession`` associated with the family hierarchy design service.
:return: a ``HierarchyDesignSession`` for families
:rtype: ``osid.relationship.FamilyHierarchyDesignSession``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_family_hierarchy_design()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_family_hierarchy_design()`` is ``true``.*
"""
return # osid.relationship.FamilyHierarchyDesignSession
family_hierarchy_design_session = property(fget=get_family_hierarchy_design_session)
@abc.abstractmethod
def get_relationship_batch_manager(self):
"""Gets the relationship batch manager.
:return: a ``RelationshipBatchManager``
:rtype: ``osid.relationship.batch.RelationshipBatchManager``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_batch()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_batch()`` is ``true``.*
"""
return # osid.relationship.batch.RelationshipBatchManager
relationship_batch_manager = property(fget=get_relationship_batch_manager)
@abc.abstractmethod
def get_relationship_rules_manager(self):
"""Gets the relationship rules manager.
:return: a ``RelationshipRulesManager``
:rtype: ``osid.relationship.rules.RelationshipRulesManager``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_rules()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_rules()`` is ``true``.*
"""
return # osid.relationship.rules.RelationshipRulesManager
relationship_rules_manager = property(fget=get_relationship_rules_manager)
class RelationshipProxyManager:
"""The relationship manager provides access to relationship sessions and provides interoperability tests for various aspects of this service.
Methods in this manager support the passing of a Proxy. The sessions
included in this manager are:
* ``RelationshipLookupSession:`` a session to retrieve and examine
relationships
* ``RelationshipQuerySession:`` a session to query relationships
* ``RelationshipSearchSession:`` a session to search for
relationships
* ``RelationshipAdminSession:`` a session to manage relationships
* ``RelationshipNotificationSession:`` a session to receive
notifications pertaining to relationship changes
* ``RelationshipFamilySession:`` a session to look up relationship
to family mappings
* ``RelationshipFamilyAssignmentSession:`` a session to manage
relationship to family catalog mappings
* ``RelationshipSmartFamilySession:`` a session to manage dynamic
relationship families
* ``FamilyLookupSession:`` a session to retrieve families
* ``FamilyQuerySession:`` a session to query families
* ``FamilySearchSession:`` a session to search for families
* ``FamilyAdminSession:`` a session to create and delete families
* ``FamilyNotificationSession:`` a session to receive
notifications pertaining to family changes
* ``FamilyHierarchySession:`` a session to traverse a hierarchy of
families
* ``FamilyHierarchyDesignSession:`` a session to manage a family
hierarchy
"""
__metaclass__ = abc.ABCMeta
@abc.abstractmethod
def get_relationship_lookup_session(self, proxy):
"""Gets the ``OsidSession`` associated with the relationship lookup service.
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``RelationshipLookupSession``
:rtype: ``osid.relationship.RelationshipLookupSession``
:raise: ``NullArgument`` -- ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_lookup()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_lookup()`` is ``true``.*
"""
return # osid.relationship.RelationshipLookupSession
@abc.abstractmethod
def get_relationship_lookup_session_for_family(self, family_id, proxy):
"""Gets the ``OsidSession`` associated with the relationship lookup service for the given family.
:param family_id: the ``Id`` of the family
:type family_id: ``osid.id.Id``
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``RelationshipLookupSession``
:rtype: ``osid.relationship.RelationshipLookupSession``
:raise: ``NotFound`` -- no ``Family`` found by the given ``Id``
:raise: ``NullArgument`` -- ``family_id`` or ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_lookup()`` or ``supports_visible_federation()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_lookup()`` and
``supports_visible_federation()`` are ``true``*
"""
return # osid.relationship.RelationshipLookupSession
@abc.abstractmethod
def get_relationship_query_session(self, proxy):
"""Gets the ``OsidSession`` associated with the relationship query service.
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``RelationshipQuerySession``
:rtype: ``osid.relationship.RelationshipQuerySession``
:raise: ``NullArgument`` -- ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_query()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_query()`` is ``true``.*
"""
return # osid.relationship.RelationshipQuerySession
@abc.abstractmethod
def get_relationship_query_session_for_family(self, family_id, proxy):
"""Gets the ``OsidSession`` associated with the relationship query service for the given family.
:param family_id: the ``Id`` of the family
:type family_id: ``osid.id.Id``
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``RelationshipQuerySession``
:rtype: ``osid.relationship.RelationshipQuerySession``
:raise: ``NotFound`` -- no ``Family`` found by the given ``Id``
:raise: ``NullArgument`` -- ``family_id`` or ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_query()`` or ``supports_visible_federation()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_query()`` and
``supports_visible_federation()`` are ``true``*
"""
return # osid.relationship.RelationshipQuerySession
@abc.abstractmethod
def get_relationship_search_session(self, proxy):
"""Gets the ``OsidSession`` associated with the relationship search service.
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``RelationshipSearchSession``
:rtype: ``osid.relationship.RelationshipSearchSession``
:raise: ``NullArgument`` -- ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_search()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_search()`` is ``true``.*
"""
return # osid.relationship.RelationshipSearchSession
@abc.abstractmethod
def get_relationship_search_session_for_family(self, family_id, proxy):
"""Gets the ``OsidSession`` associated with the relationship search service for the given family.
:param family_id: the ``Id`` of the family
:type family_id: ``osid.id.Id``
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``RelationshipSearchSession``
:rtype: ``osid.relationship.RelationshipSearchSession``
:raise: ``NotFound`` -- no ``Family`` found by the given ``Id``
:raise: ``NullArgument`` -- ``family_id`` or ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_search()`` or ``supports_visible_federation()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_search()`` and
``supports_visible_federation()`` are ``true``*
"""
return # osid.relationship.RelationshipSearchSession
@abc.abstractmethod
def get_relationship_admin_session(self, proxy):
"""Gets the ``OsidSession`` associated with the relationship administration service.
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``RelationshipAdminSession``
:rtype: ``osid.relationship.RelationshipAdminSession``
:raise: ``NullArgument`` -- ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_admin()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_admin()`` is ``true``.*
"""
return # osid.relationship.RelationshipAdminSession
@abc.abstractmethod
def get_relationship_admin_session_for_family(self, family_id, proxy):
"""Gets the ``OsidSession`` associated with the relationship administration service for the given family.
:param family_id: the ``Id`` of the family
:type family_id: ``osid.id.Id``
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``RelationshipAdminSession``
:rtype: ``osid.relationship.RelationshipAdminSession``
:raise: ``NotFound`` -- no ``Family`` found by the given ``Id``
:raise: ``NullArgument`` -- ``family_id`` or ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_admin()`` or ``supports_visible_federation()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_admin()`` and
``supports_visible_federation()`` are ``true``*
"""
return # osid.relationship.RelationshipAdminSession
@abc.abstractmethod
def get_relationship_notification_session(self, relationship_receiver, proxy):
"""Gets the ``OsidSession`` associated with the relationship notification service.
:param relationship_receiver: the receiver
:type relationship_receiver: ``osid.relationship.RelationshipReceiver``
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``RelationshipNotificationSession``
:rtype: ``osid.relationship.RelationshipNotificationSession``
:raise: ``NullArgument`` -- ``relationship_receiver`` or ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_notification()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_notification()`` is ``true``.*
"""
return # osid.relationship.RelationshipNotificationSession
@abc.abstractmethod
def get_relationship_notification_session_for_family(self, relationship_receiver, family_id, proxy):
"""Gets the ``OsidSession`` associated with the relationship notification service for the given family.
:param relationship_receiver: the receiver
:type relationship_receiver: ``osid.relationship.RelationshipReceiver``
:param family_id: the ``Id`` of the family
:type family_id: ``osid.id.Id``
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``RelationshipNotificationSession``
:rtype: ``osid.relationship.RelationshipNotificationSession``
:raise: ``NotFound`` -- no ``Family`` found by the given ``Id``
:raise: ``NullArgument`` -- ``relationship_receiver, family_id`` or ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_notification()`` or ``supports_visible_federation()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_notification()`` and
``supports_visible_federation()`` are ``true``*
"""
return # osid.relationship.RelationshipNotificationSession
@abc.abstractmethod
def get_relationship_family_session(self, proxy):
"""Gets the ``OsidSession`` to lookup relationship/family mappings.
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``RelationshipFamilySession``
:rtype: ``osid.relationship.RelationshipFamilySession``
:raise: ``NullArgument`` -- ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_family()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_family()`` is ``true``.*
"""
return # osid.relationship.RelationshipFamilySession
@abc.abstractmethod
def get_relationship_family_assignment_session(self, proxy):
"""Gets the ``OsidSession`` associated with assigning relationships to families.
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``RelationshipFamilyAssignmentSession``
:rtype: ``osid.relationship.RelationshipFamilyAssignmentSession``
:raise: ``NullArgument`` -- ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_family_assignment()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_family_assignment()`` is ``true``.*
"""
return # osid.relationship.RelationshipFamilyAssignmentSession
@abc.abstractmethod
def get_relationship_smart_family_session(self, family_id, proxy):
"""Gets the ``OsidSession`` to manage dynamic families of retlationships.
:param family_id: the ``Id`` of the ``Family``
:type family_id: ``osid.id.Id``
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``RelationshipSmartFamilySession``
:rtype: ``osid.relationship.RelationshipSmartFamilySession``
:raise: ``NotFound`` -- no family found by the given ``Id``
:raise: ``NullArgument`` -- ``family_id`` or ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_smart_family()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_smart_family()`` is ``true``.*
"""
return # osid.relationship.RelationshipSmartFamilySession
@abc.abstractmethod
def get_family_lookup_session(self, proxy):
"""Gets the ``OsidSession`` associated with the family lookup service.
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``FamilyLookupSession``
:rtype: ``osid.relationship.FamilyLookupSession``
:raise: ``NullArgument`` -- ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_family_lookup()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_family_lookup()`` is ``true``.*
"""
return # osid.relationship.FamilyLookupSession
@abc.abstractmethod
def get_family_query_session(self, proxy):
"""Gets the ``OsidSession`` associated with the family query service.
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``FamilyQuerySession``
:rtype: ``osid.relationship.FamilyQuerySession``
:raise: ``NullArgument`` -- ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_family_query()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_famil_query()`` is ``true``.*
"""
return # osid.relationship.FamilyQuerySession
@abc.abstractmethod
def get_family_search_session(self, proxy):
"""Gets the ``OsidSession`` associated with the family search service.
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``FamilySearchSession``
:rtype: ``osid.relationship.FamilySearchSession``
:raise: ``NullArgument`` -- ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_family_search()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_family_search()`` is ``true``.*
"""
return # osid.relationship.FamilySearchSession
@abc.abstractmethod
def get_family_admin_session(self, proxy):
"""Gets the ``OsidSession`` associated with the family administrative service.
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``FamilyAdminSession``
:rtype: ``osid.relationship.FamilyAdminSession``
:raise: ``NullArgument`` -- ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_family_admin()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_family_admin()`` is ``true``.*
"""
return # osid.relationship.FamilyAdminSession
@abc.abstractmethod
def get_family_notification_session(self, family_receiver, proxy):
"""Gets the ``OsidSession`` associated with the family notification service.
:param family_receiver: the receiver
:type family_receiver: ``osid.relationship.FamilyReceiver``
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``FamilyNotificationSession``
:rtype: ``osid.relationship.FamilyNotificationSession``
:raise: ``NullArgument`` -- ``family_receiver`` or ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_family_notification()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_family_notification()`` is ``true``.*
"""
return # osid.relationship.FamilyNotificationSession
@abc.abstractmethod
def get_family_hierarchy_session(self, proxy):
"""Gets the ``OsidSession`` associated with the family hierarchy service.
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``FamilyHierarchySession`` for families
:rtype: ``osid.relationship.FamilyHierarchySession``
:raise: ``NullArgument`` -- ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_family_hierarchy()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_family_hierarchy()`` is ``true``.*
"""
return # osid.relationship.FamilyHierarchySession
@abc.abstractmethod
def get_family_hierarchy_design_session(self, proxy):
"""Gets the ``OsidSession`` associated with the family hierarchy design service.
:param proxy: a proxy
:type proxy: ``osid.proxy.Proxy``
:return: a ``HierarchyDesignSession`` for families
:rtype: ``osid.relationship.FamilyHierarchyDesignSession``
:raise: ``NullArgument`` -- ``proxy`` is ``null``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_family_hierarchy_design()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_family_hierarchy_design()`` is ``true``.*
"""
return # osid.relationship.FamilyHierarchyDesignSession
@abc.abstractmethod
def get_relationship_batch_proxy_manager(self):
"""Gets the relationship batch proxy manager.
:return: a ``RelationshipBatchProxyManager``
:rtype: ``osid.relationship.batch.RelationshipBatchProxyManager``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_rules()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_rules()`` is ``true``.*
"""
return # osid.relationship.batch.RelationshipBatchProxyManager
relationship_batch_proxy_manager = property(fget=get_relationship_batch_proxy_manager)
@abc.abstractmethod
def get_relationship_rules_proxy_manager(self):
"""Gets the relationship rules proxy manager.
:return: a ``RelationshipRulesProxyManager``
:rtype: ``osid.relationship.rules.RelationshipRulesProxyManager``
:raise: ``OperationFailed`` -- unable to complete request
:raise: ``Unimplemented`` -- ``supports_relationship_rules()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_relationship_rules()`` is ``true``.*
"""
return # osid.relationship.rules.RelationshipRulesProxyManager
relationship_rules_proxy_manager = property(fget=get_relationship_rules_proxy_manager)
| 40.339874 | 145 | 0.663553 | 5,012 | 51,393 | 6.665403 | 0.042298 | 0.045021 | 0.041907 | 0.033526 | 0.937049 | 0.919298 | 0.881522 | 0.837579 | 0.792948 | 0.754453 | 0 | 0 | 0.219368 | 51,393 | 1,273 | 146 | 40.371563 | 0.832698 | 0.717238 | 0 | 0.603376 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.295359 | false | 0 | 0.004219 | 0 | 0.704641 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
c02b213ef4c468c641b1431444ad327f7f388d04 | 2,445 | py | Python | RecoMuon/MuonIsolationProducers/python/caloExtractorByAssociatorBlocks_cff.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 852 | 2015-01-11T21:03:51.000Z | 2022-03-25T21:14:00.000Z | RecoMuon/MuonIsolationProducers/python/caloExtractorByAssociatorBlocks_cff.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 30,371 | 2015-01-02T00:14:40.000Z | 2022-03-31T23:26:05.000Z | RecoMuon/MuonIsolationProducers/python/caloExtractorByAssociatorBlocks_cff.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 3,240 | 2015-01-02T05:53:18.000Z | 2022-03-31T17:24:21.000Z | import FWCore.ParameterSet.Config as cms
# -*-TCL-*-
from RecoMuon.MuonIsolationProducers.trackAssociatorBlocks_cff import *
MIsoCaloExtractorByAssociatorTowersBlock = cms.PSet(
MIsoTrackAssociatorTowers,
Noise_HE = cms.double(0.2),
DR_Veto_H = cms.double(0.1),
Noise_EE = cms.double(0.1),
UseRecHitsFlag = cms.bool(False),
NoiseTow_EE = cms.double(0.15),
Threshold_HO = cms.double(0.5),
Noise_EB = cms.double(0.025),
Noise_HO = cms.double(0.2),
CenterConeOnCalIntersection = cms.bool(False),
DR_Max = cms.double(0.5),
PropagatorName = cms.string('SteppingHelixPropagatorAny'),
ServiceParameters = cms.PSet(
Propagators = cms.untracked.vstring( 'SteppingHelixPropagatorAny' ),
RPCLayers = cms.bool( False ),
UseMuonNavigation = cms.untracked.bool( False )
),
Threshold_E = cms.double(0.2),
Noise_HB = cms.double(0.2),
PrintTimeReport = cms.untracked.bool(False),
NoiseTow_EB = cms.double(0.04),
Threshold_H = cms.double(0.5),
DR_Veto_E = cms.double(0.07),
DepositLabel = cms.untracked.string('Cal'),
ComponentName = cms.string('CaloExtractorByAssociator'),
DR_Veto_HO = cms.double(0.1),
DepositInstanceLabels = cms.vstring('ecal',
'hcal',
'ho')
)
MIsoCaloExtractorByAssociatorHitsBlock = cms.PSet(
MIsoTrackAssociatorHits,
Noise_HE = cms.double(0.2),
DR_Veto_H = cms.double(0.1),
Noise_EE = cms.double(0.1),
UseRecHitsFlag = cms.bool(True),
NoiseTow_EE = cms.double(0.15),
Threshold_HO = cms.double(0.1),
Noise_EB = cms.double(0.025),
Noise_HO = cms.double(0.2),
CenterConeOnCalIntersection = cms.bool(False),
DR_Max = cms.double(0.5),
PropagatorName = cms.string('SteppingHelixPropagatorAny'),
ServiceParameters = cms.PSet(
Propagators = cms.untracked.vstring( 'SteppingHelixPropagatorAny' ),
RPCLayers = cms.bool( False ),
UseMuonNavigation = cms.untracked.bool( False )
),
Threshold_E = cms.double(0.025),
Noise_HB = cms.double(0.2),
NoiseTow_EB = cms.double(0.04),
PrintTimeReport = cms.untracked.bool(False),
Threshold_H = cms.double(0.1),
DR_Veto_E = cms.double(0.07),
DepositLabel = cms.untracked.string('Cal'),
ComponentName = cms.string('CaloExtractorByAssociator'),
DR_Veto_HO = cms.double(0.1),
DepositInstanceLabels = cms.vstring('ecal',
'hcal',
'ho')
)
| 35.434783 | 76 | 0.671575 | 293 | 2,445 | 5.484642 | 0.208191 | 0.156814 | 0.174238 | 0.05476 | 0.841942 | 0.769135 | 0.719353 | 0.719353 | 0.719353 | 0.719353 | 0 | 0.034378 | 0.191002 | 2,445 | 68 | 77 | 35.955882 | 0.778059 | 0.003681 | 0 | 0.75 | 0 | 0 | 0.074013 | 0.063322 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.03125 | 0 | 0.03125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c03f5c32d48eef33a18ab1db107763c53aca11b5 | 108 | py | Python | build/lib/tqdm_multi_thread/__init__.py | boydfd/tqdm_multi_thread | c581115ce5a8345b626081eb80005ab7e65ac24c | [
"MIT"
] | 11 | 2019-08-12T01:11:14.000Z | 2020-12-17T22:45:30.000Z | tqdm_multi_thread/__init__.py | boydfd/tqdm_multi_thread | c581115ce5a8345b626081eb80005ab7e65ac24c | [
"MIT"
] | null | null | null | tqdm_multi_thread/__init__.py | boydfd/tqdm_multi_thread | c581115ce5a8345b626081eb80005ab7e65ac24c | [
"MIT"
] | null | null | null | from .tqdm_multi_thread import TqdmMultiThread
from .tqdm_multi_thread_factory import TqdmMultiThreadFactory | 54 | 61 | 0.916667 | 13 | 108 | 7.230769 | 0.615385 | 0.170213 | 0.276596 | 0.404255 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064815 | 108 | 2 | 61 | 54 | 0.930693 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
220a37e3a79ac961da02709008cfc6ad92931e25 | 139,166 | py | Python | nominals/testing/test_integration/test_journal.py | rossm6/accounts | 74633ce4038806222048d85ef9dfe97a957a6a71 | [
"MIT"
] | 11 | 2021-01-23T01:09:54.000Z | 2021-01-25T07:16:30.000Z | nominals/testing/test_integration/test_journal.py | rossm6/accounts | 74633ce4038806222048d85ef9dfe97a957a6a71 | [
"MIT"
] | 7 | 2021-04-06T18:19:10.000Z | 2021-09-22T19:45:03.000Z | nominals/testing/test_integration/test_journal.py | rossm6/accounts | 74633ce4038806222048d85ef9dfe97a957a6a71 | [
"MIT"
] | 3 | 2021-01-23T18:55:32.000Z | 2021-02-16T17:47:59.000Z | from datetime import date, datetime, timedelta
from itertools import chain
from json import loads
from accountancy.helpers import sort_multiple
from accountancy.testing.helpers import create_formset_data, create_header
from controls.models import FinancialYear, ModuleSettings, Period
from django.contrib.auth import get_user_model
from django.shortcuts import reverse
from django.test import RequestFactory, TestCase
from django.utils import timezone
from nominals.helpers import (create_nominal_journal,
create_nominal_journal_without_nom_trans,
create_vat_transactions)
from nominals.models import (Nominal, NominalHeader, NominalLine,
NominalTransaction)
from vat.models import Vat, VatTransaction
"""
These tests just check that the nominal module uses the accountancy general classes correctly.
The testing of these general classes is done in the purchase ledger.
"""
HEADER_FORM_PREFIX = "header"
LINE_FORM_PREFIX = "line"
match_form_prefix = "match"
DATE_INPUT_FORMAT = '%d-%m-%Y'
MODEL_DATE_INPUT_FORMAT = '%Y-%m-%d'
class CreateJournal(TestCase):
@classmethod
def setUpTestData(cls):
cls.user = get_user_model().objects.create_superuser(username="dummy", password="dummy")
cls.factory = RequestFactory()
cls.ref = "test journal"
cls.date = datetime.now().strftime(DATE_INPUT_FORMAT)
cls.due_date = (datetime.now() + timedelta(days=31)
).strftime(DATE_INPUT_FORMAT)
cls.model_date = datetime.now().strftime(MODEL_DATE_INPUT_FORMAT)
cls.model_due_date = (datetime.now() + timedelta(days=31)
).strftime(MODEL_DATE_INPUT_FORMAT)
fy = FinancialYear.objects.create(financial_year=2020)
cls.period = Period.objects.create(fy=fy, period="01", fy_and_period="202001", month_start=date(2020,1,31))
cls.description = "a line description"
# ASSETS
assets = Nominal.objects.create(name="Assets")
current_assets = Nominal.objects.create(
parent=assets, name="Current Assets")
cls.bank_nominal = Nominal.objects.create(
parent=current_assets, name="Bank Account")
cls.debtors_nominal = Nominal.objects.create(
parent=current_assets, name="Trade Debtors")
# LIABILITIES
liabilities = Nominal.objects.create(name="Liabilities")
current_liabilities = Nominal.objects.create(
parent=liabilities, name="Current Liabilities")
cls.vat_nominal = Nominal.objects.create(
parent=current_assets, name="Vat")
cls.vat_code = Vat.objects.create(
code="1", name="standard rate", rate=20)
cls.url = reverse("nominals:create")
ModuleSettings.objects.create(
cash_book_period=cls.period,
nominals_period=cls.period,
purchases_period=cls.period,
sales_period=cls.period
)
# CORRECT USAGE
# Can request create journal view t=nj GET parameter
def test_get_request_with_query_parameter(self):
self.client.force_login(self.user)
response = self.client.get(self.url + "?t=nj")
self.assertEqual(response.status_code, 200)
# This HTML fragment is before the selectize widget does its thing
self.assertContains(
response,
'<select name="header-type" class="transaction-type-select form-control form-control-sm" id="id_header-type">'
'<option value="nj" selected>Journal</option>'
'</select>',
html=True
)
# CORRECT USAGE
# Can request create journal view without GET parameter
def test_get_request_without_query_parameter(self):
self.client.force_login(self.user)
response = self.client.get(self.url)
self.assertEqual(response.status_code, 200)
# This HTML fragment is before the selectize widget does its thing
self.assertContains(
response,
'<select name="header-type" class="transaction-type-select form-control form-control-sm" id="id_header-type">'
'<option value="nj" selected>Journal</option>'
'</select>',
html=True
)
# CORRECT USAGE
# Each line contains non-zero goods and vat
def test_create_journal(self):
self.client.force_login(self.user)
data = {}
header_data = create_header(
HEADER_FORM_PREFIX,
{
"type": "nj",
"ref": self.ref,
"date": self.date,
"total": 120,
"period": self.period.pk,
"vat_type": "o"
}
)
data.update(header_data)
line_forms = []
line_forms.append(
{
"description": self.description,
"goods": 100,
"nominal": self.bank_nominal.pk,
"vat_code": self.vat_code.pk,
"vat": 20
}
)
line_forms.append(
{
"description": self.description,
"goods": -100,
"nominal": self.debtors_nominal.pk,
"vat_code": self.vat_code.pk,
"vat": -20
}
)
line_data = create_formset_data(LINE_FORM_PREFIX, line_forms)
data.update(line_data)
response = self.client.post(self.url, data)
self.assertEqual(response.status_code, 302)
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
'nj',
)
self.assertEqual(
header.ref,
self.ref,
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.goods,
100
)
self.assertEqual(
header.vat,
20
)
self.assertEqual(
header.total,
120
)
self.assertEqual(
header.vat_type,
"o"
)
lines = NominalLine.objects.all()
nominal_transactions = NominalTransaction.objects.all()
vat_transactions = VatTransaction.objects.all().order_by("line")
self.assertEqual(
len(lines),
2
)
self.assertEqual(
len(vat_transactions),
2
)
debit = lines[0]
credit = lines[1]
# DEBIT
self.assertEqual(
debit.description,
self.description
)
self.assertEqual(
debit.goods,
100
)
self.assertEqual(
debit.nominal,
self.bank_nominal
)
self.assertEqual(
debit.vat_code,
self.vat_code
)
self.assertEqual(
debit.vat,
20
)
self.assertEqual(
debit.goods_nominal_transaction,
nominal_transactions[0]
)
self.assertEqual(
debit.vat_nominal_transaction,
nominal_transactions[1]
)
self.assertEqual(
debit.vat_transaction,
vat_transactions[0]
)
# CREDIT
self.assertEqual(
credit.description,
self.description
)
self.assertEqual(
credit.goods,
-100
)
self.assertEqual(
credit.nominal,
self.debtors_nominal
)
self.assertEqual(
credit.vat_code,
self.vat_code
)
self.assertEqual(
credit.vat,
-20
)
self.assertEqual(
credit.goods_nominal_transaction,
nominal_transactions[2]
)
self.assertEqual(
credit.vat_nominal_transaction,
nominal_transactions[3]
)
self.assertEqual(
credit.vat_transaction,
vat_transactions[1]
)
self.assertEqual(
len(nominal_transactions),
4
)
# debit goods
self.assertEqual(
nominal_transactions[0].header,
header.pk
)
self.assertEqual(
nominal_transactions[0].line,
debit.pk
)
self.assertEqual(
nominal_transactions[0].nominal,
self.bank_nominal
)
self.assertEqual(
nominal_transactions[0].value,
100
)
self.assertEqual(
nominal_transactions[0].ref,
header.ref
)
self.assertEqual(
nominal_transactions[0].period,
self.period
)
self.assertEqual(
nominal_transactions[0].type,
"nj"
)
self.assertEqual(
nominal_transactions[0].field,
"g"
)
# debit vat
self.assertEqual(
nominal_transactions[1].header,
header.pk
)
self.assertEqual(
nominal_transactions[1].line,
debit.pk
)
self.assertEqual(
nominal_transactions[1].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[1].value,
20
)
self.assertEqual(
nominal_transactions[1].ref,
header.ref
)
self.assertEqual(
nominal_transactions[1].period,
self.period
)
self.assertEqual(
nominal_transactions[1].type,
"nj"
)
self.assertEqual(
nominal_transactions[1].field,
"v"
)
# credit goods
self.assertEqual(
nominal_transactions[2].header,
header.pk
)
self.assertEqual(
nominal_transactions[2].line,
credit.pk
)
self.assertEqual(
nominal_transactions[2].nominal,
self.debtors_nominal
)
self.assertEqual(
nominal_transactions[2].value,
-100
)
self.assertEqual(
nominal_transactions[2].ref,
header.ref
)
self.assertEqual(
nominal_transactions[2].period,
self.period
)
self.assertEqual(
nominal_transactions[2].type,
"nj"
)
self.assertEqual(
nominal_transactions[2].field,
"g"
)
# credit vat
self.assertEqual(
nominal_transactions[3].header,
header.pk
)
self.assertEqual(
nominal_transactions[3].line,
credit.pk
)
self.assertEqual(
nominal_transactions[3].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[3].value,
-20
)
self.assertEqual(
nominal_transactions[3].ref,
header.ref
)
self.assertEqual(
nominal_transactions[3].period,
self.period
)
self.assertEqual(
nominal_transactions[3].type,
"nj"
)
self.assertEqual(
nominal_transactions[3].field,
"v"
)
for i, vat_tran in enumerate(vat_transactions):
self.assertEqual(
vat_tran.header,
header.pk
)
self.assertEqual(
vat_tran.line,
lines[i].pk
)
self.assertEqual(
vat_tran.module,
"NL"
)
self.assertEqual(
vat_tran.ref,
header.ref
)
self.assertEqual(
vat_tran.period,
header.period
)
self.assertEqual(
vat_tran.date,
header.date
)
self.assertEqual(
vat_tran.field,
"v"
)
self.assertEqual(
vat_tran.tran_type,
header.type
)
self.assertEqual(
vat_tran.vat_type,
"o"
)
self.assertEqual(
vat_tran.vat_code,
lines[i].vat_code
)
self.assertEqual(
vat_tran.vat_rate,
lines[i].vat_code.rate
)
self.assertEqual(
vat_tran.goods,
lines[i].goods
)
self.assertEqual(
vat_tran.vat,
lines[i].vat
)
# CORRECT USAGE
# Each line contains goods only
def test_create_journal_with_goods_only_and_not_vat(self):
self.client.force_login(self.user)
data = {}
header_data = create_header(
HEADER_FORM_PREFIX,
{
"type": "nj",
"ref": self.ref,
"date": self.date,
"total": 120,
"period": self.period.pk,
"vat_type": "o"
}
)
data.update(header_data)
line_forms = []
line_forms.append(
{
"description": self.description,
"goods": 120,
"nominal": self.bank_nominal.pk,
"vat_code": self.vat_code.pk,
"vat": 0
}
)
line_forms.append(
{
"description": self.description,
"goods": -120,
"nominal": self.debtors_nominal.pk,
"vat_code": self.vat_code.pk,
"vat": -0
}
)
line_data = create_formset_data(LINE_FORM_PREFIX, line_forms)
data.update(line_data)
response = self.client.post(self.url, data)
self.assertEqual(response.status_code, 302)
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
'nj',
)
self.assertEqual(
header.ref,
self.ref,
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.goods,
120
)
self.assertEqual(
header.vat,
0
)
self.assertEqual(
header.total,
120
)
self.assertEqual(
header.vat_type,
"o"
)
lines = NominalLine.objects.all()
nominal_transactions = NominalTransaction.objects.all()
vat_transactions = VatTransaction.objects.all().order_by("line")
self.assertEqual(
len(lines),
2
)
debit = lines[0]
credit = lines[1]
# DEBIT
self.assertEqual(
debit.description,
self.description
)
self.assertEqual(
debit.goods,
120
)
self.assertEqual(
debit.nominal,
self.bank_nominal
)
self.assertEqual(
debit.vat_code,
self.vat_code
)
self.assertEqual(
debit.vat,
0
)
self.assertEqual(
debit.goods_nominal_transaction,
nominal_transactions[0]
)
self.assertEqual(
debit.vat_nominal_transaction,
None
)
self.assertEqual(
debit.vat_transaction,
vat_transactions[0]
)
# CREDIT
self.assertEqual(
credit.description,
self.description
)
self.assertEqual(
credit.goods,
-120
)
self.assertEqual(
credit.nominal,
self.debtors_nominal
)
self.assertEqual(
credit.vat_code,
self.vat_code
)
self.assertEqual(
credit.vat,
0
)
self.assertEqual(
credit.goods_nominal_transaction,
nominal_transactions[1]
)
self.assertEqual(
credit.vat_nominal_transaction,
None
)
self.assertEqual(
credit.vat_transaction,
vat_transactions[1]
)
self.assertEqual(
len(nominal_transactions),
2
)
# debit goods
self.assertEqual(
nominal_transactions[0].header,
header.pk
)
self.assertEqual(
nominal_transactions[0].line,
debit.pk
)
self.assertEqual(
nominal_transactions[0].nominal,
self.bank_nominal
)
self.assertEqual(
nominal_transactions[0].value,
120
)
self.assertEqual(
nominal_transactions[0].ref,
header.ref
)
self.assertEqual(
nominal_transactions[0].period,
self.period
)
self.assertEqual(
nominal_transactions[0].type,
"nj"
)
self.assertEqual(
nominal_transactions[0].field,
"g"
)
# credit goods
self.assertEqual(
nominal_transactions[1].header,
header.pk
)
self.assertEqual(
nominal_transactions[1].line,
credit.pk
)
self.assertEqual(
nominal_transactions[1].nominal,
self.debtors_nominal
)
self.assertEqual(
nominal_transactions[1].value,
-120
)
self.assertEqual(
nominal_transactions[1].ref,
header.ref
)
self.assertEqual(
nominal_transactions[1].period,
self.period
)
self.assertEqual(
nominal_transactions[1].type,
"nj"
)
self.assertEqual(
nominal_transactions[1].field,
"g"
)
self.assertEqual(
len(vat_transactions),
2
)
for i, vat_tran in enumerate(vat_transactions):
self.assertEqual(
vat_tran.header,
header.pk
)
self.assertEqual(
vat_tran.line,
lines[i].pk
)
self.assertEqual(
vat_tran.module,
"NL"
)
self.assertEqual(
vat_tran.ref,
header.ref
)
self.assertEqual(
vat_tran.period,
header.period
)
self.assertEqual(
vat_tran.date,
header.date
)
self.assertEqual(
vat_tran.field,
"v"
)
self.assertEqual(
vat_tran.tran_type,
header.type
)
self.assertEqual(
vat_tran.vat_type,
"o"
)
self.assertEqual(
vat_tran.vat_code,
lines[i].vat_code
)
self.assertEqual(
vat_tran.vat_rate,
lines[i].vat_code.rate
)
self.assertEqual(
vat_tran.goods,
lines[i].goods
)
self.assertEqual(
vat_tran.vat,
lines[i].vat
)
# CORRECT USAGE
# Each line contains goods only
def test_create_journal_with_vat_only_and_no_goods(self):
self.client.force_login(self.user)
data = {}
header_data = create_header(
HEADER_FORM_PREFIX,
{
"type": "nj",
"ref": self.ref,
"date": self.date,
"total": 120,
"period": self.period.pk,
"vat_type": "o"
}
)
data.update(header_data)
line_forms = []
line_forms.append(
{
"description": self.description,
"goods": 0,
"nominal": self.bank_nominal.pk,
"vat_code": self.vat_code.pk,
"vat": 120
}
)
line_forms.append(
{
"description": self.description,
"goods": 0,
"nominal": self.debtors_nominal.pk,
"vat_code": self.vat_code.pk,
"vat": -120
}
)
line_data = create_formset_data(LINE_FORM_PREFIX, line_forms)
data.update(line_data)
response = self.client.post(self.url, data)
self.assertEqual(response.status_code, 302)
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
'nj',
)
self.assertEqual(
header.ref,
self.ref,
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.goods,
0
)
self.assertEqual(
header.vat,
0
)
self.assertEqual(
header.total,
120
)
self.assertEqual(
header.vat_type,
"o"
)
lines = NominalLine.objects.all()
nominal_transactions = NominalTransaction.objects.all()
vat_transactions = VatTransaction.objects.all()
self.assertEqual(
len(lines),
2
)
debit = lines[0]
credit = lines[1]
# DEBIT
self.assertEqual(
debit.description,
self.description
)
self.assertEqual(
debit.goods,
0
)
self.assertEqual(
debit.nominal,
self.bank_nominal
)
self.assertEqual(
debit.vat_code,
self.vat_code
)
self.assertEqual(
debit.vat,
120
)
self.assertEqual(
debit.goods_nominal_transaction,
None
)
self.assertEqual(
debit.vat_nominal_transaction,
nominal_transactions[0]
)
self.assertEqual(
debit.vat_transaction,
vat_transactions[0]
)
# CREDIT
self.assertEqual(
credit.description,
self.description
)
self.assertEqual(
credit.goods,
0
)
self.assertEqual(
credit.nominal,
self.debtors_nominal
)
self.assertEqual(
credit.vat_code,
self.vat_code
)
self.assertEqual(
credit.vat,
-120
)
self.assertEqual(
credit.goods_nominal_transaction,
None
)
self.assertEqual(
credit.vat_nominal_transaction,
nominal_transactions[1]
)
self.assertEqual(
credit.vat_transaction,
vat_transactions[1]
)
self.assertEqual(
len(nominal_transactions),
2
)
# debit goods
self.assertEqual(
nominal_transactions[0].header,
header.pk
)
self.assertEqual(
nominal_transactions[0].line,
debit.pk
)
self.assertEqual(
nominal_transactions[0].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[0].value,
120
)
self.assertEqual(
nominal_transactions[0].ref,
header.ref
)
self.assertEqual(
nominal_transactions[0].period,
self.period
)
self.assertEqual(
nominal_transactions[0].type,
"nj"
)
self.assertEqual(
nominal_transactions[0].field,
"v"
)
# credit goods
self.assertEqual(
nominal_transactions[1].header,
header.pk
)
self.assertEqual(
nominal_transactions[1].line,
credit.pk
)
self.assertEqual(
nominal_transactions[1].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[1].value,
-120
)
self.assertEqual(
nominal_transactions[1].ref,
header.ref
)
self.assertEqual(
nominal_transactions[1].period,
self.period
)
self.assertEqual(
nominal_transactions[1].type,
"nj"
)
self.assertEqual(
nominal_transactions[1].field,
"v"
)
self.assertEqual(
len(vat_transactions),
2
)
for i, vat_tran in enumerate(vat_transactions):
self.assertEqual(
vat_tran.header,
header.pk
)
self.assertEqual(
vat_tran.line,
lines[i].pk
)
self.assertEqual(
vat_tran.module,
"NL"
)
self.assertEqual(
vat_tran.ref,
header.ref
)
self.assertEqual(
vat_tran.period,
header.period
)
self.assertEqual(
vat_tran.date,
header.date
)
self.assertEqual(
vat_tran.field,
"v"
)
self.assertEqual(
vat_tran.tran_type,
header.type
)
self.assertEqual(
vat_tran.vat_type,
"o"
)
self.assertEqual(
vat_tran.vat_code,
lines[i].vat_code
)
self.assertEqual(
vat_tran.vat_rate,
lines[i].vat_code.rate
)
self.assertEqual(
vat_tran.goods,
lines[i].goods
)
self.assertEqual(
vat_tran.vat,
lines[i].vat
)
# INCORRECT USAGE
def test_create_journal_without_total(self):
self.client.force_login(self.user)
data = {}
header_data = create_header(
HEADER_FORM_PREFIX,
{
"type": "nj",
"ref": self.ref,
"date": self.date,
"total": '',
"period": self.period.pk,
"vat_type": "o"
}
)
data.update(header_data)
line_forms = []
line_forms.append(
{
"description": self.description,
"goods": 100,
"nominal": self.bank_nominal.pk,
"vat_code": self.vat_code.pk,
"vat": 20
}
)
line_forms.append(
{
"description": self.description,
"goods": -100,
"nominal": self.debtors_nominal.pk,
"vat_code": self.vat_code.pk,
"vat": -20
}
)
line_data = create_formset_data(LINE_FORM_PREFIX, line_forms)
data.update(line_data)
response = self.client.post(self.url, data)
self.assertEqual(response.status_code, 200)
header = NominalHeader.objects.all()
lines = NominalLine.objects.all()
self.assertEqual(
len(header),
0
)
self.assertEqual(
len(lines),
0
)
self.assertContains(
response,
'<li class="py-1">No total entered. This should be the total value of the debit side of the journal i.e. the total of the positive values</li>',
html=True
)
self.assertEqual(
len(VatTransaction.objects.all()),
0
)
# INCORRECT USAGE
def test_create_journal_where_debits_do_not_equal_total_entered(self):
self.client.force_login(self.user)
data = {}
header_data = create_header(
HEADER_FORM_PREFIX,
{
"type": "nj",
"ref": self.ref,
"date": self.date,
"total": 120,
"period": self.period.pk,
"vat_type": "o"
}
)
data.update(header_data)
line_forms = []
line_forms.append(
{
"description": self.description,
"goods": 100,
"nominal": self.bank_nominal.pk,
"vat_code": self.vat_code.pk,
"vat": 20
}
)
line_forms.append(
{
"description": self.description,
"goods": -50,
"nominal": self.debtors_nominal.pk,
"vat_code": self.vat_code.pk,
"vat": -10
}
)
line_data = create_formset_data(LINE_FORM_PREFIX, line_forms)
data.update(line_data)
response = self.client.post(self.url, data)
self.assertEqual(response.status_code, 200)
header = NominalHeader.objects.all()
lines = NominalLine.objects.all()
self.assertEqual(
len(header),
0
)
self.assertEqual(
len(lines),
0
)
self.assertContains(
response,
'<li class="py-1">Debits and credits must total zero. Total debits entered i.e. '
'positives values entered is 120.00, and total credits entered i.e. negative values entered, is -60.00. This gives a non-zero total of 60.00</li>',
html=True
)
self.assertEqual(
len(VatTransaction.objects.all()),
0
)
# INCORRECT USAGE
def test_create_journal_without_header_total_and_no_analysis(self):
self.client.force_login(self.user)
data = {}
header_data = create_header(
HEADER_FORM_PREFIX,
{
"type": "nj",
"ref": self.ref,
"date": self.date,
"total": '',
"period": self.period.pk,
"vat_type": "o"
}
)
data.update(header_data)
line_forms = []
line_data = create_formset_data(LINE_FORM_PREFIX, line_forms)
data.update(line_data)
response = self.client.post(self.url, data)
self.assertEqual(response.status_code, 200)
header = NominalHeader.objects.all()
lines = NominalLine.objects.all()
self.assertEqual(
len(header),
0
)
self.assertEqual(
len(lines),
0
)
self.assertContains(
response,
'<li class="py-1">No total entered. This should be the total value of the debit side of the journal i.e. the total of the positive values</li>',
html=True
)
self.assertEqual(
len(VatTransaction.objects.all()),
0
)
# INCORRECT USAGE
def test_create_journal_with_header_total_and_no_analysis(self):
self.client.force_login(self.user)
data = {}
header_data = create_header(
HEADER_FORM_PREFIX,
{
"type": "nj",
"ref": self.ref,
"date": self.date,
"total": 120,
"period": self.period.pk,
"vat_type": "o"
}
)
data.update(header_data)
line_forms = []
line_data = create_formset_data(LINE_FORM_PREFIX, line_forms)
data.update(line_data)
response = self.client.post(self.url, data)
self.assertEqual(response.status_code, 200)
header = NominalHeader.objects.all()
lines = NominalLine.objects.all()
self.assertEqual(
len(header),
0
)
self.assertEqual(
len(lines),
0
)
self.assertContains(
response,
'<li class="py-1">The total of the debits does not equal the total you entered.</li>',
html=True
)
self.assertEqual(
len(VatTransaction.objects.all()),
0
)
# INCORRECT USAGE
def test_create_journal_with_header_which_is_credit_total(self):
self.client.force_login(self.user)
data = {}
header_data = create_header(
HEADER_FORM_PREFIX,
{
"type": "nj",
"ref": self.ref,
"date": self.date,
"total": -120,
"period": self.period.pk,
"vat_type": "o"
}
)
data.update(header_data)
line_forms = []
line_data = create_formset_data(LINE_FORM_PREFIX, line_forms)
data.update(line_data)
response = self.client.post(self.url, data)
self.assertEqual(response.status_code, 200)
header = NominalHeader.objects.all()
lines = NominalLine.objects.all()
self.assertEqual(
len(header),
0
)
self.assertEqual(
len(lines),
0
)
self.assertContains(
response,
'<li class="py-1">The total of the debits does not equal the total you entered.</li>',
html=True
)
self.assertEqual(
len(VatTransaction.objects.all()),
0
)
# INCORRECT USAGE
def test_create_journal_without_vat_type_but_vat_is_analysed(self):
self.client.force_login(self.user)
data = {}
header_data = create_header(
HEADER_FORM_PREFIX,
{
"type": "nj",
"ref": self.ref,
"date": self.date,
"total": '',
"period": self.period.pk,
"vat_type": ""
}
)
data.update(header_data)
line_forms = []
line_forms.append(
{
"description": self.description,
"goods": 100,
"nominal": self.bank_nominal.pk,
"vat_code": self.vat_code.pk,
"vat": 20
}
)
line_forms.append(
{
"description": self.description,
"goods": -100,
"nominal": self.debtors_nominal.pk,
"vat_code": self.vat_code.pk,
"vat": -20
}
)
line_data = create_formset_data(LINE_FORM_PREFIX, line_forms)
data.update(line_data)
response = self.client.post(self.url, data)
self.assertEqual(response.status_code, 200)
header = NominalHeader.objects.all()
lines = NominalLine.objects.all()
self.assertEqual(
len(header),
0
)
self.assertEqual(
len(lines),
0
)
self.assertContains(
response,
'<li class="py-1">If you want to analyse the vat you need to state at the top of the page whether it is input or output</li>',
html=True
)
self.assertEqual(
len(VatTransaction.objects.all()),
0
)
class EditJournal(TestCase):
@classmethod
def setUpTestData(cls):
cls.user = get_user_model().objects.create_superuser(username="dummy", password="dummy")
cls.ref = "test journal"
cls.date = datetime.now().strftime(DATE_INPUT_FORMAT)
cls.due_date = (datetime.now() + timedelta(days=31)
).strftime(DATE_INPUT_FORMAT)
cls.model_date = datetime.now().strftime(MODEL_DATE_INPUT_FORMAT)
cls.model_due_date = (datetime.now() + timedelta(days=31)
).strftime(MODEL_DATE_INPUT_FORMAT)
fy = FinancialYear.objects.create(financial_year=2020)
cls.fy = fy
cls.period = Period.objects.create(fy=fy, period="01", fy_and_period="202001", month_start=date(2020,1,31))
cls.description = "a line description"
assets = Nominal.objects.create(name="Assets")
current_assets = Nominal.objects.create(
parent=assets, name="Current Assets")
cls.bank_nominal = Nominal.objects.create(
parent=current_assets, name="Bank Account")
cls.debtors_nominal = Nominal.objects.create(
parent=current_assets, name="Trade Debtors")
# LIABILITIES
liabilities = Nominal.objects.create(name="Liabilities")
current_liabilities = Nominal.objects.create(
parent=liabilities, name="Current Liabilities")
cls.vat_nominal = Nominal.objects.create(
parent=current_assets, name="Vat")
cls.vat_code = Vat.objects.create(
code="1", name="standard rate", rate=20)
ModuleSettings.objects.create(
cash_book_period=cls.period,
nominals_period=cls.period,
purchases_period=cls.period,
sales_period=cls.period
)
# CORRECT USAGE
# Can request create journal view t=nj GET parameter
def test_get_request_with_query_parameter(self):
self.client.force_login(self.user)
header, lines, nominal_transactions = create_nominal_journal({
"header": {
"type": "nj",
"ref": "test journal",
"period": self.period,
"date": self.model_date,
"total": 120,
"vat_type": "o"
},
"lines": [
{
"line_no": 1,
"description": "line 1",
"goods": 100,
"nominal": self.bank_nominal,
"vat_code": self.vat_code,
"vat": 20
},
{
"line_no": 2,
"description": "line 2",
"goods": -100,
"nominal": self.debtors_nominal,
"vat_code": self.vat_code,
"vat": -20
}
],
},
self.vat_nominal
)
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
"nj"
)
self.assertEqual(
header.ref,
"test journal"
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.vat_type,
"o"
)
lines = NominalLine.objects.all()
create_vat_transactions(header, lines)
vat_transactions = VatTransaction.objects.all()
self.assertEqual(
len(vat_transactions),
2
)
self.assertEqual(
len(lines),
2
)
self.assertEqual(
lines[0].description,
"line 1"
)
self.assertEqual(
lines[0].goods,
100
)
self.assertEqual(
lines[0].nominal,
self.bank_nominal
)
self.assertEqual(
lines[0].vat_code,
self.vat_code
)
self.assertEqual(
lines[0].vat,
20
)
self.assertEqual(
lines[1].description,
"line 2"
)
self.assertEqual(
lines[1].goods,
-100
)
self.assertEqual(
lines[1].nominal,
self.debtors_nominal
)
self.assertEqual(
lines[1].vat_code,
self.vat_code
)
self.assertEqual(
lines[1].vat,
-20
)
nominal_transactions = NominalTransaction.objects.all()
self.assertEqual(
len(nominal_transactions),
4
)
# DEBITS
self.assertEqual(
nominal_transactions[0].module,
"NL",
)
self.assertEqual(
nominal_transactions[0].header,
header.pk
)
self.assertEqual(
nominal_transactions[0].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[0].nominal,
lines[0].nominal
)
self.assertEqual(
nominal_transactions[0].value,
lines[0].goods
)
self.assertEqual(
nominal_transactions[0].ref,
header.ref
)
self.assertEqual(
nominal_transactions[0].period,
header.period
)
self.assertEqual(
nominal_transactions[0].type,
header.type
)
self.assertEqual(
nominal_transactions[1].module,
"NL",
)
self.assertEqual(
nominal_transactions[1].header,
header.pk
)
self.assertEqual(
nominal_transactions[1].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[1].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[1].value,
lines[0].vat
)
self.assertEqual(
nominal_transactions[1].ref,
header.ref
)
self.assertEqual(
nominal_transactions[1].period,
header.period
)
self.assertEqual(
nominal_transactions[1].type,
header.type
)
# CREDITS
self.assertEqual(
nominal_transactions[2].module,
"NL"
)
self.assertEqual(
nominal_transactions[2].header,
header.pk
)
self.assertEqual(
nominal_transactions[2].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[2].nominal,
lines[1].nominal
)
self.assertEqual(
nominal_transactions[2].value,
lines[1].goods
)
self.assertEqual(
nominal_transactions[2].ref,
header.ref
)
self.assertEqual(
nominal_transactions[2].period,
header.period
)
self.assertEqual(
nominal_transactions[2].type,
header.type
)
self.assertEqual(
nominal_transactions[3].module,
"NL"
)
self.assertEqual(
nominal_transactions[3].header,
header.pk
)
self.assertEqual(
nominal_transactions[3].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[3].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[3].value,
lines[1].vat
)
self.assertEqual(
nominal_transactions[3].ref,
header.ref
)
self.assertEqual(
nominal_transactions[3].period,
header.period
)
self.assertEqual(
nominal_transactions[3].type,
header.type
)
url = reverse("nominals:edit", kwargs={"pk": header.pk})
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
# This HTML fragment is before the selectize widget does its thing
self.assertContains(
response,
'<select name="header-type" class="transaction-type-select form-control form-control-sm" disabled id="id_header-type">'
'<option value="nj" selected>Journal</option>'
'</select>',
html=True
)
# CORRECT USAGE
# JUST HALF THE GOODS AND VAT
def test_edit_journal(self):
self.client.force_login(self.user)
header, line, nominal_transactions = create_nominal_journal({
"header": {
"type": "nj",
"ref": "test journal",
"period": self.period,
"date": self.model_date,
"total": 120,
"vat_type": "o"
},
"lines": [
{
"line_no": 1,
"description": "line 1",
"goods": 100,
"nominal": self.bank_nominal,
"vat_code": self.vat_code,
"vat": 20
},
{
"line_no": 2,
"description": "line 2",
"goods": -100,
"nominal": self.debtors_nominal,
"vat_code": self.vat_code,
"vat": -20
}
],
},
self.vat_nominal
)
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
"nj"
)
self.assertEqual(
header.ref,
"test journal"
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.total,
120
)
self.assertEqual(
header.vat_type,
"o"
)
# NOM LINES
lines = NominalLine.objects.all()
create_vat_transactions(header, lines)
vat_transactions = VatTransaction.objects.all()
self.assertEqual(
len(vat_transactions),
2
)
nominal_transactions = NominalTransaction.objects.all()
self.assertEqual(
len(lines),
2
)
self.assertEqual(
lines[0].description,
"line 1"
)
self.assertEqual(
lines[0].goods,
100
)
self.assertEqual(
lines[0].nominal,
self.bank_nominal
)
self.assertEqual(
lines[0].vat_code,
self.vat_code
)
self.assertEqual(
lines[0].vat,
20
)
self.assertEqual(
lines[0].goods_nominal_transaction,
nominal_transactions[0]
)
self.assertEqual(
lines[0].vat_nominal_transaction,
nominal_transactions[1]
)
self.assertEqual(
lines[0].vat_transaction,
vat_transactions[0]
)
self.assertEqual(
lines[1].description,
"line 2"
)
self.assertEqual(
lines[1].goods,
-100
)
self.assertEqual(
lines[1].nominal,
self.debtors_nominal
)
self.assertEqual(
lines[1].vat_code,
self.vat_code
)
self.assertEqual(
lines[1].vat,
-20
)
self.assertEqual(
lines[1].goods_nominal_transaction,
nominal_transactions[2]
)
self.assertEqual(
lines[1].vat_nominal_transaction,
nominal_transactions[3]
)
self.assertEqual(
lines[1].vat_transaction,
vat_transactions[1]
)
# DEBIT NOM TRANS
self.assertEqual(
len(nominal_transactions),
4
)
self.assertEqual(
nominal_transactions[0].module,
"NL",
)
self.assertEqual(
nominal_transactions[0].header,
header.pk
)
self.assertEqual(
nominal_transactions[0].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[0].nominal,
lines[0].nominal
)
self.assertEqual(
nominal_transactions[0].value,
lines[0].goods
)
self.assertEqual(
nominal_transactions[0].ref,
header.ref
)
self.assertEqual(
nominal_transactions[0].period,
header.period
)
self.assertEqual(
nominal_transactions[0].type,
header.type
)
self.assertEqual(
nominal_transactions[1].module,
"NL"
)
self.assertEqual(
nominal_transactions[1].header,
header.pk
)
self.assertEqual(
nominal_transactions[1].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[1].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[1].value,
lines[0].vat
)
self.assertEqual(
nominal_transactions[1].ref,
header.ref
)
self.assertEqual(
nominal_transactions[1].period,
header.period
)
self.assertEqual(
nominal_transactions[1].type,
header.type
)
# CREDIT NOM TRANS
self.assertEqual(
nominal_transactions[2].module,
"NL",
)
self.assertEqual(
nominal_transactions[2].header,
header.pk
)
self.assertEqual(
nominal_transactions[2].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[2].nominal,
lines[1].nominal
)
self.assertEqual(
nominal_transactions[2].value,
lines[1].goods
)
self.assertEqual(
nominal_transactions[2].ref,
header.ref
)
self.assertEqual(
nominal_transactions[2].period,
header.period
)
self.assertEqual(
nominal_transactions[2].type,
header.type
)
self.assertEqual(
nominal_transactions[3].module,
"NL"
)
self.assertEqual(
nominal_transactions[3].header,
header.pk
)
self.assertEqual(
nominal_transactions[3].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[3].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[3].value,
lines[1].vat
)
self.assertEqual(
nominal_transactions[3].ref,
header.ref
)
self.assertEqual(
nominal_transactions[3].period,
header.period
)
self.assertEqual(
nominal_transactions[3].type,
header.type
)
for i, vat_tran in enumerate(vat_transactions):
self.assertEqual(
vat_tran.header,
header.pk
)
self.assertEqual(
vat_tran.line,
lines[i].pk
)
self.assertEqual(
vat_tran.module,
"NL"
)
self.assertEqual(
vat_tran.ref,
header.ref
)
self.assertEqual(
vat_tran.period,
header.period
)
self.assertEqual(
vat_tran.date,
header.date
)
self.assertEqual(
vat_tran.field,
"v"
)
self.assertEqual(
vat_tran.tran_type,
header.type
)
self.assertEqual(
vat_tran.vat_type,
"o"
)
self.assertEqual(
vat_tran.vat_code,
lines[i].vat_code
)
self.assertEqual(
vat_tran.vat_rate,
lines[i].vat_code.rate
)
self.assertEqual(
vat_tran.goods,
lines[i].goods
)
self.assertEqual(
vat_tran.vat,
lines[i].vat
)
data = {}
header_data = create_header(
HEADER_FORM_PREFIX,
{
"type": header.type,
"ref": header.ref,
"date": header.date.strftime(DATE_INPUT_FORMAT),
"total": 60,
"period": header.period.pk,
"vat_type": "o"
}
)
data.update(header_data)
line_forms = []
line_forms.append(
{
"description": lines[0].description,
"goods": 50,
"nominal": lines[0].nominal_id,
"vat_code": lines[0].vat_code_id,
"vat": 10
}
)
line_forms[0]["id"] = lines[0].pk
line_forms.append(
{
"description": lines[1].description,
"goods": -50,
"nominal": lines[1].nominal_id,
"vat_code": lines[1].vat_code_id,
"vat": -10
}
)
line_forms[1]["id"] = lines[1].pk
line_data = create_formset_data(LINE_FORM_PREFIX, line_forms)
line_data["line-INITIAL_FORMS"] = 2
data.update(line_data)
url = reverse("nominals:edit", kwargs={"pk": header.pk})
response = self.client.post(url, data)
self.assertEqual(response.status_code, 302)
# POST EDIT ...
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
"nj"
)
self.assertEqual(
header.ref,
"test journal"
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.total,
60
)
self.assertEqual(
header.vat_type,
"o"
)
# NOM LINES
vat_transactions = VatTransaction.objects.all()
self.assertEqual(
len(vat_transactions),
2
)
lines = NominalLine.objects.all()
nominal_transactions = NominalTransaction.objects.all()
self.assertEqual(
len(lines),
2
)
self.assertEqual(
lines[0].description,
"line 1"
)
self.assertEqual(
lines[0].goods,
50
)
self.assertEqual(
lines[0].nominal,
self.bank_nominal
)
self.assertEqual(
lines[0].vat_code,
self.vat_code
)
self.assertEqual(
lines[0].vat,
10
)
self.assertEqual(
lines[0].goods_nominal_transaction,
nominal_transactions[0]
)
self.assertEqual(
lines[0].vat_nominal_transaction,
nominal_transactions[1]
)
self.assertEqual(
lines[0].vat_transaction,
vat_transactions[0]
)
self.assertEqual(
lines[1].description,
"line 2"
)
self.assertEqual(
lines[1].goods,
-50
)
self.assertEqual(
lines[1].nominal,
self.debtors_nominal
)
self.assertEqual(
lines[1].vat_code,
self.vat_code
)
self.assertEqual(
lines[1].vat,
-10
)
self.assertEqual(
lines[1].goods_nominal_transaction,
nominal_transactions[2]
)
self.assertEqual(
lines[1].vat_nominal_transaction,
nominal_transactions[3]
)
self.assertEqual(
lines[1].vat_transaction,
vat_transactions[1]
)
# DEBIT NOM TRANS
self.assertEqual(
len(nominal_transactions),
4
)
self.assertEqual(
nominal_transactions[0].module,
"NL",
)
self.assertEqual(
nominal_transactions[0].header,
header.pk
)
self.assertEqual(
nominal_transactions[0].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[0].nominal,
lines[0].nominal
)
self.assertEqual(
nominal_transactions[0].value,
lines[0].goods
)
self.assertEqual(
nominal_transactions[0].ref,
header.ref
)
self.assertEqual(
nominal_transactions[0].period,
header.period
)
self.assertEqual(
nominal_transactions[0].type,
header.type
)
self.assertEqual(
nominal_transactions[1].module,
"NL"
)
self.assertEqual(
nominal_transactions[1].header,
header.pk
)
self.assertEqual(
nominal_transactions[1].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[1].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[1].value,
lines[0].vat
)
self.assertEqual(
nominal_transactions[1].ref,
header.ref
)
self.assertEqual(
nominal_transactions[1].period,
header.period
)
self.assertEqual(
nominal_transactions[1].type,
header.type
)
# CREDIT NOM TRANS
self.assertEqual(
nominal_transactions[2].module,
"NL",
)
self.assertEqual(
nominal_transactions[2].header,
header.pk
)
self.assertEqual(
nominal_transactions[2].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[2].nominal,
lines[1].nominal
)
self.assertEqual(
nominal_transactions[2].value,
lines[1].goods
)
self.assertEqual(
nominal_transactions[2].ref,
header.ref
)
self.assertEqual(
nominal_transactions[2].period,
header.period
)
self.assertEqual(
nominal_transactions[2].type,
header.type
)
self.assertEqual(
nominal_transactions[3].module,
"NL"
)
self.assertEqual(
nominal_transactions[3].header,
header.pk
)
self.assertEqual(
nominal_transactions[3].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[3].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[3].value,
lines[1].vat
)
self.assertEqual(
nominal_transactions[3].ref,
header.ref
)
self.assertEqual(
nominal_transactions[3].period,
header.period
)
self.assertEqual(
nominal_transactions[3].type,
header.type
)
for i, vat_tran in enumerate(vat_transactions):
self.assertEqual(
vat_tran.header,
header.pk
)
self.assertEqual(
vat_tran.line,
lines[i].pk
)
self.assertEqual(
vat_tran.module,
"NL"
)
self.assertEqual(
vat_tran.ref,
header.ref
)
self.assertEqual(
vat_tran.period,
header.period
)
self.assertEqual(
vat_tran.date,
header.date
)
self.assertEqual(
vat_tran.field,
"v"
)
self.assertEqual(
vat_tran.tran_type,
header.type
)
self.assertEqual(
vat_tran.vat_type,
"o"
)
self.assertEqual(
vat_tran.vat_code,
lines[i].vat_code
)
self.assertEqual(
vat_tran.vat_rate,
lines[i].vat_code.rate
)
self.assertEqual(
vat_tran.goods,
lines[i].goods
)
self.assertEqual(
vat_tran.vat,
lines[i].vat
)
# CORRECT USAGE
def test_edit_journal_by_adding_two_new_lines(self):
self.client.force_login(self.user)
header, line, nominal_transactions = create_nominal_journal({
"header": {
"type": "nj",
"ref": "test journal",
"period": self.period,
"date": self.model_date,
"total": 120,
"vat_type": "o"
},
"lines": [
{
"line_no": 1,
"description": "line 1",
"goods": 100,
"nominal": self.bank_nominal,
"vat_code": self.vat_code,
"vat": 20
},
{
"line_no": 2,
"description": "line 2",
"goods": -100,
"nominal": self.debtors_nominal,
"vat_code": self.vat_code,
"vat": -20
}
],
},
self.vat_nominal
)
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
"nj"
)
self.assertEqual(
header.ref,
"test journal"
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.total,
120
)
self.assertEqual(
header.vat_type,
"o"
)
# NOM LINES
lines = NominalLine.objects.all()
create_vat_transactions(header, lines)
vat_transactions = VatTransaction.objects.all()
self.assertEqual(
len(vat_transactions),
2
)
nominal_transactions = NominalTransaction.objects.all()
self.assertEqual(
len(lines),
2
)
self.assertEqual(
lines[0].description,
"line 1"
)
self.assertEqual(
lines[0].goods,
100
)
self.assertEqual(
lines[0].nominal,
self.bank_nominal
)
self.assertEqual(
lines[0].vat_code,
self.vat_code
)
self.assertEqual(
lines[0].vat,
20
)
self.assertEqual(
lines[0].goods_nominal_transaction,
nominal_transactions[0]
)
self.assertEqual(
lines[0].vat_nominal_transaction,
nominal_transactions[1]
)
self.assertEqual(
lines[0].vat_transaction,
vat_transactions[0]
)
self.assertEqual(
lines[1].description,
"line 2"
)
self.assertEqual(
lines[1].goods,
-100
)
self.assertEqual(
lines[1].nominal,
self.debtors_nominal
)
self.assertEqual(
lines[1].vat_code,
self.vat_code
)
self.assertEqual(
lines[1].vat,
-20
)
self.assertEqual(
lines[1].goods_nominal_transaction,
nominal_transactions[2]
)
self.assertEqual(
lines[1].vat_nominal_transaction,
nominal_transactions[3]
)
self.assertEqual(
lines[1].vat_transaction,
vat_transactions[1]
)
# DEBIT NOM TRANS
self.assertEqual(
len(nominal_transactions),
4
)
self.assertEqual(
nominal_transactions[0].module,
"NL",
)
self.assertEqual(
nominal_transactions[0].header,
header.pk
)
self.assertEqual(
nominal_transactions[0].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[0].nominal,
lines[0].nominal
)
self.assertEqual(
nominal_transactions[0].value,
lines[0].goods
)
self.assertEqual(
nominal_transactions[0].ref,
header.ref
)
self.assertEqual(
nominal_transactions[0].period,
header.period
)
self.assertEqual(
nominal_transactions[0].type,
header.type
)
self.assertEqual(
nominal_transactions[1].module,
"NL"
)
self.assertEqual(
nominal_transactions[1].header,
header.pk
)
self.assertEqual(
nominal_transactions[1].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[1].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[1].value,
lines[0].vat
)
self.assertEqual(
nominal_transactions[1].ref,
header.ref
)
self.assertEqual(
nominal_transactions[1].period,
header.period
)
self.assertEqual(
nominal_transactions[1].type,
header.type
)
# CREDIT NOM TRANS
self.assertEqual(
nominal_transactions[2].module,
"NL",
)
self.assertEqual(
nominal_transactions[2].header,
header.pk
)
self.assertEqual(
nominal_transactions[2].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[2].nominal,
lines[1].nominal
)
self.assertEqual(
nominal_transactions[2].value,
lines[1].goods
)
self.assertEqual(
nominal_transactions[2].ref,
header.ref
)
self.assertEqual(
nominal_transactions[2].period,
header.period
)
self.assertEqual(
nominal_transactions[2].type,
header.type
)
self.assertEqual(
nominal_transactions[3].module,
"NL"
)
self.assertEqual(
nominal_transactions[3].header,
header.pk
)
self.assertEqual(
nominal_transactions[3].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[3].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[3].value,
lines[1].vat
)
self.assertEqual(
nominal_transactions[3].ref,
header.ref
)
self.assertEqual(
nominal_transactions[3].period,
header.period
)
self.assertEqual(
nominal_transactions[3].type,
header.type
)
data = {}
header_data = create_header(
HEADER_FORM_PREFIX,
{
"type": header.type,
"ref": header.ref,
"date": header.date.strftime(DATE_INPUT_FORMAT),
"total": 240,
"period": header.period.pk,
"vat_type": "o"
}
)
data.update(header_data)
line_forms = []
line_forms.append(
{
"description": lines[0].description,
"goods": 100,
"nominal": lines[0].nominal_id,
"vat_code": lines[0].vat_code_id,
"vat": 20
}
)
line_forms[0]["id"] = lines[0].pk
line_forms.append(
{
"description": lines[1].description,
"goods": -100,
"nominal": lines[1].nominal_id,
"vat_code": lines[1].vat_code_id,
"vat": -20
}
)
line_forms[1]["id"] = lines[1].pk
# NEW LINES
line_forms.append(
{
"description": lines[0].description,
"goods": 100,
"nominal": lines[0].nominal_id,
"vat_code": lines[0].vat_code_id,
"vat": 20
}
)
line_forms[2]["id"] = ""
line_forms.append(
{
"description": lines[1].description,
"goods": -100,
"nominal": lines[1].nominal_id,
"vat_code": lines[1].vat_code_id,
"vat": -20
}
)
line_forms[3]["id"] = ""
line_data = create_formset_data(LINE_FORM_PREFIX, line_forms)
line_data["line-INITIAL_FORMS"] = 2
data.update(line_data)
url = reverse("nominals:edit", kwargs={"pk": header.pk})
response = self.client.post(url, data)
self.assertEqual(response.status_code, 302)
# POST EDIT ...
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
"nj"
)
self.assertEqual(
header.ref,
"test journal"
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.total,
240
)
self.assertEqual(
header.vat_type,
"o"
)
# NOM LINES
vat_transactions = VatTransaction.objects.all().order_by("line")
self.assertEqual(
len(vat_transactions),
4
)
lines = NominalLine.objects.all().order_by("pk")
nominal_transactions = NominalTransaction.objects.all().order_by("pk")
self.assertEqual(
len(lines),
4
)
self.assertEqual(
lines[0].description,
"line 1"
)
self.assertEqual(
lines[0].goods,
100
)
self.assertEqual(
lines[0].nominal,
self.bank_nominal
)
self.assertEqual(
lines[0].vat_code,
self.vat_code
)
self.assertEqual(
lines[0].vat,
20
)
self.assertEqual(
lines[0].goods_nominal_transaction,
nominal_transactions[0]
)
self.assertEqual(
lines[0].vat_nominal_transaction,
nominal_transactions[1]
)
self.assertEqual(
lines[0].vat_transaction,
vat_transactions[0]
)
self.assertEqual(
lines[1].description,
"line 2"
)
self.assertEqual(
lines[1].goods,
-100
)
self.assertEqual(
lines[1].nominal,
self.debtors_nominal
)
self.assertEqual(
lines[1].vat_code,
self.vat_code
)
self.assertEqual(
lines[1].vat,
-20
)
self.assertEqual(
lines[1].goods_nominal_transaction,
nominal_transactions[2]
)
self.assertEqual(
lines[1].vat_nominal_transaction,
nominal_transactions[3]
)
self.assertEqual(
lines[1].vat_transaction,
vat_transactions[1]
)
# NEW LINES
self.assertEqual(
lines[2].description,
"line 1"
)
self.assertEqual(
lines[2].goods,
100
)
self.assertEqual(
lines[2].nominal,
self.bank_nominal
)
self.assertEqual(
lines[2].vat_code,
self.vat_code
)
self.assertEqual(
lines[2].vat,
20
)
self.assertEqual(
lines[2].goods_nominal_transaction,
nominal_transactions[4]
)
self.assertEqual(
lines[2].vat_nominal_transaction,
nominal_transactions[5]
)
self.assertEqual(
lines[2].vat_transaction,
vat_transactions[2]
)
self.assertEqual(
lines[3].description,
"line 2"
)
self.assertEqual(
lines[3].goods,
-100
)
self.assertEqual(
lines[3].nominal,
self.debtors_nominal
)
self.assertEqual(
lines[3].vat_code,
self.vat_code
)
self.assertEqual(
lines[3].vat,
-20
)
self.assertEqual(
lines[3].goods_nominal_transaction,
nominal_transactions[6]
)
self.assertEqual(
lines[3].vat_nominal_transaction,
nominal_transactions[7]
)
self.assertEqual(
lines[3].vat_transaction,
vat_transactions[3]
)
# DEBIT NOM TRANS
self.assertEqual(
len(nominal_transactions),
8
)
self.assertEqual(
nominal_transactions[0].module,
"NL",
)
self.assertEqual(
nominal_transactions[0].header,
header.pk
)
self.assertEqual(
nominal_transactions[0].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[0].nominal,
lines[0].nominal
)
self.assertEqual(
nominal_transactions[0].value,
lines[0].goods
)
self.assertEqual(
nominal_transactions[0].ref,
header.ref
)
self.assertEqual(
nominal_transactions[0].period,
header.period
)
self.assertEqual(
nominal_transactions[0].type,
header.type
)
self.assertEqual(
nominal_transactions[0].field,
"g"
)
self.assertEqual(
nominal_transactions[1].module,
"NL"
)
self.assertEqual(
nominal_transactions[1].header,
header.pk
)
self.assertEqual(
nominal_transactions[1].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[1].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[1].value,
lines[0].vat
)
self.assertEqual(
nominal_transactions[1].ref,
header.ref
)
self.assertEqual(
nominal_transactions[1].period,
header.period
)
self.assertEqual(
nominal_transactions[1].type,
header.type
)
self.assertEqual(
nominal_transactions[1].field,
"v"
)
self.assertEqual(
nominal_transactions[4].module,
"NL",
)
self.assertEqual(
nominal_transactions[4].header,
header.pk
)
self.assertEqual(
nominal_transactions[4].line,
lines[2].pk,
)
self.assertEqual(
nominal_transactions[4].nominal,
lines[2].nominal
)
self.assertEqual(
nominal_transactions[4].value,
lines[2].goods
)
self.assertEqual(
nominal_transactions[4].ref,
header.ref
)
self.assertEqual(
nominal_transactions[4].period,
header.period
)
self.assertEqual(
nominal_transactions[4].type,
header.type
)
self.assertEqual(
nominal_transactions[4].field,
"g"
)
self.assertEqual(
nominal_transactions[5].module,
"NL"
)
self.assertEqual(
nominal_transactions[5].header,
header.pk
)
self.assertEqual(
nominal_transactions[5].line,
lines[2].pk,
)
self.assertEqual(
nominal_transactions[5].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[5].value,
lines[2].vat
)
self.assertEqual(
nominal_transactions[5].ref,
header.ref
)
self.assertEqual(
nominal_transactions[5].period,
header.period
)
self.assertEqual(
nominal_transactions[5].type,
header.type
)
self.assertEqual(
nominal_transactions[5].field,
"v"
)
# CREDIT NOM TRANS
self.assertEqual(
nominal_transactions[2].module,
"NL",
)
self.assertEqual(
nominal_transactions[2].header,
header.pk
)
self.assertEqual(
nominal_transactions[2].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[2].nominal,
lines[1].nominal
)
self.assertEqual(
nominal_transactions[2].value,
lines[1].goods
)
self.assertEqual(
nominal_transactions[2].ref,
header.ref
)
self.assertEqual(
nominal_transactions[2].period,
header.period
)
self.assertEqual(
nominal_transactions[2].type,
header.type
)
self.assertEqual(
nominal_transactions[2].field,
"g"
)
self.assertEqual(
nominal_transactions[3].module,
"NL"
)
self.assertEqual(
nominal_transactions[3].header,
header.pk
)
self.assertEqual(
nominal_transactions[3].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[3].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[3].value,
lines[1].vat
)
self.assertEqual(
nominal_transactions[3].ref,
header.ref
)
self.assertEqual(
nominal_transactions[3].period,
header.period
)
self.assertEqual(
nominal_transactions[3].type,
header.type
)
self.assertEqual(
nominal_transactions[6].module,
"NL",
)
self.assertEqual(
nominal_transactions[6].header,
header.pk
)
self.assertEqual(
nominal_transactions[6].line,
lines[3].pk,
)
self.assertEqual(
nominal_transactions[6].nominal,
lines[3].nominal
)
self.assertEqual(
nominal_transactions[6].value,
lines[3].goods
)
self.assertEqual(
nominal_transactions[6].ref,
header.ref
)
self.assertEqual(
nominal_transactions[6].period,
header.period
)
self.assertEqual(
nominal_transactions[6].type,
header.type
)
self.assertEqual(
nominal_transactions[6].field,
"g"
)
self.assertEqual(
nominal_transactions[7].module,
"NL"
)
self.assertEqual(
nominal_transactions[7].header,
header.pk
)
self.assertEqual(
nominal_transactions[7].line,
lines[3].pk,
)
self.assertEqual(
nominal_transactions[7].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[7].value,
lines[3].vat
)
self.assertEqual(
nominal_transactions[7].ref,
header.ref
)
self.assertEqual(
nominal_transactions[7].period,
header.period
)
self.assertEqual(
nominal_transactions[7].type,
header.type
)
for i, vat_tran in enumerate(vat_transactions):
self.assertEqual(
vat_tran.header,
header.pk
)
self.assertEqual(
vat_tran.line,
lines[i].pk
)
self.assertEqual(
vat_tran.module,
"NL"
)
self.assertEqual(
vat_tran.ref,
header.ref
)
self.assertEqual(
vat_tran.period,
header.period
)
self.assertEqual(
vat_tran.date,
header.date
)
self.assertEqual(
vat_tran.field,
"v"
)
self.assertEqual(
vat_tran.tran_type,
header.type
)
self.assertEqual(
vat_tran.vat_type,
"o"
)
self.assertEqual(
vat_tran.vat_code,
lines[i].vat_code
)
self.assertEqual(
vat_tran.vat_rate,
lines[i].vat_code.rate
)
self.assertEqual(
vat_tran.goods,
lines[i].goods
)
self.assertEqual(
vat_tran.vat,
lines[i].vat
)
# INCORRECT USAGE
# START OFF WITH FOUR LINES AND THEN ZERO OUT BOTTOM TWO
def test_edit_journal_by_zeroing_out_bottom_two_lines(self):
self.client.force_login(self.user)
header, line, nominal_transactions = create_nominal_journal({
"header": {
"type": "nj",
"ref": "test journal",
"period": self.period,
"date": self.model_date,
"total": 240,
"vat_type": "o"
},
"lines": [
{
"line_no": 1,
"description": "line 1",
"goods": 100,
"nominal": self.bank_nominal,
"vat_code": self.vat_code,
"vat": 20
},
{
"line_no": 2,
"description": "line 2",
"goods": -100,
"nominal": self.debtors_nominal,
"vat_code": self.vat_code,
"vat": -20
},
{
"line_no": 3,
"description": "line 3",
"goods": 100,
"nominal": self.bank_nominal,
"vat_code": self.vat_code,
"vat": 20
},
{
"line_no": 4,
"description": "line 4",
"goods": -100,
"nominal": self.debtors_nominal,
"vat_code": self.vat_code,
"vat": -20
}
],
},
self.vat_nominal
)
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
"nj"
)
self.assertEqual(
header.ref,
"test journal"
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.total,
240
)
self.assertEqual(
header.vat_type,
"o"
)
# NOM LINES
lines = NominalLine.objects.all().order_by("pk")
create_vat_transactions(header, lines)
vat_transactions = VatTransaction.objects.all()
self.assertEqual(
len(vat_transactions),
4
)
nominal_transactions = NominalTransaction.objects.all().order_by("pk")
self.assertEqual(
len(lines),
4
)
debit_nom_trans = list(
nominal_transactions[:2]) + list(nominal_transactions[4:6])
debit_vat_trans = [vat_transactions[0], vat_transactions[2]]
debit_lines = lines[::2]
for i, line in enumerate(debit_lines):
self.assertEqual(
line.line_no,
(2 * i) + 1
)
self.assertEqual(
line.goods,
100
)
self.assertEqual(
line.nominal,
self.bank_nominal
)
self.assertEqual(
line.vat_code,
self.vat_code
)
self.assertEqual(
line.vat,
20
)
self.assertEqual(
line.goods_nominal_transaction,
debit_nom_trans[(2 * i) + 0]
)
self.assertEqual(
line.vat_nominal_transaction,
debit_nom_trans[(2 * i) + 1]
)
self.assertEqual(
line.vat_transaction,
debit_vat_trans[i]
)
credit_nom_trans = list(
nominal_transactions[2:4]) + list(nominal_transactions[6:])
credit_vat_trans = [vat_transactions[1], vat_transactions[3]]
credit_lines = lines[1::2]
for i, line in enumerate(credit_lines):
self.assertEqual(
line.line_no,
(2 * i) + 2
)
self.assertEqual(
line.goods,
-100
)
self.assertEqual(
line.nominal,
self.debtors_nominal
)
self.assertEqual(
line.vat_code,
self.vat_code
)
self.assertEqual(
line.vat,
-20
)
self.assertEqual(
line.goods_nominal_transaction,
credit_nom_trans[(2 * i) + 0]
)
self.assertEqual(
line.vat_nominal_transaction,
credit_nom_trans[(2 * i) + 1]
)
self.assertEqual(
line.vat_transaction,
credit_vat_trans[i]
)
data = {}
header_data = create_header(
HEADER_FORM_PREFIX,
{
"type": header.type,
"ref": header.ref,
"date": header.date.strftime(DATE_INPUT_FORMAT),
"total": 240,
"period": header.period.pk,
"vat_type": header.vat_type
}
)
data.update(header_data)
line_forms = []
line_forms.append(
{
"description": lines[0].description,
"goods": 100,
"nominal": lines[0].nominal_id,
"vat_code": lines[0].vat_code_id,
"vat": 20
}
)
line_forms[0]["id"] = lines[0].pk
line_forms.append(
{
"description": lines[1].description,
"goods": -100,
"nominal": lines[1].nominal_id,
"vat_code": lines[1].vat_code_id,
"vat": -20
}
)
line_forms[1]["id"] = lines[1].pk
line_forms.append(
{
"description": lines[0].description,
"goods": 0,
"nominal": lines[0].nominal_id,
"vat_code": lines[0].vat_code_id,
"vat": 0
}
)
line_forms[2]["id"] = lines[2].pk
line_forms.append(
{
"description": lines[1].description,
"goods": 0,
"nominal": lines[1].nominal_id,
"vat_code": lines[1].vat_code_id,
"vat": 0
}
)
line_forms[3]["id"] = lines[3].pk
line_data = create_formset_data(LINE_FORM_PREFIX, line_forms)
line_data["line-INITIAL_FORMS"] = 4
data.update(line_data)
url = reverse("nominals:edit", kwargs={"pk": header.pk})
response = self.client.post(url, data)
self.assertEqual(response.status_code, 200)
self.assertContains(
response,
'<li class="py-1">Goods and Vat cannot both be zero.</li>',
html=True
)
# CORRECT USAGE
# START OFF WITH FOUR LINES AND THEN MARK BOTTOM TWO AS DELETED
def test_edit_journal_by_deleting_bottom_two_lines(self):
self.client.force_login(self.user)
header, line, nominal_transactions = create_nominal_journal({
"header": {
"type": "nj",
"ref": "test journal",
"period": self.period,
"date": self.model_date,
"total": 120,
"vat_type": "o"
},
"lines": [
{
"line_no": 1,
"description": "line 1",
"goods": 100,
"nominal": self.bank_nominal,
"vat_code": self.vat_code,
"vat": 20
},
{
"line_no": 2,
"description": "line 2",
"goods": -100,
"nominal": self.debtors_nominal,
"vat_code": self.vat_code,
"vat": -20
},
{
"line_no": 3,
"description": "line 3",
"goods": 100,
"nominal": self.bank_nominal,
"vat_code": self.vat_code,
"vat": 20
},
{
"line_no": 4,
"description": "line 4",
"goods": -100,
"nominal": self.debtors_nominal,
"vat_code": self.vat_code,
"vat": -20
}
],
},
self.vat_nominal
)
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
"nj"
)
self.assertEqual(
header.ref,
"test journal"
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.total,
120
)
self.assertEqual(
header.vat_type,
"o"
)
# NOM LINES
lines = NominalLine.objects.all().order_by("pk")
create_vat_transactions(header, lines)
vat_transactions = VatTransaction.objects.all()
for i, vat_tran in enumerate(vat_transactions):
self.assertEqual(
vat_tran.header,
header.pk
)
self.assertEqual(
vat_tran.line,
lines[i].pk
)
self.assertEqual(
vat_tran.module,
"NL"
)
self.assertEqual(
vat_tran.ref,
header.ref
)
self.assertEqual(
vat_tran.period,
header.period
)
self.assertEqual(
vat_tran.date,
header.date
)
self.assertEqual(
vat_tran.field,
"v"
)
self.assertEqual(
vat_tran.tran_type,
header.type
)
self.assertEqual(
vat_tran.vat_type,
"o"
)
self.assertEqual(
vat_tran.vat_code,
lines[i].vat_code
)
self.assertEqual(
vat_tran.vat_rate,
lines[i].vat_code.rate
)
self.assertEqual(
vat_tran.goods,
lines[i].goods
)
self.assertEqual(
vat_tran.vat,
lines[i].vat
)
self.assertEqual(
len(vat_transactions),
4
)
nominal_transactions = NominalTransaction.objects.all().order_by("pk")
self.assertEqual(
len(lines),
4
)
debit_nom_trans = list(
nominal_transactions[:2]) + list(nominal_transactions[4:6])
debit_vat_trans = [ vat_transactions[0], vat_transactions[2] ]
debit_lines = lines[::2]
for i, line in enumerate(debit_lines):
self.assertEqual(
line.line_no,
(2 * i) + 1
)
self.assertEqual(
line.goods,
100
)
self.assertEqual(
line.nominal,
self.bank_nominal
)
self.assertEqual(
line.vat_code,
self.vat_code
)
self.assertEqual(
line.vat,
20
)
self.assertEqual(
line.goods_nominal_transaction,
debit_nom_trans[(2 * i) + 0]
)
self.assertEqual(
line.vat_nominal_transaction,
debit_nom_trans[(2 * i) + 1]
)
self.assertEqual(
line.vat_transaction,
debit_vat_trans[i]
)
credit_nom_trans = list(
nominal_transactions[2:4]) + list(nominal_transactions[6:])
credit_vat_trans = [ vat_transactions[1], vat_transactions[3] ]
credit_lines = lines[1::2]
for i, line in enumerate(credit_lines):
self.assertEqual(
line.line_no,
(2 * i) + 2
)
self.assertEqual(
line.goods,
-100
)
self.assertEqual(
line.nominal,
self.debtors_nominal
)
self.assertEqual(
line.vat_code,
self.vat_code
)
self.assertEqual(
line.vat,
-20
)
self.assertEqual(
line.goods_nominal_transaction,
credit_nom_trans[(2 * i) + 0]
)
self.assertEqual(
line.vat_nominal_transaction,
credit_nom_trans[(2 * i) + 1]
)
self.assertEqual(
line.vat_transaction,
credit_vat_trans[i]
)
data = {}
header_data = create_header(
HEADER_FORM_PREFIX,
{
"type": header.type,
"ref": header.ref,
"date": header.date.strftime(DATE_INPUT_FORMAT),
"total": 120,
"period": header.period.pk,
"vat_type": header.vat_type
}
)
data.update(header_data)
line_forms = []
line_forms.append(
{
"description": lines[0].description,
"goods": 100,
"nominal": lines[0].nominal_id,
"vat_code": lines[0].vat_code_id,
"vat": 20
}
)
line_forms[0]["id"] = lines[0].pk
line_forms.append(
{
"description": lines[1].description,
"goods": -100,
"nominal": lines[1].nominal_id,
"vat_code": lines[1].vat_code_id,
"vat": -20
}
)
line_forms[1]["id"] = lines[1].pk
line_forms.append(
{
"description": lines[0].description,
"goods": 0,
"nominal": lines[0].nominal_id,
"vat_code": lines[0].vat_code_id,
"vat": 0
}
)
line_forms[2]["id"] = lines[2].pk
line_forms[2]["DELETE"] = "yes"
line_forms.append(
{
"description": lines[1].description,
"goods": 0,
"nominal": lines[1].nominal_id,
"vat_code": lines[1].vat_code_id,
"vat": 0
}
)
line_forms[3]["id"] = lines[3].pk
line_forms[3]["DELETE"] = "yes"
line_data = create_formset_data(LINE_FORM_PREFIX, line_forms)
line_data["line-INITIAL_FORMS"] = 4
data.update(line_data)
url = reverse("nominals:edit", kwargs={"pk": header.pk})
response = self.client.post(url, data)
self.assertEqual(response.status_code, 302)
# POST EDIT ...
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
"nj"
)
self.assertEqual(
header.ref,
"test journal"
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.total,
120
)
self.assertEqual(
header.vat_type,
"o"
)
# NOM LINES
lines = NominalLine.objects.all().order_by("pk")
vat_transactions = VatTransaction.objects.all().order_by("line")
self.assertEqual(
len(vat_transactions),
2
)
nominal_transactions = NominalTransaction.objects.all().order_by("pk")
self.assertEqual(
len(lines),
2
)
self.assertEqual(
len(nominal_transactions),
4
)
debit_nom_trans = nominal_transactions[:2]
debit_line = lines[0]
self.assertEqual(
debit_line.line_no,
1
)
self.assertEqual(
debit_line.goods,
100
)
self.assertEqual(
debit_line.nominal,
self.bank_nominal
)
self.assertEqual(
debit_line.vat_code,
self.vat_code
)
self.assertEqual(
debit_line.vat,
20
)
self.assertEqual(
debit_line.goods_nominal_transaction,
debit_nom_trans[0]
)
self.assertEqual(
debit_line.vat_nominal_transaction,
debit_nom_trans[1]
)
self.assertEqual(
debit_line.vat_transaction,
vat_transactions[0]
)
credit_nom_trans = nominal_transactions[2:]
credit_line = lines[1]
self.assertEqual(
credit_line.line_no,
2
)
self.assertEqual(
credit_line.goods,
-100
)
self.assertEqual(
credit_line.nominal,
self.debtors_nominal
)
self.assertEqual(
credit_line.vat_code,
self.vat_code
)
self.assertEqual(
credit_line.vat,
-20
)
self.assertEqual(
credit_line.goods_nominal_transaction,
credit_nom_trans[0]
)
self.assertEqual(
credit_line.vat_nominal_transaction,
credit_nom_trans[1]
)
self.assertEqual(
credit_line.vat_transaction,
vat_transactions[1]
)
total = 0
for tran in nominal_transactions:
total = total + tran.value
self.assertEqual(total, 0)
for i, vat_tran in enumerate(vat_transactions):
self.assertEqual(
vat_tran.header,
header.pk
)
self.assertEqual(
vat_tran.line,
lines[i].pk
)
self.assertEqual(
vat_tran.module,
"NL"
)
self.assertEqual(
vat_tran.ref,
header.ref
)
self.assertEqual(
vat_tran.period,
header.period
)
self.assertEqual(
vat_tran.date,
header.date
)
self.assertEqual(
vat_tran.field,
"v"
)
self.assertEqual(
vat_tran.tran_type,
header.type
)
self.assertEqual(
vat_tran.vat_type,
"o"
)
self.assertEqual(
vat_tran.vat_code,
lines[i].vat_code
)
self.assertEqual(
vat_tran.vat_rate,
lines[i].vat_code.rate
)
self.assertEqual(
vat_tran.goods,
lines[i].goods
)
self.assertEqual(
vat_tran.vat,
lines[i].vat
)
def test_edit_header_only(self):
"""
A change in the header only might still mean the lines and nominal and vat transactions need updating
E.g. the period is changed.
"""
self.client.force_login(self.user)
header, line, nominal_transactions = create_nominal_journal({
"header": {
"type": "nj",
"ref": "test journal",
"period": self.period,
"date": self.model_date,
"total": 120,
"vat_type": "o"
},
"lines": [
{
"line_no": 1,
"description": "line 1",
"goods": 100,
"nominal": self.bank_nominal,
"vat_code": self.vat_code,
"vat": 20
},
{
"line_no": 2,
"description": "line 2",
"goods": -100,
"nominal": self.debtors_nominal,
"vat_code": self.vat_code,
"vat": -20
}
],
},
self.vat_nominal
)
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
"nj"
)
self.assertEqual(
header.ref,
"test journal"
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.total,
120
)
self.assertEqual(
header.vat_type,
"o"
)
# NOM LINES
lines = NominalLine.objects.all()
create_vat_transactions(header, lines)
vat_transactions = VatTransaction.objects.all()
self.assertEqual(
len(vat_transactions),
2
)
nominal_transactions = NominalTransaction.objects.all()
self.assertEqual(
len(lines),
2
)
self.assertEqual(
lines[0].description,
"line 1"
)
self.assertEqual(
lines[0].goods,
100
)
self.assertEqual(
lines[0].nominal,
self.bank_nominal
)
self.assertEqual(
lines[0].vat_code,
self.vat_code
)
self.assertEqual(
lines[0].vat,
20
)
self.assertEqual(
lines[0].goods_nominal_transaction,
nominal_transactions[0]
)
self.assertEqual(
lines[0].vat_nominal_transaction,
nominal_transactions[1]
)
self.assertEqual(
lines[0].vat_transaction,
vat_transactions[0]
)
self.assertEqual(
lines[1].description,
"line 2"
)
self.assertEqual(
lines[1].goods,
-100
)
self.assertEqual(
lines[1].nominal,
self.debtors_nominal
)
self.assertEqual(
lines[1].vat_code,
self.vat_code
)
self.assertEqual(
lines[1].vat,
-20
)
self.assertEqual(
lines[1].goods_nominal_transaction,
nominal_transactions[2]
)
self.assertEqual(
lines[1].vat_nominal_transaction,
nominal_transactions[3]
)
self.assertEqual(
lines[1].vat_transaction,
vat_transactions[1]
)
# DEBIT NOM TRANS
self.assertEqual(
len(nominal_transactions),
4
)
self.assertEqual(
nominal_transactions[0].module,
"NL",
)
self.assertEqual(
nominal_transactions[0].header,
header.pk
)
self.assertEqual(
nominal_transactions[0].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[0].nominal,
lines[0].nominal
)
self.assertEqual(
nominal_transactions[0].value,
lines[0].goods
)
self.assertEqual(
nominal_transactions[0].ref,
header.ref
)
self.assertEqual(
nominal_transactions[0].period,
header.period
)
self.assertEqual(
nominal_transactions[0].type,
header.type
)
self.assertEqual(
nominal_transactions[1].module,
"NL"
)
self.assertEqual(
nominal_transactions[1].header,
header.pk
)
self.assertEqual(
nominal_transactions[1].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[1].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[1].value,
lines[0].vat
)
self.assertEqual(
nominal_transactions[1].ref,
header.ref
)
self.assertEqual(
nominal_transactions[1].period,
header.period
)
self.assertEqual(
nominal_transactions[1].type,
header.type
)
# CREDIT NOM TRANS
self.assertEqual(
nominal_transactions[2].module,
"NL",
)
self.assertEqual(
nominal_transactions[2].header,
header.pk
)
self.assertEqual(
nominal_transactions[2].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[2].nominal,
lines[1].nominal
)
self.assertEqual(
nominal_transactions[2].value,
lines[1].goods
)
self.assertEqual(
nominal_transactions[2].ref,
header.ref
)
self.assertEqual(
nominal_transactions[2].period,
header.period
)
self.assertEqual(
nominal_transactions[2].type,
header.type
)
self.assertEqual(
nominal_transactions[3].module,
"NL"
)
self.assertEqual(
nominal_transactions[3].header,
header.pk
)
self.assertEqual(
nominal_transactions[3].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[3].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[3].value,
lines[1].vat
)
self.assertEqual(
nominal_transactions[3].ref,
header.ref
)
self.assertEqual(
nominal_transactions[3].period,
header.period
)
self.assertEqual(
nominal_transactions[3].type,
header.type
)
for i, vat_tran in enumerate(vat_transactions):
self.assertEqual(
vat_tran.header,
header.pk
)
self.assertEqual(
vat_tran.line,
lines[i].pk
)
self.assertEqual(
vat_tran.module,
"NL"
)
self.assertEqual(
vat_tran.ref,
header.ref
)
self.assertEqual(
vat_tran.period,
header.period
)
self.assertEqual(
vat_tran.date,
header.date
)
self.assertEqual(
vat_tran.field,
"v"
)
self.assertEqual(
vat_tran.tran_type,
header.type
)
self.assertEqual(
vat_tran.vat_type,
"o"
)
self.assertEqual(
vat_tran.vat_code,
lines[i].vat_code
)
self.assertEqual(
vat_tran.vat_rate,
lines[i].vat_code.rate
)
self.assertEqual(
vat_tran.goods,
lines[i].goods
)
self.assertEqual(
vat_tran.vat,
lines[i].vat
)
new_period = Period.objects.create(fy=self.fy, fy_and_period="202002", period="02", month_start=date(2020,2,29))
data = {}
header_data = create_header(
HEADER_FORM_PREFIX,
{
"type": header.type,
"ref": header.ref,
"date": header.date.strftime(DATE_INPUT_FORMAT),
"total": 120,
"period": new_period.pk,
"vat_type": "o"
}
)
data.update(header_data)
line_forms = []
line_forms.append(
{
"description": lines[0].description,
"goods": 100,
"nominal": lines[0].nominal_id,
"vat_code": lines[0].vat_code_id,
"vat": 20
}
)
line_forms[0]["id"] = lines[0].pk
line_forms.append(
{
"description": lines[1].description,
"goods": -100,
"nominal": lines[1].nominal_id,
"vat_code": lines[1].vat_code_id,
"vat": -20
}
)
line_forms[1]["id"] = lines[1].pk
line_data = create_formset_data(LINE_FORM_PREFIX, line_forms)
line_data["line-INITIAL_FORMS"] = 2
data.update(line_data)
url = reverse("nominals:edit", kwargs={"pk": header.pk})
response = self.client.post(url, data)
self.assertEqual(response.status_code, 302)
# POST EDIT ...
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
"nj"
)
self.assertEqual(
header.ref,
"test journal"
)
self.assertEqual(
header.period,
new_period
)
self.assertEqual(
header.total,
120
)
self.assertEqual(
header.vat_type,
"o"
)
# NOM LINES
vat_transactions = VatTransaction.objects.all()
self.assertEqual(
len(vat_transactions),
2
)
lines = NominalLine.objects.all()
nominal_transactions = NominalTransaction.objects.all()
self.assertEqual(
len(lines),
2
)
self.assertEqual(
lines[0].description,
"line 1"
)
self.assertEqual(
lines[0].goods,
100
)
self.assertEqual(
lines[0].nominal,
self.bank_nominal
)
self.assertEqual(
lines[0].vat_code,
self.vat_code
)
self.assertEqual(
lines[0].vat,
20
)
self.assertEqual(
lines[0].goods_nominal_transaction,
nominal_transactions[0]
)
self.assertEqual(
lines[0].vat_nominal_transaction,
nominal_transactions[1]
)
self.assertEqual(
lines[0].vat_transaction,
vat_transactions[0]
)
self.assertEqual(
lines[1].description,
"line 2"
)
self.assertEqual(
lines[1].goods,
-100
)
self.assertEqual(
lines[1].nominal,
self.debtors_nominal
)
self.assertEqual(
lines[1].vat_code,
self.vat_code
)
self.assertEqual(
lines[1].vat,
-20
)
self.assertEqual(
lines[1].goods_nominal_transaction,
nominal_transactions[2]
)
self.assertEqual(
lines[1].vat_nominal_transaction,
nominal_transactions[3]
)
self.assertEqual(
lines[1].vat_transaction,
vat_transactions[1]
)
# DEBIT NOM TRANS
self.assertEqual(
len(nominal_transactions),
4
)
self.assertEqual(
nominal_transactions[0].module,
"NL",
)
self.assertEqual(
nominal_transactions[0].header,
header.pk
)
self.assertEqual(
nominal_transactions[0].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[0].nominal,
lines[0].nominal
)
self.assertEqual(
nominal_transactions[0].value,
lines[0].goods
)
self.assertEqual(
nominal_transactions[0].ref,
header.ref
)
self.assertEqual(
nominal_transactions[0].period,
new_period
)
self.assertEqual(
nominal_transactions[0].type,
header.type
)
self.assertEqual(
nominal_transactions[1].module,
"NL"
)
self.assertEqual(
nominal_transactions[1].header,
header.pk
)
self.assertEqual(
nominal_transactions[1].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[1].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[1].value,
lines[0].vat
)
self.assertEqual(
nominal_transactions[1].ref,
header.ref
)
self.assertEqual(
nominal_transactions[1].period,
new_period
)
self.assertEqual(
nominal_transactions[1].type,
header.type
)
# CREDIT NOM TRANS
self.assertEqual(
nominal_transactions[2].module,
"NL",
)
self.assertEqual(
nominal_transactions[2].header,
header.pk
)
self.assertEqual(
nominal_transactions[2].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[2].nominal,
lines[1].nominal
)
self.assertEqual(
nominal_transactions[2].value,
lines[1].goods
)
self.assertEqual(
nominal_transactions[2].ref,
header.ref
)
self.assertEqual(
nominal_transactions[2].period,
new_period
)
self.assertEqual(
nominal_transactions[2].type,
header.type
)
self.assertEqual(
nominal_transactions[3].module,
"NL"
)
self.assertEqual(
nominal_transactions[3].header,
header.pk
)
self.assertEqual(
nominal_transactions[3].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[3].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[3].value,
lines[1].vat
)
self.assertEqual(
nominal_transactions[3].ref,
header.ref
)
self.assertEqual(
nominal_transactions[3].period,
new_period
)
self.assertEqual(
nominal_transactions[3].type,
header.type
)
for i, vat_tran in enumerate(vat_transactions):
self.assertEqual(
vat_tran.header,
header.pk
)
self.assertEqual(
vat_tran.line,
lines[i].pk
)
self.assertEqual(
vat_tran.module,
"NL"
)
self.assertEqual(
vat_tran.ref,
header.ref
)
self.assertEqual(
vat_tran.period,
new_period
)
self.assertEqual(
vat_tran.date,
header.date
)
self.assertEqual(
vat_tran.field,
"v"
)
self.assertEqual(
vat_tran.tran_type,
header.type
)
self.assertEqual(
vat_tran.vat_type,
"o"
)
self.assertEqual(
vat_tran.vat_code,
lines[i].vat_code
)
self.assertEqual(
vat_tran.vat_rate,
lines[i].vat_code.rate
)
self.assertEqual(
vat_tran.goods,
lines[i].goods
)
self.assertEqual(
vat_tran.vat,
lines[i].vat
)
class VoidJournal(TestCase):
@classmethod
def setUpTestData(cls):
cls.user = get_user_model().objects.create_superuser(username="dummy", password="dummy")
cls.ref = "test journal"
cls.date = datetime.now().strftime(DATE_INPUT_FORMAT)
cls.due_date = (datetime.now() + timedelta(days=31)
).strftime(DATE_INPUT_FORMAT)
cls.model_date = datetime.now().strftime(MODEL_DATE_INPUT_FORMAT)
cls.model_due_date = (datetime.now() + timedelta(days=31)
).strftime(MODEL_DATE_INPUT_FORMAT)
fy = FinancialYear.objects.create(financial_year=2020)
cls.period = Period.objects.create(fy=fy, period="01", fy_and_period="202001", month_start=date(2020,1,31))
cls.description = "a line description"
assets = Nominal.objects.create(name="Assets")
current_assets = Nominal.objects.create(
parent=assets, name="Current Assets")
cls.bank_nominal = Nominal.objects.create(
parent=current_assets, name="Bank Account")
cls.debtors_nominal = Nominal.objects.create(
parent=current_assets, name="Trade Debtors")
# LIABILITIES
liabilities = Nominal.objects.create(name="Liabilities")
current_liabilities = Nominal.objects.create(
parent=liabilities, name="Current Liabilities")
cls.vat_nominal = Nominal.objects.create(
parent=current_assets, name="Vat")
cls.vat_code = Vat.objects.create(
code="1", name="standard rate", rate=20)
ModuleSettings.objects.create(
cash_book_period=cls.period,
nominals_period=cls.period,
purchases_period=cls.period,
sales_period=cls.period
)
# INCORRECT USAGE
def test_void_journal_already_voided(self):
self.client.force_login(self.user)
header, line = create_nominal_journal_without_nom_trans({
"header": {
"type": "nj",
"ref": "test journal",
"period": self.period,
"date": self.model_date,
"total": 120,
"status": "v",
"vat_type": "o"
},
"lines": [
{
"line_no": 1,
"description": "line 1",
"goods": 100,
"nominal": self.bank_nominal,
"vat_code": self.vat_code,
"vat": 20
},
{
"line_no": 2,
"description": "line 2",
"goods": -100,
"nominal": self.debtors_nominal,
"vat_code": self.vat_code,
"vat": -20
}
],
},
)
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
"nj"
)
self.assertEqual(
header.ref,
"test journal"
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.total,
120
)
self.assertEqual(
header.status,
'v'
)
# NOM LINES
lines = NominalLine.objects.all()
nominal_transactions = NominalTransaction.objects.all()
self.assertEqual(
len(nominal_transactions),
0
)
self.assertEqual(
len(lines),
2
)
self.assertEqual(
lines[0].description,
"line 1"
)
self.assertEqual(
lines[0].goods,
100
)
self.assertEqual(
lines[0].nominal,
self.bank_nominal
)
self.assertEqual(
lines[0].vat_code,
self.vat_code
)
self.assertEqual(
lines[0].vat,
20
)
self.assertEqual(
lines[0].goods_nominal_transaction,
None
)
self.assertEqual(
lines[0].vat_nominal_transaction,
None
)
self.assertEqual(
lines[1].description,
"line 2"
)
self.assertEqual(
lines[1].goods,
-100
)
self.assertEqual(
lines[1].nominal,
self.debtors_nominal
)
self.assertEqual(
lines[1].vat_code,
self.vat_code
)
self.assertEqual(
lines[1].vat,
-20
)
self.assertEqual(
lines[1].goods_nominal_transaction,
None
)
self.assertEqual(
lines[1].vat_nominal_transaction,
None
)
data = {}
data["void-id"] = header.id
url = reverse("nominals:void", kwargs={"pk": header.pk})
response = self.client.post(url, data)
self.assertEqual(response.status_code, 200)
content = response.content.decode("utf")
json_content = loads(content)
self.assertEqual(
json_content["success"],
False
)
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
"nj"
)
self.assertEqual(
header.ref,
"test journal"
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.total,
120
)
self.assertEqual(
header.status,
'v'
)
# NOM LINES
lines = NominalLine.objects.all()
nominal_transactions = NominalTransaction.objects.all()
self.assertEqual(
len(nominal_transactions),
0
)
self.assertEqual(
len(lines),
2
)
self.assertEqual(
lines[0].description,
"line 1"
)
self.assertEqual(
lines[0].goods,
100
)
self.assertEqual(
lines[0].nominal,
self.bank_nominal
)
self.assertEqual(
lines[0].vat_code,
self.vat_code
)
self.assertEqual(
lines[0].vat,
20
)
self.assertEqual(
lines[0].goods_nominal_transaction,
None
)
self.assertEqual(
lines[0].vat_nominal_transaction,
None
)
self.assertEqual(
lines[1].description,
"line 2"
)
self.assertEqual(
lines[1].goods,
-100
)
self.assertEqual(
lines[1].nominal,
self.debtors_nominal
)
self.assertEqual(
lines[1].vat_code,
self.vat_code
)
self.assertEqual(
lines[1].vat,
-20
)
self.assertEqual(
lines[1].goods_nominal_transaction,
None
)
self.assertEqual(
lines[1].vat_nominal_transaction,
None
)
# CORRECT USAGE
# JUST HALF THE GOODS AND VAT
def test_void_journal(self):
self.client.force_login(self.user)
header, line, nominal_transactions = create_nominal_journal({
"header": {
"type": "nj",
"ref": "test journal",
"period": self.period,
"date": self.model_date,
"total": 120,
"vat_type": "o"
},
"lines": [
{
"line_no": 1,
"description": "line 1",
"goods": 100,
"nominal": self.bank_nominal,
"vat_code": self.vat_code,
"vat": 20
},
{
"line_no": 2,
"description": "line 2",
"goods": -100,
"nominal": self.debtors_nominal,
"vat_code": self.vat_code,
"vat": -20
}
],
},
self.vat_nominal
)
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
"nj"
)
self.assertEqual(
header.ref,
"test journal"
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.total,
120
)
self.assertEqual(
header.status,
'c'
)
# NOM LINES
lines = NominalLine.objects.all()
create_vat_transactions(header, lines)
vat_transactions = VatTransaction.objects.all().order_by("pk")
self.assertEqual(
len(vat_transactions),
2
)
nominal_transactions = NominalTransaction.objects.all().order_by("pk")
self.assertEqual(
len(lines),
2
)
self.assertEqual(
lines[0].description,
"line 1"
)
self.assertEqual(
lines[0].goods,
100
)
self.assertEqual(
lines[0].nominal,
self.bank_nominal
)
self.assertEqual(
lines[0].vat_code,
self.vat_code
)
self.assertEqual(
lines[0].vat,
20
)
self.assertEqual(
lines[0].goods_nominal_transaction,
nominal_transactions[0]
)
self.assertEqual(
lines[0].vat_nominal_transaction,
nominal_transactions[1]
)
self.assertEqual(
lines[0].vat_transaction,
vat_transactions[0]
)
self.assertEqual(
lines[1].description,
"line 2"
)
self.assertEqual(
lines[1].goods,
-100
)
self.assertEqual(
lines[1].nominal,
self.debtors_nominal
)
self.assertEqual(
lines[1].vat_code,
self.vat_code
)
self.assertEqual(
lines[1].vat,
-20
)
self.assertEqual(
lines[1].goods_nominal_transaction,
nominal_transactions[2]
)
self.assertEqual(
lines[1].vat_nominal_transaction,
nominal_transactions[3]
)
self.assertEqual(
lines[1].vat_transaction,
vat_transactions[1]
)
# DEBIT NOM TRANS
self.assertEqual(
len(nominal_transactions),
4
)
self.assertEqual(
nominal_transactions[0].module,
"NL",
)
self.assertEqual(
nominal_transactions[0].header,
header.pk
)
self.assertEqual(
nominal_transactions[0].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[0].nominal,
lines[0].nominal
)
self.assertEqual(
nominal_transactions[0].value,
lines[0].goods
)
self.assertEqual(
nominal_transactions[0].ref,
header.ref
)
self.assertEqual(
nominal_transactions[0].period,
header.period
)
self.assertEqual(
nominal_transactions[0].type,
header.type
)
self.assertEqual(
nominal_transactions[1].module,
"NL"
)
self.assertEqual(
nominal_transactions[1].header,
header.pk
)
self.assertEqual(
nominal_transactions[1].line,
lines[0].pk,
)
self.assertEqual(
nominal_transactions[1].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[1].value,
lines[0].vat
)
self.assertEqual(
nominal_transactions[1].ref,
header.ref
)
self.assertEqual(
nominal_transactions[1].period,
header.period
)
self.assertEqual(
nominal_transactions[1].type,
header.type
)
# CREDIT NOM TRANS
self.assertEqual(
nominal_transactions[2].module,
"NL",
)
self.assertEqual(
nominal_transactions[2].header,
header.pk
)
self.assertEqual(
nominal_transactions[2].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[2].nominal,
lines[1].nominal
)
self.assertEqual(
nominal_transactions[2].value,
lines[1].goods
)
self.assertEqual(
nominal_transactions[2].ref,
header.ref
)
self.assertEqual(
nominal_transactions[2].period,
header.period
)
self.assertEqual(
nominal_transactions[2].type,
header.type
)
self.assertEqual(
nominal_transactions[3].module,
"NL"
)
self.assertEqual(
nominal_transactions[3].header,
header.pk
)
self.assertEqual(
nominal_transactions[3].line,
lines[1].pk,
)
self.assertEqual(
nominal_transactions[3].nominal,
self.vat_nominal
)
self.assertEqual(
nominal_transactions[3].value,
lines[1].vat
)
self.assertEqual(
nominal_transactions[3].ref,
header.ref
)
self.assertEqual(
nominal_transactions[3].period,
header.period
)
self.assertEqual(
nominal_transactions[3].type,
header.type
)
data = {}
data["void-id"] = header.id
url = reverse("nominals:void", kwargs={"pk": header.pk})
response = self.client.post(url, data)
self.assertEqual(response.status_code, 200)
content = response.content.decode("utf")
json_content = loads(content)
self.assertEqual(
json_content["success"],
True
)
self.assertEqual(
json_content["href"],
reverse("nominals:transaction_enquiry")
)
# POST VOID
header = NominalHeader.objects.all()
self.assertEqual(
len(header),
1
)
header = header[0]
self.assertEqual(
header.type,
"nj"
)
self.assertEqual(
header.ref,
"test journal"
)
self.assertEqual(
header.period,
self.period
)
self.assertEqual(
header.total,
120
)
self.assertEqual(
header.status,
'v'
)
# NOM LINES
lines = NominalLine.objects.all()
self.assertEqual(
len(lines),
2
)
self.assertEqual(
lines[0].description,
"line 1"
)
self.assertEqual(
lines[0].goods,
100
)
self.assertEqual(
lines[0].nominal,
self.bank_nominal
)
self.assertEqual(
lines[0].vat_code,
self.vat_code
)
self.assertEqual(
lines[0].vat,
20
)
self.assertEqual(
lines[0].goods_nominal_transaction,
None
)
self.assertEqual(
lines[0].vat_nominal_transaction,
None
)
self.assertEqual(
lines[0].vat_transaction,
None
)
self.assertEqual(
lines[1].description,
"line 2"
)
self.assertEqual(
lines[1].goods,
-100
)
self.assertEqual(
lines[1].nominal,
self.debtors_nominal
)
self.assertEqual(
lines[1].vat_code,
self.vat_code
)
self.assertEqual(
lines[1].vat,
-20
)
self.assertEqual(
lines[1].goods_nominal_transaction,
None
)
self.assertEqual(
lines[1].vat_nominal_transaction,
None
)
self.assertEqual(
lines[1].vat_transaction,
None
)
self.assertEqual(
len(NominalTransaction.objects.all()),
0
)
self.assertEqual(
len(
VatTransaction.objects.all()
),
0
) | 26.77299 | 160 | 0.463418 | 11,776 | 139,166 | 5.31284 | 0.024541 | 0.230404 | 0.125887 | 0.194553 | 0.965651 | 0.945624 | 0.936097 | 0.911898 | 0.895579 | 0.891807 | 0 | 0.022217 | 0.446611 | 139,166 | 5,198 | 161 | 26.77299 | 0.790165 | 0.011957 | 0 | 0.714949 | 0 | 0.002017 | 0.039995 | 0.001407 | 0 | 0 | 0 | 0 | 0.195885 | 1 | 0.004438 | false | 0.000605 | 0.002623 | 0 | 0.007666 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
3f10dfc647c8fa4e00b5cfecaf8d71db1fbfb629 | 193 | py | Python | Wayload/__init__.py | Walker-00/Wayload | 00d3632bb80129e444d35b73ea7c8b4fe7f78a16 | [
"MIT"
] | null | null | null | Wayload/__init__.py | Walker-00/Wayload | 00d3632bb80129e444d35b73ea7c8b4fe7f78a16 | [
"MIT"
] | null | null | null | Wayload/__init__.py | Walker-00/Wayload | 00d3632bb80129e444d35b73ea7c8b4fe7f78a16 | [
"MIT"
] | null | null | null | from payload.hide import hd
from payload.extract import apk
from payload.extract import png
from payload.extract import jpg
from payload.extract import jpeg
from payload.extract import gif | 32.166667 | 33 | 0.823834 | 30 | 193 | 5.3 | 0.366667 | 0.415094 | 0.566038 | 0.754717 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145078 | 193 | 6 | 34 | 32.166667 | 0.963636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3f6f78ffc69dd94a63e87106f4b0dd268cc27500 | 72 | py | Python | metodebibliotek/__init__.py | statisticsnorway/metodebibliotek | 2cd6023171e5d1df3a59f4f6b425c6b9006583b2 | [
"Apache-2.0"
] | null | null | null | metodebibliotek/__init__.py | statisticsnorway/metodebibliotek | 2cd6023171e5d1df3a59f4f6b425c6b9006583b2 | [
"Apache-2.0"
] | null | null | null | metodebibliotek/__init__.py | statisticsnorway/metodebibliotek | 2cd6023171e5d1df3a59f4f6b425c6b9006583b2 | [
"Apache-2.0"
] | null | null | null | from metodebibliotek import MakeUrl
from metodebibliotek import GetKlass | 36 | 36 | 0.902778 | 8 | 72 | 8.125 | 0.625 | 0.584615 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097222 | 72 | 2 | 36 | 36 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3f717a91f421fe6f3f127c27b05516a983b31c09 | 6,535 | py | Python | odziez/employees/migrations/0003_auto_20191029_2037.py | szymanskirafal/odziez | 029d20da0474a0380e8383f9f89c1072666c5399 | [
"MIT"
] | null | null | null | odziez/employees/migrations/0003_auto_20191029_2037.py | szymanskirafal/odziez | 029d20da0474a0380e8383f9f89c1072666c5399 | [
"MIT"
] | null | null | null | odziez/employees/migrations/0003_auto_20191029_2037.py | szymanskirafal/odziez | 029d20da0474a0380e8383f9f89c1072666c5399 | [
"MIT"
] | null | null | null | # Generated by Django 2.1.8 on 2019-10-29 20:37
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('employees', '0002_auto_20190709_1756'),
]
operations = [
migrations.AlterField(
model_name='employee',
name='body_size',
field=models.CharField(choices=[('XL', 'XL'), ('L', 'L'), ('M', 'M'), ('S', 'S')], max_length=2, verbose_name='body_size'),
),
migrations.AlterField(
model_name='employee',
name='colar',
field=models.PositiveSmallIntegerField(verbose_name='colar'),
),
migrations.AlterField(
model_name='employee',
name='height',
field=models.PositiveSmallIntegerField(verbose_name='height'),
),
migrations.AlterField(
model_name='employee',
name='job',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='employees.Job', verbose_name='job'),
),
migrations.AlterField(
model_name='employee',
name='name',
field=models.CharField(max_length=15, verbose_name='name'),
),
migrations.AlterField(
model_name='employee',
name='sex',
field=models.CharField(choices=[('W', 'Kobieta'), ('M', 'Mężczyzna')], max_length=1, verbose_name='sex'),
),
migrations.AlterField(
model_name='employee',
name='shoe_size',
field=models.PositiveSmallIntegerField(verbose_name='shoe_size'),
),
migrations.AlterField(
model_name='employee',
name='surname',
field=models.CharField(max_length=40, verbose_name='surname'),
),
migrations.AlterField(
model_name='employee',
name='width_waist',
field=models.PositiveSmallIntegerField(verbose_name='width_waist'),
),
migrations.AlterField(
model_name='job',
name='position',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='employees.Position', verbose_name='position'),
),
migrations.AlterField(
model_name='job',
name='size_of_job',
field=models.DecimalField(decimal_places=2, max_digits=3, verbose_name='size_of_job'),
),
migrations.AlterField(
model_name='job',
name='work_place',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='employees.WorkPlace', verbose_name='work_place'),
),
migrations.AlterField(
model_name='manager',
name='body_size',
field=models.CharField(choices=[('XL', 'XL'), ('L', 'L'), ('M', 'M'), ('S', 'S')], max_length=2, verbose_name='body_size'),
),
migrations.AlterField(
model_name='manager',
name='colar',
field=models.PositiveSmallIntegerField(verbose_name='colar'),
),
migrations.AlterField(
model_name='manager',
name='height',
field=models.PositiveSmallIntegerField(verbose_name='height'),
),
migrations.AlterField(
model_name='manager',
name='job',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='employees.Job', verbose_name='job'),
),
migrations.AlterField(
model_name='manager',
name='name',
field=models.CharField(max_length=15, verbose_name='name'),
),
migrations.AlterField(
model_name='manager',
name='sex',
field=models.CharField(choices=[('W', 'Kobieta'), ('M', 'Mężczyzna')], max_length=1, verbose_name='sex'),
),
migrations.AlterField(
model_name='manager',
name='shoe_size',
field=models.PositiveSmallIntegerField(verbose_name='shoe_size'),
),
migrations.AlterField(
model_name='manager',
name='surname',
field=models.CharField(max_length=40, verbose_name='surname'),
),
migrations.AlterField(
model_name='manager',
name='width_waist',
field=models.PositiveSmallIntegerField(verbose_name='width_waist'),
),
migrations.AlterField(
model_name='position',
name='description',
field=models.CharField(max_length=300, verbose_name='description'),
),
migrations.AlterField(
model_name='position',
name='name',
field=models.CharField(max_length=150, verbose_name='name'),
),
migrations.AlterField(
model_name='supervisor',
name='email',
field=models.EmailField(max_length=254, verbose_name='email'),
),
migrations.AlterField(
model_name='supervisor',
name='name',
field=models.CharField(max_length=50, verbose_name='job'),
),
migrations.AlterField(
model_name='supervisor',
name='user',
field=models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL, verbose_name='job'),
),
migrations.AlterField(
model_name='workplace',
name='city',
field=models.CharField(max_length=50, verbose_name='city'),
),
migrations.AlterField(
model_name='workplace',
name='email',
field=models.EmailField(max_length=254, verbose_name='email'),
),
migrations.AlterField(
model_name='workplace',
name='name',
field=models.CharField(max_length=150, verbose_name='name'),
),
migrations.AlterField(
model_name='workplace',
name='phone',
field=models.CharField(max_length=13, verbose_name='phone'),
),
migrations.AlterField(
model_name='workplace',
name='postal_code',
field=models.CharField(max_length=8, verbose_name='postal_code'),
),
migrations.AlterField(
model_name='workplace',
name='street',
field=models.CharField(max_length=50, verbose_name='street'),
),
]
| 37.130682 | 135 | 0.570161 | 615 | 6,535 | 5.873171 | 0.147967 | 0.177187 | 0.221484 | 0.256921 | 0.857697 | 0.83278 | 0.699336 | 0.668051 | 0.62237 | 0.62237 | 0 | 0.01501 | 0.296557 | 6,535 | 175 | 136 | 37.342857 | 0.77072 | 0.006886 | 0 | 0.83432 | 1 | 0 | 0.124075 | 0.003545 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.017751 | 0 | 0.035503 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4500aeeef4291647800035b7b61126449fff017c | 27,020 | py | Python | src/plotting_helper/plotting_helper.py | davidghobson1/martian | ae51cb1ea5488d4d8849b9d6cee0dbab88251352 | [
"MIT"
] | null | null | null | src/plotting_helper/plotting_helper.py | davidghobson1/martian | ae51cb1ea5488d4d8849b9d6cee0dbab88251352 | [
"MIT"
] | null | null | null | src/plotting_helper/plotting_helper.py | davidghobson1/martian | ae51cb1ea5488d4d8849b9d6cee0dbab88251352 | [
"MIT"
] | null | null | null | """Plotting Helper
This module allows for quick and easy plot creation using matplotlib.
Although this module doesn't provide nearly as much flexibility as matplotlib directly,
it does allow you to setup hassle-free plots quickly without having to type out a plethora
of matplotlib plot and formatting commands.
This module is therefore mainly intended as a time-saver and for general data exploration.
For more flexibility and custom plot creation, refer to matplotlib directly.
Filename: plotting_helper.py
Maintainers: David Hobson, Saruggan Thiruchelvan
Last Updated: December 21, 2021
"""
import numpy as np
import matplotlib.pyplot as plt
from matplotlib.ticker import MaxNLocator
##############################################################################################################
## Individual Plotting Functions
##############################################################################################################
def plot_vertical(x_values, y_values, title="", x_label="", y_label="", colour='tab:red', figsize=None, width=0.75, x_label_orientation='horizontal', plot_filepath="", save=False):
"""
Creates a bar chart.
plot_vertical(x_values, y_values, title, x_label, y_label, colour, figsize, width, x_label_orientation, plot_filepath, save)
x_values: list
The x-values to plot.
y_values: list
The y-values to plot.
title: str; optional (default="")
The title for the plot.
x_label: str: optional (default="")
The label for the x-axis.
y_label: str: optional (default="")
The label for the y-axis.
colour: str; optional (default='tab:red')
The colour of the bars.
figsize: (int, int); optional (default=None)
The size of the figure. (Width x Height).
width: float; optional (default=0.75)
The width of each bar. Represents the percentage of available space the bar will take up.
Must be between 0 and 1.
x_label_orientation: str or float; optional (default='horizontal')
The orientation of the labels on the x-axis. This can either be 'vertical' or 'horizontal' or can be a
float indicating a degree of rotation.
plot_filepath: str; optional (default="")
The path to save the file to.
save: bool; optional (default=False)
True if the file should be saved to the specified filepath.
"""
if figsize is None:
fig, ax = plt.subplots()
else:
fig, ax = plt.subplots(figsize=figsize)
# plot bar graph
ax.bar(x_values, y_values, width, color=colour)
# format the title, x-axis, y-axis
ax.set_title(title)
# re-orient the x-axis if applicable
if x_label_orientation == 'vertical' or x_label_orientation == 'v':
plt.xticks(rotation="vertical") # makes x-names vertical
elif isinstance(x_label_orientation, int):
plt.xticks(rotation=x_label_orientation) # makes x-names slanted at given angle
ax.set_xlabel(x_label)
ax.set_ylabel(y_label)
ax.yaxis.set_major_locator(MaxNLocator(integer=True)) # makes y-axis integer valued
plot_filepath = plot_filepath
if save:
plt.savefig(plot_filepath, bbox_inches="tight")
plt.show()
def plot_horizontal(x_values, y_values, title="", x_label="", y_label="", colour='tab:red', figsize=None, plot_filepath="", save=False):
"""
Creates a horizontal bar chart.
plot_horizontal(x_values, y_values, title, x_label, y_label, colour, figsize, plot_filepath, save)
x_values: list
The x-values to plot.
y_values: list
The y-values to plot.
title: str; optional (default="")
The title for the plot.
x_label: str: optional (default="")
The label for the x-axis.
y_label: str: optional (default="")
The label for the y-axis.
colour: str; optional (default='tab:red')
The colour of the bars.
figsize: (int, int); optional (default=None)
The size of the figure. (Width x Height).
plot_filepath: str; optional (default="")
The path to save the file to.
save: bool; optional (default=False)
True if the file should be saved to the specified filepath.
"""
if figsize is None:
fig, ax = plt.subplots()
else:
fig, ax = plt.subplots(figsize=figsize)
y_positions = np.arange(len(y_values)) # set the positions along the y-axis
# plot the bars
ax.barh(y_positions, x_values, align='center', color=colour)
# format the plot
ax.set_title(title)
ax.set_yticks(y_positions)
ax.set_yticklabels(y_values)
ax.invert_yaxis() # labels read top-to-bottom
ax.set_xlabel(x_label)
ax.set_ylabel(y_label)
ax.xaxis.set_major_locator(MaxNLocator(integer=True)) # makes x-axis integer valued
plot_filepath = plot_filepath
# save if desired
if save:
plt.savefig(plot_filepath, bbox_inches="tight")
plt.show()
##############################################################################################################
## Group Functions
##############################################################################################################
def plot_groups_stacked(group_data, group_names, title="", x_label="", y_label="", colours=None, figsize=None, width=0.75, orientation='vertical', x_label_orientation='horizontal', plot_filepath="", save=False):
"""
Create a stacked bar chart. That is, a bar chart where the data for the same x-value from different groups is stacked into the same bar.
plot_groups_stacked(group_data, group_names, title, x_label, y_label, colours, figsize, width, orientation, x_label_orientation, plot_filepath, save)
group_data: {str: [float]}
The data for each group.
The keys of the dict are the x-values to be plotted.
The values correspond to the list of y-values; one for each group. These lists must be the same length as the group_names argument.
group_names: [str]
The list of the group names. These are shown in the legend.
title: str; optional (default="")
The title for the plot.
x_label: str: optional (default="")
The label for the x-axis.
y_label: str: optional (default="")
The label for the y-axis.
colours: [str]
The list of bar colours for each of the groups. This must be the same length as the group_names argument.
figsize: (int, int); optional (default=None)
The size of the figure. (Width x Height).
width: float; optional (default=0.75)
The width of each bar. Only applicable to vertical bar charts. Represents the percentage of available space the bar will take up.
Must be between 0 and 1.
orientation: str; optional (default='vertical')
The orientation of bars on the bar chart. Must be either 'vertical' or 'v' (for vertical bars) or 'horizontal' or'h'
for horizontal bars.
x_label_orientation: str or float; optional (default='horizontal')
The orientation of the labels on the x-axis. Only applicable to vertical bar charts. This can either be 'vertical' or 'horizontal' or can be a
float indicating the degree of rotation.
plot_filepath: str; optional (default="")
The path to save the file to.
save: bool; optional (default=False)
True if the file should be saved to the specified filepath.
"""
if not isinstance(colours, list):
print("Error: The colours argument needs to be a list, and have the same length as group_names")
return
elif len(colours) != len(group_names):
print("Error: The colours needs to have the same length as group_names")
return
if orientation == 'vertical' or orientation == 'v':
return __plot_groups_stacked_vertical(group_data, group_names, title=title, x_label=x_label, y_label=y_label, colours=colours, figsize=figsize, width=width, x_label_orientation = x_label_orientation, plot_filepath=plot_filepath, save=save)
elif orientation == 'horizontal' or orientation == 'h':
return __plot_groups_stacked_horizontal(group_data, group_names, title=title, x_label=x_label, y_label=y_label, colours=colours, figsize=figsize, plot_filepath=plot_filepath, save=save)
else:
print("Orientation can only be either 'vertical' or 'horizontal' (or 'v' or 'h' for short).")
return
def __plot_groups_stacked_vertical(group_data, group_names, title="", x_label="", y_label="", colours=None, figsize=None, width=0.75, x_label_orientation='horizontal', plot_filepath="", save=False):
"""
<PRIVATE> Helper function to create a stacked vertical bar chart.
__plot_groups_stacked_vertical(group_data, group_names, title, x_label, y_label, colours, figsize, width, x_label_orientation, plot_filepath, save)
group_data: {str: [float]}
The data for each group.
The keys of the dict are the x-values to be plotted.
The values correspond to the list of y-values; one for each group. These lists must be the same length as
the group_names argument.
group_names: [str]
The list of the group names. These are shown in the legend.
title: str; optional (default="")
The title for the plot.
x_label: str: optional (default="")
The label for the x-axis.
y_label: str: optional (default="")
The label for the y-axis.
colours: [str]
The list of bar colours for each of the groups.
figsize: (int, int); optional (default=None)
The size of the figure. (Width x Height).
width: float; optional (default=0.75)
The width of each bar. Represents the percentage of available space the bar will take up.
Must be between 0 and 1.
x_label_orientation: str or float; optional (default='horizontal')
The orientation of the labels on the x-axis. This can either be 'vertical' or 'horizontal' or can be a
float indicating the degree of rotation.
plot_filepath: str; optional (default="")
The path to save the file to.
save: bool; optional (default=False)
True if the file should be saved to the specified filepath.
"""
# get the values for the plot
labels = group_data.keys() # x-values
data = np.array(list(group_data.values())) # y-values for each group as an array
data_cum = data.cumsum(axis=1) # cumulative sum of the y-values for each group (needed to stack the bars)
# choose a figure size
if figsize is None:
fig, ax = plt.subplots()
else:
fig, ax = plt.subplots(figsize=figsize)
# adjust the colours
if not isinstance(colours, list):
if isinstance(colours, str):
colours = [colours for i in group_names]
else:
print("The colours argument needs to be a string or a list. Using the default red colour.")
colours = ['tab:red' for i in group_names]
# create and plot each bar
for i in range(len(group_names)):
ax.bar(labels, data[:, i], color=colours[i], bottom=data_cum[:, i] - data[:, i], label=group_names[i])
# re-orient the x-axis if applicable
if x_label_orientation == 'vertical' or x_label_orientation == 'v':
plt.xticks(rotation="vertical") # makes x-names vertical
elif isinstance(x_label_orientation, int):
plt.xticks(rotation=x_label_orientation) # makes x-names slanted at given angle
# format the plot
ax.set_title(title)
ax.set_title(title)
ax.set_xlabel(x_label)
ax.set_ylabel(y_label)
ax.yaxis.set_major_locator(MaxNLocator(integer=True)) # makes y-axis integer valued
ax.legend()
# optionally save the figure
if save:
plt.savefig(plot_filepath, bbox_inches="tight")
plt.show()
def __plot_groups_stacked_horizontal(group_data, group_names, title="", x_label="", y_label="", colours=None, figsize=None, plot_filepath='', save=False):
"""
<PRIVATE> Helper function to create a horizontal stacked bar chart.
plot_groups_stacked_horizontal(group_data, group_names, title, x_label, y_label, colours, figsize, plot_filepath, save)
group_data: {str: [float]}
The data for each group.
The keys of the dict are the x-values to be plotted.
The values correspond to the list of y-values; one for each group. These lists must be the same length as
the group_names argument.
group_names: [str]
The list of the group names. These are shown in the legend.
title: str; optional (default="")
The title for the plot.
x_label: str: optional (default="")
The label for the x-axis.
y_label: str: optional (default="")
The label for the y-axis.
colours: [str]
The list of bar colours for each of the groups. This must be the same length as the group_names argument.
figsize: (int, int); optional (default=None)
The size of the figure. (Width x Height).
plot_filepath: str; optional (default="")
The path to save the file to.
save: bool; optional (default=False)
True if the file should be saved to the specified filepath.
"""
# get the values for the groups
labels = list(group_data.keys()) # x-values
data = np.array(list(group_data.values())) # y-values for each group as an array
data_cum = data.cumsum(axis=1) # cumulative sum of the y-values for each group (needed to stack the bars)
# choose a figure size
if figsize is None:
fig, ax = plt.subplots()
else:
fig, ax = plt.subplots(figsize=figsize)
# adjust the colours
if not isinstance(colours, list):
if isinstance(colours, str):
colours = [colours for i in group_names]
else:
print("The colours argument needs to be a string or a list. Using the default red colour.")
colours = ['tab:red' for i in group_names]
# invert the axes
ax.invert_yaxis()
# format the plot
ax.set_title(title)
ax.set_xlabel(x_label)
ax.set_ylabel(y_label)
ax.xaxis.set_major_locator(MaxNLocator(integer=True)) # makes x-axis integer valued
ax.set_xlim(0, np.sum(data, axis=1).max())
# add the colours to the bars
for i, (colname, colour) in enumerate(zip(group_names, colours)):
widths = data[:, i]
starts = data_cum[:, i] - widths
ax.barh(labels, widths, left=starts, height=0.5,
label=colname, color=colour)
xcenters = starts + widths / 2
ax.legend(loc='best')
# optionally save the figure
if save:
plt.savefig(plot_filepath, bbox_inches="tight")
plt.show()
def plot_groups_clustered(group_data, group_names, title="", x_label="", y_label="", colours=None, figsize=None, plot_filepath="", save=False):
"""
Creates a clustered bar plot. That is, a bar chart with multiple adjacent bars corresponding to data from different groups.
plot_groups_clustered(group_data, group_names, title, x_label, y_label, colours, figsize, plot_filepath, save)
group_data: {str: [float]}
The data for each group.
The keys of the dict are the x-values to be plotted.
The values correspond to the list of y-values; one for each group. These lists must be the same length as the group_names argument.
group_names: [str]
The list of the group names. These are shown in the legend.
title: str; optional (default="")
The title for the plot.
x_label: str: optional (default="")
The label for the x-axis.
y_label: str: optional (default="")
The label for the y-axis.
colours: [str]
The list of bar colours for each of the groups. This must be the same length as the group_names argument.
figsize: (int, int); optional (default=None)
The size of the figure. (Width x Height).
plot_filepath: str; optional (default="")
The path to save the file to.
save: bool; optional (default=False)
True if the file should be saved to the specified filepath.
"""
if not isinstance(colours, list):
print("Error: The colours argument needs to be a list, and have the same length as group_names")
return
elif len(colours) != len(group_names):
print("Error: The colours needs to have the same length as group_names")
return
labels = group_data.keys()
# group percentages by group as opposed to by word
group_percentages = [ [group_data[label][i] for label in group_data] for i in range(len(group_names)) ]
N = len(labels)
x_values = np.arange(N) # array of x-values where each new word starts
width = 0.15 # width of each individual bar
fig = plt.figure(figsize=figsize)
# plot the results
for i in range(len(group_names)):
# plot the results for one party
plt.bar(x_values + i*width, group_percentages[i], width, color=colours[i], label=group_names[i])
# set the title, y-axis, x-axis, and legend
plt.title(title)
plt.xlabel(x_label)
plt.ylabel(y_label)
plt.xticks(x_values + (len(group_names) - 1)*width / 2, tuple(labels), rotation=45)
plt.legend(loc='best')
# optionally save the image
if save:
plt.savefig(plot_filepath, bbox_inches="tight")
plt.show()
def plot_group_results_individually(group_results, title_template="", x_label="", y_label="", colours='tab:red', figsize=None, width=0.75, orientation='vertical', x_label_orientation='horizontal', plot_filepath_template="", save=False):
"""
Plot multiple bar charts for different groups at once. One bar chart is created for each group.
plot_groups_results_individually(group_results, title_template, x_label, y_label, colours, figsize, width, orientation, x_label_orientation, plot_filepath_template, save)
group_results: {str: ([object], [object])}
The data for the groups.
The keys of the group names.
The values are tuples containing the x- and y-values for that group (each as a list). ([x-values], [y-values]).
title_template: str; optional (default="")
The template for the titles of the plots. Anywhere there is an asterick (*) in the title template, that character will be replaced by the group name.
x_label: str: optional (default="")
The label for the x-axis.
y_label: str: optional (default="")
The label for the y-axis.
colours: str or [str]
The colour, or list of colours to be used for the plots. If one colour is specified, it will be used as the colour for all the plots. If a list is used,
it need not have the same length as the number of groups. The colours will be cycled through each of the plots.
figsize: (int, int); optional (default=None)
The size of the figure. (Width x Height).
width: float; optional (default=0.75)
The width of each bar. Only applicable for vertical bar charts. Represents the percentage of available space the bar will take up.
Must be between 0 and 1.
orientation: str; optional (default='vertical')
The orientation of bars in the bar charts. Must be either 'vertical' or 'v' (for vertical bars) or 'horizontal' or'h' for horizontal bars.
x_label_orientation: str or float; optional (default='horizontal')
The orientation of the labels on the x-axis. Only applicable for vertical bar charts. This can either be 'vertical' or 'horizontal' or can be a
float indicating the degree of rotation.
plot_filepath_template: str; optional (default="")
The template for the filepaths of the plots. Anywhere there is an asterick (*) in the template, that character will be replaced by the group name.
save: bool; optional (default=False)
True if the file should be saved to the specified filepath.
"""
if orientation == 'vertical' or orientation == 'v':
return __plot_group_results_individually_vertical(group_results, title_template=title_template, x_label=x_label, y_label=y_label, colours=colours, width=width, x_label_orientation=x_label_orientation, figsize=figsize, plot_filepath_template=plot_filepath_template, save=save)
elif orientation == 'horizontal' or orientation == 'h':
return __plot_group_results_individually_horizontal(group_results, title_template=title_template, x_label=x_label, y_label=y_label, colours=colours, figsize=figsize, plot_filepath_template=plot_filepath_template, save=save)
else:
print("Orientation can only be either 'vertical' or 'horizontal' (or 'v' or 'h' for short).")
return
def __plot_group_results_individually_vertical(group_results, title_template="", x_label="", y_label="", colours=None, figsize=None, width=0.75, x_label_orientation='horizontal', plot_filepath_template="", save=False):
"""
<PRIVATE> Helper function to plot multiple vertical bar charts for different groups at once.
__plot_groups_results_individually_vertical(group_results, title_template, x_label, y_label, colours, figsize, width, x_label_orientation, plot_filepath_template, save)
group_results: {str: ([object], [object])}
The data for the groups.
The keys of the group names.
The values are tuples containing the x- and y-values for that group (each as a list). ([x-values], [y-values]).
title_template: str; optional (default="")
The template for the titles of the plots. Anywhere there is an asterick (*) in the title template, that character will be replaced by the group name.
x_label: str: optional (default="")
The label for the x-axis.
y_label: str: optional (default="")
The label for the y-axis.
colours: str or [str]
The colour, or list of colours to be used for the plots. If one colour is specified, it will be used as the colour for all the plots. If a list is used,
it need not have the same length as the number of groups. The colours will be cycled through each of the plots.
figsize: (int, int); optional (default=None)
The size of the figure. (Width x Height).
width: float; optional (default=0.75)
The width of each bar. Represents the percentage of available space the bar will take up.
Must be between 0 and 1.
x_label_orientation: str or float; optional (default='horizontal')
The orientation of the labels on the x-axis. This can either be 'vertical' or 'horizontal' or can be a
float indicating the degree of rotation.
plot_filepath_template: str; optional (default="")
The template for the filepaths of the plots. Anywhere there is an asterick (*) in the template, that character will be replaced by the group name.
save: bool; optional (default=False)
True if the file should be saved to the specified filepath.
"""
group_names = list(group_results.keys())
results = list(group_results.values())
if not isinstance(colours, list):
if isinstance(colours, str):
colours = [colours]
else:
print("The colours argument needs to be a string or a list. Using the default red colour.")
colours = ['tab:red']
# plot results for each group
for i in range(len(group_results)):
plot_vertical(x_values=results[i][0], y_values=results[i][1], title=title_template.replace('*', group_names[i]), x_label=x_label, y_label=y_label, colour=colours[i%len(colours)], figsize=figsize, width=width, x_label_orientation=x_label_orientation, plot_filepath=plot_filepath_template.replace('*', group_names[i]), save=save)
def __plot_group_results_individually_horizontal(group_results, title_template="", x_label="", y_label="", colours=None, figsize=None, plot_filepath_template="", save=False):
"""
<PRIVATE> Helper function to plot multiple horizontal bar charts for different groups at once.
__plot_groups_results_individually_horizontal(group_results, title_template, x_label, y_label, colours, figsize, plot_filepath_template, save)
group_results: {str: ([object], [object])}
The data for the groups.
The keys of the group names.
The values are tuples containing the x- and y-values for that group (each as a list). ([x-values], [y-values]).
title_template: str; optional (default="")
The template for the titles of the plots. Anywhere there is an asterick (*) in the title template, that character will be replaced by the group name.
x_label: str: optional (default="")
The label for the x-axis.
y_label: str: optional (default="")
The label for the y-axis.
colours: str or [str]
The colour, or list of colours to be used for the plots. If one colour is specified, it will be used as the colour for all the plots. If a list is used,
it need not have the same length as the number of groups. The colours will be cycled through each of the plots.
figsize: (int, int); optional (default=None)
The size of the figure. (Width x Height).
plot_filepath_template: str; optional (default="")
The template for the filepaths of the plots. Anywhere there is an asterick (*) in the template, that character will be replaced by the group name.
save: bool; optional (default=False)
True if the file should be saved to the specified filepath.
"""
group_names = list(group_results.keys())
results = list(group_results.values())
if not isinstance(colours, list):
if isinstance(colours, str):
colours = [colours]
else:
print("The colours argument needs to be a string or a list. Using the default red colour.")
colours = ['tab:red']
# plot the results for each group
for i in range(len(results)):
plot_horizontal(x_values=results[i][1], y_values=results[i][0], title=title_template.replace('*', group_names[i]), x_label=x_label, y_label=y_label, colour=colours[i%len(colours)], figsize=figsize, plot_filepath=plot_filepath_template.replace('*', group_names[i]), save=save) | 50.504673 | 336 | 0.652036 | 3,810 | 27,020 | 4.504987 | 0.076378 | 0.025519 | 0.041948 | 0.044046 | 0.876253 | 0.870893 | 0.867047 | 0.857085 | 0.845898 | 0.829061 | 0 | 0.003137 | 0.244893 | 27,020 | 535 | 337 | 50.504673 | 0.838112 | 0.552369 | 0 | 0.715152 | 0 | 0.012121 | 0.102564 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054545 | false | 0 | 0.018182 | 0 | 0.133333 | 0.060606 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
18aa4cc1906746ab05febfd958ad00cd2f57c3d3 | 6,690 | py | Python | python/oneflow/nn/modules/comparison.py | Warmchay/oneflow | 5a333ff065bb89990318de2f1bd650e314d49301 | [
"Apache-2.0"
] | null | null | null | python/oneflow/nn/modules/comparison.py | Warmchay/oneflow | 5a333ff065bb89990318de2f1bd650e314d49301 | [
"Apache-2.0"
] | null | null | null | python/oneflow/nn/modules/comparison.py | Warmchay/oneflow | 5a333ff065bb89990318de2f1bd650e314d49301 | [
"Apache-2.0"
] | null | null | null | """
Copyright 2020 The OneFlow Authors. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import oneflow as flow
from oneflow.framework.tensor import register_tensor_op
@register_tensor_op("eq")
def eq_op(input, other):
"""
Computes element-wise equality.
The second argument can be a number or a tensor whose shape is broadcastable with the first argument.
Args:
input (oneflow.Tensor): the tensor to compare
other (oneflow.Tensor, float or int): the target to compare
Returns:
- A boolean tensor that is True where :attr:`input` is equal to :attr:`other` and False elsewhere
For example:
.. code-block:: python
>>> import oneflow as flow
>>> import numpy as np
>>> input = flow.Tensor(np.array([2, 3, 4, 5]), dtype=flow.float32)
>>> other = flow.Tensor(np.array([2, 3, 4, 1]), dtype=flow.float32)
>>> y = flow.eq(input, other)
>>> y
tensor([1, 1, 1, 0], dtype=oneflow.int8)
"""
return flow._C.equal(input, other)
@register_tensor_op("ne")
def ne_op(input, other):
"""
Computes element-wise not equality.
The second argument can be a number or a tensor whose shape is broadcastable with the first argument.
Args:
input (oneflow.Tensor): the tensor to compare
other (oneflow.Tensor, float or int): the target to compare
Returns:
- A boolean tensor that is True where :attr:`input` is not equal to :attr:`other` and False elsewhere
For example:
.. code-block:: python
>>> import oneflow as flow
>>> import numpy as np
>>> input = flow.Tensor(np.array([2, 3, 4, 5]), dtype=flow.float32)
>>> other = flow.Tensor(np.array([2, 3, 4, 1]), dtype=flow.float32)
>>> y = flow.ne(input, other)
>>> y
tensor([0, 0, 0, 1], dtype=oneflow.int8)
"""
return flow._C.not_equal(input, other)
def greater_op(input, other):
"""Returns the truth value of :math:`input > other` element-wise.
Args:
input (oneflow.Tensor): A Tensor
other (oneflow.Tensor): A Tensor
Returns:
oneflow.Tensor: A Tensor with int8 type.
For example:
.. code-block:: python
>>> import numpy as np
>>> import oneflow as flow
>>> input1 = flow.Tensor(np.random.randn(2, 6, 5, 3), dtype=flow.float32)
>>> input2 = flow.Tensor(np.random.randn(2, 6, 5, 3), dtype=flow.float32)
>>> out = flow.gt(input1, input2).shape
>>> out
flow.Size([2, 6, 5, 3])
"""
return flow._C.greater(input, other)
@register_tensor_op("gt")
def greater_op_tensor(input, other):
"""
gt() -> Tensor
See :func:`oneflow.gt`
"""
return greater_op(input, other)
def greater_equal_op(input, other):
"""Returns the truth value of :math:`input >= other` element-wise.
Args:
input (oneflow.Tensor): A Tensor
other (oneflow.Tensor): A Tensor
Returns:
oneflow.Tensor: A Tensor with int8 type.
For example:
.. code-block:: python
>>> import numpy as np
>>> import oneflow as flow
>>> input1 = flow.Tensor(np.array([1, 2, 3]).astype(np.float32), dtype=flow.float32)
>>> input2 = flow.Tensor(np.array([1, 1, 4]).astype(np.float32), dtype=flow.float32)
>>> out = flow.ge(input1, input2)
>>> out
tensor([1, 1, 0], dtype=oneflow.int8)
"""
return flow._C.greater_equal(input, other)
@register_tensor_op("ge")
def greater_equal_op_tensor(input, other):
"""
ge() -> Tensor
See :func:`oneflow.ge`
"""
return greater_equal_op(input, other)
@register_tensor_op("lt")
def less_op(input, other):
"""Returns the truth value of :math:`input < other` element-wise.
Args:
input (oneflow.Tensor): A Tensor
other (oneflow.Tensor): A Tensor
Returns:
oneflow.Tensor: A Tensor with int8 type.
For example:
.. code-block:: python
>>> import numpy as np
>>> import oneflow as flow
>>> input1 = flow.Tensor(np.array([1, 2, 3]).astype(np.float32), dtype=flow.float32)
>>> input2 = flow.Tensor(np.array([1, 2, 4]).astype(np.float32), dtype=flow.float32)
>>> out = flow.lt(input1, input2)
>>> out
tensor([0, 0, 1], dtype=oneflow.int8)
"""
return flow._C.less(input, other)
@register_tensor_op("le")
def less_equal_op(input, other):
"""Returns the truth value of :math:`input <= other` element-wise.
Args:
input (oneflow.Tensor): A Tensor
other (oneflow.Tensor): A Tensor
Returns:
oneflow.Tensor: A Tensor with int8 type.
For example:
.. code-block:: python
>>> import numpy as np
>>> import oneflow as flow
>>> input1 = flow.Tensor(np.array([1, 2, 3]).astype(np.float32), dtype=flow.float32)
>>> input2 = flow.Tensor(np.array([1, 1, 4]).astype(np.float32), dtype=flow.float32)
>>> out = flow.le(input1, input2)
>>> out
tensor([1, 0, 1], dtype=oneflow.int8)
"""
return flow._C.less_equal(input, other)
@register_tensor_op("ne")
def ne_op(input, other):
"""
Computes element-wise not equality.
The second argument can be a number or a tensor whose shape is broadcastable with the first argument.
Args:
input (oneflow.Tensor): the tensor to compare
other (oneflow.Tensor, float or int): the target to compare
Returns:
- A boolean tensor that is True where :attr:`input` is not equal to :attr:`other` and False elsewhere
For example:
.. code-block:: python
>>> import oneflow as flow
>>> import numpy as np
>>> input = flow.Tensor(np.array([2, 3, 4, 5]), dtype=flow.float32)
>>> other = flow.Tensor(np.array([2, 3, 4, 1]), dtype=flow.float32)
>>> y = flow.ne(input, other)
>>> y
tensor([0, 0, 0, 1], dtype=oneflow.int8)
"""
return flow._C.not_equal(input, other)
if __name__ == "__main__":
import doctest
doctest.testmod(raise_on_error=True)
| 25.930233 | 109 | 0.612108 | 925 | 6,690 | 4.368649 | 0.16 | 0.061866 | 0.041574 | 0.050483 | 0.7805 | 0.743628 | 0.728038 | 0.726553 | 0.726553 | 0.685474 | 0 | 0.029992 | 0.257399 | 6,690 | 257 | 110 | 26.031128 | 0.783414 | 0.75441 | 0 | 0.2 | 0 | 0 | 0.02058 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0.1 | 0 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
18afb1a35e906977c71677fcd84f1c17546ac9e3 | 229 | py | Python | quantarhei/builders/__init__.py | slamavl/quantarhei | d822bc2db86152c418e330a9152e7866869776f7 | [
"MIT"
] | 14 | 2016-10-16T13:26:05.000Z | 2021-11-09T11:40:52.000Z | quantarhei/builders/__init__.py | slamavl/quantarhei | d822bc2db86152c418e330a9152e7866869776f7 | [
"MIT"
] | 61 | 2016-09-19T10:45:56.000Z | 2021-11-10T13:53:06.000Z | quantarhei/builders/__init__.py | slamavl/quantarhei | d822bc2db86152c418e330a9152e7866869776f7 | [
"MIT"
] | 21 | 2016-08-30T09:09:28.000Z | 2022-03-30T03:16:35.000Z | # -*- coding: utf-8 -*-
from .modes import Mode
from .molecules import Molecule
from .aggregates import Aggregate
from .pdb import PDBFile
from .aggregate_states import ElectronicState
from .aggregate_states import VibronicState | 28.625 | 45 | 0.80786 | 29 | 229 | 6.310345 | 0.551724 | 0.142077 | 0.20765 | 0.273224 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005 | 0.126638 | 229 | 8 | 46 | 28.625 | 0.91 | 0.091703 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
18cf0e80294f55b55929d46372720e7c43e1790f | 115 | py | Python | mitty/analysis/__init__.py | ozemsbg/Mitty | 5fc6ad399c5366442034b59dfd1c0f9bab199a3f | [
"Apache-2.0"
] | 12 | 2016-08-13T23:31:26.000Z | 2020-03-31T09:23:27.000Z | mitty/analysis/__init__.py | ozemsbg/Mitty | 5fc6ad399c5366442034b59dfd1c0f9bab199a3f | [
"Apache-2.0"
] | 13 | 2017-01-25T03:54:07.000Z | 2021-02-17T05:55:15.000Z | mitty/analysis/__init__.py | ozemsbg/Mitty | 5fc6ad399c5366442034b59dfd1c0f9bab199a3f | [
"Apache-2.0"
] | 2 | 2016-11-17T01:25:14.000Z | 2017-10-25T10:57:10.000Z | from mitty.analysis.bamtoolz import *
from mitty.analysis.bamfilters import *
from mitty.analysis.aaftoolz import * | 38.333333 | 39 | 0.826087 | 15 | 115 | 6.333333 | 0.466667 | 0.284211 | 0.536842 | 0.484211 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095652 | 115 | 3 | 40 | 38.333333 | 0.913462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
e15912e4ed7e456cf25f5bf337a8598f1b09ee83 | 30,345 | py | Python | test/test_spellcheck.py | kratel/nyu_appsec_a2 | ade9efd8f2b5143a99e4f2da8f249884c6d4a80e | [
"MIT"
] | null | null | null | test/test_spellcheck.py | kratel/nyu_appsec_a2 | ade9efd8f2b5143a99e4f2da8f249884c6d4a80e | [
"MIT"
] | 2 | 2021-03-20T02:28:43.000Z | 2021-04-20T19:09:18.000Z | test/test_spellcheck.py | kratel/nyu_appsec_a2 | ade9efd8f2b5143a99e4f2da8f249884c6d4a80e | [
"MIT"
] | null | null | null | """
Tests the spellcheck module of the spellcheckapp.
Makes use of flask's test client to perform integration tests.
"""
import os
import pathlib
import sys
import tempfile
import unittest
from unittest.mock import patch
import app
import bs4
parent_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
sys.path.append(parent_dir)
beautifulsoup = bs4.BeautifulSoup
spellcheck_path = './spell_check.out'
wordlist_path = 'wordlist.txt'
class TestAuth(unittest.TestCase):
"""Groups Spellcheck tests to use the same test client."""
def setUp(self):
"""
Runs before each test.
Creates test flask client, using a test config.
Creates temporary sqlite file.
"""
db_fd, database_name = tempfile.mkstemp()
test_config = {"SECRET_KEY": 'test',
"TESTING": True,
"SQLALCHEMY_DATABASE_URI": 'sqlite:///' + database_name,
"SQLALCHEMY_TRACK_MODIFICATIONS": False,
"SPELLCHECK": spellcheck_path,
"WORDLIST": wordlist_path,
"SESSION_COOKIE_HTTPONLY": True,
"SESSION_COOKIE_SAMESITE": 'Lax',
"REMEMBER_COOKIE_HTTPONLY": True}
base_app = app.create_app(test_config)
self.app = base_app.test_client()
self.db_fd = db_fd
self.database_name = database_name
self.base_app = base_app
def tearDown(self):
"""Tears down the test client and removes the sqlite file."""
os.close(self.db_fd)
os.unlink(self.database_name)
# Helper Funcs
def register(self, uname, pword, csrf_token=""):
"""Helper function to issue a register request."""
pdata = {"username": uname,
"password": pword,
"csrf_token": csrf_token}
return self.app.post(
'/register',
data=pdata,
follow_redirects=True
)
def login(self, uname, pword, mfa="", csrf_token=""):
"""Helper function to issue a login request."""
if mfa:
pdata = {"username": uname,
"password": pword,
"mfa": mfa,
"csrf_token": csrf_token}
else:
pdata = {"username": uname,
"password": pword,
"csrf_token": csrf_token}
return self.app.post(
'/login',
data=pdata,
follow_redirects=True
)
def logout(self):
"""Helper function to issue a logout request."""
return self.app.get(
'/logout',
follow_redirects=True
)
def spell_check_text(self, inputtext="", csrf_token=""):
"""Helper function to issue a spell check submission."""
if inputtext:
pdata = {"inputtext": inputtext,
"csrf_token": csrf_token}
else:
pdata = {"csrf_token": csrf_token}
return self.app.post(
'/spell_check',
data=pdata,
follow_redirects=True
)
def query_user_spell_history(self, userquery=None, csrf_token=""):
"""Helper function to issue a request to view spell check history."""
if userquery:
pdata = {"userquery": userquery,
"csrf_token": csrf_token}
else:
pdata = {"csrf_token": csrf_token}
return self.app.post(
'/history',
data=pdata,
follow_redirects=True
)
# Tests Start
def test_spell_check_get_no_login(self):
"""Tests that spell_check page will redirect to login if a guest user visits."""
response = self.app.get('/spell_check', follow_redirects=True)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all('title')
# Should have redirected to login page
self.assertTrue(any(("Log In" in s.text) for s in results))
def test_spell_check_get_with_login(self):
"""Tests that spell_check page is retrieved successfully while logged in."""
# Register a user
response = self.app.get('/register', follow_redirects=True)
soup = beautifulsoup(response.data, 'html.parser')
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.register(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='success')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Registration success" in s.text for s in results))
# Login as a user
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.login(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='result')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Login success" in s.text for s in results))
# Now get the spell_check page
response = self.app.get('/spell_check', follow_redirects=True)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all('title')
self.assertTrue(any(("Spell Checker - Submission" in s.text) for s in results))
self.assertGreater(len(soup.find_all('textarea', id="inputtext")), 0, "No textarea with id 'inputtext' found.")
def test_spell_check_csrf(self):
"""Tests that csrf token is required for spell_check form submission."""
# Register a user
response = self.app.get('/register', follow_redirects=True)
soup = beautifulsoup(response.data, 'html.parser')
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.register(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='success')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Registration success" in s.text for s in results))
# Login as a user
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.login(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='result')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Login success" in s.text for s in results))
# Now get the spell_check page
response = self.app.get('/spell_check', follow_redirects=True)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
self.assertGreater(len(soup.find_all('input', id='csrf_token')), 0, "No csrf token found in input.")
@patch('subprocess.Popen')
@patch('tempfile.TemporaryFile', unittest.mock.mock_open(read_data=b''))
def test_mock_spell_check_basic_input(self, subproc):
"""Mocks spell check executable and tests that spell check submission works and correct words are treated as expected returning an element with id 'no_misspelled'.""" # noqa: E501
# Register a user
response = self.app.get('/register', follow_redirects=True)
soup = beautifulsoup(response.data, 'html.parser')
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.register(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='success')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Registration success" in s.text for s in results))
# Login as a user
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.login(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='result')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Login success" in s.text for s in results))
# Now get the spell_check page
response = self.app.get('/spell_check', follow_redirects=True)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all('title')
self.assertTrue(any(("Spell Checker - Submission" in s.text) for s in results))
self.assertGreater(len(soup.find_all('textarea', id="inputtext")), 0, "No textarea with id 'inputtext' found.")
# Setup mocks
subproc.return_value = unittest.mock.MagicMock()
# Submit some text to spell checker
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
inputtext = " Some correct words"
response = self.spell_check_text(inputtext=inputtext, csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
self.assertTrue(subproc.called)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find('p', id="textout")
self.assertTrue(inputtext in results.text)
results = soup.find('p', id="no_misspelled")
self.assertTrue(results)
@patch('subprocess.Popen')
@patch('tempfile.TemporaryFile', unittest.mock.mock_open(read_data=b'flkfkef\nlkferf\n'))
def test_mock_spell_check_basic_misspelled_input(self, subproc):
"""Mocks spell check executable and tests that spell check submission works and incorrect words are treated as expected returning a list of misspelled words.""" # noqa: E501
# Register a user
response = self.app.get('/register', follow_redirects=True)
soup = beautifulsoup(response.data, 'html.parser')
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.register(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='success')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Registration success" in s.text for s in results))
# Login as a user
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.login(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='result')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Login success" in s.text for s in results))
# Now get the spell_check page
response = self.app.get('/spell_check', follow_redirects=True)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all('title')
self.assertTrue(any(("Spell Checker - Submission" in s.text) for s in results))
self.assertGreater(len(soup.find_all('textarea', id="inputtext")), 0, "No textarea with id 'inputtext' found.")
# Setup mocks
subproc.return_value = unittest.mock.MagicMock()
# Submit some text to spell checker
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
inputtext = " Some incorrect words "
misspelled_words = "flkfkef lkferf"
inputtext += misspelled_words
response = self.spell_check_text(inputtext=inputtext, csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
self.assertTrue(subproc.called)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find('p', id="textout")
self.assertEqual(inputtext, results.text.replace("'", ''))
results = soup.find('p', id="misspelled")
misspelled_words_out = results.text.split(", ")
self.assertEqual(len(misspelled_words_out), 2, "Did not return expected amount of misspelled words")
for word in misspelled_words_out:
self.assertTrue(word in misspelled_words, "Expected misspelled word not returned.")
@patch('subprocess.Popen')
@patch('tempfile.TemporaryFile', unittest.mock.mock_open(read_data=b'flkfkef\nlkferf\n'))
def test_mock_spell_check_history(self, subproc):
"""Mocks spell check executable and tests that spell check history renders correctly."""
# Register a user
response = self.app.get('/register', follow_redirects=True)
soup = beautifulsoup(response.data, 'html.parser')
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.register(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='success')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Registration success" in s.text for s in results))
# Login as a user
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.login(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='result')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Login success" in s.text for s in results))
# Now get the spell_check page
response = self.app.get('/spell_check', follow_redirects=True)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all('title')
self.assertTrue(any(("Spell Checker - Submission" in s.text) for s in results))
self.assertGreater(len(soup.find_all('textarea', id="inputtext")), 0, "No textarea with id 'inputtext' found.")
# Setup mocks
subproc.return_value = unittest.mock.MagicMock()
# Submit some text to spell checker
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
inputtext = " Some incorrect words "
misspelled_words = "flkfkef lkferf"
inputtext += misspelled_words
response = self.spell_check_text(inputtext=inputtext, csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
self.assertTrue(subproc.called)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find('p', id="textout")
self.assertEqual(inputtext, results.text.replace("'", ''))
results = soup.find('p', id="misspelled")
misspelled_words_out = results.text.split(", ")
self.assertEqual(len(misspelled_words_out), 2, "Did not return expected amount of misspelled words")
for word in misspelled_words_out:
self.assertTrue(word in misspelled_words, "Expected misspelled word not returned.")
# Now test that spell check history page stored our word.
response = self.app.get('/history', follow_redirects=True)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find('h3', id="numqueries")
self.assertEqual("1", results.text)
results = soup.find('a', id="query1")
self.assertEqual("Query 1", results.text)
@patch('subprocess.Popen')
@patch('tempfile.TemporaryFile', unittest.mock.mock_open(read_data=b'flkfkef\nlkferf\n'))
def test_mock_admin_spell_check_history(self, subproc):
"""Mocks spell check executable and tests that admin view of spell check history renders correctly. Also tests that admin can query for other user's histories.""" # noqa: E501
# Register a user
response = self.app.get('/register', follow_redirects=True)
soup = beautifulsoup(response.data, 'html.parser')
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.register(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='success')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Registration success" in s.text for s in results))
# Login as a user
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.login(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='result')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Login success" in s.text for s in results))
# Now get the spell_check page
response = self.app.get('/spell_check', follow_redirects=True)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all('title')
self.assertTrue(any(("Spell Checker - Submission" in s.text) for s in results))
self.assertGreater(len(soup.find_all('textarea', id="inputtext")), 0, "No textarea with id 'inputtext' found.")
# Setup mocks
subproc.return_value = unittest.mock.MagicMock()
# Submit some text to spell checker
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
inputtext = " Some incorrect words "
misspelled_words = "flkfkef lkferf"
inputtext += misspelled_words
response = self.spell_check_text(inputtext=inputtext, csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
self.assertTrue(subproc.called)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find('p', id="textout")
self.assertEqual(inputtext, results.text.replace("'", ''))
results = soup.find('p', id="misspelled")
misspelled_words_out = results.text.split(", ")
self.assertEqual(len(misspelled_words_out), 2, "Did not return expected amount of misspelled words")
for word in misspelled_words_out:
self.assertTrue(word in misspelled_words, "Expected misspelled word not returned.")
# Logout
response = self.logout()
self.assertEqual(response.status_code, 200)
# Login as admin
response = self.app.get('/login', follow_redirects=True)
soup = beautifulsoup(response.data, 'html.parser')
# Login as default admin
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.login(uname='replaceme', pword='replaceme', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
# Now test that spell check history page stored our word.
response = self.app.get('/history', follow_redirects=True)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find('h3', id="numqueries")
self.assertEqual("0", results.text)
# Query another User's history
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
username_to_query = "temp1234"
response = self.query_user_spell_history(userquery=username_to_query, csrf_token=csrf_token)
soup = beautifulsoup(response.data, 'html.parser')
self.assertEqual(response.status_code, 200)
results = soup.find('h3', id="numqueries")
self.assertEqual("1", results.text)
results = soup.find('a', id="query1")
self.assertEqual("Query 1", results.text)
results = soup.find('h2')
self.assertTrue('temp1234' in results.text)
@patch('subprocess.Popen')
@patch('tempfile.TemporaryFile', unittest.mock.mock_open(read_data=b'flkfkef\nlkferf\n'))
def test_mock_spell_check_query(self, subproc):
"""Mocks spell check executable and tests that spell check query page is created and renders correctly."""
# Register a user
response = self.app.get('/register', follow_redirects=True)
soup = beautifulsoup(response.data, 'html.parser')
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.register(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='success')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Registration success" in s.text for s in results))
# Login as a user
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.login(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='result')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Login success" in s.text for s in results))
# Now get the spell_check page
response = self.app.get('/spell_check', follow_redirects=True)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all('title')
self.assertTrue(any(("Spell Checker - Submission" in s.text) for s in results))
self.assertGreater(len(soup.find_all('textarea', id="inputtext")), 0, "No textarea with id 'inputtext' found.")
# Setup mocks
subproc.return_value = unittest.mock.MagicMock()
# Submit some text to spell checker
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
inputtext = " Some incorrect words "
misspelled_words = "flkfkef lkferf"
inputtext += misspelled_words
response = self.spell_check_text(inputtext=inputtext, csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
self.assertTrue(subproc.called)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find('p', id="textout")
self.assertEqual(inputtext, results.text.replace("'", ''))
results = soup.find('p', id="misspelled")
misspelled_words_out = results.text.split(", ")
self.assertEqual(len(misspelled_words_out), 2, "Did not return expected amount of misspelled words")
for word in misspelled_words_out:
self.assertTrue(word in misspelled_words, "Expected misspelled word not returned.")
# Now test that spell check history page stored our word.
response = self.app.get('/history', follow_redirects=True)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find('h3', id="numqueries")
self.assertEqual("1", results.text)
results = soup.find('a', id="query1")
query_url = results['href']
self.assertEqual("Query 1", results.text)
# Now test that the url was generated and we can access it.
response = self.app.get(query_url, follow_redirects=True)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find('td', id="querytext")
self.assertEqual(inputtext, results.text.replace("'", ''))
results = soup.find('td', id="queryresults")
misspelled_words_out = results.text.split(", ")
self.assertEqual(len(misspelled_words_out), 2, "Query result page did not store expected amount of misspelled words")
for word in misspelled_words_out:
self.assertTrue(word in misspelled_words, "Expected misspelled word not stored in query page.")
@unittest.skipIf(((not pathlib.Path(spellcheck_path).exists()) or (not pathlib.Path(wordlist_path).exists())), 'Spellcheck executable or wordlist not in appropriate path.') # noqa: E501
def test_spell_check_basic_input(self):
"""Tests that spell check submission works and correct words are treated as expected returning an element with id 'no_misspelled'."""
# Register a user
response = self.app.get('/register', follow_redirects=True)
soup = beautifulsoup(response.data, 'html.parser')
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.register(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='success')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Registration success" in s.text for s in results))
# Login as a user
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.login(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='result')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Login success" in s.text for s in results))
# Now get the spell_check page
response = self.app.get('/spell_check', follow_redirects=True)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all('title')
self.assertTrue(any(("Spell Checker - Submission" in s.text) for s in results))
self.assertGreater(len(soup.find_all('textarea', id="inputtext")), 0, "No textarea with id 'inputtext' found.")
# Submit some text to spell checker
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
inputtext = " Some correct words"
response = self.spell_check_text(inputtext=inputtext, csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find('p', id="textout")
self.assertTrue(inputtext in results.text)
results = soup.find('p', id="no_misspelled")
self.assertTrue(results)
@unittest.skipIf(((not pathlib.Path(spellcheck_path).exists()) or (not pathlib.Path(wordlist_path).exists())), 'Spellcheck executable or wordlist not in appropriate path.') # noqa: E501
def test_spell_check_basic_misspelled_input(self):
"""Tests that spell check submission works and incorrect words are treated as expected returning a list of misspelled words."""
# Register a user
response = self.app.get('/register', follow_redirects=True)
soup = beautifulsoup(response.data, 'html.parser')
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.register(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='success')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Registration success" in s.text for s in results))
# Login as a user
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
response = self.login(uname='temp1234', pword='temp1234', csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all(id='result')
self.assertGreater(len(results), 0, "No flash messages received")
self.assertTrue(any("Login success" in s.text for s in results))
# Now get the spell_check page
response = self.app.get('/spell_check', follow_redirects=True)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find_all('title')
self.assertTrue(any(("Spell Checker - Submission" in s.text) for s in results))
self.assertGreater(len(soup.find_all('textarea', id="inputtext")), 0, "No textarea with id 'inputtext' found.")
# Submit some text to spell checker
csrf_token = soup.find_all('input', id='csrf_token')[0]['value']
inputtext = " Some incorrect words "
misspelled_words = "flkfkef lkferf"
inputtext += misspelled_words
response = self.spell_check_text(inputtext=inputtext, csrf_token=csrf_token)
self.assertEqual(response.status_code, 200)
soup = beautifulsoup(response.data, 'html.parser')
results = soup.find('p', id="textout")
self.assertEqual(inputtext, results.text.replace("'", ''))
results = soup.find('p', id="misspelled")
misspelled_words_out = results.text.split(", ")
self.assertEqual(len(misspelled_words_out), 2, "Did not return expected amount of misspelled words")
for word in misspelled_words_out:
self.assertTrue(word in misspelled_words, "Expected misspelled word not returned.")
if __name__ == '__main__':
unittest.main()
| 54.090909 | 190 | 0.657736 | 3,752 | 30,345 | 5.191631 | 0.066098 | 0.059603 | 0.035577 | 0.074439 | 0.89327 | 0.882951 | 0.86678 | 0.853689 | 0.852251 | 0.845423 | 0 | 0.015632 | 0.21997 | 30,345 | 560 | 191 | 54.1875 | 0.807309 | 0.09738 | 0 | 0.815385 | 0 | 0 | 0.187178 | 0.008565 | 0 | 0 | 0 | 0 | 0.287912 | 1 | 0.037363 | false | 0.006593 | 0.017582 | 0 | 0.068132 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.